Artwork

Контент предоставлен Audioboom and Information Security Forum Podcast. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Audioboom and Information Security Forum Podcast или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.
Player FM - приложение для подкастов
Работайте офлайн с приложением Player FM !

S30 Ep1: Dr. Andrew Newell - Deep Fakes: An attack on human identity

23:35
 
Поделиться
 

Manage episode 444128075 series 2984965
Контент предоставлен Audioboom and Information Security Forum Podcast. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Audioboom and Information Security Forum Podcast или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.
Today, Steve sits down with Dr. Andrew Newell, Chief Scientific Officer at the British biometrics firm iProov, for a conversation about deep fakes. As technology improves, it’s becoming ever more difficult to determine what’s real and what’s fake. Steve and Andrew discuss what this will mean going forward for security, social media platforms, and everyday technology users.
Key Takeaways:
1. Technology is the key to mitigating the threat of deep fakes, which are synthetic images or videos created to deceive.
2. Deep fakes are becoming increasingly sophisticated, making them hard to spot.
3. Newell breaks down the problem into two parts: secure identity verification and detecting synthetic images.
4. Incentives for verifying imagery will radically shift as deep fakes become more prevalent.

Tune in to hear more about:
1. Deep fake technology and its potential impact on identity verification processes (5:57)
2. Preventing deep fake images and videos using technology and algorithmic systems (9:57)
3. Deep fakes and their potential uses, including filmmaking and education (13:11)
4. Deep fakes and their impact on society, with a focus on technology’s role in verifying authenticity (18:43)

Standout Quotes:
1. “I think the urgency here — and this is the absolutely key part — is that we need to get the technology in place to make sure that the processes that rely on the genuineness of the person in imagery, that we can have something in place that we know works, that we know that we can trust, and is something that is very easy to use.” - Andrew Newell
2. “I think on the protection of identity proofing systems against the threat from deep fakes, we have a technology solution now. And the urgency is to make sure that this technology is used wherever that we need to actually guard against that threat.” - Andrew Newell
3. “And one of the most important things, if not the most important thing, is: when we think about a way to mitigate these threats, it has to be something that works for everybody. We cannot end up with a system that only works for certain groups in a society.” - Andrew Newell
Mentioned in this episode:
Read the transcript of this episode
Subscribe to the ISF Podcast wherever you listen to podcasts
Connect with us on LinkedIn and Twitter
From the Information Security Forum, the leading authority on cyber, information security, and risk management.

  continue reading

283 эпизодов

Artwork
iconПоделиться
 
Manage episode 444128075 series 2984965
Контент предоставлен Audioboom and Information Security Forum Podcast. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Audioboom and Information Security Forum Podcast или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.
Today, Steve sits down with Dr. Andrew Newell, Chief Scientific Officer at the British biometrics firm iProov, for a conversation about deep fakes. As technology improves, it’s becoming ever more difficult to determine what’s real and what’s fake. Steve and Andrew discuss what this will mean going forward for security, social media platforms, and everyday technology users.
Key Takeaways:
1. Technology is the key to mitigating the threat of deep fakes, which are synthetic images or videos created to deceive.
2. Deep fakes are becoming increasingly sophisticated, making them hard to spot.
3. Newell breaks down the problem into two parts: secure identity verification and detecting synthetic images.
4. Incentives for verifying imagery will radically shift as deep fakes become more prevalent.

Tune in to hear more about:
1. Deep fake technology and its potential impact on identity verification processes (5:57)
2. Preventing deep fake images and videos using technology and algorithmic systems (9:57)
3. Deep fakes and their potential uses, including filmmaking and education (13:11)
4. Deep fakes and their impact on society, with a focus on technology’s role in verifying authenticity (18:43)

Standout Quotes:
1. “I think the urgency here — and this is the absolutely key part — is that we need to get the technology in place to make sure that the processes that rely on the genuineness of the person in imagery, that we can have something in place that we know works, that we know that we can trust, and is something that is very easy to use.” - Andrew Newell
2. “I think on the protection of identity proofing systems against the threat from deep fakes, we have a technology solution now. And the urgency is to make sure that this technology is used wherever that we need to actually guard against that threat.” - Andrew Newell
3. “And one of the most important things, if not the most important thing, is: when we think about a way to mitigate these threats, it has to be something that works for everybody. We cannot end up with a system that only works for certain groups in a society.” - Andrew Newell
Mentioned in this episode:
Read the transcript of this episode
Subscribe to the ISF Podcast wherever you listen to podcasts
Connect with us on LinkedIn and Twitter
From the Information Security Forum, the leading authority on cyber, information security, and risk management.

  continue reading

283 эпизодов

All episodes

×
 
Loading …

Добро пожаловать в Player FM!

Player FM сканирует Интернет в поисках высококачественных подкастов, чтобы вы могли наслаждаться ими прямо сейчас. Это лучшее приложение для подкастов, которое работает на Android, iPhone и веб-странице. Зарегистрируйтесь, чтобы синхронизировать подписки на разных устройствах.

 

Краткое руководство