"Do You Have a Safe Word Yet?" A Call to Arms Against Deep Fake Voice Attacks

"Do You Have a Safe Word Yet?" A Call to Arms Against Deep Fake Voice Attacks

In the digital age, we've seen a steady evolution of threats, but none perhaps as chilling as the rise of deep fake voices and videos. Malicious actors can, with relative ease, use voice technology to mimic someone's voice and use it in criminally exploitative ways—from convincing others to take potentially dangerous actions, to making fraudulent payments, or opening gaps in security. This is a threat too severe to overlook. So, how do we safeguard ourselves in a landscape where our ears can't always be trusted?