Industry Talk

Regular Industry Development Updates, Opinions and Talking Points relating to Manufacturing, the Supply Chain and Logistics.

AI Voice Cloning Scam Warning

Social media has become a deeply rooted part of our lives, with the sharing of video and audio content a daily occurrence for many. However, given the sheer amount of publicly available content now online, it has never been easier for malicious actors to create deepfake content.

We have already seen many consumers fall victim to such scams, sharing sensitive information with someone who they believe is a trusted friend or family member. And for businesses, the deepfake threat paints a similar picture, with CEOs being cloned to obtain financial details from employees. To mitigate this risk, we need stronger authentication, such as cryptographically signed proof that someone is who they say they are, which can only be unlocked by biometrics.

This technology doesn’t undermine the importance of a human takedown approach though. Knowing what to look for (blurry / distorted imagery or muffled or disjointed audio) is a vital frontline defence in tackling the rise of deepfake threats for businesses and consumers alike. This process will come down to a lot of secondary checks – if something seems off, calling the person in question, or asking the deepfake a question that only the real person would know the answer to.