Deepfakes: The Newest Frontier in Fraud
Fraud perpetrators are constantly altering their methods to evade detection. Nimble cybercriminals, for example, are why IT security companies update their software so frequently. The use of deepfakes (a word derived from “deep learning” and “fake”) is one of the latest threats to emerge. Deepfakes are enabled by artificial intelligence (AI) and they’re something your company needs to have on its radar because if you haven’t seen a deepfake yet, you will.
Spotting an imposter
A deepfake involves the use of AI to create video, audio or static images that seem real. You may have seen them in viral videos of famous people, such as one in which Facebook’s Mark Zuckerberg is shown saying he has “total control of billions of people’s stolen data.” As realistic as it looked and sounded, the video depicted something that never happened.
Aside from manipulating public opinion and generating outrage, deepfakes can be used to steal. Employing an expertly altered audio file, someone can trick a bank’s voice authentication tools to grant access to funds. Or a deepfake using audio and video files could convince a company to open a customer account to buy goods on credit. In such cases, the nonpaying customers are untraceable.
Proving what’s real
Since deepfakes use emerging technology, detecting them can be challenging. But depending on a deepfake’s format, some third-party detection solutions are available.
Software designed to detect video deepfakes can use a “liveness” detector, which analyzes a person’s face for natural movements. Computers also can analyze images at the pixel level for manipulation. Deepfake audio software is capable of discerning almost-imperceptible sounds that aren’t human generated.
Keeping current
You can protect your business from deepfake-related fraud by updating your current internal controls. For example, if your company operates a call center, make sure you have procedures that prevent audio deepfakes from gaining unauthorized account access. In addition, keep current on deepfake developments. You might, for example, establish a Google Alert to provide you with articles relevant to your industry and particular vulnerabilities.
(This is Blog Post #997)