Scammers Pull Off $240,000 Heist Using AI-Powered Deepfake Voice
The wanted poster for this crime needs a picture of a computer.
Thieves in Europe have used highly sophisticated software that can mimic voices to make off with 220,000 euros (about $240,000) in what one report labels one of the first publicly reported crimes in which artificial intelligence was used to steal.
The Washington Post recounted the episode in which an executive of a British energy company wired the money to an account in Hungary after being called by someone he thought was his boss. The Wall Street Journal first reported the incident.
The Post’s information was provided by Euler Hermes, a French insurance company, but Euler Hermes would not identify the company that was scammed.
Although the executive thought the request was strange, he complied because he was certain that he was speaking to the boss of the firm’s parent company in Germany, with whom he had spoken before.
“The software was able to imitate the voice, and not only the voice: the tonality, the punctuation, the German accent,” Euler Hermes spokeswoman Antje Wolters said.
Technology has fueled the creation of what as known as “deepfakes,” in which politicians appear to say things they never did, or videos are altered to put a celebrity’s face on someone else’s body. As the quality of the technology advances, so does the quality of the fakery, to the point where the person on the other end can be hard-pressed to tell the difference.
“Criminals are going to use whatever tools enable them to achieve their objectives cheapest,” said Andrew Grotto, a fellow at Stanford University’s Cyber Policy Center.
“This is a technology that would have sounded exotic in the extreme 10 years ago, now being well within the range of any lay criminal who’s got creativity to spare.”
Saurabh Shintre, a senior researcher for Symantec, said that new technology, which may not always be effective if the victim has the chance to evaluate it, is most effective when combined with age-old scamming tricks.
“When you create a stressful situation like this for the victim, their ability to question themselves for a second — ‘Wait, what the hell is going on? Why is the CEO calling me?’ — goes away, and that lets them get away with it,” Shintre said.
That’s what happened in the British incident until the thieves pushed their luck and made a second request for cash.
After the second call arrived, the executive called his boss. The game was soon up because he was speaking to his real boss at the same time his fake boss was trying to con more money out of him, Euler Hermes told The Post.
The thieves have not yet been apprehended.
As the 2020 elections approach, Microsoft and other tech giants are fearful that deepfake videos may be used to sway the contest and are working to develop a simple way to detect deepfake videos, Reuters reported.
“There’s a tension in the commercial space between wanting to make the best product and considering the bad applications that product could have,” Charlotte Stanton, the director of the Silicon Valley office of the Carnegie Endowment for International Peace, told The Post.
“Researchers need to be more cautious as they release technology as powerful as voice-synthesis technology, because clearly it’s at a point where it can be misused.”
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.