Fraudsters are targeting Peruvians using deepfake voice imitation, showing how criminal actors are using artificial intelligence to carry out sophisticated scams.  

According to an investigation by Peruvian news outlet El Comercio, there have been 55 instances of criminals using artificial intelligence (AI) to create audio deepfakes, or impersonations of an individual’s voice, to commit fraud and extortion in Peru. The investigation did not give any time frame in which these incidents had occurred, but it is presumed they happened recently given that the technology is relatively new.  

El Comercio said criminals take short audio recordings of the target’s voice — which they find on social media networks such as YouTube, Instagram, and WhatsApp — and feed them into deepfake software to produce realistic voice copies that can be programmed to say anything.  

The investigation said criminals were using these audio deepfakes to extract ransoms for faked kidnappings, after making friends and relatives believe that a loved one is in danger. Another version of the scam uses a recording to simply impersonate someone and to ask for money from their acquaintances, who believe that they are talking to the real person. 

SEE ALSO: Dominican Republic Cybercrime Ring Shows Extent of Caribbean’s Financial Fraud Crisis

Peru’s Public Ministry told El Comercio that it is investigating several deepfake scams but did not provide any further details or statistics of its prevalence. The Attorney General’s Office did not answer InSight Crime’s request for comment. 

The growth of audio deepfake scams is not a solely Peruvian phenomenon. Similar frauds have been reported in the United States, Canada, and Hong Kong.  

InSight Crime Analysis

Artificial intelligence could be a gamechanger for organized criminal groups, who are already using it to enhance extortion, fraud, and other criminal activities, and could employ it to undermine criminal investigations, create false documents, and carry out disinformation campaigns against their enemies. 

In the past few years, AI has made big gains, with hundreds of services making deepfake technology widely available and easy to use.  

“What we see now is the massification of AI technology across the Americas, including among criminal groups that access information and telecommunications tools to make their illegal operations more profitable,” said Carlos Solar, who researches Latin American security at the Royal United Services Institute and is the author of the book “Cybersecurity Governance in Latin America.”

Impersonation scams — called cuento del tío or “your crazy uncle’s story” — have long been part of the criminal economy in Latin America. And while statistics on the prevalence of these scams are scarce, high-profile cases indicate they can be extremely profitable.  

In 2020, for instance, Argentinian police captured a leader of a gang that stole $600,000 in just ten days through a series of impersonation scams. In another recent example, criminals stole $1.5 million from a mining company in Chile in 2021 by impersonating the CEO and convincing an employee to transfer the funds.

SEE ALSO: Will an Algorithm Help Colombia Predict Crime?

Organized groups are behind the scams, Solar told InSight Crime, with different members devoted to different parts of the scheme, such as identifying the victims, creating the script, and effectively utilizing the technology. 

“It cannot be just a one-man operation,” he said.  

The potential usefulness of deepfakes to organized criminal elements extends beyond scams, and in coming years, governments fear that cases of the technology driving criminal activities will create increasing challenges for authorities.

A 2022 Europol report, for example, highlighted deepfakes’ potential in falsifying evidence, creating fraudulent documents, and spreading disinformation, all of which could be used by criminals to evade persecution. 

Some countries such as Chile and Colombia have adopted their own AI systems to aid in combating crime. But criminals have remained one step ahead of the authorities. 

“It is easier for criminals to access the latest free AI package than for a state agency to go through a lengthy procurement process to be able to get their own,” Solar told InSight Crime, “That’s an inevitable gap which criminals use to their favor.”