Site icon TPG Online Daily

AG: AI-Generated Scams are Widespread & Tricky to Spot

On May 31, California Attorney General Rob Bonta issued a consumer alert warning Californians to beware of scams that use artificial intelligence (AI) or “deepfakes” to impersonate government officials, distressed family members, or other trusted figures.

Rob Bonta

“Scammers are often quite literally in our pockets, just a phone call, social media message, or text away,” said Bonta. “AI and other novel and evolving technologies can make scams harder to spot. Knowing what to look for is an important way to keep consumers safe against these tactics. I urge Californians to take practical steps to guard against being victimized by scammers, including talking to friends and family who may be unaware of these dangers.”

New technology – such as AI and deep fake video or voice manipulation – makes it easier for scammers to create sophisticated impersonations and to make more convincing requests for money or personal information. Scammers can use information available on the internet, including images and audio from social media, to convince people that the voice on the other end of the call is someone they can trust. Bad actors can clone a person’s voice through AI technology using clips of audio taken from that person’s social media account(s) and can refer to personal information about the victim found on the internet, making the scam appear credible.

For example, a troubling new scam targets parents by sending them AI voice impersonations of their child begging for help. Recent reports have included parents receiving a phone call using the cloned voice of their child claiming to have been badly injured in a car accident or in need of money to pay bail. Grandparents are often the target of scams claiming that their grandchild is in trouble and in need of money. In 2023, the FBI received victim complaints regarding grandparent scams that resulted in nearly $1.9 million in losses.

Scammers often target consumers on their phones. In 2023, robocalls and robotexts resulted in more than $1.2 billion in reported losses nationwide. And most other methods of contact by scammers — including email, social media, and the internet – are also accessible by smartphones. These phone-based scams are designed to steal money, identities, or passwords, or urgently demand payment through cash or gift cards. Scams can result in significant financial losses, ruined credit scores, and impacted security clearance for service members and others.

While younger adults reported losing money to fraud more often in 2023 than older adults, older adults who lose money tend to lose larger amounts.


Imposter scams were the most commonly reported fraud in 2023. These imposter scams often involve a bad actor pretending to be a bank’s fraud department, the government, a well-known business, a technical support expert, or a distressed relative, such as a kidnapped child. Other common phone-based scams include calls related to medical needs and prescriptions, debt reduction, utilities, bank fraud warnings, warranties, or IRS notices.

These scams can also spread misinformation about elections or political candidates. For example, in January residents of New Hampshire received scam election robocalls that allegedly used AI to impersonate the president and discourage voters from participating in the New Hampshire primary.

Protect Yourself

Here are some tips to protect you and those you know from phone-based scams.

In January, Bonta called on the FCC to address the threat of AI-generated robocalls, and the FCC subsequently declared voice-cloning technology used in common robocall scams illegal under the Telephone Consumer Protection Act.

In February, Bonta joined a coalition of 51 bipartisan attorneys general in issuing a warning letter to a company that allegedly sent New Hampshire residents scam election robocalls during the New Hampshire primary election. The calls allegedly used AI to impersonate the president and discourage voters from participating in the primary.

Exit mobile version