CyberSolace has been writing about the risks of deepfake impersonation of human voice for a few years now.  The stormwinds of this threat seem to continue gathering force as is highlighted in a recent article by The Washington Post.

The Washington Post article, published on March 5th 2023, reports on the rising trend of scams that use artificial intelligence (AI) to mimic the voices of loved ones in financial distress. Basically, the use of deepfake technology and voice-spoofing software that can manipulate audio and create convincing imitations of individuals’ voices.  It discusses how fraudsters are using these technologies to target vulnerable individuals, such as the elderly or those suffering from dementia, to extract money or personal information. Scammers are reportedly using these fake voices to impersonate family members or friends and create a sense of urgency and desperation, urging victims to send money or divulge sensitive information.

The Federal Trade Commission (FTC) too has raised the alarm bell indicating that a significant increase in imposter scams using voice-mimicking technology. They reported that in 2022 alone, more than 5000 victims were scammed out of $11 million over the phone.  The FTC last month told companies, “You need to know about the reasonably foreseeable risks and impact of your AI product before putting it on the market.”

Read more by clicking the button below.