CyberSolace has been writing about the risks of deepfake impersonation of human voice for a few years now. The stormwinds of this threat seem to continue gathering force as is highlighted in a recent article by The Washington Post.
The Washington Post article, published on March 5th 2023, reports on the rising trend of scams that use artificial intelligence (AI) to mimic the voices of loved ones in financial distress. Basically, the use of deepfake technology and voice-spoofing software that can manipulate audio and create convincing imitations of individuals’ voices. It discusses how fraudsters are using these technologies to target vulnerable individuals, such as the elderly or those suffering from dementia, to extract money or personal information. Scammers are reportedly using these fake voices to impersonate family members or friends and create a sense of urgency and desperation, urging victims to send money or divulge sensitive information.