Home ADS

News

Home Clone the Voice of your Loved ones to deceive you - Do not trust everything you hear in phone or voice calls

Clone the Voice of your Loved ones to deceive you - Do not trust everything you hear in phone or voice calls

Artificial intelligence (AI) presents a dual challenge; while it offers potential benefits and happiness for humanity, it also opens avenues for organized crime to forge textual, video, image, voice, and phone call content.

Criminals are leveraging advanced technology to replicate the voices of familiar individuals, enabling them to conduct sophisticated scams. They can mimic the voices of family members, such as mothers, to request phone contacts or execute financial transactions. These activities pose significant risks as distinguishing between authentic and counterfeit voices becomes increasingly difficult, thereby compromising trust and exploiting personal relationships for fraudulent gains.
They clone the voice of your mother, partner or friends to deceive you. Don't trust what they tell you

They clone the voice of your mother, partner or friends to deceive you. Don't trust what they tell you

Recent warnings from security experts highlight a rise in fraud cases involving voice cloning. In one instance, a user received an anonymous call where her husband's voice oddly asked her to send a message to a specific phone number.
One user called them after suffering a scam attempt: she received an anonymous call, and when she answered, she heard her husband's voice literally telling her:
"Hello! I can't contact you, send me a message at this number [$]"
Despite the message appearing to be from her partner, she managed to ignore the call and verify that he had not actually contacted her. This situation illustrates how criminals use voice cloning technology in fraudulent operations, which are increasingly prevalent.

Artificial intelligence now allows us to create voice comments using the voices of individuals, not just celebrities but anyone we know. A person's voice can be captured and used to create an audio recording saying whatever we desire, a concern that worries even the White House Chief of Security.

Criminals have taken note of this use, and voice-based fraud operations may soon become their preferred method. This type of fraud appears more credible to victims, as it enables impersonation of a bank employee or financial institution. While the technology is not yet perfect, the realism achieved in these recordings poses significant risks.

The bigger issue is that criminals have information about us, enabling them to successfully execute fraudulent operations. Whether through our social networks or from massive data breaches they gain access to, they can be aware of our close connections.
No comments
Post a Comment

Ad

Back to top button