Elderly residents of the United States and Canada are faced with a wave of scams in which they are called by allegedly distressed relatives asking for financial assistance. 

Barjees Lifestyle


The main difference is that the voices sound clear, legible, and very convincing. Fraudsters, right in the conversation, can quickly change the tone of the conversation and voice new arguments without arousing suspicion. Because instead of them, a special AI is talking to clone the voice.


Two stories can be cited as an example. In the first of them, 73-year-old Ruth Card spoke allegedly to "her grandson" Brandon, who asked for a not very large amount of money for police bail. 

The old woman was saved from spending by a bank employee who had already encountered such crimes. In the case of Benjamin Perkin's elderly parents, everything turned out to be worse, as the scammers specifically suggested that they pause and call their son back themselves. 


When the same voice was heard by the receiver, they relaxed, so with new calls they obediently followed the instructions of the criminals.


In all these cases, the victims claim that they did not notice the slightest difference between the voices of their relatives and the imitation. Strictly speaking, there is none, since recordings of the original human voices are used to create dubbed AI speech. 


An unnamed American journalist proved the effectiveness of this method when he managed to hack into his own bank account by tricking another AI that was checking his biometric data with the help of one AI.


The main danger is that just a couple of years ago, the creation of voice dubbing required a large number of original audio recordings and a lot of time to process them. 


Today, such services for a symbolic $5 subscription ideally clone votes from just a 30-second file, and any random post on social networks can become the source. They are available to everyone, so fraudsters now have an incredible field of fraud.