Home International Criminals use AI to create fake audio: they extort friends and family members

Criminals use AI to create fake audio: they extort friends and family members

by Trend News
9 minutes read


There Artificial intelligence (IA) evolves in giant steps every day, presenting itself as a tool created to facilitate the life of its users. However, the law gap around the use of these new technologies has led to that criminals Use it for cometers delitos that exists outside the digital world.

In this sense, a new problem has arisen, in which criminals use AI to imitate the voices of people, pretending that they are in a situation where they can, and so extortionate to your family members or loved ones. Also to mention that in countries like Mexico, it is a practice that is now becoming more popular and the cases are increasing as the technologies are developed, until there are laws burdened with penalizing these actions.

Following reading:

The nakedness with IA, the new form of digital violence that blinds women: “She’s my dear, but I’m not my body”

Cae Diego “N”, student charged with editing photos of his companions with AI and selling them

“Yes it was my voice, it was the one”: Natalia gave me money thinking that I would help her again

Criminals use AI to track down their crimes in the digital space. Photo: Freepik.

“I need to deposit the money”, if I ever said to a young man. Natalia quickly recognized that it was her sister, Alexis, who had traveled to Morelos a day ago in the company of her friends; the woman listened to her voice notewhich he received via WhatsApp, more than five times, to confirm that it was a message from his new pidéndole that he would realize a transfer to help solve an economic problem in which he was supposedly involved.

Natalia remembered that the panic had subsided from her body, and that the telephone number – where she received the audio message – was not the same as hers, although the voice was identical. “They sent me mensaje (voz) of a number that I did not have added. If he thought he had left me with a deposit because he had a problem and had been asked for nothing (money),” the woman revealed in conversation with El Heraldo Digital. Assured by what it might be like to live with Alexis, Natalia made a transfer of just 300 pesos: “I didn’t have more money and I told her ‘I can only send you 300, tell your friends to lend you and I’ll pay them here’ (in the city of México)”, añadió.

Natalia sent money to the criminals, thinking that it was her match. Photo: Freepik.

The young educator confessed that she had lost her faith audio, because he was sent with a number that he didn’t know, but fearing that his name would lead to problems, he made the payment. “Yes, I guess, but in it mensaje (voz) I told myself that next I would lose my cell phone and that it would be urgent. So don’t investigate the case and if it was on her voice, it was the one”, said Natalia. However, after completing the deposit, the WhatsApp contact that sent the audio would be blocked and deleted. mensajes sent, oh I suspect that something was not good.

He then confirmed to her that he had been the victim of extortion

“They blocked me, they stopped the conversation and I stopped. They stopped calling their friends and they didn’t argue. After a while they responded to me and asked them for it and they told me that it was okay, that they didn’t steal anything from them,” he said. Natalia in the middle east. She added that when she finally got to talk to her partner, Alexis, it confirmed to her that she was the victim of a extortionbecause I’m never in danger and I haven’t lost anything on my phone cell phone: “I told myself that it wasn’t true, that it was good, that once in a while I had been there. I didn’t care much, only the 300 pesos, something inside me said ‘no harm, it’s not true’ and for your sake no transfer with the 5 million pesos you are looking for”.

Natalia was blocked after being extorted. Photo: Freepik.

In this respect, Information Technology engineer Nancy Salazar explained that it is now more common for criminals to escape to the digital space and find their crimes using tools like the Artificial intelligence (IA). The founder of Tekis.services ensured, in consultation with El Heraldo Digitalthat creating false audio or images is a practice that has been popularized in social circles as long as it is linked to criminal cyber, those who create it most frequently audios, photographs y hasta fake videoswith the objective of extortionate to their victims.

What are deepfakes and how do they happen?

According to the expert, these types of crimes are relatively “easy” to come by, and in reality on the internet there is a great variety of herramientas of AI which make it easier for users to create fake audio, images or videos. “Part of the invention and launch of this type of hardware is that they are super simple to use, the user’s experience is very simple. In the case of the voice cloning You have platforms where you only need to collect voice audios that you want to replicate and these softwares analyze the audio, the voice patterns, the tone, the rhythm and with this it generates the cloning. If you know how to add a file to your electronic system, you can easily use this type of hardware”.

Making fake audio, images or videos is relatively simple. Photo: Freepik.

To be able to clone voices, criminals need only one video or one photography of the interested person and once they hold the audio fragment, the Artificial intelligence if you carry the rest. “Estos audios they recover the videos that you submit to your social networks such as TikTok, Facebook, Instagram and others. Whatever you want is a solution to extract your audio and with that make these replicas. You can also clone your image in itself, your bodily movements, your faces, the shape in which you move to make it appear clear that you are”, advised Nancy.

Artificial intelligence regulations are advancing slowly

Nancy emphasized that this is not a new problem, as long as these problems arise alongside the development of the AI; the engineer recorded that the first victims were politicians and celebrities such as the ex-president of the United States, Barack Obama, whose image and voice were used to generate and spread a fake video from which a speech was mentioned. In this sense, the specialist complained that AI regulations are very slow compared to the advances of new technologies: “I need to speed up this type of regulation, which is implemented next to you. You could include laws that penalize its use.” misintentional of the Artificial intelligence, creation of false content; and make transparency regulations, such as labeling or making visible that the content you created was made with Artificial Intelligence”, said the founder of Tekis.services.

Celebrities and politicians were the first victims. Photo: Freepik.

How to identify deepfakes and what to do if you are the victim of AI extortion?

With the aim of ensuring that fewer people are victims of cyber criminals, Nancy shared a series of recommendations for people who receive voice messages, images or videos of their evenings queried in alarm situations. These tips will help you avoid one extortion or estafa

  • In the case of images or videos, people’s eyes have a sufficiently sharp gaze and are rarely seen.
  • Abnormal borders on the beak, mainly on the ears, nose and eyes.
  • Inconsistencies in the lighting or extra shadows.
  • In audio it is recommended to pay attention to the tones of the voice, as well as to the patroness of small natural or common words.
  • If you lose money or place yourself in an alarm situation, it is recommended to call in a direct manner to the person who is supposed to be the victim, to validate the information.

You may also like

logo

Stay up to date on the latest trending news stories with our comprehensive coverage of current events, politics, tech, sports, entertainment, business, and more.

 

 

Contact us: [email protected]

Copyright © 2023 Trend News – All Right Reserved

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More