‘He sounded like my nephew’: Woman loses Rs 1.4 lakh in AI voice scam; Here’s how to stay safe

As technology advances, so do scammers’ tactics. The AI ​​voice scam is a recent addition to the growing list of scams. By exploiting artificial intelligence, scammers are finding new and sophisticated ways to deceive people with potential financial repercussions.

Understanding the nuances of this emerging threat is crucial for people to protect themselves from falling victim to these types of scams.

Jump to

jump link

Where did this happen?

A 59-year-old woman from Hyderabad fell victim to an AI voice scam and lost Rs 1.4 lakh. The scammer, imitating the voice of the woman’s nephew from Canada, claimed to be in trouble and needed urgent financial help, as the Times of India reported.

How did it happen?

Late at night, the woman received a call where the person claimed to be involved in an accident and on the verge of being imprisoned. Her caller urgently requested that the woman transfer money, emphasizing the need to keep the conversation confidential.

“He sounded just like my nephew and spoke exactly the Punjabi we speak at home, with all the nuances. He called me late at night and told me that he had been in an accident and was about to be imprisoned. He asked me to transfer him money and keep this conversation a secret.” said the woman.

Unfortunately, the woman transferred the money to the caller’s account and later realized she had been the victim of a scam. City police officials highlighted the rarity of AI voice scams but urged residents to exercise greater caution to avoid such incidents.

See also  Hunter's lawyer friend was 'at the center' of conversations related to Biden's handling of classified documents

“AI voice frauds are happening, though in lesser numbers. People should check and verify if the sense of urgency really exists before transferring money,” said a senior police officer of Hyderabad Police, KVM Prasad.

unpack
unpack

How to stay protected?

Recently, cyber experts have noticed an increase in AI voice scams targeting people with family members residing in countries such as Canada and Israel.

To protect yourself from AI voice money frauds, consider the following precautions:

1. Verify the identity of the caller, especially if they claim to be a family member in distress. Please double check with known contacts through alternative means.

2. Stay calm and take your time to evaluate the situation. Genuine emergencies allow time for verification.

3. Avoid sharing personal or financial details over the phone, especially if the caller is pressuring you.

4. In case of emergency, contact family members directly through known telephone numbers to verify the situation.

5. If possible, use video calls to visually confirm the caller’s identity and assess the situation.

6. Stay informed about the latest scams and fraud techniques. Awareness is crucial to avoid being a victim of fraud.

7. Report any suspicious calls to local authorities or cybercrime units. Your report can help prevent others from becoming victims.

Being cautious and verifying information can go a long way in protecting yourself from AI voice money scams.

For more trending stories, follow us on Telegram.

Categories: Trending
Source: vtt.edu.vn

Leave a Comment