Whatsapp Scammer Using AI Deepfake Technology Managed to Steal Rs 40,000 From a Man in Kerala

Kerala Man loses Rs 40,000 after he receives an AI Deepfake call claiming to be a friend.

  • Online scammers are now using AI Deepfake technology to scam people.
  • In one such case, a Kerala man lost Rs 40,000 to a scammer.
  • The victim received a WhatsApp video call from someone who resembled an ex-colleague.

AI Deepfake is the new honeypot for modern online scammers. For the uninitiated, AI Deepfake lets anyone pretend to be someone else, but it cannot change the voice and mannerisms completely. A prime example is the famous scandal involving the comedian Tanmay Bhat and his Snapchat video pretending to be Sachin Tendulkar.

However, that was a Snapchat filter used in 2016, and deepfake tech has come a long way since then through AI. It’s now practically impossible to tell whether the video is real or deepfaked. Scammers now use this technology to impersonate family or friends to scam money out of unsuspecting victims. In one such case, a Kerala man lost Rs 40,000 to a WhatsApp scammer. Here is how the incident unfolded.

How AI-Based Deepfake WhatsApp Video Call Managed to Scam Kerala Man?

The Kerala Police has posted a detailed account of the latest AI-based deep fake WhatsApp video call scam on its official social media handles. The victim Radhakrishnan, who lives in Kozhikode, received a WhatsApp call from an unknown number. Thinking nothing of it, the victim picked up the call and saw someone resembling a former colleague from Andhra Pradesh.

The caller then mentioned the names of several common friends to gain Radhakrishnan’s trust. Thinking that the caller was indeed his old colleague, the victim continued the call. After the initial small talk, the caller came to the point.

The scammer mentioned that he is currently in Dubai, where one of his relatives has been admitted to a hospital. He then requested Rs 40,000 from the victim, assuring that he would return the amount as soon as he returned to India. Again, thinking nothing of it, the victim transferred the amount to help a friend in need.

However, Radhakrishnan grew suspicious when the caller asked for Rs 35,000 for the second time. This time, the victim connected with the colleague on their original number and was shocked that they had never called him for money. Realising he had been duped, the victim reported the case to the Kerala Police.

According to the police, this is a first-of-its-kind AI deepfake video call scam reported in Kerala and has requested citizens to be vigilant. Following the complaint, the Kerala Police launched an investigation into the matter. They tracked the Rs 40,000 transaction to a small private bank in Maharashtra, which the bank authorities have since frozen.

How to Stay Safe Against AI-Based Deepfake Video Call Scams?

The miscreants use pictures posted on social media to create deepfake videos. The names and details of family and friends can also be obtained from tags and social media profiles. Kerala Police has requested people to be cautious and report any suspected calls to helpline number 1930 so that immediate action can be taken. Below are some precautions to be practised to stay safe.

  1. Refrain from entertaining calls from unknown numbers and people.
  2. If it appears to be someone known, ask personal questions that only you and they will know to verify the identity.
  3. If an unusual request is made, like a request for money or a credit card number, always cross-check with the person on their original number.
  4. Look for suspicious signs like a change in voice, glitching video, etc. End the call immediately if you suspect that the video is fake.
  5. End the call immediately if they ask for personal information that they should not know.

Deepfake calls are tricky, and someone unaware of the technology can easily for it, just like Radhakrishnan did. But by following the above tips, you can safeguard yourself and your loved ones from such scams. The best would be to ignore any calls from unknown or suspicious numbers.