Malaysian Woman Exposes New Scam Technique Involving Faked Kidnappings and “Torture”

As the number of scams continues to rise, con artists are constantly devising new methods to deceive unsuspecting individuals and exploit their finances. 

In Malaysia, one woman has taken it upon herself to warn others about a potential new type of phone-call scam that involves fabricated kidnappings and torture. While fake kidnappings are not a new tactic employed by scammers, this woman's experience sheds light on how the scam may have evolved.

malaysian woman exposes new scam technique involving faked kidnappings and “torture”Photo via Ekbis Banten

Marlene Zahari shared her encounter on Facebook, recounting a call she received from an anonymous male caller who claimed to have her child in their custody. The caller demanded a ransom, threatening that the child would be harmed if Marlene failed to comply. 

However, the scammer provided no specific details about the child, only identifying Marlene as the person on the line. When pressed for more information, the caller simply repeated the threat and even ordered someone in the background to simulate violence against the alleged child.

In the background, Marlene could hear distressing noises resembling struggling, crying, and screaming. She had no way to confirm if any of those sounds belonged to her own child…

Suspecting something was amiss, Marlene quickly texted her eldest son to inquire about his whereabouts. To her relief, she received a reply stating that he was in school and had even seen his sister during lunch.

Realizing that her children were safe, Marlene remained silent on the call, eventually causing the scammer to hang up when their attempts to elicit a response failed.

Motivated by her experience and a close friend's similar encounter, Marlene decided to share her story with others. 

She expressed concern that scammers might adapt their tactics to include AI-generated voices that could mimic the voices of loved ones. Given the recent advancements in AI technology, this possibility is not far-fetched.

Impostor scams employing AI-generated voices have already become prevalent in the United States, leading victims to willingly transfer substantial sums of money under the belief that they were communicating with a trusted family member.

Because another person got the call this morning, am putting this on FB. A few days ago, at around 2pm, got a call...

Posted by Marlene Zahari on Friday, 16 June 2023

Marlene's courageous act of sharing her experience serves as a wake-up call, urging everyone to remain vigilant and aware of evolving scam techniques. 

By staying informed, we can better protect ourselves and our loved ones from falling victim to these manipulative schemes.

Thank you, Marlene for sharing!