Article Details

Scrape Timestamp (UTC): 2025-12-05 23:28:11.073

Source: https://www.theregister.com/2025/12/05/virtual_kidnapping_scam/

Original Article Text

Click to Toggle View

Crims using social media images, videos in 'virtual kidnapping' scams. Proof of life? Or an active social media presence?. Criminals are altering social media and other publicly available images of people to use as fake proof of life photos in "virtual kidnapping" and extortion scams, the FBI warned on Friday.  In these truly heinous extortion attempts, miscreants contact their victims via text messages and claim to have kidnapped their loved one.  Some of these are totally fake, and don't involve any abducted people. However, the FBI's Friday alert also warns about posting real missing person info online, indicating that scammers may also be scraping these images and contacting the missing person's family with fake information. The moves are similar to the age-old grandparent scams, in which fraudsters call seniors and impersonate their children or grandchildren, purporting to be in great physical danger if the grandparent doesn't send needed money ASAP. The FBI classifies this type of fraud as "emergency scams," [PDF] and says it received 357 complaints about them last year, costing victims $2.7 million. This newer version, however, adds a 2025 twist: In addition to sending a text, the criminals typically send what appears to be a real image or video of the "kidnapped" person to show proof of life.  Plus, to increase the pressure on the victims to pay, the scammers often "express significant claims of violence towards the loved one if the ransom is not paid immediately," the federal cops said. The FBI didn't immediately respond to The Register's questions, including how many complaints and/or cases of these fake kidnappings it has received. It's easy enough to find photos and videos of people – and connect potential victims to family and friends – via social media, and then use AI tools to doctor this footage, or create entirely new images or videos.  However, these proof-of-life images, "upon close inspection often [reveal] inaccuracies when compared to confirmed photos of the loved one," the FBI notes. For example, the supposed kidnapped victim may be missing a tattoo or scar, or the body proportions might be a bit off.  "Criminal actors will sometimes purposefully send these photos using timed message features to limit the amount of time victims have to analyze the images," the alert adds. To protect yourself – and your family and friends – from falling victim to these types of scams, the FBI recommends not providing personal information to strangers while traveling, and setting a code word that only you and your loved ones know. Also, screenshot or record proof-of-life scam images if possible, and report any incidents to the FBI's Internet Crime Complaint Center at www.ic3.gov. Include as much information as possible about the interaction including phone numbers, payment information, text and audio communications, and photos or videos. And always attempt to contact the supposed victim before paying any ransom demand. Criminals are also using fake images and videos to scam corporations, typically with an AI boost. The technique is perhaps most prevalent in the ongoing fake IT worker scams that have hit companies across multiple sectors, including one high-profile scheme that generated at least $88 million over about six years, the Department of Justice said last year.  These scammers largely originate from North Korea - or at least funnel money back to Pyongyang after fraudulently obtaining a remote worker job, generally in a software development role. Increasingly, they also rely on AI tools to not only write resumes and cover letters, but also to help with video call interviews with software that changes the interviewee's appearance in real time.

Daily Brief Summary

CYBERCRIME // FBI Warns of AI-Enhanced Virtual Kidnapping and Extortion Scams

The FBI has issued a warning about criminals using altered social media images in virtual kidnapping scams, demanding ransoms from victims' families.

Scammers claim to have kidnapped loved ones and send doctored images or videos as fake proof of life, leveraging AI tools to enhance credibility.

The FBI received 357 complaints of such scams last year, resulting in $2.7 million in losses, with tactics evolving to include AI-generated content.

Criminals often use social media to gather images and personal information, making it easier to target victims and their families.

The FBI advises against sharing personal details with strangers and recommends setting a code word known only to family members to verify authenticity.

Victims are encouraged to report incidents to the FBI's Internet Crime Complaint Center, providing detailed information to aid investigations.

Similar scams targeting corporations involve fake IT workers using AI to alter their appearance during video interviews, with links to North Korean operations.