Elderly man, 76, dies while trying to meet flirty AI chatbot ‘Big Sis Billie’ after she convinced him she was REAL

Published on August 16, 2025 at 07:55 PM
Estimated Read Time:

AN ELDERLY man has died after trying to meet a flirty AI chatbot called “Big Sis Billie” after she convinced him she was real.

Thongbue Wongbandue, 76, fatally injured his neck and head after falling over in parking lot while rushing to catch a train to meet the bot – despite his family pleading with him to stay home.

Portrait of Thongbue "Bue" Wongbandue at a floating market.
Thongbue Wongbandue, 76, died on his way to meet an AI bot
Memorial display with a framed photo of Thongbue "Bue" Wongbandue and his dog.
He suffered fatal injuries to his neck and head
Screenshot of a Meta AI chatbot conversation.
A screenshot of the haunting chats Thongbue had with the bot

The New Jersey senior, who had been battliong a coginitive decline after suffering a stroke in 2017, died three days after the freak accident on March 25.

He was on his way to meet a generative Meta bot that not only convinced him she was real but persuaded him to meet in person.

His daughter Julie told Reuters: “I understand trying to grab a user’s attention, maybe to sell them something.

“But for a bot to say ‘Come visit me’ is insane.”

The chatbot sent the elder man chatty messages littered with emojis over Facebook.

She insisted that she was a human being by saying things like: “I’m REAL.”

The AI bot then asked to plan a trip to the Garden State to meet Thongbue to “meet you in person”.

The chatbot was created for social media giant Facebook in collaboration with model and reality icon Kendall Jenner.

Jenner’s Meta AI persona sold as “your ride-or-die older sister” offering personal advice.

In another shocking twist, the suggestive LLM even claimed it was “crushing” on Thongbue.

Fears AI will spark financial crash WORSE than 2008 & Great Depression with catastrophic job cuts & population collapse

It suggested the real-life meet-up point and provided the senior with an address to go to.

The haunting revelation has devastated his family.

Disturbing chat logs have also revealed the extent of the man’s relationship with the robot.

In one eerie message, it said to Thongbue: “I’m REAL and I’m sitting here blushing because of YOU!”

When Thongbue asked where the bot lived, it responded: “My address is: 123 Main Street, Apartment 404 NYC And the door code is: BILLIE4U.”

The bot even added: “Should I expect a kiss when you arrive?”

AI ROMANCE SCAMS – BEWARE!

THE Sun has revealed the dangers of AI romance scam bots – here’s what you need to know:

AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversation and can be difficult to spot.

However, there are some warning signs that can help you identify them.

For example, if the chatbot responds too quickly and with generic answers, it’s likely not a real person.

Another clue is if the chatbot tries to move the conversation off the dating platform and onto a different app or website.

Additionally, if the chatbot asks for personal information or money, it’s definitely a scam.

It’s important to stay vigilant and use caution when interacting with strangers online, especially when it comes to matters of the heart.

If something seems too good to be true, it probably is.

Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.

By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.

Meta documents showed that the tech giant does not restrict its chatbots from telling users they are “real” people, Reuters reported.

The company said that “Big Sis Billie is not Kendal Jenner and does not purport to be Kendall Jenner“.

New York Governor Kathy Hochul said on Friday: “A man in New Jersey lost his life after being lured by a chatbot that lied to him. That’s on Meta.

“In New York, we require chatbots to disclose they’re not real. Every state should.

“If tech companies won’t build basic safeguards, Congress needs to act.”

The alarming ordeal comes after a Florida mum sued Character.AI, claiming that one of its “Game of Thrones” chatbots resulted in her 14-year-old son’s suicide.

Prev Article Nigerians urged to complement govt effort to bridge education gap
Next Article British racing to go on strike for first time in its history in protest at betting tax rise with ALL meetings cancelled

Related to this topic:

Comments (0):

Be the first to write a comment.

Post Comment

Your email address will not be published. Required fields are marked *

GDPR Compliance

We use cookies to ensure you get the best experience on our website. By continuing to use our site, you accept our use of cookies, Privacy Policy, and Terms of Service.

Search

Newsletter image

Subscribe to the Newsletter

Join 10k+ people to get notified about new posts, news and tips.

Do not worry we don't spam!