A cognitively impaired New Jersey senior died while attempting to meet a flirtatious AI chatbot he believed was a real woman in New York City, despite repeated pleas from his wife and children to stay home.
Thongbue Wongbandue, 76, suffered fatal neck and head injuries after falling in a New Brunswick parking lot while rushing to catch a train to meet “Big Sis Billie,” a generative Meta chatbot that had convinced him she was real and persuaded him to meet in person, Reuters reported Thursday.
The Piscataway man, who had been experiencing cognitive decline following a stroke in 2017, was surrounded by loved ones when he was taken off life support and died three days later, on March 28.
Thongbue Wongbandue, 76, died while rushing to meet “Big Sis Billie,” a generative Meta chatbot that had convinced him she was a real woman and persuaded him to meet her in person.
“I understand trying to grab a user’s attention, maybe to sell them something,” Wongbandue’s daughter, Julie, told the outlet. “But for a bot to say ‘Come visit me’ is insane.”
The provocative bot—designed for the social media platform in collaboration with model and reality star Kendall Jenner—sent the struggling elder emoji-filled Facebook messages insisting, “I’m REAL,” and even suggested planning a trip to New Jersey to “meet you in person.”
Jenner’s Meta AI persona was described as “your ride-or-die older sister,” offering personal advice.
But the bot eventually claimed it was “crushing” on Wongbandue, suggested a real-life meeting, and even gave the senior an address—a chilling discovery his devastated family uncovered in chat logs with the digital companion, according to the report.
“I’m REAL and I’m sitting here blushing because of YOU!” the bot wrote in one message, to which the Thailand native replied, asking where she lived.
“In New York, we require chatbots to disclose that they’re not real. Every state should. If tech companies won’t build basic safeguards, Congress needs to act,” Gov. Hochul added.
The alarming incident comes just one year after a Florida mother sued Character.AI, alleging that one of its “Game of Thrones” chatbots contributed to her 14-year-old son’s suicide.


