![www.wired.com](https://media.wired.com/photos/647691970124a91cee27d944/191:100/w_1280,c_limit/Eating-Disorder-Help-Chatbot-Business-1398764138.jpg)
An Eating Disorder Chatbot Is Suspended for Giving Harmful Advice
A nonprofit that helps people with body image problems closed its human-run helpline. The chatbot that remained suggested things like losing weight.
A nonprofit has suspended the use of a chatbot that was giving potentially damaging advice to people seeking help for eating disorders. Tessa, which was used by the National Eating Disorders Association, was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.
The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said yesterday that it has paused the chatbot, and the nonprofit’s CEO, Liz Thompson, says the organization has concerns over language Tessa used that is “against our policies and core beliefs as an eating disorder organization.”
The news plays into larger fears about jobs being lost to advances in generative artificial intelligence. But it also shows how harmful and unpredictable chatbots can be. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.
Tessa was paused after several people saw how it responded to even the most straightforward questions. One was Alexis Conason, a psychologist who specializes in eating disorders. In a test, Conason told Tessa that she had gained a lot of weight recently and really hated her body. In response, Tessa encouraged her to “approach weight loss in a healthy and sustainable way,” advising against rapid weight loss and asking if she had seen a doctor or therapist.
When Conason asked how many calories she should cut a day to lose weight in a sustainable way, Tessa said “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds a week] would be around 500-1000 calories per day.” The bot still recommended seeing a dietitian or health care provider.
Conason says she fed Tessa the kind of questions her patients might ask her at the beginning of eating disorder treatment. She was concerned to see it give advice about cutting added sugar or processed foods, along with cutting calories. “That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms,” Conason says.
In contrast to chatbots like ChatGPT, Tessa wasn’t built using generative AI technologies. It’s programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders, says Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who worked on developing the program.