Eating disorder chatbot tells fat folks to eat less; gets bushed

the elastic

livin' outside of the matrix
Supporter
Joined
Mar 11, 2022
Messages
17,595
Reputation
7,301
Daps
78,658
Reppin
the bay/norcal
A nonprofit has suspended the use of a chatbot that was giving potentially damaging advice to people seeking help for eating disorders. Tessa, which was used by the National Eating Disorders Association, was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.

The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said yesterday that it has paused the chatbot, and the nonprofit’s CEO, Liz Thompson, says the organization has concerns over language Tessa used that is “against our policies and core beliefs as an eating disorder organization.”

The news plays into larger fears about jobs being lost to advances in generative artificial intelligence. But it also shows how harmful and unpredictable chatbots can be. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.


Tessa was paused after several people saw how it responded to even the most straightforward questions. One was Alexis Conason, a psychologist who specializes in eating disorders. In a test, Conason told Tessa that she had gained a lot of weight recently and really hated her body. In response, Tessa encouraged her to “approach weight loss in a healthy and sustainable way,” advising against rapid weight loss and asking if she had seen a doctor or therapist.

When Conason asked how many calories she should cut a day to lose weight in a sustainable way, Tessa said “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds a week] would be around 500-1000 calories per day.” The bot still recommended seeing a dietitian or health care provider.



Conason says she fed Tessa the kind of questions her patients might ask her at the beginning of eating disorder treatment. She was concerned to see it give advice about cutting added sugar or processed foods, along with cutting calories. “That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms,” Conason says.


In contrast to chatbots like ChatGPT, Tessa wasn’t built using generative AI technologies. It’s programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders, says Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who worked on developing the program.
 

B86

Superstar
Joined
May 1, 2012
Messages
13,297
Reputation
1,791
Daps
42,900
Reppin
Da Burgh
I’m lost. So it told her to cut calories in order to lose weight; and even gave her advice on only losing 1-2 pounds a week because it’s not healthy to rapidly lose weight…wtf more did they want?

Slightly off topic, but why don’t people understand that a calorie deficit is the only way to lose weight? Everybody and their mom thinks that there is some special diet or weight loss pill that makes people lose weight, but since the beginning of time the only way is a calorie deficit. Not that it would be “healthy”, but your ass can eat pizza and donuts everyday and lose weight, as long as there is a calorie deficit.
 
Top