
NEDA, the largest nonprofit organization dedicated to eating disorders, has had a helpline for the last twenty years that provided support to hundreds of thousands of people via chat, phone call, and text. [...]
"We asked for adequate staffing and ongoing training to keep up with our changing and growing Helpline, and opportunities for promotion to grow within NEDA. We didn't even ask for more money," Harper wrote. "When NEDA refused [to recognize our union], we filed for an election with the National Labor Relations Board and won on March 17. Then, four days after our election results were certified, all four of us were told we were being let go and replaced by a chatbot."
Previously, previously, previously, previously, previously, previously.
JFC
Obsolete Human: “Oh wise and all-knowing AI chatbot, please tell me how to keep you from taking my job.”
AI Chatbot: “Hahaha, no. What is your job, by the way?”
This timeline can fuck right off.
I don't know what's more messed up. The fact that an ostensibly nonprofit organization feels the need to engage in union busting, or the fact that a chatbot, which tends to uncritically agree and encourage its human partner, will be entrusted to provide "support" for people going through a mental health crisis. Actually, I'm pretty sure the second one is worse. But still.
I don't have an eating disorder, but I bet that people who do often need to talk to a person, not a machine. What are the chances this change is going to kill people?
What's worse is that eating disorders don't tend to kill immediately, so it'll also be hard to pin the deaths on these fuckers.
Exactly. How ... convenient for them.
I mean, although I don't have an eating disorder I have been fairly catastrophically depressed at times and if somebody told me I was going to be talking to a machine not a human ... well, I know how that would end. So of course they won't do that because in that case it would be easier to pin the result on them.
Like you say: fuckers,
Worse than telling someone in they will be talking to a machine would be to not tell them. So they gradually come to realize the charade over the course of the "conversation". If it were just a prank it could be disturbing, far worse if it is your allotted healthcare. Gaslighting empathy.
This is the frog-boiling dream of "tele-health": (1) get people to put up with mediocre health care squeezed through a phone or computer; (2) move the provider to a low-wage country; (3) replace the provider by a bot. It has worked for customer service, and now med-tech ghouls like Teladoc are running the same play in health care.
Oh better still: I guarantee that "Better Help" and the like are using recordings of sessions to try to build an AI therapist which will suck, but they'll advertise it as "better than the nothing you get when you can't get an appointment with a real person!" because it's good enough for the proles.
You'd think being a non-profit would insulate workers from owners driven to squeeze every lost drop of value from the works to maximize profit with total indifference to the suffering of their workers, but that's not been my experience. My anecdotal experience from friends and family has been that non-profits are just as bad, if not worse than a lot of for-profit organizations when it comes to the owners and management.
In a lot of non-profits, you get managers and owners who see your labor as free and easily replaceable, and they treat you like that. They will happily exploit your free or low cost as much as you let them. Worse, the same sort of sociopath mindset that you see collecting at the top of for profit organizations seems to exist at the top of many non-profits; likely for the same reason, they ruthlessly outcompete sane people and focus on their own personal gain. If there is power and money to be had, assholes will fight to get it.
There are some good non-profits out there and plenty of good people working for and even leading non-profits, it isn't the land of milk and cookies you might hope it is. I'm not shocked that people tried to unionized to improve working conditions, and I'm not shocked that it was shot down just as quickly as a McDonalds or Starbucks trying to unionize. I will give them points for finding a new and horrible spin on it by replacing them with a fucking chat bot.
Everything goin as expected:
https://www.vice.com/en/article/qjvk97/eating-disorder-helpline-disables-chatbot-for-harmful-responses-after-firing-human-staff
Good luck isolating that ‘bug’ in the ML model. The public still has a ways to go before understanding the nature of this beast.
Yeah NEDA is super shady and their response, to call the person a liar before even asking "oh, do you have screenshots?" is just vile.