Crisis Text Line selling data to for-profit company

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

futureapppsy2

Assistant professor
Volunteer Staff
Lifetime Donor
15+ Year Member
Joined
Dec 25, 2008
Messages
7,645
Reaction score
6,388

I know as a suicidologist and clinician, I'm supposed to be in favor of these, but honestly, I'm beginning to think that they are a lot more iatrogenic than we'd like to admit, between stuff like this, doxxing callers, etc.

Thoughts?

Members don't see this ad.
 
  • Like
Reactions: 1 users
It's just like all those other mental health treatment apps like Better Help. They're either just grifts to get VC money for a product that will never actually be profitable or they're ways to scape up as much personal data as possible to either be sold to other companies or used in the owners' other for-profit ventures.

This is what happens when every aspect of our lives have been commodified and financialized.
 
  • Like
Reactions: 4 users
Exactly. Any claims that they want to help patients or providers is just a smokescreen or whitewashing of their actual interests in monetizing data.

And there's some pretty significant ethical issues in claiming that customers/clients consented to have their information collected and used like this because there's an EULA and disclaimer when they first go to use the app. If someone is so distraught to the point of HI/SI and desperate for help like that to avoid harming themselves or others, what are the odds that they're going to actually read this information? And have they really consented without duress if they were in that state?
 
  • Like
Reactions: 4 users
Top