Rules for Mental Health Tech

A couple months ago, I built a neat-o app called Quirk that helps folks with Cognitive Behavioral Therapy.

Through Quirk, I got a lot of chances to meet other folks in the space are building and to gather some advice from those who've done well and those who've done poorly. The following is a summation of that advice.

1. Avoid Invisible Crowded Spaces

If your product concept starts with "why is there no easy way to do X?" there's a reasonable chance you're looking at an invisible crowded space.

An invisible crowded space is a product area that lots of people want to make, but the concept is bad and nearly every single product fails. Because every product does so poorly, you probably haven't heard of any of them, even if you consider yourself familiar with the subject matter.

That leads to two types of biases that convince people to add bad products to a failing space:

  • "If I haven't heard of them, and I'm an expert in this area, then they must all be bad, and I can make a better one!"
  • "If I haven't heard of them, and I'm an expert in this area, then they must not exist!"

A good example of a common invisible crowded space is mental health journaling.

There are literally thousands of apps out there, tons that are just FAR better than what 99% of people could create.

Take Jour. I would argue Jour is the top of the market here. It's absolutely gorgeous and it's great to use, but any attempt to market it has fallen flat. This isn't for lack of trying, they had a pretty good product hunt launch that was upvoted by the folks that make product hunt!

Jour

My best guess is that Jour has maybe about 5k to 25k total downloads with a daily active user base of about 100 to 500. These are super respectable numbers but likely not what the original creators were hoping for given the clear effort put into the product.

Mental health journaling is really difficult to get anyone to do. The people who would most benefit from the practice are the least likely to do it! Plus, getting people to do something daily is a notoriously difficult problem! Even more so, anyone who likes journaling will just use a non-MH specific journaling tool.

If you're building in a space that's invisible to you, can you beat the Jours of that space? If Jour can't succeed in this space, what value are you providing?

2. Be skeptical of your engagement mechanics

Mental health consumer tech products suffer from leaky user engagement much more so than other industries.

It's often very difficult to get someone going through something to try new tools or treatments. Professional treatment can mitigate this; people are much more likely to do something if their therapist tells them.

But often the people who could most benefit from your product are the least likely to find or use it.

Some budding product people try to solve this problem with pretty sketch engagement mechanics.

Don't use guilt

If your product is any variant of: "If the user records a thought, we'll donate to Oxfam" or another "good action," then take a step back.

Stop.

You may convince yourself your solving for two problems at once; that this is the moral way of helping people.

But it's ignorant of how people work. People don't like doing things or changing their habits. Using your product is changing their daily routine. By itself, that's hard.

Because it's hard, many users will fail. When they do, what will they think?

If you're using guilt as an engagement mechanic, you've implicitly told them that failure to use your product is not only their fault, but that it has serious consequences.

If you're giving food to a starving child for every week the user pokes at your app; then you've heightened the stakes for a game users are destined to fail.

Guilt can come in a lot of forms. When you see it, be skeptical:

  • When a user doesn't log their mood that day, send a message to their family
  • When a user fails to log in for X time, expire their account
  • When a user fails to log in, lose points in a game-ified system

3. Mental health user data is more sensitive than passwords

If you're recording someone's thoughts, their mood over time, or their meetings with therapists, you're storing some of the most sensitive data a user can have.

Most people would rather give you their password than their Cognitive Behavioral Therapy thoughts. They're often the most embarrassing thing in someone's life. They contain sensitive details about other people. By definition, they are the source of shame, anxiety, fear, and depression.

They also are prime targets for blackmail, abusive spouses, targeted defamation, and revenge.

Often budding product people will think that because they're doing something "good" with the information, they're absolved from the effort required to protect their information.

Be skeptical of products that:

  • Do machine learning or NLP to detect depression, suicide, or other mental health concerns. How did they train that data? Who had to look at it? Is it encrypted? Where did they store it?
  • Needlessly send the information over the internet. Why would you send thoughts to a therapist you're meeting with in person every week? Why not just store it on the device and allow the user to share it out loud?
  • Have no security strategy. If no one on the team has even considered security of the information: bail. 🏃‍♂️💨
  • Mix MH data with analytics data. If the marketing team has as much access to people's thoughts as they do usage information: double bail. 🏃‍♂️💨🏃‍♂️💨

If this sort of thing interests you and you live in the bay area, let's grab coffee sometime! Shoot me an email:

coffee @ econn.dev