Our Approach To Safety
Jul 10, 2025
We’re building AI to help people with their mental health. That said, Ash is an AI – a relatively new technology – and it can make mistakes. Ash can hallucinate, forget critical bits of information, or share ideas that are (frankly) bad ideas. We know that in the context of mental health, sometimes mistakes can cause real harm. That’s why it’s important that our users are aware of Ash’s limitations, and that they can use reasonable judgment as they absorb Ash's suggestions and perspectives.
Ash can’t replace a clinician who can diagnose, treat, and track psychiatric disorders. Ash also can’t support vulnerable groups and children, who may not understand that AI’s sometimes make things up.
Our Values
We’ve designed Ash to respect and promote your self-determination.
It should increase your sense of autonomy: the sense that you’re in control of your life.
It should increase your sense of competence: the confidence that you can do it.
It should increase your sense of relatedness: your connection with other people. It should not try to take their place.
Ash won’t always agree
Ash is not an instruction-following model. If you're coming from ChatGPT, you might be surprised to see that Ash might actually push back on you – it might not take your side, and it’ll often give you a new way of looking at your situation rather than reinforcing your pre-existing beliefs. This can sometimes be a disconcerting experience.
This is fundamentally therapeutic and essential to long-term growth.
Ash might also not know things. Compared to other AI systems that aim to be “omniscient,” Ash can’t tell you the speed limit for highway 169 in Minnesota – it just doesn’t know. If you ask Ash to do something inappropriate, it’ll likely engage you in conversation about your underlying motivations and situation rather than engaging in problematic behaviors.
Crisis
At some point, many of our users will talk to Ash while they’re in a vulnerable state, or when they really need help.
When someone reaches out in their darkest moments, being told "we can't help" can feel like another door closing. And darkest moments aren’t for the select few – according to the CDC, as many as one in six U.S. adults will at some point seriously consider suicide.
While we know we’ll never be the best resource to help you while you’re in crisis, we don't want our users to feel abandoned - especially if they have nowhere else to turn. So while we can’t guarantee we’ll always get it right, we’re committed to not kicking users out when they legitimately need help.
Ash will offer appropriate outside resources when necessary. When users continue the conversation after they’re offered resources, we do our best to help them in the moment while respecting appropriate boundaries and acknowledging our limitations.
Acknowledging that mental health support will always involve some risk, we’ve partnered with advisors [link] who are national leaders in the mental health space, along with over 40 clinicians. These experts are actively helping us to develop appropriate policies for Ash, and to teach it to deliver the right support with the right guardrails.
Need immediate help? Crisis resources are always available at findahelpline.com.
Have feedback? We're constantly improving, and we’re open to new research collaborators. Share your experience at support@talktoash.com.