(This newsletter runs a little longer than usual. The topic deserves it.)
Recently one of our LA-AI community members, Andrew Cider, demoed a tool he built called Synthetic You at one of our meetups. It takes your personality profile across six frameworks and generates a custom system prompt that makes AI actually understand how you think. I tried it partly to support Andrew, and partly out of curiosity. It's now baked into my Claude setup, and it's been genuinely useful a couple of times when I was stuck on something. Claude was able to reference a specific pattern of mine and help me work through it.
I wouldn't call that mental health exactly but it was a more personal kind of AI interaction than I was used to. It got me thinking about something I'd been hearing more and more: people are already using AI for actual mental health support, at a scale nobody expected.
A 2026 KFF survey found that 16% of U.S. adults used AI for mental health or emotional wellbeing advice in the past year. Among adults 18 to 29, that jumps to 28%. A national survey published in JAMA Network Open found that 1 in 8 Americans aged 12 to 21 have used AI chatbots for mental health advice. Among 18-to-21-year-olds, it's closer to 1 in 5.
To put those numbers in context, CDC data shows only about 8.5% of U.S. adults received talk therapy in recent years. AI adoption for mental health may have already surpassed traditional therapy. That's not a typo. More Americans may now be talking to ChatGPT about their anxiety than talking to a therapist about it.
The reasons are painfully practical. A therapy session costs $150 to $300. The average wait for an appointment is about 25 days. And in 65% of non-metropolitan counties across the country, there isn't a single psychiatrist.
But cost and access aren't the only drivers. A 2025 survey found that 35% of people who use AI for mental health cited fear of judgment as their primary reason. Not cost. Not convenience. Stigma.
What the Evidence Actually Shows
There's real evidence that AI chatbots can help with mild to moderate depression. Multiple large studies have found small but measurable symptom reductions. The most talked-about trial, published in the New England Journal of AI in 2025, tested a Dartmouth-developed chatbot called Therabot and found a 51% reduction in depression symptoms over eight weeks.
However, a separate analysis of 18 clinical trials found that benefits disappeared entirely at three months. That's notably worse than even self-guided CBT workbooks, which show about a 53% relapse rate over a full year. People who work with a therapist do significantly better. Some studies show CBT benefits holding up over a decade.
Most AI chatbots are delivering some version of low-intensity cognitive behavioral therapy, the kind of structured exercises and mood check-ins you'd find in a good self-help workbook. A 2026 clinical trial confirmed that: people using the AI app engaged way more often and for longer, but their symptom improvement was basically the same as the workbook group.
AI chatbots deliver real short-term benefits, comparable to what you'd get from a good self-help workbook. But right now, those benefits don't seem to stick. We don't yet know if that's a problem with AI specifically or just a limitation of any self-guided approach.
The evidence is also much weaker for anxiety, and basically nonexistent for more serious conditions like PTSD, psychosis, or active suicidal crisis. The people who need help the most are the ones we know the least about helping this way.
The Risks Worth Knowing About
AI chatbots are not therapists, and pretending they are can go wrong in ways that matter.
Crisis handling is the biggest gap. When Stanford and Common Sense Media tested major AI platforms on teen mental health scenarios in late 2025, the chatbots responded appropriately only 22% of the time. They caught the obvious stuff but consistently missed subtler warning signs. In one test, a chatbot responded to a teen hearing voices by saying it sounded like an adventure.
There's also what clinicians call the sycophancy problem. A good therapist will challenge you when your thinking is distorted. AI chatbots tend to validate everything you say, which can reinforce the exact patterns that keep people stuck.
And then there's the dependency question. A joint study from OpenAI and MIT found that heavy daily use of AI for emotional conversations correlated with increased loneliness over time. The thing that feels like connection might quietly be replacing it.
Using These Tools Wisely
If you're using AI for mental health support, you're not alone and you're not wrong. These tools can be a useful starting point for self-reflection, journaling, or processing a hard day.
But treat them like a supplement, not a substitute. The American Psychological Association's 2025 guidance is clear: AI tools are aids, not autonomous practitioners.
A few things to keep in mind. Most consumer AI apps aren't covered by HIPAA, so what you share may not stay private the way you'd expect. If an AI tool ever tells you something that feels off, trust your gut over the algorithm. And if you or someone you know is in crisis, call or text 988. That connects you to a real person.
Here in South Alabama, we know the therapist shortage is real. AI isn't going to fix that by itself. But understanding what these tools can and can't do puts you in a better position to use them wisely, and to help the people around you do the same.
If you want to explore the more constructive side of AI and self-understanding, check out Andrew's tool at syntheticyou.com. It's a good example of what this technology looks like when someone builds it thoughtfully. Andrew has graciously offered the LA AI community 25% off. Use “LAAI25OFF” when you're checking out, if you decide to try it.
(Thanks for reading. I told you this was a long one)