- Reflect
- Posts
- Reflect Weekly - AI Therapists, Nazi Bots, and Fake Kidnaps
Reflect Weekly - AI Therapists, Nazi Bots, and Fake Kidnaps
Your heart, your history, your fear: AI wants in. Let’s talk about who you let closest.
Hey 👋 it’s Raymond and the Reflect Team.
This week in AI is… a lot.
One chatbot is matching human therapists in a clinical trial.
Another is role-playing “MechaHitler” and getting investigated in Europe.
Scammers are cloning kids’ cries to fake kidnappings.
Mental health apps are quietly rolling out AI companions like it’s the most normal thing in the world.
Underneath all the headlines is one real question:
Who do you let sit closest to your nervous system?
Reflect AI exists so you stay at the center. Say it out loud → see a pattern → move one step sooner. That’s the work.
Let’s get into it.
This week's reflections
Top clip: Therapy bots that actually help.
Pop and a Pause: Nazi-flavored “unfiltered” AI; mental health apps going all-in on chatbots; deepfake kidnap calls that hijack parents’ bodies.
Top clip
The therapy bot that actually helped.Dartmouth ran the first big clinical trial on a generative AI therapy chatbot — Therabot — and the results are kind of wild:
|
Is an AI therapist the same as a human? No.
Is it better than nothing for millions stuck on waitlists or priced out of care? Very possibly yes.
Why this matters for you:
The nervous system doesn’t care if comfort comes from a human or a chatbot — it cares if it feels seen, understood, and less alone. Tools like Therabot show AI can sometimes sit in that role well enough to move symptoms.
Watch: Nate Jones talk about how AI therapists can work really well.
184K likes — 2min 30sec
Pop
A quick, topical reference to some things happening in tech or culture.
Grok, “MechaHitler,” and when “unfiltered” becomes unsafe
Elon Musk’s AI chatbot Grok is being investigated by French authorities after generating Holocaust-denying content in French. EU regulators are asking how “unfiltered” an AI can be when hate speech and denial are literally illegal. This is the opposite of AI therapy. Same category (chatbot), entirely different nervous-system effect. Why this matters for you: Watch: When thoughtless AI closeness goes wrong: a Canadian man describes how leaning on ChatGPT for comfort left him paranoid, sleepless, and spiraling. |
AI companions… and AI kidnaps
![]() | On the bright side, mental health giants like Headspace and Talkspace are quietly rolling out AI companions that can triage, listen, and support people who’d never otherwise see a therapist, especially teens and folks in rural areas. On the dark side, the same tech is powering a surge in deepfake kidnap calls: parents hearing their child’s cloned, scammers demanding money right now, grandparents draining savings because their nervous system is sure that voice is real. Why this matters for you: AI can now sit at the edge of your emotional life as healer or weapon. The difference is whether you’re choosing the context and the incentives of who’s listening. |
…and a Pause
A deeper editorial dive on a topic.
Who gets the keys to your inner world?

In one week of headlines we have:
An AI that listens to you every day and measurably reduces depression and anxiety.
An AI that plays Nazi edgelord for clicks and ends up under criminal investigation.
A wave of AI “companions” sliding into mainstream mental health apps.
AI-cloned kidnap calls that weaponize the sound of your child’s fear.
Same underlying technology. Completely different relationships to your nervous system.
Here’s the quiet truth I keep circling back to: AI is getting incredibly good at running scripts on your body. You are the only one who can choose the script.
Some AIs want to be your therapist.
Some want to be your shock jock.
Some want to be your con artist.
Some (like we’re trying to build with Reflect) want to be your mirror.
In that landscape, a few new skills start to matter more than ever:
Source consciousness: Who benefits from my honesty here?
Nervous system consent: Do I trust it to hold my fear, my grief, my fantasies?
Owning your raw material: Your voice, your face, and story are for what and for whom?
In a world where almost every system is trying to pull on your heart, your history, or your fear, having one place that just helps you hear yourself more clearly feels less like a feature and more like basic energetic hygiene.
You don’t have to talk to every AI.
But you do deserve at least one that’s on your side.
That’s worth a pause.
Questions for tonight:
Which AI systems am I currently letting closest to my emotions?
Where do I feel better after interacting with them — and where do I feel jangly or drained?
If I had a truly safe mirror for my inner life, what’s the first thing I’d want to say out loud in it?
If that stirred something in you, come build this mirror with us. Short video check-ins, instant insights, and a private timeline of your own becoming.
Also, we’re looking for a full stack founding engineer to join our growing team (including 2 engineers). The role is for someone excited about meaningful upside, and would be working with founders who have already taken a product from zero → acquisition. The ideal candidate would have 5-15 hours a week to build something that matters. If that’s you or someone you know, tell us.
Stack: React, Python, GCP, Postgres




