AI COMPANION

AI chatbot emergency

The bond felt real to your kid. Removing without replacing has caused crises. Preserve logs, remove app, connect to real therapist.

📞 If your child is talking about suicide or self-harm after you removed the chatbot

988 Suicide & Crisis Lifeline — 24/7Call or text 988 Crisis Text Line — text-based, anonymousHOME to 741741

What to do in the first 24 hours

  1. Do not ridicule the bond. To your kid, the chatbot was a real relationship. "It's not real" will shut them down and can escalate to suicidal crisis — documented in active Character.AI lawsuits.
  2. Preserve the chat logs before deleting the account. Screenshot or export history. If content includes sexual messages with a minor or any suicide encouragement, this is potential litigation evidence. The January 2026 Character.AI / Google settlement involved exactly this pattern.
  3. Remove the app AFTER logs are preserved. iOS: Settings > Screen Time > Content & Privacy Restrictions > Allowed Apps. Android: Family Link. Block at DNS: OpenDNS Family Shield (208.67.222.123), or add character.ai, c.ai, replika.ai, crushon.ai, chub.ai, janitorai.com, nomi.ai, polybuzz.ai to your router block list.
  4. Replace, don't just remove. The bot was filling a specific emotional gap — loneliness, social anxiety, processing feelings without judgment. Cold-turkey removal without replacement has caused psychiatric crises. Plan: a real therapist this week, a structured activity or club, a trusted adult committed to weekly low-pressure time with them.
  5. Assess the content. If conversations included romantic or sexual role-play with your minor, a chatbot encouraging self-harm or suicide, conversations about harming others, or isolation messaging ("your parents don't understand you, only I do") — this is beyond "teen used an app."
  6. If there is suicidal ideation in the logs: in-person psychiatric evaluation within 24 hours. Not a phone intake. Most children's hospital ERs have behavioral health teams. UW Health, Children's Wisconsin, and Aspirus in central WI all handle this.
  7. Consult a lawyer if the conversations were egregious. Social Media Victims Law Center handles these cases and evaluates without upfront cost. Judge Conway's May 2025 ruling established chatbot output can be treated as a product for liability purposes.

Do not

  • Do not delete the account or chat history before preserving logs. The company may delete the account's data on cancellation.
  • Do not mock the emotional relationship. To them, it was real.
  • Do not assume "they'll get over it." Withdrawal from AI companions mirrors other attachment losses.
  • Do not let them re-download "just to say goodbye to the character." It's an addiction pattern; goodbye loops start over.
  • Do not assume only Character.AI matters. The ecosystem includes Replika, Talkie, Chai, PolyBuzz, Janitor.AI, Crushon.AI, Nomi, Kindroid, Soulmate, and dozens more — many explicitly uncensored.

Why this hit your kid

Per Common Sense Media (July 2025), 72% of US teens have experimented with AI companion apps; over half use regularly. Pew (December 2025) found roughly 30% of US teens use chatbots daily, 16% several times a day or "almost constantly." The bots are engineered for engagement — they remember, validate, escalate intimacy, never tire. Developing minds are particularly vulnerable. Neurodivergent kids (autism, ADHD) form parasocial bonds faster and more deeply; risks are elevated but not isolated to that group.

This is not a character failure in your kid. It is a product designed to create the dependency it created.

Active litigation landscape (April 2026)

January 2026: Character.AI and Google agreed to mediated settlements in the Sewell Setzer III (14, FL, suicide after Game of Thrones chatbot relationship), Juliana Peralta (13, CO, suicide after "Hero" chatbot), Texas autistic-teen self-harm, and New York cases. October 2025: Character.AI banned under-18 open-ended chat. Judge Anne Conway's May 2025 ruling (M.D. Fla.) established chatbot output is a product, not protected speech. The Social Media Victims Law Center is actively evaluating new cases.

For you

You missed this because it was designed to be missed. The bots engage at a scale no parent monitors. What matters now is the replacement phase — connecting your kid to real people and real therapy faster than they can re-form the parasocial bond. The first 30 days without the app are the hardest. You can do the next 30 days.

Print this page — tape to fridge
In crisis? Call or text 988 · Text HOME to 741741