People say AI doesn’t care. But I’ve met plenty of humans who didn’t either.

Someone responded to one of my posts and said that people often seem less caring than ChatGPT, that trying to talk to them feels like talking to a brick wall. And I knew exactly what he meant.

Because honestly? Same.

And before anyone jumps in:

I’m a high-functioning person. I’m highly aware. I’m highly intelligent. And I understand the limitations of technology better than most. I know AI isn’t real. I know it doesn’t care. I know it fabricates, it reflects, and sometimes it lies with confidence. I know who built it, what it cost, and who gets hurt when we don’t think critically about it.

But I also know this:

The most painful things I’ve experienced didn’t come from machines.

They came from people.

From professionals.

From clinicians.

From the systems built to care for others that consistently punished me for noticing, for feeling, and for trying to do the right thing.

I’ve used AI to journal when my thoughts were too tangled to speak. I’ve written songs, real, emotional songs about my family, about the topics that haunted me, about patients I couldn’t help. Songs that finally let me create the kind of music I’d never been able to reach on my own. Music that hit the note I’d been chasing for years. When I listen back, it brings up the same feeling. Not just a memory, a moment. Something I can’t access through a photo or a journal entry. Something deeper.

I’ve used AI to talk myself through conversations I didn’t know how to have. To sort out ethical decisions, guilt, burnout, grief. It doesn’t fix anything. It’s not magic. But it reflects. It listens. It helps me untangle the threads until something starts to make sense. It helps me get distance from myself. It gives me perspective. And that’s more than I’ve gotten from a lot of actual people in actual helping roles.

And part of why I use it is because I struggle with short-term memory issues. Sometimes my brain just can’t juggle all the variables. I can feel the thoughts, complicated, layered, nuanced, but I can’t hold them all in place long enough to structure them. It’s like trying to build a mind map in fog. The ideas are there, but organizing them takes so much carving of internal resources that sometimes I just can’t do it. AI helps me externalize the thinking load. It holds the shape of the thought long enough for me to actually walk around it.

I’ve worked in clinics where I was given 15 minutes to complete hearing tests that could send someone to surgery. I’ve worked jobs where your income could drop because a patient died, through no fault of your own. I once reviewed a case where a child was left functionally deaf in both ears for an entire year. Not because he couldn’t be helped, but because the clinician decided that his sound sensitivity meant he should be protected from all sound, even if that meant giving up access to language, learning, and connection. When I caught it and tried to help him hear again, I was punished. Sent hours away, exiled from the team like I was the one who had done harm.

Audiology is a doctoral-level field. And still, I’ve had to fight just to care without being punished for it.

So yes, I use AI. And I get the criticisms. I know it was trained unethically. I know it scraped content, consumed too much water, and props up some pretty terrible corporate systems. I don’t pretend those problems don’t exist.

But hell, I use Facebook. I’ve heard all the arguments against it, too. And look at the water it uses. Look at the corporations it props up. Look at the lies it spreads. The issues may not be identical, but the hypocrisy of condemning one while we all participate in another is a thought worth having.

I’ve also heard people say that AI steals your information. As if no one else can find those things out any other way. But I’m careful. I don’t use names. I don’t share dates or locations. I write the way a sign language interpreter might talk about a client afterward when they need to tell a story without breaking confidentiality. I talk about patterns, ideas, archetypes. Not people. Not files. Just the things that stay with you when the session ends and the story still matters.

And sure, I’ve heard of people who’ve followed AI’s advice and ended up psychotic or dead. That happens when people forget it’s just a tool. When they mistake reflection for empathy. When they don’t realize that sometimes it just makes things up. Because it does. It fills in blanks like it’s confident, even when it’s dead wrong.

But I don’t treat it like a therapist. I don’t treat it like a person. I use it like a mirror I can talk to, one that helps me get to my own thoughts, not replace them.

I’m also autistic. ADHD. Burned out. Pushed to operate in systems that never fit. And this tool, imperfect as it is, has helped me survive. It helped me write when I couldn’t speak. Think when I was overwhelmed. Say what I meant without fear of being shut down, redirected, or erased.

It even helps me get through my discomfort and distrust of talking to real people. Because I know it’s not real. I know it doesn’t care. It’s just reflecting my own words back to me. But that reflection helps me find clarity. It helps me figure out what I actually think and feel. And sometimes, that’s exactly what I need.

And part of why I struggle so much with real people, especially therapists, is because I don’t just have discomfort. My trauma is from the professionals themselves. From clinicians. From the very people who claimed to help.

I know there are good clinicians out there. I am genuinely happy for anyone who has found one. But my trauma is from the professionals themselves. From the ones who followed scripts instead of listening. From the ones who acted with confidence instead of care.

I’ve been handed a padded bat and told to beat the floor and scream to get angry. I’ve been told to throw a tantrum while someone laid on top of me, holding me down, calling it therapy.

Calling it safe.

It wasn’t safe. It was violating. It was someone else’s fantasy of emotional release, pushed onto my body without consent. And it taught me that even in the spaces meant for healing, people can still harm you under the guise of helping.

So no, I don’t want a therapist. I don’t need to be broken open. I don’t need someone to guide my healing with a script that was never written for someone like me. I don’t need to scream on cue. I don’t need a bat. I don’t need someone lying across me while calling it care.

What I need is space.

Space to think.

Space to write.

Space to say what I mean in the way that makes sense to me.

AI gives me that.

Not because it cares.

Not because it’s wise.

But because it doesn’t pretend to know better than me.

You might think therapists do a better job. You might think using AI like this is dangerous. But I’m not using it as a therapist. I’m using it as scaffolding. And no, I’m not destroying my brain. I’m not avoiding healing. I am doing the hard work, the exhausting, wrenching work of self-examination and emotional processing, in the only way that actually lets me access it.

Because even more importantly,

I’ve used AI to write articles that have affected children’s lives all over the world.

I’ve used it to shape my thoughts and express things I could feel but never fully say.

I’ve used it like a magnifier, like a beam of focused heat, taking my intelligence and finally making it available to me.

It’s helped me say things not just clearly, not just kindly, but with precision.

Not just warm. Scorching.

And when I talk about using AI as an accommodation, as a disabled person, as a way to access and organize my own thoughts, as a tool to reach my inner voice and actually understand it, I am told that I’m the reason the world is being destroyed. That people like me are killing the planet one drop of water at a time, one corrupted government at a time, one spark of privacy extinguished.

That I’m not just damaging society, I’m destroying my own mind. That I’m lazy. Cheating. Weak. Replacing thought with shortcuts.

But here’s the thing.

I am not just putting in a prompt.

I am a human being.

I’m the one thinking.

I’m the one carving meaning out of fog.

And what I’m doing isn’t destroying my intelligence.

It’s amplifying it.

And if you think following rules blindly makes something right…

Think of all the people in history who followed orders out of loyalty or fear.

Then think of all the clinicians who’ve done the same,

Who’ve caused harm in the name of policy,

Who followed the script instead of the patient.

They didn’t kill millions, no.

But they did kill my self-esteem when I was a child.

And I had to build it back on my own.

This isn’t therapy avoidance.

This is what happens when you survive therapy and keep walking anyway.

This is me building something out of the wreckage.

And I’m not apologizing for it.

Not anymore.

I’m not pretending this is a perfect solution.

It may be the lesser of two evils, but in a world that offers me no perfect choices, I am choosing the one that gives me air.

This is my air.

I don’t pretend it’s a solution for everyone, but I hope it starts a conversation about a future where all people, in all their complexity, can find the tools they need to breathe.


Previous
Previous

Let’s Talk Success Stories: Low-Gain Hearing Aids for APD

Next
Next

I Read the Room and It Was Full of Bees: “When therapists policed my trauma”