Rethinking Spatial Hearing Support: APD, Echolocation, and the Limits of Diagnosis-Based Access

Recently, I was asked a very fair—and very complicated—question: Is it appropriate to provide vision-related services, like Orientation and Mobility (O&M), to children who don’t technically have a visual impairment? And more specifically: If a child has auditory processing disorder (APD) and not blindness, does offering them these supports take something away from blind students who need those services?

It’s a valid concern, especially when resources are already limited and hard-won.

But here’s where it gets complicated.

We now know that many kids don’t fit neatly into diagnostic boxes. Take Ehlers-Danlos Syndrome, particularly the hypermobile type (hEDS)—a condition that’s often underdiagnosed in children, and yet it affects connective tissue throughout the entire sensory system. These kids may experience fluctuating proprioception, unstable vision due to eye tracking issues, auditory disorientation due to eardrum laxity or ossicle instability, and sensory fatigue that doesn’t show up on standard screenings.

They may look like they’re functioning “normally” one day—and completely shut down the next. The same goes for kids with autism, who often overlap with the hypermobile population. When sensory overload hits, visual access can drop dramatically. Bright lights, fast movement, visual clutter—all of it can turn the visual channel into static. These kids may technically have 20/20 vision, but that’s not the same as being able to use their vision effectively to access learning or safely navigate a space.

Add in visual processing disorder, which itself frequently overlaps with both APD and autism, and the picture gets even more complex. Many of these children struggle to organize and interpret what they see—not because of low acuity, but because of difficulty integrating visual information in real time. This can have profound effects on reading, copying, spatial awareness, and orientation—even when eye exams come back “normal.”

So when we talk about whether a child “qualifies” for vision-based services, we have to ask: are we talking about legal definitions—or functional access?

Because those aren’t always the same thing.

Spatial hearing testing is still surprisingly rare in the U.S., even though it’s one of the most important aspects of how we function day-to-day. You can pass a hearing screening and still be totally disoriented in a classroom—not because you can’t hear, but because your brain can’t organize the sound around you.

In Australia, they do a better job in some ways. Spatial processing disorder is actually considered a core subtype of APD there, and they use the LiSN-S to test for it. That’s a step forward.

But even in Australia, the interventions aren’t always helpful.

Most kids are offered a program called Sound Storm—a computerized auditory training app that runs on iPads. And here’s the thing: the same company that makes the LiSN-S also makes Sound Storm. So the only commercial tool we have to diagnose spatial hearing issues comes paired with a single treatment. And neither of them reflect the real world.

The LiSN-S test is a simulation. It uses headphones to play speech in spatialized noise, but the target voice always comes from directly in front. That’s not how people communicate in real life.

Sound Storm is built on the same premise. The child practices listening to faint speech against a noisy backdrop, but again—always with the speech coming straight ahead, never from other directions. It’s repetitive. It’s fatiguing. And it takes over 100 sessions to complete.

Worse, it often requires intense behavioral reinforcement just to keep kids compliant. Most don’t finish it. And there’s very little evidence that it generalizes to the real world.

How could it?

When in real life do we only listen to people standing right in front of us?

What about when a child is sitting in the front seat and needs to understand a sibling in the back?

What about at a noisy birthday party when someone’s trying to call their name from across the room?

Or in a grocery store—losing track of their mother’s voice—and not being able to find her because they can’t localize where the sound is coming from?

Those aren’t abstract scenarios. That’s real life.

And if our tests and treatments can’t reflect that? Then we’re not solving the right problem.

That’s why I push for real-world spatial training, and why I use low-gain hearing aids in my practice. They preserve spatial cues, reduce fatigue, and give the child a clearer, more usable experience of sound.

I work with a blind patient who uses echolocation and has normal hearing. Today, they came in after noticing that one hearing aid was softer than the other. That imbalance caused them to lose orientation—and they were nearly struck by a car.

This is someone with hEDS, who had enough visual acuity as a child that they were expected to read and even drive. They didn’t qualify for vision services, because they weren’t “blind enough.”

That’s what happens when we gatekeep based on rigid criteria. People fall through the cracks.

If you’re still wondering why someone with APD might need vision-based accommodations like O&M training—this is why. Because not all orientation challenges are visual. Sometimes, they’re auditory. Sometimes, they’re sensory. And sometimes, they’re rooted in connective tissue problems that don’t show up on traditional screenings.

This isn’t about taking anything away from students who are blind. It’s about recognizing that function—not labels—should guide access to support.

And the more we expand our understanding of who benefits from these services, the more effectively we can serve everyone—without leaving the kids who fall in between behind.

Previous
Previous

Auditory Processing Disorder: The Elephant in the Room

Next
Next

From AirPods to Roger: What Really Helps Kids with APD Hear Better in Class