Key Takeaways
- Facial recognition algorithms misidentify Black faces at rates 10 to 100 times higher than white faces, leading to documented wrongful arrests.
- Being “coded out” means being made invisible by design—from voice assistants that ignore accents to credit models that mark entire communities as too risky.
- The “default user” myth places a narrow identity at the center of every design decision, treating everyone else as an “edge case.”
- Four pillars—Lead with Curiosity, Be Accountable, Solve for One/Extend to Many, and Take Action—provide a framework for inclusive AI.
- When we solve for people at the sharpest edge of harm, we improve systems for everyone—from margins to multipliers.
Three Men, One Pattern
January 2020, Detroit, Michigan. Robert Williams is in his front yard, playing with his daughters, when a police car pulls up. Officers step out and tell him he’s under arrest for larceny. His daughters watch as their father is handcuffed in front of them and taken away. Robert spends 30 hours in jail for a crime he didn’t commit. The “evidence”? A facial recognition algorithm that couldn’t tell Black faces apart.
Robert Williams isn’t alone. There’s Michael Oliver—similar story, different city. Accused of a crime based solely on facial recognition. And Nijeer Parks—accused of shoplifting and trying to hit a police officer with a car in a town he had never even visited. He spent 10 days in jail and nearly three years fighting the charges before they were dismissed.
Three Black men. Three wrongful arrests. Three lives disrupted. And these are only the cases we know about.
This Isn’t Anecdotal. It’s Systemic.
In 2019, the National Institute of Standards and Technology (NIST) published a major study on facial recognition accuracy across demographic groups. Their findings were damning: facial recognition algorithms misidentified Black faces at rates 10 to 100 times higher than white faces. For Black women specifically, error rates were even worse.
The systems being sold into policing, airports, housing, banking, and hiring are systematically worse at recognizing us. This isn’t a random glitch. It’s the predictable result of how these systems are built.
What It Means to Be “Coded Out”
When I talk about being “coded out,” I’m talking about being made invisible by design. You’ve been coded out when a voice assistant never understands your accent or your disabled speech pattern. When a “smart” form forces you into two gender options that don’t fit. When a navigation app forgets wheelchair users exist. When a credit scoring model marks your entire community as “too risky,” ignoring the structural barriers you face.
The common thread: a mythical “default user” at the center of every design decision.
The Myth of the Default User
Most tech products are built around an imaginary “typical” user—usually defined by whoever’s in the room building it. Everyone else gets labeled an “edge case.” We see this in name fields that reject non-Western formats, recognition engines trained on narrow datasets, interfaces that quietly assume a certain way of seeing, moving, hearing, or processing information, and data pipelines that treat Black, disabled, and Global South communities as statistical noise.
The result? People at the margins get locked out, misclassified, or harmed first and worst. Trust erodes. Lives are disrupted. And yet the system is marketed as “objective,” “innovative,” or “unbiased.”
From Coded Out to Coded In: Four Pillars
Pillar 1: Lead with Curiosity
- Who is missing: From this dataset, this user research, this leadership meeting?
- Whose support tickets: Are dismissed as “rare” or “edge cases”?
- Where are people dropping off: Silently, because the system doesn’t recognize them?
Pillar 2: Be Accountable
- Own the harm: When your system locks people out or misidentifies them.
- Put real resources behind it: Budget, time, and actual authority behind fixing it.
- Assign clear owners: For inclusive outcomes—and track progress like any other key metric.
Pillar 3: Solve for One, Extend to Many
- Case study: A global bank’s AI voice authentication was locking out users with regional accents and speech disabilities. Initial success rate: ~30% for some groups.
- The approach: Co-designed with affected users, performed reverse bias audits, expanded and diversified training data.
- The result: Success rates jumped to ~90%—not just for “edge cases,” but across their entire customer base.
Pillar 4: Take Action
- Run real audits: Bias and impact audits on your models—not just internal slide decks.
- Redesign for real users: Forms, flows, and interfaces that reflect your actual users, especially those historically excluded.
- Design with communities: Co-creation sessions, community councils, feedback loops that actually change decisions.
Proof That Coding-In Works
From Margins to Multipliers
Voiceitt + Webex
Real-time captioning for atypical speech patterns meant managers finally heard ideas from team members who had been silent. Teams saw measurable productivity gains.
Microsoft Adaptive Controller
Created with disabled gamers, it unlocked play for disabled players and sparked new ways for families and creators to interact around games.
Project Euphonia
By centering people with atypical speech—ALS, Parkinson’s, stroke survivors—Google’s voice recognition became more accurate for everyone.
Fintech credit equity
Bias audits and retraining models to include excluded borrowers increased approval rates, reduced default rates, and improved customer retention across India, Kenya, and Nigeria.
Every time we treat “edge cases” as design constraints instead of afterthoughts, we see the same pattern: higher trust, better outcomes, and real business value. From margins to multipliers.
What Do We Do Now?
Questions for Action
- What is one system in your world—at work, in your city, in your product—that clearly codes people out? How would you redesign it to be inclusive by default?
- Which dataset, model, or process in your org has never been audited for exclusionary patterns? What would it take to change that this quarter?
- Who are your so-called “edge cases”? What might happen if you centered them for your next product cycle instead of designing around them?
Robert Williams is still fighting for justice. So are Michael Oliver and Nijeer Parks. They’re fighting not just for themselves, but for every Black person who might be the next victim of an algorithm that can’t see us.
This Black History Month, honoring our ancestors means extending the fight for justice into the digital age. You’ve been coded out. My work is about helping you—and the systems around you—get coded back in.
