The Pitfalls of Adversarial Clothing
Back to BlogAI + Privacy

The Pitfalls of Adversarial Clothing

When I present on panels about equitable and inclusive design, there are two areas I emphasize: these are the areas where we, as humans have the greatest opportunity to bring about transformative change.

Dr. Dédé Tetsubayashi|7 min read

When I present on panels about equitable and inclusive design, there are two areas I emphasize; as both a social scientist and tech ethicist, these are the areas where we, as humans have the greatest opportunity to bring about transformative change.

The first and most fundamental tool we have within our arsenal is the call-in. The call-in is the seed from which the best accessible, equitable, and inclusive products and processes take shape. Who am I designing this for? Who am I designing it with? If they are not one and the same, we must go back and begin again.

The second area is anticipating harm—thinking proactively about how technology might be misused or might disproportionately impact certain communities. This is where adversarial clothing enters the conversation, and where I have significant concerns.

The Problem with Individual Solutions

Adversarial clothing—garments designed to confuse facial recognition systems—represents an individual solution to a systemic problem. These clothes use patterns that disrupt how AI systems identify human forms and faces. On the surface, this seems like an empowering response to surveillance overreach.

But it places the burden of evading surveillance on the very people most likely to be harmed by it. Black and brown communities, who already face disproportionate surveillance and policing, are now being asked to purchase specialized clothing to protect themselves from technology that shouldn't be deployed against them in the first place.

Moreover, adversarial clothing doesn't work equally for everyone. The effectiveness of these techniques varies based on body type, skin tone, and environmental conditions. Those with more resources have better access to the most effective options. When we offer individual technical fixes for structural problems, we often deepen existing inequalities.

The Capitalization of Resistance

There's something deeply troubling about turning resistance to surveillance into a consumer product. Companies are now profiting from selling protection against systems that they—or companies like them—helped create. This is the logic of late capitalism: create the problem, then sell the solution.

Meanwhile, those who can't afford adversarial clothing remain vulnerable. Privacy becomes a luxury good, available only to those who can pay for it. This isn't liberation—it's a two-tiered system where the wealthy can buy their way out of surveillance while everyone else remains trapped.

Systemic Change Required

The solution to harmful surveillance isn't better camouflage—it's not building the harmful surveillance in the first place. We need to shift our focus from helping individuals evade bad systems to preventing those systems from being deployed.

This means advocating for laws that restrict facial recognition technology. It means holding companies accountable for biased AI systems. It means demanding transparency about how surveillance technologies are used and against whom. The fight for privacy shouldn't require a new wardrobe—it should require a new relationship between technology and justice.

About Dr. Dédé Tetsubayashi

Dr. Dédé is a global advisor on AI governance, disability innovation, and inclusive technology strategy. She helps organizations navigate the intersection of AI regulation, accessibility, and responsible innovation.

Work With Dr. Dédé
Share this article:
Schedule a Consultation

Want more insights?

Explore more articles on AI governance, tech equity, and inclusive innovation.

Back to All Articles