Nothing About Us Without Us: A Disability Justice Framework for Artificial Intelligence
What the AI Conversation is Missing and How to Build the Future We Actually Want
AI has genuinely changed things. For a lot of people building and creating with it right now (including me), it has opened up new ways of thinking, working, and making that did not exist before. That’s very real. And so is the unease. The sense that the world being built with these tools is not quite the one we would choose if we stopped to ask the question. Or that the people deciding what AI optimizes for, whose needs it centers, and what counts as a good outcome are a remarkably small and similar group.
The call for the humanities in AI has been one response to that feeling. Philosophers, ethicists, sociologists, historians, linguists. Their perspectives matter greatly and have been largely absent from a conversation dominated by engineers and investors. But academic expertise and lived experience are not the same thing. Knowing how to analyze a system is different from knowing what it means to survive one. That group is still almost entirely absent from the rooms where AI is being shaped, even when they have the deepest knowledge of what it looks like when systems fail, infrastructure excludes, and the gap between what a policy promises and what it delivers lands on your body and your family. Disability justice is a framework built by exactly those people. And it has more to offer this moment than most of us have begun to realize.
What Is Disability Justice?
Disability justice is a framework developed in 2005 by the Disability Justice Collective, a group including Patty Berne, Mia Mingus, Stacey Milbern, Leroy F. Moore Jr., and Eli Clare, organized under Sins Invalid. It centers multiply marginalized disabled people and arose directly from the exclusion they felt from the disability rights movement, which at the time focused heavily on the experiences of straight white men with physical disabilities.
Disability justice centers the voices of those who experience exclusion, regularly and in many ways. It centers Black, Brown, and Indigenous disabled people. Queer, houseless, and incarcerated disabled people. Immigrants with disabilities. People whose disabilities intersect with every other axis of marginalization. Their lives are not just harder, but categorically different in ways that the dominant system was never designed to see. According to Project LETS, a national disability justice grassroots organization, disability justice “recognizes the intersecting legacies of white supremacy, colonial capitalism, gendered oppression and ableism in understanding how people’s bodies and minds are labelled deviant, unproductive, disposable and/or invalid.”
You’ve probably heard about the disability rights movement, but it differs significantly from disability justice. Disability rights seeks to accommodate more people within the existing system. Disability justice asks whether the system should even exist in its current form. Mia Mingus, one of the framework’s founding thinkers, gave remarks at the 2017 Paul K. Longmore Lecture on Disability Studies and invoked a metaphor attributed to the late philosopher Grace Lee Boggs: gaining access to a burning house and doing nothing to change the situation is not liberation. It’s a continued struggle in a failing environment. That distinction between inclusion in an unjust system and the transformation of that system is the core of disability justice.
Sins Invalid identified ten principles of disability justice that operationalize this framework: intersectionality, leadership of those most impacted, anti-capitalist politic, commitment to cross-movement organizing, commitment to cross-disability solidarity, interdependence, collective access, recognizing wholeness, sustainability, and collective liberation. The follow-up to this essay will apply each principle directly to the AI conversation.
Where I’m Standing
It’s important I situate myself before going further. I am neurodivergent. You may be as well (there seems to be a large segment of neurodivergent people building with AI). I am also a white woman, relatively privileged, living in the United States, able to work and supplemented financially by my partner, a native English speaker who benefits from the system in which I live. I don’t have to think about disability justice if I don’t want to. But my life has made it impossible not to.
I have an aunt with Down Syndrome who lived with my family growing up. She was born 15 years before the Americans with Disabilities Act was passed, so my grandparents dedicated their lives to building a more inclusive world for her. After college I worked as an employment consultant for people with intellectual and developmental disabilities, quickly learning how privileged my family was within the disability world. I worked with immigrant families where my clients were the translators for their families in everyday life, navigating systems that were never designed with them in mind. I worked with indigenous clients whose tribal community was already taking care of them fully, generously, and in ways the system never seemed to replicate. And yet the state still required formal services, monthly check-ins, and documentation. I worked with clients who had been institutionalized, and then watched life outside become a different form of institutionalization. And with families navigating housing, benefits, education, and employment simultaneously, who fell through many cracks.
I’ve seen the system work adequately (but still far from perfect) for families that look like mine and fail partially or completely for everyone else. That gap is what disability justice is about. This framework is not a common worldview when you’re working inside the system just trying to do your job. I had to seek it out. I’m definitely not an expert. But I’ve applied it to my thinking about AI and I’m convinced we would build a better future if more people do the same.
What We’re Dealing With
According to a 2024 CDC report, 28.7% of Americans have a disability. Even more have a close family member or friend with one. And now we’re facing technology that will touch every single one of our lives, if it hasn’t already. The question of how AI impacts the largest minority group in the United States, and globally, is not a niche concern. It should be one of the central questions we’re asking.
AI is genuinely transformative. The scale of what’s becoming possible is real. But instead of pausing to ask what kind of world we want to build, we are accelerating the world we already live in — with the same hierarchies, the same exclusions, the same violence. Just faster and at scale. Ruha Benjamin, a sociologist at Princeton, calls this the “New Jim Code.” Technology that presents itself as neutral or progressive can actually deepen existing systems of oppression. She writes primarily about race, but the analysis applies directly to disability.
In March of 2025, the ACLU filed a complaint to the Colorado Civil Rights Division and the U.S. Equal Employment Opportunity Commission on behalf of a Deaf and Indigenous Intuit employee. The complaint claims that the employee was required to use an AI-powered hiring assessment tool made by HireVue when she applied for a promotion, even though her manager recommended her. According to the complaint, she requested and was denied human-generated captioning, meaning she had to rely on spotty AI-generated captions. She was rejected for the promotion due to her communication style. In December 2025, Donald Trump signed an executive order halting any laws limiting artificial intelligence and creating a task force to challenge state laws. State laws like Colorado’s, which required employers to conduct risk assessments for algorithmic discrimination in hiring and take precautions against it.
In 2016, Arkansas used an algorithm to determine care hours for the Medicaid home and community based services (HCBS) program. Nearly half of the beneficiaries experienced unexpected and dramatic cuts. In 2019, the state replaced algorithms with a new assessment tool, that deemed over 25% of those in the HCBS program ineligible for care. Both of these scenarios led to unnecessary suffering. In 2025, Congress passed Donald Trump’s funding bill, which slashes almost $1 trillion from Medicaid over the next decade and implements a work requirement for eligibility. The requirement applies to healthy adults, with exemptions for those with special medical needs. But disabled people, disability advocates, and state Medicaid officials argue that deciding who is eligible is tricky. For example, the process to grant “medically frail” status is not universal across states and some people who seem healthy now would not be if required to work. These changes will lead to even more people losing benefits, increase surveillance of those who don’t (checking for eligibility is required every 6 months), and inevitably worsen health outcomes across the country. Arkansas soft launches their work requirement this summer.
According to a 2024 report from the Center for Law and Social Policy, the rise in surveillance technology used in schools “blurs the boundary between the schoolhouse and the jailhouse by providing a digital infrastructure” that increases the role of law enforcement, including ICE, in the lives of marginalized youth. Schools across the United States have implemented device monitoring, vape detection, social media surveillance, and facial recognition among other surveillance techniques, systems, and technologies. And we know that Black, Brown, and disabled students are the most impacted. Disability justice asks us to acknowledge the reality that disabled kids of color live in, then dismantle it. Artificial Intelligence in its current form entrenches some students into the school-to-prison pipeline, perpetuating the world we’ve created and the biases we’ve built into our systems. The calls for increased surveillance argue it’s for child safety, but the question becomes which children and at what cost.
These are not isolated incidents. They are the predictable outcome of building systems that encode existing hierarchies and call it progress. Most people building with AI right now aren’t trying to cause harm, but good intentions without new frameworks will lead to the same outcomes. Just faster. The promise of transformation can mask the reproduction of harm if we let it.
What We’re Up Against
There’s a lot of money and momentum keeping us on this path. AI companies and international governments have created a global AI arms race at all of our expense. CEOs need to prove their worth, acquire funding, make their models bigger, faster, stronger. They need to win, allegedly in the name of national security. Moving cautiously is explicitly framed as irresponsible by the people with the most power in this moment.
On the disability side, we’re up against the charity and medical models, which are frameworks that position disabled people as recipients of help rather than architects of their own lives. These models are built into policy, benefit systems, and the design of assistive technology. They keep the people with the most insight into navigating broken systems out of the rooms where decisions get made.
There’s a common phrase in disability activism — nothing about us without us. The phrase was popularized by disability activist James Charlton when he published his 1998 book by the same name. He argues that disability oppression is rooted in powerlessness and dependency, with the only remedy being self-representation and self-determination. It has since been adopted by the UN Convention on the Rights of Persons with Disabilities. Building superintelligence the way it’s currently being built violates that principle at scale. But this isn’t only about the people at the top of the hierarchy. Even those of us with good intentions tend to build for the most comfortable, most legible, most profitable user rather than toward the margins where the real design challenges live.
A Different Way Forward
Disability justice is a critique of our current system, but it is also a tradition of imagination. Disabled people, particularly disabled people of color, have been building visions of different worlds out of necessity for a long time. That imaginative work is exactly what this moment needs.
Mia Mingus developed the concept of Access Intimacy to describe that rare feeling when someone simply understands your access needs without you having to over-explain or justify yourself. Most disabled people only experience this in their closest relationships. In the same 2017 speech at the Paul K. Longmore Lecture on Disability Studies, Mingus described Access Intimacy as “something that can transform ordinary access into a tool for liberation, instead of merely reinforcing ‘inclusion’ and ‘equality.’” What if AI could extend that quality of being understood more broadly? Not replace human connection, but to make that quality of care more available. A world where you don’t have to translate yourself into the system’s terms. Where infrastructure holds you without interrogation.
There are already people for whom technology and body are not separate — where that integration is simply how they move through the world. People who use electric wheelchairs, AAC devices, eye gaze technology, pacemakers, prosthetic limbs. People for whom coexisting with technology is not a future scenario but a present reality. Some even identify as cyborgs. Disabled people have been navigating this terrain for years, and are largely absent from the conversations deciding how the rest of us make this transition.
In 1985, feminist theorist Donna Haraway published A Cyborg Manifesto, using the figure of the cyborg — part human, part machine — to argue that the merging of technology and body could go two ways. It could become another mechanism for controlling and diminishing the oppressed, or it could become genuinely liberatory. Which direction it goes depends entirely on who shapes it and toward what end. Alison Kafer, in Feminist Queer Crip, extends that framework to ask whose technological future gets imagined and whose gets left out. One answer is that technology rushes to fix disabled people. To make Deaf people hear. To make autistic people less autistic. To develop stair-climbing exoskeletons. To eliminate difference rather than build for it. But that assumes the problem is the person, not the world. And it assumes disabled people want to be fixed, which is not ours to decide.
The technology we develop will not build a world different than the one we can imagine. If we can only imagine fixing people, that’s what we’ll build. If we can imagine a world that works for everyone on their own terms, we might build that instead.
What could AI look like if it started there? A world where AI helps autistic people articulate their thoughts exactly as they thought them. Where Deaf people can communicate in their own language through AI-assisted tools without being pushed toward interventions they didn’t ask for. Where tools are about fulfillment, not just survival. Where the question isn’t how do we make this work for more people but what becomes possible when we let the people the system has most completely failed lead.
And we have to be genuinely open to what that leadership might tell us. Sometimes the answer will be that AI is not the right tool at all. The more urgent work might be cultural, not technological. If Deaf people tell us overwhelmingly that what would make them feel more included is teaching sign language in every elementary school, the solution isn’t an app. That is a message we need to hear and act on. Disability justice does not guarantee that AI is always the answer. It guarantees that we are asking the right people the question.
Here’s what it comes down to: AI could potentially extend the capacity to know and be known. Or it could become another system you have to translate yourself for. Another system that surveils, perpetuates harmful ideologies, and makes life easier for the people around the disabled person doesn’t change much for them. The direction we go depends entirely on the conversations we have right now and the people who build the tech.
Why Disability Justice
Disability justice is political. It has consequences that reach into every part of life. And disabled people are already experts at navigating systems that get it wrong because they have had no other choice. Disability justice is knowledge built from surviving hostile infrastructure. It tells us where to start: with the people most marginalized, most surveilled, most excluded from the rooms where decisions are made.
Disability justice is a heavy topic to sit with. Confronting who the world excludes (and how deliberately) is not comfortable. But the world disability justice imagines is not gloomy. It is a world where disability is not a barrier to joy, community, belonging, and fulfillment. It is generative and specific and already being built at the margins by people who had no other choice. And it turns out that world looks a lot like the one many of us say we want from AI. Disability justice is the only framework that refuses to leave anyone out in the pursuit of it.
In my next essay I’ll go through each of the ten Sins Invalid principles and apply them directly to AI. In the meantime, take thirty minutes. Read something written by a disability justice activist. Look into Sins Invalid. Read Mia Mingus’ blog Leaving Evidence. Listen to Leah Lakshmi Piepzna-Samarasinha talk about their book The Future is Disabled: Prophecies, Love Notes, and Mourning Songs. Browse the Disability Visibility Project website. Let yourself be surprised. It may open your eyes to another way of existing — and change how you build going forward.
Sources and Further Reading:
The Future Is Disabled Envisions a Time “Where the World Has Been Cripped” by Sarah Nielson
Disability activist Alice Wong writes about the past and futurism by Jeneé Darden
Access Intimacy: The Missing Link by Mia Mingus
Cyborg identity: what can MedTech learn from the disability community?
Nothing about us without us — Stimpunks
Models of Disability: And Overview — Mobility International USA
Black Students Are Being Watched Under AI — and They Know It by Quintessa Williams
Arkansas to “soft launch” upcoming Medicaid work requirement checks by Ainsley Platt
Arkansas Medicaid Home and Community Based Services Hours Cut — Benefits Tech Advocacy Hub
ACLU Sues Intuit and HireVue Over Discriminatory AI Interviewing Practices — nquiringminds
Disability Impacts All of Us Infographic — Center for Disease Control
10 Principles of Disability Justice — Sins Invalid

Fantastic article, Lag Phase. There are so many good points, I think every single one could be its own article!
You described so well tensions across so many areas — AI as perpetuating the system vs enabling people with disabilities to be architects of their own lives (so good!), that civil and disability rights often continues to reflect dominant groups and ignore the intersectionality of multiple marginalized identities, and most importantly the distinction between “rights” and “justice” and how the latter is much more difficult to do because it upturns the system …and that’s exactly what the system often needs.
Thank you so much for letting me know when you published this! I don’t think I’ve seen anyone else write about disabilities and AI, and I’m so grateful when I see someone give voice to an area that is less represented.
I look forward to the next one!
I am SO glad Jade recommended this piece. Thank you for writing it.