
AI Is Building the Web Backwards
The first long-playing record wasn't made for music. It was made for a blind person who wanted to read a book.
In the 1930s, the American Foundation for the Blind lobbied Congress to fund recorded books for people who couldn't see print. The Pratt-Smoot Act passed in 1931. The Library of Congress began pressing audio recordings of printed works.
The format they needed, a disc that could hold more than three minutes of audio, didn't exist yet. So they built it. The long-playing record, the LP, the format that would later carry Miles Davis and the Beatles, was born as an assistive technology for blind readers.
The typewriter has a similar origin. In 1808, Italian inventor Pellegrino Turri built one of the first working typewriters for Countess Carolina Fantoni da Fivizzano, who was blind and wanted to write letters without a secretary reading her private correspondence. Turri also invented carbon paper to provide ink for the device.
Closed captions were mandated for deaf viewers. Today, 80% of people who use captions are not deaf or hard of hearing. Seventy percent of Gen Z watches video with captions on by default. The feature designed for a specific disability became the way most young people experience media.
In 1989, Sam Farber watched his wife Betsey struggle with a vegetable peeler in a rented house in southern France. Betsey had arthritis. The thin metal handle was painful to grip.
Farber hired the design firm Smart Design and told them to build a peeler that someone with arthritis could use comfortably. The result, OXO Good Grips, is now a $750 million company. The flagship peeler sits in MoMA's permanent collection.
A product designed for a specific disability turned out to be the product everyone preferred.
Angela Glover Blackwell gave this pattern a name in a 2017 article in the Stanford Social Innovation Review. She called it "the curb cut effect." In the early 1970s, disability activists in Berkeley, California, including Ed Roberts and a group known as the Rolling Quads, fought for curb cuts on every street corner. The city built the first planned wheelchair-accessible route in the United States. Parents with strollers used the ramps. Delivery workers with hand trucks. Business travelers with rolling luggage. Skateboarders. Runners.
The accommodation designed for the few became infrastructure used by everyone.
That's the inclusion dividend. Designing for the edge improves the center. It has held true for centuries, across technologies, industries, and continents.
AI-assisted development is running this pattern in reverse.
94.8% and Counting
The web was already broken before AI started writing code for it.
WebAIM's 2025 analysis of the top one million home pages found that 94.8% had detectable WCAG failures. The average page contained 51 accessibility errors. The six most common problems, low contrast text, missing alt text, missing form labels, empty links, empty buttons, and missing language declarations, accounted for 96% of all detected errors.
These aren't obscure edge cases. Missing alt text means a screen reader user encounters an image and hears nothing. An empty button means a keyboard-only user tabs to a control and has no idea what it does. A missing form label means a blind person filling out an application is guessing which field they're in.
Ninety-four point eight percent. That's the training data.
When an AI coding assistant generates a web interface, it draws on patterns learned from the web as it exists. And the web as it exists fails accessibility standards almost everywhere. The models didn't learn accessible code because accessible code is the exception, not the norm.
Researchers at Carnegie Mellon published a study at CHI 2025 called CodeA11y, examining what happens when developers use GitHub Copilot to build web interfaces. They found three barriers.
First, developers had to explicitly prompt for accessible code. If they didn't already know accessibility mattered, Copilot didn't mention it. The assistant generated what was asked for, and what was asked for rarely included accessibility.
Second, developers skipped critical manual steps. Copilot would suggest placeholder alt text for images. Developers accepted the placeholder and moved on.
The code looked complete. The alt text was meaningless.
Third, developers couldn't verify whether the output was actually accessible. They lacked the expertise to evaluate what Copilot produced. The code compiled, and the page rendered. It looked fine on their screen.
Whether someone using a screen reader could navigate it was a question nobody asked and nobody could answer.
The Ghost in the Training Data
The problem goes deeper than output quality. The tools themselves are inaccessible.
A second CHI 2025 study examined how AI coding assistants affect developers who are visually impaired. Not the people using the code, but the people writing it.
The researchers found that code completion suggestions appear as "grayed-out ghost text" with no controls to inspect them. A sighted developer sees a suggestion appear and accepts or rejects it with a glance. A screen reader user encounters a phantom. Something has been added to their editor, but there's no reliable way to review it, and dismissing it can trigger unintended edits.
The visually impaired developers reported cognitive overload from the volume of suggestions. They wanted "AI timeouts." The tool designed to make coding faster was making coding harder for the people who already faced barriers.
The ASSETS 2025 conference published an evaluation of three widely used conversational programming tools, including Replit and Cursor. All suffered from poor keyboard interactions, broken focus management, insufficient feedback for assistive technology users. The vibe coding platforms that let anyone build software are, by default, excluding the developers who navigate by keyboard and screen reader.
Vic Kostrzewski, an accessibility specialist, catalogued ten reasons why vibe coding threatens digital accessibility. His sharpest point is that AI code is "trained on inaccessible material." The models absorbed a web where 94.8% of pages fail WCAG. They produce what they learned.
The bias is structural, not incidental.
A subtler problem is that vibe coding changes the incentive structure around quality. When generating code is nearly free, the pressure to validate it drops. If accessibility testing was already the first thing teams skipped under deadline pressure, vibe coding gives them a new justification. The AI probably handled it.
The AI did not handle it.
The Thirty-to-One Rule
There's a rule of thumb in accessibility work that carries the weight of decades of institutional pain: fixing an accessibility bug in testing costs thirty times more than building accessibility in from the start.
Thirty to one. That's the cost of remediation when an organization builds first and accommodates later.
New designs can include semantic HTML, proper heading structures, and ARIA labels from the beginning at near-zero marginal cost. Retrofitting those into an existing codebase means auditing every component, rewriting interaction patterns, and regression testing the entire application.
The Department of Veterans Affairs is a case study in what accumulation looks like. A 2021 VA report to Congress found that only approximately 9% of the VA's internet sites were fully Section 508 compliant. A July 2025 audit by the VA's Office of Inspector General reviewed 30 critical IT systems and found that only four met accessibility standards. A blind social worker at a VA medical center in Oregon filed a lawsuit claiming the Oracle Cerner electronic health record system was inaccessible to disabled clinicians and veterans.
The VA serves 9 million enrolled veterans. Its systems are used by medical professionals making treatment decisions. Eighty-seven percent of the IT systems those professionals depend on don't meet basic accessibility requirements.
The FTC offered another data point in January 2025, when it fined accessiBe $1 million for claiming its AI-powered widget could make any website WCAG-compliant. The company had sold an automated overlay that promised accessibility without manual work. The FTC found the claims were false. The order bars accessiBe from repeating them for 20 years.
An AI tool that promised to fix accessibility through automation. The government's conclusion is that it doesn't work.
Every vibe-coded product that ships without accessibility validation is a loan taken at thirty-to-one interest. The debt is accumulating faster than it ever has. And unlike technical debt, which degrades performance for everyone gradually, accessibility debt excludes specific people completely and immediately.
The Bill Comes Due
In the first half of 2025, 2,014 federal ADA lawsuits were filed over digital accessibility. That's a 37% increase over the same period in 2024. Projections for the full year approach 5,000 cases.
Illinois saw a 745% increase. Plaintiffs are increasingly using AI to draft complaints and identify violations. The same technology accelerating inaccessible development is accelerating the legal response.
In 2016, Guillermo Robles, a blind man in California, sued Domino's Pizza because neither its website nor its mobile app worked with his screen reader. He couldn't order a pizza. The case went to the Ninth Circuit, which ruled the ADA applies to digital content. The Supreme Court declined to hear Domino's appeal. The case settled in 2022.
In 2006, the National Federation of the Blind sued Target because Target.com lacked alt text, had inaccessible navigation, and couldn't be used with assistive technology. Target settled for $6 million in damages plus $3.7 million in legal fees.
The regulatory walls are closing from multiple directions. In April 2024, the Department of Justice published a final rule requiring state and local governments to make web content and mobile apps meet WCAG 2.1, Level AA. Entities serving populations of 50,000 or more must comply by April 2026. The European Accessibility Act began enforcement in June 2025, covering websites, mobile apps, e-commerce, banking, and transport services across the EU. Penalties reach up to 4% of annual revenue.
The World Health Organization reports 1.3 billion people globally experience significant disability. The CDC found 28.7% of U.S. adults, more than 70 million people, have a disability. Nearly 44% of people over 65 report one.
A quarter of the adult population. Excluded by products built too fast to notice.
The Curb Cut in Reverse
In 1985, an architect named Ronald Mace gave a name to something that had been practiced for decades without a label. He called it "universal design": "the design of products and environments to be usable by all people, to the greatest extent possible, without the need for adaptation or specialized design."
Mace had contracted polio at age nine and used a wheelchair for the rest of his life. He studied architecture at NC State, where inaccessible campus buildings were a daily obstacle. In 1997, he led the working group that codified the seven principles of universal design: equitable use, flexibility, simplicity, perceptible information, tolerance for error, low physical effort, and appropriate size and space.
Those principles read like a usability checklist for any well-designed product. They were written by and for people who had been excluded from poorly designed ones.
If you're shipping AI-generated code, the practical version of universal design starts with the prompt. Include accessibility upfront, not as an afterthought. Add a linter, wire accessibility checks into your CI pipeline, and test with a keyboard before anything ships. Run a screen reader on your core flows. Build it into your components once so every feature inherits it automatically. The CHI 2025 research showed that accessibility-aware AI tools produce significantly better code than the defaults. The default is the problem. Change the default.
The curb cut story, the LP story, the OXO story, the caption story. Every time someone designs for the edge, the center benefits. Every time someone designs for the center and ignores the edge, they accumulate a debt that compounds.
AI is the fastest edge-ignoring tool ever built. It generates from the statistical center of its training data. It doesn't know who's missing. It doesn't ask.
It produces what looks right to the people who can see the screen, hear the audio, use a mouse, and read the text at the font size it chose.
The question facing every team shipping AI-generated interfaces is not whether they can build faster.
It's whether they're building something anyone can use.