
Little Boxes Made of Ticky-Tacky
In 1947, a builder named William Levitt bought 4,000 acres of potato farmland on Long Island and started constructing houses. One house, repeated 17,447 times.
Each home had the same floor plan. Same kitchen. Same two bedrooms. Same 12-by-16-foot living room with a built-in television. Levitt's crews could finish one every 16 minutes during peak production. Families moved in as fast as the paint dried.
Levittown solved a real problem. Millions of GIs had come home from the war to a country with almost no available housing. Levitt gave them something affordable, functional, and genuinely better than what existed.
The homes were a hit. The waiting list was enormous.
They were also all identical.
Critic Lewis Mumford looked at the development and saw "a multitude of uniform, unidentifiable houses, lined up inflexibly, at uniform distances on uniform roads." Folk singer Malvina Reynolds drove past a similar development in Daly City, California, and wrote a song that captured it in 90 seconds. Little boxes on the hillside, little boxes made of ticky-tacky, little boxes, all the same.
A university professor told Time magazine in 1964 that he'd been lecturing about middle-class conformity for an entire semester. Reynolds' song said it all in a minute and a half.
AI-generated interfaces are having their Levittown moment.
The Statistical Average of Everything
The tools are trained on the same datasets. They've ingested the same design systems, the same Dribbble shots, the same landing page templates. When you ask one to generate a dashboard, it produces the statistical average of every dashboard it has ever seen.
The output is competent. It is also exactly what everyone else is getting.
This is how convergence works. It arrives as convenience. Each individual output looks fine. The sameness only becomes visible when you step back and see the street. The same card layouts, the same hero sections, the same tasteful sans-serif type, the same rounded corners at the same radius, stretching to the horizon like potato fields that became subdivisions.
The Nielsen Norman Group calls it "synthetic genericism." Seventy-five percent of firms have adopted generative AI tools. The result is a flood of interfaces that are polished in identical ways, the way Levittown houses were polished in identical ways. Clean lines, neat lawns, built-in televisions. Professional from a distance.
That qualifier matters. From a distance.
Zoom In
From a distance, a Levittown house looked fine. From a distance, every AI-generated landing page looks professional. Clean typography, balanced whitespace, tasteful color palette.
The problem shows up when someone actually has to live there.
Consider a medical records system. It serves doctors, nurses, billing staff, and patients. Each group has needs that conflict with the others. Doctors need speed. Patients need clarity. Billing staff need precision. The system must comply with HIPAA, support accessibility requirements that go beyond standard web guidelines, and display data dense enough to inform clinical decisions without overwhelming the person making them.
An AI can generate something that looks like healthcare software. Clean cards, a sidebar, a dashboard with charts. From a distance, it checks every box.
Zoom in. The contrast ratios fail a clinician scanning results at 3 AM under fluorescent light. The navigation assumes a linear workflow when clinical workflows branch constantly. The data display looks elegant with five entries and becomes unusable with three hundred. The interface was designed for a screenshot, and screenshots don't have patients.
This is where the Levittown metaphor sharpens. Levitt's houses worked for a young couple with one kid and a car. They worked less well for a family of six, or a household where someone used a wheelchair, or anyone whose life didn't match the floor plan's assumptions about what a life looked like. The house was fine. The house just didn't know anything about the people inside it.
Generic interfaces have the same problem. They're built for the average user, and the average user doesn't exist. Every real person using a real product in a real context has specific needs that the composite flattens out. The interface works from a distance. Up close, where people actually live, the seams show.
The Polarization
Fast fashion shows where this trajectory leads.
Shein can produce a new garment in ten days. The industry churns through 52 micro-seasons per year. Prices dropped, speed increased, and everything started looking the same.
The result was polarization. The middle of the market collapsed. Brands that stood for something specific, that carried a point of view you could articulate in a sentence, became more valuable precisely because the generic middle became so crowded that standing in it meant standing nowhere. Patagonia nearly tripled its revenue during the decade when fast fashion was eating everything around it. The garments that won weren't cheaper or faster. They were identifiable.
The same reflex is surfacing in tech. Merriam-Webster named "slop" its 2025 Word of the Year. Reddit's CEO built his entire investor narrative around being "the most human place on the internet." When a company worth $20 billion positions itself against AI-generated content in an earnings call, specificity has become a market force.
AI-generated interfaces are following the same arc. Levittown solved the housing crisis. Fast fashion solved the price problem. AI tools are solving the access problem. Then the sameness becomes the problem, and the market starts rewarding the ones that look like someone was paying attention.
Being in the Room
If convergence comes from everyone using the same tools on the same data, divergence requires something the tools can't provide. Specific knowledge about a specific context. The kind of knowledge that only comes from being in the room.
An AI can generate a patient intake form. It cannot know that elderly patients at a particular rural clinic tend to arrive with a family member who fills out the form for them, and that the interface needs to account for that second person without creating a confusing permissions model. That knowledge comes from watching people use the thing. Sitting in the waiting room. Noticing that the daughter is holding the phone while the patient answers the questions, and that the form wasn't designed for two people and one screen.
An AI can generate a project management tool. It will produce the composite of every project management tool it has seen, which means it will include every feature that every competitor includes, arranged in the most common arrangement.
Linear, the project management tool that engineering teams have been quietly migrating to, did the opposite. Every feature in Linear reflects specific opinions about how software teams actually work. The keyboard-first interface. The opinionated workflow states. The deliberate absence of features competitors treat as table stakes. None of those decisions would survive the averaging process. They came from a team that watched engineering teams use project tools and formed strong opinions about what was broken.
The difference between Linear and a generic AI-generated project tool is the difference between a Levittown house and a house an architect designed after spending a month with the family who'd live in it. One is a floor plan. The other is a response to a life.
This is the work that resists automation. Knowing that elderly patients arrive with a family member. Knowing that engineering teams think in keyboard shortcuts. Knowing that the clinician at 3 AM needs contrast ratios that the screenshot on Dribbble didn't have to worry about. The training data contains the what. The why lives somewhere else.
Divergence is domain knowledge. It's the accumulation of all the things you learn by being in the room that the model was never in. Every product where the design actually matters, where the interface is carrying strategic weight, is built on a foundation of context that no dataset contains.
A designer described the current moment. People lowered their expectations of what design is. First designers themselves, then clients. The bar dropped to "looks professional." AI can clear that bar all day. The bar that matters is "works for these specific people in this specific situation," and clearing it requires knowing something the training data doesn't know.
They All Look Just the Same
Levittown's identical houses didn't stay identical forever. Within a decade, homeowners started renovating. They added rooms, changed facades, planted different gardens. The uniformity was a starting point.
But the renovations required something Levitt's assembly line couldn't provide. Individual judgment about what each family actually needed. Someone had to look at the house and look at the family and understand why the floor plan didn't fit. The assembly line built the box. A person had to make it a home.
For simple applications, marketing sites, and low-stakes tools, the AI-generated interface might be the whole answer. A Levittown house is fine if all you need is a roof and four walls. Millions of families were happy in them. The houses were warm. The televisions worked.
For products where the interface is the experience, where the design is the strategy, where the users have needs that a statistical average will never surface, the box is where you start. It was always where you start.
Little boxes, little boxes, all the same.
Someone has to build the ones that aren't.