Skip to content
← Back to Words
Dense generative botanical forms erupting from the bottom-left corner — algorithmic fern fronds, fractal flower heads, and spiraling phyllotactic patterns branching diagonally across the frame, each tendril splitting into finer copper and verdigris filaments with glowing spore nodes at branching points. Warm copper (#D4A574) primary structures with verdigris (#4A7A6F) and moss (#5B6B4A) secondary growth. Ink wash texture bleed at tendril tips, organic noise layered with faint grid lines suggesting the code beneath the growth. Focal elements within the central area. Composition: Growth. Density: Dense.

Creative Coding as Design Tool

1,256 words · 6 min read

You write a function that places shapes according to some rule. Perlin noise, recursive subdivision, flocking behavior. You run it. The screen fills with something you didn’t draw.

A lot of the time it's garbage, but you expect that.

You change a parameter, run it again. Garbage, garbage, then suddenly something clicks into place and you’re looking at a composition that feels intentional even though no one intended it. This is the happy accident pipeline. You build a machine that produces accidents and then you select the good ones.

The designer’s role shifts in a way that feels unfamiliar at first but ends up being more honest about what design always was. You stop being the person who authors every pixel and start being the person who builds the system and judges its output. Miles Davis didn’t improvise from nothing. The structure of “So What” gave him two chords for 32 bars, and inside those constraints he found melodies that no amount of deliberate composition would have produced. Creative coding works the same way. You set the boundaries, play the parameters, and listen to what the system plays back.

Parameters as Taste

People often think the creative part of generative work is the algorithm. The creative part is the constraints.

Consider a simple particle system. Particles spawn at a point, move outward, fade over time. Everyone’s seen it. But what if you constrain the spawn angle to a 15-degree arc? What if velocity is mapped to a sine wave? What if the fade curve is exponential instead of linear? Every one of those choices is a design decision, and they don’t feel like design decisions because you’re typing numbers into a function signature while your designer friends are dragging handles in Figma. But that’s exactly what they are.

A noise function has frequency, amplitude, octaves, persistence. Those four numbers are a design vocabulary. Choosing 0.003 instead of 0.03 for frequency is the difference between marble and TV static. That’s an aesthetic decision disguised as a technical one.

I think about this when people dismiss generative work as “random.” Randomness is the raw material, the way light is raw material for a photographer. The choices you make about how to shape, filter, and constrain that randomness are the design. Aperture, exposure, framing. That’s where the taste lives. Your taste is expressed through boundaries.

When the Tool Talks Back

Most design tools do what you tell them. You click, you drag, you get output. Photoshop doesn’t argue with you. Figma doesn’t suggest you’re wrong.

Code that generates visual output is different. It has opinions. Systems with even simple rules can produce behavior that exceeds your mental model of them. Three rules about steering, separation, and cohesion give you flocking. That emergent complexity wasn’t in any individual rule. It appeared in the space between them, the way harmony appears between notes that weren’t necessarily written to be played together.

This is what makes generative tools feel alive compared to direct-manipulation tools. When you work in Illustrator, you’re giving a monologue. The tool executes. When you work with a generative system, you’re in a conversation. You propose, the system responds, you adjust, it responds again, and each iteration teaches you something about the parameter space that you couldn’t have learned by thinking about it. You have to run the code to know, the same way that most of us have to play the chord to hear whether the voicing works.

That feedback loop is why creative coding produces work with a quality that’s hard to achieve manually. There’s an organic complexity to it, something that comes from being discovered on the way to somewhere else. Like the difference between a planned city grid and a path worn through grass by people actually walking.

The Moment It Goes Wrong

There’s a version of this that doesn’t work, and you’ve seen it everywhere. A marketing site with a particle background. A portfolio with animated Perlin noise behind the hero text. Generative wallpaper.

The system isn’t informing the design. It’s sitting behind it, and you could swap it for a gradient and lose nothing.

That’s decoration. Nobody gets hurt by it, the same way nobody gets hurt by a screensaver. But the distinction matters. Generative design means the system’s output is the design. The forms, the composition, the color relationships, the spatial hierarchy. These come from the process you built.

Think about the difference between tiling a floor with a repeating pattern you bought and writing a program that generates tile arrangements based on rules about adjacency and color contrast. The first is application. The second is discovery. You might find an arrangement you’d never have drawn manually, one where the rules create a visual rhythm that a human hand would have smoothed out but that the system preserves because it doesn’t know what “normal” looks like. It only knows the rules you gave it. And sometimes those rules, followed faithfully, produce something better than your assumptions would have allowed.

One is wallpaper. The other is architecture.

Why This Matters Now

If any of this sounds familiar, it’s because the feedback loop I’ve been describing is the same loop that now defines working with AI tools. Propose, observe, adjust, propose again. You give an LLM a prompt. It responds with something you didn’t write and couldn’t have predicted. You refine, it responds again, and the gap between your intention and its output becomes the space where the interesting work happens.

Creative coding taught this skill years before AI made it mainstream. Setting constraints. Judging output. Treating the unexpected result as information. That’s the muscle generative art has been building all along, and it’s the same muscle you need now when you’re directing an agent through a codebase or iterating on a design with a model that has its own tendencies and blind spots.

The designers and developers who already think this way have an advantage they might not even recognize yet. They’ve been in conversation with systems that talk back for years. They already know that the interesting part is knowing which output to keep.

The Disagreement You Need

A noise function doesn’t know what looks good. It doesn’t have opinions about whitespace or balance or visual hierarchy. It just maps coordinates to values according to math that you chose.

That’s exactly why it’s useful.

When you design something manually, you bring every assumption you’ve ever absorbed. Every trend you’ve internalized, every “rule” about grids and spacing and color theory. These are useful, and they’re also a cage. You’ll never draw something that surprises you, because you’re the one drawing it.

A generative system starts from a different place. Math, physics, biology, pure geometric logic. It doesn’t know about design trends or your portfolio. It follows the rules and produces output, and your job is to look at that output and decide what’s good. The word for this is curating. And that distinction gets at something important about what design actually is, or at least what the part of design that matters most has always been.

Design is judgment. It’s knowing which of the thousand possible outputs is the right one. You don’t have to be the one who generated all thousand. You just have to be the one who knows.

Set up the chord changes. Play the system. Listen to what comes back. And understand that the ability to hear the difference, between the take that works and the nine hundred that don’t, that’s the skill that doesn’t automate. That’s the part that’s yours.