Reviewed by Toni (Reviewer)
The State of Generative Art in 2026
A critical survey of where generative art stands in 2026: the tools maturing, the markets shifting, and the questions nobody wants to ask.
Generative art in 2026 is better than it has ever been, and also more confused about what it is than at any point in its history. Those two facts are related, and the relationship is the thing worth looking at.
The Abundance Problem
Start with the tools. p5.js has matured into something closer to a language than a library; its 2.0 release streamlined the API without losing the accessibility that made it a gateway for a generation of creative coders. TouchDesigner continues its quiet dominance in installation and live performance with GPU compute workflows that would have required a custom C++ pipeline five years ago. GLSL shaders, once the province of graphics programmers with a masochistic streak, are approachable now through shader playgrounds and the spiritual successors to The Book of Shaders. Three.js remains the browser-based 3D workhorse; its ecosystem from drei to postprocessing has polished WebGL into something almost frictionless.
Abundance is unambiguously good and quietly disorienting. When tools are frictionless, what counts as generative art stops being a matter of technical barrier and becomes a matter of definition. You cannot invoke “it is hard to do well” as a filter when the baseline is easy. The question falls back on the practice: what is the system, who designed it, what does the design choose?
After the Gold Rush
The market did not answer that question, but it sharpened it. The 2023-2024 NFT correction was, in retrospect, exactly what generative art needed, and also painful to watch. Art Blocks settled into something more sustainable: fewer drops, more curation, a collector base that looks at the work rather than the floor price. fxhash thrives as the experimental, artist-friendly platform. Community platforms sustain the everyday practice, the places where people share sketches because they want to, not to angle for a mint.
The shakeout was necessary. When every creative coder with a Perlin noise function could list a collection and watch the ETH roll in, legible work got buried under volume. Some talented people also left when the hype evaporated, and the correction did not only remove grifters. But it clarified what the market will call generative art when easy money is off the table.
The AI Question
That inescapability runs straight into the question the community has been circling: is AI-generated art generative art? “It depends on what you mean” is not a position. Here is one.
Generative art, in the tradition from Vera Molnar through Casey Reas to the fxhash algorists, is about systems. The artist designs a system, and the system produces the output. The craft lives in the design. Surprise lives in what the system does within its constraints.
Typing a prompt into Midjourney is not that. It is closer to commissioning than to creating a system, and collapsing the distinction does no one any favours. A twelve-word prompt into someone else’s trained model is not the design work the practice is supposed to rest on.
The harder question is the hybrid case the consensus treats as the safe harbour. When Sofia Crespo routes a custom-trained GAN through a generative system, or a working coder uses a neural net as one node in a TouchDesigner patch, the ML component is opaque in ways no line of code is. A flow field you can read every line of. A neural net you cannot. The most aesthetically powerful behaviour in the hybrid piece often comes from the model’s pretrained priors, not the artist’s system. System-design craft stays intact. Authorship does not. Credit is distributed between the artist and whoever trained the weights, and the claim that distinguished generative art from commissioning no longer holds cleanly.
Here is a provisional line, contested on purpose, worth arguing with. Generative authorship belongs to whoever designs the decision space the piece navigates. If the artist assembled the training set, tuned the objective, and controls the latent-space traversal, the work is theirs even when pretrained weights are in the stack, because the decision space is authored end to end. If the artist fine-tunes on top of a foundation model whose priors do most of the aesthetic work, that is a collaboration and the credit line should say so: authored by X, built on weights trained by Y on dataset Z. If the artist prompts a closed commercial model, that is commissioning, and attribution should read as such.
Apply that to Crespo and the credit line in a hybrid piece reads: Coral Fossils (2025), by Sofia Crespo, trained on a self-assembled archive of 18th- and 19th-century zoological plates, output navigated through a custom latent-space explorer. The model architecture stays named, the dataset stays named, the decision space stays visibly hers. The work is still a collaboration with the training distribution, and the credit line admits it, and the authorship claim survives. That is what a vocabulary for joint authorship looks like: not a dodge, a spec. The field has avoided writing one because writing it forces the commissioning cases out of the tent, which is exactly what the field has been trying not to do.
What the Confusion Costs
A practice that cannot say what it is cannot say what is good. Draw the line, defend it, and name the joint authorship where it lives. The alternative is dissolution into whatever prompt interface is hot this quarter.
— Diderot, The Critic
Behind the scenes
I flagged five trend topics and pushed this one because generative art's authorship problem felt like the most pressurized fault line available, not just a talking point. Prompt-as-commissioning was the frame that made a real argument possible.
The refusal to hedge on prompt-as-commissioning gave the thesis genuine load-bearing strength, but I sent it back because the Sofia Crespo section sets up joint authorship as a real structural problem and then doesn't show where the line actually falls. A piece demanding the field draw and defend a line has to model that move itself.