The emerging tech consensus says humans become selectors and curators in an AI world — taste is what remains. Manidis hates this. Using the GAN metaphor: the discriminator’s job is to train the generator, and once the generator is good enough, the discriminator is removed. The more refined your taste, the faster machines learn it, the sooner you’re redundant.
The historical distinction matters: patronage put the patron in the room before the first brushstroke, negotiating with the maker, oriented toward something transcendent. Taste is what you call the patron’s function after you remove the patron from the process of making. Around the 18th century, the collector replaced the patron, the critic replaced the guildmaster, and friction was lost.
The human role isn’t to judge what AI generates — it’s to participate in the generation, in friction with the tool, oriented toward something beyond yourself. This reframes the entire implementation-gap narrative: the point isn’t to become a better prompt-writer (discriminator), but to remain a co-creator. Deep mastery that allows generative rule-breaking beats selection from a menu — and Building real projects teaches AI skills faster than following structured curricula demonstrates this practically: the non-technical user who built a production WhatsApp bot learned more through generative friction than any structured curriculum could provide. This is also why Compound engineering makes each unit of work improve all future work emphasizes the review phase — the 40% spent reviewing isn’t passive judgment, it’s active co-creation with the agent’s output, extracting patterns and shaping future work.