Since AI came out, everyone started talking about taste.
People I respect said it first: taste is the last human advantage. The thing AI can't copy. There were articles, hot takes, frameworks. I read all of it. Saved it. Kept going back. And then I deliberately didn't write about it.
Not because I didn't have thoughts. But because I wanted to understand what taste actually meant to me — not as a strategic moat, not as career advice, but as something I live with every day and was still figuring out.
This is what I think.
Taste is not about being picky. It's about being coherent.
People confuse taste with having strong preferences — liking certain things, rejecting others. That's not it. Taste is about coherence. It's the ability to know what belongs in your world and what doesn't — not because you read somewhere that it was good or bad, but because you understand your own signal well enough to tell.
The problem is that most people have never actually developed that signal. Their preferences are downstream of engagement patterns, not real attention. When you ask someone why they like something, the honest answer is often: because I kept seeing it.
That's not taste. That's exposure mistaken for preference.
Taste is built slowly. You can't download it.
Everything I've observed confirms this: taste accumulates. It's earned through making things, discarding things, sitting with things, returning to things. A preference that doesn't survive repetition isn't taste — it's reaction. Real taste is what holds up when the mood shifts and no one's watching.
This is why friction matters. Not difficulty for its own sake, but what the difficulty produces — the ability to articulate why something is right or wrong, not just that it feels that way. The chef who's oversalted a thousand dishes doesn't just avoid salt now. They understand salt. That's different.
AI doesn't threaten taste. It exposes whether you have it.
People frame taste as a defense against AI — the human advantage machines can't replicate. I think that's the wrong frame. The real question AI puts on the table isn't "can machines have taste?" It's: do you know your own taste well enough to use a tool that amplifies whatever you give it?
Used passively, AI makes your taste irrelevant. It averages you toward the median. Used intentionally — as a mirror, something that feeds your own inclinations back to you — it can sharpen what you already have. But only if you already know what you have.
Most people don't.
The real work is knowing your own signal.
Steve Jobs said that technology married with the liberal arts yields results that make your heart sing. I think he was pointing at this. It's not enough to know how to build. You need to know what deserves to exist. And that's not a technology question — it's a question about who you are and what you actually care about.
Taste, in the end, is self-pursuit. The slow work of discovering what matters to you — not because you should care, but because you genuinely do. The only way to do that is to keep making, keep discarding, keep paying attention. To protect the space where you're creating for no audience but yourself.
I wrote the longer version of this on Liminal Letters — more on the research, the friction, and what a different relationship with AI might look like. Read it here.
Still figuring it out. Probably always will be.