Let's talk about the elephant in the room that everyone is either panicking about or pretending doesn't exist.
IBM paused hiring for hundreds of roles expecting AI to automate 30% of them within five years. Google Cloud cut over 100 design and UX positions. Duolingo replaced 10% of its contractors with AI for content creation. Atlassian, Goldman Sachs, McKinsey, UPS. The list is long and it's getting longer.
Scary? Sure. A little. But also, chill.
The AI Doesn't Have Taste
Here's what nobody is saying clearly enough: AI is run by humans. Every single output, every generated image, every block of vibe-coded frontend, every AI-written caption that sounds slightly off in a way you can't quite place. All of it starts and ends with a human making decisions. The AI doesn't have taste. It doesn't know if something looks right. It doesn't know if the composition is off or the copy is flat or the layout makes no sense on mobile. It just does what it's told and generates what's statistically likely based on everything it's seen.
There is, in very small text under almost every AI chat interface, a disclaimer that says something like: "AI can make mistakes. Please double check responses."
Be the person who does that. Genuinely. That's the whole strategy.
Generating Is Easy. Knowing What to Do With It Isn't.
The problem isn't AI. The problem is people using AI without understanding what they're actually making.
The design gurus will tell you that you don't need to understand design to make something with an AI design tool. Technically true. You can generate an image without knowing what composition means. You can vibe code an app without knowing how to debug it. You can write a blog post without knowing how to structure an argument.
But can you tell if it's good? Can you tell if it's wrong? Can you catch it when it hallucinates a fact, breaks on mobile, or produces something that looks fine in isolation and completely off-brand in context?
Generating is easy. Knowing what to do with the output is the actual skill. And that skill requires understanding the thing you're generating. At least enough to have taste. At least enough to know when something isn't working and why.
If you want to use AI to generate images, learn basic composition first. If you want to vibe code, learn enough to debug it when it breaks — because it will break, and wasted tokens and bad output cost more than just time. If you want AI to write your content, know what your voice actually sounds like so you can tell when it drifts.
The floor got lower. The ceiling didn't.
The Tool Doesn't Come With Taste
I had a client once who watched me use AI as part of my process. Not hiding it, just using it the way I use any other tool. They saw the output, liked it, figured they could just do it themselves and cut me out.
A few weeks later they came back. The output was bad. Not broken, just bad. Wrong tone, off composition, no coherence across pieces. They couldn't figure out why.
I didn't say much. But the reason was pretty simple: they had the tool and no taste.
The tool doesn't come with taste. That's not something you can generate.
This Has Happened Before
This isn't new by the way. In the 80s, machines replaced human labour in factories. People didn't quit. They learned to operate the machines. When computers arrived, people learned to use computers. Every time a tool changes the game, the people who adapt and learn how to use it well become more valuable, not less.
Your ancestors figured it out with less information and fewer resources. You have the entire internet, documentation, tutorials, and honestly pretty good AI assistants to help you learn. The excuse of "I don't know how" has never been thinner.
You will not be replaced if you are not replaceable. And you become irreplaceable by actually understanding what you're doing, using the tools well, and having enough taste to know the difference between output that works and output that just exists.
The Gap Is Where the Work Lives
AI didn't lower the quality bar. It lowered the barrier to entry, which is different. Now everyone can make something. Not everyone can make something good. Not everyone can tell the difference. And not everyone can do it consistently, on brief, on brand, and actually useful to a real human on the other end.
That gap — between generated and good, between output and craft — is where the work actually lives.
It always was.


