The Disappearing Expert

There was a book published in 2017 called "The Death of Expertise" that explained why so many of us turned against established experts. We began to dismiss their credentials as elitist, or we started questioning the idea of expertise itself. The result is that we now treat every opinion as equally valid.

What we lost in that shift is the ability to tell who actually knows what they are talking about. I miss knowing what truly qualifies someone to speak on complicated topics. When did we stop caring whether someone has actually studied economics before they explain inflation, or worked in healthcare before they dismiss medical research?

This matters more than ever because AI is making everything harder to verify. With deepfakes and AI-generated articles, we now live in a world where you can't always trust what you see or read. Hearing from people who actually know their stuff isn't just helpful anymore, it's essential. When everything can be faked, actual expertise becomes one of the few things we can still hold on to.

And yet, we're building AI systems that could erase that entirely. When we let AI do all our research for us, we risk not just ignoring experts—we won't even know they exist. Right now, AI pulls from Reddit, Quora, or news snippets, blending expert voices with random ones. Attribution disappears. Bias is hidden. By the time you're reading the output, you have no idea whose perspective you're seeing, or whose you're missing.

But there does seem to be a shift away from shallow takes. Super-specialized newsletters and 3-hour deep-dive podcasts with actual experts are finding real audiences. People are hungry for context, for understanding how all the pieces connect, and for someone who can explain not just what's happening but why it matters. Maybe we are finally tired of TikTok takes?

AI is arriving at exactly this inflection point. Just as people are rediscovering the value of expertise, we're building systems that could make it invisible. This is why attribution matters so much. We need to see where AI is getting its information. And to trust expertise again, people need to see not just information but the people behind it, including their work, their history, and their perspective.

I'm working on something new to try to address some of these challenges. The questions I'm wrestling with are these: How do we build systems, whether in AI, media or education that put real expertise front and center? What would it actually look like to design for credibility instead of virality? I'd genuinely love your thoughts.

Recent posts

Latest from us