I’ve sat through hundreds of hours of “analysis” sessions. Want to know how many of those hours changed what we actually did?

A number that rounds to zero.

This sounds cynical. It’s not meant to be. It’s an observation about how time gets spent when smart people gather around a conference table - or more likely now, a Zoom call - to discuss “the data.”

The Pattern

Here’s how it usually goes.

Someone asks, “What percentage of our traffic was brand vs non-brand last month?”

Reasonable question. Except the reporting is murky. GA4 says one thing. Search Console says another. The paid team has their own numbers that don’t quite reconcile with either. So you spend 45 minutes triangulating, making caveats, explaining why the sources don’t match.

Eventually you arrive at a number. Let’s say 62% brand.

Everyone nods. Someone writes it down. Maybe it goes into a slide deck.

And then… nothing changes.

The campaign stays the same. The budget allocation stays the same. The content calendar stays the same. You’ve established a fact, and that fact has had exactly zero impact on what anyone does next.

Here’s the question that never gets asked: “Okay, and what would we do differently if it was 55%? Or 70%?”

If the answer is “nothing” - which it almost always is - you just burned an hour on trivia.

The Distinction

Effective knowledge is information that changes your behavior, strategy, or resource allocation.

Trivia is accurate information that has zero impact on what you actually do.

The distinction isn’t about whether something is true. It’s about whether knowing it matters.

There’s a simple test: “If this number were 20% different, would we do anything differently?” If the answer is no, you’re looking at trivia dressed up as analysis.

Brand vs non-brand split? Trivia, unless you have a specific threshold that triggers a budget reallocation.

Which landing pages have the highest bounce rate? Trivia, unless you’re actually going to rewrite them.

How does our domain authority compare to competitors? Trivia, unless you’re making a specific decision about link building investment.

The list goes on. Most of what gets discussed in “strategy” meetings falls into this bucket. Facts that feel important but don’t change anything.

Why This Happens

It’s worth understanding why smart people end up in these loops, because it’s not about incompetence.

Clients often feel they need to understand everything before acting. There’s comfort in data, even when that data doesn’t point anywhere specific. Meetings feel productive even when they’re not - activity and progress are easy to confuse.

On the agency or consultant side, nobody wants to say “this doesn’t matter” because it sounds dismissive. We’re being paid to provide expertise, and expertise is supposed to involve knowing things. So we know things. Lots of things. Whether those things are useful is a separate question that doesn’t always get asked.

There’s also a billing incentive, if we’re being honest. Hours spent on “analysis” are easier to justify than “we already know what to do, we’re just doing it.” The former sounds rigorous. The latter sounds like you’re not working hard enough.

The uncomfortable truth is that sometimes the right answer is obvious from the first meeting. Everything after that is expensive stalling.

The Real Cost

This isn’t just about wasted time in meetings. Though that’s bad enough.

It’s about delayed execution on things that actually move the needle. Every hour spent debating metrics that don’t matter is an hour not spent on work that compounds - content that builds authority, outreach that earns links, product improvements that drive retention.

It’s about mental overhead. Tracking metrics that don’t matter clutters your thinking. You end up with dashboards full of numbers that create the illusion of insight without any of the utility.

And it’s about opportunity cost. The right questions don’t get asked because everyone’s busy answering the wrong ones.

A Better Return on Useless Knowledge

I have friends back home who do horror movie trivia nights. Every week they gather at a bar and answer questions about obscure slashers from the ’80s and which director made which sequel.

This knowledge is, by any practical measure, useless. It cannot change their behavior in any meaningful way. There is no life decision that hinges on knowing who played Michael Myers in Halloween 4.

But here’s the thing: they get cheap drinks and cash prizes.

Their trivia has a positive expected value. They’ve accumulated deep, useless knowledge and converted it into beer money. That’s an honest transaction.

Now, as a consultant, I get paid whether the meeting is useful or not. The meter runs regardless. So in a narrow sense, sitting through an hour of brand vs non-brand archaeology pays the same as an hour of actual strategy work.

But the client is the one paying for trivia and getting nothing back. No prizes. No drinks. Just invoices for conversations that didn’t move anything forward.

At least horror movie trivia is honest about what it is. Nobody pretends it’s strategic.

The Test

Before diving into any analysis or metric, ask one question:

“If this number were different, would we change anything?”

If no - stop. Move on. It’s trivia. You’re not being rigorous, you’re just burning time.

If yes - define what threshold would trigger the change. Then go find the answer. Now you have a reason to care about the number.

This is the difference between data-driven decision making and data-decorated status quo maintenance. One of these is useful. The other just makes everyone feel busy.

My friends know their horror movie knowledge is for fun. They’re not pretending it’s strategic. The problem starts when we dress up professional trivia as analysis and bill hours for the privilege of not changing anything.

Know the difference. Or at least get some drink tickets out of it.