Contact
Strategy

Perché la tua dashboard ti mente

Empirium Team10 min read

Your marketing dashboard says website traffic is up 40%, email open rates are at 28%, and social media impressions hit 500,000 last month. The executive team is impressed. The board deck looks great.

Revenue is flat.

This isn't a paradox. It's the difference between vanity metrics and business metrics. Your dashboard isn't technically wrong — traffic probably did increase 40%. But traffic doesn't pay salaries. Pipeline does. Revenue does. And those metrics either aren't on the dashboard or they're buried behind three clicks nobody makes.

Here's how marketing dashboards lie, and how to build ones that tell the truth.

The Vanity Metric Trap

Vanity metrics are numbers that go up and to the right without correlating to business outcomes. They feel productive. They fill dashboards beautifully. They justify marketing budgets in the short term. And they obscure whether marketing is actually working.

The Usual Suspects

Vanity Metric Why It Looks Good Why It Lies
Website traffic Always grows with more content and ads Traffic without conversion is an expense, not an asset
Social media followers Steady growth curve Followers don't correlate with pipeline
Email open rate 25-30% seems healthy Apple Mail Privacy Protection inflates opens by 40-70%
Blog page views Content team can show growing consumption Page views don't measure whether readers became buyers
Impressions Big numbers impress stakeholders Impressions measure exposure, not influence
Marketing qualified leads (MQLs) Funnel metrics improve MQL definitions often include low-quality actions (whitepaper downloads, webinar registrations) that don't predict purchase

The problem isn't that these metrics are useless — they have diagnostic value when used correctly. The problem is that they're reported as success metrics. "We generated 500 MQLs" is celebrated without asking "how many became revenue?"

The Open Rate Illusion

Email open rates deserve special attention because they're the most commonly reported and most systematically misleading metric in marketing.

Since Apple introduced Mail Privacy Protection in iOS 15, all emails loaded through Apple Mail are marked as "opened" regardless of whether the user actually read them. Apple Mail accounts for 50-60% of email clients globally. This means your 28% open rate is probably 12-15% actual opens + 13-16% phantom opens from Apple's proxy.

If your email strategy is optimized for open rates, you're optimizing for a metric that doesn't reflect human behavior. Click-through rate and reply rate are the only email engagement metrics that can't be faked by a proxy.

Common Dashboard Lies

Beyond vanity metrics, dashboards mislead through structural problems in how data is presented.

Lie 1: Survivorship Bias

Your dashboard shows a 15% conversion rate from demo to closed-won. Impressive. But it only counts deals that reached the demo stage. It doesn't show the 80% of leads that never reached a demo, the 30% of demos that were disqualified immediately, or the 25% of "closed-won" deals that churned in the first 90 days.

Survivorship bias makes every stage look better than reality because you're only measuring the survivors.

Fix: Show full-funnel metrics. Total leads → Demo-qualified → Demo held → Proposal sent → Closed-won → Retained at 90 days. The bottom number is the only one that matters, and it's always smaller than the dashboard suggests.

Lie 2: Attribution Inflation

Your Google Ads dashboard says Google Ads generated $500K in pipeline. Your LinkedIn Ads dashboard says LinkedIn generated $300K. Your content dashboard says organic content generated $200K. Your email dashboard says email generated $150K.

Total: $1.15M in attributed pipeline. Actual pipeline: $600K.

Each tool claims credit for every deal it touched. When a buyer clicked a Google ad, read a blog post, and received a nurture email before requesting a demo, all three channels claim the full deal value. The sum of channel-attributed pipeline is always higher than actual pipeline.

Fix: Use a single source of truth for pipeline attribution — your CRM — not individual channel dashboards. Report attributed pipeline with a clear model (first-touch, last-touch, or multi-touch) and note that channel totals will exceed actual pipeline due to multi-touch overlap.

Lie 3: Sampling and Aggregation Errors

Google Analytics 4 uses data sampling when your dataset exceeds certain thresholds. The dashboard shows an aggregate, but the underlying data may only include 20-50% of actual sessions. For large websites, this means your traffic numbers are estimates, not counts.

Additionally, different tools measure the same thing differently. Google Analytics counts "sessions." Your CRM counts "leads." Your ad platform counts "conversions." None of these numbers will match because each tool defines its terms differently and measures at different points in the user journey.

Fix: Define each metric precisely in your data warehouse. "A lead is a unique email address that submitted a form on our website" — not "whatever Google Analytics calls a conversion." When metrics from different tools disagree, the warehouse definition is authoritative.

Lie 4: Cherry-Picked Time Ranges

"Traffic is up 40% month-over-month." True. Also true: the previous month was an all-time low because of a site migration. Compared to the same month last year, traffic is down 10%.

Time range selection can make any metric tell any story. Month-over-month comparisons amplify noise. Year-over-year comparisons smooth seasonality but hide recent trends.

Fix: Always show both: month-over-month trend AND year-over-year comparison. Include the rolling 3-month average to smooth short-term noise. Context prevents cherry-picking.

The Metrics That Matter

Replace vanity metrics with business metrics that predict or measure revenue.

Leading Indicators (Predict Future Revenue)

Metric What It Predicts Measurement
Qualified pipeline created Revenue in 3-6 months $ value of new opportunities by source
Pipeline velocity Deal cycle health Average days from opportunity to close
Demo-to-opportunity conversion Sales process effectiveness % of demos that become pipeline
Lead score accuracy Marketing-sales alignment % of high-scored leads accepted by sales
Content-to-pipeline ratio Content ROI Pipeline $ attributed to content touchpoints

Lagging Indicators (Measure Actual Revenue)

Metric What It Measures Measurement
Customer acquisition cost (CAC) Marketing efficiency Total marketing spend / new customers
CAC payback period Capital efficiency Months to recover CAC from customer revenue
Marketing-sourced revenue Marketing's direct contribution Revenue from marketing-generated pipeline
Net revenue retention (NRR) Customer health Revenue retained + expanded from existing customers
LTV:CAC ratio Unit economics Lifetime value divided by acquisition cost (target: 3:1+)

The Dashboard That Works

A single-page marketing dashboard should contain exactly these metrics:

Row Metric Timeframe
1 Marketing-sourced pipeline created This month vs last month vs same month last year
2 Qualified opportunities (sales-accepted) This month, trending
3 CAC by channel Rolling 3-month average
4 Pipeline velocity Current average vs 90-day average
5 Marketing-sourced revenue closed This month, this quarter, YTD
6 LTV:CAC ratio Rolling 12-month

Six metrics. One page. Everything a CMO needs to know whether marketing is working. Everything else is diagnostic — available for drill-down when a top-level metric moves in the wrong direction.

Building Honest Dashboards

Principle 1: Start with the Business Question

Don't build a dashboard and then look for insights. Start with the questions: "Is marketing generating enough pipeline?" "Are we acquiring customers efficiently?" "Which channels produce the best customers?" Then build the minimum dashboard that answers those questions.

Principle 2: Show Uncertainty

No metric is perfectly accurate. Email open rates are inflated by Apple. Attribution is incomplete. Forecasts have confidence intervals. An honest dashboard acknowledges uncertainty:

  • Attribution-based metrics include a caveat: "Based on last-touch attribution; actual influence may differ"
  • Forecasts include a range: "$500K-$700K projected pipeline" not "$600K projected pipeline"
  • Sampled data is labeled: "Based on 35% sample (GA4 sampling active)"

Principle 3: Include the Counter-Metric

Every metric has a counter-metric that prevents gaming. If you optimize for MQL volume, quality drops. If you optimize for CAC, volume drops. Always pair:

Metric Counter-Metric
MQL volume MQL-to-SQL conversion rate
CAC Pipeline coverage ratio
Deal size Sales cycle length
Traffic Conversion rate
Email sends Unsubscribe rate

Principle 4: Audit Data Sources Quarterly

Dashboards rot. Data connections break. Definitions drift. Quarterly, validate that each metric on your dashboard still measures what you think it measures. Check: Are the data sources connected? Has the source tool changed its tracking methodology? Do the numbers reconcile with your data warehouse?

FAQ

What dashboard tool should we use? Looker Studio (free) for simple dashboards connected to GA4 and Google Ads. Metabase (free, self-hosted) for dashboards connected to your data warehouse. Looker ($3,000+/month) when you need governed metric definitions and self-service analytics. Don't buy Tableau for marketing dashboards — it's overkill.

How often should we review dashboards? Weekly for operational metrics (pipeline created, MQL volume). Monthly for business metrics (CAC, revenue, LTV:CAC). Quarterly for strategic metrics (channel efficiency trends, year-over-year comparisons). Daily monitoring of marketing dashboards is a waste of time — the data is too noisy at daily resolution.

What do we do when the dashboard shows bad metrics? Present them anyway. Hiding negative trends destroys trust faster than any bad number. Frame negatives with context and a plan: "CAC increased 20% this quarter. The cause is [specific reason]. The fix is [specific action]. We expect improvement by [specific timeline]." Executives respect honesty and action plans. They don't respect dashboards that only show good news.

How do we transition from vanity metrics to business metrics? Don't remove vanity metrics overnight — that creates resistance. Add business metrics alongside existing ones. Over 2-3 months, shift the narrative: lead with pipeline and revenue, then reference traffic and engagement as supporting context. Within a quarter, the team will naturally focus on the metrics that appear first and receive the most discussion.

Your dashboard is a lens. If the lens is distorted, every decision based on it is wrong. Build dashboards that start with business questions, show honest numbers with appropriate context, and resist the temptation to present only the metrics that make marketing look good. That's how you build trust — and trust is what earns marketing its budget. Empirium builds the data infrastructure and reporting systems that produce honest marketing analytics. Let's build yours.

Written by Empirium Team

Explore More

Deep-dive into related topics across our five pillars.

Pillar Guide

Lo stack marketing ops moderno

A layer-by-layer breakdown of the marketing operations stack that actually works for B2B operators in 2026. CRM, automation, analytics, and integration patterns.

View all Strategy articles

Related Resources

Need help with this?

Talk to Empirium