Marketing Dashboards & Reporting
A dashboard that takes 45 minutes to understand every Monday is a failure. Marketing reporting should take 10 minutes to read and produce 30 minutes of decisions. Here's how to build reports that drive action.
The CMO who threw out 6 dashboards and built one
In 2022, the CMO of a UK retail chain walked into a Monday marketing review and found a 40-slide deck waiting for her. Every metric her five-person team tracked was in it. Channel by channel, week by week, comparison after comparison.
She asked one question: "What should we do differently this week based on this?"
The room went quiet. The deck contained everything except the decision.
She spent an afternoon rebuilding the reporting structure. The new format: one page. Five metrics. One commentary section with three sentences: what changed this week, why it changed, and what the team will do about it.
The Monday meetings went from 90 minutes to 20 minutes. Decision quality improved because the decision-making context was clear. Three months later, the team had completed eight A/B tests — none had been possible before because reporting had consumed all available time.
The data was always there. It was the format that was broken.
What a marketing dashboard should do
A dashboard is a decision support tool, not a data repository.
The question to ask about every metric on every dashboard: What decision does this metric enable? If the answer is "none" — if you can't describe a specific action you'd take based on this number being different — the metric doesn't belong on the dashboard.
Dashboard types serve different audiences:
| Dashboard type | Audience | Frequency | Metrics |
|---|---|---|---|
| Operational | Marketing team | Daily/weekly | Campaign spend, leads, CPL, CTR, anomaly flags |
| Performance | Marketing manager | Weekly | Channel KPIs, CPA by source, funnel conversion rates |
| Business | Leadership/CEO | Monthly | CAC, ROAS, MQL→SQL rate, revenue attribution, LTV trend |
| Strategic | Board/investors | Quarterly | LTV:CAC, payback period, channel mix, market share proxies |
The hierarchy: Operational dashboards track inputs and activities. Performance dashboards track marketing outcomes. Business dashboards track business outcomes. Strategic dashboards track health indicators over time.
The anatomy of a useful marketing report
The Weekly Performance Report Structure:
The three-sentence commentary model:
- What happened (the number, the direction, the magnitude)
- Why it happened (the root cause — not speculation)
- What's being done about it (specific action, specific person, specific timeline)
Marketing reports that just show numbers require the reader to figure out the "so what." Reports that follow the three-sentence model give the reader the analysis and the recommendation.
Building your metrics stack
Start with business outcomes:
| Layer | Metric | Target | Source |
|---|---|---|---|
| Business | Revenue from marketing | £X/month | CRM / payment processor |
| Business | New customers acquired | Y/month | CRM |
| Business | CAC | £Z | Calculation |
| Marketing | Marketing qualified leads | A/month | CRM |
| Marketing | CPL | £B | Calculation |
| Marketing | ROAS (e-commerce) | C× | GA4 |
| Channel | Channel-by-channel CPL | Various | Platform reports |
| Channel | Landing page conversion rate | D% | GA4 |
| Channel | Email open/click rates | E% | Email platform |
The north star metric: Each business should identify one metric that best represents health and progress. For SaaS: MRR or ARR. For e-commerce: revenue ÷ CAC (revenue efficiency). For lead generation: SQL pipeline value. Everything else either explains why the north star is moving or is subordinate to it.
Anomaly detection: when to escalate
Not every metric change is worth reporting. The skill of good reporting is distinguishing signal from noise:
Normal variance (don't escalate):
- Day-to-day fluctuations within 10–15% of average
- Week-over-week changes within seasonal patterns
- CPCs moving 5–15% (normal auction variability)
Worth investigating (investigate before escalating):
- A metric moving 20%+ week-over-week without an obvious cause
- A metric declining for 3+ consecutive weeks
- A metric suddenly diverging from its historical relationship with another metric (e.g., traffic stable but conversions dropping)
Escalate immediately:
- Conversion tracking stops firing (zero conversions suddenly)
- Delivery rate drops on email (deliverability issue)
- Ad account disapproved or suspended
- Landing page goes down during an active campaign
Building anomaly detection:
- Set weekly performance thresholds for key metrics: "If CPL exceeds £X, investigate"
- Check daily for operational anomalies: spend spikes, tracking failures
- Use GA4's anomaly detection or platform alerts for automated flagging
There Are No Dumb Questions
"How do I get buy-in from leadership on marketing attribution when they only understand revenue?"
Meet them where they are. Start with revenue and work backward: "Marketing generated [X] customers this quarter. At an average LTV of £[Y], that's £[Z] in expected lifetime revenue. Our total marketing investment was £[A], giving us an LTV:CAC of [B]:1." Then show what improved or declined. Leadership doesn't need to understand UTM parameters — they need to understand whether marketing is creating value relative to what it costs. Translate every metric into pound signs or business outcomes before presenting to non-marketing stakeholders.
"Should I use Google Looker Studio, a spreadsheet, or a proper BI tool for my marketing dashboard?"
Start with what you'll actually use and maintain. For most marketing teams: Google Looker Studio (free, connects to GA4 and Google Ads directly) for visual dashboards; Google Sheets for manual tracking and bespoke calculations. Dedicated BI tools (Tableau, Metabase, Amplitude) are worth the investment only once you have a data analyst to own them. The best dashboard is the one that's actually looked at — a beautifully designed Looker Studio dashboard that no one opens is worse than a shared Google Sheet that the team updates every Monday.
Build a Marketing Dashboard
50 XPBack to the CMO
The 40-slide deck hadn't been failing because it contained the wrong data — it had been failing because it required everyone in the room to do the analysis themselves, slide by slide, while the meeting burned. The CMO's one-page replacement didn't reduce information; it moved the interpretation from the meeting to the document. Five metrics, three sentences, one commentary section: what changed, why it changed, what we're doing about it. Meetings went from 90 minutes to 20. Eight A/B tests completed in the following quarter — none had been possible before because reporting had consumed all the available thinking time. The decisions that used to happen after the meeting, over email, days later, started happening in the room. One dashboard. One decision-making rhythm. Decisions made in the room instead of after the meeting.
Key takeaways
- A dashboard is a decision tool, not a data display. Every metric should enable a specific decision. If it doesn't, it doesn't belong.
- Different audiences need different views. Operational dashboards for daily team use; performance dashboards for weekly review; business dashboards for leadership; strategic dashboards for quarterly planning.
- The three-sentence commentary format. What happened, why it happened, what's being done. Reports that only show numbers require the reader to do the analysis — good reports do it for them.
- Anomaly detection distinguishes signal from noise. Set thresholds, monitor daily for operational failures, and reserve escalation for genuine outliers.
- Translate metrics into £ for non-marketing stakeholders. Revenue, customers acquired, and LTV:CAC are universally understood. UTM-level detail is not. Meet your audience where they are.
Knowledge Check
1.A marketing manager's weekly report shows: website traffic +8%, email open rate +4%, social followers +220, blog page views +12%, LinkedIn impressions +34%, webinar registrations +15%. The CEO asks: 'Did we acquire more customers this week?' The manager doesn't know. What does this illustrate?
2.A marketing dashboard shows: this week's leads = 187, last week's leads = 203, target = 175. The change is −7.9% week-over-week. How should this be reported and what action, if any, should be taken?
3.A marketing team spends 6 hours every Friday building a 35-slide performance deck for Monday's review meeting. The meeting takes 90 minutes. The CEO reads 3–4 slides and asks follow-up questions that the team can't answer because the data isn't in the deck. What should change?
4.A marketing director presents to the board: 'Our email open rate is 34%, which is 12 points above the industry average. Our blog generates 45,000 monthly page views, up 23% year-over-year. Our social media engagement rate is 4.2%.' The board asks: 'What's our marketing ROI?' How should the director respond?