Rigorous Revenue Architecture Reporting
Created a comprehensive, AI-native revenue reporting system that provided executive leadership with unprecedented visibility into marketing, sales, and customer success metrics across the entire customer journey. Deployed 37 production queries with automated weekly reporting, replacing a months-long manual process with a 30-minute refresh cadence while saving ~$0.6M in ineffective marketing spend.

Impact at a Glance
→ 1-2 days
→ 30 min
The Challenge
When I started at CloudTrucks, the Chief Revenue Officer (CRO) told me that there was no owner for conversion metrics. He flagged this as a critical challenge to deeply understand the end-to-end customer lifecycle. Senior leaders lacked the clarity they needed to make strategic decisions about sales and marketing initiatives. The company was flying blind - without visibility into specific areas of the bowtie funnel to focus on, it was impossible to measure the impact of initiatives, build business cases for new projects, or prioritize by expected impact.
Before This Project
Leadership had top-level metrics like close rates and revenue, but they didn't have detailed conversion data or segmentation. More critically, data integrity was frequently in question, so whatever reports leadership did receive were largely written off as unreliable. This forced an intuition and gut-feel approach to managing go-to-market motions - a costly and inefficient way to operate at scale.
The challenges were multi-layered:
- ❌Data Quality Crisis - The foundational data integrity issues meant any reporting would be dismissed unless I could rebuild trust from the ground up
- ❌Building Stakeholder Confidence - Even with clean data, I needed to validate outputs with subject matter experts and prove the reporting was rigorous enough to base decisions on
- ❌Automation & Scale - Manual reporting was unsustainable - I needed to build automated workflows for weekly cadence without constant manual effort
- ❌Organizational Ownership - Navigating the politics of democratizing database access when a centralized data team was supposed to own all reporting
- ❌Sustainable Handoff - Training my successor to maintain the same level of specificity and rigor I had established
What Made This Challenging
- ⚠️Technical Complexity: Navigating complex data architecture and building deep expertise in schemas, relationships, edge cases, and gotchas
- ⚠️Organizational Politics: Democratizing database access when a centralized data team was supposed to own all reporting
- ⚠️Trust Building: Overcoming historical data quality issues and building stakeholder confidence from scratch
The Approach
My approach was to become the GTM data expert the CRO desperately wanted. I rolled up my sleeves and got elbow-deep in the data to understand it rigorously myself. Initially, I was directed to work with the data team and have analysts build these reports. But it quickly became clear that it would be faster, more accurate, and more reliable for me to build it myself - once I successfully navigated the ownership lines and democratized database access.
Phase 1: Deep Data Understanding
I started with an exhaustive dive into the data architecture - understanding schemas, relationships, edge cases, and gotchas. This wasn't just about writing queries; it was about understanding why the data looked the way it did and where the trust issues originated.
Phase 2: AI-Native Workflow Development
Once I understood the data model, I built an AI-native query generation workflow:
- Voice-to-Text - Used AI dictation tools (Wispr Flow, later Monologue) to describe reporting needs in plain English - the specific filters, segmentation criteria, and business logic
- Custom Gemini Gem - Created and trained a Gemini Gem specifically for our data architecture. I fed it our schemas and iteratively taught it the nuances and gotchas of our data model
- Query Generation - The gem converted my plain English requirements into business logic specific to our data architecture, then generated BigQuery SQL
- Human-in-the-Loop - I manually reviewed every query, refined the logic, and used each correction as an opportunity to train the gem further - creating a flywheel that got better with every iteration
- Stakeholder Validation - Before deploying any report to production, I validated outputs with key stakeholders to ensure accuracy and build trust
Phase 3: Automated Deployment
Once queries were validated, I embedded them into an automated workflow:
- BigQuery → Google Sheets → Google Slides
- One-click weekly refresh instead of 8-10 hours of manual copy-paste drudgery
- Reduced weekly reporting prep from a full day to 30 minutes
Phase 4: Phased Rollout
I followed change management best practices by slowly dripping new reports into the weekly RevVitals cadence. Rather than overwhelming senior leadership with everything at once, I carefully managed the flow of new information so they could digest insights incrementally until we reached full depth and specificity.
The rollout followed a logical sequence:
- Sales Pipeline - Standard sales stage progression with segmentation by customer cohorts, verticals, and individual salespeople
- Marketing Attribution - Upstream metrics, campaign performance, and forecasting
- Customer Success - Downstream retention, expansion, and health metrics
- Business Case Simulator - Expected project impact modeling using DoubleLoop to pinpoint exactly which funnel metrics an initiative would affect and forecast the expected change to business outcomes
Phase 5: Knowledge Transfer
After months of managing this system myself, I was moved to a high-impact strategic project implementing the enterprise voice AI program. I trained my successor, documented the workflows, and ensured they could maintain the same rigor and specificity I had established.
Key Phases
- Deep data understanding (schemas, relationships, gotchas)
- AI-native workflow development (voice dictation → custom Gem → BigQuery SQL)
- Automated deployment (BigQuery → Sheets → Slides, one-click refresh)
- Phased rollout (sales pipeline → marketing → customer success → business case simulator)
- Knowledge transfer and documentation
Technology Stack
- AI Tools: Gemini Gems (custom-trained for data architecture), Wispr Flow/Monologue (voice dictation)
- Data & Analytics: BigQuery (queries), Google Sheets (data layer), Google Slides (presentation layer), DoubleLoop (business case modeling)
- Data Sources: Salesforce, internal customer performance data
Results & Impact
Quantifiable Metrics
37 Production Queries Deployed across marketing, sales, customer success, and operations - covering the entire customer journey from awareness through expansion.
Report Delivery Acceleration: 3-6 months → 1-2 days
The old process for a single report request: submit a ticket to the data team, wait 1-2 weeks for sprint prioritization, wait another 1-2 months for initial output, then spend another 1-2 months in back-and-forth refinement. If you were lucky, you'd get something usable in 3-6 months. If you weren't, the request would die in the backlog.
With my AI-native workflow, I went from concept to validated, production-ready report in days - including stakeholder validation. This represented a 50-100× acceleration in time-to-insight.
Weekly Reporting: 10 Hours → 30 Minutes
The weekly RevVitals prep used to consume a full workday: 8-10 hours of manual drudgery pulling data, taking screenshots, copy-pasting into slides, formatting, and checking for errors. After automation, it took 30 minutes for a one-click refresh and final review.
~$0.6M Saved in Low-Value Marketing Spend
The detailed conversion tracking revealed that paid advertising wasn't performing as effectively as originally believed. We cut ~$0.6M in marketing spend that wasn't delivering ROI.
Strategic Pivot: Acquisition → Retention
The reporting highlighted critical gaps in customer retention - we were spending heavily on acquisition while operating a leaky bucket. This insight drove a strategic shift to focus on retention as we were decreasing spend on acquisition.
Organization-Wide Adoption
Every member of the C-suite and senior leadership used these reports: CEO, CTO, CFO, COO, CRO, Head of Sales, Head of RevOps, Head of Customer Success. Frequency ranged from daily (for operators) to weekly (for executives in RevVitals meetings).
Qualitative Impact
New Standard of Excellence
This project set a new bar for rigor and specificity in data-driven decision-making. Once senior leadership experienced having the information they needed to make confident decisions, they never wanted to go back to flying blind.
Cultural Shift in Data Access
The success of my AI-native workflow changed expectations around data democratization. I trained a handful of other individual contributors on my SQL-generating Gem workflow, and they were able to build the reports they needed much more quickly and effectively than routing requests through a centralized team.
From "What Happened?" to "What Should We Do Next?"
The reporting layer evolved into a strategic modeling capability. Using the detailed bowtie funnel metrics in conjunction with DoubleLoop, we could forecast the expected impact of any proposed initiative on specific funnel metrics - enabling rigorous ROI modeling and prioritization based on projected business impact rather than gut feel.
Unexpected Wins
- 💡Strategic Planning Rigor: The detailed reporting enabled a level of strategic planning rigor that hadn't been possible before. We could model expected outcomes with precision, showing exactly which metrics in the bowtie funnel a given project would impact, by how much, and what that meant for overarching business goals. This gave us the ability to prioritize projects by expected ROI rather than loudest voice in the room.