Connected spend, at the granularity marketers actually plan in
When marketers plan multi-quarter campaigns but the system only shows full-year totals, the gap doesn't show up as a usability bug — it shows up as spreadsheets, escalations, and reconciliation work that no one tracks but everyone feels.
I led the end-to-end design of Time Phasing through handoff: the discovery, problem framing, configuration model, and final design for surfacing connected spend at quarterly granularity.
timeline
Q1 2025 – Q2 2025 · ~5 months
team
2 PMs · Eng Lead · Devs · Professional Services
platform
Marketing Performance Management · B2B SaaS
my role
Lead Desiner (through design handoff)
context
Enhancement to Uptempo's connected spend system
the problem
Marketers at Fortune 500 customers — IBM, Cisco, SolarWinds — were planning campaigns that played out across four quarters. But the connected spend panel only showed full-year totals. There was no way to compare planned spend against estimated cost at a quarterly grain, no way to see whether Q3's budget was actually allocated where it needed to be, and no way to track multi-year activities cleanly.
The result: Marketers exported data into spreadsheets to do quarter-by-quarter math the product should have been doing for them. Power users had built tribal workarounds that broke whenever data refreshed. Enterprise readiness commitments were being held up by a gap that, on paper, looked like a small enhancement.
the challenge
HMW expose connected spend at quarterly granularity without breaking existing customer configurations, the underlying data model, or the mental model marketers already have for the Plan and Spend modules?
The challenge had to:
support multiple time-phasing models (FY-only, quarterly, multi-year)
respect existing data architecture — calendar year, fiscal year, and quarter were already three separate concepts in the system
work for both admins and marketers, who have fundamentally different mental models
from the product brief
Five separate signals pointed at the same gap — direct customer insights from Cisco, SolarWinds, and IBM; enterprise-readiness input from internal stakeholders; and customer feedback from Thao Ngo on Campaign Planning.
Before any design work began, I worked with both PMs to translate these signals into a single problem statement — so we weren't designing five features that happened to share a name.
my contribution
Led discovery, definition, and design through engineering handoff
Synthesized customer insights from five separate sources into a single problem framework
Coordinated across two PMs whose products both touched the same data model
Made the call to scrap V1 and rebuild from a high-level flow
Documented system logic in a flowchart that became the engineering source of truth
Defined the component model the budget tab now uses
outcomes at handoff
This case study covers the work through design handoff. Another designer carried the project into validation and ship, so the metrics on the live feature aren't mine to claim. The directional outcomes from the design phase:
3
Fortune 500 accounts unblocked on enterprise readiness scoping — IBM, Cisco, SolarWinds.
2
2PMs aligned on a single problem framework — preventing two adjacent features from shipping with a seam between them.
4 → 1
Collapsed four competing config patterns into one hybrid model, before a line of production code was written.
∞
Established a component-based budget tab pattern that downstream teams reused.
DISCOVERY
A framework before transcripts
I ran a structured discovery framework with seven questions designed to cut through the surface ask: Who is this information for? At what stage in the workflow? Single activity or multiple? FY-to-FY or quarter-to-quarter comparison?
The reason I built the framework instead of jumping into transcripts was that the same word — time phasing — was being used by five different stakeholders to describe what I suspected were different problems. Without a shared question set, synthesis would just average the asks instead of separating them.
I synthesized customer transcripts from Cisco (quarterly breakouts), SolarWinds (budget-linking time phasing), IBM (calculated columns at activity level), and internal voices (B.B. on enterprise readiness; T.N. on campaign planning roll-overs).
The pattern that emerged confirmed the hypothesis: "time phasing" was shorthand for at least three distinct jobs, and each pointed at a different solution.
See breakdowns of connected spend across quarters — primary
Compare planned vs. estimated cost at a quarterly grain — secondary
Track multi-year activities cleanly — separate problem, scoped out
Naming the difference let me argue — with evidence — for which job V1 should solve.
Aligning two PMs on one problem
Two PMs owned adjacent products that both touched this work: one focused on the activity workflow side, one on budget/spend views. They each had customer asks pointing at "time phasing," but their customers wanted slightly different things.
I built a shared problem framework in FigJam — jobs, outcomes, scope boundaries, and explicit "not in this release" calls — and walked both PMs through it together. The framework became the artifact we revisited every time scope crept.
SOLUTIONING & TRADE-OFS
Four config models, one decision
Once the jobs were clear, the question became how to expose time phasing in the configuration layer. Four models were viable. Each had a different cost.
OPTION
STRENGHT
1. Use Insights (existing)
Existing pattern, zero new UX
COST
Short-term solution, same as Insights config per FY structure
2. FY org-wide setting
One FY setting for Plan and Spend
New setting, similar to Insights, charts may break, roll-overs
3. Per-budget FY setting
Followers can support custom FY
New setting, doesn't solve cross-budget views
4. Quarter per column
Granular control, custom config
Heavy config burden, complex per-column setup at scale
I documented each in FigJam with annotated screens, ran the matrix past both PMs and the eng lead, and made the case for a hybrid: master-level enable/disable + per-activity-type config + a view-level period switch (FY ↔ Quarterly).
The hybrid wasn't the simplest option. It was the one that respected the three constraints I couldn't trade off: existing customer configs, the data model, and the marketer mental model.
The data complexity V1 missed
What forced the rebuild wasn't a UI problem. It was a data-model problem hiding inside a UI problem.
This is the artifact that made the V2 case to eng and PMs. Once everyone could see that quarters meant different things to different customers, "fix the UI" stopped being a viable answer.
Scrapping V1
Halfway through, I marked the V1 Figma file deprecated and rebuilt from the high-level flow up.
V1 had tried to solve every edge case — calculated columns, currency tags, data category restrictions, multi-year rollovers — inside one budget tab. It was technically thorough and operationally unusable. When I walked it through with the partner team, we both saw the same thing: too much logic was being surfaced to the user, and the screens were starting to encode rules instead of expose them.
I rebuilt by mapping every conditional state first — EC enabled? Connected spend allowed on type? Has children activities? Time phasing enabled? — then designing the budget tab as a set of components that resolve into the right configuration based on those conditions, instead of one screen trying to be everything.
The restart cost a sprint. It saved the project.
DESIGN
A system, not a screen
The V2 design wasn't one budget tab. It was a system that resolves into the right configuration based on the conditional state of each activity — whether connected spend exists at the parent level, what funding sources are linked, and what the activity hierarchy looks like.
I worked directly in hi-fi using Uptempo's existing design system. The configuration patterns, panels, and table components already existed — the work was composing them correctly against the conditional flow, not creating new primitives.
This kept the project moving fast and meant the budget tab landed visually consistent with the rest of the product on day one of handoff.
The shipped V2
Below: the V2 design organized into the two scenarios that mattered for shipping — activities without connected spend at the parent level, and activities with it. Each scenario shows the resolved budget tab states the system needs to handle.
Scenario 1: No connected spend at parent level (5 states) · Scenario 2: With connected spend at parent level (2 states)HANDOFF
I handed off to another designer at the end of Q2 at design-complete.
The deliverables: the conditional flowchart, the hi-fi screens, and the trade-off documentation explaining why we landed where we did.
My involvement on the project ended at handoff — another designer carried the work forward from there.
reflection
The most important decision I made on this project wasn't a design decision — it was the decision to scrap V1 and rebuild from a flowchart. Senior design isn't about getting it right the first time. It's about being honest enough to admit when "right enough" is a trap, and structured enough to know what to build instead.
The second thing this project reinforced: when a problem looks like a UI gap, check the data model first. Time phasing read as a screen issue and was actually a mapping issue between spend columns and each customer's fiscal calendar. The Calendar vs. Fiscal Year exercise was what made that visible to the team — and once it was visible, the real shape of the work could be designed for, not designed around.
The third: aligning two PMs on a single problem framework before any design work started is what kept the project from fragmenting. Five customer signals were already being treated as five feature requests. The job of the lead designer in that moment wasn't to design — it was to make the problem unmistakable.

