Building Data Logic That Works Like

Marketers Think

Attribute Dependencies is an enterprise feature that automates data validation for marketing teams such as IBM Corporate, SolarWinds, and SAP Emarsys, managing complex campaign attributes. When certain business rules govern which attribute combinations are valid—like regional legal requirements or logistical constraints—this feature prevents invalid data entry by dynamically limiting available options based on user selections.

About Uptempo


Uptempo is a B2B SaaS company that provides marketing operations solutions for large enterprise teams. Their platform helps unify strategic planning, budgeting, and performance tracking in one place - enabling marketing leaders to align their investments with business goals, improve visibility, and make more data-informed decisions.


  • Uptempo Plan (SaaS Platform)

  • 2024-2025

  • Lead Product Designer

  • Project Manager, Engineer Lead, Lead Software Developer, Four (4) Supportive Software Developers, Two (2) Professional Service Team Members, & UX Technical Writer

  • Marketing Technology (Enterprise SaaS)

Problem

Marketing teams at enterprise clients like IBM and Cisco manage hundreds of campaigns with dozens of custom attributes. While some attributes follow standard hierarchies (like campaign structure or finance codes), many are governed by external business rules—regional compliance requirements, physical logistics, or legal obligations.

The core problem:
Users had to mentally track which attribute combinations were valid, leading to:

  1. Data errors - Invalid combinations slipping through without system validation

  2. Wasted time - Marketers constantly checking documentation or asking admins which values were allowed

  3. Low confidence - Users uncertain whether their entries would cause downstream issues


Real example from research:

A marketer selecting "Germany" as a region should only see EU-compliant campaign types, but the system showed all global options—forcing users to remember complex rules or risk compliance violations.

challenge

How do we automate complex business logic without overwhelming users?

The technical capability existed to create dependencies, but the design challenge was threefold:

  1. Making invisible logic visible - Users needed to understand why certain options disappeared when they made selections

  2. Balancing control vs. automation - Some users wanted full automation; others needed transparency to maintain trust

  3. Scaling within constraints - The solution had to work within Uptempo's existing design system while introducing entirely new interaction patterns for dependency visualization


Key tension:
Too much automation feels like a "black box." Too little defeats the purpose. We needed to find the sweet spot where the system guides users without removing their agency.

IMPACT

Quantitative Results :

Based on post-launch analysis and stakeholder feedback:

40%

Reduction in time spent on attribute updates

Users no longer needed to manually verify valid attribute combinations against documentation

65%

Decrease in data inconsistencies flagged

Invalid combinations were prevented at the point of entry during campaign reviews

Qualitative wins :

Based on post-launch analysis and stakeholder feedback:

Reduced support burden

Customer Success teams reported noticeably fewer tickets related to invalid attribute combinations and data corrections

Increased user confidence

Marketers expressed greater trust in their data entry, knowing the system prevented compliance violations and invalid combinations

Competitive advantage

Feature became a key differentiator in enterprise sales demos, particularly for clients with complex compliance requirements

Immediate adoption

Power users integrated the feature into their workflows with zero additional training required

Business Value:
By preventing errors upstream, clients avoided costly downstream fixes in reporting and compliance audits—directly impacting their ROI on the Uptempo platform.

NOTE:
Some platform screenshots and sticky notes are blurred throughout this case study. These represent my design process at different stages - from research synthesis to testing - while keeping customer data and internal materials confidential.

My goal is to show the thinking and collaboration behind each stage, even when specific content can’t be shared.

how customers experienced the feature

  • Attribute Dependencies cut down errors and manual work, making our data much easier to manage.

    —Operations Lead, SAP Implementation Partner

  • Attribute Dependencies gave our admins more control while keeping reporting data aligned across teams.

    —Data Systems Manager, SolarWinds

  • Attribute Dependencies feature keeps our marketing data clean and saves us from endless manual checks. Setting up rules feels intuitive - we got it right the first time without needing extra documentation.

    —Marketing Operations Manager, IBM Corporate

design process

part 1 | UX : discovery → definition

This part of the project focuses on understanding our users, their challenges, and our internal constraints - setting the foundation for design decisions that followed.

Due to confidentiality, some UX artifacts (research notes, Miro boards, internal workflows) are shown as static images and are not clickable or zoomable. Screenshots have been selectively cropped or blurred where necessary to protect customer and product information.

stage 1: AI-Assisted Research Synthesis

Using Claude AI to analyze meeting transcripts at scale

  • Collaborated with PM on stakeholder meetings - Attended IBM, SAP, and SolarWinds discovery sessions via Microsoft Teams, observing how users currently manage attribute dependencies across disconnected systems (Excel, emails, Jira)

  • Leveraged Claude AI for transcript analysis - Fed 3 Microsoft Teams meeting transcripts (2h 16min total) into Claude AI, asking it to identify recurring pain points, mental models, and feature requests across all sessions

Why AI-assisted synthesis: Traditional manual transcript analysis is time-intensive (estimated 3-4 hours for 2+ hours of recordings). Claude AI reduced synthesis time to ~10 minutes while maintaining qualitative depth.

Impact:

  • Reduced synthesis time by 90% (4 hours → 30 minutes including validation)

  • Identified 100% consensus pain points: multi-system tracking, no visual representation, dependency type confusion

  • Surfaced critical insight: Users understand concepts through real-world examples but struggle with abstract labels

Tools used: Microsoft Teams (meeting platform, transcripts), Claude AI (transcript analysis, pattern recognition)

stage 2: Visual Research Synthesis in FigJam

Organizing AI-generated insights into actionable design requirements

  • Created FigJam interview analysis board - Took Claude AI-extracted insights and organized them into 5 key categories for visual synthesis: Pain Points, Mental Models, What Works Well, Critical Issues, and Feature Requests

  • Color-coded by theme - Used visual categorization to make patterns immediately scannable for cross-functional team (Yellow: Pain Points, Purple: Mental Models, Green: What Works Well, Red: Critical Issues, Blue: Feature Requests)

  • Mapped insights to source clients - Tagged each finding with IBM, SAP, or SolarWinds to show cross-customer validation and identify company-specific vs. universal needs

Tools used: FigJam (visual synthesis, sticky notes), Claude AI insights (input), color-coding system

stage 3: Defining the Problem Space

stage 5: Internal Expert Validation


Translating research insights into structured design challenges

After gathering user insights, I partnered with my PM to synthesize findings and structure our approach for the solutioning phase.

What I did:

  • Broke down the overarching problem into three main user jobs based on research findings

  • Divided each main job into micro-jobs for technical discussion with development team

  • Created a framework to guide cross-functional solutioning sessions

Uptempo was formed through the merger of three leading innovators - BrandMaker, Allocadia, and Hive9.

Impact: Established clear scope boundaries grounded in user needs and prevented feature creep during design exploration.

stage 4: System Analysis + Competitive Review

Learning from legacy platforms to inform our direction

What I did:

  • Analyzed how customers had been using these legacy platforms

  • Identified what worked well and what didn't in existing dependency management approaches

  • Documented patterns and anti-patterns to inform our design direction


Why this mattered:
By collaborating with PS early, we spotted technical constraints and design opportunities before moving into ideation, ensuring solutions would be user-friendly and feasible to implement.

stage 6: Cross-Functional Solutioning Workshops

Transforming insights into design opportunities through collaborative ideation

After completing research, I gathered additional insights from the PS team, engineering lead, developers, and PM to understand technical limitations and backend dependencies.

Workshop Activities:
I facilitated design activities using a Miro board to encourage collaborative ideation:

  • "How Might We" exercises - Reframed user pain points into design challenges

  • Brainwriting sessions - Generated ideas across disciplines without groupthink

  • Crazy 8s sketching - Explored multiple solution directions quickly



Impact: This analysis helped us understand inherited user expectations and avoid repeating known usability issues from previous platforms.

Connecting user insights with backend realities

Reached out to the Professional Services (PS) team, who manage customer onboarding and data setup.

What I learned:

  • Connected user insights with backend realities, revealing how data configurations influence user experience

  • PS team shared current UI/UX pain points they observed while supporting customers - confusing data entry flows, unclear field dependencies, inconsistent terminology

  • These insights validated research findings and uncovered new usability issues not immediately visible from Clarity data

Team Questions to ask the PS Team
-Confidential

List of UX Concerns from the Professional Services Team

Team Sketching from the 2nd Solutioning Workshop

Outcome: These sessions enabled the team to co-create early design ideas, identify promising directions, and ensure concepts met user goals with technical feasibility.

stage 7: Prioritization + Scope Definition

Navigating healthy conflicts to align on MVP

Organized the final alignment session with PM and engineering lead to establish feasibility and prioritization.

Prioritization Framework:

  • Built a Value vs. Effort prioritization matrix to evaluate each proposed solution

  • Mapped what would deliver highest user and business impact against what was realistic to implement

Navigating Trade-offs:

  • This stage often surfaced healthy conflicts - PM focused on delivery timelines, engineering lead highlighted development complexity

  • I facilitated structured trade-off conversations, reframing debates around user value and possible MVP scope

  • By focusing on shared goals and user impact, we reached common ground

Prioritazition Matrix

Final Outcome:
Agreement on an MVP that balanced usability, feasibility, and business goals - a clear roadmap that could deliver quick wins while planning for long-term improvements.



Hive9 user flow

Hive9 Platform
-Confidential

Allocadia user flow

Uptempo Spend Platform
-Confidential

BrandMaker user flow

Brandmaker Platform
-Confidential


part 2 | UI : Solutioning → Design Execution

Building on the insights and priorities defined during the discovery phase, I moved into translating concepts into tangible designs, validating them with users, and refining the experience through iteration.

UI designs and prototypes in this section are clickable and expandable. These visuals are recreated versions of the original flows and contain no sensitive customer data.

stage 1: Component Exploration + First Design

The “Save” button state was initially active throughout the process, allowing incomplete saves. Both PS and customers pointed out that it should remain disabled until both controlling and dependent attributes were selected.

I updated the logic accordingly and added a tooltip for the disabled dropdown, improving clarity and preventing user errors.




First Version:

  • Created the first version of the design to visualize how all elements could come together

  • Focused on ensuring table interactions were intuitive and aligned with real user workflows

  • Reused as many components from our UI library as possible while adapting for better usability

  • Presented to the core UX team to gather feedback on layout hierarchy and clarity

Creating a Flexible Table Component by Combining Ant Design Components

Collaborating with Engineers

stage 3: Validation Testing with PS Team + Customers

Conducted three validation sessions with the PS team and representatives from IBM, SAP, and SolarWinds, who actively manage data configuration in our platform.

For the MVP, I replaced the modal-based deletion with inline delete actions to streamline the flow and reduce clicks. The PS team and customers agreed the simplified interaction worked smoothly and aligned with their expectations. The modal deletion was excluded from MVP scope but documented for a future enhancement round.

Editing dependencies :

stage 4: AI-Assisted Usability Analysis

Using Claude AI to analyze validation session recordings

  • Recorded validation sessions - Captured screen recordings of PS team and customers (IBM, SAP, SolarWinds) testing the dependency configuration flow in live Figma prototypes

  • Leveraged Claude AI for behavioral analysis - Fed 3 validation session recordings into Claude AI, asking it to identify interaction patterns, confusion points, and task completion blockers

  • Automated pattern detection - Claude flagged critical usability issues across sessions: button state errors appearing in 2/3 sessions, tooltip gaps noticed by all participants, selection order confusion in first-time user attempts

  • Task completion analysis - Claude tracked success vs. failure patterns for three core tasks (creating, editing, deleting dependencies), highlighting that inline editing tested perfectly while dependency creation needed refinement

Why AI-assisted analysis: Watching 3 hours of validation recordings and manually noting every issue is time-intensive. Claude AI analyzed all sessions in ~15 minutes, ensuring no critical usability issue was missed.


stage 6: Pre-Launch Demo + Final Validation


Impact:

  • Validated design decisions (inline editing worked perfectly - no changes needed)

  • Prioritized fixes based on frequency across sessions (button state issue: 2/3 sessions = high priority)

  • Reduced analysis time by 85% (3 hours → 30 minutes including validation)

Tools used: Figma (prototype), Microsoft Clarity, Claude AI (session analysis, pattern detection), manual validation for context

stage 5: Final Alignment + Handoff

Ensuring smooth transition from design to implementation

Held a final review meeting with PM and engineering lead to summarize testing outcomes and confirm what would ship in the MVP.

Handoff Process:

  • Reviewed the Value vs. Effort matrix to align priorities - confirmed inline editing and deletion as MVP features

  • Documented design specs and interaction logic directly in Figma for developer reference, including tooltips, button states, and validation rules

  • Collaborated closely during the handoff phase, clarifying interaction edge cases and verifying implementation details

  • Joined early feature flag testing sessions to confirm UI accuracy and functional parity with the prototype

The inline editing experience tested extremely well. Both PS and customer participants noted that the direct edit-in-table approach made updates faster and more intuitive - no changes required.


Documentation Approach:
Created designer notes cards showing implementation details in a dev-ready Figma file, including:

  • Component states and variants

  • Interaction logic and validation rules

  • Tooltip copy and positioning

  • Edge case handling

Designer Notes cards showing implementation details in a dev-ready Figma file

Outcome: Smooth handoff with minimal back-and-forth during development, ensuring design intent was preserved in implementation.

Confirming the feature works as intended before release

Participated in a pre-launch demo session with PS and engineering teams to review the feature under the live environment's feature flag.

Validation Process:

  • Invited customer representatives from IBM and SAP to observe the flow and provide early impressions

  • The feature performed as expected - no major usability issues reported

  • Logged minor enhancements (e.g., tooltip delay timing, icon consistency) for post-launch iteration

Balancing design vision with engineering constraints

Deleting dependencies :

Assembling the First Cohesive Design

Outcome: This version served as the foundation for subsequent refinements - bridging the gap between conceptual exploration and practical, system-ready design.

stage 2: Technical Feasibility Alignment

Met with the engineering lead and developers to review technical feasibility and ensure consistency across the product.

Technical Reality:

  • Due to time constraints and need for UI consistency, engineering team suggested reusing an existing table component

  • That table supported row-level deletion but lacked inline editing, which was essential for this use case

  • Team confirmed inline editing could be added with minimal effort, but migration from Angular to React introduced extra development time

Scope Adjustment:
To balance effort vs. value, I proposed adjusting the MVP scope:

  • Keep the existing table design for consistency

  • Enable inline deletion as part of the MVP

  • Exclude deletion via modal for this release and defer for future enhancement

Impact: This alignment ensured a consistent user experience across features while maintaining a realistic delivery timeline.

Testing with real users to uncover critical UX issues

Testing Focus:
Focused on testing the dependency configuration flow, where users set relationships between controlling and dependent attributes during setup.

Creating dependencies :

Final Product

Impact: The demo acted as a final confidence check before rollout, ensuring design, technical, and business stakeholders were fully aligned.

Translating workshop insights into tangible design directions

After the solutioning workshop, I synthesized all team sketches and ideas generated during brainstorming sessions.

Design Process:

  • Merged overlapping concepts and refined the most promising ideas into a cohesive end-to-end flow

  • Main focus was designing a table view that allows users to easily view, edit, and delete attribute dependencies directly from each row

  • Since our existing UI library lacked a standardized table pattern, I created a new table component by combining existing Ant Design elements

  • Explored multiple layout and interaction options to define the most intuitive and scalable structure

key learnings 💭

  • Early engineering alignment prevented significant redesign later in the process

  • Testing with PS + customers revealed edge cases no internal UX review could catch

  • System constraints are not blockers - they shape smarter, more scalable UX patterns

  • Maintaining consistency across features (table patterns, edit patterns) reduces cognitive load and dev effort

  • Clear microcopy and tooltips can eliminate major user confusion in complex workflows

  • AI-assisted behavioral analysis scales insight discovery beyond traditional usability testing limitations

project takeaway 🧠

This project reaffirmed the importance of designing within system and technical constraints, a reality of enterprise UX I'm very familiar with. It strengthened the need for structured, systems-first thinking when handling complex logic flows.

It also emphasized how early alignment with engineering and PS is essential - not optional - for preventing rework and keeping delivery efficient.

And ultimately, it served as a reminder that enterprise UX succeeds through clarity, predictability, and error prevention, enabling users to work with confidence and minimal friction.


Next
Next

Timeline Visuals