User Research
UX / Visual Design
Product Strategy
Role: UX Lead
Duration: 8 months
Team: 3-5 members
Research Reports
Low-fidelity Wireframes
High-fidelity Prototypes
Mixed-methods Research
Statistical Analysis
UX / UI Design
As UX Lead, I was responsible for guiding a cross-functional team through a complete redesign of a global enterprise’s support experience. Originally scoped to design a virtual agent, our team uncovered more foundational issues with content strategy, user trust, and experience design. By reframing the problem and aligning with stakeholders, we delivered significant improvements to user outcomes and business efficiency.
The client’s internal support portal was underperforming, creating friction for employees and burdening support teams. Employees often abandoned official workflows, instead contacting support through backchannels. This behavior skewed analytics, slowed ticket resolution, and hurt the credibility of the support system itself.
Our initial ask was to design a virtual agent that could redirect these inquiries. However, early stakeholder conversations revealed that their assumptions about user behavior were only partially grounded in evidence. Stakeholders believed that users were lazy or resistant to the process, but this explanation felt too simplistic and risked masking deeper UX failures.
I guided stakeholders through a discussion on problem framing, elaborating that what we often perceive as “laziness” has deep ties to human psychology and problem-solving. Drawing parallels to cognitive bias and design constraints, I made the case that we should first explore user motivations and barriers before committing to a solution.
To their credit, stakeholders gave us space to investigate. We shifted from solutioning to problem definition, aligning our team around a hypothesis that the issue wasn’t just user behavior.
We conducted mixed-methods research to explore how users interacted with the support system.
Highlighted pain points in trust, past failures, and lack of clarity
Surfaced consistent dissatisfaction with findability and usefulness
Identified high bounce rates and failure loops in key flows
Mapped misalignments between system labels and user mental models
One user summarized the sentiment well: “I tried going through the appropriate process, but at the end of the day I still had to call someone.”
Patterns pointed to a core insight: users weren’t bypassing the system because they were lazy, they did it because previous attempts to use the portal had failed them. They had learned the system couldn’t be trusted to deliver results, and resorted to workaround behaviors.
We developed a strategic proposal to restructure the support experience at multiple levels. This proposal was organized into three progressive stages based on user needs and business readiness.
We began with a comprehensive content audit, cataloguing over 500 support categories. We imported these into Mural, clustered them by theme, and analyzed labeling patterns and redundancies. Many categories overlapped in vague or inconsistent ways, undermining search and navigation.
We used tree testing via Optimal Workshop to evaluate the clarity and logic of reorganized taxonomies. We ran three rounds of testing with 65 participants, adjusting the taxonomy based on user input and failed tasks. As the structure improved, misclassification rates dropped significantly.
The final taxonomy was aligned to ServiceNow’s system capabilities and handed off to the Knowledge Management team for implementation.
With the content groundwork in place, we turned our attention to the portal interface. Our primary design goal was to minimize dependency on support providers by improving self-service outcomes. We mapped new user flows based on our improved information architecture, aligning them to common user goals. Rapid iteration helped us quickly test low-fidelity prototypes.
Despite this, we encountered repeated breakdowns during developer handoff. While initial sprint reviews were positive, engineering later deemed our solutions too complex. This created delivery delays and eroded trust.
To resolve the issue, I restructured our design validation process:
Introduced mid-fidelity feasibility checkpoints to flag complexity earlier
Embedded system documentation into design artifacts
Facilitated cross-team retrospectives to rebuild rapport
These steps restored alignment and enabled us to deliver a feasible, validated interface redesign.
Originally staffed to design a predetermined solution for a misunderstood problem, we were able to not only improve stakeholders’ metrics, but also help restore their confidence in the portal as a product. This led to the successful relaunch of a virtual agent that had failed in the past.
In rising to the challenge of this project, we were able to transform how a global enterprise empowered and supported its workforce.
“Users are lazy” is an indicator of design failure, not a user flaw. We must design systems that account for human psychology.
Politics are part of the process. Understanding power dynamics and organizational pressures allowed us to build better alignment.
Assumptions drain velocity. Teams need shared definitions and constraint scaffolding to move quickly without rework.