Every day, millions of banking transactions flow through the commercial ecosystem. Behind the scenes, data scientists work to ensure each transaction is properly categorized—a process that relies on accurate transaction data, or 'strings.' Up to this point, this was a heavily manual process - but we're about to change that...
As the sole UX designer for the Validation Tool ('Val'), I was tasked with reimagining how data scientists interact with these critical but complex transaction reviews. The challenge? Transform a manual, multi-system process into an intuitive workflow that would scale across the organization.
One challenge presented to me was to convince my new team that a "rewind" was necessary. To address this, I quickly set up a usability test with the actual users and presented my findings and a (time-bound) action plan.
To understand the true scope of the challenge, we needed to see how data scientists actually worked. I structured our research in two parts: first, understanding their daily lives and aspirations (user interviews), and second, observing them clean transaction strings in real-time (contextual interviews). Key findings revealed a process filled with friction:
Through journey mapping, we visualized these pain points and discovered something crucial: many of these system switches could be eliminated through API integration. This insight became the cornerstone of our solution strategy.
With multiple service design projects running in parallel, we faced a critical challenge: how do we prevent feature overlap and ensure each tool serves a distinct purpose? The solution came through collaborative design activities with our users.
First, we conducted card sorting sessions with data scientists to organize and assign functionalities across tools. The relief on their faces was immediate—finally, a clear understanding of which tool would handle which task.
Building on this clarity, we moved to hands-on wireframing sessions. I provided component libraries and guided teams as they designed their ideal workflows. This collaborative approach accomplished two crucial goals:
The result? A shared vision for each tool's role in the data validation ecosystem, with Val's specific purpose clearly defined and supported by user research.
While utilizing an existing design can be efficient, it can also be naive. I wanted to ensure we hadn't missed better solutions, so we took a step back to explore new possibilities.
Using Invision Freehand as our collaborative canvas, I facilitated ideation sessions that went beyond the original single-design approach. The team explored multiple concepts, focusing on two key challenges:
The initial concept featured a list view of strings alongside a comparison window. But was this the best approach? Our exploration would soon reveal that users had different ideas about what would make their workflow most efficient.
Through iterative testing with our Data Science team, two unexpected insights emerged that transformed our approach:
These insights helped us evolve from a basic data comparison tool into a more thoughtful workspace that considered both the emotional and functional needs of our users.
While we knew automating manual tasks would improve efficiency, the actual impact exceeded our expectations. One month after launch, the numbers told a compelling story:
Having our users in-house proved invaluable—we could quickly iterate, test, and refine based on real-time feedback. Looking ahead, these results are just the beginning. With a solid foundation built on user needs and validated through real-world use, we're well-positioned to continue evolving the tool as data science needs grow.
I learned that it's important to not just take a handoff from a fellow designer and run forward. You must ask "is there anything missing from our evidence that would inform a better design?"