← All Work
04 — UX & Product Design · Travelers Insurance

Business audit Experience

How do you ask a business user to complete a complex insurance audit with as little friction as possible? By running the research to find out exactly what you can stop asking — then redesigning around that answer.

Travelers internal tool — details available on request
3 Research Methods · 2 User Groups · 17 Issues Identified
7
Users Studied (3 Auditors · 4 Customers)
6.0/7
Prototype vs. Current (86% of max)
17
Issues Identified & Severity-Categorized
6
Critical & High Issues Addressed Immediately
/ The Problem

Business users were completing insurance audits through a process that felt overwhelming and opaque — answering questions that weren't always necessary and uploading documents without understanding why. The goal wasn't just to simplify the UI. It was to determine the absolute minimum information and documentation needed for a valid audit, then design a flow that collected exactly that — no more.

/ My Role

I led end-to-end UX research and design — running click tests to identify navigation failures, designing and executing an A/B test on our data grid component, and conducting moderated user studies with two distinct audiences: internal auditors who administer the process and external customers who experience it. I introduced a new RAS (Recommended Adoption Score) tracking method to categorize and prioritize the 17 issues surfaced. Findings were synthesized into a stakeholder readout with formal recommendations.

/ Research Approach

Two users. One process. Completely different mental models.

Internal auditors understand exactly what documentation they need and why — but have no visibility into how their requests land on the other side. External customers experience the audit as an opaque, high-stakes checklist with no clear rationale. Designing for both required running two parallel research tracks simultaneously: one to define what was truly required, and one to understand where external users were confused, hesitating, or providing incomplete information.

/ Research Methods
01

Click Testing

Identified 2 specific points of navigation confusion where users were being directed off the correct path. Updated language and removed options that were derailing users before they reached the audit questions.

02

A/B Testing

Tested a live prototype migrating data tables from AG Grid to our in-house design system. Results were comparable with a slightly improved pass rate — providing the data-backed green light to proceed with the full component conversion.

03

Moderated User Studies

7 sessions across 3 internal auditors and 4 external customers via moderated think-aloud and Figma prototype walk-throughs on Userzoom. Preliminary Questions scored 6.3/7 and Document Upload 6.1/7 on Ease of Use, with the prototype rated 6.0/7 against the current process. Sessions surfaced 17 distinct issues, severity-categorized using a new RAS (Recommended Adoption Score) tracking method introduced for this project.

/ New Methodology

17 issues. 6 that couldn't wait.

Rather than presenting a flat list of problems, I introduced the RAS — Recommended Adoption Score — as a new severity-categorization framework for tracking research findings at Travelers. The 17 issues identified were classified by impact: all critical and high-severity items (6 total) were flagged for immediate resolution. The RAS gives product and engineering teams a clear, prioritized action list rather than an undifferentiated pile of feedback.

/ Design Decisions

Ask less. Verify more.

The redesigned flow eliminated 3 questions entirely and reworded 4 others for clarity. Three UX patterns were introduced: conditional logic so questions only appear based on previous answers, a multi-step format breaking the audit into manageable stages, and inline validation giving users immediate feedback on their inputs. Three new features were added to reduce manual effort: a printable checklist, automatic pre-population of last year's data, and AI-powered document scraping — which reads uploaded payroll and tax documents, extracts the relevant data, and verifies it automatically so users don't have to enter it twice.

/ Outcomes
01

Prototype Beat the Status Quo

6.0/7 vs. current process · 6.3 prelim · 6.1 upload.

Across 7 moderated sessions, the prototype scored 6.0/7 against the current workflow, with Preliminary Questions at 6.3/7 and Document Upload at 6.1/7 on Ease of Use — clear evidence the redesign improved on what auditors and customers use today.

02

Critical Findings, Triaged Immediately

State-specific docs, password-protected files, downloadable summary.

Two CRITICAL issues (state-specific document and pay requests for WC audits) and four HIGH issues — including password-protected payroll files, missing FL signature surfacing, and the lack of a downloadable audit summary — were routed straight into the build via the RAS framework.

03

AI-Powered Verification Added

Bulk uploads, automatically categorized.

Introduced AI document scraping and categorization as a new audit feature — automatically organizing 'mass dump' uploads from experienced insureds and verifying data from payroll and tax documents, eliminating a major manual burden for both auditors and customers.

/ What I Learned
/ Selected Artifacts

/ Click any screen to expand

01 · Study Overview — Participants & Methodology
02 · Prototype Flow — Premium & Virtual Audit Document Collection
03 · Critical & High Severity Findings
04 · Combined Score Summary — SEQ Ease of Use