Companywide quantitative / qualitative Research
A company-wide experience survey across nearly 3,000 employees — designed to surface friction, benchmark internal tools, and give leadership a data-backed lens on where to invest next.
Were we actually improving the employee experience through our internal tools — or just assuming we were? Leadership lacked a consistent, measurable way to know.
I co-led survey design and owned the qualitative analysis — building the coding framework, categorizing 1,900+ open-text responses, and translating findings into stakeholder-specific presentations for both technical teams and leadership.
Numbers for the what. Commentary for the why.
We fielded a survey to ~3,000 employees measuring SUS, Ease of Use, Value, Learnability, and Responsiveness per system — segmented by role, tenure, and business unit. Quantitative scores provided benchmarks; open-ended commentary provided the why behind them.
* Portions of these artifacts have been redacted to protect confidential company data, proprietary metrics, and internal commentary.
Roadmap influence
Directly shaped Q1 2024 prioritization.
Findings directly shaped which systems were prioritized for improvement in the following cycle.
Baseline established
First year-over-year UX benchmark at the company.
Teams now had year-over-year benchmarks to measure against, shifting internal conversation from opinion to data.
Executive alignment
Two tailored readouts — technical and strategic.
The same research was packaged two ways: detailed findings for technical leads, strategic summary for leadership.
- Structure is critical at scale.
Analyzing nearly 2,000 qualitative responses taught me to establish coding frameworks early and test them before diving into full analysis. Without clear structure upfront, patterns get lost in the volume.
- Context makes metrics meaningful.
SUS scores provided benchmarks, but pairing them with user commentary revealed the real story. A low score could mean different things — poor usability, missing features, or simply high expectations.
- Stakeholder communication is as important as research.
I learned to translate findings differently for different audiences: technical teams wanted specifics, leadership needed strategic implications. The same insights packaged differently created much stronger impact.
- Measurement drives change.
Simply tracking experience metrics consistently shifted how teams viewed their internal tools. Once we established baselines and made scores visible, teams became invested in improving their systems' performance year over year.