top of page
Adobe TTV Baseline Research Project

Adobe User Labs

Time To Value Benchmark

At a glance:

Adobe User Labs offers Adobe's Experience Cloud product teams a streamlined process to regularly validate concepts and designs with real users. Releasing a new product, the product team needed a way to show UX improvement overtime. The goal of the TTV Benchmark (time to value) was to establish a clear view of the current state of one of the product stack to measure improvement over time.

Date

March - April 2020

Product Team

Kristy Duncan

Anurag Dodeja

Aaron Shields

David Kao

Siddhartha Goal

Research Team

Pert Eilers

Michaela Behrman

Ande Reisman

Michael Grieder

Summary

Our immediate team comprised two UX researchers and five project managers (one for each main product area and the primary sponsor for the project). The project managers provided the needed product expertise so we could coordinate research questions and key tasks for the benchmark. Working as the research lead, I organized the benchmark efforts from exploratory meetings with the product teams to writing the PowerPoint decks and presenting the results. 

Time-to-Value Benchmark Study Summary

In this usability study, the product team wanted to know how usable the current version of the product is so they can measure improvement with future releases. Their goal is to improve time-to-value for new users. Using a mixed-methods unmoderated approach, we were able to gather both behavioral and attitudinal data to establish a combined baseline usability score that the team can compare to with future iterations after key product releases. We were also able to identify major areas of the product that need improvement and some stick points for users. Knowing the key areas that need improvement helps prioritize UX updates and allocate resources; saving Adobe engineering resources, time, and money.

Time-to-Value Benchmark Study Setup

Measuring the usability of a product could be approached in many ways. The team came to us with 3 specific questions:

  1. Can a user add data from a source to a Profile?

  2. Are users able to create a segment with specific attributes using the segment builder?

  3. Can they add the audience they created to a destination for action?

Using specific usability principles to frame the project goals, the following refined research questions emerge: 

  • Learnability and Usability: Can users perform common tasks with little or no assistance?

  • Usefulness & Efficiency: How easily & successfully can users add data, find profiles, build segments, and add segments to a destination?

  • Appearance & Loyalty: Does the functionality and appearance of the product meet users' expectations?

Method

Participants, procedure, & metrics

Since starting at Adobe in April 2019, we have been slowly building up a panel of Adobe Experience Cloud customers who have opted into research. All the feedback individual panelists have given is tracked to provide a better user experience and frame Adobe's product development. Using the panelist's profiles, we were able to segment and invite Adobe marketing customers who regularly segment audiences as part of their job responsibilities. We recruited a total of 15 participants to do the unmoderated usability study. 

Time-to-Value Benchmark Study Procedure

We decided to run this study unmoderated so we could measure things like task success, time on task, number of clicks, number of page views, etc. The participants were given access to a sandbox instance of the product and prompted to do a series of common tasks within the test instance.  After each task, participants rated the difficulty of the task on a scale from 1 (very difficult) to 5 (very easy) in a survey format. Once all tasks were complete, we asked them seven additional questions in a survey format to measure qualitative metrics like the overall ease of use, visual appeal, productivity impact, NPS, etc. These were used with task success rates to come up with a single combined UX score (qxScore) for the product.

qxScore infographic

The qxScore, or Quality of Experience Score, combines both behavioral (task success) and attitudinal metrics (SUS, SUPR-Q & NPS) into a single overall usability score. This new industry benchmark metric created collaboratively by UserZoom and Lead UX Researchers provides companies a consistent, comparable way to measure UX across studies, products, and business units.

Sessions were run from April 3rd – April 13th, 2020 unmoderated through UserZoom. It took participants an average of 41 minutes to complete the entire study. The voice and screen actions of each participant were recorded using UserZoom and later reviewed by research facilitators to gain additional insights into metrics.

Results

Analysis & Results

With the proper setup, we were able to not only provide a usability scorecard and baseline metrics to measure against in the future; but also able to gather attitudinal data to understand if people's perceptions of the usefulness of the product and its ease of use. All this data can be used to measure improvement with future iterations and better understand the products current state.

Because the remote sessions were recorded, we were also able to identify areas where people struggled and provide teams with quotes and video examples of key tasks, concepts, and findings.

Image of example methods used to find profiles

With this information, the product team was able to gain a clear understanding of the product's current usability and areas that need improvement. This allowed the team to better prioritize the product backlog and allowed the product manager to define clear goals for their OKRs (Objectives and key results).

© 2020 Michaela Behrman. All Rights Reserved.*

*All other copyrights and trademarks are the property of their respective owners. Use of them does not imply any affiliation with or endorsement by them. 

bottom of page