Campaign View | Usability Benchmarking

Usability Benchmarking Study for a Social Media Platform. Usability was focused on improving efficiency for actions within the Campaign Page for Ads Growth Team.

Lead Researcher
Advertiser Tools
Qualitative Study
Usability

Client & Problem

Client is a Social Media company that has a advertiser tools platform where advertisers can run campaigns for their business/ clients (if they are an advertiser).

The goal of this specific benchmarking round was to analyze the success of previous improvements to customizing columns in the campaign view, via implementing a metrics taxonomy framework. We were studying the success of this change. 

My roles

  • Lead Researcher
  • Direct point-of-contact to Client 
  • Moderator
  • Developer of Materials (Discussion Guide, Final Reports, etc.)

Discussion Guide Development

Task 1: Incorporate recommended Metrics to Advertiser Tools

Task 2: Search & Filter for Specific Campaign

Task 3: Identify Campaign that is performing below threshold

Recruitment

10 advertisers who are regular to frequent users of the platform.

Mix of spend levels among advertisers

Fielding

Fielding occured via remote sessions on the Discuss.io platform.

Qual Data collected: quotes, video clips

Quant Data collected: Task Success rate, Error Rate, Decision Count

Analysis & Reporting 

Findings were presented via both an issue tracker and one-pagers. The use of multiple mediums allowed for different teams to focus on prioritizing the issues most pertinent to them.

Finding 01

Entry point confusion undermines task comprehension

  • Participants frequently landed on incorrect screens or used wrong features.
  • Unclear terminology made it difficult to predict where to complete tasks
  • Users relied on familiar pathways that didn't exist in the interface
Finding 02

Search and filtering interactions created friction

  • Users expected a visible, persistent search bar within the campaigns table
  • Unclear filter categorization made it hard to identify what was being searched
  • Extra step of exiting a filter box after searching disrupted flow and delayed completion
Finding 03

Anomaly detection lacked visual clarity and discoverability

  • Participants struggled to locate flagged anomalies within metrics columns
  • Adjacent UI elements were misidentified as the correct action path
  • Key links were perceived as too small or poorly positioned
Stakeholders Response: 
Client XFN teams responded positively to findings 
Product Changes:
Clearer labeling for campaign entry point
Removed extra step of exiting a filter box after searching 
Implementation of Anomaly Detection Highlight Button
Continuation of benchmarking program
Implementation of rounds within the usability benchmarking program among various product-sectors 
I wanted to note that as an agency side researcher I do have limited information into specific product implementations come to fruition.

This study stood out to me because of its rapid research format, it required staying efficient across every phase while collaborating closely with recruiters and client-side engineers simultaneously. It was also a unique experience in that I wore multiple hats beyond moderating and leading the study, including async clip tagging post-session, which came from being short-staffed across concurrent projects. That stretched my range in a meaningful way. What made the impact especially tangible was the speed of implementation. By the next benchmarking round, we could directly compare usability scores against previous sessions and see how the product changes moved the needle.