Nielsen panel apps product design examples
Nielsen logo

June 2019 - April 2023

Improved data quality and simplified support for 750k audience measurement research participants in 30+ countries with enterprise research platform

TL;DR

In this case study, I detail how I co-led a cross-functional team at Nielsen to simplify technical support and enhance engagement for 750,000 audience measurement research participants across over 30 countries.

By designing inclusive experiences that accommodate global diversity, adopting flexible design strategies, and creating a modular architecture optimised for localisation, we reduced operational complexity, improved data quality, and accelerated time to market.

Despite challenges such as navigating varied operating models, cultural differences, and strict anonymity regulations, our team successfully developed scalable solutions that transformed participant engagement on a global scale.

The project resulted in significant achievements, including reduced development costs, improved participant engagement, and recognition through the CEO's Annual Award for Global Top Performers.

Research Platform Interface Snapshots
Nielsen Panels Explainer

Introduction

Audience measurement is a critical component of media analytics. It analyses how people interact with media and content.

This measurement data is indispensable in various use cases within the global media buying market, which was worth over $307 billion in 2022.

  • Broadcasters use audience measurement data to plan and optimize programming, understand audience demographics, and price media effectively within the competitive broadcasting industry.
  • Marketers leverage it to buy, time, and optimize marketing strategies to reach target demographics efficiently.
  • Content Producers rely on audience insights to price content appropriately and assess its performance in the content production market.
  • Media Researchers utilise this data to build accurate ratings, validate big data sets, and conduct in-depth media research analysis.

Over the past decade, the rapid digital transformation has dramatically changed global media consumption habits. As audiences and media platforms evolve, understanding what keeps us informed, entertained, and connected has become increasingly complex.

Panel Explainer

How does it work? Audience Measurement

Everything starts with audience measurement data collection. Selected individuals, representative of the measured community, voluntarily contribute their digital footprint for research.

Depending on the study, participants may install measurement meters on their phones, in their homes, wear them as smartwatches, or use multiple types simultaneously.

Every day, data scientists aggregate, anonymise, and process participants' data, which is fed into a portfolio of rating products and analytics tools.

This data also helps validate big data sets, offering deeper insights into media interactions within specific markets or countries.

However…

Keeping this process up and running at scale is a complex operation.

Audience Measurement Explainer

The Challenge: Simplifying Complexity at Scale

Our team was given a clear yet challenging mission:

  1. Simplify technical support and quality control to reduce operational overhead.
  2. Improve research participants' engagement and compliance to improve data quality.
  3. Create a scalable solution optimised for localisation across products in over 30 countries on six continents.

Initially, the project was propelled by the limits of years-long internal process optimisations and the massive overhead generated by a fragmented, non-scalable third-party setup.

However, growing market demands for innovative solutions in high-profile tenders and the onset of the COVID-19 pandemic intensified the internal push to transition as many parts of audience measurement participation as possible to self-service channels.

Research Platform Interface Localisation to RTL language Snapshots

Designing inclusive experiences for global diversity

Adopting a global mindset was essential in designing for 750,000 research participants across 30+ regulated markets.

Adopting a global mindset was essential in designing for 750,000 research participants across 30+ regulated markets.

Key diversity considerations included:

  • Demographics: Spanning various ages, genders, ethnicities, income levels, and occupations.
  • Tech Savviness: Ranging from highly tech-savvy individuals to those with minimal experience with technology.

Moreover, strict anonymity regulations in most markets added another layer of complexity, as we needed to design personalized experiences without accessing identifiable participant information.

These factors demanded flexible design strategies that could deliver an inclusive, user-friendly experience for all participants, regardless of their background or technological comfort level.

Research Platform Interface Snapshots

A great team for a great challenge

I had an opportunity to lead a distributed team of product designers embedded into product development teams.

I had an opportunity to lead a distributed team of product designers embedded into product development teams.

Collaborating across domains, I partnered with product, technology and data science leaders while reporting to the VP of Tech and later to the Global SVP of UX.

Our development teams, spread across Europe, Asia, and the US, delivered market-specific apps across a portfolio of research products. Our progress wouldn’t have been possible without help and insights from tens of domain experts representing product, data science, software development, hardware engineering, and operations.

Initially, my focus was on:

  1. Setting the direction for the product's modular architecture.
  2. Developing tools to facilitate research and collaboration among domain experts.
  3. Designing the company's first multi-platform design system optimised for touch interfaces and internationalisation.

As the team gained momentum and release cycles ramped up, my focus shifted to:

  1. Expanding components, patterns, and features to support more products and an expanding scope.
  2. Orchestrating delivery, handoff processes, and localisation while implementing company rebranding.
  3. Optimising processes and best practices to further scale the team’s impact and efficiency.
Research Platform Interface Snapshots

The primary challenge: complexity by variety

Navigating global "complexity by variety" in operating models, client commitments, data governance, and privacy regulations in 30+ markets with a stakeholders map spanning six continents was initially intimidating.

Adding to this complexity, we faced secondary challenges such as:

  • The requirement to support multiple platforms (including web, native, PWAs, and side-loaded apps) across various devices.
  • Cultural differences complicated the standard approach to using imagery, colours, icons, wording, etc., and required a mindset change.
  • Mapping complex use cases involving proprietary hardware.

We identified web and native apps as the fastest and most efficient means to:

  1. Improve onboarding, engagement, and support for audience measurement participants in over 30 countries.
  2. Reduce operational complexity and third-party dependencies.
  3. Lower development costs and accelerate time to market.

Combined with ambitious goals and the number of countries involved, these constraints demanded a flexible approach with built-in mechanisms for course correction.

To tackle these challenges effectively, we first needed to learn more about participants' daily experiences in audience measurement studies.

Research Platform Research Deliverables

Establishing shared understanding

Our research goals were aimed to:

  1. Contextualise complexity through real-world examples to enable problem reframing.
  2. Engage cross-domain groups of subject matter experts to understand end-to-end processes outside of organisational units’ dynamics.
  3. Establish a working context across domains by extracting expertise from isolated work environments.

We conducted internal research and quantitative data analysis of historical issues, such as support tickets and repair logs, to align the 'why' behind the 'what' for our large, cross-domain team.

Deep dives with subject matter experts expanded our domain knowledge while shadowing staff in offices and on-site provided real-life context. A series of remote workshops allowed us to systematise our findings and identify opportunities.

Together, we mapped the entire end-to-end experience, capturing all interactions between research participants, company infrastructure, communication channels, and staff into a comprehensive service blueprint.

By building this shared understanding, we created a solid foundation for developing solutions that were informed, effective, and aligned across all domains.

Research Platform Interface Snapshots

Formulating strategic assumptions.

Information architecture decisions:

  1. Progressive disclosure should be over-applied to simplify complex processes, breaking them into smaller steps to give users a faster sense of progress.
  2. Use a one-question or one-step-at-a-time approach to allow more space for in-context instructions and troubleshooting, especially for hardware setups.
  3. Avoid linear workflows by offering multiple entry and exit points, acknowledging that solutions vary across products and markets.

Design decisions:

  1. Use the card metaphor extensively to surface contextual information and provide larger interactive areas.
  2. Rely on primary colour combined with saturated greys for easy rebranding across different markets.
  3. Explore design iterations with multiple locales to efficiently handle different languages, label lengths, or date formats, ensuring a seamless experience in every language.
  4. Design for adaptability to ensure consistency of the experience when the product’s feature set shrinks or expands in a given market.
Research Platform Interface Snapshots

Designing for continuous improvement

While conducting research, we simultaneously engaged in rounds of prototyping, review, refinement, and testing.

This parallel approach aimed to replace third-party solutions in the initial market and set the stage for continuous product optimisation and expansion in subsequent releases.

After the initial release, we leveraged usage data and feedback from local operations to make even more informed decisions.

Our modular architecture, leveraging horizontal and vertical feature toggling at the component, pattern, and feature levels, enabled support for:

  1. Various use cases, including standalone apps, diary apps, and wearable companion apps.
  2. Multiple platforms and devices, including web, native apps, Progressive Web Apps (PWAs), and side-loaded applications.
  3. Multiple locales and languages, including those requiring right-to-left (RTL) text orientation.

After multiple iterations, release cycles, and a company rebranding, our design solidified into the company's first mobile-oriented component library. This library contributed to the global Nielsen Design System, significantly reducing the time to market for a portfolio of research products.

With an adaptable design in place, we shifted our focus back to facilitating collaboration at scale.

Research Platform Collaboration Framework

Facilitating collaboration at scale

With a global stakeholder map, it is easy to get buried in alignment meetings attempting to build consensus on contradicting agendas in different domains or competing priorities in different markets.

To address this challenge, we collaborated with leaders across domains to streamline facilitation and consensus-building. We experimented with tools that offered stakeholders an efficient, algorithmic approach to build consensus and prioritise further research and development.

This initiative, promoting the culture of evidence-based decision-making in an S&P500 company, concluded with the idea of a scorecard roadmap where:

  1. Stakeholder groups submit feature requests and negotiate numerical values representing their importance across geographies and domains.
  2. Engineering teams estimate the effort required and, together with designers, assess the confidence added by research.
  3. These values are fed into a prioritisation formula that calculates each improvement's overall value to the company, effectively prioritising the development roadmap for development or further research.
Research Platform Interface Snapshots

Rewarding results

The panel apps project continues transforming audience measurement participation at Nielsen, with new apps launching to support an ever-growing number of countries (20 markets and counting) and products.

Key achievements:

  1. Reduced cost of development, localisation, and time to market.
  2. Reduced operational complexity, 3rd party dependency and overhead.
  3. Improved data quality, reliability, and participants’ engagement.
  4. Reduced participant churn.

The project's impact and results were so significant that I received the CEO's Annual Award for Global Top Performers two years in a row.

The three most important things I learned

Throughout this transformative experience in UX leadership and an unforgettable adventure, I learned three invaluable lessons:

Communication and collaboration are the best areas to focus on initially, as you can improve them the fastest. Read more → “Countermeasures against adhocracy and three other enterprise problems affecting software development.”

Collaborating on a global scale means a lot more people to learn from. Read more → “Where to work, when and why?”

Confidence in public speaking takes practice, but there are tricks to gain it faster. Read more → “Introvert’s Guide to Confident Presentations.”