Designing with Data

Book description

On the surface, design practices and data science may not seem like obvious partners. But these disciplines actually work toward the same goal, helping designers and product managers understand users so they can craft elegant digital experiences. While data can enhance design, design can bring deeper meaning to data.

This practical guide shows you how to conduct data-driven A/B testing for making design decisions on everything from small tweaks to large-scale UX concepts. Complete with real-world examples, this book shows you how to make data-driven design part of your product design workflow.

  • Understand the relationship between data, business, and design
  • Get a firm grounding in data, data types, and components of A/B testing
  • Use an experimentation framework to define opportunities, formulate hypotheses, and test different options
  • Create hypotheses that connect to key metrics and business goals
  • Design proposed solutions for hypotheses that are most promising
  • Interpret the results of an A/B test and determine your next move

Publisher resources

View/Submit Errata

Table of contents

  1. Praise for Designing with Data
  2. Foreword
  3. Preface
    1. Design and Data: A Perfect Synergy
    2. Our Focus: A/B Testing
    3. Some Orienting Principles
    4. Who Is This Book For?
    5. Scope
    6. About Us
      1. A Word from Rochelle
      2. A Word from Elizabeth
      3. A Word from Caitlin
    7. How This Book Is Organized
    8. How to Read This Book
      1. Introducing our “Running a Camp” Metaphor
    9. O’Reilly Safari
    10. How to Contact Us
    11. Acknowledgments
      1. Rochelle
      2. Elizabeth
      3. Caitlin
  4. 1. Introducing a Data Mindset
    1. Data as a Trend
    2. Three Ways to Think About Data
    3. What Does This Mean for You as a Designer?
    4. Data Can Help to Align Design with Business
      1. On Data Quality
    5. With a Little Help from Your Friends...
      1. Data Producers
      2. Data Consumers
    6. What If You Don’t Have Data Friends (Yet)?
    7. Themes You’ll See in This Book
    8. Summary
    9. Questions to Ask Yourself
  5. 2. The ABCs of Using Data
    1. The Diversity of Data
      1. Many Dimensions of Data
      2. Why are you collecting data?
    2. When is the data collected?
    3. How is the data collected?
      1. How much data to collect?
    4. Why Experiment?
      1. Learning About Causality
      2. Statistically Significant, not Anecdotal
      3. Informed Opinions about what will happen in the Wild
    5. Basics of Experimentation
      1. Language and Concepts
      2. Race to the Campsite!
      3. Experimentation in the Internet Age
    6. A/B Testing: Online Experiments
      1. Sampling Your Users Online
      2. Cohorts and segments
      3. Demographic information
    7. New users versus existing users
      1. Metrics: The Dependent Variable of A/B Testing
      2. Detecting a Difference in Your Groups
      3. How big is the difference you want to measure?
    8. A big enough sample to power your test
      1. Significance level
    9. Your Hypothesis and Why It Matters
      1. Defining a Hypothesis or Hypotheses
      2. Know What You Want to Learn
    10. Running Creative A/B Tests
      1. Data Triangulation: Strength in Mixed Methods
      2. The Landscape of Design Activities
      3. Exploring and evaluating Ideas
      4. Thinking Global and Thinking Local
    11. Summary
    12. Questions to Ask Yourself
  6. 3. A Framework for Experimentation
    1. Introducing Our Framework
      1. Working with Data Should Feel Familiar...
    2. Three Phases: Definition, Execution, and Analysis
      1. The Definition Phase
      2. The Execution Phase
      3. The Analysis Phase
    3. Examples: Data and Design in Action
    4. Summary
    5. Questions to Ask Yourself
  7. 4. The Definition Phase (How to Frame Your Experiments)
    1. Getting Started: Defining Your Goal
      1. Defining Your Metric of Interest
      2. Metric sensitivity
      3. Tracking multiple metrics
      4. Getting the full picture
      5. Your metrics may change over time
    2. Competing metrics
      1. Refining Your Goals with Data
    3. Identifying the Problem You Are Solving
      1. Remember Where You Are
    4. Building Hypotheses for the Problem at Hand
      1. Example: A Summer Camp Hypothesis
      2. Example: Netflix—transitioning from DVD rentals to Streaming
    5. The Importance of Going Broad
      1. Multiple Ways to Influence a Metric
      2. Focus on New and Existing Users
      3. Revisit the Scope of Your Problem
      4. Example: Netflix on the PlayStation 3
      5. Involve Your Team and Your Data Friends
    6. Which Hypotheses to Choose?
      1. Consider Potential Impact
      2. Using What You Already Know
      3. Using Other Methods to Evaluate Your Hypotheses
      4. Consider the Reality of Your Test
      5. How much measurable impact do you believe your hypothesis can make?
      6. Can you draw all the conclusions you want to draw from your test?
      7. Balancing learning and speed
      8. Keep Your Old Hypotheses in Your Back Pocket
    7. Summary
    8. Questions to Ask Yourself
  8. 5. The Execution Phase (How to Put Your Experiments into Action)
    1. Designing to Learn
      1. Engaging Your Users in a Conversation
      2. Having Quality Conversations
      3. Designing to extremes to learn about your users
    2. Revisiting the minimum detectable effect
    3. Designing the Best Representation of Your Hypothesis
      1. Understanding Your Variables
    4. Not all variables are visible
      1. Your Design Can Influence Your Data
      2. Example: Netflix Wii
      3. Revisiting the Space of Design Activities
      4. Avoiding Local Maxima
    5. Different problems for summer camp
      1. Directional testing: “Painted door” tests
      2. Picking the right level of granularity for your experiment
      3. Example: Netflix on Playstation 3
      4. Example: Spotify Navigation
      5. Experiment 1: Defining the hypothesis to get early directional feedback
      6. Experiment 1: Designing the hypotheses
      7. Interlude: Quick explorations using prototypes and usability testing
      8. Experiment 2: Refining the “tabbed” navigation
      9. “Designing” your tests
      10. Other Considerations When Designing to Learn
      11. Polishing your design too much, too early
      12. Edge cases and “worst-case” scenarios
      13. Taking advantage of other opportunities to learn about your design
      14. Identifying the Right Level of Testing for Different Stages of Experimentation
    6. Running parallel experiments
    7. Thinking about “Experiment 0”
    8. Summary
    9. Questions to Ask Yourself
  9. 6. The Analysis Phase (Getting Answers From Your Experiments)
    1. Vetting Your Designs Ahead of Launch
      1. Lab Studies: Interviews and Usability Testing
      2. Surveys
      3. Working with Your Peers in Data
    2. Launching Your Design
      1. Balancing Trade Offs to Power Your Test
      2. Weighing sample size and significance level
      3. Getting the sample that you need (rollout % versus test time)
      4. Who are you including in your sample?
      5. Practical Implementation Details
    3. Is your experience “normal” right now?
      1. Sanity check: Questions to ask yourself
    4. Evaluating Your Results
      1. Revisiting Statistical Significance
    5. What Does the Data Say?
      1. Expected (“Positive”) Results
      2. Unexpected and Undesirable (“Negative”) Results
      3. When the World is Flat
      4. Errors
      5. Replication
      6. Using secondary metrics
      7. Using multiple test cells
      8. Rolling out to more users
    6. Revisiting “thick” data
      1. Getting Trustworthy Data
      2. Novelty effect
    7. Seasonality bias
    8. Rolling Out Your Experience, or Not
      1. What’s Next for Your Designs?
      2. Were you exploring or evaluating?
      3. Was your problem global or local?
      4. Knowing when to stop
      5. Ramp Up
      6. Holdback Groups
      7. Taking Communication into Account
    9. Case Study: Netflix on PlayStation 3
      1. Many Treatments of the Four Hypotheses
      2. Evolving the Design Through Iterative Tests
      3. What If You Still Believe?
    10. Summary
    11. Questions to Ask Yourself
  10. 7. Creating the Right Environment for Data-Aware Design
    1. Principle 1: Shared Company Culture and Values
      1. Depth: Communicating Across Levels
      2. Breadth: Beyond Design and Product
      3. The Importance of a Learning Culture
      4. The rewards of taking risks: Redefining “failure”
      5. The value of developing your customer instinct
    2. Principle 2: Hiring and Growing the Right People
      1. Establishing a Data-Aware Environment Through Your Peers
      2. Hiring for Success
      3. Building the team with data involved from the start
    3. Principle 3: Processes to Support and Align
      1. Establishing a Knowledge Baseline
      2. Establishing a Common Vocabulary
      3. Developing a Rhythm Around Data Collection and Sharing
      4. Project review meetings
    4. Spreading data across the organization
      1. Creating a Presence in the Office
      2. Learning from the Past
    5. Summary
    6. Questions to Ask Yourself
  11. 8. Conclusion
    1. Ethical Considerations
      1. Ethics in Online Experimentation
      2. Design Experimentation Versus Social Experimentation
      3. Two “Power of Suggestion” Experiments
      4. Toward Ethical A/B Testing
      5. Key Concepts
      6. Asking Questions, Thinking Ethically
    2. Last Words
  12. A. Resources
    1. Keywords
      1. Chapter 1
      2. Chapter 2
      3. Chapter 3
      4. Chapters 4, 5, and 6
      5. Chapter 7
      6. Chapter 8
    2. Books
    3. Online Articles, Papers, and Blogs
    4. Courses
    5. Tools
    6. Professional Groups, Meetups, and Societies
  13. Index

Product information

  • Title: Designing with Data
  • Author(s): Rochelle King, Elizabeth F Churchill, Caitlin Tan
  • Release date: March 2017
  • Publisher(s): O'Reilly Media, Inc.
  • ISBN: 9781449334956