reklama - zainteresowany?

Designing with Data. Improving the User Experience with A/B Testing - Helion

Designing with Data. Improving the User Experience with A/B Testing
ebook
Autor: Rochelle King, Elizabeth F Churchill, Caitlin Tan
ISBN: 978-14-493-3495-6
stron: 370, Format: ebook
Data wydania: 2017-03-29
Księgarnia: Helion

Cena książki: 126,65 zł (poprzednio: 147,27 zł)
Oszczędzasz: 14% (-20,62 zł)

Dodaj do koszyka Designing with Data. Improving the User Experience with A/B Testing

Tagi: Funkcjonalność stron i UX | Techniki programowania

On the surface, design practices and data science may not seem like obvious partners. But these disciplines actually work toward the same goal, helping designers and product managers understand users so they can craft elegant digital experiences. While data can enhance design, design can bring deeper meaning to data.

This practical guide shows you how to conduct data-driven A/B testing for making design decisions on everything from small tweaks to large-scale UX concepts. Complete with real-world examples, this book shows you how to make data-driven design part of your product design workflow.

  • Understand the relationship between data, business, and design
  • Get a firm grounding in data, data types, and components of A/B testing
  • Use an experimentation framework to define opportunities, formulate hypotheses, and test different options
  • Create hypotheses that connect to key metrics and business goals
  • Design proposed solutions for hypotheses that are most promising
  • Interpret the results of an A/B test and determine your next move

Dodaj do koszyka Designing with Data. Improving the User Experience with A/B Testing

 

Osoby które kupowały "Designing with Data. Improving the User Experience with A/B Testing", wybierały także:

  • Projektowanie oprogramowania dla zupeÅ‚nie poczÄ…tkujÄ…cych. Owoce programowania. Wydanie V
  • D3.js w akcji
  • Nie każ mi myÅ›leć! O życiowym podejÅ›ciu do funkcjonalnoÅ›ci stron internetowych. Wydanie III
  • Ucieczka z puÅ‚apki budowania. Efektywne zarzÄ…dzanie produktem
  • Badanie UX. Praktyczne techniki projektowania bezkonkurencyjnych produktów

Dodaj do koszyka Designing with Data. Improving the User Experience with A/B Testing

Spis treści

Designing with Data. Improving the User Experience with A/B Testing eBook -- spis treści

  • Designing with Data: Improving the User Experience with A/B Testing
  • Praise for Designing with Data
  • Foreword
  • Preface
    • Design and Data: A Perfect Synergy
    • Our Focus: A/B Testing
    • Some Orienting Principles
    • Who Is This Book For?
    • Scope
    • About Us
      • A Word from Rochelle
      • A Word from Elizabeth
      • A Word from Caitlin
    • How This Book Is Organized
    • How to Read This Book
      • Introducing our Running a Camp Metaphor
    • OReilly Safari
    • How to Contact Us
    • Acknowledgments
      • Rochelle
      • Elizabeth
      • Caitlin
  • 1. Introducing a Data Mindset
    • Data as a Trend
    • Three Ways to Think About Data
    • What Does This Mean for You as a Designer?
    • Data Can Help to Align Design with Business
      • On Data Quality
    • With a Little Help from Your Friends...
      • Data Producers
      • Data Consumers
    • What If You Dont Have Data Friends (Yet)?
    • Themes Youll See in This Book
    • Summary
    • Questions to Ask Yourself
  • 2. The ABCs of Using Data
    • The Diversity of Data
      • Many Dimensions of Data
      • Why are you collecting data?
    • When is the data collected?
    • How is the data collected?
      • How much data to collect?
    • Why Experiment?
      • Learning About Causality
      • Statistically Significant, not Anecdotal
      • Informed Opinions about what will happen in the Wild
    • Basics of Experimentation
      • Language and Concepts
      • Race to the Campsite!
      • Experimentation in the Internet Age
    • A/B Testing: Online Experiments
      • Sampling Your Users Online
      • Cohorts and segments
      • Demographic information
    • New users versus existing users
      • Metrics: The Dependent Variable of A/B Testing
      • Detecting a Difference in Your Groups
      • How big is the difference you want to measure?
    • A big enough sample to power your test
      • Significance level
    • Your Hypothesis and Why It Matters
      • Defining a Hypothesis or Hypotheses
      • Know What You Want to Learn
    • Running Creative A/B Tests
      • Data Triangulation: Strength in Mixed Methods
      • The Landscape of Design Activities
      • Exploring and evaluating Ideas
      • Thinking Global and Thinking Local
    • Summary
    • Questions to Ask Yourself
  • 3. A Framework for Experimentation
    • Introducing Our Framework
      • Working with Data Should Feel Familiar...
    • Three Phases: Definition, Execution, and Analysis
      • The Definition Phase
      • The Execution Phase
      • The Analysis Phase
    • Examples: Data and Design in Action
    • Summary
    • Questions to Ask Yourself
  • 4. The Definition Phase (How to Frame Your Experiments)
    • Getting Started: Defining Your Goal
      • Defining Your Metric of Interest
      • Metric sensitivity
      • Tracking multiple metrics
      • Getting the full picture
      • Your metrics may change over time
    • Competing metrics
      • Refining Your Goals with Data
    • Identifying the Problem You Are Solving
      • Remember Where You Are
    • Building Hypotheses for the Problem at Hand
      • Example: A Summer Camp Hypothesis
      • Example: Netflixtransitioning from DVD Rentals to Streaming
    • The Importance of Going Broad
      • Multiple Ways to Influence a Metric
      • Focus on New and Existing Users
      • Revisit the Scope of Your Problem
      • Example: Netflix on the PlayStation 3
      • Involve Your Team and Your Data Friends
    • Which Hypotheses to Choose?
      • Consider Potential Impact
      • Using What You Already Know
      • Using Other Methods to Evaluate Your Hypotheses
      • Consider the Reality of Your Test
      • How much measurable impact do you believe your hypothesis can make?
      • Can you draw all the conclusions you want to draw from your test?
      • Balancing learning and speed
      • Keep Your Old Hypotheses in Your Back Pocket
    • Summary
    • Questions to Ask Yourself
  • 5. The Execution Phase (How to Put Your Experiments into Action)
    • Designing to Learn
      • Engaging Your Users in a Conversation
      • Having Quality Conversations
      • Designing to extremes to learn about your users
    • Revisiting the minimum detectable effect
    • Designing the Best Representation of Your Hypothesis
      • Understanding Your Variables
    • Not all variables are visible
      • Your Design Can Influence Your Data
      • Example: Netflix Wii
      • Revisiting the Space of Design Activities
      • Avoiding Local Maxima
    • Different problems for summer camp
      • Directional testing: Painted door tests
      • Picking the right level of granularity for your experiment
      • Example: Netflix on Playstation 3
      • Example: Spotify Navigation
      • Experiment 1: Defining the hypothesis to get early directional feedback
      • Experiment 1: Designing the hypotheses
      • Interlude: Quick explorations using prototypes and usability testing
      • Experiment 2: Refining the tabbed navigation
      • Designing your tests
      • Other Considerations When Designing to Learn
      • Polishing your design too much, too early
      • Edge cases and worst-case scenarios
      • Taking advantage of other opportunities to learn about your design
      • Identifying the Right Level of Testing for Different Stages of Experimentation
    • Running parallel experiments
    • Thinking about Experiment 0
    • Summary
    • Questions to Ask Yourself
  • 6. The Analysis Phase (Getting Answers From Your Experiments)
    • Vetting Your Designs Ahead of Launch
      • Lab Studies: Interviews and Usability Testing
      • Surveys
      • Working with Your Peers in Data
    • Launching Your Design
      • Balancing Trade Offs to Power Your Test
      • Weighing sample size and significance level
      • Getting the sample that you need (rollout % versus test time)
      • Who are you including in your sample?
      • Practical Implementation Details
    • Is your experience normal right now?
      • Sanity check: Questions to ask yourself
    • Evaluating Your Results
      • Revisiting Statistical Significance
    • What Does the Data Say?
      • Expected (Positive) Results
      • Unexpected and Undesirable (Negative) Results
      • When the World is Flat
      • Errors
      • Replication
      • Using secondary metrics
      • Using multiple test cells
      • Rolling out to more users
    • Revisiting thick data
      • Getting Trustworthy Data
      • Novelty effect
    • Seasonality bias
    • Rolling Out Your Experience, or Not
      • Whats Next for Your Designs?
      • Were you exploring or evaluating?
      • Was your problem global or local?
      • Knowing when to stop
      • Ramp Up
      • Holdback Groups
      • Taking Communication into Account
    • Case Study: Netflix on PlayStation 3
      • Many Treatments of the Four Hypotheses
      • Evolving the Design Through Iterative Tests
      • What If You Still Believe?
    • Summary
    • Questions to Ask Yourself
  • 7. Creating the Right Environment for Data-Aware Design
    • Principle 1: Shared Company Culture and Values
      • Depth: Communicating Across Levels
      • Breadth: Beyond Design and Product
      • The Importance of a Learning Culture
      • The rewards of taking risks: Redefining failure
      • The value of developing your customer instinct
    • Principle 2: Hiring and Growing the Right People
      • Establishing a Data-Aware Environment Through Your Peers
      • Hiring for Success
      • Building the team with data involved from the start
    • Principle 3: Processes to Support and Align
      • Establishing a Knowledge Baseline
      • Establishing a Common Vocabulary
      • Developing a Rhythm Around Data Collection and Sharing
      • Project review meetings
    • Spreading data across the organization
      • Creating a Presence in the Office
      • Learning from the Past
    • Summary
    • Questions to Ask Yourself
  • 8. Conclusion
    • Ethical Considerations
      • Ethics in Online Experimentation
      • Design Experimentation Versus Social Experimentation
      • Two Power of Suggestion Experiments
      • Toward Ethical A/B Testing
      • Key Concepts
      • Asking Questions, Thinking Ethically
    • Last Words
  • A. Resources
    • Keywords
      • Chapter 1
      • Chapter 2
      • Chapter 3
      • Chapters 4, 5, and 6
      • Chapter 7
      • Chapter 8
    • Books
    • Online Articles, Papers, and Blogs
    • Courses
    • Tools
    • Professional Groups, Meetups, and Societies
  • B. About the Authors
  • About the Authors
  • Colophon
  • Index
  • Copyright

Dodaj do koszyka Designing with Data. Improving the User Experience with A/B Testing

Code, Publish & WebDesing by CATALIST.com.pl



(c) 2005-2024 CATALIST agencja interaktywna, znaki firmowe należą do wydawnictwa Helion S.A.