reklama - zainteresowany?

Bandit Algorithms for Website Optimization - Helion

Bandit Algorithms for Website Optimization
ebook
Autor: John Myles White
ISBN: 978-14-493-4158-9
stron: 88, Format: ebook
Data wydania: 2012-12-10
Księgarnia: Helion

Cena książki: 59,42 zł (poprzednio: 69,09 zł)
Oszczędzasz: 14% (-9,67 zł)

Dodaj do koszyka Bandit Algorithms for Website Optimization

Tagi: Algorytmy - Programowanie

When looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success.

This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You’ll quickly learn the benefits of several simple algorithms—including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms—by working through code examples written in Python, which you can easily adapt for deployment on your own website.

  • Learn the basics of A/B testing—and recognize when it’s better to use bandit algorithms
  • Develop a unit testing framework for debugging bandit algorithms
  • Get additional code examples written in Julia, Ruby, and JavaScript with supplemental online materials

Dodaj do koszyka Bandit Algorithms for Website Optimization

 

Osoby które kupowały "Bandit Algorithms for Website Optimization", wybierały także:

  • Python na maturze. Kurs video. Algorytmy i podstawy j
  • Algorytmy kryptograficzne. Przewodnik po algorytmach w blockchain, kryptografii kwantowej, protoko
  • Informatyk samouk. Przewodnik po strukturach danych i algorytmach dla pocz
  • My
  • Nauka algorytm

Dodaj do koszyka Bandit Algorithms for Website Optimization

Spis treści

Bandit Algorithms for Website Optimization eBook -- spis treści

  • Bandit Algorithms for Website Optimization
  • Preface
    • Finding the Code for This Book
    • Dealing with Jargon: A Glossary
    • Conventions Used in This Book
    • Using Code Examples
    • Safari Books Online
    • How to Contact Us
    • Acknowledgments
  • 1. Two Characters: Exploration and Exploitation
    • The Scientist and the Businessman
      • Cynthia the Scientist
      • Bob the Businessman
      • Oscar the Operations Researcher
    • The Explore-Exploit Dilemma
  • 2. Why Use Multiarmed Bandit Algorithms?
    • What Are We Trying to Do?
    • The Business Scientist: Web-Scale A/B Testing
  • 3. The epsilon-Greedy Algorithm
    • Introducing the epsilon-Greedy Algorithm
    • Describing Our Logo-Choosing Problem Abstractly
      • Whats an Arm?
      • Whats a Reward?
      • Whats a Bandit Problem?
    • Implementing the epsilon-Greedy Algorithm
    • Thinking Critically about the epsilon-Greedy Algorithm
  • 4. Debugging Bandit Algorithms
    • Monte Carlo Simulations Are Like Unit Tests for Bandit Algorithms
    • Simulating the Arms of a Bandit Problem
    • Analyzing Results from a Monte Carlo Study
      • Approach 1: Track the Probability of Choosing the Best Arm
      • Approach 2: Track the Average Reward at Each Point in Time
      • Approach 3: Track the Cumulative Reward at Each Point in Time
    • Exercises
  • 5. The Softmax Algorithm
    • Introducing the Softmax Algorithm
    • Implementing the Softmax Algorithm
    • Measuring the Performance of the Softmax Algorithm
    • The Annealing Softmax Algorithm
    • Exercises
  • 6. UCB The Upper Confidence Bound Algorithm
    • Introducing the UCB Algorithm
    • Implementing UCB
    • Comparing Bandit Algorithms Side-by-Side
    • Exercises
  • 7. Bandits in the Real World: Complexity and Complications
    • A/A Testing
    • Running Concurrent Experiments
    • Continuous Experimentation vs. Periodic Testing
    • Bad Metrics of Success
    • Scaling Problems with Good Metrics of Success
    • Intelligent Initialization of Values
    • Running Better Simulations
    • Moving Worlds
    • Correlated Bandits
    • Contextual Bandits
    • Implementing Bandit Algorithms at Scale
  • 8. Conclusion
    • Learning Life Lessons from Bandit Algorithms
    • A Taxonomy of Bandit Algorithms
    • Learning More and Other Topics
  • Colophon
  • Copyright

Dodaj do koszyka Bandit Algorithms for Website Optimization

Code, Publish & WebDesing by CATALIST.com.pl



(c) 2005-2024 CATALIST agencja interaktywna, znaki firmowe należą do wydawnictwa Helion S.A.