Bandit Algorithms for Website Optimization

O'Reilly Media
SKU:
9781449341336
|
ISBN13:
9781449341336
$22.97
(No reviews yet)
Condition:
New
Usually Ships in 24hrs
Current Stock:
Estimated Delivery by: | Fastest delivery by:
Adding to cart… The item has been added
Buy ebook
When looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success. This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You’ll quickly learn the benefits of several simple algorithms—including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms—by working through code examples written in Python, which you can easily adapt for deployment on your own website. Learn the basics of A/B testing—and recognize when it’s better to use bandit algorithms Develop a unit testing framework for debugging bandit algorithms Get additional code examples written in Julia, Ruby, and JavaScript with supplemental online materials


  • | Author: John Myles White
  • | Publisher: O'Reilly Media
  • | Publication Date: January 03, 2013
  • | Number of Pages: 88 pages
  • | Language: English
  • | Binding: Paperback
  • | ISBN-10: 1449341330
  • | ISBN-13: 9781449341336
Author:
John Myles White
Publisher:
O'Reilly Media
Publication Date:
January 03, 2013
Number of pages:
88 pages
Language:
English
Binding:
Paperback
ISBN-10:
1449341330
ISBN-13:
9781449341336