Cisco Blogs


Cisco Blog > Digital and Social

Lessons from Wikipedia and other A/B testers

December 7, 2010
at 11:48 am PST

Chances are you have used Wikipedia for something in the last few months. And if so, chances are you have seen one of their fundraising pleas, such as these:

What’s really interesting is that Wikipedia is publishing the results of the ads, in something we in the biz call an “A/B” or multivariate test.  The idea is this: Create a series of different ads – with different pictures, headlines, buttons and links – then alternate them across the site, and see which combinations work best. You can measure “version A”  versus “version B” (that’s called an A/B test), or you can do a more sophisticated mix and match of the elements, called a multivariate test.  The great thing is you get data from 10,000 or 100,000 or 1,000,000+ interactions with your web site creative content, and from that can see what visitors are actually interested in, based on their behavior.

Commercial web sites do this all day every day, but the difference is that Wikipedia is posting their results in real time publicly, to the world.  There’s a ton of raw data about click behavior in multiple languages.

In their link to their A/B tests data updates you see a lot of interesting and counter-intuitive behaviors.  For instance, I thought “Admit it- without Wikipedia, you never could have finished that report.” is a fantastic headline, but it got less than a 1% click-through rate, whereas the “personal appeal from Jimmy Wales” with his picture got almost 3% clickthrough on a recent week.

The Wikipedia team are also posting regular summaries of their findings, in case you don’t want to slog through their detailed testing page or download their spreadsheets of raw data.  Some findings from last week emphasize that the total experience (from clicks through donation) are important to measure, and not just the initial click-through rate:

  • “The original two-step payment form we’ve been using is the most effective, it performed better than the new one-step process.”
  • “Adding an editor’s image to the landing page did not significantly affect donations.”
  • “The click-through rates on the editor banners continue to be on par with the winning Jimmy banner, but bring in fewer donations.”
  • “Many donors appear to relate better to letters which focus on readers showing support instead of individuals editing.”
  • “There has been a positive response to the new editor banners, the variety keeps our campaign interesting.”

From the commercial web world, there are a couple of other sites that post results of A/B and multivariate tests: WhichTestWon.com and MarketingExperiments.com both show recent design tests and let you guess which ones were most effective.

By the way, you don’t have to use A/B tests just for advertising. In fact, many companies use them as part of their design regimen to test for which designs actually are most effective for users to complete a task.  The A/B test tells you which design works better, but of course not why it works better; you still need usability testing for that.

P.S. Shoutout  to the folks at digitaloptimizer and others for pointing out the Wikipedia tests to me – very interesting.

Tags: , ,

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.

1 Comments.


  1. Good post, thanks Martin.

    It’s great that Wikipedia is publishing this data. I use A/B split testing for a lot of my online marketing projects, and it’s nice to be able to compare my data with that of an online powerhouse!

       0 likes