Have you ever debated what should be included in your site’s header? This week’s header test by Harry & David’s ecommerce store increased purchases by a double digit amount. See if you can guess which header did the trick…
There were seven total elements tested: search box elements such as outline, highlight, and color; search term copy and location; and the look of the header’s cart icon.
See the creative samples and guess which one won. Let us know your thoughts in the comments.
Landing page content is always a balancing act of not enough information vs. too much copy clutter. See whether longer or shorter copy increased sales by 62% on this landing page.
This week’s case study is a clean A/B test – the versions are identical except one omits a FAQ section that had appeared below the fold. Note: Click twice on each images to blow them up to full size.
Site navigation is something every website should be testing. This week’s Ecommerce Product Page test has some interesting results that should remind you to always be testing.
This is a clean A/B test juxtaposing two product pages that are 100% identical except one version includes site navigation in the left hand margin and the other version does not.
Can you guess WhichTestWon? Let us know your thoughts in the comments.
Have you ever wondered how important trust icons are on your lead generation forms? We’ve wondered the same thing! Can you guess which form got the most submissions?
This is a straight forward A/B test pitting a short form with no trust indicator against one with the TRUSTe insignia. This was the ONLY element that changed on the page.
Does a clean entrance page or additional site navigation increase leads for Dell’s cloud computing solution? Can you guess which test won in this 2012 award-winning test?
This section of Dell’s site has over 60+ pages of content to digest. Dell hypothesized that the additional nav would increase leads,and ran this test in order to find out if it would lift conversions or add too much clutter.
What do you think? Did this additional nav increase leads or confusion. Vote and tell us what you think:
This week’s test showcases a subscription offer page test by social media giant LinkedIn. Can you guess which of the radically redesigned pages increased subscription purchases by 20%?
There are many differences between the two pages. The most notable is one page uses dynamic content to display specific call-outs based on user info & site location, and the other uses static content while maintaining the familiar LinkedIn UI.
This week’s test is a must-view for all sites getting traffic from more than one country. If you market globally (or want to someday) see if you can guess which homepage won the test:
The pages were identical except for the addition of a dynamic text call-out targeted at international visitors. The country’s name was displayed based on the visitor’s geolocation.
Have you run an HTML vs. plain text email test lately? With the winning version here increasing revenue by 303.8% you may want to soon…. Can you guess which version it is?
Both email versions shared an identical subject line, landing page, and offer, and both were sent to segments of the same list.
We were surprised by the 439% difference in leads generated between these two pages from California Closets®. Can you tell which one massively increased conversions?
These pages were the final step in a 4 step conversion path for PPC traffic. Both pages are pretty well designed – probably nicer than 90% of the lead generation pages online today. So,it’s nice to see how much testing can help even when your site is not all that bad to begin with.
Here’s the link to vote – and then join the debate in our comments section:
Side Note: We’ve only got 15 seats left for the Advanced Testing Workshop I’ll be leading in San Francisco in April. If you’re interested in attending, get more info and reserve seats at http://whichtestwon.com/training
Recently, famous British Newspaper, The Telegraph, wanted to improve revenues from Google AdSense ads on their site. So, they tested a wide variety of tweaked AdSense display designs.
Can you guess which design got more clicks?
If your company runs PPC AdSense campaigns, this is a great benchmark test to show the best sites that your ads run on. Chances are these publishers may not realize they can a/b test Google ad design to improve your clicks (not to mention their revenues!)
So, vote for your favorite now, and then consider spreading the word to your AdSense publishing partners: