The Invisible Traps in AB Testing: A Conversion Optimization Case Study
As a conversion optimization hacker, you pride yourself in making data backed decisions for your business. You follow the AB testing guidelines to a tee. After dozens of tests, you've finally found your diamond in the rough - your new landing page is performing 15% better in converting visitors for registrations. You're so pumped, you're ready to go!Don't.
At least not before you've checked all the data.
Case study
Pinterest once ran a test on the visual design where they increased the size of the related pin to the original image that brought users to their landing page.
It created more equality for the extended image roll and may get users into the mode of scrolling on more instantly. Brilliant idea! Data agreed too. They saw a lot of engagement and the new layout was 25% more effective at convincing a visitor to sign up.
They didn't ship that page.
Why? Because the new design broke SEO. Well, kinda. They saw a double digit drop in traffic coming through. The resizing of the images forced search engines to re-crawl tons of pages and the traffic through image search was negatively impacted. This was a highly trafficked landing page so the percentage drop in top of the funnel was considerable.
Key Takeaway
It is a simple but often overlooked phenomenon in conversion optimization - when you're laser focused on improving one key metric, make sure you've buttoned up on the entire data flow when making your decision. Pride yourself on being the data detective - keep your ears open when your data is screaming a B-side story.
Follow me on twitter (@qindizhang) or add me to your Google+ circle, my friend! Let me know in the comment what other topics you want to read about!
No comments:
Post a Comment