I’ve said previously that one of my current goals is to redefine the process for completing our major page revisions to include more A/B testing to demonstrate why changes are being made. While it’s just good practice, it is also becoming increasingly invaluable to show stakeholders how changes to text, layout and presentation impact user experience.
But I wanted to take it a step further. If we launch a new version of a page, how can we show the campus leadership the positive impact of what we’ve done? I’ve been trying to come up with a better process to illustrate the improvements to a user’s journey on our site. Here are the highlights of the process I’ve been putting into place:
- Take a preliminary heat map to look at clicks & scroll depth. Then document how the page is performing in your analytics tool of choice. What is the current goal performance of the page? What is the bounce rate? Are we seeing return interest in the content? How far are people scrolling? How much extra content is on the page that doesn’t speak to the topic?
- Show the stakeholder what’s going on currently. Have an honest conversation after reviewing the data about what’s working and what isn’t. Many times, this comes down to whether we’re seeing our visitors accomplish the key tasks we want them to: explore more of the site, complete our calls to action, return to the page for more information.
- Here comes the fun part. Now, it’s time to think about what adjustments should be made. What do we want/need people to do? What is causing paralysis of choice? Are there any requirements we are required to have? Can we get creative with them? (The requirements piece can be huge in higher education as we all have to deal with multiple accreditors.)
- Construct some sort of A/B test to measure how revising the current page and the associated content can help us meet our goals. I typically use Google Optimize. At this point, reach out to the stakeholders to share what is behind the suggested changes.
- Identify the criteria for success. Do we want to see longer sessions? Do we want to see the completion of specific goals? Do we want to lower the bounce rate? See better mobile performance? Share that information with your stakeholders so everyone can understand what a successful experiment will look like.
- Launch the A/B test. This is probably the worst part because you’ve put all this work into creating what you think will be a better user experience and now you need to wait for the results to roll in.
- If the test shows that the new version our performs the goals you set previously, celebrate the launch of an improved user experience.
- Repeat the heat map experiment. While we’re happy with the results of the A/B test, we also want to make sure a broader number of visitors agree. Are they now clicking on our new calls to action? Scrolling further? Not getting lost in the extra information we were previously including on the page? Converting at a higher level?
Most importantly, be sure to show your success and be consistent in testing.