David Mannheim runs User Conversion, a CRO (Conversion Rate Optimization) agency based in Manchester, UK. User Conversion works on everything related to optimizing conversions—from content to design. They use FullStory session replay to identify opportunities, aggregate insights, and run experiments.
We believe that session recordings are one of the best conversion research techniques out there.
—David Mannheim, User Conversion
In one example, the User Conversion team used session replay analysis to drive a 26% conversion rate uplift for an e-commerce client.
What is User Conversion's process for using session replay for CRO? And how do you manage the thousands of session recordings in FullStory and create structured outputs from subjective analysis of session replay?
David shared the answers to the above questions (and more) in a recent webinar—you can find the recording linked below. What follows is a deep dive into the User Conversion methodology wherein we explore in detail how you can use session replay for conversion rate optimization. There is much to cover, so if you prefer to jump around, simply navigate using the in-page links below:
TABLE OF CONTENTS » Session Replay for CRO
- How to use session replay for conversion rate optimization.
- Case-Study: How Travis Perkins achieved a 26% lift in conversions.
- Tips for session replay analysis.
- Webinar recording.
- Outline of the webinar.
- Slides from the presentation.
How to use session replay for CRO.
For David and the User Conversion team, FullStory's search and session replay is the way to conduct user research, identifying opportunities, uncovering insights, and building experiments—all for the purpose of improving conversions.
We believe that session recordings are one of the best conversion research techniques out there. [It's] natural user behavior that's not skewed by tasks or any kind of bias ... [Session recordings can be] hard to analyze, but they're natural, they're organic, they're real.
As David goes on to explain, the challenge in analyzing session recordings comes down to two things:
- How do you manage the volume of recordings? As David points out, a given client can have tens of thousands of session recordings. How do you know where to start?
- Bringing rigor to the subjectivity of analysis. Ask different people to watch a the same user session recording and you'll get completely different observations.
For the first problem—managing the volume of session recordings—David explains you need to focus your analysis around an objective. Identify an aspect of your site you suspect could be optimized. As an example, if your objective is to improve the conversion rate around the checkout process, you could use FullStory search to segment user sessions by those who engage with the checkout flow but do not convert.
If the resulting set of sessions still seems too big to manage, refine your search further by filtering your sessions using additional criteria—e.g. device, acquisition source, click behaviors, or Rage Clicks. Once you have a workable set of sessions to observe—and you still won't need to watch them all—you can begin your qualitative research.
You might be wondering how do you apply rigor to qualitatively analyzing session recordings? For this, David has developed a methodology that brings a quantitative feel to the qualitative nature of session replay analysis.
David's structured analysis uses a Google Sheet—you can find the template here. What the template does is force you to record observations from sessions into a file that becomes the foundation for identifying opportunities and insights. This methodical process is important because of the outputs it creates.
What you see in the above screenshot is the result of User Conversion mining session recordings over the course of a few hours. You can see how the URL column includes a link to the FullStory session and each following column allows for notes and categorization of observations, including device, template (or page in question), action, and priority.
Keeping your qualitative research simple—making a specific observation and recording it in a spreadsheet—allows you to move through many sessions and make tons of observations. While running through this observational process, reserve judgment for what needs to be done and, as David suggests, try not to go down the rabbit hole. Stay focused on the objective of your analysis.
As your observations pile up, they feed into a "Summary Document." You can see what that looks like below (it's also in the template):
Note the highlighted lines. These are instances where the same observation was made across different session recordings (as in the first screenshot above). User Conversion's template aggregates subjective, qualitative observations from different user session recordings.
From this aggregation of subjective observations, patterns begin to emerge. Common problems surface across sessions. And the insights that result allow User Conversion to prioritize next steps.
Do the work and record and categorize your observations. Once your reach a point of diminishing returns and watching more sessions yields few novel observations, stop and either refine the set further (based on new observations), or determine next steps.
Case Study—Travis Perkins
From session replay analytics to A/B testing experiments.
Having identified opportunities and insights, User Conversion creates hypotheses centered on whatever problems customers appear to be having. These hypotheses drive User Conversion to conduct experiments using Monetate, an A/B testing tool.
People tend to think that when you experiment, you experiment from an idea, and that's not really the case.
You actually experiment from a user problem.
—David Mannheim, User Conversion
One of User Conversion's clients is Travis Perkins—a building products retailer in the UK. While analyzing Travis Perkins' session recordings, User Conversion observed that website visitors on certain pages appeared to be aimlessly searching through different product categories. David comments about a given users behavior while watching a session recording:
Within the navigation, we go to Building Materials, they scroll over. They go to Doors and Joinery. And you can almost say this out loud. You're like, "What are you looking for?”
Okay, solid wood flooring. Seems simple enough. Scrolling down, you see the user almost trying to underline Solid Wood Flooring. They're hovering over that. Maybe suggests some level of intent to see what's going on ... Then [the customer goes to] Products from Timber, and then back to Solid Wood Flooring. Nothing's going on. Okay, and then they're back to the search.
What is this user trying to do? As David says, "When you're analyzing session recordings, it's often the things that you don't see that are the most important. Try to think about why they're doing what they're doing."
Building off this observation, User Conversion refined their focus further, hunting for other sessions that contained similar customer behaviors.
Having observed many different user sessions and observing the same kind of customer behaviors crop up time and time again, the User Conversion team came up with a hypothesis. Perhaps customers were having trouble with the way information was organized on the Travis Perkins site and showing signs of struggle because the navigation and search functionalities weren't helping them find what they were looking for.
Based on this hypothesis, User Conversion came up with an experiment. What if they provided an index, alphabetically organized navigation menu in addition to their product categories. Perhaps that would help users get to where they wanted to go.
Having come up with an experiment they could test, User Conversion used Monetate to conduct A/B tests. Again, using FullStory they made new observations about just those consumers who interact with the A-Z product listing.
The results of this experiment were incredible. Of those users who interacted with the menu, User Conversion saw a 26% lift in conversion. They continue to refine the menu through watching session recordings of users who interact with it so that they can optimize the UI further.
You don't have to be working on ecommerce stores or product pages to use this method of conversion optimization, and it readily applies to any given landing page, homepage, or product pages. Session replay provides the nuanced information you need to understand opportunities for higher conversions. Furthermore, combining your analysis with multivariate tests—and using session replay to observe the user experience will produce test results you can act on to drive your CRO efforts.
It's a powerful methodology.
Tips for successful session replay analysis.
Successful session playback analysis requires attention to detail. That's why David and team only watch user session recordings at 1X speed. David cautions that while increasing the playback speed can make watching your user recordings go faster, it can mean missing customer behavior nuances—like hesitating before clicking on a button. It's often the nuances that reveal what is—or isn't—happening with that customer.
Watch the mouse movements—hypothesize what they might mean. David relates how at times you can derive offsite behavior by simply watching mouse movements. For example, if you see the mouse move up and to the left, it may signal that the user is opening a new tab, as might be the case when a customer is going to look for reviews or price checking. Again, the details captured in session recordings make all the difference.
Another example is how they've observed that customers often appear to highlight product names and then mouse up to the top of the browser navigation. What's happening? They're highlighting a product name to copy it for the purpose of running a price-check or hunt down some reviews on Google or Amazon.
Knowing how customers behave can result in creative solutions or experiments to try and improve the customer experience—and optimize for conversions.
Be ready to set aside a few hours or even a couple days to run an analysis. The success of session replay analysis hinges on doing the hard work of making observations across enough sessions that actionable insights emerge. The good news is that from this effort will come a large set of opportunities, whether they're immediate fixes or insights that lead to experiments you can run.
Session replay for CRO requires a systematic approach.
User Conversion's overall process for user research revolves around their "COIE Optimisation Framework," where "COIE" is an acronym for Configuration, Opportunities, Insights, and Experiment.
You can read more about how User Conversion conducts user research on their site, but the critical tool they use—as noted in David's presentation by the "FS" between the overlapping circles you see above—is FullStory session replay.
User Conversion's approach works because it brings a systematic approach to session replay analysis (See our discussion of how Quantitative Data and Qualitative Research Work Better Together), introducing rigor to an otherwise subjective process in order to identify opportunities and tease out insights.
All it requires to get started with this process is to determine what your objective is. From there, use FullStory to search your session recordings. The search results will leave you with a focused list of user sessions that will be filled with opportunities and insights that can be used to drive higher conversion rates.
Want to know more?
If you're intrigued by User Conversion's process for conversion optimization and interested in putting it to work, watching the full webinar below is highly recommended (Pro-tip: you can boost the replay speed by tapping the gear icon). 90+% of live attendees stayed for the entire hour-long webinar.
Thank you, David and everyone at User Conversion for sharing this wealth of information with FullStory users.
WEBINAR RECORDING » How to use session replay for conversion rate optimisation.
If you prefer, you can find the webinar recording on YouTube here.
OUTLINE » How to use session replay for conversion rate optimization.
- [5:22] How session replay helps identify of opportunities, create and manage insights, and conduct experiments. (COIE model—Configuration, Opportunities, Insights, Experiment)
- [9:53] How to structure inputs to manage session replay analysis.
- Why session replay analysis is hard—quantity of sessions and subjectivity of analysis.
- Focus—create an objective that constrains your analysis (e.g. a checkout flow).
- Categorization—how to categorize problems identified through session replay.
- [19:39] Case study—David walks through a completed analysis that led to an experimenting on the navigation for a building products (retail) web site. Experiment results in 26% conversion rate uplift for User Conversion e-commerce client.
- Watching a session replay in FullStory, teasing out insights.
- Creating a hypothesis based on observations from session recordings.
- [26:00] How to structure outputs. On managing subjectivity of session replay analysis.
- Narrating using "thematic analysis" doesn't work. Takes too long.
- Session replay analysis template that User Conversion uses to track observations from FullStory sessions, categorize sessions by device and follow-up action (e.g. bug, experiment, etc.). See the template here.
- Template aggregates insights and produces a report.
- [32:05] Live example—analysis of users who engage with the navigation.
- [32:50] Using FullStory with variations (uses A/B testing tool Monetate).
- [34:20] Using FullStory Page Insights Inspect Mode or Chrome Inspect to constrain FullStory sessions to those that include interaction with the navigation element.
- [36:22] Identifying from FullStory session list which sessions to watch (paying attention to events, etc.)
- [37:00] Law of diminishing returns on watching session recordings.
- [40:30] How mouse movements may signal offsite behaviors (e.g. researching prices, mouse moves up to open a new tab to search on another web page).
- "When you're analyzing session recordings, it's often the things that you don't see that are the most important."
- "Try to think about why they're doing what they're doing."
- [47:44] How long does analysis take? Timeboxing analysis.
- [49:00] Q&A