Is Data Ingrained in your Company DNA? [eBook]
At Optimizely, our vision is to enable the world to turn data into action. We know that this process is much easier said than done, and have a few thoughts on how to make growing your business more efficient and improve your data-driven decision making processes.

How many metrics do you report to your team on a weekly basis? Do you know which one(s) you’re actively trying to improve? What’s your process for tackling that challenge?
Enacting change with data is easier said than done, so we’ve collected a few thoughts on how to be data driven as we crafted our latest in-depth guide to optimization. We have access to a tremendous amount of data, with dozens of metrics and KPIs at our disposal. Despite our best efforts, collecting data and reporting on metrics can still leave us guessing.
“Data-collecting doesn’t mean data-informed, and data-informed doesn’t mean data-acting. 59% of companies do not have processes in place in their organizations to ensure that data is understood and acted upon.“ —Data Driven Culture Survey Report, Econsultancy & Geckoboard
We’ve discovered that data isn’t just difficult to act on; it’s tough to know what data is the right kind, and how to incorporate data into a continuous process of testing. Without a framework for regularly collecting and taking action on our data, we’ll never reach our goal of becoming more data-driven.
Just using data isn’t enough
The ideal of being ‘Data-Driven’ is actually to be ‘Data-Informed.’ Data points are an important set of inputs, but they are not a replacement for human intuition and judgment. Quantitative analysis can help to uncover areas where there are gaps in your website’s performance, or opportunities for improvements through optimization. What data points cannot provide, however, is input on what your business metrics should be, or what initiatives you want to prioritize. Quantitative data can also fail to provide context. After your analytics data provides a framework for where to test, qualitative insights fill in the gaps of what should be tested—and how. Hiten Shah, Co-Founder of KISSmetrics, recently shared some key questions to ask when formulating A/B tests.
Use a structured process for building data into optimization
Optimizing your website, app, or other customer touchpoints are prime opportunities to proactively experiment and take action with your data. Additionally, experiments based on data-informed hypotheses increase your likelihood of success compared to a randomly structured testing program. Data plays a key role in the following steps of the Optimization Flywheel:
Set Goals
The first step to a successful optimization process is to clearly align tests around a business goal. A clear understanding of the metrics you are optimizing for will help with prioritization and enables continual iteration and learning from experiments.
Determine Optimization Points
Identify a step in your funnel that is a prime candidate for optimization. It might be your homepage call to action (CTA), your campaign landing pages, your checkout flow, or your recommended content. Make sure to choose an area for optimization that has a direct correlation to your business goals. Quick funnel analysis with your analytics software will provide context for which areas of your website can be improved. Will optimizing this step of your website experience create a measurable change to the metric you identified when setting your goals?
Hypothesize Improvements
A strong hypothesis about how your experiment will perform is core to running a winning A/B test. Hypotheses are statements, not open-ended questions. They address a question with a proposed solution. Crafting a hypothesis to address an open question or problem on your website enforces a well-rationalized, thoughtful proposal for how to address that problem. To take your hypothesis even further, consider what you would learn if your prediction was proven correct or incorrect in an experiment. What would you learn in each scenario? Leverage your knowledge of your customers’ needs and other qualitative data for clearer indicators of what should be tested to better match your visitors’ intent and solve their frustrations with your product or website. Take the time to collaborate on hypotheses with your team, and properly document them along with the data that you used to inform them (both qualitative and quantitative).
Measure and Iterate
After running an experiment, evaluate whether your hypothesis was proven correct. A discovery from a winning test can be applied to other areas of a website, or testing a more advanced hypothesis which builds upon the previous discovery (segmentation of high-value audiences within the previous test, for instance.) If your test was a draw or your variation lost, investigate why that might be. It could be the case that your hypothesis needed additional research, or that you didn’t account for a behavior or event that skewed your test results. In the event of a losing test or a draw, it is still possible to gain additional insights that you had not anticipated through further analysis and discussion with your testing team. At this point, discuss what the data couldn’t account for as you planned the experiment. What would you do differently in your next hypothesis and experiment?
We’re just getting started
Uncover an intentional, measurable approach to running better A/B tests and experiments that will help you connect with customers, boost your KPIs, and grow your business with the guide to Building your Company’s Data DNA. Download the complete resource for:
- Organizational recommendations for alignment around key metrics
- Tactical advice for CRO practitioners around uncovering and prioritizing impactful test ideas
- Tips from data and optimization experts at KISSmetrics and Qualaroo