Introducing stats engine service and enhancements to our experiment data platform
The era of a single black-box data product as your source of truth is over. Teams need complete visibility throughout the customer lifecycle, forcing a move from all-in-one tools to open, integrated stacks with many data sources.

The key is ensuring you’ve got clean, actionable data in the right place with proper tools to derive insights. We’re excited to announce today several enhancements to our experimentation data platform to meet the growing needs of product, engineering, data science, and growth marketing teams and to help you solve more complex data and statistics problems.
Introducing Stats Engine Service
The new Stats Engine Service exposes Optimizely’s proprietary model for AB testing statistical analysis via an API. For the first time, you can run Stats Engine on non-Optimizely datasets from external sources like a data warehouse or analytics tool. We developed Stats Engine with leading researchers from Stanford University to democratize results review so anyone in a company can interpret results and make sound decisions. Historically, customers could only use Stats Engine in the context of our Optimizely Results page. Now, we’re helping you bring the power of Stats Engine to anywhere you have experimentation data.
For example, you can join Optimizely decisions with private financial data and then run Stats Engine on those metrics. Go beyond optimizing conversion rates by using Stats Engine to measure the impact of experiments on complex metrics like LTV, MAUs, and retention. Now, you can even plot Stats Engine analysis charts in business intelligence (BI) tools like Tableau or Chartio. The Stats Engine Service is coming soon and you can request access here.
Introducing SSRM (Sequential Sample Ratio Mismatch) Service
Proper experiment design is critical to making sound decisions about your site, app, and business. From time to time, an implementation issue can influence how visitors are counted in a test, possibly introducing bias in the results. When the number of users in each variation differs significantly from what is expected under the intended random allocation, you may have a Sample Ratio Mismatch (SRM). For instance, you specified a 50/50 traffic split between variations in your test but are observing a 35/65 distribution of traffic. We’ve built a sequential SRM test that you can use to detect incorrectly implemented experiments as each visitor is counted so that you troubleshoot and fix them faster. The SSRM service is coming soon and you can request access here.
Introducing Enriched Events Export and Streaming Export
Optimizely’s Enriched Events Export, is a new way to help teams integrate Optimizely Results data into their analysis workflows and data stack. Customers with advanced analysis needs can now join Enriched Events with other data to develop machine-learning models and build customer reports and dashboards. Enriched Events Export supports event-level joins, with tags and metadata preserved so resolution is not lost. It features an intuitive schema, partitioned into decisions and conversions to make joins with internal data easier. It is enriched with useful information like session IDs and details regarding which experiments and variations were active when the event fired. Enriched event data is stored in an AWS S3 bucket, updated daily, and can be downloaded programmatically via an API.
Looking for real-time data? We’ll be releasing a streaming export capability that makes the Enriched events data available almost immediately after you send the data to our backend, so you can build real-time experiment analysis pipelines. Enriched Events Export is generally available and streaming export will be available later in Q3.
Improving Your Data Workflows with Command-Line, Snowflake, and Fivetran Integrations
We’re committed to fostering an open platform and accelerating your time to insight, so we’re rolling out a few new features to help relieve development overhead when integrating with Optimizely’s experiment data. We’ve released a command-line tool called oevents for data scientists and other technical users to load just the data they need from the Optimizely digital experience platform, without a production ETL job in place. Check out a demo of oevents in action. We’ve built an integration with Snowflake to automatically load your Enriched Events Export data into a Snowflake instance with zero engineering work needed. We’re also supporting a new Fivetran integration that automatically loads Enriched Events and other Optimizely data into a destination of choice. All three of these new capabilities are generally available today.
Introducing Labs: A collection of integrations and tutorials for working with Optimizely data and APIs
We’ve introduced a new section of our website called Labs. Labs are a library of reference implementations and integrations for you to extend Optimizely’s platform. Labs includes recipes for integrating Optimizely SDKs for specific programming language frameworks, reference code for sending data between Optimizely and different data providers, and Jupyter notebooks to enable more complex experiment analyses. All of these labs are open-source and hosted on Github. To learn more go to optimizely.com/labs.
Optimizely Labs – Collection of reference implementations and integrations
Learn More
- Watch the Opticon keynote live today to learn more about all of these new solutions
- Come to our co-founder Pete Koomen’s Opticode workshop tomorrow where he’ll be showing how to uncover insights in data using these new solutions.
- Learn more and request access for these upcoming features on the Data Teams page