Let's Get Real About Real-Time Data Processing!
Published on Oct 15, 2024
by Zoë Oakes
Real-time online controlled experimentation is transforming how companies test new ideas, detect issues early, and make revenue-driving decisions with agility. Experimentation industry leaders like ABsmartly have championed this evolution, creating systems that provide immediate insights and create a culture of continuous improvement and innovation.
Jonas Alves, co-founder and CEO of ABsmartly, and Lukas Vermeer, Senior Director of Experimentation at Vista, each have extensive experience building and leading experimentation teams at companies like Booking.com. They know firsthand how real-time data can revolutionise business experimentation and drive increased revenue.
What Do We Mean By Real-Time in Experimentation?
In online experimentation, real-time often becomes a buzzword, but what does it really mean? Real-time implies immediate access to data or outcomes, but in practice, we’re talking of seconds or minutes. In the world of experimentation, real-time generally refers to the ability to process and respond to data quickly enough to make meaningful and timely decisions.
What counts as real-time? In most scenarios, real-time implies that data latency is low enough for teams to identify trends, spot issues, and make decisions at speed. While the precise definition can be subjective, this article's real-time definition will refer to a latency window of seconds or minutes rather than hours or days.
What is Latency?
In experimentation, latency is the delay between when an event occurs and when its effects are visible in reports or dashboards. It is essentially the time it takes for data to travel through the various collection, processing, and analysis stages to reach a report or dashboard. Latency determines how quickly teams can detect trends, identify bugs, and make informed decisions based on live data. High latency can impact the ability to react quickly to negative outcomes, leaving problems untouched or unnoticed for hours or days, which can have dire effects on customer experience, the brand, and the business. Making real-time data essential for effective and agile experimentation practices to react without delay.
What Causes Latency in Experimentation?
Several factors contribute to latency in data processing and reporting within experimentation platforms:
Data Collection: The first step involves collecting user interactions or events. This process can vary in speed, depending on the complexity of the tracked data and the efficiency of the tools used for logging these interactions. Any delay in this phase adds to the overall latency.
Data Transmission: Once collected, the data needs to be transmitted from the user's device to a central processing system. Network speed, server load, and data size all affect how quickly data is transmitted.
Aggregation and Processing: After data reaches the server, it undergoes aggregation and processing to generate insights. This step introduces further delays, particularly if the infrastructure relies on batch processing (e.g., data warehouses) rather than real-time streaming.
Dashboard and Reporting: Finally, processed data is loaded into dashboards or reports for analysis. While some dashboards are built to display data quickly, the latency incurred in previous stages often limits the speed.
Streaming-based systems like ABsmartly tend to provide near real-time results within minutes, while data warehouses could introduce delays of up to 36 hours due to batch processing schedules. At companies like Booking.com, a mix of architectures resulted in latency windows ranging from a few minutes to several hours, depending on the system used.
Why Is Real-Time Data Such a Game Changer for PLG?
Real-time data in experimentation is transformative because it enables quicker, data-driven decisions. Immediate data enables teams to detect and address problems as they occur. If an experiment negatively impacts key metrics like reservations, returns, or cancellations, real-time insights can trigger an early warning, allowing teams to promptly investigate and mitigate potential issues, reduce risk, and even potentially save thousands or millions to the business.
Real-time data allows developers and product teams to iterate rapidly. By quickly assessing the impact of changes, teams can refine their strategies and improve user experiences faster. With real-time monitoring, companies can adopt a risk-tolerant approach to experimentation. They can deploy changes more confidently, knowing they have the tools to catch negative outcomes early. This shift in risk perception empowers teams to try new ideas without fear of long-term negative consequences.
The ability to view results within minutes rather than days or weeks changes the entire landscape of experimentation. It encourages a proactive, data-centric approach to product development that prioritises speed, responsiveness, and continuous improvement.
Consequences of High Latency in Experimentation
When data takes hours or days to surface, it delays decision-making, and teams are left in the dark about the impact of their changes. This delay can mean missed opportunities to optimise products or services and could make negative outcomes worse if a problematic change goes undetected for too long. High latency often leads to conservative behaviour in deploying experiments. Teams become more cautious about rolling out changes because the delayed feedback loop makes it harder to identify and correct problems quickly. This risk aversion can stifle innovation and slow down the development cycle or experimentation flywheel.
Without real-time insights from experimentation platforms, teams may turn to traditional monitoring tools for immediate feedback. However, these tools lack the sensitivity and specificity of controlled experiments, potentially leading to incomplete or misleading conclusions.
Internal processes also become more cumbersome when data takes longer to process. Teams may need to adopt more extensive QA measures pre-launch, which can slow down the product development pipeline. High latency also forces teams to rely on intuition and hypothesis-driven decision-making rather than using data to make objective insights.
Opportunities of Low Latency in Experimentation
Low-latency experimentation opens up a wealth of opportunities for organisations. One of the most significant advantages of real-time experimentation platforms is the ability to immediately identify bugs. Small issues, especially those affecting niche user segments, can be detected through experimentation data that traditional monitoring tools might overlook.
Real-time data enables teams to confidently deploy changes. With the ability to closely monitor the impact of new features, companies can mitigate risk by halting or adjusting experiments if they show negative trends. This reduced feedback loop allows for rapid iteration. Teams can make small, incremental changes and quickly assess their impact. Real-time insights also allow teams to allocate resources efficiently—for instance, QA teams can prioritise experiments showing potential issues, focusing on high-risk areas rather than conducting exhaustive pre-launch testing on every new feature.
How and When Would You Use Real-Time Experimentation?
Real-time data should be leveraged whenever teams need immediate feedback to guide decision-making. Real-time experimentation is particularly valuable in these scenarios:
Feature Rollouts: To monitor the release of new features. If negative trends are detected, feature flags can be used to quickly disable or adjust the feature for specific user segments.
Bug Fixes: Running a controlled experiment for bug fixes allows teams to ensure that the change doesn't introduce new issues, providing an additional safety net.
Risky or Uncertain Changes: When testing changes with potentially high risk (e.g., major UI overhauls).
Marketing Campaigns: Real-time data is important in monitoring user behaviour in response to marketing campaigns. Immediate insights can guide adjustments to campaign strategies.
The Importance of Post-Release Monitoring
Post-release monitoring is invaluable for identifying issues that may not have surfaced during initial testing. Real-time experimentation platforms serve as a continuous integration testing tool, tracking user interactions and system performance after a change goes live.
For example, a feature that works well during testing may show unexpected behaviour in production due to unforeseen interactions with other systems or user segments. Real-time post-release monitoring helps identify such issues quickly, allowing teams to act before they cause widespread problems.
The Advantage of Using Controlled Experiments for Monitoring
Using controlled experiments (A/B tests) for monitoring gives teams a more sensitive and specific way to detect issues than traditional methods. Controlled experiments reveal the impact of changes on specific user segments, highlighting areas that may require attention.
If each change you want to make is done as one experiment you can pinpoint right away which change created a problem. This means you can immediately stop the experiment instead of rolling back a full code release. Separating out the experiments means you don't need to have to examine all of the changes you’ve made to identify where the problem has come from (if we even notice the problem in the first place). Without experiments, it might take months to find hidden breakages. This level of granularity is often missing in general monitoring tools, which tend to provide aggregate data.
By using real-time experimentation as part of the post-release monitoring strategy, organisations can maintain a high standard of product quality while continuing to innovate and experiment.
Latency Has an Outsized Impact on Development
Data latency in experimentation platforms makes a huge difference in how teams operate, make decisions, and manage risk. Low-latency, real-time data can be a game-changer in creating an agile development environment, improving risk management, and enhancing the overall quality of products and services. On the other hand, high latency can lead to conservative behaviour, inefficient processes, and missed opportunities.
By embracing real-time data and integrating it into their experimentation strategies, organisations can confidently innovate, respond to user feedback, and make informed, data-driven decisions. While real-time monitoring isn’t a cure-all, its ability to provide immediate insights and early detection makes it an indispensable tool in modern product development. Companies must understand and manage latency to leverage its full potential, using it to drive their success.
To explore how ABsmartly’s real-time experimentation platform can accelerate your business growth, visit our pricing page for more information on tailored solutions.