Steve Wilkes

21 Posts

Striim Achieves SOC 2 Type II Certification

At Striim we recognize the essential role that our software plays in the data architecture of our customers. Our unified real-time data integration and stream processing platform and our fully managed SaaS data products in Striim Cloud, are the vital engines that drive the data for many mission critical applications. Our customers need to trust us, and our software, to be secure and available.

Nine months ago we announced our SOC 2 Type I certification. To further this trust, we are very excited to announce that Striim has now achieved SOC 2 Type II certification.

A SOC 2 assessment report provides detailed information and assurance about an organization’s security, confidentiality, availability, processing integrity, and/or privacy controls, based on their assurance of compliance with the American Institute of Certified Public Accountants (AICPA) Trust Services Principles and Criteria for Security. A SOC 2 report is often the primary document that the security departments of our customers will rely on to assess Striim’s ability to maintain adequate security, and reviewing such documents is itself often required by SOC 2 controls.

SOC 2 compliance comes in two forms: the SOC 2 Type I report which describes the design of the controls we have in place to meet relevant trust criteria at specific point in time; and a SOC 2 Type II report which details the operational effectiveness of those controls over a specified period of time. These reports are the results of audits performed by independent third parties, in our case Grant Thornton LLP.

We completed SOC 2 Type I last year, and successfully operated the controls for a period of nine months in order to become SOC 2 Type II certified. The Controls that the audit covers include Striim as a corporation, our on-premise Platform and the Striim Cloud managed SaaS offering. They cover infrastructure, software, devices, people, data, and our corporate and customer policies, procedures and processes.

To achieve this certification, we relied on the investments we made for SOC 2 Type I certification in defining processes, policies and procedures, as well as training and utilization of technologies. Continual internal audits ensured we were meeting our goals and not straying from the many controls we have in place. This required the continual efforts of a cross functional team including contributions from executive management, security, human resources, engineering, infrastructure and legal departments.

SOC 2 is not just a certification, and it is not something you do once just to gain a check mark. The annual audits require that the controls and processes around them are ingrained into the DNA of every Striimer, and the insight gained during the process is a stepping stone to other broader and industry specific certifications.

This is just the start of our journey, so stay tuned for further exciting updates. The SOC 2 Type II report is available on request for our customers and those in the process of evaluating Striim.

Striim is now SOC 2 Type 1 Certified

Striim is SOC2 certified

As a unified data streaming and integration company, the Striim platform sits at the heart of our customers’ data architecture. It is crucial that our customers trust our software, and our company, to always do the right thing from a security perspective.

With that in mind, we are thrilled to announce that Striim is now officially SOC 2 Type 1 certified. 

A SOC 2 assessment report provides detailed information and assurance about an organization’s security, confidentiality, availability, processing integrity, and/or privacy controls, based on their assurance of compliance with the American Institute of Certified Public Accountants (AICPA) Trust Services Principles and Criteria for Security. A SOC 2 report is often the primary document that the security departments of our customers will rely on to assess Striim’s ability to maintain adequate security.

SOC 2 compliance comes in two forms: the SOC 2 Type 1 report which describes the design of the controls we have in place to meet relevant trust criteria at specific point in time; and a SOC 2 Type 2 report which details the operational effectiveness of those controls over a specified period of time. These reports are the results of audits performed by independent third parties, in our case Grant Thornton LLP.

We have completed SOC 2 Type 1 and are in the process of the requisite assessments over time to complete SOC 2 Type 2.

To achieve this certification, we have undergone a year-long effort to ensure that our people, principles, and processes are fully aligned with the level of security our customers would expect from a SaaS company. This has involved investments in training and new technologies to help automate processes and protect infrastructure, and a lot of documentation, reporting, and continual internal reviews.

The scope of the report covers all people, systems, and processes involved in getting the Striim software into the hands of our customers, whether they are using Striim on-premise, in their own cloud environment, utilizing containers, or are one of the initial Striim Cloud private preview customers.

Striim is SOC2 Type 1 Certified

 

SOC 2 is not just a certification, it is a way of thinking, and a journey that requires a deep dive into everything you do. Completing this certification has given us the opportunity to solidify security as a number one operating principle within the company, and ensure that all actions involve security considerations. Now that we have all of the required controls in place, we are working diligently to show how we can maintain those controls throughout the year, as we work towards SOC 2 Type 2 certification. We’ll keep you posted.

 

Striim Overview – Real-Time Enterprise Data Integration to the Cloud

A 3 minute overview of the Striim platform and common use-cases.

Striim’s real-time enterprise data integration platform, is a next generation cloud based platform that can ingest real-time data from a variety of sources, including change data from enterprise databases such as Oracle and Microsoft SQL Server, and rapidly deliver it to your cloud systems such as Google Cloud, Azure and AWS.

Video Transcription:

Cloud adoption is an essential part of your digital transformation journey.

Whether you are modernizing legacy applications and databases, or have a ‘cloud first’ strategy for all new applications and analytics, you have to consider data integration from diverse sources – a lot of which may reside on premise, or in other clouds – to where it needs to go in a timely and non disruptive  fashion.

How do you collect, move, process, and deliver data from existing and legacy sources to your new cloud technologies in a continuous, scalable, and reliable way? How do you address this without interruption to your business applications? 

That’s why you need Striim’s real time data integration platform, a next generation cloud based platform with built in Intelligence and AI. 

Striim can ingest real-time data from a variety of sources, including change data from enterprise databases such as Oracle and Microsoft SQL Server, and rapidly deliver it to your cloud systems such as Google Cloud, Azure and AWS.

While the data is moving, it’s easy to filter, transform, enrich, and correlate this data, using simple SQL based transformations, to get it into the correct form for the target. 

Real-time data delivery validation, monitoring and alerts provide visibility into your continuous data pipelines. Providing enterprise grade real-time integration, in a scalable, reliable and secure platform.

Financial organizations are using Striim to migrate legacy enterprise databases to the cloud, without taking any downtime, through our wizards and intuitive UI

Global delivery companies and retailers are continuously feeding data to the cloud for real-time reporting and operational intelligence

While large scale cloud analytics driven by continuous data from Striim is powering real-time decision making in education and healthcare

With continuous monitoring of their data flows in real-time as part of the solution.

Built with the cloud in mind, Striim can scale with your workloads, and provides the high-availability, reliability and security you would expect from a mission critical piece of your cloud transformation. 

You can try Striim from our website, find us in all major cloud marketplaces, or contact us for a demo tailored to your exact use case. 

Stream to the cloud, today, with Striim.

 

Striim Migration Service to Google Cloud Tutorials

Striim Migration Service for PostgreSQL to Cloud SQL for PostgreSQL

In this tutorial you will learn how you can use Striim to migrate an on-premise PostgreSQL database to Cloud SQL for PostgreSQL in Google Cloud, through Striim’s wizard-based UI and intuitive data pipelines, with zero database downtime.

Striim Migration Service for Oracle to Cloud SQL for PostgreSQL

Migrating from Oracle to Cloud SQL in Google Cloud opens up cloud services that offer a wealth of capabilities with low management overhead and cost. But, moving your existing on-premises applications to the cloud can be a challenge. Existing applications built on top of on-premises deployments of databases like MySQL. In this tutorial we are going to step you through a database technology called Change Data Capture to synchronize data from MySQL into a Google Cloud SQL instance.

Striim Migration Service for MySQL to Cloud SQL for MySQL

This guide provides a really quick and easy way to synchronize an on-premises instance of MySQL to Cloud SQL using Striim. You could start using the cloud database to run additional applications or do data analysis — without affecting the performance and use of your existing system.

Striim Migration Service for SQL Server to Cloud SQL for SQL Server

This guide provides a really quick and easy way to synchronize an on-premises instance of SQL Server to Cloud SQL database in Google Cloud using Striim. You could start using the cloud database to run additional applications or do data analysis — without affecting the performance and use of your existing system.

Striim Migration Service for Oracle to Cloud Spanner

In this tutorial you will learn how you can use Striim to migrate an on-premise Oracle database to Cloud Spanner, through Striim’s wizard-based UI and intuitive data pipelines, with zero database downtime.

Striim Product Demo – Oracle To Cloud Spanner

In this demo, you are going to see how you can use Striim to continuously move data from Oracle to Google Cloud Spanner. We will show you how to use Striim’s wizards and intuitive UI to build data flows; run the data flows to collect data from Oracle using Change Data Capture, and deliver it in real-time to Cloud Spanner; and see continuous monitoring of your cloud migration solution.

Video Transcription:

In this demo, you are going to see how you can use Striim to continuously move data from Oracle to Google Cloud Spanner. We will show you how to use Striim’s wizards and intuitive UI to build data flows; run the data flows to collect data from Oracle using Change Data Capture, and deliver it in real-time to Cloud Spanner; and see continuous monitoring of your cloud migration solution.

Performing streaming data integration with Striim starts with our wizards. We will select Oracle as the source, and Cloud Spanner as the target. After clicking the wizard and entering a name for our data flow, you just need to complete a few simple steps.

First, you will configure the source. Enter the necessary information to connect to the source and click on next. Don’t worry, any secure information like passwords is encrypted. The wizard will check that the connection information is correct, and that the connection has the correct privileges and supports change data capture. 

Next you select the tables that you are interested in collecting real-time data from. You can change this selection afterwards, so start with a few tables initially. Finally you need to configure the target connection information, including how the source data is mapped to target tables. 

When you complete the wizard, a data flow is created from the information you entered. You can see the source and target configuration here. To start the data flow, first deploy it to get it ready to run, then start it to begin collecting data from Oracle and delivering it to Cloud Spanner

Initially, there is no data flowing, because we are not generating any new data in Oracle. You can see from the UI for Cloud Spanner that there is no data present in any of the target tables.

Now we will run a data generator for Oracle that creates a set of inserts, updates and deletes. You can see the data in the data flow preview window, and view the rate of data collection and delivery in the UI. We can also look at the application progress here to see a summary view of your tables. After a number of operations have been generated, we can check back with the Cloud Spanner UI and see the data in the target tables.

Of course, Striim can perform initial loads as well through similar data flows. Here we are moving a million rows from tables in Oracle to Cloud Spanner using our smart delivery pipeline. You can monitor the progress through the Striim UI, and, if we switch to the Cloud Spanner UI, you can see the data in the target.

We can also use the Striim monitor UI to look at overall metrics, and drill down to see the application statistics, and detailed information for each of the application components.

This has been a quick demo of using Striim to deliver data continuously from Oracle to Cloud Spanner. Please go to our website to try Striim yourself, find Striim in the Google Cloud Marketplace, or contact us to learn more.

 

Online Enterprise Database Migration to Google Cloud

Migrate to cloud

Migrating existing workloads to the cloud is an formidable step in the journey of digital transformation for enterprises. Moving an enterprise application from on premises to run in the cloud, or modernizing with the best use of cloud-native technologies, is only part of the challenge. A major part of this task is to move the existing enterprise databases while business continuously operate at full speed.

Pause never

How the data is extracted and loaded into the new cloud environment plays a big role in keeping the business critical systems performant. Particularly for enterprise databases supporting mission-critical applications, avoiding downtime is a must-have requirement during migrations to minimize both the risk and operational disruption.

For business critical applications, the acceptable downtime precipitously approaches zero. All the while, moving large amounts of data, and essential testing of the business critical applications can take days, weeks, or even months.

Keep running your business

The best practice in enterprise database migration, to minimize and even altogether eliminate the downtime, is to use online database migration that keeps the application running.

In the online migration, changes from the enterprise source database are captured non-intrusively as real-time data streams using Change Data Capture (CDC) technology. This capability is available for most major databases, including Oracle, Microsoft SQL Server, HPE NonStop, MySQL, PostgreSQL, MongoDB, and Amazon RDS, but has to be harnessed in the correct way.

In online database migration, first, you initially load the source database to the cloud. Then, any changes in the source database that have happened since you were executing the initial load are applied to the target cloud database continuously from the real-time data stream. The source and target databases will remain up to date until you are ready to completely cut over. You will also have the option to fallback to the source all along, further minimizing risks.

Integrate continuously

Online database migration also provides essential data integration services for the new application development in the cloud. The change delivery can be kept running while you develop and test the new cloud applications. You may even choose to keep the target and source databases in sync indefinitely typically for continuous database replication in hybrid or multi-cloud use cases.

Keep fresh

Once the real-time streaming data pipelines to the cloud are set up, businesses can easily build new applications, and seamlessly adopt new cloud services to get the most operational value from the cloud environment. Real-time streaming is a crucial element in all such data movement use cases, and it can be widely applied to hybrid or multi-cloud architectures, operational machine learning, analytics offloading, large scale cloud analytics, or any other scenario where having up-to-the-second data is essential to the business.

Change Data Capture

Striim, in strategic partnership with Google Cloud, offers online database migrations and real-time hybrid cloud data integration to Google Cloud through non-intrusive Change Data Capture (CDC) technologies. Striim enables real-time continuous data integration from on-premises and other cloud data sources to BigQuery, Cloud Spanner, Cloud SQL for PostgreSQL, for MySQL, and for SQL Server, as well as Cloud Pub/Sub and Cloud Storage as well as other databases running in the Google Cloud.

Replicate to Google Cloud

In addition to data migration, data replication is an important use case as well. In contrast to data migration, data replication continuously replicates data from a source system to a target system “forever” without the intent to shut down the source system.

An example target system in the context of data replication is BigQuery. It is the data analytics platform of choice in Google Cloud. Striim supports continuous data streaming (replication) from an on-premises database to BigQuery in Google Cloud in case the data has to remain on-premises and cannot be migrated. Striim bridges the two worlds and makes Google Cloud data analytics accessible by supporting the hybrid environment.

Transform in flight

Data migration and continuous streaming in many cases transports the data unmodified from the source to the target systems. However, many use cases require data to be transformed to match the target systems, or to enrich and combine data from different sources in order to complement and complete the target data set for increased value and expressiveness in a simple and robust architecture. This method is frequently referred to as Extract Transform Load, or ETL.

Striim provides a very flexible and powerful in-flight transformation and augmentation functionality in order to support use cases that go beyond simple one-time data migration.

More to migrate? Keep replicate!

Enterprises in general have several data migration and online streaming use cases at the same time. Often data migration takes place for some source databases, while data replication is ongoing for others.

A single Striim installation can support several use cases at the same time, reducing the need for management and operational supervision. The Striim platform supports high-volume, high velocity data with built-in validation, security, high-availability, reliability, and scalability as well as backup-driven disaster recovery addressing enterprise requirements and operational excellence.

The following architecture shows an example where migration and online streaming is implemented at the same time. On the left, the database in the Cloud is migrated to the Cloud SQL database on the right. After a successful migration the source database is going to be removed. In addition, the two source databases on the left in an on-premises data center are continuously streamed (replicated) to BigQuery for analytics and Cloud Spanner for in-Cloud processing.

Keep going

In addition, Striim as the data migration technology is implemented in a high-availability configuration. The three servers on Compute Engine form a cluster, and each of the servers is executing in a different zone, making the cluster highly available and protecting the migration and online streaming from zone failures or zone outages.

Accelerate Cloud adoption

As organizations modernize their data infrastructure, integrating mission-critical databases is essential to ensure information is accessible, valuable, and actionable. Striim and Google Cloud’s partnership supports Google customers with a smooth data movement and continuous integration solutions, accelerating Google Cloud adoption and driving business growth.

Learn more

To learn more about the enterprise cloud data integration questions, feel free to reach out to Striim and check out these references:?

Google Cloud Solution Architecture: Architecting database migration and replication using Striim

Blog: Zero downtime database migration and replication to and from Cloud Spanner

Tutorial: Migrating from MySQL to BigQuery for Real-Time Data Analytics

 

Striim 3.10.1 Further Speeds Cloud Adoption

 

 

We are pleased to announce the general availability of Striim 3.10.1 that includes support for new and enhanced Cloud targets, extends manageability and diagnostics capabilities, and introduces new ease of use features to speed our customers’ cloud adoption. Key Features released in Striim 3.10.1 are directly available through Snowflake Partner Connect to enable rapid movement of enterprise data into Snowflake.

Striim 3.10.1 Focus Areas Including Cloud Adoption

This new release introduces many new features and capabilities, summarized here:

3.10.1 Features Summary

 

Let’s review the key themes and features of this new release, starting with the new and expanded cloud targets

Striim on Snowflake Partner Connect

From Snowflake Partner Connect, customers can launch a trial Striim Cloud instance directly as part of the Snowflake on-boarding process from the Snowflake UI and load data, optionally with change data capture, directly into Snowflake from any of our supported sources. You can read about this in a separate blog.

Expanded Support for Cloud Targets to Further Enhance Cloud Adoption

The Striim platform has been chosen as a standard for our customers’ cloud adoption use-cases partly because of the wide range of cloud targets it supports. Striim provides integration with databases, data warehouses, storage, messaging systems and other technologies across all three major cloud environments.

A major enhancement is the introduction of support for the Google BigQuery Streaming API. This not only enables real-time analytics on large scale data in BigQuery by ensuring that data is available within seconds of its creation, but it also helps with quota issues that can be faced by high volume customers. The integration through the BigQuery streaming API can support data transfer up to 1GB per second.

In addition to this, Striim 3.10.1 also has the following enhancements:

  • Optimized delivery to Snowflake and Azure Synapse that facilitates compacting multiple operations on the same data to a single operation on the target resulting in much lower change volume
  • Delivery to MongoDB cloud and MongoDB API for Azure Cosmos DB
  • Delivery to Apache Cassandra, DataStax Cassandra, and Cassandra API for Azure Cosmos DB

  • Support for delivery of data in Parquet format to Cloud Storage and Cloud Data Lakes to further support cloud analytics environments

Schema Conversion to Simplify Cloud Adoption Workflows

As part of many cloud migration or cloud integration use-cases, especially during the initial phases, developers often need to create target schemas to match those of source data. Striim adds the capability to use source schema information from popular databases such as Oracle, SQL Server, and PostgreSQL and create appropriate target schema in cloud targets such as Google BigQuery, Snowflake and others. Importantly, these conversions understand data type and structure differences between heterogeneous sources and targets and act intelligently to spot problems and inconsistencies before progressing to data movement, simplifying cloud adoption.

Enhanced Monitoring, Alerting and Diagnostics

On-going data movement between on-premise and cloud environments for migrations, or powering reporting and analytics solutions, are often part of an enterprise’s critical applications. As such they demand deep insights into the status of all active data flows.

Striim 3.10.1 adds the capability to inherently monitor data from its creation in the source to successful delivery in a target, generate detailed lag reports, and alert on situations where lag is outside of SLAs.

End to End Lag Visualization

In addition, this release provides detailed status on checkpointing information for recovery and high availability scenarios, with insight into checkpointing history and currency.

Real-time Checkpointing Information

Simplifies Working with Complex Data

As customers work with heterogeneous environments and adopt more complex integration scenarios, they often have to work with complex data types, or perform necessary data conversions. While always possible through user defined functions, this release adds multiple commonly requested data manipulation functions out of the box. This simplifies working with JSON data and document structures, while also facilitating data cleansing, and regular expression operations.

On-Going Support for Enterprise Sources

As customers upgrade their environments, or adopt new technologies, it is essential that their integration platform keeps pace. In Striim 3.10.1 we extend our support for the Oracle database to include Oracle 19c, including change data capture, add support for schema information and metadata for Oracle GoldenGate trails, and certify our support for Hive 3.1.0

These are a high level view of the new features of Striim 3.10.1. There is a lot more to discover to aid on your cloud adoption journey. If you would like to learn more about the new release, please reach out to schedule a demo with a Striim expert.

Streaming Data: The Nexus of Cloud-Modernized Analytics

 

 

On April 9th I am going to be having a conversation with Andrew Brust of GigaOm about the role of streaming integration in digital transformation initiatives, especially cloud modernization and real-time analytics. The format of this webinar is light on power-point, rich on lively discussion and interaction — so we hope you can join us.

Streaming Data: The Nexus of Cloud-Modernized Analytics

APR 9, 2020- 10:00 AM PDT/ 1:00 PM EDT

Digital transformation is the integration of digital technology into all areas of a business resulting in fundamental changes to how the businesses operate and how they deliver value to customers. Cloud has been the number one driving technology in a majority of such transformations. It could be you have a cloud-first strategy, with all new applications being built in the cloud, or you may need to migrate online databases without taking downtime. You may want to take advantage of cloud-scale for infinite data storage, coupled with machine learning to gain new insights and make proactive decisions.

In all cases, the key component is data. The data for your new applications, cloud analytics, or your data migration could originate on-premise, in another cloud or be generated from millions of IoT devices. It is essential that this data can be collected, processed, and delivered rapidly, reliably and at scale. This is why streaming data is the key major component of data modernization, and why streaming integration platforms are vital to the success of digital transformation initiatives.

In a modern data architecture, the goal is to harvest your existing data sources and enable your analysts and data scientists to provide value in the form of applications, visualizations, and alerts to your decision makers, customers, and partners.

In this webinar we will discuss the key aspects of this architecture, including the role of change data capture (CDC) and IoT technologies in data collection, options for data processing, and the differing requirements for data delivery. You will also learn how streaming integration platforms can be utilized for cloud modernization, large scale and stream analytics, and machine learning operationalization, in a reliable and scalable way.

I hope you can join us on April 9th, and see why streaming integration is the engine of data modernization for digital transformation.

 

Real-Time Data is for Much More Than Just Analytics

Striim’s Real-Time Data is for Much More Than Just Analytics article was originally published on Forbes.

The conversation around real-time data, fast data and streaming data is getting louder and more energetic. As the age of big data fades into the sunset — and many industry folks are even reluctant to use the term — there is much more focus on fast data and obtaining timely insights. The focus of many of these discussions is on real-time analytics (otherwise known as streaming analytics), but this only scratches the surface of what real-time data can be used for.

If you look at how real-time data pipelines are actually being utilized, you find that about 75% of the use cases are integration related. That is, continuous data collection creates real-time data streams, which are processed and enriched and then delivered to other systems. Often these other systems are not themselves streaming. The target could be a database, data warehouse or cloud storage, with a goal of ensuring that these systems are always up to date. This leaves only about 25% of companies doing immediate streaming analytics on real-time data. But these are the use cases that are getting much more attention.

There are many reasons why streaming data integration is more common, but the main reason is quite simple: This is a relatively new technology, and you cannot do streaming analytics without first sourcing real-time data. This is known as a “streaming first” data architecture, where the first problem to solve is obtaining real-time data feeds.

Organizations can be quite pragmatic about this and approach stream-enabling their sources on a need-to-have, use-case-specific basis. This could be because batch ETL systems no longer scale or batch windows have gone away in a 24/7 enterprise. Or, they want to move to more modern technologies, which are most suitable for the task at hand, and keep them continually up to date as part of a digital transformation initiative.

Cloud Is Driving Streaming Data Integration

The rise of cloud has made a streaming-first approach to data integration much more attractive. Simple use cases, like migrating an on-premise database that services an in-house business application to the cloud, are often not even viable without streaming data integration.

The naive approach would be to back up the database, load it into the cloud and point the cloud application at it. However, this assumes a few things:

1. You can afford application downtime.

2. Your application can be stopped while you are doing this.

3. You can spin up and use the cloud application without testing it.

For most business-critical applications, none of these things are true.

A better approach to minimizing or eliminating downtime is an online migration that keeps the application running. To perform this task, source changes from the in-house database, using a technology called change data capture (CDC), as real-time data streams, load the database to the cloud, then apply any changes from the real-time stream that happened while you were doing the loading. The change delivery to the cloud can be kept running while you test the cloud application, and when you cut over, it will be already up to date.

Streaming data integration is a crucial element of this type of use case, and it can also be applied to cloud bursting, operational machine learning, large scale cloud analytics or any other scenario where having up-to-the-second data is essential.

Streaming Data Integration Is The Precursor To Streaming Analytics

Once organizations are doing real-time data collection, typically for integration purposes, it then opens the door to doing streaming analytics. But you can’t put the cart before the horse and do streaming analytics unless you already have streaming data.

Streaming analytics also requires preprepared data. It’s a commonly known metric that 80% of the time spent in data science is in data preparation. This is true for machine learning and also true for streaming analytics. Obtaining the real-time data feed is just the beginning. You may also need to transform, join, cleanse and enrich data streams to give the data more context before performing analytics.

As a simple example, imagine you are performing CDC on a source database and have a stream of orders being made by customers. In any well-normalized, relational database, these tables are mostly just numbers relating to detail contained in other tables.

This might be perfect for a relational, transactional system, but it’s not very useful for analytics. However, if you can join the streaming data with reference data for customers and items, you have now added more context and more value. The analytics can now show real-time sales by customer location or item category and truly provide business insights.

Without the processing steps of streaming data integration, the streaming analytics would lose value, again showing how important the real-time integration layer really is.

Busting The Myth That Real-Time Data Is Prohibitively Expensive

A final consideration is cost. Something that has been said repeatedly is that real-time systems are expensive and should only be used when absolutely necessary. The typically cited use cases are algorithmic trading and critical control systems.

While this may have been true in the past, the massive improvements in the price-performance equation for CPU and memory over the last few decades have made real-time systems, and in-memory processing in general, affordable for mass consumption. Coupled with cloud deployments and containerization, the capability to have real-time data streamed to any system is within reach of any enterprise.

While real-time analytics and instant operational insights may get the most publicity and represent the long-term goal of many organizations, the real workhorse behind the scenes is streaming data integration. 

Back to top