Striim is pleased to introduce a broad set of enhancements to our on-premise and cloud marketplace offerings that add additional sources and targets, provide increased manageability, and further enhance performance.
Category: Videos
Striim Cloud
Striim Overview – Real-Time Enterprise Data Integration to the Cloud
A 3 minute overview of the Striim platform and common use-cases.
Striim’s real-time enterprise data integration platform, is a next generation cloud based platform that can ingest real-time data from a variety of sources, including change data from enterprise databases such as Oracle and Microsoft SQL Server, and rapidly deliver it to your cloud systems such as Google Cloud, Azure and AWS.
Video Transcription:
Cloud adoption is an essential part of your digital transformation journey.
Whether you are modernizing legacy applications and databases, or have a ‘cloud first’ strategy for all new applications and analytics, you have to consider data integration from diverse sources – a lot of which may reside on premise, or in other clouds – to where it needs to go in a timely and non disruptive fashion.
How do you collect, move, process, and deliver data from existing and legacy sources to your new cloud technologies in a continuous, scalable, and reliable way? How do you address this without interruption to your business applications?
That’s why you need Striim’s real time data integration platform, a next generation cloud based platform with built in Intelligence and AI.
Striim can ingest real-time data from a variety of sources, including change data from enterprise databases such as Oracle and Microsoft SQL Server, and rapidly deliver it to your cloud systems such as Google Cloud, Azure and AWS.
While the data is moving, it’s easy to filter, transform, enrich, and correlate this data, using simple SQL based transformations, to get it into the correct form for the target.
Real-time data delivery validation, monitoring and alerts provide visibility into your continuous data pipelines. Providing enterprise grade real-time integration, in a scalable, reliable and secure platform.
Financial organizations are using Striim to migrate legacy enterprise databases to the cloud, without taking any downtime, through our wizards and intuitive UI
Global delivery companies and retailers are continuously feeding data to the cloud for real-time reporting and operational intelligence
While large scale cloud analytics driven by continuous data from Striim is powering real-time decision making in education and healthcare
With continuous monitoring of their data flows in real-time as part of the solution.
Built with the cloud in mind, Striim can scale with your workloads, and provides the high-availability, reliability and security you would expect from a mission critical piece of your cloud transformation.
You can try Striim from our website, find us in all major cloud marketplaces, or contact us for a demo tailored to your exact use case.
Stream to the cloud, today, with Striim.
Striim Migration Service to Google Cloud Tutorials
Striim Migration Service for PostgreSQL to Cloud SQL for PostgreSQL
In this tutorial you will learn how you can use Striim to migrate an on-premise PostgreSQL database to Cloud SQL for PostgreSQL in Google Cloud, through Striim’s wizard-based UI and intuitive data pipelines, with zero database downtime.
Striim Migration Service for Oracle to Cloud SQL for PostgreSQL
Migrating from Oracle to Cloud SQL in Google Cloud opens up cloud services that offer a wealth of capabilities with low management overhead and cost. But, moving your existing on-premises applications to the cloud can be a challenge. Existing applications built on top of on-premises deployments of databases like MySQL. In this tutorial we are going to step you through a database technology called Change Data Capture to synchronize data from MySQL into a Google Cloud SQL instance.
Striim Migration Service for MySQL to Cloud SQL for MySQL
This guide provides a really quick and easy way to synchronize an on-premises instance of MySQL to Cloud SQL using Striim. You could start using the cloud database to run additional applications or do data analysis — without affecting the performance and use of your existing system.
Striim Migration Service for SQL Server to Cloud SQL for SQL Server
This guide provides a really quick and easy way to synchronize an on-premises instance of SQL Server to Cloud SQL database in Google Cloud using Striim. You could start using the cloud database to run additional applications or do data analysis — without affecting the performance and use of your existing system.
Striim Migration Service for Oracle to Cloud Spanner
In this tutorial you will learn how you can use Striim to migrate an on-premise Oracle database to Cloud Spanner, through Striim’s wizard-based UI and intuitive data pipelines, with zero database downtime.
Striim Product Demo – Oracle To Cloud Spanner
In this demo, you are going to see how you can use Striim to continuously move data from Oracle to Google Cloud Spanner. We will show you how to use Striim’s wizards and intuitive UI to build data flows; run the data flows to collect data from Oracle using Change Data Capture, and deliver it in real-time to Cloud Spanner; and see continuous monitoring of your cloud migration solution.
Video Transcription:
In this demo, you are going to see how you can use Striim to continuously move data from Oracle to Google Cloud Spanner. We will show you how to use Striim’s wizards and intuitive UI to build data flows; run the data flows to collect data from Oracle using Change Data Capture, and deliver it in real-time to Cloud Spanner; and see continuous monitoring of your cloud migration solution.
Performing streaming data integration with Striim starts with our wizards. We will select Oracle as the source, and Cloud Spanner as the target. After clicking the wizard and entering a name for our data flow, you just need to complete a few simple steps.
First, you will configure the source. Enter the necessary information to connect to the source and click on next. Don’t worry, any secure information like passwords is encrypted. The wizard will check that the connection information is correct, and that the connection has the correct privileges and supports change data capture.
Next you select the tables that you are interested in collecting real-time data from. You can change this selection afterwards, so start with a few tables initially. Finally you need to configure the target connection information, including how the source data is mapped to target tables.
When you complete the wizard, a data flow is created from the information you entered. You can see the source and target configuration here. To start the data flow, first deploy it to get it ready to run, then start it to begin collecting data from Oracle and delivering it to Cloud Spanner
Initially, there is no data flowing, because we are not generating any new data in Oracle. You can see from the UI for Cloud Spanner that there is no data present in any of the target tables.
Now we will run a data generator for Oracle that creates a set of inserts, updates and deletes. You can see the data in the data flow preview window, and view the rate of data collection and delivery in the UI. We can also look at the application progress here to see a summary view of your tables. After a number of operations have been generated, we can check back with the Cloud Spanner UI and see the data in the target tables.
Of course, Striim can perform initial loads as well through similar data flows. Here we are moving a million rows from tables in Oracle to Cloud Spanner using our smart delivery pipeline. You can monitor the progress through the Striim UI, and, if we switch to the Cloud Spanner UI, you can see the data in the target.
We can also use the Striim monitor UI to look at overall metrics, and drill down to see the application statistics, and detailed information for each of the application components.
This has been a quick demo of using Striim to deliver data continuously from Oracle to Cloud Spanner. Please go to our website to try Striim yourself, find Striim in the Google Cloud Marketplace, or contact us to learn more.
Case Study: How Macy’s Streamlined Retail Ops
Speakers
- Alok Pareek, Founder EVP Products, Striim
- Neel Chinta, Tech Manager Engineering Databases, Macys
As retailers strive to meet the growing expectations of shoppers, they are turning to Google Cloud to transform their businesses and tackle opportunities in an increasingly challenging industry. From optimizing inventory management, to increasing collaboration between employees across locations and roles, to helping build omnichannel experiences for customers, Google Cloud is working together with retailers to help make the shopping experience as seamless and personalized as possible.
Striim, a premier technology partner for Google Cloud, delivers streaming data integration with intelligence using customer behavior, sales, inventory, and other operational data to detect and notify of time-sensitive buyer opportunities and operational risks. It helps you make automated decisions with deeper and timely customer insight while bringing operational efficiencies that raise profitability.
A standout Google Cloud customer is Macy’s, one of the world’s largest retailers. Founded in 1858, Macy’s operates approximately 680 Macy’s and Bloomingdale’s and 190 specialty stores including Bloomingdale’s The Outlet, Bluemercury, and Macy’s Backstage. And through macys.com, bloomingdales.com, and bluemercury.com, it also serves millions of customers across more than 100 countries.
By moving its infrastructure to the cloud, and taking advantage of Google Cloud data warehousing and analytics solutions, Macy’s is streamlining retail operational functions across its network.
Macy’s, Google, and Striim work together to build cloud technology solutions to improve digital and mobile experiences, site stability, store technology, fulfillment, and logistics, and integrate its front line and back office to reinvent retail.
By moving its infrastructure to the cloud, and taking advantage of Google Cloud data warehousing and analytics solutions, Macy’s is streamlining retail operational functions across its network.
Macy’s, Google, and Striim work together to build cloud technology solutions to improve digital and mobile experiences, site stability, store technology, fulfillment, and logistics, and integrate its front line and back office to reinvent retail.
Stream Data into Snowflake with Streaming Data Integration
In this video, learn why enterprises must stream data into Snowflake to take full advantage of this data warehouse built for the cloud.
To learn more about Striim for Snowflake Data Warehouse, visit our Snowflake solution page.
Video Transcription:
You chose Snowflake to provide rapid insights into your data on a massive scale, on AWS or Azure. However, most of your source data resides elsewhere – in a wide variety of on-premise or cloud sources. How do you continually move data to Snowflake in real-time, processing it along the way, so that your fast analytics and insights are reporting on timely data?
Snowflake was built for the cloud, and built for speed. By separating compute from storage you can easily scale up and down as needed. This gives you instant elasticity supporting any amount of data, and high speed queries for any number of users, coupled with the peace of mind provided by secure data sharing. The per-second pricing and support for multiple clouds allows you to choose your infrastructure and only pay when you are using the data warehouse.
However, residing in cloud means you have to determine how to most effectively move data to Snowflake. This could be migrating an existing Teradata or Exadata Data Warehouse, or continually populating Snowflake with newly generated on-premises data from operational databases, logs, or device information. In order for the warehouse to provide up-to-date information, there should be as little latency as possible between the original data creation and its delivery to Snowflake.
The Striim platform can help with all these requirements and more. Our database adapters support change data capture, or CDC, from enterprise or cloud databases. CDC directly intercepts database activity and collects all the inserts, updates, and deletes as they happen, ready to stream into Snowflake. Adapters for machine logs and other files read at the end of multiple files in parallel to stream out data as it is written, removing the inherent latency of batch. While data from devices and messaging systems can be collected easily, independent of their format, through a variety of high-speed adapters and parsers.
After being collected continuously, the streaming data can be delivered directly into Snowflake with very low latency, or pushed through a data pipeline where it can be pre-processed through filtering, transformation, enrichment, and correlation using SQL-based queries, before delivery into Snowflake. This enables such things as data denormalization, change detection, de-duplication, and quality checking before the data is ever stored.
In addition to this, because Striim is an enterprise-grade platform, it can scale with Snowflake and reliably guarantee delivery of source data while also providing built-in dashboards and verification of data pipelines for operational monitoring purposes.
The Striim wizard-based UI enables users to rapidly create a new data flow to move data to Snowflake. In this example, real-time change data from Oracle is being continually delivered to Snowflake. The wizard walks you through all the configuration steps, checking that everything is set up properly, and results in a data flow application. This data flow can be enhanced to filter, transform and enrich the data through SQL-based queries. In the video, we add a name and email address from a cache, based on an ID present in the original data.
When the application is started, data flows in real-time from Oracle to Snowflake. Making changes in Oracle results in the transformed data being written continually to Snowflake, visible through the Snowflake UI.
Striim and Snowflake can change the way you do analytics, with Snowflake providing rapid insight to the real-time data provided by Striim. The data warehouse that is built for the cloud needs data delivered to the cloud, and Striim can continuously move data to Snowflake to support your business operations and decision-making.
To learn more about how Striim makes it easy to continuously move data to Snowflake, visit our Striim for Snowflake product page, schedule a demo with a Striim technologist, or download the platform and try it for yourself.
Rapid Adoption of Google Cloud SQL Using Streaming Integration & CDC
In this video, we will demonstrate how Striim can provide continuous data integration via CDC to Google Cloud SQL through a pipeline for the real-time collection, processing, and delivery of enterprise data, sourcing from Oracle on-prem.
Are you looking to migrate to Google Cloud Platform? Start our free 90-day migration service today!
To learn more about the Striim platform, visit our platform overview page.
This was originally published as a blog post here.
Unedited Transcript:
Rapid adoption of Google Cloud SQL. We’re using Striim with streaming integration from any enterprise data source you want to move to Google Cloud SQL, but much of your data may currently be elsewhere locked up. Maybe this is in operational databases, data warehouses, legacy systems and other locations. You need a new hybrid cloud integration strategy for the continuous movement of enterprise data to and from Google Cloud with continuous collection, processing, and delivery of enterprise data in real time, not batch, to ensure Google Cloud SQL is always up to date. Data from on premise and cloud sources needs to be delivered to Google Cloud SQL including the one time load and continuous change to delivery with in-flight processing to ensure up to second information for your users, and that’s where Striim comes in. Striim is a next generation streaming integration and intelligence platform that supports your hybrid cloud initiatives and has integration with multiple Google Cloud technologies. We will demonstrate how Striim can provide continuous data integration into Google Cloud SQL through a pipeline for the real-time collection, processing and delivery of enterprise data.
Sourcing from Oracle on premise. In this case, we’ll be doing an initial load followed by continuous change delivery from Oracle to Google Cloud SQL. Striim’s UI makes it easy to continuously and non intrusively ingest all your enterprise data from a variety of sources in real time. We’ll start by doing an initial load of data from Oracle on premise to Google Cloud SQL using a data flow. When the flow is started, the full contents of the on premise customer table is loaded into Google Cloud SQL. After a short time. All the rows in the source table are present in the Google Cloud SQL customer target table. This can be monitored using Striim and the Google Cloud monitor UI. Once the initial load is complete, we can continuously deliver changes using CDC from Oracle into the Google Cloud SQL instance. A separate flow is used so that the initial load and CDC can be coordinated after many changes. You can see that the Google Cloud SQL is completely up to date with the on premise Oracle instance. The continuous updates can also be monitored through the Striim UI. You can see how Striim can enable your hybrid cloud initiatives and accelerate the adoption of Google Cloud SQL. Get started with Striim now with a trial download on our website or contact us if you want to know more.
Google Cloud Next – Cloud Spanner Demo
Alok Pareek, EVP of Products at Striim, and Codin Pora, Director of Partner Technology at Striim, provide a demo of the Striim platform at Google Cloud Next SF, April 2019. Alok goes into detail about how Google Cloud users can move real-time data from a variety of sources into their Google Cloud Spanner environment using the Striim platform.
Unedited Transcript:
So with that, I’d like to invite Alok and call them up to stage to give us a demo of Spanner. And their company Striim is strategic partners of ours that do basically replication and migration of data into Google cloud. Thank you. Thank you.
Thank you, Tobias. So today I’m going to show a demonstration of another. You have these wonderful endpoints on the Google cloud. How do you actually use them? How do you actually move your data into them? And I’m going to talk about in this demo how we move real time data from your applications from an on premise Oracle database into Cloud Spanner. So before I get into the demojust a little bit about Striim. Striim is the next generation platform that helps in three solution categories. These are cloud option, hybrid cloud data integration, in-memory stream processing. Today I’m going to be focusing on the cloud adoption, specifically, how do we move data into Spanner? So with that, we’re going to jump into the demo.
Okay. So what you see on the screen is the landing page. And I’m gonna keep this going pretty fast. We’re going to step into the apps part of the demo. That’s where the data pipelines are defined. That helps you move the data from on premise to Spanner. In this case, what you are seeing, there are two pipelines. One of them is meant to do an initial load or an instantiation of your existing data onto Cloud Spanner tables. And the other one is also meant to catch it up. So while you are actually moving the data, you might have very large tables, for example, or massive amounts of volumes. So how do you actually go ahead and not lose any data? And all of the consistency things that we heard about from Tobia survey earlier.
It’s important that while you are moving the data, you also don’t have disruption to your applications and to your business. So let’s step into the pipeline here. So this is a very simple pipeline. It actually has a simple flow. You have at the top a data source, which is in this case Oracle, it’s running on premise. So we connect into this Oracle database. It has a line items table. We’re going to show you a movement of about a hundred thousand records. And also there’s an order stabler where we’re going to show you the delta processing. The way this application is constructed is by using these components on the left side of the UI in the flow designer as you drag and drop one of these things and you push them into the pipeline.
And that’s how you actually construct your data flow. And once we actually go we can also step into the Spanner target definition and this is your service account and the connectivity and the config for your Spanner. We’re gonna next deploy this application or the pipeline and once we deploy it, this is where you can sort of see that I can actually run this within the Striim platform. This can be run either on premise or on the Google Cloud. We want to probably show, Codin, that there’s nothing available yet in the tables on the Spanner side. So let’s go ahead and execute a query against a line item table. And in this case you’re seeing that there are zero records there and you can take my word that there is a hundred thousand records on the Oracle side.
In the interest of time we’ll assume that and let’s go ahead and run the application. And as soon as we are on the application you can see that in the preview in the lower part of your screen, you can actually see the records running live. This is while we are uploading the data and applying them into Cloud Spanner. You can see that we have completed a 100,000 records and it was pretty fast. This morning I’d done a million records so I was holding my breath there, but that was pretty fast as well. So now you can see that the data part is completed. I mentioned to you that there’s a second phase here. That’s the change data capture phase. So this is while you’re actually executing this query, of course, this query is consistent as of a specific snapshot.
At Oracle, there’s also DML activity against your application. So how do we actually take this data? This is the second pipeline now, so we can step into pipeline number two. Codin is already deployed it and in this case we use a special reader and that actually operates against the redo logs of the Oracle database and actually monitors that. So it doesn’t actually have any impact on the production system per se, impact us in like it’s at least not doing any query impact there. We grabbed the data from the redo logs and then we are going to reapply that as DMO, as inserts, updates and so forth on the Cloud Spanner system. So let’s go ahead and run this application. We are going to generate some DML using a data generator.
And let’s go ahead and run the generator and you’ll see that there’s a number of inserts, updates and deletes against the orders table. And now let’s switch over to the Cloud Spanner system and query the order stable here. As you can see, there’s data in the orders table. This was also something that was just propagated. So this is sort of like the two phase, very fast demo of how you get data from your on prem databases into Cloud Spanner. And of course this can work against other databases that we support as well. And this a available in the Google Cloud. So with that, I’m gonna hand the control back to Tobias.
Moving Real-Time Data to Azure Cosmos DB with Striim
In this video you will see how Striim can help feed Cosmos DB in real-time through our wizard-based UI and intuitive data pipelines.
Azure Cosmos DB is Microsoft’s globally distributed, multi-model database service. You have chosen Cosmos DB to store ever-increasing volumes of data and make this data available in milliseconds. However, most of your source data resides elsewhere – in a wide variety of on-premise or cloud sources. How do you continually move this data to Cosmos DB in real-time, so that your fast analytics and insights are reporting on timely data?
Video Transcription:
Azure Cosmos DB was built to achieve low latency and high availability in a globally distributed world. By elastically and independently scaling throughput and storage across multiple Azure regions world-wide you can access your data when and where you want. And support for multiple models means you can use SQL, Cassandra, MongoDB and other APIs to get to your data.
However, residing in the cloud means you have to determine how to move your existing data to Cosmos DB. This could be migrating an existing SQL Server, Oracle, MySQL, or PostgreSQL operational database, or continually populating Cosmos DB with newly generated on-premise data from logs, or device information. In order for Cosmos DB to provide up-to-date information, there should be as little latency as possible between the original data creation and its delivery to the cloud.
The Striim platform can help with all these requirements and more. Our database adapters support change data capture, or CDC from enterprise or cloud databases. CDC directly intercepts database activity and collects all the inserts, updates and deletes as they happen, ready to stream into Cosmos DB. Adapters for machine logs and other files read at the end of multiple files in parallel to stream out data as it is written, removing the inherent latency of batch. While data from devices and messaging systems can be collected easily, independent of its format, through a variety of high speed adapters and parsers.
After being collected continuously, the streaming data can be delivered directly into Azure Cosmos DB with very low latency, or pushed through a data pipeline where it can be pre-processed through filtering, transformation, enrichment, and correlation using SQL-based queries, before delivery into CosmosDB. This enables such things as data denormalization, change detection, deduplication, and quality checking before the data is ever stored.
In addition to this, because Striim is an enterprise grade platform, it can scale with Cosmos DB and reliably guarantee delivery of source data while also providing built-in dashboards and verification of data pipelines for operational monitoring purposes.
The Striim wizard-based UI enables users to rapidly create a new data flow to move data to Cosmos DB. In this example, real-time change data from Oracle is being continually delivered to Cosmos DB through the SQL API. The wizard walks you through all the configuration steps, checking that everything is set up properly, and results in a data flow application. This data flow can be enhanced to filter, transform and enrich the data through SQL based queries. Here we are adding a name and email address from a cache, based on an ID present in the original data.
When the application is started, data will begin flowing in real-time from Oracle to Cosmos DB. Making changes in Oracle results in the transformed data being written continually to Cosmos DB, as you can see through the Cosmos DB data explorer UI.
Of course, we are not limited to writing through the SQL API. In this example, we are writing Oracle data to a Cassandra model, which can be utilized directly by existing or new Cassandra applications. Here’s what the data looks like in this case.
Striim and Cosmos DB can change the way you do analytics, with Cosmos DB providing global rapid access to the real-time data provided by Striim. The globally distributed cloud database service needs data delivered to the cloud, and Striim can continually feed Cosmos DB with the data you need to run your business.
Try Striim and Cosmos DB today through the Striim for Real-Time Data Integration to Cosmos DB offering on the Azure Marketplace, to see your data how, where, and when you want it.