Striim Team

223 Posts

How to Build Decentralized Data Products with Data Streaming

In this period of economic uncertainty, data leaders are under increasing pressure to supply useful data to their teams so they can make quick decisions and provide exceptional customer experiences.

Data streaming and Change Data Capture can help decentralize data management and provide the necessary flexibility to meet the needs of various stakeholders.

In this live webinar, we will demonstrate how to source data from operational systems (OLTP databases, sensors, API data) and transform it into actionable Data Products, while meeting SLAs for uptime and delivery speeds. The presentation will be followed by a Q&A session. Save your spot and join us live!

 

Presented by:

John Kutay
Director of Product, Striim

What’s New In Data with Jacob Matson (S2E3)

In this episode, we host Jacob Matson – VP of Finance and Operations at Simetric. Jacob is a long time data leader who will dive deep into whether data engineers should ship production-ready code and data streaming use cases in finance. He will also talk about his love for DuckDB.

See it in action: Schema Evolution

With Striim’s schema evolution capabilities, you can have full control whenever data drifts. Capture schema changes, configure how each consumer propagates the change or simply halt and alert when a manual resolution is needed.

Building Real-Time Data Products with Data Streaming

John Kutay, PRODUCT MANAGER, Striim

Data leaders are evaluating methods to meet the various needs of cross-functional stakeholders. Some business users and customers need data in real-time, others need materialized views in a business format, and some want all of the above! Learn how data streaming and change data capture can decentralize your data operations.This talk will include specifics on sourcing data from collaborating operational systems (OLTP databases, sensors, API data) and transforming it into Data Products in the format of actionable business data with strong SLAs and SLOs for uptime and delivery speeds.

How Change Data Capture Enables Real-Time Data Streaming From Oracle Databases

 

Oracle CDC Webinar Replay

All businesses rely on data. Historically, this data resided in monolithic, on-premises databases, and for many enterprises, Oracle was the database of choice. As businesses modernize they are looking to the cloud for analytics and striving for real-time data insights. While they often find their legacy databases difficult to completely replace, the data and transactions happening within them are essential for analytics.

In order to unlock the full value of their data, companies need to stream critical transactions from their Oracle databases to their cloud provider, in real time. An easy way to do this is by using Change Data Capture (CDC). But not all CDC is created equal, and not all CDC solutions can handle mission-critical workloads.

Watch our on-demand technical webinar and demo, where we provide you with an overview of Striim’s Oracle CDC capabilities. We cover topics including:

  • Introduction to OJet, Striim’s new Oracle reader, that can read up to 150+ gigabytes of data per hour from Oracle (up to version-21c)
  • Demo: how to set up zero-downtime migrations/replications from Oracle to Google BigQuery from initial load to on-going, real-time synch. We also demo unique capabilities of Striim’s BigQuery writer including partition pruning and a streaming API.
  • An overview of Striim’s features that support high-volume, mission-critical enterprise environments

Presented by:

Sai Natarajan
VP Chief Field Technologist, Striim

 

Connect Data Sources and Targets

Striim makes it easy to connect to your sources with a point and click wizard. Select tables, migrate schemas, and start moving data in seconds.

Data Pipeline Monitoring

Visualize your pipeline health, end-to-end data latency, and table-level metrics. Plug in with Striim’s REST APIs.

How to Stream Data to Azure Synapse Analytics with Striim

 

Azure_Synapse_Webinar_July_2022_On-Demand

There is significant demand for zero downtime database migrations and continuous data replication as workloads shift to the cloud. Modernizing databases by offloading workloads to the cloud requires building real-time data pipelines from legacy systems.

The Striim® platform is an enterprise-grade cloud data integration solution that continuously ingests, processes, and delivers high volumes of streaming data from diverse sources, on-premises or in the cloud.

In this joint session with Striim partner Microsoft, we discuss why and how cloud and data architects/engineers use Striim to move data into Azure in a consumable form, quickly and with sub-second latency to easily run critical transactional and analytical workloads.

We also demonstrate how to use Striim to migrate or continuously replicate enterprise databases to Azure Synapse Analytics with no downtime:

  • Prepare Azure data targets or data integration with table creation that reflects the source database
  • Set up in-flight transformations right in the GUI to minimize end-to-end latency and enable real-time analytics & operational reporting
  • Deploy zero-downtime migrations to Azure from existing enterprise databases anywhere

Presented by:

Edward Bell
Senior Solutions Architect, Striim

 

Karlien Vanden Eynde
Director of Product Marketing Cloud Analytics, Microsoft

Is “The Modern Data Stack” Dead?

In this episode recorded live from New York City, we hosted Ethan Aaron: CEO of Portable and thought leader in the data industry. Ethan covers a range of topics such as out-of-the-box analytics, the ‘Post-Modern Data Stack,’ and running a dashboard-driven organization.

Ethan explains why having too many tools too early could weigh down small data teams and reduce the business value they could provide. Make sure to follow Ethan on Linkedin for more.

 

Back to top