Modernizing Healthcare Regulation: Inside the GMC’s Cloud Analytics Transformation with Striim and Azure

About the General Medical Council

The General Medical Council (GMC) is the independent regulator of doctors in the United Kingdom, responsible for protecting patient safety and upholding standards in medical practice. Established in 1858, the GMC maintains the official register of medical practitioners, ensures the quality of medical education and training, and investigates complaints about doctors’ conduct and performance. Its regulatory duties span the full medical career lifecycle—from medical school accreditation to post-graduate training oversight and fitness to practise tribunals—making it a cornerstone of the UK’s healthcare system.

With over 1,700 employees supporting more than 300,000 registered doctors, the GMC depends on timely, accurate, and secure data to fulfill its mission. Its work involves sensitive and complex data, including personal identifiers, legal casework, and educational records. As the organization modernizes its infrastructure, the move toward real-time, cloud-based analytics is essential for faster reporting, enhanced transparency, and future-ready capabilities like AI-driven insights. This transformation enables GMC to deliver more responsive regulation and support high-quality care across the UK.

Legacy Infrastructure Slows Progress Toward Cloud Analytics

GMC’s strategic goal was to migrate to a modern, cloud-based analytics stack built around Azure and Power BI. But there was one major obstacle: their primary data source, Siebel CRM, wasn’t ready to move to the cloud.

The organization faced several limitations:

  • Delayed access to up-to-date data, with ETLs running only once per day
  • High costs tied to legacy tools like Tableau and Oracle
  • Inefficient processes that made rerunning failed ETLs slow and resource-intensive
  • A growing need to enable self-service analytics across business teams using Power BI

Following a thorough review of how the right data was critical in the right architecture, it was shared: 

Why GMC Chose Striim for Real-Time Data Streaming

To solve this challenge, GMC needed a real-time integration layer that could stream on-prem data to Azure reliably. After evaluating several solutions—including Oracle GoldenGate and Qlik—they selected Striim for its:

  • Ease of use
  • Responsive support team
  • Built-in CDC and real-time sync

GMC’s team worked with Striim to deploy a streaming solution that connected their Siebel source data to the cloud—while simultaneously scaling up their Azure environment. The implementation helped the team build out its new architecture while laying the groundwork for broader real-time data access.

Early Wins: Cutting Costs, Saving Time, and Improving Agility

Even before completing their full migration, GMC saw significant operational benefits:

✅ Cost Savings
By retiring Tableau (an estimated £90,000/year) and planning the decommissioning of Oracle analytics and Informatica, GMC reduced analytics costs while positioning the organization for scalable growth.

✅ Faster Back-End Operations
Previously, if an ETL failed, re-running it meant uploading over 150 GB of data—a process that could take hours and disrupt business operations. With Striim’s live streaming in place, data is always current, and ETLs can be triggered on demand.

✅ Minimal Disruption
Because Striim runs in parallel with existing ETLs, GMC was able to phase in their new system gradually, minimizing risk during the transition.

✅ Strategic Flexibility
Striim enabled decoupling from legacy infrastructure, empowering GMC to scale up Power BI adoption and build out its modern cloud analytics stack with confidence.


Powering a Cloud-First, Real-Time Future for GMC

By connecting on-prem systems with Azure in real time, GMC is not only solving today’s data integration challenges but also laying the groundwork for tomorrow’s AI, analytics, and compliance initiatives. 

Looking ahead, GMC’s analytics roadmap includes:

  • Enabling near-real-time dashboards across key departments
  • Expanding Power BI adoption through Azure-based centralized reporting

Explore What’s Possible with Real-Time Data Streaming

GMC’s transformation highlights the power of real-time data integration in modernizing legacy systems and enabling a cloud-first future. Striim delivered the scalability, compliance, and speed needed to help GMC accelerate its journey while keeping costs in check and teams empowered.

Want to see what Striim can do for your organization?

Start Your Free Trial | Schedule a Demo

Vector Search in the Aisles: How Morrisons Made Product Discovery Smarter with Peter Laflin

Get More Insights In Your Inbox

Peter Laflin, Chief Data Officer at Morrisons, shares how his team turned customer confusion into a cutting-edge vector search experience—bridging physical retail with AI-powered search. He and John Kutay dive into the practical challenges of implementing LLMs and real-time data pipelines at scale, the importance of starting with actual customer problems, and why the best engineering feels a little lazy (on purpose). A real-world look at what happens when modern search meets supermarket shelves.

What’s New In Data is a data thought leadership series hosted by John Kutay who leads data and products at Striim. What’s New In Data hosts industry practitioners to discuss latest trends, common patterns for real world data patterns, and analytics success stories.

Unlocking Real-Time Decision-Making with High-Velocity Data Analytics

As data volumes surge and the need for fast, data-driven decisions intensifies, traditional data processing methods no longer suffice. This growing demand for real-time analytics, scalable infrastructures, and optimized algorithms is driven by the need to handle large volumes of high-velocity data without compromising performance or accuracy. To stay competitive, organizations must embrace technologies that enable them to process data in real time, empowering them to make intelligent, on-the-fly decisions.

With industries facing an increasing pace of change, businesses require the capability to quickly extract valuable insights from dynamic data streams. Real-time AI and machine learning (ML) models play a crucial role in ensuring both speed and precision, enabling businesses to navigate and respond to ever-changing conditions efficiently. These technologies must not only scale but also adapt to the complexity of high-velocity data.

Optimizing Operations Through High-Throughput Data Processing

Real-time analytics offer organizations the ability to enhance operational efficiency by making faster, more informed decisions. Below are key advantages of leveraging high-throughput data processing:

Real-Time Actionable Insights: By applying trained AI models to incoming data streams in real time, businesses can extract actionable insights immediately. This ensures that critical decisions—such as identifying new business opportunities or mitigating risks—are made quickly, reducing delays and increasing agility. Striim plays a key role in enabling businesses to extract these insights by seamlessly processing and integrating data in real time from various sources.

Improved Efficiency and Scalability: Real-time data processing platforms like Striim allow businesses to manage vast datasets without sacrificing performance. By using advanced algorithms and parallel processing techniques, Striim helps organizations scale their operations to accommodate increasing data volumes while maintaining low-latency performance. This scalability ensures that businesses can handle large, complex datasets efficiently, even as they grow.

Cost Savings Through Automation: High-throughput data processing allows organizations to automate decision-making tasks that would otherwise require manual intervention. This reduces reliance on human resources, minimizes errors, and lowers operational costs, enabling businesses to allocate resources more effectively. Striim’s platform supports this automation, ensuring that businesses can optimize their operations and reduce the need for manual data handling.

Enhanced Accuracy: Real-time processing utilizes sophisticated algorithms. These models improve the accuracy of insights derived from data streams, supporting more reliable, up-to-date decision-making and minimizing risks associated with outdated or incomplete data. With Striim’s advanced data integration capabilities, businesses can ensure that their decision-making is based on the most accurate and timely data available.

Seamless Integration for Instant Insight: To maximize the benefits of real-time analytics, organizations need platforms that can seamlessly integrate AI models into their data pipelines. Striim provides the architecture to apply trained models to incoming data as it flows through the system. By deploying lightweight inference agents within the streaming pipeline, Striim delivers real-time insights without delays, ensuring businesses can act on them instantly.

Flexibility Across Use Cases: Real-time data analytics can be applied across a variety of use cases, from predictive maintenance to anomaly detection, and customer behavior analysis. Whether businesses are looking to monitor equipment performance, detect fraud, or gain insights into customer trends, Striim’s platform provides the flexibility to implement AI models quickly and effectively, delivering insights tailored to specific business needs.

Key Benefits of Real-Time AI Inference with Striim

  • Cost Efficiency: Automating high-throughput inference tasks reduces manual processes, saving time and resources while minimizing errors.
  • Real-Time Actionability: Striim empowers businesses to make faster decisions by processing incoming data in real time, ensuring that opportunities are seized and risks are mitigated promptly.
  • Scalability: Striim’s platform can seamlessly handle large-scale data applications, enabling businesses to scale their operations without sacrificing speed or accuracy.
  • Accuracy: With continuous optimization of ML algorithms and integration of real-time data, Striim ensures that businesses can make decisions based on accurate, up-to-date insights.

The Future of High-Velocity Data: Agility and Intelligence at Scale

As industries continue to generate enormous volumes of data, the ability to process and manage this data at high speeds will be critical to success. Organizations that can leverage real-time analytics to extract insights from fast-moving data streams will be better equipped to make informed decisions in today’s dynamic landscape. Striim’s platform plays an integral role in enabling businesses to achieve this by delivering real-time data processing, scalable architectures, and seamless integration of advanced analytics models.

The future of high-velocity data demands agility, scalability, and precision—qualities that Striim delivers, helping businesses turn real-time insights into actionable outcomes with minimal delay.

Start Your Free Trial | Schedule a Demo

Architecting the Future: Alok Pareek on Databases, Logs, and Real-Time AI

Get More Insights In Your Inbox

Alok Pareek, Co-founder and EVP of Products at Striim, joins What’s New in Data to dive into the game-changing innovations in Striim’s latest release. We explore how real-time data streaming is transforming analytics, operations, and decision-making across industries. Alok breaks down the challenges of building reliable, low-latency data pipelines and shares how Striim’s newest advancements help businesses process and act on data faster than ever. From cloud adoption to AI-driven insights, we discuss what’s next for streaming-first architectures and why the shift to real-time data is more critical than ever.

Learn more about our latest release on Striim’s Release Highlight page here:  https://www.striim.com/whats-new-in-striim/

What’s New In Data is a data thought leadership series hosted by John Kutay who leads data and products at Striim. What’s New In Data hosts industry practitioners to discuss latest trends, common patterns for real world data patterns, and analytics success stories.

Accelerating SQL Server Data Replication with SQL2Fabric-X

Striim augments SQL2Fabric – Mirroring to additionally replicate real-time data to Azure Databricks and Microsoft Fabric Data Warehouse

For years, SQL Server has been a cornerstone for enterprise data management, but moving that data in real time to modern cloud platforms has often been complex, slow, and operationally intrusive.

But real-time data movement for replication, mirroring, or analytics shouldn’t be a bottleneck—it should be an enabler. That’s why we’re excited to announce the general availability of SQL2Fabric-X, a purpose-built managed service designed to simplify and accelerate SQL Server data replication into the Microsoft ecosystem for delivering AI and BI solutions.

With SQL2Fabric-X, organizations can seamlessly replicate their SQL Server databases and tables to Microsoft Fabric Mirrored database, Microsoft Fabric Data Warehouse, and Azure Databricks. This means data teams can shift from batch-oriented processes to real-time insights, enabling more agile decision-making and unlocking new state-of-the-art AI and analytics use cases.

Transforming How Businesses Move and Use SQL Server Data

Data is only as valuable as the speed at which it can be accessed, analyzed, and acted upon. Historically, organizations have struggled with managing ETL pipelines that introduce complexity, latency, operational overhead, and the risk of data inconsistency. SQL2Fabric-X eliminates these challenges by offering the highest performance, lowest-latency streaming approach that aligns with modern cloud-first strategies.

With this launch, businesses no longer have to choose between flexibility and simplicity. SQL2Fabric-X provides:

  • Near Real-Time Replication – Keep data fresh across cloud environments, eliminating reliance on outdated snapshots and batch processing.
  • Operational Resilience – Automated failover and consistency mechanisms ensure high availability and accuracy, reducing downtime risks.
  • Broad Workload Compatibility – Replicate data to Microsoft Fabric Mirrored database, Fabric Data Warehouse, and/or Azure Databricks to support analytics, reporting, and AI-driven workloads.
  • Optimized Performance – Designed for high-throughput workloads, reducing the time it takes to move and process data for business-critical applications.

“At Ignite in Nov 2024, we jointly announced our strategic partnership on Open mirroring with Microsoft by launching a public preview of a simple low cost, low latency solution to mirror on premise SQL Server data,”  said Alok Pareek, co-founder and EVP of Products at Engineering at Striim. “ We were the first partner to announce that, and now we are delighted to offer broader, flexible capabilities in this GA service with great feedback from early customers who expressed an interest in unlocking on-premise SQL Server data for Azure Databricks in addition to Mirroring.”

Make Smarter, Faster Decisions with SQL2Fabric-X

SQL2Fabric-X isn’t just about moving data—it’s about removing friction in decision-making. By enabling real-time, event-driven pipelines, companies in all industries can shift from reactive analytics to proactive intelligence, ensuring that operational and analytical systems are always working with the freshest insights.

Take customer 360 initiatives as an example: Instead of waiting for daily ETL jobs to update customer data, businesses can have real-time visibility into purchases, support interactions, and engagement, making personalization and service improvements instantaneous. Similarly, finance and operations teams can leverage real-time reporting, ensuring that inventory levels, pricing models, and risk assessments are dynamically adjusted to current market conditions.

The Next Step in Microsoft Fabric’s Evolution

SQL2Fabric-X is a strategic enabler for Microsoft Fabric customers. By offering direct, native integration, it expands the capabilities of Microsoft Fabric, allowing organizations to maximize their investment in Microsoft’s ecosystem while reducing data silos and improving accessibility.

For organizations looking to take the next step, SQL2Fabric-X is now generally available with a 30-day free trial. For those attending the Microsoft Fabric Community Conference in Las Vegas from March 31–April 2, visit Striim at booth #312 to see SQL2Fabric-X in action and discuss how real-time data streaming can accelerate your cloud strategy.

Building for Scale: AWS’s Marc Brooker on Distributed SQL

Get More Insights In Your Inbox

In this episode of What’s New in Data, AWS VP and Distinguished Engineer Marc Brooker joins us to break down DSQL, Amazon’s latest innovation in serverless, distributed databases. We discuss how DSQL balances consistency, availability, and scalability—without the headaches of traditional relational databases. Tune in to hear how this new approach simplifies architecture, eliminates operational pain points, and sets a new standard for high-performance cloud databases.

Follow Marc on: X, Bluesky, LinkedIn, or his blog for more insights on distributed systems, databases, and the future of cloud computing.

Seamless Database Migration and Replication to AWS Aurora PostgreSQL with Striim

AWS PostgreSQL, a managed database service, provides a robust platform for enterprises to modernize their data infrastructure. However, the challenge lies in migrating and replicating data seamlessly while ensuring minimal downtime and maintaining transactional consistency. Striim, a leader in real-time data integration, offers a comprehensive solution to address these challenges.

Why Migrate to AWS PostgreSQL?

AWS PostgreSQL, including Amazon RDS for PostgreSQL and Amazon Aurora PostgreSQL, provides a managed, scalable, and secure environment for enterprise-grade applications. Some key benefits include:

  • Scalability & High Availability: Elastic scaling and automated failover mechanisms ensure business continuity.
  • Performance Optimization: Support for parallel queries, enhanced indexing, and optimized storage for large datasets.
  • Security & Compliance: Built-in encryption, IAM authentication, and compliance with industry standards like GDPR and HIPAA.
  • Fully Managed Service: Automated backups, patching, and monitoring reduce operational overhead.

Challenges of Database Migration and Replication

Migrating a database from on-premises or another cloud provider to AWS PostgreSQL involves several complexities:

  • Downtime Risks: Traditional migration methods often require extended downtime, impacting business operations.
  • Data Consistency: Ensuring data integrity during migration and replication is critical for transactional consistency.
  • Schema Evolution: Differences in data structures and evolving schemas can lead to errors if not handled properly.
  • Real-Time Synchronization: Businesses need up-to-date data without disruptions, making real-time replication essential.

How Striim Enables Seamless Migration and Replication

Striim provides an enterprise-grade, cloud-native platform for real-time data integration, featuring change data capture (CDC), continuous replication, and zero-downtime migration. Here’s how Striim simplifies the process:

1. Change Data Capture (CDC) for Minimal Downtime

Striim’s CDC technology captures changes from source databases in real time, allowing continuous data movement without disrupting ongoing operations. This ensures:

  • Zero Downtime Migration: Keeps source and target databases in sync during the transition.
  • Transactional Integrity: Guarantees consistency, preserving primary keys, foreign keys, and dependencies.

2. Real-Time Data Replication for Always-Current Data

With Striim, businesses can continuously replicate data from on-premises databases or cloud platforms to AWS PostgreSQL with sub-second latency. This supports:

  • Hybrid and Multi-Cloud Strategies: Ensures real-time data synchronization across diverse environments.
  • Disaster Recovery & High Availability: Replicating to standby instances enhances resilience.

3. Schema Evolution and Automated Transformation

Striim dynamically handles schema changes and applies transformations, including:

  • Automated Data Mapping: Adapts source schema to target PostgreSQL schema seamlessly.
  • Pre-Built Connectors: Supports heterogeneous environments such as Oracle, SQL Server, MySQL, and NoSQL databases.

4. Secure, Scalable, and Fully Managed Solution

Striim is designed to meet enterprise security and scalability requirements:

  • Encryption & Access Control: Secure data movement with TLS encryption and role-based access control.
  • Scalable Architecture: Distributes workloads efficiently to handle large-scale data replication.
  • Monitoring & Alerts: Provides real-time dashboards and alerts for tracking pipeline health.

Use Case: Large-Scale Enterprise Migration to AWS PostgreSQL

A leading financial services company needed to migrate its mission-critical Oracle database to AWS PostgreSQL without disrupting ongoing transactions. By leveraging Striim’s CDC-based replication, they achieved:

  • Zero downtime migration, allowing continuous business operations.
  • End-to-end encryption, ensuring regulatory compliance.
  • Automated schema conversion, simplifying PostgreSQL adoption.
  • Real-time failover, enhancing disaster recovery and availability.

Conclusion

Migrating and replicating databases to AWS PostgreSQL doesn’t have to be complex or disruptive. With Striim’s real-time data integration platform, businesses can achieve a seamless transition with zero downtime, data consistency, and operational resilience. Whether modernizing data infrastructure, enabling hybrid cloud strategies, or ensuring high availability, Striim provides the tools to accelerate your cloud journey.

Get Started Today

Ready to migrate or replicate your database to AWS PostgreSQL? Schedule a demo with Striim to see real-time data integration in action.

What are Preview Application Connectors?

What are Preview Adapters?

A preview adapter is an adapter that is available for early prototyping by users for functional testing of their use cases. Preview adapters have a subset of functionality of generally available adapters, and should not be used for production or business-critical workloads, or for performance testing and benchmarking. Striim may choose to make the preview connectors generally available at a future point in time. Striim does not offer guarantees that the generally available version of a preview adapter will retain the functionality, performance or architecture of the preview connector.

Why Preview Adapters?

These adapters facilitate trying out Striim’s upcoming adapters (with initially restricted functionality) to test basic scenarios before using the subsequent GA versions of these adapters for production usage.

Where can I try the Preview adapters?

We have made Preview adapters available using the Striim Developer Edition so that developers and data engineers can try these out quickly in their sandbox environments. Explore our Preview Adapters by signing up for free on Striim’s Developer Edition with 14-day trial for each adapter.

How can I identify Preview adapters?

Look out for the icon on the adapter logo –
Examples –

How can I try the Preview adapters?

After you have signed-in to the  Striim Developer Edition, use the Flow Designer to create an app. Drag-drop the source and target components from the panel to create your App and modify the connection properties and table details. Details in this doc. Check this video for reference.

Which adapters are available in the Preview?

CRM and Customer Service (8)
Salesforce Marketing Cloud, ActiveCampaign, Acumatica, Pipedrive, SugarCRM, Freshdesk, Veeva Vault, Odoo

Marketing & Related tools (17)
Gmail, Google Search, Google Campaign Manager 360, Google Ad Manager, LinkedIn Company Pages, LinkedIn Ads, Facebook Ads, Facebook Pages, Meta for Business, Act-On, Mailchimp, Pinterest Ads, Snapchat Ads, X(Twitter) Ads, SendGrid, WordPress, Marketo

Analytics and BI (4)
Adobe Analytics, Tableau, Google Analytics, YouTube Analytics

IT tools, Workflow, and Communications (24)
Google Drive, Microsoft Excel, Microsoft Excel Online, Microsoft Project, Office 365, Microsoft SharePoint, Azure DevOps, Microsoft Dynamics 365, Microsoft OneDrive, Microsoft Advertising
Microsoft Bing Search, Microsoft Dataverse, Dropbox, Box, Asana, Airtable, Monday.com, Smartsheet, Trello, Twilio, Kintone, Paymo, SurveyMonkey, Splunk

Human Resources and People Management (4)
BambooHR, Workday, Epicor Kinetic, Certinia

Financial Accounting and Payments (9)
QuickBooks Online, Sage 50 Accounts, SageIntacct, NetSuite, Exact Online, Xero, Zuora
Oracle Fusion Cloud Financials, Square

E-commerce and Logistics (6)
Amazon Marketplace, Adobe Commerce, BigCommerce, eBay, Shopify, WooCommerce

What are the differences between Preview and Generally Available (GA) adapters?

The common differences between the class of adapters is as follows:

Feature Supported in GA Adapters Supported in Preview Adapters
Objects Standard Objects
Custom Objects
Authentication Basic Authentication
OAuth Authentication
Custom Authentication Methods
Building Applications Using wizards
Flow Designer
Striim TQL
Operations Automated mode
Initial Load
Continuous replication using incremental loading
Schema Handling Initial Schema Creation
Runtime Resilience / Recovery
Parallel Execution
Metrics
Governance Connection Profiles
Sherlock AI
Sentinel AI
Customer Support Contractual SLAs

You can also check it in our documentation here.

How do I request an adapter?

Use this form to request ‌a new adapter – https://go2.striim.com/request-connector

Why Real-Time Data Will Define 2025

AI adoption is accelerating, but most enterprises are still stuck with outdated data management. The organizations that win in 2025 won’t be the ones with the biggest AI models—they’ll be the ones with real-time, AI-ready data infrastructures that enable continuous learning, adaptive decision-making, and assist regulatory compliance at scale.

What’s changing? The shift to always-on data pipelines, AI governance built for real-time, and architectures that unify multi-cloud complexity. Here’s what’s coming next (and why the winners are already making moves today).

1. Real-Time Data is the Baseline

For decades, businesses have treated data latency as a tolerable issue. That era is over. The shift from batch to real-time data pipelines is an existential requirement for AI-driven businesses.

Static AI models trained on stale data will deliver poor outcomes. Whether it’s anomaly detection, predictive analytics, or AI-powered decision-making, AI needs live data streams to work effectively. This is why companies are abandoning traditional ETL in favor of Change Data Capture (CDC) and event-driven architectures.

Events (deposits and withdrawals) are captured and streamed in real time using change data capture.

At Striim, we’re seeing enterprises move to always-on data pipelines that integrate with AI applications in real time. AI-driven decision-making needs millisecond-level freshness, not insights delayed by hours or days. If your AI isn’t reacting in real time, it’s already obsolete.

2. AI Governance Requires Detecting and Classifying PII in Flight

The last 18 months have seen a surge in AI regulatory frameworks, and enterprises must navigate a new reality where AI decisions will be scrutinized at every level. Enterprises must also solve practical problems to ensure AI models don’t have access to customer PII.

The problem? Most companies still operate with outdated data governance policies that aren’t built for AI. If your governance model doesn’t account for real-time data flows and LLM models, you have some catching up to do.The solution is a continuous compliance approach, where security, governance, and access controls happen dynamically. 

We see organizations implementing real-time data lineage tracking, automatic PII detection, and encryption at the ingestion layer—not as an afterthought, but as an integral part of the data pipeline. By combining AI-ready data lakes with fine-grained, real-time access controls, enterprises can work towards compliance without sacrificing speed. 

Microsoft Fabric, for example, enables governance at scale, making it easier to enforce real-time security policies across AI applications.

3. Hybrid and Multi-Cloud is the Default… But That’s Not Enough

For years, technical leaders have debated cloud vs. on-prem. The reality is, in 2025, every company is multi-cloud by default—whether they planned to be or not. SaaS sprawl, vendor lock-in concerns, and performance optimization mean enterprises now run workloads across AWS, Azure, GCP, and private clouds.

The challenge now isn’t deciding where to store data—it’s ensuring seamless real-time movement between these environments. This is why we’re seeing rapid adoption of cross-cloud data fabrics, where organizations treat data infrastructure as a fluid, event-driven system rather than a collection of disconnected storage silos.

With Microsoft Fabric’s OneLake and Striim’s real-time CDC technology, enterprises can create a single, AI-powered data layer that unifies ingestion, transformation, and analytics regardless of where the data originates.

4. Build AI for Business Outcomes, Not the Hype

AI adoption is often driven by technology-first thinking, where enterprises chase the latest model instead of solving real problems. In 2025, this approach will fail.

The shift is towards AI that drives measurable business impact, rather than AI that exists for its own sake. That means:

  • AI must be deeply embedded in real-time business processes, not just dashboards.
  • Models must be continuously trained on the freshest, most relevant data, not just historical snapshots.
  • AI applications must be iterative and adaptable, evolving alongside changing business needs.

Organizations truly succeeding with AI are integrating  into live decision-making loops, where insights automatically trigger actions. For example, streaming fraud detection models in financial services do more than just identify risks—they initiate automated responses in real time.

The companies that win with AI will be the ones that build adaptive, event-driven architectures that continuously improve with every data point that enters the system.

5. Retrieval-Augmented Generation (RAG) Will Separate AI Winners from the Rest

Most AI models today generate insights based on publicly available data or predefined training sets. This is no longer good enough. The next phase of enterprise AI is RAG (Retrieval-Augmented Generation): models dynamically pull in real-time enterprise data before generating responses.

RAG introduces a  fundamental shift in how AI interacts with business operations. Instead of relying on static knowledge, RAG-based systems connect directly to live operational databases, SaaS applications, and event streams to produce context-aware, business-specific insights.

In my opinion, the impact of RAG will be widespread and profound, resulting in:

  • AI-generated insights grounded in real business reality instead of generic knowledge.
  • Enterprises maintaining tight control over their proprietary data and reduce compliance risks.

AI is moving from being a static analysis tool to a real-time decision-making engine. And as AI moves into mission-critical workflows, RAG becomes a requirement rather than an option.

The Road Ahead: Real-Time AI is the Only AI That Matters

We are at the tipping point where real-time data infrastructure and AI are converging. The companies that recognize this will redefine industries, while those that cling to legacy architectures will fall behind.

2025 will belong to organizations that build real-time, AI-ready infrastructures that continuously adapt, govern, and act on data the moment it is created.

At Striim, we’re enabling this shift by helping enterprises move beyond batch processing and into the world of always-on, real-time AI pipelines. Microsoft Fabric is accelerating this movement, providing a unified foundation for real-time analytics, governance, and AI integration.

If you want to see these trends in action, check out our recent webinar, Data and AI Trends 2025. And if you’re heading to FabCon in Las Vegas March 31-April 2, don’t miss our session on Real-Time Data for Real-Time AI—where we’ll show how enterprises are making real-time AI a reality today.

Simplify User Access with SSO Support for On-Premise Striim

In the latest Striim 5.0 release, we are excited to introduce a highly anticipated feature: Single Sign-On (SSO) support for on-premise Striim. This new capability empowers users to access Striim seamlessly using their existing corporate credentials, streamlining the login process and enhancing security. Let’s dive into what this feature does, how to use it, and how Striim adds value to your business.

What Does SSO Support Do?

Traditionally, users needed to manage multiple sets of credentials, such as usernames and passwords, for each enterprise application, including Striim. With the introduction of SSO support, Striim now integrates with popular identity providers such as Microsoft Entra ID (formerly Azure AD) and Okta, enabling customers to use a single set of login credentials across multiple enterprise systems.

SSO utilizes the SAML 2.0 protocol, allowing Striim users to log in without having to remember separate usernames and passwords for each application. By eliminating this friction, the user experience becomes more streamlined and secure.

How Do You Use It?

Striim offers two methods for setting up SSO:

  1. Identity Provider (IDP) Initiated Login: In this method, the user launches Striim from the Azure AD App Gallery. They are automatically federated through the Identity Provider (IDP), completing the login process.
  2. Service Provider (SP) Initiated Login: Here, the user starts the login process directly within Striim. Striim redirects them to Azure Entra or Okta (the IDP), where they authenticate, and then they are returned to Striim.

These two options give flexibility in how users can access Striim, ensuring that the setup aligns with your enterprise’s workflow.

Want to dive deeper? Check out the doc and explore more.

How Does Striim Add Value?

The addition of SSO support provides numerous advantages for both users and businesses:

  • Enhanced Security and Compliance: With SSO and Multi-Factor Authentication (MFA) support via Entra or Okta, Striim meets enterprise information security requirements for authentication and authorization. This helps businesses maintain compliance with security standards while reducing the risk of password-related breaches.
  • Simplified User Management: Striim users can now be managed directly through their Entra or Okta dashboards, making it easier for IT teams to control access and permissions, just like they do with other enterprise software. This centralization simplifies administration and improves user access management.

Transform Your Business with SSO Support for On-Premise Striim – Try It Today!

Striim 5.0’s SSO feature brings a new level of convenience and security to your enterprise applications. By allowing your users to access Striim with their existing credentials from Microsoft Entra ID or Okta, you can streamline workflows, improve security, and reduce administrative overhead. Ready to power your business with real-time data? Try Striim today with a free trial or book a demo to see it in action.

Start Your Free Trial | Schedule a Demo

Back to top