StreamNative Introduces Lakestream Architecture and Launches Native Kafka Service
Read Announcement > Sign Up Now >Stream from any Kafka source directly into Delta Lake tables — no ETL, no copies, no delays. Powered by Ursa Engine.
StreamNative now supports streaming Kafka and Pulsar data directly into Unity Catalog Managed Tables with automated governance, lifecycle management, and performance optimization.
• OVERVIEW
StreamNative partners with Databricks to enable enterprises to seamlessly stream data into the Databricks Lakehouse Catalog. With multiple integration options available, organizations can select the best approach for their needs—whether for governance, storage efficiency, or advanced analytics.

Easily stream real-time data into Databricks Lakehouse.
Choose from native Unity Catalog integration, Delta Lake streaming, or Spark connectors.
Ensure high-performance, cost-effective data streaming at scale.

• USE CASES

Effortlessly configure out-of-the-box integration within StreamNative Cloud to stream data directly into Databricks Unity Catalog.
Unified data governance across real-time workloads
Simplified data management for seamless analytics
Use the Lakehouse Connector to sink streaming data into Databricks tables with configurable batching, partitioning, and schema management.
Configurable batch size and interval
Automatic table creation
Schema registry integration
Dead letter queue support
Read and write Pulsar topics from Databricks Spark jobs using the native Pulsar Spark connector for batch and structured streaming workloads.
Structured streaming support
Batch read from Pulsar topics
Exactly-once processing
Checkpoint-based recovery
Connect Databricks Spark jobs to StreamNative using the standard Kafka Spark connector. No code changes required for existing Kafka workloads.
Native Kafka protocol support
No application code changes
Databricks SQL support
Real-time dashboards and alerts
• RESOURCES
Enjoy $200 free credit to start