trusts
Managing PULSAR

How Contextual Scaled its AI Orchestration Platform with StreamNative’s Managed Apache Pulsar Service

Contextual is an innovative AI Orchestration platform that enables businesses to develop, prototype, and deploy AI solutions quickly and cost-effectively. Founded by a team of seasoned entrepreneurs with a track record of building scalable technology platforms, Contextual simplifies AI adoption for enterprises by providing a comprehensive, integrated technology stack.

Challenge

As an AI Orchestration platform, Contextual faced several critical challenges:

  • Need for a scalable, dependable messaging infrastructure to support their data management and low-code processing platform.
  • Requirement for a multi-tenant, dynamically scalable topic/partition solution to match customer tenants and agent configurations.
  • Cost-effective partitioning to support growth from 1 to 1,000 businesses using the platform.
  • A solution that could handle continuous development and deployment by users, triggering immediate events and messages for seamless AI solution building.

Solution

After researching various options, Contextual chose StreamNative's managed Apache Pulsar service as their messaging infrastructure solution. They initially started with the pay-as-you-go option for development and QA environments, and later transitioned to the Bring Your Own Cloud (BYOC) product when it became available on Azure.

Technical Journey

Contextual's journey with StreamNative began with the implementation of StreamNative's solution as a replacement for their previous managed Kafka platform. This transition marked the beginning of a collaborative effort between Contextual and StreamNative to optimize their messaging infrastructure. 

As the implementation progressed, the team encountered various technical challenges, particularly with the Bring Your Own Cloud (BYOC) implementation in Azure. These challenges provided valuable learning experiences for both parties. During the process, Contextual faced some issues with Kafka over StreamNative, including instances of message loss in the production cluster. As the team worked through these obstacles, it became apparent that the architecture needed to be scaled to effectively handle scenarios where multiple client and producer connections required simultaneous reconnection. Throughout this technical journey, the close collaboration between Contextual and StreamNative was instrumental in identifying and addressing these challenges, ultimately leading to a more robust and scalable solution.

Results

Compared to a managed Kafka solution where partitions were fixed and additional partitions cost more money, StreamNative gave Contextual the ability to dynamically allocate partitions to their clients without incurring significant additional cost. This allowed Contextual to go-to-market with a cost effective, multi-tenant solution where new topics and partitions can be easily created for each new customer.

Key Takeaways

Contextual's implementation underscored the critical importance of flexible, cost-effective partitioning in multi-tenant AI orchestration platforms. This StreamNative capability proved to be a game-changer for Contextual, enabling them to efficiently scale their services to meet the diverse needs of their growing customer base. 

The project also highlighted the complexities involved in implementing a Bring Your Own Cloud (BYOC) solution in Microsoft Azure, especially when integrating with Kafka. These challenges provided both Contextual and StreamNative with valuable learning experiences, contributing to the ongoing refinement of the platform. Furthermore, the journey emphasized the need for robust architecture capable of handling simultaneous reconnections of multiple clients and producers, a crucial factor in maintaining the reliability and performance of an event-driven AI orchestration platform. 

These key learnings not only shaped Contextual's approach to their infrastructure but also provided valuable insights for StreamNative in enhancing their service offerings for similar use cases in the future.

Future Prospects

As Contextual launches its production platform at Contextual.io, it will begin to push the scale limits of the StreamNative solution. This will provide valuable insights into the platform's performance at scale and may lead to further optimizations and enhancements.

Conclusion

StreamNative's managed Apache Pulsar service has provided Contextual with a scalable, cost-effective messaging infrastructure that is crucial for their AI Orchestration platform. The StreamNative solution has positioned Contextual to offer a competitive, multi-tenant AI solution platform that can scale efficiently with their business growth.

"Compared to a managed Kafka solution where partitions were fixed and additional partitions cost more money, StreamNative gave us the ability to dynamically allocate partitions to our clients without incurring significant additional cost. This allows us to go-to-market with a multi-tenant solution where new topics and partitions can be easily created for each new customer without breaking the bank." - Andrew Brooks, CEO of Contextual

Newsletter

Our strategies and tactics delivered right to your inbox

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.