30 min

Open Q&A Session - Pulsar Virtual Summit Europe 2023

Pulsar Virtual Summit Europe 2023

One highlight of the Pulsar Virtual Summit Europe 2023 was this open Q&A session, where a panel of facilitators and Pulsar engineers answered unstructured questions live. Listen along to hear questions and responses, including the announcement of the new Apache Pulsar website, questions regarding the release of Apache Pulsar 3.0, the recent open sourcing of Oxia, and various questions regarding both technical and non-technical Pulsar interests alike.

One question to listen for came from an attendee who asked how to convince their leadership to adopt Pulsar over a more traditional legacy Kafka install. Nakul Mish, (StreamNative Solutions Engineer,) provided a thoughtful response, which you can hear in full in this recording:

“When you are trying to think about fighting and convincing, then you are coming from an ego perspective: 'I want to win or they want to win.' I would just take a step back and think about everything having a price, right? So Kafka can solve a certain set of problems and Pulsar can also solve more sets of problems. And also both of them are not the golden bullet. I mean, we are not saying Pulsar is the golden bullet nor is Kafka the golden bullet. So you have to look at: 'What is your use case?' If you have the use case where you are going to have streaming as well as messaging, well, Kafka announced recently that they will begin building that soon. But with Pulser, you have this functionality already for a long time and people are running it successfully in production currently…”

This recording was originally presented at Pulsar Virtual Summit Europe 2023.

Asaf Mesika
Principal Engineer, StreamNative
Julien Jakubowski
Developer Advocate, StreamNative
Gilles Barbier
Head Of Community, StreamNative
Nakul Mish
Solutions Engineer, StreamNative


Our strategies and tactics delivered right to your inbox

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.