Main

Event Streaming in 3 Minutes

https://cnfl.io/event-streams-and-design-intro | What is event streaming? Danica Fine (Staff Developer Advocate, Confluent) explains the basics—from an event to an event stream to event stream processing, and how it works. Event streaming is a sequence of continuous data points that originate from Apache Kafka®, the de facto distributed system for brokering messages in highly available environments. Data points could also stem from the physical world if a measuring tool is present, for example, a temperature reading sent by a jet engine sensor could be an event. Each data point from a system is referred to as an event—a fundamental unit for stream processing. Events have a repeating and evolving nature, hence the ongoing delivery of events is referred to as a stream. Event stream processing (ESP) is when we combine streams of data together for real-time data delivery, which opens new capabilities like real-time data processing and analytics. This is where the true power of event streaming lies. LEARN MORE ► Designing Events and Event Streams: https://cnfl.io/event-streams-and-design-intro ► Event Sourcing course: https://cnfl.io/event-sourcing-fundamentals ► Kafka Streams course: https://cnfl.io/kafka-streams-introduction ► Learn Apache Kafka with Tutorials: https://cnfl.io/confluent-developer-what-is-event-streaming ► Join the Confluent Developer Community: https://cnfl.io/join-community-what-is-event-streaming ABOUT CONFLUENT Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io. #streamprocessing #kafkastreams #apachekafka #kafka #confluent

Confluent

2 years ago

hi i'm danica fine with confluent today we'll talk about event streaming to understand event streaming let's start with its fundamental unit the event an event is something that happens at a specific point in time signaling that something has happened as well as the information about exactly what happened an event could be when the user of a website adds a product to a cart or when a customer completes a business process like paying an invoice but events can also stem from the physical world if
a measuring tool is present the geographical coordinates of a pedestrian could be an event or a temperature reading sent by a jet engine sensor could be one but events aren't interesting if they happen just once or infrequently we wouldn't even need a database to record them were that the case instead it's their repeating and evolving nature that makes them worth observing events form into event streams which reflect the changing behavior of a system this could be the continuing path of a pedest
rian or the entire sequence of products added and removed from an e-commerce cart distributing this behavior level data around a system or company allows us to arrive at useful conclusions just a few milliseconds after the original event occurred so far i have spoken about event streams as individual things purchases gps coordinates and so forth but really the true power of event streaming lies in the combining of many streams together which we call event stream processing where one stream reads
from the output of another making calculations that then feed into a third which in turn sends its calculations to a completely different streaming pipeline and so on really a somewhat kaleidoscopic scenario if you diagram it since event stream processing allows immediate action to be taken on conclusions made from many sources of data it can be used directly with operational workloads that actually run a business not just for the end of day analytics we saw in earlier generations of data tooli
ng and companies are increasingly putting these technologies to good use applying them to everything from real-time fraud analysis and ticketing to self-driving vehicles vehicle maintenance monitoring and ride shared matching in fact customers of digital products across the economy are starting to expect product features that rely on the event streaming paradigm to function even if they aren't really aware of it happening but as event streaming grows in popularity users are finding that not all
event streaming solutions are built alike in fact there are many that are retrofits of static products these can have issues with scaling which brings me to the final important thing i'd like to impart about event streams they generate boundless data sets because you may not be able to predict how often an event will happen how long it will continue or how long you will need to measure it you may not even know how many streams you'll need to monitor so when working with events you need a solutio
n that can handle unknowable sizes in other words that scales really well and horizontally your best bet is to use one that was designed for event streaming from its inception [Music] you

Comments

@pierrealvarez3773

Student here - found this very helpful. Thank you!

@edharrod

Nice explanation, diagrams always help me to understand a concept better though 👍

@francksgenlecroyant

Thank Confluent Team

@toenytv7946

Good job keeping it simple. 👍