The Ops Community ⚙️

Bhagya
Bhagya

Posted on

Azure Event Hub

Microsoft Azure Event Hub is a cloud-based service that processes millions of events in seconds and serves as a big data platform and event ingestion service. Data generated or saved in the source system can be delivered to the event hub, where the user can execute the appropriate transformations before storing it utilizing real-time ingestion or batch/storage operations. Azure event hub is used for anomaly detection, application logging, or applications that require real-time data, such as live dashboarding.

Azure event hub easily processes data in real-time, allowing users to gain additional insight into data. It processes data with minimal latency using Hadoop's distributed processing architecture and integrates with data analytics services. Event hubs serve as an event ingester and operate as a "front door" to the event pipeline. The event ingester sits between the event punisher and the event consumer. It is a unified streaming service designed to separate the event producer from the event consumer.

For processing, the stream of data Event hub uses the below component:

Event producers: It is an endpoint that engages customers with event hubs using the HTTP or Apache Kafka protocols. Any type of data sent to the hub is first published using the event publisher/producer.

Partitions: Event hub streams a message which is partitioned so that based on partition consumers can only read the particular subset of the partition of the streamed message.

Consumer groups: Event hub follows the mechanism of publishing and subscribing and this is enabled in the event hub using the consumer group. It provides the state, position, and offset view of the event hub. Based on the subscription of the consumer group they can view the event stream in the event hub. Consumer groups can read or view the stream based on their pace and offset.

Throughput units: To control the throughput capacity users can pre-purchase the units of capacity in the Azure Event Hub.

Event Receiver: it is an entity used to read the data from the event hub. AMQP 1.0 sessions are used to connect the All event hub consumers. This session is used to deliver Event hub services as soon as they are available. For real-time streaming and ingestion of data, Apache Kafka uses Kafka consumers which are connected using Kafka protocol 1.0 or later.

Top comments (1)

Collapse
 
arul2arul profile image
arul2arul

Hi Bhagya,
If we thinking enabling the monitoring and obesverability..what are the topics to be considered? What are the alerts to be enabled ?