Chapter 11. Implementing a Streaming Solution
In this chapter, we embark on building a simple IoT pipeline using a Raspberry Pi and a DHT22 humidity and temperature sensor. This setup will enable continuous environmental data collection, which we will stream in near real time to Confluent Cloud using Apache Kafka. The goal is to create a foundational system that combines hardware, software, and cloud infrastructure to demonstrate end-to-end data streaming.
We begin by assembling the physical components and configuring the Raspberry Pi running Ubuntu, then proceed to connecting and programming the sensor. After that, we’ll establish a Kafka cluster in Confluent Cloud, set up topics, and configure producer and consumer applications using Python. Finally, we’ll create a connector to persist the streaming data to Amazon S3. By the end of this chapter, you’ll have a fully integrated pipeline capable of capturing, streaming, and storing sensor data in a scalable and cloud native environment.
Raspberry Pi and Sensor Setup
We’ll now set up a Raspberry Pi with a humidity and temperature sensor and feed streaming data to Confluent. The setup involves several steps, beginning with the assembly and configuration of the hardware components. By the end of this chapter, you will have a fully functional system that continuously monitors and streams humidity and temperature data and streams the data via Confluent Kafka.
Bill of Materials
This is a pretty simple project from a hardware perspective. ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access