Apache Kafka is a popular distributed streaming data platform. A Kafka cluster stores streams of records (messages) in categories called topics. It is the architectural backbone of modern data analytics. Data flowing into Kafka often originates from native data streams such as social media streams, telemetry data, financial transactions and many others. But these data streams only contain part of the information. A lot of data necessary in stream processing is stored in traditional systems backed by relational databases. To implement new and modern, real-time solutions, an up-to-date view of that information is needed. So how do we make sure that information can flow between the RDMBS and Kafka, so that changes are available in Kafka as soon as possible in near-real-time? It this session, we present different approaches for integrating relational databases with Kafka, such as Kafka Connect, Oracle GoldenGate and bridging Kafka with Oracle Advanced Queuing (AQ).