Redpanda open-sources top 16 AI connectors

Stream fast and free with Redpanda Connect’s pre-built AI processors

By
on
August 18, 2025

We’re excited to release Redpanda’s top AI connectors to the most used destinations — including OpenAI, Cohere, Bedrock, Ollama, and Vertex AI — under the Apache 2.0 open-source license. You can check all the AI processors in Redpanda Connect here.

Today, Redpanda Connect's unique, embeddable architecture already powers hundreds of the world's largest businesses, from top U.S. retailers and the biggest power grid companies in Europe, to the largest banks in South Africa and leading sports betting companies in Australia. Our product-build philosophy is to build simple, safe, efficient systems that easily plug into your ecosystem. Over the last year, we've proven this by adding over 100 connectors and expanding into new markets, including AI, with one of the best connector portfolios out there. 

While our recent focus has been on enterprise readiness and security hardening; general cloud availability, with more BYOC regions and expanded Serverless capabilities; and building the fastest, lightest-weight CDC for PostgreSQL, MySQL, and MongoDB — we’re proud to continue the original Benthos ethos of “Fancy stream processing made operationally mundane.” And now, we're ready to share our powerful AI connectors with the rest of the world.

Explore all components in the Redpanda Connect catalog →

Shifting from AI experimentation to AI monetization

By open-sourcing our AI connectors, we’re not just giving you tools; we’re unlocking new business models. Because Redpanda Connect is embeddable and cloud-native, you can now package these AI capabilities directly into your own commercial products and services.

This allows you to:

  • Build your own AI-powered features: Seamlessly integrate generative AI, embeddings, and real-time analysis into your applications.
  • Monetize your innovations: Ship products with powerful, streaming AI features without the friction of enterprise licensing for the connectors.
  • Empower agents and teams: Both human and agentic AI workflows can now iterate freely, running at full speed.

Meet the AI processor components

Here’s a quick look at the streaming AI processors that come pre-built in Redpanda Connect across OpenAI, Amazon Bedrock, Google Cloud Vertex AI, Cohere, and Ollama.

OpenAI

You may have heard of this company. OpenAI’s models are versatile, capable of handling text, audio, and image data in real-time. They excel at pipelines that require a mix of generation, summarization, classification, and translation.

Redpanda Connect supports six processors for OpenAI:

This makes OpenAI ideal for multi-modal pipelines and rapid prototyping. See this recent example of streaming text embeddings for RAG. The GitHub repo provides a ready-to-run template.

Outlining the RAG demo app featuring Redpanda Connect's OpenAI processor alongside MongoDB for vector storage, LangChain for orchestration, and Redpanda for streaming events

Amazon Bedrock

“Hey Alexa, where can I find a bunch of foundation models that are tightly integrated with AWS services?”

If only Alexa had the answer! But that’s exactly what Bedrock does: it gives you access to foundation models without you having to manage the underlying infrastructure, making it a natural fit for streaming workflows that need content generation, summarization, or intelligent chatbots.

Redpanda Connect offers two processors for Bedrock:

In a recent demo, we streamed support tickets into Bedrock models to summarize issues and generate draft responses in real-time. If you’re curious, we recorded it. 

Watch the webcast: Build GenAI apps with serverless architecture

Vertex AI

Vertex AI is designed for structured machine learning at scale. (Managed GPUs or TPUs? Yes, please!) 

It’s also currently the only managed service where you can access Google’s Gemini models. Gemini models provide the latest in large language model capabilities, tightly integrated with Vertex’s structured ML pipelines. These state-of-the-art language models make Vertex AI ideal for pipelines that require continuous predictions, anomaly detection, or personalization based on streaming data.

Redpanda Connect includes two processors for Vertex AI:

Whether you’re delivering real-time recommendations, running predictive analytics, or experimenting with the cutting-edge Gemini models, Vertex AI makes it possible — all fully integrated into streaming workflows with Redpanda Connect.

Cohere

Cohere excels at embeddings and semantic search, which makes it perfect for pipelines that need to extract meaning from massive streams of text. (Basically, you could analyze every battle rap ever written and figure out who’s really spitting the hardest bars.) 

Redpanda Connect supports three Cohere processors:

Cohere shines in real-time semantic search, NLP-heavy agent workflows, or embeddings pipelines, allowing you to extract meaning and context from large streams of text effortlessly.

Ollama

Ollama is your local-first, full-control AI option that's perfect for privacy-sensitive or on-premises workflows. Of course, we have a RAG example here too, and are more than happy to share it with you. (See the cookbook example in our docs.)

Redpanda Connect provides three Ollama processors:

This setup is perfect when you need full sovereignty over data and models, while still leveraging the power of real-time AI streaming, complete with moderation guardrails.

Recapping the Redpanda Connect AI lineup

Choose the connector family that fits your workload, whether that’s broad flexibility, foundation models, Gemini access, NLP pipelines, or complete control over your data.

ConnectorBest for
OpenAIBroad, flexible, multi-modal pipelines
BedrockFoundation model workflows at scale
Vertex AIStructured ML + streaming predictions + Gemini access
CohereNLP pipelines and embedding-heavy workflows
OllamaFull data/model sovereignty and secure local-first processing

Stream AI with Redpanda Connect today

This crop of AI components is officially ready for action. Apache 2.0 means you can integrate, extend, and experiment freely. Embeddings, RAG pipelines, generative content, or local-first models. You name it, you can stream it.

Go ahead: stream the freshest context to your AI, prototype your next experiment, and watch both your team and your agents hum right along. Redpanda Connect AI connectors are open source, and they’re your ticket to streaming AI at full speed.

Explore the entire Redpanda Connect catalog, or get started quickly with pipelines on Redpanda Serverless. Happy streaming!

No items found.
Mike Broberg
Author
Mike Broberg

Mike Broberg
Author

Author

Author

Related articles

VIEW ALL POSTS
Redpanda 25.2: Advancing Apache Iceberg™ integration for the real-time lakehouse
Matt Schumpert
&
Bipin Singh
&
&
August 5, 2025
Text Link
Redpanda Serverless Standard deprecation FAQ
Mike Broberg
&
&
&
June 16, 2025
Text Link
New in Redpanda Cloud: automation, security, and connectivity
Towfiqa Yasmeen
&
David Yu
&
Denis Coady
&
May 27, 2025
Text Link