Trying out Kong Konnect AI Gateway

Last updated: April 28, 2026

Recently I came across a Kong billboard that might seem a bit odd if you’ve never had a dog… but for those of us who have, it’s a funny and unexpected take 🙂

It reminded me about API management and my experience with Oracle Integration Cloud (OIC). Back then, we had Apiary (for API design) through acquisition, along with a mix of API Portal and Catalog solutions that felt quite disconnected, and were not up to the competition level at the time. Recently these evolved into more native OIC capabilities, including a proper API Catalog and Portal—which definitely improved the experience.

I decided to explore what Kong has to offer in this space.

Exploring Kong Konnect

I just spent some time diving into the Kong Konnect portal, and honestly, it’s a very nice UI design, simple and intuitive.. In particular I love the fact htat on any of the connectivity options it immediattly shows a diagram with a visualization of the E2E connectivity.

Look at this example from the Event Gateway landing page

So the main options are

  • API Gateway: The main entry point. It handles incoming traffic, secures it, routes it, and makes APIs easy to use.
  • AI Gateway: A layer for working with LLMs. It centralizes API keys, helps control usage.
  • Event Gateway: Makes event streams (like Kafka) easier to use by treating them more like APIs
  • Service Mesh: Manages communication between internal services, keeping it secure and reliable so failures don’t spread.

I came here with the intention of testing the API Gateway, but I must say that the AI Gateway got my curiosity 🙂

AI Gateway

Again we can see the E2E diagram that easily explains the flow with the AI Gateway sitting in the middle.

I could start from scratch but we have the option to start with a demo, which I will use for now.

Then we get the below screen with clear steps on what we need to setup/

#1 Connect LLM

Next, you configure which model/provider will handle the request.

  • Choose a provider (e.g. OpenAI, Cohere, etc.) – The list is extensive
  • Select the model (For Cohere i choose command-a-03-2025)
  • Add your API key

At this point, Kong acts as a proxy in front of the LLM, abstracting direct access.

#2 Add a plugin

Plugins in this context offer extra control (which is why we use an AI Gateway)

You can apply rate limiting to prevent excessive or costly usage, enforce authentication to ensure only authorized clients can access the models, and enable logging and observability to track prompts and responses for monitoring or debugging.

On top of that we can enforce guardrails , transform prompts, all of this without changing the application logic that uses the gateway. Separation of duties and enforced governance.

#3 Test your setup

And that’s it, I can test the AI Gateway with the below cURL command.

 curl -X POST https://xxxxxx.eu.serverless.gateways.konggateway.com/test \
-H 'Content-Type: application/json' \
-d '{
  "messages": [
    {
      "role": "user",
      "content": "How does Kong AI Gateway work?"
    }
  ]
}'

The below is the response, which does comply with my modified plugin prompt 🙂

{
  "created": 1777373995,
  "object": "chat.completion",
  "model": "command-a-03-2025",
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Kong AI Gateway is a powerful tool designed to integrate AI capabilities into API management.

It acts as a middleware layer that sits between clients and backend services, enabling the application of AI-driven features such as traffic management, security, and analytics.

Here’s a high-level overview of how it works:

1. API Traffic Routing:
Kong AI Gateway routes incoming API requests to the appropriate backend services based on predefined rules and policies.
It can dynamically adjust routing based on AI-driven insights, such as load balancing or user behavior patterns.

2. AI-Powered Plugins:
Kong leverages plugins that can be enhanced with AI capabilities.
These plugins can perform tasks like rate limiting, authentication, logging, and more.
With AI integration, these plugins can make smarter decisions, such as detecting anomalies or optimizing performance in real time.

3. Security Enhancements:
The gateway can employ AI to detect and mitigate security threats, such as DDoS attacks, SQL injection, or unauthorized access attempts.
Machine learning models can analyze traffic patterns to identify and block malicious activity.

4. Analytics and Monitoring:
Kong AI Gateway can collect and analyze API usage data, providing insights into performance, user behavior, and potential bottlenecks.
AI algorithms can predict trends, recommend optimizations, and generate actionable reports.

5. Dynamic Policy Enforcement:
AI enables the gateway to enforce policies dynamically based on context.
For example, it can adjust rate limits or access controls in response to changing conditions, such as increased traffic or suspicious activity.

6. Integration with AI Services:
Kong can integrate with external AI services or models to enhance its capabilities.
This allows developers to leverage pre-trained models or custom AI solutions for specific use cases.

By combining Kong’s robust API management features with AI-driven intelligence, the Kong AI Gateway provides a flexible and scalable solution for modern API ecosystems.

Chuck Norris quote:
\"Chuck Norris doesn't use APIs. APIs use Chuck Norris.\""
      }
    }
  ]
}

Note the Chuck Norris Quote in the end 🙂

What impressed me most is how simple it was to get started. Within a short time, I had a working setup that sits in front of an LLM, applies policies, and gives me a controlled entry point. That’s a powerful building block.

Be the first to comment

Leave a Reply

Your email address will not be published.


*