Related Documentation
Made by
Kong Inc.
Supported Gateway Topologies
hybrid db-less traditional
Supported Konnect Deployments
hybrid cloud-gateways serverless
Compatible Protocols
grpc grpcs http https
Minimum Version
Kong Gateway - 3.8
AI Gateway Enterprise: This plugin is only available as part of our AI Gateway Enterprise offering.

The AI Semantic Prompt Guard plugin enhances the AI Prompt Guard plugin by allowing you to permit or block prompts based on a list of similar prompts, helping to prevent misuse of llm/v1/chat or llm/v1/completions requests.

You can use a combination of allow and deny rules to maintain integrity and compliance when serving an LLM service using Kong Gateway.

How it works

The matching behavior is as follows:

  • If any deny prompts are set and the request matches a prompt in the deny list, the caller receives a 400 response.
  • If any allow prompts are set, but the request matches none of the allowed prompts, the caller also receives a 400 response.
  • If any allow prompts are set and the request matches one of the allow prompts, the request passes through to the LLM.
  • If there are both deny and allow prompts set, the deny condition takes precedence over allow. Any request that matches a prompt in the deny list will return a 400 response, even if it also matches a prompt in the allow list. If the request doesn’t match a prompt in the deny list, then it must match a prompt in the allow list to be passed through to the LLM.
Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!
OSZAR »