Overview
The OpenAPI connector allows you to use external REST APIs via OpenAPI specifications (Swagger) as LLM functions. LLM models can then automatically call these APIs as functions to retrieve data or perform actions.Info: OpenAPI specifications can be imported in JSON or YAML format.
Endpoints are automatically detected and made available as LLM functions.
Features Overview
- Import OpenAPI Specifications: Support for JSON and YAML formats
- Automatic Endpoint Detection: All endpoints are automatically extracted from the specification
- Authentication: Support for API Key, Bearer Token, and Basic Auth
- Test Interface: Integrated tester for testing API calls
- Group Management: Access control via groups
Creating an OpenAPI Configuration
When creating a new OpenAPI configuration, the following options are available:1. Title
Freely selectable name to identify the configuration, e.g., GitHub API or Stripe API.2. OpenAPI Specification
The complete OpenAPI specification in JSON or YAML format. This can be:- Entered directly
- Imported from a URL
- Uploaded from a file
Note: The specification must comply with the OpenAPI 3.0 or Swagger 2.0
standard.
3. Authentication
Supported authentication types:-
No Authentication (
none)- For public APIs without authentication
-
API Key (
apiKey)- API Key Name: Name of the API key (e.g.,
X-API-Key) - API Key Value: The actual API key value
- Location: Header or Query parameter
- API Key Name: Name of the API key (e.g.,
-
Bearer Token (
bearer)- Token: The Bearer token value
- Automatically sent in the
Authorization: Bearer <token>header
-
Basic Auth (
basic)- Username: Username for Basic Authentication
- Password: Password for Basic Authentication
4. Groups
Selection of which groups have access to this OpenAPI configuration. Only users from authorized groups can use the endpoints as LLM functions.Managing Endpoints
After importing an OpenAPI specification, all endpoints are automatically detected and extracted. Each endpoint is made available as a separate LLM function.Endpoint Details
Each endpoint contains:- Name: The function name for the LLM
- Description: Description of the function for the LLM
- HTTP Method: GET, POST, PUT, DELETE, PATCH
- Path: The API endpoint path
- Parameters: All parameters (Query, Header, Path, Body) with their schemas
Testing Endpoints
You can test each endpoint directly via the integrated test interface:- Select an endpoint
- Fill in the required parameters
- Click “Test”
- The response is displayed with status code, headers, and body
Providing OpenAPI Endpoints as Connectors
After creating an OpenAPI configuration and importing endpoints, you can use these endpoints as connectors in your assistants.Creating a Connector
- Edit Assistant: Open the assistant configuration
- Add Connector: Add a new connector
- Select OpenAPI Endpoints: Select the desired endpoints from the list
- You can select endpoints from different OpenAPI configurations
- Each endpoint is provided as a separate LLM function
- Save: Save the connector configuration
Info: OpenAPI endpoints are automatically converted to LLM functions. The
LLM can call these functions during conversation to retrieve data or perform
actions.
Endpoint Selection
- Multiple Selection: You can select multiple endpoints from different OpenAPI configurations
- Search: Use the search function to quickly find endpoints
- Filtering: Endpoints are displayed grouped by OpenAPI configuration
How It Works
When an assistant is configured with OpenAPI endpoints:- The endpoints are provided to the LLM as available functions
- The LLM can call these functions based on the user’s request
- The API calls are forwarded to the corresponding REST API
- The responses are provided to the LLM and integrated into the conversation
