Skip to main content
To use AI models, first configure connections to LLM providers. Connect multiple providers simultaneously so users can pick from various models. Configure in Admin > Settings > Connections.
Connections settings tab

OpenAI API Connection

Connect with OpenAI or OpenAI-compatible APIs (Azure OpenAI, AI Foundry, etc.). Configure multiple connections to balance load.
1

Add connection

Click the + button in the OpenAI API section.
2

Enter connection info

In the connection modal, enter API Base URL and API Key.
3

Verify connection

After saving, verify the model list auto-refreshes.
SettingDescription
API Base URLhttps://api.openai.com/v1
API KeyKey from OpenAI (sk-...)
Prefix IDModel ID prefix (optional — prevents collision in multi-connection)
Model IDsManual specification (auto-fetched if blank)

Common Connection Settings

Each connection has these common features:
FeatureDescription
Verify ConnectionTest API access via the verify button in the settings modal
Prefix IDDistinguish models with {prefix}.{model} form when names collide across connections
Activate/DeactivatePer-connection toggle to pause without deleting
Model IDsWhen manually specified, only those models are exposed. Auto-fetched if blank
Adding multiple connections to the same provider auto-merges model lists. For example, adding 2 Azure OpenAI connections (US/Korea regions) shows models from both regions. Use Prefix ID to distinguish.

Ollama API Connection

Connect to a local or internal server Ollama instance.
SettingDescriptionExample
Base URLOllama server addresshttp://localhost:11434
Add multiple servers to load-balance. Ollama provides complete data privacy without external network.

Direct Connections

Beyond admin-configured connections, allow users to connect directly to LLMs with personal API keys.
SettingDescriptionDefault
Allow direct connectionsAllow per-user personal API connectionsEnabled
When enabled, users can add OpenAI-compatible APIs in Personal Settings > Connections.
Direct connections support OpenAI-compatible APIs only (Ollama unsupported). The user’s API server must have CORS correctly configured.

Cloud Accounts

Configure global cloud service accounts. Service Account Keys set here serve as fallback when individual features (TTS, STT, images) lack separate keys.
ProviderSettings
Google CloudService Account Key (JSON)
Leaving Service Account Key empty uses Application Default Credentials (ADC).

Image Attachment Mode

Configure how images attached to chats are processed and stored.
SettingDescriptionDefault
Image Upload ModeBase64 Inline or Cloud StorageBase64 Inline
Storage ProviderLocal, AWS S3, Azure Blob, Google Cloud StorageLocal
SettingDescription
Bucket NameS3 bucket name
RegionAWS region
Endpoint URLCustom Endpoint (optional)
Access Key IDAuth key ID
Secret Access KeySecret key
Key PrefixFile path prefix (optional)

Shared Storage (file uploads)

Configure storage location for uploaded files like documents and PDFs. Same storage provider options as Image Attachment Mode.
SettingDescriptionDefault
File Storage ProviderLocal, AWS S3, Azure Blob, Google Cloud StorageLocal
After changing storage settings, always verify connection state with the Test button.

Model Management

Activate/deactivate connected models and configure Workspace models

Documents

Embedding engine and RAG pipeline settings

Code Gateway

LLM proxy settings for coding tools