Overview
This guide walks you through connecting OpenAI (Standard) to Matrix using OpenClaw. You'll learn how to configure authentication, set up the channel, and deploy your AI assistant.
OpenAI (Standard) Features
- Standard OpenAI access
- GPT-4o models
- Simple setup
- Wide compatibility
Matrix Capabilities
- Direct messages
- Group chats
- Media support
- Reactions
- Thread support
Step 1: Configure OpenAI (Standard)
Standard API key only
- Create account at platform.openai.com
- Generate API key
- Set OPENAI_API_KEY environment variable
Environment variable: OPENAI_API_KEY
Step 2: Configure Matrix
- Create a Matrix account on your preferred homeserver
- Generate an access token
- Configure matrix in openclaw.json
- Start the gateway
- Invite the bot to rooms
Step 3: Combined Configuration
Add both configurations to your openclaw.json:
{
"agents": {
"defaults": {
"model": {
"primary": "openai/gpt-4o"
}
}
},
"models": {
"providers": {
"openai": {
"models": {
"providers": {
"openai": {
"apiKey": "${OPENAI_API_KEY}
}
},
"channels": {
"matrix": {
"homeserver": "https://matrix.org",
"accessToken": "${MATRIX_ACCESS_TOKEN}",
"userId": "@mybot:matrix.org"
}
}
}
} Step 4: Start the Gateway
# Start the gateway
openclaw gateway start
# Check status
openclaw status
# View logs
openclaw logs --follow Access Control
Matrix supports the following access control policies:
DM Policies
| Policy | Description |
|---|---|
allowlist | Only senders in allowFrom list are processed |
pairing | Unknown senders receive a pairing code; admin must approve |
open | All DMs are processed (requires allowFrom: ["*"]) |
Group Policies
| Policy | Description |
|---|---|
allowlist | Only groups in groupAllowFrom are processed |
open | All groups are processed |
Deploy Options
Choose how to deploy your OpenAI (Standard) + Matrix setup:
Local Deployment
Run on your personal machine with local-only access
View Guide →VPS Deployment
Always-on deployment on a Linux VPS
View Guide →Cloud Deployment
Managed container deployment on Fly.io or similar
View Guide →Docker Deployment
Containerized deployment with Docker
View Guide →Frequently Asked Questions
How do I set up OpenAI (Standard) with Matrix?
Configure OpenAI (Standard) as your AI provider and enable Matrix as a channel in openclaw.json. The gateway routes Matrix messages to OpenAI (Standard) for processing automatically.
Is OpenAI (Standard) a good choice for Matrix bots?
OpenAI (Standard) works great with Matrix. Simple setup and Well-documented make it well-suited for Privacy-focused teams and Open source communities.