Overview
This guide shows you how to connect OpenAI (Standard) to Nostr using OpenClaw. You'll configure the messaging channel, set up AI provider authentication, and deploy your assistant.
Nostr Capabilities
- Direct messages
- Reactions
OpenAI (Standard) Features
- Standard OpenAI access
- GPT-4o models
- Simple setup
- Wide compatibility
Step 1: Configure Nostr
- Generate a Nostr key pair
- Choose relay servers
- Configure nostr in openclaw.json
- Start the gateway
- Publish your public key
Step 2: Configure OpenAI (Standard)
Standard API key only
- Create account at platform.openai.com
- Generate API key
- Set OPENAI_API_KEY environment variable
Environment variable: OPENAI_API_KEY
Step 3: Combined Configuration
Add both configurations to your openclaw.json:
{
"agents": {
"defaults": {
"model": {
"primary": "openai/gpt-4o"
}
}
},
"models": {
"providers": {
"openai": {
"models": {
"providers": {
"openai": {
"apiKey": "${OPENAI_API_KEY}
}
},
"channels": {
"nostr": {
"privateKey": "${NOSTR_PRIVATE_KEY}",
"relays": ["wss://relay.damus.io", "wss://nos.lol"]
}
}
}
} Step 4: Start and Test
# Start the gateway
openclaw gateway start
# Check connection status
openclaw status
# View real-time logs
openclaw logs --follow Access Control
Nostr supports the following access control policies:
DM Policies
| Policy | Description |
|---|---|
allowlist | Only senders in allowFrom list are processed |
open | All DMs are processed (requires allowFrom: ["*"]) |
Group Policies
| Policy | Description |
|---|---|
open | All groups are processed |
Deploy Options
Choose how to deploy your Nostr + OpenAI (Standard) setup:
Local Deployment
Run on your personal machine with local-only access
View Guide →VPS Deployment
Always-on deployment on a Linux VPS
View Guide →Cloud Deployment
Managed container deployment on Fly.io or similar
View Guide →Docker Deployment
Containerized deployment with Docker
View Guide →