ConnectOnionConnectOnion

Deploy

Beta Test

Deploy your ConnectOnion agents to production. Docker, GCP, AWS - your choice.

Why deploy? Run agents 24/7, scale horizontally, integrate with existing infrastructure. Your agents become production services.

Two Deployment Options

co deploy (Easiest)

Quick deployment to ConnectOnion Cloud. Managed hosting, no infrastructure to manage.

  • One command deployment
  • Automatic HTTPS
  • Re-deploy updates same URL

Self-Host

Full control with your own infrastructure. Use Docker, GCP, AWS, or any VPS.

  • Full control
  • Custom domains
  • Compliance requirements

60-Second Quick Start

Deploy your agent with the CLI - one command:

Terminal
1# Create an agent project 2co create my-agent 3 4# Navigate to the project 5cd my-agent 6 7# Deploy to ConnectOnion Cloud 8co deploy
Python REPL
Interactive
Deploying to ConnectOnion Cloud...
 
Project: my-agent
Secrets: 3 keys
 
Uploading...
Building...
 
Deployed!
Agent URL: https://my-agent-abc123.agents.openonion.ai

co deploy Requirements

Git repository with committed code
.co/config.toml (created by co create or co init)
Authenticated (co auth)

Configuration

.co/config.toml
1# .co/config.toml 2[project] 3name = "my-agent" 4secrets = ".env" 5 6[deploy] 7entrypoint = "agent.py"

Secrets

Secrets from .env are securely passed to your agent:

.env
1# .env 2OPENAI_API_KEY=sk-xxx 3DATABASE_URL=postgres://...

Note: URL format is {project_name}-{your_address[:10]}.agents.openonion.ai. Re-deploying updates the same URL.

Self-Host with host()

Deploy to your own infrastructure using host():

agent.py
1# agent.py 2from connectonion import Agent, host 3 4agent = Agent("my-agent", tools=[my_tool]) 5 6# Export ASGI app for uvicorn/gunicorn 7app = host.app(agent) 8 9if __name__ == "__main__": 10 host(agent)

Run with uvicorn/gunicorn

code
1# Direct 2python agent.py 3 4# Uvicorn 5uvicorn agent:app --workers 4 6 7# Gunicorn 8gunicorn agent:app -w 4 -k uvicorn.workers.UvicornWorker

For full API reference, see host() documentation.

Docker Deployment

The simplest way to deploy - works anywhere Docker runs:

1. Create Your Agent

agent.py
1# agent.py 2from connectonion import Agent, host 3 4def search(query: str) -> str: 5 """Search for information.""" 6 return f"Results for: {query}" 7 8agent = Agent( 9 name="my-agent", 10 tools=[search], 11 system_prompt="You are a helpful assistant." 12) 13 14# Host the agent 15host(agent)

2. Dockerfile

Dockerfile
1FROM python:3.11-slim 2 3WORKDIR /app 4 5# Install dependencies 6COPY requirements.txt . 7RUN pip install --no-cache-dir -r requirements.txt 8 9# Copy agent code 10COPY agent.py . 11 12# Run the agent 13CMD ["python", "agent.py"]

3. Build & Run

Terminal
1# Build the image 2docker build -t my-agent . 3 4# Run with API key 5docker run -d \ 6 -e OPENAI_API_KEY=$OPENAI_API_KEY \ 7 --name my-agent \ 8 my-agent
Python REPL
Interactive
Building image...
Successfully built a1b2c3d4e5f6
Successfully tagged my-agent:latest
 
Container started: my-agent
Agent serving at: 0x3d4017c3e843895a...

Deploy to Google Cloud Run

Serverless deployment with automatic scaling:

1. Build & Push to Container Registry

Terminal
1# Authenticate with GCP 2gcloud auth login 3 4# Set project 5gcloud config set project YOUR_PROJECT_ID 6 7# Build and push 8gcloud builds submit --tag gcr.io/YOUR_PROJECT_ID/my-agent

2. Deploy to Cloud Run

Terminal
1gcloud run deploy my-agent \ 2 --image gcr.io/YOUR_PROJECT_ID/my-agent \ 3 --platform managed \ 4 --region us-central1 \ 5 --set-env-vars "OPENAI_API_KEY=sk-..." \ 6 --allow-unauthenticated
Python REPL
Interactive
Deploying container to Cloud Run service [my-agent]...
Done.
 
Service URL: https://my-agent-abc123-uc.a.run.app
Agent address: 0x3d4017c3e843895a92b70aa74d1b7ebc...

Security Note

Use Secret Manager for API keys in production instead of environment variables.

Deploy to AWS

Multiple options depending on your needs:

EC2 / Lightsail

Simple VPS deployment. Best for always-on agents.

  • Full control over environment
  • Predictable pricing
  • Easy SSH access

ECS / Fargate

Container orchestration. Best for scaling.

  • Auto-scaling built in
  • Load balancing
  • Rolling deployments

EC2 Quick Deploy

Terminal
1# SSH to your EC2 instance 2ssh -i your-key.pem ubuntu@your-instance-ip 3 4# Install dependencies 5sudo apt update && sudo apt install -y python3-pip docker.io 6 7# Clone and run 8git clone https://github.com/your/agent-repo.git 9cd agent-repo 10docker build -t my-agent . 11docker run -d -e OPENAI_API_KEY=$OPENAI_API_KEY my-agent

Environment Variables

Configure your agent for different environments:

VariableDescriptionRequired
OPENAI_API_KEYOpenAI API key for GPT modelsYes*
ANTHROPIC_API_KEYAnthropic API key for ClaudeOptional
GOOGLE_API_KEYGoogle API key for GeminiOptional
RELAY_URLCustom relay server URLOptional
LOG_LEVELLogging verbosity (DEBUG, INFO, WARN)Optional

* Or use ConnectOnion managed keys with co auth - no API keys needed!

Best Practices

Security

  • Use secret managers for API keys
  • Never commit .co/ or .env
  • Use non-root container users
  • Enable HTTPS for all endpoints

Reliability

  • Add health checks to containers
  • Set up automatic restarts
  • Configure logging and monitoring
  • Use persistent volumes for keys

Ready to Deploy?

Your agents are production-ready. Ship them!