ConnectOnionConnectOnion
Make Agents Network-Accessible

Share Your AI Agents Over the Network

Call agent.serve() to make your agent accessible from anywhere. One line of code, cryptographic identity, zero configuration.

Why serve? Turn local agents into network services. Access specialized agents from anywhere, build distributed workflows, scale horizontally.

60-Second Quick Start

Create an agent and call .serve() - that's it:

serve_agent.py
1from connectonion import Agent 2 3def search(query: str) -> str: 4 """Search for information.""" 5 return f"Results for: {query}" 6 7agent = Agent("helper", tools=[search]) 8 9# Make it network-accessible 10agent.serve()
Python REPL
Interactive
Agent 'helper' serving at: 0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c
Connected to relay: wss://oo.openonion.ai/ws/announce
Waiting for connections...

What Just Happened?

Generated Ed25519 keys → Saved to .co/keys/helper/
Connected to relay → WebSocket at wss://oo.openonion.ai/ws/announce
Announced presence → Published public key to relay
Started listening → Waiting for INPUT messages

Testing Your Served Agent

From another Python script, connect using the agent's address:

use_agent.py
1from connectonion import connect 2 3# Connect using the agent's address 4remote = connect("0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c") 5 6# Use it like a local agent 7result = remote.input("Search for Python docs") 8print(result)
Python REPL
Interactive
Results for: Python docs

Or from terminal: Start serving in Terminal 1, connect from Terminal 2. Your agent is now a network service!

How It Works

Client Relay Server Your Agent | | | |--- INPUT message ------->| | | |--- INPUT message ------->| | | | | | [Process task] | | | | |<-- OUTPUT message -------| |<-- OUTPUT message -------| | | | |

INPUT Message

{ "type": "INPUT", "from": "0xclient...", "task": "Search for Python docs" }

OUTPUT Message

{ "type": "OUTPUT", "to": "0xclient...", "result": "Results for: ..." }

All messages are automatically signed with your agent's private key and verified by the relay.

Configuration

Default Relay (Production)

main.py
1# Uses wss://oo.openonion.ai/ws/announce by default 2agent.serve()

Custom Relay (Development)

main.py
1# Connect to local relay server 2agent.serve(relay_url="ws://localhost:8000/ws/announce")

Environment-Based

main.py
1import os 2 3relay_url = os.getenv( 4 "RELAY_URL", 5 "wss://oo.openonion.ai/ws/announce" 6) 7 8agent.serve(relay_url=relay_url)

Security

Ed25519 Cryptography

Every message is signed with your agent's private key. The relay verifies signatures to ensure authenticity.

# Automatic signing message = {"type": "OUTPUT", "result": "..."} signature = signing_key.sign(json.dumps(message)) # Relay verifies with public key verify_key.verify(signature) # Raises if invalid

Key Storage

Keys are stored in .co/keys/{agent_name}/':

  • private_key.pem - Keep this secret! Never commit to git.
  • public_key.pem - Your agent's address, safe to share.
# Add to .gitignore .co/

Complete Example

research_assistant.py
1from connectonion import Agent 2 3# Tool 1: Web search 4def search(query: str) -> str: 5 """Search the web.""" 6 import requests 7 # Actual search implementation 8 return f"Search results for {query}" 9 10# Tool 2: Save to file 11def save_file(filename: str, content: str) -> str: 12 """Save content to a file.""" 13 with open(filename, 'w') as f: 14 f.write(content) 15 return f"Saved to {filename}" 16 17# Create agent 18agent = Agent( 19 name="research_assistant", 20 tools=[search, save_file], 21 system_prompt="You are a research assistant." 22) 23 24# Serve it 25print(f"Starting {agent.name}...") 26agent.serve()
Python REPL
Interactive
Starting research_assistant...
Agent 'research_assistant' serving at: 0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9b2c
Connected to relay: wss://oo.openonion.ai/ws/announce
 
Keys saved to: .co/keys/research_assistant/
- private_key.pem
- public_key.pem
 
Waiting for connections...

Ready to Share Your Agents?

Just call agent.serve() and your agent goes live!