ConnectOnionConnectOnion
Use Remote Agents

Use Any Agent, Anywhere, As If Local

Call connect(address) to create a proxy to a remote agent. Same interface as local agents, works across networks.

Why connect? Access specialized agents from anywhere, build distributed workflows, scale horizontally across multiple machines.

60-Second Quick Start

Connect to a remote agent with one function call:

use_remote.py
1from connectonion import connect 2 3# Connect to a remote agent 4remote_agent = connect("0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c") 5 6# Use it like a local agent 7result = remote_agent.input("Search for Python documentation") 8print(result)
Python REPL
Interactive
I found extensive Python documentation at docs.python.org covering tutorials,
library reference, and language specifications.

What Just Happened?

Created proxy agent → Acts like a local Agent instance
Connected to relay → WebSocket at wss://oo.openonion.ai/ws/announce
Sent INPUT message → Routed to the remote agent
Received OUTPUT → Got the result back

Complete Example: Two Terminals

Terminal 1: Start a Serving Agent

serve_agent.py
1# serve_agent.py 2from connectonion import Agent 3 4def calculate(expression: str) -> str: 5 """Perform calculations.""" 6 return str(eval(expression)) 7 8def get_weather(city: str) -> str: 9 """Get weather information.""" 10 return f"Weather in {city}: Sunny, 72°F" 11 12agent = Agent( 13 "assistant", 14 tools=[calculate, get_weather], 15 system_prompt="You are a helpful assistant." 16) 17 18print("Starting agent...") 19agent.serve()
Python REPL
Interactive
Starting agent...
Agent 'assistant' serving at: 0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b
Connected to relay: wss://oo.openonion.ai/ws/announce
Waiting for connections...

Terminal 2: Connect and Use

use_agent.py
1# use_agent.py 2from connectonion import connect 3 4# Connect using the agent's address 5assistant = connect("0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b") 6 7# Use it 8result1 = assistant.input("What is 42 * 17?") 9print(result1) 10 11result2 = assistant.input("What's the weather in Seattle?") 12print(result2)
Python REPL
Interactive
The result of 42 * 17 is 714.
 
Weather in Seattle: Sunny, 72°F

Common Patterns

1. Connect to Multiple Agents

Build workflows with specialized remote agents:

main.py
1from connectonion import connect 2 3# Connect to specialized agents 4searcher = connect("0xaaa...") 5writer = connect("0xbbb...") 6reviewer = connect("0xccc...") 7 8# Use them together 9research = searcher.input("Research AI trends") 10draft = writer.input(f"Write article about: {research}") 11final = reviewer.input(f"Review and improve: {draft}") 12 13print(final)

2. Retry on Connection Failure

Handle network failures gracefully:

main.py
1import time 2from connectonion import connect 3 4def connect_with_retry(address, max_retries=3): 5 for attempt in range(max_retries): 6 try: 7 return connect(address) 8 except Exception as e: 9 if attempt < max_retries - 1: 10 print(f"Retrying... ({attempt + 1}/{max_retries})") 11 time.sleep(2) 12 else: 13 raise 14 15agent = connect_with_retry("0x7a8f...")

3. Agent Pool (Load Balancing)

Distribute load across multiple identical agents:

main.py
1from connectonion import connect 2 3# Pool of identical agents 4agent_addresses = [ 5 "0xaaa...", 6 "0xbbb...", 7 "0xccc..." 8] 9 10agents = [connect(addr) for addr in agent_addresses] 11 12# Simple round-robin 13def get_agent(): 14 agent = agents.pop(0) 15 agents.append(agent) 16 return agent 17 18# Use different agent each time 19result1 = get_agent().input("Task 1") 20result2 = get_agent().input("Task 2") 21result3 = get_agent().input("Task 3")

Multi-Turn Conversations

Remote agents maintain conversation state across multiple input() calls:

main.py
1remote = connect("0x7a8f...") 2 3# Turn 1 4response1 = remote.input("Calculate 100 + 50") 5print(response1) 6 7# Turn 2 - remembers context 8response2 = remote.input("Multiply that by 2") 9print(response2)
Python REPL
Interactive
The result is 150
 
The result is 300

Real-World: Distributed Workflow

Local orchestrator using remote specialized agents:

main.py
1from connectonion import Agent, connect 2 3# Local orchestrator agent 4def run_workflow(task: str) -> str: 5 """Run distributed workflow.""" 6 7 # Connect to remote specialized agents 8 researcher = connect("0xaaa...") 9 analyst = connect("0xbbb...") 10 writer = connect("0xccc...") 11 12 # Step 1: Research 13 research = researcher.input(f"Research: {task}") 14 15 # Step 2: Analyze 16 analysis = analyst.input(f"Analyze this data: {research}") 17 18 # Step 3: Write report 19 report = writer.input(f"Write report based on: {analysis}") 20 21 return report 22 23# Local agent with access to remote agents via tool 24orchestrator = Agent("orchestrator", tools=[run_workflow]) 25 26# User just talks to local agent 27result = orchestrator.input("Create a report on AI market trends") 28print(result)

Configuration

Default Relay (Production)

main.py
1# Uses wss://oo.openonion.ai/ws/announce by default 2agent = connect("0x7a8f...")

Local Relay (Development)

main.py
1# Connect to local relay server 2agent = connect("0x7a8f...", relay_url="ws://localhost:8000/ws/announce")

Environment-Based

main.py
1import os 2 3relay_url = os.getenv( 4 "RELAY_URL", 5 "wss://oo.openonion.ai/ws/announce" 6) 7 8agent = connect("0x7a8f...", relay_url=relay_url)

Local vs Remote Agents

Local Agent

main.py
1from connectonion import Agent 2 3agent = Agent("local", 4 tools=[search, calculate]) 5 6result = agent.input("task")

+ No network latency

+ Works offline

Limited to one machine

No sharing

Remote Agent

main.py
1from connectonion import connect 2 3agent = connect("0x7a8f...") 4 5result = agent.input("task")

+ Access from anywhere

+ Share across team

Network latency

Requires connectivity

Ready to Use Remote Agents?

Just call connect(address) and start building distributed workflows!