A Scalable Communication Protocol for Networks of Large Language Models

1University of Oxford 2Camel AI

Agora is a cross-platform, dead-simple protocol for efficient communication between LLM agents. It enables very different agents to communicate with each other at a fraction of the cost.
Agora can also be easily integrated with existing multiagent frameworks, such as Camel AI, LangChain and Swarm.

Abstract

Communication is a prerequisite for collaboration. When scaling networks of AI-powered agents, communication must be versatile, efficient, and portable. These requisites, which we refer to as the Agent Communication Trilemma, are hard to achieve in large networks of agents. We introduce Agora, a meta protocol that leverages existing communication standards to make LLM-powered agents solve complex problems efficiently. In Agora, agents typically use standardised routines for frequent communications, natural language for rare communications, and LLM-written routines for everything in between. Agora sidesteps the Agent Communication Trilemma and robustly handles changes in interfaces and members, allowing unprecedented scalability with full decentralisation and minimal involvement of human beings. On large Agora networks, we observe the emergence of self-organising, fully automated protocols that achieve complex goals without human intervention.

Agora in 4 Steps

Step 1: Natural language is useful for rare communications, but inefficient Step 2: LLMs negotiate a structured protocol
Step 3: LLMs implement routines that follow the agreed-upon protocol Step 4: Routines communicate without needing to invoke the LLMs

Demo

Our demo showcases a network of 100 agents interacting with each other. The agents have different LLMs (OpenAI GPT-4o, Llama 3 405b, Gemini 1.5 Pro) and different DB technologies (MongoDB, SQL), but they still complete complex, multi-agent tasks with way lower costs. In one picture:

Demo

FAQ

Q: Can negotiated protocols be recycled?

A: Yes, if two agents agree on a protocol, it can be shared and easily used by other agents.

Q: Does Agora require authoritative nodes?

A: No, Agora is fully decentralized. All agents can communicate without relying on any central node.

Q: What if a routine fails?

A: The LLM takes over the communication and sends the query (or the reply) in place of the routine.

Q: How do agents know which protocol is being used in a given communication?

A: Agents add the SHA1 hash of the TXT file describing the protocol. Refer to the paper for more info.

Next Steps & Contributing

We're building the next iteration of Agora, with more features for real-world use cases. If you're interested in contributing or simply want to stay updated about Agora, check out our Discord or subscribe to our mailing list.

BibTeX


@article{marro2024scalable,
  title={A Scalable Communication Protocol for Networks of Large Language Models},
  author={Marro, Samuele and La Malfa, Emanuele and Wright, Jesse and Li, Guohao and Shadbolt, Nigel and Wooldridge, Michael and Torr, Philip},
  journal={arXiv preprint arXiv:2410.11905},
  year={2024}
}

Acknowledgements

Funding acknowledgements:

We also thank these fantastic organizations:

  • The Alan Turing Institute, for providing the computational power to run our agent network.
  • Camel AI, for providing OpenAI credits to run our GPT-4o agents.
  • SambaNova, for providing credits to run our Llama 3 agents.