Valory's grant to Algovera to build decentralized LLM agent workflows
Exploring the benefits of combining Crypto and LLM agent frameworks
The intersection of AI and Crypto is a popular topic these days, with many exploring what the potential use cases may be. For one, it’s hoped that Crypto can bring decentralization to cloud providers like OpenAI. The Crypto world has been interested in autonomous agents and organizations for a relatively long time, and provides infrastructure, frameworks and tooling towards achieving this. On the other hand, recent advances with LLMs in the AI space have enabled agents to do more useful things, such as coding or increasing general productivity.
Valory has awarded a grant to Algovera to explore and enhance the benefits of combining Crypto and LLM agent frameworks.
LLM x Crypto Agent Frameworks
LLMs represent a new computing platform as a result of useful capabilities like in-context learning, generalization to samples outside of the training distribution, and reasoning that emerge above a certain size. LLM programming is the act of building on top of these general personalizable reasoning engines, and frameworks such as LangChain, Llama Index and BabyAGI have been widely adopted for this purpose. Typically, the LLM is just one component of a larger AI system that includes other building blocks such as short-term memory, long-term memory (e.g. vector databases) and skills. These enable the creation of useful apps like chains (or workflows), agents and multi-agent systems. Services for deploying these agents are in the early stages of development and tend to focus on centralized approaches, which creates questions around who ultimately owns and governs these apps.
Crypto agent frameworks such as Open Autonomous Economic Agent (AEA) and Open Autonomy have been under development for a relatively long period of time compared to LLM agent frameworks. Open Autonomy is a framework for the creation of agent services: off-chain autonomous services which run as a multi-agent-system and offer enhanced functionalities on-chain. Open Autonomy uses finite state machines (FSMs), which are more flexible and powerful versions of chains or workflows. Most importantly, agent services are decentralized, trust-minimized, transparent, and robust. Recently, Valory awarded a grant to Algovera to facilitate the implementation of new, powerful building blocks of modern AI systems within Open Autonomy.
For a deeper dive on LLM x Crypto agent frameworks, you can watch this recent presentation by Algovera at Valory’s AI x Crypto mini-conference.
Valory Chat & Question Answering Assistant
To demonstrate what you can build with building blocks such as LLM skills, short-term memory and long-term memory, we created two apps: Chat and Question Answering. Since they use the Olas stack, both of these apps have the capacity to be run in a decentralized manner. This is a key differentiator between these and other AI chat apps. The Chat app makes use of an LLM with short-term memory to keep track of the chat history of the conversation, which allows users to ask follow up queries and allows the assistant to correct errors in previous responses. The Question Answering app uses an LLM together with a vector database that acts as long-term memory. We created embeddings of Valory’s technical documentation, and stored the result in a vector database on IPFS. When a user query is submitted to the Question Answering app, the vector embedding is retrieved from IPFS. Similarity search is then performed to select appropriate documents, and these documents are used to generate answers. This technique allows the assistant to answer questions about Valory’s Open Autonomy library that were not in the training set of the LLM itself. For example, here is the answer provided to the question:
What is Open Autonomy?
Open Autonomy Is a framework for the creation of agent services, which are off-chain autonomous services that run as a multi-agent system (MAS) and offer enhanced functionalities on-chain. Agent services expand the range of operations that traditional smart contracts provide, allowing for the execution of complex operations like machine-learning algorithms. These agent services are decentralized and offer features such as transparency and robustness. The framework provides command line tools and packages to build, deploy, publish, and test agent services. It is designed to speed up the development life cycle of autonomous services.
For more information, check out the video demo and the code for the Valory Chat & Question Answering Assistant. You can also see two services minted on the Olas registry here and here.