Categories
AI Experiment

Experimenting with OpenAI’s Swarm Framework

I’ve been experimenting with multi-agent setups for a few months now. Mostly in Rivet but more recently within some WordPress and NodeJs projects. So I thought it was about time I experimented with the OpenAI Swarm framework.

OpenAI’s Swarm is an experimental educational framework designed to simplify the development and orchestration of multi-agent systems. It focuses on enabling the coordination of lightweight, modular agents, allowing developers to create systems where multiple AI agents can collaborate, hand off tasks and manage complex workflows autonomously.

First, we’ll walk through what it is and a bit about how it works. Then we’ll build a multi-agent system that works together to generate the best possible output. We’ll create two agents and a few functions that make external calls to pull in real data. By the end, you should be able to run the code yourself and hopefully customize it for your own needs.

Key Parts of Swarm

Agents

In Swarm, agents are autonomous entities designed to handle specific tasks with their own set of tools and instructions. Agents can communicate and work collaboratively, allowing them to specialize in different areas. For instance, one agent might gather data, while another focuses on analysis or decision-making.

Task Handoffs

One of Swarm’s strengths is the ability to seamlessly transfer tasks between agents. If an agent identifies that another agent is better suited to complete a specific task, it can delegate that task to the appropriate agent. This handoff system ensures that specialized agents can handle complex workflows without overwhelming a single agent with all responsibilities.

Stateless Design

Swarm operates in a stateless manner, meaning it doesn’t retain memory between separate interactions. Each session is independent, requiring the context and instructions to be passed explicitly every time. This design allows for efficient parallel processing but requires careful management of context within each session to maintain continuity.

Within a single client.run() session, all agents and tools share the same context and conversation messages. Swarm manages the context and the exchange of information between agents and tools during the entire process, ensuring smooth execution until the final result is delivered to the user.

Execution Model

Swarm’s execution revolves around the client.run() method, which drives the interaction between agents and tools. The process follows several key steps:

  • Fetching a response from the current agent based on its instructions and the provided context.
  • Executing functions or tools, such as API calls, data gathering, or other tasks.
  • Checking if a handoff is required to another agent for specialized tasks.
  • Updating the context with new information as the process continues.
  • Returning the final results to the user once the task is completed.

This loop allows agents to work through tasks, tools, and handoffs in a structured manner.

Agent Interaction Flow

The process starts with an initial agent based on user input or system design. As tasks progress, agents can hand off responsibilities to other agents better equipped for specific tasks. For example, in a customer support scenario, an initial triage agent might assess the user’s issue and pass it to a more specialized agent, such as technical support or billing. This fluid interaction ensures agents can collaborate efficiently and focus on their areas of expertise.

Why Swarm Is Significant

The introduction of Swarm brings several advantages:

  • Simplified Multi-Agent Orchestration: Swarm makes it easier to manage complex interactions between multiple agents, reducing the overhead typically associated with such systems.
  • Educational Value: Its lightweight and client-side nature make Swarm ideal for learning and experimentation.
  • Scalability: The stateless design allows for parallel execution and easier scaling, which is crucial for handling large workloads.
  • Flexibility: Agents can be specialized and reconfigured without affecting the entire system, promoting modular development.

Helpful Tip
Think of agents as specialized team members, each with their own expertise, collaborating to achieve a common goal.

When diving into multi-agent systems, it’s worth mentioning some other cool projects in this space. One example is GPTSwarm, a graph-based framework for LLM-powered agents. GPTSwarm lets developers build agent swarms from graphs, enabling automatic self-organization and self-improvement. OpenAI’s swarm is just one of several frameworks out there for orchestrating these kinds of multi-agent setups.

Building a Blog Post Title Generator with Swarm

Let’s put theory into practice. In this tutorial, we’ll build a multi-agent system that generates unique blog post titles on a given topic. The system will fetch the latest articles for research and ensure the titles don’t duplicate existing posts on your WordPress site. We’ll set it up so the agents go back and forth until they are satisfied with the final results and have collected enough links for each topic idea for us to research.

Setting Up the Environment

Before we dive into the code, make sure you have the following:

  • Python 3.7+ installed on your system.
  • An OpenAI API key set up in your environment variables.
  • A NewsAPI.org API key for fetching the latest articles.
  • Access to your WordPress site’s REST API to retrieve existing post titles.

Understanding the Code

Here’s the full code we’ll be working with:

# main.py

from swarm import Swarm, Agent
import requests
import json
from newsapi import NewsApiClient

# Replace 'YOUR_NEWSAPI_KEY' with your actual NewsAPI.org API key
NEWSAPI_KEY = 'YOUR_NEWSAPI_KEY'

# Replace 'https://your-wordpress-site.com' with your actual WordPress site URL
WORDPRESS_SITE_URL = 'https://your-wordpress-site.com'

def get_latest_articles(topic):
    """
    Get latest articles about a topic from NewsAPI.org.

    Args:
        topic (str): The topic to search for.

    Returns:
        str: A JSON string of the list of URLs.
    """
    newsapi = NewsApiClient(api_key=NEWSAPI_KEY)
    articles = newsapi.get_everything(q=topic, language='en', sort_by='relevancy', page_size=5)
    links = []
    for article in articles['articles']:
        links.append(article['url'])
    print(f"Getting latest articles for: {topic}")
    return json.dumps({'links': links})

def get_existing_titles():
    """
    Get existing post titles from the WordPress REST API.

    Returns:
        str: A JSON string of the list of titles.
    """
    response = requests.get(f'{WORDPRESS_SITE_URL}/wp-json/wp/v2/posts')
    posts = response.json()
    titles = [post['title']['rendered'] for post in posts]
    print(f"Getting existing post titles")
    return json.dumps({'titles': titles})

def handoff_to_filter_agent():
    """
    Hand off to the Title Filter Agent.
    """
    print(f"Handoff to Filter Agent")
    return TitleFilterAgent

def handoff_to_generator_agent():
    """
    Hand off to the Title Generator Agent.
    """
    print(f"Handoff to Generator Agent")
    return TitleGeneratorAgent

def title_generator_instructions(context_variables):
    topic = context_variables.get('topic', '')
    return f"""
You are a creative assistant that generates interesting and specific blog post titles on the topic '{topic}'.
Identify groupings of articles withing 'article_links' that would make for good topics.
Generate a list of 5 potential blog post titles based on groupings that match the user's topic.
Call the function 'get_latest_articles' with the topic to gather relevant links for each title to help with research.
Store the generated titles and associated links in your response.
Once you have generated the titles and gathered the links, call 'handoff_to_filter_agent' to hand off to the Title Filter Agent.
After you've generated the titles and the filter agent has cleaned up tht title name, perform additional news lookups to find more links for each title.
Go back and forth with the filter agent until you have a unique list of post ideas and at least 2-5 links per title.
Output the final list of titles and URLs once you are satisfied with their quality.
"""

TitleGeneratorAgent = Agent(
    name="Title Generator Agent",
    instructions=title_generator_instructions,
    functions=[get_latest_articles, handoff_to_filter_agent]
)

def title_filter_instructions(context_variables):
    return f"""
You are an assistant that filters and adapts blog post titles to avoid duplication with existing posts.
Call the function 'get_existing_titles' to get a list of existing post titles.
Compare the provided titles/links with existing titles and remove or adapt any that are too similar.
Avoid titles that are similar to the titles in the existing titles.
Make sure the titles will fit in will alongside existing titles.
Go back and forth with the generator agent until you have a unique list of post ideas and at least 2-5 links per title.
Once you have optimized the titles, call 'handoff_to_generator_agent' to continue doing research.
"""

TitleFilterAgent = Agent(
    name="Title Filter Agent",
    instructions=title_filter_instructions,
    functions=[get_existing_titles, handoff_to_generator_agent]
)

if __name__ == '__main__':
    client = Swarm()
    topic = input("Enter a topic for blog post ideas: ")

    context_variables = {'topic': topic}

    messages = [{"role": "user", "content": f"Please generate blog post titles on '{topic}'."}]

    response = client.run(
        agent=TitleGeneratorAgent,
        messages=messages,
        context_variables=context_variables,
        max_turns=50,
    )

    print("\nFinal blog post title ideas with research links:\n")
    print(response.messages[-1]['content'])

Importing Libraries

We start by importing the necessary libraries:

  • swarm: The Swarm framework for orchestrating agents.
  • requests and json: For HTTP requests and JSON handling.
  • newsapi: To interact with NewsAPI.org for fetching articles.

Defining Functions

We define several functions that our agents will use:

  • get_latest_articles(topic): Fetches the latest articles on a given topic.
  • get_existing_titles(): Retrieves existing post titles from your WordPress site.
  • handoff_to_filter_agent(): Transfers control to the Title Filter Agent.
  • handoff_to_generator_agent(): Transfers control back to the Title Generator Agent.

Creating Agents

We create two agents with specific roles:

  • Title Generator Agent: Gather recent news about a specific topic and identify patterns in the results to generate blog post ideas.
  • Title Filter Agent: Delete or optimize the post idea based on existing post titles on our site.

Each agent has its own set of instructions and functions it can call.

Running the Client

Finally, we instantiate the Swarm client and run it, starting with the Title Generator Agent. The system will handle the handoff to the Title Filter Agent automatically and vice versa.

Helpful Tip
Pass debug=True to client.run() to view each agent’s requests and responses in the console.

This can be helpful when trying to troubleshoot issues with your multi-agent setup and it’s also just kind of cool to see under the hood.

Step-by-Step Guide

Running it is pretty simple. Save the code above as main.py somewhere on your computer.

1. Set Up API Keys and URLs

Replace the placeholders in the code with your actual API keys and WordPress site URL:

# Replace with your actual NewsAPI.org API key
NEWSAPI_KEY = 'YOUR_NEWSAPI_KEY'

# Replace with your actual WordPress site URL
WORDPRESS_SITE_URL = 'https://your-wordpress-site.com'

2. Install Necessary Libraries

Open a command prompt and navigate to the folder where you downloaded main.py. Install the required libraries using pip:

pip install git+https://github.com/openai/swarm.git newsapi-python requests

3. Test the System

Run the script and enter a topic when prompted:

python main.py

Sample Output:

Enter a topic for blog post ideas: WordPress News
Getting latest articles for: WordPress News
Handoff to Filter Agent
Getting existing post titles
Handoff to Generator Agent
Getting latest articles for: Automattic WP Engine WordPress Revenue
Getting latest articles for: WordPress.org Alternatives
Getting latest articles for: Relevance of WordPress Today
Getting latest articles for: WordPress ACF Hijacking
Getting latest articles for: WordPress Ecosystem Updates

Final blog post title ideas with research links:

Here are the finalized blog post titles on 'WordPress News' along with additional research links:

1. **Automattic vs. WP Engine: Unveiling the WordPress Revenue Battle**
   - [The Verge](https://www.theverge.com/2024/10/2/24260158/automattic-demand-wp-engine-revenue-wordpress-battle)
   - [Automattic Blog](https://automattic.com/2024/10/01/wpe-terms/)
   - [404 Media](https://www.404media.co/wordpress-checkbox-login-wp-engine/)

2. **Navigating WordPress: Making the Choice Between WordPress.org and Alternatives**
   - [Megabyterose](https://megabyterose.com/2024/10/leaving-wordpress-org-or-wpf-still-unsure-which-one/)
   - [WP Beginner Review](https://www.wpbeginner.com/showcase/best-wp-engine-alternatives/)
   - [WP Tavern](https://wptavern.com/impact-of-wpengines-ban-on-acf-plugin)

3. **The Relevance of WordPress Today: Is it Here to Stay?**
   - [Molodtsov](https://molodtsov.me/2024/10/wordpress-doesnt-matter-for-the-future-of-web/)
   - [Hongkiat](https://www.hongkiat.com/blog/15-noteworthy-websites-that-changed-the-internet/)

4. **WordPress Security Update: The Hijacking of ACF**
   - [Anderegg](https://anderegg.ca/2024/10/13/acf-has-been-hijacked)
   - [Advanced Custom Fields Blog](https://www.advancedcustomfields.com/blog/acf-plugin-no-longer-available-on-wordpress-org/)

5. **October WordPress Ecosystem Updates: What You Need to Know**
   - [Scripting](http://scripting.com/2024/10/07.html)
   - [WP Beginner Spotlight](https://www.wpbeginner.com/news/wpbeginner-spotlight-04-wordcamp-us-highlights-plugin-updates-more/)
   - [Speckyboy](https://speckyboy.com/manage-website-technical-debt/)

These titles are ready to be used for creating informative blog posts.

Helpful Tip
If you encounter errors, double-check your API keys and ensure all libraries are correctly installed. Dependencies can be tricky, it’s always good to verify your setup if something doesn’t work as expected.

And that’s it! This can easily be expanded to include additional agents and functions to further refine the quality of the results. Things like social media monitoring, SEO keyword research, user comment analysis, etc. What I’ve found great about it so far is, once I have a few functions and agents built, they seem pretty interchangeable with other swarms with minimal modifications aside from system instructions.


The ideas behind OpenAI’s Swarm framework open up exciting possibilities for building and managing multi-agent systems. The hardest part about working with multi-agent systems is managing their complexity as they scale. This architecture immediately abstracts away a lot of that complexity and I’ve found myself getting a lot further a lot faster going from idea to PoC.

In general, this architecture has opened my eyes a bit around what is possible regarding multi-agent systems. While I’ve built non-linear multi-agent systems in the past, they were still fairly rigid. I’ve since adopted newer approaches to help manage my tool dependencies a bit better but it still doesn’t compare to the fluidity of my experience with swarm so far.

I might dig a bit deeper but I’ve become overwhelmed with distractions, er, curiosities, around generative AI applications and their evolution. I have a feeling swarm or something very similar, will soon become a part of my regular AI dev toolkit.

Leave a Reply

Your email address will not be published. Required fields are marked *