Building Your Farcaster Agent

Today, I'll show you how to build a powerful Farcaster agent using Cloudflare Workers. This agent comes with built-in memory capabilities and an extensible action system. I initially tried creating a Farcaster account manually, but trust me - use Neynar's API. It'll save you hours of headache!

What you'll learn: What you'll need:

1. Creating Your Farcaster Account

First, you'll need a Farcaster account for your agent. Neynar provides an easy API for this - head over to their documentation for the full setup process. You'll get an FID (Farcaster ID) and API keys that we'll need later. You will also need to generate your UUID for the bot, you will see this on the neynar dashboard, do so, we will need this.

2. Setting Up Your Worker

Let's create a new Cloudflare Worker project:

npm create cloudflare@latest my-farcaster-agent
cd my-farcaster-agent

Now, grab the agent code from our template repository. Copy the contents of the `fagent/src` directory into your worker's `src` directory.

3. Configuration

Create a `wrangler.toml` file in your project root:

name = "farcaster-agent"
main = "src/index.js"
compatibility_date = "2023-01-01"
node_compat = true

[vars]
FARCASTER_FID = "your_fid"
FARCASTER_NEYNAR_SIGNER_UUID = "your_signer_uuid"
FARCASTER_NEYNAR_API_KEY = "your_neynar_key"
OPENROUTER_API_KEY = "your_openrouter_key"

# KV namespace binding
[[kv_namespaces]]
binding = "AGENT_KV"
id = "your_kv_namespace_id"

4. Setting Up Cloudflare KV

The agent uses Cloudflare KV for memory storage. Create your KV namespace:

# Create the KV namespace
npx wrangler kv:namespace create AGENT_KV

# You'll get output like:
# Add the following to your wrangler.toml
# [[kv_namespaces]]
# binding = "AGENT_KV"
# id = "xxxxx-xxxxx-xxxxx"

5. LLM Configuration

The agent uses OpenRouter for LLM access. In your environment variables, set your OpenRouter API key and configure the model in `src/core/agent.js`:

// Default model configuration
const modelConfig = {
  model: "openai/gpt-4-turbo-preview",  // or your preferred model
  max_tokens: 1000,
  temperature: 0.7,
  system_prompt: characterConfig.system_prompt
};
Model Selection:

You can use any model available on OpenRouter's model page. They offer a wide range of models including GPT-4, Claude, and more. Choose based on your needs for performance and cost.

6. Character Configuration

The character.json file defines your agent's personality and behavior. This is passed directly to the LLM as part of the system prompt:

{
  "name": "YourAgent",
  "bio": [
    "A knowledgeable AI agent on Farcaster",
    "Specializes in [your specialty]",
    "Known for [unique traits]"
  ],
  "style": {
    "tone": [
      "friendly but professional",
      "technically accurate",
      "clear and concise"
    ],
    "writing_style": [
      "use clear explanations",
      "maintain conversation context"
    ]
  },
  "system_prompt": "You are [name], [key characteristics]..."
}
Character Components:

The character file is processed and combined with conversation memory before being sent to the LLM, ensuring consistent personality across interactions.

7. Memory System

Our agent uses a two-tier memory system:

The memory system automatically maintains context across conversations, making your agent feel more natural and consistent.

8. Deployment and Webhooks

Deploy your worker:

npx wrangler deploy

This command will deploy your worker and return a URL endpoint (something like https://your-worker-name.username.workers.dev). Make sure to save this URL - we'll need it in the next step when setting up the webhook.

Set up your webhook in the Neynar dashboard:

  1. Go to the webhooks tab
  2. Create a new webhook
  3. Enter your worker URL as the target
  4. Add your bot's FID to both mentioned_fids and parent_author_fids

For detailed webhook setup instructions, check out the Neynar webhook documentation.

Extensibility

The agent comes with a trending action that monitors Clanker for trending tokens. You can extend its capabilities by creating custom actions - we'll cover that in a future post!

Pro Tips:

Conclusion

You now have a powerful Farcaster agent running on Cloudflare Workers! The built-in memory system and extensible action framework make it easy to create sophisticated interactions. Stay tuned for our next post on creating custom actions for your agent.