Skip to content

Build a local deep research agent with DeerFlow

DeerFlow is an open-source framework that transforms a simple research topic into a comprehensive, detailed report.

This guide will walk through the process of setting up DeerFlow on your Olares device, integrating it with a local Ollama model and the Tavily search engine for web-enabled research.

Learning objectives

In this guide, you will learn how to:

  • Configure DeerFlow to communicate with a local LLM via Ollama.
  • Configure the Tavily search API for web access.
  • Execute deep research tasks and manage reports.

Prerequisites

Before you begin, make sure:

  • Ollama is installed and running in your Olares environment.
  • At least one model installed using Ollama. See Ollama for details.
  • You have a Tavily account (a free account is sufficient).

Install DeerFlow

  1. Open Market, and search for "DeerFlow". Install DeerFlow

  2. Click Get, then Install, and wait for installation to complete.

Configure DeerFlow

DeerFlow requires connection details for the LLM. You will configure this by editing the conf.yaml file using either the graphical interface or the command line.

Configure DeerFlow to use Ollama

Configure DeerFlow to use Tavily

To enable web search, add your Tavily API key to the application configuration.

  1. In Control Hub, select the DeerFlow project.

  2. Click Configmaps in the resource list and select deerflow-config. Browse to DeerFlow's configmaps

  3. Click edit_square in the top-right to open the editor.

  4. Add the following key-value pairs under the data section:

    yaml
    SEARCH_API: tavily
    TAVILY_API_KEY: tvly-xxx # Your Tavily API Key

    Configure Tavily

  5. Click Confirm to save the changes.

Restart DeerFlow

Restart the service to apply the new model and search configurations.

  1. In Control Hub, select the DeerFlow project.

  2. Under Deployments, locate deerflow and click Restart. Restart DeerFlow

  3. In the confirmation dialog, type deerflow and click Confirm.

  4. Wait for the status icon to turn green, which indicates the service has successfully restarted.

Run DeerFlow

Run a deep research task

  1. Open DeerFlow from the Olares Launchpad.

  2. Click Get Started and enter your research topic in the prompt box. Enter research prompt

  3. Click the wand icon to have DeerFlow refine your prompt for better results.

  4. Enable Investigation.

  5. Select your preferred writing style (e.g., Popular Science).

  6. Click arrow_upward to send the request.

DeerFlow will generate a preliminary research plan. Review and edit this plan if necessary, or allow it to proceed. Generate research plan

Once the process is complete, a detailed analysis report will be displayed. View research report

To audit the sources and steps taken, click the Activities tab. Review research activities

Edit and save the report

Verify citations

AI models may occasionally generate inaccurate citations or "hallucinated" links. Be sure to manually verify important sources in the citations section.

  1. Click edit in the top-right corner to enter editing mode.
  2. You can adjust formatting using Markdown or select a section and ask the AI to improve or expand it. Ask AI to edit the report
  3. Click undo in the top-right corner to exit editing mode.
  4. Click download to save the report to your local machine as a Markdown file.

Add an MCP server

The Model Context Protocol (MCP) extends DeerFlow's capabilities by integrating external tools. For example, adding the Fetch server allows the agent to scrape and convert web content into Markdown for analysis.

  1. Open your DeerFlow app, and click settings to open the Settings dialog.
  2. Select the MCP tab and click Add Servers.
  3. Paste the JSON configuration for the server. The following example adds the fetch server:
    json
     {
       "mcpServers": {
         "fetch": {
           "command": "uvx",
           "args": ["mcp-server-fetch"]
         }
       }
     }
  4. Click Add. The server is automatically enabled and available for research agents. Add MCP server

Turn research report to a podcast (TTS)

DeerFlow can convert reports into MP3 audio using a Text-to-Speech (TTS) service, such as Volcengine TTS. This requires adding API credentials to the application environment.

  1. Obtain your Access Token and App ID from the Volcengine console.
  2. In Control Hub, select the DeerFlow project and go to Configmaps > deerflow-config.
  3. Click the Edit icon in the top-right corner.
  4. Add the following keys under the data section:
    yaml
    VOLCENGINE_TTS_ACCESS_TOKEN: # Your Access Token
    VOLCENGINE_TTS_APPID: # Your App ID
  5. Click Confirm to save the changes.
  6. Navigate to Deployments > deerflow and click Restart.

Once restarted, DeerFlow should detect these keys and the podcast/TTS feature will be available.

FAQ

DeerFlow does not generate a response

If the agent fails to start or hangs:

  • Check model compatibility: DeerFlow does not support reasoning models (e.g., DeepSeek R1). Switch to a standard chat model and try again.
  • Check endpoint configuration: Ensure the Ollama API endpoint in conf.yaml includes the /v1 suffix.

No web search results during the research

If the report is generic and lacks external data:

  • Check model capabilities: The selected LLM may lack strong tool-calling capabilities. Switch to a model known for effective tool use, such as Qwen 2.5 or Llama 3.1.
  • Verify API Key: Ensure the TAVILY_API_KEY in the ConfigMap is correct and the account has remaining quota.