Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:HKUDS AI Trader BaseAgent Initialize

From Leeroopedia


Knowledge Sources
Domains LLM_Agents, Infrastructure
Last Updated 2026-02-09 14:00 GMT

Overview

Concrete tool for initializing the BaseAgent's MCP client, tool set, and LLM model instance for trading sessions.

Description

The BaseAgent.initialize() async method creates the MultiServerMCPClient from self.mcp_config, fetches tools via await self.client.get_tools(), and instantiates ChatOpenAI (or DeepSeekChatOpenAI for deepseek models) with the configured API credentials. The method stores results on self.client, self.tools, and self.model for use in subsequent trading sessions.

Usage

Call await agent.initialize() after constructing the BaseAgent instance and before calling agent.run_date_range(). Must be awaited as it performs async MCP connections.

Code Reference

Source Location

  • Repository: AI-Trader
  • File: agent/base_agent/base_agent.py
  • Lines: L330-404

Signature

async def initialize(self) -> None:
    """
    Initialize MCP client, tools, and LLM model.

    Sets:
        self.client: MultiServerMCPClient connected to MCP servers
        self.tools: List of available MCP tools
        self.model: ChatOpenAI or DeepSeekChatOpenAI instance

    Returns:
        None
    """

Import

from agent.base_agent.base_agent import BaseAgent
# Then: await agent.initialize()

I/O Contract

Inputs

Name Type Required Description
self.mcp_config Dict[str, Dict] Yes MCP server configuration (URLs and transport types)
self.basemodel str Yes LLM model name (e.g., "gpt-4o", "deepseek-chat")
self.openai_api_key str Yes API key for the LLM provider
self.openai_base_url str Yes Base URL for the LLM API endpoint

Outputs

Name Type Description
self.client MultiServerMCPClient Connected MCP client
self.tools List Available MCP tools from all connected servers
self.model ChatOpenAI LLM model instance ready for inference

Usage Examples

Initialize Agent

from agent.base_agent.base_agent import BaseAgent

agent = BaseAgent(
    signature="gpt-4o",
    basemodel="gpt-4o",
    openai_api_key="sk-...",
    openai_base_url="https://api.openai.com/v1"
)

await agent.initialize()
# agent.client, agent.tools, agent.model are now set

Related Pages

Requires Environment

Uses Heuristic

Implements Principle

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment