AITools API Reference
AI tools for the Bot class using LangChain and OpenRouter: run_ai_with_tools (main LLM with tools), run_ai_simple (cheap LLM, single-turn), and run_ai_simple_with_fallback (cheap-first with sanity check and optional fallback to main LLM). The module also provides _default_sanity_check for validating cheap-LLM output.
tradingbot.utils.aitools
AI tools for the Bot class using LangChain and OpenRouter.
LangSmith tracing (EU): If LANGSMITH_API_KEY is set, tracing is enabled and the EU endpoint (LANGSMITH_ENDPOINT) is used when not set. Set LANGSMITH_TRACING=false to disable.
Two LLMs are supported: - Main LLM (OPENROUTER_MAIN_MODEL, default deepseek/deepseek-v3.2): used by run_ai_with_tools for complex, multi-turn flows with tool use (market data, portfolio, trades). - Cheap LLM (OPENROUTER_CHEAP_MODEL, default openai/gpt-oss-120b): used by run_ai_simple for simple single-turn text tasks that do not need tools, e.g. summarization, extraction, classification, rewriting, or formatting. Prefer run_ai_simple for these to save cost.
Cheap-first with fallback: Use run_ai_simple_with_fallback() (or Bot.run_ai_simple_with_fallback) to try the cheap LLM first, verify output for sanity, and retry with the main LLM if the result fails validation. This keeps cost low while guaranteeing sane results.
Extensibility: Subclasses can override Bot.get_ai_tools() to add custom tools; run_ai() merges them automatically. run_ai_with_tools() accepts extra_tools= and optional tool_names= to whitelist which base tools to include.
run_ai_simple(system_prompt: str, user_message: str, model: Optional[str] = None) -> str
Run the AI for a single-turn, no-tools task (summarization, extraction, classification, rewriting). Uses the cheap LLM (OPENROUTER_CHEAP_MODEL, default openai/gpt-oss-120b). Pass model= to override. Use run_ai_with_tools when you need tool access (market data, portfolio, trades).
Source code in tradingbot/utils/aitools.py
run_ai_simple_with_fallback(system_prompt: str, user_message: str, sanity_check: Optional[Callable[[str], bool]] = None, fallback_to_main: bool = True) -> str
Run a simple (no-tools) task with cheap LLM first; verify output for sanity; if validation fails, retry with main LLM. Use this to save cost when the task does not require tools.
Callable that takes the response string and returns True if sane.
If None, uses _default_sanity_check (non-empty, no refusal/error prefix).
fallback_to_main: If True and sanity_check fails, run again with main model (OPENROUTER_MAIN_MODEL) and return that result.
Returns the first sane response, or the main-model response after fallback.
Source code in tradingbot/utils/aitools.py
run_ai_with_tools(system_prompt: str, user_message: str, bot: 'Bot', model: Optional[str] = None, max_tool_rounds: int = 5, extra_tools: Optional[List] = None, tool_names: Optional[List[str]] = None) -> str
Run the AI with the given system prompt and user message, using tools bound to the bot. Uses the main LLM (OPENROUTER_MAIN_MODEL, default deepseek/deepseek-v3.2) for complex, multi-turn tool use. Pass model= to override.
Optional list of LangChain tools to add (e.g. from bot.get_ai_tools()).
Tools with the same name as a base tool override the base tool.
tool_names: Optional whitelist of base tool names to include (e.g. ["get_market_data", "get_portfolio_status"]). If None, all base tools are included.
Returns the final model response as a string.