bots package
Subpackages
- bots.dev package
- bots.flows package
- bots.foundation package
- Submodules
- bots.foundation.anthropic_bots module
AnthropicNodeAnthropicToolHandlerAnthropicMailboxAnthropicBotAnthropicBot.api_keyAnthropicBot.model_engineAnthropicBot.max_tokensAnthropicBot.temperatureAnthropicBot.nameAnthropicBot.roleAnthropicBot.role_descriptionAnthropicBot.system_messageAnthropicBot.tool_handlerAnthropicBot.conversationAnthropicBot.mailboxAnthropicBot.autosaveAnthropicBot.__init__()
CacheController
- bots.foundation.base module
load()EnginesEngines.GPT4Engines.GPT4_0613Engines.GPT4_32KEngines.GPT4_32K_0613Engines.GPT4TURBOEngines.GPT4TURBO_0125Engines.GPT4TURBO_VISIONEngines.GPT35TURBOEngines.GPT35TURBO_16KEngines.GPT35TURBO_0125Engines.GPT35TURBO_INSTRUCTEngines.CLAUDE3_HAIKUEngines.CLAUDE3_SONNETEngines.CLAUDE3_OPUSEngines.CLAUDE35_SONNET_20240620Engines.CLAUDE35_SONNET_20241022Engines.CLAUDE37_SONNET_20250219Engines.get()Engines.get_bot_class()Engines.get_conversation_node_class()
ConversationNodeConversationNode.roleConversationNode.contentConversationNode.parentConversationNode.repliesConversationNode.tool_callsConversationNode.tool_resultsConversationNode.pending_resultsConversationNode.__init__()ConversationNode._create_empty()ConversationNode._is_empty()ConversationNode._add_reply()ConversationNode._sync_tool_context()ConversationNode._add_tool_calls()ConversationNode._add_tool_results()ConversationNode._find_root()ConversationNode._root_dict()ConversationNode._to_dict_recursive()ConversationNode._to_dict_self()ConversationNode._build_messages()ConversationNode._node_count()
ModuleContextToolHandlerErrorToolNotFoundErrorModuleLoadErrorToolHandlerToolHandler.toolsToolHandler.function_mapToolHandler.requestsToolHandler.resultsToolHandler.modulesToolHandler.__init__()ToolHandler.generate_tool_schema()ToolHandler.generate_request_schema()ToolHandler.tool_name_and_input()ToolHandler.generate_response_schema()ToolHandler.generate_error_schema()ToolHandler.extract_requests()ToolHandler.exec_requests()ToolHandler._create_builtin_wrapper()ToolHandler._create_dynamic_wrapper()ToolHandler.add_tool()ToolHandler._add_tools_from_file()ToolHandler._add_tools_from_module()ToolHandler.to_dict()ToolHandler.from_dict()ToolHandler.get_tools_json()ToolHandler.clear()ToolHandler.add_request()ToolHandler.add_result()ToolHandler.get_results()ToolHandler.get_requests()ToolHandler._get_code_hash()ToolHandler.__str__()ToolHandler.__repr__()
MailboxBotBot.api_keyBot.nameBot.model_engineBot.max_tokensBot.temperatureBot.roleBot.role_descriptionBot.conversationBot.system_messageBot.tool_handlerBot.mailboxBot.autosaveBot.__init__()Bot.respond()Bot.add_tools()Bot._cvsn_respond()Bot.set_system_message()Bot.load()Bot.save()Bot.chat()Bot.__mul__()Bot.__str__()
- bots.foundation.openai_bots module
- Module contents
- bots.tools package
- Submodules
- bots.tools.code_tools module
- bots.tools.meta_tools module
- bots.tools.python_editing_tools module
NodeTransformerWithAsyncSupportFunctionReplacerMethodAdderadd_imports()remove_import()replace_import()add_class()replace_class()add_function_to_class()add_function_to_file()replace_function()_make_file()_add_single_function_to_class()_add_single_function_to_file()_replace_single_function()
- bots.tools.python_execution_tool module
- bots.tools.self_tools module
- bots.tools.terminal_tools module
- Module contents
Module contents
Bots package initialization and primary interface.
This module serves as the main entry point for the bots package, providing access to: - Core bot implementations:
AnthropicBot: Claude-based bot implementation
ChatGPT_Bot: GPT-based bot implementation
- Development tools:
auto_terminal: Advanced terminal interface for autonomous coding
lazy: Runtime code generation decorator
project_tree: Project structure analysis and management
- Tool collections:
python_editing_tools: Python code modification utilities
meta_tools: Bot self-modification capabilities
terminal_tools: Command-line interaction tools
code_tools: General code manipulation utilities
self_tools: Bot introspection utilities
The package follows a layered architecture with foundation, flows, and tools layers. All commonly used components are imported here for convenient access.
- Example Usage:
>>> from bots import AnthropicBot >>> import bots.tools.code_tools as code_tools >>> >>> # Initialize and equip bot >>> bot = AnthropicBot() >>> bot.add_tools(code_tools) >>> >>> # Single response >>> response = bot.respond("Please write a basic Flask app") >>> >>> # Interactive mode >>> bot.chat()
- class bots.AnthropicBot(api_key: str | None = None, model_engine: Engines = Engines.CLAUDE37_SONNET_20250219, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'Claude', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]
Bases:
BotA bot implementation using the Anthropic API.
Use when you need to create a bot that interfaces with Anthropic’s chat completion models. Provides a complete implementation with Anthropic-specific conversation management, tool handling, and message processing. Supports both simple chat interactions and complex tool-using conversations.
- Inherits from:
Bot: Base class for all bot implementations, providing core conversation and tool management
- api_key
Anthropic API key for authentication
- Type:
str
- max_tokens
Maximum tokens allowed in completion responses
- Type:
int
- temperature
Response randomness factor (0-1)
- Type:
float
- name
Instance name for identification
- Type:
str
- role
Bot’s role identifier
- Type:
str
- role_description
Detailed description of bot’s role/personality (for humans to read, not used in api)
- Type:
str
- system_message
System-level instructions for the bot
- Type:
str
- tool_handler
Manages function calling capabilities
- Type:
- conversation
Manages conversation history
- Type:
- mailbox
Handles API communication
- Type:
- autosave
Whether to automatically save state after responses
- Type:
bool
Example
```python # Create a documentation expert bot bot = ChatGPT_Bot(
model_engine=Engines.GPT4, temperature=0.3, role_description=”a Python documentation expert”
)
# Add tools and use the bot bot.add_tool(my_function) response = bot.respond(“Please help document this code.”)
# Save the bot’s state for later use bot.save(“doc_expert.bot”) ```
- __init__(api_key: str | None = None, model_engine: Engines = Engines.CLAUDE37_SONNET_20250219, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'Claude', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True) None[source]
Initialize an AnthropicBot.
- Parameters:
api_key – Optional API key (will use ANTHROPIC_API_KEY env var if not provided)
model_engine – The Anthropic model to use (default: CLAUDE37_SONNET_20250219)
max_tokens – Maximum tokens per response (default: 4096)
temperature – Response randomness, 0-1 (default: 0.3)
name – Bot’s name (default: ‘Claude’)
role – Bot’s role (default: ‘assistant’)
role_description – Description of bot’s role (default: ‘a friendly AI assistant’)
autosave – Whether to autosave state after responses (default: True, saves to cwd)
- class bots.ChatGPT_Bot(api_key: str | None = None, model_engine: Engines = Engines.GPT4, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'bot', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]
Bases:
BotA bot implementation using the OpenAI GPT API.
Use when you need to create a bot that interfaces with OpenAI’s chat completion models. Provides a complete implementation with OpenAI-specific conversation management, tool handling, and message processing. Supports both simple chat interactions and complex tool-using conversations.
- Inherits from:
Bot: Base class for all bot implementations, providing core conversation and tool management
- api_key
OpenAI API key for authentication
- Type:
str
- max_tokens
Maximum tokens allowed in completion responses
- Type:
int
- temperature
Response randomness factor (0-1)
- Type:
float
- name
Instance name for identification
- Type:
str
- role
Bot’s role identifier
- Type:
str
- role_description
Detailed description of bot’s role/personality
- Type:
str
- system_message
System-level instructions for the bot
- Type:
str
- tool_handler
Manages function calling capabilities
- Type:
- conversation
Manages conversation history
- Type:
- mailbox
Handles API communication
- Type:
- autosave
Whether to automatically save state after responses
- Type:
bool
Example
```python # Create a documentation expert bot bot = ChatGPT_Bot(
model_engine=Engines.GPT4, temperature=0.3, role_description=”a Python documentation expert”
)
# Add tools and use the bot bot.add_tool(my_function) response = bot.respond(“Please help document this code.”)
# Save the bot’s state for later use bot.save(“doc_expert.bot”) ```
- __init__(api_key: str | None = None, model_engine: Engines = Engines.GPT4, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'bot', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]
Initialize a ChatGPT bot with OpenAI-specific components.
Use when you need to create a new OpenAI-based bot instance with specific configuration. Sets up all necessary components for OpenAI interaction including conversation management, tool handling, and API communication.
- Parameters:
api_key (Optional[str]) – OpenAI API key. If not provided, attempts to read from OPENAI_API_KEY environment variable
model_engine (Engines) – The OpenAI model to use, defaults to GPT-4. Determines capabilities and pricing
max_tokens (int) – Maximum tokens in completion response, defaults to 4096. Affects response length and API costs
temperature (float) – Response randomness (0-1), defaults to 0.3. Higher values make responses more creative but less focused
name (str) – Name of the bot instance, defaults to ‘bot’. Used for identification in logs and saved states
role (str) – Role identifier for the bot, defaults to ‘assistant’. Used in message formatting
role_description (str) – Description of the bot’s role/personality, defaults to ‘a friendly AI assistant’. Guides bot behavior
autosave (bool) – Whether to automatically save conversation state, defaults to True. Enables conversation recovery
Note
The bot is initialized with OpenAI-specific implementations of: - OpenAIToolHandler for function calling - OpenAINode for conversation management - OpenAIMailbox for API communication
- class bots.Engines(*values)[source]
Bases:
str,EnumEnum class representing different AI model engines.
- static get(name: str) Engines | None[source]
Retrieve an Engines enum member by its string value.
Use when you need to convert a model name string to an Engines enum member.
- Parameters:
name (str) – The string value of the engine (e.g., ‘gpt-4’, ‘claude-3-opus-20240229’)
- Returns:
The corresponding Engines enum member, or None if not found
- Return type:
Optional[Engines]
- static get_bot_class(model_engine: Engines) Type[Bot][source]
Get the appropriate Bot subclass for a given model engine.
Use when you need to programmatically determine which Bot implementation to use for a specific model engine.
- Parameters:
model_engine (Engines) – The engine enum member to get the bot class for
- Returns:
The Bot subclass (ChatGPT_Bot or AnthropicBot)
- Return type:
Type[Bot]
- Raises:
ValueError – If the model engine is not supported
Example
`python bot_class = Engines.get_bot_class(Engines.GPT4) bot = bot_class(api_key="key") `
- static get_conversation_node_class(class_name: str) Type[ConversationNode][source]
Get the appropriate ConversationNode subclass by name.
Use when you need to reconstruct conversation nodes from saved bot state.
- Parameters:
class_name (str) – Name of the node class (‘OpenAINode’ or ‘AnthropicNode’)
- Returns:
The ConversationNode subclass
- Return type:
Type[ConversationNode]
- Raises:
ValueError – If the class name is not a supported node type
- GPT4 = 'gpt-4'
- GPT4_0613 = 'gpt-4-0613'
- GPT4_32K = 'gpt-4-32k'
- GPT4_32K_0613 = 'gpt-4-32k-0613'
- GPT4TURBO = 'gpt-4-turbo-preview'
- GPT4TURBO_0125 = 'gpt-4-0125-preview'
- GPT4TURBO_VISION = 'gpt-4-vision-preview'
- GPT35TURBO = 'gpt-3.5-turbo'
- GPT35TURBO_16K = 'gpt-3.5-turbo-16k'
- GPT35TURBO_0125 = 'gpt-3.5-turbo-0125'
- GPT35TURBO_INSTRUCT = 'gpt-3.5-turbo-instruct'
- CLAUDE3_HAIKU = 'claude-3-haiku-20240307'
- CLAUDE3_SONNET = 'claude-3-sonnet-20240229'
- CLAUDE3_OPUS = 'claude-3-opus-20240229'
- CLAUDE35_SONNET_20240620 = 'claude-3-5-sonnet-20240620'
- CLAUDE35_SONNET_20241022 = 'claude-3-5-sonnet-20241022'
- CLAUDE37_SONNET_20250219 = 'claude-3-7-sonnet-20250219'
- bots.load(filepath: str) Bot[source]
Load a saved bot from a file.
Use when you need to restore a previously saved bot with its complete state, including conversation history, tools, and configuration.
- Parameters:
filepath (str) – Path to the .bot file containing the saved bot state
- Returns:
A reconstructed Bot instance with the saved state
- Return type:
Example
`python bot = bots.load("my_saved_bot.bot") bot.respond("Continue our previous conversation") `
- bots.lazy(prompt: str | None = None, bot: Bot | None = None, context: str | None = None) Callable[source]
Decorator that lazily implements a function using an LLM at runtime.
Use when you need to generate function implementations dynamically using an LLM. The implementation will be generated on first call and persisted to the source file.
- Parameters:
prompt (Optional[str]) – Additional instructions for the LLM about how to implement the function. Defaults to empty string.
bot (Optional[Bot]) – The bot instance to use for implementation. Defaults to a new AnthropicBot instance.
context (Optional[str]) – Level of context to provide to the LLM. Options are: - ‘None’: No additional context - ‘low’: Only the containing class - ‘medium’: The entire current file - ‘high’: Current file and interfaces of other files in directory - ‘very high’: All Python files in directory Defaults to ‘None’.
- Returns:
A decorator function that wraps the target function
- Return type:
Callable
Example
@lazy(“Sort using a funny algorithm. Name variables as though you’re a clown.”) def sort(arr: list[int]) -> list[int]:
pass