bots package

Subpackages

Module contents

Bots package initialization and primary interface.

This module serves as the main entry point for the bots package, providing access to: - Core bot implementations:

  • AnthropicBot: Claude-based bot implementation

  • ChatGPT_Bot: GPT-based bot implementation

  • Development tools:
    • auto_terminal: Advanced terminal interface for autonomous coding

    • lazy: Runtime code generation decorator

    • project_tree: Project structure analysis and management

  • Tool collections:
    • python_editing_tools: Python code modification utilities

    • meta_tools: Bot self-modification capabilities

    • terminal_tools: Command-line interaction tools

    • code_tools: General code manipulation utilities

    • self_tools: Bot introspection utilities

The package follows a layered architecture with foundation, flows, and tools layers. All commonly used components are imported here for convenient access.

Example Usage:
>>> from bots import AnthropicBot
>>> import bots.tools.code_tools as code_tools
>>>
>>> # Initialize and equip bot
>>> bot = AnthropicBot()
>>> bot.add_tools(code_tools)
>>>
>>> # Single response
>>> response = bot.respond("Please write a basic Flask app")
>>>
>>> # Interactive mode
>>> bot.chat()
class bots.AnthropicBot(api_key: str | None = None, model_engine: Engines = Engines.CLAUDE37_SONNET_20250219, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'Claude', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]

Bases: Bot

A bot implementation using the Anthropic API.

Use when you need to create a bot that interfaces with Anthropic’s chat completion models. Provides a complete implementation with Anthropic-specific conversation management, tool handling, and message processing. Supports both simple chat interactions and complex tool-using conversations.

Inherits from:

Bot: Base class for all bot implementations, providing core conversation and tool management

api_key

Anthropic API key for authentication

Type:

str

model_engine

The Anthropic model being used (e.g., GPT4)

Type:

Engines

max_tokens

Maximum tokens allowed in completion responses

Type:

int

temperature

Response randomness factor (0-1)

Type:

float

name

Instance name for identification

Type:

str

role

Bot’s role identifier

Type:

str

role_description

Detailed description of bot’s role/personality (for humans to read, not used in api)

Type:

str

system_message

System-level instructions for the bot

Type:

str

tool_handler

Manages function calling capabilities

Type:

AnthropicToolHandler

conversation

Manages conversation history

Type:

AnthropicNode

mailbox

Handles API communication

Type:

AnthropicMailbox

autosave

Whether to automatically save state after responses

Type:

bool

Example

```python # Create a documentation expert bot bot = ChatGPT_Bot(

model_engine=Engines.GPT4, temperature=0.3, role_description=”a Python documentation expert”

)

# Add tools and use the bot bot.add_tool(my_function) response = bot.respond(“Please help document this code.”)

# Save the bot’s state for later use bot.save(“doc_expert.bot”) ```

__init__(api_key: str | None = None, model_engine: Engines = Engines.CLAUDE37_SONNET_20250219, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'Claude', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True) None[source]

Initialize an AnthropicBot.

Parameters:
  • api_key – Optional API key (will use ANTHROPIC_API_KEY env var if not provided)

  • model_engine – The Anthropic model to use (default: CLAUDE37_SONNET_20250219)

  • max_tokens – Maximum tokens per response (default: 4096)

  • temperature – Response randomness, 0-1 (default: 0.3)

  • name – Bot’s name (default: ‘Claude’)

  • role – Bot’s role (default: ‘assistant’)

  • role_description – Description of bot’s role (default: ‘a friendly AI assistant’)

  • autosave – Whether to autosave state after responses (default: True, saves to cwd)

class bots.ChatGPT_Bot(api_key: str | None = None, model_engine: Engines = Engines.GPT4, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'bot', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]

Bases: Bot

A bot implementation using the OpenAI GPT API.

Use when you need to create a bot that interfaces with OpenAI’s chat completion models. Provides a complete implementation with OpenAI-specific conversation management, tool handling, and message processing. Supports both simple chat interactions and complex tool-using conversations.

Inherits from:

Bot: Base class for all bot implementations, providing core conversation and tool management

api_key

OpenAI API key for authentication

Type:

str

model_engine

The OpenAI model being used (e.g., GPT4)

Type:

Engines

max_tokens

Maximum tokens allowed in completion responses

Type:

int

temperature

Response randomness factor (0-1)

Type:

float

name

Instance name for identification

Type:

str

role

Bot’s role identifier

Type:

str

role_description

Detailed description of bot’s role/personality

Type:

str

system_message

System-level instructions for the bot

Type:

str

tool_handler

Manages function calling capabilities

Type:

OpenAIToolHandler

conversation

Manages conversation history

Type:

OpenAINode

mailbox

Handles API communication

Type:

OpenAIMailbox

autosave

Whether to automatically save state after responses

Type:

bool

Example

```python # Create a documentation expert bot bot = ChatGPT_Bot(

model_engine=Engines.GPT4, temperature=0.3, role_description=”a Python documentation expert”

)

# Add tools and use the bot bot.add_tool(my_function) response = bot.respond(“Please help document this code.”)

# Save the bot’s state for later use bot.save(“doc_expert.bot”) ```

__init__(api_key: str | None = None, model_engine: Engines = Engines.GPT4, max_tokens: int = 4096, temperature: float = 0.3, name: str = 'bot', role: str = 'assistant', role_description: str = 'a friendly AI assistant', autosave: bool = True)[source]

Initialize a ChatGPT bot with OpenAI-specific components.

Use when you need to create a new OpenAI-based bot instance with specific configuration. Sets up all necessary components for OpenAI interaction including conversation management, tool handling, and API communication.

Parameters:
  • api_key (Optional[str]) – OpenAI API key. If not provided, attempts to read from OPENAI_API_KEY environment variable

  • model_engine (Engines) – The OpenAI model to use, defaults to GPT-4. Determines capabilities and pricing

  • max_tokens (int) – Maximum tokens in completion response, defaults to 4096. Affects response length and API costs

  • temperature (float) – Response randomness (0-1), defaults to 0.3. Higher values make responses more creative but less focused

  • name (str) – Name of the bot instance, defaults to ‘bot’. Used for identification in logs and saved states

  • role (str) – Role identifier for the bot, defaults to ‘assistant’. Used in message formatting

  • role_description (str) – Description of the bot’s role/personality, defaults to ‘a friendly AI assistant’. Guides bot behavior

  • autosave (bool) – Whether to automatically save conversation state, defaults to True. Enables conversation recovery

Note

The bot is initialized with OpenAI-specific implementations of: - OpenAIToolHandler for function calling - OpenAINode for conversation management - OpenAIMailbox for API communication

class bots.Engines(*values)[source]

Bases: str, Enum

Enum class representing different AI model engines.

static get(name: str) Engines | None[source]

Retrieve an Engines enum member by its string value.

Use when you need to convert a model name string to an Engines enum member.

Parameters:

name (str) – The string value of the engine (e.g., ‘gpt-4’, ‘claude-3-opus-20240229’)

Returns:

The corresponding Engines enum member, or None if not found

Return type:

Optional[Engines]

Example

```python engine = Engines.get(‘gpt-4’) if engine:

bot = Bot(model_engine=engine)

```

static get_bot_class(model_engine: Engines) Type[Bot][source]

Get the appropriate Bot subclass for a given model engine.

Use when you need to programmatically determine which Bot implementation to use for a specific model engine.

Parameters:

model_engine (Engines) – The engine enum member to get the bot class for

Returns:

The Bot subclass (ChatGPT_Bot or AnthropicBot)

Return type:

Type[Bot]

Raises:

ValueError – If the model engine is not supported

Example

`python bot_class = Engines.get_bot_class(Engines.GPT4) bot = bot_class(api_key="key") `

static get_conversation_node_class(class_name: str) Type[ConversationNode][source]

Get the appropriate ConversationNode subclass by name.

Use when you need to reconstruct conversation nodes from saved bot state.

Parameters:

class_name (str) – Name of the node class (‘OpenAINode’ or ‘AnthropicNode’)

Returns:

The ConversationNode subclass

Return type:

Type[ConversationNode]

Raises:

ValueError – If the class name is not a supported node type

GPT4 = 'gpt-4'
GPT4_0613 = 'gpt-4-0613'
GPT4_32K = 'gpt-4-32k'
GPT4_32K_0613 = 'gpt-4-32k-0613'
GPT4TURBO = 'gpt-4-turbo-preview'
GPT4TURBO_0125 = 'gpt-4-0125-preview'
GPT4TURBO_VISION = 'gpt-4-vision-preview'
GPT35TURBO = 'gpt-3.5-turbo'
GPT35TURBO_16K = 'gpt-3.5-turbo-16k'
GPT35TURBO_0125 = 'gpt-3.5-turbo-0125'
GPT35TURBO_INSTRUCT = 'gpt-3.5-turbo-instruct'
CLAUDE3_HAIKU = 'claude-3-haiku-20240307'
CLAUDE3_SONNET = 'claude-3-sonnet-20240229'
CLAUDE3_OPUS = 'claude-3-opus-20240229'
CLAUDE35_SONNET_20240620 = 'claude-3-5-sonnet-20240620'
CLAUDE35_SONNET_20241022 = 'claude-3-5-sonnet-20241022'
CLAUDE37_SONNET_20250219 = 'claude-3-7-sonnet-20250219'
bots.load(filepath: str) Bot[source]

Load a saved bot from a file.

Use when you need to restore a previously saved bot with its complete state, including conversation history, tools, and configuration.

Parameters:

filepath (str) – Path to the .bot file containing the saved bot state

Returns:

A reconstructed Bot instance with the saved state

Return type:

Bot

Example

`python bot = bots.load("my_saved_bot.bot") bot.respond("Continue our previous conversation") `

bots.lazy(prompt: str | None = None, bot: Bot | None = None, context: str | None = None) Callable[source]

Decorator that lazily implements a function using an LLM at runtime.

Use when you need to generate function implementations dynamically using an LLM. The implementation will be generated on first call and persisted to the source file.

Parameters:
  • prompt (Optional[str]) – Additional instructions for the LLM about how to implement the function. Defaults to empty string.

  • bot (Optional[Bot]) – The bot instance to use for implementation. Defaults to a new AnthropicBot instance.

  • context (Optional[str]) – Level of context to provide to the LLM. Options are: - ‘None’: No additional context - ‘low’: Only the containing class - ‘medium’: The entire current file - ‘high’: Current file and interfaces of other files in directory - ‘very high’: All Python files in directory Defaults to ‘None’.

Returns:

A decorator function that wraps the target function

Return type:

Callable

Example

@lazy(“Sort using a funny algorithm. Name variables as though you’re a clown.”) def sort(arr: list[int]) -> list[int]:

pass