TOXO DOCS

Complete documentation for the TOXO Python Library - Transform any LLM into a Context Augmented Language Model (CALM)

📦 PyPI Package:pip install toxo

Quick Start

Get up and running with TOXO in minutes

Basic Usage
# Install the TOXO Python Library
pip install toxo

# Import and use the .toxo file
from toxo import ToxoLayer

# Load your trained CALM layer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")

# Connect to ANY LLM with specific model selection
# Gemini (recommended)
layer.setup_api_key("your_gemini_key", "gemini-2.0-flash-exp", "gemini")

# OpenAI GPT
layer.setup_api_key("your_openai_key", "gpt-4", "openai")

# Claude
layer.setup_api_key("your_claude_key", "claude-3.5-sonnet", "claude")

# Your LLM is now a domain expert!
response = layer.query("Create a viral LinkedIn post about AI trends")
print(response)

Core Concepts

Smart Layers

TOXO creates intelligent layers that attach to any LLM API, providing instant domain expertise without retraining.

CALM Technology

Context Augmented Language Models (CALM) enhance any black-box LLM with persistent knowledge and domain expertise.

Universal Compatibility

Works with ANY LLM: Gemini, GPT, Claude, local models, and custom APIs. No model-specific training required.

API Reference

ToxoLayer.load(path)

Load a trained smart layer from a .toxo file

Parameters:
  • path - Path to the .toxo file
Returns:
ToxoLayer instance

setup_api_key(key, model, provider)

Configure API credentials and model selection

Parameters:
  • key - Your API key
  • model - Model name (e.g., "gpt-4", "gemini-2.0-flash-exp")
  • provider - Provider ("openai", "gemini", "claude")

query(prompt)

Query the enhanced LLM with domain expertise

Parameters:
  • prompt - Your query or prompt
Returns:
Enhanced response with domain knowledge

query_async(prompt)

Asynchronous version for high-performance applications

Parameters:
  • prompt - Your query or prompt
Returns:
Awaitable enhanced response

Advanced Features

Memory Systems

Establish context once and use it across all future queries

Memory Example
# Establish context once (Memory Systems)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_gemini_key", "gemini-2.0-flash-exp", "gemini")

# Teach the layer about you/your brand once
layer.query("""
Remember these details about me:
- I'm the founder of Toxo (AI training platform)
- Target audience: AI/ML professionals and startup founders
- Goal: Build thought leadership and drive B2B leads
""")

# Subsequent queries will automatically use this context
response = layer.query("Create a 7-day LinkedIn content plan tailored to my audience")
print(response)

Response Ranking

Generate multiple options and get AI-evaluated rankings with reasoning

Ranking Example
# Generate multiple options and rank them (Response Ranking)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_openai_key", "gpt-4", "openai")

prompt = """
Generate 5 LinkedIn post ideas about AI trends in 2025.
Rank them from 1-5 by viral potential, engagement likelihood, and business impact.
For each, explain the ranking decision and suggested call-to-action.
"""
response = layer.query(prompt)
print(response)

Capability Detection

Programmatically discover domain, components, and capabilities of your layer

Discovery Example
# Discover what your .toxo layer can do (Capability Detection)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_claude_key", "claude-3.5-sonnet", "claude")

info = layer.get_info()
print("Domain:", info.get("domain"))
print("Training examples:", info.get("training_examples"))
print("Components:", info.get("components"))

caps = layer.get_capabilities()
print("Capabilities:", caps)

Async Support

High-performance async queries for production applications

Async Example
import asyncio
from toxo import ToxoLayer

async def main():
    # Load your trained CALM layer
    layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
    
    # Setup API key for your preferred LLM
    layer.setup_api_key("your_api_key", "gpt-4", "openai")
    
    # Query with domain expertise
    response = layer.query("Write a compelling LinkedIn post about AI innovation")
    print(response)
    
    # Async support for production applications
    async_response = await layer.query_async("Analyze market trends for AI startups")
    print(async_response)

asyncio.run(main())

Continuous Learning

Improve performance with feedback and suggestions

Feedback System
# Provide feedback to enhance performance
layer.add_feedback(
    question="Investment strategy question",
    response="Generated response...",
    rating=8.5  # Quality score 0-10
)

Multi-Agent Systems

Orchestrate multiple specialized layers for complex workflows

Multi-Agent Example
# Multiple domain experts working together
research_agent = ToxoLayer.load("research_expert.toxo")
writing_agent = ToxoLayer.load("writing_expert.toxo")

# Setup different providers for each agent
research_agent.setup_api_key("gemini_key", "gemini-2.0-flash-exp", "gemini")
writing_agent.setup_api_key("openai_key", "gpt-4", "openai")

# Collaborative AI workflow
research = await research_agent.query_async("Research quantum computing")
report = await writing_agent.query_async(f"Write report: {research}")

Memory Systems

Establish context once and use it across all future queries

Memory Example
# Establish context once (Memory Systems)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_gemini_key", "gemini-2.0-flash-exp", "gemini")

# Teach the layer about you/your brand once
layer.query("""
Remember these details about me:
- I'm the founder of Toxo (AI training platform)
- Target audience: AI/ML professionals and startup founders
- Goal: Build thought leadership and drive B2B leads
""")

# Subsequent queries will automatically use this context
response = layer.query("Create a 7-day LinkedIn content plan tailored to my audience")
print(response)

Response Ranking

Generate multiple options and get AI-evaluated rankings with reasoning

Ranking Example
# Generate multiple options and rank them (Response Ranking)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_openai_key", "gpt-4", "openai")

prompt = """
Generate 5 LinkedIn post ideas about AI trends in 2025.
Rank them from 1-5 by viral potential, engagement likelihood, and business impact.
For each, explain the ranking decision and suggested call-to-action.
"""
response = layer.query(prompt)
print(response)

Capability Detection

Programmatically discover domain, components, and capabilities of your layer

Discovery Example
# Discover what your .toxo layer can do (Capability Detection)
from toxo import ToxoLayer
layer = ToxoLayer.load("viral_linkedin_ultimate.toxo")
layer.setup_api_key("your_claude_key", "claude-3.5-sonnet", "claude")

info = layer.get_info()
print("Domain:", info.get("domain"))
print("Training examples:", info.get("training_examples"))
print("Components:", info.get("components"))

caps = layer.get_capabilities()
print("Capabilities:", caps)

Supported LLM Providers

Google Gemini

Recommended - optimized integration

  • • gemini-2.0-flash-exp
  • • gemini-1.5-pro
  • • gemini-1.5-flash

OpenAI GPT

Full GPT model support

  • • GPT-4
  • • GPT-4o
  • • GPT-3.5-turbo

Anthropic Claude

Advanced reasoning models

  • • claude-3.5-sonnet
  • • claude-3-haiku
  • • claude-3-opus

Ready to Get Started?

Download the sample .toxo file and start building Context Augmented Language Models today.