JustLLMsDocumentation

JustLLMs Documentation

Complete guide to building production-ready LLM applications with intelligent routing, enterprise analytics, and multi-provider management.

Quick Navigation

Installation

Get started with JustLLMs in seconds. Choose the installation option that fits your needs.

Basic Installation
bash
pip install justllms
With PDF Export Support
bash
pip install justllms[pdf]
Full Installation (All Features)

Includes PDF export, Redis caching, and advanced analytics

bash
pip install justllms[all]

📦 Package Stats

Size:1.1MB
Lines of Code:~11K
Dependencies:Minimal

Quick Start

Get your first LLM response in under 30 seconds with automatic provider routing.

Quick Start Example
python
from justllms import JustLLM
# Initialize with your API keys
client = JustLLM({
"providers": {
"openai": {"api_key": "your-openai-key"},
"google": {"api_key": "your-google-key"},
"anthropic": {"api_key": "your-anthropic-key"}
}
})
# Simple completion - automatically routes to best provider
response = client.completion.create(
messages=[{"role": "user", "content": "Explain quantum computing"}]
)
print(response.content)
print(f"Used provider: {response.provider}")
print(f"Cost: ${response.cost:.4f}")

✅ That's it!

JustLLMs automatically chose the best provider based on cost, availability, and performance. No manual provider switching required.

Multi-Provider Support

Connect to all major LLM providers with a single, consistent interface.

OpenAI (GPT-5, GPT-4, etc.)
Google (Gemini 2.5, etc)
Anthropic (Claude 3.5 models)
Azure OpenAI
xAI Grok
DeepSeek
Ollama (Local Models)
Multi-Provider Configuration
python
from justllms import JustLLM
client = JustLLM({
"providers": {
"openai": {
"api_key": "your-openai-key",
},
"anthropic": {
"api_key": "your-anthropic-key",
},
"google": {
"api_key": "your-google-key",
},
"ollama": {
"base_url": "http://localhost:11434"
}
},
"default_provider": "openai", # Fallback if routing fails
"timeout": 30 # Request timeout in seconds
})

Provider-Agnostic Tools

Define tools once using a simple python decorator, and use them seamlessly across OpenAI, Anthropic, or Google without learning different native APIs.

Universal Tool Definition
python
from justllms import JustLLM, tool
@tool
def get_weather(location: str) -> dict:
"""Get weather for a location."""
return {"temperature": 22, "condition": "sunny"}
# Works with OpenAI, Anthropic, Google - same code!
response = client.completion.create(
messages=[{"role": "user", "content": "What's the weather in Paris?"}],
tools=[get_weather],
provider="openai", # or "anthropic", "google"
execute_tools=True
)
print(response.content) # "The weather in Paris is sunny and 22 degrees."

SXS Model Comparison

Compare multiple LLM providers and models simultaneously via an interactive CLI tool. Perfect for evaluating prompt performance and cost differences.

Run SXS Comparison
bash
# Launch the interactive menu
justllms sxs
// Example Output
================================================================================
Prompt: Which programming language is better for beginners: Python or JavaScript?
================================================================================
┌─ openai/gpt-5 ─────────────────────────────────────────────────────┐
│ Python is generally better for beginners due to its clean, readable syntax │
│ that resembles natural language. It has fewer confusing concepts... │
└─────────────────────────────────────────────────────────────────────────────┘
┌─ google/gemini-2.5-pro ─────────────────────────────────────────────────────┐
│ JavaScript has advantages for beginners because it runs everywhere - in │
│ browsers, servers, and mobile apps. You can see immediate visual results │
└─────────────────────────────────────────────────────────────────────────────┘

Automatic Fallbacks

Configure fallback providers and models for reliability. If an API outage occurs, your requests are cleanly and automatically rerouted.

Configuring Fallbacks
python
client = JustLLM({
"providers": {
"openai": {"api_key": "your-key"},
"anthropic": {"api_key": "your-key"}
},
"routing": {
"fallback_provider": "anthropic",
"fallback_model": "claude-3-5-sonnet-20241022"
}
})
# If no model is specified or the primary model fails, uses fallback seamlessly
response = client.completion.create(
messages=[{"role": "user", "content": "Hello"}]
)

Native Tools Implementation

Out-of-the-box support for Server-Side Google Search and Python code execution environments perfectly suited for intelligent autonomous agents.

Native Tool Usage
python
from justllms import GoogleSearch, GoogleCodeExecution
# Server-side Google Search and Python execution out-of-the-box
response = client.completion.create(
messages=[{"role": "user", "content": "Latest AI news and calculate 2^10"}],
tools=[GoogleSearch(), GoogleCodeExecution()],
provider="google"
)