Installation
Get mycontext-ai installed and ready in under a minute.
Requirements
- Python 3.11+ — Download Python
- pip, uv, poetry, or conda as your package manager
Install the SDK
- pip
- uv
- Poetry
- Conda
pip install mycontext-ai
uv add mycontext-ai
poetry add mycontext-ai
conda install -c conda-forge mycontext-ai
This installs the core SDK with 16 free cognitive patterns, the full intelligence layer, quality metrics, and all 13 export formats.
Add LLM Execution (Recommended)
To execute contexts against LLMs, install with LiteLLM — which gives you access to 100+ LLM providers through a single interface:
- pip
- uv
- Poetry
pip install mycontext-ai litellm
uv add mycontext-ai litellm
poetry add mycontext-ai litellm
Provider-Specific Extras
Install the official SDK for your preferred provider:
- pip
- uv
- Poetry
# OpenAI
pip install "mycontext-ai[openai]"
# Anthropic (Claude)
pip install "mycontext-ai[anthropic]"
# Google (Gemini)
pip install "mycontext-ai[google]"
# All providers at once
pip install "mycontext-ai[all]"
# OpenAI
uv add "mycontext-ai[openai]"
# Anthropic (Claude)
uv add "mycontext-ai[anthropic]"
# Google (Gemini)
uv add "mycontext-ai[google]"
# All providers
uv add "mycontext-ai[all]"
# OpenAI
poetry add "mycontext-ai[openai]"
# Anthropic (Claude)
poetry add "mycontext-ai[anthropic]"
# Google (Gemini)
poetry add "mycontext-ai[google]"
# All providers
poetry add "mycontext-ai[all]"
What each extra includes
| Extra | Package | Use case |
|---|---|---|
openai | openai>=2.0.0 | Direct OpenAI API usage |
anthropic | anthropic>=0.79.0 | Direct Anthropic API usage |
google | google-genai>=1.0.0 | Direct Google Gemini API usage |
tokens | tiktoken>=0.7.0 | Token counting and optimization |
all | All of the above | Full provider support |
LiteLLM handles routing to all providers. The extras install the provider's own SDK for cases where you need direct API access.
Optional: Structured Output Parsing
Install instructor to enable JSON-mode LLM output with automatic retry on validation failure in the intelligence layer:
pip install instructor
When installed, all intelligence-layer LLM calls (suggest_patterns, generate_context, TemplateIntegratorAgent) use structured function-calling mode (~98% parse success). Without it, the SDK falls back to its Pydantic + regex parser transparently — no code changes needed.
Optional: Accurate Token Counting
For assemble_for_model() token-budget assembly and token-aware context trimming, install tiktoken:
pip install "mycontext-ai[tokens]"
# or directly:
pip install tiktoken
Without tiktoken, the SDK falls back to character-based estimation. Install [all] to get everything at once.
Orchestration Extras
If you're integrating with agent frameworks:
pip install "mycontext-ai[orchestration]"
This includes LangChain, LangGraph, CrewAI, AutoGen, and Semantic Kernel.
Development Install
To contribute to mycontext-ai or develop locally:
git clone https://github.com/SadhiraAI/mycontext.git
cd mycontext
pip install -e ".[dev]"
This installs the SDK in editable mode with pytest, mypy, ruff, black, and other dev tools.
Verify Installation
python -c "import mycontext; print(f'mycontext-ai v{mycontext.__version__} installed successfully')"
Expected output:
mycontext-ai v0.6.0 installed successfully
Configure Your API Key
Set the API key for your LLM provider as an environment variable:
- OpenAI
- Anthropic
export OPENAI_API_KEY="sk-..."
Windows (PowerShell):
$env:OPENAI_API_KEY = "sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
Windows (PowerShell):
$env:ANTHROPIC_API_KEY = "sk-ant-..."
export GOOGLE_API_KEY="AI..."
Windows (PowerShell):
$env:GOOGLE_API_KEY = "AI..."
Or use a .env file in your project root:
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=AI...
You only need an API key when executing contexts against an LLM (ctx.execute(), smart_execute(), etc.). Building, exporting, and scoring contexts works entirely offline.
What's Included
| Component | Description |
|---|---|
| Core SDK | Context, Guidance, Directive, Constraints classes |
| 16 Free Patterns | RootCauseAnalyzer, CodeReviewer, StepByStepReasoner, and 13 more |
| Intelligence Layer | transform(), suggest_patterns(), smart_execute(), generate_context() |
| Async Execution | ctx.aexecute() — non-blocking LLM calls via litellm.acompletion |
| Token-Budget Assembly | ctx.assemble_for_model(model, max_tokens) — tiktoken-accurate trimming |
| Validated Output Parsing | Pydantic + instructor structured parsing with automatic retry |
| Quality Metrics | 6-dimension context scoring + 5-dimension output evaluation |
| CAI | Context Amplification Index — proves templates produce better output |
| 13 Export Formats | OpenAI, Anthropic, Gemini, LangChain, YAML, JSON, XML, and more |
| 7 Integrations | LangChain, LlamaIndex, CrewAI, AutoGen, DSPy, Semantic Kernel, Google ADK |
Troubleshooting
ModuleNotFoundError: No module named 'mycontext'
Make sure you installed the package (not a directory named mycontext):
pip install mycontext-ai # ← correct
# NOT: pip install mycontext
Python version errors
mycontext-ai requires Python 3.11+. Check your version:
python --version
If you have multiple Python versions, try:
python3.11 -m pip install mycontext-ai
# or
python3.12 -m pip install mycontext-ai
LiteLLM import errors
If you see errors related to LiteLLM, install it explicitly:
pip install litellm>=1.55.0
Next: Quick Start →