Skip to main content

Integrations

mycontext-ai is framework-agnostic. Every Context object can export itself into the native format of any major AI framework — no adapters, no glue code, no fighting with APIs.

How Integration Works

The Context object is the universal unit. Once you build one (from scratch or via a cognitive pattern), it can export itself to any framework:

from mycontext.templates.free.reasoning import RootCauseAnalyzer

# Build once
ctx = RootCauseAnalyzer().build_context(
problem="API latency tripled after deployment",
depth="comprehensive",
)

# Export anywhere
messages = ctx.to_messages() # OpenAI / universal messages
lc_format = ctx.to_langchain() # LangChain
li_format = ctx.to_llamaindex() # LlamaIndex
crew_cfg = ctx.to_crewai() # CrewAI Agent + Task
autogen_cfg = ctx.to_autogen() # AutoGen AssistantAgent

Supported Frameworks

FrameworkInstallHelper classNative export
LangChainpip install langchain-coreLangChainHelperctx.to_langchain()
LlamaIndexpip install llama-indexLlamaIndexHelperctx.to_llamaindex()
CrewAIpip install crewaiCrewAIHelperctx.to_crewai()
AutoGenpip install pyautogenAutoGenHelperctx.to_autogen()
DSPypip install dspy-aiDSPyHelperctx.assemble()
Semantic Kernelpip install semantic-kernelSemanticKernelHelperctx.assemble()
Google ADKpip install google-adkGoogleADKHelperctx.assemble()

Two Integration Paths

Path 1: Native Context export methods

Every Context has built-in export methods:

from mycontext import Context
from mycontext.foundation import Guidance, Directive

ctx = Context(
guidance=Guidance(role="Senior data analyst"),
directive=Directive(content="Identify anomalies in the sales data"),
)

# Universal messages format
messages = ctx.to_messages()
# → [{"role": "system", "content": "..."}]

# LangChain
lc = ctx.to_langchain()
# → {"system_message": "...", "context": {...}}

# CrewAI
crew = ctx.to_crewai()
# → {"role": "...", "goal": "...", "backstory": "...", "expected_output": "..."}

# AutoGen
ag = ctx.to_autogen()
# → {"system_message": "...", "description": "...", ...}

# LlamaIndex
li = ctx.to_llamaindex()
# → {"template": "...", "system_prompt": "...", "query_instruction": "..."}

# Provider-specific
openai_cfg = ctx.to_openai() # {"messages": [...], "temperature": 0.7}
anthropic_cfg = ctx.to_anthropic() # {"system": "...", "messages": [...]}

Path 2: Helper classes

The helper classes add framework-specific functionality on top of the export methods:

from mycontext.integrations import (
LangChainHelper,
LlamaIndexHelper,
CrewAIHelper,
AutoGenHelper,
DSPyHelper,
SemanticKernelHelper,
GoogleADKHelper,
)

auto_integrate() — One-liner

from mycontext.integrations import auto_integrate

ctx = RootCauseAnalyzer().build_context(problem="API latency spiked")

# Detect and export automatically
messages = auto_integrate(ctx, "langchain")
agent = auto_integrate(ctx, "crewai", name="analyst", tools=my_tools)
prompt = auto_integrate(ctx, "dspy")
adk_agent = auto_integrate(ctx, "google_adk", name="diagnostician")

Supported framework strings: "langchain", "llamaindex", "crewai", "autogen", "dspy", "semantic_kernel", "google_adk"

The Shared Pattern

Every integration follows the same pattern:

1. Build a Context (from scratch or via a cognitive pattern)
2. Export it to the framework's native format
3. Use the framework normally — mycontext just powered the prompt

This means your framework code doesn't change. You're just replacing ad-hoc system prompts with research-backed cognitive frameworks.

All Context Export Methods

MethodReturnsUse with
ctx.to_messages()list[dict]Any OpenAI-compatible API
ctx.to_openai()dictOpenAI Python client
ctx.to_anthropic()dictAnthropic Python client
ctx.to_langchain()dictLangChain / LangGraph
ctx.to_llamaindex()dictLlamaIndex query/chat engines
ctx.to_crewai()dictCrewAI Agent + Task
ctx.to_autogen()dictAutoGen AssistantAgent
ctx.assemble()strAny framework accepting a string
ctx.to_markdown()strDocumentation, debugging
ctx.to_json()strStorage, APIs
ctx.to_yaml()strConfig files
ctx.to_xml()strXML-based systems

Install Integration Dependencies

# Install only what you need
pip install mycontext-ai # Core only

pip install mycontext-ai langchain-core # + LangChain
pip install mycontext-ai llama-index # + LlamaIndex
pip install mycontext-ai crewai # + CrewAI
pip install mycontext-ai pyautogen # + AutoGen
pip install mycontext-ai dspy-ai # + DSPy
pip install mycontext-ai semantic-kernel # + Semantic Kernel
pip install mycontext-ai google-adk # + Google ADK

# Or all at once
pip install "mycontext-ai[all]"