madakit
Composable AI client library — one async interface for cloud, local, and native providers, layered with production middleware.
Provider Agnostic
21 providers across cloud (OpenAI, Anthropic, Gemini, DeepSeek), local (Ollama, vLLM, LM Studio), and native (Transformers, llama.cpp).
Composable Middleware
16 production-grade components: retry, circuit breaker, caching, fallbacks, rate limiting, load balancing, A/B testing, and more.
Zero Core Dependencies
Core uses only Python stdlib. Add httpx for cloud providers, transformers for native inference, prometheus-client for metrics.
Production Hardened
Async-first with streaming. Full type annotations (mypy strict). Framework integrations for LangChain, LlamaIndex, FastAPI, Flask.
Installation
# Core library (zero dependencies)
pip install madakit
# With cloud providers
pip install madakit[cloud]
# Everything
pip install madakit[all] Resources
MIT License