Key takeaways
- Research demonstrator from Microsoft — explicitly not production-ready, no safety systems built in yet
- Modular bundle architecture lets you compose tools, agents, behaviors, and providers like LEGO bricks
- Multi-provider: Anthropic Claude (primary), OpenAI, Azure OpenAI, and Ollama for local inference
- Ships 14 specialized agents in the foundation bundle including zen-architect, bug-hunter, and explorer
FAQ
What is Microsoft Amplifier?
Amplifier is Microsoft's open-source research demonstrator for modular AI-assisted development. It provides a CLI-first interface with a composable architecture underneath — bundles of tools, agents, and behaviors that can be mixed and matched.
Is Amplifier production-ready?
No. Microsoft explicitly labels it a research demonstrator in early development. Safety systems haven't been built yet, APIs may change, and breaking changes are expected. It's open exploration, not a product release.
What AI providers does Amplifier support?
Anthropic Claude (recommended, most tested), OpenAI (GPT-5.2 family), Azure OpenAI (with managed identity support), and Ollama for local models like llama3 and codellama.
What is the bundle system?
Bundles are composable configuration packages that define providers, tools, agents, and behaviors. The default "foundation" bundle includes filesystem, bash, web, search tools plus 14 specialized agents. You can add external bundles from git repos or create your own.
What Is Microsoft Amplifier?
Amplifier is Microsoft's open-source research project exploring modular, extensible AI-assisted development. Currently a CLI tool, it's designed as a platform where the CLI is just one interface — with web, mobile, voice, and collaborative AI-to-AI interfaces on the roadmap.
The project is explicit about what it is: a research demonstrator, not a product. Microsoft is performing "active exploration in the open" and hasn't built safety systems yet. This honesty is refreshing — and important context for anyone evaluating it.
Previously built on top of Claude Code, Amplifier has since been rewritten with its own modular kernel. The old Claude Code version lives on the amplifier-claude branch.
Architecture: Bundles, Modules, and Agents
Amplifier's core idea is composability through bundles. A bundle is a configuration package that defines:
- Providers — which AI model to use (Anthropic, OpenAI, Azure, Ollama)
- Tools — filesystem, bash, web, search, task delegation
- Agents — specialized AI personas for focused tasks
- Behaviors — logging, redaction, streaming UI, todo tracking
The default foundation bundle ships with 14 specialized agents:
| Agent | Purpose |
|---|---|
| zen-architect | System design with "ruthless simplicity" |
| bug-hunter | Systematic debugging |
| web-research | Web research and content fetching |
| modular-builder | Code implementation |
| explorer | Breadth-first code/doc exploration with citations |
| git-ops | Git operations |
| + 8 more | Various specialized tasks |
Bundles can be installed from git repos, switched on the fly, or scoped per-project vs. per-user. The amplifier bundle add command pulls external bundles, and amplifier bundle use switches context.
The Kernel
The architecture splits into three layers:
- amplifier-core — Ultra-thin kernel (~2,600 lines) providing module protocols, session lifecycle, and hooks
- amplifier-foundation — Bundle composition library + the default foundation bundle
- amplifier-app-cli — The reference CLI implementation
This separation means the platform can support entirely different frontends while sharing the same module system.
Multi-Provider Support
Amplifier supports four AI providers:
- Anthropic Claude — Primary, most tested (Sonnet 4.5, Opus 4.6, Haiku 4.5)
- OpenAI — GPT-5.2, GPT-5.2-Pro, GPT-5.1-Codex
- Azure OpenAI — Enterprise path with managed identity and
az loginsupport - Ollama — Local inference with llama3, codellama, mistral
Switching is a single command: amplifier provider use openai --model gpt-5.2. Provider config can be scoped to local (just you) or project (team-wide).
The Azure OpenAI support is notable — it uses DefaultAzureCredential, which means it works with az login locally and managed identity in production. This is the enterprise-friendly path that most open-source agents skip.
Session Persistence
Sessions are automatically saved and project-scoped. Working in /home/user/myapp shows only that project's sessions. amplifier continue picks up the last session; amplifier session resume <id> resumes a specific one.
This project-scoping is a nice design choice — it mirrors how developers actually context-switch between codebases.
Self-Improvement Patterns
One of Amplifier's more interesting claims: "Use Amplifier to create new Amplifier modules." The system is designed to be self-extending — agents can build new modules, agents, and bundles. This is the "AI builds AI" part of their vision.
The modular architecture makes this more plausible than it sounds. Since modules follow defined protocols and bundles are just configuration packages, generating new ones is a structured problem rather than open-ended code generation.
Why It Matters
Microsoft releasing a research-stage agent framework in the open is significant for a few reasons:
-
Honest positioning — Calling it a research demonstrator sets correct expectations. Most tools in this space overpromise.
-
Bundle composability — The idea that agent configurations should be packaged, shared, and composed like libraries is compelling. Most agent frameworks treat configuration as monolithic.
-
Multi-provider from day one — Supporting Anthropic, OpenAI, Azure, and local models means teams aren't locked in. The Azure path is particularly important for enterprise adoption.
-
Platform, not just a CLI — The kernel/foundation/app separation suggests Microsoft is thinking about this as infrastructure, not just a developer tool.
Current Limitations
- Not accepting external contributions yet (CLA required, PRs not open)
- Safety systems not built — they're explicit about this
- Anthropic-first — other providers have "rough edges"
- Documentation incomplete — moving fast, docs are catching up
- ~3k GitHub stars — early community, not yet widely adopted
The Tembo Angle
Amplifier's bundle system is philosophically aligned with agent orchestration — composable, modular, multi-provider. The key difference: Amplifier bundles are static configuration packages, while true orchestration involves dynamic routing, cost optimization, and runtime decisions about which agent handles what. Amplifier is a single-agent system with delegation; orchestration platforms manage fleets.
Worth watching as Microsoft iterates. The modular kernel could become infrastructure that other tools build on — or it could remain a research artifact. The "not currently accepting contributions" stance suggests Microsoft isn't sure yet either.