Key takeaways
- CLI tool that converts codebases into formatted LLM prompts with source tree visualization, prompt templating, and token counting. Rust core for speed
- 7.2k stars, MIT license. Fast Python bindings for integration into RAG pipelines and AI agent automation scripts
- Template system lets you customize prompt format for different use cases — code review, documentation, refactoring, etc.
- Outputs to clipboard or stdout. Ideal for AI agents, automation scripts, and deep integration into existing workflows
FAQ
What is code2prompt?
A CLI tool (Rust) that converts your codebase into a single formatted LLM prompt with source tree, prompt templates, and token counting. Also provides fast Python bindings for programmatic use.
How does it compare to Repomix?
Repomix is TypeScript with XML output optimized for Claude and Tree-sitter compression. code2prompt is Rust (faster) with a template system and Python bindings. Repomix has broader adoption (22k vs 7k stars).
Overview
code2prompt is a Rust-powered CLI tool that converts codebases into formatted LLM prompts. It generates a source tree, applies customizable prompt templates, counts tokens, and outputs to clipboard or stdout — ideal for feeding codebases into any LLM.
With 7.2k stars, it is the second most popular context packing tool after Repomix. The Rust core provides speed, and Python bindings enable integration into RAG pipelines and automation scripts.
Key stats: 7,224 stars, MIT license, Rust with Python bindings. Created March 2024.
Competitive Position
Strengths: Fast (Rust). Template system for customizable output formats. Python bindings for pipeline integration. MIT license.
Weaknesses: Smaller community than Repomix. No MCP server mode. No Tree-sitter compression. Less Claude-optimized output format.
Research by Ry Walker Research