0:00
/
0:00
Transcript

pro@coder AI Pack Demo - Generate Doc

AI Pack pro@coder: Summarizing Codebase with Production-Grade AI Workflows

In this demo, we showcase how to generate structured documentation for a codebase using the AI Pack ProCoder, following a production-first approach.

Step-by-Step Setup

Install AI Pack CLI
Make sure you have Rust installed.

# Install AIPACK (aip command line)
cargo install aipack

# Install pro@coder
aip install pro@coder

This will install the prop@coder aipack at ~/.aipack-base/pack/installed/pro/code

🔑 Check Your API Keys

Ensure your environment has the necessary keys:

aip check-keys

🚀 Run pro@coder

aip run pro@coder

This launches the AI agent with a customizable prompt system. By pro@coder convention, prompts live in:

.aipack/.prompt/pro@coder/coder-prompt.md

🛠️ Configure the Prompt

  • The prompt supports parameters via a toml code block.

  • Set the model (e.g. gemini2-flash for fast/free or gmini2.5-pro for high quality).

  • Write Mode:

    • Set to false to avoid file generation.

    • Set to true to enable the agent to write output files.

📂 Step 1: Generate Per-File Summaries

  1. Set Working Globs (for parallel file processing):

...
# context_globs = ...

working_globs = ["web/content/**/*"]
working_concurrency = true
input_concurrency = 6
write = true
  1. Prompt Example:

Can you summarize the working files? Save each file's summary in .doc/summary-<normalized-path>.md
  1. Run the prompt (e.g., using R inside the ProCoder interface).

Result: Summaries for each source file saved in .doc/.

📘 Step 2: Create a Consolidated Summary

  1. Switch to Context Globs:

...

context_globs = [".doc/summary-*.md"]

# working_globs       = ["web-content/**/*.*"]
# working_concurrency = true
# input_concurrency   = 6
  1. Prompt Example:

Can you summarize your context files and save the result in doc/summary.md
  1. Run again with write = true.

Result: A clean, full summary generated at doc/summary.md.

✨ Highlights

  • Tiny Pack: ~9KB binary (pro@coder)

  • Offline & Secure: Works with local keys, local models, or remote models.

  • Highly concurrent: Up to 6 files processed at once.

  • Any AI Models/Providers: Gemini, OpenAI, Anthropic, Ollama, Groq, xAI

  • Production Practices: Modular, guided prompts and output control.

🧠 Pro Tips

  • Use working globs for parallel processing.

  • Use context globs for giving an hollistic context (both Working and Context globs can be combined)

  • .gitignore Ignore, the `.*` for temporary output.

That’s it! With AI Pack pro@coder, you can turn any codebase into clean, structured documentation in no time, and much more.

Until next time—happy coding!

Discussion about this video