Claude Code + 15 Repos: How a Non-Engineer Answers Every Customer Question | Al Chen

How I AI
How I AIApr 6, 2026

Why It Matters

By turning the live codebase into an AI‑accessible knowledge hub, companies can deliver faster, more accurate support while cutting engineering overhead, giving them a competitive edge in enterprise SaaS.

Key Takeaways

  • Pull all repos into VS Code for unified codebase queries.
  • Use Claude Code to generate scripts and answer customer questions.
  • Combine code, Confluence, and custom commands for tailored deployment guidance.
  • Automate daily repo updates with a short AI‑written pull script.
  • Treat AI as source of truth, reducing reliance on static documentation.

Summary

Al Chen, a field engineer at Galileo, demonstrates how he leverages Claude Code and a unified VS Code workspace containing fifteen micro‑service repositories to answer highly technical customer queries that standard documentation cannot resolve. By loading every repo into a single IDE and enabling Cloud Code, he can ask the AI to traverse the entire codebase, retrieve up‑to‑date implementations, and synthesize step‑by‑step answers for enterprise developers.

The workflow hinges on a few practical tricks: an AI‑generated 16‑line script pulls the latest main branches across all repositories each morning, keeping the local view current without manual git commands. Claude Code also pulls contextual data from Confluence and a custom “DPL” command that merges deployment guides with customer‑specific quirks stored in a shared page. This hybrid of code, documentation, and AI‑driven prompts yields precise, version‑accurate responses, eliminating the need to ping engineering channels for clarification.

During the demo, Chen asks Claude to generate a deployment checklist for a client using Google Secrets Manager, showing how the model first consults Confluence, then falls back to the relevant repo files if needed. He also highlights the “coin‑operated Claude” concept—rewarding the model with quota for correct answers—to continuously improve answer quality. The approach turns what was once a chaotic mix of docs, Slack threads, and stale pages into a single, searchable knowledge source.

For SaaS firms, this method reduces support latency, frees engineering bandwidth, and establishes the live codebase as the definitive source of truth. It also encourages teams to be less obsessive about where information lives, trusting AI to stitch together context across repositories, wikis, and chat tools, thereby scaling technical support without expanding headcount.

Original Description

Al Chen is a field engineer at Galileo, an observability platform for AI applications, where he works on the front lines with enterprise customers asking highly technical questions. Despite never having held an engineering role, Al has built a system using Claude Code to query Galileo’s 15 separate repositories, combine that with Confluence documentation and customer-specific quirks, and deliver hyper-personalized answers that would otherwise require constant engineering support.
What you’ll learn:
1. How to use Claude Code to query multiple repositories simultaneously for customer support
2. Why code is often a better source of truth than documentation
3. How to combine repository context with Confluence and Slack using MCPs
4. The “customer quirks” system that creates hyper-personalized deployment guides
5. How to build virtuous loops that turn single customer questions into scalable knowledge
6. Why information organization matters less in the AI era
7. A simple 16-line script (written by Claude Code) that pulls the latest main branch across all your repositories to keep your context current
8. How to reduce engineering interruptions to near-zero by empowering customer-facing teams to query the codebase directly
Brought to you by:
Orkes—The enterprise platform for reliable applications and agentic workflows: https://www.orkes.io/
Tines—Start building intelligent workflows today: https://tines.com/howiai
In this episode, we cover:
(00:00) Introduction to Al Chen
(02:50) The problem: documentation wasn’t enough
(04:23) Pulling 15 repos into VS Code
(06:03) How Claude Code queries the entire codebase
(08:00) Why current code beats documentation
(08:31) The pull script that keeps everything updated
(09:54) Opening projects at the multi-repo level
(11:40) Live demo: answering deployment questions
(13:25) The customer quirks system
(15:00) Living in chaos: why organization matters less now
(17:03) Competing on customer experience, not just product
(18:20) Should customers be able to query the code directly?
(20:05) Where humans still add value
(25:46) Using AI for reactive Slack support
(29:16) The “and then” workflow discovery
(32:07) Scaling processes across the team
(34:07) Lightning round and final thoughts
Detailed workflow walkthroughs from this episode:
• How Al Chen Uses Claude Code and 15 Repos to Answer Any Customer Question: https://www.chatprd.ai/how-i-ai/claude-code-and-repos-to-answer-any-customer-question
• How to Use AI to Answer Customer Questions from Your Entire Codebase: https://www.chatprd.ai/how-i-ai/workflows/how-to-use-ai-to-answer-customer-questions-from-your-entire-codebase
Tools referenced:
• Claude Code: https://claude.ai/code
Other references:
• Kubernetes: https://kubernetes.io/
• Stack Overflow: https://stackoverflow.com/
Where to find Al Chen:
Where to find Claire Vo:
_Production and marketing by https://penname.co/._
_For inquiries about sponsoring the podcast, email jordan@penname.co._

Comments

Want to join the conversation?

Loading comments...