We wanted to introduce our codebase chat that uses a wiki of your repo to do retrieval instead of traditional vector or keyword RAG.
We have been pleasantly surprised by the quality of responses. We think creating a language knowledge base for LLMs to do RAG is the future. The way this works is we first write a Wikipedia style article on your codebase complete with diagrams and citations to your code.
We’ve run it on some of the most popular repos. If you want us to run in an open source repo please comment below. You can also run it on private repos and the permissions will follow those of the repository on GitHub. The wikis auto update via a GitHub bot and even show you previews of the new wikis in a diff view.
Please check it out and let us know your thoughts.
Thank you! Omar
PS Here are some examples:
https://wiki.mutable.ai/hashicorp/terraform
https://wiki.mutable.ai/ggerganov/llama.cpp
https://wiki.mutable.ai/ethereum/go-ethereum
https://wiki.mutable.ai/NVIDIA/TensorRT
https://wiki.mutable.ai/langchain-ai/langchain
https://wiki.mutable.ai/ollama/ollama
https://wiki.mutable.ai/tensorflow/models
https://wiki.mutable.ai/grafana/grafana
https://wiki.mutable.ai/OpenAutoCoder/Agentless
https://wiki.mutable.ai/unslothai/unsloth
https://wiki.mutable.ai/Dao-AILab/flash-attention
https://wiki.mutable.ai/vercel/next.js
https://wiki.mutable.ai/microsoft/vscode
https://wiki.mutable.ai/wasm3/wasm3
https://wiki.mutable.ai/deepfakes/faceswap