An Emacs interface for AI-assisted software development. The purpose is to provide a uniform interface and experience for different AI backends. It is also designed to integrate seamlessly with AI-driven agile development workflows.
- Currently it supports these AI coding CLIs:
- I switch between different CLI-based AI tools in Emacs: Claude Code / Gemini CLI / Aider / OpenAI Codex. If you also use different AI tools inside Emacs, but want to keep the same user interface and experience, this package is for you.
- Lots of features and tools are ported from aider.el. If you like the features in aider.el, but wish to switch to modern AI coding CLI, this package is also for you.
- Screenshot
Enable installation of packages from MELPA by adding an entry to package-archives after (require ‘package) and before the call to package-initialize in your init.el or .emacs file:
(require 'package)
(add-to-list 'package-archives '("melpa" . "https://melpa.org/packages/") t)
(package-initialize)- Use
M-x package-refresh-contentsorM-x package-list-packagesto ensure that Emacs has fetched the MELPA package list - Use
M-x package-installto installai-codepackage - Import and configure
ai-codein your init.el or .emacs file:
(use-package ai-code
:config
;; use codex as backend, other options are 'gemini, 'github-copilot-cli, 'opencode, 'grok, 'claude-code-ide, 'claude-code, 'cursor, 'kiro
(ai-code-set-backend 'codex)
;; Enable global keybinding for the main menu
(global-set-key (kbd "C-c a") #'ai-code-menu)
;; Optional: Use eat if you prefer, by default it is vterm
;; (setq ai-code-backends-infra-terminal-backend 'eat) ;; for openai codex, github copilot cli, opencode, grok, cursor-cli; for claude-code-ide.el, you can check their config
;; Optional: Turn on auto-revert buffer, so that the AI code change automatically appears in the buffer
(global-auto-revert-mode 1)
(setq auto-revert-interval 1) ;; set to 1 second for faster update
;; Optional: Set up Magit integration for AI commands in Magit popups
(with-eval-after-load 'magit
(ai-code-magit-setup-transients)))(use-package ai-code
:straight (:host github :repo "tninja/ai-code-interface.el")
:config
;; use codex as backend, other options are 'gemini, 'github-copilot-cli, 'opencode, 'grok, 'claude-code-ide, 'claude-code, 'cursor, 'kiro
(ai-code-set-backend 'codex)
;; Enable global keybinding for the main menu
(global-set-key (kbd "C-c a") #'ai-code-menu)
;; Optional: Use eat if you prefer, by default it is vterm
;; (setq ai-code-backends-infra-terminal-backend 'eat) ;; for openai codex, github copilot cli, opencode, grok, cursor-cli; for claude-code-ide.el, you can check their config
;; Optional: Turn on auto-revert buffer, so that the AI code change automatically appears in the buffer
(global-auto-revert-mode 1)
(setq auto-revert-interval 1) ;; set to 1 second for faster update
;; Optional: Set up Magit integration for AI commands in Magit popups
(with-eval-after-load 'magit
(ai-code-magit-setup-transients)))- Emacs 28.1 or later
- `org`: Org-mode support
- `magit`: Git integration
- `transient`: For the menu system
- vterm (default) or eat needs to be installed to support AI coding CLI backends.
- `helm`: For an enhanced auto-completion experience (`ai-code-input.el`).
- `yasnippet`: For snippet support in the prompt file. A library of snippets is included.
- `gptel`: For intelligent, AI-generated headlines in the prompt file.
- `flycheck`: To enable the `ai-code-flycheck-fix-errors-in-scope` command.
- `projectile`: For project root initialization.
- `helm-gtags`: For tags creation.
- `python-pytest`: For running python tests in the TDD workflow.
- `jest`: For running JavaScript / TypeScript tests in the TDD workflow.
- Transient-Driven Hub (`C-c a`): One keystroke opens a contextual transient menu that groups every capability (CLI control, code actions, agile workflows, utilities) so you never need to memorize scattered keybindings.
- AI CLI Session Management: Start (`a`), resume (`R`), or jump back into (`z`) the active AI CLI buffer, instantly swap backends (`s`), upgrade them (`u`), edit backend configs (`g`), and run prompts against the current file (`|`). It support multiple sessions per project.
- Context-Aware Code Actions: The menu exposes dedicated entries for changing code (`c`), implementing TODOs (`i`), asking questions (`q`), explaining code (`x`), sending free-form commands (`<SPC>`), and refreshing AI context (`@`). Each command automatically captures the surrounding function, region, or clipboard contents (via `C-u`) to keep prompts precise.
- Agile Development Workflows: Use the refactoring navigator (`r`), the guided TDD cycle (`t`), and the pull/review diff helper (`v`) to keep AI-assisted work aligned with agile best practices. Prompt authoring is first-class through quick access to the prompt file (`p`), block sending (`b`), and AI-assisted shell/file execution (`!`).
- Productivity & Debugging Utilities: Initialize project navigation assets (`.`), investigate exceptions (`e`), auto-fix Flycheck issues in scope (`f`), copy or open file paths formatted for prompts (`k`, `o`), generate MCP inspector commands (`m`), and capture session notes straight into Org (`n`).
- Seamless Prompt Management: Open `.ai.code.prompt.org`, send regions with `ai-code-prompt-send-block`, and reuse prompt snippets via `yasnippet` to keep conversations organized.
- Interactive Chat & Context Tools: Dedicated buffers hold long-running chats, automatically enriched with file paths, diffs, and history from Magit or Git commands for richer AI responses.
- AI-Assisted Bash Commands: From Dired, shell, eshell, or vterm, run `C-c a !` and type natural-language commands prefixed with `:` (e.g., `:count lines of python code recursively`); the tool generates the shell command for review and executes it in a compile buffer.
- Changing Code: Position the cursor on a function or select a region of code. Press `C-c a`, then `c` (`ai-code-code-change`). Describe the change you want to make in the prompt. The AI will receive the context of the function or region and your instruction.
- Implementing a TODO: Write a comment in your code, like `;; TODO: Implement caching for this function`. Place your cursor on that line and press `C-c a`, then `i` (`ai-code-implement-todo`). The AI will generate the implementation based on the comment.
- Asking a Question: Place your cursor within a function, press `C-c a`, then `q` (`ai-code-ask-question`), type your question, and press Enter. The question, along with context, will be sent to the AI.
- Refactoring a Function: With the cursor in a function, press `C-c a`, then `r` (`ai-code-refactor-book-method`). Select a refactoring technique from the list, provide any required input (e.g., a new method name), and the prompt will be generated.
- Reviewing a Pull Request: Press `C-c a`, then `v` (`ai-code-pull-or-review-diff-file`). Choose to generate a diff between two branches. The diff will be created in a new buffer, and you’ll be prompted to start a review.
Context engineering is the deliberate practice of selecting, structuring, and delivering the right information to an AI model so the output is specific, accurate, and actionable. For AI-assisted programming, the model cannot read your whole codebase by default, so the quality of the result depends heavily on the clarity and relevance of the provided context (file paths, functions, regions, related files, and repo-level notes). Good context engineering reduces ambiguity, prevents irrelevant suggestions, and keeps changes aligned with the current code.
This package makes context engineering easy by automatically assembling precise context blocks and letting you curate additional context on demand:
- Automatic file and window context: prompts can include the current file and other visible files (`ai-code–get-context-files-string`), so the AI sees related code without manual copying.
- Function or region scoping: most actions capture the current function or active region, keeping requests focused (e.g., `ai-code-code-change`, `ai-code-implement-todo`, `ai-code-ask-question`).
- Manual context curation: `C-c a @` (`ai-code-context-action`) stores file paths, function anchors, or region ranges in a repo-scoped list, which is appended to prompts via `ai-code–format-repo-context-info`.
- Optional clipboard context: prefix with `C-u` to append clipboard content to prompts for external snippets or logs.
- Prompt suffix guardrails: set `ai-code-prompt-suffix` to append persistent constraints to every prompt (when `ai-code-use-prompt-suffix` is non-nil). Example: `(setq ai-code-prompt-suffix “Only use English in code file, but Reply in Simplified Chinese language”)`.
Example (focused refactor with curated context):
- In a buffer, run `C-c a @` to add the current function or selected region to stored repo context.
- Open another related file in a window so it is picked up by `ai-code–get-context-files-string`.
- Place the cursor in the target function and run `C-c a c` to request a change.
The generated prompt will include the function/region scope, visible file list, and stored repo context entries, giving the AI exactly the surrounding information it needs.
This package acts as a generic interface that requires a backend AI assistant package to function. You can configure it to work with different backends.
- Press `C-c a` to open the AI menu, then `s` to “Select Backend”.
- Pick one of the supported backends and the integration will switch immediately.
- The selection updates the start/switch/send commands and the CLI used by `ai-code-apply-prompt-on-current-file`.
Supported options:
- Claude Code (`claude-code.el`)
- Claude Code IDE (`claude-code-ide.el`)
- Gemini CLI (`ai-code-gemini-cli.el`)
- OpenAI codex CLI (`ai-code-codex-cli.el`)
- GitHub Copilot CLI (`ai-code-github-copilot-cli.el`)
- Opencode (`ai-code-opencode.el`)
- Grok CLI (`ai-code-grok-cli.el`)
- Cursor CLI (`ai-code-cursor-cli.el`)
- Kiro CLI (`ai-code-kiro-cli.el`)
Install grok-cli and ensure the `grok` executable is on your PATH. Customize `grok-cli-program` or `grok-cli-program-switches` if you want to point at a different binary or pass additional flags (for example, selecting a profile). After that, select the backend through `ai-code-select-backend` or bind a helper in your config.
You can add other backends by customizing the `ai-code-backends` variable.
- This PR adds github-copilot-cli. It can be an example to add basic support for other AI coding CLI.
- Open an issue, post information about the new AI coding CLI backend (eg. cursor CLI?), at least providing the command line name. You can also include the version upgrade command, how to resume, where the configuration files are located, and so on. We can ask GitHub Copilot to add support features based on the issue.
Q: Using Opencode as backend, it might have performance issues with eat.el in Doom Emacs. Issue
- A: Use vterm as the backend; Opencode won’t trigger mouse hover and will not cause Emacs flickering. Setting “theme” to “system” in the Opencode config can reduce glitches. From gkzhb’s answer:
{
"$schema": "https://opencode.ai/config.json",
"theme": "system"
}- A: use gemini-3-flash model, it is pretty fast, with good quality (being able to solve leetcode hard problems), and it is free. You can set the following in your Emacs config:
(setq ai-code-gemini-cli-program-switches '("--model" "gemini-3-flash-preview"))Q: Codex CLI use my API key, instead of my ChatGPT Plus subscription and cost money, how to fix that?
- A: use `codex login` to login with your OpenAI account that has ChatGPT Plus subscription. After that, Codex CLI will use your ChatGPT Plus subscription automatically. To confirm, check with /status inside the codex CLI buffer.
The following books introduce how to use AI to assist programming and potentially be helpful to aider / aider.el users.
- Beyond Vibe Coding, by Addy Osmani, August, 2025
- Critical Thinking Habits for Coding with AI, by Andrew Stellman, Oct 2025
- Software Testing with Generative AI, by Mark Winteringham, Dec 2024
- More AI Assisted Programming related books
- Claude Code (`claude-code.el`)
- Claude Code IDE (`claude-code-ide.el`)
- Gemini CLI (`gemini-cli.el`)
- agent-shell (acp.el)
Apache-2.0 License
Contributions, issue reports, and improvement suggestions are welcome! Please open an issue or submit a pull request on the project’s GitHub repository.
