Install Codex CLI with Ollama
Step-by-step guide to installing and configuring the OpenAI Codex CLI with local Ollama models for offline AI coding assistance.
Quick Navigation
Difficulty: 🟡 Intermediate
Estimated Time: 15-20 minutes
Prerequisites: Node.js, Basic command line knowledge, Ubuntu/Debian system
What You'll Learn
This tutorial covers essential AI coding setup concepts and tools:
- Codex CLI Installation - Setting up OpenAI's command-line coding assistant
- Ollama Integration - Configuring local LLM models for offline use
- Local Development Setup - Creating a secure, private AI coding environment
- Workflow Integration - Using AI assistance in your development process
Prerequisites
- Node.js and npm installed
- Basic command line knowledge
- Ubuntu/Debian system
- curl utility
Related Tutorials
- VS Code Extensions - Enhance your development environment
- Docker GPU Setup - Containerize your AI development environment
- Security Best Practices - Secure your development workflow
Introduction
Welcome to the world of local AI coding assistance! In this guide, you'll learn how to install the Codex CLI from OpenAI Codex and configure it to use the powerful Ollama provider for local LLM execution. This integration is perfect if you're looking for offline, high-performance AI coding assistance without cloud dependencies.
Understanding Codex CLI
What is OpenAI Codex CLI?
The OpenAI Codex CLI is an open-source command-line tool that brings the power of OpenAI's latest reasoning models directly to your terminal. It acts as a lightweight coding agent that can:
- Read, modify, and run code on your local machine
- Help you build features faster
- Squash bugs efficiently
- Understand unfamiliar code
It runs entirely locally, so your code stays secure and private unless you choose to share it.
Key Features
- Zero-setup install:
npm install -g @openai/codex
- Multimodal inputs: text, screenshots, diagrams
- Rich approval workflow: Suggest, Auto Edit, Full Auto
- Terminal-based: Works fully in the terminal for fast iteration
Approval Modes
Mode | What It Can Do | When to Use |
---|---|---|
Suggest | Proposes edits & commands, requires approval | Safe exploration & code review |
Auto Edit | Reads/writes files, asks before running commands | Repetitive or refactoring tasks |
Full Auto | Full autonomy in a sandboxed environment | Long builds or automated prototyping |
Pro tip: Codex warns before entering Auto modes if the directory isn't under version control!
Installation Steps
Step 1: Install Ollama (Ubuntu Edition)
curl -fsSL https://ollama.com/install.sh | sh
ollama --version
sudo systemctl enable ollama
sudo systemctl start ollama
Troubleshooting:
sudo apt install curl
chmod +x install.sh && ./install.sh
Step 2: Install Codex CLI Globally
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo bash -
sudo apt install -y nodejs
sudo apt install -y npm
sudo npm install -g @openai/codex
Step 3: Run Ollama Locally
ollama pull qwen3
ollama run qwen3
Ollama runs on http://localhost:11434
Step 4: Create Your Codex Config File
nano ~/.codex.json
Paste the following configuration:
{
"model": "qwen3",
"provider": "Ollama",
"providers": {
"ollama": {
"name": "Ollama",
"baseURL": "http://localhost:11434/v1",
"envKey": "OLLAMA_API_KEY"
}
},
"history": {
"maxSize": 1000,
"saveHistory": true,
"sensitivePatterns": []
}
}
Using Codex CLI
Start Coding with Codex CLI
codex generate a python hello world
Example Usage
> codex
Enter prompt: Generate a Python function to fetch data from an API.
Troubleshooting Common Issues
Ollama Connection Issues
- Ollama not responding: Ensure it's running on localhost:11434
- Model not found: Use
ollama pull qwen3
- Command not found: Ensure Codex CLI was installed globally
Node.js Installation Issues
- Permission denied: Use
sudo
for global npm installations - Version conflicts: Ensure you're using Node.js 22.x or later
Conclusion
You've successfully set up a local AI coding assistant! The Codex CLI with Ollama integration provides you with powerful, offline AI coding capabilities while maintaining the privacy and security of your code.
Key Takeaways:
- Codex CLI offers powerful local AI coding assistance
- Ollama provides offline LLM capabilities
- Local setup ensures code privacy and security
Next Steps:
- Explore different Ollama models for various coding tasks
- Integrate Codex into your daily development workflow
- Consider setting up version control for safer Auto mode usage
Tags: #Codex #Ollama #AICoding #LocalLLM #CLI #Development #OpenAI #NodeJS #Ubuntu