Zed + Groq = Insanely Fast Coding Assistance
Sometimes in life you stumble upon unexpected joy :) For me, it was recently Zed. After a couple of weeks, I was completely sold on Zed as my new IDE. Then one morning I was thinking about how insanely fast Groq’s inference is and it dawned on me: what if I could use Groq as my inference backend in Zed?
Sure enough, it just requires the following config in Zed's settings.json file:
You can find available Groq API models and their context window sizes here.
"language_models": {
"openai_compatible": {
"groq": {
"api_url": "https://api.groq.com/openai/v1",
"available_models": [
{
"name": "openai/gpt-oss-120b",
"display_name": "OpenAI GPT OSS 120B",
"max_tokens": 131072,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
}
}
]
}
}
},
API keys are required to use Groq as an inference backend. You can obtain an API key from the Groq console, and then add it to your terminal's environment variables, where Zed will look for it.
echo 'export GROQ_API_KEY=<your_api_key>' >> ~/.zshrc
source ~/.zshrc
Big shout‑out to Zed for bringing back the joy of coding! Cheers 🥂