qq - Quick Question CLI for LLMs
A CLI tool for querying LLMs
Installation
go install github.com/mkozhukh/qq@latest
Configuration
Set your API key and preferred model using environment variables:
export QQ_KEY="your-api-key"
export QQ_MODEL="provider/model-name"
export QQ_PROMPT="Optional system prompt"
Supported providers and example models:
openai/gpt-4.1-mini
anthropic/claude-4-0-sonnet
gemini/gemini-2.5-flash
Usage Scenarios
Basic Query
qq "What is the capital of France?"
Interactive Mode
When run without arguments, qq enters interactive mode
qq
# Enter your message in the form that appears
Piping Content
cat error.log | qq "What's causing these errors?"
git diff | qq "Summarize these changes"
System Prompts
Add context with system prompts:
qq --prompt "You are a senior DevOps engineer" "How do I set up a CI/CD pipeline?"
Non-Streaming Mode
Use --buffer flag to get the complete response at once
qq --buffer "List 5 programming languages" | grep -i python
Environment Variables
QQ_MODEL: Default model (e.g., openai/gpt-4.1)
QQ_KEY: API key for the provider
QQ_PROMPT: Default system prompt
if default variables are not defined, code fallback to the ones from mkozhukh/echo
ECHO_MODEL - for model
ECHO_KEY - for API key
if api key is not found, it goes deeper
GEMINI_API_KEY
OPENAI_API_KEY
ANTHROPIC_API_KEY
Command Line Options
--model: Override the model for this query
--key: Override the API key for this query
--prompt: Add a system prompt for context
--buffer: Disable streaming (get complete response at once)
--help: Show usage information
License
MIT License - see LICENSE file for details