Skip to main content
This page covers what you need to get configured quickly. For the complete settings reference — every key, type, default, and CLI override — see Configuration Reference.

Config file

notewise reads configuration from:
~/.notewise/config.env
This file is created by notewise setup. It uses a simple KEY=VALUE format — no quotes required (they are stripped if present). Override the config directory:
export NOTEWISE_HOME=/custom/path

Load order

Settings are resolved in this priority order — later entries win:
1

Code defaults

Built-in values defined in _constants.py.
2

config.env

Values written by notewise setup or edited manually.
3

Environment variables

Always override the config file.
4

CLI flags

Override everything for a single run (e.g. --model, --output).

Essential settings

Model & API key

Set your default LLM and its matching API key. Only the key for your chosen provider needs to be set.
# Default — free tier via Google AI Studio
DEFAULT_MODEL=gemini/gemini-2.5-flash
GEMINI_API_KEY=your_key_here
Switching providers:
DEFAULT_MODEL=gpt-4o
OPENAI_API_KEY=sk-...
See LLM Providers for all supported models and key names.

Output directory

OUTPUT_DIR=~/study-notes    # default: ./output

Concurrency

MAX_CONCURRENT_VIDEOS=5     # default: 5 videos processed in parallel

Temperature

TEMPERATURE=0.7             # default: 0.7 — range 0.0–1.0

Example config.env

# ~/.notewise/config.env

DEFAULT_MODEL=gemini/gemini-2.5-flash
OUTPUT_DIR=~/study-notes
MAX_CONCURRENT_VIDEOS=3
TEMPERATURE=0.7
# MAX_TOKENS=2000

GEMINI_API_KEY=your_key_here
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...

Config commands

CommandWhat it does
notewise configShow resolved config (API keys masked)
notewise config-pathPrint path to the active config file
notewise edit-configOpen config.env in $EDITOR
notewise setup --forceRe-run the interactive wizard
After any config change, run notewise config to confirm the resolved values before your next run.

Next steps

Full configuration reference

Every key, type, default, code-only constant, and CLI override.

LLM Providers

All supported providers, model string format, and routing table.