Managing models, providers, logging, and settings using the config engine
Tiny Claw uses a config engine backed by SQLite to manage all configuration. The tinyclaw config command provides a CLI interface for managing models, providers, and logging settings.
Display the active model configuration for both tiers:
tinyclaw config model
Model ConfigurationBuilt-in (Ollama Cloud - always available as fallback) Model : qwen2.5:32b-instruct Base URL : https://ollama.com/apiPrimary (overrides built-in as default provider) Model : gpt-4 Base URL : https://api.openai.com/v1The smart router uses Primary as the default provider.Built-in is the fallback if Primary becomes unavailable.
See all models available for the built-in Ollama Cloud provider:
tinyclaw config model list
Available Built-in Models● qwen2.5:32b-instruct Best all-around model - balanced speed and capability○ qwen2.5:14b-instruct Faster, smaller model for simple tasks○ qwen2.5:72b-instruct Largest model for complex reasoningSwitch with: tinyclaw config model builtin <tag>
The ● marker indicates the currently active model.
Primary Provider Model : gpt-4 Base URL : https://api.openai.com/v1 API Key : stored as "provider:openai"Clear with: tinyclaw config model primary clear
Primary Provider (not configured) No primary provider set. The built-in Ollama Cloud provider is used as the default for the smart router. To add a primary provider, install a provider plugin and ask Tiny Claw to set it as primary. You can also tell Tiny Claw: "list my providers" or "set OpenAI as my primary provider"
Remove the primary provider override and revert to built-in:
tinyclaw config model primary clear
$ tinyclaw config model primary clear✔ Primary provider cleared Built-in will be used as the default provider.Restart Tiny Claw for changes to take effect.