Centralized Configuration
Set up once, use everywhere. AbstractCore's centralized configuration system eliminates repetitive provider and model specifications across your projects.
Overview
AbstractCore provides a unified configuration system that manages default models, cache directories, logging settings, and API keys from a single location: ~/.abstractcore/config/abstractcore.json
This eliminates the need to specify providers and models repeatedly and provides consistent behavior across all applications.
Quick Start
# Check current configuration status
abstractcore --status
# Set global fallback model (used when no app-specific default)
abstractcore --set-global-default ollama/llama3:8b
# Set app-specific defaults for optimal performance
abstractcore --set-app-default summarizer openai gpt-4o-mini
abstractcore --set-app-default cli ollama qwen3:4b
# Set API keys
abstractcore --set-api-key openai sk-your-key-here
abstractcore --set-api-key anthropic your-anthropic-key
# Configure logging
abstractcore --set-console-log-level WARNING
Configuration Priority System
AbstractCore uses a clear priority hierarchy to determine which model to use:
- Explicit Parameters (highest priority)
summarizer document.txt --provider openai --model gpt-4o-mini
- App-Specific Configuration
abstractcore --set-app-default summarizer openai gpt-4o-mini
- Global Configuration
abstractcore --set-global-default openai/gpt-4o-mini
- Hardcoded Defaults (lowest priority)
Current default:
huggingface/unsloth/Qwen3-4B-Instruct-2507-GGUF
Application-Specific Defaults
Set different providers and models for different AbstractCore applications:
# Fast local model for CLI testing
abstractcore --set-app-default cli ollama qwen3:4b
# High-quality cloud model for document summarization
abstractcore --set-app-default summarizer openai gpt-4o-mini
# Balanced model for extraction tasks
abstractcore --set-app-default extractor anthropic claude-3-5-haiku
# Analytical model for evaluation
abstractcore --set-app-default judge anthropic claude-3-5-haiku
Vision Fallback Configuration
Enable text-only models to process images through smart vision fallback:
# Download local vision model (recommended)
abstractcore --download-vision-model
# Or use existing Ollama model
abstractcore --set-vision-caption qwen2.5vl:7b
# Or use cloud API
abstractcore --set-vision-provider openai --model gpt-4o
# Disable vision fallback
abstractcore --disable-vision
Logging Configuration
Control logging behavior across all AbstractCore components:
# Set console logging level
abstractcore --set-console-log-level DEBUG # Show all messages
abstractcore --set-console-log-level INFO # Show info and above
abstractcore --set-console-log-level WARNING # Show warnings and errors (default)
abstractcore --set-console-log-level ERROR # Show only errors
abstractcore --set-console-log-level NONE # Disable console logging
# File logging controls
abstractcore --enable-file-logging # Start saving logs to files
abstractcore --disable-file-logging # Stop saving logs to files
abstractcore --set-log-base-dir ~/.abstractcore/logs
# Quick commands
abstractcore --enable-debug-logging # Sets both console and file to DEBUG
abstractcore --disable-console-logging # Keeps file logging if enabled
Cache and Storage Configuration
# Set cache directories
abstractcore --set-default-cache-dir ~/.cache/abstractcore
abstractcore --set-huggingface-cache-dir ~/.cache/huggingface
abstractcore --set-local-models-cache-dir ~/.abstractcore/models
Configuration Status Dashboard
View complete configuration with helpful change commands:
abstractcore --status
This displays a hierarchical dashboard showing:
- 🎯 Application Defaults (CLI, Summarizer, Extractor, Judge)
- 🌐 Global Fallback settings
- 👁️ Media Processing configuration
- 🔑 Provider Access (API key status)
- 📝 Logging configuration
- 💾 Storage locations
Common Configuration Workflows
First-Time Setup
# Check what's available
abstractcore --status
# Configure for development (free local models)
abstractcore --set-global-default ollama/llama3:8b
abstractcore --set-console-log-level WARNING
# Add API keys when ready for cloud providers
abstractcore --set-api-key openai sk-your-key-here
abstractcore --set-api-key anthropic your-anthropic-key
# Verify everything works
abstractcore --status
Development Environment
# Optimize for local development
abstractcore --set-global-default ollama/llama3:8b # Free local models
abstractcore --enable-debug-logging # Detailed logs for debugging
abstractcore --set-app-default cli ollama qwen3:4b # Fast model for CLI testing
Production Environment
# Configure for production reliability and performance
abstractcore --set-global-default openai/gpt-4o-mini # Reliable cloud provider
abstractcore --set-console-log-level WARNING # Reduce noise
abstractcore --enable-file-logging # Persistent logs
abstractcore --set-app-default summarizer openai gpt-4o-mini # Optimize for quality
Multi-Environment Approach
# Use different providers for different applications
abstractcore --set-app-default cli ollama qwen3:4b # Fast for development
abstractcore --set-app-default summarizer openai gpt-4o-mini # Quality for documents
abstractcore --set-app-default judge anthropic claude-3-5-haiku # Detailed analysis
Configuration File Location
Configuration is stored in: ~/.abstractcore/config/abstractcore.json
You can directly edit this file if needed, but using the CLI commands is recommended for safety and validation.
Benefits
- No Repetition - Set your preferences once, use them everywhere
- Environment-Specific - Different settings for dev, staging, production
- App-Optimized - Different models for different tasks automatically
- Centralized API Keys - Manage all provider credentials in one place
- Consistent Logging - Unified logging behavior across all components
- Easy Switching - Change providers globally with one command