A CLI tool for managing your locally downloaded Huggingface models and datasets
Disclaimer: This tool is not affiliated with or endorsed by Hugging Face. It is an independent, community-developed utility.
- Smart Asset Detection: Detect HuggingFace models, datasets, LoRA adapters, fine-tuned models, Ollama models, and custom formats
- Asset Listing: View all your AI assets with size information and metadata
- Duplicate Detection: Find and clean duplicate downloads to save disk space
- Asset Details: View model configurations and dataset documentation with rich formatting
- Directory Management: Add and manage custom directories containing your AI assets
- Manifest System: Customize model names, publishers, and metadata with JSON manifests
- HuggingFace Models & Datasets: Standard cached downloads from Hugging Face Hub
- LoRA Adapters: Fine-tuned adapters from training frameworks like Unsloth
- Custom Models: Fine-tuned models, merged models, and other custom formats
- Ollama Models: GGUF format models from Ollama (both user and system directories)
pip install hf-model-tool
git clone https://github.com/Chen-zexi/hf-model-tool.git
cd hf-model-tool
pip install -e .
hf-model-tool
Launches the interactive CLI with:
- System status showing assets across all configured directories
- Asset management tools for all supported formats
- Easy directory configuration and management
The tool provides API specifically designed for vLLM-CLI for model discovery and management.
Also can be launched directly from vLLM-CLI
For detailed instructions on serving models from custom directories with vLLM-CLI, see:
- Custom Model Serving Guide - Complete guide for vLLM-CLI integration
from hf_model_tool import get_downloaded_models
from hf_model_tool.api import HFModelAPI
# Quick access to models
models = get_downloaded_models()
# Full API access
api = HFModelAPI()
api.add_directory("/path/to/models", "custom")
assets = api.list_assets()
See API Reference for complete documentation.
The tool provides comprehensive command-line options for direct operations:
# Launch interactive mode
hf-model-tool
# List all detected assets
hf-model-tool -l
hf-model-tool --list
# Enter asset management mode
hf-model-tool -m
hf-model-tool --manage
# View detailed asset information
hf-model-tool -v
hf-model-tool --view
hf-model-tool --details
# Show version
hf-model-tool --version
# Show help
hf-model-tool -h
hf-model-tool --help
# Add a directory containing LoRA adapters
hf-model-tool -path ~/my-lora-models
hf-model-tool --add-path ~/my-lora-models
# Add a custom model directory
hf-model-tool -path /data/custom-models
# Add current working directory
hf-model-tool -path .
# Add with absolute path
hf-model-tool -path /home/user/ai-projects/models
# List assets sorted by size (default)
hf-model-tool -l --sort size
# List assets sorted by name
hf-model-tool -l --sort name
# List assets sorted by date
hf-model-tool -l --sort date
- ↑/↓ arrows: Navigate menu options
- Enter: Select current option
- Back: Select to return to previous menu
- Config: Select to access settings and directory management
- Main Menu: Select to return to main menu from anywhere
- Exit: Select to clean application shutdown
- Ctrl+C: Force exit
- Directory Setup: Add directories containing your AI assets (HuggingFace cache, LoRA adapters, custom models)
- List Assets: View all detected assets with size information across all directories
- Manage Assets: Delete unwanted files and deduplicate identical assets
- View Details: Inspect model configurations and dataset documentation
- Configuration: Manage directories, change sorting preferences, and access help
- Manifest System Guide - Learn how to customize model metadata with JSON manifests
- Custom Directories Guide - Configure and manage custom model directories
- API Reference - Complete API documentation for programmatic usage
Add custom directories containing your AI assets:
- HuggingFace Cache: Standard HF cache with
models--publisher--name
structure - Custom Directory: LoRA adapters, fine-tuned models, or other custom formats
- Auto-detect: Let the tool automatically determine the directory type
Access via "Config" from any screen:
- Directory Management: Add, remove, and test directories
- Sort Options: Size (default), Date, or Name
- Help System: Navigation and usage guide
Automatic Generation: When you add a custom directory, the tool automatically generates a models_manifest.json
file that:
- Becomes the primary source for model information
- Is always read first for classification
- Can be edited to ensure accurate display in vLLM-CLI
Customize model metadata using JSON manifests:
- Define custom names for your models
- Specify publishers and organizations
- Add notes and documentation
- See Manifest System Guide for details
Important: Review and edit auto-generated manifests to ensure model names and publishers are accurate for your use case.
hf_model_tool/
├── __main__.py # Application entry point with welcome screen
├── cache.py # Multi-directory asset scanning
├── ui.py # Rich terminal interface components
├── utils.py # Asset grouping and duplicate detection
├── navigation.py # Menu navigation
├── config.py # Configuration and directory management
└── asset_detector.py # Asset detection (LoRA, custom models, etc.)
- Python ≥ 3.7
- Dependencies:
rich
,inquirer
,html2text
Application logs are written to ~/.hf-model-tool.log
for debugging and monitoring.
Settings and directory configurations are stored in ~/.config/hf-model-tool/config.json
We welcome contributions from the community! Please feel free to:
- Open an issue at GitHub Issues
- Submit a pull request with your improvements
- Share feedback about your experience using the tool
This project is licensed under the MIT License - see the LICENSE file for details.