Handle specific best parameter settings. Save, display, delete, try out and configure various models. To manage specific best parameters for different models,
python -m wisent model-config SUBCOMMAND [OPTIONS]
Save optimal parameters for a model after optimization.
python -m wisent model-config save \ meta-llama/Llama-3.1-8B-Instruct \ --classification-layer 15 \ --steering-layer 15 \ --token-aggregation average \ --detection-threshold 0.6 \ --optimization-method optuna
| Argument | Required | Description |
|---|---|---|
| model | Yes | Model name or path |
| --classification-layer | Yes | Optimal layer for classification |
| --steering-layer | No | Optimal layer for steering (defaults to classification layer) |
| --token-aggregation | No | Token aggregation method (default: average) |
| --detection-threshold | No | Detection threshold (default: 0.6) |
| --optimization-method | No | How parameters were determined (default: manual) |
| --metrics | No | JSON string with optimization metrics |
python -m wisent model-config list
python -m wisent model-config list --detailed
python -m wisent model-config show \ meta-llama/Llama-3.1-8B-Instruct
python -m wisent model-config show \ meta-llama/Llama-3.1-8B-Instruct \ --task truthfulqa_mc1
python -m wisent model-config remove \ meta-llama/Llama-3.1-8B-Instruct \ --confirm
python -m wisent model-config test \ meta-llama/Llama-3.1-8B-Instruct \ --task truthfulqa_mc1 \ --limit 5
| Argument | Description |
|---|---|
| --config-dir | Custom directory for configuration files (default: ~/.wisent/model_configs/) |
| --verbose | Enable verbose output |
Stay in the loop. Never miss out.
Subscribe to our newsletter and unlock Wisent insights.