Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Shiyu coder Kronos Config Init

From Leeroopedia


Field Value
implementation_name Config_Init
type API Doc
repository https://github.com/shiyu-coder/Kronos
source_file finetune/config.py:L3-131
implements Principle:Shiyu_coder_Kronos_Qlib_Experiment_Configuration
last_updated 2026-02-09 14:00 GMT

Summary

The Config class centralizes all experiment hyperparameters, data paths, time ranges, training settings, and backtesting parameters into a single configuration object with no constructor arguments.

Class

Config

API Signature

Config() -> Config

Import

from config import Config  # run from finetune/ directory

Constructor

No constructor arguments. All fields are set as attributes in __init__.

Key Attributes

Data and Feature Parameters

Attribute Default Description
qlib_data_path "~/.qlib/qlib_data/cn_data" Path to Qlib CN data directory
instrument "csi300" Instrument universe for data loading
lookback_window 90 Number of past time steps for input
predict_window 10 Number of future time steps for prediction
max_context 512 Maximum context length for the model
feature_list ['open', 'high', 'low', 'close', 'vol', 'amt'] Features used from raw data
time_feature_list ['minute', 'hour', 'weekday', 'day', 'month'] Time-based features to generate

Time Ranges

Attribute Default Description
dataset_begin_time "2011-01-01" Overall data loading start
dataset_end_time "2025-06-05" Overall data loading end
train_time_range ["2011-01-01", "2022-12-31"] Training split range
val_time_range ["2022-09-01", "2024-06-30"] Validation split range
test_time_range ["2024-04-01", "2025-06-05"] Test split range
backtest_time_range ["2024-07-01", "2025-06-05"] Backtesting time range

Training Hyperparameters

Attribute Default Description
epochs 30 Number of training epochs
batch_size 50 Batch size per GPU
tokenizer_learning_rate 2e-4 Learning rate for tokenizer finetuning
predictor_learning_rate 4e-5 Learning rate for predictor finetuning
accumulation_steps 1 Gradient accumulation steps
adam_beta1 0.9 AdamW beta1
adam_beta2 0.95 AdamW beta2
adam_weight_decay 0.1 AdamW weight decay
clip 5.0 Clipping value for normalized data
n_train_iter 2000 * batch_size Training samples per epoch
n_val_iter 400 * batch_size Validation samples per epoch
seed 100 Global random seed
log_interval 100 Log every N batches

Paths

Attribute Default Description
save_path "./outputs/models" Base directory for model checkpoints
dataset_path "./data/processed_datasets" Pickled dataset directory
pretrained_tokenizer_path "path/to/your/Kronos-Tokenizer-base" Pretrained tokenizer location
pretrained_predictor_path "path/to/your/Kronos-small" Pretrained predictor location
finetuned_tokenizer_path (derived) {save_path}/{tokenizer_save_folder_name}/checkpoints/best_model
finetuned_predictor_path (derived) {save_path}/{predictor_save_folder_name}/checkpoints/best_model
backtest_result_path "./outputs/backtest_results" Backtest output directory

Backtesting Parameters

Attribute Default Description
backtest_n_symbol_hold 50 Number of symbols to hold in portfolio
backtest_n_symbol_drop 5 Number of symbols to drop per rebalance
backtest_hold_thresh 5 Minimum holding period for a stock
inference_T 0.6 Sampling temperature for inference
inference_top_p 0.9 Top-p (nucleus) sampling threshold
inference_top_k 0 Top-k sampling (0 = disabled)
inference_sample_count 5 Number of samples per inference
backtest_batch_size 1000 Batch size for inference during backtest
backtest_benchmark (derived) Benchmark index, set by _set_benchmark()

Experiment Logging

Attribute Default Description
use_comet True Enable/disable Comet ML logging
comet_config {"api_key": ..., "project_name": ..., "workspace": ...} Comet ML connection settings
comet_tag "finetune_demo" Tag for the Comet experiment
comet_name "finetune_demo" Name for the Comet experiment

Output

A Config instance with all attributes set. Can be converted to a dict via config.__dict__ for use in training functions.

Example Usage

from config import Config

config = Config()

# Access attributes directly
print(config.instrument)          # "csi300"
print(config.epochs)              # 30
print(config.lookback_window)     # 90

# Convert to dict for training functions
config_dict = config.__dict__

Private Methods

  • _set_benchmark(instrument: str) -> str -- Maps instrument names to benchmark index codes (e.g., "csi300" to "SH000300"). Raises ValueError if the instrument is not recognized.

Source Reference

File: finetune/config.py, lines 3-131.

Environment & Heuristic Links

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment