Changelog
Release history for foreblocks and foretools.
v0.1.15 (current)
foreblocks
- Added
mHC(multi-Hyper-Connections) residual stream mixing toTransformerEncoderandTransformerDecoder - Added paper-style Attention Residuals (
use_attention_residual,attn_residual_type,attention_residual_block_size) - Added Mixture-of-Depths token routing (
use_mod,mod_mode,mod_budget_scheduler) - Added latent-MoE path (
moe_use_latent,moe_latent_dim,moe_latent_d_ff) for higher expert count at lower cost - Added
adaptive_noisy_topk,hash_topk, andmulti_hash_topkrouter families - Added
afocpconformal method with attention-based feature network - Added
cptc(state-aware rolling conformal) method - Removed classic dense load-balancing auxiliary loss; expert utilization now handled via router expert-bias adaptation
label_len <= 0in Informer-like mode no longer generates a full-decoder Informer padding maskforward_one_step(...)is now incompatible withuse_mod=Trueanduse_mhc=True(explicitly guarded)
foretools
- Added
tsgensynthetic series generator with AR, seasonal, trend, and noise components - Added
tsaugAutoDA augmentation search - Added
foreminerchangepoint detection, cluster analysis, and stationarity diagnostics bohbplotter and observation store improvements
Breaking changes
- Code importing
TimeSeriesSeq2Seqdirectly should switch toForecastingModel load_balance_weightno longer drives the primary balancing mechanism in MoE; tunez_loss_weightand router type instead
v0.1.14
- Initial public release of
foretools/bohbBayesian hyperparameter optimization - Added
ConformalPredictionEnginewithsplit,rolling,agaci, andenbpimethods - Added
GateSkipsublayer-level residual gating - Added CT-PatchTST encoder tokenization path (
ct_patchtst=True) - Added
foretools/fengineerfeature engineering pipeline with RFECV and mutual-information selection - Added
foretools/vmddecomposition toolkit (VMD, EMD-family, hierarchical VMD) - Stabilized
MLTrackerlocal FastAPI server andmltracker-tuiTUI
v0.1.x and earlier
See git log for commit-level history.