Models
DaskLGBMForecast
How-to guides
- Exogenous features
- Lag transformations
- Hyperparameter optimization
- Using scikit-learn pipelines
- Sample weights
- Cross validation
- Probabilistic forecasting
- Target transformations
- Analyzing the trained models
- MLflow
- Transforming exogenous features
- Custom training
- Training with numpy arrays
- One model per step
- Custom date features
- Predict callbacks
- Predicting a subset of ids
- Transfer Learning
API Reference
Models
DaskLGBMForecast
dask LightGBM forecaster
Wrapper of lightgbm.dask.DaskLGBMRegressor
that adds a model_
property that contains the fitted booster and is sent to the workers to
in the forecasting step.
source
DaskLGBMForecast
DaskLGBMForecast (boosting_type:str='gbdt', num_leaves:int=31, max_depth:int=-1, learning_rate:float=0.1, n_estimators:int=100, subsample_for_bin:int=200000, obj ective:Union[str,Callable[[Optional[numpy.ndarray],nump y.ndarray],Tuple[numpy.ndarray,numpy.ndarray]],Callable [[Optional[numpy.ndarray],numpy.ndarray,Optional[numpy. ndarray]],Tuple[numpy.ndarray,numpy.ndarray]],Callable[ [Optional[numpy.ndarray],numpy.ndarray,Optional[numpy.n darray],Optional[numpy.ndarray]],Tuple[numpy.ndarray,nu mpy.ndarray]],NoneType]=None, class_weight:Union[dict,str,NoneType]=None, min_split_gain:float=0.0, min_child_weight:float=0.001, min_child_samples:int=20, subsample:float=1.0, subsample_freq:int=0, colsample_bytree:float=1.0, reg_alpha:float=0.0, reg_lambda:float=0.0, random_state :Union[int,numpy.random.mtrand.RandomState,ForwardRef(' np.random.Generator'),NoneType]=None, n_jobs:Optional[int]=None, importance_type:str='split', client:Optional[distributed.client.Client]=None, **kwargs:Any)
Distributed version of lightgbm.LGBMRegressor.
On this page