-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add XGBoost based models #98
Comments
Easy to implement,. A generic scikit-learn model can already be used (SVR models are OK). A new package dependency
|
create anew branch pyaf_xgboost |
Sample model in a jupyter notebook
Time series models based on XGBoost regressors has been added. They are not activated by default. Need to activate these using something like : lEngine.mOptions.set_active_autoregressions(['AR', 'XGB']); |
Added a test with custom XGBoostRegressor options
Added the possibility to customize XGBRegressors
Added some tests for xgboost models with exogenous data
Add XGBX models (past of the signal + past of the exogenous variables) Transformed_Signal = Trend + Periodic + XGBoostRegressor(target = PeriodicResidue, input = PeriodicResidue_Lags + Exogenous_Lags) |
Add XGBX models. Update Model Complexity.
Add XGBX models. Update Model Complexity. Updated reference logs
Add XGBX models. Update Model Complexity. Added some reference logs
Fixed. Some tests here : https://github.com/antoinecarme/pyaf/tree/master/tests/xgb Closing. |
Need to evaluate models of the type :
Transformed_Signal = Trend + Periodic + XGBoostRegressor(target = PeriodicResidue, input = PeriodicResidue_Lags)
Of course, this is done inside the competition (all possible combinations of transformations, trends and periodics are tested).
The text was updated successfully, but these errors were encountered: