This is the code repository for Machine Learning for Time Series, published by Packt.
Time-Series analysis, statistical and machine learning models for forecasting, regression, and classification
You can install your local environment with conda (recommended) or pip. The environment configurations for conda and pip are provided. Please note that if you choose pip as you installation tool, you might need additional tweaking.
If you have any problems with the environment, please raise an issue, where you show the error you got. If you feel confident, please go ahead and create a pull request.
This is the recommended method for installing dependencies. Please make sure you have anaconda installed.
First create the environment for the book that contains all the dependencies:
conda env create --file time_series.yml
The conda environment is called time_series
. You can activate it as follows:
conda activate time_series
Pip is the default dependency management tool in Python. With pip, you should be able to install all the libraries from the requirements file:
pip install -r requirements.txt
There's a docker file for the environment as well. It uses the docker environment and starts an ipython notebook. To use it, first build it, and then run it:
docker build -t new_image .
docker run -it new_image
You should be able to find the notebook in your browser at http://localhost:8080.
Make sure you have poetry installed. On Linux and MacOS, you should be able to use the requirements file:
poetry init
cat requirements.txt | xargs poetry add
If you find anything amiss with the notebooks or dependencies, please feel free to create a pull request.
If you want to change the conda dependency specification (the yaml file), you can test it like this:
conda env create --file time_series.yml --force
You can update the pip requirements like this:
pip freeze > requirements.txt
Please make sure that you keep these two ways of maintaining dependencies in sync.
Then make sure, you test the notebooks in the new environment to see that they run.