
A Unified Python Toolkit for Industrial Time-Series Analytics
โฆฟ Motivation
: Industrial Time-Series (ITS) analytics is essential across diverse sectors, including manufacturing, energy, transportation, and healthcare. While general time-series models have made significant advancements, ITS encompasses specialized tasks such as fault diagnosis, process monitoring, soft sensing, RUC prediction, and predictive maintenance. These tasks require unique data processing, training, and evaluation protocols, making existing toolkits unsuitable for ITS applications. Furthermore, implementation details for these protocols are often not disclosed in current literature. Therefore, the area of ITS analytics needs a dedicated toolkit. PyITS is created to fill in this blank.
โฆฟ Mission
: PyITS is designed to be a versatile toolbox that streamlines the application of modern machine learning techniques for Industrial Time-Series (ITS) analytics. Our mission is twofold: (1) Enable engineers to quickly implement and compare existing algorithms, facilitating the efficient completion of ITS projects. (2) Provide researchers with a robust collection of updated baselines and standardized experimental protocols, allowing them to focus on algorithmic innovation. PyITS continuously integrates both classical and state-of-the-art machine learning algorithms, offering unified APIs across various models and tasks. Additionally, we provide comprehensive documentation and interactive tutorials to guide beginners through different applications.
๐ค Please star this repo to help others notice PyITS if you think it is a useful toolkit. This really means a lot to our open-source research. Thank you!
PyITS supports fault diagnosis, process monitoring, soft sensing, RUC prediction, and predictive maintenance tasks on ITS datasets. Suppose T is the historical length, at the time-step
-
SS
: Soft sensor, which aims to estimate the value of the quality variable ($y_t$ ). Available input data include the historical values of process variables$x_{t-\mathrm{T}+1:t}$ and quality variable$y_{t-\mathrm{T}+1:t-1}$ . -
PM
: Process monitoring, which aims to predicts the next-step quality variable ($y_{t+1}$ ). Available input data include the historical values of process variables$x_{t-\mathrm{T}+1:t}$ and quality variable$y_{t-\mathrm{T}+1:t}$ . -
FD
: Fault diagnosis, which aims to determines the correct status of a given patch, including normal and various fault types. Available input data include the historical values of all variables$z_{t-\mathrm{T}+1:t}$ . -
RUL
: RUL prediction, which aims to forecasts the remaining time before a fault occurs. Available input data include the historical values of all variables$z_{t-\mathrm{T}+1:t}$ . -
PdM
: Predictive maintainance, which aims to estimates whether a fault occurrs within a specified future window. Available input data include the historical values of all variables$z_{t-\mathrm{T}+1:t}$ .
The table below shows the availability of each algorithm for different tasks.
The symbol โ
indicates the algorithm is available for the corresponding task (note that some models are specifically designed for typical tasks, and PyITS modifies the data processing and output protocol to extend the spectrum of supported tasks).
Type | Model | Paper | SS | PM | FD | RUL | PdM | Remarks |
---|---|---|---|---|---|---|---|---|
Foundation model | Autoformer | NIPS2021 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Crossformer | ICLR2023 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | ETSformer | arxiv | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | FEDformer | ICML2022 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Informer | AAAI2021 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Nonstationary Transformer | NIPS2022 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | PatchTST | ICLR2023 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Pathformer | ICLR2024 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | PAttn | NIPS2024 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Pyraformer | ICLR2022 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Reformer | ICLR2020 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | TimeXer | NIPS2024 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | Transformer | NIPS2017 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | iTransformer | ICLR2024 | โ | โ | โ | โ | โ | Transformer-based time-series foundational model |
Foundation model | FITS | ICLR2024 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | DLinear | AAAI2023 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | LightTS | PACMMOD2023 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | FreTS | NIPS2023 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | TiDE | TMLR2023 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | TimeMixer | ICLR2024 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | Triformer | IJCAI2022 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | TSMixer | TMLR2023 | โ | โ | โ | โ | โ | MLP-based time-series foundational model |
Foundation model | Mamba | arxiv | โ | โ | โ | โ | โ | RNN-based time-series foundational model |
Foundation model | LSTM | โ | โ | โ | โ | โ | RNN-based time-series foundational model | |
Foundation model | SegRNN | arxiv | โ | โ | โ | โ | โ | RNN-based time-series foundational model |
Foundation model | MICN | ICLR2023 | โ | โ | โ | โ | โ | CNN-based time-series foundational model |
Foundation model | SCINet | NIPS2022 | โ | โ | โ | โ | โ | CNN-based time-series foundational model |
Foundation model | TCN | arxiv | โ | โ | โ | โ | โ | CNN-based time-series foundational model |
Foundation model | TimesNet | ICLR2023 | โ | โ | โ | โ | โ | CNN-based time-series foundational model |
Foundation model | Koopa | NIPS2023 | โ | โ | โ | โ | โ | Other time-series foundational model |
Foundation model | FiLM | NIPS2022 | โ | โ | โ | โ | โ | Other time-series foundational model |
Task-specific model | DLSTM | TII2022 | โ | โ | โ | โ | โ | Originally developed for SS task |
Task-specific model | DTGRU | TII2023 | โ | โ | โ | โ | โ | Originally developed for RUL task |
Task-specific model | AdaNet | TII2024 | โ | โ | โ | โ | โ | Originally developed for RUL task |
Task-specific model | DeepPLS | TNNLS2023 | โ | โ | โ | Originally developed for PdM task | ||
Task-specific model | RSN | TII2023 | โ | โ | โ | โ | โ | Originally developed for FD task |
Task-specific model | MCN | TSMC2024 | โ | โ | โ | โ | โ | Originally developed for FD task |
Task-specific model | MCTAN | TNNLS2023 | โ | โ | โ | โ | โ | Originally developed for SS task |
Task-specific model | DLformer | TNNLS2024 | โ | Originally developed for RUL task | ||||
Task-specific model | TR-LT | TII2022 | โ | โ | โ | โ | โ | Originally developed for RUL task |
โจ Contribute your model right now to enhance your research impact! Your work will be widely used and cited by the community.
Refer to the models/Transformer.py
to see how to devise your model with PyITS.
PyITS incorporates several ITS datasets for task validation.
The table below shows the availability of each dataset for different tasks.
The symbol โ
indicates the dataset is available for the corresponding task (note that some datasets may lack the required labels for typical tasks).
You can download each dataset either from the source link or from our zipped file, which is available on Google Drive and Baidu Disk.
Dataset | Paper | Source | SS | PM | FD | RUL | PdM | Remarks |
---|---|---|---|---|---|---|---|---|
SRU | Control Eng. Pract.2003 | github | โ | โ | Sulfur Recovery Unit | |||
Debutanizer | TII2020 | github | โ | โ | Debutanizer Column | |||
TEP | Measurement2023 | github | โ | โ | โ | Tennessee Eastman Process | ||
CWRU | arxiv | official | โ | Case Western Reserve University Bearing Fault Dataset | ||||
C-MAPSS | ETFA2021 | official | โ | Turbofan Engine Degradation Simulation from NASA | ||||
NASA-Li-ion | NASA2007 | official | โ | Charging and discharging experiments on Li-Ion batteries from NASA | ||||
SWaT | CRITIS2016 | official | โ | Secure Water Treatment Equipment Anomaly Dataset | ||||
SKAB | Kaggle2020 | kaggle | โ | Skoltech Anomaly Benchmark |
โจ Contribute related datasets right now to enhance your research impact! Your work will be widely used and cited by the community.
Refer to the data_provider/data_generator.py
to see how to incorporate your datasets with PyITS.
The PyITS framework consists of three main components: data_provider, model, and estimator. This modular architecture ensures flexibility, scalability, and ease of use across diverse ITS applications.
-
Data provider
: it aims to process the dataset with diverse features and protocols to provide appropriate inputs for different ITS tasks. It inherits from the base classdata_provider/Base_Dataset
, which standardizes data handling procedures across the framework. Each dataset has a unique data_provider class, while each ITS task to execute corresponds to a specific in-class method within the data_provider class. -
Model
: it serves as the encoder to generate representations from the input data provided by Data provider. Leveraging advanced machine learning algorithms, the model captures intricate dependencies inherent in ITS data. Notably, the models are generally designed to be task-agnostic, allowing it to be used across different tasks. -
Estimator
: it wraps the data_provider, model, and decoder for a given task, defining the training and evaluation protocols. The decoder, which is cascaded from the model, transforms the encoded representations into task-specific outputs. For example, in the SS task, the decoder is an affine layer that transforms the representation$z$ into a real-valued estimate of the quality variable$y$ ; in the PdM task, the decoder is an affine layer followed by a softmax layer that transforms$z$ into a probability vector, indicating the likelihood of anomalies occurring at corresponding time stamps.
We will release a package on PyPI and anaconda very soon. Before that, you can install PyITS by cloning it.
git clone https://github.com/Master-PLC/PyITS.git
We present you two usage examples of performing soft sensor with iTransformer as the model and SRU as the dataset below, you can click it to view.
Click here to see an example applying iTransformer on SRU for process monitoring:
python run.py --task_name process_monitoring --model iTransformer --data SRU --is_training 1
Click here to see an example applying iTransformer on SRU for process monitoring with your own python script:
from data_provider.data_generator import Dataset_SRU
from estimator.foundation.process_monitoring_estimator import Process_Monitoring_Estimator
from models.iTransformer import Model
from utils.argument_parser import parse_arguments
from utils.logger import Logger
from utils.tools import load_device, seed_everything
if __name__ == '__main__':
args = parse_arguments()
## fix seed
seed_everything(args.fix_seed)
## build logger
logger = Logger(log_dir=args.save_dir, remove_old=args.remove_log)
## load device
args.device = load_device(gpu_ids=args.gpu_ids)
## build dataset
dataset = Dataset_SRU(args, logger)
args = dataset.generate_data(task_name=args.task_name)
train_data = dataset.get_data(flag='train')
## build model
model = Model(args).float().to(args.device)
## build estimator
estimator = Process_Monitoring_Estimator(args, dataset=dataset, model=model, device=args.device, logger=logger)
## training and testing
estimator.fit()
estimator.test()
Saving the above code as toy.py
, you can run it with the following command:
python toy.py --task_name process_monitoring --model iTransformer --data SRU --is_training 1
We warmly invite you to contribute to PyITS. By committing your code:
- You can make your works readily available for PyITS users to run, which helps your work gain more recognition and credits.
- You will be listed as a PyITS contributor and a volunteer developer.
- Your contributions will be highlighted in PyITS release notes.
Refer to the data_provider/data_generator.py
, models/Transformer.py
and estimator/foundation/soft_sensor_estimator.py
to see how to include your own dataset, model and estimator in PyITS.
We invite you to join the PyITS community on WeChat (ๅพฎไฟกๅ ฌไผๅท). We also run a group chat on WeChat, and you can get the access by scanning the QR code. By joining the community, you can get the latest updates on PyITS, share your ideas, and discuss with other members.
Some aspects of the implementation are based on the following repositories:
- Foundation models: https://github.com/thuml/Time-Series-Library.
- Tutorials: https://pypots.com, https://pythonot.github.io.