Repo Status:
Implementation:
🚩 News (2024.05.14) Compatible with MPS backend, TEFN can be trained by .
This is the official code implementation project for paper “Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting”. The code implementation refers to . Thanks very much for ’s contribution to this project.
The Time Evidence Fusion Network (TEFN) is a groundbreaking deep learning model designed for long-term time series forecasting. It integrates the principles of information fusion and evidence theory to achieve superior performance in real-world applications where timely predictions are crucial. TEFN introduces the Basic Probability Assignment (BPA) Module, leveraging fuzzy theory, and the Time Evidence Fusion Network to enhance prediction accuracy, stability, and interpretability.
requirements.txt
Clone the repository:
git clone https://github.com/ztxtech/Time-Evidence-Fusion-Network.git
cd Time-Evidence-Fusion-Network
pip install -r requirements.txt
You can obtain datasets
from
or ,
Then place the downloaded data in the folder./dataset
.
./run_config.py
.config_path = '{your chosen config file path}'
./run_config.py
directly.python run_config.py
*.json
in ./configs
.*.json
file.{
# ...
# Nvidia CUDA Device {0}
# 'gpu': 0
# Apple MPS Device
# 'gpu': 'mps'
# ...
}
Other related operations refer to .
If you find TEFN useful in your research, please cite our work as per the citation.
@misc{TEFN,
title={Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting},
author={Tianxiang Zhan and Yuanpeng He and Zhen Li and Yong Deng},
year={2024},
journal={arXiv}
}
We appreciate the following GitHub repos a lot for their valuable code and efforts.
This library is constructed based on the following repos:
All the experiment datasets are public, and we obtain them from the following links:
Long-term Forecasting and Imputation: https://github.com/thuml/Autoformer.
Short-term Forecasting: https://github.com/ServiceNow/N-BEATS.
If you have any questions or suggestions, feel free to contact:
Or describe it in Issues.