(Article is unfinished, check back soon!)

Changepoint detection in real-time data streams is a critical challenge in time-series analysis. Traditional methods often rely on rigid assumptions about noise, data distribution, or changepoint types, limiting their adaptability to diverse scenarios. To address this, Z. Atashgahi et al developed a hybrid model combining an LSTM neural network and autoregressive components.

The LSTM network excels in capturing complex, non-linear temporal dependencies by leveraging a “forget gate” to selectively retain or discard information, enabling long-term memory. This is complemented by an autoregressive model optimised for detecting simpler patterns, such as step changes in mean. Together, these components form an ensemble approach, balancing flexibility with precision in identifying abrupt and gradual changes in data streams. This ensemble is supplemented by using multiple architectures for the LSTM network to be able to detect changepoints in a variety of settings.

This innovative method enhances the scalability and accuracy of online changepoint detection, offering a promising solution for dynamic, real-world applications. However, the requirement to retrain the network on new pieces of infomation mean this is a very computationally intensive algorithm to run, even if its runtime is technically O(n).

In my report, I will delve into the details of this approach and explain why it appears to be suitable. If time permits, I may also update the code repository, which currently relies on early versions of packages compatible only with Python 3.6, including early versions of TensorFlow.

Back to Home