Time Series Classification with Multistage Modeling Using Deep Learning
Time series classification (TSC) can be efficiently implemented with several techniques. Many techniques are based on analyzing 1-D signals in the time series data. In this work, we make an intrinsic analytical implementation of a new time series classification that involves a two-stage process. Firstly, by using Recurrence Plots (RP), we transform the time series into 2D images. The second stage consists in taking advantage of deep learning models to perform our classification. The image illustration of time series introduces different feature types that are not available for all 1D signals, and therefore our classification problem is treated as a 2D image recognition task. Experimental results show that our multistage time series modeling is exceptionally effective compared with an alternate traditional classification framework. A significant amount of data is stored in the form of time series. Climatic measurements, performances, medical tests, stock exchanges, satellite locations, and political opinions are all data saved as a time series. Time series data can be any information collected successively in time. Since processes are often measured relative to time, this data type exists in almost every task. Some examples of it are stock prices, industrial processes, electronic health records, human activities, sensor readings, and language. Because it is ubiquitous, extracting value from time series data around us is only practical. Time series classification has an extensive range of applications. The novel application of time series can start from the identification of Soil sedimentation, stock market anomalies, and the Spread of viruses to automated detection of heart and brain diseases. Time series classification can be evaluated or conducted with many techniques. Most of these techniques have two stages. The first approach uses mathematical methods, statistics, or programming tools to represent time series as feature vectors. Secondly, an algorithm can measure the difference between the time series one wants to classify. When one uses some algorithm to classify data, one can implement anything from k-nearest neighbors and SVMs to deep neural network models. The convolution Neural Network (CNN) model also has many impeccable attributes. It jointly and automatically allows different learning levels of representations with a classifier. Therefore, Recurrence Plots RP and CNN in a compact, a unified framework is expected to boost the time series classification recognition rate. Experimental results on the UCR timeseries classification demonstrate the rigid competitive accuracy of the proposed approach compared to the existing deep architectures and the state-of-the-art TSC algorithms.
Arthur, James Ekow, "Time Series Classification with Multistage Modeling Using Deep Learning" (2022). ETD Collection for University of Texas, El Paso. AAI30242419.