scispace - formally typeset
Open AccessJournal ArticleDOI

Unsupervised electric motor fault detection by using deep autoencoders

Reads0
Chats0
TLDR
An unsupervised method for diagnosing faults of electric motors by using a novelty detection approach based on deep autoencoders, and the results showed that all the autoencoder-based approaches outperform the OC-SVM algorithm.
Abstract
Fault diagnosis of electric motors is a fundamental task for production line testing, and it is usually performed by experienced human operators. In the recent years, several methods have been proposed in the literature for detecting faults automatically. Deep neural networks have been successfully employed for this task, but, up to the authors &#x02BC knowledge, they have never been used in an unsupervised scenario. This paper proposes an unsupervised method for diagnosing faults of electric motors by using a novelty detection approach based on deep autoencoders. In the proposed method, vibration signals are acquired by using accelerometers and processed to extract Log-Mel coefficients as features. Autoencoders are trained by using normal data only, i.e., data that do not contain faults. Three different autoencoders architectures have been evaluated: the multi-layer perceptron &#x0028 MLP &#x0029 autoencoder, the convolutional neural network autoencoder, and the recurrent autoencoder composed of long short-term memory &#x0028 LSTM &#x0029 units. The experiments have been conducted by using a dataset created by the authors, and the proposed approaches have been compared to the one-class support vector machine &#x0028 OC-SVM &#x0029 algorithm. The performance has been evaluated in terms area under curve &#x0028 AUC &#x0029 of the receiver operating characteristic curve, and the results showed that all the autoencoder-based approaches outperform the OC-SVM algorithm. Moreover, the MLP autoencoder is the most performing architecture, achieving an AUC equal to 99.11 &#x0025.

read more

Citations
More filters
Journal ArticleDOI

Forecasting and Anomaly Detection approaches using LSTM and LSTM Autoencoder techniques with the applications in supply chain management

TL;DR: A Long Short Term Memory network-based method for forecasting multivariate time series data and an LSTM Autoencoder network- based method combined with a one-class support vector machine algorithm for detecting anomalies in sales are suggested.
Journal ArticleDOI

A Review on Deep Learning Applications in Prognostics and Health Management

TL;DR: The survey validates the universal applicability of deep learning to various types of input in PHM, including vibration, imagery, time-series and structured data and suggests the possibility of transfer learning across PHM applications.
Journal ArticleDOI

Machine learning applications in production lines: A systematic literature review

TL;DR: This paper aims to identify, assess, and synthesize the reported studies related to the application of machine learning in production lines, to provide a systematic overview of the current state-of-the-art and pave the way for further research.
Journal ArticleDOI

Data-Driven Fault Diagnosis for Traction Systems in High-Speed Trains: A Survey, Challenges, and Perspectives

TL;DR: In this article , the authors systematically review and categorize most of the mainstream FDD methods for high-speed trains and analyze the characteristic of observations from sensors equipped in traction systems.
Journal ArticleDOI

WaveletKernelNet: An Interpretable Deep Neural Network for Industrial Intelligent Diagnosis

TL;DR: In this paper , a wavelet driven deep neural network termed as WaveletKernelNet (WKN) is presented, where a continuous wavelet convolutional (CWConv) layer is designed to replace the first convolution layer of the standard CNN.
References
More filters
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Book

Discrete-Time Signal Processing

TL;DR: In this paper, the authors provide a thorough treatment of the fundamental theorems and properties of discrete-time linear systems, filtering, sampling, and discrete time Fourier analysis.
Journal Article

Random search for hyper-parameter optimization

TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Related Papers (5)