scispace - formally typeset
Open AccessJournal ArticleDOI

Estimating Electric Motor Temperatures With Deep Residual Machine Learning

Reads0
Chats0
TLDR
In this paper, deep recurrent and convolutional neural networks with residual connections are empirically evaluated for their feasibility on predicting latent high-dynamic temperatures continuously inside permanent magnet synchronous motors.
Abstract
Most traction drive applications lack accurate temperature monitoring capabilities, ensuring safe operation through expensive oversized motor designs. Classic thermal modeling requires expertise in model parameter choice, which is affected by motor geometry, cooling dynamics, and hot spot definition. Moreover, their major advantage over data-driven approaches, which is physical interpretability, tends to deteriorate as soon as their degrees of freedom are curtailed in order to meet the real-time requirement. In this article, deep recurrent and convolutional neural networks (NNs) with residual connections are empirically evaluated for their feasibility on predicting latent high-dynamic temperatures continuously inside permanent magnet synchronous motors. Here, the temperature profile in the stator teeth, winding, and yoke as well as the rotor's permanent magnets are estimated while their ground truth is available as test bench data. With an automated hyperparameter search through Bayesian optimization and a manual merge of target estimators into a multihead architecture, lean models are presented that exhibit a strong estimation performance at minimal model sizes. It has been found that the mean squared error and maximum absolute deviation performances of both, deep recurrent and convolutional NNs with residual connections, meet those of classic thermodynamics-based approaches, without requiring domain expertise nor specific drive train specifications for their topological design. Finally, learning curves for varying training set sizes and interpretations of model estimates through expected gradients are presented.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Machine Learning for Design Optimization of Electromagnetic Devices: Recent Developments and Future Directions

TL;DR: To meet modern requirements of high manufacturing/production quality and lifetime reliability, several promising topics, including the application of cloud services and digital twin, are discussed as future directions for design optimization of electromagnetic devices.
Journal ArticleDOI

Data-Driven Permanent Magnet Temperature Estimation in Synchronous Motors With Supervised Machine Learning: A Benchmark

TL;DR: Through benchmarking, this work reveals the potential of simpler ML models in terms of regression accuracy, model size, and their data demand in comparison to parameter-heavy deep neural networks, which were investigated in the literature before.
Journal ArticleDOI

Thermal Monitoring of Electric Motors: State-of-the-Art Review and Future Challenges

TL;DR: In this article, the authors present a broad overview of the literature in the field of temperature sensing in electric motors and give a bird's-eye overview of three most important estimation classes: indirect methods, which track temperature-sensitive electrical motor parameters, and direct methods, namely lumped-parameter thermal networks as well as supervised machine learning.
Journal ArticleDOI

Optimized Two-Level Ensemble Model for Predicting the Parameters of Metamaterial Antenna

TL;DR: In this paper , the authors proposed a two-level ensemble model for predicting the bandwidth of metamaterial antenna using a novel ensemble model composed of three strong models, namely, random forest, support vector regression, and light gradient boosting machine.
Journal ArticleDOI

Flux Linkage-Based Direct Model Predictive Current Control for Synchronous Machines

TL;DR: This paper presents a flux linkage-based direct model predictive current control approach that achieves favorable performance both during steady-state and transient operation and exhibits excellent dynamic performance owing to its direct control nature.
References
More filters
Proceedings ArticleDOI

Deep Residual Learning for Image Recognition

TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal Article

Dropout: a simple way to prevent neural networks from overfitting

TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Proceedings ArticleDOI

Learning Phrase Representations using RNN Encoder--Decoder for Statistical Machine Translation

TL;DR: In this paper, the encoder and decoder of the RNN Encoder-Decoder model are jointly trained to maximize the conditional probability of a target sequence given a source sequence.
Posted Content

Empirical evaluation of gated recurrent neural networks on sequence modeling

TL;DR: These advanced recurrent units that implement a gating mechanism, such as a long short-term memory (LSTM) unit and a recently proposed gated recurrent unit (GRU), are found to be comparable to LSTM.
Related Papers (5)