scispace - formally typeset
Open AccessJournal ArticleDOI

Industry 4.0 - Optimization of Dressing Cycle Frequency and Power Requirements of Grinding Machine using Neural Networks

TLDR
The main objective is to develop a method which increases the efficiency and reliability of machines using machine learning, which can be used to alert line supervisors and engineers to analyse and rectify the problem thereby reducing the chances of machine failure.
Abstract
In today’s world Industry 4.0 is an emerging trend towards automation of manufacturing processes and technologies which include internet of things, cognitive and cloud computing, artificial intelligence and machine learning. Our main objective is to develop a method which increases the efficiency and reliability of machines using machine learning. I.INTRODUCTION Grinding is one of the most important processes in manufacturing industries. It is an abrasive machining process deals with removing excess material from a raw material surface. This process uses grinding wheel as cutting tool to make precision grinding and produces very fine finish within required tolerance. After successive grinding, the grinding wheel surface becomes blunt, losing its sharpness. Hence a diamond is used to remove the blunt surface and expose the sharp edges underneath. This process is called as “Dressing”. In present scenario, there is no active feedback given to the machine by virtue of it, the machine controller can decide the exact time for performing grinding wheel dressing and subsequently the frequency of dressing. In the absence of this closed loop feedback, machine operators set the dressing frequency manually. High dressing frequency leads to unnecessary removal of grinding wheel material, thereby reducing the life of grinding wheel. Poor dressing frequency leads to various quality issues such as chatter marks, ovality, poor surface finish (High Ra value) and high-power consumption. A correct balance must be achieved for dressing cycle (Dressing frequency) to maintain both the quality of the product and life of grinding wheel thereby increasing the reliability of final finished product and reduced grinding wheel cost. Neural networks [9] can be used to give real time feedback to machine of dressing frequency based on past cycles. Neural networks can also forecast the worst case cycles and can be used to alert line supervisors and engineers to analyse and rectify the problem thereby reducing the chances of machine failure. This helps in reducing the losses during production and increase overall manufacturing efficiency. II.SELECTION OF DATA Various dynamic forces are present during operation on grinding head. As the wheel becomes blunt, changes in the forces can be observed in the form of vibration and acceleration parameters. As the grinding wheel becomes blunt more force is applied to remove the material. Increased forces lead to more power consumption by the machine spindle. Hence the power count of variable frequency drive has a direct correlation with the forces present. Hence power count is selected as input quantity for the neural network. III.COLLECTION OF DATA Most of the machines in industries are controlled by programmable logic controllers (PLC). Power count is collected from Mitsubishi MELSEC Q PLC as an Open Database Connectivity (ODBC) client via Ethernet. Kepware ServerX software was used for the collection and decoding of data and is stored in SQL tabular format on MS Access. Over 5000 dressing cycle data were collected over the period of 26 days. To make the resulting neural network more generalized and applicable, cycles during a specific environment, failures, worst case scenarios are also collected. IV.PREPARATION OF DATA Noisy and pilot cycles were removed as they were result of human error/intervention or due to other unrelated problems. Then the data is split into three parts namely training data, test data and holding data. Using sklearn’s MinMaxscaler data is normalized. TensorFlow’s timeseries generator was used to generate batches of data along with the output array for the validation. Also training data is shaped according to number of features, length of input data with timestamps and batch size. Fig.1 Power count Vs. One Cycle time International Journal of Engineering Research & Technology (IJERT) ISSN: 2278-0181 http://www.ijert.org IJERTV9IS050849 (This work is licensed under a Creative Commons Attribution 4.0 International License.) Published by : www.ijert.org Vol. 9 Issue 05, May-2020

read more

Content maybe subject to copyright    Report

References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Bidirectional recurrent neural networks

TL;DR: It is shown how the proposed bidirectional structure can be easily modified to allow efficient estimation of the conditional posterior probability of complete symbol sequences without making any explicit assumption about the shape of the distribution.
Journal ArticleDOI

LSTM: A Search Space Odyssey

TL;DR: This paper presents the first large-scale analysis of eight LSTM variants on three representative tasks: speech recognition, handwriting recognition, and polyphonic music modeling, and observes that the studied hyperparameters are virtually independent and derive guidelines for their efficient adjustment.
Posted Content

Generating Sequences With Recurrent Neural Networks

TL;DR: This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.
Book

Supervised Sequence Labelling with Recurrent Neural Networks

Alex Graves
TL;DR: A new type of output layer that allows recurrent networks to be trained directly for sequence labelling tasks where the alignment between the inputs and the labels is unknown, and an extension of the long short-term memory network architecture to multidimensional data, such as images and video sequences.
Related Papers (5)