scispace - formally typeset
Search or ask a question
Author

Fabrizio De Vita

Bio: Fabrizio De Vita is an academic researcher from University of Messina. The author has contributed to research in topics: Computer science & Deep learning. The author has an hindex of 4, co-authored 13 publications receiving 61 citations.

Papers
More filters
Proceedings ArticleDOI
12 Jun 2019
TL;DR: This paper presents a machine learning approach by using LSTM networks in order to demonstrate that they can be considered a feasible technique to analyze the "history" of a system inorder to predict the Remaining Useful Life (RUL).
Abstract: Aspects related to the maintenance scheduling have become a crucial problem especially in those sectors where the fault of a component can compromise the operation of the entire system, or the life of a human being. Current systems have the ability to warn only when the failure has occurred causing, in the worst case, an offline period that can cost a lot in terms of money, time, and security. Recently, new ways to address the problem have been proposed thanks to the support of machine learning techniques, with the aim to predict the Remaining Useful Life (RUL) of a system by correlating the data coming from a set of sensors attached to several components. In this paper, we present a machine learning approach by using LSTM networks in order to demonstrate that they can be considered a feasible technique to analyze the "history" of a system in order to predict the RUL. Moreover, we propose a technique for the tuning of LSTM networks hyperparameters. In order to train the models, we used a dataset provided by NASA containing a set of sensors measurements of jet engines. Finally, we show the results and make comparisons with other machine learning techniques and models we found in the literature.

36 citations

Journal ArticleDOI
TL;DR: The aim is to design and implement a non-invasive system of wearable sensors for the prevention of PUs through deep learning techniques, using inertial sensors to estimate the positions of patients, and send an alert signal when he/she remains in the same position for too long a period of time.
Abstract: In recent years, statistics have confirmed that the number of elderly people is increasing. Aging always has a strong impact on the health of a human being; from a biological of point view, this process usually leads to several types of diseases mainly due to the impairment of the organism. In such a context, healthcare plays an important role in the healing process, trying to address these problems. One of the consequences of aging is the formation of pressure ulcers (PUs), which have a negative impact on the life quality of patients in the hospital, not only from a healthiness perspective but also psychologically. In this sense, e-health proposes several approaches to deal with this problem, however, these are not always very accurate and capable to prevent issues of this kind efficiently. Moreover, the proposed solutions are usually expensive and invasive. In this paper we were able to collect data coming from inertial sensors with the aim, in line with the Human-centric Computing (HC) paradigm, to design and implement a non-invasive system of wearable sensors for the prevention of PUs through deep learning techniques. In particular, using inertial sensors we are able to estimate the positions of the patients, and send an alert signal when he/she remains in the same position for too long a period of time. To train our system we built a dataset by monitoring the positions of a set of patients during their period of hospitalization, and we show here the results, demonstrating the feasibility of this technique and the level of accuracy we were able to reach, comparing our model with other popular machine learning approaches.

32 citations

Proceedings ArticleDOI
01 Nov 2018
TL;DR: A deep reinforcement learning approach is proposed that is able to manage data migration in MEC scenarios by learning during the system evolution by using the Keras machine learning framework.
Abstract: 5G technology promises to improve the network performance by allowing users to seamlessly access distributed services in a powerful way. In this perspective, Multi-access Edge Computing (MEC) is a relevant paradigm that push data and computational resources nearby users with the final goal to reduce latencies and improve resource utilization. Such a scenario requires strong policies in order to react to the dynamics of the environment also taking into account multiple parameter settings. In this paper, we propose a deep reinforcement learning approach that is able to manage data migration in MEC scenarios by learning during the system evolution. We set up a simulation environment based on the OMNeT++/SimuLTE simulator integrated with the Keras machine learning framework. Preliminary results showing the feasibility of the proposed approach are discussed.

21 citations

Proceedings ArticleDOI
21 Apr 2020
TL;DR: An industrial IoT architectural framework that allows data offloading between the cloud and the edge is proposed and an anomaly detection algorithm that exploits deep learning techniques to assess the working conditions of the plant is designed.
Abstract: The advent of IoTs has catalyzed the development of a variety of cyber-physical systems in which hundreds of sensor-actuator enabled devices (including industrial IoTs) cooperatively interact with the physical and human worlds. However, due to the large volume and heterogeneity of data generated by such systems and the stringent time requirements of industrial applications, the design of efficient frameworks to store, monitor and analyze the IoT data is quite challenging. This paper proposes an industrial IoT architectural framework that allows data offloading between the cloud and the edge. Specifically, we use this framework for telemetry of a set of heterogeneous sensors attached to a scale replica of an industrial assembly plant. We also design an anomaly detection algorithm that exploits deep learning techniques to assess the working conditions of the plant. Experimental results show that the proposed anomaly detector is able to detect 99% of the anomalies occurred in the industrial system demonstrating the feasibility of our approach.

17 citations

Journal ArticleDOI
TL;DR: This paper proposes a full stack hardware/software infrastructure to collect, manage, and analyze the data gathered from a set of heterogeneous sensors attached to a real scale replica industrial plant available in the laboratory and designed and implemented a fault prediction algorithm which exploits sensors data fusion with the aim to assess the working conditions of the industrial plant.

16 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It was concluded that computer science, including artificial intelligence and distributed computing fields, is more and more present in an area where engineering was the dominant expertise, so detaching the importance of a multidisciplinary approach to address Industry 4.0 effectively.

322 citations

Journal ArticleDOI
TL;DR: It is pointed out that predictive maintenance is a hot topic in the context of Industry 4.0 but with several challenges to be better investigated in the area of machine learning and the application of reasoning.

189 citations

Journal ArticleDOI
TL;DR: A comprehensive survey on the use of ML in MEC systems is provided, offering an insight into the current progress of this research area and helpful guidance is supplied by pointing out which MEC challenges can be solved by ML solutions, what are the current trending algorithms in frontier ML research and how they could be used in M EC.
Abstract: Mobile Edge Computing (MEC) is considered an essential future service for the implementation of 5G networks and the Internet of Things, as it is the best method of delivering computation and communication resources to mobile devices. It is based on the connection of the users to servers located on the edge of the network, which is especially relevant for real-time applications that demand minimal latency. In order to guarantee a resource-efficient MEC (which, for example, could mean improved Quality of Service for users or lower costs for service providers), it is important to consider certain aspects of the service model, such as where to offload the tasks generated by the devices, how many resources to allocate to each user (specially in the wired or wireless device-server communication) and how to handle inter-server communication. However, in the MEC scenarios with many and varied users, servers and applications, these problems are characterized by parameters with exceedingly high levels of dimensionality, resulting in too much data to be processed and complicating the task of finding efficient configurations. This will be particularly troublesome when 5G networks and Internet of Things roll out, with their massive amounts of devices. To address this concern, the best solution is to utilize Machine Learning (ML) algorithms, which enable the computer to draw conclusions and make predictions based on existing data without human supervision, leading to quick near-optimal solutions even in problems with high dimensionality. Indeed, in scenarios with too much data and too many parameters, ML algorithms are often the only feasible alternative. In this paper, a comprehensive survey on the use of ML in MEC systems is provided, offering an insight into the current progress of this research area. Furthermore, helpful guidance is supplied by pointing out which MEC challenges can be solved by ML solutions, what are the current trending algorithms in frontier ML research and how they could be used in MEC. These pieces of information should prove fundamental in encouraging future research that combines ML and MEC.

186 citations

Journal ArticleDOI
TL;DR: This work presents some important edge computing architectures and classify the previous works on computation offloading into different categories, and discusses some basic models such as channel model, computation and communication model, and energy harvesting model that have been proposed in offloading modeling.

159 citations

Journal ArticleDOI
TL;DR: In this article, a systematic review of current Industrial Artificial Intelligence literature is presented, focusing on its application in real manufacturing environments to identify the main enabling technologies and core design principles, along with a set of key challenges and opportunities to be addressed by future research efforts.
Abstract: The advent of the Industry 4.0 initiative has made it so that manufacturing environments are becoming more and more dynamic, connected but also inherently more complex, with additional inter-dependencies, uncertainties and large volumes of data being generated. Recent advances in Industrial Artificial Intelligence have showcased the potential of this technology to assist manufacturers in tackling the challenges associated with this digital transformation of Cyber-Physical Systems, through its data-driven predictive analytics and capacity to assist decision-making in highly complex, non-linear and often multistage environments. However, the industrial adoption of such solutions is still relatively low beyond the experimental pilot stage, as real environments provide unique and difficult challenges for which organizations are still unprepared. The aim of this paper is thus two-fold. First, a systematic review of current Industrial Artificial Intelligence literature is presented, focusing on its application in real manufacturing environments to identify the main enabling technologies and core design principles. Then, a set of key challenges and opportunities to be addressed by future research efforts are formulated along with a conceptual framework to bridge the gap between research in this field and the manufacturing industry, with the goal of promoting industrial adoption through a successful transition towards a digitized and data-driven company-wide culture. This paper is among the first to provide a clear definition and holistic view of Industrial Artificial Intelligence in the Industry 4.0 landscape, identifying and analysing its fundamental building blocks and ongoing trends. Its findings are expected to assist and empower researchers and manufacturers alike to better understand the requirements and steps necessary for a successful transition into Industry 4.0 supported by AI, as well as the challenges that may arise during this process.

139 citations