scispace - formally typeset
Search or ask a question
Author

S. Sankar Ganesh

Bio: S. Sankar Ganesh is an academic researcher from VIT University. The author has contributed to research in topics: Air quality index & Gradient descent. The author has an hindex of 6, co-authored 20 publications receiving 85 citations. Previous affiliations of S. Sankar Ganesh include CMR Institute of Technology & Lovely Professional University.

Papers
More filters
Journal ArticleDOI
TL;DR: Several ensemble models of individual neural network predictors and individual regression predictors have been presented for the final forecast of the PM2.5 concentration.
Abstract: Inhaling particulate matter such as PM2.5 can have a hazardous impact on the human health. In order to predict the PM2.5 concentration, Artificial Neural Networks trained with conjugate gradient descent such as Multi Layer Perceptron (MLP), cascade forward neural network, Elman neural network, Radial Basis Function (RBF) neural network and Non-linear Autoregressive model with exogenous input (NARX) along with regression models such as Multiple Linear Regression (MLR) consisting of batch gradient descent, stochastic gradient descent, mini-batch gradient descent and conjugate gradient descent algorithms and Support Vector Regression (SVR) were implemented. In these models, the concentration of PM2.5 was the dependent variable and the data related to concentrations of PM2.5, SO2, O3 and meteorological data including average Maximum Temperature (MAX T), daily wind speed (WS) for the years 2010–2016 in Houston and New York were the independent variables. For the final forecast, several ensemble models of individual neural network predictors and individual regression predictors have been presented.

21 citations

Proceedings ArticleDOI
01 Apr 2017
TL;DR: Different regression models to forecast air quality index (AQI) in particular areas of interest are presented and support vector regression (SVR) exhibited high performance in terms of investigated measures of quality.
Abstract: It is always important to monitor the quality of air that we inhale to protect ourselves from the respiratory diseases. In this paper, we present different regression models to forecast air quality index (AQI) in particular areas of interest. Support vector regression (SVR) and linear models like multiple linear regression consisting of gradient descent, stochastic gradient descent, mini-batch gradient descent were implemented. In these models, the air quality index (AQI) is dependent on pollutant concentrations of NO 2 , CO, O 3 , PM 2.5 , PM 10 and SO 2 . Among these models, support vector regression (SVR) exhibited high performance in terms of investigated measures of quality.

18 citations

Journal ArticleDOI
TL;DR: The 3x4 array pattern of Jerusalem cross shaped microstrip patch antenna have been designed, fabricated and validated in this paper, which consists of 1'mm thickness of substrate mounted on metal strips which is one side of conductive strip of 31.25×'22.9'

17 citations

Journal ArticleDOI
TL;DR: In this paper, two different shapes of (EF and ΨU) conductive strips; stack on Epoxy based composite substrate in the array pattern, have been presented in order to provide multiband applications; which also offered the various modern wireless communication systems.

16 citations

Journal ArticleDOI
TL;DR: To forecast the air quality index (AQI), artificial neural networks trained with conjugate gradient descent (CGD) along with regression models such as multiple linear regression (MLR) and support vector regression (SVR) are implemented.
Abstract: Abstract Air is the most essential constituent for the sustenance of life on earth. The air we inhale has a tremendous impact on our health and well-being. Hence, it is always advisable to monitor the quality of air in our environment. To forecast the air quality index (AQI), artificial neural networks (ANNs) trained with conjugate gradient descent (CGD), such as multilayer perceptron (MLP), cascade forward neural network, Elman neural network, radial basis function (RBF) neural network, and nonlinear autoregressive model with exogenous input (NARX) along with regression models such as multiple linear regression (MLR) consisting of batch gradient descent (BGD), stochastic gradient descent (SGD), mini-BGD (MBGD) and CGD algorithms, and support vector regression (SVR), are implemented. In these models, the AQI is the dependent variable and the concentrations of NO2, CO, O3, PM2.5, SO2, and PM10 for the years 2010–2016 in Houston and Los Angeles are the independent variables. For the final forecast, several ensemble models of individual neural network predictors and individual regression predictors are presented. This proposed approach performs with the highest efficiency in terms of forecasting air quality index.

13 citations


Cited by
More filters
Journal Article
TL;DR: Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) with einer Schlüssellänge von 128 bit zum Sicherheitsrisiko, und zuletzt konnte 1998 mit einem von der „Electronic Frontier Foundation“ (EFF) entwickkelten Spezialmaschine mit 1.800 parallel arbeit
Abstract: Im Jahre 1977 wurde der „Data Encryption Algorithm“ (DEA) vom „National Bureau of Standards“ (NBS, später „National Institute of Standards and Technology“ – NIST) zum amerikanischen Verschlüsselungsstandard für Bundesbehörden erklärt [NBS_77]. 1981 folgte die Verabschiedung der DEA-Spezifikation als ANSI-Standard „DES“ [ANSI_81]. Die Empfehlung des DES als StandardVerschlüsselungsverfahren wurde auf fünf Jahre befristet und 1983, 1988 und 1993 um jeweils weitere fünf Jahre verlängert. Derzeit liegt eine Neufassung des NISTStandards vor [NIST_99], in dem der DES für weitere fünf Jahre übergangsweise zugelassen sein soll, aber die Verwendung von Triple-DES empfohlen wird: eine dreifache Anwendung des DES mit drei verschiedenen Schlüsseln (effektive Schlüssellänge: 168 bit) [NIST_99]. Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) mit einer Schlüssellänge von 128 bit. Da die amerikanische „National Security Agency“ (NSA) dafür gesorgt hatte, daß der DES eine Schlüssellänge von lediglich 64 bit besitzt, von denen nur 56 bit relevant sind, und spezielle Substitutionsboxen (den „kryptographischen Kern“ des Verfahrens) erhielt, deren Konstruktionskriterien von der NSA nicht veröffentlicht wurden, war das Verfahren von Beginn an umstritten. Kritiker nahmen an, daß es eine geheime „Trapdoor“ in dem Verfahren gäbe, die der NSA eine OnlineEntschlüsselung auch ohne Kenntnis des Schlüssels erlauben würde. Zwar ließ sich dieser Verdacht nicht erhärten, aber sowohl die Zunahme von Rechenleistung als auch die Parallelisierung von Suchalgorithmen machen heute eine Schlüssellänge von 56 bit zum Sicherheitsrisiko. Zuletzt konnte 1998 mit einer von der „Electronic Frontier Foundation“ (EFF) entwickelten Spezialmaschine mit 1.800 parallel arbeitenden, eigens entwickelten Krypto-Prozessoren ein DES-Schlüssel in einer Rekordzeit von 2,5 Tagen gefunden werden. Um einen Nachfolger für den DES zu finden, kündigte das NIST am 2. Januar 1997 die Suche nach einem „Advanced Encryption Standard“ (AES) an. Ziel dieser Initiative ist, in enger Kooperation mit Forschung und Industrie ein symmetrisches Verschlüsselungsverfahren zu finden, das geeignet ist, bis weit ins 21. Jahrhundert hinein amerikanische Behördendaten wirkungsvoll zu verschlüsseln. Dazu wurde am 12. September 1997 ein offizieller „Call for Algorithm“ ausgeschrieben. An die vorzuschlagenden symmetrischen Verschlüsselungsalgorithmen wurden die folgenden Anforderungen gestellt: nicht-klassifiziert und veröffentlicht, weltweit lizenzfrei verfügbar, effizient implementierbar in Hardund Software, Blockchiffren mit einer Blocklänge von 128 bit sowie Schlüssellängen von 128, 192 und 256 bit unterstützt. Auf der ersten „AES Candidate Conference“ (AES1) veröffentlichte das NIST am 20. August 1998 eine Liste von 15 vorgeschlagenen Algorithmen und forderte die Fachöffentlichkeit zu deren Analyse auf. Die Ergebnisse wurden auf der zweiten „AES Candidate Conference“ (22.-23. März 1999 in Rom, AES2) vorgestellt und unter internationalen Kryptologen diskutiert. Die Kommentierungsphase endete am 15. April 1999. Auf der Basis der eingegangenen Kommentare und Analysen wählte das NIST fünf Kandidaten aus, die es am 9. August 1999 öffentlich bekanntmachte: MARS (IBM) RC6 (RSA Lab.) Rijndael (Daemen, Rijmen) Serpent (Anderson, Biham, Knudsen) Twofish (Schneier, Kelsey, Whiting, Wagner, Hall, Ferguson).

624 citations

Journal ArticleDOI
TL;DR: An Aggregated LSTM (Long Short-Term Memory) model (ALSTM) based on the L STM deep learning method is proposed that can effectively improve the accuracy of prediction of air pollution.

143 citations

Journal ArticleDOI
TL;DR: A holistic review of existing smart airport applications and services enabled by IoT sensors and systems is presented, and several types of cyber defence tools including AI and data mining techniques are investigated, and their strengths and weaknesses are analysed in the context of smart airports.
Abstract: Advances in the Internet of Things (IoT) and aviation sector have resulted in the emergence of smart airports. Services and systems powered by the IoT enable smart airports to have enhanced robustness, efficiency and control, governed by real-time monitoring and analytics. Smart sensors control the environmental conditions inside the airport, automate passenger-related actions and support airport security. However, these augmentations and automation introduce security threats to network systems of smart airports. Cyber-attackers demonstrated the susceptibility of IoT systems and networks to Advanced Persistent Threats (APT), due to hardware constraints, software flaws or IoT misconfigurations. With the increasing complexity of attacks, it is imperative to safeguard IoT networks of smart airports and ensure reliability of services, as cyber-attacks can have tremendous consequences such as disrupting networks, cancelling travel, or stealing sensitive information. There is a need to adopt and develop new Artificial Intelligence (AI)-enabled cyber-defence techniques for smart airports, which will address the challenges brought about by the incorporation of IoT systems to the airport business processes, and the constantly evolving nature of contemporary cyber-attacks. In this study, we present a holistic review of existing smart airport applications and services enabled by IoT sensors and systems. Additionally, we investigate several types of cyber defence tools including AI and data mining techniques, and analyse their strengths and weaknesses in the context of smart airports. Furthermore, we provide a classification of smart airport sub-systems based on their purpose and criticality and address cyber threats that can affect the security of smart airport’s networks.

42 citations

Journal ArticleDOI
TL;DR: Ten established, data-driven dynamic algorithms are surveyed and a practical guide for understanding these methods generated, and generalizable results demonstrating the suitability of each method for prediction over a multi-step future horizon to other complex dynamic systems are demonstrated.

36 citations

Journal ArticleDOI
Liang Ge1, Kunyan Wu1, Yi Zeng1, Feng Chang1, Yaqian Wang1, Siyu Li1 
TL;DR: A multi-scale spatiotemporal graph convolution network (MST-GCN), which consists of a multi- scale block, several spatial-temporal blocks and a fusion block, that achieves the highest performance compared with state-of-the-art and baseline models for air quality prediction.
Abstract: Air pollution is a serious environmental problem that has attracted much attention. Air quality prediction can provide useful information for urban environmental governance decision-making and residents’ daily health control. However, existing research methods have suffered from a weak ability to capture the spatial correlations and fail to model the long-term temporal dependencies of air quality. To overcome these limitations, we propose a multi-scale spatiotemporal graph convolution network (MST-GCN), which consists of a multi-scale block, several spatial-temporal blocks and a fusion block. We first divide the extracted features into several groups based on their domain categories, and represent the spatial correlations across stations as two graphs. Then we combine the grouped features and the constructed graphs in pairs to form a multi-scale block that feeds into spatial-temporal blocks. Each spatial-temporal block contains a graph convolution layer and a temporal convolution layer, which can model the spatial correlations and long-term temporal dependencies. To capture the group interactions, we use a fusion block to fuse multiple groups. Extensive experiments on a real-world dataset demonstrate that our model achieves the highest performance compared with state-of-the-art and baseline models for air quality prediction.

35 citations