scispace - formally typeset
Search or ask a question

Showing papers by "Kongu Engineering College published in 2022"


Journal ArticleDOI
TL;DR: In this article, the pectin extraction from pineapple peel (PP) waste employing the ultrasound-assisted extraction (UAE) technique was investigated, in which the optimum conditions for maximum PP extraction were determined using independent variables like ultrasonication time (15-30min), liquid to solid (LS) ratio (10-20mL/g), temperature (50-80°C) and pH (1-2).

31 citations


Journal ArticleDOI
TL;DR: In this article, the crystallographic structure of (E)-2-(2,5-dimethoxybenzylidene)hydrazinecarbothioamide (DBH) was described, and further spectroscopic studies (IR, Raman, 1H, and 13C NMR) were performed.

11 citations


Book ChapterDOI
01 Jan 2022
TL;DR: In this paper, a distributed approach employed in the present study is to replace the centralized data environment, that is, processing the data based on feature selection in data source and getting the data representative based on this, the informative data will be collected into single site.
Abstract: In real-world environment, the term big data is referred to portray the huge volume of complex structured and unstructured data, which is growing exponentially very fast in time. It is mainly applicable for overgrowth of biological data which is processed by cloud source. Cloud computing refers to processing and storing the massive volume of data over the Internet instead of single computer’s hard drive. Various types of services offer to process the data. Cloud provides most of the intelligent services like security, performance, productivity, reliability, scalability, speed, and accurate access. The data is distributed among various places and various sizes. Centralize all the data into single site is to increase the processing speed and memory. A distributed approach employed in the present study is to replace the centralized data environment. The distributed refers to the collection of independent components. To access the data by distributed way, that is, processing the data based on feature selection in data source and getting the data representative based on this, the informative data will be collected into single site. The hybrid machine learning and deep learning models are used to detect the diseases in biological data to improve the computational efficiency and reduce the memory. The hybrid distributed models show the excellent performance in biological research.

10 citations


Journal ArticleDOI
TL;DR: In this article, the authors have evaluated the performance of Naphthenic oils over mineral oil and showed that the antioxidant rich MAO exhibits superior dielectric properties and restores faster than MO after the instances of breakdown phenomena.
Abstract: The ever increasing power demand has forced the power sector companies to increase their generation, transmission and distribution in turn increasing the number of power transformers in the system where necessary. Depending on the nature of operations, the transformers are chosen as either dry-type or liquid-cooled and the preference is mostly liquid cooled. Naphthenic oils have been the primary choice over the years and alternative solutions in the form of natural esters are researched extensively due to the environmental effects of Mineral Oil (MO). One such ester, Marula is presented in this research. The experiment places its focus on measurement of electrical, physical, chemical and thermal characteristics as per the standards of ASTM and IEC. The results prove the superior dielectric performance of MAO over MO. The antioxidant rich MAO exhibits superior dielectric properties and restores faster than MO after the instances of breakdown phenomena. The antioxidants show stability at both the low and high temperatures, and temperature causes minimum damage to antioxidants. In addition to that the use of such dielectric coolant minimizes the winding damage inside the transformer tank. Thus, it can be said that MAO is an excellent coolant and improves the efficiency of the transformer without the requirements of additional cooling devices.

10 citations


DOI
01 Mar 2022
TL;DR: In this paper, the use of nano-silica and polypropylene fiber in problematic clay-ey soil to enhance the shear strength and compaction characteristics was investigated, and it was observed that the diameter of nanoparticles used in this study was in the range of 10-20nm.
Abstract: The current study presents the laboratory investigation on the use of nano-silica (0.2, 0.4, 0.8 and 1.0%) and polypropylene fiber (0.25, 0.50, 0.75 and 1.0%) in problematic clayey soil to enhance the shear strength and compaction characteristics. From the Transmission electron microscopy (TEM) analysis, it is observed that the diameter of nano-particles used in this study was in the range of 10–20 nm. The nano-particles have a spherical shape and amorphous in nature. Extensive laboratory tests such as the standard Proctor compaction test and unconfined compressive strength test have been conducted on untreated and polypropylene fiber along with nano-silica treated clayey soil. The outcomes showed that the addition of polypropylene fiber in poor soil, increase the maximum dry density and reduce the optimum moisture content of the soil. Whereas, the addition of nano-silica to the clay soil results in reduced maximum dry density and increased optimum moisture content. Unconfined compressive strength of clay soil is increased with the addition of polypropylene fiber and nano-silica to the clay soil. The optimum dosage of polypropylene fiber and nano-silica added to the poor soil was 0.75% and 0.8%, respectively. The Young’s modulus of clay soil was increased with the addition of polypropylene fiber and nano-silica. The microscopic analysis confirmed that C–S–H gel was the main cementitious product, and the inclusion of nano-silica can contribute to a denser packing of soil particles.

8 citations


Journal ArticleDOI
TL;DR: In this paper, Graphite Particle (GP) and Carbon Cloth (CC) were employed as anode electrodes to study both bio-energy generation, and decrease of Chemical Oxygen Demand (COD) simultaneously using tannery effluent.

6 citations


Journal ArticleDOI
TL;DR: In this article, the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1002/app.51602 is presented.
Abstract: © 2021 Wiley Periodicals LLC. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1002/app.51602

6 citations


Book ChapterDOI
01 Jan 2022
TL;DR: In this article, the authors proposed an algorithm SLA-GTMax-Min which schedules the tasks efficiently to the heterogeneous multi-cloud environment satisfying SLA and balances makespan, gain, and penalty/violation cost.
Abstract: Cloud is a distributed heterogeneous computing paradigm that facilitates on-demand delivery of IT heterogeneous resources to the customer based on their needs over the Internet with a pay-as-per service they use. Service level agreement (SLA) specifies the customer’s expected service levels through cloud service provider (CSP) and the remedies or penalties if any of the CSP does not meet agreed-on service levels. Before providing the requested services to the customer, CSP and customer negotiate and sign on an SLA. CSP earns money for the service provided to the customer on satisfying the agreed-on service levels. Otherwise, CSP pays the penalty cost to the customer for the violation of SLA. Task scheduling minimizes task execution time and maximizes resource usage rate. Scheduling objective tends to improve quality of service (QoS) parameters like resource usage, with a minimum execution time and cost (without violating SLA). The proposed algorithm SLA-GTMax-Min schedules the tasks efficiently to the heterogeneous multi-cloud environment satisfying SLA and balances makespan, gain, and penalty/violation cost. Proposed SLA-GTMax-Min represents three levels of SLA provided with three types of services expected by the customers. The services are namely tasks minimum execution time, tasks minimum gain cost, and tasks both minimum execution time and gain cost in percentage, respectively. Makespan is termed as tasks minimum execution time. Gain cost represents minimum execution cost for completing tasks execution. The proposed algorithm SLA-GTMax-Min incorporates the SLA gain cost for providing service successfully and SLA violation cost for providing service unsuccessfully. Performance analysis of algorithm SLA-GTMax-Min and existing algorithm is measured based on the benchmark dataset values. The experimental results of SLA-GTMax-Min algorithm and the existing scheduling algorithms, namely, SLA-MCT, Execution-MCT, Profit-MCT, SLA-Min-Min, Execution-Min-Min, and Profit-Min-Min, are compared by evaluation metrics. Evaluation measure considered for evaluating the performance of the proposed SLA-GTMax-Min algorithm are makespan, cloud utilization ratio, gain cost is the cost earned by the CSP for successful completion of the tasks, and penalty cost the CSP pays to the customer for violation of SLA. The experimental results illustrate clearly algorithm SLA-GTMax-Min performs a better balance among makespan, gain cost, and penalty cost than existing algorithms.

4 citations


Journal ArticleDOI
TL;DR: Lithium-ion batteries are widely used in consumer devices and electric vehicles due to their higher energy density, output, and extended cycle durability as discussed by the authors. But heavy metals, polymers, and toxic...
Abstract: Lithium-ion batteries are widely used in consumer devices and electric vehicles due to their higher energy density, output, and extended cycle durability. Heavy metals, polymers, and toxic...

4 citations


Book ChapterDOI
01 Jan 2022
TL;DR: The proposed trust community and optimized RPL is best for the trust-based data transmission in social Internet of things network by having high alive nodes of 90% and time delay of 30 s for hop localization.
Abstract: Internet of things plays a major role in all the applications that are utilized for automating the environment. In this, the searching of devices in the larger network is a tedious process. Further, it became easier by defining the social relationship among the devices in the network. Social Internet of things is the combination of real-world network and multiple devices in the Internet of things. In social Internet of things, the user can access information globally at any place based on its ID. But, the accessing of this ID in the large database is difficult. Several works were carried out to locate the ID of the devices in the social Internet of things [SIoT]. Community and effective search-based algorithm is one of the methods for effective localization of devices in the social Internet of things as the conventional method for device localization. The Routing protocol for Low power and Lossy networks (RPL) able to increase lifetime of network with high packet loss. In existing, the localization and routing is performed separately. This problem is overcome with the proposed trust-based community nodes and optimized RPL in social Internet of things network. Here, the community is formed with the combination of high trust and low trust nodes of similar taste. These trusted node-based community formation helps to locate the trusty nodes in the transmission process. The dragon fly optimization is used to improve the RPL performance by selecting trusty nodes as next hop in the routing process. Due to the combination of trust community and optimized RPL, the lifetime and success rate of the data transmission is increased by 90% as compared to the existing preference-based device localization and traditional RPL protocol. The whole process is realized with the help of MATLAB R2018a in windows 10 environment. Therefore, the proposed trust community and optimized RPL is best for the trust-based data transmission in social Internet of things network by having high alive nodes of 90% and time delay of 30 s for hop localization.

4 citations



Book ChapterDOI
01 Jan 2022
TL;DR: In this article, the authors analyze and review the outbreaks from the start of steam power generation to the modern days, their impact to generate new values to the society that translated the newer solutions to become new norms.
Abstract: A new era in epidemics started due to unhealthy practices, population density, environmental changes, migration and deforestation. The rapidity in the spread is primarily due to globalization as we moved to the industrial revolution where everything is internet-connected. In past 30 years, the trend exhibits an increase in the number of epidemics challenging the social well-being, the economy and to some extent the national security. And this translates to the impact on the industrial growth, the race of future together fighting with the newest of the viruses. This paper analyzes and reviews the outbreaks from the start of the revolutionary steam power generation to the modern days, their impact to generate new values to the society that translated the newer solutions to become new norms. Impact on the outbreaks on the various key sectors and the measures that lead us to overcome is presented. We present the new normal which would become normal in the near future, the post COVID-19 scenario.

DOI
01 Jan 2022
TL;DR: In this article, a combination of optimized CNN model with deep learning neural network model provides promising results for IoT-based smart farming system, where machine learning algorithms are available for predicting the solution to the agriculture problems.
Abstract: In agriculture, the technological advancement is essential for better growth and sustainability in the long run. Conventional way of farming is less efficient and time consumable because of more labor cost and high energy consumption. Hence, forefront technology like the Internet of Things (IoT) would be an affordable and more precise solution for the betterment in agriculture. By deploying intelligent systems, agricultural process can be automated and human intervention can be reduced. To increase the agriculture yield, most of the industries are adopting the automation methodologies in which agricultural data are collected and processed in an efficient manner. To analyze the sensed data, machine learning approaches are used in Agro IoT. Some of the machines learning algorithms are available for predicting the solution to the agriculture problems. ML algorithms learn from the given data and make the predictions precisely. A combination of optimized CNN model with deep learning neural network model provides promising results for IoT-based smart farming system.

Book ChapterDOI
01 Jan 2022
TL;DR: In this article, supervised learning technologies are incorporated in 5G-based narrowband IoT networks to effectively fight off computer security such as pervasive distributed denial of service (DDoS) assaults to fully unlock the secret potential of such IoT applications on a broad scale in the 5G period.
Abstract: 5G is generally regarded as a major improvement over 4G in terms of data encryption and network consumer authentication. 5G would be a physical transformation of our critical networks, with long-term implications. According to experts, the weakness in 5G protection is likely to be the contact between internet-connected devices. Anything from vehicles and factory production lines to traffic lights with incorporated internet-connected sensors is part of the Internet of Things (IoT). Lower latency, improved bandwidth, and the capacity to restrict network slices to unique use cases, all of which are inherent in 5G design requirements, would allow a variety of new mobile and remote applications that were previously impossible to implement with 4G technology. The Internet of Things (IoT) is a company’s core driver for the next-generation (5G) mobile networks, which will support a slew of revolutionary IoT applications like smart cities, wearable sensors, and other numerous IoT use cases specified in 5G standards. Supervised learning technologies are incorporated in 5G-based narrowband IoT (NB-IoT) networks to effectively fight off computer security such as pervasive distributed denial of service (DDoS) assaults to fully unlock the secret potential of such expedition IoT applications on a broad scale in the 5G period.

Journal ArticleDOI
TL;DR: In this article, an adhesive wear model based on a deterministic approach is developed to predict the galling behavior in a deep drawing process, which uses the surface topography, material properties and contact conditions to predict surface roughening of tool surfaces under perfectly plastic conditions.
Abstract: Galling is a recurring phenomenon in deep drawing processes which requires frequent maintenance of tools to improve the product surface quality. Adhesive transfer of softer material on the hard tool surface results in sharp features which causes surface roughening of the tools and deterioration of deep drawn products. In this article, an adhesive wear model based on a deterministic approach is developed to predict the galling behavior in a deep drawing process. The model uses the surface topography, material properties and contact conditions to predict the surface roughening of tool surfaces under perfectly plastic conditions. The adhesive transfer of material is considered by the growth of the asperities based on its geometry for the increase in height and radial direction by preserving the original shape and volume consistency. The results of the multi-asperity models show the growth of the transfer layer and its effects due to load, sliding cycle, sliding distance, and affinity of the materials. The results show the influence of the above-said parameters and its applicability for deep drawing process conditions. The simulated results show an 85% level of confidence in comparison with the experiments from literature for the prediction of the surface evolution due to the galling mechanism.

Proceedings ArticleDOI
01 Jan 2022
TL;DR: In this paper, the authors presented a simplified approach of the 2019-nCoV outbreak in Malaysia, based on a simple mathematical model and limited reference data, considering the recovered will have the possibility to get infected again.
Abstract: The 2019 novel coronavirus (2019-nCoV) outbreak is declared as a pandemic by the World Health Organization. This chapter presents a simplified approach of the 2019-nCoV outbreak in Malaysia, based on a simple mathematical model and limited reference data. The profound model predictions is based on the actual data on the date of confirmation excluding deaths, considering the recovered will have the possibility to get infected again. The 14 days incubation characteristics are used in the computations as pronounced by CDC to improve the prediction characteristics. This includes the four stages of recovery characteristics in any pandemic cases towards cluster segregation, contact tracing towards flattening the growth curve. The model from china was taken as reference and the Malaysian recovery phase analyses and compared with the measures in place. The computational approach for the dataset available is presented and the similarity measure is a good reference point in handling the pandemic of this size in future.

Book ChapterDOI
01 Jan 2022
TL;DR: The proposed regression-based data mining and image analysis help to forecast the level of flood in the area accurately along with broadcasting saves precious lives of people.
Abstract: Natural disasters are an unpredictable one, but the damages caused by the disaster is severe. It causes hazards to both humans and their properties. Among many hazards like earthquake, eruption and flood, the flood prediction is a quite predictable one. But, it requires proper learning for predicting the floods. Because the flood occurs due to the overflow of water from dams and rivers which is caused by heavy rainfall. Due to this, proper learning based on the weather conditions and previous flood data, the possibility of flood range and area can be detected. Such a flood prediction is performed on the real-time data set collected from the Columbia province. It uses data mining approach on the collected information for forecasting the flood level. But, it is not only sufficient for real time. Hence, the image-based flood prediction is proposed to analyse the flood level in the particular region using a satellite image of the area. Yet, both the techniques offer good prediction in the individual unit it suffers from predicting the nearby flood areas. Hence, in this, a combined approach of image processing and data mining is proposed to forecast the flood level. For data mining, the regression learning approach is used to forecast future flood levels. It is combined with the corresponding image processing of the particular area to accurately the flood level based on extracting the depth of water in the area. Finally, the predicted level is broadcasted to the people through social media for immediate action to save their lives. The proposed regression-based data mining and image analysis help to forecast the level of flood in the area accurately along with broadcasting saves precious lives of people.


DOI
01 Jan 2022
TL;DR: In this article, three signalized intersections in Erode are taken into consideration, and the junctions are redesigned by varying the red time and green time in those signals, the signal is then modelled using the computer simulation model VISSIM.
Abstract: Traffic flow in signalized intersection plays a major part in the formation of delays in the urban street network. Reduction of delays in the signalized intersections in-turn helps to reduce the travelling time and pollution. Most of the existing signals in the urban streets have not been updated according to the increase or decrease of the traffic flow at each arm of these intersections. In this paper, three signalized intersections in Erode are taken into consideration, and the junctions are redesigned by varying the red time and green time in those signals. The signal is then modelled using the computer simulation model VISSIM. Efficient modelling of vehicular traffic is efficient in India and is largely used in context of Indian heterogeneous driving conditions. By changing the red time and green time of the signal, the average waiting time of the vehicles is expected to be reduced by 50%.