scispace - formally typeset
Search or ask a question

Showing papers in "Indian journal of science and technology in 2017"


Journal ArticleDOI
TL;DR: In this paper, a lead-based perovskite solar cell model with the flexible architecture of lass/FTO/PCBM/CH3NH3PbI3/PEDOT:PSS/Ag.
Abstract: Objectives: Perovskite photovoltaic’s are getting to be distinctly predominant option for the conventional solar cells achieving a maximum efficiency of 22.1%. This work is concerned about the design and analyses of lead-based perovskite solar cell model with the flexible architecture of lass/FTO/PCBM/CH3NH3PbI3/PEDOT:PSS/Ag. Method/Analysis: The analysis of solar cell architecture is done using the Solar Cell Capacitance Simulator(SCAPS). It is a computer-based software tool and is well adapted for the analyses of homo and heterojunctions, multi- junctions and Schottky barrier photovoltaic devices. This software tool runs and simulates based on the Poisson’s and continuity equation of electrons and holes. For this model, it is used to optimize the various parameters such as thickness, the defect density of absorber layer, doping concentrations(ND and NA) of Electron Transport Material (ETM) and Hole Transport Material (HTM). Findings: The thickness of CH3NH3PbI3varied from 0.1μm to 0.6μm and the best results are observed at 0.3μm. The total defect density of the absorber varied from 1013 cm-3 to 1018 cm-3 and the minimum defect density of absorber layer is predicted as 1014cm-3. The ND or NA of the HTM and ETM varied from 1014 to 1019 cm-3and the PCE is maximum when ND and NA both kept at 1019cm-3. By tuning the thickness of absorber layer and doping concentrations, the predicted results are as follows; maximum power conversion efficiency(PCE)31.77%, short circuit current density (Jsc) 25.60 mA/cm2, open circuit voltage(Voc) 1.52V, fill factor(FF) 81.58%. Improvements: With this proposed simulated model, the efficiency of the perovskite solar cell reaches to the 31%, which is an improvement of 4-5%, to the previous models, with the optimization of few material parameters. Hence this simulation work will provide the handy information in fabricating perovskite solar cells to reasonably choose material parameters and to achieve the high efficiency.

96 citations


Journal ArticleDOI
TL;DR: The polarity of tweets which include whether the tweet is positive, negative or neutral is provided and polarity confidence and subjectivity confidence are found and accuracy of tweets are found using Naive Bayes and SVM classifiers.
Abstract: Background: Sarcasm detection in twitter is a very important task as it had helped in the analysis of tweets. With the help of sarcasm detection, companies could analyze the feelings of user about their products. This is helpful for companies, as the companies could improve their quality of product. Methods: For preprocessing of data TextBlob is used. TextBlob is a package installed in Natural Language Toolkit. The preprocessing steps include tokenization, part of speech tagging and parsing. The stop words are removed using python programming. The stop words corpus which consist of 2400 stop words and which is distributed with NLTK have been used. RapidMiner is used for finding polarity and subjectivity of tweets. TextBlob is used for finding the polarity and subjectivity confidence. Weka is used for calculating the accuracy of tweets based on Naive Bayes classifier and SVM classifiers. Findings: The paper provides the polarity of tweets which include whether the tweet is positive, negative or neutral. Polarity confidence and subjectivity confidence are also found. Accuracy of tweets are found using Naive Bayes and SVM classifiers. Applications: Sarcasm Detection could be helpful in analyzing the exact opinion of the user about a certain thing.

46 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the emission characteristics of mustard oil methyl ester at various injection timings and found that by advancing the fuel injection by 60 bT DC, NOX emission was increased by 11.61% and 16.12% respectively.
Abstract: Objectives: The intention of this work is to examine the emission characteristics of mustard oil methyl ester at various injection timings. Methods/Statistical Analysis: The work was done in vertical cylinder and air cooled diesel engine running at atmospheric conditions and at 1500 rpm. Findings: Result indicated that by advancing the fuel injection by 60 bT DC, NOX emission for Mustard oil methyl ester and diesel is increased by 11.61% and 16.12% respectively. In addition, HC emission elevates by 16.9% and 12.59% for MO and diesel. Furthermore, CO shoots up to by 6.94% and 10.42% for MO and diesel. Application/Improvements: This experimental study evidently indicates that by altering the fuel injection various emissions associated with CI engine increases for both the fuels stating that the fuels should be injected in appropriate timing (24°bTDC) to ensure uniform mixing, improved combustion and lesser emissions.

44 citations


Journal ArticleDOI
TL;DR: This paper concisely covers some of the existing techniques to design an impedance matching network that can be used to solve the impedance matching problem encountered during antenna design.
Abstract: Objective: To present a concise and comprehensive summarization of various impedance matching techniques for microstrip patch antennas. Method: Designing an impedance matching network is a central issue for optimum performance in every part of RF systems like transceiver, amplifier and antenna to ensure maximum power transfer. Various design formulae to calculate the input impedance of patch antenna and techniques to design a matching network should be known to RF designer. Finding: In this paper various impedance matching techniques along with their design equations are presented that utilize quarter wave transformer, taper lines, open or short stubs and lumped elements etc. Methods to calculate the input impedance for various antenna structures like rectangular, circular and triangular patch antenna are described. Application: This paper concisely covers some of the existing techniques to design an impedance matching network that can be used to solve the impedance matching problem encountered during antenna design.

38 citations


Journal ArticleDOI
TL;DR: This experiment results shows that the IKPDS reduces the classification completion time compare with kNN and KPDS by preserving the same classification accuracy as well as the same error rate for different types of attacks.
Abstract: Objective and Background: To adapt two fast kNN classification algorithms i.e., Indexed Partial Distance Search kNearest Neighbor (IKPDS), Partial Distance Search kNearest Neighbor (KPDS) and comparing with traditional kNN classification for Network Intrusion Detection. Methods/Statistical Analysis: NSL-KDD data set is used to evaluate the kNN classification, KPDS and IKPDS with 10 fold cross validation test. This experiment results shows that the IKPDS reduces the classification completion time compare with kNN and KPDS by preserving the same classification accuracy as well as the same error rate for different types of attacks. A novelistic method proposed for classifying the unknown patterns whether it is a malicious or legitimate using IKPDS algorithm. Findings: These algorithms efficiency were tested with the sample of 12597 instances and verified with actual class label. The resultsshow that 99.6% accuracy of the proposed method. Applications/Improvements: A deep analysis can be performed on DoS and Probe attacks as they are exhibiting similar characters andfeature selection techniques may also be implemented inorder to improve the accuracy and reduce the computational time.

36 citations


Journal ArticleDOI
TL;DR: It can be elucidated that despite various issues that usage of electronic payment systems pose, these are identified as a positive step towards the economic development of a nation.
Abstract: Objectives: This paper is aimed at investigating and increasing awareness about various concepts related to Electronic Payment Systems (EPS) including its advantages, challenges and security considerations. The proposed study also evaluates the adoption of e-payment systems and the resulting impact on economy of a nation. Methods/Statistical Analysis: In this paper, a comprehensive survey on all the aspects of electronic payment was conducted after analysis of several research studies on online payment systems. The most recent references and information have been explored in order to gain significant information about electronic payments systems. Findings: From the study conducted, it can be elucidated that despite various issues that usage of electronic payment systems pose, these are identified as a positive step towards the economic development of a nation. Nevertheless, its full potential can be realized only by raising its awareness among people. Applications/Improvements: With the advancement in technology and popularity of Internet, the perception of making online transactions is bound to gain momentum. In the future, the payment modes currently used and supported shall see a declining trend owing to the numerous benefits offered by electronic payment systems.

35 citations


Journal ArticleDOI
TL;DR: Ammonium based Ionic Liquids (AILs) also recently applied as work as potential gas hydrate inhibitor for hydrate mitigation as discussed by the authors, which proves that higher concentration of ACs provides more effective thermodynamic inhibition.
Abstract: Objectives: In natural gas transmission pipelines hydrate formation is major flow assurance challenge. This communuication presents as brief review focuses on the potentially applicable area of Ammonium based compounds (ACs) as gas hydrate inhibition. Methods/Statistical Analysis: For avoiding gas hydrate formation, oil and gas industry spent millions of dollar annually without any permanent solution. In most of the cases insertion of chemicals (inhibitor) established as the utmost supreme route for gas hydrate mitigation. Quantary ammonium salts had widely been applied as commercial hydrate inhibitors. Ammonium based Ionic Liquids (AILs) also recently applies as work as potential gas hydrate inhibitor for hydrate mitigation. Findings: The systematic review on ACs proves that higher concentration of ACs provides more effective thermodynamic inhibition. The alkyl chain length of ACs playsan important part, as the chain length increases it decrease their effectiveness as inhibitors. Usually, imidazolium based ILs employed as gas hydrate inhibitors contains fluorinated anions which categorized as toxic compound. ACs holds the strong tendency to act as the dual function chemical inhibitor for possible replacement of widely used imidazolium based ILs due to their cheaper source, easy synthesis along with fairly better environmentally friendly nature. Application/Improvements: This study will highlight the recent advancements achieved in the field of ACs, highlighting the vigorous prospects of ACs as potential gas hydrate inhibitors and provide the avenue to accomplish flow assurance problems caused due to gas hydrates

34 citations


Journal ArticleDOI
TL;DR: This study facilitates identification of efficient LBA for optimizes resource use, minimum response time, maximum throughput and avoidance of overload in cloud computing environment.
Abstract: Objectives: To enlighten the issues and challenges of Load Balancing Algorithm (LBA) for cloud computing environment. In this study the authors proposed a LBA for cloud computing server. Methods/Statistical Analysis: The numerous LBAs are compared with objective, merit, demerit and challenges of the LBA in terms of parameters such as performance, response time, scalability, overhead and many more. This study facilitates identification of efficient LBA for optimizes resource use, minimum response time, maximum throughput and avoidance of overload. Findings: The load balancer is permitted in most instances to provide support continuity as well as dealing with extra traffic. Consequently the useful load balancing algorithms required to find inexpensive resource usage by provisioning of resources to cloud users. The use of proposed algorithm is to improve the overhead and load imbalance factor of the system. Application: The Load Balancing (LB) is a vital feature of cloud computing environment. The economical LB algorithm assures cost effective resource usage by provisioning of resources to cloud users on demand schedule in pay-as-you-say-manner. The LB may actually provide assistance prioritizing users by using suitable scheduling requirements and also it is crucial part in cloud computing for maintaining the work load across various systems over the network and gives minimum processing time, minimum response time and ignores the workload.

33 citations


Journal ArticleDOI
TL;DR: Zinc ferrite particles were synthesized using co-precipitation method and it was observed that the immobilized enzyme was stable at 60˚C while retaining its activity up to 3 recycles and the immobilize enzyme showed 53% hydrolysis yield on pretreated sunn hemp biomass.
Abstract: Objectives: To synthesize ferrite nano particles; to measure size of the particles by FTIR and XRD studies and enzyme saccharification on ultra sound assisted alkaline pretreated biomass using free and immobilized enzyme. Methods/ Statistical Analysis: In the present work, Zinc ferrite particles were synthesized using co-precipitation method. Cellulase enzymes were immobilized on covalently activated ferrite nanoparticles via glutaraldehyde as a crosslinker. Biochemical characterization of free and immobilized enzyme activity were performed on CMC as a substrate. The efficiency of immobilized enzyme was evaluated based on its binding efficiency on the nanoparticles, thermostability, and reusability. Enzymatic hydrolysis was performed on ultrasound-assisted alkaline pretreated sunn hemp biomass using free and immobilized enzymes. Findings: Around 74% of binding was achieved at 4mg/ml of ferrite loading to enzyme concentration of 20 units. Comparative study on effects of pH and temperature was done on both free and immobilized enzyme and it was observed that the immobilized enzyme has maximum activity at pH 5 and temperature 60˚C. Also, the immobilized enzyme was stable at 60˚C while retaining its activity up to 3 recycles. The immobilized enzyme showed 53% hydrolysis yield on pretreated sunn hemp biomass. Application/Improvements: The research on the interaction between zinc ferrite and cellulase in immobilization and also the recovery of enzymes can determine an efficient approach for bioethanol production in industrial scale. The lab scale can be scaled-up to use at pilot and industrial scales.

30 citations


Journal ArticleDOI
TL;DR: In this article, Alccofine plays a significant role in improving the mechanical and transport properties of low calcium fly ash based geopolymer concrete at ambient conditions providing as an alternative to heat cured GPC.
Abstract: Objective:To develop geopolymer concrete (GPC) using 100% industrial waste as a binder at ambient temperature. Methods/Analysis:The low calcium fly ash based GPC was prepared with different percentage (0%,5%, and 10%) of alccofine and fly ash content (350,370,400kg/m3), to examine the fresh and hardened properties of alccofine activated GPC like density, workability, water absorption, permeable voids, water permeability, compressive and split tensile strengths using international standards. Nine mixes were prepared and investigated by X-ray diffraction (XRD) and Scanning electron microscopy (SEM) for the determination of their phase, composition and microstructural properties. Findings: The result shows that alccofine enhances the mechanical properties and significantly reduces the transport properties of GPC. Furthermore, GPC specimens prepared with alccofine emerge to improve the densification process. The results of investigations conducted reveal that higher percentage of alccofine and fly ash content has a significant effect on the polymerisation of the GPC, which in turn improves the strength and microstructural features. A maximum compressive strength of 42 MPa is achieved with 10% alccofine without elevated heat curing. Novelty/Improvement:Alccofine plays a significant role in improving the mechanical and transport properties of low calcium fly ash based geopolymer concrete at ambient conditions providing as an alternative to heat cured GPC.

30 citations


Journal ArticleDOI
TL;DR: The results from the comparative study show that the ANN model is a superior method for load forecast due to its ability to handle the load data and it has lower MAPE and RMSE of 0.0285 and 1.124 respectively, far better result than the regression model.
Abstract: Objectives: Load forecasting is an operation of predicting the future of load demands in electrical systems using previous or historical data. This paper reports the study on a medium-term load forecast carried out with load demand data set obtained from Covenant University campus in Nigeria and carry out comparative study of the two methods used in this paper. Methods/Statistical analysis: The regression analysis and Artificial Neural Network (ANN) models were used to show the feasibility of generating an accurate medium-term load forecast for the case study despite the peculiarity of its load data. The statistical evaluation methods used are Mean Absolute Percentage Error (MAPE) and root mean square error. Findings: The results from the comparative study show that the ANN model is a superior method for load forecast due to its ability to handle the load data and it has lower MAPE and RMSE of 0.0285 and 1.124 respectively which is far better result than the regression model. Application/Improvements: This result provides a benchmark for power system planning and future studies in this research domain.

Journal ArticleDOI
TL;DR: This paper reviews common SCADA implementation approaches utilized in previous related works and examines security vulnerability and loopholes in the system to develop and test security solutions developed to protect SCADA systems.
Abstract: Objectives: SCADA systems are turning into the central nerve system of the electric power system critical infrastructure. With the increasing availability and use of computer networks and the Internet as well as the convenience of cloud computing, SCADA systems have increasingly adopted Internet-of-Things technologies to significantly reduce infrastructure costs and increase ease of maintenance and integration. However, SCADA systems are obvious targets for cyber attacks that would seek to disrupt the critical infrastructure systems thus are governed by a SCADA system. Methods/Statistical Analysis: Cyber attacks exploit SCADA security vulnerabilities in order to take control or disrupt the normal operation of the system. Analyzing security vulnerability and loopholes are critical in developing security solutions for such systems. It is also equally important to test security solutions developed to protect SCADA systems. Findings: Experimenting on live systems is generally not advisable and impractical as this may render the system unstable. Such situation calls for the need of an experimental setup equivalent or quite close to the real scenario for developing and testing security solutions. Application/Improvements: This paper reviews common SCADA implementation approaches utilized in previous related works.

Journal ArticleDOI
TL;DR: Comparison of MICE using various methods to deal with missing values is provided and it is confirmed that the power of Multiple Imputations lies in getting smaller standard errors and narrower confidence intervals.
Abstract: Missing data is relatively common in all type of research, which can reduce the statistical power and have biased results if not handled properly. Multivariate Imputation by Chained Equations (MICE) has emerged as one of the principled method of addressing missing data. This paper provides comparison of MICE using various methods to deal with missing values. The chained equations approach is very flexible and can handle various types of data such as continuous or binary as well as various missing data patterns. Objectives: To discuss commonly used techniques for handling missing data and common issues that could arise when these techniques are used. In particular, we will focus on different approaches of one of the most popular methods, Multiple Imputation using Chained Equations (MICE). Methods/Statistical Analysis: Multivariate Imputation by Chained Equation is a statistical method for addressing missing value imputation. The paper will focus on Multiple Imputation using Predictive Mean Matching, Multiple Random Forest Regression Imputation, Multiple Bayesian Regression Imputation, Multiple Linear Regression using Non-Bayesian Imputation, Multiple Classification and Regression Tree (CART), Multiple Linear Regression with Bootstrap Imputation which provides a general framework for analyzing data with missing values. Findings: We have chosen to explore Multiple Imputation using MICE through an examination of sample data set. Our analysis confirms that the power of Multiple Imputations lies in getting smaller standard errors and narrower confidence intervals. The smaller is the standard error and narrower is the confidence interval; the predicted value is more accurate, thus, minimizing the bias and inefficiency considerably. In our results from sample data set, it has been observed that standard error and mean confidence interval length is the least in case of Multiple Imputation combined with Bayesian Regression. Also, it is obvious from the density plot that the imputed values are more close to the observed values in this method than other methods. Even in case of random forest, the results are quite close to Bayesian Regression. Application/Improvements: These Multiple Imputation methods can further be combined with machine learning and Genetic Algorithms on real set data to further reduce the bias and inefficiency.

Journal ArticleDOI
TL;DR: The advantages of Real – Time data i.e. data generated by IoT systems were identified to arrive at the impact on organizations through deductive reasoning to help the system to be agile and enable corrections on the go in case required.
Abstract: Objectives: To study the Impact of Data generated from Internet of Things (IoT) on Demand Forecasting. Methods/ Statistical Analysis: An exploratory research to study the Impact of IoT data on demand forecasting was conducted. Preliminary information on IoT and Demand Forecasting including the different types of forecasting and data collection methods which was gathered through various available sources. Research papers, journals, Internet sites and books were used to collate the relevant content on the subject. Analysis of almost all the relevant examples was completed as a part of this study. The advantages of Real – Time data i.e. data generated by IoT systems were identified to arrive at the impact on organizations through deductive reasoning. Findings: Industrial revolution 4.0 has begun where IoT systems will play a vital role. The population of devices that can transmit data over the network will increase exponentially. Data from such smart devices will get collated, analysed and used in various forecasting models. Since the managerial decision- making is enabled by the forecast, efforts are being put in to align the forecasting model to respond proactively to the market dynamics. Application: The IoT data gathered is used in different forecasting models to arrive at the most accurate forecast. Accuracy of the forecast gets verified by the calculated error value and relevancy of the forecasting model is established. This helps the system to be agile and enable corrections on the go in case required.

Journal ArticleDOI
TL;DR: In this paper, the authors collected all the information about major overflows in the history of Malaysia and gathered the facts about the official flood loss estimates for the selected major flood events from the year 1967 to 2012.
Abstract: Background/Objectives: To gather the information on the natural disasters occurred Worldwide from the year 2004 to 2013 Average and 2014 and emphasizes on the disaster type which has severely affected the continent Asia, particularly, Malaysia. Methods: This paper collects all the information about major overflows in the history of Malaysia and gathers the facts about the official flood loss estimates for the selected major flood events from the year 1967 to 2012. It further provides information on the general causes and effects of floods and explains about the flood mitigation measures being used in this region. Additionally, it explains about the allocation of funds for flood mitigation measures by the Malaysian government under the Malaysia plan (1971 to 2020) and enlightens the responsibility of the government agencies accountable to the mitigation measures during the flooding conditions. Findings: Experiences from past floods, demonstrate that a common hazard which causes risk of death or serious injury to the people is due to the instability of vehicles in floodwaters. Therefore, the stability of vehicles during urban flood events has aroused recent interest from the Environmental Agency in the United Kingdom and other flood management authorities around the World. However, it is still believed that there is a need of an Integrated Smart Alarm System that can be used in the flood prone regions to minimize the vehicle related fatalities. Application: To provide researchers, government agencies and decision makers etc., an overview on the most notable disaster type which has severely affected the Malaysian region.

Journal ArticleDOI
TL;DR: The proposed Load Balancing Decision Algorithm (LBDA) to manage and balance the load between the virtual machines in a datacenter along with reducing the completion time (Makespan) and Response time ( Response time) and is more efficient than the existing algorithms.
Abstract: Objectives: The load balancing becomes an important point for performance and stability of the system. Therefore, it is needed an algorithm for enhancing the system performance by balancing workload among VMs. Methods: Task scheduling algorithms are used to achieve the load balancing and QoS. The proposed Load Balancing Decision Algorithm(LBDA) to manage and balance the load between the virtual machines in a datacenter along with reducing the completion time (Makespan) and Response time. Findings: The mechanism of LBDA is based on three stages, first calculates the VM capacity and VM load to categorize the VMs’ states (Under loaded VM, Balanced VM, High Balance VM, Overloaded). Second, calculate the time required to execute the task in each VM. Finally, makes a decision to distribute the tasks among the VMs based on VM state and task time required. Improvements: We compared the result of our proposed LBDA with Max- Min, Shortest Job Firstand Round Robin. The results showed that the proposed LBDA is more efficient than the existing algorithms.

Journal ArticleDOI
TL;DR: In this paper, the effect of boron carbide abrasives on surface finishing and material removal rate of flat brass plate using magnetic abrasive machining process was investigated using full factorial experimental design technique.
Abstract: Magnetic abrasive machining is a surface finishing process in which a magnetic field is utilized to force abrasive particles against the work piece surface to remove the material as microchips. The aim of the present examination work is to investigate the effect of boron carbide abrasives on surface finishing and material removal rate of flat brass plate using magnetic abrasive finishing process. Four input parameters are taken in this research work which is rotational speed, quantity of magnetic abrasives, mesh number and machining time. Full factorial experimental design technique was used to investigate the effect of the input factors on the surface roughness and material removal rate. The analysis of variance (ANOVA) was investigated using statistical software to find optimal conditions for better surface roughness and material removal rate. Regression models have been developed by using MINITAB-17 statistical software for both surface roughness and material removal rate. Experimental results showed that rotational speed is the most important parameters on change in surface roughness and machining time for change in MRR. Minimum surface roughness (Ra) achieved 0.061μm and maximum material removal rate (MRR) 3mg/min.

Journal ArticleDOI
TL;DR: In this paper, the effect of different curing conditions on the compressive strength of the eggshell concrete was assessed and the results showed that the initial strength growth for the compressed strength with increases with the percentage of the replacement up to 15%, being this behaviour more evident for the full water curing environment up to 67.53%.
Abstract: The work assesses the effect of different curing conditions on the compressive strength of the eggshell concrete. Two different qualities of the eggshell powder were used to make concrete with 0.45 water/cement ratio. The eggshell powders were treated as partial cement replacement of 5%, 10%, 15% and 20%. The concrete cube specimens (100 x 100 x 100) mm were exposed to two different environments (full water curing and open-air curing) for 1, 7 and 28 days. The results exhibit that the initial strength growth for the compressive strength with increases with the percentage of the replacement up to 15%, being this behaviour more evident for the full water curing environment up to 67.53%. The 28- days compressive strength of the eggshell concrete for full water curing and open air curing were found at 49.23 MPa and 46.34 MPa respectively. However, the concrete specimens lost 24.7% and 34.83% when the eggshell powder replace up to 20% of the partial cement replacement for full water curing and open air curing. Thus, water curing is found to be more suitable compared to to open air curing.

Journal ArticleDOI
TL;DR: Convergence was seen where researchers from different disciplines collaborated for technology solution support for product development with feasible techno economics with prospects for successful commercialization.
Abstract: The International Conference on Fluids and Chemical Engineering (FluidsChE 2017) organised by Centre for Excellence for Advanced Research in Fluid Flow (CARIFF), Universiti Malaysia Pahang, Malaysia (FluidsChE 2017) aims to provide a platform to discuss ideas and latest research findings on different disciplines of chemical engineering. 1st FluidsChE 2015 was conducted in Langkawi, Malaysia in November, 2015 which stirred and kindled intellectual minds to make this event a biannual event to disseminate knowledge and findings of chemical engineering fields to the entire world. As an outcome of the event 60+ research articles are published in reputed journals which are in open access for all spectrums of researchers to gain knowledge. Research publications were sectioned into three volumes, volume -1, 2 and 3. First volume was segmented into these disciplines of fluid flow dynamics, distillation technology and absorption / adsorption technology. Second volume consisted of natural resources product lifecycle, polymer technology and pharmaceutical technology. Third volume contains allied fields of chemical engineering. Researchers from different parts of the world contributed to the knowledge pool. Convergence was seen where researchers from different disciplines collaborated for technology solution support for product development with feasible techno economics with prospects for successful commercialization.

Journal ArticleDOI
TL;DR: To design digital PID controller by using CHR-I and CHR-II tuning techniques, as it helps in finding out the tuning parameters of controllers for a specific system, a system that can inject the exact amount of insulin into the patient's blood and bring the blood glucose level to the normal range is being achieved.
Abstract: Objectives: To design digital PID controller by using CHR-I and CHR-II tuning techniques, as it helps in finding out the tuning parameters of controllers for a specific system Transformation of analog to digital PID controller using various transformation techniques like first order hold method, impulse-invariant mapping, Tustin approximation and zero-pole mapping equivalents and also the mathematical modeling of blood glucose level, such that a system injects the exact amount of insulin into the body of diabetic patient to maintain his/her glucose level to the normal range Method/Statistical Analysis: The differential equation of the blood glucose level is formulated and then it is converted to three-dimensional Laplace equation using forward Laplace transform Using the Laplace transform the differential equation of the blood glucose is converted into a s-domain equation Then, using the s-domain equation as the equation of the system and the Tuning techniques, CHR-I and CHR-II, the tuning parameters (Kp, Ki and Kd) are acquired Then, it is converted into digital, ie in z-domain, by applying disparate transformation techniques Findings: On analyzing the acquired equation, it is depicted that on tuning the controller with CHR-I tuning technique the system exhibits zero overshoot which is most reliable and efficient for diabetic patient Also, a considerable settling time of 63362 seconds is also achieved Application/Improvement: Therefore, a system that can inject the exact amount of insulin into the patient's blood and bring the blood glucose level to the normal range, by automatically calculating the amount of insulin required, from the available status of blood glucose level, is being achieved

Journal ArticleDOI
TL;DR: In this paper, a detailed review on the potential use of natural fibers as reinforcement in polymeric strengthening materials is presented, where a comparison was made between various types of fibers in terms of their chemical and mechanical properties.
Abstract: Synthetic Fiber Rreinforced Polymer composites (FRP) have been widely accepted by the construction industries as an effective external strengthening material to rehabilitate the existing structures deficiencies. These materials possess outstanding performances like high strength-to-weight ratio, resistance to corrosion, and lightness. However, the drawbacks include high costs during the manufacturing and end-life services, less environmental-friendly and cause adverse effects to human health. Environmental issues on global warming have triggered rapid development of natural fibers as sustainable materials for the strengthening of Reinforced Concrete (RC) structures. This paper presents a detailed review on the potential use of natural fibers as reinforcement in polymeric strengthening materials. A comparison was made between various types of fibers in terms of their chemical and mechanical properties. Bamboo fiber has demonstrated great potential among other natural fibers due to its superior physico-mechanical and thermal properties.

Journal ArticleDOI
TL;DR: The results prove that the NB classifier contribute effectively in Gujarati documents classification and is very useful to implement the functionality of directory search in many web portals to sort useful documents and many Information Retrieval (IR) applications.
Abstract: Objectives: Information overload on the web is a major problem faced by institutions and businesses today. Sorting out some useful documents from the web which is written in Indian language is a challenging task due to its morphological variance and language barrier. As on date, there is no document classifier available for Gujarati language. Methods: Keyword search is a one of the way to retrieve the meaningful document from the web, but it doesn’t discriminate by context. In this paper we have presented the Naive Bayes (NB) statistical machine learning algorithm for classification of Gujarati documents. Six pre-defined categories sports, health, entertainment, business, astrology and spiritual are used for this work. A corpus of 280 Gujarat documents for each category is used for training and testing purpose of the categorizer. We have used k-fold cross validation to evaluate the performance of Naive Bayes classifier. Findings: The experimental results show that the accuracy of NB classifier without and using features selection was 75.74% and 88.96% respectively. These results prove that the NB classifier contribute effectively in Gujarati documents classification. Applications: Proposed research work is very useful to implement the functionality of directory search in many web portals to sort useful documents and many Information Retrieval (IR) applications.

Journal ArticleDOI
TL;DR: In this paper, the effect of mechanical properties on incorporation of glass fibre with treated banana-hemp fibres has been studied and the experimental result shows that the glass/banana/hemp fibre composite exhibits maximum tensile strength than the other two combinations.
Abstract: Background/Objectives: In this experimental work the effect of mechanical properties on incorporation of glass fibre with treated banana-hemp fibres has been studied. Methods/Statistical analysis: Banana and hemp fibres are extracted by enzymatic processes for successfully removing lignin. The alkaline solution was used to treat the fibres since it is increases their mechanical strength. The banana, hemp and glass fibres were reinforced with the epoxy matrix and the hybrid composites were fabricated by using hand layup process. Alternative orientations of fibres were used for the fabrication of the laminates in order to improve their strength. Findings: As per the ASTM standards, test specimens were prepared from the laminates with the stacking sequence of glass/banana fibre, glass/hemp fibre and glass/banana/hemp fibre. The mechanical characteristics were obtained by impact test, tensile test and flexural test for the fabricated samples. The interfacial analysis was conducted using scanning electron microscope to estimate voids, fractures and fibre pull out. The experimental result shows that the glass/banana/hemp fibre composite exhibits maximum tensile strength than the other two combinations. The hemp-glass fibre composite holds the maximum flexural strength followed by glass/hemp/banana fibre composites. The glass/banana fibre composites hold maximum impact energy and the value of the composites varies from 7.33 to 9.33 Joules. Application/Improvements: These composites are performing well in all kind of mechanical loadings. It is suggested that these materials can be used in the relevant fields.

Journal ArticleDOI
TL;DR: This research focuses on major issues, challenges and current requirements of network implemented in any big organization where traditional network is being implemented and how to select the best possible SDN controller which in result will help scalability, less hardware and software requirements, centralized visibility, hassle free traffic engineering and high availability of network.
Abstract: Objectives: This paper focuses on challenges, opportunities and research issues of software defined networking (SDN), as well as how to select the best possible SDN controller, which in result will help to reduce the complexity of a network, price of implementation and maintenance of the network in any big organization. Methods/Statistical Analysis: In order to meet the objective, the review of literature has been carried out in the following contexts; Software defined networking, SDN protocol (Open Flow) and SDN research challenges. Software defined networking is one of the most discussed topic these days. This technology is being considered one of the favorable technologies for isolation of control plane and data plane and logical placement of centralized control from SDN controller. This research focuses on major issues, challenges and current requirements of network implemented in any big organization where traditional network is being implemented. Findings: To solve the issues of multiple located branch networks, cost, technical resources at each location, expertise, separate control plane for configurations, decentralized visibility of network devices, separate VLANs for each branch, complex traffic engineering, limited physical access of branches w.r.t working hours, bandwidth bottleneck at each branch, we surveyed literatures and web resources for the existing SDN controllers like NOX, POX, Ryu, Floodlight, and Open Daylight etc. All these controllers are based on Open Flow protocol. The primary concern is that a conventional system develops gradually, it has a generally abnormal state of operational expenditure and moderately static in nature. SDN holds the guarantee of overcoming those confinements. Major issues which are being faced are increasing requirements from user side, bandwidth availability, hardware (switches requirement at every place), technical resources are required at remote site for configurations, scalability issues, cost, high level processing power at each device, traffic engineering, resiliency against failures, decentralized visibility of hardware devices etc. SDN will helps to improve centralized visibility as all the underlying open flow switches are connected to controller, all switches can be configured from SDN controller without accessing individual switches. Research papers referred in this paper provide a bird eye view of what may cause hurdles further in development and technology integration of technology. Application/Improvements: This research will help how to select the best possible SDN controller which in result will help scalability, less hardware and software requirements, less technical resources requirements, centralized visibility, hassle free traffic engineering and high availability of network.

Journal ArticleDOI
TL;DR: The Ericsson model will help in revamping radio frequency planning and system design of the investigated and similar terrains thereby optimizing overall system performance while minimizing dropped calls, handover/quality issues and other network inherent failings.
Abstract: Objectives: Radio propagation models are used to predict signal strength in order to characterize the radio frequency channel. This will help in providing sufficient data required for the design of appropriate receivers that can recover the transmitted signal distorted due to fading and multipath effect. Methods/Statistical analysis: Data collection was carried out through drive test using TEst Mobile System, TEMS W995 phone interfaced with TEMS investigation tool version 13.1, Gstar GPS location finder and MapInfo professional and analyzed using Root Mean Squared Error (RMSE) statistical tool and tenth degree polynomial for fitting measured data with empirical models. Findings: Considering the contending empirical propagation models, the Ericsson model showed a better fit for the measured path loss data with root mean squared errors of 5.86dB, 5.86dB and 5.85dB at 1.0m, 1.5m and 2.0m mobile antenna heights respectively in comparison with Okumura model which is currently in use. It also outperformed other investigated models which are; Hata, COST 231, and SUI models at 2100MHz. These findings will help in revamping radio frequency planning and system design of the investigated and similar terrains thereby optimizing overall system performance while minimizing dropped calls, handover/quality issues and other network inherent failings. Application/Improvements: Results showed a minimum error estimate within the acceptable range of 6dB for signal prediction. This model can be used for signal prediction and channel characterization of any wireless mobile environment with similar channel characteristics. The other propagation models that over predicted the radio channel could be further investigated in future work and possibly tuned to accommodate dense urban areas.

Journal ArticleDOI
TL;DR: A brief survey on the various RL algorithms, and a perspective on how the trajectory is moving in the research landscape is given, as well as summarizes the entire landscape as can be seen from a bird’s eye view.
Abstract: Reinforcement Learning (RL) has emerged as a strong approach in the field of Artificial intelligence, specifically, in the field of machine learning, robotic navigation, etc. In this paper we try to do a brief survey on the various RL algorithms, and try to give a perspective on how the trajectory is moving in the research landscape. We are also attempting to classify RL as a 3-D (dimensional) problem, and give a perspective on how the journey of various algorithms in each of these dimensions progressing. We provide a quick recap of basic classifications in RL, and some popular, but old, algorithms. This research paper then discusses some of the recent trends; and also summarizes the entire landscape as can be seen from a bird’s eye view. We provide our perspective in saying that Reinforcement learning is a 3D problem and conclude with challenges that remain ahead of us. We have deliberately kept any deep mathematical equations and concepts out of this paper, as the purpose of this paper is to provide an overview and abstractions to a serious onlooker. We hope this paper provides a brief summary and trends in RL for researchers, students and interested scholars.

Journal ArticleDOI
TL;DR: In this paper, the authors have compiled scattered available research work related to use of various adsorbents for the removal of commonly occurring heavy metals present in effluent and have calculated adsorption efficiency of all the adsorbments used by different researchers just to find out the best and most efficient adsorbent for removal of particular metal.
Abstract: Objectives: To explore maximum adsorption efficiency towards Removal of commonly occurring Heavy metals from waste water by using various Adsorbents. Methods/Statistical Analysis: In this review paper, we have compiled scattered available research work related to use of various adsorbents for the removal of commonly occurring heavy metals present in effluent and have calculated adsorption efficiency of all the adsorbents used by different researchers just to find out the best and most efficient adsorbent for the removal of particular metal. Findings: It has been found that maximum adsorption efficiency for the Zinc metal is obtained by using Cassava waste (55.9% removed), Cadmium by using Smectite Clay particle (97%), Lead by using Dried water Hyacinth stems and leaves (90%), Copper and Nickel by using Sugar Baggase (94.2% & 87%) respectively. Application/Improvements: This paper would be helpful for anybody to find the best and the most efficient adsorbent for the removal of a particular heavy metal present in the effluent.

Journal ArticleDOI
TL;DR: The OOS (Object Oriented System) using the parameterized constructor in C++ programs is more reusable up to some extent.
Abstract: Background/Objectives: In this 21st century, Reusability imparts powerful tools in the software industry. More and less 80% code is reused in the new project. Evaluation of metrics form the software code is now a challenging task as well as how much percentage of code is used from the existing one. This can be achieved by using CK (Chidambaram and Kemmerer Metrics). Methods/Statistical Analysis: There are numerous metrics are defined which distinguish the actual object. The proposed new metrics which is the combinations of one of the CK metrics suite and which calculates the reusable codes in the object oriented programme. Findings: In the inheritance if we take maximum depth of class in the hierarchy then found more chance for reusability of the inherited metrics. So DIT (Depth of Inheritance) has positive sign on the reusability of the class. If reasonable value for number of children then more scope of reuse in the class. If we have more number of methods in the class then more impact will be more on the children class and restrictive the possible of reuse. Conclusion: The OOS (Object Oriented System) using the parameterized constructor in C++ programs is more reusable up to some extent. When we will get the larger ethics (values) of proposed Metrics-2 and - 3 then definitely it gives the negative collision on the reusability. So the constructor having parameters (parameterized Constructor) gives the negative impact on the reusability of the classes.

Journal ArticleDOI
TL;DR: The survey explores existing research highlights and finds that prediction of academic performance has progressed a lot but employability prediction is yet to mature and suggests few parameters that have not been considered so far in predicting the performance or employability.
Abstract: Objective: To systematically review the work done in the field of academic performance prediction and employability prediction of students in higher education. Methods: The survey first explain show higher education has become an exciting field of research and why the prediction of academic performance and employability is beneficial for the institutions. We also explain briefly in how many ways higher education is being provided world-wide. Then we discuss the work done in both the areas of prediction. Findings: The survey explores existing research highlights and finds that prediction of academic performance has progressed a lot but employability prediction is yet to mature. Application: It further suggests few parameters that have not been considered so far in predicting the performance or employability.

Journal ArticleDOI
TL;DR: In this article, the authors reviewed the available experimental studies using the granite slurry waste and showed that the use of this waste in place of cement will reduce energy demand, CO 2 emission and consumption of natural resources.
Abstract: Objective: Huge quantity of granite production leads to the collection of enormous quantity of slurry. Random disposal of this generated waste degrades the environment in numerous ways. Utilization of this waste may solve the problem of waste generation and also the problem of scarcity of natural resources. Methods/Analysis: Granite slurry waste does not contain silt or organic impurities. Also, the use of this waste in place of cement will reduce energy demand, CO 2 emission and consumption of natural resources. Sufficient literature is available and it indicates that this waste can be used in place of Fine Aggregate (F.A.) or cement. In this paper salient available experimental studies using the granite slurry waste have been reviewed. Findings: It is shown that the granite slurry waste reduces the workability whereas compressive strength of granite concrete is improved when fine aggregate or cement is partially replaced. It has been also shown that the modified concrete performed well when level of F.A. by granite slurry waste is up to 15%. Conclusion/Application: Utilization of this waste will reduce the cost of concrete, reduce environmental pollution, consumption of natural resources and energy demand.