scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Information Engineering and Electronic Business in 2019"


Journal ArticleDOI
TL;DR: This work proposes a strategy for the diagnosis of diabetes using deep neural network by training its attributes in five-fold and ten-fold crossvalidation fashion and results exhibit that the proposed system provides promising results in case of five- fold cross-validation.
Abstract: Nowadays, Diabetes is one of the most common and severe diseases in Bangladesh as well as all over the world. It is not only harmful to the blood but also causes different kinds of diseases like blindness, renal disease, kidney problem, heart diseases etc. that causes a lot of death per year. So, it badly needs to develop a system that can effectively diagnose the diabetes patients using medical details. We propose a strategy for the diagnosis of diabetes using deep neural network by training its attributes in five-fold and ten-fold crossvalidation fashion. The Pima Indian Diabetes (PID) data set is retrieved from the UCI machine learning repository database. The results on PID dataset demonstrate that deep learning approach design an auspicious system for the prediction of diabetes with prediction accuracy of 98.35%, F1 score of 98, and MCC of 97 for five-fold cross-validation. Additionally, accuracy of 97.11%, sensitivity of 96.25%, and specificity of 98.80% are obtained for ten-fold cross-validation. The experimental results exhibit that the proposed system provides promising results in case of five-fold cross-validation.

121 citations


Journal ArticleDOI
TL;DR: This paper focuses on the extractive based summarization using K-Means Clustering with TFIDF (Term Frequency-Inverse Document Frequency) for summarization and reflects the idea of true K, which divides the sentences of the input document to present the final summary.
Abstract: The quantity of information on the internet is massively increasing and gigantic volume of data with numerous compositions accessible openly online become more widespread. It is challenging nowadays for a user to extract the information efficiently and smoothly. As one of the methods to tackle this challenge, text summarization process diminishes the redundant information and retrieves the useful and relevant information from a text document to form a compressed and shorter version which is easy to understand and timesaving while reflecting the main idea of the discussed topic within the document. The approaches of automatic text summarization earn a keen interest within the Text Mining and NLP (Natural Language Processing) communities because it is a laborious job to manually summarize a text document. Mainly there are two types of text summarization, namely extractive based and abstractive based. This paper focuses on the extractive based summarization using K-Means Clustering with TFIDF (Term Frequency-Inverse Document Frequency) for summarization. The paper also reflects the idea of true K and using that value of K divides the sentences of the input document to present the final summary. Furth more, we have combined the K-means, TF-IDF with the issue of K value and predict the resulting system summary which shows comparatively best results.

37 citations


Journal ArticleDOI
TL;DR: In this paper, the influence of organizational culture, work motivation and organizational citizenship behavior (OCB) to employee performance was examined and analyzed in Pamekasan Regency Government.
Abstract: The purpose of this research is to examine and analyze the influence of organizational culture, work motivation and organizational citizenship behavior (OCB) to employee performance. This research is conducted in Pamekasan Regency Government. The samples of this research are 116 respondents. The sample technique which used in this research is proportional stratified random sampling. This study employs questionnaire by a large number of answers based on a 5-point likert scale to collect the needed data and information. The result of this research reveals that the work motivation and organizational citizenship behavior (OCB) influence to employee performance significantly. Nevertheless, organizational culture has not significant effect on employee performance at Pamekasan Regency Government.

20 citations


Journal ArticleDOI
TL;DR: This paper aims to build a novel machine learning model based on Natural Language Processing (NLP) techniques for the detection of ‘fake news’ by using both content-based features and social features of news.
Abstract: Internet acts as the best medium for proliferation and diffusion of fake news. Information quality on the internet is a very important issue, but web-scale data hinders the expert’s ability to correct much of the inaccurate content or fake content present over these platforms. Thus, a new system of safeguard is needed. Traditional Fake news detection systems are based on content-based features (i.e. analyzing the content of the news) of the news whereas most recent models focus on the social features of news (i.e. how the news is diffused in the network). This paper aims to build a novel machine learning model based on Natural Language Processing (NLP) techniques for the detection of ‘fake news’ by using both content-based features and social features of news. The proposed model has shown remarkable results and has achieved an average accuracy of 90.62% with F1 Score of 90.33% on a standard dataset.

18 citations


Journal ArticleDOI
TL;DR: Diverse data mining classification method like Decision tree classification, Naive Bayes classification, Support Vector Machine classification, and k-NN classification are used for determination and safeguard of the diseases.
Abstract: The heart is the most important part of the human body. Any abnormality in heart results heart related illness in which it obstructs blood vessels which causes heart attack, chest pain or stroke. Care and improvement of the health by the help of identification, prevention, and care of any kind of diseases is the main goal. So for this various prediction analysis methods are used which job is to identify the illness at prelim phase so that prevention and care of heart disease is done. This paper emphasizes on the care of heart diseases at a primitive phase so that it will lead to a successful cure. In this paper, diverse data mining classification method like Decision tree classification, Naive Bayes classification, Support Vector Machine classification, and k-NN classification are used for determination and safeguard of the diseases.

15 citations


Journal ArticleDOI
TL;DR: Convolutional LSTM is applied to the model to take advantage of the spatiotemporal relation of the wind and PM forecasting problem and CNN is added to extract temporal features of the dataset on the model in parallel to predict more accurate PM concentration.
Abstract: Particulate matter (PM) can harm human health by causing lung cancer, pneumonia, or cardiovascular disease. There is a growing awareness of dangerous PM among people and governments. In order to prepare for the risk, the prediction performance of PM is important. Therefore, many kinds of research are developing various prediction models. Among the models, LSTM based models show the best result and it uses various auxiliary data, including spatial features to improve performance. However, spatial features can be depreciated because all input data has to be unfolded to 1D vector. In this paper, we apply Convolutional LSTM to our model to take advantage of the spatiotemporal relation of the wind and PM forecasting problem. Also, we add CNN to extract temporal features of the dataset on our model in parallel. Finally, we combine both Convolutional LSTM and CNN to predict more accurate PM concentration. In the experiment, we compared this model with LSTM and CNN-LSTM models in previous studies. At the result, the hybrid model showed the best performance.

13 citations


Journal ArticleDOI
TL;DR: In this article some modern techniques have been used to diagnose the oral and dental diseases, and a system was made according to the decision tree, (Id3 and J48) and artificial neural network techniques.
Abstract: In this article some modern techniques have been used to diagnose the oral and dental diseases. The symptoms and causes of such disease has been studied that may cases many other serious diseases .Many cases have been reviewed through patients' records, and investigation on such causes of oral and dental disease have been carried out to help design a system that helps diagnose oral and classify them, and that system was made according to the decision tree, (Id3 and J48) and artificial neural network techniques. Sample of oral and dental diseases were collected with their symptoms to become a data base so as to help construct a diagnostic system. The graphical interface were formed in C# to facilitate the use's diagnosis process where the patient chooses the symptoms through the interface which he suffered from ,and they are analyzed using the classification techniques and then re diagnosed the disease for the user.

11 citations


Journal ArticleDOI
TL;DR: This study presents various issues related to healthcare and various machines learning algorithms which have to withstand to provide the best possible output.
Abstract: This document presents the required layout of articles to Medical data mining has become one of the prominent issues in the field of data mining due to the delicate lifestyle opted by the people which are leading them towards various chronicle health diseases. Heart disease is one of the conspicuous public health concern worldwide issues. Since clinical data is growing rapidly owing to deficient health awareness, various techniques and scientific methods are opted for analyzing this huge data. Several data mining techniques such as Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Decision tree, Naive Bayes and Artificial Neural Network (ANN) are introduced for the prediction of health disease. These techniques help to mine the relevant and useful amount of data, form the medical dataset which helps to provide beneficial information to the medical institutions. This study presents various issues related to healthcare and various machines learning algorithms which have to withstand to provide the best possible output. A comprehensive review of the literature has been summarized to put lights on the previous work done in this field.

9 citations


Journal ArticleDOI
TL;DR: The results of evaluating the SentDesk system with humans shows that the system performed as better as humans, while annotating emotions and sentiments in the datasets, counsellor’s own emotions influences their perception of emotions.
Abstract: Since organizational decisions are vital to organizational development, customers’ views and feedback are equally important to inform good decisions. Given this relevance, this paper seeks to automate a sentiment analysis system SentDeskthat can aid tracking sentiments in customers’ reviews and feedback. The study was contextualised in some business organisations in Ghana. Three business organizational marketers were made to annotate emotions and as well tag sentiments to each instance in the corpora. Kappa and Krippendoff coefficients were computed to obtain the annotation agreement in the corpora. The SentDesk system was evaluated in the environment alongside comparing the output to that of the average sentiments tagged by the marketers. Also, the SentDesk system was evaluated in the environment by the selected marketers after they had tested the platform. By finding the average kappa value from the corpora (CFR + ISEAR), the average kappa coefficient was found to be 0.40 (40%). The results of evaluating the SentDesk system with humans shows that the system performed as better as humans. The study also revealed that, while annotating emotions and sentiments in the datasets, counsellor’s own emotions influences their perception of emotions.

9 citations


Journal ArticleDOI
TL;DR: The study support commodity suppliers to take care of the determinant factors work towards maintaining quality and support decisionmaking activities in the area of the Ethiopia Commodity Exchange.
Abstract: In this paper, we have focused on the data mining technique on market data to establish meaningful relationships or patterns to determine the determinate critical factors of commodity price. The data is taken from Ethiopia commodity exchange and 18141 data sets were used. The dataset contains all main information. The hybrid methodology is followed to explore the application of data mining on the market dataset. Data cleaning and data transformation were used for preprocessing the data. WEKA 3.8.1 data mining tool, classification algorithms are applied as a means to address the research problem. The classification task was made using J48 decision tree classification algorithms, and different experimentations were conducted. The experiments have been done using pruning and unpruning for all attributes. The developed models were evaluated using the standard metrics of accuracy, ROC area. The most effective model to determine the determinate critical factors for the commodity has an accuracy of 88.35% and this result is a good experiment result. The output of this study is helpful to support decisionmaking activities in the area of the Ethiopia Commodity Exchange. The study support commodity suppliers to take care of the determinant factors work towards maintaining quality. Ethiopia Commodity Exchange (ECX), as the main facilitator of commodity exchanges, can also use the model for setting price ranges and regulations.

8 citations


Journal ArticleDOI
TL;DR: The results of the proposed algorithm were compared with the visual assessment performed by three clinicians in this field using various statistical techniques like Confidence Interval (CI), paired sample t-test and Bland-Altman plot and the agreement between the proposed method and the clinicians’ evaluation is strong.
Abstract: The most widely accepted method of monitoring the fetal heart rate and uterine activity of the mother is using Cardiotocograph (CTG). It simultaneously captures these two signals and correlate them to find the status of the fetus. This method is preferred by obstetricians since it is non-invasive as well as cost-effective. Though used widely, the specificity and predictive precision has not been undisputable. The main reason behind this is due to the contradiction in clinicians opinions. The two main components of CTG are Baseline and Variability which provide a thorough idea about the state of the fetal-health when CTG signals are inspected visually. These parameters are indicative of the oxygen saturation level in the fetal blood. Automated detection and analysis of these parameters is necessary for early and accurate detection of hypoxia, thus avoiding further compromise. Results of the proposed algorithm were compared with the visual assessment performed by three clinicians in this field using various statistical techniques like Confidence Interval (CI), paired sample t-test and Bland-Altman plot. The agreement between the proposed method and the clinicians’ evaluation is strong.

Journal ArticleDOI
TL;DR: A smart and cost-effective fire detection system based on the IoT that can detect the sudden uncertain fire in a quick succession to reduce the significant loss.
Abstract: Disaster caused by sudden uncertain fire is one of the main reasons for a great loss of properties and human lives. In our paper, we have developed a smart and cost-effective fire detection system based on the IoT that can detect the sudden uncertain fire in a quick succession to reduce the significant loss. The device houses a sensor-based smoke detection system and a camera which could be accessed by the user from anywhere through the use of internet for taking necessary preventive actions based on the reliable assessment. The notification system takes advantage of an online short message service which is connected to the Raspberry Pi module that gets triggered when the smoke sensors detect the smoke and informs the users about the predicament. The device also has a buzzer connected to central module to notify the nearby users.

Journal ArticleDOI
TL;DR: The result of the test systems shows, for practical power systems, that the GWO is a better option to solve the ED problems and both the optimality of the solution to test system and the convergence speed of theGWO algorithm are promising.
Abstract: This paper bestows the newly developed Grey Wolf Optimization (GWO) method to solve the Economic Dispatch (ED) problem with multiple fuels. The GWO method imitates the superiority ranking and feeding mechanism of grey wolves in nature. For simulating the superiority ranking follows as alpha, beta, omega and delta. For feeding the prey grey wolves follows three steps, in the order of searching, encircling and attacking, are carry out to perform optimization. While searching for a better solution, GWO does not obligate any statistics about the gradient of the fitness function. The intention of ED is to curtail the fuel cost for any viable load demand and at the same time to determine the optimal power generation. The ED is modeled as a complex problem by considering multiple fuels, valve-point loading and transmission losses. The potency of the GWO method has been examined on ten units system with four different load demands by considering four different case studies. The result of the test systems shows, for practical power systems, that the GWO is a better option to solve the ED problems. Both the optimality of the solution to test system and the convergence speed of the GWO algorithm are promising.

Journal ArticleDOI
TL;DR: The study makes a recommendation for proper enlightenment of financial institution users so as to stay abreast with possible security challenge associated with some banking transaction processes to be able to mitigate possible exploit.
Abstract: In this paper research was carried out in order to evaluate the security risk analysis and management in banking company through the use of a questionnaire to determine the level of risk that customer of the financial institution is likely to encounter. It was discovered that though the majority of financial institution users are familiar with the possible risk associated with some banking transaction, some aspect still exists that financial institution users are not familiar with which serves as a vulnerable point that could be exploited. The study makes a recommendation for proper enlightenment of financial institution users so as to stay abreast with possible security challenge associated with some banking transaction processes to be able to mitigate possible exploit.

Journal ArticleDOI
TL;DR: In this paper, the accessibility and usability of Nigeria banking websites using some automated tools and manual inspection method were evaluated using Web Accessibility Initiate (WAI) standard. But, the results from the study indicate that some of the websites do not conform to the expected standard.
Abstract: The growing need for accessible websites cannot be overemphasized as it has posed a major challenge in the world of Information and Communication Technology (ICT). Most businesses have gone online in order to improve their market value; the banking sector is not an exception. In an attempt to satisfying customers, websites developers have violated most of the websites standards. The banking sector is one area that carries out most of its activities online. Therefore, it is important that their websites be accessible to all especially people with visually impaired disability and more so, regardless of the browsing technology being used. This study evaluates the accessibility and usability of Nigeria banking websites using some automated tools and manual inspection method. This is done in order to know the conformance of the banking websites with standard as specified by Web Accessibility Initiate (WAI). Results from the study indicate that some of the websites do not conform to the expected standard. Hence, there is need for substantial improvements on most bank websites in Nigeria

Journal ArticleDOI
TL;DR: How easily forecasting and analysis can be done through tableau by taking the dataset of a superstore and predict the forthcoming sales and profit for the next four quarters of the forthcoming year is explained.
Abstract: The current era is generally treated as the era of data, Users of computer are gradually increasing day by day and vast amount of data is generated from multiple domains such as healthcaredomain, Business related domains etc. The terminology Business Intelligence (BI) generally refers different technologies, applications and practices used for the collection, integration, analysis, and presentation of information of business related domain. The main motive for Business intelligence and analytics are to help in decision making process and to enhance the profit of the organisation. Various business related tools are used to analyze & visualize different types of data which are generated frequently. Tableau prepared its mark on the Field of BI by being one of the first companies to permit business customers the ability to achieve equitably arduous data visualization in a very interesting, drag and drop manner. Tableau will enhance decision making, add operational awareness, and increase performance throughout the organization The presented paper describes different tools used for business intelligence field and provides a depth knowledge regarding the tableau tool. It also describes why tableau is widely used for data visualization purpose in different organization day by day. The main aim of this paper is to describe how easily forecasting and analysis can be done by using this tool ,this paper has explained how easily prediction can be done through tableau by taking the dataset of a superstore and predict the forthcoming sales and profit for the next four quarters of the forthcoming year. In the collected dataset sales and profit details of different categories of goods are given and by using the forecasting method in tableau platform these two measures are calculated for the forthcoming year and represented in a fruitful way. Finally, the paper has compared all the framework used for business intelligence and analytics on the basis of various parameters such as complexity, speed etc.

Journal ArticleDOI
TL;DR: The aim of this research is to build an Arabic sentiment lexicon using a corpus-based approach and the main contribution of the work comes from the empirical evaluation of different similarity to assign the best sentiment scores to terms in the co-occurrence graph.
Abstract: Sentiment analysis is an application of artificial intelligence that determines the sentiment associated sentiment with a piece of text. It provides an easy alternative to a brand or company to receive customers' opinions about its products through user generated contents such as social media posts. Training a machine learning model for sentiment analysis requires the availability of resources such as labeled corpora and sentiment lexicons. While such resources are easily available for English, it is hard to find them for other languages such as Arabic. The aim of this research is to build an Arabic sentiment lexicon using a corpus-based approach. Sentiment scores were propagated from a small, manually labeled, seed list to other terms in a term co-occurrence graph. To achieve this, we proposed a graph propagation algorithm and compared different similarity measures. The lexicon was evaluated using a manually annotated list of terms. The use of similarity measures depends on the fact that the words that are appearing in the same context will have similar polarity. The main contribution of the work comes from the empirical evaluation of different similarity to assign the best sentiment scores to terms in the co-occurrence graph.

Journal ArticleDOI
TL;DR: The paper proposed the distribution mode of online distribution and offline distribution, which promoted the flexible distribution of the terminal side quantum key, and realized the low-cost application.
Abstract: The power distribution and utilization network is an important link between the grid and the user. The data privacy protection is related to the secure and stable communication of the whole network. Based on quantum secure communication and combined with existing power security protection measures, this paper proposed a quantum key application solutions. The system architecture contained the primary station layer, access layer and terminal layer. Through the quantum key distribution network, the key generation devices of both sides of the communication negotiated to generate the quantum key, which was used to encrypt the transmitted data. In the aspect of quantum key distribution, the paper proposed the distribution mode of online distribution and offline distribution, which promoted the flexible distribution of the terminal side quantum key, and realized the low-cost application. The feasibility of the scheme was verified through the pilot application.

Journal ArticleDOI
TL;DR: This paper uses multi-objective optimization on the basis of ratio analysis (MOORA) method to evaluate mushroom cultivation options in Vietnam.
Abstract: The role of materials in the proper design and operation of products has been acknowledged. An incorrectly selected material for a certain product may cause premature failure of the final product. The right choice of available materials is very important to the success and competitiveness of manufacturing organizations. In Vietnam, tropical monsoon climate conditions greatly affect mushroom cultivation. The raw materials, additives and the ratio between them will also affect the quality and yield of mushrooms. Therefore, selecting the options for growing mushrooms or choosing good materials to grow mushrooms effectively is also a matter of concern. This is a problem of many decisionmaking problems. In this paper we multi-objective optimization on the basis of ratio analysis (MOORA) method to evaluate mushroom cultivation options in Vietnam.

Journal ArticleDOI
TL;DR: Through mathematical induction, it shows that a smaller divisor-ratio results in a closer position of the small Divisor to the square root of a RSA modulus.
Abstract: The article investigates the detail distribution of RSA modulus’ small divisor in the T3 tree in terms of the divisor-ratio. It proves that, the distribution of the small divisor in T3 tree is completely determined by the divisor-ratio and the parity of the level where the RSA modulus lies, the small divisor of a RSA modulus lying on an even level lies on the same level as where the square root of the modulus is clamped, whereas that of a modulus on an odd level possibly lies on the same level or the higher adjacent level of the square root. Through mathematical induction, it shows that a smaller divisor-ratio results in a closer position of the small divisor to the square root of a RSA modulus.


Journal ArticleDOI
TL;DR: To design the collaborative approach the knowledge was acquired from document analysis, domain experts’ interview and hidden knowledge were extracted from Ethiopia national meteorology agency weather dataset and from central statistics agency crop production dataset by using machine learning algorithms.
Abstract: Selecting proper crops for farmland involves a sequence of activities. These activities and the entire process of farming require a help of expert knowledge. However, there is a shortage of skilled experts who provide advice for farmers at district level in developing countries. This study proposed designing knowledge based solution through the collaboration of experts’ knowledge with the machine learning knowledge base to recommending suitable agricultural crops for a farm land. To design the collaborative approach the knowledge was acquired from document analysis, domain experts’ interview and hidden knowledge were extracted from Ethiopia national meteorology agency weather dataset and from central statistics agency crop production dataset by using machine learning algorithms. The study follows the design science research methodology, with CommonKADS and HYBRID models; and WEKA, SWI-Prolog 7.32 and Java NetBeans tools for the whole process of extracting knowledge, develop the knowledge base and for developing graphical user interface respectively. Based on the objective measurement PART rule induction have the highest classifier algorithm which classified correctly 82.6087% among 9867 instances. The designed collaborative approach of experts’ knowledge with the knowledge discovery for agricultural crop selections based on the domain expert, farmers and agriculture extension evaluation 95.23%, 82.2 % and 88.5 % overall performance respectively.

Journal ArticleDOI
TL;DR: After extensive experiments on real and synthetic data, the efficiency of the proposed recommender system is proved and it will be easy for a user to decide that which place would be better for him/her.
Abstract: Recommending appropriate things to the user by analyzing available data is becoming popular day by day. There are no sufficient researches on Real-estate recommendation with historical data and surrounding environments. We have collected real-estate, historical and point of interest (POI) data from the various sources. In this research, a hybrid filtering technique is used for recommending real-estate consisting of collaborative and content-based filtering. Generally, in every website user ratings are collected for the recommendation. But we have considered historical data and surrounding environments of a real-estate location for recommendation by which it will be easy for a user to decide that which place would be better for him/her. If any user request for any specific location then the system will find the POI data using google map API. Then the system will consider historical data of that area, got from the trusted sources. So considering the minimum price and optimal facilities, our system will recommend top-k real-estate. After extensive experiments on real and synthetic data, we have proved the efficiency of our proposed recommender system.

Journal ArticleDOI
TL;DR: This paper presents that the simulation of control of three phase Brushless Direct Current (BLDC) motor in all four quadrants with PI and Fuzzy Logic controllers (FLC) is presented.
Abstract: This paper presents that the simulation of control of three phase Brushless Direct Current (BLDC) motor in all four quadrants with PI and Fuzzy Logic controllers (FLC). Traditionally the speed control of motors is carried out by conventional motors with using P, PI, PID and some other control techniques [5]. But it provides a chance to occurrence of nonlinearity & uncertainties that causes some internal and external parameter errors. The efficient speed control in four quadrant operation can be achieved by using a fuzzy logic controller. The improvisation of Brushless Direct current motor drive through fuzzy logic controller in all four quadrants is done using simulink/MATLAB [7].

Journal ArticleDOI
TL;DR: This presented research paper mainly studies the frequent itemsets mining approach for finding the most important attribute to overcome the existing problems in the extraction of relevant information by using data mining approaches from a huge amount of dataset.
Abstract: This presented research paper mainly studies the frequent itemsets mining approach for finding the most important attribute to overcome the existing problems in the extraction of relevant information by using data mining approaches from a huge amount of dataset. Firstly a state of art diagram for prediction is designed and data mining classifier like naive bayes, support vector machine, decision tree, knearest neighbour are compared and then proposed methodology with new techniques are proposed. Moreover, a new attribute filtering association frequent itemsets mining algorithm is presented. Then, by analyzing the feasibility of the proposed algorithm, the data mining classification classifier is compared. As a result, SVM produces the best result among all the classifier with attribute filtrating and without attribute filtrating. With attribute filtrating algorithm enhances the accuracy of all the other classifier.

Journal ArticleDOI
TL;DR: The paper suggests a new keyword-based semantic retrieval scheme for google search engines that accelerates the performance of searching process considerably with the help of domain-specific knowledge extraction process along with inference and rules.
Abstract: Remarkable growth in the electronics and communication field provides ubiquitous services. It also permits to save huge amount of documents on web. As a result, it is very difficult to search a specific and desired information over the Internet. Classical search engines were unable to investigate the content on web intelligently. The tradition searching results has a lot of immaterial information along with desired one as per user query. To overcome from stated problem many modifications are done in traditional search engines to make them intelligent. These search engines are able to analyze the stored data and reflects only appropriate contents as per users query. Semantic Web is an emerging and efficient approach to handle the searching queries. It gathers appropriate information from web pool based on logical reasoning. It also incorporates rulebased system. Semantic web reasonably scrutinizes webs contents using ontology. The learning process of ontology not only intelligently analyze the contents on web but also improves scrutinizing process of search engine. The paper suggests a new keyword-based semantic retrieval scheme for google search engines. The schemes accelerates the performance of searching process considerably with the help of domain-specific knowledge extraction process along with inference and rules. For this, in ontology the prefix keywords and its sematic association are pre-stored. The proposed framework accelerates the efficiency of content searching of google search engine without any additional burden of end users.

Journal ArticleDOI
TL;DR: The system will assist in providing timely, efficient, accurate and comprehensive information about hypertension, which is useful for Doctors and patients in detecting, diagnosing, classifying and managing hypertension and its risk.
Abstract: Hypertension is a silent killer, which gives no warning signs to alert a patient and can only be detected through regular blood pressure checkups. Uncontrolled and unmonitored hypertension contributed to stroke, chronic kidney disease, eye problem, and heart failure. It is an ongoing challenge to health care systems worldwide. Early detection of hypertension and creating awareness will greatly reduce the effect of hypertension and its related diseases. Also, having a mobile-based system will help patients to know their status, relate with Doctor and enjoy the quick response from the Doctor on hypertension diagnostic effect on their health. The mobile application will help in monitoring patients anytime, anywhere and provide services for each patient based on their personal health condition. The mobile application was designed using unified modeling language and implemented using the Extensible Mark-Up Language and Java programming language for the mobile layout and content, while JavaScript Object Notation was used to implement the data storage and retrieval mechanism of the system. The system was tested using data collected from hospital, which yielded an accuracy of 100%. In conclusion, the system will assist in providing timely, efficient, accurate and comprehensive information about hypertension, which is useful for Doctors and patients in detecting, diagnosing, classifying and managing hypertension and its risk.

Journal ArticleDOI
TL;DR: The results show that ESWSA is very efficient process for achieving desired radiation pattern while amplitude only control performed better compare to the others two controlling process for all benchmark problems.
Abstract: Linear antenna array pattern synthesis using computational method is an important task for the electronics engineers and researchers. Suitable optimization techniques are required for solving this kind of problem. In this work, Elephant Swarm Water Search Algorithm (ESWSA) has been used for efficient and accurate designing of linear antenna arrays that generate desired far field radiation pattern by optimizing amplitude, phase and distance of the antenna elements. ESWSA is inspired by water resource search procedure of elephants during drought. Two different fitness functions for two different benchmark problem of linear antenna array have been tested for validation of the proposed methodology. During optimization, three types of synthesis have been used namely: amplitude only, phase only and position only control for all cases antenna array. The results show that ESWSA is very efficient process for achieving desired radiation pattern while amplitude only control performed better compare to the others two controlling process for all benchmark problems.

Journal ArticleDOI
TL;DR: The study shows that the TC-EGC gives lower OP and PT values when compared with conventional EGC and TC, with reduction in hardware complexity, which can be used to enhance the performance of wireless communication system.
Abstract: Wireless communication system is of paramount importance in the world of telecommunication infrastructure and is expected to be a leading role in the development of a nation. However, the system is characterized by multipath propagation effects that lead to variability of the received signal thereby degrading the performance. Equal Gain Combiner (EGC) being used to address this problem is associated with hardware complexity that result in long processing time, while Threshold Combiner (TC) with low processing time has poor performance. Hence, in this paper, a hybridized Diversity Combiner (DC) consisting of EGC and TC, (TC-EGC) with a closed form expression over Nakagami fading channel is developed. TC-EGC is derived using the conventional EGC and TC at the receiver. Randomly generated bits used as source data are modulated using M-ary Quadrature Amplitude Modulation (M-QAM) and transmitted over Nakagami channel after filtering. The faded signals generated at varying paths ‘L’ (2, 3, 4) are scanned by TC to select the strongest paths. The outputs from the three TCs are combined by EGC to obtain the received signal which is converted to baseband through demodulation. A mathematical expression using the Probability Density Function (PDF) of Nakagami fading channel at varying paths ‘L’ for Outage Probability (OP) is also derived. The technique is simulated using Matrix Laboratory (version 7.2). The performance is evaluated using Signal-to-Noise Ratio (SNR), Outage Probability (OP) and Processing Time (PT). The study shows that the TC-EGC gives lower OP and PT values when compared with conventional EGC and TC, with reduction in hardware complexity. The TC-EGC developed can be used to enhance the performance of wireless communication system.

Journal ArticleDOI
TL;DR: Fitbit sensor was used to take the reading of four stroke patients in Federal Teaching Hospital, Gombe State, and the vital readings recorded were heartbeat rate, sleeping rate, the number of steps taken, for a period of four weeks, showing that the developed system performed better than the existing manual method for monitoring stroke patients.
Abstract: This paper presents an intelligent health monitoring system for post management of stroke. Fitbit sensor was used to take the reading of four stroke patients in Federal Teaching Hospital, Gombe State, and the vital readings recorded were heartbeat rate, sleeping rate, the number of steps taken, for a period of four weeks. The developed AppFabric, Web service and AppFeedBack synchronized the operation of the sensor, the user mobile device and the medical diagnostic platform. The readings taken by the sensor were made available to the medical experts and the monitoring team using web service. The evaluation of the system in terms of efficiency and reliability using t-Test were (82.3, 85.9) and (1.729133, 2.093024) respectively. The results show that the developed system performed better than the existing manual method for monitoring stroke patients.