scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computing in 2016"


Journal ArticleDOI
TL;DR: Considering that graphic structures can be used to describe the crossing of edges in an m-polar fuzzy multi-graph with certain amount of m- polar fuzzy planarity value, first of all the notion of m -polar furry multi-set is introduced.
Abstract: In many practical applications with a graph structure, there may exist crossing between edges which is not allowed in a crisp planar graph. Considering that graphic structures can be used to describe the crossing of edges in an m-polar fuzzy multi-graph with certain amount of m-polar fuzzy planarity value, first of all the notion of m-polar fuzzy multi-set is introduced. Then m-polar fuzzy multi-graphs, m-polar fuzzy planar graphs and m-polar fuzzy strong edges are defined. Several properties of m-polar fuzzy planar graphs are also studied.

44 citations


Journal ArticleDOI
TL;DR: The efforts that LFG played in resolving various NLP issues are addressed and new trends have been triggered while conducting this survey and have been demonstrated for pursuing further research.
Abstract: Lexical Functional Grammar (LFG) plays a vital role in the area of Natural Language Processing (NLP). LFG is considered as the constraint-based philosophy of grammar. C-structure and F-structure are the two basic forms of LFG. We have perceived from the existing literature that LFG has not studied in details; the reason that encouraged us to work on this study. This study highlights the brief history of LFG along with its architecture. Arabic language along with its parsing techniques is demonstrated. Moreover, this study addresses the efforts that LFG played in resolving various NLP issues. New trends have been triggered while conducting this survey and have been demonstrated for pursuing further research.

36 citations


Journal ArticleDOI
TL;DR: The use of two important non-parametric statistical tests, namely, Wilcoxon signed rank test for comparison of two classifiers and Friedman test with the corresponding post-hoc tests for Comparison of multiple classifiers over multiple datasets are proposed.
Abstract: In machine learning, generation of new algorithms or, in most cases, minor amendment of the existing ones is a common task In such cases, a rigorous and correct statistical analysis of the results of different algorithms is necessary in order to select the exact techniques depending on the problem to be solved The main inconvenience related to this necessity is the absence of proper compilation of statistical techniques In this paper, we propose the use of two important non-parametric statistical tests, namely, Wilcoxon signed rank test for comparison of two classifiers and Friedman test with the corresponding post-hoc tests for comparison of multiple classifiers over multiple datasets We also introduce a new variant of non-parametric test known as Scheffe's test for locating unequal pairs of means of performances of multiple classifiers when the given datasets are of unequal sizes The parametric tests, which were previously being used for comparing multiple classifiers, have also been described in brief The proposed non-parametric tests have also been applied on the classification results on ten real-problem datasets taken from the UCI Machine Learning Database Repository http://wwwicsuciedu/mlearn Valdovinos and Sanchez, 2009 as case studies

33 citations


Journal ArticleDOI
TL;DR: This paper presents and discusses a method for Android’s applications classification with the purpose of malware detection and proposes the “antivirus” system especially for Android system that can detect and block undesirable and malicious applications.
Abstract: This paper presents and discusses a method for Android’s applications classification with the purpose of malware detection. Based on the application of an Artificial Immune System and Artificial Neural Networks we propose the “antivirus” system especially for Android system that can detect and block undesirable and malicious applications. This system can be characterized by self-adaption and self-evolution and can detect even unknown and previously unseen malicious applications. The proposed system is the part of our team’s big project named “Intelligent Cyber Defense System” that includes malware detection and classification module, intrusions detection and classification module, cloud security module and personal cryptography module. This paper contains the extended research that was presented during the IEEE 8th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS’2015) [1].

27 citations


Journal ArticleDOI
TL;DR: This paper deals with a simulation model of slip displacement sensors for the object slip signals’ registration in the adaptive robot’s gripper, and presents the analysis of different methods for slip displacement signals detection, as well as authors’ solutions.
Abstract: This paper deals with a simulation model of slip displacement sensors for the object slip signals’ registration in the adaptive robot’s gripper. The study presents the analysis of different methods for slip displacement signals detection, as well as authors’ solutions. Special attention is paid to the investigations of the developed sensor with the resistive registration element in rod type structure of sensitive elements, which is able to operate in harsh and corrosive environments. A sensing system for the object slip signals’ registration in the adaptive robot’s gripper with a clamping force correction is developed for proposed slip displacement sensor with multi-component resistive registration elements. The hardware implementation of the sensing system for slip signals’ registration and obtained results are considered in details. The simulation model of the proposed slip displacement sensor based on polytypic conductive rubber is modeled by Proteus software. The intelligent approaches with the use of a field programmable gate array (FPGA) and VHDL-model to the sensing system designing allow to define the slippage direction in slip displacement sensor based on resistive registration elements. Thus, this expands the functionality of the developed sensor.

21 citations


Journal ArticleDOI
TL;DR: It is shown that proposed multiple coarse grid correction strategy makes it possible not only to create the task of the smoother least demanding, but also to avoid load imbalance and limit the communication overhead.
Abstract: The paper describes practical approach for minimising the parallelisation overhead for a solution to the boundary value problems by geometric multigrid methods. It is shown that proposed multiple coarse grid correction strategy makes it possible not only to create the task of the smoother least demanding, but also to avoid load imbalance and limit the communication overhead. Estimation of maximum speedup and efficiency of parallel robust multigrid technique and parallel V-cycle are given.

19 citations


Journal ArticleDOI
TL;DR: The hardware heterogeneity of the robotic swarm and its challenges is discussed and another issue addressed in paper is the active power management of the robotics agents.
Abstract: In this work we present the hardware architecture of a mobile heterogeneous robot swarm, designed and implemented at the Interdisciplinary Robotics, Intelligent Sensing and Control (RISC) Laboratory, University of Bridgeport. Most of the recent advances in swarm robotics have mainly focused on homogeneous robot swarms and their applications. Developing and coordinating a multi-agent robot system with heterogeneity and a larger behavioral repertoire is a great challenge. To give swarm hardware heterogeneity we have equipped each swarm robot with different set of sensors, actuators, control and communication units, power supply, and an interconnection mechanism. This paper discusses the hardware heterogeneity of the robotic swarm and its challenges. Another issue addressed in paper is the active power management of the robotic agents. The power consumption of each robot in the UB robot swarm is calculated and the power management technique is also explained in this paper. We applied this heterogeneous robot swarm to perform three sample tasks – Mapping task, human rescue task and wall painting task.

18 citations


Journal ArticleDOI
TL;DR: This work proposes an automatic system for arabic handwritten word extraction and recognition based on localizing and segmenting touching characters, extracting real subwords and structural features from word images and recognizing them by a Markovian classifier.
Abstract: Segmenting arabic manuscripts into text-lines and words is an important step to make recognition systems more efficient and accurate. The major problem making this task crucial is the word extraction process: first, words are often a succession of sub-words where the space value between these sub-words do not respect any rules. Second, the presence of connections even between non adjacent sub-words in the same text-line, makes word’s parts identification and the entire word extraction difficult. This work proposes an automatic system for arabic handwritten word extraction and recognition based on 1) localizing and segmenting touching characters, 2) extracting real subwords and structural features from word images and 3) recognizing them by a Markovian classifier. The performance of the proposed system is tested using samples extracted from historical handwritten documents. The obtained results are encouraging. We achieved an average rate of recognition of 87%.

17 citations


Journal ArticleDOI
TL;DR: An overview of HSM and TLS solutions along with sample implementations are presented and some recommendations how to combine both are shared.
Abstract: The Transport Layer Security (TLS) protocol is a well-established standard for securing communication over insecure communication links, offering layer-4 VPN functionality. In the classical Internet TLS is widely used. With the advances of the Internet of Things (IoT) there is an increasing need to secure communication on resource-constrained embedded devices. On these devices, computation of complex cryptographic algorithms is difficult. Additionally, sensor nodes are physically exposed to attackers. Cryptographic acceleration and secure hardware security modules (HSMs) are possible solutions to these challenges. The usage of specialized cryptographic modules for TLS is not a new phenomenon. However, there are still few hardware security modules suitable for the use on microcontrollers in sensor networks. We therefore present an overview of HSM and TLS solutions along with sample implementations and share some recommendations how to combine both.

16 citations


Journal ArticleDOI
TL;DR: The up-to-date advancement of data collection, feature extraction, feature selection, and signal changing detection are presented, which are essential phases of this work of detecting falls in independent living apartments using accelerometer concealed under tiles.
Abstract: Automatic fall detection is a major issue in taking care of the health of elderly people and has the potential of increasing autonomy and independence while minimizing the risks of living alone. It has been an active research area due to the large demand of the healthcare association for fall detection goods. Fortunately, due to the recent fast progression in sensing technologies, fall detection system becomes prospective. It permits to monitor elders and detect their falls, and consequently provides emergency support whenever needed. This paper describes the current work of detecting falls in independent living apartments using accelerometer concealed under tiles. We present the up-to-date advancement of data collection, feature extraction, feature selection, and signal changing detection, which are essential phases of this work.

14 citations


Journal ArticleDOI
TL;DR: A technique for Twitter Arabic sentiment analysis consisting of a semantic approach and Arabic linguistic features is proposed and implemented, and a technique of classification which uses both Arabic and English sentiment lexicons to classify the Arabic tweets into three sentiment categories.
Abstract: Sentiment analysis has grown to be one of the most active research areas in natural language processing and text mining. Many researchers have investigated sentiment analysis and opinion mining from different classification approaches. However, limited research is conducted on Arabic sentiment analysis as compared to the English language. In this paper, we have proposed and implemented a technique for Twitter Arabic sentiment analysis consisting of a semantic approach and Arabic linguistic features. Hence, we introduced a mechanism for preprocessing Arabic tweets, and for the methodology of sentiment classification we used a semantic approach. Also, we proposed a technique of classification which uses both Arabic and English sentiment lexicons to classify the Arabic tweets into three sentiment categories (positive or negative or neutral). Our experiments show that many issues were encountered when we used the Arabic SentiWordNet facility to classify Arabic tweets directly; these issues are basically related to Arabic text processing. The Arabic lexicons and Arabic tools must be improved or built from scratch in order to improve Arabic sentiment analysis using the semantic approach. The improvement in results, which are due to our contribution in the form of enhanced Arabic lexicons and amended Arabic tools, demonstrate this need.

Journal ArticleDOI
TL;DR: This paper compares eight proposed methods using steganography of Arabic language texts for different search algorithms to consider a secret key and selects the best method that provides the best solution suitable to hide the Arabic languagetexts.
Abstract: This paper compares eight proposed methods using steganography of Arabic language texts for different search algorithms to consider a secret key. All methods use random numbers to generate the secret key. The objectives are to evaluate each method and to select the best method that provides the best solution suitable to hide the Arabic language texts. Secret sharing is the fourth-best method in security, linear regression is the best method for transparency and capacity of secret message hiding, whereas singular value decomposition is the best method in terms of security and robustness, Huffman code provides secret message compression security and transparency, and steganography in Microsoft Word documents uses the protocol in layer one of single–double quote, which is weak in security. Conversely, the random subtraction of two images method is the best algorithm in terms of security, robustness, and capacity, while Kashida and Single–double quote are the best methods for security, transparency, and robustness, steganography of twice secret messages in layer one is the best method for security and robustness. Of all the aforementioned security methods, secret sharing is the best overall security method.

Journal ArticleDOI
TL;DR: A new approach for Arabic word disambiguation is introduced by utilizing Wikipedia as the lexical resource for disambigsuation by utilizing Vector Space Model and cosine similarity between the word’s context and the retrieved senses from Wikipedia.
Abstract: In this research we introduce a new approach for Arabic word disambiguation by utilizing Wikipedia as the lexical resource for disambiguation. The nearest context for an ambiguous word is selected using Vector Space Model and cosine similarity between the word’s context and the retrieved senses from Wikipedia. Three experiments have been conducted to evaluate the proposed approach, two experiments use the first retrieved sentence for each sense from Wikipedia but they use different Vector Space Model while the third experiment use the first paragraph for the retrieved sense from Wikipedia. The experiments show that using the first retrieved paragraph is better than the first retrieved sentence and the use of Tf-Idf VSM is better than using raw frequency

Journal ArticleDOI
TL;DR: This work presents computational intelligence techniques applied to biometrics, from both a theoretical and an application point of view.
Abstract: Biometric systems consist of devices, procedures, and algorithms used to recognize people based on their physiological or behavioral features, known as biometric traits. Computational intelligence (CI) approaches are widely adopted in establishing identity based on biometrics and also to overcome non-idealities typically present in the samples. Typical areas include sample enhancement, feature extraction, classification, indexing, fusion, normalization, and anti-spoofing. In this context, computational intelligence plays an important role in performing of complex non-linear computations by creating models from the training data. These approaches are based on supervised as well as unsupervised training techniques. This work presents computational intelligence techniques applied to biometrics, from both a theoretical and an application point of view.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the Fibonacci Lagrange Interpolation Polynomial (FLIP) can be obtained both recursively and implicitly from the first n + 1 terms of the Fiboni sequence.
Abstract: Fibonacci sequence is one of the most common sequences in mathematics. It was first introduced by Leonardo Pisa in his book Liber Abaci (1202). From the first n + 1 terms of Fibonacci sequence, a polynomial of degree at most n can be constructed using Lagrange interpolation. In this paper, we show that this Fibonacci Lagrange Interpolation Polynomial (FLIP) can be obtained both recursively and implicitly.

Journal ArticleDOI
TL;DR: A lexical analysis technique of Textual Entailment is adopted to study the suitability of this technique for Arabic language and a semantic matching approach was added in order to enhance the precision of the proposed entailment system.
Abstract: Textual Entailment is one of the recent natural language processing challenges, where the meaning of an expression “Hypothesis” could be entailed by the meaning of another expression “Text”. In comparison with English Language, the Arabic language has more challenges in determining entailment judgment, due to lexical and syntactic ambiguity. The proposed work in this paper has adopted a lexical analysis technique of Textual Entailment to study the suitability of this technique for Arabic language. In addition a semantic matching approach was added in order to enhance the precision of the proposed entailment system. The lexical analysis is based on calculating word overlap and bigram extraction and matching. The semantic matching has been incorporated with word overlap to increase the accuracy of words matching. The system has been evaluated by measuring precision and recall, those two metrics are the main evaluation measures used in Recognizing Textual Entailmen challenge number 2 to evaluate participated systems. The system has achieved precision of 68%, 58% for both Entails and NotENtails respectively with overall recall of 61%.

Journal ArticleDOI
TL;DR: A differential fruit fly optimisation algorithm DFOA to solve the problem of QoS-aware service composition by performing the local search in the neighbourhoods of global optimisation solution based on fruit fly swarm differential mutation and crossover operation and obtaining optimal service composition.
Abstract: With the emergence of a large number of web services with similar functional but different non-functional attributes, how to select appropriate web services from massive candidate services and assemble them into service composition which can complete the complex value-added business process is of great concern. The problem of QoS-aware service composition belongs to the multi-objective decision optimisation problem. Its goal is to optimise QoS of the entire service composition. To solve this problem, this paper presents a differential fruit fly optimisation algorithm DFOA. Firstly, make fast global optimisation through FOA. Then, perform the local search in the neighbourhoods of global optimisation solution based on fruit fly swarm differential mutation and crossover operation. Finally, update the global optimal solution according to the two optimisation results, thereby obtaining the optimal service composition. The experiment verifies the feasibility of the algorithm.

Journal ArticleDOI
TL;DR: An improved multi-strategy artificial bee colony algorithm MSABC is proposed, which performs significantly better than several recently proposed similar algorithms in terms of the convergence speed and solution accuracy.
Abstract: Artificial bee colony ABC algorithm is a nature-inspired metaheuristic based on imitating the foraging behaviour of bee, which is widely used in solving complex multi-dimensional optimisation problems In order to overcome the drawbacks of standard ABC, such as slow convergence and low solution accuracy, we propose an improved multi-strategy artificial bee colony algorithm MSABC According to the type of position information in ABC, three basic search mechanisms are summarised, the mechanisms include searching around the individual, the random neighbour and the global best solution Then, the basic search mechanisms are improved to obtain three search strategies Each bee randomly selects a search strategy to produce a candidate solution under the same probability in each iteration Thus these strategies can make a good balance between exploration and exploitation Finally, the experiments are conducted on eight classical functions Results show that our algorithm performs significantly better than several recently proposed similar algorithms in terms of the convergence speed and solution accuracy

Journal ArticleDOI
TL;DR: Different software complexity models are critically studied and compared and software complexity for each program is found using the four popular LOC, McCabe, Halstead and Cognitive models.
Abstract: One of the main problems in software engineering is the inherent complexity. Complexity metric is used to estimate various parameters such as software development cost, amount of time needed for implementation and number of tests required. In this paper, different software complexity models are critically studied and compared. For application, quick sort algorithm is considered. The programs are written in three object oriented languages: C++, Visual Basic and Java. Software complexity for each program is found using the four popular LOC, McCabe, Halstead and Cognitive models. The results are compared.

Journal ArticleDOI
TL;DR: The outcome of this study is the realization of a neuronal-axon network simulator that exhibits small-world characteristics of clustering with a logarithmic degree of separation between nodes without the need for long-range communication edges.
Abstract: The small-world phenomena exhibits highly localized clustering and short-cut paths between vertices in a graph that reflect observed properties in social networks, epidemiological models and other real-world networks. The small-world models rely on the application of constraint-based randomness or the derivation of constraints on randomness to simulate the desired network complexities and their associated network connection properties. In this paper, rather than exploring the random properties of small-world networks, we employ deterministic strategies in the design of a computationally efficient distributed neuronal-axon network simulator that results in a small world network. These strategies are derived by addressing the parallel complexities of the proposed neuronal-axon network simulator, and also from physical constraints imposed by resource limitations of the distributed simulation architecture. The outcome of this study is the realization of a neuronal-axon network simulator that exhibits small-world characteristics of clustering with a logarithmic degree of separation between nodes without the need for long-range communication edges. The importance of this result is the deterministic application of reasoned optimization rules from which the small-world network emerges.

Journal ArticleDOI
TL;DR: In this article, the authors identified 20 factors that are important for social media adoption and short-listed the identified factors to nine, based on the criteria that three or more researchers identified them as important in their studies.
Abstract: Millions of people around the globe are daily interacting with each other and with organizations through social media platforms, which is a very convenient way of communication. Accordingly, public sector organizations around the world realized the importance of social media and thus are increasingly using it for communicating with their citizens. Although social media provides various benefits but risks also exist that are not only related to time, money, and effort losses; but also extends to include risks such as reputation and trust losses. Hence, it is critical for organizations to understand which factors impact the social media adoption most. Based on the literature review, this study initially identifies 20 factors that are important for social media adoption. We short-listed the identified factors to nine, based on the criteria that three or more researchers identified them as important in their studies. Then we asked experts in social media in Oman to rank those factors in the context of their importance. We used the Analytic Hierarchy Process (AHP) method to rank those nine factors. Results of the AHP show that some factors such as Social Media Strategy, Training and experience of Staff, Community Influence and Top Management Support are more important in a developing country as compared to developed countries.

Journal ArticleDOI
TL;DR: A novel Adaptive Spider Net Search Algorithm (ASNS) has been presented, which has been used for the optimization of conventional control scheme used in shunt active power filter, which unmistakably prove the usefulness of the proposed algorithm in balanced, unbalanced and distorted supply system.
Abstract: In this paper, a novel Adaptive Spider Net Search Algorithm (ASNS) has been presented, which has been used for the optimization of conventional control scheme used in shunt active power filter. The effectiveness of the proposed algorithm has been proved by applying this in balanced, unbalanced and distorted supply conditions. The conventional sinusoidal current control technique has been used. The soft computing algorithms have been used to give the optimum results. The superiority of ASNS algorithm over existing Genetic Algorithm results has been presented by analyzing the THD and compensation time of both the algorithms. The simulation results using MATLAB model ratify that algorithm has optimized the control technique, which unmistakably prove the usefulness of the proposed algorithm in balanced, unbalanced and distorted supply system.

Journal ArticleDOI
TL;DR: A historical overview of design choices for data acquisition and control systems, from the first developments in CAMAC, through the evolution of their designs operating in VMEbus, Firewire and USB, to the latest developments concerning distributed systems using, in particular, wireless protocols and time-triggered architecture.
Abstract: The objective of this paper is to present a historical overview of design choices for data acquisition and control systems, from the first developments in CAMAC, through the evolution of their designs operating in VMEbus, Firewire and USB, to the latest developments concerning distributed systems using, in particular, wireless protocols and time-triggered architecture. First part of the overview is focused on connectivity aspects, including buses and interconnects, as well as their standardization. More sophisticated designs and a number of challenges are addressed in the second part, among them: bus performance, bus safety and security, and others.

Journal ArticleDOI
TL;DR: In this paper, a modification of the image contour segmentation method based on Canny method is presented, which allows to obtain a sequence of hierarchical object contour preparations with an adjusted detailing.
Abstract: The paper presents modification of the image contour segmentation method based on Canny method. A distinctive feature of proposed approach is the usage of wavelet-functions as underlining transformation. It allows to obtain a sequence of hierarchical object contour preparations with an adjusted detailing. This approach is a part of the object recognition information technology that significantly reduces the size of information processing of video surveillance system.

Journal ArticleDOI
TL;DR: The aim is to implement a scalable and extensible platform for automatically retrieving the diacritic marks for undiacritized dialectal Arabic texts with different rule-based and statistical techniques.
Abstract: In this paper, the problem of missing diacritic marks in most of dialectal Arabic written resources is addressed. Our aim is to implement a scalable and extensible platform for automatically retrieving the diacritic marks for undiacritized dialectal Arabic texts. Different rule-based and statistical techniques are proposed. These include: maximum likelihood estimate, and statistical n-gram models. The proposed platform includes helper tools for text pre-processing and encoding conversion. Diacritization accuracy of each technique is evaluated in terms of Diacritic Error Rate (DER) and Word Error Rate (WER). The approach trains several n-gram models on different lexical units. A data pool of both Modern Standard Arabic (MSA) data along with Dialectal Arabic data was used to train the models.

Journal ArticleDOI
TL;DR: In this article, the Ensemble Kalman Filter is used to predict water levels and flow of waters when it reaches the barrage and the results obtained from this method are then used as input for controlling the floodgates.
Abstract: One of the flood controls, especially in the downstream areas are barrage. Those are optimized using Ensemble Kalman filter based non linear predictive control. Ensemble Kalman Filter is used to predict water levels and flow of waters when it reaches the barrage. The results obtained from this method is then used as input for controlling the floodgates. Simulations are performed in three circumstances, namely the normal flow, flooding and drought. For normal flow, using optimum quantities are obtained from NMPC by opening the floodgates. Simulations were performed for 100 hours, with a gap of 5 per hour of observation. EnKf fulfilled with RMSE yields accuracy of the system and estimates of less than 1, RMSE debit is 0.5346 and RMSE water level is 0.2716. Furthermore the operation of the opening gate achieves optimum value, with the movement of between 40 - 65 per cent, with an average difference of movement is 0.10065 percent. Flood conditions, the water flow 2.000 m3/s and the water level 10 m operation of opening gate ranging between 98 - 100 per cent and the amount of the difference opening gate is 0.028835. RMSE to estimate the flow rate of 1.5835, while for the water level of 0.3145. While the flow conditions dry, with water flow 10 m3/s and the water level 1 m operation of opening gate ranging between 0 - 1 percent and the amount of the difference opening gate is 0.41289 percent. RMSE to estimate the flow rate of 0.0826, while for the water level of 0.0677.

Journal ArticleDOI
TL;DR: In this article, the development of genetic algorithms (GA) that are used for analyzing and predicting regional economic growth (REG) with an agriculture share (SA) and an industry share (SI) as independent variables covers 13 districts/cities in East Kalimantan Province of 2002-2012 datasets.
Abstract: This paper outlines and presents the development of genetic algorithms (GA) that are used for analyzing and predicting Regional Economic Growth (REG) with an agriculture share (SA) and an industry share (SI) as independent variables covers 13 districts/cities in East Kalimantan Province of 2002-2012 datasets. The genetic algorithm (GA) was used for modeling of REG datasets. The results of experiment shows that GA was produced prediction value of 92.389. This results indicate that the average price fluctuation is decreased in 2012. Meanwhile, the southern region is significantly increased in 2013. The results indicated that GA was good algorithm for prediction of REG. This paper is concluded by recommending some future works that can be applied in order to improve the prediction accuracy.

Journal ArticleDOI
TL;DR: A new probability distribution called the serial Weibull Rayleigh distribution has been proposed and its mathematical properties derived and it is revealed that the new distribution provides a better fit compared to other candidate distributions.
Abstract: In this study, a new probability distribution called the serial Weibull Rayleigh distribution has been proposed and its mathematical properties derived. An application of the new distribution using a real lifetime dataset revealed that the new distribution provides a better fit compared to other candidate distributions.

Journal ArticleDOI
TL;DR: This paper has described and implemented Remote Authentication Dial-In User Service (RADIUS) protocol and Authentication, Authorization and Accounting (AAA) server integrated with Mikrotik for bandwidth management in Universitas Mulawarman.
Abstract: An internet traffic service mechanism includes monitoring and network security is indispensable. The main purpose of network monitoring is bandwidth optimizing and maintaining network security. This paper has described and implemented Remote Authentication Dial-In User Service (RADIUS) protocol and Authentication, Authorization and Accounting (AAA) server integrated with Mikrotik. The purpose of this article deal with the implementation of bandwidth management, which includes LAN (Local Area Network) and Wi-Fi (Wireless Fidelity) in Universitas Mulawarman. Based on experiment, the system is simple and easy to be used that controls and allocates bandwidth to users (lecturers, staff, and students) as they authenticate with LAN and Wi-Fi. Furthermore, network security perspective shows that users who are not registered to use the internet at the Universitas Mulawarman could be maintained as well.

Journal ArticleDOI
TL;DR: In this article, it is shown that charge carriers in semiconductors are electrons and holes and that their numbers are controlled by the concentrations of impurity elements, i.e. doping concentration; for that reason doping concentration has great influence on carrier mobility.
Abstract: The term carrier mobility generally alludes to both electron and hole mobility in semiconductors. These parameters characterize how quickly an electron and/or hole moves through a metal or semiconductor when under the influence of an electric field. Most studied mobility models only take into account the influence of temperature and doping concentration which provides less accurate but faster simulation and allows preliminary device description adjustments and analysis. However complete models, like Klaassen, Shirahata or some allowed model combination give results that better fit experimental curves. This work focuses on such possibilities and shows that, as carriers are accelerated in an electric field, their velocity will begin to saturate when the electric field magnitude becomes significant. Such effects are observed in low, high and inversion mobility models simulated in strain-Silicon devices. These effects are to be accounted for by reducing of the effective mobility. Furthermore, it is shown that charge carriers in semiconductors are electrons and holes and that, their numbers are controlled by the concentrations of impurity elements, i.e. doping concentration; for that reason doping concentration has great influence on carrier mobility. Carriers are able to flow more quickly in materials with higher mobility; since the speed of an embedded device is limited by the time it takes a carrier to move from one side to the other. Devices composed of materials with higher mobility are able to achieve higher speeds.