scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Digital Information and Wireless Communications in 2014"


Journal ArticleDOI
TL;DR: A research based on a tool that uses Genetic Algorithm, called the GA Playground is done to demonstrate the capability of solving the Knapsack Problem with the fitness function and a case study on how images can be reproduced using the optimal parameters.
Abstract: In today’s world, an optimal and intelligent problem solving approaches are required in every field, regardless of simple or complex problems. Researches and developers are trying to make machines and software's more efficient and intelligent. This is where the Artificial Intelligence plays its role in developing efficient and optimal searching algorithm solutions. Genetic algorithm is one of most pervasive and advanced developed heuristic search technique in AI. Genetic algorithm (GA) is developed to find the most optimized solution for a given problem based on inheritance, mutation, selection and some other techniques. It was proved that genetic algorithms are the most powerful unbiased optimization techniques for sampling a large solution space. In this paper, we have used GA for the image optimization and Knapsack Problems, which are commonly found in a real world scenario. Furthermore, a research based on a tool that uses Genetic Algorithm, called the GA Playground is done to demonstrate the capability of solving the Knapsack Problem with the fitness function and a case study on how images can be reproduced using the optimal parameters. Lastly, a few methods such as the Hash Table and the Taguchi Method are suggested to improve the performance of the Genetic Algorithm.

89 citations


Journal ArticleDOI
TL;DR: A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm and the list leads to progressive enhancement for assisting in requirement engineering activities in globalSoftware development paradigm.
Abstract: Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the factors, but also the risks which these factors may generate. A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm. The list leads to progressive enhancement for assisting in requirement engineering activities in global software development paradigm. This work is especially useful for the, less experience people working in global software development. .

31 citations


Journal ArticleDOI
TL;DR: A general model to assess information credibility on UGC different platforms, including Twitter, is proposed, which employs a contextual credibility approach that examines the effect of culture, situation, topic variations, and languages on assessing credibility, using Arabic context as an example.
Abstract: Due to the growing dependence on the WWW UserGenerated Content (UGC) as a primary source for information and news, the research on web credibility is becoming more important than ever before. In this paper we review previous efforts to evaluate information credibility, focusing specifically on microblogging. In particular, we provide a comparison of different systems for automatic assessment of information credibility based on the used techniques and features, and we classify the Twitter credibility surveys based on the features considered. We then propose a general model to assess information credibility on UGC different platforms, including Twitter, which employs a contextual credibility approach that examines the effect of culture, situation, topic variations, and languages on assessing credibility, using Arabic context as an example. We identify several factors that users may consider in determining credibility, and argue that the importance of each factor may vary with a context. Future work will include both a user study and machine learning techniques to evaluate the effectiveness of various factors for information credibility classification in different contexts.

28 citations


Journal ArticleDOI
TL;DR: The performance of the designed controller is illustrated to follow a desired position, velocity, acceleration and the heading angle of quadrotor despite the fully unknown parameters and noise measurement.
Abstract: A robust nonlinear composite adaptive control algorithm is done for a 6-DOF quadrotor system. The system is considered to suffer from the presence of parametric uncertainty and noise signal. The under-actuated system is split to two subsystems using dynamic inversion. A sliding mode control is controlling the internal dynamics while the adaptive control is controlling the fully actuated subsystem. All the plant parameters such as mass, system inertia, thrust and drag factors are considered as fully unknown and vary with time. The composite adaptive control is driven using the information from two errors; tracking error and prediction error. An enhancement on the adaptive control has been done using robust technique to reject the presence of the noise. The stability of the closed-loop system is driven in the flight region of interest. Also the performance of the designed controller is illustrated to follow a desired position, velocity, acceleration and the heading angle of quadrotor despite the fully unknown parameters and noise measurement.

26 citations


Journal ArticleDOI
TL;DR: The main focus of this paper is to investigate the impacts of Knowledge Management (KM) on Risk Management (RM) in IT project implementation process.
Abstract: IT projects management is not free from risks which are created from various sources of the environment. Thus a comprehensive understanding of these possible risks and creating strategic policies to confront them are one of the fundamental requirements for successful implementation of IT projects. The risks faced during the implementation of IT Projects are not just related to financial aspects. IT Project Managers must embrace these fundamental issues with more holistic view, rather than merely focusing on the financial matters. In order to prevent the potential problems from arising or escalating into bigger magnitude, serious attention must be given to it before the implementation of any IT project. The main focus of this paper is to investigate the impacts of Knowledge Management (KM) on Risk Management (RM) in IT project implementation process.

23 citations


Journal ArticleDOI
TL;DR: A general instructional design model is proposed to teach students in faculty of education, especially in department of educational technology how to design and develop online virtual labs in a common way to guide the students in refinement of the future learning environment using recent technology.
Abstract: The purpose of this work is to propose a general instructional design model to teach students in faculty of education, especially in department of educational technology how to design and develop online virtual labs in a common way. We have made analyses of previous instructional design models and related studies in regard to the virtual labs to specify diverse features of the proposed model. It was found that the online virtual labs have no conventional instructional design model, especially for designing and developing stages and also no common shape and components. Based on these results, we have reached to a new suggestion model which guides the students in refinement of the future learning environment using recent technology. In this paper, we also present a list of criteria for designing and developing the online virtual labs.as a modern principles for directing designing process of online virtual labs environments to become instructional products. We have made a derivation of these criteria from previous studies related to the virtual labs, e-Learning technologies and some miscellaneous technological resources in educational technology. These criteria would provide the students with educational and technological guidelines to produce the online virtual labs with high quality and efficiency.

17 citations


Journal ArticleDOI
TL;DR: A simple yet efficient data scheduling scheme is proposed in this paper to manage incoming data packet from the system based on their application types and priorities and shows a promising solution for guaranteeing higher throughput to the high priority data while giving sufficient access to low priority data without introducing much delay impact.
Abstract: Smart Home and Ambient Assisted Living (SHAAL) systems utilize advanced and ubiquitous technologies including sensors and other devices that are integrated in the residential infrastructure or wearable, to capture data describing activities of daily living and health related events. However, with the introduction of these technology-orientated services come a number of challenges, which to date are still largely unsolved. The management and processing of the large quantities of data generated from multiple sensors is recognized as one of the most significant challenges. Therefore, a simple yet efficient data scheduling scheme is proposed in this paper to manage incoming data packet from the system based on their application types and priorities. The performances of this lightweight context-aware scheme are investigated in a real SHAAL setting under two scenarios; centralized and distributed setups. The experimental results show the proposed scheme offers a promising solution for guaranteeing higher throughput to the high priority data while giving sufficient access to low priority data without introducing much delay impact.

12 citations


Journal ArticleDOI
TL;DR: The SH-SLA model is proposed to generate a hierarchical self-healing SLA in cloud computing that would be able to check its QoS and notify the recent status to dependent SLAs and prevent or propagate the notified violations by an urgent reaction.
Abstract: The service level agreement (SLA) is a mutual contract between the service provider and consumer which determines the agreed service level objective (SLO). The common SLA is a plain documental agreement without any relation to other dependent SLAs during the different layers of cloud computing. Hence, the cloud computing environment needs the hierarchical and autonomic SLA. This paper proposes the SH-SLA model to generate a hierarchical self-healing SLA in cloud computing. The self-healing ability contains the SLA monitoring, violation detecting and violation reacting processes. In SH-SLA, the related SLAs communicate with each other hierarchically. The SLA would be able to check its QoS and notify the recent status to dependent SLAs. Furthermore, SH-SLA could prevent or propagate the notified violations by an urgent reaction. Consequently, the service providers have a great chance to prevent the violated SLA before sensing by end users. The SH-SLA model is simulated and the experiment results have presented the violation detection and reaction abilities of the proposed model in cloud computing. Besides, the end users meet the lesser violations in SH-SLA than the common SLA.

11 citations


Journal ArticleDOI
TL;DR: This work shows a new concept of adding structure to highly unstructured, heterogeneous data, which will greatly improve the total process of storing and effectively retrieving them and uses HumanComputer-Interaction patterns as a case study to present the proof of concept.
Abstract: Structured data has an inherently great automation value. It renders itself readily for software tools to help store, organize and search effectively. With the growing dependence of data, we face many new problems. While software applications replace each other, and older software has rapidly diminishing value, data has the extreme opposite nature, which we can call “cumulative effect”. Unlike software, we see new data as an addition to the old one, so we tend to continuously accumulate date without deleting any thing. Even older data is often archived for it’s value, for legal reasons, or just because we can never be sure if we’d need them again. This would not be a problem if we had structured data, as we can automate the storing and retrieval process. However, most of the valuable information lie inside unstructured data, which is extremely difficult to store and retrieve in large scale. Our work shows a new concept of adding structure to highly unstructured, heterogeneous data, which will greatly improve the total process of storing and effectively retrieving them. We use HumanComputer-Interaction (HCI) patterns as a case study to present our proof of concept.

10 citations


Journal ArticleDOI
TL;DR: Empirical results shows that the proposed approach for Persian/Arabic handwritten digit recognition has a very good generalization accuracy, which is the best among the stateof-the-art methods.
Abstract: Automated handwritten character recognition seems to be necessary due to the increasing number of Persian/Arabic handwritten documents. A new approach for Persian/Arabic handwritten digit recognition has been proposed in this paper. This approach employs Local Binary Pattern (LBP) operator as the base feature extraction method. Although this operator has shown great performance in research areas such as context and object recognition, but it has not been used in Persian/Arabic handwritten digit recognition problem. First step in the proposed approach involves smoothing, converting black and white input image to grayscale intensity image and resizing it to a fixed size. In the next step, input image is divided into several blocks. LBP operator is applied to each block to extract features. Finally, these features are used to train a multi-layer perceptron neural network with circular approach. Empirical results shows that the proposed approach has a very good generalization accuracy (99.72%) on Hoda dataset with 60000 train and 20000 test samples. This accuracy is the best among the stateof-the-art methods.

10 citations


Journal ArticleDOI
TL;DR: The quantitative analysis of the survey results shows that more than 50% of the students view collaborative learning to have a large benefit in design-based learning.
Abstract: The aim of this paper is to analyse and present cloud- link as well as campus-linked students' perceptions of collaborative learning and design based learning in engineering. Project oriented design based learning (PODBL) is a learning and teaching approach, where students learn through design activities while being driven by project(s). PODBL enhances cloud-linked and campus-linked students' ability to acquire career essential skills that fulfill future industry needs. A paper-based survey is used to recognise a cohort of students' experience of collaborative learning and design based learning in engineering. The paper-based survey was given to 30 students from an engineering discipline. The quantitative analysis of the survey results shows that more than 50% of the students view collaborative learning to have a large benefit in design-based learning.

Journal ArticleDOI
TL;DR: This paper presents comprehensive study on the need for Usability Risk Assessment Model to reduce usability problems in software products.
Abstract: Usability is an important factor in ensuring development of quality and usable software product. Ignorance and unawareness about the concept of usability and failure to address usability during software development process has led to usability problems in software product. Many efforts has been suggested in literature to overcome usability problem in software products but current practices faces challenges in reducing these usability problems. Alternatively, the concept of risk management can be used to control usability problems even though these problems cannot be eliminated totally. The concept of risk management is important to deal with usability problem before it occurs. Unfortunately, there is still lack of proper definition of usability risk and a proper model to identify, analyze and prioritize potential usability risk during Software Development Lifecycle (SDLC). This paper presents comprehensive study on the need for Usability Risk Assessment Model to reduce usability problems in software products.

Journal ArticleDOI
TL;DR: The enhanced AHP method used is the Guided Ranked AHP (GRAHP), a technique where decision matrix tables are automatically filled in based on ranked data and can still be altered by following the guidelines which serves the purpose of improving the consistency of the decision matrix table.
Abstract: Application of model base in group decision making that makes up a Group Decision Support System (GDSS) is of paramount importance. Analytic Hierarchy Process (AHP) is the multi-criteria decision making (MCDM) that has been applied in GDSS. In order to be effectively used in GDSS, AHP needs to be customized so that it is more user friendly with ease of used features. In this paper, we propose an enhanced AHP model for GDSS tendering. The enhanced AHP method used is the Guided Ranked AHP (GRAHP). It is a technique where decision matrix tables are automatically filled in based on ranked data. However, the generated values in the decision matrix tables can still be altered by following the guidelines which in turn serve the purpose of improving the consistency of the decision matrix table. This process is transparent to Decision Makers because the degree of data inconsistency is visible. A prototype system based on tendering process has been developed to test the GRAHP model in terms of its applicability and robustness.

Journal ArticleDOI
TL;DR: Some processes in DSS domain of COBIT5 is introduced and mapped in different area of eTOM operations phase to enrich the processes in operations phase included of service assurance and fulfillment and to increase customer retention and loyalty and SLA fulfillment.
Abstract: eTOM is a standard framework that is defined in telecommunication business processes area. It contains of three phases, 1. Operations 2. Infrastructure, Strategy and Product 3. Enterprise Management. The goal of this paper is to enrich the processes in operations phase included of service assurance and fulfillment and to increase customer retention and loyalty and SLA fulfillment. For this purpose some processes in DSS domain of COBIT5 is introduced and mapped in different area of eTOM operations phase.

Journal ArticleDOI
TL;DR: A statistical simulation model for spectrum sensing of cognitive radio and the associated interference probability calculation methodologies is investigated and the capability to simulate multiple cognitive radio systems with issues of complex range of spectrum engineering and radio compatibility are explored.
Abstract: Conventional fixed spectrum allocation results in a large part of frequency band remaining underutilized. Channels that are dedicated to licensed (primary) users are out of reach of unlicensed users, while the licensed users do not occupy the channel completely, at all times. Cognitive Radio is an attractive concept and solution, capable of addressing this issue. This paper investigates a statistical simulation model for spectrum sensing of cognitive radio and the associated interference probability calculation methodologies. The capability to simulate multiple cognitive radio systems with issues of complex range of spectrum engineering and radio compatibility are explored. Simulations were carried out and output parameters such as sensing received signal strength, cell capacity and achieved bitrate were obtained and analyzed. The results were obtained for different conditions with particular emphasis on CDMA systems and OFDMA systems. The article also highlights the results obtained for studying detection threshold and the associated interference probability.

Journal ArticleDOI
TL;DR: Experimental results show that this method can detect the falls effectively; in addition, it is more portable than other devices as well.
Abstract: Fall detection is one of the major issues in health care filed. Falls can cause serious injury both in physiology and psychology, especially to the old people. A reliable fall detector can provide rapid emergency medical care for the fallen down people. Thus, a reliable and effectively fall detection system is necessary. In this paper, we propose a system which utilizing mobile phones as a detector to detect the falling. When fall accident occurs, the system has three response procedures for help. The first procedure is transmitting the emergency message to the related people for help. The second procedure shows the user’s status and location on the map of webpage, according to user’s GPS location and status. The third procedure makes the alarm sound; its purpose is to let the person who nearby the user can be noticed that the user needs help. First, using a waist-mounted mobile phone to capture accelerometer of the human body and adopt the DCT (Discrete Cosine Transform) to analyze the value of accelerometer to distinguish the activities of daily living (ADL) and falls. ADL consist of walking, standing and sitting. We utilized a tri-axial accelerometer in mobile phone to capture the signal and transmit it to the server by way of Internet. We adapt two judgments achieved in Server, first judgment is based on an adaptive threshold for detecting the energy by DCT; the setting of adaptive threshold include height, weight and gender. The second judgment is according to the tilt of smart phone. Experimental results show that this method can detect the falls effectively; in addition, it is more portable than other devices as well.

Journal ArticleDOI
TL;DR: The paper describes the process of modelling domain knowledge of Information Science (IS) by creating an Ontology of information Science domain (OIS), and reports on the life cycle of the ontology building process using Methontology, based on the IEEE standard for development software life cycle process.
Abstract: The paper describes the process of modelling domain knowledge of Information Science (IS) by creating an Ontology of Information Science domain (OIS). It also reports on the life cycle of the ontology building process using Methontology, based on the IEEE standard for development software life cycle process, which mainly consists of: specification, conceptualization, formalization, implementation, maintenance and evaluation. The information resource used in acquisition and evaluation has been obtained from Information Science. The conceptualization consists of identifying IS concepts and grouping them into a hierarchy tree based on a faceted classification scheme. The OIS ontology is formalized by using the ontology editor Protege to generate the ontology code. The achieved result is OIS ontology which has fourteen facets: Actors, Method, Practice, Studies, Mediator, Kinds, Domains, Resources, Legislation, Philosophy & Theories, Societal, Tool, Time and Space. The model is evaluated using ontology quality criteria to check the ontology’s usefulness, and how it could be transferred into application ontology for Information Science education.

Journal ArticleDOI
TL;DR: The history of this tool is reviewed, and stock is taken of the rich variety of computational problems and applications that have been tackled with it during the past thirty years.
Abstract: A paper published in 1983 established that the rotating calipers paradigm provides an elegant, simple, and yet powerful computational tool for solving several twodimensional geometric problems. Since then the rotating calipers have been extended to three dimensions, and have been applied to many new problems. In the present paper the history of this tool is reviewed, and stock is taken of the rich variety of computational problems and applications that have been tackled with it during the past thirty years.

Journal ArticleDOI
TL;DR: The purpose of this paper is to provide a novel architecture for cloud environment, based on recent best practices and frameworks and other cloud reference architecture, and a new service model has been introduced in this proposed architecture.
Abstract: Today across various businesses, administrative and senior managers seeking for new technologies and approaches in which they can utilize it, more easy and affordable and thereby rise up their competitive profit and utility. Information Communications and Technology (ICT) is no exception from this principle. Cloud computing concept and technology and its inherent advantages has created a new ecosystem in the world of computing and is driving ICT industry one step forward. This technology can play an important role in an organization’s durability and IT strategies. Nowadays, due to progress and global popularity of cloud environments, many organizations moving to cloud and some well-known IT solution providers such as IBM and Oracle have introduced specific architecture to be deployed for cloud environment. On the other hand, using of IT Frameworks can be the best way for integrated business processes and other different processes. The purpose of this paper is to provide a novel architecture for cloud environment, based on recent best practices and frameworks and other cloud reference architecture. Meanwhile, a new service model has been introduced in this proposed architecture. This architecture is finally compared with little other architecture in a form of statistical graphs to show its benefits.

Journal ArticleDOI
TL;DR: A time slotted spectrum sharing protocol named Channel Usage and Collision Based MAC protocol (CUCBMAC) is proposed, a modification of a MAC protocol named Collision-Based MAC Protocol (CBMAC), it depends on three parameters for the allocation of channels to the SUs.
Abstract: In Cognitive Radio Networks (CRNs); the unlicensed users named Secondary Users (SUs) are allowed to share the licensed wireless spectrum band with its licensed users name Primary Users (PUs) but without degradation in the PU's Quality of Service (QoS). As all the SUs have the same priority to access the licensed spectrum band; a spectrum sharing protocol is needed to fairly divide the available spectrum band among these SUs. Spectrum sharing protocols in CRNs are similar to Medium Access Control (MAC) protocols in regular networks. In this paper; a time slotted spectrum sharing protocol named Channel Usage and Collision Based MAC protocol (CUCBMAC) is proposed. It is a modification of a MAC protocol named Collision-Based MAC protocol (CBMAC), it depends on three parameters for the allocation of channels to the SUs; 1) counting collisions number for each SU, 2) predicting the availability probability for all the available channels and then 3) excluding some of the available channels. It has been proved that using the proposed CUCB-MAC protocol outperforms the original CB-MAC protocol on all the measured performance metrics.


Journal ArticleDOI
TL;DR: This paper compares the performance of popular AI techniques in approaching the solution of a N-Puzzle of size 8, on a 3x3 matrix board and for the solutions of the classic 8 Queen Puzzle.
Abstract: This paper compares the performance of popular AI techniques, namely the Breadth First Search, Depth First Search, A* Search, Greedy Best First Search and the Hill Climbing Search in approaching the solution of a N-Puzzle of size 8, on a 3x3 matrix board and for the solution of the classic 8 Queen Puzzle. It looks at the complexity of each algorithm as it tries to approaches the solution in order to evaluate the operation of each technique and identify the better functioning one in various cases. The N Puzzle and the 8 Queen is used as the test scenario. An application was created to implement each of the algorithms to extract results for each of the cases. The paper also depicts the extent each algorithm goes through while processing the solution and hence helps to clarify the specific cases in which a technique may be preferred over another.

Journal ArticleDOI
TL;DR: An analysis of the current engineering educational context is presented and an interactive content authoring system as well as a virtual professor are proposed in an attempt to make learning experience richer and more motivating to students.
Abstract: Heated debates involving reforms in the educational system are becoming more and more frequent in recent years. This is due to the increasingly evident shortcomings in the educational system and its difficulties to evolve at the same pace as technological development. The aim of this work is to present an analysis of the current engineering educational context and propose an interactive content authoring system as well as a virtual professor in an attempt to make learning experience richer and more motivating to students.

Journal ArticleDOI
TL;DR: This paper critically reviewed the domain of “standards and models”, and identified the initial list of situational factors from them, and adopted the constant comparison and memoing techniques of the Grounded theory for identification of uniqueinitial list of the situational factors.
Abstract: Requirement engineering (RE) process clear description can be an important factor for guiding the team members involved in the RE process, which may help organizations not exceeding the estimated schedule and budget for the software project. There can be many reasons of not having efficient RE process such as changing situations among the organizations involved in the RE process. It is certainly one of the various other reasons of having non-efficient RE process. Due to these changing situations the RE process customization accordingly should be performed. As part of our current research project, where we are investigating situational RE in Global Software Development (GSD), first we need to identify the factors which may results in changing situations. This paper we have critically reviewed the domain of “standards and models”, and identified the initial list of situational factors from them. These standards and models are related to software engineering, which directly or indirectly discusses RE process. We have adopted the constant comparison and memoing techniques of the Grounded theory for identification of unique initial list of the situational factors. This initial list is a significant consideration for a comprehensive list of situational factors affecting RE process in GSD.

Journal ArticleDOI
TL;DR: The simulations results show that the proposed cooperative scheme achieves lower signal-to-noise ratio (SNR) values for desired bit-error rate (BER) and high spectral efficiency as compared to ACM direct transmission and ACM cooperative with single relay.
Abstract: In this paper the performance of a cooperative wireless communication system based on combined best relay selection (BRS) and adaptive LDPC coded modulation (ACM) scheme is investigated These investigations are focused on evaluating the performance of the proposed cooperative wireless communication system over independent non-identical Rayleigh fading channels in terms of bit-error rate (BER) using MATLAB® computer simulations and comparing the system performance with ACM direct transmission and ACM cooperative with single relay The simulations results show that the proposed cooperative scheme achieves lower signal-to-noise ratio (SNR) values for desired bit-error rate (BER) and high spectral efficiency as compared to ACM direct transmission and ACM cooperative with single relay

Journal ArticleDOI
TL;DR: This paper presents a new approach to modeling a wireless channel using finite mixture models (FMM), and results indicate that models composed of a mixture of Rayleigh and Lognormal distributions consistently provide good fits for most of the impulses of the UWB channel.
Abstract: This paper presents a new approach to modeling a wireless channel using finite mixture models (FMM). Instead of the conventional approach of using non mixtures (single) probability distribution functions, FMMs are used here to model the channel impulse response amplitude statistics. To demonstrate this, a FMM based model of Ultrawideband (UWB) channels amplitude statistics is developed. In this research, finite mixture models composed of combinations of constituent PDFs such as Rayleigh, Lognormal, Weibull, Rice and Nakagami are used for modeling the channel amplitude statistics. The use of FMMs is relevant because of their ability to characterize the multimodality in the data. The stochastic expectation maximization (SEM) technique is used to estimate the parameters of the FMMs. The resultant FMMs are then compared to one another and to non-mixture models using model selection techniques such as Akaike’s Information Criteria (AIC). Results indicate that models composed of a mixture of Rayleigh and Lognormal distributions consistently provide good fits for most of the impulses of the UWB channel. Other model selection techniques such as Minimum Description Length (MDL) and Accumulative Predictive Error (APE) also confirmed this finding. This selection of FMM based on Rayleigh and Lognormal distributions is true for both the industrial as well as the university environment channel data

Journal ArticleDOI
TL;DR: The paper presents the complete System Development Life Cycle (SDLC) of the firing practice system and associated WSN, which provides an automatic bullet-impact count during firing training session and is modular scalable in design for multiple of eight concurrent shooters.
Abstract: The critical importance of an efficient infantryman in special operations force, tactical paramilitary and Law Enforcement Agencies (LEAs) is insurmountable. One of the many vital aspects of an effective solider is excellent marksmanship which requires extensive training at sophisticated firing ranges. Modern firing ranges are supported by Automatic Firing Practice Systems (AFPS) and this paper presents the design and development of such a system based on WSN. AFPS provide an automatic bullet-impact count during firing training session and is modular scalable in design for multiple of eight concurrent shooters. The system is versatile and flexible allowing for different small-arms and firing training modes and supports night firing exercise. AFPS comprises of two major components, the automatic target-box and a commander console. Automatic target-box has a motor & gear assembly, target sheet, bullet-impact sensor, control board, and WiFi communication module. Commander console is a ruggedized sunlight readable 10.4” Tablet PC, which with a built-in WiFi acts an access point. The automatic target-boxes equipped with embedded WiFi modules form sensor nodes of a WSN. The paper presents the complete System Development Life Cycle (SDLC) of the firing practice system and associated WSN. The AFPS and bullet-impact sensor was extensively tested on Firing Ranges for accuracy of bullet-impact count. The results showed a bulletimpact count accuracy of over 97 percent.

Journal ArticleDOI
TL;DR: The main aim of the literature review was to find out the latest scientific researches around IT governance, IT management and IT service management within enterprise architecture.
Abstract: IT decision making is supported by frameworks of the different kind for IT governance, IT management, and enterprise architecture. Organizations are adopting Enterprise Architecture (EA) frameworks for improving the interoperability of the information systems that are used in the production of services. Therefore, the main aim of our literature review (212 hits, 20 appraised hits) was to find out the latest scientific researches around IT governance, IT management and IT service management within enterprise architecture. It is possible to archive the commitment to things of the different kind with assessments. Therefore, the commitment to the enterprise architecture work and ICT governance is established in spring of 2014 in the Northern Savonia with the online survey. Our survey of ICT governance has 25 questions. It had 331 potential respondents and we got 136 answers. In this paper, we reported the answers of five questions that position the ICT governance in the Northern Savonia.

Journal ArticleDOI
TL;DR: A review of statements for information system modernization shows that the concept of modernization is unestablished and if the authors want to have similar notions about information system modernizations then they have to be transparent where the changes have effect.
Abstract: Organizational purpose or function changes usually concern with information systems that have to be modernized. However, there are difficulties of making information system modernization decisions because of suppliers and clients may have different notions about modernization. In this paper, we present a review of statements for information system modernization. The objective of this review is to describe current reported knowledge in terms of what kind of modernization is defined and what the domain of modernization is. We found 42 statements for modernization from 36 papers. These findings show that the concept of modernization is unestablished. However, if we want to have similar notions about information system modernizations then we have to be transparent where the changes have effect. The analysis of relationships between the entities of enterprise architecture is presented to help in making information system modernization decisions. The case-based lessons concern the semantic assets of Finnish social welfare. The suppliers of three client information systems (CIS) were evaluated and they answered that 2-58 % of the semantic assets are unknown. Furthermore, there are described the main data groups and logical data stores which are not allowed in the TOGAF content metamodel.

Journal ArticleDOI
TL;DR: This approach focuses on generating a tree based communication architecture that is made up of three different types of sensors and ensures that at least one sensor of each type is connected to the communication backbone; thereby forming a MULTI-TREE.
Abstract: A wireless sensor network consisting of heterogeneous sensors is capable of dealing with more than one type of information about any region at a time. Sensor applications that cater to disaster relief require faster data delivery and effective connectivity than the other monitoring applications. Moreover, usually, sensors collect a single type of data. Our approach focuses on generating a tree based communication architecture that is made up of three different types of sensors and ensures that at least one sensor of each type is connected to the communication backbone; thereby forming a MULTI-TREE. By responding to query initiated by the sink for one type of data at a time, the heterogeneous sensors ensure reduced amount of carried load and latency in data transmission. Our strategy implemented for structural health monitoring application shows that when compared to a minimum spanning tree under the same simulation constraints, performs better as it reduces the carried load and delay to one third under the existing set of ad hoc protocols.