scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computer Science and Business Informatics in 2014"


Journal Article
TL;DR: Smart phones now represent a supreme objective for malware writers and evolution, propagation, detection, detection & control of mobile malware are to be explored.
Abstract: These days’ mobile devices are an inseparable part of our everyday lives and in preceding years its usage has grown-up exponentially. With the functionality upgrade of mobile phones, the malware threat for mobile phones is expected to increase. This threat got worse with the fast internet access which provides attractive targets for malware developers. The growth of smart phone usage can also be directly linked to its capability to support third-party applications that are offered through online application markets. Therefore, smart phones now represent a supreme objective for malware writers. Evolution, propagation, detection & control of mobile malware are to be explored.

11 citations


Journal Article
TL;DR: Simulation results signify that the proposed protocol performs satisfactorily in secure routing and is robust against both single and cooperative Black Hole attacks in a dynamic environment.
Abstract: In a Wireless Sensor Network, Security remains a major challenge due to its dynamic topology, open wireless medium, lack of centralized infrastructure, intermittent connectivity, resource constrained sensor nodes. These weak entities make WSN easily compromised by an adversary to device abundant attacks resulting in disastrous consequences. Black Hole can be one of them wherein it exploits a trustworthiness of a network by promising the routing of data packets to the destination knowing that it has a shortest path but in reality it drops all packets and consequently threatens reliability. In order to accomplish secure packet transmission, an efficient and trust based secure protocol is proposed to defend against single and cooperative Black Hole attack. A proposed protocol incorporates trust metric estimation to determine honesty of nodes during secure path formation. A proposed system builds a Hierarchical Cluster Topology and is experimentally evaluated to demonstrate its effectiveness in detecting and preventing efficiently the Black Hole attacks. Besides, the comparison of proposed protocol with one of the existing approach [9] proves that proposed system is efficiently reduces the possibility of misbehaving nodes being a part of network communication process and achieves better packet delivery ratio, throughput and less end-to-end delay. The Simulation results signify that the proposed protocol performs satisfactorily in secure routing and is robust against both single and cooperative Black Hole attacks in a dynamic environment.

10 citations


Journal Article
TL;DR: A lightweight authentication protocol for mobile cloud environment is proposed that has many advantages such as: supporting user anonymity, local authentication and also resistance against related attacks such as replay attack, stolen verifier attack, modification attack, server spoofing attack and so on.
Abstract: The ABI Research believes that the number of mobile cloud computing users is expected to grow from 42.8 million (1.1% of total mobile users) in 2008 to 998 million (19% of total mobile users) in 2014. The security risks have become a hurdle in the rapid adaptability of the mobile cloud computing technology. Significant efforts have been devoted in research organizations and academia to securing the mobile cloud computing technology. In this paper we proposed a lightweight authentication protocol for mobile cloud environment. Our proposed protocol has many advantages such as: supporting user anonymity, local authentication and also resistance against related attacks such as replay attack, stolen verifier attack, modification attack, server spoofing attack and so on.

8 citations


Journal Article
TL;DR: The contribution of this work is bifold, first to detect spam content in Twitter and preventing it being displayed and second is to design a new classifier to detect spammers with accuracy.
Abstract: Social Networking sites have become popular in recent years, among these sites Twitter is one of the fastest growing site. It plays a dual role of Online Social Networking (OSN) and Micro Blogging. Spammers invade twitter trending topics (popular topics discussed by Twitter user) to pollute the useful content. Social spamming is more successful compared to email spamming by using social relationship between the users. Spam detection is important because Twitter is mainly used for commercial advertisement and spammers invade the privacy information of the user and also the reputation of the user is damaged. Spammers can be detected using content and user based attributes. Traditional classifiers are required for spam detection. The contribution of this work is bifold, first to detect spam content in Twitter and preventing it being displayed. Second is to design a new classifier to detect spammers with accuracy.

8 citations


Journal Article
TL;DR: In this paper, the authors compared various cloning detection techniques with respect to their time complexities and robustness of detection against various post processing operations such as cropping, brightness and contrast adjustments.
Abstract: During the recent years, tampering of digital images has become a general habit among people and professionals. As a result, establishment of image authenticity has become a key issue in fields those make use of digital images. Image authentication involves separation of original camera outputs from their tampered or Stego counterparts. Digital image cloning being a popular type of image tampering, this paper analyses and compares various cloning detection techniques with respect to their time complexities and robustness of detection against various post processing operations such as cropping, brightness and contrast adjustments.

6 citations


Journal Article
Sharad Gangele1
TL;DR: An approximate methodology to estimate the bounded area using Trapezoidal rule of numerical quadrature is presented and it is found that bounded area is directly proportional to customer choice and network blocking.
Abstract: The problem of internet traffic sharing between two operators was discussed by Naldi (2002) and he has developed mathematical relationship between traffic share and network blocking probability. This relationship generates probability based quadratic function which has a definite bounded area. This area is a function of many parameters and needs to be estimated. But, by direct integration methods, it is difficult solve. This paper presents an approximate methodology to estimate the bounded area using Trapezoidal rule of numerical quadrature. It is found that bounded area is directly proportional to customer choice and network blocking .It helps to explain relationship among traffic share and computer network parameters.

6 citations


Journal Article
TL;DR: The goal of this paper is to increase the awareness about the importance of nonfunctional requirements and to analyze the various techniques that are used to prioritize the NFRs.
Abstract: Nonfunctional Requirements are as important as functional requirements. But they have been often neglected, poorly understood and not considered adequately in software development process. If the NFRs are not met properly, it will lead to the dissatisfaction of customers. NFRs may be more critical than functional requirements as there can be mutual dependencies among the NFR, which may affect the completion of the project. Hence it is necessary to prioritize the NFRs effectively. But prioritizing such NFR is a challenging task in Software development. Many techniques are used to prioritize the requirements in various dimensions. It is important to choose the appropriate requirement prioritization technique for a particular software development process. One can select the appropriate techniques based on the various factors such as, the stakeholders involved, available resources, and the product he develop and so on. The goal of this paper is to increase the awareness about the importance of NFRs and to analyze the various techniques that are used to prioritize the NFRs.

6 citations


Journal Article
TL;DR: This paper proposes to use clustering techniques to cluster the web log data sets and uses popularity and similarity based-page rank algorithm to make prediction when the ambiguous results are found.
Abstract: Predicting the user's web page access is a challenging task that is continuing to gain importance as the web. Understanding users' next page access helps in formulating guidelines for web site personalization. Server side log files provide information that enables to reconstruct the user navigation sessions within the web site, where a session consists of a sequence of web pages viewed by a user within a given time. A web navigation behavior is helpful in understanding what information of online users demand. In this paper, we present the system that focuses on the improvements of predicting web page access. We proposed to use clustering techniques to cluster the web log data sets. As a result, a more accurate Markov model is built based on each group rather than the whole data sets. Markov models are commonly used in the identification of the next page to be accessed by the user based on the previously accessed pages. Then, we use popularity and similarity based-page rank algorithm to make prediction when the ambiguous results are found. Page Rank is a numeric value that represents how important a page is on the web. When one page links to another page, it is effectively casting a vote for the other page. The more votes for a page, the more important the page must be.

5 citations


Journal Article
TL;DR: This paper presents a transformation approach that consists of a source metamodel for UML 2 sequence diagrams, a target metammodel for Petri Nets and transformation rules and has been implemented using Atlas Transformation Language (ATL).
Abstract: UML 2 sequence diagrams are a well-known graphical language and are widely used to specify the dynamic behaviors of transaction-oriented systems. However, sequence diagrams are expressed in a semi-formal modeling language and need a well-defined formal semantic base for their notations. This formalization enables analysis and verification tasks. Many efforts have been made to transform sequence diagrams into formal representations including Petri Nets. Petri Nets are a mathematical tool allowing formal specification of the system dynamics and they are commonly used in Model Checking. In this paper, we present a transformation approach that consists of a source metamodel for UML 2 sequence diagrams, a target metamodel for Petri Nets and transformation rules. This approach has been implemented using Atlas Transformation Language (ATL). A Cellular Phone System is considered, as a case study. Keywords UML 2, Sequence diagrams, Petri Nets, Model checking, Model transformation, Metamodeling, Transformation rules, ATL

5 citations


Journal Article
TL;DR: Proposing framework uses multi-criteria optimization technique to analyze non dominant sets from feasible service provider to select best Cloud Service Providers from requirement set defined by cloud user.
Abstract: Cloud computing is kinetically evolving areas which offer large potential for agencies of all sized to increase efficiency. Cloud Broker acts as a mediator between cloud users and cloud service providers. The main functionality of cloud broker lies in selecting best Cloud Service Providers (CSP) from requirement set defined by cloud user. Request from cloud user are processed by cloud broker and suited providers are allocated to them. Proposing framework uses multi-criteria optimization technique to analyze non dominant sets from feasible service provider.

5 citations


Journal Article
TL;DR: Five different image fusion algorithms, SWT, fuzzy, Neuro-Fuzzy, Fuzzylet and Neuro-Pagination, have been discussed and tested with two datasets (mono-spectrals and multi-spectral).
Abstract: Normal 0 false false false EN-US X-NONE HI Image fusion is done for integrating images obtained from different sensors, which outputs a single image containing all relevant data from the source images. Five different image fusion algorithms, SWT, fuzzy, Neuro-Fuzzy, Fuzzylet and Neuro-Fuzzylet algorithms has been discussed and tested with two datasets (mono-spectral and multi-spectral). The results are compared using fusion quality performance evaluation metrics. It was observed that Neuro-Fuzzy gives better results than Fuzzy and SWT. Fuzzylet and Neuro-Fuzzylet were obtained by combining Fuzzy and Neuro-Fuzzy respectively with SWT. It was observed that Fuzzylet gives better results for mono-spectral images and on the other hand, Neuro-Fuzzylet had given better results for multi-spectral images at the cost of execution time. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; mso-bidi-font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

Journal Article
TL;DR: The focus of this study was to establish the existence of women money saving clubs in rural Zimbabwe, explore their operations and avail data which highlight the need for implementing tailor made Mobile Money Transfer (MMT) for all.
Abstract: The focus of this study was to establish the existence of women money saving clubs in rural Zimbabwe, explore their operations and avail data which highlight the need for implementing tailor made Mobile Money Transfer (MMT) for all. The researchers employed focus group discussion and survey questionnaires to extract information from the research participants. Participants to the questionnaire research were women who were actively running participating in money saving clubs in rural Zimbabwe. The focus group discussion participants included leaders of these rural women money saving clubs. This study will contribute to the body of knowledge novel information which is imperative for MMT operators in Zimbabwe. Therefore MMTs or Banks in Zimbabwe should embrace it.

Journal Article
TL;DR: The analysis used both descriptive techniques and the Pearson Correlation model to establish different facts to evaluate challenges associated with cyber crimes in mobile money services in Tanzania.
Abstract: This paper investigates the trend of cyber crimes in Tanzania. The purpose is to evaluate challenges associated with cyber crimes in mobile money services. The study acknowledges the provision of mobile money services by both telecommunication companies and local banks, the fact which pose a threat in the old fashion of addressing crimes. Data were collected from the Foreigscic Section of the Tanzania Police Force and users of the mobile-money services. The analysis used both descriptive techniques and the Pearson Correlation model to establish different facts. The conclusion is based on observed evidence, and it is placed in the last section of the paper.

Journal Article
TL;DR: The correlation that exists between the extent of information sharing and factors such as accessibility, understandability, usability and reliability and how it can be enhanced through e-transparency systems for public service delivery in an open society is established.
Abstract: This paper determines the extent of information sharing in government institutions through e-transparent tools. First the basis for the study is set through the background, problem statement and objectives. The discussion then proceeds by focusing on ICT tools for information sharing. An information sharing model is proposed and the extent of information sharing in the public sector of Tanzania through online media is discussed; furthermore, the correlation that exists between the extent of information sharing and factors such as accessibility, understandability, usability and reliability is established. The paper concludes by providing recommendations on information sharing and how it can be enhanced through e-transparency systems for public service delivery in an open society.

Journal Article
TL;DR: An integrated procedure using data envelopment analysis (DEA), ant colony optimization (ACO) for continuous domains and gene expression programming (GEP) is proposed and can be considered a feasible and effective tool for making outstanding investment plans.
Abstract: The portfolio optimization problem is an important issue in the field of investment/financial decision-making and is currently receiving considerable attention from both researchers and practitioners. The problem becomes much more difficult if the number of assets is increased or if additional constraints, such as cardinality constraints, bounding constraints or other real-world requirements, are considered. Therefore, various heuristic approaches have been proposed to deal with the portfolio optimization problem, which is difficult to resolve using the traditional mathematical programming technique. In this study, an integrated procedure using data envelopment analysis (DEA), ant colony optimization (ACO) for continuous domains and gene expression programming (GEP) is proposed. The procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market. By providing a potential average return of 13.12% on six-month investments from November 1, 2007 to July 8, 2011, the experimental results show that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans. Moreover, it is a strategy that can help investors make profits even though the overall stock market suffers a loss.

Journal Article
TL;DR: The flower classification system proposed in this paper uses a novel concept of developing visual vocabulary for simplifying the complex task of classifying flower images and seems to have efficient performance.
Abstract: In today’s world, automatic recognition of flowers using computer technology is of great social benefits. Classification of flowers has various applications such as floriculture, flower searching for patent analysis and much more. Floriculture industry consists of flower trade, nursery and potted plants, seed and bulb production, micro propagation and extraction of essential oil from flowers. For all the above, automation of flower classification is very essential step. However, classifying flowers is not an easy task due to difficulties such as deformations of petals, inter and intra class variability, illumination and many more. The flower classification system proposed in this paper uses a novel concept of developing visual vocabulary for simplifying the complex task of classifying flower images. Separate vocabularies for color, shape and texture features are created and then they are combined into final classifier. In this process firstly, an image is segmented using grabcut method. Secondly, features are extracted using appropriate algorithms such as SIFT descriptors for shape, HSV model for color and MR8filter bank for texture extraction. Finally, the classification is done with multiboost classifier. Results are represented on 17 categories of flower species and seem to have efficient performance.

Journal Article
TL;DR: In this paper, the authors used a survey to analyze the behavioral pattern of Internet usage of university students and found that most students used Internet as a support tool for their academic and research work.
Abstract: This study uses a survey to analyze the behavioral pattern of Internet usage of university students. The results show that most students used Internet as a support tool for their academic and research work. The students accessed Internet mostly from their personal computers and had multiple years of experience using Internet. Ease of work and time saving were the most cited reasons for Internet use. The findings of the study provide significant implications for the academicians, practitioners, and government policy makers.

Journal Article
TL;DR: The proposed work aims to develop an efficient tool, which detects the inconsistency in the given UML models, and calculates error efficiency and design efficiency from the inconsistency report.
Abstract: Quality of any software developed is evaluated based on the design aspect. Poor process design leads to high failure rate of software. Design is one of the most important phase in software life cycle. To design the software various traditional and UML models are widely used. There are many existing tools to design the UML models as per the user requirements. These tools do not support validation of UML models which, ultimately leads to design errors. Most of the testing tools check for consistency of the UML models. Some tools check for inconsistency of the UML models that does not follow the consistency rule required for UML models. The proposed work aims to develop an efficient tool, which detects the inconsistency in the given UML models. Parsing techniques are applied to extract the XML tags. The extracted tags contain the relevant details such as class name, attribute name, operation name and association with their corresponding names in Class diagram in the Meta model format. Adopting the consistency rules for the given input UML model, inconsistency is detected and a report is generated. From the inconsistency report, error efficiency and design efficiency is computed.

Journal Article
TL;DR: In this paper, the authors present an effective method for detecting spatial domain Steganography, which can be used to detect spatial domain steganography in the digital images of medical images.
Abstract: Digitization of image was a revolutionary step for the fields of photography and Image processing as this made the editing of images much effortless and easier. Image editing was not an issue until it was limited to corrective editing procedures used to enhance the quality of an image such as, contrast stretching, noise filtering, sharpening etc. But, it became a headache for many fields when image editing became manipulative. Digital images have become an easier source of tampering and forgery during last few decades. Today users and editing specialists, equipped with easily available image editing software, manipulate digital images with varied goals. Photo journalists often tamper photographs to give dramatic effect to their stories. Scientists and researchers use this trick to get theirs works published. Patients’ diagnoses are misrepresented by manipulating medical imageries. Lawyers and Politicians use tampered images to direct the opinion of people or court to their favor. Terrorists, anti-social groups use manipulated Stego images for secret communication. In this paper we present an effective method for detecting spatial domain Steganography.

Journal Article
Sunghae Jun1
TL;DR: This paper studies an efficient connection between the statistical software and database management system (DBMS) and carries out a case study using real application.
Abstract: In big data era, we need to manipulate and analyze the big data. For the first step of big data manipulation, we can consider traditional database management system. To discover novel knowledge from the big data environment, we should analyze the big data. Many statistical methods have been applied to big data analysis. Most works of statistical analysis are depended on diverse statistical software such as SAS, SPSS, or R project. In addition, a considerable portion of big data is stored in diverse database systems. But, the data types of general statistical software are different from the database systems such as Oracle, or MySQL. So, many approaches to connect statistical software to database management system(DBMS) were introduced. In this paper, we study on an efficient connection between the statistical software and DBMS. To show our performance, we carry out a case study using real application.

Journal Article
TL;DR: The intent of this review paper is to contribute the readers an overview of the basic visual cryptography scheme constructions as well as extended work in the area.
Abstract: Cryptography is study of transforming information in order to make it secure from unintended recipients or use. Visual Cryptography Scheme (VCS) is a cryptography method that encrypts visual information (picture, printed text, handwritten notes) such that decryption can be performed using human visual system. The idea is to convert this visual information into an image and encypher this image into n different shares (known as sheets). The deciphering only requires selecting some shares out of n shares. The intent of this review paper is to contribute the readers an overview of the basic visual cryptography scheme constructions as well as extended work in the area. In addition, we also review some applications that take advantage of such secure system.

Journal Article
TL;DR: A depth comparison of the LTE and WiMAX standards is performed and delves into the intricacies of each of them.
Abstract: There are two up-and-coming technologies and these two are the 3GPP LTE whose complete meaning is Third Generation Partnership Project Long Term Evolution and the IEEE 802.16 WiMAX whose full meaning is Worldwide Interoperability for Microwave Access. The main aspire found from both technologies are to give mobile data transmission, voice communication and video services by promoting sound level cost deployment and service models through friendly architectures for Internet and protocols. It is as well as true that, that are being well thought-out as candidates for the fourth generation (4G) of mobile networks. However, the analyses from the case study of this paper is performing a depth comparison of the LTE and WiMAX standards and delves into the intricacies of each of them.

Journal Article
TL;DR: A novel algorithm is provided which stores the details of the various clients who have downloaded the files and looks up that table when a new request comes in and sends the data from that client to the requestor thus saving the precious CPU time of the server by harnessing the computing power of the clients.
Abstract: When an e-Learning System is installed on a server, numerous learners make use of it and they download various learning objects or files from the server. Most of the time the request is for same files downloaded from the server which results in server perform the same repetitive task of locating the file and sending across to the requestor. This paper provides a novel algorithm which stores the details of the various clients who have downloaded the files and look up that table when a new request comes in and sends the fie from that client to the requestor thus saving the precious CPU time of the server by harnessing the computing power of the clients.

Journal Article
TL;DR: A review on the existing routing protocols for WSN by considering energy efficiency and QoS, which focuses on the main motivation behind the development of each protocol and explains the operation of different protocols in detail.
Abstract: WSNs (Wireless Sensor Networks) are a huge collection of sensor nodes which have limited battery power and limited computational capacity. The power limitation causes the nodes to premature dead so the node power should be used efficiently to prolong the network lifetime. In time critical applications, the data should reach the destination within a deadline and without any packet loss which means the QoS metrics such as reliability and delay are very essential for delivering the data to destination. One of the vital challenges for research in wireless sensor networks is the implementation of routing protocols which achieve both Quality of Service (QoS) and energy efficiency. The main task of the routing protocol is to discover and maintain the routes to transmit the data over the network. At present, to increase the performance of the networks, to achieve load balancing and to provide fault tolerance multipath routing techniques are widely used rather than single path routing technique. We present a review on the existing routing protocols for WSN by considering energy efficiency and QoS. We focus on the main motivation behind the development of each protocol and explain the operation of different protocols in detail. We compare the protocols based on energy efficiency and QoS metrics. Finally we conclude the study by giving future research directions.

Journal Article
Abstract: In this paper a concise outline for improving throughput and average end to end delay of information gathered from the agriculture field for precision agriculture, using a distributed clustering mechanism has been outlined. This algorithm offers a throughput of 180 bits/seconds. Besides delivery of water level information packets/signals to base station, it also computes a threshold as well as calculates the values based on transmission range. This overall computational mechanism helps us to build a robust mechanism for delivery of information to the base station, thus reducing the packet loss. A wireless sensor network is a system consisting of sensor nodes, which incorporates a radio frequency (RF) transceiver, sensor, microcontroller and a power source. Recent advances in wireless sensor networking technology have led to the expansion of low cost, low power, multifunctional sensor nodes. Sensor nodes facilitate environment sensing together with data processing, are able to network with other sensor systems and exchange data with external users. Sensor networks are used for a variety of applications including wireless data acquisition, environmental monitoring, irrigation management, safety management and in many other areas. In this paper, a review of incorporating a distributed clustering algorithm for an agricultural application has been elaborated.

Journal Article
TL;DR: An IT service management model is proposed for Zimbabwean universities and is a holistic approach through integration of Operational Level Agreements, Service Level Agreement (SLAs) and IT Service Catalogues (ITSCs).
Abstract: Several IT service management (ITSM) frameworks have been deployed and are being adopted by companies and institutes without redefining the framework to a model which suits their IT departments’ operating environment and requirements. An IT service management model is proposed for Zimbabwean universities and is a holistic approach through integration of Operational Level Agreements (OLAs), Service Level Agreement (SLAs) and IT Service Catalogues (ITSCs). OLA is considered as the domain for describing IT Service management and its attainment is geared by organisational management and IT section personnel in alignment with the mission, vision and values of the organisation. Explicitly defining OLAs will aid management in identification of key services and processes in both qualitative and quantitative form (SLAs). After defining SLAs then ITSCs can be formulated, a measure which is both customer and IT service provider centric and acts as the nucleus of the model. Redefining IT Service Management from this this perspective will result in deriving value from IT service management frameworks and customer satisfaction.

Journal Article
TL;DR: In this framework, a novel palm print representation method, namely orthogonal line ordinal features, is proposed and the palm print registration, feature extraction, palm print verification and palm print recognition modules are designed.
Abstract: Personal identification is one of the most important requirements in all e-commerce and criminal detection applications. In this framework, a novel palm print representation method, namely orthogonal line ordinal features, is proposed. The palm print registration, feature extraction, palm print verification and palm print recognition modules are designed to manage the palm prints and the palm print database module is designed to store their palm prints and the person details in the database. The feature extraction module is proposed to extract the ordinal measurements for the palm prints. The verification module is designed to verify the palm print with the personal identification record. The recognition module is proposed to find out the relevant person associated with the palm print image. The proposed palm print recognition scheme uses the intensity and brightness to measure the ordinal measurement. The ordinal measures are estimated for the 4 x 4 regions of the palm print images.

Journal Article
TL;DR: A variety of MAC protocols for WSNs are surveyed, with a special focus on traffic classification and priority assignment, and a comparison of different MAC protocols with various parameters and future research directions are included.
Abstract: Wireless Sensor Networks (WSNs) consists of multiple sensor nodes, which are deployed randomly to collect periodic data, processes the data and forward it to the sink node. The main challenges that WSN faces are severe energy constraints, unpredictable environmental conditions, robustness, responsiveness, self-configuration, etc… Among this the main challenge is the energy efficiency. In order to tackle all these challenges, new protocols in all the layers of communication stack need to be designed. Designing a MAC protocol is of crucial importance because it influences the transceiver unit of the sensor node. The Quality of Service (QoS) at the MAC layer matters as it rules medium sharing and supports reliable communication. In WSNs nodes generate heterogeneous traffic which have different QoS requirements like reliability and delay deadline with different priority requirements that vary according to the application. In this work, a variety of MAC protocols for WSNs are surveyed, with a special focus on traffic classification and priority assignment. A comparison of different MAC protocols with various parameters and future research directions are also included.

Journal Article
TL;DR: A multi-criteria analysis is done to select the access network and the proposed system yields better results in terms of Throughput, delay and Packet Loss Ratio (PLR).
Abstract: Seamless Service delivery in a heterogeneous wireless network environment demands selection of an optimal access network. Selecting a non-promising network, results in higher costs and poor services. In heterogeneous networks, network selection schemes are indispensable to ensure Quality of Service (QoS). The factors that have impact on network selection include Throughput, Delay, Jitter, Cost and Signal Strength. In this paper, multi-criteria analysis is done to select the access network. The proposed scheme involves two schemes. In the first scheme, Dynamic Analytic Hierarchy Process (AHP) is applied to dynamically decide the relative weights of the evaluative criteria set based on the user preferences and service applications. The second scheme adopts Modified Grey Relational Analysis (MGRA) to rank the network alternatives with faster and simpler implementation. The proposed system yields better results in terms of Throughput, delay and Packet Loss Ratio (PLR).

Journal Article
TL;DR: The proposed approach significantly improves the precision and recall of the retrieval system and achieves this property by using Particle Swarm Optimization algorithm.
Abstract: Storage and retrieval of images over a large database is an important issue. Content Based Image Retrieval system provides solution for this issue. In Content Based Image Retrieval(CBIR) similar images are retrieved using low level features such as color, texture, edge, etc that are extracted both from the query image and the database. In CBIR less amount of retrieval time with high accuracy is desired property. The proposed system achieves this property by using Particle Swarm Optimization algorithm. The proposed system consists of the following phases (i) Color feature extraction using (luminance(y), blue chrominance (u), red chrominance (v)) method (ii) Texture feature extraction using Grey Level Co-occurrence Matrix (iii) Edge feature extraction using Edge Histogram Descriptor (iv) Measurement of Similarity between Query image and the Database image using Euclidean Distance. (v) Optimization of retrieved result using Particle Swarm Optimization. In comparison with the existing approach, the proposed approach significantly improves the precision and recall of the retrieval system.