scispace - formally typeset
Search or ask a question

Showing papers in "International journal of engineering and technology in 2013"


Journal Article
TL;DR: The paper is to find an efficient way of storing unstructured data and appropriate approach of fetching data and the public tweets of Twitter are targeted in this work to organize.
Abstract: Nowadays, most of information saved in companies are unstructured models. Retrieval and extraction of the information is essential works and importance in semantic web areas. Many of these requirements will be depend on the unstructured data analysis. More than 80% of all potentially useful business information is unstructured data, in kind of sensor readings, console logs and so on. The large number and complexity of unstructured data opens up many new possibilities for the analyst. Text mining and natural language processing are two techniques with their methods for knowledge discovery from textual context in documents. This is an approach to organize a complex unstructured data and to retrieve necessary information. The paper is to find an efficient way of storing unstructured data and appropriate approach of fetching data. Unstructured data targeted in this work to organize, is the public tweets of Twitter. Building an Big Data application that gets stream of public tweets from twitter which is latter stored in the HBase using Hadoop cluster and followed by data analysis for data retrieved from HBase by REST calls is the pragmatic approach of this project. Keyword: Unstructured Data, Hadoop, HBase, Data Mining

98 citations


Journal Article
TL;DR: A wind turbine prototype with a pitch angle control based on fuzzy logic to maximize the output power is built and demonstrated, and the use of fuzzy logic controller can maximize the average output power.
Abstract: Pitch angle control of wind turbine has been used widely to reduce torque and output power variation in high rated wind speed areas It is a challenge to maximize available energy in the low rated wind speed areas In this paper, a wind turbine prototype with a pitch angle control based on fuzzy logic to maximize the output power is built and demonstrated In the varying low rated wind speed of 4-6 m/s, the use of fuzzy logic controller can maximize the average output power of 145 watt compared to 140 watt at a fixed pitch angle of the blade Implementation of pitch angle fuzzy logic-based control to the wind turbine is suitable for the low rated wind speed areas

52 citations



Journal ArticleDOI
TL;DR: The aim of this paper is to provide up to date enlightenment in the field of ABC algorithm and its applications.
Abstract: In recent years large number of algorithms based on the swarm intelligence has been proposed by various researchers. The Artificial Bee Colony (ABC) algorithm is one of most popular stochastic, swarm based algorithm proposed by Karaboga in 2005 inspired from the foraging behavior of honey bees. In short span of time, ABC algorithm has gain wide popularity among researchers due to its simplicity, easy to implementation and fewer control parameters. Large numbers of problems have been solved using ABC algorithm such as travelling salesman problem, clustering, routing, scheduling etc. the aim of this paper is to provide up to date enlightenment in the field of ABC algorithm and its applications.

40 citations


Journal Article
TL;DR: An efficient, robust, scalable, and easy to use web authentication system called '2CAuth' that combines the above factors with OTP and challenge response methods without requiring any time synchronization between user's mobile phone and service provider's server for authentication purpose.
Abstract: Password based schemes has been the standard means of authentication over decades. Enhancements use entities like ownership (something one possess), knowledge (something one knows), and inherence (something one is) as first factor and mobile phones as token less second factor, in combinations, to offer different levels of security assurances, trading off usability. In this paper we present '2CAuth' a new two factor authentication scheme that enhances secure usage of application information and preserves usability, without sacrificing user's privacy. A significant feature of the scheme is that it DOES NOT call for any synchronization between Mobile Network Operator (MNO) and users. The analysis of the scheme clearly brings out its effectiveness in terms of its usability even at times of peak loads on mobile networks. The scheme has the dual advantage of carrying out the verification of transactions which involve the physical presence of the user as well as those to be done in his absence. Many of the services that we use daily, for example banking, have transformed from traditional customer services into Internet services. As services that contain sensitive data are moved to Internet, strong authentication is required to provide a high enough level of security and privacy. With computing becoming pervasive, people increasingly rely on public computers to do business over the Internet thus making it a preferred environment for a multitude of e-services like e-commerce, e-banking, etc. Security for these applications is an important enabler. In general, the password based authentication mechanism provides the basic capability to prevent unauthorized access. One-time passwords make it more difficult to gain unauthorized access to restricted resources. Many researchers have devoted efforts to implement various OTP schemes using smartcards, time-synchronized token or SMS etc. Security risks are more pressing as attacks become more daring. This makes systems that rely on single factor authentication more vulnerable and at risk calling for authentication using multiple factors. In this paper, we introduce, implement and analyze an efficient, robust, scalable, and easy to use web authentication system called '2CAuth'. The proposed method is based on ownership (smart code and Mobile) and knowledge (Security Code) factors. It combines the above factors with OTP and challenge response methods without requiring any time synchronization between user's mobile phone and service provider's server for authentication purpose. The rest of the paper is organized as follows section 2 presents literature review, section 3 describes the motivation for the work and the contributions of this paper. Section 4 discusses the proposed scheme and its details. Section 5 presents the analysis of the scheme and finally Section 6 presents the conclusions.

38 citations



Journal Article
TL;DR: In this article, the authors investigate how the air temperature is affected by the urban design and how it can be modified to improve the thermal comfort in the city of Rome, and present design guidance on how to form urban passive cooling systems.
Abstract: The aim of this study is to investigate how the air temperature is affected by the urban design and how it can be modified to improve the thermal comfort in the city of Rome. The physical and geometrical properties of the buildings and the presence of green areas have a large impact on the urban climate and on thermal conditions of the people who use open spaces; it is obvious how the role of a urban planner is important to reduce the thermal stress and to design comfortable outdoor spaces for humans. In this study, several numerical simulations using ENVI-met have been performed to evaluate the impact of urban morphology on the microclimate within a city center in summer. Although some very hot conditions were recorded, there were evident examples of more acceptable comfort conditions and cooling potential for some orientations and degrees of urban compactness due to the clustered form with green cool islands and wind flow through the main canyons. Some design guidance on how to form urban passive cooling systems is presented.

29 citations


Journal ArticleDOI
TL;DR: In this article, a survey of various VOD measurement techniques such as electric, nonelectric and fibre optic have been discussed and some commercially available VOD meter comparison are also presented.
Abstract: Velocity of Detonation (VOD) is an important measure characteristics parameter of explosive material. The performance of explosive invariably depends on the velocity of detonation. The power/ strength of explosive to cause fragmentation of the solid structure determine the efficiency of the Blast performed. It is an established fact that measuring velocity of detonation gives a good indication of the strength and hence the performance of the explosive. In this survey various VOD measurement techniques such as electric, nonelectric and fibre optic have been discussed. To aid the discussion some commercially available VOD meter comparison are also presented. After review of the existing units available commercially and study of their respective merits and demerits, feature of an ideal system is proposed.

22 citations


Journal ArticleDOI
TL;DR: In this article, the compressive strength of heavyweight HPCs plays an important role in enhancing the attenuation of γ-rays and the mass attenuation coefficients were also compared with the values obtained by the United States National Institute of Standards and Technology (NIST).
Abstract: that the compressive strength of heavyweight HPCs plays an important role in enhancing the attenuation of γ-rays. The compressive strength and attenuation of γ-rays have a near to linear relation. While in the normal concrete, the strength has no effect on the attenuation of γ-radiation. The mass attenuation coefficients were also compared with the values obtained by the United States National Institute of Standards and Technology (NIST). The comparison showed a reasonable agreement. It was observed that the attenuation of γ-rays is considerably affected by concrete density.

22 citations


Journal Article
TL;DR: A new algorithm called RFPID (Regular Frequent Pattern Mining in Incremental Databases) is proposed to mine regular frequent patterns in incremental transactional databases using vertical data format which requires only one database scan.
Abstract: In the real world database updates continuously in several online applications like super market, network monitoring, web administration, stock market etc. Frequent pattern mining is a fundamental and essential area in data mining research. Not only occurrence frequency of a pattern but also occurrence behaviour of a pattern may be treated as important criteria to measure the interestingness of a pattern. A frequent pattern is said to be regular frequent if the occurrence behaviour is less than or equal to the user given regularity threshold. In incremental transactional databases the occurrence frequency and the occurrence behaviour of a pattern changes whenever a small set of new transactions are added to the database. It is undesirable to mine regular frequent patterns from the scratch. Thus proposes a new algorithm called RFPID (Regular Frequent Pattern Mining in Incremental Databases) to mine regular frequent patterns in incremental transactional databases using vertical data format which requires only one database scan. The experimental results show our algorithm is efficient in both memory utilization and execution. Keyword-Frequent patterns, Regular patterns, Transactional database, Incremental database, vertical data format.

21 citations


Journal ArticleDOI
TL;DR: The applications of SVM have been grouped and summarized in the different areas of the exploration phase, which can be used as a guide to assess the effectiveness of S VM over other data mining algorithms.
Abstract: This paper presents an overview of support vector machines (SVM) as one of the most promising intelligent techniques for data analysis found in the published literature, as theoretical approaches and sophisticated applications developed for various research areas and problem domains. This work is an attempt to provide a survey of the applications of SVM for oil and gas exploration to professionals, researchers and academics involved with the hydrocarbons industry. The applications of SVM have been grouped and summarized in the different areas of the exploration phase, which can be used as a guide to assess the effectiveness of SVM over other data mining algorithms. It also provides a better understanding of the various applications that have been developed for an area that offers a glimpse of innovative applications in other domains of the industry.



Journal ArticleDOI
TL;DR: In this paper, a computer vision technique based on charge coupled devices (CCD) images was proposed to attempt to automatically detect cracks in concrete structure. But the experimental results indicated that the optimal accuracies of 90% and 84% could be achieved for the training and testing samples, respectively.
Abstract: Most important civil infrastructures are made of concrete, so accurate information by routine inspection is necessary for structure maintenance. Sometimes temporarily erected scaffoldings are needed for infrastructure inspections. Bridge inspection for example, the inspectors must stand on the platform to examine the underside of a bridge, but such a procedure is risky. At present, several inspection systems coupled with Charge Coupled Devices (CCD) cameras have been developed and applied to infrastructure inspections in order to reduce the danger of accidents to the human inspectors. This paper proposes a computer vision technique based on CCD images to attempt to automatically detect cracks in concrete structure. The experimental result indicates that the optimal accuracies of 90% and 84% could be achieved for the training and testing samples, respectively.

Journal Article
TL;DR: This research aims to overcome the shortcoming of ID3 algorithm by using gain ratio(instead of information gain) as well as by giving weights to each attribute at every decision making point to overcome its shortcoming.
Abstract: The ability to predict performance of students is very crucial in our present education system. We can use data mining concepts for this purpose. ID3 algorithm is one of the famous algorithms present today to generate decision trees. But this algorithm has a shortcoming that it is inclined to attributes with many values. So , this research aims to overcome this shortcoming of the algorithm by using gain ratio(instead of information gain) as well as by giving weights to each attribute at every decision making point. Several other algorithms like J48 and Naive Bayes classification algorithm are also applied on the dataset. The WEKA tool was used for the analysis of J48 and Naive Bayes algorithms. The results are compared and presented. The dataset used in our study is taken from the School of Computing Sciences and Engineering (SCSE), VIT University. Keyword- data mining, educational data mining (EDM), decision tree, gain ratio, weighted ID3 I. INTRODUCTION Educational Data Mining (EDM) is attracting a lot of researchers for developing methods from educational institutions' data that can be used in the improvement of quality of higher education. EDM uses the large amounts of information present in the educational institutes' databases about teaching-learning practices for the development of models which are beneficial for all the participants in the educational process. The prediction of students' performance in any higher institute has become one of the most important needs of that institute in order to improve the quality of the teaching process of that institution. Through this process we get to know the needs of the students and hence we can fulfil those needs to get better results. Students who need special attention from the teachers can also be identified through this process. A number of algorithms are available for predicting the performance of students. Some of them being Artificial Neural Network (ANN), Decision Trees, Clustering, Naive Bayes algorithm, Decision Trees being most commonly used. II. RELATED WORK

Journal ArticleDOI
TL;DR: In this paper, the use of fiber reinforced polymers (FRP) to strengthen the slab-column connections subjected to punching shear has been investigated, and the results obtained from an experimental investigation of four half-scale two-way slabcolumn interior connections, which were constructed and tested under a centric vertical load, were compared with the experimental results.
Abstract: This study aims to determine the efficiency of using Fiber Reinforced Polymers (FRP) systems to strengthen the slab–column connections subjected to punching shear. The used strengthening systems consisted of external FRP stirrups made from glass and carbon fibers. The stirrups were installed around the column. Also, external steel links were used as a conventional strengthening method for comparison. Over the last few years, the use of FRP for strengthening of concrete structures has been investigated by many researchers, whichconcerning with the strengthening of reinforced concrete slabs, beams and columns. The use of FRP in strengthening concrete slabs in flexure is done by bonding it to the tension face of the slabs. The use of FRP for strengthening the flat slabs against punching shear can be considered as a new application. This research shows the results obtained from an experimental investigation of 4 half-scale two-way slab-column interior connections, which were constructed and tested under punching shear caused by centric vertical load. The research included one unstrengthened specimen, which considered as control specimen, one specimen strengthened with steel links, one specimen strengthened with external stirrups made from Glass Fiber Reinforced Polymer (GFRP), and one specimen strengthened with external stirrups made from Carbon Fiber Reinforced Polymer (CFRP). So, the type of strengthening material is the basic parameter in this study. The experimental results showed a noticeable increase in punching shear resistance and flexural stiffness for the strengthened specimens compared to control specimen. Also, the strengthened tested slabs showed a relative ductility enhancement. Finally, equations for punching shear strength prediction of slab-column connections strengthened using different materials (Steel, GFRP & CFRP) were applied and compared with the experimental results.

Journal ArticleDOI
TL;DR: In this paper, the authors established a correlation for moisture content and density of a soil with its electrical resistivity, showing that the electrical resistor variation decreased in a curvilinear manner with increasing percentage of moisture content.
Abstract: Natural soils are an intimate mixture of solid, liquid and gas phases. This study establishes a correlation for moisture content and density of a soil with its electrical resistivity. In the past, most of the conventional geotechnical site investigation required bulky and heavy equipment to determine the geotechnical parameters necessary for design and construction purposes. Consequentially, time and cost of the project is increased especially when dealing with some difficult site such as on mountainous terrain. This study is based on laboratory soil box resistivity meter observations made on soils mixed with additions of consistent increments of 1-5 % of water to 1500 gram of remolded soils in loose condition. At least 24 repetitive resistivity test observations were made and the moisture content and soil density was determined concurrently for each of the tests. The observations showed that the electrical resistivity variation decreased in a curvilinear manner with increasing percentage of moisture content. A regression equation and coefficient of determination, R2 for moisture content against soil electrical resistivity value was established by moisture content, w = 152.87ρ-0.312 (ρ = soil electrical resistivity) and R2 = 0.7718 respectively. While a regression equation and R2 value for bulk density versus soil electrical resistivity value was observed to be ρbulk = -0.107 ln (ρ) + 1.7249 and 0.7016 respectively. Hence, a viable method is demonstrated where the electrical resistivity value was applicable and has a great potential for geotechnical data prediction of parameters such as moisture content and soil density.

Journal ArticleDOI
TL;DR: In this paper, the sound absorption coefficient of perforated facings backed by porous materials is studied under high sound intensities in the absence of mean flow, and theoretical considerations are based on the equivalent fluid following the Johnson-Champoux-Allard approach and the use of the transfer matrix method.
Abstract: The sound absorption coefficient of perforated facings backed by porous materials is studied under high sound intensities in the absence of mean flow. The theoretical considerations are based on the equivalent fluid following the Johnson-Champoux-Allard approach and the use of the transfer matrix method. To take into account the high sound levels effects, the air flow resistivity of each layer is modified following the Forchheimer law. Two specimens of perforated plate are built and tested when backed by a polymeric foam and a fibrous material. A specific impedance tube setup is developed for the measurement of the surface acoustic impedance for sound pressure levels ranging from 90 dB to 150 dB at the surface of the perforated facing. To corroborate the validity of the presented method, two considerations are particularly depicted in the experimental results: first, the case where the perforated facing and the porous material are both directly backed by a rigid wall and the case where there is an air cavity between the porous material and the rigid wall. Good agreement is observed between the simulation and the experimental results.

Journal ArticleDOI
TL;DR: In this paper, the authors analyzed 1381 projects that have been certified in LEED-Existing Building versions 2008 and 2009 using data mining techniques to discover hidden inter-relationships and the effects on high-scoring sustainable design strategies.
Abstract: LEED (Leadership in Energy and Environmental Design) is a credit-based green building rating system. Considering that a better understanding of the relationships between credits would help managers better achieve green building certification, this study analyzed 1381 projects that have been certified in LEED-Existing Building versions 2008 and 2009. The credits achieved by those projects were analyzed using data mining techniques to discover hidden inter-relationships and the effects on high-scoring sustainable design strategies. The data mining results were compared with the credit pairs provided by LEED AP consultants from the engineering perspective. Additional hidden credit pairs were also discovered.

Journal ArticleDOI
TL;DR: The aim of this project is to predict the service range for RAP for downlink transmission using ROF technology and the RF front end components are modelled using S-Parameter measured data from factory.
Abstract: ROF technology has been proposed as a promising cost effective solution. In this network, a central station (CS) is connected to numerous RAP using an optical fiber to indoor connection. The aim of this project is to predict the service range for RAP for downlink transmission. The indoor RF front end consists of a photodiode, bandpass filter (BPF), power amplifier (PA) and an antenna which operating at 2.4GHz band. BPF is needed to remove out the frequency from nonlinear effect of fiber and pass through the signal at the operating frequency. The RF front end components are modelled using S-Parameter measured data from factory. Indoor picocells are used power lower than 1 Watt (30 dBm) and service range of more than 100 meters.

Journal ArticleDOI
TL;DR: An approach for watermaking scheme to protect copyrights of digital images with the aid of a combined Lifting Wavelet Transform (LWT) and Discrete Cosine Transform (DCT), which focuses on an invisible watermark embedding, imperceptibility of watermarked image and performance evaluation metrics.
Abstract: As a potential solution to defend unauthorized replication of digital multimedia objects, digital watermarking technology is now attracting significant attention. With the aid of a combined Lifting Wavelet Transform (LWT) and Discrete Cosine Transform (DCT), an approach for watermaking scheme to protect copyrights of digital images is presented in this paper. The lifting wavelet transform is applied to decompose the original image into four sub-band images. Then the discrete cosine transform is computed on the selected sub-band of the LWT coefficients. The watermark is embedded in the DCT transformed of the selected LWT sub-band of the cover image. The proposed system focuses on an invisible watermark embedding, imperceptibility of watermarked image and performance evaluation metrics. This presented algorithm is realized in MATLAB.

Journal ArticleDOI
TL;DR: In this paper, a self-reported survey was used to ask the participants where and how they like to access Facebook, the people whom they would not want to see their Facebook profile, and the number of Facebook friends they have.
Abstract:  Abstract—This study, undertaken as part of a wider study of Facebook usage in Saudi Arabia, uses a self-report survey, and includes a thorough analysis of some aspects of Facebook usage by Saudi university students. The participants were 372 students (188 male and 184 female) at one university in Saudi Arabia. A self-reported survey was used to ask the participants where and how they like to access Facebook, the people whom they would not want to see their Facebook profile, and the number of Facebook friends they have. In addition, this study measures the differences between male and female university students in these variables. The study has revealed several significant results that contribute to the current knowledge of social network sites.

Journal ArticleDOI
TL;DR: The genetic algorithm is more efficient than the simulation model based on the FIFS rule in terms of reducing the number of errors in the simulation.
Abstract: based on the genetic algorithm is more efficient than the simulation model based on the FIFS rule.

Journal ArticleDOI
TL;DR: In this article, the design of post-tensioned flat slabs can be done by using load balancing and equivalent frame method, which can design the most economic and the safe design.
Abstract: The post-tensioning method is now a days increasing widely, due to its application. By using post-tensioning method one can design the most economic and the safe design. While using this method more precautions has to be made for shear and deflection criteria for the slabs. The design of post-tensioned flat slab can be done by using load balancing and equivalent frame method. For the application of design procedure an office building is considered as a case study. The plan of the office building (G+4) is considered. This building is designed by considering four cases with different floor systems. The quantities of reinforcing steel, prestressing steel, concrete required for the slab, beam and column is calculated for the same and are presented in tabular form. Along with this total cost of the building per square meter is found and comparison of all the four cases with respect to cost is done.

Journal ArticleDOI
TL;DR: The research objective of this study is to connect BIM software with the human behavior simulation engine utilizing object-oriented computer programming language and to prove that implementation of the occupant simulation help improve building design.
Abstract: Human behavior needs to be considered when designing buildings and infrastructures. In recent decades, building information modeling (BIM) has been increasingly adopted as a computer aided design methodology for architectural design, engineering design simulation and evaluation, and 4D constructability analysis. BIM models can be used to conduct engineering analyses while human behavior simulation using BIM models is still lacking. The research objective of this study is to connect BIM software with the human behavior simulation engine utilizing object-oriented computer programming language. A behavioral modeling engine is also developed in this study upon the agent-based modeling (ABM) approach. Finally, this paper presents a demonstrative example of four scenarios and the result proves that implementation of the occupant simulation help improve building design.

Journal ArticleDOI
TL;DR: In this article, a damped interconnection-based mitigation solution based on the incorporation of pressurized fluid-viscous dissipaters across the inadequate separation gaps, is presented, and the benefits provided by the retrofit intervention, and some of its technical installation details, are finally offered.
Abstract: viscoelastic model for the numerical time-history analysis of the dynamic impact problem is proposed and implemented in the finite element model of the buildings. The results of the assessment enquiries carried out in current conditions, and a damped interconnection-based mitigation solution based on the incorporation of pressurized fluid-viscous dissipaters across the inadequate separation gaps, are presented. Evaluations of the benefits provided by the retrofit intervention, and some of its technical installation details, are finally offered.

Journal Article
TL;DR: The coverage area for LTE-A cellular network is analyzed by taking into account the interference of first tier and frequency reuse planning, and the mathematical model is verified by using the ATDI simulator for the LTE radio planning that deals with a real digital cartographic and contains standard formats for propagation loss.
Abstract: In this paper we analyze the coverage area for LTE-A cellular network by taking into account the interference of first tier and frequency reuse planning. We considered the numerical calculations and simulation results to measure the received signal strength at the users for downlink and uplink performances. It has been shown from results that there is degradation in the received signal strength at cell boundaries from-34dBm at the center to -91dBm at the boundaries with spectral efficiency from (4.3 to 0.5) bps/Hz at cell edge. We verify the mathematical model by using the ATDI simulator for the LTE radio planning that deals with a real digital cartographic and contains standard formats for propagation loss. Keyword- LTE-A, RSSI, Coverage, UE

Journal ArticleDOI
TL;DR: The potential tradeoffs of usability and security in the software development process is discussed by proposing a guideline and case studies are diligently carried out for qualitative approach.
Abstract: Usability and security become a core issue in the designing of modern computer software's. Nevertheless, there are studies have been conducted in different combinatorial ways of these subjects. However, still there is room to improve the relationship in sense of appropriate deployment of these features in software applications. In this paper we discuss the potential tradeoffs of usability and security in the software development process by proposing a guideline. The case studies are diligently carried out for qualitative approach.

Journal ArticleDOI
TL;DR: In this article, the authors identify the gaps in scope among different green building assessment standards and discuss the future trends of green buildings for better design and certification planning, focusing on the LEED standard in US, BREEAM in UK, BEAM in Hong Kong, Green Mark in Singapore, and Green Star in Australia.
Abstract: The number of green buildings has steadily increased in recent years, during which various green building assessment standards have evolved to complement the development of green buildings. This study identifies the gaps in scope among different green building assessment standards and discusses the future trends of green buildings for better design and certification planning. This study focuses on the LEED standard in US, BREEAM in UK, BEAM in Hong Kong, Green Mark in Singapore, and Green Star in Australia. After a brief overview on the selected standards, this research analyzes the shift in scope of the different standards since their establishment, and compares the differences and trends among them. Construction has been accused of causing a variety of environmental problems ranging from excessive consumption of global resources, both in terms of construction and building operation to the pollution of the surrounding environment(1). Research on green building design and materials is already well established and different organizations and research groups have contributed to the development of separate green building assessment standardsto evaluate the environmental friendliness of the building facilities. This study aims at comparing the scope of prominent and developing green building assessment standards to analyze any gaps and to identify the future trends. The comparison will help planners make informed decisions during the design and certification stage of the green building project. Considering the diversity in climate, geography, government policies and building stocks, the following five assessment standards were selected and compared -(1) LEED in the United States, (2) BREEAM in the United Kingdom, (3) BEAM in Hong Kong, (4) Green Mark in Singapore, and (5) Green Star in Australia.

Journal Article
TL;DR: A simulation study for MPEG-4 video encoding scheme based on an experimental model was carried out to determine conformance with IEEE 802.15.4 requirements and shows that an optimal selection of the parameters value that enhances the video transmission over WSN.
Abstract: Nowadays, video streaming application is widely used in wired and wireless environment. Extending this application into Wireless sensor network (WSN) applications featuring low data rate transmission, low energy consumption, ease of deployment and low cost has attracted lots of attention in the research community. However, video transmission over such network is more challenging because of the large amount of bandwidth required. To cater this problem, video compression is of utmost importance to decrease the amount of bandwidth required over WSN. MPEG-4 video codec is one of the compression scheme that was identified to be suitable for WSN environment. In this paper, a simulation study for MPEG-4 video encoding scheme based on an experimental model was carried out to determine conformance with IEEE 802.15.4 requirements. The results obtained from this paper would be used as a benchmark for the configuration of the video encoding scheme for WSN applications. There are three parameters that we are concerned with in this experiment, which are quantization scale, group of picture (GOP) and frame rate (fps). The results from this simulation study shows that an optimal selection of the parameters value that enhances the video transmission over WSN.