scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Innovative Research in Computer and Communication Engineering in 2015"


Journal Article
TL;DR: In this article, the proposed work plan is to eliminate the concerns regarding data privacy using cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers, which is a set of IT Services, for example network, software system, storage, hardware, software, and resources and these services are provided to a customer over a network.
Abstract: Cloud Computing is a set of IT Services, for example network, software system, storage, hardware, software, and resources and these services are provided to a customer over a network. The IT services of Cloud Computing are delivered by third party provider who owns the infrastructure. Benefits of cloud storage are easy access means access to your knowledge anyplace, anyhow, anytime, scalability, resilience, cost efficiency, and high reliability of the data. Because of these benefits each and every organization is moving its data to the cloud, means it uses the storage service provided by the cloud provider. So there is a need to protect that data against unauthorized access, modification or denial of services etc. To secure the Cloud means secure the treatments (calculations) and storage (databases hosted by the Cloud provider). In this research paper, the proposed work plan is to eliminate the concerns regarding data privacy using cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers.

44 citations


Journal ArticleDOI
TL;DR: Associate in nursing approach for motion detection that utilizes Associate in nursing analysis - primarily based radial basis perform network as its principal element is proposed, indicating it to be extremely effective in variable bit-rate video streams over real-world restricted information measure networks.
Abstract: Machine-controlled motion detection technology is Associate in nursing integral element of intelligent transportation systems, and is especially essential for management of traffic and maintenance of traffic police investigation systems. Traffic police investigation systems mistreatment video communication over real-world networks with restricted information measure typically encounter difficulties attributable to network congestion and/or unstable information measure. This is often particularly problematic in wireless video communication. This has necessitated the event of a rate management theme that alters the bit-rate to match the procurable network information measure, thereby manufacturing variable bit-rate video streams. However, complete and correct detection of moving objects beneath variable bit-rate video streams could be a terribly tough task. During this paper, we tend to propose Associate in nursing approach for motion detection that utilizes Associate in nursing analysis - primarily based radial basis perform network as its principal element. This approach is applicable not solely in high bit-rate video streams, however in low bit- rate video streams, as well. The planned approach consists of a varied background generation stage and a moving object detection stage. Throughout the assorted background generation stage, the lower-dimensional Eigen-patterns and also the adjustive background model are established in variable bit -rate video streams by mistreatment the planned approach so as to accommodate the properties of variable bit-rate video streams. Throughout the moving object detection stage, moving objects are extracted via the planned approach in each low bit -rate and high bit- rate video streams; detection results are then generated through the output worth of the planned approach. The detection results created through our approach indicate it to be extremely effective in variable bit-rate video streams over real-world restricted information measure networks. Additionally, the planned methodology will be simply achieved for period of time application. Quantitative and qualitative evaluations demonstrate that it offers blessings over different state- of-the -art ways. For example, and accuracy rates created via the planned approach were up to eighty six.38% and 89.88% beyond those created via different compared ways.

41 citations


Journal ArticleDOI
TL;DR: The proposed work plan is to eliminate the concerns regarding data privacy using cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers.
Abstract: Cloud Computing is a set of IT Services, for example network, software system, storage, hardware, software, and resources and these services are provided to a customer over a network. The IT services of Cloud Computing are delivered by third party provider who owns the infrastructure. Benefits of cloud storage are easy access means access to your knowledge anyplace, anyhow, anytime, scalability, resilience, cost efficiency, and high reliability of the data. Because of these benefits each and every organization is moving its data to the cloud, means it uses the storage service provided by the cloud provider. So there is a need to protect that data against unauthorized access, modification or denial of services etc. To secure the Cloud means secure the treatments (calculations) and storage (databases hosted by the Cloud provider). In this research paper, the proposed work plan is to eliminate the concerns regarding data privacy using cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers.

38 citations


Journal ArticleDOI
TL;DR: This paper provides a survey of routing protocols for VANET, which covers application areas, challenges and security issues prevailing in VANets.
Abstract: A Vehicular Ad-Hoc Network or VANET is a sub form of Mobile Ad-Hoc Network or MANET that provides communication between vehicles and between vehicles and road-side base stations with an aim of providing efficient and safe transportation. A vehicle in VANET is considered to be an intelligent mobile node capable of communicating with its neighbours and other vehicles in the network. VANET introduces more challenges aspects as compare to MANET because of high mobility of nodes and fast topology changes in VANET. Various routing protocols have been designed and presented by researchers after considering the major challenges involved in VANETs This paper provides a survey of routing protocols for VANET . It covers application areas, challenges and security issues prevailing in VANETs.

28 citations


Journal ArticleDOI
TL;DR: Eye Blink Monitoring System (EBM) that alerts the subject during state of drowsiness that detects abnormal eye blinking rate & an alarm is initiated to wake the subject.
Abstract: Fatal Road accidents can be easily avoided by understanding the psychological state of drivers. Majority of road accidents occur during night driving due to drowsiness state of vehicle drivers (Subject). This paper provides Eye Blink Monitoring System (EBM) that alerts the subject during state of drowsiness. An embedded system based on psychological state of Subject by monitoring eye movements and head movements are useful in warning drivers during initial sleep cycle phase of drowsiness. The physiological sleep state analysis of subject can be determined by monitoring subjects eye-blink rate using an IR sensor and head movement using an accelerometer. A normal eye blink rate has no effect on the output of the system. However, if subject is in extreme state of sleep-cycle, then IR sensor receives abnormal eye blinking rate & an alarm is initiated to wake the subject. An Internet of Things (IOT) enabled sensors are used to transmit the entire data collected by sensors over a smart grid network for quick response team to take actions under emergency conditions.

28 citations


Journal ArticleDOI
TL;DR: The main aim is to design an easy but efficient algorithm that would be useful for maximum number of currencies, because all currencies have different security features, making it a tough job to design one algorithm that could be used for recognition of all available currencies.
Abstract: There are around 200+ different currencies used in different countries around the world. The technology of currency recognition aims to search and extract the visible as well as hidden marks on paper currency for efficient classification. Currency Recognition and conversion system is implemented to reduce human power to automatically recognize the amount monetary value of currency and convert it into the other currencies without human supervision. The software interface that we are proposing here could be used for various currencies (we are using four in our project). Many a times, currency notes are blurry or damaged; many of them have complex designs to enhance security. This makes the task of currency recognition very difficult. So it becomes very important to select the right features and proper algorithm for this purpose. The basic requirements for an algorithm to be considered as practically implementable are simplicity, less complexity, high speed and efficiency. Our main aim is to design an easy but efficient algorithm that would be useful for maximum number of currencies, because all currencies have different security features, making it a tough job to design one algorithm that could be used for recognition of all available currencies. Writing different programs for all is also a tedious job. The aim of the project is to recognize the currencies and not authentication

24 citations


Journal ArticleDOI
TL;DR: This issue brief is intended to help policymakers and administrators understand how analytics and data mining have been—and can be—applied for educational improvement.
Abstract: Educational data mining and learning analytics are used to research and build models in several areas that can influence learning systems. Higher education institutions are beginning to use analytics for improving the services they provide and for increasing student grades and retention. With analytics and data mining experiments in education starting to proliferate, sorting out fact from fiction and identifying research possibilities and practical applications are not easy. This issue brief is intended to help policymakers and administrators understand how analytics and data mining have been—and can be—applied for educational improvement. At present, educational data mining tends to focus on developing new tools for discovering patterns in data. These patterns are generally about the micro concepts involved in learning, learning analytics.

22 citations


Journal ArticleDOI
TL;DR: This study proposes a model for the spiral development process with the use of a simulator (Simphony.NET), which helps the project manager in determining how to increase the productivity of a software firm with theUse of minimum resources (expert team members).
Abstract: Software engineering provides methodologies, concepts and practices, which are used for analysing, designing, building and maintaining the information in a software industry. Software Development Life Cycle (SDLC) model is an approach used in the software industry for the development of various size projects: small scale projects, medium scale projects and large scale projects. A software project of any size is developed with the co-ordination of development team. Therefore it is important to assign resources intelligently to the different phases of the software project by the project manager. This study proposes a model for the spiral development process with the use of a simulator (Simphony.NET), which helps the project manager in determining how to increase the productivity of a software firm with the use of minimum resources (expert team members). This model increase the utilization of different development processes by keeping all development team members busy all the time, which helps in decrease idle and waste time. As future work, many other SDLC models like incremental, prototype etc. will be simulated, in order to find which simulation model is best for the software firm.

21 citations


Journal ArticleDOI
TL;DR: Various fusion techniques that are used in multimodal biometrics are discussed, which can be achieved through a fusion of two or more images, where the resultant fused image will be more secured.
Abstract: Biometric is the science and technology of measuring and analyzing biological data of human body, extracting a feature set from the acquired data and comparing this set against to the template set in the database. Biometric systems based on single source of information are called unimodal biometric system. The performance of unimodal system is affected by noisy sensor data and non-universality [1]. Problems arised in the unimodal system can be resolved using multimodal biometric. Multimodal biometrics can be achieved through a fusion of two or more images, where the resultant fused image will be more secured. This paper discusses various fusion techniques that are used in multimodal biometrics.

20 citations


Journal ArticleDOI
TL;DR: An Naive Bayes algorithm (NB) approach is presented to predict graduating cumulative Grade Point Average based on applicant data collected from the surveys conducted during the summer semester at the University of Tuzla, the Faculty of Economics, academic year 2010-2011 and the data taken during the enrolment.
Abstract: In recent years data mining has been successfully implemented in the business world. Evaluating students' academic success is becoming increasingly challenging, its use is intended for identification and extraction of new and potentially valuable knowledge from the data. Predicting educational outcome is a practical alternative heterogeneous environment. Performance prediction models can be built by applying data mining techniques to enrolment data. In this paper we present an Naive Bayes algorithm (NB) approach to predict graduating cumulative Grade Point Average based on applicant data collected from the surveys conducted during the summer semester at the University of Tuzla, the Faculty of Economics, academic year 2010-2011, among first year students and the data taken during the enrolment. The Naive Bayes algorithm is used to discover the most suited way to predict student's success.

19 citations


Journal ArticleDOI
TL;DR: A robot is designed that can be controlled using an application running on an android phone that sends control command via Bluetooth which has certain features like controlling the speed of the motor, sensing and sharing the information with phone about the direction and distance of the robot from the nearest obstacle.
Abstract: Nowadays android smart phones are the most popular gadget. There are multiple applications on the internet that exploit inbuilt hardware in these mobile phones, such as Bluetooth, Wi-Fi and ZigBee technology to control other devices. With the development of modern technology and Android Smartphone, Bluetooth technology aims to exchange data wirelessly at a short distance using radio wave transmission comprising features to create ease, perception and controllability. In this paper we have designed a robot that can be controlled using an application running on an android phone. It sends control command via Bluetooth which has certain features like controlling the speed of the motor, sensing and sharing the information with phone about the direction and distance of the robot from the nearest obstacle.

Journal ArticleDOI
TL;DR: Different image scaling techniques are focused on with intent that review to be useful to researchers and practitioners interested in image Scaling.
Abstract: The growing interest in image scaling is mainly due to the availability of digital imaging devices such as, digital cameras, digital camcorders, 3G mobile handsets, high definition monitors etc. Scaling a digital image is a demanding and very important area of research. Image scaling is an important image processing operation applied in diverse areas in computer graphics. Image scaling can be especially useful when one needs to reduce image file size for email and web documents or increase image size for printing, GIS observation, medical diagnostic etc. With the recent advances in imaging technology, digital images have become an important component of media distribution. In addition, a variety of displays can be used for image viewing, ranging fromhigh-resolution computer monitors to TV screens and low-resolution mobile devices.This paper is focused on different image scaling techniqueswith intent that review to be useful to researchers and practitioners interested in image Scaling.

Journal ArticleDOI
TL;DR: The analysis results show that Autonomous system surpass existing approaches and schemes in failure-recovery, and will increase throughput, mean success rate of overall system and meets the varied applications’ information measure demands within the network.
Abstract: Multi-hop wireless mesh networks typically expertise frequent link failure owing to sure reasons like channel interference, dynamic obstacles occurring in a very network and even owing to information measure demands of applications. A brand new Autonomous system for Wireless Mesh Networks. the most objective of Autonomous system for WMN is to scale back manual configuration of network concerned in maintenance of WMN. The Autonomous System for WMN is evaluated extensively through ns2-based simulation. The analysis results show that Autonomous system surpass existing approaches and schemes in failure-recovery. This technique helps in rising channel-efficiency by over 93%. It will increase throughput, mean success rate of overall system and meets the varied applications’ information measure demands within the network.

Journal Article
TL;DR: The aim of this paper is to study of different path loss propagation models in radio communication at different frequency band like SUI model, Hata model, Okumura model, COST-231 model, ECC-33 model and W-I model.
Abstract: Radio propagation models are focus on realization of the path loss with supplement task of predicting the coverage area for a radio transmitter. Radio propagation models are empirical in nature and are developed based on large collection of data for the specific scenario. The aim of this paper is to study of different path loss propagation models in radio communication at different frequency band. Like SUI model, Hata model, Okumura model, COST-231 model, ECC-33 model and W-I model.

Journal ArticleDOI
TL;DR: A thought and a way to eradicate the buttons, joysticks and replace them with some of the more intuitive technique that is, controlling the complete Robotic Arm by the operators hand gesture is presented.
Abstract: In today's world, most of all sectors, the work is done by robots or robotic arm having different number of degree of freedoms (DOF's) as per the requirement The idea is to change a perception of remote controls for actuating manually operated Robotic-Arm Well, this paper presents a thought and a way to eradicate the buttons, joysticks and replace them with some of the more intuitive technique that is, controlling the complete Robotic Arm by the operators hand gesture The proposed electronics system recognizes a particular hand gesture that will be performed in front of webcam & transmitted respected signals wirelessly through RF module Depending on the received signals the robotic arm which is followed by AVR microcontroller performs the receptive motions at the receiver section

Journal ArticleDOI
TL;DR: A brief review of Cloud Computing is presented, which includes comparative study between Cloud Computing, Distributed, Utility, Cluster, and Grid Computing and also three deployment model and service model.
Abstract: These days, the Cloud Computing is probably the most considerable technology in the IT sector. It is broadly made use to deliver services well over the internet for both low priced and technical motives. Cloud Computing support virtualization technique over the internet to meet the stretchy demand of the users with less interaction by having service provider. For that reason users are progressively engaging toward this trend. In this paper, we presented brief review of Cloud Computing; it also includes comparative study between Cloud Computing, Distributed, Utility, Cluster, and Grid Computing and also three deployment model and service model. This paper also lay out brief discussion of the main characteristics of cloud computing, benefits, challenges and its applications.

Journal ArticleDOI
TL;DR: A model for early stage cancerous cell detection is proposed and the phases involved are reviewed and summarized, namely, Image Pre-processing, Image Segmentation, Feature Extraction and Classification.
Abstract: Histopathology refers to the examination of biopsy samples by a pathologist using a microscope for analysing and classifying diseases. In order to study the manifestations of a disease, the analysis of histopathological image is done manually by a pathologist and therefore the diagnosis is subjective and greatly dependent on the level of expertise of the professional. In order to overcome this problem of a possible erroneous diagnosis and for early stage detection, an automated computerized image processing system is needed for quantitative diagnosis of biopsy samples. In this paper we have proposed a model for early stage cancerous cell detection and have reviewed and summarized the phases involved, namely, Image Pre-processing, Image Segmentation, Feature Extraction and Classification.

Journal ArticleDOI
TL;DR: This paper tries to elucidate the basic concepts of steganography, its various types and techniques, and dual Steganography.
Abstract: In the last few years communication technology has been improved, which increase the need of secure data communication. For this, many researchers have exerted much of their time and efforts in an attempt to find suitable ways for data hiding. There is a technique used for hiding the important information imperceptibly, which is Steganography. Steganography is the art of hiding information in such a way that prevents the detection of hidden messages. The process of using steganography in conjunction with cryptography, called as Dual Steganography. This paper tries to elucidate the basic concepts of steganography, its various types and techniques, and dual steganography. There is also some of research works done in steganography field in past few years.

Journal ArticleDOI
TL;DR: Various Image processing techniques are reviewed particularly for Brain tumor detection in magnetic resonance imaging and neural network techniques used to improve the performance of detecting and classifying brain tumor in MRI images.
Abstract: Medical Image Processing is the fast growing and challenging field now a days. Medical Image techniques are used for Medical diagnosis. Brain tumor is a serious life threatening disease. Detecting Brain tumor using Image Processing techniques involves four stages namely Image Pre-Processing, Image segmentation, Feature Extraction, and Classification. Image processing and neural network techniques are used to improve the performance of detecting and classifying brain tumor in MRI images. In this survey various Image processing techniques are reviewed particularly for Brain tumor detection in magnetic resonance imaging. More than twenty five research papers of image processing techniques are clearly reviewed.

Journal Article
TL;DR: This paper proposes many progressively tight levels of policy consistency constraints, and gift completely different social control approaches to ensure the trustiness of transactions capital punishment on cloud servers.
Abstract: In distributed transactional info systems deployed over cloud servers, entities work to make proofs of authorization that square measure even by collections of certified credentials. These proofs and credentials is also evaluated and picked up over extended time periods below the chance of getting the underlying authorization policies or the user credentials being in inconsistent states. It so becomes possible for policy-based authorization systems to form unsafe selections which may threaten sensitive resources. During this paper, we have a tendency to highlight the criticality of the matter. we have a tendency to then outline the notion of trusty transactions once addressing proofs of authorization. Consequently, we propose many progressively tight levels of policy consistency constraints, and gift completely different social control approaches to ensure the trustiness of transactions capital punishment on cloud servers. We have a tendency to propose a Two-Phase Validation Commit protocol as an answer, which is a changed version of the essential Two-Phase Commit protocols. We have a tendency to finally analyse the various approaches bestowed victimization each analytical evaluation of the overheads and simulations to guide the choice manufacturers to that approach to use.

Journal ArticleDOI
TL;DR: This paper reveals temporal variation in videos that are imperceptible to the naked eye and takes a standard video sequence as an input, and applies spatial decomposition, followed by temporal filtering to the frames which is called Eulerian Video Magnification.
Abstract: Thispaper reviews computational techniques to efficiently represent, analyse and visualise both shortterm and long-term temporal variation in video and image sequences. This paper reveals temporal variation in videos that are imperceptible to the naked eye. The method takes a standard video sequence as an input, and applies spatial decomposition, followed by temporal filtering to the frames which is called “Eulerian Video Magnification”. The resulting signal is then amplified to reveal the hidden information. Paper contains four techniques which are: 1.Linear approximation method, 2.Phase based video processing, 3. Riesz pyramid for fast phase based video processing and 4.Enhanced Eulerian video magnification. Using above methods, one is able to amplify and visualize small motions and temporal colour changes.

Journal ArticleDOI
TL;DR: A hybrid intrusion detection method that uses a combination of supervised and outlier based methods for improving the efficiency of detection of new and old attacks is proposed.
Abstract: 3 ABSTRACT: With the growth of networked computers and associated applications, intrusion detection has become essential to keeping networks secure. A number of intrusion detection methods have been developed for protecting computers and networks using conventional statistical methods as well as data mining methods. It is necessary that the capabilities of intrusion detection methods be updated with the creation of new attacks. This paper proposes a hybrid intrusion detection method that uses a combination of supervised and outlier based methods for improving the efficiency of detection of new and old attacks. The method is evaluated with the benchmark intrusion dataset called the knowledge discovery and data mining Cup 1999 dataset and the new version of KDD (NSL-KDD) dataset. Thus the performance of our method is very good.

Journal ArticleDOI
TL;DR: Various load testing tools that are useful for testing performance of system under heavy loads are discussed and various parameters such as response time, memory utilization, hits per second etc are used.
Abstract: Software testing is an essential part of delivering quality assured and reliable software in SDLC model Nowadays, most of the software applications are web applications that run on web browser As there is exponential growth of web applications, it is important to test these applications to ensure that they can perform well under heavy loads Load testing as a part of software testing is used for monitoring performance of web applications It is used to define the maximum amount of work a system can handle without performance degradation It design and simulate user traffic which can be used to test your application infrastructure for performance, reliability and scalability In this paper main focus is on discussing various load testing tools that are useful for testing performance of system under heavy loads To choose the best tool for analyzing performance of system various parameters such as response time, memory utilization, hits per second etc are used

Journal ArticleDOI
TL;DR: This paper gives an over view of various feature extraction techniques which are used to the budding researchers and provides a comprehensive review ofVarious feature extraction approaches to improve the classification accuracy.
Abstract: Data Mining (DM) technique is able to process the high volume of data. The data mining applications contain dataset with high dimensionality. Due to this high dimensionality, the performance of the machine learning algorithms get degraded and this problem is resolved using a technique called Dimensionality Reduction (DR). DR is an essential preprocessing technique in DM to reduce the high dimensionality. Feature Extraction is one of the important techniques in DR to extract the most important features. The goal of this survey is to provide a comprehensive review of various feature extraction approaches to improve the classification accuracy. This paper gives an over view of various feature extraction techniques which are used to the budding researchers.

Journal ArticleDOI
TL;DR: The RFID system uses RFID tag and RFID reader which collects information of vehicle passing through the toll plaza and automatically debits the toll amount from prepaid account of vehicle owner, which in return reduces the traffic congestion and human errors.
Abstract: This paper is based on RFID technology, the RFID system uses RFID tag and RFID reader which collects information of vehicle passing through the toll plaza and automatically debits the toll amount from prepaid account of vehicle owner, which in return reduces the traffic congestion and human errors. The vehicle owner has to register his vehicle with provided RFID tag, creating a rechargeable account. When the vehicle will pass through toll gate the amount of toll will automatically be reduced from its account. The system produced is microcontroller based system with embedded c coding, and the hardware is interfaced with java base coding. The softwares used are netbeans and jdk for hardware, mysql for database and mikro c for interfacing microcontroller. The basic advantages of the system is trvalling time is decreased, congestion free network, less emissions in toll area and no infrastructure cost is required. This gives a win win condition for both toll authorities and toll customers

Journal Article
TL;DR: A joint treatment of load leveling and valuation is considered and it is found that there exists associate degree optimum worth that maximizes the revenue in the multi-user surroundings.
Abstract: Managing resources and smartly valuation them on computing systems may be a difficult task. Resource sharing demands careful load leveling and sometimes strives to attain a win-win state of affairs between resource suppliers and users. Toward this goal, we consider a joint treatment of load leveling and valuation. we have a tendency to don't assume static valuation to see load leveling, or the other way around. Instead, we have a tendency to study the link between the worth that a computing node is charged and therefore the load and revenue that it receives. We find that there exists associate degree optimum worth that maximizes the revenue. we have a tendency to then contemplate a multi-user surroundings and explore however the load from a user is balanced on processors with existing hundreds. Finally, we have a tendency to derive associate degree optimum worth that maximizes the revenue in the multi-user surroundings. we have a tendency to appraise the performance of the papered algorithms through simulations.

Journal ArticleDOI
TL;DR: The main objective of this paper is to make a study on these two models and to get a conclusion on the impact of risk assessment and its success factors on complex software developmental projects.
Abstract: Software companies face a lot of difficulties in choosing a correct development model when projects have to confront a lot of risk factors ranging from low to high. When software development becomes complex in nature due to risk, companies usual practice is to switch over to the only risk detection traditional model – the Spiral. Spiral model is considered to be the best conventional model for risk analysis. But in modern developmental style, all the development cycles around a single trend – the Agile. The main objective of this paper is to make a study on these two models and to get a conclusion on the impact of risk assessment and its success factors on complex software developmental projects.

Journal ArticleDOI
TL;DR: A rectangular microstrip patch antenna has been investigated and its performance has been analyzed with the aid of Ansoft HFSS version 13 and stacked patch and Double U- slot on patch of microstrip antenna concept is used.
Abstract: An Antenna is one of the essential parts for microwave communication Since it help both transmitting and receiving the information An antenna is a device that is made to efficiently radiate and receive the radiated electromagnetic waves The microstrip antennas have attractive features such as low profile, small size and weight, low cost, printed directly on circuit board and easy to analysis and fabricate For reduction of return loss, stacked patch and Double U- slot on patch of microstrip antenna concept is used A rectangular microstrip patch antenna has been investigated and its performance has been analyzed with the aid of Ansoft HFSS version 13microstrip patch antenna, stacked patch microstrip antenna and double U- slot stacked patch microstrip antenna have been designed for S- band to reduce return loss of antenna

Journal ArticleDOI
TL;DR: The design and modelling of a GSM-based Energy Recharge System for prepaid Metering using GSM will enable the user to recharge his/her electricity account from home and many add-ons such as energy demand prediction, real time dynamic tariff as a function of demand and supply and so on.
Abstract: This paper presents the design and modelling of a GSM-based Energy Recharge System for prepaid Metering.The present system of energy billing in India is error prone and also time and labour consuming. Errors getintroduced at every stage of energy billing like errors with electro-mechanical meters, human errors, processing errors. The aim of the project is to minimize the error by introducing a new system of Prepaid Energy Metering using GSM. The GSM module provides a mode of communication between the user/ and provider. This will enable the user to recharge his/her electricity account from home. We can easily implement many add-ons such as energy demand prediction, real time dynamic tariff as a function of demand and supply and so on.

Journal ArticleDOI
TL;DR: This study ensures an overall survey about OM related to product reviews, and classification algorithms used for sentiment classification, as well as identifying the polarity of sentiment expressed in data.
Abstract: Opinions and reviews on products and services are expressed in the Web through blogs, feedback forms; it is essential to develop methods to automatically classify and gauge them to identify the underlying sentiment about the product. Analyzing the polarity of sentiment expressed in data is Opinion Mining (OM). It is a system that identifies and classifies opinion/sentiment as represented in electronic text. Economic and marketing researches depend heavily on accurate method to predict sentiments of opinions extracted from internet and predict online customer’s preferences. OM has many steps, and techniques for each step. This study ensures an overall survey about OM related to product reviews, and classification algorithms used for sentiment classification.