scispace - formally typeset
Search or ask a question

Showing papers in "International journal of engineering research and technology in 2012"


Journal Article
TL;DR: The design of high performance MIPS Cryptography processor based on triple data encryption standard is described in such a way that pipeline can be clocked at high frequency and the small adjustments and minor improvement in the MIPS pipelined architecture design are described.
Abstract: The paper describes the design of high performance MIPS Cryptography processor based on triple data encryption standard. The organization of pipeline stages in such a way that pipeline can be clocked at high frequency. Encryption and Decryption blocks of triple data encryption standard (T-DES) crypto system and dependency among themselves are explained in detail with the help of block diagram. In order to increase the processor functionality and performance, especially for security applications we include three new 32-bit instructions LKLW, LKUW and CRYPT. The design has been synthesized at 40nm process technology targeting using Xilinx Virtex-6 device. The overall MIPS Crypto processor works at 209MHz. Keywords ALU, register file, pipeline, memory, T-DES, throughput 1. INTRODUCTION oday’s digital world, Cryptography is the art and science that deals with the principles and methods for keeping message secure. Encryption is emerging as a disintegrable part of all communication networks and information processing systems, involving transmission of data. Encryption is the transformation of plain data (known as plaintext) into inintengible data (known as cipher text) through an algorithm referred to as cipher. MIPS architecture employs a wide range of applications. The architecture remains the same for all MIPS based processors while the implementations may differ [1]. The proposed design has the feature of 32-bit asymmetric and symmetric cryptography system as a security application. There is a 16- bit RSA cryptography MIPS cryptosystem have been previously designed [2]. There are the small adjustments and minor improvement in the MIPS pipelined architecture design to protect data transmission over insecure medium using authenticating devices such as data encryption standard [DES], Triple-DES and advanced encryption standard [AES] [3]. These cryptographic devices use an identical key for the receiver side and sender side. Our design mainly includes the symmetric cryptosystem into MIPS pipeline stages. That is suitable to encrypt large amount data with high speed. The MIPS is simply known as Millions of instructions per second and is one of the best RISC (Reduced Instruction Set Computer) processor ever designed. High speed MIPS processor possessed Pipeline architecture for speed up processing, increase the frequency and performance of the processor. A MIPS based RISC processor was described in [4]. It consist of basic five stages of pipelining that are pipelined processor is shown in Fig.1 which containInstruction Fetch, Instruction Decode, Instruction Execution, Memory access, write back. These five pipeline stages generate 5 clock cycles processing delay and several Hazard during the operation [2]. These pipelining Hazard are eliminates by inserting NOP (No Operation Performed) instruction which generate some delays for the proper execution of instruction [4]. The pipelining Hazards are of three type’s data, structural and control hazard. These hazards are handled in the MIPS processor by the implementation of forwarding unit, Pre-fetching or Hazard detection unit, branch and jump prediction unit [2]. Forwarding unit is used for preventing data hazards which detects the dependencies and forward the required data from the running instruction to the dependent instructions [5]. Stall are occurred in the pipelined architecture when the consecutive instruction uses the same operand of the instruction and that require more clock cycles for execution and reduces performance. To overcome this situation, instruction pre-fetching unit is used which reduces the stalls and improve performance. The control hazard are occurs when a branch prediction is mistaken or in general, when the system has no mechanism for handling the control hazards [5]. The control hazard is handled by two mechanisms: Flush mechanism and Delayed jump mechanism. The branch and jump prediction unit uses these two mechanisms for preventing control hazards. The flush mechanism runs instruction after a branch and flushes the pipe after the misprediction [5]. Frequent flushing may increase the clock cycles and reduce performance. In the delayed jump mechanism, to handle the control hazard is to fill the pipe after the jump instruction with specific numbers of NOP’s [5]. The branch and jump prediction unit placement in the pipelining architecture may affect the critical or longest path. To detecting the longest path and improving the hardware that resulting minimum clock period and is the standard method of increasing the performance of the processor. To further speed up processor and minimize clock period, the design incorporates a high speed hybrid adder which employs both carry skip and carry select techniques with in the ALU unit to handle the additions. This paper is organized as follows. The system architecture hardware design and implementation are explained in Section II. Instruction set of MIPS including new instructions in detail with corresponding diagrams shown in sub-sections. Hardware implementation design methodology is explained in section III. The experimental results of pipeline stages are shown in section IV. Simulation results of encrypted MIPS pipeline processor and their Verification & synthesis report are describes in sub sections. The conclusions of paper are described in section V.

167 citations


Journal Article
TL;DR: The observations reveal that Neural networks with 15 attributes has outperformed over all other data mining techniques for heart disease prediction and decision tree has also shown good accuracy with the help of genetic algorithm and feature subset selection.
Abstract: Heart disease is a term that assigns to a large number of medical conditions related to heart. These medical conditions describe the abnormal health conditions that directly influence the heart and all its parts. Heart disease is a major health problem in today’s time. This paper aims at analyzing the various data mining techniques introduced in recent years for heart disease prediction. The observations reveal that Neural networks with 15 attributes has outperformed over all other data mining techniques. Another conclusion from the analysis is that decision tree has also shown good accuracy with the help of genetic algorithm and feature subset selection.

141 citations


Journal Article
TL;DR: An automatic system for the extraction of normal and abnormal features in color retinal images could assist the ophthalmologists, to detect the signs of diabetic retinopathy in the early stage, for a better treatment plan and to improve the vision related quality of life.
Abstract: Diabetic retinopathy is one of the serious eye diseases that can cause blindness and vision loss. Diabetes mellitus, a metabolic disorder, has become one of the rapidly increasing health threats both in India and worldwide. The complication of the diabetes associated to retina of the eye is diabetic retinopathy. A patient with the disease has to undergo periodic screening of eye. For the diagnosis, ophthalmologists use color retinal images of a patient acquired from digital fundus camera. The present study is aimed at developing an automatic system for the extraction of normal and abnormal features in color retinal images. Prolonged diabetes causes micro-vascular leakage and micro-vascular blockage within the retinal blood vessels. Filter based approach with morphological filters is used to segment the vessels. The morphological filter are tuned to match that part of vessel to be extracted in a green channel image. To classify the pixels into vessels and non vessels local thresholding based on gray level co-occurrence matrix is applied. The performance of the method is evaluated on two publicly available retinal databases with hand labeled ground truths. The performance of retinal vessels on drive database, sensitivity 86.39%, accompanied by specificity of 91.2%. While for STARE database proposed method sensitivity 92.15 % and specificity 84.46%. The system could assist the ophthalmologists, to detect the signs of diabetic retinopathy in the early stage, for a better treatment plan and to improve the vision related quality of life. Keywords— Vessel segmentation, Morphological filter, Image Processing, Diabetic Retinopathy .

132 citations



Journal Article
TL;DR: This paper is an introductory paper on different techniques used for classification and feature selection.
Abstract: Data mining is a form of knowledge discovery essential for solving problems in a specific domain. Classification is a technique used for discovering classes of unknown data. Various methods for classification exists like bayesian, decision trees, rule based, neural networks etc. Before applying any mining technique, irrelevant attributes needs to be filtered. Filtering is done using different feature selection techniques like wrapper, filter, embedded technique. This paper is an introductory paper on different techniques used for classification and feature selection.

126 citations


Journal Article
TL;DR: This paper has concluded that MMBCR gives more network lifetime by selecting route with maximum battery capacity thereby outperforming DSR and Ad hoc OnDemand Distance Vector Routing Protocol.
Abstract: In mobile ad hoc network nodes have limited battery power. If a node is used frequently for transmission or overhearing of data packets, more energy is consumed by that node and after certain amount of time the energy level may not be sufficient for data transmission resulting in connection failure. In this paper, we have considered three routing protocols-Dynamic Source Routing (DSR) & Minimum Maximum Battery cost Routing (MMBCR), Ad hoc OnDemand Distance Vector Routing Protocol (AODV) and studied their performances in terms of network lifetime for the same network scenario. Simulations are carried out using NS2. Finally from the simulation results we have concluded that MMBCR gives more network lifetime by selecting route with maximum battery capacity thereby outperforming DSR.

79 citations



Journal Article
TL;DR: The comparison between the two leading type of Database storage components prevailing in the industry is reported, with results indicating that non-relational databases show true signs of usability here.
Abstract: We report the comparison between the two leading type of Database storage components prevailing in the industry. The Database is largely concerned with managing massive amount of data in a consistent, stable, repeatable and quick manner. The prominent features of both relational as well as non relational databases have been specified which form the basis of the comparison between the two types of database. The relational model is based on mathematical theory(set theory, relational theory) whereas the nonrelational databases may or may not have a single groundwork mathematical theory. Relational model is beneficial when it comes to reliability, flexibility, robustness, scalability requirements but in order to cater to the needs of modern applications where the data is huge and generally unstructured; Non-relational databases show true signs of usability here. Based on the characteristics, commonly used tools of relational and non relational databases are mentioned along with brief introduction of the tools. Comparison has been done between the tools to notify the distinctive features of tools of relational and non relational database. Conclusive remarks about the two categories of database have been mentioned.

65 citations


Journal Article
TL;DR: This paper describes an new symmetric key algorithm which uses basic additive cipher and some new component to encrypt the information and provides both algorithms for encryptions and decryption.
Abstract: Any general communication which is carried out between humans for exchanging thoughts and knowledge can be understood by anyone who knows that language. It is called plain text, so we bother to make such a scheme which converts this general message into a coded message which can be understudied by only authorized people. So we need to hide the information which is written into general message, from those who are not intended, even they can see the coded data . Cryptography can be stated as the art and science of transforming the message into coded form to make them secure and immune to attacks which reveals its secrecy. Cryptography is basically affiliated with the information security, Computer system security and engineering, cryptography is used in all kind of advance application technology like ATM, online banking, online trading. Cryptography has its two basic forms i.e. one is symmetric key cryptography and other is asymmetric key cryptography. An Asymmetric key algorithm uses two keys, one is public and other is private key both are used for encryption and decryption, and symmetric key cryptography algorithms are quick and most commonly used, In symmetric key algorithms single key is used for both encryption and decryption, this is also called secret key which is shared between both sender and receiver. Some common symmetric key algorithms are DES, RC2.RC4, and IDEA etc. This paper describes an new symmetric key algorithm which uses basic additive cipher and some new component to encrypt the information. We provide both algorithms for encryptions and decryption. The advantage of this algorithm over others is also explained. Categories & subject descriptors [Cryptography & Steganography]: A New Algorithm. General Terms Algorithms, Design, Security.

61 citations


Journal Article
TL;DR: The objective of this paper is to provide the way for Decision making process of Customer for recommended the membership card by comparing the results of both algorithms & recommended the card to the new customer those having similar characteristics.
Abstract: Data mining is the useful tool to discovering the knowledge from large data. Different methods & algorithms are available in data mining. Classification is most common method used for finding the mine rule from the large database. Decision tree method generally used for the Classification, because it is the simple hierarchical structure for the user understanding & decision making. Various data mining algorithms available for classification based on Artificial Neural Network, Nearest Neighbour Rule & Baysen classifiers but decision tree mining is simple one. The objective of this paper is to provide the way for Decision making process of Customer for recommended the membership card. Here C5.0 & CART algorithms applied on customer database for classification. Both algorithms first applied on training dataset & created the decision tree, pruning method used for reducing the complexity then rule set are derived from decision tree. Same rules then applied on evaluation data set. Comparing the results of both algorithms & recommended the card to the new customer those having similar characteristics. KeywordsData mining, classification algorithm, decision tree, Regression tree, membership card

53 citations



Journal Article
TL;DR: A comprehensive survey of existing and most recent watermarking techniques according to various categories such as host signal, perceptivity, robustness, watermark type, necessary data for extraction, processing domain, and applications.
Abstract: Digital watermarking techniques have been developed to protect the copyright of digital media. This paper aims to provide a detailed review and background about the watermarking definition, concept and the main contributions in this field. It begins with digital watermarking overview, general framework, attacks, application, and finally a comprehensive survey of existing and most recent watermarking techniques. We classify the techniques according various categories such as host signal, perceptivity, robustness, watermark type, necessary data for extraction, processing domain, and applications. In the survey our main concern is image only.

Journal Article
TL;DR: This paper is discussing Load Balancing approach, which is the process of distributing load over the different nodes which provides good resource utilization when nodes are overloaded with job.
Abstract: In present days cloud computing is one of the greatest platform which provides storage of data in very lower cost and available for all time over the internet.But it has more critical issue like security, load management and fault tolerance. In this paper we are discussing Load Balancing approach. Many types of load concern with cloud like memory load, CPU load and network load. Load balancing is the process of distributing load over the different nodes which provides good resource utilization when nodes are overloaded with job. Load balancing has to handle the load when one node is overloaded. When node is overloaded at that time load is distributed over the other ideal nodes. Many algorithms are available for load balancing like Static load balancing and Dynamic load balancing. Keywords– Cloud Computing, Load balancing, virtualization

Journal Article
TL;DR: In this paper, an offline signature verification scheme based on Convolutional Neural Network (CNN) is proposed and the simulation results reveal the efficiency of the suggested algorithm.
Abstract: The style of people’s handwritten signature is a biometric feature used in person authentication. In this paper, an offline signature verification scheme based on Convolutional Neural Network (CNN) is proposed. CNN focuses on the problems of feature extraction without prior knowledge on the data. The classification task is performed by Multilayer perceptron network (MLP). This method is not only capable of extracting features relevant to a given signature, but also robust with regard to signature location changes and scale variations when compared to classical methods. The proposed method is evaluated on a dataset of Persian signatures gathered originally from 22 people. The simulation results reveal the efficiency of the suggested algorithm.


Journal Article
TL;DR: In this paper, the authors performed a sensitivity analysis for weight reduction of the automotive chassis with constraints of maximum shear stress, equivalent stress and deflection of chassis under maximum load.
Abstract: Automotive chassis is an important part of an automobile. The chassis serves as a frame work for supporting the body and different parts of the automobile. Also, it should be rigid enough to withstand the shock, twist, vibration and other stresses. Along with strength, an important consideration in chassis design is to have adequate bending stiffness for better handling characteristics. So, maximum stress, maximum equilateral stress and deflection are important criteria for the design of the chassis. This report is the work performed towards the optimization of the automotive chassis with constraints of maximum shear stress, equivalent stress and deflection of chassis under maximum load. Structural systems like the chassis can be easily analyzed using the finite element techniques. A sensitivity analysis is carried out for weight reduction. So a proper finite element model of the chassis is to be developed. The chassis is modeled in PRO-E. FEA is done on the modeled chassis using the ANSYS Workbench

Journal Article
TL;DR: In this article, the authors make an attempt in identifying the factors affecting learning speaking skills and outlines feasible solutions to solve the problems in teaching and learning process of English language in India, where lack of proper official data regarding how many people speak English, the proficiency levels of Indian teachers and learners leads to the difficulties in planning and implementing development in English language and its teaching/learning process.
Abstract: In today’s globalized world, the language used most often is English. English has become the lingua franca for communication, business, education and opportunity in general. English occupies a place of prestige in our country. Even after decades of colonial rule, no indigenous language has come up to replace English, either as a medium of communication or as an official language. India is a multi-lingual country where people speak more than 350 languages and dialects. Indians are used to a particular pattern of pronunciation, intonation, stress and phonology of their mother tongue, when they start learning English language with its own set of patterns and rules, the confusion begins. This results in the problems in teaching and learning process of English language. Moreover, lack of proper official data regarding how many people speak English, the proficiency levels of Indian teachers and learners leads to the difficulties in planning and implementing development in English language and its teaching and learning process. This paper makes an attempt in identifying the factors affecting learning speaking skills and outlines feasible solutions.

Journal Article
TL;DR: 3D model of gear and finite element analysis is used to conduct RBS and SCS calculation for mating involute spur gears and Lewis formula and Hertz equation is found for quick stress calculation for gear, where as the AGMA standards and FEM is used for detailed gear stress calculation.
Abstract: This paper presented analysis of Bending stress and Contact stress of Involute spur gear teeth in meshing. There are several kinds of stresses present in loaded and rotating gear teeth. Bending stress and contact stress (Hertz stress) calculation is the basic of stress analysis. It is difficult to get correct answer on gear tooth stress by implying fundamental stress equation, such as Lewis formula for bending stress and Hertz equation for contact stress. The detailed gear stressing is the key of this paper. The design of an effective and reliable gearing system is include its ability to withstand RBS (Root Bending Stress) and SCS (Surface Contact Stress). Various research methods such as Theoretical, Numerical and Experimental have been done throughout the years. We primarily prefer Theoretical and Numerical methods because Experimental testing can be expensive. So many researchers have utilized FEM to predict RBS and SCS. In this study we use a 3D model of gear and finite element analysis to conduct RBS and SCS calculation for mating involute spur gears. A pair of involute spur gear without tooth modification and transmission error is define in a CAD system (CATIA V5 and AUTODESK INVENTOR etc.) and FEA is done by using finite element software ANSYS. Obtained FEA results is comparable with theoretical and AGMA standard. It is found that Lewis formula and Hertz equation is used for quick stress calculation for gear, where as the AGMA standards and FEM is used for detailed gear stress calculation for a pair of involute spur gear.

Journal Article
TL;DR: In this article, routing protocols have been categorized on the basis of their homogeneity and heterogeneity of sensor nodes followed by the criteria of clustered and non-clustered among both.
Abstract: Wireless sensor network (WSN) has emerged as a useful supplement to the modern wireless communication networks. Optimal selection of paths for data transfer results in saving of energy consumption resulting in increase of network lifetime of Wireless Sensor Networks. Many routing, power management, and data dissemination protocols have been specifically designed for WSNs where energy awareness is an essential design issue. Routing protocols in WSNs might differ depending on the application and network architecture as there is still no consensus on a fixed communication stack for WSN. Newer Routing protocols are required to cater to the need of ubiquitous and pervasive computing. In this paper, WSN Routing Protocols has been classified in four ways i.e., routing paths establishment, network structure, protocol operation and initiator of communications. Further, routing protocols have been categorized on the basis of their homogeneity and heterogeneity of sensor nodes followed by the criteria of clustered and non clustered among both. Data aggregation, support for query and scalability of the network of these routing protocols have also been.

Journal Article
TL;DR: RSVP was intended to provide IP networks with the capability to support the divergent performance requirements of differing application types.
Abstract: to obtain differing qualities of service (QoS) for their data flows. Such a capability recognizes that different applications have different network performance requirements. Some applications, including the more traditional interactive and batch applications, require reliable delivery of data but do not impose any stringent requirements for the timeliness of delivery. Newer application types, including videoconferencing, IP telephony, and other forms of multimedia communications require almost the exact opposite: Data delivery must be timely but not necessarily reliable. Thus, RSVP was intended to provide IP networks with the capability to support the divergent performance requirements of differing application types.

Journal Article
TL;DR: In this paper, the authors used a Taguchi approach to capture the effects of signal to noise ratio of the experiments depending upon the orthogonal arrays used, an analysis of variance and optimum conditions are found.
Abstract: The purpose of this paper is to optimize the sand casting process parameters of the castings manufactured in iron foundry by maximizing the signal to noise ratios and minimizing the noise factors using Taguchi method. A Taguchi approach is used to capture the effects of signal to noise ratio of the experiments depending upon the orthogonal arrays used, an analysis of variance and optimum conditions are found. This paper demonstrates a robust method for formulating a strategy to find optimum factors of process and interactions with a small number of experiments. The process parameters considered are moisture, sand particle size, green compression strength, mould hardness, permeability, pouring temperature, pouring time and pressure test. The results indicated that the selected process parameters significantly affect the casting defects in the foundry. The improvement expected in reduction of casting defects is found to be 37.66 percent.



Journal Article
TL;DR: In this article, the impact of age group, educational qualification, experience, extend of participation of faculty members and impact of career development programs on the quality of teaching of engineering education is investigated.
Abstract: The engineering education in Kerala is undergoing dramatic changes. Around fifteen years back only cream students went for engineering studies. At present students are joining the course not for passion to engineering stream but as a matter of prestige and pressure from external environment. The scenario is not different among the faculty community also in terms of quality erosion. The present study aimed to find out the impact of age group, educational qualification, experience, extend of participation of faculty members and impact of career development programs on the quality of teaching. The most popular career development tool utilized by faculty members in the field of engineering is attending Seminars. It may be due to the fact that it is an easy task. The research revels that the majority of faculty members in the Engineering discipline belong to thirty five years and their basic qualification is B.Tech. Apart from that, attending workshops, faculty development program, participating in national and international conference helps a faculty to increase the quality of teaching. It is viewed that there is reluctance from the part faculty members in participating in career development programmes. The history of education in India The history of education in the Indian subcontinent began with teaching of traditional elements such as Indian religions, Indian mathematics, Indian logic at early Hindu and Buddhist centres of learning such as Taxila (in modern-day Pakistan) and Nalanda (in India) before the common era. Islamic education became ingrained with the establishment of the Islamic empires in the Indian subcontinent in the Middle Ages while the coming of the Europeans later bought western education to colonial India. A series of measures continuing throughout the early half of the 20th century ultimately laid the foundation of education in the Republic of India, education in Pakistan and much of South Asia.The Early education in Indian commenced under the supervision of a guru. Initially, education was open to all and seen as one of the methods to achieve Moksha, or enlightenment. As time progressed, due to superiority complexes, the education was imparted on the basis of caste and the related duties that one had to perform as a member of a specific caste. The Brahmans learned about scriptures and religion while the Kshatriya were educated in the various aspects of warfare. The Vaishya caste learned commerce and other specific vocational courses while education was largely denied to the Shudras, the lowest caste. The earliest venues of education in India were often secluded from the main population. Students were expected to follow strict monastic guidelines prescribed by the guru and stay away from cities in ashrams. However, as population increased under the Gupta empire centres of urban learning became increasingly common and Cities such as Varanasi and the Buddhist centre at Nalanda became increasingly visible. The Education in India in its traditional form was closely related to religion. Among the Heterodox schools of belief were the Jain and Buddhist schools. Heterodox Buddhist education was more inclusive and aside of the monastic orders the Buddhist education centres were urban institutes of learning such as Taxila and Nalanda where grammar, medicine, philosophy, logic, metaphysics, arts and crafts etc. were also taught. Early secular Buddhist institutions of higher learning like Taxila and Nalanda continued to function well into the common era and were attended by students from China and Central Asia. On the subject of education for the nobility Joseph Prabhu writes: \"Outside the religious framework, kings and princes were educated in the arts and sciences related to government: politics (danda-nıti), economics (vartta), philosophy (anvıksiki), and historical traditions (itihasa). Here the authoritative source was Kautilya’s The Changing Face Of Engineering Education In Kerala -An Empirical Study At Engineering Colleges In Kerala

Journal Article
TL;DR: In this article, the authors deal with the parametric analysis of the helical coiled heat exchanger with various correlations given by different researchers for specific conditions, and present a model for specific data.
Abstract: Heat exchangers are the important engineering systems with wide variety of applications including power plants, nuclear reactors, refrigeration and air-conditioning systems, heat recovery systems, chemical processing and food industries. Helical coil configuration is very effective for heat exchangers and chemical reactors because they can accommodate a large heat transfer area in a small space, with high heat transfer coefficients. This paper deals with the parametric analysis of the helical coiled heat exchanger with various correlations given by different researchers for specific conditions. The parametric analysis of these various correlations with specific data is presented in this paper.




Journal Article
TL;DR: This paper aims to understand the benefits of RFID technology possibilities to reduce the accidents on Indian roads, and proposes a novel method to tackle trasport related issues.
Abstract: In the era of embedded systems time and efficiency are a matter of priority . RFID (Radio Frequency Identification) emerges as one of the converging technologies and transportation plays an important role in urbanization, RFID is one of the key catalyst playing a significant role in it. RFID plays major role in auto ID applications like RFID contact less smart cards used by bus riders, in Super market, Textiles and logistics chain management. This paper aims to understand the benefits of RFID technology possibilities to reduce the accidents on Indian roads. The Global System for Mobile Communications (GSM) has been a great success in providing both voice and low speed data services. The Enhanced Circuit Switched Data on GSM (ECSD) is one of the major evolutionary steps to serve real time high speed data services. Population explosion is the source of so many issues, one among them is transport. In this paper, we propose a novel method to tackle trasport related issues. Applications such as vehicle tracking, accident alert are explained in this paper.

Journal Article
TL;DR: An attempt has been made to consolidate the various security threats in a classified manner and to illustrate how cloud and virtualization vulnerabilities affect the different cloud service models.
Abstract: Cloud Computing provides an efficient and flexible way for services to meet escalating business needs. Cloudshared infrastructure and associated services make it cost effective alternative to traditional approaches. However, they may also introduce security breaches and privacy issues. As more cloud based applications keep evolving, the associated security threats are also growing. Many research works on cloud security exist in partial forms of either specifically on cloud issues or virtualization-related security issues. In this paper, an attempt has been made to consolidate the various security threats in a classified manner and to illustrate how cloud and virtualization vulnerabilities affect the different cloud service models.