scispace - formally typeset
Search or ask a question
Posted Content

Cloud Broker: A Systematic Mapping Study.

TL;DR: In this article, a comprehensive 3-tier search strategy (manual search, backward snowballing, and database search) was used to identify the most important and hottest topics in the field of cloud broker, identifying existing trends and issues, identifying active researchers and countries in the cloud broker field, a variety of commonly used techniques in building cloud brokers, variety of evaluation methods, the amount of research conducted in this field by year and place of publication and the identification of the most relevant active search spaces.
Abstract: The current systematic review includes a comprehensive 3-tier strategy (manual search, backward snowballing, and database search). The accuracy of the search methodology has been analyzed in terms of extracting related studies and collecting comprehensive and complete information in a supplementary file. In the search methodology, qualitative criteria have been defined to select studies with the highest quality and the most relevant among all search spaces. Also, some queries have been created using important keywords in the field under study in order to find studies related to the field of the cloud broker. Out of 1928 extracted search spaces, 171 search spaces have been selected based on defined quality criteria. Then, 1298 studies have been extracted from the selected search spaces and have been selected 496 high-quality papers published in prestigious journals, conferences, and workshops that the advent of them have been from 2009 until the end of 2019. In Systematic Mapping Study (SMS), 8 research questions have been designed to achieve goals such as identifying the most important and hottest topics in the field of cloud broker, identifying existing trends and issues, identifying active researchers and countries in the cloud broker field, a variety of commonly used techniques in building cloud brokers, variety of evaluation methods, the amount of research conducted in this field by year and place of publication and the identification of the most important active search spaces.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors proposed a fuzzy clustering algorithm to classify the location of customers and services and an iterative adaptive neural-fuzzy algorithm to identify suitable services based on the locations clustered by FCA and the demands of customers (experienced and inexperienced).
Abstract: Nowadays, cloud customers use cloud services increasingly to satisfy their demands. Usually, a significant number of customers are immature and inexpert and cannot express their needs accurately and numerically. They usually express their needs verbally and in the form of linguistic terms. On the other hand, the experienced customers express their needs numerically and accurately. In this situation, a recommendation system can be considered as one of the most useful ideas to support all type of customers. However, current recommendation systems (e.g., collaborative filtering based recommendations) meet customer requests that are accurately and numerically expressed. To support all types of customers, the construction of a strong recommendation system to analysis the demands expressed by customers (experienced and inexperienced) and to recommend suitable services is vital. As another important matter, cloud customers and services have been geographically distributed. Identifying the location of customers and services has a significant effect on the quality of services offered to customers. Therefore, the recommendation system should consider the location of customers and services in order to provide better services. In this paper, we introduce an efficient method to construct a powerful recommendation system which can provide suitable services considering the preferences of the customer and their location. The proposed recommendation system comprises two algorithms. The first algorithm is a fuzzy clustering algorithm, named FCA, that can well classify the location of customers and services. The second algorithm is an iterative adaptive neural-fuzzy algorithm, named IANFRA, which receives the preferences of the customer along with their location and identifies suitable services based on the locations clustered by FCA and the demands of customers (experienced and inexperienced). Finally, the feasibility of the proposed method has validated in terms of accuracy and scalability through conducting extensive experiments on a real distributed service quality dataset WS-DREAM. The evaluation results illustrate that both the service recommendation accuracy in the prediction of quality of services and the scalability, when the volume of the dataset is huge, have been improved.

5 citations

References
More filters
Journal ArticleDOI
TL;DR: The clouds are clearing the clouds away from the true potential and obstacles posed by this computing capability.
Abstract: Clearing the clouds away from the true potential and obstacles posed by this computing capability.

9,282 citations

Journal ArticleDOI
TL;DR: The series of cost estimation SLRs demonstrate the potential value of EBSE for synthesising evidence and making it available to practitioners and European researchers appear to be the leading exponents of systematic literature reviews.
Abstract: Background: In 2004 the concept of evidence-based software engineering (EBSE) was introduced at the ICSE04 conference. Aims: This study assesses the impact of systematic literature reviews (SLRs) which are the recommended EBSE method for aggregating evidence. Method: We used the standard systematic literature review method employing a manual search of 10 journals and 4 conference proceedings. Results: Of 20 relevant studies, eight addressed research trends rather than technique evaluation. Seven SLRs addressed cost estimation. The quality of SLRs was fair with only three scoring less than 2 out of 4. Conclusions: Currently, the topic areas covered by SLRs are limited. European researchers, particularly those at the Simula Laboratory appear to be the leading exponents of systematic literature reviews. The series of cost estimation SLRs demonstrate the potential value of EBSE for synthesising evidence and making it available to practitioners.

2,843 citations

Journal ArticleDOI
Thomas Hofmann1
TL;DR: This paper proposes to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice, and results in a more principled approach with a solid foundation in statistical inference.
Abstract: This paper presents a novel statistical method for factor analysis of binary and count data which is closely related to a technique known as Latent Semantic Analysis. In contrast to the latter method which stems from linear algebra and performs a Singular Value Decomposition of co-occurrence tables, the proposed technique uses a generative latent class model to perform a probabilistic mixture decomposition. This results in a more principled approach with a solid foundation in statistical inference. More precisely, we propose to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice. Probabilistic Latent Semantic Analysis has many applications, most prominently in information retrieval, natural language processing, machine learning from text, and in related areas. The paper presents perplexity results for different types of text and linguistic data collections and discusses an application in automated document indexing. The experiments indicate substantial and consistent improvements of the probabilistic method over standard Latent Semantic Analysis.

2,574 citations

Journal ArticleDOI
TL;DR: There was a need to provide an update of how to conduct systematic mapping studies and how the guidelines should be updated based on the lessons learned from the existing systematic maps and SLR guidelines.
Abstract: Context Systematic mapping studies are used to structure a research area, while systematic reviews are focused on gathering and synthesizing evidence. The most recent guidelines for systematic mapping are from 2008. Since that time, many suggestions have been made of how to improve systematic literature reviews (SLRs). There is a need to evaluate how researchers conduct the process of systematic mapping and identify how the guidelines should be updated based on the lessons learned from the existing systematic maps and SLR guidelines. Objective To identify how the systematic mapping process is conducted (including search, study selection, analysis and presentation of data, etc.); to identify improvement potentials in conducting the systematic mapping process and updating the guidelines accordingly. Method We conducted a systematic mapping study of systematic maps, considering some practices of systematic review guidelines as well (in particular in relation to defining the search and to conduct a quality assessment). Results In a large number of studies multiple guidelines are used and combined, which leads to different ways in conducting mapping studies. The reason for combining guidelines was that they differed in the recommendations given. Conclusion The most frequently followed guidelines are not sufficient alone. Hence, there was a need to provide an update of how to conduct systematic mapping studies. New guidelines have been proposed consolidating existing findings.

1,598 citations

Journal ArticleDOI
TL;DR: Mapping studies can save time and effort for researchers and provide baselines to assist new research efforts, however, they must be of high quality in terms of completeness and rigour if they are to be a reliable basis for follow-on research.
Abstract: Context: We are strong advocates of evidence-based software engineering (EBSE) in general and systematic literature reviews (SLRs) in particular. We believe it is essential that the SLR methodology is used constructively to support software engineering research. Objective: This study aims to assess the value of mapping studies which are a form of SLR that aims to identify and categorise the available research on a broad software engineering topic. Method: We used a multi-case, participant-observer case study using five examples of studies that were based on preceding mapping studies. We also validated our results by contacting two other researchers who had undertaken studies based on preceding mapping studies and by assessing review comments related to our follow-on studies. Results: Our original case study identified 11 unique benefits that can accrue from basing research on a preceding mapping study of which only two were case specific. We also identified nine problems associated with using preceding mapping studies of which two were case specific. These results were consistent with the information obtained from the validation activities. We did not find an example of an independent research group making use of a mapping study produced by other researchers. Conclusion: Mapping studies can save time and effort for researchers and provide baselines to assist new research efforts. However, they must be of high quality in terms of completeness and rigour if they are to be a reliable basis for follow-on research.

532 citations