scispace - formally typeset
Search or ask a question

Showing papers on "Web service published in 2023"


Journal ArticleDOI
TL;DR: In this article , a case study of 11 manufacturing companies is presented to enhance knowledge on how to design new revenue models for digital services, and the authors reveal a highly customer-centric, iterative, and agile process where close collaboration with key customers during the early stages guides the framing of revenue models.
Abstract: Manufacturing companies are currently undergoing a digitalization transformation in which digitally enabled, new, and innovative advanced service offerings are being launched. These so-called “digital services” represent a shift in the business logic of manufacturing firms, from up-front product sales to advanced service contracts. This business model shift has profound implications for cost structures, risk management, and revenue streams, providing manufacturing companies with the key challenge of rethinking how to capture value. Using a multiple case study of 11 companies, the purpose of this article is to enhance knowledge on how to design new revenue models for digital services. Results reveal a revenue model design framework of key phases and activities that carries implications for the emerging literature on digital servitization, as well as the business model innovation literature. The findings reveal a highly customer-centric, iterative, and agile process where close collaboration with key customers during the early stages guides the framing of revenue models for digital services. For practitioners, it provides hands-on advice on how to implement the design, development, and scaling processes for revenue models in the context of new digital services.

10 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors proposed a popularity-aware and diverse method of web API compositions' recommendation (PD-WACR), which models web APIs' functions, popularity, and compatibility with an API correlation graph.
Abstract: The ever-increasing web application programming interfaces (APIs) in various service-sharing communities (e.g., ProgrammableWeb.com and Mashape.com) have enabled software developers to quickly create their interested mashups conveniently and economically. However, the big volume of candidate web APIs and their differences often make it hard for software developers to discover a set of appropriate web APIs for mashup creation by considering API functions and API quality performances (e.g., popularity, compatibility, and diversity) simultaneously. These decrease the mashup development success rate and the mashup developers’ satisfaction significantly. In view of these challenges, a novel web APIs’ recommendation method named the popularity-aware and diverse method of web API compositions’ recommendation (PD-WACR) is proposed in this article. In concrete, we model web APIs’ functions, popularity, and compatibility with an API correlation graph. Afterward, correlation graph-based web APIs’ recommendation is performed with popularity and compatibility guarantee. Moreover, a top- $k$ strategy is adopted in the recommendation process, so as to diversify the final recommended web APIs’ results. Finally, massive experiments are carried out on a real-world web API dataset crawled from ProgrammeableWeb.com. Experimental comparisons with related methods show the advantages and innovations of the proposed PD-WACR method.

5 citations


Journal ArticleDOI
TL;DR: In this article , a new service composition algorithm based on the micro-bats behavior while hunting the prey is proposed, which determines the optimal combination of the web services to satisfy the complex user needs.
Abstract: Web services are provided as reusable software components in the services-oriented architecture. More complicated composite services can be combined from these components to satisfy the user requirements represented as a workflow with specified Quality of Service (QoS) limitations. The workflow consists of tasks where many services can be considered for each task. Searching for optimal services combination and optimizing the overall QoS limitations is a Non-deterministic Polynomial (NP)-hard problem. This work focuses on the Web Service Composition (WSC) problem and proposes a new service composition algorithm based on the micro-bats behavior while hunting the prey. The proposed algorithm determines the optimal combination of the web services to satisfy the complex user needs. It also addresses the Bat Algorithm (BA) shortcomings, such as the tradeoff among exploration and exploitation searching mechanisms, local optima, and convergence rate. The proposed enhancement includes a developed cooperative and adaptive population initialization mechanism. An elitist mechanism is utilized to address the BA convergence rate. The tradeoff between exploration and exploitation is handled through a neighborhood search mechanism. Several benchmark datasets are selected to evaluate the proposed bat algorithm’s performance. The simulation results are estimated using the average fitness value, the standard deviation of the fitness value, and an average of the execution time and compared with four bat-inspired algorithms. It is observed from the simulation results that introduced enhancement obtains significant results.

2 citations


Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a dynamic service computing model (DSCM) for monitoring land cover change, and three dynamic computation strategies were proposed according to different users' requirements of change detection.
Abstract: Land cover change (LCC) is increasingly affecting global climate change, energy cycle, carbon cycle, and water cycle, with far-reaching consequences to human well-being. Web service-based online change detection applications have bloomed over the past decade for monitoring land cover change. Currently, massive processing services and data services have been published and used over the internet. However, few studies consider both service integration and resource sharing in land cover domain, making end-users rarely able to acquire the LCC information timely. The behavior interaction between services is also growing more complex due to the increasing use of web service composition technology, making it challenging for static web services to provide collaboration and matching between diverse web services. To address the above challenges, a Dynamic Service Computing Model (DSCM) was proposed for monitoring LCC. Three dynamic computation strategies were proposed according to different users’ requirements of change detection. WMS-LCC was first developed by extending the existing WMS for ready-use LCC data access. Spatial relation-based LCC data integration was then proposed for extracting LCC information based on multi-temporal land cover data. Processing service encapsulation and service composition methods were also developed for chaining various land cover services to a complex service chain. Finally, a prototype system was implemented to evaluate the validity and feasibility of the proposed DSCM. Two walk-through examples were performed with GlobeLand30 datasets and muti-temporal Landsat imagery, respectively. The experimental results indicate that the proposed DSCM approach was more effective and applicable to a wider range of issues in land cover change detection.

1 citations


Journal ArticleDOI
TL;DR: In this paper , the authors conduct a large-scale empirical study of 20,047 web APIs published at two popular and publicly accessible web API registries: ProgrammableWeb and APIs, and conduct a user survey to investigate the features of web APIs that users often consider when shortlisting a web API for testing before they adopt it.
Abstract: With the increasing adoption of services-oriented computing and cloud computing technologies, web APIs have become the fundamental building blocks for constructing software applications. Web APIs are developed and published on the internet. The functionality of web APIs can be used to facilitate the development of software applications. There are numerous studies on retrieving and recommending candidate web APIs based on user requirements from a large set of web APIs. However, there are very limited studies on the features of web APIs that make them more likely to be used and the issues of using web APIs in practice. Moreover, users’ expectations on the development and management of web APIs are rarely investigated. In this paper, we conduct a large-scale empirical study of 20,047 web APIs published at two popular and publicly accessible web API registries: ProgrammableWeb and APIs.guru. We first extract the questions posted in Stack Overflow (SO) that are relevant to the web APIs. We then manually analyze 1,885 randomly sampled SO questions and identify 24 web API issue types (e.g., authorization error ) that are encountered by users. Afterwards, we conduct a user survey to investigate the features of web APIs that users often consider when shortlisting a web API for testing before they adopt it, validate the identified types of web API issues, and understand users’ expectations on the development and management of web APIs. From the 191 received responses, we extract 14 important features for users to decide whether to use a web API (e.g., well-organized documentation ). We also gain a better understanding of web API issue types and summarize 11 categories of user expectations on web APIs (e.g., documentation and SDK/library ). As the result of our study, we provide guidelines for web API developers and registry managers to improve web APIs and promote the use of web APIs.

1 citations


Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a web services classification method based on HIN and generative adversarial networks (GAN) named SC-GAN, which first constructs a HIN using the structural relationships between web services and their attribute information, after obtaining the feature embedding of the services based on meta-path random walks, a relationship-aware GAN model is input for adversarial training to obtain high-quality negative samples for optimizing the embedding.
Abstract: With the rapid development of service computing and software technologies, it is necessary to correctly and efficiently classify web services to promote their discovery and application. The existing service classification methods based on heterogeneous information networks (HIN) achieve better classification performance. However, such methods use negative sampling to randomly select nodes and do not learn the underlying distribution to obtain a robust representation of the nodes. This paper proposes a web services classification method based on HIN and generative adversarial networks (GAN) named SC-GAN. The authors first construct a HIN using the structural relationships between web services and their attribute information. After obtaining the feature embedding of the services based on meta-path random walks, a relationship-aware GAN model is input for adversarial training to obtain high-quality negative samples for optimizing the embedding. Experimental results on real datasets show that SC-GAN improves classification accuracy significantly over state-of-the-art methods.

1 citations


Journal ArticleDOI
01 Mar 2023-Sensors
TL;DR: In this article , a formal method for representing the components of trust-based service management in the IoT, by using higher-order logic (HOL) and verifying the different behaviors in the trust system and the trust value computation processes is presented.
Abstract: The exponential growth in the number of smart devices connected to the Internet of Things (IoT) that are associated with various IoT-based smart applications and services, raises interoperability challenges. Service-oriented architecture for IoT (SOA-IoT) solutions has been introduced to deal with these interoperability challenges by integrating web services into sensor networks via IoT-optimized gateways to fill the gap between devices, networks, and access terminals. The main aim of service composition is to transform user requirements into a composite service execution. Different methods have been used to perform service composition, which has been classified as trust-based and non-trust-based. The existing studies in this field have reported that trust-based approaches outperform non-trust-based ones. Trust-based service composition approaches use the trust and reputation system as a brain to select appropriate service providers (SPs) for the service composition plan. The trust and reputation system computes each candidate SP’s trust value and selects the SP with the highest trust value for the service composition plan. The trust system computes the trust value from the self-observation of the service requestor (SR) and other service consumers’ (SCs) recommendations. Several experimental solutions have been proposed to deal with trust-based service composition in the IoT; however, a formal method for trust-based service composition in the IoT is lacking. In this study, we used the formal method for representing the components of trust-based service management in the IoT, by using higher-order logic (HOL) and verifying the different behaviors in the trust system and the trust value computation processes. Our findings showed that the presence of malicious nodes performing trust attacks leads to biased trust value computation, which results in inappropriate SP selection during the service composition. The formal analysis has given us a clear insight and complete understanding, which will assist in the development of a robust trust system.

1 citations


Proceedings ArticleDOI
27 Jan 2023
TL;DR: In this paper , a runtime-based Semi-FaaS execution model is proposed, which dynamically extracts time-consuming code snippets from applications and offloads them to FaaS platforms for execution.
Abstract: Function-as-a-service (FaaS), an emerging cloud computing paradigm, is expected to provide strong elasticity due to its promise to auto-scale fine-grained functions rapidly. Although appealing for applications with good parallelism and dynamic workload, this paper shows that it is non-trivial to adapt existing monolithic applications (like web services) to FaaS due to their complexity. To bridge the gap between complicated web services and FaaS, this paper proposes a runtime-based Semi-FaaS execution model, which dynamically extracts time-consuming code snippets (closures) from applications and offloads them to FaaS platforms for execution. It further proposes BeeHive, an offloading framework for Semi-FaaS, which relies on the managed runtime to provide a fallback-based execution model and addresses the performance issues in traditional offloading mechanisms for FaaS. Meanwhile, the runtime system of BeeHive selects offloading candidates in a user-transparent way and supports efficient object sharing, memory management, and failure recovery in a distributed environment. The evaluation using various web applications suggests that the Semi-FaaS execution supported by BeeHive can reach sub-second resource provisioning on commercialized FaaS platforms like AWS Lambda, which is up to two orders of magnitude better than other alternative scaling approaches in cloud computing.

1 citations


Journal ArticleDOI
TL;DR: In this paper , the authors present a taxonomy of service adaptation in Web, cloud and big data environments, and discuss adaptation solutions in emerging service models, such as cloud services and big services.

1 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors proposed a service-keyword correlation graph (SKCG) to capture the relationship between services and keywords, and the compatibility among services, and proposed keyword-based deep reinforced Steiner tree search (K-DRSTS) to recommend services for mashup creation.
Abstract: Developers need to reuse web services and create mashups suitable for various scenarios. Currently, it relies on the developer’s adequate domain knowledge to be able to find services and verify their compatibility. Although service recommendation systems already exist to assist them, inexperienced developers may not be able to adequately express their requirements, resulting in inappropriate and incompatible recommendations. To tackle this problem, we define a service-keyword correlation graph (SKCG) to capture the relationship between services and keywords, and the compatibility among services. Then, we propose keyword-based deep reinforced Steiner tree search (K-DRSTS) to recommend services for mashup creation. K-DRSTS models the task of service discovery as a Steiner tree search problem against SKCG. Leveraging deep reinforcement learning, K-DRSTS provides an efficient solution for solving the NP-hard search problem of the Steiner tree. Extensive experiments on real-world data sets have shown the effectiveness of K-DRSTS.

1 citations


Journal ArticleDOI
TL;DR: In this article , the authors propose an approach to specify APIs functional and security requirements with the Structured-Object-Oriented Formal Language (SOFL) and provide a generic methodology for designing security aware APIs by utilizing concepts of domain models, domain primitives, Ecore metamodel and SOFL.
Abstract: RESTful web APIs have become ubiquitous with most modern web applications embracing the micro-service architecture. A RESTful API provides data over the network using HTTP probably interacting with databases and other services and must preserve its security properties. However, REST is not a protocol but rather a set of guidelines on how to design resources accessed over HTTP endpoints. There are guidelines on how related resources should be structured with hierarchical URIs as well as how the different HTTP verbs should be used to represent well-defined actions on those resources. Whereas security has always been critical in the design of RESTful APIs, there are few or no clear model driven engineering techniques utilizing a secure-by-design approach that interweaves both the functional and security requirements. We therefore propose an approach to specifying APIs functional and security requirements with the practical Structured-Object-oriented Formal Language (SOFL). Our proposed approach provides a generic methodology for designing security aware APIs by utilizing concepts of domain models, domain primitives, Ecore metamodel and SOFL. We also describe a case study to evaluate the effectiveness of our approach and discuss important issues in relation to the practical applicability of our method.

Journal ArticleDOI
TL;DR: In this article , an experimental evaluation of Dynamic Application Rotational Environment (DARE) and Dare IMproved (DIM) is presented, an enhanced version of DARE that leverages a host-based firewall to rotate between web servers located on the same host.
Abstract: Web servers are targets for cyberattacks because they contain valuable information, which could facilitate interactions with another system or damage an organization’s reputation. In the last two decades, Moving Target Defense (MTD) research has gained attention as a cyber resilient technique to mitigate cyber threats. However, most MTD work focuses on the network layer, and there is not much work to support the service layer. This research is an experimental evaluation of Dynamic Application Rotational Environment (DARE) and Dare IMproved (DIM). DIM is an enhanced version of DARE that leverages a host-based firewall to rotate between web servers located on the same host. The main contribution of this work is furthering the understanding of implementing a centralized host-based MTD architecture for web servers. Results show that DIM can maintain availability while thwarting attacks, whereas DARE limits the availability of the web server.

Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , the authors identify discrepancies in the sole use of classic QoS calculation in order to determine the quality of a service within the Arrowhead framework, as a representative of SOA-based system of systems architectures.
Abstract: This paper identifies discrepancies in the sole use of classic QoS calculation in order to determine the quality of a service within the Arrowhead framework, as a representative of SOA-based system of systems architectures. To substantiate the research, several scenarios from an industrial manufacturing environment are used. New key requirements for a QoS management system in SOA-based SoS architectures are identified and based on them a novel concept for an enhanced QoS management system within such architectures is proposed.

Journal ArticleDOI
TL;DR: The ePrivo service as mentioned in this paper combines different state-of-the-art tracking detection and classification methods, including TrackSign, to discover both previously known and zero-day tracking methods.
Abstract: Given the pervasiveness of web tracking practices on the Internet, many countries are developing and enforcing new privacy regulations to ensure the rights of their citizens. However, discovering websites that do not comply with those regulations is becoming very challenging, given the dynamic nature of the web or the use of obfuscation techniques. This work presents ePrivo, a new online service that can help Internet users, website owners, and regulators inspect how privacy-friendly a given website is. The system explores all the content of the website, including traffic from third parties and dynamically modified content. The ePrivo service combines different state-of-the-art tracking detection and classification methods, including TrackSign, to discover both previously known and zero-day tracking methods. After 6 months of service, ePrivo detected the largest browsing history trackers and more than 40k domains including cookies with a lifespan longer than one year, which is forbidden in some countries.

Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , a BI-CSem model was proposed and tested with multiple baseline models using real-world Web service datasets in order to determine the proper recommendation, and the precision, accuracy, recall, F-measure, and FDR for the Web service recommendation system were calculated.
Abstract: Web services are products in the era of service-oriented computing and cloud computing. As the number of web services on the Internet grows, selecting and recommending them becomes more important. Consequently, in the realm of service computing, how to propose the finest Web services for researchers is now a popular research topic. To determine the proper recommendation, the BI-CSem model was proposed and tested with multiple baseline models using real-world Web Service datasets in this research. Aside from that, thesaurus is built using Web service keywords gathered from Web service repositories such as UDDI,WSDI, and from the World Wide Web Cloud. The extracted terms are then subjected to semantic similarity, which is determined using SemantoSim, Concept similarity, and KL divergence measure, and the terms from the user, such as query, user click, and previous historical data, are pre-processed the terms from the semantic alignment are then classified using XGBoost, while the Web service dataset is classified using XGBoost and GRU. Semantic similarity is determined using just SemantoSim, based on the classification intersection of the top 75 percent words from the extracted terms from the two classifiers and features from the created intermediate term tree generated using STM. Finally, terms are reranked and recommended to the user, and the precision, accuracy, recall, F-measure, and FDR for the Web service recommendation system are calculated, and the Bi-CSem model is found to have an excellent precision percentage of 94.37% and the lowest FDR of 0.06.

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors proposed a multi-source information graph-based Web service recommendation framework (MGASR), which can automatically and efficiently extract multifaceted knowledge from the heterogeneous Web service ecosystem.
Abstract: Web service recommendation remains a highly demanding yet challenging task in the field of services computing. In recent years, researchers have started to employ side information comprised in a heterogeneous Web service ecosystem to address the issues of data sparsity and cold start in Web service recommendation. Some recent works have exploited the deep learning techniques to learn user/Web service representations accumulating information from multiplex sources. However, we argue that they still struggle to utilize multi-source information in a discriminating, unified and flexible manner. To tackle this problem, this paper presents a novel multi-source information graph-based Web service recommendation framework (MGASR), which can automatically and efficiently extract multifaceted knowledge from the heterogeneous Web service ecosystem. Specifically, different node-type and edge-type dependent parameters are designed to model corresponding types of objects (nodes) and relations (edges) in the Web service ecosystem. We then leverage graph neural networks (GNNs) with an attention mechanism to construct a multi-source information neural network (MIN) layer, for mining diverse significant dependencies among nodes. By stacking multiple MIN layers, each node can be characterized by a highly contextualized representation due to capturing high-order multi-source information. As such, MGASR can generate representations with rich semantic information toward supporting Web service recommendation tasks. Extensive experiments conducted over three real-world Web service datasets demonstrate the superior performance of our proposed MGASR as compared to various baseline methods.

Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , the authors present an approach to enforce security using the security constraints defined in WEB-INF/web.xml, which are tailored to HTTP requests and responses of web pages, not JSON snippets.
Abstract: With SPAs (single page web applications), it's impossible to reliably enforce security using the security constraints defined in WEB-INF/web.xml. The reason for this is that these security constraints have been especially tailored to HTTP requests and responses of web pages, not JSON snippets. Also, although annotations exist to mark REST classes and methods as subject to security checks, REST implementations are not required to actually perform those checks. The JAX-RS specification for RESTful services just does not handle security.

Posted ContentDOI
15 May 2023
TL;DR: Based on the standardized model encapsulation technology proposed by OpenGMS group, the authors presents a grid-service method tailored to the specific requirements of open geographic model integration applications, and the research work is carried out in the following three areas: the basic strategy of grid servitization.
Abstract: Integrated application of geo-analysis models is critical for geo-process research. Due to the continuity of the real world, the geo-analysis model cannot be applied immediately over the entire space. To date, the method of regrading space as a sequence of computing units (i.e. grid) has been widely used in geographic study. However, the model's variances in division algorithms result in distinct grid data structures. At first, researchers must install and setup the various software to generate the structure-specific grid data required by the models. This method of localized processing is inconvenient and inefficient. Second, in order to integrate the models that use different structural grid data, researchers need to design a specific conversion method based on the integration scenario. Due to difference of researcher’s development habits, it is difficult to reuse the conversion method in another runtime environment. The open and cross-platform character of web services enables users to generate data without the assistance of software programs. It has the potential to revolutionize the present time-consuming process of grid generation and conversion, hence increasing efficiency.Based on the standardized model encapsulation technology proposed by OpenGMS group, this paper presents a grid-service method tailored to the specific requirements of open geographic model integration applications, and the research work is carried out in the following three areas:The basic strategy of grid servitization. The heterogeneity of the grid generation method is a major factor that prevents it from being invoked via a unified way by web services. To reduce the heterogeneous of the grid generation method, this study proposes a standardized description method based on the Model Description Language (MDL). Method for constructing a grid data generating service. A unified representation approach for grid data is proposed in order to standardize the description of heterogeneous grid data; an encapsulation method for grid generating algorithms is proposed; and grid-service is realized by merging the main idea of grid servitization. Method for constructing a grid data conversion service . A box-type grid indexing approach is provided to facilitate the retrieval of grid cells with a large data volume; two conversion types, topologically similar and topologically inaccessible grid data conversion, are summarized, along with the related conversion procedures. On this foundation, a grid conversion engine is built using the grid service-based strategy as a theoretical guide and integrated with the grid conversion strategy. Based on the grid service approach proposed in this paper, researchers can generate and converse grid data without tedious steps for downloading and installing programs. Thus, there are more time spend on geography problem solving, hence increasing efficiency.


Journal ArticleDOI
TL;DR: PAREI as discussed by the authors proposes a Web API recommendation approach by combining both explicit and implicit information to progressively optimize the recommendation results, which can help Mashup developers to find demanded Web APIs rapidly and accurately.
Abstract: Mashup is an application with specific functions by combining Web APIs that can provide services or data on the Internet, thus avoiding the behavior of repeatedly building wheels. Recommending suitable Web APIs in the vast number of Web APIs on the Internet for Mashup developers has become a challenging problem. Previous studies often fail to fully exploit and effectively synthesize various types of information between Web APIs and Mashups. This work proposes a Web API recommendation approach - PAREI by combining both explicit and implicit information to progressively optimize the recommendation results. First, PAREI uses the explicit structural information between Mashups and Web APIs to construct the Call Relationship Network (CRN). Second, PAREI calculates explicit semantic similarities between developer’s requirement and Mashups to obtain candidate Mashup nodes in CRN. Then PAREI further mines the implicit structural information between Mashups. A combined similarity score for each Mashup node is calculated. Finally, PAREI uses CRN to obtain candidate Web APIs related to candidate Mashup nodes, and integrates implicit semantic information of Web APIs with combined scores of corresponding Mashups, so as to obtain Top-K Web APIs. Comparison experiments show that PAREI has significantly improved the Recall, Precision, and MAP metrics compared with other approaches. Ablation experiments show that different types of information play various roles in Web API recommendation, and different combination modes have different effects on the recommendation results. This work constructs the PAREI model, which combines explicit and implicit information to obtain Web API recommendation results through a progressive strategy. According to the experiment results, we believe that the PAREI approach can help Mashup developers to find demanded Web APIs rapidly and accurately.

Journal ArticleDOI
TL;DR: In this paper , the authors model the problem of collectively handling multiple service composition requests as a new multi-tasking service composition problem and propose a new Permutation-based Multi-factorial Evolutionary Algorithm based on an Estimation of Distribution Algorithm (EDA), named PMFEA-EDA, to effectively and efficiently solve this problem.
Abstract: Web service composition composes existing web services to accommodate users’ requests for required functionalities with the best possible quality of services (QoS). Due to the computational complexity of this problem, evolutionary computation (EC) techniques have been employed to efficiently find composite services with near-optimal functional quality (i.e., quality of semantic matchmaking, QoSM for short) or non-functional quality (i.e., QoS) for each composition request individually. With a rapid increase in composition requests from a growing number of users, solving one composition request at a time can hardly meet the efficiency target anymore. Driven by the idea that the solutions obtained from solving one request can be highly useful for tackling other related requests, multitasking service composition approaches have been proposed to efficiently deal with multiple composition requests concurrently. However, existing attempts have not been effective in learning and sharing knowledge among solutions for multiple requests. In this paper, we model the problem of collectively handling multiple service composition requests as a new multi-tasking service composition problem and propose a new Permutation-based Multi-factorial Evolutionary Algorithm based on an Estimation of Distribution Algorithm (EDA), named PMFEA-EDA, to effectively and efficiently solve this problem. In particular, we introduce a novel method for effective knowledge sharing across different service composition requests. For that, we develop a new sampling mechanism to increase the chance of identifying high-quality service compositions in both the single-tasking and multitasking contexts. Our experiment shows that our proposed approach, PMFEA-EDA, takes much less time than existing approaches that process each service request separately, and also outperforms them in terms of both QoSM and QoS.

Posted ContentDOI
17 Jan 2023
TL;DR: In this article , the PAM-prozess is conducted: the pesticide registration data provided via REST-API is crawled, then the data is mapped into an ontology using rmlmapper and made availabe in a machine readable and application-independent form via a SPARQL-Endpoint.
Abstract: During the application of chemical pesticides, distance requirements have to be considered. However, these have to be determined and considered by the farmer manually. To support the farmer the Pesticide Application Manager (PAM)-Projects were conducted and a decision support system was developed. A part of this system, the distance requirements service was developed at the KTBL. To provide this service, the PAM-prozess is conducted: the pesticide registration data provided via REST-API is crawled, then the data is mapped into an ontology using rmlmapper and made availabe in a machine readable and application-independent form via a SPARQL-Endpoint.

Journal ArticleDOI
TL;DR: In this paper , a hybrid Firefly and Bee Colony Optimization Algorithm is introduced for the Multi-Cloud service composition problem of NP-Hard in a multi-cloud environment. But, the authors do not consider the QoS metrics in their work.
Abstract: The development of Web services as a distribution, service consumption, and discovery architecture was successful. The components of SOA are widely used in e-business, process control, multimedia services, and several other disciplines. A QoS is still helpful for connecting non-functional concepts to Web services, and it is next used as a key point of differentiation for the different Service provider. In a multi-cloud environment (MCE), each atomic web service published by any cloud provider with the same functionality has a different price and quality of service (QoS). Service discovery and composition are the key challenges for web services development. The challenges in the composition of services distributed in multi-cloud environments include increased cost and a reduction in its speed due to the increasing number of services, providers, and clouds. Consequently, to overcome these challenges, the QoS-aware multi-cloud web service composition is presented in this work a Hybrid Firefly and Bee Colony Optimization Algorithm is introduced for the Multi-cloud service composition problem of NP-Hard in a multi- cloud environment. Keywords: Services Composition, Firefly, Bee Colony Algorithm, QoS Metrics.

Proceedings ArticleDOI
01 Mar 2023
TL;DR: In this article , the authors examine the quality of service parameters utilized in current and well-liked middleware works for heterogeneous devices and advise using these characteristics to create effective and safe middleware solutions for dependable distributed systems.
Abstract: The use of distributed computing to create middleware-based applications is growing quickly in the business. In these applications, the quality parameter is critical. Devices with different processing, storage, and output viewing capabilities are known as heterogeneous devices. Response time and security parameter considerations are two important challenges in the middleware solution for such devices. Web services keep device resources for supplying inputs to the services being executed in a distributed context. The examination of a few key qualities of service factors for middleware for heterogeneous devices with their constrained capabilities is presented in this research. The examination of the quality of service parameters utilized in current and well-liked middleware works for such devices is provided in this paper's work. Response time and security are two of the most frequently utilized quality of service criteria, and it has been discovered through survey and analysis that they deserve attention and further study. However, when these criteria were analyzed, the paper advises using these characteristics to create effective and safe middleware solutions for dependable distributed systems.

Posted ContentDOI
02 May 2023
TL;DR: In this article , an elastic web application that can automatically scale out and scale in on-demand and cost-effectively by utilizing cloud resources, specifically from Amazon Web Services (AWS).
Abstract: This project aims to build an elastic web application that can automatically scale out and scale in on-demand and cost-effectively by utilizing cloud resources, specifically from Amazon Web Services (AWS). The application is an image classification service exposed as a RESTful web service for clients to access. The infrastructure is divided into two tiers, the Web-Tier and the Application-Tier, with the former providing a user interface for uploading images, while the latter contains core functionality for image classification, business logic, and database manipulation functions. AWS EC2, SQS, and S3 resources are utilized to create this infrastructure, and scaling in and out of resources is determined by the number of incoming images. The project successfully demonstrated the implementation of an image classification application using AWS, which can be used in various industries, such as medical diagnosis, agriculture, retail, security, environmental monitoring, and manufacturing. However, the evaluation of the system based on metrics of response time, boot time, and accuracy highlighted some issues that need to be addressed to improve the application's performance. Overall, the scalability and cost-effectiveness of the infrastructure make it a suitable choice for developing image classification applications.

Book ChapterDOI
01 Jan 2023
TL;DR: In this article , a case study is presented that integrates blockchain (BC) and Web 3.0 with SOA architecture in healthcare ecosystems, and a tutorial aims to let the readers gain valuable insights into SOA integration into web-based applicative frameworks.
Abstract: In modern web applications, service-oriented architecture (SOA) allows enterprises to use reusable functional components and form interoperable services. From an applicative viewpoint, the services use standard interfaces and communication protocols, where the associated service and operations are decoupled via microservices. This removes the redundancy in task development and provides interoperability with back-end legacy frameworks. With the advent of Web 3.0, the requirement is even more critical as services communicate over open wireless channels, and an adversary may gain access to confidential information through associated application programming interface (API) points. Until now, limited research has been carried out to understand the critical visions of SOA architecture and its associated enablers. Thus, motivated by the research gap, we discuss the SOA vision, its key components, and enabling technologies in this article. We present a discussion of SOA with the web and the critical communication protocols to support the case. Next, we discuss the security viewpoint of SOA and address the critical security principles. Research challenges are suggested, and a case study is presented that integrates blockchain (BC) and Web 3.0 with SOA architecture in healthcare ecosystems. The tutorial aims to let the readers gain valuable insights into SOA integration into web-based applicative frameworks.

Book ChapterDOI
TL;DR: In this paper , an overview of how quantum algorithms can be converted into web services, and how they can be deployed using the Amazon Braket platform for quantum computing and invoked through classical web service endpoints.
Abstract: Quantum computing has gained attention from the scientific community and industry, resulting in the development of increasingly powerful quantum computers and supporting technology. Major computer companies have created functional quantum computers, programming languages, and simulators that can be used by developers. This infrastructure is available through the cloud, similar to Infrastructure as a Service. However, utilizing these computers requires a deep understanding of quantum programming and hardware, which is different from traditional cloud computing. To enable a coexistence between quantum and classical computing, we believe that a transition period is necessary. One solution for coexistence is through web services. This tutorial will provide an overview of how quantum algorithms can be converted into web services, and how they can be deployed using the Amazon Braket platform for quantum computing and invoked through classical web service endpoints. Finally, we will propose a process for creating and deploying quantum services using an extension of OpenAPI and GitHub Actions. This extension allows developers to use the same methodology that they are used to for classical services.

Journal ArticleDOI
TL;DR: KVFinder-web as mentioned in this paper is an open-source web-based application for cavity detection and characterization of biomolecular structures, which has two independent components: a RESTful web service and a web graphical portal.
Abstract: Abstract Molecular interactions that modulate catalytic processes occur mainly in cavities throughout the molecular surface. Such interactions occur with specific small molecules due to geometric and physicochemical complementarity with the receptor. In this scenario, we present KVFinder-web, an open-source web-based application of parKVFinder software for cavity detection and characterization of biomolecular structures. The KVFinder-web has two independent components: a RESTful web service and a web graphical portal. Our web service, KVFinder-web service, handles client requests, manages accepted jobs, and performs cavity detection and characterization on accepted jobs. Our graphical web portal, KVFinder-web portal, provides a simple and straightforward page for cavity analysis, which customizes detection parameters, submits jobs to the web service component, and displays cavities and characterizations. We provide a publicly available KVFinder-web at https://kvfinder-web.cnpem.br, running in a cloud environment as docker containers. Further, this deployment type allows KVFinder-web components to be configured locally and customized according to user demand. Hence, users may run jobs on a locally configured service or our public KVFinder-web.


Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a Web service recommendation method via integrating heterogeneous graph attention network representation and FiBiNET (Feature Importance and Bilinear feature Interaction NETwork) score prediction.
Abstract: The rapid growth in the number and diversity of Web service, coupled with the myriad of similar Web service in functionality, makes it challenging to find most suitable Web service for users to accelerate and accomplish Mashup development. Therefore, this paper proposes a Web service recommendation method via integrating heterogeneous graph attention network representation and FiBiNET (Feature Importance and Bilinear feature Interaction NETwork) score prediction. In this method, firstly, a heterogeneous information service network is constructed by using composite service information, atomic service information, and their respective attribute information. Secondly, the meta-paths are defined according to different semantic information and service similarity matrixes are built by using commuting matrix and meta-path-based similarity measurement technology. A two-layer attention model is designed to calculate the node level attention and meta-path-level attention of the services respectively, and generate the feature representation of Web service. Thirdly, for the Web services in the service cluster, combining their feature representations with multi-dimensional QoS attributes, the FiBiNET is exploited to dynamically learn the importance of features and complex feature interactions, and predict the score of Web services. Finally, the experiments are performed on the real Web service dataset. The experimental results show that the proposed method is better than the other nine methods in terms of accuracy, recall, F1, and AUC, and achieves better classification and recommendation quality.