scispace - formally typeset
Search or ask a question

Showing papers on "Web service published in 2003"


Book
01 Jan 2003
TL;DR: Based on their academic and industrial experience with middleware and enterprise application integration systems, Alonso and his co-authors describe the fundamental concepts behind the notion of Web services and present them as the natural evolution of conventional middleware, necessary to meet the challenges of the Web and of B2B application integration.
Abstract: Like many other incipient technologies, Web services are still surrounded by a substantial level of noise. This noise results from the always dangerous combination of wishful thinking on the part of research and industry and of a lack of clear understanding of how Web services came to be. On the one hand, multiple contradictory interpretations are created by the many attempts to realign existing technology and strategies with Web services. On the other hand, the emphasis on what could be done with Web services in the future often makes us lose track of what can be really done with Web services today and in the short term. These factors make it extremely difficult to get a coherent picture of what Web services are, what they contribute, and where they will be applied.Alonso and his co-authors deliberately take a step back. Based on their academic and industrial experience with middleware and enterprise application integration systems, they describe the fundamental concepts behind the notion of Web services and present them as the natural evolution of conventional middleware, necessary to meet the challenges of the Web and of B2B application integration. Rather than providing a reference guide or a "how to write your first Web service" kind of book, they discuss the main objectives of Web services, the challenges that must be faced to achieve them, and the opportunities that this novel technology provides. Established, as well as recently proposed, standards and techniques (e.g., WSDL, UDDI, SOAP, WS-Coordination, WS-Transactions, and BPEL), are then examined in the context of this discussion in order to emphasize their scope, benefits, and shortcomings. Thus, the book is ideally suited both for professionals considering the development of application integration solutions and for research and students interesting in understanding and contributing to the evolution of enterprise application technologies.

2,082 citations


Book
01 Jan 2003
TL;DR: Enterprise Integration Patterns provides an invaluable catalog of sixty-five patterns, with real-world solutions that demonstrate the formidable of messaging and help you to design effective messaging solutions for your enterprise.
Abstract: Would you like to use a consistent visual notation for drawing integration solutions? Look inside the front cover. Do you want to harness the power of asynchronous systems without getting caught in the pitfalls? See "Thinking Asynchronously" in the Introduction. Do you want to know which style of application integration is best for your purposes? See Chapter 2, Integration Styles. Do you want to learn techniques for processing messages concurrently? See Chapter 10, Competing Consumers and Message Dispatcher. Do you want to learn how you can track asynchronous messages as they flow across distributed systems? See Chapter 11, Message History and Message Store. Do you want to understand how a system designed using integration patterns can be implemented using Java Web services, .NET message queuing, and a TIBCO-based publish-subscribe architecture? See Chapter 9, Interlude: Composed Messaging.Utilizing years of practical experience, seasoned experts Gregor Hohpe and Bobby Woolf show how asynchronous messaging has proven to be the best strategy for enterprise integration success. However, building and deploying messaging solutions presents a number of problems for developers. Enterprise Integration Patterns provides an invaluable catalog of sixty-five patterns, with real-world solutions that demonstrate the formidable of messaging and help you to design effective messaging solutions for your enterprise.The authors also include examples covering a variety of different integration technologies, such as JMS, MSMQ, TIBCO ActiveEnterprise, Microsoft BizTalk, SOAP, and XSL. A case study describing a bond trading system illustrates the patterns in practice, and the book offers a look at emerging standards, as well as insights into what the future of enterprise integration might hold.This book provides a consistent vocabulary and visual notation framework to describe large-scale integration solutions across many technologies. It also explores in detail the advantages and limitations of asynchronous messaging architectures. The authors present practical advice on designing code that connects an application to a messaging system, and provide extensive information to help you determine when to send a message, how to route it to the proper destination, and how to monitor the health of a messaging system. If you want to know how to manage, monitor, and maintain a messaging system once it is in use, get this book. 0321200683B09122003

1,374 citations


Journal ArticleDOI
TL;DR: Combining Web services to create higher level, cross-organizational business processes requires standards to model the interactions.
Abstract: Combining Web services to create higher level, cross-organizational business processes requires standards to model the interactions. Several standards are working their way through industry channels and into vendor products.

1,291 citations


Proceedings ArticleDOI
20 May 2003
TL;DR: This paper proposes a global planning approach to optimally select component services during the execution of a composite service, and experimental results show that thisglobal planning approach outperforms approaches in which the component services are selected individually for each task in a Composite service.
Abstract: The process-driven composition of Web services is emerging as a promising approach to integrate business applications within and across organizational boundaries. In this approach, individual Web services are federated into composite Web services whose business logic is expressed as a process model. The tasks of this process model are essentially invocations to functionalities offered by the underlying component services. Usually, several component services are able to execute a given task, although with different levels of pricing and quality. In this paper, we advocate that the selection of component services should be carried out during the execution of a composite service, rather than at design-time. In addition, this selection should consider multiple criteria (e.g., price, duration, reliability), and it should take into account global constraints and preferences set by the user (e.g., budget constraints). Accordingly, the paper proposes a global planning approach to optimally select component services during the execution of a composite service. Service selection is formulated as an optimization problem which can be solved using efficient linear programming methods. Experimental results show that this global planning approach outperforms approaches in which the component services are selected individually for each task in a composite service.

1,229 citations


Journal ArticleDOI
TL;DR: A new Web services discovery model is proposed in which the functional and non-functional requirements are taken into account for the service discovery and should give Web services consumers some confidence about the quality of service of the discovered Web services.
Abstract: Web services technology has generated a lot interest, but its adoption rate has been slow. This paper discusses issues related to this slow take up and argues that quality of services is one of the contributing factors. The paper proposes a new Web services discovery model in which the functional and non-functional requirements (i.e. quality of services) are taken into account for the service discovery. The proposed model should give Web services consumers some confidence about the quality of service of the discovered Web services.

1,081 citations


Journal ArticleDOI
Alexander Keller1, Heiko Ludwig1
TL;DR: A novel framework for specifying and monitoring Service Level Agreements (SLA) for Web Services, designed for a Web Services environment, that is applicable as well to any inter-domain management scenario, such as business process and service management, or the management of networks, systems and applications in general.
Abstract: We describe a novel framework for specifying and monitoring Service Level Agreements (SLA) for Web Services. SLA monitoring and enforcement become increasingly important in a Web Service environment where enterprise applications and services rely on services that may be subscribed dynamically and on-demand. For economic and practical reasons, we want an automated provisioning process for both the service itself as well as the SLA managment system that measures and monitors the QoS parameters, checks the agreed-upon service levels, and reports violations to the authorized parties involved in the SLA management process. Our approach to these issues is presented in this paper. The Web Service Level Agreement (WSLA) framework is targeted at defining and monitoring SLAs for Web Services. Although WSLA has been designed for a Web Services environment, it is applicable as well to any inter-domain management scenario, such as business process and service management, or the management of networks, systems and applications in general. The WSLA framework consists of a flexible and extensible language based on XML Schema and a runtime architecture comprising several SLA monitoring services, which may be outsourced to third parties to ensure a maximum of objectivity. WSLA enables service customers and providers to unambiguously define a wide variety of SLAs, specify the SLA parameters and the way they are measured, and relate them to managed resource instrumentations. Upon receipt of an SLA specification, the WSLA monitoring services are automatically configured to enforce the SLA. An implementation of the WSLA framework, termed SLA Compliance Monitor, is publicly available as part of the IBM Web Services Toolkit.

1,036 citations


Proceedings ArticleDOI
20 May 2003
TL;DR: This paper investigates how Semantic and Web Services technologies can be used to support service advertisement and discovery in e-commerce with the design and implementation of a service matchmaking prototype which uses a DAML-S based ontology and a Description Logic reasoner to compare ontology based service descriptions.
Abstract: An important objective of the Semantic Web is to make Electronic Commerce interactions more flexible and automated. To achieve this, standardization of ontologies, message content and message protocols will be necessary.In this paper we investigate how Semantic and Web Services technologies can be used to support service advertisement and discovery in e-commerce. In particular, we describe the design and implementation of a service matchmaking prototype which uses a DAML-S based ontology and a Description Logic reasoner to compare ontology based service descriptions. We also present the results of initial experiments testing the performance of this prototype implementation in a realistic agent based e-commerce scenario.

833 citations


Proceedings ArticleDOI
20 May 2003
TL;DR: An application called 'Semantic Search' is presented which is built on these supporting technologies and is designed to improve traditional web searching and an overview of TAP, the application framework upon which the Semantic Search is built is provided.
Abstract: Activities such as Web Services and the Semantic Web are working to create a web of distributed machine understandable data. In this paper we present an application called 'Semantic Search' which is built on these supporting technologies and is designed to improve traditional web searching. We provide an overview of TAP, the application framework upon which the Semantic Search is built. We describe two implemented Semantic Search systems which, based on the denotation of the search query, augment traditional search results with relevant data aggregated from distributed sources. We also discuss some general issues related to searching and the Semantic Web and outline how an understanding of the semantics of the search terms can be used to provide better results.

817 citations


Patent
Ray Y Lai1
18 Aug 2003
TL;DR: In this paper, the authors present a system and method for designing and implementing Web Services according to a structured methodology and design patterns, which may be used in creating end-to-end solutions based on past experience and best practices.
Abstract: System and method for designing and implementing Web Services according to a structured methodology and design patterns. Embodiments may incorporate a structured methodology, best practices and design patterns that address reliability, availability and scalability of Web Services architecture. Embodiments may provide mechanisms for integrating heterogeneous technology components into Web Services. Embodiments may provide a vendor-independent Web Services architecture framework and reusable Web Services design patterns, which may be used in creating end-to-end solutions based on past experience and best practices. Embodiments may include design patterns and best practices for delivering Web Services solutions with Quality of Services. One embodiment may provide a Business-to-Business Integration (B2Bi) integration framework for Web Services. Embodiments may provide a Web Security framework and design patterns for designing end-to-end Web Services security.

793 citations


Proceedings Article
01 Jan 2003
TL;DR: A Petri net-based algebra is proposed, used to model control flows, as a necessary constituent of reliable Web service composition process and is expressive enough to capture the semantics of complex Web service combinations.
Abstract: The Internet is going through several major changes. It has become a vehicle of Web services rather than just a repository of information. Many organizations are putting their core business competencies on the Internet as a collection of Web services. An important challenge is to integrate them to create new value-added Web services in ways that could never be foreseen forming what is known as Business-to-Business (B2B) services. Therefore, there is a need for modeling techniques and tools for reliable Web service composition. In this paper, we propose a Petri net-based algebra, used to model control flows, as a necessary constituent of reliable Web service composition process. This algebra is expressive enough to capture the semantics of complex Web service combinations.

705 citations


Journal ArticleDOI
01 Nov 2003
TL;DR: This paper proposes an ontology-based framework for the automatic composition of Web services with formal safeguards for meaningful composition through the use of composability rules, and provides an implementation using an E-government application offering customized services to indigent citizens.
Abstract: Service composition is gaining momentum as the potential silver bullet for the envisioned Semantic Web. It purports to take the Web to unexplored efficiencies and provide a flexible approach for promoting all types of activities in tomorrow’s Web. Applications expected to heavily take advantage of Web service composition include B2B E-commerce and E-government. To date, enabling composite services has largely been an ad hoc, time-consuming, and error-prone process involving repetitive low-level programming. In this paper, we propose an ontology-based framework for the automatic composition of Web services. We present a technique to generate composite services from high-level declarative descriptions. We define formal safeguards for meaningful composition through the use of composability rules. These rules compare the syntactic and semantic features of Web services to determine whether two services are composable. We provide an implementation using an E-government application offering customized services to indigent citizens. Finally, we present an exhaustive performance experiment to assess the scalability of our approach.

Journal ArticleDOI
TL;DR: A vision for Semantic Web Services, which combine the growing Web services architecture and theSemantic Web, is introduced and DAML-S is proposed as a prototypical example of an ontology for describing SemanticWeb services.


Biplav Srivastava1, Jana Koehler1
01 Jan 2003
TL;DR: This work discusses what makes the Web service composition so special and derive challenges for the AI planning community and compares these approaches to the problems of modeling, composing, executing, and verifying Web services.
Abstract: Composition of Web services has received much interest to support business-to-business or enterprise application integration. On the one side, the business world has developed a number of XML-based standards to formalize the specification of Web services, their flow composition and execution. This approach is primarily syntactical: Web service interfaces are like remote procedure call and the interaction protocols are manually written. On the other side, the Semantic Web community focuses on reasoning about web resources by explicitly declaring their preconditions and effects with terms precisely defined in ontologies. For the composition of Web services, they draw on the goal-oriented inferencing from planning. So far, both approaches have been developed rather independently from each other. We compare these approaches and discuss their solutions to the problems of modeling, composing, executing, and verifying Web services. We discuss what makes the Web service composition so special and derive challenges for the AI planning community.

Journal ArticleDOI
TL;DR: This paper considers how the mechanism for composing services in Self-Serv is based on two major concepts: the composite service and the service container.
Abstract: Self-Serv aims to enable the declarative composition of new services from existing ones, the multiattribute dynamic selection of services within a composition, and peer-to-peer orchestration of composite service executions. Self-Serv adopts the principle that every service, whether elementary or composite, should provide a programmatic interface based on SOAP and the Web Service Definition Language. This does not exclude the possibility of integrating legacy applications, such as those written in CORBA, into the service's business logic. To integrate such applications, however, first requires the development of appropriate adapters. The paper considers how the mechanism for composing services in Self-Serv is based on two major concepts: the composite service and the service container.

Journal ArticleDOI
TL;DR: The software as a service model composes services dynamically, as needed, by binding several lower-level services-thus overcoming many limitations that constrain traditional software use, deployment, and evolution.
Abstract: The software as a service model composes services dynamically, as needed, by binding several lower-level services-thus overcoming many limitations that constrain traditional software use, deployment, and evolution.

Proceedings ArticleDOI
24 Aug 2003
TL;DR: The experimental results show that the proposed technique outperforms existing techniques substantially, and is able to mine both contiguous and non-contiguous data records.
Abstract: A large amount of information on the Web is contained in regularly structured objects, which we call data records. Such data records are important because they often present the essential information of their host pages, e.g., lists of products or services. It is useful to mine such data records in order to extract information from them to provide value-added services. Existing automatic techniques are not satisfactory because of their poor accuracies. In this paper, we propose a more effective technique to perform the task. The technique is based on two observations about data records on the Web and a string matching algorithm. The proposed technique is able to mine both contiguous and non-contiguous data records. Our experimental results show that the proposed technique outperforms existing techniques substantially.

Proceedings ArticleDOI
22 Jun 2003
TL;DR: This work describes new approaches developed to support the Globus Toolkit version 3 (GT3) implementation of the Open Grid Services Architecture, an initiative that is recasting Grid concepts within a service-oriented framework based on Web services.
Abstract: Grid computing is concerned with the sharing and coordinated use of diverse resources in distributed "virtual organizations." The dynamic and multiinstitutional nature of these environments introduces challenging security issues that demand new technical approaches. In particular, one must deal with diverse local mechanisms, support dynamic creation of services, and enable dynamic creation of trust domains. We describe how these issues are addressed in two generations of the Globus Toolkit/spl reg/. First, we review the Globus Toolkit version 2 (GT2) approach; then we describe new approaches developed to support the Globus Toolkit version 3 (GT3) implementation of the Open Grid Services Architecture, an initiative that is recasting Grid concepts within a service-oriented framework based on Web services. GT3's security implementation uses Web services security mechanisms for credential exchange and other purposes, and introduces a tight least-privilege model that avoids the need for any privileged network service.

Posted Content
TL;DR: The Globus Toolkit version 2 (GT2) as discussed by the authors was developed to support the Open Grid Services Architecture, an initiative that recasting Grid concepts within a service oriented framework based on Web services.
Abstract: Grid computing is concerned with the sharing and coordinated use of diverse resources in distributed "virtual organizations." The dynamic and multi-institutional nature of these environments introduces challenging security issues that demand new technical approaches. In particular, one must deal with diverse local mechanisms, support dynamic creation of services, and enable dynamic creation of trust domains. We describe how these issues are addressed in two generations of the Globus Toolkit. First, we review the Globus Toolkit version 2 (GT2) approach; then, we describe new approaches developed to support the Globus Toolkit version 3 (GT3) implementation of the Open Grid Services Architecture, an initiative that is recasting Grid concepts within a service oriented framework based on Web services. GT3's security implementation uses Web services security mechanisms for credential exchange and other purposes, and introduces a tight least-privilege model that avoids the need for any privileged network service.

Journal ArticleDOI
Francisco Curbera1, Rania Khalaf1, Nirmal K. Mukhi1, Stefan Tai1, Sanjiva Weerawarana1 
TL;DR: How three specifications support creating robust service compositions are supported and how to incorporate them into service compositions is explained.
Abstract: How three specifications support creating robust service compositions.

Journal Article
TL;DR: The METEOR-S Web Service Discovery Infrastructure as mentioned in this paper uses an ontology-based approach to organize registries into domains, enabling domain based classification of all Web services, and each of these registries supports semantic publication and discovery of Web services.
Abstract: Web services are the new paradigm for distributed computing. They have much to offer towards interoperability of applications and integration of large scale distributed systems. To make Web services accessible to users, service providers use Web service registries to publish them. Current infrastructure of registries requires replication of all Web service publications in all Universal Business Registries. Large growth in number of Web services as well as the growth in the number of registries would make this replication impractical. In addition, the current Web service discovery mechanism is inefficient, as it does not support discovery based on the capabilities of the services, leading to a lot of irrelevant matches. Semantic discovery or matching of services is a promising approach to address this challenge. In this paper, we present a scalable, high performance environment for Web service publication and discovery among multiple registries. This work uses an ontology-based approach to organize registries into domains, enabling domain based classification of all Web services. Each of these registries supports semantic publication and discovery of Web services. We believe that the semantic approach suggested in this paper will significantly improve Web service publication and discovery involving a large number of registries. This paper describes the implementation and architecture of the METEOR-S Web Service Discovery Infrastructure, which leverages peer-to-peer computing as a scalable solution.

01 Jan 2003
TL;DR: A prototype that guides a user in the dynamic composition of web services, a semi-automatic process that includes presenting matching services to the user at each step of a composition, and filtering the possibilities by using semantic descriptions of the services.
Abstract: As web services become more prevalent, tools will be needed to help users find, filter and integrate these services. Composing existing services to obtain new functionality will prove to be essential for both business-to-business and business-to-consumer applications. We have developed a prototype that guides a user in the dynamic composition of web services. Our semi-automatic process includes presenting matching services to the user at each step of a composition, filtering the possibilities by using semantic descriptions of the services. The generated composition is then directly executable through the WSDL grounding of the services. We tested our system by generating semantic descriptions for some of the common services available on the web such as translator, dictionary and map services. We also applied our approach to a prototype sensor network environment where each sensor provides its data as

Proceedings ArticleDOI
06 Oct 2003
TL;DR: A model-based approach to verifying Web service compositions for Web service implementations supports verification against specification models and assigns semantics to the behavior of implementation model so as to confirm expected results for both the designer and implementer.
Abstract: In this paper, we discuss a model-based approach to verifying Web service compositions for Web service implementations. The approach supports verification against specification models and assigns semantics to the behavior of implementation model so as to confirm expected results for both the designer and implementer. Specifications of the design are modeled in UML (Unified Modeling Language), in the form of message sequence charts (MSC), and mechanically compiled into the finite state process notation (FSP) to concisely describe and reason about the concurrent programs. Implementations are mechanically translated to FSP to allow a trace equivalence verification process to be performed. By providing early design verification, the implementation, testing, and deployment of Web service compositions can be eased through the understanding of the differences, limitations and undesirable traces allowed by the composition. The approach is supported by a suite of cooperating tools for specification, formal modeling and trace animation of the composition workflow.

Book ChapterDOI
20 Oct 2003
TL;DR: This work has proven the correspondence between the semantics of SHOP2 and the situation calculus semantics of the Process Model, and implemented a system which soundly and completely plans over sets of DAML-S descriptions using a SHop2 planner, and then executes the resulting plans over the Web.
Abstract: The DAML-S Process Model is designed to support the application of AI planning techniques to the automated composition of Web services. SHOP2 is an Hierarchical Task Network (HTN) planner well-suited for working with the Process Model. We have proven the correspondence between the semantics of SHOP2 and the situation calculus semantics of the Process Model. We have also implemented a system which soundly and completely plans over sets of DAML-S descriptions using a SHOP2 planner, and then executes the resulting plans over the Web. We discuss the challenges and difficulties of using SHOP2 in the information-rich and human-oriented context of Web services.

Book ChapterDOI
30 May 2003
TL;DR: This presentation complements an earlier foundational article, “The Anatomy of the Grid,” by describing how Grid mechanisms can implement a service-oriented architecture, explaining how Grid functionality can be incorporated into a Web services framework, and illustrating how the architecture can be applied within commercial computing as a basis for distributed system integration.
Abstract: In both e-business and e-science, we often need to integrate services across distributed, heterogeneous, dynamic “virtual organizations” formed from the disparate resources within a single enterprise and/or from external resource sharing and service provider relationships. This integration can be technically challenging because of the need to achieve various qualities of service when running on top of different native platforms. We present an Open Grid Services Architecture that addresses these challenges. Building on concepts and technologies from the Grid and Web services communities, this architecture defines a uniform exposed service semantics (the Grid service); defines standard mechanisms for creating, naming, and discovering transient Grid service instances; provides location transparency and multiple protocol bindings for service instances; and supports integration with underlying native platform facilities. The Open Grid Services Architecture also defines, in terms of Web Services Description Language (WSDL) interfaces and associated conventions, mechanisms required for creating and composing sophisticated distributed systems, including lifetime management, change management, and notification. Service bindings can support reliable invocation, authentication, authorization, and delegation, if required. Our presentation complements an earlier foundational article, “The Anatomy of the Grid,” by describing how Grid mechanisms can implement a service-oriented architecture, explaining how Grid functionality can be incorporated into a Web services framework, and illustrating how our architecture can be applied within commercial computing as a basis for distributed system integration—within and across organizational domains. This is a DRAFT document and continues to be revised. The latest version can be found at http://www.globus.org/research/papers/ogsa.pdf. Please send comments to foster@mcs.anl.gov, carl@isi.edu, jnick@us.ibm.com, tuecke@mcs.anl.gov Physiology of the Grid 2

Proceedings Article
01 Jan 2003
TL;DR: This work discusses one such approach that involves adding semantics to WSDL using DAML+OIL+OIL ontologies and uses UDDI to store these semantic annotations and search for Web services based on them.
Abstract: With the increasing growth in popularity of Web services, discovery of relevant Web services becomes a significant challenge. One approach is to develop semantic Web services where by the Web services are annotated based on shared ontologies, and use these annotations for semantics-based discovery of relevant Web services. We discuss one such approach that involves adding semantics to WSDL using DAML+OIL ontologies. Our approach also uses UDDI to store these semantic annotations and search for Web services based on them. We compare our approach with another initiative to add semantics to support Web service discovery, and show how our approach may fit current standards-based industry approach better.

Journal ArticleDOI
01 Nov 2003
TL;DR: A solution within the context of the emerging Semantic Web that includes use of ontologies to overcome some of the problem of interoperability of heterogeneous Web services is presented.
Abstract: Systems and infrastructures are currently being developed to support Web services. The main idea is to encapsulate an organization's functionality within an appropriate interface and advertise it as Web services. While in some cases Web services may be utilized in an isolated form, it is normal to expect Web services to be integrated as part of workflow processes. The composition of workflow processes that model e-service applications differs from the design of traditional workflows, in terms of the number of tasks (Web services) available to the composition process, in their heterogeneity, and in their autonomy. Therefore, two problems need to be solved: how to efficiently discover Web services—based on functional and operational requirements—and how to facilitate the interoperability of heterogeneous Web services. In this paper, we present a solution within the context of the emerging Semantic Web that includes use of ontologies to overcome some of the problem. We describe a prototype that has been implemented to illustrate how discovery and interoperability functions are achieved more efficiently.

Proceedings ArticleDOI
20 May 2003
TL;DR: The design of Web application security assessment mechanisms are analyzed in order to identify poor coding practices that render Web applications vulnerable to attacks such as SQL injection and cross-site scripting.
Abstract: As a large and complex application platform, the World Wide Web is capable of delivering a broad range of sophisticated applications. However, many Web applications go through rapid development phases with extremely short turnaround time, making it difficult to eliminate vulnerabilities. Here we analyze the design of Web application security assessment mechanisms in order to identify poor coding practices that render Web applications vulnerable to attacks such as SQL injection and cross-site scripting. We describe the use of a number of software-testing techniques (including dynamic analysis, black-box testing, fault injection, and behavior monitoring), and suggest mechanisms for applying these techniques to Web applications. Real-world situations are used to test a tool we named the Web Application Vulnerability and Error Scanner (WAVES, an open-source project available at http://waves.sourceforge.net) and to compare it with other tools. Our results show that WAVES is a feasible platform for assessing Web application security.

Proceedings ArticleDOI
20 May 2003
TL;DR: A system called, DeLa, which reconstructs (part of) a "hidden" back-end web database by sending queries through HTML forms, automatically generating regular expression wrappers to extract data objects from the result pages and restoring the retrieved data into an annotated (labelled) table.
Abstract: Many tools have been developed to help users query, extract and integrate data from web pages generated dynamically from databases, i.e., from the Hidden Web. A key prerequisite for such tools is to obtain the schema of the attributes of the retrieved data. In this paper, we describe a system called, DeLa, which reconstructs (part of) a "hidden" back-end web database. It does this by sending queries through HTML forms, automatically generating regular expression wrappers to extract data objects from the result pages and restoring the retrieved data into an annotated (labelled) table. The whole process needs no human involvement and proves to be fast (less than one minute for wrapper induction for each site) and accurate (over 90% correctness for data extraction and around 80% correctness for label assignment).

Journal Article
TL;DR: eXist as discussed by the authors is an Open Source native XML database system, which supports keyword search on element and attribute contents and an enhanced indexing scheme at the architecture's core supports quick identification of structural node relationships.
Abstract: With the advent of native and XML enabled database systems, techniques for efficiently storing, indexing and querying large collections of XML documents have become an important research topic. This paper presents the storage, indexing and query processing architecture of eXist, an Open Source native XML database system. eXist is tightly integrated with existing tools and covers most of the native XML database features. An enhanced indexing scheme at the architecture's core supports quick identification of structural node relationships. Based on this scheme, we extend the application of path join algorithms to implement most parts of the XPath query language specification and add support for keyword search on element and attribute contents.