scispace - formally typeset
Search or ask a question

Showing papers on "Semantic interoperability published in 2000"


ReportDOI
01 Jan 2000
TL;DR: The SHOE language is presented, which the authors feel has many of the features necessary to enable a semantic web, and an existing set of tools that make it easy to use the language are described.
Abstract: XML will have a profound impact on the way data is exchanged on the Internet. An important feature of thislanguage is the separation of content from presentation, which makes it easier to select and/or reformat the data.However, due to the likelihood of numerous industry and domain specific DTDs, those who wish to integrateinformation will still be faced with the problem of semantic interoperability. In this paper we discuss why thisproblem is not solved by XML, and then discuss why the Resource Description Framework is only a partial solution.We then present the SHOE language, which we feel has many of the features necessary to enable a semantic web,and describe an existing set of tools that make it easy to use the language.Jeff Heflin, University of MarylandJames Hendler, University of Maryland

166 citations


Journal ArticleDOI
TL;DR: The authors review selected activities of the Health Level 7 (HL7) Vocabulary Technical Committee that are related to vocabulary domain specification for HL7 coded data elements that are aimed at realizing semantic interoperability in the context of the HL7 Message Development Framework.

58 citations



Journal Article
TL;DR: Secure and Open Mobile Agent (SOMA) as mentioned in this paper is a secure and open mobile agent programming environment that is based on a thorough security model and provides a wide range of mechanisms and tools to build and enforce flexible security policies.
Abstract: SUMMARY The Mobile Agent technology helps in the development of applications in open, distributed and heterogeneous environments such as the Internet and the Web, but it has to answer to the requirements of security and interoperability to achieve wide acceptance. The paper focuses on security and interoperability, and describes a Secure and Open Mobile Agent (SOMA) programming environment where both requirements are main design objectives. On the one hand, SOMA is based on a thorough security model and provides a wide range of mechanisms and tools to build and enforce flexible security policies. On the other hand, the SOMA framework permits to interoperate with different application components designed with different programming styles. SOMA grants interoperability by closely considering compliance with the OMG CORBA and MASIF standards. SOMA has already shown the feasibility and effectiveness of the approach for the development of flexible and adaptive applications in several areas, particularly in network and systems management.

37 citations


Book ChapterDOI
01 Jan 2000
TL;DR: The role of domain-specific standards for managing semantic heterogeneity among dissimilar information sources is discussed, whereby standards play a central role for ‘initiating’ top-down processes by means of defining common data models for the involved information sources.
Abstract: For integrating heterogeneous information systems, semantic interoperability is necessary to ensure that exchange of information makes sense — that the provider and requester of information have a common understanding of the ‘meaning’ of the requested services and data. Effective exchange of information between heterogeneous systems needs to be based on a common understanding of the transferred data. This paper discusses the role of domain-specific standards for managing semantic heterogeneity among dissimilar information sources. The process of integrating such heterogeneous information systems is also discussed in this context, whereby standards play a central role for ‘initiating’ top-down processes by means of defining common data models for the involved information sources.

33 citations


Proceedings ArticleDOI
03 Apr 2000
TL;DR: A new approach to the integration of software tools used in an engineering process is shown, the supporting infrastructure is described, and the background model-integrated generation technology is discussed.
Abstract: The integration of software tools used in an engineering process is a problem that arises frequently in large-scale engineering projects. Traditional approaches are insufficient for complex engineering tools and processes. The solution must also account for the evolution of the system, as tools and processes change over time. This paper shows a new approach to the problem, describes the supporting infrastructure, and discusses the background model-integrated generation technology.

30 citations


01 Jan 2000
TL;DR: This paper seeks to express the concepts and constraints of the RDF model in first-order logic allowing RDF with a formalization allowing its full exploitation as a key ingredient of the evolving Semantic Web.
Abstract: The Resource Description Framework (RDF) is intended to be used to capture and express the conceptual structure of information offered in the Web. Interoperability is considered to be an important enabler of future web applications. While XML supports syntactic interoperability, RDF is aimed at semantic interoperability. Interoperability is only given if different users/agents interpret an RDF data model in the same way. Important aspects of the RDF model are, however, expressed in prose which may lead to misunderstandings. To avoid this, capturing the intended semantics of RDF in first-order logic might be a valuable contribution and may provide RDF with a formalization allowing its full exploitation as a key ingredient of the evolving Semantic Web. This paper seeks to express the concepts and constraints of the RDF model in first-order logic.

26 citations


Journal ArticleDOI
Scott Jarvis1

23 citations


Proceedings Article
Asaf Adi, David Botzer, Opher Etzion1
01 Jan 2000
TL;DR: This research is aimed at providing the system designer a tool to define and describe events and their relationships to other events, object and tasks, by using the semantic data modeling approach, and applies it to events.
Abstract: Events are at the core of reactive applications, which have become popular in many domains. Contemporary modeling tools lack the capability express the event semantics and relationships to other entities. This research is aimed at providing the system designer a tool to define and describe events and their relationships to other events, object and tasks. It follows the semantic data modeling approach, and applies it to events, by using the classification, aggregation, generalization and association abstractions in the event world. The model employs conditional generalizations that are specific to the event domain, and determine conditions in which an event that is classified to lower level class, is considered as a member of a higher-level event class, for the sake of reaction to the event. The paper describes the event model, its knowledge representation scheme and its properties, and demonstrates these properties through a comprehensive example.

19 citations


Proceedings ArticleDOI
04 Jul 2000
TL;DR: The design issues of an interoperable CSCW system in a distributed health-care environment through an illustrative study in the area of telecardiology is presented.
Abstract: Computer supported cooperative work (CSCW) provides a fusion of the understanding of organisational processes with communication technologies. Telemedicine involves an integration of networking technologies with health-care processes. Since different modalities of patient care require applications running on heterogeneous computing environments, interoperability is a major issue in telemedicine. Software interoperability provides two distinctly classified benefits-benefits for the users of the system and benefits to the development and maintenance of the system. Software interoperability between different applications can be modeled at different levels of abstractions can be modeled at different levels of abstractions such as physical interoperability and semantic interoperability. Various mechanisms exist to resolve the problem at different levels. This paper presents the design issues of an interoperable. CSCW system in a distributed health-care environment through an illustrative study in the area of telecardiology.

19 citations


Journal ArticleDOI
TL;DR: This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems using a popular object-oriented software development methodology - unified modeling language (UML).
Abstract: Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place.

Patent
13 Jul 2000
TL;DR: In this paper, a method is disclosed where distributed repositories achieve semantic interoperability through the exchange of examples and, optionally, classifiers, which can be used to determine whether common labels are referring to the same semantic meaning.
Abstract: Distributed resource discovery is an essential step for information retrieval and/or providing information services. This step is usually used for determining the location of an information or data repository which has relevant information. The most fundamental challenge is the usual lack of semantic interoperability of the requested resource. In accordance with the invention, a method is disclosed where distributed repositories achieve semantic interoperability through the exchange of examples and, optionally, classifiers. The outcome of the inventive method can be used to determine whether common labels are referring to the same semantic meaning.

Proceedings ArticleDOI
19 Jun 2000
TL;DR: This paper demonstrates that intermediaries defining service semantics in PSL can automatically integrate multiple suppliers' PCL catalogs for their agent-mediated services.
Abstract: Internet commerce is increasing the demands of service integrations by sharing XML-based catalogs We propose the PCO (Portable Compound Object) data model supporting semantic inheritance to ensure the synonymy of heterogeneous semantics among distributed schemas that different authors define independently Also, the PCO model makes semantic relationships independent of an initial class hierarchy, and it enables rapid schema evolution across the entire Internet business This preserves semantic interoperability without changing pre-defined classes We have also encoded the PCO model into two XML-based languages: the PCO Specification Language (PSL) and the Portable Composite Language (PCL) This paper demonstrates that intermediaries defining service semantics in PSL can automatically integrate multiple suppliers' PCL catalogs for their agent-mediated services

Proceedings ArticleDOI
01 Sep 2000
TL;DR: This paper focuses on an application’s structural requirements as reflected in values of particular architectural characteristics, describing how they can be incompatible with individual component system properties.
Abstract: Interoperability problems arise when complex software systems are constructed by integrating distinct, and often heterogeneous, components. By performing interoperability analysis on the software architecture design of the system and its components, potential incompatibilities can be anticipated early in the design process. In this paper, we focus on an application’s structural requirements as reflected in values of particular architectural characteristics, describing how they can be incompatible with individual component system properties.

Proceedings Article
01 Jan 2000
TL;DR: A required infrastructure of the subject mediators aiming at semantic interoperability of heterogeneous digital library collections is presented and the diversity of information models that should be uniformly represented at the canonical level is analyzed.
Abstract: This work is oriented on creation of integrated virtual digital libraries mediating heterogeneous distributed digital collections of scientific information. A required infrastructure of the subject mediators aiming at semantic interoperability of heterogeneous digital library collections is presented. The diversity of information models that should be uniformly represented at the canonical level is analyzed. The canonical model and an approach for various information models homogenization in the canonical paradigm are introduced. The mediator’s infrastructure is defined as a set of functionallyoriented frameworks: collection registration framework, information extraction framework, personalization framework. Mediator scalability measures are discussed.

Proceedings ArticleDOI
25 Jun 2000
TL;DR: This paper proposes a systematic approach for interoperability verification using a reference ORB and industry-standard test suites, and extends the interoperability testing framework to performance verification, where the scope of the tests is detailed and the design strategy is analyzed.
Abstract: The rapid changes in the telecommunications environment demand that new services are developed and deployed quickly, and that legacy systems are integrated seamlessly. While emerging technologies and standards such as CORBA allow rapid solution integration via shared reusable components, new issues arise for interoperability and performance assurance in highly heterogeneous environments. In this paper, we first identify these issues in a multi-vendor and multi-ORB telecommunications environment. We then propose a systematic approach for interoperability verification using a reference ORB and industry-standard test suites. We extend our interoperability testing framework to performance verification, where the scope of the tests is detailed and the design strategy is analyzed.

Journal ArticleDOI
TL;DR: The fundamental clash between flexibility and interoperability in CORBA security and in distributed object systems security in general is discussed and where CORba security has unnecessary flexibility or interoperability limitations is identified.

23 Nov 2000
TL;DR: Methods for the formal description of spatial relations and integrates them in a categorical view are reviewed and examples of image-schematic specifications for large-scale (PATH) and small -scale (CONTAINER, SURFACE) space are given.
Abstract: The formal specification of spatial objects and spatial relations is at the core of geographic data exchange and interoperabilit y for GIS. It is necessary that the representation of such objects and relations comes close to how people use them in their everyday lives, i.e., that these specifications are built upon elements of human spatial cognition. Image schemata have been suggested as highly abstract and structured mental patterns to capture spatial and similar physical as well as metaphorical relations between objects in the experiential world. We assume that image-schematic details for large-scale (geographic) space are potentially different from imageschematic details for small -scale (table-top) space. This paper reviews methods for the formal description of spatial relations and integrates them in a categorical view. We give examples of image-schematic specifications for large-scale (PATH) and small -scale (CONTAINER, SURFACE) space. Such specifications should provide a foundation for further research on formalizing elements of human spatial cognition for interoperabilit y in GIS.

Proceedings ArticleDOI
04 Jan 2000
TL;DR: A workflow repository is defined which enables extensible communication between WfMS and shows the expressiveness and the idea of extensibility of the interoperability approach.
Abstract: Workflow management systems (WfMS) are designed to enable the enactment of business processes, i.e. to extract and maintain the process logic between business applications. Within a workflow management solution, heterogeneous applications are connected through a given workflow definition. Future development efforts must aim to establish interoperability between different WfMS. This leads straight to interoperability problems of two types: in a technical point of view, one WfMS has to be able to abstract from a concrete implementation (programming language, hardware, etc.) of another WfMS. In a semantic point of view, WfMS have to agree on the execution semantics of a workflow definition. We try to cope with the semantic interoperability problem by defining a workflow repository which enables extensible communication between WfMS. To understand the notion and the central function of repositories, a short history of metadata systems is given. Based on a modular architecture of WfMS, a repository is developed and sketched in a CORBA compliant style. This repository shows the expressiveness and the idea of extensibility of our interoperability approach.

ReportDOI
01 Jan 2000
TL;DR: This paper will address the critical role of software architecture in achieving large-scale system interoperability as well as initiatives underway to promote architectural-based interoperability solutions for the Unified Commands.
Abstract: : This paper will address the critical role of software architecture in achieving large-scale system interoperability as well as initiatives underway to promote architectural-based interoperability solutions for the Unified Commands. Software architecture is the means to define systems composed of systems. This definition is critical to achieving interoperability. Joint Publication 1-02 defines interoperability as the ability of systems, units or forces to provide services to and accept services from other systems, units or forces and use the services to enable them to operate effectively together [JP 1-02, 1994]. In order to achieve interoperability, compatible systems, doctrine and policy must exist. The technical challenges to interoperability can be daunting -- particularly when a new requirement is established that requires existing (legacy) systems to interoperate. Military forces do not operate as a fully connected graph; modern warfare does not require every system to interoperate. Joint doctrine is the key to determining interoperability requirements. Doctrine tells us how to fight and how we fight determines interoperability requirements. Policy sets the bounds on acceptable doctrine.

Proceedings ArticleDOI
22 May 2000
TL;DR: This paper shows how semantic mismatches are resolved between two well-established and widely used metadata standards; the ANZLIC metadata standard and the DIF (Directory Interchange Format) metadata standard.
Abstract: The World Wide Web (WWW) provides geo-spatial data custodians with an environment in which to advertise their datasets via on-line catalogue systems. The geo-spatial community is diverse and its members are spread around the world. The user's task of searching for and locating datasets of interest, would be greatly simplified if a single point of entry could search for datasets from multiple, independent catalogue systems-an interoperable catalogue system. Advances in middleware for distributed systems now make it straightforward to invoke remote catalogues. However semantic interoperability is still a challenge due to different catalogue systems employing different metadata standards. A metadata standard is a collection of attributes used to describe datasets. This paper employs ontologies as a framework to classify semantic mismatches between metadata standards before resolving them. Using ontologies provides a deeper understanding of the semantic mismatches. In addition, both an architecture for an interoperable catalogue system and the implementation of mediators, the component which resolves semantic mismatches, are described. The implementation uses emerging Web technologies such as XML and XSLT. To demonstrate the appropriateness of the approach, this paper shows how semantic mismatches are resolved between two well-established and widely used metadata standards; the ANZLIC (Australia and New Zealand Land information Council) metadata standard and the DIF (Directory Interchange Format) metadata standard.

Proceedings ArticleDOI
01 Mar 2000
TL;DR: An overview of COIL and a stylized example showing an application of the COIL language and its components are presented.
Abstract: Megaprogramming, module interconnection languages, and mediator-based approaches built on standard distributed object technologies such as CORBA and DCOM have provided promising advances in enterprise-level data integration. These distributed object technologies, however, still require relatively low-level, technology-dependent implementations to achieve object (or module) interconnection, communication, and coordination. The Component Object Interconnection Language (COIL) is a language designed specifically to facilitate rapid and flexible data integration through high-level object interconnections. COIL is an extensible language, with both declarative and imperative aspects. It has been designed to provide broad functionality in declaring and controlling structural and semantic data integration, and includes constructs for specifying updates, constraint enforcement, and update propagations. This paper presents an overview of COIL and a stylized example showing an application of the COIL language and its components.

Book ChapterDOI
01 Jan 2000
TL;DR: Some achievements in the application of intelligent systems in tourism obtained with the project “Intelligent decision support systems in management of complex systems” are illustrated.
Abstract: In the coming years the quantity of information will grow exponentially, creating real labyrinths where tourists and tourist professionals will have to solve complex problems quickly. In the most far-sighted visions intelligent systems will have an important role [4, 18]. They will be imperatively used as research tools in tourism, but also as integrators of existing technologies and performers of such tasks that until now have only been executed by people. The article illustrates some achievements in the application of intelligent systems in tourism obtained with the project “Intelligent decision support systems in management of complex systems”. The aim of the project is to research and create the methodology of developing the second generation of intelligent systems [9, 10]. Next, second generation of intelligent systems, according to F. Hayes-Roth, should overcome the limitations of today’s intelligent systems, introducing some new characteristics: “context reusable building blocks, reusable knowledge, high value component functions, composite architecture for multi-task systems, architecture for semantic interoperability, and a small number of domains, which should be attacked persistently [2]”.

15 Oct 2000
TL;DR: The Object Oriented Data Technology task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
Abstract: Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.

01 Jul 2000
TL;DR: This report on interoperability has five sections and describes fundamental interoperability concepts and develops a terminology for describing interoperability issues and needs, and addresses the vital issue of interoperability security.
Abstract: : Interoperability is the ability to use resources from diverse origins as if they had been designed as parts of a single system. Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem of interoperability itself never disappears. Instead, it simply moves up to a new level of complexity that accepts earlier integrations as a given. Interoperability is especially critical for military systems, where international politics can lead to abrupt realignments where yesterday's foe becomes today's coalition partner. This report on interoperability has five sections. The first section is an introduction to the interoperability problem, and the second section describes fundamental interoperability concepts and develops a terminology for describing interoperability issues and needs. The second section also addresses the vital issue of interoperability security. The third section is about the processes by which interoperability technologies are standardized, including comparisons of the interoperability benefits of different processes. The fourth section is an overview of a number of emerging information technologies relevant to interoperability, and the fifth section suggests opportunities for further action.

Journal ArticleDOI
TL;DR: The document then addresses interoperability between encryptors and decryptors using the key encapsulation method of key recovery using the Common Key Recovery Block as a cornerstone for interoperability.

Journal Article
TL;DR: Based on client/ server, object-orien- ted, component and driver technology, the architecture of GIS interoperability has been presented and many key aspects in this architecture: communication protocol, specification, requests and answers, API and driver.
Abstract: There are two ways to achieve Geo-information sharing:data transformation and GISinteroperability. Data transformation is an approach which mainly focuses on the data integration,but GISinteroperability implies thatboth data with differentformats and data structures and processing resources related to the data can be sharable.Mutual access to data and processing resources are the basic characteristics of GIS interoperability. GIS interoperability stresses semantic interoperability. Based on client/ server,object-orien- ted,component and driver technology,the architecture of GIS interoperability has been presented.There are many key aspects in this architecture:communication protocol, specification,requests and answers,APIand driver.


01 Jan 2000
TL;DR: How the syllepse appeared to us as a key figure that accounts for medical knowledge acquisition in the domain of organ failure and transplantation is described.
Abstract: The Etablissement francais des Greffes (EfG) is a public health agency in charge of organs, tissues and cells transplantation in France. Among EfG's missions is the evaluation of the organ retrieval and transplantation activities, which relies on a national information system (IS). In order to facilitate data recording, to improve information quality and homogeneity, to allow data interchange and semantic interoperability with hospital information systems and other registries, a specific work as been initiated dealing with ontological foundations of medical terminology in the domain of organ failure and transplantation. The aim of this paper is to describe how the syllepse appeared to us as a key figure that accounts for medical knowledge acquisition.

Journal ArticleDOI
TL;DR: An interoperability scheme is proposed within a heterogeneous industrial environment based on a behavioral shell developed by a common set of objects and messages using object models establishing a semantic framework where different control networks and corporate offices may operate transparently as a single system.