scispace - formally typeset
Search or ask a question
Author

Stefan Decker

Other affiliations: Fraunhofer Society, Siemens, University College London  ...read more
Bio: Stefan Decker is an academic researcher from RWTH Aachen University. The author has contributed to research in topics: Semantic Web & Social Semantic Web. The author has an hindex of 64, co-authored 355 publications receiving 18961 citations. Previous affiliations of Stefan Decker include Fraunhofer Society & Siemens.


Papers
More filters
Journal ArticleDOI
TL;DR: The authors describe how Protege-2000, a tool for ontology development and knowledge acquisition, can be adapted for editing models in different Semantic Web languages.
Abstract: As researchers continue to create new languages in the hope of developing a Semantic Web, they still lack consensus on a standard. The authors describe how Protege-2000, a tool for ontology development and knowledge acquisition, can be adapted for editing models in different Semantic Web languages.

1,092 citations

Proceedings ArticleDOI
20 May 2003
TL;DR: It is shown how to interoperate, semantically and inferentially, between the leading Semantic Web approaches to rules and ontologies and define a new intermediate knowledge representation contained within this intersection: Description Logic Programs (DLP), and the closely related Description Horn Logic (DHL).
Abstract: We show how to interoperate, semantically and inferentially, between the leading Semantic Web approaches to rules (RuleML Logic Programs) and ontologies (OWL/DAML+OIL Description Logic) via analyzing their expressive intersection. To do so, we define a new intermediate knowledge representation (KR) contained within this intersection: Description Logic Programs (DLP), and the closely related Description Horn Logic (DHL) which is an expressive fragment of first-order logic (FOL). DLP provides a significant degree of expressiveness, substantially greater than the RDF-Schema fragment of Description Logic. We show how to perform DLP-fusion: the bidirectional translation of premises and inferences (including typical kinds of queries) from the DLP fragment of DL to LP, and vice versa from the DLP fragment of LP to DL. In particular, this translation enables one to "build rules on top of ontologies": it enables the rule KR to have access to DL ontological definitions for vocabulary primitives (e.g., predicates and individual constants) used by the rules. Conversely, the DLP-fusion technique likewise enables one to "build ontologies on top of rules": it enables ontological definitions to be supplemented by rules, or imported into DL from rules. It also enables available efficient LP inferencing algorithms/implementations to be exploited for reasoning over large-scale DL ontologies.

939 citations

Proceedings ArticleDOI
07 May 2002
TL;DR: In this article, the authors discuss the open source project Edutella which builds upon metadata standards defined for the WWW and aims to provide an RDF-based metadata infrastructure for P2P applications, building on the recently announced JXTA Framework.
Abstract: Metadata for the World Wide Web is important, but metadata for Peer-to-Peer (P2P) networks is absolutely crucial. In this paper we discuss the open source project Edutella which builds upon metadata standards defined for the WWW and aims to provide an RDF-based metadata infrastructure for P2P applications, building on the recently announced JXTA Framework. We describe the goals and main services this infrastructure will provide and the architecture to connect Edutella Peers based on exchange of RDF metadata. As the query service is one of the core services of Edutella, upon which other services are built, we specify in detail the Edutella Common Data Model (ECDM) as basis for the Edutella query exchange language (RDF-QEL-i) and format implementing distributed queries over the Edutella network. Finally, we shortly discuss registration and mediation services, and introduce the prototype and application scenario for our current Edutella aware peers.

939 citations

Journal ArticleDOI
TL;DR: In this paper, the authors define a new intermediate knowledge representation (KR) contained within this intersection: Description Logic Programs (DLP) and the closely related Description Horn Logic (DHL) which is an expressive fragment of first-order logic (FOL).
Abstract: We show how to interoperate, semantically and inferentially, between the leading Semantic Web approaches to rules (RuleML Logic Programs) and ontologies (OWL/DAML+OIL Description Logic) via analyzing their expressive intersection. To do so, we define a new intermediate knowledge representation (KR) contained within this intersection: Description Logic Programs (DLP), and the closely related Description Horn Logic (DHL) which is an expressive fragment of first-order logic (FOL). DLP provides a significant degree of expressiveness, substantially greater than the RDF-Schema fragment of Description Logic. We show how to perform DLP-fusion: the bidirectional translation of premises and inferences (including typical kinds of queries) from the DLP fragment of DL to LP, and vice versa from the DLP fragment of LP to DL. In particular, this translation enables one to "build rules on top of ontologies": it enables the rule KR to have access to DL ontological definitions for vocabulary primitives (e.g., predicates and individual constants) used by the rules. Conversely, the DLP-fusion technique likewise enables one to "build ontologies on top of rules": it enables ontological definitions to be supplemented by rules, or imported into DL from rules. It also enables available efficient LP inferencing algorithms/implementations to be exploited for reasoning over large-scale DL ontologies.

843 citations

Journal ArticleDOI
TL;DR: It is argued that a further representation and inference layer is needed on top of the Web's current layers, and to establish such a layer, a general method for encoding ontology representation languages into RDF/RDF schema is proposed.
Abstract: XML and RDF are the current standards for establishing semantic interoperability on the Web, but XML addresses only document structure. RDF better facilitates interoperation because it provides a data model that can be extended to address sophisticated ontology representation techniques. We explain the role of ontologies in the architecture of the Semantic Web. We then briefly summarize key elements of XML and RDF, showing why using XML as a tool for semantic interoperability will be ineffective in the long run. We argue that a further representation and inference layer is needed on top of the Web's current layers, and to establish such a layer, we propose a general method for encoding ontology representation languages into RDF/RDF schema. We illustrate the extension method by applying it to Ontology Interchange Language, an ontology representation and inference language.

795 citations


Cited by
More filters
Book
01 Jan 1995
TL;DR: In this article, Nonaka and Takeuchi argue that Japanese firms are successful precisely because they are innovative, because they create new knowledge and use it to produce successful products and technologies, and they reveal how Japanese companies translate tacit to explicit knowledge.
Abstract: How has Japan become a major economic power, a world leader in the automotive and electronics industries? What is the secret of their success? The consensus has been that, though the Japanese are not particularly innovative, they are exceptionally skilful at imitation, at improving products that already exist. But now two leading Japanese business experts, Ikujiro Nonaka and Hiro Takeuchi, turn this conventional wisdom on its head: Japanese firms are successful, they contend, precisely because they are innovative, because they create new knowledge and use it to produce successful products and technologies. Examining case studies drawn from such firms as Honda, Canon, Matsushita, NEC, 3M, GE, and the U.S. Marines, this book reveals how Japanese companies translate tacit to explicit knowledge and use it to produce new processes, products, and services.

7,448 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI
TL;DR: The authors describe progress to date in publishing Linked Data on the Web, review applications that have been developed to exploit the Web of Data, and map out a research agenda for the Linked data community as it moves forward.
Abstract: The term “Linked Data” refers to a set of best practices for publishing and connecting structured data on the Web. These best practices have been adopted by an increasing number of data providers over the last three years, leading to the creation of a global data space containing billions of assertions— the Web of Data. In this article, the authors present the concept and technical principles of Linked Data, and situate these within the broader context of related technological developments. They describe progress to date in publishing Linked Data on the Web, review applications that have been developed to exploit the Web of Data, and map out a research agenda for the Linked Data community as it moves forward.

5,113 citations

Book
01 Jan 2008
TL;DR: Nonaka and Takeuchi as discussed by the authors argue that there are two types of knowledge: explicit knowledge, contained in manuals and procedures, and tacit knowledge, learned only by experience, and communicated only indirectly, through metaphor and analogy.
Abstract: How have Japanese companies become world leaders in the automotive and electronics industries, among others? What is the secret of their success? Two leading Japanese business experts, Ikujiro Nonaka and Hirotaka Takeuchi, are the first to tie the success of Japanese companies to their ability to create new knowledge and use it to produce successful products and technologies. In The Knowledge-Creating Company, Nonaka and Takeuchi provide an inside look at how Japanese companies go about creating this new knowledge organizationally. The authors point out that there are two types of knowledge: explicit knowledge, contained in manuals and procedures, and tacit knowledge, learned only by experience, and communicated only indirectly, through metaphor and analogy. U.S. managers focus on explicit knowledge. The Japanese, on the other hand, focus on tacit knowledge. And this, the authors argue, is the key to their success--the Japanese have learned how to transform tacit into explicit knowledge. To explain how this is done--and illuminate Japanese business practices as they do so--the authors range from Greek philosophy to Zen Buddhism, from classical economists to modern management gurus, illustrating the theory of organizational knowledge creation with case studies drawn from such firms as Honda, Canon, Matsushita, NEC, Nissan, 3M, GE, and even the U.S. Marines. For instance, using Matsushita's development of the Home Bakery (the world's first fully automated bread-baking machine for home use), they show how tacit knowledge can be converted to explicit knowledge: when the designers couldn't perfect the dough kneading mechanism, a software programmer apprenticed herself withthe master baker at Osaka International Hotel, gained a tacit understanding of kneading, and then conveyed this information to the engineers. In addition, the authors show that, to create knowledge, the best management style is neither top-down nor bottom-up, but rather what they call "middle-up-down," in which the middle managers form a bridge between the ideals of top management and the chaotic realities of the frontline. As we make the turn into the 21st century, a new society is emerging. Peter Drucker calls it the "knowledge society," one that is drastically different from the "industrial society," and one in which acquiring and applying knowledge will become key competitive factors. Nonaka and Takeuchi go a step further, arguing that creating knowledge will become the key to sustaining a competitive advantage in the future. Because the competitive environment and customer preferences changes constantly, knowledge perishes quickly. With The Knowledge-Creating Company, managers have at their fingertips years of insight from Japanese firms that reveal how to create knowledge continuously, and how to exploit it to make successful new products, services, and systems.

3,668 citations