scispace - formally typeset
Search or ask a question

Showing papers in "D-lib Magazine in 1998"


Journal ArticleDOI
TL;DR: It is argued for the concept of the hybrid library as a logical follow on from current developments, and institutions should remain an important focus for digital library activities, and users in those institutions require the sort of integration of digital library services which the hybrids promises.
Abstract: Achievements in research programs are seldom what was envisaged at the start. It is still too early to judge the success or failure of the first phases of the UK Electronic Libraries Program (eLib), as several projects are still not completed. Despite this, some lessons learned are becoming apparent. In this article I reflect on a little of what has been learned, and to explore some of the implications. The eLib program has a heavy evaluation component, but the views expressed here are personal, deriving as much from living and working with the projects for the last three years as from any formal evaluations at this stage. From these reflections I look forward briefly to the latest phase of eLib. My aim is to argue for the concept of the hybrid library as a logical follow on from current developments. Institutions should remain an important focus for digital library activities, and users in those institutions require the sort of integration of digital library services which the hybrid library promises.

161 citations


Journal ArticleDOI

98 citations



Journal ArticleDOI
TL;DR: The Resource Description Framework is an infrastructure that enables the encoding, exchange and reuse of structured metadata and provides means for publishing both human-readable and machine-processable vocabularies designed to encourage the reuse of metadata semantics among disparate information communities.
Abstract: The Resource Description Framework (RDF) is an infrastructure that enables the encoding, exchange and reuse of structured metadata RDF is an application of XML that imposes needed structural constraints to provide unambiguous methods of expressing semantics RDF additionally provides means for publishing both human-readable and machine-processable vocabularies designed to encourage the reuse and extension of metadata semantics among disparate information communities The deployment of these constructs allows the vast unstructured mass of information on the Web to be transformed into something more manageable, and thus something far more useful

76 citations



Journal ArticleDOI
TL;DR: A scalable system for searching heterogeneous multilingual collections on the World Wide Web is described, including a markup language for describing the characteristics of a search engine and its interface, and a protocol for requesting word translations between languages.
Abstract: This article describes a scalable system for searching heterogeneous multilingual collections on the World Wide Web. It details a markup language for describing the characteristics of a search engine and its interface, and a protocol for requesting word translations between languages.

70 citations



Journal ArticleDOI
TL;DR: Interpretation, and re-interpretation, of primary and secondary sources is the foundation of much humanistic scholarship and depends on an evaluation of the authenticity of source materials.
Abstract: Interpretation, and re-interpretation, of primary and secondary sources is the foundation of much humanistic scholarship. Construction of a convincing argument depends on an evaluation of the authenticity of source materials. Judgments about authenticity are based on assessments of the origins, completeness and internal integrity of a document. They may also draw from the consistency and coherence that exists between a particular source and others in the same context or of the same type.

59 citations


Journal ArticleDOI
TL;DR: Based on the recognized achievements of DLI and the promise of additional Federal investment in digital libraries, a follow-up program was announced in the spring of 1998, and in the new program, "Digital Libraries Initiative—Phase 2," NSF, DARPA, and NASA are jo ined by the National.
Abstract: The Digital Libraries Initiative (DLI) was the result of a communitybased process which began in the late 1980s with informal discussions between researchers and agency program managers. These discussions progressed to planning workshops designed to develop research values and agendas and culminated in the National Science Foundation (NSF)/ Defense Advanced Research Projects Agency (DARPA)/National Aeronautics and Space Administration (NASA) Research in Digital Libraries Initiative announced in late 1993. With the selection and funding of the six DLI projects [http://www.cise.nsf.gov/iis/dli_home.html], interest and activities related to digital libraries accelerated rapidly. The six DLI projects became highly visible and influential efforts and grew in scope, participation, and influence. NSF and DARPA funded additional workshops as part of the DLI to develop consensus on specific digital libraries topical areas and boundaries, to bring together researchers to stimulate cross-disciplinary interaction, and to ponder together how best to adapt to a rapidly changing global information environment. By now, researchers and practitioners from many disciplines have been drawn into digital libraries research and related activities, f rom subject domains reaching far beyond the sciences into the arts and humanities. Based on the recognized achievements of DLI and the promise of additional Federal investment in digital libraries, a follow-up program was announced in the spring of 1998. In the new program, \"Digital Libraries Initiative—Phase 2,\" NSF, DARPA, and NASA are jo ined by the National

58 citations




Journal ArticleDOI
TL;DR: Assessing the achievements of the Open Journal project and considering some of the difficulties it faced, it is reported on the different approaches to linking that the project developed, and summarise the important user responses that indicate what works and what does not.
Abstract: The Open Journal project has completed its three year period of funding by the UK Electronic Libraries(eLib) programme. During that time, the number of journals that are available electronically leapt from a few tens to a few thousand. Some of these journals are now developing the sort of features the project has been advocating, in particular the use of links within journals, between different primary journals, with secondary journals data, and to non-journal sources. Assessing the achievements of the project and considering some of the difficulties it faced, we report on the different approaches to linking that the project developed, and summarise the important user responses that indicate what works and what does not. Looking ahead, there are signs of change, not just to simple linking within journals but to schemes in which links are the basis of "distributed" journals, where information may be shared and documents built from different sources. The significance has yet to be appreciated, but this would be a major change from printed journals. If projects such as this and others have provided the initial impetus, the motivation for distributed journals comes, perhaps surprisingly, from within certain parts of the industry, as the paper shows.










Journal ArticleDOI
TL;DR: DASHER’s Information Space Analysis Tools are unique in combining multiple methods to assist in this task, which makes the suite particularly well-suited to integrating additional technologies in order to create specialized systems.
Abstract: The DASHER Project at USC/ISI has focused upon helping organizations with rapid-response mission requirements. Such organizations need to be able to quickly stand up tiger teams backed by the information, materiel, and support services they need to do their job. To do so, they need to find and assess sources of those services who are potential participants in the tiger team. To support this very initial phase of team development, the project has developed information analysis tools that help make sense of sets of data sources in an intranet or internet: characterizing them, partitioning them, sorting and filtering them. These tools focus on three key issues in forming a collaborative team: helping individuals responsible for forming the team to understand what is available, helping them structure and categorize on the information available to them in a manner specifically suited to the task at hand, and helping them understand the mappings between their organization of the information and those used by others who might participate. DASHER’s Information Space Analysis Tools are unique in combining multiple methods to assist in this task. This makes the suite particularly well-suited to integrating additional technologies in order to create specialized systems.

Journal ArticleDOI