scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Universal Computer Science in 1998"


BookDOI
Uwe M. Borghoff1, Remo Pareschi1
TL;DR: This special issue of the Journal of Universal Computer Science contains a selection of papers from the First Conference on Practical Applications of Knowledge Management, where each paper describes a specific type of information technology suitable for the support of different aspects of knowledge management.
Abstract: Knowledge has been lately recognized as one of the most important assets of organizations. Can information technology help the growth and the sustainment of organizational knowledge? The answer is yes, if care is taken to remember that IT here is just a part of the story (corporate culture and work practices being equally relevant) and that the information technologies best suited for this purpose should be expressly designed with knowledge management in view. This special issue of the Journal of Universal Computer Science contains a selection of papers from the First Conference on Practical Applications of Knowledge Management. Each paper describes a specific type of information technology suitable for the support of different aspects of knowledge management.

391 citations


Book ChapterDOI
TL;DR: This chapter compares and summarizes the experiences from three case studies on Corporate Memories for supporting various aspects in the product life-cycles of three European corporations and sketches a general framework for the development methodology, architecture, and technical realization of a Corporate Memory.
Abstract: A core concept in discussions about technological support for knowledge management is the Corporate Memory. A Corporate or Organizational Memory can be characterized as a comprehensive computer system which captures a company’s accumulated know-how and other knowledge assets and makes them available to enhance the efficiency and effectiveness of knowledge-intensive work processes. The successful development of such a system requires a careful analysis of established work practices and available information-technology (IT) infrastructure. This is essential for providing a cost-effective solution which will be accepted by the users and can be evolved in the future. This chapter compares and summarizes our experiences from three case studies on Corporate Memories for supporting various aspects in the product life-cycles of three European corporations. Based on the conducted analyses and prototypical implementations, we sketch a general framework for the development methodology, architecture, and technical realization of a Corporate Memory.

156 citations


Journal Article
TL;DR: Adaptive link annotation is a new direction within the field of user-model based interfaces as mentioned in this paper, which aims to help users find an appropriate path in a learning and information space by adapting link presentation to the goals, knowledge, and other characteristics of an individual user.
Abstract: Adaptive link annotation is a new direction within the field of user-model based interfaces. It is a specific technique in Adaptive Navigation Support (ANS) whose aim is to help users find an appropriate path in a learning and information space by adapting link presentation to the goals, knowledge, and other characteristics of an individual user. More specifically, ANS has been implemented on the WWW in the InterBook system as link annotation indicating several states such as visited, ready to be learned, or not ready to be learned. These states represent an expert's suggested path for an individual user through a learning space according to both a history-based (tracking where the user has been), and a pre- requisite based (indexing of content as a set of domain model concepts) annotation. This particular process has been more fully described elsewhere (Brusilovsky, Eklund & Schwarz 1998). This paper details results from an investigation to determine the effectiveness of user- model based link annotation, in a real-world teaching and learning context, on learning outcomes for a group of twenty-five second year education students in their study of databases and spreadsheets. Using sections of a textbook on ClarisWorks databases and spreadsheets, which had been authored into the InterBook program, students received sections of the text both with and without the adaptive link annotation. Through the use of audit trails, questionnaires and test results, we show that while this particular form of ANS implemented in InterBook initially had a negative effect on learning of the group, it appears to have been beneficial to the learning of those particular students who tended to accept the navigation advice, particularly initially when they were unfamiliar with a complex interface. We also show that ANS provided learners with the confidence to adopt less sequential paths through the learning space. Considering ANS tools comprised a minimal part of the interface in the experiment, we show that they functioned reliably well. Discussion and suggestions for further research are provided.

152 citations


Journal Article
TL;DR: This work compares transition super-cell systems with classic mechanisms in formal language theory, context-free and matrix grammars, E0L and ET0L systems, interpreted as generating mechanisms of number relations (the authors take the Parikh image of the usual language generated by these mechanisms rather than the language).
Abstract: We continue the investigation of the power of the computability models introduced in [Gh. Paun, Computing with membranes, TUCS Report 208, November 1998] under the name of transition super-cell systems. We compare these systems with classic mechanisms in formal language theory, context-free and matrix grammars, E0L and ET0L systems, interpreted as generating mechanisms of number relations (we take the Parikh image of the usual language generated by these mechanisms rather than the language). Several open problems are also formulated.

96 citations


Journal Article
TL;DR: A generic tableau prover that has been implemented and integrated with Isabelle (Paulson, 1994), which has numerous extensions that allow it to reason with any supplied set of tableau rules.
Abstract: A generic tableau prover has been implemented and integrated with Isabelle (Paulson, 1994). Compared with classical first-order logic provers, it has numerous extensions that allow it to reason with any supplied set of tableau rules. It has a higherorder syntax in order to support user-defined binding operators, such as those of set theory. The unification algorithm is first-order instead of higher-order, but it includes modifications to handle bound variables. The proof, when found, is returned to Isabelle as a list of tactics. Because Isabelle verifies the proof, the prover can cut corners for efficiency’s sake without compromising soundness. For example, the prover can use type information to guide the search without storing type information in full. Categories: F.4, I.1

92 citations


Journal Article
TL;DR: It is shown that a compact NSS has some special access hierarchy and it is closely related to a matroid, which means that it meets the equalities of both the bounds and the entropy type bound.
Abstract: Nonperfect secret sharing schemes (NSSs) have an advantage such that the size of shares can be shorter than that of perfect secret sharing schemes. This paper shows some basic properties of general NSS. First, we present a necessary and su cient condition on the existence of an NSS. Next, we show two bounds of the size of shares, a combinatorial type bound and an entropy type bound. Further, we de ne a compact NSS as an NSS which meets the equalities of both our bounds. Then we show that a compact NSS has some special access hierarchy and it is closely related to a matroid. Veri able nonperfect secret sharing schemes are also presented.

52 citations


Journal Article
TL;DR: A system called Pat is presented for localizing instances of structural design patterns in existing C++ software, relying extensively on a commercial CASE tool and a PROLOG interpreter, resulting in a simple and robust architecture that cannot solve the problem completely, but is industrial-strength; it avoids the brittleness that many reverse engineering tools exhibit when applied to realistic software.
Abstract: The object-oriented design community has recently begun to collect so-called software design patterns : descriptions of proven solutions common software design problems, packaged in a description that includes a problem, a context, a solution, and its properties. Design pattern information can improve the maintainability of software, but is often absent in program documentation. We present a system called Pat for localizing instances of structural design patterns in existing C++ software. It relies extensively on a commercial CASE tool and a PROLOG interpreter, resulting in a simple and robust architecture that cannot solve the problem completely, but is industrial-strength; it avoids the brittleness that many reverse engineering tools exhibit when applied to realistic software. To evaluate Pat, we quantify its performance in terms of precision and recall. We examine four applications, including the popular class libraries zApp and LEDA. Within Pat's restrictions all pattern instances are found, the precision is about 40 percent, and manual ltering of the false positives is relatively easy. Therefore, we consider Pat a good compromise: modest functionality, but high practical stability for recovering design information. CR classi cation: D2.2 CASE, D.2.6, D.2.7 documentation, D.2.10 representation, I.5.5 General terms: Algorithms, Design, Measurement.

46 citations


Journal Article
TL;DR: It is proved that almost every Boolean function (almost every balanced Boolean function) satisfies all above mentioned criteria on levels very close to optimal and therefore can be considered to be cryptographically strong.
Abstract: Boolean functions used in cryptographic applications have to satisfy various cryptographic criteria. Although the choice of the criteria depends on the cryptosystem in which they are used, there are some properties (balancedness, nonlinearity, high algebraic degree, correlation immunity, propagation criteria) which a cryptographically strong Boolean function ought to have. We study the above mentioned properties in the set of all Boolean functions (all balanced Boolean functions) and prove that almost every Boolean function (almost every balanced Boolean function) satisfies all above mentioned criteria on levels very close to optimal and therefore can be considered to be cryptographically strong.

42 citations


Journal Article
TL;DR: In this paper, the structure of documents and the link topology can be exploited to perform categorization by context, where the context surrounding a link in an HTML document is used for categorizing the document referred by the link.
Abstract: The traditional approach to document categorization is categorization by content, since information for categorizing a document is extracted from the document itself. In a hypertext environment like the Web, the structure of documents and the link topology can be exploited to perform what we call categorization by context [Attardi 98]: the context surrounding a link in an HTML document is used for categorizing the document referred by the link. Categorization by context is capable of dealing also with multimedia material, since it does not rely on the ability to analyze the content of documents. Categorization by context leverages on the categorization activity implicitly performed when someone places or refers to a document on the Web. By focusing the analysis to the documents used by a group of people, one can build a catalogue tuned to the need of that group. Categorization by context is based on the following assumptions:

42 citations


Journal ArticleDOI
TL;DR: Using the CADNA library which allows on computers to estimate the round-off error effect on any computed result, the optimal value of $n$ can be computed to approximate $I$ and it is sure that the exact significant digits of $I_n$ are in common with thesignificant digits of £I.
Abstract: If $I_n$ is the approximation of a definite integral $\int_{a}^{b}f(x)dx$ with step $\frac{b-a}{2^n}$ using the trapezoidal rule (respectively Simpson's rule), if $C_{a,b}$ denotes the number of significant digits common to $a$ and $b$, we show, in this paper, that $C_{I_{n},I_{n+1}} = C_{I_ {n},I}+\log_{10}(\frac{4}{3})+\mathcal{O}(\frac{1}{4^n})$ (respectively $C_{I_{n},I_{n+1}} = C_{I_ {n},I}+\log_{10}(\frac{16}{15})+\mathcal{O}(\frac{1}{16^n})$) According to the previous theorems, using the CADNA library which allows on computers to estimate the round-off error effect on any computed result, we can compute dynamically the optimal value of $n$ to approximate $I$ and we are sure that the exact significant digits of $I_n$ are in common with the significant digits of $I$

42 citations


Journal Article
TL;DR: The eeciency of global optimization methods in connection with interval arithmetic, associated with various Branch and Bound methods, allow to determine the global optimum and the corresponding optimizers with certainty and arbitrary accuracy.
Abstract: The eeciency of global optimization methods in connection with interval arithmetic is no more to be demonstrated. They allow to determine the global optimum and the corresponding optimizers, with certainty and arbitrary accuracy. One of the main features of these algorithms is to deliver a function enclosure deened on a box right parallelepiped. The studied method provides a lower bound or upper bound of a function in that box throughout two diierent strategies. As we shall see, these algorithms associated with various Branch and Bound methods lead to accelerated convergence and permit to avoid the cluster problem.

Journal Article
TL;DR: This work introduces and study abstract algebraic systems generalizing the arith metic systems of intervals and convex bodies involving Minkowski operations such as quasimodules and quasilinear systems.
Abstract: We introduce and study abstract algebraic systems generalizing the arith metic systems of intervals and convex bodies involving Minkowski operations such as quasimodules and quasilinear systems Embedding theorems are proved and computa tional rules for algebraic transformations are given

Journal Article
TL;DR: HyperAT, a hypertext research authoring tool developed to help designers build usable web documents on the World Wide Web without getting \lost, is presented.
Abstract: Users tend to lose their way in the maze of information within hypertext. Much work done to address the \\lost in hyperspace\" problem is reactive, that is, doing remedial work to correct the de ciencies within hypertexts because they are (or were) poorly designed and built. What if solutions are sought to avoid the problem? What if we do things well from the start? This paper reviews the \\lost in hyperspace\" problem, and suggests a framework to understand the design and usability issues. The issues cannot be seen as purely psychological or purely computing, they are multi-disciplinary. Our proactive, multi-disciplinary approach is drawn from current technologies in sub-disciplines of hypertext, humancomputer interaction, cognitive psychology and software engineering. To demonstrate these ideas, this paper presents HyperAT, a hypertext research authoring tool, developed to help designers build usable web documents on the World Wide Web without getting \\lost.\

Journal ArticleDOI
TL;DR: This special issue of the Journal of Universal Computer Science focuses on assessment and evaluation practices and focuses on three major issues: pragmatics -- cost estimations and product reviews, measuring the effectiveness of theory-driven design, and extending paradigms for capturing more profound understanding of variables and outcomes.
Abstract: Enormous sums of money and human effort have gone into educational technologies over the past decade. Yet nagging questions surface as to whether this tremendous investment produces advantageous results. While we intuitively feel that the influence of technology should be substantial, little sound guidance exists as to what is effective and why or how to use it. We seem to have cleared several of the hurdles for building a computer-aided instruction infrastructure; now we must turn our attention to richer understandings of research into the impact of technologies in the classroom. This special issue of the Journal of Universal Computer Science focuses on assessment and evaluation practices. The six articles in this collection have been clustered around three major issues: (1) pragmatics -- cost estimations and product reviews, (2) measuring the effectiveness of theory-driven design, (3) extending paradigms for capturing more profound understanding of variables and outcomes. Category: K.3 - Computers and Education

Book ChapterDOI
TL;DR: This chapter discusses the major steps in realizing this approach to documents: Knowledge acquisition, knowledge representation, and techniques to automatically generate multilingual documents from knowledge bases.
Abstract: A great part of the product knowledge in manufacturing enterprises is only available in the form of natural language documents. The know-how recorded in these documents is an essential resource for successful competition in the market. From the viewpoint of knowledge management, however, documents have a severe limitation: They do not capture the wealth of knowledge contained in these documents, since the entire knowledge is not spelled out on the linguistic surface. In order to overcome this limitation, the notion of a document as a particular kind of realization of (or view on) the underlying knowledge is introduced. This chapter discusses the major steps in realizing this approach to documents: Knowledge acquisition, knowledge representation, and techniques to automatically generate multilingual documents from knowledge bases. Further; we describe how the required product knowledge can be represented in a sharable and reusable way.


Journal Article
TL;DR: A system is provided for forming small accurately-spherical objects that are supported at the opening of a conduit on the update of hot gas emitted from the opening, so that surface tension forms a precise sphere.
Abstract: A system is provided for forming small accurately-spherical objects. Preformed largely-spherical objects (18) are supported at the opening of a conduit (16) on the update of hot gas emitted from the opening, so the object is in a molten state. The conduit is suddenly jerked away at a downward incline, to allow the molten object to drop in free fall, so that surface tension forms a precise sphere. The conduit portion that has the opening, lies in a moderate-vacuum chamber 40, and the falling sphere passes through the chamber and through a briefly-opened valve (30) into a tall drop tower (32) that contains a lower pressure, to allow the sphere to cool without deformation caused by falling through air.


Journal Article
TL;DR: Basic pedagogical principles that may serve as starting points and guidelines in the evaluation of hypermedia-based learning environments are discussed and two existing hypermedia learning environments will be introduced and evaluated on the basis of the pedagological principles presented.
Abstract: This paper discusses evaluation of hypermedia-based learning environments mainly from the point of view of the learner or student. The evaluation of a learning environment should be based on modern learning theories. These emphasise the importance of constructivism and the learner's activity in building mental models of the mathematical knowledge. The environment should also support conversational and collaborative learning. From the point of view of the learner it should be intentional and provide real life situations and contexts to motivate the study of abstract mathematical contents. Also, it should give sufficient feedback and be able to adapt to the needs of various learners. The purpose of the paper is to discuss basic pedagogical principles that may serve as starting points and guidelines in the evaluation of hypermedia-based learning environments. Two existing hypermedia learning environments will be introduced and evaluated on the basis of the pedagogical principles presented.


Journal Article
TL;DR: A formalism for a kind of branch and prune algorithm implementing symbolic and numerical methods to reduce the systems with respect to a relation from both inclusion of variable domains and inclusion of sets of constraints is proposed.
Abstract: This paper discusses the processing of non-linear polynomial systems using a branch and prune algorithm within the framework of constraint programming. We propose a formalism for a kind of branch and prune algorithm implementing symbolic and numerical methods to reduce the systems with respect to a relation de ned from both inclusion of variable domains and inclusion of sets of constraints. The second part of the paper presents an instantiation of this general scheme. The pruning step is implemented as a cooperation of factorizations, substitutions and partial computations of Grobner bases to simplify the systems, and interval Newton methods address the numerical, approximate solving. The branching step creates a partition of domains or generates disjunctive constraints from equations in factorized form. Experimental results from a prototype show that interval methods generally bene t from the symbolic processing of the initial constraints.

Journal Article
TL;DR: A new enclosure method for initial value problems in systems of ordinary di erential equations based on Taylor expansion, an implicit method that sometimes yields much tighter bounds than any of the common explicit methods.
Abstract: We introduce an implicit single-step method for enclosing solutions of initial value problems in systems of ordinary differential equations.

Journal Article
TL;DR: It is proposed that the most important role the computer may play in education could be contributing to the ubiquitous use of assessment for the improvement of instruction, and newly emerging WWW-based learning systems should support a very wide range of embedded assessment features.
Abstract: This paper proposes that the most important role the computer may play in education could be contributing to the ubiquitous use of assessment for the improvement of instruction. In order to realize this potential, newly emerging WWW-based learning systems should support a very wide range of embedded assessment features. These systems require architectures with a core of reliable integrated management tools, one or more modules with instruction and assessment, standard database connectivity, and an acceptable level of attention to permissions and security. No company will adequately address all of the possibilities for assessment in WWW-based learning systems, so it is critical that WWW based learning systems have “open system” architectures and company policies for cooperating with other companies to support interoperable modules. The point is raised that some products with similar types of assessment features can have very different architectures and policies for supporting interoperable modules. It is recommended that “checklists” for comparing assessment capabilities should be viewed with skepticism, because they can favor products with weaker architectures and policies for accommodating assessment capabilities.

Journal Article
TL;DR: The ’City Game’ is described, a computer-generated, virtual 3D city developed for teaching spatial orientation in elementary schools, and an informal study to see first reactions of children and adults when using the ’ city game’.
Abstract: Spatial orientation is an important ability, which should be facilitated in geometry courses of elementary schools. A preferred approach (in Germany) typically involves navigation and orientation tasks with pictures of a town and city maps depicted in a book. Because of increasing use of computer systems in schools, it is very interesting to explore the value a virtual environment possesses for teaching spatial orientation. This articel describes the ’City Game’, a computer-generated, virtual 3D city, developed for teaching spatial orientation in elementary schools, and an informal study to see first reactions of children and adults when using the ’City Game’.

Journal Article
TL;DR: This work will report the results from a design experiment with a multimedia program developed at the NASA Classroom of the Future, and examine the methodologies that were used in the evaluation.
Abstract: Researchers at the NASA Classroom of the Future have been using the design experiment framework to conduct evaluations of multimedia curricula. This method stands in contrast to more traditional, controlled experimental methods of evaluating curricular reforms. The methodology presented here is integrated with Walter Doyle's (1983) notion of using academic tasks to describe how classroom activities impact student learning. We will report the results from a design experiment with a multimedia program developed at the NASA Classroom of the Future, and we will examine the methodologies that were used in the evaluation.

Journal Article
TL;DR: It is shown that at least a quarter of the total elapsed time is spent on establishing TCP connections with HTTPP1.0, and that a single stand-alone squid proxy cache does not always reduce response time for the authors' workloads.
Abstract: It is critical to understand WWW latency in order to design better HTTP protocols. In this study we c haracterize Web response time and examine the eeects of proxy caching, network bandwidth, traac load, persistent connections for a page, and periodicity. Based on studies with four workloads, we show that at least a quarter of the total elapsed time is spent on establishing TCP connections with HTTPP1.0. The distributions of connection time and elapsed time can be modeled using Pearson, Weibul, or Log-logistic distributions. Response times display strong daily and weekly patterns. We also characterize the eeect of a user's network bandwidth on response time. Average connection time from a client via a 33.6 K modem is two times longer than that from a client via switched Ethernet. We estimate the elapsed time savings from using persistent connections for a page to vary from about a quarter to a half. This study nds that a proxy caching server is sensitive to traac loads. Contrary to the typical thought about Web proxy caching, this study also nds that a single stand-alone squid proxy cache does not always reduce response time for our workloads. Implications of these results to future versions of the HTTP protocol and to Web application design are discussed.

Journal Article
TL;DR: It is shown that each P-selective set can be accepted by a polynomial-time nondeterministic machine using linear advice and linear nondeterminism.
Abstract: Hemaspaandra and Torenvliet showed that each P-selective set can be accepted by a polynomial-time nondeterministic machine using linear advice and quasilinear nondeterminism. We show that each P-selective set can be accepted by a polynomial-time nondeterministic machine using linear advice and linear nondeterminism.

Journal Article
TL;DR: The Special Issue “Recent Progress in Electromagnetic Theory and its Applications” is presented, an outcome of the COST (European Cooperation in Science and Technology) Action TU1208 “Civil engineering applications of Ground Penetrating Radar”.
Abstract: We are very pleased to present the Special Issue “Recent Progress in Electromagnetic Theory and its Applications”, an outcome of the COST (European Cooperation in Science and Technology) Action TU1208 “Civil engineering applications of Ground Penetrating Radar”. The Special Issue comprises two parts: Part I includes eight papers on Ground Penetrating Radar (GPR) technology, methodology and applications; Part II contains six papers dealing with other applications of electromagnetic fields. Overall, the papers are authored by scientists from nineteen institutes in nine countries (Armenia, France, Germany, India, Ireland, Italy, Poland, Russia, and United Kingdom).

Journal Article
TL;DR: All solutions of the nonlinear system of equations describing equilibrium conditions of the "high polymer liquid system", which is a well-known ill-conditioned system of equation, are identi ed by the method.
Abstract: A linear programming-based method is presented for nding all solutions of nonlinear systems of equations with guaranteed accuracy. In this method, a new e ective linear programming-based method is used to delete regions in which solutions do not exist. On the other hand, Krawczyk's method is used to nd regions in which solutions exist. As an illustrative example, all solutions of the nonlinear system of equations describing equilibrium conditions of the \high polymer liquid system", which is a well-known ill-conditioned system of equations, are identi ed by the method.

Journal Article
TL;DR: The general type of pseudo linear interval equa tions in the extended interval arithmetic is given and some examples leading to algebraic solution of the equations under consideration are discussed.
Abstract: The arithmetic on the extended set of proper and improper intervals is an algebraic completion of the conventional interval arithmetic and thus facilitates the explicit solution of certain interval algebraic problems Due to the existence of inverse elements with respect to addition and multiplication operations certain inter val algebraic equations can be solved by elementary algebraic transformations The conditionally distributive relations between extended intervals allow that complicated interval algebraic equations multi incident on the unknown variable be reduced to simpler ones In this paper we give the general type of pseudo linear interval equa tions in the extended interval arithmetic The algebraic solutions to a pseudo linear interval equation in one variable are studied All numeric and parametric algebraic solutions as well as the conditions for nonexistence of the algebraic solution to some basic types pseudo linear interval equations in one variable are found Some examples leading to algebraic solution of the equations under consideration and the extra func tionalities for performing true symbolic algebraic manipulations on interval formulae in a Mathematica package are discussed