scispace - formally typeset
Search or ask a question

Showing papers on "Domain (software engineering) published in 1987"


Journal ArticleDOI
TL;DR: Cognitive skills are encoded by a set of productions, which are organized according to a hierarchical goal structure, which implies that all variety of skill acquisition, including that typically regarded as inductive, conforms to this characterization.
Abstract: : Cognitive skills are encoded by a set of productions, which are organized according to a hierarchical goal structure. People solve problems in new domains by applying weak problem-solving procedures to declarative knowledge they have about this domain. From these initial problem solutions, production rules are compiled which are specific to that domain and that use of the knowledge. Numerous experimental results follow from this conception of skill organization and skill acquisition. These experiments include predictions about transfer among skills, differential improvement on problem types, effects of working memory limitations, and applications to instruction. The theory implies that all variety of skill acquisition, including that typically regarded as inductive, conforms to this characterization. Keywords: Artificial intelligence; Cognitive science; Skill acquisition; Production system; LISP.

987 citations


Journal ArticleDOI
TL;DR: An initial domain for defining flexibility is established, specific measures are suggested, and sampling issues are discussed and an agenda for conducting rigorous research into the flexibility of manufacturing processes is outlined.
Abstract: This paper outlines an agenda for conducting rigorous research into the flexibility of manufacturing processes. Its underlying premise is that current efforts are being impeded by the absence of operational measures for the concept. An initial domain for defining flexibility is established, specific measures are suggested, and sampling issues are discussed. There is also a discussion of relevant research problems that can be addressed once valid and reliable operationalisations exist.

465 citations


Journal Article
TL;DR: In this article, the authors implique l'accord avec les conditions générales d'utilisation (http://www.numdam.org/legal.php).
Abstract: L’accès aux archives de la revue « Annali della Scuola Normale Superiore di Pisa, Classe di Scienze » (http://www.sns.it/it/edizioni/riviste/annaliscienze/) implique l’accord avec les conditions générales d’utilisation (http://www.numdam.org/legal.php). Toute utilisation commerciale ou impression systématique est constitutive d’une infraction pénale. Toute copie ou impression de ce fichier doit contenir la présente mention de copyright.

245 citations


Book ChapterDOI
01 Oct 1987
TL;DR: This paper shows how contexts can be represented using the notion of projection from domain theory, which means that recursive context equations can be solved using standard fixpoint techniques, instead of the algebraic manipulation previously used.
Abstract: Contexts have been proposed as a means of performing strictness analysis on non-flat domains. Roughly speaking, a context describes how much a sub-expression will be evaluated by the surrounding program. This paper shows how contexts can be represented using the notion of projection from domain theory. This is clearer than the previous explanation of contexts in terms of continuations. In addition, this paper describes finite domains of contexts over the non-flat list domain. This means that recursive context equations can be solved using standard fixpoint techniques, instead of the algebraic manipulation previously used.

199 citations


Book
01 Nov 1987
TL;DR: The main purpose of as discussed by the authors is to show how the traditions in foreign language teaching and in curriculum renewal should converge, so that the importance of modern language teaching in the educational domain is fully realized.
Abstract: The main purpose of this book is to show how the traditions in foreign language teaching and in curriculum renewal should converge, so that the importance of modern language teaching in the educational domain is fully realized.

150 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated whether domain-specific expertise could compensate for low overall aptitude on certain domain-related cognitive processing tasks and found that performance was a function of level of expertise in the domain.
Abstract: Two experiments investigated whether domain-specific expertise could compensate for low overall aptitude on certain domain-related cognitive processing tasks. It was hypothesized that the performance of low-aptitude individuals on a task requiring them to acquire new information in a domain would be a function of domain expertise rather than overall aptitude level. In Experiment 1, two low-aptitude groups, one with high domain knowledge and one with low domain knowledge, were presented with a baseball passage. On both recall and recognition tests, performance was a function of level of expertise in the domain. In Experiment 2, both level of baseball knowledge and overall aptitude were varied in a factorial design. Again, performance was a function of baseball knowledge rather than aptitude level. Low-aptitude/high-knowledge participants recalled more information than high-aptitude/low-knowledge participants. In addition, the performance of the low-aptitude/high-knowledge group was similar to the high-apti...

131 citations


Journal ArticleDOI
TL;DR: Multistep algorithms, introduced in the Soviet literature, are generalized and systematically studied; they are shown to provide significant improvements over the classical (one-step) methods for the purpose of tracking.
Abstract: The design of adaptive algorithms for the purpose of the tracking of slowly time varying systems is investigated . A criterion for measuring the tracking capability of an algorithm in this situation was introduced in an earlier work; the domain of vali dity of this criterion is shown to be much wider than expected before. On the other hand, multistep algorithms, introduced in the Soviet literature, are generalized and systematically studied; they are shown to provide significant improvements over the classical (one-step) methods for the purpose of tracking. Finally, a complete design me thodology for adaptive algorithms used on time varying systems is given.

128 citations


Journal ArticleDOI
01 Jul 1987
TL;DR: A research program is described that explores issues within the context of the Georgia Tech-Multisatellite Operations Control Center (GT-MSOCC), a real-time interactive simulation of the operator interface to a NASA ground control system for unmanned Earth-orbiting satellites.
Abstract: Modeling and aiding operators in supervisory control environments are necessary prerequisites to the effective use of automation in complex dynamic systems. A research program is described that explores these issues within the context of the Georgia Tech-Multisatellite Operations Control Center (GT-MSOCC). GT-MSOCC is a real-time interactive simulation of the operator interface to a NASA ground control system for unmanned Earth-orbiting satellites. GT-MSOCC is a high fidelity domain in which a range of modeling, decision aiding, and workstation design issues addressing human-computer interaction may be explored. GT-MSOCC is described in detail. The use of high-fidelity domains as research tools is also discussed, and the validity and generalizability of such results to other domains are examined. In addition to a description of GT-MSOCC, several other related parts are included. A GT-MSOCC operator function model (OFM) is presented. The GT-MSOCC model illustrates an enhancement to the general operator function modeling methodology that extends the model's utility for design applications. The proposed methodology represents operator actions as the lowest level discrete control network nodes. Actions may be cognitive or manual; operator action nodes are linked to information needs or system reconfiguration commands. Thus augmented, the operator function model provides a formal representation of operator interaction with the controlled system and serves as the foundation for subsequent theoretical and empirical research. A brief overview of experimental and methodological studies using GT-MSOCC is also given.

128 citations


Journal ArticleDOI
TL;DR: The goal is to build cooperative computer systems to augment human intelligence by replacing human-computer communication with human problem-domain communication, which allows users to concentrate on the problems of their domain and to ignore the fact that they are using a computer tool.
Abstract: Our goal is to build cooperative computer systems to augment human intelligence. In these systems, the communication between the user and the computer plays a crucial role. To provide the user with the appropriate level of control and a better understanding, we have to replace human-computer communication with human problem-domain communication, which allows users to concentrate on the problems of their domain and to ignore the fact that they are using a computer tool. Construction kits and design environments are tools that represent steps toward human problem-domain communication. A construction kit is a set of building blocks that models a problem domain. The building blocks define a design space (the set of all possible designs that can be created by combining these blocks). Design environments go beyond construction kits in that they bring to bear general knowledge about design (e.g., which meaningful artifacts can be constructed, how and which blocks can be combined with each other) that is useful for the designer. Prototypical examples of these systems (especially in the area of user interface design) are described in detail, and the feasibility of this approach is evaluated.

117 citations


Proceedings Article
01 Oct 1987
TL;DR: In this article, a genetic algorithm is adapted to manipulate Lisp S-expressions and the traditional genetic operators of crossover, inversion, and mutation are modified for the Lisp domain.
Abstract: The genetic algorithm is adapted to manipulate Lisp S-expressions. The traditional genetic operators of crossover, inversion, and mutation are modified for the Lisp domain. The process is tested using the Prisoner's Dilemma. The genetic algorithm produces solutions to the Prisoner's Dilemma as Lisp S-expressions and these results are compared to other published solutions.

108 citations


Proceedings ArticleDOI
01 Mar 1987
TL;DR: A comparison of the respective approaches to process modelling taken by Osterweil and myself is raised, which reflects the current understanding of his views on Process Programs and Process Programming, and my reaction to what I believe he will present.
Abstract: One way of responding to a keynote speaker is to put the expressed views into context, pointing to highlights in the address, suggesting areas where alternative viewpoints might have been presented, exposing any chinks in the armour of the otherwise solid structure erected by the speaker.Logistics have made it impossible for this respondent to see the paper to be presented to ICSE9 by Professor L Osterweil before generating his own written response,. The above approach cannot, therefore, be taken. Instead, I raise a fundamental issue that follows from a comparison of the respective approaches to process modelling taken by Osterweil and myself. What is expressed here reflects my current understanding of his views on Process Programs and Process Programming, my reaction to what I believe he will present. I can only hope that this will not do too much violence to views to be expressed in his Proceedings paper or in the Keynote lecture itself.To set the scene and to provide a basis and framework for discussion, let me first summarize my view of studies of the software development process in terms of my own involvement in them.To the best of my knowledge, the first such study was a 1956 paper by Benington [BEN56]. In this, a process model with basic characteristics of that subsequently termed the 'Waterfall Model', was first presented. Current interest in the software development process makes it most appropriate that this historic paper is to re-presented at this conference. In 1968/9, totally unaware of the earlier paper, I engaged in a study whose conclusions were presented in a confidential report entitled 'The Programming Process' [LEH69]. This has now become available in the open literature [LEH85, chapter 3] and is, I believe, as relevant today as at the time it was written. It was this study and the continuing research it triggered that subsequently led my colleagues and me to the concepts of process models, evolution dynamics, program evolution and support environments.Our earliest process models reflected the dynamics of the process [LEH85, chs. 5-9, 14, 16, 19]. By the mid 70's, at about the time that Barry Boehm [BOE76] popularized the Waterfall model first proposed by Royce [ROY70], my studies had led to a search for better understanding of the total process of software development.This total process was seen as extending from initial verbalization of the problem to be solved or computer application to be implemented, through delivery of the product and over its subsequent evolution. The search was expressed through the development and refinement of a sequence of Process Models [LEH85 chs. 3, 7, 14, 20, 21, 2]. It was directed towards first formulating a model of an ideal process ('ideal' though unachievable in the sense of the 'ideal' cycle of thermodynamics). Such a model would constitute a general paradigm. A practical process would be obtained by instantiation in terms of relevant concepts, available technologies, specific implementation environments, process constraints and so on. This development of process models culminated in the LST model [LEH84] and its subsequent analysis and application as presented at the first two Process Workshops [SPW84, 86]. The importance of that model is not only in the process it depicts. It is a canonical model of software development and of development steps.What has all this to do with process programs? Process programs, as described by Osterweil, are also process models. They are models constructed from linguistic elements expressed and structured in programmatic form. They are intended to define a procedure for achieving some desired end from an initial starting point and are expressed in terms of expressions in a natural or formal language. The procedure is implemented by executing the primitive actions named in the program.A process program to describe a process that, if followed, will permit execution of some specific task in its environment, can be systematically developed, top-down, in a manner equivalent to top-down development of a procedural program. The Osterweil approach is essentially equivalent, in the context of process modelling, to the use of procedural programming (in contrast to styles such as functional, imperative and so on). Its power is defined by the properties of the language used in relation to available execution mechanisms. In fact, a process program is precisely that - a procedural program whose value depends on the constructability of a mechanism that can execute it mechanically, human intervention being restricted primarily to the provision of information. This is a view that Osterweil will not dispute; in the papers that I have seen the algorithmic nature of process programs is repeatedly stressed.And therein lies the rub. The approach is fine, almost certainly useful, when comprehensive models of the phenomenon, the domain and the system that are the subject of the program are known and understood, when strategies and algorithms for achieving the desired ends are known a priori, when computational, managerial and administrative practices are fully defined. It is useless, indeed meaningless, if such phenomenological and algorithmic models do not exist [TUR86], if progress in definition (and execution) of the process is a function of the process itself.

Journal ArticleDOI
TL;DR: This paper describes a formalism, which is bosed on the First Order Predicate Calculus, for representing the knowledge structure associated with a domain and develops a theory of Constrained Semantic Transference which allows the terms and the structural relationships of the source domain to be transferred coherently across to the target domain.

Proceedings Article
23 Aug 1987
TL;DR: Research is directed towards the construction of a system that will detect and correct problems with domain theories that will enable knowledge-based systems to operate with imperfect domain theories and automatically correct the imperfections whenever they pose problems.
Abstract: In recent years knowledge-based techniques like explanation-based learning, qualitative reasoning and case-based reasoning have been gaining considerable popularity in AI. Such knowledge-based methods face two difficult problems: 1) the performance of the system is fundamentally limited by the knowledge initially encoded into its domain theory 2) the encoding of just the right knowledge to enable the system to function properly over a wide range of tasks and situations is virtually impossible for a complex domain. This paper describes research directed towards the construction of a system that will detect and correct problems with domain theories. This will enable knowledge-based systems to operate with imperfect domain theories and automatically correct the imperfections whenever they pose problems. This paper discusses the classification of imperfect theory problems, strategies for their detection and an approach based on experiment design to handle different types of imperfect theory problems.

Proceedings ArticleDOI
01 Dec 1987
TL;DR: This paper discusses the organization of a case law knowledge base in terms of three interacting components: a domain knowledge model defines the basic concepts of acase law domain; individual case descriptors describe the particular constellation of concepts that pertain to each case, organized into a frame-based superstructure according to the legal roles they fill.
Abstract: Conceptual retrieval requires the computer to have knowledge of legal concepts and issues, and their relationship to the case law collection. This paper discusses the organization of a case law knowledge base in terms of three interacting components: a domain knowledge model defines the basic concepts of a case law domain; individual case descriptors describe the particular constellation of concepts that pertain to each case, organized into a frame-based superstructure according to the legal roles they fill; and issue/case discrimination trees represent the significance of each case relative to a model of the normative relationships of the legal domain. Each of these components is described and justified by showing its contribution to the goal of conceptual retrieval.

Journal ArticleDOI
TL;DR: This research demonstrates that implementation independent domain modelling is feasible and useful within the context of a methodology which aims at supporting good software engineering principles as applied to Expert Systems.
Abstract: Traditional approaches to Expert Systems development have emphasized the use of exploratory programming techniques and moving as quickly as possible from conceptualisation to code. Our research demonstrates that implementation independent domain modelling is feasible and useful within the context of a methodology which aims at supporting good software engineering principles as applied to Expert Systems.

Book ChapterDOI
01 Jan 1987
TL;DR: In this chapter, techniques for discovering organization in an expert’s domain knowledge are described in the domain of “domestic gas-fired hot water and central heating systems,” which possesses technical properties seen as relevant to larger-scale domains.
Abstract: In this chapter we describe techniques for discovering organization in an expert’s domain knowledge. These techniques are illustrated in the domain of “domestic gas-fired hot water and central heating systems,” which possesses technical properties seen as relevant to larger-scale domains. The informant was not a recognized expert on central heating but a scientist with an interest in the domain.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a model constructor that can be used to synthesize models that explain and generate expert behavior in the problem domain of business resource planning, where reasoning is based on models that evolve in response to changing external conditions or internal policies.
Abstract: Flexible representations are required in order to understand and generate expert behavior. Although production rules with quantifiers can encode experiential knowledge, they often have assumptions implicit in them, making them brittle in problem scenarios where these assumptions do not hold. Qualitative models achieve flexibility by representing the domain entities and their interrelationships explicitly. However, in problem domains where assumptions underlying such models change periodically, it is necessary to be able to synthesize and maintain qualitative models in response to the changing assumptions. In this paper we argue for a representation that contains partial model components that are synthesized into qualitative models containing entities and relationships relevant to the domain. The model components can be replaced and rearranged in response to changes in the task environment. We have found this "model constructor" to be useful in synthesizing models that explain and generate expert behavior, and have explored its ability to support decision making in the problem domain of business resource planning, where reasoning is based on models that evolve in response to changing external conditions or internal policies.

Patent
05 Oct 1987
TL;DR: In this article, two RS decoders, one a time domain decoder and the other a transform-domain decoder, use the same first part to develop an errata locator polynomial τ(x), and an evaluators polynominal A(x).
Abstract: Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial τ(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs τ(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on τ(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.

01 Nov 1987
TL;DR: This RFC provides guidelines for domain administrators in operating a domain server and maintaining their portion of the hierarchical database.
Abstract: This RFC provides guidelines for domain administrators in operating a domain server and maintaining their portion of the hierarchical database. Familiarity with the domain system is assumed (see RFCs 1031, 1032, 1034, and 1035).

Journal ArticleDOI
TL;DR: The concepts which allow an expert system to be used for both design diagnosis and design synthesis are described and an example of the implementation of these concepts is presented in the domain of preliminary design of domestic kitchens in the expert system PREDIKT.
Abstract: This paper describes the concepts which allow an expert system to be used for both design diagnosis and design synthesis An example of the implementation of these concepts is presented in the domain of preliminary design of domestic kitchens in the expert system PREDIKT PREDIKT carries out both design diagnosis and design synthesis using the same knowledge base and utilises an existing expert system shell which has forward- and backward-chaining capabilities The significance of graphical interaction with expert systems in design domains is demonstrated

01 Jan 1987
TL;DR: Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids.
Abstract: Mapping the solution domain of n-finite elements into N-subdomains that may be processed in parallel by N-processors is an optimal one if the subdomain decomposition results in a well-balanced workload distribution among the processors The problem is discussed in the context of irregular finite element domains as an important aspect of the efficient utilization of the capabilities of emerging multiprocessor computers Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids The simulated annealing analogy and algorithm are described, and numerical results are given for mapping an irregular two-dimensional finite element domain containing a singularity onto the Hypercube computer

Journal ArticleDOI
TL;DR: In this article, it is argued that current theories of the domain of liaison are inadequate and that thai 'domain' is a variable intralinguistic constraint on the (variable) rule of liaison.
Abstract: In this paper it is argued that current theories of the domain of liaison are inadequate. It is shown thai 'domain' is a variable intralinguistic constraint on the (variable) rule of liaison. The variable nature of liaison is also apparent from the fact that its rate of application covaries with extralinguistic factors such as style and social class. These results also show that rules of sentence phonology should be based on reliable corpora of spoken language.

Journal ArticleDOI
TL;DR: A rule-based approach to develop a financial marketing system proved inadequate as a vehicle for "scaling up" to a production system providing all the capabilities required.
Abstract: F inancial marketing represents a new and challenging domain for expert system applications. The term financial marketing denotes financial decision processes used in marketing products and services of such a scale as to significantly impact a company's financial status. Customers interested in major computing technology acquisitions usually want financing plans to be safe, sound, and attractive. When making large equipment proposals, financial considerations often become as important as computing considerations. Typically, no \"best\" answer exists when developing appropriate acquisition planst that satisfy customer capacity requirements while meeting applicable financial objectives. The question is not simply to optimize some financial objective. Of necessity, any proposed solution will be the result of balancing competing goals. Furthermore, it would be insufficient to merely propose a plan; rather, we need explanations persuading decision makers that our proposals are reasonable-convincing financial explanations that \"sell\" our solution (specifically, an acquisition and financing plan). Consider the following simple example: A customer currently has an IBM 3081 KX3 processor installed, and is close to exceeding that machine's capacity. He can select a larger processor from the same product family (such as an IBM 3084 QX6), or one from a product family representing newer technology (such as an IBM 3090 200). The 3081 KX3 is still on lease, so lease termination penalties apply. The leasing firm offers a no-penalty lease termination if the 3081 KX3 is replaced by a _3084 QX6 leased from the same firm. On the < other hand, the 3090 200 has higher residual =8 value plus other financial incentives. In such a scenario, it is important to know who will make the acquisition decision and whether that person is measured on budget, earnings per share, or some other financial criterion. An earlier article described a rule-based approach to develop a financial marketing system.' While that effort demonstrated the feasibility of a knowledgebased approach, it proved inadequate as a vehicle for \"scaling up\" to a production system providing all the capabilities required. Financial marketing domain knowledge is considerable, but structured. Representthat initial version required many rules, but hid much

Proceedings Article
01 Sep 1987
TL;DR: An improved algorithm the RQA/FQI Strategy is proposed which is complete over the domain of function-free Horn clauses which uses a two step approach recursive expansion + an efficient variant of LFP iteration to evaluate recursive queries.
Abstract: In this paper we will discuss several methods for recursive query processing using a recursive control structure. We will describe the QSQR method, introduced in [Vie861 and show that it fails to produce all answers in certain cases. After analyzing the causes of this failure we propose an improved algorithm the RQA/FQI Strategy which is complete over the domain of function-free Horn clauses. The new method uses a two step approach recursive expansion + an efficient variant of LFP iteration to evaluate recursive queries. A short comparison of these methods shows the efficiency of RQA/FQI.

Proceedings Article
13 Jul 1987
TL;DR: It is shown how localized domain descriptions can alleviate aspects of the frame problem and serve as the foundation of a planning technique based on localized planning spaces because domain constraints and properties are localized.
Abstract: This paper presents a general method for structuring domains that is based on the notion of locality. We consider a localized domain description to be one that is partitioned into regions of activity, each of which has some independent significance. The use of locality can be very beneficial for domain representation and reasoning, especially for parallel, muItiagent domains. We show how localized domain descriptions can alleviate aspects of the frame problem and serve as the foundation of a planning technique based on localized planning spaces. Because domain constraints and properties are localized, potential interactions among these search spaces are fewer and more easily identified.

Book ChapterDOI
TL;DR: This work proposes a new approach to synthesis of engineering systems called EDESYN (Engineering DEsign SYNthesis), developed using a frame-based reasoning tool, which provides a domain independent environment for building design expert systems.
Abstract: The knowledge used in the synthesis of engineering systems includes understanding systems and their components and the implications of design decisions on other decisions on further problem decomposition. In using a knowledge based approach to synthesis, design knowledge can be characterized as planning knowledge, including design goals and planning or ordering of goals, and design knowledge, including alternative solutions for each goal and constraints on the selection of a solution for a given goal. This approach is implemented in a program called EDESYN (Engineering DEsign SYNthesis), developed using a frame-based reasoning tool. EDESYN provides a domain independent environment for building design expert systems.

Proceedings Article
13 Jul 1987
TL;DR: The machinist program is modeled after the behavior of human machinists, and makes plan for fabricating metal parts using machine tools, and contains about 180 OPS5 rules, and has been judged to make plans that are on the average, better than those of a 5 year journeyman.
Abstract: The Machinist program extends domain dependent planning technology. It is modeled after the behavior of human machinists, and makes plan for fabricating metal parts using machine tools. Many existing planning programs rely on a problem solving strategy that involves fixing problems in plans only after they occur. The result is that planning time may be wasted when a bad plan is unnecessarily generated and must be thrown out or modified. The machinist program improves on these methods by looking for cues in the problem specification that may indicate potential difficulties or conflicting goal interactions, before generating any plans. It plans around those difficulties, greatly increasing the probability of producing a good plan on the first try. Planning efficiency is greatly increased when Jalse starts can be eliminated. The machinist program contains about 180 OPS5 rules, and has been judged by experienced machinists to make plans that, are on the average, better than those of a 5 year journeyman. The knowledge that makes the technique effective is domain dependent, but the technique itself can be used in other domains.

Journal ArticleDOI
01 Aug 1987
TL;DR: Various definitions of task-space tolerances, using geometric pseudoenvelopes, are introduced, and new approaches to "direct" and "inverse" joint-error analysis of robots are formulated, leading to a "feasible joint-tolerance domain" concept for use in computer-aided design of robots.
Abstract: Various definitions of task-space tolerances, using geometric pseudoenvelopes, are introduced, and new approaches to "direct" and "inverse" joint-error analysis of robots are formulated. These error analyses lead to the development of a "feasible joint-tolerance domain" concept for use in computer-aided design of robots.

Journal ArticleDOI
TL;DR: A review of the domains in KH2PO4 crystal is given in this paper, which includes a description of the domain and cell orientations correlated to the texture, the existence of quasidislocations and their role at the domain tips, and a study of lateral displacements of the walls and domain freezing.
Abstract: A review of the domains in KH2PO4 crystal is given. This includes a description of the domains and the cell orientations correlated to the texture, the existence of quasidislocations and their role at the domain tips, and a study of lateral displacements of the walls and domain freezing are studied.

01 Nov 1987
TL;DR: This memo describes procedures for registering a domain with the Network Information Center (NIC) of Defense Data Network (DDN), and offers guidelines on the establishment and administration of a domain in accordance with the requirements specified in RFC-920.
Abstract: Domains are administrative entities that provide decentralized management of host naming and addressing. The domain-naming system is distributed and hierarchical. This memo describes procedures for registering a domain with the Network Information Center (NIC) of Defense Data Network (DDN), and offers guidelines on the establishment and administration of a domain in accordance with the requirements specified in RFC-920. It is recommended that the guidelines described in this document be used by domain administrators in the establishment and control of second-level domains. The role of the domain administrator (DA) is that of coordinator, manager, and technician. If his domain is established at the second level or lower in the tree, the domain administrator must register by interacting with the management of the domain directly above this.