scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Human-computer Studies \/ International Journal of Man-machine Studies in 1981"


Journal ArticleDOI
Thomas P. Moran1
TL;DR: This article introduces and discusses a specific grammatical structure—the Command Language Grammar (CLG)—as a representational framework for describing the user interface aspects of interactive computer systems.
Abstract: This article introduces and discusses a specific grammatical structure—the Command Language Grammar (CLG)—as a representational framework for describing the user interface aspects of interactive computer systems. CLG partitions a system into a Conceptual Component (tasks and abstract concepts), a Communication Component (command language), and a Physical Component (display, keyboard, etc.), The components are further stratified into distinct Levels—a Task Level, a Semantic Level, a Syntactic Level, and an Interaction Level-each Level being a complete description of the system at its level of abstraction. Each Level's description contains procedures for accomplishing the tasks addressed by the system in terms of the actions available at that Level. That is, the system is described by progressive refinement. An extensive example, a small message-processing system, is described at all Levels in the CLG notation. CLG is discussed from three points of view: the Linguistic View sees CLG as elaborating the structure of the system's user interface and of the communication between the user and the system. The principal goal of CLG in this view is to lay out the space of command language systems. The Psychological View sees CLG as describing the user's mental model of the system. The main concern in this view is with the psychological validity of the CLG description. The Design View sees CLG as a series of representations for specifying the design of a system. CLG proposes a top-down design process in which the conceptual model of the system is first specified and then a command language is created to communicate with it.

542 citations


Journal ArticleDOI
TL;DR: An extension and refinement of the author's theory for human visual information processing is presented, which is then applied to the problem of human facial recognition, and pertains to Gestalt recognition of any class of familiar objects or scenes.
Abstract: This paper presents an extension and refinement of the author's theory for human visual information processing, which is then applied to the problem of human facial recognition. Several fundamental processes are implicated: encoding of visual images into neural patterns, detection of simple facial features, size standardization, reduction of the neural patterns in dimensionality, and finally correlation of the resulting sequence of patterns with all visual patterns already stored in memory. In the theory presented here, this entire process is automatically "driven" by the storage system in what amounts to an hypothesis verification paradigm. Neural networks for carrying out these processes are presented and syndromes resulting from damage to the proposed system are analyzed. A correspondence between system component and brain anatomy is suggested, with particular emphasis on the role of the primary visual cortex in this process. The correspondence is supported by structural and electrophysiological properties of the primary visual cortex and other related structures. The logical (computational) role suggested for the primary visual cortex has several components: size standardization, size reduction, and object extraction. The result of processing by the primary visual cortex, it is suggested, is a neural encoding of the visual pattern at a size suitable for storage. (In this context, object extraction is the isolation of regions in the visual field having the same color, texture, or spatial extent.) It is shown in detail how the topology of the mapping from retina to cortex, the connections between retina, lateral geniculate bodies and primary visual cortex, and the local structure of the cortex itself may combine to encode the visual patterns. Aspects of this theory are illustrated graphically with human faces as the primary stimulus. However, the theory is not limited to facial recognition but pertains to Gestalt recognition of any class of familiar objects or scenes.

213 citations


Journal ArticleDOI
TL;DR: Three designs of pocket calculator are analysed for the models implied by their behaviour and three applications of mapping models are discussed, the analysis in each case yielding empirically testable consequences for the user's behaviour.
Abstract: For an interactive device to be satisfactory, its intended users must be able to form a “conceptual model” of the device which can guide their actions and help them interpret its behaviour. Three designs of pocket calculator are analysed for the models implied by their behaviour. Two different kinds of model are illustrated, oriented primarily to answering different questions about the calculator's behaviour. Implied register models, aimed at predicting how the calculator will respond to a given sequence of inputs, provide a simple “cover story” for how the calculator works. Task/action mapping models, aimed more at deriving an appropriate input sequence to achieve a given task, focus on the relations between the actions a user performs and the task the calculator carries out. The “core” of these relations acts as the conceptual model. Complexities in the model, for example, give rise to corresponding difficulties in the use of the calculator. Three applications of mapping models are discussed, the analysis in each case yielding empirically testable consequences for the user's behaviour.

150 citations


Journal ArticleDOI
Ellen Hisdal1
TL;DR: It is concluded that increased fuzziness in a description means increased ability to handle inexact information in a logically correct manner.
Abstract: A new fuzzy relation which represents an IF THEN ELSE (abbreviated to “ITE”) statement is constructed. It is shown that a relation which (a) always gives correct inference and (b) does not contain false information which is not present in the ITE statement, must be of a higher degree of fuzziness than the antecedents and consequents of the ITE statement. Three different ways of increasing the fuzziness of the relation are used here: (1) the fuzzy relation is of higher type; (2) it is interval-valued; (3) contains a BLANK or “don't know” component. These three types of fuzziness come about naturally because the relation is a restriction of an initial relation which represents the second approximation to the state of complete ignorance. There exist successive approximations to the state of complete ignorance, each of them being an intervalvalued fuzzy set BLANK of one type higher than the previous one. Similar representations of the zeroth and first approximation to the state of ignorance have been used in the theory of probability, though in a rather heuristic fashion. The assignment of a value to a variable is represented as a complete restriction of the BLANK state of type N; the “value” being any pure (non-interval-valued) fuzzy set of type N. With the new relation, the inferred set is a superposition of the consequents of the ITE statement and of the BLANK state, each of these components being multiplied by an interval-valued coefficient. In the case of modus ponens inference, the component with the highest coefficient (determined from a specially defined ordering relation for interval-values) is always the correct consequent, provided that the original ITE statement is logically consistent. A mathematical test for logical consistency of the (possibly fuzzy) ITE statement is given. Disjointness of fuzzy sets is defined and connected up with logical consistency. The paradoxes of the implication of mathematical logic disappear in the fuzzy set treatment of the ITE statement. Subnormal fuzzy singletons find their natural interpretation. When used as an antecedent in an ITE statement, such a singleton does not have enough strength to induce the consequent with complete certainty. Instead it induces a superposition of an interval-valued fuzzy set and the BLANK state. New definitions for union, intersection and complementation of fuzzy sets of higher type are suggested. A new interpretation of an interval-valued fuzzy set of type N as a collection of fuzzy sets of type N, not as a type N+1 set, is given. There exist two types of union, intersection and complementation for intervalvalued fuzzy sets. They are called the fuzzy and the crisp operations, respectively. It is suggested that the negation be represented by an interval-valued fuzzy set. We conclude that increased fuzziness in a description means increased ability to handle inexact information in a logically correct manner.

135 citations


Journal ArticleDOI
Philip J. Barnard1, N.V. Hammond1, John Morton1, John Long1, I.A. Clark2 
TL;DR: A methodological approach to this fusion is outlined as a background for three studies of structured human-.computer dialogue involving a task in which secret messages were decoded in a number of discrete steps corresponding to computer commands.
Abstract: To tackle problems of human-computer interaction the traditional scope of humanmachine studies needs extending to include the complex cognitive skills of understanding, communication and problem solving. This extension requires a fusion of the conceptual and empirical tools of human factors with those of cognitive psychology. A methodological approach to this fusion is outlined as a background for three studies of structured human-.computer dialogue. The studies involved a task in which secret messages were decoded in a number of discrete steps corresponding to computer commands. Each “command” required two numeric arguments. The study investigated underlying variables using questionnaire techniques in addition to user performance in an interactive version of the task. Three factors concerning the order of arguments in a command string were investigated: the consistent positioning of a recurrent argument, the relationship between argument entry order and their order in natural language, and the relationship between argument entry order and the position of argument values on a VDU. In Study I software specialists were asked to design command structures for the task and to give reasons for their choices. In Study II naive subjects were asked to choose between telegrams in which alternative argument orders were expressed in terms of alternative word orders. In the interactive version of the task, used in Study III, positionally consistent systems were most readily learned, but this depended on having the recurrent argument in the first position. With positionally inconsistent systems there were reliable effects due to the position of the direct object of individual command verbs.

107 citations


Journal ArticleDOI
TL;DR: It is expressed the view that the programming of interactive dialogue is an important technology for software engineering in its own right and develops systematically a set of “rules” for dialogue programming and discusses them in terms of user psychology.
Abstract: The availability of low-cost interactive computing for commercial applications makes it attractive to give end-users direct access to computers in a “conversational” mode. The need to minimize typing by the user leads to the conversational dialogues consisting of a series of “prompts” and responses. Such dialogue sequences are now being programmed into a wide range of systems, but there are currently available only limited guidelines on appropriate programming techniques. This paper expresses the view that the programming of interactive dialogue is an important technology for software engineering in its own right. It develops systematically a set of “rules” for dialogue programming and discusses them in terms of user psychology. It is hoped that this definitive approach will lead to additions, extensions, and refinements, to these rules, eventually generating a recognized technology for dialogue engineering .

82 citations


Journal ArticleDOI
TL;DR: The way in which originally Chris Evans worked with one of us to examine the possibility of using computer interrogation in this country is described; the position which the technique has reached today; and the possible future the authors envisage for it.
Abstract: This paper describes the way in which originally Chris Evans (C.R.E.) worked with one of us (W.I.C.) to examine the possibility of using computer interrogation in this country; the position which the technique has reached today; and the possible future we envisage for it. Computer interrogation has to be distinguished from history taking by the doctor since the computer is denied nearly all the non-verbal information available from the patient, and since the questions available to the computer must be pre-defined and relatively small in number. The development of this technique has required study of the patient-computer interface, while its performance has been assessed by measurement of its accuracy, its acceptability to the patient and its cost. The power of the computer is only exploited when, within a limited field of disease, data are collected by questioning the patient which make diagnostic inference possible and hence the identification of the most likely disease. Computer interrogation is not yet fully exploited either technically by, for example, the presentation of questions in audio mode, or mathematically, in the identification of the most powerful sets of questions to be asked. The extension of the technique to questioning in, for example, Urdu or Punjabi is largely unexplored. We can envisage a therapeutic role for the computer whereby the patient, perhaps an alcoholic, gains valuable self-knowledge from the interview. The computer is not a rival to the doctor but a partner, and our duty is to develop a partnership which uses the great qualities of each to the full.

34 citations


Journal ArticleDOI
TL;DR: The aim of this paper is to present some problems which may appear in the initial stage off design of a decisionmaking algorithm and discuss a method of their formulation.
Abstract: The fuzzy set theory established by L. A. Zadeh has started anew period in formalizing a decision-making processes in ill-defined systems where a human being (operator) is an important element in the control loop. The control algorithm used here is based on a generalized fuzzy version of modus ponens (compositional rule of inference) where a set of decision-making rules forming a control algorithm is given. The aim of this paper is to present some problems which may appear in the initial stage off design of a decisionmaking algorithm and discuss a method of their formulation. We introduce notions like the completeness of algorithm, interactivity and the competitivity of control rules and consider indices illustrating the presented design aspects.

29 citations


Journal ArticleDOI
TL;DR: This paper investigates the effect of different presentation speeds on performance in a learning task and found that performance deteriorated at speeds similar to, or faster than reading speed.
Abstract: When a computer generates text for a Visual Display Unit, it is usually presented at the fastest speed available. The experiments described in this paper investigate the effect of different presentation speeds on performance in a learning task. It was found that performance deteriorated at speeds similar to, or faster than reading speed. If understanding and retention of textual material is important, the optimum presentation speed is in the range 10-15 characters per second.

28 citations


Journal ArticleDOI
John Fox1
TL;DR: It is argued that recognition of various different roles for fuzzy logics strengthens the pragmatic case for their development but that their formal justification remains somewhat exposed to Haack's arguments.
Abstract: Haack (1979) has questioned the need for fuzzy logic on methodological and linguistic grounds. However, three possible roles for fuzzy logic should be distinguished; as a requisite apparatus—because the world poses fuzzy problems; as a prescriptive apparatus—the only proper calculus for the manipulation of fuzzy data; as a descriptive apparatus—some existing inference system demands description in fuzzy terms. Haack does not examine these distinctions. It is argued that recognition of various different roles for fuzzy logics strengthens the pragmatic case for their development but that their formal justification remains somewhat exposed to Haack's arguments. An attempt is made to reconcile pragmatic pressures and theoretical issues by introducing the idea that fuzzy operations should be carried out on subjective statements about the world, leaving standard logic as the proper basis for objective computations.

24 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the contribution of personal construct psychology to the development of the PEGASUS and SOCIOGRIDS programs for construct elicitation and analysis, in particular the practical role in it of the PSM programs.
Abstract: Recently educational technology has undergone a change of emphasis in the methods and means of teaching: from mass instruction through individualized instruction to group learning. This re-orientation parallels developments within education itself of the three stages of dependent, independent and interdependent learning. This paper discusses the contribution which can be made to this development by personal construct psychology, and in particular the practical role in it of the PEGASUS and SOCIOGRIDS programs for construct elicitation and analysis.

Journal ArticleDOI
TL;DR: An empirical comparison between two methods of computer programming, both of which are based on principles of structured programming, finds the type of approach has quite a clear effect on the global program structure and on certain categories of errors.
Abstract: The results of an empirical comparison between two methods of computer programming are presented, both of which are based on principles of structured programming. Their principal difference is in the approach proposed for problem solving. One approach is prospective: by which the program structure is derived from the data structure. The other is retrospective: one must, on the contrary start from the structure of the results. A relatively complex management program produced by students trained to each of the methods (29 and 17 subjects, respectively) was analysed. Three main results were observed: (a) the type of approach has quite a clear effect on the global program structure and (b) on certain categories of errors, (c) the adoption of a prospective approach (regardless of the method learned) to construct a difficult component of the program. We conclude by a recommendation for the conception of programming methods.

Journal ArticleDOI
TL;DR: The use of the simple operations of fuzzy logic allows retrieval of documents with the highest grades of formal relevance (in a given information system).
Abstract: An information retrieval method based on fuzzy logic is presented. The method described takes into account in a straightforward way the varying importance of descriptors which reflect the content of the information system documents as well as the varying formal relevance grades of documents in relation to a given query. The use of the simple operations of fuzzy logic allows retrieval of documents with the highest grades of formal relevance (in a given information system).

Journal ArticleDOI
Masatoshi Sakawa1
TL;DR: In this paper, based on the algorithm of SPOT, a computer program for multiobjective decision making with interactive procedures is presented and called ISPOT, especially designed to facilitate the interactive processes for computer-aided decision making.
Abstract: A new interactive multiobjective decision making technique, which is called the sequential proxy optimization technique (SPOT), has been proposed by the author. Using this technique, the preferred solution for the decision maker can be derived efficiently from among a Pareto optimal solution set by assessing his marginal rates of substitution and maximizing the local proxy preference functions sequentially. In this paper, based on the algorithm of SPOT, a computer program for multiobjective decision making with interactive procedures is presented and called ISPOT. The program is especially designed to facilitate the interactive processes for computer-aided decision making. After a brief description of the theoretical framework of SPOT, the computer program ISPOT is presented. The commands in this program and major prompt messages are also explained. An illustrative numerical example for the interactive processes is demonstrated and numerous insights are obtained.

Journal ArticleDOI
TL;DR: The National Physical Laboratory has developed a microprocessor system for interviewing patients to obtain their medical histories in the form of a numbered flowchart that emulates a friendly doctor asking questions requiring simple YES or NO answers.
Abstract: The National Physical Laboratory has developed a microprocessor system for interviewing patients to obtain their medical histories. The doctor specifies the interview in the form of a numbered flowchart. When presented to the patient this emulates a friendly doctor asking questions requiring simple YES or NO answers. The system is easy to use, and collects accurate information.

Journal ArticleDOI
TL;DR: A study of the way in which university staff and research students make use of the central university computing service in the course of their day-to-day work found that users experienced in computing appeared to be more ambitious in what they were attempting, and made greater use of their own programs.
Abstract: This article describes a study of the way in which university staff and research students make use of the central university computing service in the course of their day-to-day work. A sample of computer users was selected, stratified according to their academic discipline, the length of time for which they had been computing, and the rate at which they ran jobs on the computer. These users were asked to fill in a questionnaire reporting on the results of every computer job they had run. It was found that three-quarters of all jobs were intended as “productio”, rather than developing new programs or finding the cause of errors. Slightly over half of the jobs involved the use of the users' own programs rather than of pre-supplied program packages. Just over a half of all jobs were completely successful, whilst one in five were a total failure. Users were more successful when they ran program packages than when they ran their own programs. Overall, computer-novices were more successful at what they attempted. Users experienced in computing appeared to be more ambitious in what they were attempting, and made greater use of their own programs; their proportion of successful jobs was hence lower than for the computer novices. There was also some evidence that users from the physical sciences were more successful than users from the social sciences, medicine and the humanities (but this effect was less significant than the split between computer novices and experts). Upon analysing their output, users believed that 70% of their errors had been definitely located, whilst 6% remained totally obscure and needed further runs on the computer. Just less than a third of all errors were classified as trivial, whilst about one in ten involved an error in concept. Many of the trivial errors were located in the users' data rather than in their programs. Slightly more errors were attributed to problems in the users' own discipline than to the use of the computer itself. Users tended to accept nearly all the blame for errors, except that 3% were attributed to misleading documentation.

Journal ArticleDOI
TL;DR: The results showed a relatively close correspondence between scanning time and the rating data of the same color combinations in an earlier study, and showed that the overall error rate was extremely low.
Abstract: An experiment is reported which employed a method of scanning matrices of letters for specific targets. The color of the letters and the color of the background varied. Scanning time was one dependent variable and accuracy in detecting the correct number of targets in each matrix was another. The results showed a relatively close correspondence between scanning time and the rating data of the same color combinations in an earlier study. With respect to accuracy, the present experiment showed that the overall error rate was extremely low.

Journal ArticleDOI
TL;DR: The nature of the social research package user is discussed, and features of the conversational package SCSS are examined to point out the important lines of package development.
Abstract: The general purpose data analysis package is characterized by its user-oriented interface. This paper discusses the nature of the social research package user, and examines features of the conversational package SCSS to point out the important lines of package development. A discussion of these developments covers the areas of data structure flexibility, facilities for model manipulation and testing, self-documentation, adaptability to both expert and novice package users, as well as the facilities of table, graphics and statistics generation systems. Not all of these developments are equally feasible, because they put possibly conflicting pressures on the form of the user interface. The continuing number of users of data analysis packages will continue to make this an interesting area of user-interface development.

Journal ArticleDOI
TL;DR: A critical evaluation of Atkin's Q-Analysis in its application to problems of medical diagnosis is given and it is shown that Q-analysis is not a suitable method for research in diagnostics.
Abstract: This paper gives a critical evaluation of Atkin's Q-Analysis in its application to problems of medical diagnosis. The basic procedure of Q-analysis is first explained in simple terms. Then the theory of classification by binary features is explained. It is done in a generally accessible way for the benefit of those likely to use the method for diagnostics. The main points are illustrated on a small Q-analysis on artificial data. It is shown that Q-analysis is not a suitable method for research in diagnostics.

Journal ArticleDOI
TL;DR: A study of the attitudes of users in eight British universities to computing, and particularly to guidance services for operating systems and applications software, is described.
Abstract: Computer-users in universities need computing facilities to aid them in their research and teaching; they are not computer professionals nor, for them, is the writing of programs an end in itself. To enable them to make good use of the facilities, guidance is provided in the form of documentation, courses, face-to-face advice and computerbased HELP systems. A study of the attitudes of users in eight British universities to computing, and particularly to guidance services for operating systems and applications software, is described. The overall evaluation of the guidance given was extremely favourable, although approval ratings for documentation were lower than for face-to-face advice for all kinds of software. Very few users had attended any kind of formal course, apart from those who had studied computing as undergraduates. As far as diagnostic advice was concerned, in general users chose as their first source of help either their colleagues or members of computer centre staff in about equal proportions. This was subject to variation depending on both the discipline of the user and the nature of the software concerned. As a general rule, users were more likely to turn to the computer centre for help in areas where the software was not in widespread use among their colleagues. This contrasts with the widely-held view that advisory services are used disproportionately by those in the social sciences, humanities and medicine. Areas needing further study include the role of the group in aiding problemsolving, and the identification of methods of guidance most appropriate to particular problem areas.

Journal ArticleDOI
TL;DR: The architecture of the GUHA-DBS Data Base System is described and the features of its principal components are presented.
Abstract: This paper presents the concept of the GUHA-DBS Data Base System intended for the users of the GUHA method and for GUHA procedure programmers. The architecture of the system as a whole is described and the features of its principal components are presented.

Journal ArticleDOI
TL;DR: This paper describes some techniques employed at the National Physical Laboratory in developing a practical system capable of recognizing human speech and segmenting and recognizing continuous speech such as strings of numerals.
Abstract: This paper describes some techniques employed at the National Physical Laboratory in developing a practical system capable of recognizing human speech. The system, which is currently being evaluated in an extended series of trials, is capable of performing two main tasks: (1) recognizing key words embedded in continuous speech and (2) segmenting and recognizing continuous speech such as strings of numerals.

Journal ArticleDOI
TL;DR: Some formal systems corresponding to current theories of statistical inference and oriented towards mechanized statistical inference as considered in GUHA methods of mechanized hypothesis formation are discussed.
Abstract: In the present paper we try to discuss some formal systems corresponding to current theories of statistical inference and oriented towards mechanized statistical inference as considered, for example, in GUHA methods of mechanized hypothesis formation. The syntax and semantics of theoretical sentences is shown in relation to functor calculi and corresponding observational sentences evaluable at data.

Journal ArticleDOI
TL;DR: The present state of software realizing the GUHA method of mechanized hypothesis formation in exploratory data analysis of categorical data is presented.
Abstract: The present state of software realizing the GUHA method of mechanized hypothesis formation in exploratory data analysis of categorical data is presented. The paper is related to the paper of Hajek & Havranek (1978b) describing the basic principles of the GUHA method.

Journal ArticleDOI
TL;DR: Main features of the database systems GUHA-DBS as a means for solving the problems mentioned are outlined and problems to be solved are discussed.
Abstract: The first part of this article concerns certain assumed trends of the development of the GUHA method. Particular attention was paid to the calculus of open observational formulae as a means for making it possible to declare derived quantities as parameters of GUHA procedures. In the second part of the article, problems are discussed, which are to be solved for providing the software and for the development of the GUHA method in the mentioned trends. Main features of the database systems GUHA-DBS as a means for solving the problems mentioned are outlined. The database system GUHA-DBS is described in more detail in Pokorný & Rauch (1981) . It is assumed that the reader knows the works of Hajek and Havranek, 1977 , Hajek and Havranek, 1978 and Rauch (1978) .

Journal ArticleDOI
Petr Hájek1
TL;DR: It is shown that variousLogical calculi corresponding to the theoretical level of statistical inference may be described as some generalized monadic modal predicate calculi that are undecidable when endowed with general semantics but become decidable when semantics is restricted to identically independently distributed structures.
Abstract: Logical calculi corresponding to the theoretical level of statistical inference (as understood, for example, in foundations of GUHA-style hypothesis formation) may be described as some generalized monadic modal predicate calculi. It is shown that various such calculi are undecidable when endowed with general semantics (arbitrary probabilistic structures) but, roughly, all reasonable such calculi become decidable when semantics is restricted to identically independently distributed structures.

Journal ArticleDOI
TL;DR: In the beginning, long before there had arisen a creature which could be considered as human, there existed both society and the environment as discussed by the authors, and the relationship between society and its environment is the most basic of all the relationships.
Abstract: In the beginning, long before there had arisen a creature which could be considered as human, there existed both society and the environment. The society was a hominid society, derived from a more ancient primate form of social organisation. The environment, although variable, had always existed, and provided those selective pressures which determined the nature of this particular form of animal social organisation. The relationship between society and its environment is the most basic of all the relationships. It is animal. It is primal.

Journal ArticleDOI
TL;DR: Insofar as Pinkava's criticisms are directed against the computer alogrithm called “Q-analysis”, it is correct to say that a blind application of it alone will not automatically give useful q -connected components for the purpose of classification.
Abstract: The limitations of Q-analysis in the field of psychology suggested in Pinkava's paper seem to boil down to the following: (A) an innate inability to consider “negative features”; (B) that a Q-analysis is based on numbers of shared features as opposed to the features themselves; (C) that a definition of “classification” stated in the language of algebraic logic eludes Q-analysis; and (D) an example of Q-analysis failing to discover a “hidden” classification. In this reply a review of the concept of anti-vertex answers (A), the distinction between the procedure “Q-analysis” and the “Methodology of Q-analysis” illuminates (B), a construction using anti-vertices answers (C), while a more relevant Q-analysis answers (D). Insofar as Pinkava's criticisms are directed against the computer alogrithm called “Q-analysis”, it is correct to say that a blind application of it alone will not automatically give useful q -connected components for the purpose of classification. To exploit the proven use of Q-analysis in set definition and classification it is necessary to appeal to the wider “Methodology of Q-analysis”. The latter is concerned with all the kinematics associated with a topological representation of a relation (between finite sets), whilst the former is a technique for finding some of the global properties of such a representation.

Journal ArticleDOI
TL;DR: Modula as mentioned in this paper is a high-level language for real-time parallel programming, which concentrates on its distinctive features as compared with Pascal; in particular the processs, signal and three types of module are considered.
Abstract: A short description of Modula, the high-level language for real-time parallel programming, concentrates on its distinctive features as compared with Pascal; in particular the processs, signal and three types of module are considered. VRW, a vision laboratory control program written in Modula is introduced. Its complete module and process structure is presented in support of the argument that Modula allows a most attractive program architecture which matches that of the laboratory and -the experimental control problem. Detailed fragments of VRW are presented to illustrate the capabilities of Modula with special attention to device handling. Further benefits of the Modula discipline such as the inherent confidence possible in solutions and the merits of the module as a unit for software construction are discussed. In examining means of control over the use of machine-store, scalar types and, more particularly, the timing of events, weaknesses in Modula are noted and discussed. But these do not prevent the conclusion that it is a most capable and attractive language for laboratory control.

Journal ArticleDOI
TL;DR: It is reported that during a computer interview context-free Encouragement and Chattiness, used randomly and to a moderate extent, seem to provide an optimum format for acceptability, but in human interviews random Encouragements seem to have no effect on people's acceptance, while the randomEncouragement might even have a negative effect.
Abstract: This paper reports an analysis and three experiments in the field of man-computer interviewing. To explore the importance of the linguistic format of the questions in a computer interview, four history-taking interviewing programs were analysed. These programs had already been used successfully in a computer system to carry out friendly and natural interviews. In this analysis the major variations in phraseology were found to be represented by two variables, namely Encouragement and Chattiness. Three experiments are then described aiming to test the usefulness of these two variables in a man-computer interviewing situation and to compare this with the man-man interviewing situation. The conclusions from these experiments are: (a) during a computer interview context-free Encouragement and Chattiness, used randomly and to a moderate extent, seem to provide an optimum format for acceptability, but (b) in human interviews random Encouragement and Chattiness seem to have no effect on people's acceptance, while the random Encouragement might even have a negative effect.