scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Human-computer Studies \/ International Journal of Man-machine Studies in 1977"


Journal ArticleDOI
Ronald R. Yager1
TL;DR: This paper presents a model for solving multi-objective decision problems when the objectives are of varying degrees of importance by assigning to each objective a power indicative of its importance and then raising each fuzzy set to its appropriate power.
Abstract: One of the most useful aspects of fuzzy set theory is its ability to represent multio-bjective decision problems involving vague or fuzzy objectives. This paper presents a model for solving multi-objective decision problems when the objectives are of varying degrees of importance. This is done by assigning to each objective a power indicative of its importance and then raising each fuzzy set to its appropriate power. These powers are obtained by getting the eigenvector of the maximum eigenvalue of a matrix of paired comparisons of the objectives.

452 citations


Journal ArticleDOI
TL;DR: This paper examines the fundamentals of network notation, in order to understand why the “formalism” has not been the panacea it was once hoped to be and emphasizes the importance of considering an “epistemological foundation” on which to consistently build representations for complex concepts.
Abstract: Semantic networks constitute one of the many attempts to capture human knowledge in an abstraction suitable for processing by computer program. While semantic nets enjoy widespread popularity, they seem never to live up to their authors' expectations of expressive power and ease of construction. This paper examines the fundamentals of network notation, in order to understand why the “formalism” has not been the panacea it was once hoped to be. We focus here on “concepts”—what net-authors think they are, and how network nodes might represent them. The simplistic view of concept nodes as representing extensional sets is examined, and found wanting in several respects. In an effort to solve the foundational problems exposed, we emphasize the importance of considering an “epistemological foundation” on which to consistently build representations for complex concepts. A level of representation above that of completely uniform nodes and links, but below the level of conceptual knowledge itself, seems to be the key to using previously learned concepts to interpret and structure new ones. A particular foundation is proposed here, based on the notion of a set of functional roles bound together by a structuring interrelationship. Procedures for using this foundation to automatically build instances and conceptual modifications are presented. In addition, the intensional nature of such a representation and its implications are discussed.

244 citations


Journal ArticleDOI
TL;DR: The main part of the paper consists of a bibliography of some 1150 items, each keyword-indexed with some 750 being classified as concerned with fuzzy system theory and its applications.
Abstract: The main part of the paper consists of a bibliography of some 1150 items, each keyword-indexed with some 750 being classified as concerned with fuzzy system theory and its applications. The remaining items are concerned with closely related topics in many-valued logic, linguistics, the philosophy of vagueness, etc. These background references are annotated in an initial section that outlines the relationship of fuzzy system theory to other developments and provides pointers to various possible fruitful interrelationships. Topics covered include: the philosophy and logic of imprecision and vagueness; other non-standard logics; foundations of set theory; probability theory; fuzzification of mathematical systems; linguistics and psychology; and applications.

152 citations


Journal ArticleDOI
TL;DR: Experimental evidence is offered which supports the use of memorization/recall tasks as a further basis for judging program quality and programmer comprehension and a syntactic/semantic model of programmer behavior is presented.
Abstract: Judging the relative quality of computer programs and a programmer's comprehension of a given program has proven to be a difficult task. Ability to debug, modify, hand simulate execution or respond to questions about the program all have their weaknesses as comprehension metrics. This paper offers experimental evidence which supports the use of memorization/recall tasks as a further basis for judging program quality and programmer comprehension. A syntactic/semantic model of programmer behavior is presented to provide a basis for this hypothesis.

149 citations


Journal ArticleDOI
TL;DR: It was reported that it was easier for non-programmers to learn to use nested conditional constructions than jumping, or branch-to-label, constructions, but the comparison to the more general case where nesting requires “scope markers” to disambiguate the syntax showed that NEST-INE is so much superior.
Abstract: In a previous paper the authors reported that it was easier for non-programmers to learn to use nested conditional constructions than jumping, or branch-to-label, constructions; however, as only single situations were studied, the conclusions were necessarily restricted. The present study extends the comparison to the more general case where nesting requires “scope markers” to disambiguate the syntax. The results showed that if the scope markers were simply the begin and end of ALGOL 60 (abbreviated NEST-BE) then the advantage of nesting over jumping was weakened; but if the scope markers carried redundant information about the conditional tested (NEST-INE) performance was excellent, particularly at debugging. It seems necessary to distinguish sequence information in a program, which describes the order in which things are done, from taxon information, which describes the conditions under which a given action is performed. Conventional programming languages obscure the taxon information. The advantage of nesting over jumping, we speculate, is in clarifying the sequence information by redundant re-coding in spatial terms; the added advantage of NEST-INE over NEST-BE is that it clarifies the taxon information. It is because debugging requires taxon information that NEST-INE is so much superior. On this view one would expect that in decision table and production system languages, where the taxon information is explicit but the sequence information is obscured, the reverse phenomena should occur. Because debugging requires sequence information as well as taxon information, a device that clarified the sequence would greatly improve such languages.

130 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that the mismatch between the user's conceptualization of the data and its actual structure must be consciously resolved in a single process, and that the use of natural language instead of an artificial query language is the only means by which this can be done.
Abstract: Now that large data bases of valuable information are in existence, we must face the problem of putting this information in the hands of the end user. This is not a task to be taken lightly, since it is a very difficult problem due to the mismatch between the user's conceptualization of the data and its actual structure. By analyzing this problem in some detail we argue that this mismatch must be consciously resolved in a single process, and that the use of natural language instead of an artificial query language is the only means by which this can be done. The technical feasibility of this approach is demonstrated by the ROBOT natural language query processor. Actual dialogs and end user experiences are used to show the resulting increase in end user orientation. The impact of natural language processing on the various DBMS architectures is also discussed.

103 citations


Journal ArticleDOI
TL;DR: Developing and evaluation of two scales for the measurement of patients' attitudes are described, one being of the traditional “Thurstone” type based on attitude statements, and the other being a Semantic Differential scale, which found male patients had more favourable attitudes than female patients; younger patients were more favourable toward the computer than older patients; and manual workers were moreavourable toward theComputer than non-manual workers.
Abstract: In the evaluation of techniques for the automatic acquisition of medical history data, the attitudes of patients, the “users”, are of the utmost importance. While patients' opinions have been sought in several studies, there has previously been no objective measurement of attitudes in this field. In the present paper, the development and evaluation of two scales for the measurement of patients' attitudes are described, one being of the traditional “Thurstone” type based on attitude statements, and the other being a Semantic Differential scale. A reliability coefficient of 0·90 was obtained for the “Thurstone” scale, and the results from this and the Semantic Differential were found to correlate well with each other ( r = 0·70), supporting the validity of the measures. In a study of 75 patients, each was interrogated by computer and then asked to take home a questionnaire containing the attitude scales to complete and return anonymously through the post. Of the 67 patients who returned their questionnaires, 82% had favourable attitudes toward computer interrogation, and 49% had more favourable attitudes toward medical interviews with a computer than toward medical interviews with a doctor. Male patients had more favourable attitudes than female patients; younger patients were more favourable toward the computer than older patients; and manual workers were more favourable toward the computer than non-manual workers.

74 citations


Journal ArticleDOI
TL;DR: Significantly more error-free programs were written in the Procedural condition than in the Plain condition, showing that explicit procedures can improve programming success, at least in these conditions.
Abstract: Ways to reduce careless programming errors were investigated. Non-programmers learnt to write nested conditional programs in one of three conditions: an Automatic syntax condition, in which syntactic errors were impossible because programs were made up from whole syntactic constructions rather than from single words; a Procedural condition, in which programs were written word by word as usual, but a well-defined procedure was prescribed to help subjects write nested conditionals correctly; and a Plain condition resembling standard programming tuition, in which subjects were told the structure of the language but were given no guide to help in writing. Significantly more error-free programs were written in the Procedural condition than in the Plain condition, showing that explicit procedures can improve programming success, at least in these conditions. In the Automatic condition the success rate was still higher, showing that the procedure we used could still be improved. These results, and the outcomes of further analyses, bear on recommendations by the “structured programming” school to follow explicit procedures when writing programs, and also on previous work on the design of easily-used programming languages.

61 citations


Journal ArticleDOI
TL;DR: It is shown that a programming language is progressively interiorized by a subject in the form of a “Systeme de Representation et de Traitement” (S.R.T.) or “Representation and Processing System”, in which the experienced programmer can analyse problems.
Abstract: A theoretical framework has been defined to elucidate the problems raised in the training of analyst-programmers, and a beginning made in validating it in a preliminary experiment. This experiment enabled it to be shown that a programming language is progressively interiorized by a subject in the form of a “Systeme de Representation et de Traitement” (S.R.T.) or “Representation and Processing System”, in which the experienced programmer can analyse problems. Prior to this, however, he must have made his analysis in other S.R.T.s that are more or less compatible with the programming language concerned. Nineteen subjects of various level of training were made to construct a Cobol flowchart of a Metro ticket-machine control problem. An analysis of errors was made and the strategies used described with the aid of 22 variables in order to determine the three principal steps involved in learning a programming language.

61 citations


Journal ArticleDOI
TL;DR: The results indicate that, at least for the kinds of problems tested here, it is possible to develop vocabularies of limited size that can be used effectively in man-computer communications.
Abstract: Two-person teams of subjects worked at realistic problem-solving tasks by communicating through a teletypewriter system. One third of the teams had to limit their vocabulary to words on lists of 300 words, one-third were required to use words on lists of 500 words, and one third of the teams worked with no vocabulary restrictions. Each team solved a different problem on each of three successive days. Dependent measures were taken on four classes of variables: (1) time to solve the problem, (2) measures of overt behavior, (3) measures of verbal output, and (4) errors made by subjects who used the restricted vocabularies. The main finding of the experiment was that subjects who worked with the restricted vocabularies interacted and solved problems as successfully as their counterparts who worked with no vocabulary restrictions. The results indicate that, at least for the kinds of problems tested here, it is possible to develop vocabularies of limited size that can be used effectively in man-computer communications.

53 citations


Journal ArticleDOI
TL;DR: In the first part a logic of induction is developed, based on formal notions of observational and theoretical calculi and inductive inference rules, and GUHA methods of mechanized formation of hypotheses are formally investigated.
Abstract: A logic of discovery developed as a theoretical background of certain methods of mechanized formation of inductive hypotheses is surveyed. In the first part a logic of induction is developed, based on formal notions of observational and theoretical calculi and inductive inference rules. A logic of suggestion is presented in the second part, and GUHA methods of mechanized formation of hypotheses are formally investigated. Logical, statistical and computational aspects are emphasized.

Journal ArticleDOI
TL;DR: An interactive aid for musicians, together with its applications to music teaching, music composition, real-time performance and music typesetting, and a performance-oriented music notation are described.
Abstract: An interactive aid for musicians is described, together with its applications to music teaching, music composition, real-time performance and music typesetting. The system permits the input of sounded music and its subsequent playback using an electronic organ. Facilities for transcription, display and editing are provided using a graphical display unit. A digital synthesizer is incorporated for sophisticated sound generation. Problems encountered with transcription, display and editing of sounded music are identified. A performance-oriented music notation is described, and compared with conventional music notation. The requirements of both notations are assessed with respect to the transcription, display and editing tasks. The value of an interactive rather than fully automatic system is emphasized. Software and data-base organization of a transcription system permitting the display and editing of both notations is described. Results are presented of practical uses of the system.

Journal ArticleDOI
TL;DR: It is found that if the material to be presented can be represented as a transition diagram, then much of the process of providing error feedback and assistance can be automated.
Abstract: The creation of CAI material is generally a formidable task, since all possible branching paths, corresponding to all possible user inputs both correct and incorrect, must be anticipated and receive an appropriate response. A system which can be given only the set of correct paths through the material and which can automatically generate appropriate responses to user errors and requests for assistance would greatly simplify this task. We have found that if the material to be presented can be represented as a transition diagram, then much of the process of providing error feedback and assistance can be automated. In addition, transition diagrams have been found capable of modeling a wide enough class of concepts to be useful for many applications.

Journal ArticleDOI
TL;DR: The Telephone Enquiry Service is a computer system which allows interactive information retrieval from an ordinary touch-tone telephone, and an unusual feature of the system is that the speech is generated by rule from a phonetic representation.
Abstract: The Telephone Enquiry Service is a computer system which allows interactive information retrieval from an ordinary touch-tone telephone. For input, the caller employs the touch-tone keypad, and the computer replies with a synthetic voice response. The service has been in fairly continuous operation for around one year, using a small time-shared computer in conjunction with an internal 200-line telephone exchange, and has been widely used by people with no special interest in synthetic speech. An unusual feature of the system is that the speech is generated by rule from a phonetic representation. A satellite computer, acting as a peripheral to the main machine, performs this task in real time, and controls the parameters of an analogue speech synthesizer. This constitutes an extremely economical and flexible method of speech storage, whose only real disadvantage is the low quality of articulation of the output. A major conclusion of the work is that even low-quality speech is acceptable to casual users, if the service is sufficiently interesting and useful to them.

Journal ArticleDOI
TL;DR: Results of some new simulation experiments regarding the structure identification procedure are described and discussed in the paper and are supposed to help the user (modeller) to interpret the various parameters associated with the procedure properly.
Abstract: This paper summarizes an approach to the problem of structure identification which was previously suggested by the authors and elaborated into a fully computerized procedure ( Klir, 1976 , Klir and Uyttenhove, 1976 , Klir and Uyttenhove, 1976 ). A structure system is viewed as a set of coupled relations (probabilistic, in the general case) which are projections of an overall relation. The term “structure identification” refers to the problem of finding, at each level of structure refinement, structure systems which conform best to the given relation. Results of some new simulation experiments regarding the structure identification procedure are described and discussed in the paper. These results are supposed to help the user (modeller) to interpret the various parameters associated with the procedure properly. The use of the experimental results is illustrated by examples taken from the areas of computer performance evaluation and social research.

Journal ArticleDOI
TL;DR: The technical details of a psychological instrument called the Personal Relations Index (PRI), which produces a graphical representation of any two-person relationship as seen by one member of the interaction, are outlined.
Abstract: This paper outlines the technical details of a psychological instrument called the Personal Relations Index (PRI). The instrument produces a graphical representation of any two-person relationship as seen by one member of the interaction. Several stages are involved. (1) During a free discussion the respondent is asked to describe himself and the other person in the context of their relationship. These descriptions include attitudes, feelings or behavioural states which are referred to as elements. (2) The elements are fed into a computer which produces a questionnaire unique to the respondent. In answering the questionnaire the respondent imposes order on his impressions by indicating how he is most likely to react to each of the other person's elements and how the other person is most likely to react to his (the respondent's) elements. (3) A graphical representation showing the dynamic interaction between the elements and therefore the two people, is constructed by interconnecting each element with its most likely outcome(s). (4) This graphical representation is presented to the respondents in a readily comprehensible form and enables them to understand their own problem situation and hence aids them in resolving it. (5) Successive graphical representations of later states of affairs are also fed back to the respondents so that the improvements (in what is a complex dynamic situation) are apparent to them. The logic of the method, the syntactical structure of the questionnaire and the process of automation are described. Details of the graphs and their properties are given which emphasize the capacity of the PRI to accommodate great variability. Operational definitions and evidence of both validity and reliability are given. The penultimate section of the paper presents three examples of the use of the PRI. The paper concludes with a discussion of the feasibility of implementing the PRI in the clinical setting, details of ongoing work, and the suggestion that automated techniques provide the means of producing idiographic psychological assessments.

Journal ArticleDOI
John Fox1
TL;DR: The current state of medical computing is surveyed and an overview of the broad classes of problems that can be created for the user when the new techniques are introduced is provided.
Abstract: It is argued that human-factors research could contribute to the successful development of medical computer technology, but that an initiative will be required from the human-factors professions. The paper attempts to contribute to such an initiative by surveying the current state of medical computing and by providing an overview of the broad classes of problems that can be created for the user when the new techniques are introduced.

Journal ArticleDOI
TL;DR: A formalism PSGL has been developed, along with a computer system for interpreting it and parsing sentences, that allows one to write compact and linguistically apt grammars.
Abstract: A formalism PSGL for writing natural language grammars has been developed, along with a computer system for interpreting it and parsing sentences. PSGL combines ideas from augmented transition network theory, systemic grammar, and Chomsky's recent trace theory, in a way that allows one to write compact and linguistically apt grammars.

Journal ArticleDOI
TL;DR: The conclusion was drawn that position of pitch rise, under the conditions of the experiment, was perceived categorically, one category being early in the syllable and the other late, and there was some evidence for the existence of two further categories.
Abstract: An experiment was run in which listeners heard pairs of nonsense words exhibiting the same segmental structure, but differing in the form of pitch variation imposed. In each pair, the first word bore a pitch rise over 100 ms superimposed upon a generally declining pitch frequency, while the second word carried a similar variation, but with the rise occurring later. Listeners made a forced choice response of “SAME” or “DIFFERENT”. The null hypothesis, that listeners' ability to discriminate pairs as different would be independent of the mean position of the pitch rises, was rejected with great confidence and, subject to several caveats, the conclusion was drawn that position of pitch rise, under the conditions of the experiment, was perceived categorically, one category being early in the syllable and the other late. There was some evidence for the existence of two further categories. The generalization and extension of the work will provide a continuing challenge.

Journal ArticleDOI
TL;DR: This work is of interest in the field of computer learning in as much as it provides an example of an adaptive system that, rather than tuning numeric weights, actually varies its primary structural element, namely the grammar that defines its current language.
Abstract: Natural language acquisition deals with two very difficult problems in artificial intelligence: computer learning and natural language processing. This system focuses on the problems involved in the acquisition of primitive linguistic capability. That is when words are first correlated to concepts and when the ordering of the words of utterance first become important. With these beginnings the techniques developed herein eventually acquire the capability to deal with nested dependent clauses. This work is of interest in the field of computer learning in as much as it provides an example of an adaptive system that, rather than tuning numeric weights, actually varies its primary structural element, namely the grammar that defines its current language. This work is of interest in the field of natural language processing in as much as it requires the development of a parsing algorithm robust enough to deal with grammars and dictionaries that vary with time. The ability to automatically extend the,grammar to include new sentence forms is also requisite for language acquisition.

Journal ArticleDOI
TL;DR: A survey of some of the modern techniques and ideas of machine understanding of natural language shows them, although superficially different to be fundamentally similar.
Abstract: The development of machine understanding of natural language is briefly traced from the early years of machine translation to today's question answering and translation systems. This survey of some of the modern techniques and ideas shows them, although superficially different to be fundamentally similar.

Journal ArticleDOI
Charles C. Tappert1
TL;DR: A Markov-model acoustic-phonetic component is constructed for the synthesis of standard acoustic representations of connected speech that permits matching between actual acoustic data and internally modeled acoustic data, and can be employed in various ways—to label speech automatically.
Abstract: A Markov-model acoustic-phonetic component is constructed for the synthesis of standard acoustic representations of connected speech. The primary building blocks are phones with Markov models structured so that phone length, spectral power and fundamental frequency are parametrically controlled. The model generates acoustic parameter outputs at 10-ms time steps. The acoustic-phonetic component permits matching between actual acoustic data and internally modeled acoustic data, and can be employed in various ways—to label speech automatically. as a phone decorder to obtain estimated phone strings, and in speech recognizers which match at the acoustic level.

Journal ArticleDOI
TL;DR: Applicability of the GUHA method of mechanized hypothesis formation (General Unary Hypotheses Automaton) in data analysis is tested by processing of concrete sociological data.
Abstract: Applicability of the GUHA method of mechanized hypothesis formation (General Unary Hypotheses Automaton) in data analysis is tested by processing of concrete sociological data. For the sake of comparison some usual statistical methods are applied to the same data sample. The advantages of the joint use of mechanized hypothesis formation systems and statistical programs in data analysis are briefly discussed.

Journal ArticleDOI
TL;DR: The paper makes explicit the nature of some logical paradoxes by representing them in the form of logicalnets, or simple finite automata expressed in the structural language as logical nets, both binary and non-classic multivalued ones.
Abstract: The paper makes explicit the nature of some logical paradoxes by representing them in the form of logical nets, or simple finite automata expressed in the structural language as logical nets, both binary and non-classic multivalued ones. In this representation the structure of the problems turning eventually into paradoxes is expressed by the structure of the respective logical nets and the course of reasoning about the problems by the behaviour of these nets. All the nets in question have memory, this standing for the fact that they depend on self-reference. It is shown, however, that this is not the only sufficient and necessary condition for a problem of this class to turn into a paradox.

Journal ArticleDOI
TL;DR: This paper presents a self-contained introduction and implementation description to a simulation system for modeling simultaneous actions and continuous processes ( Hendrix, 1973), and a new methodology for the construction of world models.
Abstract: This paper presents a self-contained introduction and implementation description to a simulation system for modeling simultaneous actions and continuous processes ( Hendrix, 1973 ). The essence of the system is described by a portion of its abstract: A new methodology for the construction of world models is presented. The central feature of this methodology is a mechanism which makes possible the modeling of (1) simultaneous, interactive processes, (2) processes characterized by a continuum of gradual change, (3) involuntarily activated processes (such as the growing of grass) and (4) time as a continuous phenomena. and by a recent review, Gaines (1975) : This is a fascinating paper that will be of interest outside the “artificial intelligence” (AI) context in which it is written, from those concerned with simulating and controling multi-element systems to those interested in operational definitions of concepts such as “causality.” Three robot world models are incrementally developed, each introducing a new modeling concept. World models, including a robot world (with sample output), electrical world, and a Turing world are also presented. The interactive operating environment presented permits the user to inspect and alter the run-time structure. A detailed account of the implementation is presented.

Journal ArticleDOI
TL;DR: The purpose of each of the three phases is to facilitate control of the system (or environment)—prudent exploration accelerates modelling, successful modelling assists control.
Abstract: Imagine being given a new system to control, whose structure is unknown and can only be ascertained by input-output experiments. The process of interacting with it can be broken down into three phases which, although they are distinct conceptually, usually overlap in practice. The breakdown seems to correspond with what we do intuitively when presented with a strange device that we cannot take apart—like, for example, a terminal attached to an unfamiliar computer system. Firstly, input sequences must be synthesized which force the system to exhibit interesting behaviour—the exploration problem. Secondly, the input-output behaviour of the system must be modelled. Finally, one must learn how to control the system by generating sequences of inputs which drive it into desirable states. The purpose of each of the three phases is to facilitate control of the system (or, as we shall say, environment)—prudent exploration accelerates modelling, successful modelling assists control. This paper discusses these three components of the learning control problem, and summarizes results and techniques that bear on each of them.

Journal ArticleDOI
Naomichi Furutani1
TL;DR: The main psychological state (w-sense) and four kinds of psychological factor which affect the w-sense are defined and the desire of steering operation is explained by a psychological balance between the driving goal and the current driving state.
Abstract: Steering behaviour of a car on both straight and curved roads is studied by use of the butterfly catastrophe Adopting the same method with regard to model framework and introduction of psychological aspects with Part I, the main psychological state (w-sense) and four kinds of psychological factor which affect the w-sense are defined The w-sense is interpreted as a dynamic sense of lateral location on the road associated with steering behaviour The desire of steering operation is explained by a psychological balance between the driving goal and the current driving state The steering behaviour model is made combining the submodels of two kinds of world (implicit and explicit) and the interfacial field In the case of the straight road, two types of steering are defined according to the property of the driving goal; severe steering and easy steering A simulation of zigzag behaviour on a straight road is done by use of the obtained model In the case of the curved road, the curved-stable run and the driving programme at the entrance of the curved road are discussed

Journal ArticleDOI
TL;DR: A procedural means for designing computer-assisted instructional systems called Unobtrusive Problem Solving Monitors (UPSM'S) which (1) monitor students solving multi-step problems and (2) offer useful, pertinent advice to the student based on the prior student input history.
Abstract: This paper describes a procedural means for designing computer-assisted instructional (CAI) systems called Unobtrusive Problem Solving Monitors (UPSM'S) which (1) monitor students solving multi-step problems and (2) offer useful, pertinent advice to the student based on the prior student input history. These systems represent a conceptual advance over earlier types of systems in that they allow students to enter intermediate results of their problem solving activity, yet also allow them to discover and correct any errors in these results without unsolicited prompting from the system. At the same time, they can offer specific advice generated from a student's intermediate results whenever requested. These features allow students to obtain the benefits of problem solving in an unhindered, unguided fashion, yet allow them to request help when it is needed. This paper also outlines a prototype UPSM system that has been implemented in the area of plane geometry. This implementation incorporates a problem solver that solves portions of the problems under consideration while generating structured, verbalized descriptions of the solution process which are used as sources of incremental advice for the student. Because the problem solver attempts to extend lines of reasoning started by the student, the advice that is generated allows the geometry UPSM to adapt its responses to a variety of student input histories.