scispace - formally typeset
Search or ask a question

Showing papers in "Behavior Research Methods Instruments & Computers in 1986"


Journal ArticleDOI
TL;DR: The general natures of psychometric functions and of thresholds are reviewed, and various methods for estimating sensory thresholds are summarized, and the most efficient method, in principle, using maximum-likelihood threshold estimations is examined in detail.
Abstract: Laboratory computers permit detection and discrimination thresholds to be measured rapidly, efficiently, and accurately. In this paper, the general natures of psychometric functions and of thresholds are reviewed, and various methods for estimating sensory thresholds are summarized. The most efficient method, in principle, using maximum-likelihood threshold estimations, is examined in detail. Four techniques are discussed that minimize the reported problems found with the maximum-likelihood method. A package of FORTRAN subroutines, ML-TEST, which implements the maximum-likelihood method, is described. These subroutines are available on request from the author.

265 citations


Journal ArticleDOI
TL;DR: This paper found that females tended to use more extreme ratings than did males when rating words on the pleasantness scale, while males tended to rate words higher on the imagery and familiarity scales.
Abstract: No catalog of words currently available contains normative data for large numbers of words rated low or high in affect. A preliminary sample of 1,545 words was rated for pleasantness by 26–33 college students. Of these words, 274 were selected on the basis of their high or low ratings. These words, along with 125 others (Rubin, 1981), were then rated by additional groups of 62–76 college students on 5-point rating scales for the dimensions of pleasantness, imagery, and familiarity. The resulting mean ratings were highly correlated with the ratings obtained by other investigators using some of the same words. However, systematic differences in the ratings were found for male versus female raters. Females tended to use more extreme ratings than did males when rating words on the pleasantness scale. Also, females tended to rate words higher on the imagery and familiarity scales. Whether these sex differences in ratings represent cognitive differences between the sexes or merely differences in response style is a question that can be determined only by further research.

168 citations


Journal ArticleDOI
TL;DR: Preliminary data from 26 disabled readers indicate that there are significant benefits of speech feedback for reading comprehension and word recognition, and that children enjoy reading with the system.
Abstract: In this paper, we describe the application of new computer and speech synthesis technologies for reading instruction. Stories are presented on the computer screen, and readers may desig­ nate words or parts of words that they cannot read for immediate speech feedback. The impor­ tant contingency between speech sounds and their corresponding letter patterns is emphasized by displaying the letter patterns in reverse video as they are spoken. Speech feedback is provided by an advanced text-to-speech synthesizer (DECtalk). Intelligibility data are presented, showing that DECtalk can be understood almost as well as natural human speech by both normal adults and reading disabled children. Preliminary data from 26 disabled readers indicate that there are significant benefits of speech feedback for reading comprehension and word recognition, and that children enjoy reading with the system.

99 citations


Journal ArticleDOI
TL;DR: The overall performance of the best synthesis system, DECtalk-Paul, was equivalent to natural speech only in terms of performance on initial consonants, and suggestions for future research on improving the quality of synthetic speech are considered.
Abstract: We present the results of studies designed to measure the segmental intelligibility of eight text-to-speech systems and a natural speech control, using the Modified Rhyme Test (MRT). Results indicated that the voices tested could be grouped into four categories: natural speech, high-quality synthetic speech, moderate-quality synthetic speech, and low-quality synthetic speech. The overall performance of the best synthesis system, DECtalk-Paul, was equivalent to natural speech only in terms of performance on initial consonants. The findings are discussed in terms of recent work investigating the perception of synthetic speech under more severe conditions. Suggestions for future research on improving the quality of synthetic speech are also considered.

91 citations


Journal ArticleDOI
TL;DR: A number of basic methods used in the lab for displaying psychophysical displays using an Adage RDS-3000 raster display system and the syntax of the software routines involved are shown.
Abstract: For the study of human and animal vision, the video framebuffer is the only technology that is capable of displaying two-dimensional images with precise control of contrast, luminance, and display timing. The video framebuffer also allows precise control of color. However, this device is not designed for precise psychophysical displays, and techniques must be developed to use them in this role. In order to be concrete, an RDS-3000 raster display system (Adage, 1982) is used which is hosted by a PDP-11/73 under the Venix operating system. The principles generalize to other machines. Where it clarifies the issues, the syntax of the software routines involved is shown.

77 citations


Journal ArticleDOI
TL;DR: Results suggest that this aerosol nasal-spray method of presenting nicotine provides the measured doses necessary for quantification of nicotine’s effects.
Abstract: For studies of the behavioral and physiological effects of nicotine in smokers, delivery of nicotine via cigarette smoking is highly variable and difficult to control. A more precise method of delivery is needed in order to accurately manipulate the amount of nicotine being presented and, thus, to determine its quantitative effects. The objective of the study reported here was to test an aerosol nasal-spray method of delivering measured doses of nicotine. Eleven healthy males were presented 0 mg (placebo), 0.5 mg, 1.0 mg, and 2.0 mg of nicotine over 5 min during four separate sessions, and changes were observed in plasma nicotine concentration and cardiovascular activity. Dose-response relationships were observed between nicotine presented via this method and plasma nicotine, heart rate, systolic blood pressure, and, to a lesser extent, diastolic blood pressure. These results suggest that this aerosol spray method of presenting nicotine provides the measured doses necessary for quantification of nicotine’s effects.

59 citations


Journal ArticleDOI
TL;DR: In this paper, a general logical model of properties of suppressor variables is proposed and consistent exploration of possible manifestations of suppressors within this theoretical framework accounts for extant classifications of suppressOR variables into the classical, net, and cooperative categories and suggests existence of new subcategories.
Abstract: A general logical model of properties of suppressor variables is proposed. Consistent exploration of possible manifestations of suppressor variables within this theoretical framework accounts for extant classifications of suppressor variables into the classical, net, and cooperative categories and suggests existence of new subcategories, not detected previously. The discussed model leads to consistent identification and classification of suppressor variables and facilitates computer simulation.

59 citations


Journal ArticleDOI
TL;DR: This report describes ARITHPRO and its architecture and knowledge base, an instantiation of a recently proposed cognitive model of the knowledge and procedures required to solve children’s arithmetic word-problem solving behavior.
Abstract: ARITHPRO is a computer simulation of children’s arithmetic word-problem solving behavior. It is an instantiation of a recently proposed cognitive model of the knowledge and procedures required to solve such problems. The program solves word problems by (1) comprehending the story text in which the problem is embedded, (2) comprehending numerical information as sets of objects, (3) building superstructures from these sets, thereby specifying their logical relations, and (4) using a counting procedure to derive the answer to the problem. This report describes ARITHPRO and its architecture and knowledge base. A few comparisons of ARITHPRO’s performance with that of children are also provided.

55 citations


Journal ArticleDOI
TL;DR: A short computer program is provided to calculate β and the sensitivity indexd’, which summarizes the criteria values related to hit and false-alarm rates in signal detection theory.
Abstract: Gardner, Dalsing, Reyes, and Brake (1984) supplied a table of criterion values (β) related to hit and false-alarm rates in signal detection theory. Other methods of calculatingβ are suggested as more accurate alternatives to using that table. A short computer program is provided to calculate β and the sensitivity indexd’.

46 citations


Journal ArticleDOI
TL;DR: RSCORE-J computes jackknife estimates of ROC parameters and their standard errors from rating-method data pooled over a group of observers.
Abstract: RSCORE-J, a computer program for a signal-detection analysis of pooled rating-method data, is listed and described. RSCORE-J computes jackknife estimates of ROC parameters and their standard errors from rating-method data pooled over a group of observers.

43 citations


Journal ArticleDOI
TL;DR: The article reviews existing programs that function as funnels, inventors, and therapists and concludes with a discussion of the potential efficacy of such programs in solving the major problems associated with planning and translating.
Abstract: This article details three difficulties encountered during the prewriting and drafting stages of document preparation and describes computer aids designed for each difficulty. Writers experience problems in planning ideas and translating ideas into text because of attentional overload, inability to generate useful ideas, and affective interference. Idea processors are programs that perform various functions to assist with generating and organizing ideas so they can be communicated successfully in a written document. Among other things, an idea processor can serve as a funnel for attention, an inventor of ideas, or therapist for emotional hindrance. The article reviews existing programs that function as funnels, inventors, and therapists and concludes with a discussion of the potential efficacy of such programs in solving the major problems associated with planning and translating.

Journal ArticleDOI
TL;DR: Sentence-completion norms for sentences using a multiple production measure are presented in this paper, where a subset of these items were taken from the Bloom and Fischler (1980) sentence completion norms in order to compare the Cloze measure with the present MCP measure.
Abstract: Sentence-completion norms for sentences using a multiple production measure are presented. A subset of these items were taken from the Bloom and Fischler (1980) sentence-completion norms in order to compare the Cloze measure with the present multiple production measure. For both measures, the sentence constraint correlated negatively with the number of responses generated across subjects. Although the Cloze measure and the multiple production measure were highly correlated, sentence predictability was higher when the multiple production measure was used. These sentence norms provide an alternative to norms derived using the Cloze procedure.

Journal ArticleDOI
TL;DR: In this article, the authors present an assembler for synchronizing experimental timing with display presentation on the Apple Macintosh, which can be easily adapted to any of the Macintosh-based languages.
Abstract: Many of the timed functions that concern psychologists, such as perceptual presentations and reaction time, are sensitive to a maximum variability in display timing caused by screen-refresh characteristics. For the Apple Macintosh, the screen operating speed is 60 Hz, which translates to an average of 8.33-msec variability. For microcomputers other than the Macintosh, a variety of hardware and software modifications to generate millisecond timing have become standard (e.g., Reed, 1979). Other than Reed College’s (1985) implementation in Rascal, which requires the Rascal development language, there has been no method of which we were aware to synchronize experimental timing with display presentation on the Macintosh. This limitation in the usefulness of the Macintosh as an otherwise excellent research tool can be overcome using Drexel University’s MilliTimer. The assembler code which follows should be considered in the public domain and can be readily adapted to any of the Macintosh-based languages.

Journal ArticleDOI
TL;DR: The three-dimensional numeric and graphic analyses of movement made possible by this system allow new studies into the nature of the neural control of movement.
Abstract: A system is presented that allows automated, three-dimensional tracking of hand and arm movements. The system incorporates commercially available optoelectronic cameras and provides portable and affordable, yet accurate, three-dimensional monitoring of multiple joints of the hands and arms. Special-purpose hardware components were developed, as was software for data acquisition, data processing, and graphic display. The hardware and software are described, along with such necessary procedures as system calibration and transformation of coordinate system frames of reference. Testing of the system revealed highly accurate three-dimensional spatial tracking. The three-dimensional numeric and graphic analyses of movement made possible by this system allow new studies into the nature of the neural control of movement.

Journal ArticleDOI
TL;DR: UNIX|STAT is a statistical package developed at the University of California, San Diego, and at the Wang Institute of Graduate Studies that allows the manipulation and analysis of data and is complemented by a detailed tutorial, a manuals for each program, and a quick reference sheet.
Abstract: UNIX|STAT is a statistical package developed at the University of California, San Diego, and at the Wang Institute of Graduate Studies. Over 20 programs allow the manipulation and analysis of data and are complemented by a detailed tutorial, a manual entry for each program, and a quick reference sheet. The package was first introduced as 4 programs (Perlman, 1980). Since then, many programs have been added, and since 1980, the package has been distributed to over 400 UNIX sites. The portability of the package, written in C, was demonstrated when it was ported from UNIX to MSDOS at Cornell University on an IBM PC using the Lattice C compiler. The added capabilities of the package, combined with the availability on the popular MSDOS, used on IBM, AT&T, Wang, and other personal computers, make the programs even more useful for psychological research and instruction.

Journal ArticleDOI
TL;DR: A system is described for analyzing recorded natural speech in real time using a microcomputer for clinical research and has applicability to other research areas involving speech.
Abstract: A system is described for analyzing recorded natural speech in real time using a microcomputer. Recordings up to 15 min in length can be analyzed in terms of fundamental frequency, amplitude, length of utterances, and pauses. Although primarily developed for clinical research, the system has applicability to other research areas involving speech.

Journal ArticleDOI
TL;DR: Techniques for circumventing some of the current limitations of digital image-processing systems combined with color raster television technology are presented, especially with regard to the issue of real-time variation of visual images.
Abstract: Digital image-processing systems combined with color raster television technology are opening new worlds for the visual psychophysicist. Some of the inherent limitations and practical hazards of these technologies are explored. Techniques for circumventing some of the current limitations are presented, especially with regard to the issue of real-time variation of visual images.

Journal ArticleDOI
TL;DR: The purpose of this article is to introduce a computer grid package written for personal computers, Circumgrids, which is written in Turbo Pascal for the ffiM PC system, is user friendly, and provides several types of analysis.
Abstract: Personalconstruct theory(Kelly, 1955) is basedon the assumption that a person's behavior is shapedby his/her constructs Because of the importance placed on constructs, the major task for construct psychologists is the assessment of theindividual's system of constructs Nearly a thousand publications using construct theory have emerged since the theory's creation, with about 90% of these using scaling techniques called repertory grids There are many forms of repertory grids (Beail, 1985; Fransella& Bannister, 1977),with eachproviding some numeric or geometric representation of the individual's construct system In completing a grid, the person typically ratesor ranksa set of peopleor events alonga number of construct dimensions, suchas intelligentfriendly, and so forth This rating/ranking produces matrices of numbers that are analyzed, per individual, by oneor more of numerous types of analysis Results fromthese separate analyses are then either interpreted per individual, as in clinical assessment, or concatenated with results from other subjects' grids to be further analyzed Research withgridscan be very time-consuming when analysis proceeds by hand For a number of years, the Britishgovernment providedBritish researchers and clinicians free access to a principal components grid package This service led to a dramatic increase in publications employing repertory gridsThe service hassincebeendiscontinued At present there are several commercially available computerpackages for grid analysis The packages generallyprovideonly one type of analysis, are expensive,and requireboth programming skillsand access to mainframe computers The average researcher, clinician, or studenthoping to usegrid techniques todayfaces formidable obstacles The purposeof this article is to introduce a computer gridpackage written forpersonal computers The program, Circumgrids, is written in Turbo Pascal for the ffiM PC system, is user friendly, andprovides several types of analysis The name Circumgrids is derived fromthe wordcircumspection, a technical term

Journal ArticleDOI
TL;DR: The MacLab at Drexel University as discussed by the authors is a series of "take-home" programs that convert a student's personal computer into a piece of experimental psychology equipment, which can be used for research.
Abstract: Every student at Drexel University is required, on admission, to purchase a Macintosh computer. Consequently, there is an understandable demand to effectively utilize this resource in the undergraduate curriculum. We have developed what amounts to a series of “take-home” programs that convert the Macintosh into a number of “pieces” of experimental psychology equipment. Providing each student with a personal psychology “MacLaboratory” has apparent pedagogical and practical benefits, from creative hands-on experience to ease of independent research. This paper summarizes details of the program to date, its development process, supporting materials, and our experience when every student has a personal computer.

Journal ArticleDOI
TL;DR: In this article, a method is described through which pigeons learn odd-item search rapidly and perform with high accuracy despite the appearance of each form as a target on some trials and as a distractor on others.
Abstract: In odd-item visual search, subjects confront a display on which a number of stimulus items appear. All but one of these items are identical; the subject must respond to the one item (the target) that in some way differs from all the others (the distractors). The time required to find the target reflects the similarity between the target form and the distractor form. A matrix of search times for all possible pairs of a set of 20 or more items can be obtained in a single session. Such similarity matrices may reflect stimulus features, dimensions, and categories, among other things. A method is described through which pigeons learn odd-item search rapidly and perform with high accuracy despite the appearance of each form as a target on some trials and as a distractor on others. The paper also describes the essential apparatus and exemplifies displays and data.

Journal ArticleDOI
TL;DR: Application of the procedures for the production of visually degraded picture, letter, and word stimuli, and of visual stimuli common to neuropsychological investigations, are discussed.
Abstract: In this article, we describe procedures, materials, and some representative results of a microcomputer-based approach to the degradation of visual stimuli for the investigation of perceptual identification. We discuss application of the procedures for the production of visually degraded picture, letter, and word stimuli, and of visual stimuli common to neuropsychological investigations.

Journal ArticleDOI
TL;DR: In this article, the transit-signal method was combined with nonaging intervals for double-stimulation experiments on reaction time (RT) to study the attentional demands of movements and the influence of level of momentary probability on basic RT effects.
Abstract: The nonaging-intervals procedure, in which momentary probability of stimulus occurrence remains constant, is preferable in principle to varied or constant intervals for double-stimulation experiments on reaction time (RT). However, elevation of RT has uniformly been found at short waiting intervals on single-stimulation baseline tasks. Effects attributable to the first stimulus on double-stimulation tasks would thus be confounded. The required level baselines were obtained for both simple and choice reactions by combining the transit-signal method with nonaging intervals. Possible reasons for this success were the elimination of timekeeping error and psychological refractoriness. Results with precued full response information show the expected decline of RT with increase of precue-to-stimulus interval. Suggestions are given for the use of nonaging intervals for studying the attentional demands of movements and the influence of level of momentary probability on basic RT effects.

Journal ArticleDOI
TL;DR: A simple algorithm is presented here a simple algorithm that has varied applications in the analysis of eye-movement scan-path records, based in part on techniques developed for use in ethological and taxonomic studies of behavior.
Abstract: Progress in the development of techniques and equipment to record eye movements (Young & Sheena, 1975) has not been matched by progress or innovation in analytic techniques for assessing eye-movement data itself. Byand large, studies thatemploydynamic eye-movement r~~ords as a p~ysiological measureof attentionor of cogrutive processing focus on fixations or saccade amplitudes as key variables. Yet the more global feature of eyemovement records, the overall distribution of fixations on the visual two-space (the actual stimulusdisplay) has not received as muchattention. An appeal to topographic, g~ap~-t~eore~ic, or spatial-point type of analysis can proVIde insights mtothe distributional structure of suchglobal dimensions of eye-movement data. A visual scene can be regarded as a two-dimensional space that is characterized by regions that differ along a nurr.tber of dimensions: informative value, texture, luminosity, and so forth. We can ask whether the successive displacementof eye positions, reflecting the global scan patterns of subjects, is regular, random, or structured in patterns of clusters, pairs, or single fixations over the visual field. Givena technique for reliably answering such questionsfor large data sets, we can then developa number of indexesto characterize the ocular behavior of subjec~s in waysthatgo beyond the analysis of typically static variables, suchas fixation durationand saccade amplitude. We present here a simplealgorithm that has varied applications in the analysis of eye-movement scan-path records. This algorithmis based in part on techniques developed for use in ethological and taxonomic studies of behavior. The algorithm describedhere departsfrom such techniqu~s in one important respect. Althoughclustering sc.hemes m the past have not been particularly concerned WIth the order of arrival of elements in a cluster, this is

Journal ArticleDOI
TL;DR: In this paper, the authors present preliminary arguments for the information-preservation view and introduce a new technique, that of using simulated projection surfaces, whose use in experimental situations suggests that Euclidean rectification is not necessary.
Abstract: For the film goer who sits to the front and side of a movie theater, the virtual space “behind” the screen undergoes affine and perspective transformations. These transformations should, one would think, make the rigidity of objects on the screen very difficult to discern. Despite the fact that it has long been known that viewers are not very sensitive to such distortions, a phenomenon I call La Gournerie’s paradox, the effect is without a good theoretical account. Two possibilities are: (1) that viewers rectify the distortions of Euclidean space through the use of information about screen slant, and (2) that sufficient information is preserved under these transformations so that perception may be unperturbed. This paper presents preliminary arguments for the information-preservation view and introduces a new technique, that of using simulated projection surfaces, whose use in experimental situations suggests that Euclidean rectification is not necessary.

Journal ArticleDOI
TL;DR: The Fieller's Theorem as discussed by the authors describes the distribution of the ratio of two normal variables, i.e., the dose of a compound that is lethal to 50% of subjects tested (LDso) divided by the dose that is effective for 50% treated subjects (EDso).
Abstract: Occasionally, one has needfor confidence intervals or tests of significance for ratios of normal variables. For instance, tests of significance on the therapeutic safetyratio are needed in pharmacology. This ratio is the doseof a compound that is "lethal" to 50% of subjects tested (LDso) divided by the dose that is "effective" for 50% of treated subjects (EDso) . The bigger this ratio is, the largerthe lethaldoseis compared withthe effective dose and, therefore, the safer the drug. Although it is probablysafeto assume thatestimates suchas EDso and LDso haveapproximately normal distributions, thedistribution of the ratioof twonormal variables is quite complex. This distribution has been investigated first by Geary (1930), then more extensively by Fieller (1932), and more recently by a number of researchers (see, e.g., Hinkley, 1969; Marsaglia, 1965; Paulson, 1942; Shanmugalingam, 1982). Because of Fieller's extensive early work, the generaltheoryof the distribution of the ratio of two normal variables is called "Fieller's Theorem" by some authors (see, e.g., Davies, 1961). FieUer's Theorem. A goodapproximation to the confidence interval for a ratio of normal variables can be solvedin the following manner. For normal variables y and x, call their ratio v, where y are normally distributed, which is certainly tenable when x andyare means; the effects of nonnormality on thedistribution of a ratio has not to our knowledge been investigated. Because d in Equation 2 is normally distributed, Equation 2 divided by the square root of Equation 3 is distributed as Student's t (See Kendall & Stuart, 1979, pp. 137-138, for a derivation). Squaring both sides we can write

Journal ArticleDOI
TL;DR: The advantages and limitations of using computer-animated stimuli in studying motion perception are discussed and it is suggested that findings with computer-generated stimuli will generalize to natural events.
Abstract: The advantages and limitations of using computer-animated stimuli in studying motion perception are discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer-generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer-generated displays present simplified approximations of the dynamics in natural events. We know very little about how the differences between natural events and computer simulations influence perceptual processing. In practice, we tend to assume that the differences are irrelevant to the questions under study and that findings with computer-generated stimuli will generalize to natural events.

Journal ArticleDOI
TL;DR: This paper focuses on the evolution of one theory of orientation selection, and it is shown how it was led to differential geometry from “line detectors”; how parallel, distributed computational modeling led to novel proposals regarding curvature estimation; and how these proposals predicted psychophysical sensitivity to discontinuities.
Abstract: It is widely accepted that computer implementations can play a role in verifying psychological theories. In this paper, I argue for a much broader and more abstract role for computation, in particular, one that includes formulation as well as verification. Consideration of issues of abstract computation—what should be computed and how-provides a level of analysis between ecological issues at the problem level and realization issues at the physiological level. This is the computational connection. The paper reflects my personal experience so that my argument can be made concretely. I concentrate on the evolution of one theory of orientation selection, and I show how we were led to differential geometry from “line detectors”; how parallel, distributed computational modeling led to novel proposals regarding curvature estimation; and how these proposals predicted psychophysical sensitivity to discontinuities.

Journal ArticleDOI
TL;DR: In this paper, a representative sample of arguments for and against the computational paradigm are presented and evaluated and the conclusion is that the idea of computation is productive for achieving a functionalist description of how we perceive and act.
Abstract: My concern is with the computer as a metaphor for explaining perception and action. A representative sample of arguments for and against the paradigm are presented and evaluated. The conclusion is that the idea of computation is productive for achieving a functionalist description of how we perceive and act. This level of description can contribute to our understanding independently of description achieved at the levels of neurophysiology and phenomenology. Some of the perceived limitations in the computational method rest on the assumption that the symbolic level must be discrete and abstract. In fact, worthwhile explanations within the information processing framework utilize continuous, modality-specific processes and representations as explanatory devices. One suggestion for a movement from the discrete to the continuous mode is advised to bring computational theories in line with the psychological phenomena they describe. Various alternatives to the computational framework are considered and found to be inadequate substitutes. An example of research is used to demonstrate the value of the continuous mode and the computational level of explanation.

Journal ArticleDOI
TL;DR: In this article, a scale-invariant segregation of image structure solely on the basis of orientation content is described, which is complementary to the well-explored method of filtering in spatial frequency bands; the latter technique is rotation invariant, whereas the former technique is scale invariant.
Abstract: A method is described for scale-invariant segregation of image structure solely on the basis of orientation content. This kind of image decomposition is an unexplored image-processing method that is complementary to the well-explored method of filtering in spatial frequency bands; the latter technique is rotation-invariant, whereas the former technique is scale-invariant. The complementarity of these two approaches is explicit in the fact that orientation and spatial frequency are orthogonal variables in the two-dimensional Fourier plane, and the filters employed in the one method depend only on the radial variable, whereas those employed in the other method depend only on the angular variable. The biological significance of multiscale (spatial frequency selective) image analysis has been well-recognized and often cited, yet orientation selectivity is a far more striking property of neural architecture in cortical visual areas. In the present paper, we begin to explore some coding properties of the scale-invariant orientation variable, paying particular attention to its perceptual significance in texture segmentation and compact image coding. Examples of orientation-coded pictures are presented with data compression to 0.3 bits per pixel.

Journal ArticleDOI
TL;DR: Some of the institutional and curricular implications of 100% student access to personal computing are described and the new perspective which these conditions create on the availability of courseware, on the process of Courseware development, and on future directions in courseware development are explored.
Abstract: In this paper, I describe some of the institutional and curricular implications of 100% student access to personal computing. I then explore the new perspective which these conditions create on the availability of courseware, on the process ofcourseware development, and on future direc­ tions in courseware development. A new challenge and a new opportunity are created by the machine-rich environment that exists when every stu­ dent has a computer. The challenge is in reexamining the contents of the curriculum and looking for things worth doing with the computer. The opportunity lies in introduc­ ing a computer-based component into the entire psychol­ ogy curriculum, tying together the entire curriculum in new ways which introduce a new kind and level of ex­ perience for faculty and students. The personal computer allows one to create a structured learning environment, not only in the lecture hall and the tutorial laboratory, but also wherever the student keeps the machine. Freedom of access helps to free instructional computing from the capture effect of the computer itself. In a machine-rich environment, it is no longer possible to capitalize on whatever intrinsic motivation effect is created by providing students with access to a scarce resource. In addition, ease of access creates new oppor­ tunities for the use ofmicrocomputer application programs (e.g., Hewett, 1985) and for courseware development. Several ideas presented here were first developed while the author was the recipient of faculty development and courseware development minigrants funded from a grant to Drexel University by the Pew Memorial Trust. In working out some ofthese ideas, it has been partic­ ularly useful to have the opportunity to view and discuss with colleagues at Drexel the wide variety of courseware materials which they have de­ veloped, or are currently developing. The final draft of this paper has benefited considerably from the reactions and feedback provided by anonymous reviewers and by colleagues at Drexel who read and com­ mented on earlier drafts.