scispace - formally typeset
Search or ask a question
Author

Susan J. Lederman

Other affiliations: University of Toronto
Bio: Susan J. Lederman is an academic researcher from Queen's University. The author has contributed to research in topics: Haptic technology & Perception. The author has an hindex of 56, co-authored 155 publications receiving 12669 citations. Previous affiliations of Susan J. Lederman include University of Toronto.


Papers
More filters
Journal ArticleDOI
TL;DR: Two experiments establish links between desired knowledge about objects and hand movements during haptic object exploration, and establish that in free exploration, a procedure is generally used to acquire information about an object property, not because it is merely sufficient, butBecause it is optimal or even necessary.

1,723 citations

Journal ArticleDOI
TL;DR: This tutorial focuses on the sense of touch within the context of a fully active human observer and describes an extensive body of research on “what” and “where” channels, the former dealing with haptic perception of objects, surfaces, and their properties, and the latter with perception of spatial layout on the skin and in external space relative to the perceiver.
Abstract: This tutorial focuses on the sense of touch within the context of a fully active human observer. It is intended for graduate students and researchers outside the discipline who seek an introduction to the rapidly evolving field of human haptics. The tutorial begins with a review of peripheral sensory receptors in skin, muscles, tendons, and joints. We then describe an extensive body of research on “what” and “where” channels, the former dealing with haptic perception of objects, surfaces, and their properties, and the latter with perception of spatial layout on the skin and in external space relative to the perceiver. We conclude with a brief discussion of other significant issues in the field, including vision-touch interactions, affective touch, neural plasticity, and applications.

822 citations

Book
20 Apr 2006
TL;DR: This book discusses the evolution and anatomy of the hand, sensory neurophysiology, and applications across the lifespan, as well as some of the applications currently in use.
Abstract: 1. Historical Overview and general introduction 2. Evolutionary development and anatomy of the hand 3. Sensory neurophysiology 4. Tactile sensing 5. Active haptic sensing 6. Prehension 7. Non-prehensile skilled movements 8. End-effector constraints 9. Hand function across the lifespan 10. Applications 11. Summary, conclusions and future directions

544 citations

Journal ArticleDOI
TL;DR: The present study provides a baseline measure of recognition under those circumstances, and it indicates that haptic object recognition can be both rapid and accurate.
Abstract: How good are we at recognizing objects by touch? Intuition may suggest that the haptic system is a poor recognition device, and previous research with nonsense shapes and tangible-graphics displays supports this opinion. We argue that the recognition capabilities of touch are best assessed with three-dimensional, familiar objects. The present study provides a baseline measure of recognition under those circumstances, and it indicates that haptic object recognition can be both rapid and accurate.

522 citations

Journal ArticleDOI
TL;DR: In this article, the availability and saliency of object attributes under haptic exploration, with and without vision, were assessed by two tasks in which subjects sorted objects that varied factorially in size, shape, texture, and hardness.
Abstract: SUMMARY The availability and salience of object attributes under haptic exploration, with and without vision, were assessed by two tasks in which subjects sorted objects that varied factorially in size, shape, texture, and hardness. In the directed-discrimination task, subjeets were instructed to sort along a particular dimension. Although levels on all dimensions were easily discriminated, shape was relatively less so for haptic explorers without vision, as was hardness for those using vision and haptics. Size was least discriminable for both groups. In thefiee-sorting task, subjects were to sort objects by similarity. Three groups used haptic exploration only; these were differentiated by the experimenters' definition of object similarity: unbiased haptics (no particular definition of similarity), haptically biased hapties (similarity = objects feel similar), haptics plus visual imagery (similarity = objects' visual inmges are similar). A fourth group used vision as well as haptics, with instructions like those of the unbiased haptics group. Dimensional salience was measured by the extent to which levels on a dimension were differentiated in free sorting (more differentiation indicating higher salience). The unbiased haptics and haptically biased haptics groups were highly similar; both found the substance dimensions (hardness and texture) relatively salient. The haptics plus visual imagery group showed shape to be overwhelmingly salient, even more so when they were instructed to use two hands, but less so when they had just seen the objects. The haptics plus vision group showed salience to be more evenly distributed over the dimensions. Exploratory hand movements were videotaped and scored into four categories of exploratory procedure (I.,derman & Klatzky, 1987): lateral motion, pressure, contour following, and enclosure (related to texture, hardness, shape, and size, respectively). The distribution of exploratory procedures was found to be directly related to both the designated dimension in the directed-discrimination task, and the salient dimension in the free-sorting task. The results support our contention that the haptic and visual systems have distinct encoding pathways, with haptics oriented toward the encoding of substance rather than shape. This may reflect a direct influence of haptic exploratory procedures: The procedures that are executed under unbiased haptic encoding are those that are generally found to be rapid and accurate (high "ease of encoding"), and the execution of these procedures determines which object properties become salient.

418 citations


Cited by
More filters
Journal ArticleDOI
06 Jun 1986-JAMA
TL;DR: The editors have done a masterful job of weaving together the biologic, the behavioral, and the clinical sciences into a single tapestry in which everyone from the molecular biologist to the practicing psychiatrist can find and appreciate his or her own research.
Abstract: I have developed "tennis elbow" from lugging this book around the past four weeks, but it is worth the pain, the effort, and the aspirin. It is also worth the (relatively speaking) bargain price. Including appendixes, this book contains 894 pages of text. The entire panorama of the neural sciences is surveyed and examined, and it is comprehensive in its scope, from genomes to social behaviors. The editors explicitly state that the book is designed as "an introductory text for students of biology, behavior, and medicine," but it is hard to imagine any audience, interested in any fragment of neuroscience at any level of sophistication, that would not enjoy this book. The editors have done a masterful job of weaving together the biologic, the behavioral, and the clinical sciences into a single tapestry in which everyone from the molecular biologist to the practicing psychiatrist can find and appreciate his or

7,563 citations

Journal ArticleDOI
24 Jan 2002-Nature
TL;DR: The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
Abstract: When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.

4,142 citations

Journal ArticleDOI
TL;DR: In this paper, the authors offer a new book that enPDFd the perception of the visual world to read, which they call "Let's Read". But they do not discuss how to read it.
Abstract: Let's read! We will often find out this sentence everywhere. When still being a kid, mom used to order us to always read, so did the teacher. Some books are fully read in a week and we need the obligation to support reading. What about now? Do you still love reading? Is reading only for you who have obligation? Absolutely not! We here offer you a new book enPDFd the perception of the visual world to read.

2,250 citations

Journal ArticleDOI
TL;DR: A mechanism is proposed that is able to encode the desired goal of the action and is applicable to different levels of representational organization, as well as investigating the role of posterior parietal and premotor cortical areas in schema instantiation.
Abstract: This paper concerns how motor actions are neurally represented and coded. Action planning and motor preparation can be studied using a specific type of representational activity, motor imagery. A close functional equivalence between motor imagery and motor preparation is suggested by the positive effects of imagining movements on motor learning, the similarity between the neural structures involved, and the similar physiological correlates observed in both imaging and preparing. The content of motor representations can be inferred from motor images at a macroscopic level, based on global aspects of the action (the duration and amount of effort involved) and the motor rules and constraints which predict the spatial path and kinematics of movements. A more microscopic neural account calls for a representation of object-oriented action. Object attributes are processed in different neural pathways depending on the kind of task the subject is performing. During object-oriented action, a pragmatic representation is activated in which object affordances are transformed into specific motor schemas (independently of other tasks such as object recognition). Animal as well as human clinical data implicate the posterior parietal and premotor cortical areas in schema instantiation. A mechanism is proposed that is able to encode the desired goal of the action and is applicable to different levels of representational organization.

2,154 citations