scispace - formally typeset
T

Takanori Kochiyama

Researcher at Primate Research Institute

Publications -  157
Citations -  5373

Takanori Kochiyama is an academic researcher from Primate Research Institute. The author has contributed to research in topics: Functional magnetic resonance imaging & Facial expression. The author has an hindex of 35, co-authored 147 publications receiving 4747 citations. Previous affiliations of Takanori Kochiyama include Queen's University & Kyoto University.

Papers
More filters
Journal ArticleDOI

Prefrontal and premotor cortices are involved in adapting walking and running speed on the treadmill: an optical imaging study

TL;DR: Results indicate that the prefrontal and premotor cortices are involved in adapting to locomotor speed on the treadmill, and these areas might predominantly participate in the control of running rather than walking.
Journal ArticleDOI

Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study

TL;DR: The broad region of the occipital and temporal cortices, especially in the right hemisphere, showed higher activation during viewing of the dynamic facial expressions than it did during the viewing of either control stimulus, common to both expressions.

Research report Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study

TL;DR: For instance, this paper found that the left amygdala was highly activated in response to dynamic facial expressions relative to both control stimuli, but not in the case of happy expressions, while the right ventral premotor cortex was also activated.
Journal ArticleDOI

Internally simulated movement sensations during motor imagery activate cortical motor areas and the cerebellum.

TL;DR: It is concluded that kinesthetic sensation associated with imagined movement is internally simulated during motor imagery by recruiting multiple motor areas in the absence of overt movement.
Journal ArticleDOI

Emotional expression boosts early visual processing of the face: ERP recording and its decomposition by independent component analysis.

TL;DR: These findings confirm that the emotional signal boosts early visual processing of the stimuli, and might be implemented by the amygdalar re-entrant projections.