scispace - formally typeset
Search or ask a question
Author

Tiina Parviainen

Bio: Tiina Parviainen is an academic researcher from University of Jyväskylä. The author has contributed to research in topics: Magnetoencephalography & Psychology. The author has an hindex of 16, co-authored 50 publications receiving 930 citations. Previous affiliations of Tiina Parviainen include Aalto University & Helsinki University of Technology.


Papers
More filters
Journal ArticleDOI
TL;DR: Action images, regardless of whether verbs or nouns were named, evoked stronger activation than object-only images in the posterior middle temporal cortex bilaterally, the left temporo-parietal junction and the left frontal cortex, a network previously identified in processing of action knowledge.

105 citations

Journal ArticleDOI
01 Feb 2017-PLOS ONE
TL;DR: It is proposed that children, when learning new word forms in either native or foreign language, are not yet constrained by left-hemispheric segmental processing and established sublexical native-language representations Instead, they may rely more on supra-segmental contours and prosody.
Abstract: It is commonly thought that phonological learning is different in young children compared to adults, possibly due to the speech processing system not yet having reached full native-language specialization. However, the neurocognitive mechanisms of phonological learning in children are poorly understood. We employed magnetoencephalography (MEG) to track cortical correlates of incidental learning of meaningless word forms over two days as 6-8-year-olds overtly repeated them. Native (Finnish) pseudowords were compared with words of foreign sound structure (Korean) to investigate whether the cortical learning effects would be more dependent on previous proficiency in the language rather than maturational factors. Half of the items were encountered four times on the first day and once more on the following day. Incidental learning of these recurring word forms manifested as improved repetition accuracy and a correlated reduction of activation in the right superior temporal cortex, similarly for both languages and on both experimental days, and in contrast to a salient left-hemisphere emphasis previously reported in adults. We propose that children, when learning new word forms in either native or foreign language, are not yet constrained by left-hemispheric segmental processing and established sublexical native-language representations. Instead, they may rely more on supra-segmental contours and prosody.

93 citations

Journal ArticleDOI
TL;DR: Track the neural time course of syllable processing with magnetoencephalography shows that this continuous construction of meaning-based representations is aided by both top-down and bottom-up cues in the speech signal.
Abstract: In speech perception, extraction of meaning from complex streams of sounds is surprisingly fast and efficient. By tracking the neural time course of syllable processing with magnetoencephalography we show that this continuous construction of meaning-based representations is aided by both top-down (context-based) expectations and bottom-up (acoustic--phonetic) cues in the speech signal. Syllables elicited a sustained response at 200--600 ms (N400m) which became most similar to that evoked by words when the expectation for meaningful speech was increased by presenting the syllables among words and sentences or using sentence-initial syllables. This word-like cortical processing of meaningless syllables emerged at the build-up of the N400m response, 200--300 ms after speech onset, during the transition from perceptual to lexical--semantic analysis. These findings show that the efficiency of meaning-based analysis of speech is subserved by a cortical system finely tuned to lexically relevant acousticphonetic and contextual cues.

79 citations

Journal ArticleDOI
TL;DR: Dyslexic individuals seem to have an unusual cortical organization of general auditory processing in the time window of speech-sensitive analysis, which is likely to reflect analysis at the phonetic level.
Abstract: Neurophysiological measures indicate cortical sensitivity to speech sounds by 150 ms after stimulus onset. In this time window dyslexic subjects start to show abnormal cortical processing. We investigated whether phonetic analysis is reflected in the robust auditory cortical activation at approximately 100 ms (N100m), and whether dyslexic subjects show abnormal N100m responses to speech or nonspeech sounds. We used magnetoencephalography to record auditory responses of 10 normally reading and 10 dyslexic adults. The speech stimuli were synthetic Finnish speech sounds (/a/, /u/, /pa/, /ka/). The nonspeech stimuli were complex nonspeech sounds and simple sine wave tones, composed of the F1+F2+F3 and F2 formant frequencies of the speech sounds, respectively. All sounds evoked a prominent N100m response in the bilateral auditory cortices. The N100m activation was stronger to speech than nonspeech sounds in the left but not in the right auditory cortex, in both subject groups. The leftward shift of hemispheric balance for speech sounds is likely to reflect analysis at the phonetic level. In dyslexic subjects the overall interhemispheric amplitude balance and timing were altered for all sound types alike. Dyslexic individuals thus seem to have an unusual cortical organization of general auditory processing in the time window of speech-sensitive analysis.

73 citations

Journal ArticleDOI
TL;DR: The present data indicate involvement of the middle superior temporal cortex in semantic processing from ∼300 ms onwards, regardless of input modality.
Abstract: Retrieval of word meaning from the semantic system and its integration with context are often assumed to be shared by spoken and written words. How is modality-independent semantic processing manifested in the brain, spatially and temporally? Time-sensitive neuroimaging allows tracking of neural activation sequences. Use of semantically related versus unrelated word pairs or sentences ending with a semantically highly or less plausible word, in separate studies of the auditory and visual modality, has associated lexical-semantic analysis with sustained activation at ∼200–800 ms. Magnetoencephalography (MEG) studies have further identified the superior temporal cortex as a main locus of the semantic effect. Nevertheless, a direct comparison of the spatiotemporal neural correlates of visual and auditory word comprehension in the same brain is lacking. We used MEG to compare lexical-semantic analysis in the visual and auditory domain in the same individuals, and contrasted it with phonological analysis that, according to models of language perception, should occur at a different time with respect to semantic analysis in reading and speech perception. The stimuli were lists of four words that were either semantically or phonologically related, or with the final word unrelated to the preceding context. Superior temporal activation reflecting semantic processing occurred similarly in the two modalities, left-lateralized at 300–450 ms and thereafter bilaterally, generated in close-by areas. Effect of phonology preceded the semantic effect in speech perception but not in reading. The present data indicate involvement of the middle superior temporal cortex in semantic processing from ∼300 ms onwards, regardless of input modality.

64 citations


Cited by
More filters
01 Jan 2016
TL;DR: This is an introduction to the event related potential technique, which can help people facing with some malicious bugs inside their laptop to read a good book with a cup of tea in the afternoon.
Abstract: Thank you for downloading an introduction to the event related potential technique. Maybe you have knowledge that, people have look hundreds times for their favorite readings like this an introduction to the event related potential technique, but end up in malicious downloads. Rather than reading a good book with a cup of tea in the afternoon, instead they are facing with some malicious bugs inside their laptop.

2,445 citations

Journal ArticleDOI
TL;DR: It is shown that evidence bearing on where the N400 response is generated provides key insights into what it reflects, and this has important consequences for neural models of language comprehension.
Abstract: Measuring event-related potentials (ERPs) has been fundamental to our understanding of how language is encoded in the brain. One particular ERP response, the N400 response, has been especially influential as an index of lexical and semantic processing. However, there remains a lack of consensus on the interpretation of this component. Resolving this issue has important consequences for neural models of language comprehension. Here we show that evidence bearing on where the N400 response is generated provides key insights into what it reflects. A neuroanatomical model of semantic processing is used as a guide to interpret the pattern of activated regions in functional MRI, magnetoencephalography and intracranial recordings that are associated with contextual semantic manipulations that lead to N400 effects.

1,392 citations

Journal ArticleDOI
TL;DR: Anatomical and functional connectivity studies are now required to identify the processing pathways that integrate these areas to support language.
Abstract: In this review of 100 fMRI studies of speech comprehension and production, published in 2009, activation is reported for: prelexical speech perception in bilateral superior temporal gyri; meaningful speech in middle and inferior temporal cortex; semantic retrieval in the left angular gyrus and pars orbitalis; and sentence comprehension in bilateral superior temporal sulci. For incomprehensible sentences, activation increases in four inferior frontal regions, posterior planum temporale, and ventral supramarginal gyrus. These effects are associated with the use of prior knowledge of semantic associations, word sequences, and articulation that predict the content of the sentence. Speech production activates the same set of regions as speech comprehension but in addition, activation is reported for: word retrieval in left middle frontal cortex; articulatory planning in the left anterior insula; the initiation and execution of speech in left putamen, pre-SMA, SMA, and motor cortex; and for suppressing unintended responses in the anterior cingulate and bilateral head of caudate nuclei. Anatomical and functional connectivity studies are now required to identify the processing pathways that integrate these areas to support language.

1,204 citations