scispace - formally typeset
Search or ask a question
Author

Orsolya Beatrix Kolozsvari

Other affiliations: Hungarian Academy of Sciences
Bio: Orsolya Beatrix Kolozsvari is an academic researcher from University of Jyväskylä. The author has contributed to research in topics: Temporal cortex & Speech perception. The author has an hindex of 5, co-authored 9 publications receiving 82 citations. Previous affiliations of Orsolya Beatrix Kolozsvari include Hungarian Academy of Sciences.

Papers
More filters
Journal ArticleDOI
TL;DR: This study evaluated how various combinations of viewpoints and reference frames affect subjects' performance when they navigated in a bounded virtual environment without landmarks and provided evidence that there are inherent associations between visual perspectives and cognitive reference frames.
Abstract: Spatial navigation in the mammalian brain relies on a cognitive map of the environment. Such cognitive maps enable us, for example, to take the optimal route from a given location to a known target. The formation of these maps is naturally influenced by our perception of the environment, meaning it is dependent on factors such as our viewpoint and choice of reference frame. Yet, it is unknown how these factors influence the construction of cognitive maps. Here, we evaluated how various combinations of viewpoints and reference frames affect subjects' performance when they navigated in a bounded virtual environment without landmarks. We measured both their path length and time efficiency and found that (1) ground perspective was associated with egocentric frame of reference, (2) aerial perspective was associated with allocentric frame of reference, (3) there was no appreciable performance difference between first and third person egocentric viewing positions and (4) while none of these effects were dependent on gender, males tended to perform better in general. Our study provides evidence that there are inherent associations between visual perspectives and cognitive reference frames. This result has implications about the mechanisms of path integration in the human brain and may also inspire designs of virtual reality applications. Lastly, we demonstrated the effective use of a tablet PC and spatial navigation tasks for studying spatial and cognitive aspects of human memory.

27 citations

Journal ArticleDOI
TL;DR: The results suggest that stress as a phonological feature is processed based on long-term representations, and listeners show a specific sensitivity to segmental and suprasegmental cues signaling the prosodic boundaries of words.

22 citations

Journal ArticleDOI
TL;DR: Dynamic changes in brain responses related to multi-sensory processing when grapheme-phoneme associations were learned and changes were observed in the brain responses to the novel letters during the learning process are found.

18 citations

Journal ArticleDOI
TL;DR: The current MEG study indicated that learning of logographic languages has a large impact on the audiovisual integration of written characters with some distinct features compared to previous results on alphabetic languages.
Abstract: Learning to associate written letters/characters with speech sounds is crucial for reading acquisition. Most previous studies have focused on audiovisual integration in alphabetic languages. Less is known about logographic languages such as Chinese characters, which map onto mostly syllable-based morphemes in the spoken language. Here we investigated how long-term exposure to native language affects the underlying neural mechanisms of audiovisual integration in a logographic language using magnetoencephalography (MEG). MEG sensor and source data from 12 adult native Chinese speakers and a control group of 13 adult Finnish speakers were analyzed for audiovisual suppression (bimodal responses vs. sum of unimodal responses) and congruency (bimodal incongruent responses vs. bimodal congruent responses) effects. The suppressive integration effect was found in the left angular and supramarginal gyri (205-365 ms), left inferior frontal and left temporal cortices (575-800 ms) in the Chinese group. The Finnish group showed a distinct suppression effect only in the right parietal and occipital cortices at a relatively early time window (285-460 ms). The congruency effect was only observed in the Chinese group in left inferior frontal and superior temporal cortex in a late time window (about 500-800 ms) probably related to modulatory feedback from multi-sensory regions and semantic processing. The audiovisual integration in a logographic language showed a clear resemblance to that in alphabetic languages in the left superior temporal cortex, but with activation specific to the logographic stimuli observed in the left inferior frontal cortex. The current MEG study indicated that learning of logographic languages has a large impact on the audiovisual integration of written characters with some distinct features compared to previous results on alphabetic languages.

18 citations

Journal ArticleDOI
TL;DR: This article measured brain responses to Finnish letters and speech sounds from 29 typically developing Finnish children in a child-friendly audiovisual integration experiment using magnetoencephalography and found that the auditory late response around 400 ms showed the largest association with phonological processing and rapid automatized naming abilities.
Abstract: Letter-speech sound (LSS) integration is crucial for initial stages of reading acquisition. However, the relationship between cortical organization for supporting LSS integration, including unimodal and multimodal processes, and reading skills in early readers remains unclear. In the present study, we measured brain responses to Finnish letters and speech sounds from 29 typically developing Finnish children in a child-friendly audiovisual integration experiment using magnetoencephalography. Brain source activations in response to auditory, visual and audiovisual stimuli as well as audiovisual integration response were correlated with reading skills and cognitive skills predictive of reading development after controlling for the effect of age. Regression analysis showed that from the brain measures, the auditory late response around 400 ms showed the largest association with phonological processing and rapid automatized naming abilities. In addition, audiovisual integration effect was most pronounced in the left and right temporoparietal regions and the activities in several of these temporoparietal regions correlated with reading and writing skills. Our findings indicated the important role of temporoparietal regions in the early phase of learning to read and their unique contribution to reading skills.

14 citations


Cited by
More filters
01 Jan 2016
TL;DR: This is an introduction to the event related potential technique, which can help people facing with some malicious bugs inside their laptop to read a good book with a cup of tea in the afternoon.
Abstract: Thank you for downloading an introduction to the event related potential technique. Maybe you have knowledge that, people have look hundreds times for their favorite readings like this an introduction to the event related potential technique, but end up in malicious downloads. Rather than reading a good book with a cup of tea in the afternoon, instead they are facing with some malicious bugs inside their laptop.

2,445 citations

Book ChapterDOI
01 Jan 1980
TL;DR: A computer program is a series of coded instructions for the computer to obey and represent a method of processing data that is read and translated into electronic pulses needed to make the computer work.
Abstract: A computer program is a series of coded instructions for the computer to obey and represent a method of processing data. Programs can't be written in English. They must first be written using a special language called a programming language. A PROGRAMMING LANGUAGE (e.g. BASIC, PASCAL, and C+) consists of a set of codes and rules which can be used to construct commands for the computer. These commands are read and translated into electronic pulses needed to make the computer work. Programs are written by programmers. A computer language is a set of instructions used for writing computer programs. There are THREE (3) levels of languages: 1. MACHINE LANGUAGE – this was the first language available for programming. It varies from one computer to another, but the basic principles are the same. MACHINE LANGUAGE PROGRAMS are written using a series of 0's and 1's i.e. using a BINARY SYSTEM. All programs written today must be translated into machine language before they can be executed (used) by the computer. EXAMPLE: 110110001 2. ASSEMBLY LANGUAGE / LOW LEVEL LANGUAGE – these were developed to replace the 0's and 1's of machine language with symbols that are easier to understand and remember. Like with machine language, Assembly language varies form one make of computer to another so that a program written in one assembly language will not run on another make of computer. EXAMPLE: LDA 300 ADD 400 STA 500 3. HIGH LEVEL LANGUAGE – these differ from low level languages in that they require less coding detail and make programs easier to write. High level languages are designed for the solution of problems in one ore more areas of the application and are commonly described as application-oriented or problem-oriented languages. High level languages are not machine dependant. Programs written in a high level language must be translated to a form which can be accepted by that computer, i.e.

489 citations

Journal ArticleDOI
TL;DR: In 2012, I left radiology residency to pursue internal medicine training, and as a medical student, I was drawn to radiology’s diagnostic challenges and its attention to detail, but it wasn’t long after beginning radiology training that I realized I missed treating patients.
Abstract: In 2012, I left radiology residency to pursue internal medicine training As a medical student, I had been drawn to radiology’s diagnostic challenges and its attention to detail But it wasn’t long after beginning radiology training that I realized I missed treating patients When I broke news of my decision to leave, my colleagues were skeptical: after more than a year away from clinical medicine, was I sure I wanted to do this? Nevertheless, my first morning back as a senior medical resident went well I had spent the night before excitedly reading about the patients I’d be caring for Morning rounds felt at once novel and suddenly familiar—the easy cadence of conversation as our team walked down the hospital corridor; the expectant look on patients’ faces as we entered their rooms Around early evening, after my attending (and, it seemed, everyone else in the hospital) had left for the day, doubt crept in Less than 3 weeks prior, I had been comfortably reading X-rays and CTs Now I was on call and supervising two newly minted interns My pager beeped interminably Sure, I could interpret a stat portable chest radiograph…but a 12-lead EKG in a patient with atypical chest pain? Or an arterial blood gas? Before, from the relative isolation of a leather office chair and a darkened room, I had a faint sense of what it meant to be a clinician This—being a medical resident on call—was something else entirely I thought of the final lines of Sinclair Lewis’ Arrowsmith, which so ably captured that feeling that all physicians, facing a seemingly insurmountable body of knowledge, must know well: I feel as if I were really beginning to work now In that moment, however, Lewis’ words took on an ominous tone Mr H was my first admission that night He had come to the emergency room because over the preceding few months, he had noticed a steady decline in his exercise tolerance He had always been active (he was a roofer by trade), but now, walking more than a block required him to pause and hunch over, hands on his knees, to catch his breath Mr H hadn’t seen a doctor since he was in the military Hadn’t had to, he said “I’ve always been healthy” The ability to discern “sick” from “not sick,” I had heard so many times, is among the more valuable things that one learns during internship and residency In the case of Mr H, however, he didn’t look sick—not in the way that I had grown accustomed to seeing: on a stretcher, being wheeled into the scanner for a trauma survey Instead, he sat upright and proudly recounted how he had never smoked a day in his life, and how last year he and his wife had joined a gym But Mr H’s placid exterior belied ominous findings—namely, his oxygen saturation, which was 93 % at rest and dropped to below 80 % when I walked with him up and down the corridor Breathing heavily, he steadied himself against the wall about ten feet from his room He looked scared If there had been any doubt about whether Mr H was sick, it had since evaporated I pored over the chest X-ray he had received in the ED Was that a nodule? Were those reticular markings at the bases? I agonized over what to do next The list of possible diagnostic tests seemed endless I finally settled on a CT, convincing myself that if Mr H had something catastrophic in his chest, a CT would surely pick it up By the time Mr H received his scan, it was well past midnight I scrolled through the images, afraid of what the next slice would reveal My mind raced through the possibilities Sarcoid Interstitial lung disease Lymphoma Finally, after what seemed like an eternity staring at the screen (and after calling the nighthawk radiologist for reassurance), I sat back in my chair and rubbed my eyes Nothing His scan was clean It was late Rounds were in a few hours I walked back into Mr H’s room to give him the results of his CT “Well, Doc?” he asked with a wan smile He hadn’t slept I froze, struggling to find a way to tell him that about a thousand dollars and 20 millisieverts of radiation later, I was no closer to figuring out what was wrong I mumbled something about waiting on more results and left his room His gaze trailed after me In the hall, I steadied myself against the same wall that Mr H had leaned on hours earlier The next few days would reveal that he suffered from pulmonary arterial hypertension He received an echocardiogram and underwent right heart catheterization His disease was pretty advanced I kept telling myself that his was an illness that no conventional radiologic study could reliably detect, and that another resident would have done the same thing overnight But truthfully, during those next few days, I was in a fog While the pulmonologists exchanged terms like “endothelin” and “nitric oxide,” I perseverated on the events of the night I met Mr H, replaying them in a continuous loop: my less-than-thorough social history (had I asked about pet birds?), my (no doubt incomplete) physical exam I had re-entered medical residency with rusty clinical skills, to be sure; but I thought this deficit might be counterbalanced by my ability to read images Yet, had my reliance on this ability caused me to overlook Mr H’s diagnosis? When I was a radiology resident, my attendings would sometimes tell me that becoming adept at interpreting images required learning how to read all over again For instance, when looking at a chest X-ray, forget what you learned in medical school—ie, “A is for airway, B is for bones”—instead, look at the study as a whole The scan comprises a map, or better yet, words on a page; it’s up to you to glean their meaning Now, as I approach the end of internal medicine residency, I know that the same can be said of patients, their stories, their faces Looking back, I think I know why: to acquire that sine qua non characteristic that the best medicine doctors possess—the ability to elicit a patient’s history and arrive at a diagnosis—one has to learn to read patients with the same level of nuance The etiology of patients’ illnesses can be found in poring through their images, but it’s also buried in their histories and etched in their expressions Now and then I’ll hear a bit about how Mr H is doing He wears continuous oxygen The word “transplant” is frequently mentioned Hearing these updates always grounds me, reminds me how far I’ve come since that night—and also how far there is to go It’s all part of learning to read

137 citations

18 Jun 2006
TL;DR: The present event-related fMRI study was designed to address two questions that could not directly be addressed in the previous studies, due to their passive nature and blocked design: whether the enhancement/suppression of auditory cortex are truly multisensory integration effects or can be explained by different attention levels during congruent/incongruent blocks.
Abstract: In alphabetic scripts, letters and speech sounds are the basic elements of correspondence between spoken and written language. In two previous fMRI studies using blocked stimulus presentation and passive perception, we found a cross-modal modulation of the response to speech sounds in the auditory association cortex by letters, expressed as response enhancement by congruent letters and suppression by incongruent letters. Interestingly, temporal proximity was critical for this congruency effect to occur. In the present study, we used fMRI to investigate the effect of stimulus presentation mode (blocked vs. event-related) and task instruction (passive perception vs. active matching) on the neural integration of letters and speech sounds. The principle findings are 1) a replication of the previous results on passive integration using event-related fMRI, and 2) the absence of the effects of congruency and temporal proximity in the auditory association cortex during active matching. Finding 1 shows the suitability of event-related fMRI for studying letter-sound integration. Finding 2 indicates that the task demands overruled the automatic multisensory responses to letters and speech sounds, most likely because the task changed the behavioral relevance of the stimuli.

107 citations