scispace - formally typeset
Search or ask a question
Topic

Attentional blink

About: Attentional blink is a research topic. Over the lifetime, 1346 publications have been published within this topic receiving 53064 citations. The topic is also known as: Attentional blinks.


Papers
More filters
Journal ArticleDOI
02 Feb 2011-PLOS ONE
TL;DR: It was shown that categorical similarity between working memory content and the target stimuli pertaining to the attentional task (both digits) increased attentional blink magnitude compared to a condition in which this similarity was absent (colors and digits).
Abstract: Three experiments were conducted to investigate the effects of working memory content on temporal attention in a rapid serial visual presentation attentional blink paradigm. It was shown that categorical similarity between working memory content and the target stimuli pertaining to the attentional task (both digits) increased attentional blink magnitude compared to a condition in which this similarity was absent (colors and digits, respectively). This effect was only observed when the items in working memory were not presented as conjunctions of the involved categories (i.e., colored digits). This suggested that storage and retrieval from working memory was at least preferentially conjunctive in this case. It was furthermore shown that the content of working memory enhanced the identification rate of the second target, by means of repetition priming, when inter-target lag was short and the attentional blink was in effect. The results are incompatible with theories of temporal attention that assume working memory has no causal role in the attentional blink and support theories that do.

5 citations

Journal ArticleDOI
TL;DR: This study reports significant differences in the processing speeds of divided attention and the error rates in the AB test between the congenitally hearing impaired and the normal with the hearing impaired having slower speeds of processing but making less errors in AB test.
Abstract: Background: Enhanced visual attention is one of the major documented effects in auditory deprivation. However, in parallel, it has been shown that the congenitally deaf show deficits in their temporal processing. Aim: In this study, we aimed to study the parameters of visual attention. Materials and Methods: The speed of processing of divided attention (using the symbol digit modality test [SDMT]) and the central attention with attentional blink (AB) using a commercially available App, the BrainBaseline App. We tested these parameters of visual attention in students who are congenitally hearing impaired and those with normal hearing. Results: The measure of visual attention (error scores) using the SDMT did not show any significant differences between the congenitally hearing impaired and the students with normal hearing. However, we report significant differences in the processing speeds of divided attention and the error rates in the AB test between the congenitally hearing impaired and the normal with the hearing impaired having slower speeds of processing but making less errors in AB test. Conclusion: This finding probably indicates the redistribution or allocation of available brain resources as a result of sensory deprivation.

5 citations

Journal ArticleDOI
TL;DR: In this paper, the authors extended current knowledge by evaluating four visual attention paradigms used in this research-visual attention span, attention blink, visual search, and visuospatial attention-in a single study.

5 citations

Journal ArticleDOI
TL;DR: In this article , Xu et al. investigated whether the cross-modal benefit originates from audiovisual interactions or sound-induced alertness, and whether the semantic congruency effect is contingent on audi-isual temporal synchrony.
Abstract: The visual attentional blink can be substantially reduced by delivering a task‐irrelevant sound synchronously with the second visual target (T2), and this effect is further modulated by the semantic congruency between the sound and T2. However, whether the cross‐modal benefit originates from audiovisual interactions or sound‐induced alertness remains controversial, and whether the semantic congruency effect is contingent on audiovisual temporal synchrony needs further investigation. The current study investigated these questions by recording event‐related potentials (ERPs) in a visual attentional blink task wherein a sound could either synchronize with T2, precede T2 by 200 ms, be delayed by 100 ms, or be absent, and could be either semantically congruent or incongruent with T2 when delivered. The behavioral data showed that both the cross‐modal boost of T2 discrimination and the further semantic modulation were the largest when the sound synchronized with T2. In parallel, the ERP data yielded that both the early occipital cross‐modal P195 component (192–228 ms after T2 onset) and late parietal cross‐modal N440 component (424–448 ms) were prominent only when the sound synchronized with T2, with the former being elicited solely when the sound was further semantically congruent whereas the latter occurring only when that sound was incongruent. These findings demonstrate not only that the cross‐modal boost of T2 discrimination during the attentional blink stems from early audiovisual interactions and the semantic congruency effect depends on audiovisual temporal synchrony, but also that the semantic modulation can unfold at the early stage of visual discrimination processing.

5 citations

Journal ArticleDOI
TL;DR: It is hypothesized that the dependence of involuntary attention on top-down attention interacts with the presence/absence of the target location uncertainty and distractor interference and it is found that when the attentional resources were depleted, the involuntary attention did not affect the perception of a single target stimulus.

5 citations


Network Information
Related Topics (5)
Visual perception
20.8K papers, 997.2K citations
89% related
Working memory
26.5K papers, 1.6M citations
87% related
Visual cortex
18.8K papers, 1.2M citations
83% related
Functional magnetic resonance imaging
15.4K papers, 1.1M citations
81% related
Prefrontal cortex
24K papers, 1.9M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202312
202266
202148
202043
201945
201840