scispace - formally typeset
Search or ask a question
Topic

Attentional blink

About: Attentional blink is a research topic. Over the lifetime, 1346 publications have been published within this topic receiving 53064 citations. The topic is also known as: Attentional blinks.


Papers
More filters
Posted ContentDOI
16 Dec 2022
TL;DR: In this paper , the eye-blink patterns of 13 participants were coupled with the speech pauses in the attended speech stream and participants did not inhibit their blinking preceding a pause in the ignored speech stream.
Abstract: Eye blinks do not only serve to maintain the tear film of the eye but also seem to have a functional role in information processing. People tend to inhibit an eye blink when they expect relevant information to occur in their visual environment and blink more often when the information has been processed. Recent studies have shown that this relation also holds for auditory information processing. Yet so far, only artificial auditory stimuli like tones or controlled lists of words were used. In the current study, we tested whether there is a temporal association between the pauses in a continuous speech stream and the listener’s eye blinks. To this end, we analyzed the eye blinks of 35 participants who were instructed to attended to one of two simultaneously presented audio books. We found that the blink patterns of 13 participants were coupled with the speech pauses in the attended speech stream. These participants blinked more often during the pauses in the attended speech stream. Contrary to our prediction, participants did not inhibit their blinking preceding a pause in the attended speech stream. As expected, there was no evidence that the listeners’ blink pattern was coupled to the pauses in the ignored speech stream. Thus, we conclude that the listeners’ blink patterns can reflect attention to continuous speech.

1 citations

Journal ArticleDOI
TL;DR: The results suggest that the quality of T2 representations gradually becomes more precise as attentional resources are made available over time, and that discrete or continuous processing depends on the level in which attentional capacity is taxed.
Abstract: A debate regarding the discrete or continuous nature of visual awareness has revolved around the attentional blink (AB). The AB is a transient limitation in the ability to perceive the second (T2) of two masked targets 200-500ms after the attentional processing of the first target (T1). The AB is considered to represent both a central limitation of attention and limitations within earlier visual stages of information processing (Dux & Marois, 2009). Central to this debate is whether the failure to report T2 is due to the discrete failure of information reaching post-perceptual stages of processing, or whether the graded quality of information produces a weak conscious representation. Asplund et al. (2014) applied a mixture-modeling analysis (Zhang & Luck, 2008) to errors in T2 responses to estimate 1) the width of the error distribution to measure the precision of the T2 percept and 2) the probability of successful T2 encoding, assuming an all-or-none representation of T2. They found evidence in favor of discrete failures within central stages of information processing. Here, we tested whether the AB necessarily leads to discrete failures of perception, or whether weak representations can occur once attention is taxed within the same early visual processing channel (Awh et al. 2004), or feature dimension of orientation. Observers were presented with two oriented gratings, which appeared in a rapid serial visual presentation of random noise. Participants performed two task types: report the orientation of both targets, or of T2 alone. Mixture-modeling analysis revealed an AB, defined by the interaction between task and time, in precision but not in the probability of successful encoding. These results suggest that the quality of T2 representations gradually becomes more precise as attentional resources are made available over time, and that discrete or continuous processing depends on the level in which attentional capacity is taxed. Meeting abstract presented at VSS 2015.

1 citations

01 Apr 2010
TL;DR: The model aims at integrating a winners-take-all type of neural network with Bundesen’s (1990) well-established mathematical theory of visual attention with the aim of offering a neural interpretation of how objects are consolidated in VSTM at the same time.
Abstract: In this paper a neural network model of visual short-term memory (VSTM) is presented. The model aims at integrating a winners-take-all type of neural network (Usher & Cohen, 1999) with Bundesen’s (1990) well-established mathematical theory of visual attention. We evaluate the model’s ability to fit experimental data from a classical whole and partial report study. Previous statistic models have successfully assessed the spatial distribution of visual attention; our neural network meets this standard and offers a neural interpretation of how objects are consolidated in VSTM at the same time. We hope that in the future, the model will be developed to fit temporally dependent phenomena like the attentional blink effect, lag-1 sparing, and attentional dwell-time.

1 citations


Network Information
Related Topics (5)
Visual perception
20.8K papers, 997.2K citations
89% related
Working memory
26.5K papers, 1.6M citations
87% related
Visual cortex
18.8K papers, 1.2M citations
83% related
Functional magnetic resonance imaging
15.4K papers, 1.1M citations
81% related
Prefrontal cortex
24K papers, 1.9M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202312
202266
202148
202043
201945
201840