scispace - formally typeset
Open AccessJournal ArticleDOI

Learning During Processing: Word Learning Doesn't Wait for Word Recognition to Finish

TLDR
In three experiments using word learning as a model domain, evidence is provided that learning reflects the ongoing dynamics of auditory and visual processing during a learning event and shows that learning can occur before stimulus recognition processes are complete.
About
This article is published in Cognitive Science.The article was published on 2017-04-01 and is currently open access. It has received 16 citations till now. The article focuses on the topics: Associative learning & Sequence learning.

read more

Citations
More filters
Journal ArticleDOI

Of mice and men: Speech sound acquisition as discriminative learning from prediction error, not just statistical tracking.

TL;DR: Together, these results show that existing knowledge of acoustic cues can block later learning of new cues, and speech sound acquisition depends on the predictive structure of learning events, which may have considerable implications for the field of speech acquisition.
Journal ArticleDOI

A real-time mechanism underlying lexical deficits in developmental language disorder: Between-word inhibition.

TL;DR: This paper examined lexical inhibition, the ability of more active words to suppress competitors, in 79 adolescents with and without DLD, and found evidence of inhibition with greater interference for stimuli that briefly activated a competitor word.
Proceedings ArticleDOI

Speech segmentation with a neural encoder model of working memory

TL;DR: This work presents the first unsupervised LSTM speech segmenter as a cognitive model of the acquisition of words from unsegmented input and is the first fully unsuper supervised system to be able to segment both symbolic and acoustic representations of speech.

Developmental and Computational Neuroscience Approaches to Cognition:The Case of Generalization (特集 高次認知機能の創発とコネクショニストモデル)

TL;DR: The authors argue that people's successes and failures in generalization are well characterized by neural network models and that with sufficient experience and appropriate architectural properties, such models can develop abstract representations that support good generalization.
Journal ArticleDOI

Partial Knowledge in the Development of Number Word Understanding

TL;DR: Results support the idea of graded representations (Munakata, 2001) in number word development and suggest traditional approaches to coding the give-N task may not completely capture children's knowledge.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.

Linear Mixed-Effects Models using 'Eigen' and S4

TL;DR: The core computational algorithms are implemented using the Eigen C++ library for numerical linear algebra and RcppEigen``glue''.
Journal ArticleDOI

Mixed-effects modeling with crossed random effects for subjects and items

TL;DR: In this article, the authors provide an introduction to mixed-effects models for the analysis of repeated measurement data with subjects and items as crossed random effects, and a worked-out example of how to use recent software for mixed effects modeling is provided.
Journal ArticleDOI

A Distributed, Developmental Model of Word Recognition and Naming.

TL;DR: A parallel distributed processing model of visual word recognition and pronunciation is described, which consists of sets of orthographic and phonological units and an interlevel of hidden units and which early in the learning phase corresponds to that of children acquiring word recognition skills.
Related Papers (5)