scispace - formally typeset
Search or ask a question

Showing papers on "Encoding (memory) published in 2012"


Journal ArticleDOI
TL;DR: A new computational model for the complex-span task, the most popular task for studying working memory, is introduced, which accounts for benchmark findings in four areas: effects of processing pace, processing difficulty, and number of processing steps.
Abstract: This article introduces a new computational model for the complex-span task, the most popular task for studying working memory. SOB-CS is a two-layer neural network that associates distributed item representations with distributed, overlapping position markers. Memory capacity limits are explained by interference from a superposition of associations. Concurrent processing interferes with memory through involuntary encoding of distractors. Free time in-between distractors is used to remove irrelevant representations, thereby reducing interference. The model accounts for benchmark findings in four areas: (1) effects of processing pace, processing difficulty, and number of processing steps; (2) effects of serial position and error patterns; (3) effects of different kinds of item-distractor similarity; and (4) correlations between span tasks. The model makes several new predictions in these areas, which were confirmed experimentally.

282 citations



Journal ArticleDOI
TL;DR: Results provide the first direct evidence for inhibition of competing memories during episodic memory retrieval and suggest that competitive retrieval is governed by inhibitory mechanisms similar to those employed in selective attention.
Abstract: Selective retrieval of a specific target memory often leads to the forgetting of related but irrelevant memories. Current cognitive theory states that such retrieval-induced forgetting arises due to inhibition of competing memory traces. To date, however, direct neural evidence for this claim has not been forthcoming. Studies on selective attention suggest that cortical inhibition is mediated by increased brain oscillatory activity in the alpha/beta frequency band. The present study, testing 18 human subjects, investigated whether these mechanisms can be generalized to selective memory retrieval in which competing memories interfere with the retrieval of a target memory. Our experiment was designed so that each cue used to search memory was associated with a target memory and a competitor memory stored in separate brain hemispheres. Retrieval-induced forgetting was observed in a condition in which the competitor memory interfered with target retrieval. Increased oscillatory alpha/beta power was observed over the hemisphere housing the sensory representation of the competitor memory trace and predicted the amount of retrieval-induced forgetting in the subsequent memory test. These results provide the first direct evidence for inhibition of competing memories during episodic memory retrieval and suggest that competitive retrieval is governed by inhibitory mechanisms similar to those employed in selective attention.

162 citations


Journal ArticleDOI
TL;DR: The findings of eight fMRI studies from the laboratory support the proposal that hippocampal activity co-varies with the amount of contextual information about a study episode that is encoded or retrieved, and not with the strength of an undifferentiated memory signal.

132 citations


Journal ArticleDOI
27 Jul 2012-Science
TL;DR: How people’s memory formation and decisions are influenced by their recent engagement in episodic encoding and retrieval is measured, finding that the recent encoding of novel objects improved subsequent identification of subtle changes, a task thought to rely on pattern separation.
Abstract: How do we decide if the people we meet and the things we see are familiar or new? If something is new, we need to encode it as a memory distinct from already stored episodes, using a process known as pattern separation. If familiar, it can be used to reactivate a previously stored memory, by a process known as pattern completion. To orchestrate these conflicting processes, current models propose that the episodic memory system uses environmental cues to establish processing biases that favor either pattern separation during encoding or pattern completion during retrieval. To assess this theory, we measured how people's memory formation and decisions are influenced by their recent engagement in episodic encoding and retrieval. We found that the recent encoding of novel objects improved subsequent identification of subtle changes, a task thought to rely on pattern separation. Conversely, recent retrieval of old objects increased the subsequent integration of stored information into new memories, a process thought to rely on pattern completion. These experiments provide behavioral evidence that episodic encoding and retrieval evoke lingering biases that influence subsequent mnemonic processing.

130 citations


Journal ArticleDOI
TL;DR: Results of two experiments confirmed previous evidence that an irrelevant attentional load interferes equally with memory for features and memory for feature bindings and suggested that different measures of recognition memory performance give a converging picture of main effects, but are less consistent in detecting interactions.
Abstract: We aimed to resolve an apparent contradiction between previous experiments from different laboratories, using dual-task methodology to compare effects of a concurrent executive load on immediate recognition memory for colours or shapes of items or their colour–shape combinations. Results of two experiments confirmed previous evidence that an irrelevant attentional load interferes equally with memory for features and memory for feature bindings. Detailed analyses suggested that previous contradictory evidence arose from limitations in the way recognition memory was measured. The present findings are inconsistent with an earlier suggestion that feature binding takes place within a multimodal episodic buffer Baddeley, (2000) and support a subsequent account in which binding takes place automatically prior to information entering the episodic buffer Baddeley, Allen, & Hitch, (2011). Methodologically, the results suggest that different measures of recognition memory performance (A′, d′, corrected recognition) ...

112 citations


Patent
17 Dec 2012
TL;DR: In this paper, phase change memory cells including a phase change media can be encoded using a source of energy that is not integral with the memory cell, such as thermal heads, thermal transfer printing and sources of electromagnetic radiation such as lasers.
Abstract: Phase change memory cells including a phase change media can be encoded using a source of energy that is not integral with the memory cell. External sources of energy include thermal heads, such as those used in direct thermal printing or thermal transfer printing and sources of electromagnetic radiation, such as lasers. Such types of phase change memory devices can be associated with substrates that include thermochromic materials or are suitable for thermal transfer printing so that the memory cells can be encoded and print media applied to the substrate using the same source of thermal energy.

100 citations


Journal ArticleDOI
TL;DR: A neural model that learns episodic traces in response to a continuous stream of sensory input and feedback received from the environment that is able to achieve a high level of memory performance and robustness, while controlling memory consumption over time is presented.
Abstract: This paper presents a neural model that learns episodic traces in response to a continuous stream of sensory input and feedback received from the environment. The proposed model, based on fusion adaptive resonance theory (ART) network, extracts key events and encodes spatio-temporal relations between events by creating cognitive nodes dynamically. The model further incorporates a novel memory search procedure, which performs a continuous parallel search of stored episodic traces. Combined with a mechanism of gradual forgetting, the model is able to achieve a high level of memory performance and robustness, while controlling memory consumption over time. We present experimental studies, where the proposed episodic memory model is evaluated based on the memory consumption for encoding events and episodes as well as recall accuracy using partial and erroneous cues. Our experimental results show that: 1) the model produces highly robust performance in encoding and recalling events and episodes even with incomplete and noisy cues; 2) the model provides enhanced performance in a noisy environment due to the process of forgetting; and 3) compared with prior models of spatio-temporal memory, our model shows a higher tolerance toward noise and errors in the retrieval cues.

90 citations


Journal ArticleDOI
TL;DR: A model of temporal processing in audition and speech that involves a division of labor between the cerebellum and the basal ganglia in tracing acoustic events in time is developed and it is proposed that spectrotemporal predictive processes may be facilitated by subcortical coding of relevant changes in sound energy as temporal event markers.

78 citations


Journal ArticleDOI
TL;DR: In this article, different neuro-cognitive processes have been linked to the formation of true and false memories, including the medial temporal lobe and the medial and lateral prefrontal cortex, respectively.
Abstract: Perception and memory are imperfect reconstructions of reality. These reconstructions are prone to be influenced by several factors, which may result in false memories. A false memory is the recollection of an event, or details of an episode, that did not actually occur. Memory formation comprises at least three different sub-processes: encoding, consolidation and the retrieval of the learned material. All of these sub-processes are vulnerable for specific errors and consequently may result in false memories. Whereas, processes like imagery, self-referential encoding or spreading activation can lead to the formation of false memories at encoding, semantic generalization during sleep and updating processes due to misleading post event information, in particular, are relevant at the consolidation stage. Finally at the retrieval stage, monitoring processes, which are assumed to be essential to reject false memories, are of specific importance. Different neuro-cognitive processes have been linked to the formation of true and false memories. Most consistently the medial temporal lobe and the medial and lateral prefrontal cortex have been reported with regard to the formation of true and false memories. Despite the fact that all phases entailing memory formation, consolidation of stored information and retrieval processes, are relevant for the forming of false memories, most studies focused on either memory encoding or retrieval. Thus, future studies should try to integrate data from all phases to give a more comprehensive view on systematic memory distortions. An initial outline is developed within this review to connect the different memory stages and research strategies.

74 citations


Journal ArticleDOI
TL;DR: It is argued that memory alterations arise from the inherent predictive function of memory, and it is suggested that systems consolidation and sleep contribute to the formation of abstract knowledge, and that abstract knowledge can function as pre-existing schemas to the encoding of novel memories.

Patent
10 Feb 2012
TL;DR: In this article, a memory system and method using at least one memory device die stacked with and coupled to a logic die by interconnects, such as through silicon vias, is described.
Abstract: A memory system and method using at least one memory device die stacked with and coupled to a logic die by interconnects, such as through silicon vias. One such logic die includes an ECC system generating error checking and correcting (“ECC) bits corresponding to write data. The write data are transmitted to the memory device dice in a packet containing a serial burst of a plurality of parallel data bits. The ECC bits are transmitted to the memory device dice using through silicon vias that are different from the vias through which data are coupled. Such a logic die could also include a data bus inversion (“DBI”) system encoding the write data using a DBI algorithm and transmitting to the memory device dice DBI bits indicating whether the write data have been inverted. The DBI bits are transmitted using through silicon vias that are shared with the ECC bits when they are unused for transferring the ECC bits.

Journal ArticleDOI
TL;DR: It appears that storing, or merely attending to, one feature of an object is sufficient to promote automatic encoding of all its features, depleting VWM resources.
Abstract: Selective attention is often considered the "gateway" to visual working memory (VWM). However, the extent to which we can voluntarily control which of an object's features enter memory remains subject to debate. Recent research has converged on the concept of VWM as a limited commodity distributed between elements of a visual scene. Consequently, as memory load increases, the fidelity with which each visual feature is stored decreases. Here we used changes in recall precision to probe whether task-irrelevant features were encoded into VWM when individuals were asked to store specific feature dimensions. Recall precision for both color and orientation was significantly enhanced when task-irrelevant features were removed, but knowledge of which features would be probed provided no advantage over having to memorize both features of all items. Next, we assessed the effect an interpolated orientation-or color-matching task had on the resolution with which orientations in a memory array were stored. We found that the presence of orientation information in the second array disrupted memory of the first array. The cost to recall precision was identical whether the interfering features had to be remembered, attended to, or could be ignored. Therefore, it appears that storing, or merely attending to, one feature of an object is sufficient to promote automatic encoding of all its features, depleting VWM resources. However, the precision cost was abolished when the match task preceded the memory array. So, while encoding is automatic, maintenance is voluntary, allowing resources to be reallocated to store new visual information.

Journal ArticleDOI
TL;DR: There was a significant, positive correlation between working memory capacity (WMC) and increase in memory performance after sleep but not after a period of wakefulness, suggesting that the relationship is specific to change in memory due to sleep.
Abstract: Decades of research have established that "online" cognitive processes, which operate during conscious encoding and retrieval of information, contribute substantially to individual differences in memory Furthermore, it is widely accepted that "offline" processes during sleep also contribute to memory performance However, the question of whether individual differences in these two types of processes are related to one another remains unanswered We investigated whether working memory capacity (WMC), a factor believed to contribute substantially to individual differences in online processing, was related to sleep-dependent declarative memory consolidation Consistent with previous studies, memory for word pairs reliably improved after a period of sleep, whereas performance did not improve after an equal interval of wakefulness More important, there was a significant, positive correlation between WMC and increase in memory performance after sleep but not after a period of wakefulness The correlation between WMC and performance during initial test was not significant, suggesting that the relationship is specific to change in memory due to sleep This suggests a fundamental underlying ability that may distinguish individuals with high memory capacity

Journal ArticleDOI
TL;DR: A novel paradigm is used to investigate how control influences memory encoding and, conversely, how memory measures can provide new insight into flexible cognitive control to illustrate how cognitive control and bottom-up factors interact to simultaneously influence both current performance and future memory.
Abstract: Cognitive control and memory are fundamentally intertwined, but interactions between the two have only recently received sustained research interest. In the study reported here, we used a novel paradigm to investigate how control influences memory encoding and, conversely, how memory measures can provide new insight into flexible cognitive control. Participants switched between classifying objects and words, then were tested for their recognition memory of items presented in this task-switching phase. Task switching impaired memory for task-relevant information but actually improved memory for task-irrelevant information, which indicates that control demands reduced the selectivity of memory encoding rather than causing a general memory decline. Recognition memory strength provided a robust trial-by-trial measure of the effectiveness of cognitive control that "predicted" earlier task-switching performance. It also revealed a substantial influence of bottom-up factors on between-task competition, but only on trials in which participants had to switch from one type of classification to the other. Collectively, our findings illustrate how cognitive control and bottom-up factors interact to simultaneously influence both current performance and future memory.

Proceedings ArticleDOI
13 Oct 2012
TL;DR: The proposed method is a projector-camera system that reconstructs a shape from a single image where a static pattern is cast by a projector, such a method is ideal for acquisition of moving objects at a high frame rate.
Abstract: In this paper, we propose a method to reconstruct the shapes of moving objects. The proposed method is a projector-camera system that reconstructs a shape from a single image where a static pattern is cast by a projector, such a method is ideal for acquisition of moving objects at a high frame rate. The issues tackled in this paper are as follows: 1) realize one-shot 3D reconstruction with a single-colored pattern, and 2) obtain accurate shapes by finding correspondences in sub-pixel accuracy. To achieve these goals, we propose the following methods: 1) implicit encoding of projector information by a grid of wave lines, 2) grid-based stereo between projector pattern and camera images to determine unique correspondences, 3) (quasi-)pixel-wise interpolations and optimizations to reconstruct dense shapes, and 4) a single-colored pattern contributes to simplify pattern projecting devices compared to color-coded methods. In the experiment, we show the proposed method is efficient to solve the issues above.

Proceedings ArticleDOI
01 Jul 2012
TL;DR: Fundamental information-theoretic bounds are provided on the required circuit wiring complexity and power consumption for encoding and decoding of error-correcting codes and for bounded transmit-power schemes, showing that there is a fundamental tradeoff between the transmit and encoding/decoding power.
Abstract: We provide fundamental information-theoretic bounds on the required circuit wiring complexity and power consumption for encoding and decoding of error-correcting codes. These bounds hold for all codes and all encoding and decoding algorithms implemented within the paradigm of our VLSI model. This model essentially views computation on a 2-D VLSI circuit as a computation on a network of connected nodes. The bounds are derived based on analyzing information flow in the circuit. They are then used to show that there is a fundamental tradeoff between the transmit and encoding/decoding power, and that the total (transmit + encoding + decoding) power must diverge to infinity at least as fast as cube-root of log 1/p e , where P e is the average block-error probability. On the other hand, for bounded transmit-power schemes, the total power must diverge to infinity at least as fast as square-root of log 1/P e due to the burden of encoding/decoding.

Journal ArticleDOI
TL;DR: Results showed that when possible, participants constrained recall to the solicited targets by reinstating the original encoding operations on the recall cues, which improved the quality of the information that came to mind and enhanced actual recall performance.
Abstract: Research on the strategic regulation of memory accuracy has focused primarily on monitoring and control processes used to edit out incorrect information after it is retrieved (back-end control). Recent studies, however, suggest that rememberers also enhance accuracy by preventing the retrieval of incorrect information in the first place (front-end control). The present study put forward and examined a mechanism called source-constrained recall (cf. Jacoby, Shimizu, Velanova, & Rhodes, 2005) by which rememberers process and use recall cues in qualitatively different ways, depending on the manner of original encoding. Results of 2 experiments in which information about source encoding depth was made available at test showed that when possible, participants constrained recall to the solicited targets by reinstating the original encoding operations on the recall cues. This reinstatement improved the quality of the information that came to mind, which, together with improved postretrieval monitoring, enhanced actual recall performance.

Journal ArticleDOI
31 May 2012-PLOS ONE
TL;DR: Intriguingly, activity at the time of study in the left precuneus was modulated by the self-reported quality (vividness) of the generated mental images with greater activity for trials given higher ratings of quality, suggesting that regions of the brain support memory in accord with the encoding operations engaged at thetime of study.
Abstract: Previous behavioral evidence suggests that instructed strategy use benefits associative memory formation in paired associate tasks. Two such effective encoding strategies–visual imagery and sentence generation–facilitate memory through the production of different types of mediators (e.g., mental images and sentences). Neuroimaging evidence suggests that regions of the brain support memory reflecting the mental operations engaged at the time of study. That work, however, has not taken into account self-reported encoding task success (i.e., whether participants successfully generated a mediator). It is unknown, therefore, whether task-selective memory effects specific to each strategy might be found when encoding strategies are successfully implemented. In this experiment, participants studied pairs of abstract nouns under either visual imagery or sentence generation encoding instructions. At the time of study, participants reported their success at generating a mediator. Outside of the scanner, participants further reported the quality of the generated mediator (e.g., images, sentences) for each word pair. We observed task-selective memory effects for visual imagery in the left middle occipital gyrus, the left precuneus, and the lingual gyrus. No such task-selective effects were observed for sentence generation. Intriguingly, activity at the time of study in the left precuneus was modulated by the self-reported quality (vividness) of the generated mental images with greater activity for trials given higher ratings of quality. These data suggest that regions of the brain support memory in accord with the encoding operations engaged at the time of study.

Journal ArticleDOI
TL;DR: Young and older adults' mediator-based strategy use on source-monitoring tasks was examined to underscore the importance of assessing encoding strategy use for understanding individual differences in source memory.
Abstract: Past research has examined the contribution of mediator-based encoding strategies (interactive imagery and sentence generation) to individual (particularly age-related) differences in associative memory exclusively in the paired-associates paradigm. In the present study, we examined young and older adults' mediator-based strategy use on source-monitoring tasks. Participants spontaneously used mediator-based strategies to encode about 30% to 40% of word-source pairs and were able to follow instructions to use the specific mediator-based strategy of interactive imagery; mediator-based strategy use was associated with higher source memory and explained variance in source memory. There were no age-related differences in the patterns of mediator-based strategy production and utilization. Age-related differences in source memory were explained by age-related declines in the ability to bind information in memory (incidental memory for digit-symbol associations) but not by encoding strategy production. Results underscore the importance of assessing encoding strategy use for understanding individual differences in source memory.

Journal ArticleDOI
TL;DR: The results revealed that the degree of prior knowledge positively predicted memory for source specifying contextual details, and suggest that a priori knowledge within a specific domain allows attentional resources to be allocated toward the encoding of contextual details.
Abstract: A positive relationship between prior knowledge and item memory is a consistent finding in the literature. In the present study, we sought to determine whether this relationship extends to episodic details that are present at the time of encoding, namely source memory. Using a novel experimental design, we were able to show both between- and within-subjects effects of prior knowledge on source memory. Specifically, the results revealed that the degree of prior knowledge positively predicted memory for source specifying contextual details. In addition, by including two conditions in which attention was divided either at encoding or retrieval, we were able to show that prior knowledge influences memory by affecting encoding processes. Overall, the data suggest that a priori knowledge within a specific domain allows attentional resources to be allocated toward the encoding of contextual details.

Journal ArticleDOI
TL;DR: The high-value advantage holds for implicit- and explicit-memory, but comes with a side effect: High-value items are more difficult to relearn in a new context.
Abstract: Learning through reward is central to adaptive behavior. Indeed, items are remembered better if they are experienced while participants expect a reward, and people can deliberately prioritize memory for high- over low-valued items. Do memory advantages for high-valued items only emerge after deliberate prioritization in encoding? Or, do reward-based memory enhancements also apply to unrewarded memory tests and to implicit memory? First, we tested for a high-value memory advantage in unrewarded implicit- and explicit-tests (Experiment 1). Participants first learned high or low-reward values of 36 words, followed by unrewarded lexical decision and free-recall tests. High-value words were judged faster in lexical decision, and more often recalled in free recall. These two memory advantages for high-value words were negatively correlated suggesting at least two mechanisms by which reward value can influence later item-memorability. The ease with which the values were originally acquired explained the negative correlation: people who learned values earlier showed reward effects in implicit memory whereas people who learned values later showed reward effects in explicit memory. We then asked whether a high-value advantage would persist if trained items were linked to a new context (Experiments 2a and 2b). Following the same value training as in Experiment 1, participants learned lists composed of previously trained words mixed with new words, each followed by free recall. Thus, participants had to retrieve words only from the most recent list, irrespective of their values. High- and low-value words were recalled equally, but low-value words were recalled earlier than high-value words and high-value words were more often intruded (proactive interference). Thus, the high-value advantage holds for implicit- and explicit-memory, but comes with a side effect: High-value items are more difficult to relearn in a new context. Similar to emotional arousal, reward value can both enhance and impair memory.

Journal ArticleDOI
TL;DR: This paper presents another construction of a single-error-correcting WOM-code with a better rate, and shows two constructions that can be combined for the correction of an arbitrary number of errors.
Abstract: A Write Once Memory (WOM) is a storage medium with binary memory elements, called cells, that can change from the zero state to the one state only once. Examples of WOMs include punch cards and optical disks. WOM-codes, introduced by Rivest and Shamir, permit the reuse of a WOM by taking into account the location of cells that have already been changed to the one state. The objective in designing WOM-codes is to use the fewest number of cells to store a specified number of information bits in each of several reuses of the memory. An [n,k,t] WOM-code C is a coding scheme for storing k information bits in n cells t times. At each write, the state of each cell can be changed, provided that the cell is changed from the zero state to the one state. The rate of C, defined by R(C) = kt/n, indicates the total amount of information that is possible to store in a cell in t writes. Two WOM-code constructions correcting a single cell-error were presented by Zemor and Cohen. In this paper, we present another construction of a single-error-correcting WOM-code with a better rate. Our construction can be adapted also for single-error-detection, double-error-correction, and triple-error-correction. For the last case, we use triple-error-correcting BCH-like codes, which were presented by Kasami and more recently described again by Bracken and Helleseth. Finally, we show two constructions that can be combined for the correction of an arbitrary number of errors.

Proceedings ArticleDOI
24 Sep 2012
TL;DR: A novel approach Memory Exploration and Encoding (ME2) is presented, which first identifies useful pages and then utilizes Run Length Encode algorithm to quickly encode memory, to efficiently decrease the total transferred data, total migration time and downtime.
Abstract: Live migration of virtual machine plays an important role in data center, which can successfully migrate virtual machine from one physical machine to another with only slight influence on upper workload. It can be used to facilitate hardware maintenance, load balancing, fault-tolerance and power-saving, especially in cloud computing data centers. Although the pre-copy is the prevailing approach, it cannot distinguish which memory page is used, resulting in transferring large amounts of useless memory pages. This paper presents a novel approach Memory Exploration and Encoding (ME2), which first identifies useful pages and then utilizes Run Length Encode algorithm to quickly encode memory, to efficiently decrease the total transferred data, total migration time and downtime. Experiments demonstrate that ME2 can significantly decrease 50.5% of total transferred data, 48.2% of total time and 47.6% of downtime on average compared with Xen's pre-copy algorithm.

Patent
30 Nov 2012
TL;DR: In this paper, a bit representation corresponding to a bit in a representation of a codeword that is read from a non-volatile memory of a data storage device is determined at least partially based on an amount of shaping of data.
Abstract: Systems and methods of encoding and decoding shaped data include determining a bit representation corresponding to a bit in a representation of a codeword that is read from a non-volatile memory of a data storage device. A soft metric corresponding to the bit representation is determined at least partially based on an amount of shaping of data.

Proceedings Article
01 Jan 2012
TL;DR: An integer linear programming method is developed and dynamic programming is employed to produce codes for uniformly distributed data and data-aware coding schemes are introduced to efficiently address the energy minimization problem for stochastic data.
Abstract: We devise new coding methods to minimize Phase Change Memory write energy. Our method minimizes the energy required for memory rewrites by utilizing the differences between PCM read, set, and reset energies. We develop an integer linear programming method and employ dynamic programming to produce codes for uniformly distributed data. We also introduce data-aware coding schemes to efficiently address the energy minimization problem for stochastic data. Our evaluations show that the proposed methods result in up to 32% and 44% reduction in memory energy consumption for uniform and stochastic data respectively.

Journal ArticleDOI
TL;DR: The results indicate that the source memory advantage for the emotional context information is not always accompanied by enhanced recollection of the specific details of the learning episode and might rather reflect unspecific memory for categorical emotional information.
Abstract: Two experiments designed to examine the specificity of emotional source memory are reported. In the encoding phase, participants saw faces along with emotional context information, that is, descriptions of cheating, trustworthy, or irrelevant behavior. In the test phase, participants were required to complete a source classification test and a cued recall test. In both experiments, the source memory advantage for faces characterized by negative context information (cheating) was replicated. Extending previous research, a multinomial source-monitoring model was applied to distinguish between specific source memory for individual behavior descriptions and partial source memory in the sense of only a rough classification of the behavior as belonging to a particular emotional category--cheating, trustworthy, or neither of these. The results indicate that the source memory advantage for the emotional context information is not always accompanied by enhanced recollection of the specific details of the learning episode and might rather reflect unspecific memory for categorical emotional information.

Journal ArticleDOI
29 Aug 2012-PLOS ONE
TL;DR: The mechanisms underlying working memory capacity are investigated by means of a biophysically-realistic attractor network with spiking neurons while accounting for two recent experimental observations: the presence of a visually salient item reduces the number of items that can be held in working memory, and visually salient items are commonly kept in memory at the cost of not keeping as many non-salient items.
Abstract: The study of working memory capacity is of outmost importance in cognitive psychology as working memory is at the basis of general cognitive function. Although the working memory capacity limit has been thoroughly studied, its origin still remains a matter of strong debate. Only recently has the role of visual saliency in modulating working memory storage capacity been assessed experimentally and proved to provide valuable insights into working memory function. In the computational arena, attractor networks have successfully accounted for psychophysical and neurophysiological data in numerous working memory tasks given their ability to produce a sustained elevated firing rate during a delay period. Here we investigate the mechanisms underlying working memory capacity by means of a biophysically-realistic attractor network with spiking neurons while accounting for two recent experimental observations: 1) the presence of a visually salient item reduces the number of items that can be held in working memory, and 2) visually salient items are commonly kept in memory at the cost of not keeping as many non-salient items. Our model suggests that working memory capacity is determined by two fundamental processes: encoding of visual items into working memory and maintenance of the encoded items upon their removal from the visual display. While maintenance critically depends on the constraints that lateral inhibition imposes to the mnemonic activity, encoding is limited by the ability of the stimulated neural assemblies to reach a sufficiently high level of excitation, a process governed by the dynamics of competition and cooperation among neuronal pools. Encoding is therefore contingent upon the visual working memory task and has led us to introduce the concept of effective working memory capacity (eWMC) in contrast to the maximal upper capacity limit only reached under ideal conditions.

Patent
Tokumasa Hara1, Osamu Torii1
24 Aug 2012
TL;DR: In this article, a memory controller that controls a non-volatile semiconductor memory including a memory cell of 3 bits/cell includes a controller that extracts bits which becomes an error caused by the movement to the adjacent threshold voltage distribution from a first bit and a second bit of data to be written in each of the memory cells to generate a virtual page and an encoding unit that generate an error correcting code for the virtual page.
Abstract: According to one embodiment, a memory controller that controls a non-volatile semiconductor memory including a memory cell of 3 bits/cell includes a controller that extracts bits which becomes an error caused by the movement to the adjacent threshold voltage distribution from a first bit and a second bit of data to be written in each of the memory cells to generate a virtual page and an encoding unit that generate an error correcting code for the virtual page and writes the data for three pages and the error correcting code in the non-volatile semiconductor memory.

Book ChapterDOI
TL;DR: The concept of distinctive processing offers an encouraging opening to the development of a broader theory of precision in memory.
Abstract: Memory allows prior experience to influence current processing and through that function memory is arguably fundamental to cognitive processes from perception to reasoning Yet, both formal and anecdotal evidence suggests that encoding to memory is rarely intentional Thus, the challenge for memory theory is to capture the operation of a powerful and sensitive process that nonetheless operates incidentally to perception and comprehension This chapter describes the long-term development of a framework for such a theory The framework emerged from a convergence of empirical work on organization and levels of processing in memory and has been conceptually guided by theories of similarity judgment Of particular relevance is structural mapping theory (eg, Gentner, 1983; Medin, Goldstone, & Gentner, 1990 Medin et al, 1990 ), which has fostered a definition of distinctive processing as the processing of difference in the context of similarity Research has shown near-perfect memory for a substantial amount of material following such processing Moreover, distinctive processing has been shown to reduce forgetting as well as false memory In sum, the concept of distinctive processing offers an encouraging opening to the development of a broader theory of precision in memory