scispace - formally typeset
Search or ask a question
Topic

Encoding (memory)

About: Encoding (memory) is a research topic. Over the lifetime, 7547 publications have been published within this topic receiving 120214 citations. The topic is also known as: memory encoding & encoding of memories.


Papers
More filters
Patent
25 Apr 1995
TL;DR: In this article, a motion vector detected by a motion detector and a quantizing parameter and a frame structure determined by a controller are stored in a memory and the data thus stored is supplied to an encoder which carries out the encode processing corresponding to the stored data.
Abstract: A motion vector detected by a motion detector (2) and a quantizing parameter and a frame structure determined by a controller (4) are stored in a memory (5). The data thus stored is supplied to an encoder (3) which carries out the encode processing corresponding to the stored data. Thus, the data are coded via multiple paths which can reduce restrictions from the standpoint of time and also reduce the scale of hardware need for encoding.

53 citations

Patent
12 Jun 2006
TL;DR: In this paper, a method of performing a transformation comprising loading a source block into memory of the computing device, performing an intermediate transformation of less than all of the source block, then replacing a part of source block with intermediate results in the memory and then completing the transformation such that output symbols stored in memory form a set of encoded symbols.
Abstract: In an encoder for encoding symbols of data using a computing device having memory constraints, a method of performing a transformation comprising loading a source block into memory of the computing device, performing an intermediate transformation of less than all of the source block, then replacing a part of the source block with intermediate results in the memory and then completing the transformation such that output symbols stored in the memory form a set of encoded symbols. A decoder can perform decoding steps in an order that allows for use of substantially the same memory for storing the received data and the decoded source block, performing as in-place transformations. Using an in-place transformation, a large portion of memory set aside for received data can be overwritten as that received data is transformed into decoded source data without requiring a similar sized large portion of memory for the decoded source data.

53 citations

Patent
23 Oct 1997
TL;DR: An image decoder which can decode an encoded bit stream which is encoded by a different encoding system has an encoding system judging unit which judges the encoding system of the encoded bits stream in accordance with multiplexed encoding system identification information as discussed by the authors.
Abstract: An image decoder which can decode an encoded bit stream which is encoded by a different encoding system has an encoding system judging unit which judges the encoding system of the encoded bit stream in accordance with multiplexed encoding system identification information, a setting means which sets the header information of a 2nd encoding system in accordance with the header information of a 1st encoding system and a decoding means which decodes the encoded image data of the 1st encoding system in accordance with the header information of the set 2nd encoding system.

53 citations

Journal ArticleDOI
TL;DR: The subjective likelihood model [SLiM; McClelland, J. L., & Chappell, M. L.) and the retrieving effectively from memory model [REM]; a brief tutorial on each model and simulations showing cases where they diverge.

53 citations

Journal ArticleDOI
TL;DR: For example, the authors found that recall of a conceptually unrelated hierarchy of words increased significantly with generative instructions, with the greatest gain occurring for the randomly arranged hierarchy and a properly arranged conceptual hierarchy.
Abstract: The 90 individually run subjects learned and were tested for their free recall of a conceptually unrelated hierarchy of words, a randomly arranged, or a properly arranged conceptual hierarchy, under instructions to process the words either by generating hierarchical associations among them or by copying them. As predicted, recall of every type of hierarchy increased markedly with generative instructions [p < .001], with the greatest gain occurring for the randomly arranged hierarchy. Type of hierarchy also affected recall [p < .001]. The results tend to support the generative model of encoding, a model which emphasizes the active construction of distinctive as well as semantic associations. The nature of the information encoded in long-term memory is a central issue in multistage models of memory and in levels-of-processing models of memory. Multistage models often contrast acoustic or phonetic processing with semantic processing to describe how information is encoded into short-term memory and long-term memory respectively. Across experiments with multistage models, semantic processing varies in meaning from a circular definition of it as any processing that results in longterm recall, to carefully detailed statements about the abstract semantic attributes and markers, networks of lexical meanings, taxonomical organizations, and hierarchical retrieval cues involved in long-term recall. However, in nearly all of these definitions, the processing of abstract meanings is of primary concern. Models that rely primarily on the processing of abstract categories of meaning to account for long-term memory may be unnecessarily limited in the range of encoding variables they include. Levels-of-processing models, such as the generative model presented below, omit the stages

53 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
83% related
Deep learning
79.8K papers, 2.1M citations
83% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Convolutional neural network
74.7K papers, 2M citations
81% related
Cluster analysis
146.5K papers, 2.9M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,083
20222,253
2021450
2020378
2019358
2018363