scispace - formally typeset
Search or ask a question
Topic

Cache pollution

About: Cache pollution is a research topic. Over the lifetime, 11353 publications have been published within this topic receiving 262139 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: An approximate analytical model for the performance of multiprocessors with private cache memories and a single shared main memory is presented and is found to be very good over a broad range of parameters.
Abstract: This paper presents an approximate analytical model for the performance of multiprocessors with private cache memories and a single shared main memory. The accuracy of the model is compared with simulation results and is found to be very good over a broad range of parameters. The parameters of the model are the size of the multiprocessor, the size and type of the interconnection network, the cache miss-ratio, and the cache block transfer time. The analysis is extended to include several different read/write policies such as write-through, load-through, and buffered write-back. The analytical technique presented is also applicable to the performance of interconnection networks under block transfer mode.

78 citations

Patent
03 Sep 2007
TL;DR: In this article, a cache maintains rows of data accessed by different user applications on corresponding connections, while the cache restricts access to rows populated by other applications when processing requests directed to the source database system.
Abstract: Enhanced access data available in a cache. In one embodiment, a cache maintaining copies of source data is formed as a volatile memory. On receiving a request directed to the cache for a copy of a data element, the requested copy maintained in the cache is sent as a response to the request. In another embodiment used in the context of applications accessing databases in a navigational model, a cache maintains rows of data accessed by different user applications on corresponding connections. Applications may send requests directed to the cache to retrieve copies of the rows, populated potentially by other applications, while the cache restricts access to rows populated by other applications when processing requests directed to the source database system. In another embodiment, an application may direct requests to retrieve data elements caused to be populated by activity on different connections established by the same application.

78 citations

Patent
Naoki Moritoki1
02 Jan 2008
TL;DR: In this paper, the authors present a storage apparatus and its data management method capable of preventing the loss of data retained in a volatile cache memory even during an unexpected power shutdown, provided that the storage apparatus includes a cache memory configured from a volatile and nonvolatile memory.
Abstract: Provided are a storage apparatus and its data management method capable of preventing the loss of data retained in a volatile cache memory even during an unexpected power shutdown. This storage apparatus includes a cache memory configured from a volatile and nonvolatile memory. The volatile cache memory caches data according to a write request from a host system and data staged from a disk drive, and the nonvolatile cache memory only caches data staged from a disk drive. Upon an unexpected power shutdown, the storage apparatus immediately backs up the dirty data and other information cached in the volatile cache memory to the nonvolatile cache memory.

78 citations

Proceedings ArticleDOI
06 Mar 2009
TL;DR: A 3D chip design that stacks SRAM and DRAM upon processing cores and employs OS-based page coloring to minimize horizontal communication of cache data is postulated and it is shown that a tree topology is an ideal fit that significantly reduces the power and latency requirements of the on-chip network.
Abstract: Cache hierarchies in future many-core processors are expected to grow in size and contribute a large fraction of overall processor power and performance. In this paper, we postulate a 3D chip design that stacks SRAM and DRAM upon processing cores and employs OS-based page coloring to minimize horizontal communication of cache data. We then propose a heterogeneous reconfigurable cache design that takes advantage of the high density of DRAM and the superior power/delay characteristics of SRAM to efficiently meet the working set demands of each individual core. Finally, we analyze the communication patterns for such a processor and show that a tree topology is an ideal fit that significantly reduces the power and latency requirements of the on-chip network. The above proposals are synergistic: each proposal is made more compelling because of its combination with the other innovations described in this paper. The proposed reconfigurable cache model improves performance by up to 19% along with 48% savings in network power.

78 citations

Patent
25 Aug 1997
TL;DR: In this article, a reconfigurable cache optimized for texture mapping is proposed, which provides two-banks of memory during one mode of operation and a palettized map under a second mode of operations.
Abstract: A reconfigurable cache in a signal processor provides a cache optimized for texture mapping. In particular, the reconfigurable cache provides two-banks of memory during one mode of operation and a palettized map under a second mode of operation. In one implementation, the reconfigurable cache optimizes mip-mapping by assigning one texture map in one of the memory banks and a second texture map of a different resolution to the other memory bank. A special mapping pattern ("supertiling") between a graphical image to cache lines minimizes cache misses in texture mapping operations.

78 citations


Network Information
Related Topics (5)
Cache
59.1K papers, 976.6K citations
93% related
Compiler
26.3K papers, 578.5K citations
89% related
Scalability
50.9K papers, 931.6K citations
87% related
Server
79.5K papers, 1.4M citations
86% related
Static routing
25.7K papers, 576.7K citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202342
2022110
202112
202020
201915
201830