scispace - formally typeset
Search or ask a question
Topic

Smart Cache

About: Smart Cache is a research topic. Over the lifetime, 7680 publications have been published within this topic receiving 180618 citations.


Papers
More filters
Patent
26 Jun 2013
TL;DR: In this article, the authors present a computer implemented method, system, and computer program product for cache management comprising recording metadata of IO sent from the server to a storage array, calculating a distribution of a server cache based on metadata, receiving an IO directed to the storage array and revising an allocation of the server cache to the plurality of storage mediums based on the calculated distribution and the IO.
Abstract: A computer implemented method, system, and computer program product for cache management comprising recording metadata of IO sent from the server to a storage array, calculating a distribution of a server cache based on the metadata, receiving an IO directed to the storage array, and revising an allocation of the server cache to the plurality of storage mediums based on the calculated distribution and the IO.

73 citations

Journal ArticleDOI
17 Jan 1976
TL;DR: The concept of cache memory is introduced together with its major organizational parameters: size, associativity, block size, replacement algorithm, and write strategy, and simulation results are given showing how the performance of the cache varies with changes in these parameters.
Abstract: This paper gives a summary of the research which led to the design of the cache memory in the DEC PDP-11/70. The concept of cache memory is introduced together with its major organizational parameters: size, associativity, block size, replacement algorithm, and write strategy. Simulation results are given showing how the performance of the cache varies with changes in these parameters. Based on these simulation results the design of the 11/70 cache is justified.

73 citations

Patent
28 Jun 1996
TL;DR: In this article, the authors propose an apparatus and method for synchronizing a cache mode in a cache memory system in a computer to protect cache operations, where cache mode is stored as metadata in the cache modules and is detected by the first controller to determine the cache mode.
Abstract: An apparatus and method for synchronizing a cache mode in a cache memory system in a computer to protect cache operations. The cache memory system has a first controller and a second controller and two cache modules and operates in a plurality of cache modes. The cache mode is stored as metadata in the cache modules and is detected by the first controller to determine the cache mode. Lock signals in the first controller are set in accordance with the cache mode detected to set the cache mode state in the first controller. The second controller copies the cache mode state from the first controller to synchronize both controllers in the same cache mode state. After a failure of the second controller, the first controller may lock access to both caches to recover data previously accessed by the second controller. The second controller restarts and copies the cache mode state from the first controller, so that both controllers return to the cache mode state prior to the failure of the second controller.

73 citations

01 Jan 2004
TL;DR: This paper proposes a dynamic cache partitioning method for simultaneous multithreading systems that collects the miss-rate characteristics of simultaneously executing threads at runtime, and partitions the cache among the executing threads.
Abstract: This paper proposes a dynamic cache partitioning method for simultaneous multithreading systems. We present a general partitioning scheme that can be applied to setassociative caches at any partition granularity. Furthermore, in our scheme threads can have overlapping partitions, which provides more degrees of freedom when partitioning caches with low associativity. Since memory reference characteristics of threads can change very quickly, our method collects the miss-rate characteristics of simultaneously executing threads at runtime, and partitions the cache among the executing threads. Partition sizes are varied dynamically to improve hit rates. Trace-driven simulation results show a relative improvement in the L2 hit-rate of up to 40.5% over those generated by the standard least recently used replacement policy, and IPC improvements of up to 17%. Our results show that smart cache management and scheduling is important for SMT systems to achieve high performance.

73 citations

Patent
17 Mar 1997
TL;DR: In this article, the cache controller determines whether the requested data object is to be cached or is exempt from being cached, and then it is loaded directly into a local memory and is not stored in the cache.
Abstract: A method for selectively caching data in a computer network. Initially, data objects which are anticipated as being accessed only once or seldomly accessed are designated as being exempt from being cached. When a read request is generated, the cache controller reads the requested data object from the cache memory if it currently resides in the cache memory. However, if the requested data object cannot be found in the cache memory, it is read from a mass storage device. Thereupon, the cache controller determines whether the requested data object is to be cached or is exempt from being cached. If the data object is exempt from being cached, it is loaded directly into a local memory and is not stored in the cache. This provides improved cache utilization because only objects that are used multiple times are entered in the cache. Furthermore, processing overhead is minimized by reducing unnecessary cache insertion and purging operations. In addition, I/O operations are minimized by increasing the likelihood that hot objects are retained in the cache longer at the expense of infrequently used objects.

73 citations


Network Information
Related Topics (5)
Cache
59.1K papers, 976.6K citations
92% related
Server
79.5K papers, 1.4M citations
88% related
Scalability
50.9K papers, 931.6K citations
88% related
Network packet
159.7K papers, 2.2M citations
85% related
Quality of service
77.1K papers, 996.6K citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202350
2022114
20215
20201
20198
201818