scispace - formally typeset
Search or ask a question
Topic

Smart Cache

About: Smart Cache is a research topic. Over the lifetime, 7680 publications have been published within this topic receiving 180618 citations.


Papers
More filters
Proceedings ArticleDOI
13 Jun 1983
TL;DR: In designing a VLSI instruction cache for a RISC microprocessor the authors have uncovered four ideas potentially applicable to other V LSI machines that provide expansible cache memory, increased cache speed, reduced program code size, and decreased manufacturing costs.
Abstract: A cache was first used in a commercial computer in 1968,1 and researchers have spent the last 15 years analyzing caches and suggesting improvements. In designing a VLSI instruction cache for a RISC microprocessor we have uncovered four ideas potentially applicable to other VLSI machines. These ideas provide expansible cache memory, increased cache speed, reduced program code size, and decreased manufacturing costs. These improvements blur the habitual distinction between an instruction cache and an instruction fetch unit. The next four sections present the four architectural ideas, followed by a section on performance evaluation of each idea. We then describe the implementation of the cache and finally summarize the results.

66 citations

Patent
09 Dec 1993
TL;DR: In this paper, a microprocessor is provided with an integral, two-level cache memory architecture, where the first level cache misses and the second level cache is discarded and stored in the replacement cache.
Abstract: A microprocessor is provided with an integral, two level cache memory architecture. The microprocessor includes a microprocessor core and a set associative first level cache both located on a common semiconductor die. A replacement cache, which is at least as large as approximately one half the size of the first level cache, is situated on the same semiconductor die and is coupled to the first level cache. In the event of a first level cache miss, a first level entry is discarded and stored in the replacement cache. When such a first level cache miss occurs, the replacement cache is checked to see if the desired entry is stored therein. If a replacement cache hit occurs, then the hit entry is forwarded to the first level cache and stored therein. If a cache miss occurs in both the first level cache and the replacement cache, then a main memory access is commenced to retrieve the desired entry. In that event, the desired entry retrieved from main memory is forwarded to the first level cache and stored therein. When a replacement cache entry is removed from the replacement cache by the replacement algorithm associated therewith, that entry is written back to main memory if that entry was modified. Otherwise the entry is discarded.

65 citations

Patent
Le Trong Nguyen1
10 Oct 2000
TL;DR: In this paper, the authors propose an integrated digital signal processor (IDS) architecture which includes a cache subsystem, a first bus and a second bus, which provides caching and data routing for the processors.
Abstract: To achieve high performance at low cost, an integrated digital signal processor uses an architecture which includes both a general purpose processor and a vector processor. The integrated digital signal processor also includes a cache subsystem, a first bus and a second bus. The cache subsystem provides caching and data routing for the processors and buses. Multiple simultaneous communication paths can be used in the cache subsystem for the processors and buses. Furthermore, simultaneous reads and writes are supported to a cache memory in the cache subsystem.

65 citations

Proceedings ArticleDOI
23 Jun 1997
TL;DR: It is proposed that Soft Caching, where an image can be cached at one of a set of levels of resolutions, can benefit the overall performance when combined with cache management strategies that estimate, for each object, both the bandwidth to the server where the object is stored and the appropriate resolution level demanded by the user.
Abstract: The vast majority of current Internet traffic is generated by web browsing applications. Proxy caching, which allows some of the most popular web objects to be cached at intermediate nodes within the network, has been shown to provide substantial performance improvements. In this paper we argue that image-specific caching strategies are desirable and will result in improved performance over approaches treating all objects alike. We propose that Soft Caching, where an image can be cached at one of a set of levels of resolutions, can benefit the overall performance when combined with cache management strategies that estimate, for each object, both the bandwidth to the server where the object is stored and the appropriate resolution level demanded by the user. We formalize the cache management problem under these conditions and describe an experimental system to test these techniques.

65 citations

Proceedings ArticleDOI
22 Sep 1997
TL;DR: This paper uses log files from four Web servers and proposes and evaluates static caching, a novel cache policy for Web servers that incur no CPU overhead and does not suffer from memory fragmentation.
Abstract: This paper studies caching in primary Web servers. We use log files from four Web servers to analyze the performance of various proposed cache policies for Web servers: LRU-threshold, LFU, LRU-SIZE, LRU-MIN, LRU-k-threshold and the Pitkow/Recker (1994) policy. Web document access patterns change very slowly. Based on this fact, we propose and evaluate static caching, a novel cache policy for Web servers. In static caching, the set of documents kept in the cache is determined periodically by analyzing the request log file for the previous period. The cache is filled with documents to maximize cache performance provided document access patterns do not change. The set of cached documents remains constant during the period. Surprisingly, this simple policy results in high cache performance, especially for small cache sizes. Unlike other policies, static caching incur no CPU overhead and does not suffer from memory fragmentation.

65 citations


Network Information
Related Topics (5)
Cache
59.1K papers, 976.6K citations
92% related
Server
79.5K papers, 1.4M citations
88% related
Scalability
50.9K papers, 931.6K citations
88% related
Network packet
159.7K papers, 2.2M citations
85% related
Quality of service
77.1K papers, 996.6K citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202350
2022114
20215
20201
20198
201818