scispace - formally typeset
Search or ask a question
Topic

Cache invalidation

About: Cache invalidation is a research topic. Over the lifetime, 10539 publications have been published within this topic receiving 245409 citations.


Papers
More filters
Patent
Rabindranath Dutta1
26 Aug 1999
TL;DR: In this article, the caching agent starts transferring this partial file to the client while it is simultaneously retrieving the remaining portion of the file across the Internet, thereby creating a safety margin in storing more than one page.
Abstract: A system, method and program stores, in a local cache, only a small part of a large file that is being requested over a network such as the Internet. In a preferred embodiment, the caching agent starts transferring this partial file to the client while it is simultaneously retrieving the remaining portion of the file across the Internet. A preferred embodiment of the invention stores a first page of the browser display in the cache. Other embodiments store more than the first page, or a part of the full file or document, thereby creating a safety margin in storing more than one page. Another preferred embodiment initially stores the full file or document, and if there is a need for cache replacement, the cache is replaced up until the first page is reached. As such, the cache space requirements are minimized for large documents being retrieved over the World Wide Web.

98 citations

Patent
23 Dec 1994
TL;DR: In this paper, a column-associative cache that reduces conflict misses, increases the hit rate and maintains a minimum hit access time is proposed, where the cache lines represent a column of sets.
Abstract: A column-associative cache that reduces conflict misses, increases the hit rate and maintains a minimum hit access time. The column-associative cache indexes data from a main memory into a plurality of cache lines according to a tag and index field through hash and rehash functions. The cache lines represent a column of sets. Each cache line contains a rehash block indicating whether the set is a rehash location. To increase the performance of the column-associative cache, a content addressable memory (CAM) is used to predict future conflict misses.

98 citations

Patent
15 Apr 1996
TL;DR: In this paper, the authors propose a separate region conversion system that is capable of maintaining the hit rate of the cache at high level by simplifying the cache status and to improve the execution efficiency of the application program.
Abstract: To reduce process time by simplifying the cache status and to improve the execution efficiency of the application program in the separate region conversion system that is capable of maintaining the hit rate of the cache at high level. When an access request is made to the object and if the object is not stored in the object cache, the page containing the object is read from the database and is stored in the page cache, and the object is read from the page and stored in the cache. The status of the page cache describing the status of the page stored in the page cache is stored in the page status storage device and at the same time the status of the object cache describing the status of the object stored in the object cache is stored in the object status storage device. By establishing a relationship between the status of the page cache and the status of the object, if the status of the page cache and the corresponding status of the object cache are not consistent, the status synchronizing device executes a synchronization process to make these status consistent.

98 citations

Patent
09 Oct 2000
TL;DR: In this paper, a branch cache module stores at least a portion of the first data from one or more of the pipeline stages, and a branch control unit causes the branch cache to store at least some data from the first stage in response to a cache hit.
Abstract: A pipelined processor includes a branch acceleration technique which is based on an improved branch cache architecture. In one exemplary embodiment, the present invention has an instruction pipeline comprising a plurality of stages, each stage initially containing first data. A branch cache module stores at least a portion of the first data from one or more of the pipeline stages, and a branch cache control unit causes the branch cache module to store at least portion of the first data from one or more of the pipeline stages in response to execution of a cacheable branch instruction which triggers a cache miss, and causes the branch cache module to restore second data to one or more of the pipeline stages in response to a cache hit. The present invention also discloses methods for controlling the branch cache and for reducing pipeline stalls caused by branching.

97 citations

Patent
27 Aug 2001
TL;DR: In this article, a cache directory is also provided to track cache lines in the write cache and the at least one read cache, which provides a low-latency copy of data that is most likely to be used.
Abstract: A caching input/output hub includes a host interface to connect with a host. At least one input/output interface is provided to connect with an input/output device. A write cache manages memory writes initiated by the input/output device. At least one read cache, separate from the write cache, provides a low-latency copy of data that is most likely to be used. The at least one read cache is in communication with the write cache. A cache directory is also provided to track cache lines in the write cache and the at least one read cache. The cache directory is in communication with the write cache and the at least one read cache.

97 citations


Network Information
Related Topics (5)
Cache
59.1K papers, 976.6K citations
93% related
Scalability
50.9K papers, 931.6K citations
88% related
Server
79.5K papers, 1.4M citations
88% related
Network packet
159.7K papers, 2.2M citations
83% related
Dynamic Source Routing
32.2K papers, 695.7K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202344
2022117
20214
20208
20197
201820