scispace - formally typeset
Search or ask a question

Showing papers on "Cache coloring published in 1974"


Patent
17 Jan 1974
TL;DR: In this article, the cache store is operated in parallel to the request for data information from the main memory store and a successful retrieval from the cache cache store aborts the retrieval from a main memory.
Abstract: A cache store located in the processor provides a fast access look-aside store to blocks of data information previously fetched from the main memory store. The request to the cache store is operated in parallel to the request for data information from the main memory store. A successful retrieval from the cache store aborts the retrieval from a main memory. Block loading of the cache store is performed autonomously from the processor operations. The cache store is cleared on cycles such as interrupts which require the processor to shift program execution. The store-aside configuration of the processor overlooks the backing store cycle on a store operand cycle and the cache store checking operations are performed next causing the cycles to be performed simultaneously.

77 citations


Journal ArticleDOI
TL;DR: An investigation of the various cache schemes that are practical for a minicomputer has been found to provide considerable insight into cache organization.
Abstract: An investigation of the various cache schemes that are practical for a minicomputer has been found to provide considerable insight into cache organization. Simulations are used to obtain data on the performance and sensitivity of organizational parameters of various writeback and lookahead schemes. Hardware considerations in the construction of the actual cache-minicomputer are also noted and a simple cost/performance analysis is presented.

63 citations


Patent
10 Apr 1974
TL;DR: In this paper, the cache store is selectively cleared of the information from the page whose data information is no longer needed by addressing each level of an associative tag directory to the cache.
Abstract: In a data processing system that uses segmentation and paging to access data information such as in a virtual memory machine, the cache store need not be entirely cleared each time an I/O operation is performed or each time the data in the cache has a possibility of being incorrect. With segmentation and paging, only a portion of the cache store need be cleared when a new page is obtained from the virtual memory. The entire cache store is cleared only when a new segment is indicated by the instruction. The cache store is selectively cleared of the information from the page whose data information is no longer needed by addressing each level of an associative tag directory to the cache store. The columns of each level are compared to the page address and if a comparison is signaled that column of the addressed level is cleared by clearing the flag indicating the full status of the column in the addressed level. Each level of the tag directory is addressed.

51 citations


Journal ArticleDOI
TL;DR: The method is independent of cache memory techniques, although aimed at the same problem, and could be combined with use of a cache memory to obtain still more speedup of processor execution.
Abstract: This paper discusses potential techniques for the dynamic generation of instructions in the instruction fetch unit of a processor. The chief advantage is the increased effective bandwidth in transfer of compressed program information from memory to the processor unit, which allows higher processor speed for a given memory access rate. The method is independent of cache memory techniques, although aimed at the same problem, and could be combined with use of a cache memory to obtain still more speedup of processor execution. The paper is largely at a conceptual level; work is planned to obtain data to facilitate design and simulation of a prototype machine.

2 citations