scispace - formally typeset
Patent

Integrated cache memory system with primary and secondary cache memories

Reads0
Chats0
TLDR
In this article, the data is stored in an appropriate one of the arrays (88)-(94) and transferred through the primary cache (26) via transfer circuits (96), (98), (100) and (102) to the data bus (32).
Abstract
A central processing unit (10) has a cache memory system (24) associated therewith for interfacing with a main memory system (23). The cache memory system (24) includes a primary cache (26) comprised of SRAMS and a secondary cache (28) comprised of DRAM. The primary cache (26) has a faster access than the secondary cache (28). When it is determined that the requested data is stored in the primary cache (26) it is transferred immediately to the central processing unit (10). When it is determined that the data resides only in the secondary cache (28), the data is accessed therefrom and routed to the central processing unit (10) and simultaneously stored in the primary cache (26). If a hit occurs in the primary cache (26), it is accessed and output to a local data bus (32). If only the secondary cache (28) indicates a hit, data is accessed from the appropriate one of the arrays (80)-(86) and transferred through the primary cache ( 26) via transfer circuits (96), (98), (100) and (102) to the data bus (32). Simultaneously therewith, the data is stored in an appropriate one of the arrays (88)-(94). When a hit does not occur in either the secondary cache (28) or the primary cache (26), data is retrieved from the main system memory (23) through a buffer/multiplexer circuit on one side of the secondary cache (28) and passed through both the secondary cache (28) and the primary cache (26) and stored therein in a single operation due to the line for line transfer provided by the transfer circuits (96)-(102).

read more

Citations
More filters
Patent

Method and apparatus for monitoring traffic in a network

TL;DR: In this article, a monitor for and a method of examining packets passing through a connection point on a computer network is presented. The method includes receiving a packet from a packet acquisition device and performing parsing/extraction operations on the packet to create a parser record comprising a function of selected portions of the packet.
Patent

Method and apparatus for improving the performance of a database management system through a central cache mechanism

TL;DR: In this article, the cache directory structure is used for defining the name of each configured central cache system and for providing an index value identifying the particular set of descriptors associated therewith.
Patent

State processor for pattern matching in a network monitor device

TL;DR: In this paper, a processor for processing contents of packets passing through a connection point on a computer network is presented, which includes a searching apparatus having one or more comparators for searching for a reference string in the contents of a packet.
Patent

Dynamic arrays and overlays with bounds policies

TL;DR: In this paper, a method for accessing a memory array is presented, where the data is provided within a one-dimensional array of allocated memory, and data is accessed from within the block of statements as a dimensional indexed array using the array attribute storage object.
Patent

Integrated processing and L2 DRAM cache

TL;DR: In this article, an integrated processor and level two (L2) dynamic random access memory (DRAM) are fabricated on a single chip, and the L2 DRAM cache is placed on the same chip as the processor to reduce the time needed for two chip-to-chip crossings.
References
More filters
Patent

Store queue for a tightly coupled multiple processor configuration with two-level cache buffer storage

TL;DR: In this paper, a hierarchical first-level and second-level memory system includes a first level store queue (18B1) for storing instructions and/or data from a processor (20B) of the multiprocessor system prior to storage in the first level of cache (18A2), a second level store queues (26A2).
Patent

Partitioned cache memory with partition look-aside table (PLAT) for early partition assignment identification

TL;DR: Partition Look-Aside Table (PLAT) as discussed by the authors is a cache partition identifier, a control field, and a congruence-class address for locating associated data in the identified partition.
Patent

Variable address mode cache

TL;DR: In this paper, a common directory and an L1 control array (L1CA) are provided for the CPU to access both the L1 and L2 caches, each of which is either a real/absolute address or a virtual address according to whichever address mode the CPU is in.
Patent

Multi-processor system with cache memories

TL;DR: In this paper, a system is described wherein a CPU, a main memory means and a bus means are provided to couple the CPU to the bus means and means to indicate the status of a data unit stored within the cache memory means.
Patent

Apparatus for maintaining consistency of a cache memory with a primary memory

TL;DR: In this article, a microprocessor system is described with a high speed system bus for coupling system elements, and having a dual bus microprocessor with separate ultra-high speed instruction and data cache-MMU interfaces coupled to independently operable instruction and MMU, respectively.