[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20100005539A - Cache memory system and prefetching method thereof - Google Patents

Cache memory system and prefetching method thereof Download PDF

Info

Publication number
KR20100005539A
KR20100005539A KR1020080065625A KR20080065625A KR20100005539A KR 20100005539 A KR20100005539 A KR 20100005539A KR 1020080065625 A KR1020080065625 A KR 1020080065625A KR 20080065625 A KR20080065625 A KR 20080065625A KR 20100005539 A KR20100005539 A KR 20100005539A
Authority
KR
South Korea
Prior art keywords
cache
data
address
main memory
requested
Prior art date
Application number
KR1020080065625A
Other languages
Korean (ko)
Inventor
이용석
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020080065625A priority Critical patent/KR20100005539A/en
Publication of KR20100005539A publication Critical patent/KR20100005539A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0804Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches with main memory updating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0862Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches with prefetch

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

Disclosed are a cache memory system having an improved hit rate and a method of prefetching a cache by efficiently determining the address of data that is prefetched from memory into the cache. In the present invention, the cache memory system includes a cache for storing data prefetched from the main memory, an address difference value calculator for calculating a difference value between addresses of the main memory requested from the processing apparatus, and a difference value between the addresses. And an address difference value storage unit for storing the data and a cache control unit for determining an address of data to be prefetched from the main memory based on the address difference value, and controlling the data of the determined address to be prefetched into the cache.

Description

Cache memory system and prefetching method

TECHNICAL FIELD The present invention relates to cache memory. More specifically, cache access is improved by efficiently determining an address of data prefetched from main memory into a cache by using a difference value of an address of a previously requested main memory. A cache memory system and a method of prefetching a cache to have a hit rate.

A cache is a small and fast memory that temporarily stores a part of the main memory and takes advantage of the locality that appears when the address of the main memory accessed by a processing device such as a CPU is observed. Intensity is largely classified into spatial locality and temporal locality. The CPU accesses the address area of main memory nonuniformly, but the range is often concentrated in the same address space for some time. This property is called temporal intensity, which indicates that the information to be used in the near future is mainly the information currently in use. Temporal intensity mainly occurs in program loops. In addition, the address area of the main memory accessed by the CPU is often concentrated in similar addresses. This characteristic is called spatial locality.

Using this spatial and temporal density, the principle of cache is to store a small amount of very fast storage between the CPU and the main memory and store some area of the main memory that is often used there. To increase the efficiency of the cache, maximize the probability that the required data will be found in the cache, that is, maximize the hit ratio, minimize the access time to the actual cache, delay due to missed data ( It is necessary to meet requirements such as minimizing latency. In particular, prefetching is generally applied to caches to improve hit rates. Prefetching means predicting future requests and loading in advance into the cache an area of the data contained in the memory that is likely to be accessed in the future. However, prefetching is efficient when accessing a previously accessed address repeatedly, such as a do-loop in a program, but when the spatial and temporal density of data is weak, the miss rate of cache access is increased and efficient. It may not be.

The problem to be solved by the present invention is accessed by the processing apparatus by efficiently determining the address of the data to be prefetched based on the history of the address of the data previously requested by the processing apparatus for prefetching of less intensive data. A cache memory system and a method of prefetching the same improve the data access hit ratio of a cache and reduce a memory delay.

In order to solve the above problems, a cache memory system according to the present invention comprises a cache for storing data prefetched from the main memory (prefetch); An address difference value calculator for calculating a difference value between addresses of the main memory requested from a processing device; An address difference value storage unit for storing a difference value between the addresses; And a cache controller configured to determine an address of data to be prefetched from the main memory based on the difference value, and to control the data of the determined address to be prefetched into the cache.

The address difference value calculator is configured to provide a difference between a first address of the main memory requested at the first time point from the processing device and a second address of the main memory requested at the second time point before the first time point. It is desirable to calculate the value.

The cache controller may be configured to control prefetching to read data stored at an address obtained by adding the difference value to an address of the main memory most recently requested from the processing apparatus, and to store the data in the cache.

The cache controller may determine an address of data to be prefetched from the main memory by using a difference value most recently stored among a plurality of difference values stored in the address difference value storage unit.

The cache controller may determine an address of data to be prefetched from the main memory using a difference value having the highest frequency among a plurality of difference values stored in the address difference value storage unit.

The cache controller controls the data of the determined address to be prefetched into the cache when there is no request for reading data of the main memory device from the processing device or a request for writing data to the main memory device. desirable.

Preferably, the cache controller controls to store a predetermined identifier indicating whether or not data of the subline exists in the cache when requested by the processing apparatus in a predetermined area of each subline stored in the cache.

When there is no empty space in the cache, the cache controller identifies a cache line to which data of all sublines is hit by using the identifier, and replaces the data of the identified cache line with data prefetched from the main memory. It is desirable to control the storage.

If there is no empty space in the cache, the cache controller controls to replace the data of the cache line storing the data of the address farthest from the address most recently requested from the processing device with data prefetched from the main memory. It is desirable to.

The cache controller determines whether the data of the requested address exists in the cache when data of a predetermined address is requested from the processing device during the prefetch operation, so that the data of the requested address exists in the cache. The prefetch operation is performed again after providing the data of the corresponding address to the processing apparatus, and the data of the requested address is not present in the cache, but the data of the address to be prefetched is the same as the requested address. Preferably, the prefetch operation is performed to control to provide the prefetched data to the processing apparatus.

Also, a method of prefetching data into a cache according to an embodiment of the present invention includes calculating and storing a difference value between addresses of a main memory previously requested from a processing device; Determining an address of data to be prefetched from the main memory based on the difference value; And controlling the data of the determined address to be read from the main memory and stored in the cache.

According to the present invention, the hit ratio of the cache can be improved. In addition, according to the present invention, it is possible to improve the overall performance of the system using the cache by hiding the memory delay.

Hereinafter, with reference to the accompanying drawings will be described in detail a preferred embodiment of the present invention.

1 is a block diagram illustrating a configuration of a cache memory system according to an exemplary embodiment of the present invention. The cache memory system 130 loads and stores data from the main memory 110 in response to an access request from the processing device 120 such as a CPU to the main memory 110 transferred through the bus 125, or the main memory. It serves to transmit data to 110. 1 illustrates a case in which the cache memory system 130 according to the present invention is located separately from the processing device 120, for convenience of description, the cache memory system 130 may have the same chip as the processing device 120. Those skilled in the art to which the present invention pertains may clearly understand that the main memory 110 may be an L2 cache without being limited to a general memory. have.

Referring to FIG. 1, the cache memory system 130 according to the present invention includes a cache 131, a cache controller 132, an address storage unit 133, an address difference value calculation unit 134, and an address difference value storage unit ( 135).

When the processing apparatus 120 requests access to the first address of the main memory 110 at any first time, the address information of the requested main memory 110 is stored in the address storage unit 133. If the processing apparatus 120 requests access to the second address of the main memory 110 at a second time after the first time, the address difference value calculator 134 may store the first address stored in the address storage 134. And a difference value between the second address and the second address to calculate the difference value between the addresses requested to be accessed by the processing apparatus 120 at the first adjacent time point and the second time point.

2 is a reference diagram sequentially showing addresses of the main memory 110 requested to be accessed by the processing apparatus 120, and FIG. 3 is a reference diagram showing address difference values stored in the address difference value calculator 134. . 2 and 3, when the processor 120 sequentially requests access to an address area from A 0 to A n (n is a positive integer) of the main memory 110, an address difference value is calculated. The unit 134 sequentially calculates a difference value (d i , i is an integer from 0 to n) between the addresses requested to be accessed. The address difference values d i calculated as described above are stored in the address difference value storage unit 135.

The cache controller 132 predicts an address of an area of the data area stored in the main memory 110 that is likely to be accessed in the future using the difference value stored in the address difference value storage unit 135, and predicts an area of the predicted address. A prefetching operation of loading data stored in the cache into the cache 131 is performed. In detail, the cache controller 132 may add the address of the main memory 110 most recently requested from the processing apparatus 120 and the address difference value stored in the address difference value storage 135 to be prefetched. ) Is determined. Here, the cache controller 132 may determine the target address of the main memory 110 to be prefetched using the most recently stored address difference value among the plurality of address difference values stored in the address difference value storage unit 135. . For example, if the address of the main memory 110 most recently requested by the processing device 120 is A recent , and the address difference value most recently stored in the address difference value storage unit 135 is D recent , the cache is cached. The controller 132 determines A recent + D recent as the target address of the main memory 110 to be prefetched. In addition, the cache controller 132 may determine a target address of data to be prefetched from the main memory 110 by using the address difference value having the highest frequency as the address difference value. For example, five address difference values are stored in the address difference value storage unit 135, and the five stored address difference values d 1 to d 5 are respectively d 1 = 5, d 2 = 30, d 3 Assume that = 10, d 4 = 30, d 5 = 30. In this case, the cache controller 132 may determine the address A recent of the main memory 110 that was most recently requested by the processing apparatus 120 among the most frequently found 30 of the difference values stored in the address difference value storage unit 135. A recent +30 value is added as a target address of the main memory 110 to be prefetched.

As such, when the target address of the main memory 110 to be prefetched is determined by using difference values between previously requested addresses, the cache controller 132 loads data of the target address area of the main memory 110. To pre-store the cache 131 in advance. In general, data of each frame constituting the video data is stored and processed in the main memory 110 in units of lines. As described above, although the address itself of the frame data stored in the main memory 110 on a line-by-line basis is less spatially intensive, the difference values between the addresses of each line of the frame data may be strongly localized. Accordingly, in the present invention, even in the case of data having low density, such as video data, difference values between addresses may have strong locality. Therefore, the main memory 110 stored in the target address may be determined using the difference value. Preloading is performed in advance to load the data of the data into the cache 131.

Meanwhile, the cache controller 132 controls to perform prefetching when there is no access request from the processing apparatus 120 to the main memory 110. That is, the cache controller 132 performs prefetching in consideration of the state of the current cache 131 after the process of receiving and transmitting data according to an access request from the processing device 120 to the main memory 110 is completed. . If data of a predetermined address is requested from the processing apparatus 120 during the prefetching operation, the cache controller 132 determines whether data of the requested address exists in the cache 131. If the data of the requested address exists in the cache 131, the cache controller 132 provides the data of the address to the processing device 120 and resumes the prefetch operation, and the data of the requested address is stored in the cache 131. If the data of the address that is not present in the current address but the requested address is the same as the requested address, the prefetch operation is continued to provide the processing device 120 with the data prefetched into the cache 131.

4 is a diagram illustrating the structure of a cache according to the present invention.

Referring to FIG. 4, the address 40 of the main memory 110 provided from the processing apparatus 120 is composed of a tag TAG, an index INDEX, and a line offset. The comparator (not shown) compares the tag of the address 40 of the main memory 110 with the tag of the cache line 46 of the cache 45 indicated by the index INDEX. miss). A hit is a case where the tag of the address 40 of the main memory 110 and the tag of the cache line 46 indicated by the index INDEX coincide, that is, the main memory 110 requested from the processing device 120. Indicates that data at the address is present in cache line 46. The miss is a case where the tag of the address 40 of the main memory 110 and the tag of the cache line 46 indicated by the index INDEX are inconsistent. In other words, the miss of the main memory 110 requested from the processing device 120 is mismatched. Indicates that data at the address is not present in cache line 46.

The cache controller 132 according to the present invention indicates whether or not the data of the subline was present in the cache when the request was made by the processing apparatus 120, that is, in a predetermined region of each subline of the cache line stored in the cache. The predetermined binary identifier hit is stored separately. This is for determining a cache line to be replaced with prefetched data when there is no empty space in the cache 45. In the case of a cache line in which all data of all sublines are hit, the data is processed by the processing apparatus 120. This is because it is relatively unlikely that the processing for will be completed and accessed again later. Accordingly, the cache controller 132 uses the aforementioned identifier hit to identify a cache line to which data of all sublines is hit and replace the data of the identified cache line with prefetched data. Perform the operation.

5 is a reference diagram illustrating a specific configuration of a cache according to the present invention.

Referring to FIG. 5, the cache controller 132 determines whether data of the address of the requested main memory 110 exists in the cache 131 whenever there is a request for access to the main memory 110 by the processing device 120. It is determined whether or not the hit identifier of the hit subline data is set to '1' and the hit identifier of the missed subline data is set to '0'. Thereafter, when there is no access request to the main memory 110 by the processing device 120, as described above, the cache controller 132 controls the prefetching operation to be performed, but if there is no empty space in the cache 50. Next, the hit identifiers configured in the sublines of the respective cache lines are read to determine the cache lines of which all hit identifiers are '1', and the stored cache lines are replaced with the prefetched data and stored. For example, in the case of FIG. 5, since the hit identifiers of the sublines provided in the second cache line 52 are all '1', when there is no empty space in the cache 50, the newly prefetched data is the second. Stored in cache line 52. If a plurality of cache lines having a hit identifier of '1' of all sublines exist or none exist, the cache controller 132 stores the data of the cache line having the address farthest from the most recently hit cache line. Control to replace the data with prefetched data.

6 is a flowchart illustrating a method of prefetching a cache according to the present invention.

Referring to FIG. 6, it is determined whether there is an access request from the processing device to the main memory in step 610. As described above, the present invention starts prefetching when there is no access request from the processing device, and if there is an access request from the processing device, whether the data of the address of the main memory requested in step 615 exists in the cache. If it is determined that the data is hit, the data is read from the cache and provided to the processing device. If the data is missed, the cache data is read from the main memory and provided to the processing device.

In step 620, a difference value between addresses of the main memory previously requested from the processing device is calculated and stored. In other words, a difference value between the first address of the main memory requested by the processing apparatus at a certain first time point and the second address of the main memory requested at a second time after the first time point is calculated and stored.

In step 625, an address of a region having a high probability of being accessed in the future among data regions stored in the main memory is determined using the address difference value, and prefetching is performed by loading data of the determined address region into the cache. As described above, the address of the main memory most recently requested from the processing apparatus and the address difference value are added to determine the target address of the main memory to be prefetched. Here, the most recently stored address difference value may be used as the address difference value used, or the address difference value having the highest frequency may be used.

In operation 630, when the target address is determined, a prefetch operation of loading data in the target address area of the main memory and storing the data in the cache in advance is performed.

As described above, the prefetching operation of the cache according to the present invention is preferably performed when there is no request for access to the main memory from the processing apparatus. However, if there is an access request from the processing device to the main memory after the prefetching operation is started, interrupt handling as described below is required.

7 is a flowchart illustrating a process in the case where there is an access request during the prefetching operation according to the present invention.

Referring to FIG. 7, in operation 710, it is determined whether there is a request for access to the main memory from the processing device during the prefetching operation. In step 720, if there is an access request during the prefetching operation, it is determined whether data of the address of the requested main memory exists in the cache. In operation 730, if data of the address of the requested main memory exists in the cache, the data of the address of the requested main memory is provided to the processing device, and then the prefetching operation is performed again.

If the data of the address of the requested main memory does not exist in the cache as a result of the determination of step 720, the address of the main memory to be prefetched by the prefetching operation according to the present invention which was operating in step 740. And whether the address a req of the main memory requested to be accessed by the processing apparatus coincides with each other. If the address (a prefetch ) of the main memory to be prefetched and the address (a req ) of the main memory requested to be accessed by the processing device coincide with each other, the prefetching operation which is currently being operated is continued in step 750. It provides the prefetched data to the processing device. If the address (a prefetch ) of the main memory to be prefetched and the address (a req ) of the main memory requested to be accessed by the processing device do not match, the prefetching operation currently operating in step 760 is stopped and processed. It provides a cache service that reads data stored in the address (a req ) of the main memory requested by the device to provide to the processing device.

As described above, although the present invention has been described by way of limited embodiments and drawings, the present invention is not limited to the above-described embodiments, which can be variously modified and modified by those skilled in the art to which the present invention pertains. Modifications are possible. Accordingly, the spirit of the invention should be understood only by the claims set forth below, and all equivalent or equivalent modifications will fall within the scope of the invention. In addition, the system according to the present invention can be embodied as computer readable codes on a computer readable recording medium. The computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also include a carrier wave (for example, transmission through the Internet). The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

1 is a block diagram illustrating a configuration of a cache memory system according to an exemplary embodiment of the present invention.

2 is a reference diagram sequentially illustrating an address of the main memory 110 requested to be accessed by the processing apparatus 120.

3 is a reference diagram illustrating address difference values stored in the address difference value calculator 134.

4 is a diagram illustrating the structure of a cache according to the present invention.

5 is a reference diagram illustrating a specific configuration of a cache according to the present invention.

6 is a flowchart illustrating a method of prefetching a cache according to the present invention.

7 is a flowchart illustrating a process in the case where there is an access request during the prefetching operation according to the present invention.

Claims (20)

In a cache memory system for temporarily storing data between the processing unit and the main memory, A cache for storing data prefetched from the main memory; An address difference value calculator for calculating a difference value between addresses of the main memory requested from the processing apparatus; An address difference value storage unit for storing a difference value between the addresses; And And a cache controller configured to determine an address of data to be prefetched from the main memory based on the difference value, and to control the data of the determined address to be prefetched into the cache. The method of claim 1, The address difference value calculator is configured to provide a difference between a first address of the main memory requested at the first time point from the processing apparatus and a second address of the main memory requested at the second time after the first time point. A cache memory system, characterized by calculating a value. The method of claim 1, The cache controller may perform a prefetch operation to read data stored at an address obtained by adding the difference value to an address of the main memory most recently requested from the processing apparatus, and to store the data in the cache. Memory system. The method of claim 1, And the cache controller determines an address of data to be prefetched from the main memory by using a difference value most recently stored among a plurality of difference values stored in the address difference value storage unit. The method of claim 1, And the cache controller determines an address of data to be prefetched from the main memory using a difference value having the highest frequency among a plurality of difference values stored in the address difference value storage unit. The method of claim 1, The cache controller controls the data of the determined address to be prefetched into the cache when there is no request to read data of the main memory device from the processing device or to write data to the main memory device. Cache memory system. The method of claim 1, And the cache controller controls to store, in a predetermined region of each subline stored in the cache, a predetermined identifier indicating whether data of the subline was present in the cache when requested by the processing apparatus. Cache memory system. The method of claim 7, wherein When there is no empty space in the cache, the cache controller identifies a cache line to which data of all sublines is hit by using the identifier, and replaces the data of the identified cache line with data prefetched from the main memory. Cache memory system, characterized in that the control to save. The method of claim 1, If there is no empty space in the cache, the cache controller controls to replace the data of the cache line storing the data of the address farthest from the address most recently requested from the processing device with data prefetched from the main memory. Cache memory system, characterized in that. The method of claim 1, The cache controller determines whether data of the requested address exists in the cache when data of a predetermined address is requested from the processing device during the prefetch operation, and the data of the requested address exists in the cache. When the data of the corresponding address is provided to the processing apparatus, the prefetch operation is performed again, and although the data of the requested address does not exist in the cache, the data of the address to be prefetched is the same as the requested address. And continue to perform the prefetch operation to provide the prefetched data to the processing device. A method of prefetching data into a cache that temporarily stores data between a processing device and a main memory, the method comprising: Calculating and storing difference values between addresses of the main memory previously requested from the processing apparatus; Determining an address of data to be prefetched from the main memory based on the difference value; And And controlling the data of the determined address to be read from the main memory and to be stored in the cache. The method of claim 11, Computing and storing the difference value between the addresses Calculates a difference value between a first address of the main memory requested at the first predetermined time point from the processing apparatus and a second address of the main memory requested from the processing apparatus at a second time after the first time point Prefetching method of the cache, characterized in that for storing. The method of claim 11, Controlling to be stored in the cache And preserving data in the cache by reading data existing at an address obtained by adding the difference value to the address of the main memory most recently requested from the processing apparatus. The method of claim 11, Controlling to be stored in the cache And determining an address of data to be prefetched from the main memory using the most recently stored difference value among the difference values. The method of claim 11, Controlling to be stored in the cache And determining an address of data to be prefetched from the main memory by using a difference value having the highest frequency among the difference values. The method of claim 11, Controlling to be stored in the cache A cache for controlling the data of the determined address to be prefetched into the cache when there is no request to read data of the main memory device from the processing device or to write data to the main memory device Prefetch method. The method of claim 11, Storing in the predetermined area of each subline stored in the cache, a predetermined identifier indicating whether data of the subline was present in the cache when requested by the processing apparatus. Prefetch method. The method of claim 17, When there is no empty space in the cache, the identifier is used to identify the cache line to which data of all sublines are hit, and control to replace and store the data of the identified cache line with data prefetched from the main memory. The method further comprises the step of prefetching the cache. The method of claim 11, If there is no free space in the cache, controlling to replace and store data of a cache line in which data of an address farthest from the processing device most recently requested is replaced with data prefetched from the main memory. A method of prefetching cache, comprising: a. The method of claim 11, When data of a predetermined address is requested from the processing device during the prefetch operation, it is determined whether data of the requested address exists in the cache, and when data of the requested address exists in the cache, After the data is provided to the processing apparatus, the prefetch operation is performed again, and if the data of the requested address does not exist in the cache but the data of the address to be prefetched is the same as the requested address, the prefetch operation is performed. And continuing to perform the step of controlling to provide the prefetched data to the processing apparatus.
KR1020080065625A 2008-07-07 2008-07-07 Cache memory system and prefetching method thereof KR20100005539A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080065625A KR20100005539A (en) 2008-07-07 2008-07-07 Cache memory system and prefetching method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080065625A KR20100005539A (en) 2008-07-07 2008-07-07 Cache memory system and prefetching method thereof

Publications (1)

Publication Number Publication Date
KR20100005539A true KR20100005539A (en) 2010-01-15

Family

ID=41814910

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080065625A KR20100005539A (en) 2008-07-07 2008-07-07 Cache memory system and prefetching method thereof

Country Status (1)

Country Link
KR (1) KR20100005539A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012061962A1 (en) * 2010-11-12 2012-05-18 蓝天电脑股份有限公司 Plate-free printing transfer film
WO2012030466A3 (en) * 2010-08-30 2012-06-07 Intel Corporation Method and apparatus for fuzzy stride prefetch
WO2012091234A1 (en) * 2010-12-31 2012-07-05 세종대학교산학협력단 Memory system including a nonvolatile memory and a volatile memory, and processing method using the memory system
WO2014178683A1 (en) * 2013-05-03 2014-11-06 삼성전자 주식회사 Cache control device for prefetching and prefetching method using cache control device
US9645934B2 (en) 2013-09-13 2017-05-09 Samsung Electronics Co., Ltd. System-on-chip and address translation method thereof using a translation lookaside buffer and a prefetch buffer
US9880940B2 (en) 2013-03-11 2018-01-30 Samsung Electronics Co., Ltd. System-on-chip and method of operating the same

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012030466A3 (en) * 2010-08-30 2012-06-07 Intel Corporation Method and apparatus for fuzzy stride prefetch
US8433852B2 (en) 2010-08-30 2013-04-30 Intel Corporation Method and apparatus for fuzzy stride prefetch
WO2012061962A1 (en) * 2010-11-12 2012-05-18 蓝天电脑股份有限公司 Plate-free printing transfer film
WO2012091234A1 (en) * 2010-12-31 2012-07-05 세종대학교산학협력단 Memory system including a nonvolatile memory and a volatile memory, and processing method using the memory system
US9411719B2 (en) 2010-12-31 2016-08-09 Seong University Industry Academy Cooperation Foundation Memory system including nonvolatile and volatile memory and operating method thereof
US10140060B2 (en) 2010-12-31 2018-11-27 Sejong University Industry Academy Cooperation Foundation Memory system including a nonvolatile memory and a volatile memory, and processing method using the memory system
US10558395B2 (en) 2010-12-31 2020-02-11 Sejong University Industry Academy Cooperation Foundation Memory system including a nonvolatile memory and a volatile memory, and processing method using the memory system
US11188262B2 (en) 2010-12-31 2021-11-30 SK Hynix Inc. Memory system including a nonvolatile memory and a volatile memory, and processing method using the memory system
US9880940B2 (en) 2013-03-11 2018-01-30 Samsung Electronics Co., Ltd. System-on-chip and method of operating the same
WO2014178683A1 (en) * 2013-05-03 2014-11-06 삼성전자 주식회사 Cache control device for prefetching and prefetching method using cache control device
US9886384B2 (en) 2013-05-03 2018-02-06 Samsung Electronics Co., Ltd. Cache control device for prefetching using pattern analysis processor and prefetch instruction and prefetching method using cache control device
US9645934B2 (en) 2013-09-13 2017-05-09 Samsung Electronics Co., Ltd. System-on-chip and address translation method thereof using a translation lookaside buffer and a prefetch buffer

Similar Documents

Publication Publication Date Title
US10740261B2 (en) System and method for early data pipeline lookup in large cache design
KR102369500B1 (en) Adaptive prefetching in a data processing apparatus
KR102470184B1 (en) Cache aging policy selection for prefetch based on cache test region
US8433852B2 (en) Method and apparatus for fuzzy stride prefetch
CN109478165B (en) Method for selecting cache transfer strategy for prefetched data based on cache test area and processor
US11803484B2 (en) Dynamic application of software data caching hints based on cache test regions
US20080086599A1 (en) Method to retain critical data in a cache in order to increase application performance
US20180300258A1 (en) Access rank aware cache replacement policy
CN113407119B (en) Data prefetching method, data prefetching device and processor
US11188256B2 (en) Enhanced read-ahead capability for storage devices
US20090063777A1 (en) Cache system
US20150143045A1 (en) Cache control apparatus and method
KR20100005539A (en) Cache memory system and prefetching method thereof
US20120124291A1 (en) Secondary Cache Memory With A Counter For Determining Whether to Replace Cached Data
WO2023173991A1 (en) Cache line compression prediction and adaptive compression
US20120066456A1 (en) Direct memory access cache prefetching
US8356141B2 (en) Identifying replacement memory pages from three page record lists
US10997077B2 (en) Increasing the lookahead amount for prefetching
WO2023173995A1 (en) Cache line compression prediction and adaptive compression
KR102692838B1 (en) Enhanced read-ahead capability for storage devices
US11449428B2 (en) Enhanced read-ahead capability for storage devices
US10776043B2 (en) Storage circuitry request tracking
US11048637B2 (en) High-frequency and low-power L1 cache and associated access technique
CN117120989A (en) Method and apparatus for DRAM cache tag prefetcher
WO2021059198A1 (en) Circuitry and method

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination