CN109684235A - A kind of method, device and equipment of computer system application cache - Google Patents

A kind of method, device and equipment of computer system application cache Download PDF

Info

Publication number
CN109684235A
CN109684235A CN201811615556.6A CN201811615556A CN109684235A CN 109684235 A CN109684235 A CN 109684235A CN 201811615556 A CN201811615556 A CN 201811615556A CN 109684235 A CN109684235 A CN 109684235A
Authority
CN
China
Prior art keywords
application
cache
cache module
module
computer system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811615556.6A
Other languages
Chinese (zh)
Inventor
张经宇
钟思琪
王进
李文军
王斐
何施茗
邝利丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changsha University of Science and Technology
Original Assignee
Changsha University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changsha University of Science and Technology filed Critical Changsha University of Science and Technology
Priority to CN201811615556.6A priority Critical patent/CN109684235A/en
Publication of CN109684235A publication Critical patent/CN109684235A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0866Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches for peripheral storage systems, e.g. disk cache
    • G06F12/0871Allocation or management of cache space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3452Performance evaluation by statistical analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The invention discloses method, apparatus, equipment and the computer readable storage mediums of a kind of computer system application cache, it include: to obtain computer system currently loaded applications, so that described apply is cached detecting under the first cache module, acquires the first performance statistical data of the application;So that described apply is cached detecting under the second cache module, acquires the second performance statistic of the application;According to the first performance statistical data and second performance statistic, the type of the application is identified, cache module corresponding with the type of the application is selected to optimize the operational efficiency of the application.Method, apparatus, equipment and computer readable storage medium provided by the present invention currently load the loadtype of application by aware computer systems, select the runnability that current load application is improved with the corresponding cache module of current load application.

Description

A kind of method, device and equipment of computer system application cache
Technical field
The present invention relates to field of computer technology, more particularly to a kind of computer system application cache method, Device, equipment and computer readable storage medium.
Background technique
Current cache technology is widely used among Computer Architecture, is indispensable in computer system Important component.In order to promote the operational efficiency of computer, memory access for data, instruction the caches skill such as reading Art is come into being, and the memory access speed of data and instruction has not only been improved, but also reduces the read-write for memory and disk.
In the prior art, computer system can use SRAM (Static Random-Access Memory static random Access memory) cache, and then optimize the performance of compute-intensive applications program.It can be by using state-of-the art DRAM (Dynamic Random Access Memory dynamic random access memory) cache carrys out memory optimization intensive applications The performance of program.However cache systems in the prior art single can only optimize a type of load, lack Weary elasticity optimization and adaptability for more programming mixed load.SRAM cache system once receives memory-intensive Application load, it is impossible to obtain the optimization of aspect of performance, or even decline system performance;Corresponding DRAM cache system one Denier receives compute-intensive applications load, it is impossible to obtain the optimization of aspect of performance, or even decline system performance.
When the load of computer system is more programming mixed loads, it may apply and calculate comprising memory-intensive simultaneously Intensive applications are not available the operation of unified policy optimization program.This makes some new technology (such as DRAM caches And SRAM cache) value of optimization can not be embodied, it cannot be guaranteed that the performance of the degree of optimization and program.
In summary as can be seen that the operational efficiency for how optimizing the computer system application of more programming mixed loads is mesh Preceding problem to be solved.
Summary of the invention
The object of the present invention is to provide method, apparatus, equipment and the calculating of a kind of computer system application cache Machine readable storage medium storing program for executing, to solve not being available the unified more programming mixed loads of high speed mitigation system optimization in the prior art The problem of operational efficiency of computer system application.
In order to solve the above technical problems, the present invention provides a kind of method of computer system application cache, comprising: obtain Computer system currently loaded applications are taken, so that described apply is cached detecting under the first cache module, adopts Collect the first performance statistical data of the application;After the acquisition for completing the first performance statistical data, make described apply It is cached detecting under two cache modules, acquires the second performance statistic of the application;According to described first Performance statistic and second performance statistic identify the type of the application, the type pair of selection and the application The cache module answered optimizes the operational efficiency of the application.
Preferably, the acquisition computer system currently loaded applications, make described apply in the first cache module Under be cached the first performance statistical data that detecting acquires the application, comprising:
Obtain computer system currently loaded applications mcf, make it is described using mcf the computer system initial height Detecting is cached under fast cache module, wherein the initial cache module of the computer system is DRAM high speed Cache module;
According to the hit rate of the DRAM cache Module cache, acquires the first performance using mcf and unite It counts.
Preferably, described that described apply is made to be cached detecting under the second cache module, it is answered described in acquisition Second performance statistic, comprising:
The application mcf is set to be cached detecting under SRAM cache module;
According to the hit rate of the SRAM cache Module cache, acquires second performance using mcf and unite It counts.
Preferably, described according to the first performance statistical data and second performance statistic, it is answered described in identification Type selects cache module corresponding with the type of the application to optimize the operational efficiency of the application, comprising:
The first performance statistical data and second performance statistic are compared and are assessed, is tied according to comparing Fruit and assessment result determine that the type using mcf is memory-intensive application;
By current cache module by the SRAM cache module replacing be the DRAM cache module, from And it realizes and the optimization using mcf operational efficiency is operated.
Preferably, described by current cache module is DRAM high speed by the SRAM cache module replacing Cache module, comprising:
When replacing current cache module, suspends the operation using mcf, activate the DRAM cache mould Block, to the DRAM cache module, will restore the application according to the Data Migration in the SRAM cache module The operation of mcf.
The present invention also provides a kind of devices of computer system application cache, comprising:
First detecting module delays described apply in the first high speed for obtaining computer system currently loaded applications It is cached detecting under storing module, acquires the first performance statistical data of the application;
Second detecting module, after the acquisition for completing the first performance statistical data, make it is described apply it is high second It is cached detecting under fast cache module, acquires the second performance statistic of the application;
Selecting module, for according to the first performance statistical data and second performance statistic, described in identification The type of application selects cache module corresponding with the type of the application to optimize the operational efficiency of the application.
Preferably, first detecting module is specifically used for:
Obtain computer system currently loaded applications mcf, make it is described using mcf the computer system initial height Detecting is cached under fast cache module, wherein the initial cache module of the computer system is DRAM high speed Cache module;
According to the hit rate of the DRAM cache Module cache, acquires the first performance using mcf and unite It counts.
Preferably, second detecting module is specifically used for:
The application mcf is set to be cached detecting under SRAM cache module;
According to the hit rate of the SRAM cache Module cache, acquires second performance using mcf and unite It counts.
The present invention also provides a kind of equipment of computer system application cache, comprising:
Mixed high-speed caching, for providing the cache data access of high speed to CPU;
Memory, for storing computer program instructions and data;CPU, for executing the computer program instructions and number According to when realize a kind of method of above-mentioned computer system application cache the step of.
The present invention also provides a kind of computer readable storage medium, meter is stored on the computer readable storage medium Calculation machine program instruction and data, the computer program instructions and data realize a kind of above-mentioned department of computer science when being executed by processor System applies the step of method of cache.
The method of computer system application cache provided by the present invention, for including the meter of more programming mixed loads Calculation machine system, every primary new application of load are utilized respectively the first cache module and the second high speed in cache systems Cache module is cached detecting to the currently loaded applications.Acquire described apply in the first cache mould The first performance statistical data that runs under block and described apply the second performance run under second cache module Statistical data.The first performance statistical data and second performance statistic are compared and are assessed, to judge The type of the application.Cache module corresponding with the application type is selected to carry out the operational efficiency of the application excellent Change.Method provided by the present invention is perceived using the loadtype by currently loading application to computer system, selection Improving the operation that the current load is applied using corresponding cache module with the current load is performance;It solves It is high using that can only be carried out using a kind of cache module to the computer system comprising more programming mixed load in the prior art It, can not be to the optimizations using operational efficiency all in the computer system, it could even be possible to reducing the meter when speed caching The problem of excellent operational efficiency applied in calculation machine system.
Corresponding, device, equipment and the computer of computer system application cache provided by the present invention can Storage medium is read, above-mentioned beneficial effect is all had.
Detailed description of the invention
It, below will be to embodiment or existing for the clearer technical solution for illustrating the embodiment of the present invention or the prior art Attached drawing needed in technical description is briefly described, it should be apparent that, the accompanying drawings in the following description is only this hair Bright some embodiments for those of ordinary skill in the art without creative efforts, can be with root Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the stream of the first specific embodiment of the method for computer system application cache provided by the present invention Cheng Tu;
Fig. 2 is the stream of second of specific embodiment of the method for computer system application cache provided by the present invention Cheng Tu;
Fig. 3 is the stream of the third specific embodiment of the method for computer system application cache provided by the present invention Cheng Tu;
Fig. 4 is a kind of structural block diagram of the device of computer system application cache provided in an embodiment of the present invention.
Specific embodiment
Core of the invention is to provide method, apparatus, equipment and the calculating of a kind of computer system application cache Machine readable storage medium storing program for executing, the loadtype of application is currently loaded by aware computer systems, and selection applies phase with current load The operation that corresponding cache module improves current load application is performance.
In order to enable those skilled in the art to better understand the solution of the present invention, with reference to the accompanying drawings and detailed description The present invention is described in further detail.Obviously, described embodiments are only a part of the embodiments of the present invention, rather than Whole embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not making creative work premise Under every other embodiment obtained, shall fall within the protection scope of the present invention.
Referring to FIG. 1, Fig. 1 be the method for computer system application cache provided by the present invention the first is specific The flow chart of embodiment;Specific steps are as follows:
Step S101: computer system currently loaded applications are obtained, described apply under the first cache module is made It is cached detecting, acquires the first performance statistical data of the application;
When the initial cache of the computer system is DRAM cache module, by the DRAM cache Module is as the first cache module.When the initial cache of the computer system is SRAM cache module, then Using the SRAM cache module as the first cache module.
Select the DRAM cache module as the first cache of the computer system in the present embodiment Module.The computer system currently loaded applications are obtained, carry out described apply under the DRAN cache module Cache snoop, and it is data cached to record the first performance.
Since system Shi Youyi sections of slowly acceleration start-up course is added rigid in described apply, in the process, high speed is slow The hit rate deposited is extremely unstable, and performance data may lack directiveness.Therefore in the present invention, corresponding threshold value is set up to weigh The variation for the hit rate that weighs, and the performance statistic applied is collected in the cache hit rate stable stage.In the present embodiment In, it can recorde the starting duration of the application, when the starting duration of the application is more than or equal to preset duration threshold value, explanation The hit rate of the cache has tended towards stability.Described in can also being judged by the change rate of the hit rate of the cache Whether the hit rate of cache tends towards stability.
Step S102: after the acquisition for completing the first performance statistical data, make described apply in the second cache mould It is cached detecting under block, acquires the second performance statistic of the application;
When it is described apply at the DRAM or SRAM cache module test after the completion of, use the SRAM or DRAM Cache module is tested.
Step S103: according to the first performance statistical data and second performance statistic, the application is identified Type, select corresponding with the type of application cache module to optimize the operational efficiency of the application.
In the present embodiment, according to the computer system currently loaded applications respectively in the DRAM cache mould The assessment of the first performance statistical data and the second performance statistic tested under block and SRAM cache module and ratio Compared with as a result, perceiving the type of the currently loaded applications.When the high speed applied under the DRAM cache module When Cache Statistics data assessment is assessed better than the cache statistics under the SRAM cache module, described in judgement It is excellent using for memory-intensive application program, selecting the DRAM cache module to carry out the operational efficiency of the application Change.
Based on the above embodiment, in the present embodiment, in the type for determining the currently loaded applications, selection with it is described Using corresponding cache module;And judge whether to need to update the current cache module of the computer system.Please With reference to Fig. 2, Fig. 2 is the stream of second of specific embodiment of the method for computer system application cache provided by the present invention Cheng Tu;Specific steps are as follows:
Step S201: obtaining computer system currently loaded applications, makes described apply in the first of the computer system It is cached detecting under beginning cache module, acquires the first performance statistical data of the application, wherein the calculating The initial cache module of machine system is DRAM cache module;
Step S202: after the acquisition for completing the first performance statistical data, make described apply in SRAM cache mould It is cached detecting under block, acquires the second performance statistic of the application;
Step S203: being compared and assess to the first performance statistical data and second performance statistic, The type of the application is determined according to comparison result and assessment result;
Step S204: judged whether to need replacing the current height of the computer system operation according to the type of the application Fast cache module;
Step S205: if the application is memory-intensive application, by current cache module by the SRAM high Fast cache module is changed to the DRAM cache module, operates to realize to the optimization using operational efficiency;
Step S206: if the application is compute-intensive applications, not needing to replace the current cache module, Execute step S207;
The compute-intensive applications in the process of running, need to consume a large amount of computing resource, the access for memory And infrequently.The memory-intensive application in the process of running, needs frequently to access memory to obtain corresponding data, for meter The consumption of calculation resource is simultaneously little.
Step S207: monitor whether that new application is loaded onto the computer system, if so, updating the computer After system currently loaded applications, step S201 is executed.
In the present embodiment, the DRAM cache module and SRAM high speed are existed simultaneously in cache systems Cache module, by the application that the is collected into performance statistic under two cache modules respectively, analyzing and Assess current application type, and according to different application types using different strategies come scheduling high-speed cache module.
In the present embodiment, after the type applied to each perceives, it can recorde the type for having loaded application With the statistical information of cache module, and store into database;Judge that the application being newly added is answered if it is what is analyzed With, if so, record before being found out by the comparing of database he that be information, so that quickly selection is high Fast cache module simultaneously achievees the purpose that optimize performance.
Based on the above embodiment, in the present embodiment, with the application selected from 2006 Performance Evaluation test set of SPEC CPU The present invention is described in further detail for the operation of mcf.Referring to FIG. 3, Fig. 3 is department of computer science provided by the present invention System applies the flow chart of the third specific embodiment of the method for cache;Specific steps are as follows:
Step S301: computer system currently loaded applications mcf is obtained, makes the application mcf in the department of computer science Detecting is cached under the initial cache module of system, wherein the initial cache module of the computer system For DRAM cache module;
Step S302: it according to the hit rate of the DRAM cache Module cache, acquires described using the of mcf One performance statistic;
Step S303: after the acquisition for completing the first performance statistical data, make the application mcf in SRAM cache Detecting is cached under module;
Step S304: it according to the hit rate of the SRAM cache Module cache, acquires described using the of mcf Two performance statistics;
Step S305: being compared and assess to the first performance statistical data and second performance statistic, Determine that the type using mcf is memory-intensive application according to comparison result and assessment result;
Step S306: the pause operation using mcf activates the DRAM cache module;
In the present embodiment, the application mcf is memory-intensive application, and the DRAM cache module should be selected excellent Change the runnability using mcf.And currently running cache is the SRAM cache module, it is therefore desirable to will Currently running cache is changed to DRAM cache module.
When replacing cache module, suspend the operation using mcf, and move to the data in cache It moves.Although described at this time be suspended using the operation of mcf, the data saved in the caches using mcf will not It is deleted, but is migrated by different strategies, while using PCIE (PCI-Express Peripheral Component Interconnect Express) etc. high speeds bus transfer migration data.
The PCIE is a kind of high speed serialization computer expansion bus standard.
Step S307: will be according to the Data Migration in the SRAM cache module to the DRAM cache mould Block closes the SRAM cache module, restores the operation using mcf;
Step S308: monitor whether that new application is loaded onto the computer system, if so, updating the computer After system currently loaded applications, the type that perception is applied after updating, so that selection is corresponding with the type applied after the update Cache module optimize the operational efficiency applied after the update.
In the present embodiment, it is migrated for different modules, we use different migration strategies.If currently running It is the SRAM cache module, needs to activate and migrating data is to the DRAM cache module, due to the SRAM Cache module capacity is significantly smaller than the DRAM cache module capacity, directly copies data to the DRAM high speed In cache module.And it is currently running activate the SRAM cache module when being the DRAM cache module, Since the SRAM cache module capacity is smaller, the data modified only are migrated at this time and enter the SRAM cache mould Block, overflows in case of data, then the data overflowed can be written into main memory.
Referring to FIG. 4, Fig. 4 is a kind of knot of the device of computer system application cache provided in an embodiment of the present invention Structure block diagram;Specific device may include:
First detecting module 100 makes described apply in the first high speed for obtaining computer system currently loaded applications It is cached detecting under cache module, acquires the first performance statistical data of the application;
Second detecting module 200 makes described apply second after the acquisition for completing the first performance statistical data It is cached detecting under cache module, acquires the second performance statistic of the application;
Selecting module 300, for identifying institute according to the first performance statistical data and second performance statistic The type of application is stated, cache module corresponding with the type of the application is selected to optimize the operational efficiency of the application.
The device of the computer system application cache of the present embodiment is high for realizing computer system application above-mentioned The method of speed caching, therefore the visible calculating hereinbefore of specific embodiment in the device of computer system application cache The embodiment part of the method for machine system application cache, for example, the first detecting module 100, the second detecting module 200, choosing Module 300 is selected, step S101, S102 and S103 in the method for realizing above-mentioned computer system application cache are respectively used to, So specific embodiment is referred to the description of corresponding various pieces embodiment, details are not described herein.
The specific embodiment of the invention additionally provides a kind of equipment of computer system application cache, comprising: mixed type Cache, for providing the cache data access of high speed to CPU;Memory, for storing computer program instructions and data; CPU realizes a kind of side of above-mentioned computer system application cache when for executing the computer program instructions and data The step of method.
The specific embodiment of the invention additionally provides a kind of computer readable storage medium, the computer readable storage medium On be stored with computer program instructions and data, the computer program instructions and data realize above-mentioned one when being executed by processor The step of method of kind computer system application cache.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with it is other The difference of embodiment, same or similar part may refer to each other between each embodiment.For being filled disclosed in embodiment For setting, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place is referring to method part Explanation.
Professional further appreciates that, unit described in conjunction with the examples disclosed in the embodiments of the present disclosure And algorithm steps, can be realized with electronic hardware, computer software, or a combination of the two, in order to clearly demonstrate hardware and The interchangeability of software generally describes each exemplary composition and step according to function in the above description.These Function is implemented in hardware or software actually, the specific application and design constraint depending on technical solution.Profession Technical staff can use different methods to achieve the described function each specific application, but this realization is not answered Think beyond the scope of this invention.
The step of method described in conjunction with the examples disclosed in this document or algorithm, can directly be held with hardware, processor The combination of capable software module or the two is implemented.Software module can be placed in random access memory (RAM), memory, read-only deposit Reservoir (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or technology In any other form of storage medium well known in field.
Above to method, apparatus, equipment and the computer of computer system application cache provided by the present invention Readable storage medium storing program for executing is described in detail.Specific case used herein carries out the principle of the present invention and embodiment It illustrates, the above description of the embodiment is only used to help understand the method for the present invention and its core ideas.It should be pointed out that for this For the those of ordinary skill of technical field, without departing from the principle of the present invention, the present invention can also be carried out several Improvement and modification, these improvements and modifications also fall within the scope of protection of the claims of the present invention.

Claims (10)

1. a kind of method of computer system application cache characterized by comprising
Obtain computer system currently loaded applications, make it is described apply to be cached under the first cache module detect It surveys, acquires the first performance statistical data of the application;
After the acquisition for completing the first performance statistical data, described apply is made to carry out delaying at a high speed under the second cache module Detecting is deposited, the second performance statistic of the application is acquired;
According to the first performance statistical data and second performance statistic, identify the type of the application, selection with The corresponding cache module of the type of the application optimizes the operational efficiency of the application.
2. the method as described in claim 1, which is characterized in that the acquisition computer system currently loaded applications make institute It states to apply and is cached the first performance statistical data that detecting acquires the application under the first cache module, packet It includes:
Computer system currently loaded applications mcf is obtained, is made described slow in the initial rapid of the computer system using mcf Detecting is cached under storing module, wherein the initial cache module of the computer system is DRAM cache Module;
According to the hit rate of the DRAM cache Module cache, the first performance statistical number using mcf is acquired According to.
3. method according to claim 2, which is characterized in that described to carry out described apply under the second cache module Cache snoop, the second performance statistical data packet for acquiring the application include:
The application mcf is set to be cached detecting under SRAM cache module;
According to the hit rate of the SRAM cache Module cache, the second performance statistical number using mcf is acquired According to.
4. method as claimed in claim 3, which is characterized in that described according to the first performance statistical data and described second Performance statistic identifies the type of the application, and cache module corresponding with the type of the application is selected to optimize institute State the operational efficiency of application, comprising:
The first performance statistical data and second performance statistic are compared and are assessed, according to comparison result and Assessment result determines that the type using mcf is memory-intensive application;
It by the SRAM cache module replacing is the DRAM cache module by current cache module, thus real Now the optimization using mcf operational efficiency is operated.
5. method as claimed in claim 4, which is characterized in that described that current cache module is slow by the SRAM high speed Storing module is changed to the DRAM cache module, comprising:
When replacing current cache module, suspends the operation using mcf, activates the DRAM cache module, By according to the Data Migration in the SRAM cache module to the DRAM cache module, restores described and apply mcf Operation.
6. a kind of device of computer system application cache characterized by comprising
First detecting module makes described apply in the first cache mould for obtaining computer system currently loaded applications It is cached detecting under block, acquires the first performance statistical data of the application;
Second detecting module after the acquisition for completing the first performance statistical data, delays described apply in the second high speed It is cached detecting under storing module, acquires the second performance statistic of the application;
Selecting module, for identifying the application according to the first performance statistical data and second performance statistic Type, select corresponding with the type of application cache module to optimize the operational efficiency of the application.
7. device as claimed in claim 6, which is characterized in that first detecting module is specifically used for:
Computer system currently loaded applications mcf is obtained, is made described slow in the initial rapid of the computer system using mcf Detecting is cached under storing module, wherein the initial cache module of the computer system is DRAM cache Module;
According to the hit rate of the DRAM cache Module cache, the first performance statistical number using mcf is acquired According to.
8. device as claimed in claim 7, which is characterized in that second detecting module is specifically used for:
The application mcf is set to be cached detecting under SRAM cache module;
According to the hit rate of the SRAM cache Module cache, the second performance statistical number using mcf is acquired According to.
9. a kind of equipment of computer system application cache characterized by comprising
Mixed high-speed caching, for providing the cache data access of high speed to CPU;
Memory, for storing computer program instructions and data;
CPU, when by executing the computer program instructions and data realize it is a kind of as described in any one of claim 1 to 5 based on The step of method of calculation machine system application cache.
10. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program instruction and data are realized when the computer program instructions and data are executed by processor such as any one of claim 1 to 5 A kind of the step of method of computer system application cache.
CN201811615556.6A 2018-12-27 2018-12-27 A kind of method, device and equipment of computer system application cache Withdrawn CN109684235A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811615556.6A CN109684235A (en) 2018-12-27 2018-12-27 A kind of method, device and equipment of computer system application cache

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811615556.6A CN109684235A (en) 2018-12-27 2018-12-27 A kind of method, device and equipment of computer system application cache

Publications (1)

Publication Number Publication Date
CN109684235A true CN109684235A (en) 2019-04-26

Family

ID=66190740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811615556.6A Withdrawn CN109684235A (en) 2018-12-27 2018-12-27 A kind of method, device and equipment of computer system application cache

Country Status (1)

Country Link
CN (1) CN109684235A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113973502A (en) * 2020-05-25 2022-01-25 华为技术有限公司 Cache collision processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851066A (en) * 2015-07-16 2018-03-27 高通股份有限公司 Hardware counter and the offline adaptable caching architecture for establishing profile to application during based on operation
CN109032964A (en) * 2018-07-02 2018-12-18 京东方科技集团股份有限公司 Buffer replacing method and its device, heterogeneous multi-core system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851066A (en) * 2015-07-16 2018-03-27 高通股份有限公司 Hardware counter and the offline adaptable caching architecture for establishing profile to application during based on operation
CN109032964A (en) * 2018-07-02 2018-12-18 京东方科技集团股份有限公司 Buffer replacing method and its device, heterogeneous multi-core system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHANG, JINGYU: "HSCS: a hybrid shared cache scheduling scheme for multiprogrammed workloads" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113973502A (en) * 2020-05-25 2022-01-25 华为技术有限公司 Cache collision processing method and device
CN113973502B (en) * 2020-05-25 2023-11-17 华为技术有限公司 Cache collision processing method and device

Similar Documents

Publication Publication Date Title
US11106579B2 (en) System and method to manage and share managed runtime memory for java virtual machine
CN105205014B (en) A kind of date storage method and device
US9804803B2 (en) Data access in hybrid main memory systems
US9182927B2 (en) Techniques for implementing hybrid flash/HDD-based virtual disk files
US10621093B2 (en) Heterogeneous computing system configured to adaptively control cache coherency
US10025504B2 (en) Information processing method, information processing apparatus and non-transitory computer readable medium
US9280300B2 (en) Techniques for dynamically relocating virtual disk file blocks between flash storage and HDD-based storage
US20130185229A1 (en) Apparatus and method for managing storage of data blocks
US20170277551A1 (en) Interception of a function call, selecting a function from available functions and rerouting the function call
US9286199B2 (en) Modifying memory space allocation for inactive tasks
US20170277246A1 (en) Workload placement based on heterogeneous compute performance per watt
CN116954929B (en) Dynamic GPU scheduling method and system for live migration
US10860499B2 (en) Dynamic memory management in workload acceleration
CN109582649A (en) A kind of metadata storing method, device, equipment and readable storage medium storing program for executing
CN109684235A (en) A kind of method, device and equipment of computer system application cache
US20150347303A1 (en) Adjusting allocation of storage devices
Park et al. MH cache: A multi-retention STT-RAM-based low-power last-level cache for mobile hardware rendering systems
CN107018163B (en) Resource allocation method and device
CN112445794B (en) Caching method of big data system
CN106469020B (en) Cache element and control method and its application system
CN108959499A (en) Distributed file system performance analysis method, device, equipment and storage medium
US9037622B1 (en) System and method for managing spool space in a mixed SSD and HDD storage environment
US20230418688A1 (en) Energy efficient computing workload placement
CN107329705B (en) Shuffle method for heterogeneous storage
CN105740167A (en) File system cache deletion method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20190426