KR20160126472A - Application launching time optimization apparatus and method using non-volatile memory - Google Patents

Application launching time optimization apparatus and method using non-volatile memory Download PDF

Info

Publication number
KR20160126472A
KR20160126472A KR1020150057504A KR20150057504A KR20160126472A KR 20160126472 A KR20160126472 A KR 20160126472A KR 1020150057504 A KR1020150057504 A KR 1020150057504A KR 20150057504 A KR20150057504 A KR 20150057504A KR 20160126472 A KR20160126472 A KR 20160126472A
Authority
KR
South Korea
Prior art keywords
start code
target program
file
code
block
Prior art date
Application number
KR1020150057504A
Other languages
Korean (ko)
Other versions
KR101680966B1 (en
Inventor
조용운
김태석
Original Assignee
광운대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 광운대학교 산학협력단 filed Critical 광운대학교 산학협력단
Priority to KR1020150057504A priority Critical patent/KR101680966B1/en
Publication of KR20160126472A publication Critical patent/KR20160126472A/en
Application granted granted Critical
Publication of KR101680966B1 publication Critical patent/KR101680966B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/0223User address space allocation, e.g. contiguous or non contiguous base addressing
    • G06F12/023Free address space management
    • G06F12/0238Memory management in non-volatile memory, e.g. resistive RAM or ferroelectric memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0893Caches characterised by their organisation or structure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30145Instruction analysis, e.g. decoding, instruction word fields
    • G06F9/30149Instruction analysis, e.g. decoding, instruction word fields of variable length instructions
    • G06F9/30152Determining start or end of instruction; determining instruction length

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

Disclosed are a starting time optimizing apparatus of a program and a method thereof using a nonvolatile memory. The method for optimizing starting time of the program may include: a step of detecting a code necessary for starting the target program by file unit in the target program; and a step of cashing a starting code of detecting file unit in the nonvolatile memory.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an apparatus and method for optimizing a start time of a program using nonvolatile memory,

The present invention relates to an apparatus and a method for optimizing a start time of a program using a nonvolatile memory.

In order for the program to start, it is necessary to load the start code necessary for starting the program from the storage medium stored in the program and to have the launch time for the CPU to process the loaded start code.

However, the longer the start time, the more time the user has to wait for the program to be used, and the user may feel uncomfortable.

Therefore, there is a demand for a method of minimizing the inconvenience of the user by shortening the program start time by optimizing the start time of the program.

An apparatus and a method for shortening an input / output time of a start code necessary for starting a target program by caching a start code necessary for starting a target program in a nonvolatile memory having a faster access speed than a nonvolatile memory storing a target program .

In addition, the present invention can provide an apparatus and method for shortening the input / output time of a start code necessary for starting a target program by caching a start code necessary for starting a target program and pre-fetching a cached start code.

The present invention is characterized by caching the start code necessary for starting the target program in a nonvolatile memory having a faster access speed and a buffer cache mapped than the nonvolatile memory in which the target program is stored, Can be provided.

A method for optimizing a start time of a program according to an exemplary embodiment of the present invention includes: detecting a code necessary for starting a target program in a target program on a file basis; And caching the detected start code of the file unit in the nonvolatile memory.

The detecting of the start time optimization method of a program according to an embodiment of the present invention may extract the necessary code for starting the target program according to a system call that is requested for starting the target program.

A method for optimizing a start time of a program according to an embodiment of the present invention comprises the steps of: analyzing a start code per file unit cached in the first nonvolatile memory and detecting a code necessary for starting the target program on a block basis; And setting the detected start code of each block unit to be prefetched to the target program.

The method of optimizing a start time of a program according to an embodiment of the present invention may further comprise mapping block information corresponding to a start code of the detected block unit to file information corresponding to a start code of the file unit, The setting step may set the start code of the block unit corresponding to the block information to be pre-fetched to the target program.

The block information of the program start time optimization method according to an embodiment of the present invention may include at least one of a logical block address and a block size of a start code of the detected block unit.

The file information of the program start time optimization method according to an embodiment of the present invention may include at least one of a file name, a start address, and a file size of a start code of the detected file unit.

A method for optimizing a start time of a program according to an embodiment of the present invention includes: evaluating a value of a start code per file unit cached in the non-volatile memory; And managing the cached file unit start code based on the value of the cached file unit code.

A method for optimizing a start time of a program according to an exemplary embodiment of the present invention includes: detecting a code necessary for starting a target program in a target program on a file basis; Analyzing a start code of the detected file unit and detecting a start code of a block unit necessary for starting the target program; And caching the detected start code of each block unit in the nonvolatile memory.

A method for optimizing a start time of a program according to an embodiment of the present invention includes: evaluating a value of a start code in units of blocks cached in a non-volatile memory; And managing the cached block-based start code based on the value of the cached block-by-block code.

A method for optimizing a start time of a program according to an embodiment of the present invention includes prefetching the start code into a volatile memory in a nonvolatile memory having a start code cached at the start of a target program; And starting the target program using a pre-fetched start code in the volatile memory.

A start code of a program start time optimization method according to an embodiment of the present invention includes a start code of a file unit detected in the target program or a start code of a block unit detected by analyzing a start code of the detected file unit, ≪ / RTI >

According to an embodiment of the present invention, the start code necessary for starting the target program is cached in a nonvolatile memory having a faster access speed than that of the nonvolatile memory in which the target program is stored, so that the input / Can be shortened.

According to an embodiment of the present invention, it is possible to shorten the input / output time of the start code necessary for starting the target program by caching the start code necessary for starting the target program and pre-fetching the cached start code.

According to an embodiment of the present invention, a start code necessary for starting a target program is cached in a nonvolatile memory having a faster access speed and a buffer cache mapped than a nonvolatile memory in which a target program is stored, The input / output time of the necessary start code can be shortened.

1 is a diagram illustrating a start time optimization apparatus according to an embodiment of the present invention.
2 is an example of operation of the start time optimization apparatus according to an embodiment of the present invention.
3 is an example of the operation of the processor of the start time optimization apparatus according to an embodiment of the present invention.
4 is an example of a start time of a target program according to an embodiment of the present invention.
5 is a flowchart illustrating a caching process of the start time optimization method according to the first embodiment of the present invention.
6 is a flowchart showing a target program starting process of the start time optimization method according to the first embodiment of the present invention.
FIG. 7 is a flowchart illustrating a caching process of the start time optimization method according to the second embodiment of the present invention.
8 is a flowchart showing a target program starting process of the start time optimization method according to the second embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. The start time optimization method according to an embodiment of the present invention may be performed by a start time optimization apparatus.

1 is a diagram illustrating a start time optimization apparatus according to an embodiment of the present invention.

The start time optimization apparatus may include a first non-volatile memory 110, a second non-volatile memory 120, a volatile memory 130, and a processor 140, as shown in FIG.

The first non-volatile memory 110 may be a storage medium having a faster information access speed than the second non-volatile memory 120. For example, the second nonvolatile memory 120 may be an eMMC (embedded Multi Media Card) type memory using flash memory, or a solid state drive (SSD). Also, the first non-volatile memory 110 may be a Non-Volatile RAM (NVRAM).

At this time, the first non-volatile memory 110 may cache the code necessary for starting the target program stored in the second non-volatile memory 120. [

The second non-volatile memory 120 may be a storage medium for storing and managing a target program.

The volatile memory 130 may include a buffer cache in which information that the processor 140 uses to perform the target program is temporarily stored. For example, the volatile memory 130 may be a dynamic random access memory (DRAM) used as the main memory of the processor 140.

The processor 140 may detect the code necessary for starting the target program in the target program stored in the second nonvolatile memory 120 and cache it in the first nonvolatile memory 110. [

For example, the processor 140 may detect the code necessary for starting the target program on a file basis. At this time, the processor 140 may extract a code necessary for starting according to a system call requested by the target program for starting the target program.

For example, processor 140 may use a tool to track system calls, such as strace, to detect the list of files needed to start the target program. At this time, the file list may include at least one of an executable file, a library, and a configuration file. The processor 140 may determine that the files included in the detected file list are file-based start codes, and cache the files in the first nonvolatile memory 110.

That is, the processor 140 transfers a file necessary for starting the target program from the target program file stored in the second nonvolatile memory 120 to the first nonvolatile memory 110 ), It is possible to improve the access speed to the file necessary for starting the target program, thereby improving the input / output time occurring in the process of starting the target program.

Then, the processor 140 may generate file information including at least one of a file name, a start address, and a file size of a file-based start code. At this time, since the file information is the information of the start code of the file unit detected by the target program, it may be the information of the file system layer.

The processor 140 may cache the detected start code in the file unit in the first nonvolatile memory 110. [ Next, the processor 140 may analyze the start code in units of files cached in the first nonvolatile memory 110 and detect the code necessary for starting the target program on a block-by-block basis. Then, the processor 140 can set the detected start code of each block unit to be pre-fetched to the target program.

For example, the processor 140 may detect a start code of a block unit required when a target program is executed using a tool for tracking input / output requests at a block level, such as blktrace. At this time, the processor 140 may generate block information including at least one of a logical block address (LBA) of a start code on a block-by-block basis, and a block size. In this case, since the block information is the information of the start code of the block unit which is detected by tracking the input / output requests of the target program, it may be the information of the input / output layer.

At this time, the processor 140 may map the block information corresponding to the start code of the block unit to the file information corresponding to the start code of the file unit. Then, the processor 140 can set the block start code corresponding to the block information to be prefetched to the target program.

Specifically, the processor 140 maps block information such as a logical block address and a block size to file information such as a file name, a start address, and a file size, thereby pre-fetching a code necessary for pre- A fetching program can be produced.

For example, the processor 140 may produce a pre-fetching program including a function for pre-reading a start code of a block unit based on block information. For example, the function that pre-reads the start code of a block unit may be posix_fadvise. At this time, the prefetching program may be executed together with the start of the target program to copy the start code of the block unit necessary for starting the target program from the first nonvolatile memory 110 to the volatile memory 130. That is, by performing the input / output of the target program in the prefetching program, the input / output of the target program and the CPU calculation are parallelized, and the execution time of the target program can be improved.

In addition, the processor 140 may evaluate the value of the start code in file units cached in the first non-volatile memory 110. [ For example, the processor 140 may determine the frequency and recent recency of the target program from which the code is extracted and the size of the start code in file units cached in the first nonvolatile memory 110, The value of the start code in file units cached in the first nonvolatile memory 110 can be evaluated using at least one of the first and second nonvolatile memories.

Then, the processor 140 can manage the start code in file units cached in the first nonvolatile memory 110 based on the value of the evaluated code.

For example, if there is insufficient space to store the code in the first non-volatile memory 110, the processor 140 may delete the lowest value code. That is, as the execution frequency of the target program from which the code is extracted is lower, the processor 140 may evaluate the value of the code lower and delete the code earlier than other codes. Also, as the time for which the target program from which the code is extracted is longer or the size of the code stored in the first nonvolatile memory 110 is larger, the processor 140 may evaluate the value of the code lower and delete the code earlier than other codes.

The first non-volatile memory 110 has a faster information access speed than the second non-volatile memory 120, but may have a smaller storage space. The processor 140 may increase the space efficiency of the first nonvolatile memory 110 by caching the code necessary for starting the target program in the first nonvolatile memory 110 on a block basis.

Specifically, the processor 140 can detect, on a file-by-file basis, the code necessary for starting the target program in the target program. The processor 140 may analyze the start code of the detected file unit and detect the start code of the block unit necessary for starting the target program. Next, the processor 140 may cache the detected start code in block units in the first nonvolatile memory 110 in a chunk form.

At this time, the processor 140 may evaluate the value of the chunk, which is the start code of the block unit cached in the first non-volatile memory 110. [ For example, the processor 140 may determine at least one of the frequency and recent recency of the target program from which the code is extracted, and the size (size) of the chunk cached in the first non-volatile memory 110 Volatile memory 110 to evaluate the value of the cached chunk in the first non-volatile memory 110.

The processor 140 may then manage the cached chunk in the first non-volatile memory 110 based on the value of the evaluated code.

For example, if there is insufficient space to store the code in the first non-volatile memory 110, the processor 140 may delete the chunk having the lowest value. That is, as the execution frequency of the target program from which the code is extracted is lower, the processor 140 can evaluate the value of the chunk low and delete the chunk before the other chunk. In addition, as the chunk size stored in the first nonvolatile memory 110 is longer or the time period during which the target program from which the code is extracted is long, the processor 140 can evaluate the value of the chunk lower and delete the chunk earlier than other chunks.

The processor 140 may also map the buffer cache to the first non-volatile memory 110 by the operating system. At this time, the processor 140 may use the first non-volatile memory 110 instead of the volatile memory 130. [ The processor 140 may cache the start code of the file unit or the start code of the block unit detected in the second nonvolatile memory 110 in the buffer cache. Thus, the processor 140 may use the cached start code in the first non-volatile memory 110 for the start of the target program without copying it to the volatile memory 130. [

2 is an example of operation of the start time optimization apparatus according to an embodiment of the present invention.

FIG. 2 is a block diagram of a nonvolatile memory according to an embodiment of the present invention. Referring to FIG. 2, a start code of a block unit is cached in a first nonvolatile memory 110 in a first nonvolatile memory 110 in a chunk form, Is an example of operation.

First, the processor 140 may detect a code necessary for starting a target program in a target program stored in the second nonvolatile memory 120, on a file-by-file basis. For example, the processor 140 may detect start codes C 1 , C 2 , and C 3 211 in units of files in a program c 210, which is a target program. In addition, the processor 140 may detect the start code L 4 221 of the file unit in the library 220.

Next, the processor 140 may analyze the detected start code of each file unit and detect a start code of a block unit necessary for starting the target program. For example, the processor 140 may program a block 230 containing start codes C 1 , C 2 , C 3 211, and file start code L 4 221, 210 as a start code in the unit of a block necessary for starting the program.

Next, the processor 140 may cache the detected start code in block units in the first nonvolatile memory 110 in a chunk form. For example, the processor 140 may include a block 230 containing start codes C 1 , C 2 , C 3 211, and file start code L 4 221, as shown in FIG. 2, May be cached in the first non-volatile memory 110 in the form of chunks.

When the program c 210 is started, the processor 140 pre-fetches the block 230 to the volatile memory 130 as shown in Fig. 2, It is possible to minimize the input / output time of the code necessary for starting the program.

3 is an example of the operation of the processor of the start time optimization apparatus according to an embodiment of the present invention.

The processor 140 includes a system call profiler 310, a first non-volatile memory manager 320, an input and output profiler 330, a mapper 340, a prefetching program generator 350, And may include a program start manager 360.

The system call profiler 310 can detect a list of files necessary for starting the target program 301 by using a tool for tracking system calls such as strace.

The system call profiler 310 can output the files included in the detected file list to the start code 311 of the file unit. At this time, the system call profiler 310 generates file information including at least one of the file name, the start address, and the file size of the start code 311 of the file unit, and outputs the file information together with the start code 311 can do.

The first nonvolatile memory manager 320 may cache the start code 311 of the file unit output from the system call profiler 310 in the first nonvolatile memory 110. [

For example, the input / output profiler 330 uses a tool for tracking input / output requests at the block level, such as blktrace, to generate a block start code 331 required for executing the target program 301 Can be detected. At this time, the input / output profiler 330 generates block information including at least one of a logical block address (LBA) and a block size of a start code 331 of a block unit, and outputs it along with a start code 331 can do.

The mapper 340 may output the mapping information 341 by mapping the block information output by the input / output profiler 330 to the file information output by the system call profiler 310.

The prefetching program generator 350 generates the prefetching program 350 based on the mapping information 341 mapping block information such as a logical block address and a block size to file information such as a file name, A pre-fetching program 302 for pre-fetching codes necessary for starting can be produced.

When starting the target program 301, the program start manager 360 executes a prefetching program 302 to start a block-by-block start code necessary for starting the target program 301 from the first nonvolatile memory 110 Volatile memory 130 as shown in FIG. That is, by performing the code input / output of the target program 301 in the prefetching program 302, the input / output of the target program 301 and the CPU calculation are parallelized, and the execution time of the target program 301 can be improved.

4 is an example of a start time of a target program according to an embodiment of the present invention.

The start time 410 of the target program may be the start time of the target program that does not use the start time optimizer 100. At this time, the start time 410 of the target program is an input / output time (I / O) for loading the start code necessary for starting the target program from the second nonvolatile memory 120 in which the target program is stored 401), and a CPU processing time 402 for processing a code loaded from the second nonvolatile memory 120 by the CPU.

The start time 420 of the target program may be the start time of the target program if the start time optimizer 100 maps the buffer cache to the first nonvolatile memory 110 at the operating system level. At this time, the start code loaded from the second nonvolatile memory 120 in the input / output time (I / O) 401 is cached in the buffer cache of the first nonvolatile memory 110, It is possible to process the start code included in the buffer cache mapped to the memory 110. That is, since the CPU can use the start code cached in the first nonvolatile memory 110 to start the target program without copying the start code into the volatile memory 130, the start time 420 of the target program is the input / / O) 401 and only the CPU processing time 402 can be continued.

Therefore, the start time 420 of the target program in which only the CPU processing time 402 is continuous may be shorter than the start time 410 of the target program.

The start time 430 of the target program may be the start time of the target program if the start time optimizer 100 has cached the start code in the first nonvolatile memory 110. [ Since the information access time of the first nonvolatile memory 110 is faster than the information access time of the second nonvolatile memory 120, the input / output time I / F, which loads the start code from the first nonvolatile memory 110, O) 431 may be shorter than an input / output time (I / O) 401 for loading a start code from the second nonvolatile memory 120.

Therefore, the start time 430 of the target program in which the input / output time (I / O) 431 and the CPU processing time 402 are continuous may be shorter than the start time 410 of the target program.

The start time 440 of the target program may be the start time of the target program if the start time optimizer 100 caches the start code in the first nonvolatile memory 110 and generates the prefetching program. At this time, as shown in FIG. 4, the prefetching program generated by the start time optimizing apparatus 100 loads the start code from the first nonvolatile memory 110 in parallel with the CPU processing time 401 of the target program And prefetching 441 composed of an input / output time (I / O) 431 can be performed.

For instance, CPU 1 is pre-fetching while the program is processing the prefetching start code in the I / O 1 performed and, CPU 2 is performed CPU 1, pre-fetching pre-fetching in the I / O 2 performed by the program You can process the start code.

That is, the time 442 during which the target program is executed by the CPU may be shorter than the start time 410 of the target program since only the CPU processing time 402 is continuous as shown in FIG.

5 is a flowchart illustrating a caching process of the start time optimization method according to the first embodiment of the present invention.

The first embodiment of the present invention is an embodiment in which the start time optimization apparatus 100 caches the start code in the first nonvolatile memory 110 as the start time 430 of the target program in FIG.

At step 510, the processor 140 may detect the start code required for the start of the target program in the target program.

At this time, the processor 140 may detect the code necessary for starting the target program as a start code of the file unit. In addition, the processor 140 may analyze the detected start code of each file unit to detect a start code of a block unit necessary for starting the target program.

In step 520, the processor 140 may cache the start code detected in step 510 in the first non-volatile memory 110. [ At this time, the processor 140 may cache the start code of the file unit in the first nonvolatile memory 110. [ In addition, the processor 140 may cache the block start code in the first nonvolatile memory 110 in a chunk form.

6 is a flowchart showing a target program starting process of the start time optimization method according to the first embodiment of the present invention.

In step 610, the processor 140 may load the start code into the volatile memory 130 in the first non-volatile memory 110 where the start code necessary for the start of the program is cached. In this case, the start code may be one of a start code of the file unit detected by the target program, or a start code of the block unit detected by analyzing the start code of the detected file unit.

In step 620, the processor 140 may start the target program using the start code loaded in step 620. [

FIG. 7 is a flowchart illustrating a caching process of the start time optimization method according to the second embodiment of the present invention.

The second embodiment of the present invention is similar to the first embodiment in that the start time optimization apparatus 100 caches the start code in the first nonvolatile memory 110 and generates the prefetching program Fig.

In step 710, the processor 140 may detect the code necessary for starting the target program on a file basis. At this time, the processor 140 may extract a code necessary for starting according to a system call requested by the target program for starting the target program. In addition, the processor 140 may generate file information including at least one of a file name, a start address, and a file size of a start code of a file unit. At this time, since the file information is the information of the start code of the file unit detected by the target program, it may be the information of the file system layer.

In step 720, the processor 140 may cache the start code of the file unit detected in step 710 in the first nonvolatile memory 110. [

In step 730, the processor 140 may analyze the start code in units of files cached in the first nonvolatile memory 110 in step 720 to detect a code necessary for starting the target program on a block-by-block basis. For example, the processor 140 may detect a block start code required for executing a target program using a tool for tracking input / output requests at a block level. At this time, the processor 140 may generate block information including at least one of a logical block address (LBA) of a start code on a block-by-block basis, and a block size.

In step 740, the processor 140 may map the block information corresponding to the start code of the block unit to the file information corresponding to the start code of the file unit.

In step 750, the processor 140 may set the block start code corresponding to the block information to be prefetched to the target program. Specifically, the processor 140 includes a function for preliminarily reading a block-by-block start code by mapping block information such as a logical block address and a block size to file information such as a file name, a start address, and a file size Pre-fetching program can be produced.

8 is a flowchart showing a target program starting process of the start time optimization method according to the second embodiment of the present invention.

At step 810, the processor 140 may prefetch the start code to the volatile memory 130 in the first non-volatile memory 110 where the start code necessary for starting the target program is cached. In this case, the start code may be one of a start code of the file unit detected by the target program, or a start code of the block unit detected by analyzing the start code of the detected file unit.

At step 820, the processor 140 may start the target program using the pre-fetched code in the volatile memory 130. [

At this time, step 810 may be performed in parallel with step 820 to copy the start code necessary for the start of the target program from the first nonvolatile memory 110 to the volatile memory 130. That is, by performing the input / output of the target program in the prefetching program, the input / output of the target program and the CPU calculation are parallelized, and the execution time of the target program can be improved.

The present invention can shorten the input / output time of the start code necessary for starting the target program by caching the start code necessary for starting the target program in the nonvolatile memory having a faster access speed than the nonvolatile memory storing the target program.

In addition, the present invention can shorten the input / output time of the start code necessary for starting the target program by caching the start code necessary for starting the target program and pre-fetching the cached start code.

The present invention is characterized by caching the start code necessary for starting the target program in a nonvolatile memory having a faster access speed and a buffer cache mapped than the nonvolatile memory in which the target program is stored, Can be shortened.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. This is possible.

Therefore, the scope of the present invention should not be limited to the described embodiments, but should be determined by the equivalents of the claims, as well as the claims.

110: first nonvolatile memory
120: second nonvolatile memory
140: Processor

Claims (11)

Detecting a code necessary for starting the target program in a target program on a file basis; And
Caching the detected start code of the file unit in the nonvolatile memory
The program comprising:
The method according to claim 1,
Wherein the detecting comprises:
And extracting a code necessary for the start of the target program according to a request for starting the target program.
The method according to claim 1,
Analyzing a start code for each file unit cached in the nonvolatile memory and detecting a code necessary for starting the target program on a block basis; And
Setting the detected start code of the block unit to be prefetched to the target program
Further comprising the steps of:
The method of claim 3,
Further comprising the step of mapping the block information corresponding to the start code of the detected block unit to file information corresponding to the start code of the file unit,
Wherein the setting step comprises:
And a start code of a block unit corresponding to the block information is pre-fetched to the target program.
5. The method of claim 4,
The block information includes:
A logical block address of the start code in the detected block unit, and a block size.
5. The method of claim 4,
The file information includes:
And a file size of the start code of the detected file unit, the start address, and the file size of the file.
The method according to claim 1,
Evaluating a value of a start code per file unit cached in the nonvolatile memory; And
Managing the cached file-based start code based on the value of the cached file-based code
Further comprising the steps of:
Detecting a code necessary for starting the target program in a target program on a file basis;
Analyzing a detected start code of each file unit and detecting a start code of a block unit necessary for starting the target program; And.
Caching the detected start code of the block unit in the nonvolatile memory
The program comprising:
9. The method of claim 8,
Evaluating a value of a block start code cached in the non-volatile memory; And
Managing the cached block-based start code based on the value of the cached block-by-block code
Further comprising the steps of:
Prefetching the start code into a volatile memory in a nonvolatile memory where a start code necessary for starting a target program is cached; And
Starting the target program using a pre-fetched start code in the volatile memory
The program comprising:
11. The method of claim 10,
The start code includes:
A start code of a file unit detected in the target program, or a start code of a block unit detected by analyzing a start code of the detected file unit.
KR1020150057504A 2015-04-23 2015-04-23 Application launching time optimization apparatus and method using non-volatile memory KR101680966B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150057504A KR101680966B1 (en) 2015-04-23 2015-04-23 Application launching time optimization apparatus and method using non-volatile memory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150057504A KR101680966B1 (en) 2015-04-23 2015-04-23 Application launching time optimization apparatus and method using non-volatile memory

Publications (2)

Publication Number Publication Date
KR20160126472A true KR20160126472A (en) 2016-11-02
KR101680966B1 KR101680966B1 (en) 2016-11-29

Family

ID=57518740

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150057504A KR101680966B1 (en) 2015-04-23 2015-04-23 Application launching time optimization apparatus and method using non-volatile memory

Country Status (1)

Country Link
KR (1) KR101680966B1 (en)

Also Published As

Publication number Publication date
KR101680966B1 (en) 2016-11-29

Similar Documents

Publication Publication Date Title
US7979631B2 (en) Method of prefetching data in hard disk drive, recording medium including program to execute the method, and apparatus to perform the method
US6915404B1 (en) Computer system implementing a multi-threaded stride prediction read ahead algorithm
US9489239B2 (en) Systems and methods to manage tiered cache data storage
US9772949B2 (en) Apparatus, system and method for providing a persistent level-two cache
US20160054936A1 (en) Information processing system and nonvolatile storage unit
US20140310476A1 (en) Bucketized multi-index low-memory data structures
US20100174853A1 (en) User device including flash and random write cache and method writing data
CN108897642B (en) Method and device for optimizing log mechanism in persistent transactional memory system
US10095624B1 (en) Intelligent cache pre-fetch
US10203899B2 (en) Method for writing data into flash memory apparatus, flash memory apparatus, and storage system
US20130086307A1 (en) Information processing apparatus, hybrid storage apparatus, and cache method
US8151068B2 (en) Data copy management for faster reads
US20170199822A1 (en) Systems and methods for acquiring data for loads at different access times from hierarchical sources using a load queue as a temporary storage buffer and completing the load early
US10719240B2 (en) Method and device for managing a storage system having a multi-layer storage structure
US20140359211A1 (en) Method for disk defrag handling in solid state drive caching environment
CN116069681A (en) Disk space recovery method and device, electronic equipment and storage medium
KR100987251B1 (en) Flash memory management method and apparatus for merge operation reduction in a fast algorithm base ftl
JP2010191983A (en) Storage device
KR101680966B1 (en) Application launching time optimization apparatus and method using non-volatile memory
US11132128B2 (en) Systems and methods for data placement in container-based storage systems
US10210097B2 (en) Memory system and method for operating the same
JP7170093B2 (en) Improved read-ahead capabilities for storage devices
US9442863B1 (en) Cache entry management using read direction detection
JP6689471B2 (en) Information processing apparatus, information processing method, and information processing program
US10831370B1 (en) Deduplicated and compressed non-volatile memory cache

Legal Events

Date Code Title Description
A201 Request for examination
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20191104

Year of fee payment: 4