CN110019361B - Data caching method and device - Google Patents

Data caching method and device Download PDF

Info

Publication number
CN110019361B
CN110019361B CN201711052725.5A CN201711052725A CN110019361B CN 110019361 B CN110019361 B CN 110019361B CN 201711052725 A CN201711052725 A CN 201711052725A CN 110019361 B CN110019361 B CN 110019361B
Authority
CN
China
Prior art keywords
data
cached
cache
priority
classified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711052725.5A
Other languages
Chinese (zh)
Other versions
CN110019361A (en
Inventor
焦张波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Gridsum Technology Co Ltd
Original Assignee
Beijing Gridsum Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Gridsum Technology Co Ltd filed Critical Beijing Gridsum Technology Co Ltd
Priority to CN201711052725.5A priority Critical patent/CN110019361B/en
Publication of CN110019361A publication Critical patent/CN110019361A/en
Application granted granted Critical
Publication of CN110019361B publication Critical patent/CN110019361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management

Abstract

The invention discloses a data caching method and a data caching device, relates to the technical field of computers, and mainly aims to fully utilize storage resources of a cache region and improve the query performance of data, wherein the main technical scheme of the invention is as follows: acquiring data to be cached; classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached; determining the cache priority corresponding to the classified data to be cached; and putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low. The invention is mainly used for caching data.

Description

Data caching method and device
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a data caching method and apparatus.
Background
The cache technology is a common technology for data access in a computer system, and when the computer system accesses data, the recently accessed data can be stored in a buffer area, so that when the computer system needs to access the recently accessed data again, the recently accessed data can be inquired from the buffer area firstly, if the recently accessed data is found, the data is directly executed, and if the recently accessed data is not found, the data is inquired from a memory.
In the existing data caching method, recently accessed data is continuously stored in a cache region in the process of data caching, however, as the application field of data is continuously widened, the data in the cache region is continuously increased and may exceed the space size of the cache region, so that the cache region is overflowed, and at this time, new data covers the data which is stored in the cache region and has not been accessed for the latest and the longest, so that when a computer queries the recently accessed data, the data is preferentially queried from the cache region.
Although the data caching method can ensure that the latest accessed data is stored in the cache region, the data to be covered is more and more due to the space size of the cache region, the data stored in the cache region is convenient for subsequent repeated access, the subsequent use is improved and faster, if the excessive important data is covered, the data is a waste of storage resources for the cache region, and the query performance of the data is reduced.
Disclosure of Invention
In view of this, the present invention provides a data caching method and apparatus, and mainly aims to fully utilize storage resources of a cache region and improve query performance of data.
In order to solve the above problems, the present invention mainly provides the following technical solutions:
in one aspect, an embodiment of the present invention provides a data caching method, including:
acquiring data to be cached;
classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached;
determining the cache priority corresponding to the classified data to be cached;
and putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
Further, the data characteristics include multiple dimensions, and classifying the data to be cached according to the data characteristics corresponding to the data to be cached includes:
according to the type of the storage equipment corresponding to the data to be cached, performing first dimension classification on the data to be cached;
according to different display positions of the data to be cached in the storage device, carrying out second dimension classification on the data to be cached;
performing third-dimension classification on the data to be cached according to the page attribute characteristics corresponding to the data to be cached;
and according to the service module characteristics corresponding to the data to be cached, carrying out fourth dimensional classification on the data to be cached.
Further, the determining the cache priority corresponding to the classified data to be cached includes:
when the data to be cached carries indication information of a specified caching priority, determining the caching priority for the data to be cached according to the indication information;
when the data to be cached does not carry indication information for designating the caching priority, determining the caching priority for the cached data according to an importance evaluation value of the data to be cached, wherein the importance evaluation value is obtained by calculating a weight value and an evaluation score corresponding to each dimension classification.
Further, before determining the caching priority for the cached data according to the importance evaluation value of the data to be cached, the method further includes:
pre-configuring a weight value and an evaluation score of the data to be cached in each dimension classification;
obtaining a historical query frequency corresponding to the data to be cached by querying a historical cache record of a cache region;
and determining the importance evaluation value of the data to be cached according to the weight value and the evaluation score of the data to be cached in each dimension classification and the historical query frequency corresponding to the data to be cached.
Further, after the data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, the method further includes:
respectively determining the classified data to be cached and the capacity of the cache region;
if the classified data to be cached is larger than the capacity of the cache region, receiving an input data screening condition, screening out at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data;
after the classified data to be cached is placed in a cache region according to the order of the cache priority from high to low, the method further comprises the following steps:
determining the cache priority of the updated data according to the updating condition of the data to be cached;
and updating the cache data in the cache region according to the cache priority of the updated data.
In order to achieve the above object, according to another aspect of the present invention, a storage medium is provided, where the storage medium includes a stored program, and when the program runs, a device on which the storage medium is located is controlled to execute the above data caching method.
In order to achieve the above object, according to another aspect of the present invention, there is provided a processor for executing a program, wherein the program executes the above data caching method.
On the other hand, an embodiment of the present invention further provides a data caching device, including:
the acquisition unit is used for acquiring data to be cached;
the classification unit is used for classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached;
the first determining unit is used for determining the cache priority corresponding to the classified data to be cached;
and the cache unit is used for placing the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
Further, the data features include a plurality of dimensions, and the classification unit includes:
the first classification module is used for performing first dimension classification on the data to be cached according to the type of the storage equipment corresponding to the data to be cached;
the second classification module is used for performing second dimension classification on the data to be cached according to different display positions of the data to be cached in the storage equipment;
the third classification module is used for carrying out third dimension classification on the data to be cached according to the page attribute characteristics corresponding to the data to be cached;
and the fourth classification module is used for performing fourth dimension classification on the data to be cached according to the service module characteristics corresponding to the data to be cached.
Further, the first determining unit is specifically configured to determine a cache priority for the data to be cached according to indication information when the data to be cached carries the indication information specifying the cache priority;
the first determining unit is further specifically configured to determine a cache priority for the cache data according to an importance evaluation value of the data to be cached when the data to be cached does not carry indication information that specifies the cache priority, where the importance evaluation value is calculated according to a weight value and an evaluation score corresponding to each dimension classification.
Further, the apparatus further comprises:
the configuration unit is used for pre-configuring a weight value and an evaluation score of the data to be cached in each dimension classification;
the query unit is used for obtaining the historical query frequency corresponding to the data to be cached by querying the historical cache records of the cache region;
and the second determining unit is used for determining the importance evaluation value of the data to be cached according to the weight value and the evaluation score of the data to be cached in each dimension classification and the historical query frequency corresponding to the data to be cached.
Further, the apparatus further comprises:
a third determining unit, configured to determine the classified data to be cached and the capacity of the cache area respectively;
the screening unit is used for receiving an input data screening condition if the classified data to be cached is larger than the capacity of the cache region, screening at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data;
a fourth determining unit, configured to determine, after the classified data to be cached is placed in the cache region according to the order from high to low of the cache priority, a cache priority of the updated data according to an update condition of the data to be cached;
and the updating unit is used for updating the cache data in the cache region according to the cache priority of the updated data.
By the technical scheme, the technical scheme provided by the embodiment of the invention at least has the following advantages:
according to the data caching method and device provided by the embodiment of the invention, the data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, so that the caching priority corresponding to the classified data to be cached is determined according to the data to be cached of different classifications, and the classified data to be cached is placed in the caching area according to the sequence of the caching priority from high to low. Compared with the method for directly putting the data to be cached into the storage area in the prior art, the embodiment of the invention is convenient for subsequent repeated access to the data in the cache area, classifies the data to be cached according to the data characteristics corresponding to the data to be cached, puts the classified data to be cached into the cache area according to the sequence of the cache priority from high to low before putting the data to be cached into the storage area, ensures that the data resources in the cache area are fully utilized in the subsequent data query process, so that when a user queries the data in the subsequent process, the required data can be queried from the cache area preferentially, and the data query performance is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of a data caching method according to an embodiment of the present invention;
fig. 2 is a flowchart of another data caching method according to an embodiment of the present invention;
fig. 3 is a block diagram illustrating a data caching apparatus according to an embodiment of the present invention;
fig. 4 is a block diagram of another data caching apparatus according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The embodiment of the present invention provides a data caching method, as shown in fig. 1, the method classifies data to be cached according to data characteristics corresponding to the data to be cached, and places the classified data to be cached into a cache region according to a sequence of a cache priority from high to low, so as to fully utilize storage resources of the cache region and improve query performance of the data, and for this, the following specific steps are provided in the embodiment of the present invention:
101. and acquiring data to be cached.
The data to be cached is data that needs to be stored in the cache region, such as web browsing data, user search data, video playing data, and the like, and the embodiment of the present invention is not limited.
The buffer here mainly plays three roles, one is pre-reading. When the hard disk is controlled by a CPU instruction to start to read data, a control chip on the hard disk can control the magnetic head to read the data in the next cluster or clusters of the cluster being read into the cache, when the data in the next cluster or clusters needs to be read, the hard disk does not need to read the data again, and the data in the cache is directly transmitted to the memory, so that the purpose of obviously improving the performance can be achieved because the cache speed is far higher than the read-write speed of the magnetic head; the second is to cache the write action. When the hard disk receives the instruction of writing data, the data is not written into the disk immediately, but is temporarily stored in the buffer memory firstly, and then a signal of 'data written' is sent to the system, at this time, the system considers that the data is written and continues to execute the following work, and the hard disk writes the data in the buffer memory into the disk when the hard disk is idle. And thirdly, temporarily storing the recently accessed data. Sometimes, some data are accessed frequently, a cache inside the hard disk stores some data which are read frequently in the cache, and when the data are read again, the data can be directly transmitted from the cache.
102. And classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached.
For the embodiment of the invention, the page browsing data can be stored in a cache region of the mobile phone terminal or a cache region of the server terminal, the user searching data can store a default page in the cache region or store an inquiry page in the cache region, the default page usually shows the day with the latest data by default, the inquiry page usually shows the inquiry page by someone, and the user can not return the result without inputting the inquiry condition. Different cached data corresponds to different data characteristics. And further classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached.
According to the embodiment of the invention, the data to be cached with different data characteristics are classified, and the data to be cached is sorted in advance before the data to be cached is placed in the cache region, so that the data can be cached according to the subsequent data query condition.
103. And determining the cache priority corresponding to the classified data to be cached.
The data to be cached with different data characteristics is different from the subsequent query condition of the data in the cache region, for example, the probability that the video viewing data is possibly queried at the mobile phone terminal is lower, and the probability that the video viewing data is possibly queried at the external interface terminal is higher.
Because the data to be cached with higher query probability is called more frequently in the cache region, the embodiment of the invention determines the cache priority corresponding to the classified data to be cached, determines the higher cache priority for the data to be cached with higher query probability, determines the lower cache priority for the data to be cached with lower query probability, so as to fully utilize the storage resources of the cache region, determine the higher cache priority for the important data to be cached, when the user queries the data subsequently, ensure that the data with higher demand is stored in the cache region, and improve the data query speed.
104. And putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
The cache priority is equal to the priority order of storing the data to be cached into the cache region in advance, and it should be noted that when the storage space in the cache region is larger than or equal to the data to be cached, all the classified data to be cached are placed into the cache region according to the sequence from high to low of the priority of the cache region, when the storage space in the cache region is smaller than the data to be cached, the data to be cached with high priority is stored into the cache region preferentially, the data to be cached with low priority is not stored into the cache region, so that the cache quality of the data stored into the cache region is ensured, the data to be cached with high access probability is stored into the cache region preferentially, and the waste of data resources in the cache region is avoided.
The data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, so that the caching priority corresponding to the classified data to be cached is determined according to the data to be cached of different classifications, and the classified data to be cached is placed in the caching area according to the sequence of the caching priority from high to low. Compared with the method for directly putting the data to be cached into the storage area in the prior art, the embodiment of the invention is convenient for subsequent repeated access to the data in the cache area, classifies the data to be cached according to the data characteristics corresponding to the data to be cached, puts the classified data to be cached into the cache area according to the sequence of the cache priority from high to low before putting the data to be cached into the storage area, ensures that the data resources in the cache area are fully utilized in the subsequent data query process, so that when a user queries the data in the subsequent process, the required data can be queried from the cache area preferentially, and the data query performance is improved.
In order to describe the data caching method proposed by the present invention in more detail, particularly, the steps of classifying the data to be cached according to the data characteristics corresponding to the data to be cached and determining the caching priority corresponding to the classified data to be cached are provided below, an embodiment of the present invention further provides another data caching method, as shown in fig. 2, the specific steps of the method include:
201. and acquiring data to be cached.
When the data to be cached is web browsing data, and a user requests the web browsing data for the first time, the cache region does not have cache data corresponding to the request, the web browsing data needs to be requested from the server, and then the requested web browsing data is put into the cache region after the server returns the requested web browsing data, so that when the user requests the same web browsing data again, the corresponding data can be directly requested from the cache region.
For the embodiment of the invention, the data which is accessed for the first time by the script monitoring system or the application program can be embedded, and the data can be reused by the system or the application program, and the data is taken as the data to be cached and is put into the cache region, so that the data can be directly and quickly read from the cache region when a subsequent user accesses the data again.
202. And classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached.
The data characteristics may include multiple dimensions, each of which reflects the data to be cached from characteristics of different aspects, where the dimensions may be a storage device type corresponding to the data to be cached, a display position of the data to be cached in the storage device, a page attribute characteristic corresponding to the data to be cached, or a service module characteristic corresponding to the data to be cached, which is not limited in the embodiment of the present invention.
For the embodiment of the present invention, when the data feature is the type of the storage device corresponding to the data to be cached, the data to be cached may be classified in a first dimension according to the type of the storage device corresponding to the data to be cached, and the type of the storage device corresponding to the data to be cached may be specifically classified into a mobile phone terminal, a PC terminal, an external interface terminal, and the like; when the data features are the display positions of the data to be cached in the storage device, the data to be cached can be classified in a second dimension according to the display positions of the data to be cached in the storage device, and the display positions of the data to be cached in the storage device can be specifically divided into a first page, a second page or a third page of a PC (personal computer) end, a first screen or a second screen of a mobile phone terminal and the like; when the data features are page attribute features corresponding to the data to be cached, performing third-dimension classification on the data to be cached according to the page attribute features corresponding to the data to be cached, wherein the page attribute features corresponding to the data to be cached can be specifically divided into default pages for displaying latest data by default, query pages for user-defined query and the like; when the data characteristics are service module characteristics corresponding to the data to be cached, the data to be cached can be classified according to the fourth dimension according to the service module characteristics corresponding to the data to be cached, and the service module characteristics corresponding to the data to be cached can be specifically classified into a leading cockpit module, a summary module and the like.
It should be noted that, the first dimension, the second dimension, the third dimension, and the fourth dimension do not relate to any permutation sequence, and some dimensions may be added or deleted according to a user requirement, which is not limited in the embodiment of the present invention.
203. And pre-configuring a weight value and an evaluation score of the data to be cached in each dimension classification.
Because the data to be cached in different classifications correspond to different data characteristics, in the embodiment of the invention, before the data to be cached is put into the cache, the importance degree of the data to be cached with different data characteristics is divided by pre-configuring the weight value and the evaluation score of the data to be cached in each dimension classification.
It should be noted that, the sum of the weight values of the data to be cached in each dimension classification is 1, and if the data to be cached includes 4 dimensions, the type of the storage device corresponding to the data to be cached, the display position of the data to be cached in the storage device, the page attribute feature corresponding to the data to be cached, and the service module feature corresponding to the data to be cached, the sum of the weight values of the data to be cached in the 4 dimension classifications is 1.
Illustratively, 100 points are used as full points of evaluation scores, and when the first dimension is classified, the evaluation score corresponding to the mobile phone terminal is 60 points, the evaluation score corresponding to the PC terminal is 50 points, and the evaluation score corresponding to the third-party interface terminal is 20 points, which are configured according to business requirements; when the second dimension is classified, according to the service requirement, configuring the evaluation score of the first page corresponding to the display position as 80, the evaluation score corresponding to the first screen as 60, the evaluation score corresponding to the second page as 60 and the evaluation score corresponding to the second screen as 70; when the third dimension is classified, whether the evaluation score of 'yes' in the default data classification is displayed is 80 and the evaluation score of 'no' is 10 is configured according to the service requirements; in the fourth dimension classification, the evaluation score of the service demand configuration module is 90 according to the summary of the classification characteristics, and the evaluation score of the leading cockpit is 85.
It should be understood that, the weighted value and the evaluation score of the data to be cached in each dimension classification may be configured in advance by an administrator, or may be adjusted in real time according to actual service requirements, which is not limited in the embodiment of the present invention.
204. And obtaining the historical query frequency corresponding to the data to be cached by querying the historical cache records of the cache region.
Because different query frequencies can reflect the condition that the data to be cached is required, if the historical query frequency corresponding to the data to be cached is higher, the requirement that the data to be cached is accessed by the user recently is higher, and the data resource of the cache region is fully utilized, and conversely, if the historical query frequency corresponding to the data to be cached is lower, the requirement that the data to be cached is accessed by the user recently is lower, and the data resource of the cache region cannot be fully utilized.
For the embodiment of the present invention, each time a user queries data from the cache region, the cache region stores a query record of the user, for example, the number of times that the cache data a is recently accessed by the user is 0, and the number of times that the cache data b is recently accessed by the user is 10.
205. And determining the importance evaluation value of the data to be cached according to the weight value and the evaluation score of the data to be cached in each dimension classification and the historical query frequency corresponding to the data to be cached.
For the embodiment of the present invention, the weighted value, the evaluation score and the historical query frequency corresponding to the data to be cached of the data to be cached in each dimension classification can all indicate the required degree of the data to be cached, so that the importance evaluation value of the data to be cached can be determined according to the weighted value, the evaluation score and the historical query frequency corresponding to the data to be cached in each dimension classification of the data to be cached, and specifically, the importance evaluation value of the data to be cached can be calculated by the sum of products of the weighted value in each dimension classification and the evaluation score of the data to be cached in each dimension classification and the product of the configured data query frequency weight and the historical query frequency corresponding to the data to be cached.
206a, when the data to be cached carries indication information of a designated caching priority, determining the caching priority for the data to be cached according to the indication information.
It should be noted that, when the data to be cached carries the indication information specifying the cache priority, it indicates that the user knows that the query data may be reused and reused in subsequent operations when querying the data, so that the cache priority is determined for the data to be cached according to the indication information sent by the user when the user queries the data.
The cache priority may be represented in a numerical form, the larger the numerical value is, the lower the priority is, for example, the cache priority of 10 is greater than the cache priority of 155, for the embodiment of the present invention, the cache priority may be set to a numerical value of 1 to 1000, as an importance of manually and custom-selecting data to be cached by a user, each piece of data to be cached must have a numerical value corresponding to the cache priority.
A step 206b executed in parallel with the step 206a, when the data to be cached does not carry indication information of a designated caching priority, determining the caching priority for the cached data according to the importance evaluation value of the data to be cached.
It should be noted that, when the data to be cached does not carry the indication information for specifying the caching priority, it indicates that the user does not know that the queried data may be reused and reused in subsequent operations when querying the data, so that the caching priority is determined for the cached data according to the importance evaluation value of the data to be cached when the user queries the data.
If the user does not manually select the numerical value of the cache priority, the embodiment of the invention determines the cache priority for the data to be cached according to the calculated importance evaluation value of the data to be cached.
It should be noted that, in general, the cache priority of the user indication information is higher than the calculated cache priority, and the cache priority is marked in the data table in real time along with the record of the data to be cached, and the number of the data to be cached of the user indication information is not determined in each data table, so that a certain number of the data to be cached of the user indication information is reserved in each data table, which may be 100 or 1000, and is not specifically limited, and the calculated cache priority is updated in real time along with the change of some attribute of the data to be cached, for example, the cache priority of a certain piece of data to be cached may be changed after the history query frequency of the certain piece of data to be cached is changed.
207. And respectively determining the classified data to be cached and the capacity of the cache region.
Since the buffer is only used for storing some data that a user may frequently access, in order to ensure the user data query speed, the capacity of the buffer is usually small, and if the capacity of the data to be cached is too large, all the data cannot be put into the buffer, therefore, before the data to be cached is stored into the buffer, the classified data to be cached and the capacity of the buffer are usually determined respectively, so as to determine the capacity of the data to be cached put into the buffer according to the capacity of the buffer.
208. And if the classified data to be cached is larger than the capacity of the cache region, receiving an input data screening condition, screening at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data.
For the embodiment of the invention, if the classified data to be cached is larger than the capacity of the cache region, the input data screening condition is received, wherein the screening condition can be determined by user definition, under the condition of a lot of data contents to be cached, at least part of data most concerned by the user can be screened out through the data screening condition input by the user, and the screened data to be cached is stored in the cache region after the cache priority ranking is carried out on the screened data to be cached, so that the limited space in the cache region is fully utilized.
209. And putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
The cache priority is equal to the priority order of storing the data to be cached into the cache region in advance, and it should be noted that when the storage space in the cache region is larger than or equal to the data to be cached, all the classified data to be cached are placed into the cache region according to the sequence from high to low of the priority of the cache region, when the storage space in the cache region is smaller than the data to be cached, the data to be cached with high priority is stored into the cache region preferentially, the data to be cached with low priority is not stored into the cache region, so that the cache quality of the data stored into the cache region is ensured, the data to be cached with high access probability is stored into the cache region preferentially, and the waste of data resources in the cache region is avoided.
210. And determining the cache priority of the updated data according to the updating condition of the data to be cached.
For the embodiment of the invention, after the data to be cached is stored in the cache region, the data to be cached is also updated in real time, meanwhile, the cache priority corresponding to the data to be cached is also continuously updated, and the cache priority of the updated data is further determined, so that the condition of the data to be cached can be known in real time.
211. And updating the cache data in the cache region according to the cache priority of the updated data.
It should be noted that, if there is cache data with a cache priority higher than that in the cache region in the data to be cached, it indicates that the utilization rate of the data to be cached is higher than that of the data in the cache region, and further updates the cache data in the cache region according to the cache priority of the updated data, and replaces the data to be cached with a higher cache priority with the cache data with a lower cache priority in the cache region, thereby achieving the purpose of fully utilizing the data resources in the cache region and improving the data query performance.
According to another data caching method provided by the embodiment of the invention, the data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, so that the caching priority corresponding to the classified data to be cached is determined according to the data to be cached of different classifications, and the classified data to be cached is placed in the caching area according to the sequence of the caching priority from high to low. Compared with the method for directly putting the data to be cached into the storage area in the prior art, the embodiment of the invention is convenient for subsequent repeated access to the data in the cache area, classifies the data to be cached according to the data characteristics corresponding to the data to be cached, puts the classified data to be cached into the cache area according to the sequence of the cache priority from high to low before putting the data to be cached into the storage area, ensures that the data resources in the cache area are fully utilized in the subsequent data query process, so that when a user queries the data in the subsequent process, the required data can be queried from the cache area preferentially, and the data query performance is improved.
In order to achieve the above object, according to another aspect of the present invention, an embodiment of the present invention further provides a storage medium, where the storage medium includes a stored program, and when the program runs, a device on which the storage medium is located is controlled to execute the above data caching method.
In order to achieve the above object, according to another aspect of the present invention, an embodiment of the present invention further provides a processor, where the processor is configured to execute a program, where the program executes the above data caching method.
Further, as an implementation of the method shown in fig. 1 and fig. 2, another embodiment of the present invention further provides a data caching apparatus. The embodiment of the apparatus corresponds to the embodiment of the foregoing method, and for convenience of reading, details in the embodiment of the apparatus are not repeated one by one, but it should be clear that the apparatus in the embodiment can fully utilize the storage resource of the cache region to improve the query performance of data, and as shown in fig. 3 specifically, the apparatus includes:
an obtaining unit 301, configured to obtain data to be cached;
the classifying unit 302 may be configured to classify the data to be cached according to data characteristics corresponding to the data to be cached, so as to obtain the classified data to be cached;
a first determining unit 303, configured to determine a caching priority corresponding to the classified data to be cached;
the buffering unit 304 may be configured to put the classified data to be buffered into a buffer area according to the order of the buffering priority from high to low.
The embodiment of the invention provides a data caching device, which classifies data to be cached according to data characteristics corresponding to the data to be cached to obtain the classified data to be cached, determines the caching priority corresponding to the classified data to be cached according to the data to be cached of different classifications, and puts the classified data to be cached into a caching area according to the sequence of the caching priority from high to low. Compared with the method for directly putting the data to be cached into the storage area in the prior art, the embodiment of the invention is convenient for subsequent repeated access to the data in the cache area, classifies the data to be cached according to the data characteristics corresponding to the data to be cached, puts the classified data to be cached into the cache area according to the sequence of the cache priority from high to low before putting the data to be cached into the storage area, ensures that the data resources in the cache area are fully utilized in the subsequent data query process, so that when a user queries the data in the subsequent process, the required data can be queried from the cache area preferentially, and the data query performance is improved.
Further, as shown in fig. 4, the apparatus further includes:
a configuration unit 305, configured to pre-configure a weight value and an evaluation score of the data to be cached in each dimension classification;
the query unit 306 may be configured to obtain a historical query frequency corresponding to the data to be cached by querying a historical cache record of a cache region;
the second determining unit 307 may be configured to determine an importance evaluation value of the data to be cached according to a weight value and an evaluation score of the data to be cached in each dimension classification and a history query frequency corresponding to the data to be cached.
A third determining unit 308, configured to determine the classified data to be cached and the capacity of the cache region respectively;
the screening unit 309 may be configured to receive an input data screening condition if the classified data to be cached is greater than the capacity of the cache region, screen at least part of the cached data from the classified data to be cached according to the data screening condition, and perform cache priority ordering on the screened at least part of the cached data;
a fourth determining unit 310, configured to determine, after the classified data to be cached is placed in a cache region according to the order from high to low of the cache priority, a cache priority of the updated data according to an update condition of the data to be cached;
the updating unit 311 may be configured to update the cache data in the cache region according to the cache priority of the updated data.
Further, the data features include a plurality of dimensions, and the classification unit 302 includes:
the first classification module 3021 may be configured to perform first dimension classification on the data to be cached according to a storage device type corresponding to the data to be cached;
the second classification module 3022 may be configured to perform second-dimension classification on the data to be cached according to different display positions of the data to be cached in the storage device;
a third classification module 3023, configured to perform third-dimension classification on the data to be cached according to a page attribute feature corresponding to the data to be cached;
the fourth classifying module 3024 may be configured to perform fourth dimension classification on the data to be cached according to the service module characteristic corresponding to the data to be cached.
Further, the first determining unit 303 may be specifically configured to determine, when the data to be cached carries indication information that specifies a cache priority, the cache priority for the data to be cached according to the indication information;
the first determining unit 303 may be further specifically configured to determine, when the to-be-cached data does not carry indication information that specifies a cache priority, a cache priority for the to-be-cached data according to an importance evaluation value of the to-be-cached data, where the importance evaluation value is calculated according to a weight value and an evaluation score corresponding to each dimension classification.
According to another data caching device provided by the embodiment of the invention, the data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, so that the caching priority corresponding to the classified data to be cached is determined according to the data to be cached of different classifications, and the classified data to be cached is placed in the caching area according to the sequence of the caching priority from high to low. Compared with the method for directly putting the data to be cached into the storage area in the prior art, the embodiment of the invention is convenient for subsequent repeated access to the data in the cache area, classifies the data to be cached according to the data characteristics corresponding to the data to be cached, puts the classified data to be cached into the cache area according to the sequence of the cache priority from high to low before putting the data to be cached into the storage area, ensures that the data resources in the cache area are fully utilized in the subsequent data query process, so that when a user carries out data patrol in the subsequent process, the required data can be preferentially queried from the cache area, and the data query performance is improved.
The data caching device comprises a processor and a memory, the acquiring unit 301, the classifying unit 302, the first determining unit 303, the caching unit 304 and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more, storage resources of the cache region are fully utilized by adjusting kernel parameters, and the query performance of data is improved.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium on which a program is stored, and the program implements a caching method of the data when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the data caching method is executed when the program runs.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein the processor executes the program and realizes the following steps:
a data caching method comprises the following steps: acquiring data to be cached; classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached; determining the cache priority corresponding to the classified data to be cached; and putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
Further, the data characteristics include multiple dimensions, and classifying the data to be cached according to the data characteristics corresponding to the data to be cached includes: according to the type of the storage equipment corresponding to the data to be cached, performing first dimension classification on the data to be cached; according to different display positions of the data to be cached in the storage device, carrying out second dimension classification on the data to be cached; performing third-dimension classification on the data to be cached according to the page attribute characteristics corresponding to the data to be cached; and according to the service module characteristics corresponding to the data to be cached, carrying out fourth dimensional classification on the data to be cached.
Further, the determining the cache priority corresponding to the classified data to be cached includes: when the data to be cached carries indication information of a specified caching priority, determining the caching priority for the data to be cached according to the indication information; when the data to be cached does not carry indication information for designating the caching priority, determining the caching priority for the cached data according to an importance evaluation value of the data to be cached, wherein the importance evaluation value is obtained by calculating a weight value and an evaluation score corresponding to each dimension classification.
Further, before determining the caching priority for the cached data according to the importance evaluation value of the data to be cached, the method further includes: pre-configuring a weight value and an evaluation score of the data to be cached in each dimension classification; obtaining a historical query frequency corresponding to the data to be cached by querying a historical cache record of a cache region; and determining the importance evaluation value of the data to be cached according to the weight value and the evaluation score of the data to be cached in each dimension classification and the historical query frequency corresponding to the data to be cached.
Further, after the data to be cached is classified according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached, the method further includes: respectively determining the classified data to be cached and the capacity of the cache region; if the classified data to be cached is larger than the capacity of the cache region, receiving an input data screening condition, screening out at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data; after the classified data to be cached is placed in a cache region according to the order of the cache priority from high to low, the method further comprises the following steps: determining the cache priority of the updated data according to the updating condition of the data to be cached; and updating the cache data in the cache region according to the cache priority of the updated data.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform program code for initializing the following method steps when executed on a data processing device: acquiring data to be cached; classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached; determining the cache priority corresponding to the classified data to be cached; and putting the classified data to be cached into a cache region according to the sequence of the cache priority from high to low.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method for caching data, comprising:
acquiring data to be cached;
classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached;
respectively determining the classified data to be cached and the capacity of a cache region;
if the classified data to be cached is larger than the capacity of the cache region, receiving an input data screening condition, screening out at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data;
determining the cache priority corresponding to the classified data to be cached;
and when the storage space in the cache region is smaller than the data to be cached, putting the classified data to be cached into the cache region according to the sequence of the cache priority from high to low.
2. The method according to claim 1, wherein the data characteristics include a plurality of dimensions, and the classifying the data to be cached according to the data characteristics corresponding to the data to be cached includes:
according to the type of the storage equipment corresponding to the data to be cached, performing first dimension classification on the data to be cached;
according to different display positions of the data to be cached in the storage device, carrying out second dimension classification on the data to be cached;
performing third-dimension classification on the data to be cached according to the page attribute characteristics corresponding to the data to be cached;
and according to the service module characteristics corresponding to the data to be cached, carrying out fourth dimensional classification on the data to be cached.
3. The method according to claim 1, wherein the determining the caching priority corresponding to the classified data to be cached comprises:
when the data to be cached carries indication information of a specified caching priority, determining the caching priority for the data to be cached according to the indication information;
when the data to be cached does not carry indication information for designating the caching priority, determining the caching priority for the cached data according to an importance evaluation value of the data to be cached, wherein the importance evaluation value is obtained by calculating a weight value and an evaluation score corresponding to each dimension classification.
4. The method according to claim 3, wherein before determining the buffering priority for the buffered data according to the importance evaluation value of the data to be buffered, the method further comprises:
pre-configuring a weight value and an evaluation score of the data to be cached in each dimension classification;
obtaining a historical query frequency corresponding to the data to be cached by querying a historical cache record of a cache region;
and determining the importance evaluation value of the data to be cached according to the weight value and the evaluation score of the data to be cached in each dimension classification and the historical query frequency corresponding to the data to be cached.
5. The method according to any one of claims 1-4, wherein after placing the sorted data to be cached into a cache region in order of the cache priority from high to low, the method further comprises:
determining the cache priority of the updated data according to the updating condition of the data to be cached;
and updating the cache data in the cache region according to the cache priority of the updated data.
6. An apparatus for caching data, comprising:
the acquisition unit is used for acquiring data to be cached;
the classification unit is used for classifying the data to be cached according to the data characteristics corresponding to the data to be cached to obtain the classified data to be cached;
the first determining unit is used for determining the cache priority corresponding to the classified data to be cached;
the cache unit is used for placing the classified data to be cached into a cache region according to the sequence of the cache priority from high to low when the storage space in the cache region is smaller than the data to be cached;
a third determining unit, configured to determine the classified data to be cached and the capacity of the cache area respectively;
and the screening unit is used for receiving an input data screening condition if the classified data to be cached is larger than the capacity of the cache region, screening at least part of cache data from the classified data to be cached according to the data screening condition, and performing cache priority ordering on the screened at least part of cache data.
7. The apparatus of claim 6, wherein the data features comprise a plurality of dimensions, and wherein the classification unit comprises:
the first classification module is used for performing first dimension classification on the data to be cached according to the type of the storage equipment corresponding to the data to be cached;
the second classification module is used for performing second dimension classification on the data to be cached according to different display positions of the data to be cached in the storage equipment;
the third classification module is used for carrying out third dimension classification on the data to be cached according to the page attribute characteristics corresponding to the data to be cached;
and the fourth classification module is used for performing fourth dimension classification on the data to be cached according to the service module characteristics corresponding to the data to be cached.
8. The apparatus of claim 7,
the first determining unit is specifically configured to determine a cache priority for the data to be cached according to indication information when the data to be cached carries the indication information specifying the cache priority;
the first determining unit is further specifically configured to determine a cache priority for the cache data according to an importance evaluation value of the data to be cached when the data to be cached does not carry indication information that specifies the cache priority, where the importance evaluation value is calculated according to a weight value and an evaluation score corresponding to each dimension classification.
9. A storage medium, characterized in that the storage medium comprises a stored program, wherein when the program runs, a device in which the storage medium is located is controlled to execute the data caching method of any one of claims 1 to 5.
10. A processor, configured to execute a program, wherein the program executes a method for caching data according to any one of claims 1 to 5.
CN201711052725.5A 2017-10-30 2017-10-30 Data caching method and device Active CN110019361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711052725.5A CN110019361B (en) 2017-10-30 2017-10-30 Data caching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711052725.5A CN110019361B (en) 2017-10-30 2017-10-30 Data caching method and device

Publications (2)

Publication Number Publication Date
CN110019361A CN110019361A (en) 2019-07-16
CN110019361B true CN110019361B (en) 2021-10-15

Family

ID=67186731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711052725.5A Active CN110019361B (en) 2017-10-30 2017-10-30 Data caching method and device

Country Status (1)

Country Link
CN (1) CN110019361B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674432B (en) * 2019-09-09 2023-11-21 中国平安财产保险股份有限公司 Second-level caching method, device and computer readable storage medium
CN110716885B (en) * 2019-10-23 2022-02-18 北京字节跳动网络技术有限公司 Data management method and device, electronic equipment and storage medium
CN111159232A (en) * 2019-12-16 2020-05-15 浙江中控技术股份有限公司 Data caching method and system
CN111090653B (en) * 2019-12-20 2023-12-15 东软集团股份有限公司 Data caching method and device and related products
CN111125083B (en) * 2019-12-31 2020-11-10 北京金堤科技有限公司 Historical record screening method and device
TWI746261B (en) 2020-11-12 2021-11-11 財團法人工業技術研究院 Cache managing method and system based on session type
CN112702433B (en) * 2020-12-23 2022-08-02 南方电网电力科技股份有限公司 Data scheduling method and device for intelligent electric meter, intelligent electric meter and storage medium
CN114677782A (en) * 2020-12-24 2022-06-28 北京百度网讯科技有限公司 Information processing method, device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479249A (en) * 2010-11-26 2012-05-30 中国科学院声学研究所 Method for eliminating cache data of memory of embedded browser
CN102821113A (en) * 2011-06-07 2012-12-12 阿里巴巴集团控股有限公司 Cache method and system
CN103744367A (en) * 2013-12-20 2014-04-23 武汉钢铁(集团)公司 Production-line actual-performance information processing method, device and system
CN105279163A (en) * 2014-06-16 2016-01-27 Tcl集团股份有限公司 Buffer memory data update and storage method and system
CN105893173A (en) * 2015-12-10 2016-08-24 乐视网信息技术(北京)股份有限公司 Caching data processing method and device
CN106021126A (en) * 2016-05-31 2016-10-12 腾讯科技(深圳)有限公司 Cache data processing method, server and configuration device
CN106341467A (en) * 2016-08-30 2017-01-18 国网江苏省电力公司电力科学研究院 State analysis method of power utilization information collector based on big data parallel computing

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1505506A1 (en) * 2003-08-05 2005-02-09 Sap Ag A method of data caching
US8949368B2 (en) * 2006-05-12 2015-02-03 Citrix Systems, Inc. Method for cache object aggregation
CN102799583B (en) * 2011-05-23 2015-01-14 上海爱数软件有限公司 Object-oriented access method and system
US9031975B2 (en) * 2012-11-06 2015-05-12 Rockwell Automation Technologies, Inc. Content management
IN2013MU04016A (en) * 2013-12-23 2015-08-07 Tata Consultancy Services Ltd
CN106201917B (en) * 2016-07-08 2019-03-15 苏州华元世纪科技发展有限公司 A kind of data processing system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479249A (en) * 2010-11-26 2012-05-30 中国科学院声学研究所 Method for eliminating cache data of memory of embedded browser
CN102821113A (en) * 2011-06-07 2012-12-12 阿里巴巴集团控股有限公司 Cache method and system
CN103744367A (en) * 2013-12-20 2014-04-23 武汉钢铁(集团)公司 Production-line actual-performance information processing method, device and system
CN105279163A (en) * 2014-06-16 2016-01-27 Tcl集团股份有限公司 Buffer memory data update and storage method and system
CN105893173A (en) * 2015-12-10 2016-08-24 乐视网信息技术(北京)股份有限公司 Caching data processing method and device
CN106021126A (en) * 2016-05-31 2016-10-12 腾讯科技(深圳)有限公司 Cache data processing method, server and configuration device
CN106341467A (en) * 2016-08-30 2017-01-18 国网江苏省电力公司电力科学研究院 State analysis method of power utilization information collector based on big data parallel computing

Also Published As

Publication number Publication date
CN110019361A (en) 2019-07-16

Similar Documents

Publication Publication Date Title
CN110019361B (en) Data caching method and device
US9020892B2 (en) Efficient metadata storage
US8782324B1 (en) Techniques for managing placement of extents based on a history of active extents
US9201810B2 (en) Memory page eviction priority in mobile computing devices
US10409728B2 (en) File access predication using counter based eviction policies at the file and page level
US11429630B2 (en) Tiered storage for data processing
US9495396B2 (en) Increased database performance via migration of data to faster storage
CN107430551B (en) Data caching method, storage control device and storage equipment
US9727479B1 (en) Compressing portions of a buffer cache using an LRU queue
US11288287B2 (en) Methods and apparatus to partition a database
CN108959510B (en) Partition level connection method and device for distributed database
JP2021501389A (en) Data hierarchy storage and hierarchy search method and device
US20210349918A1 (en) Methods and apparatus to partition a database
US9380126B2 (en) Data collection and distribution management
US10585802B1 (en) Method and system for caching directories in a storage system
CN107193754B (en) Method and apparatus for data storage for searching
US10482012B1 (en) Storage system and method of operating thereof
CN109783006B (en) Computing system and method of operating a computing system
CN115509437A (en) Storage system, network card, processor, data access method, device and system
US20100077147A1 (en) Methods for caching directory structure of a file system
US11468417B2 (en) Aggregated storage file service
CN113297267A (en) Data caching and task processing method, device, equipment and storage medium
CN111143711A (en) Object searching method and system
US20190227734A1 (en) Tracking information related to free space of containers
CN110727405A (en) Data processing method and device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100083 No. 401, 4th Floor, Haitai Building, 229 North Fourth Ring Road, Haidian District, Beijing

Applicant after: Beijing Guoshuang Technology Co.,Ltd.

Address before: 100086 Beijing city Haidian District Shuangyushu Area No. 76 Zhichun Road cuigongfandian 8 layer A

Applicant before: Beijing Guoshuang Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant