CN104166649A - Caching method and device for search engine - Google Patents

Caching method and device for search engine Download PDF

Info

Publication number
CN104166649A
CN104166649A CN201310182204.7A CN201310182204A CN104166649A CN 104166649 A CN104166649 A CN 104166649A CN 201310182204 A CN201310182204 A CN 201310182204A CN 104166649 A CN104166649 A CN 104166649A
Authority
CN
China
Prior art keywords
inquiry
buffer memory
search engine
list
score value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310182204.7A
Other languages
Chinese (zh)
Other versions
CN104166649B (en
Inventor
宋华青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba (Shanghai) Co., Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201310182204.7A priority Critical patent/CN104166649B/en
Publication of CN104166649A publication Critical patent/CN104166649A/en
Application granted granted Critical
Publication of CN104166649B publication Critical patent/CN104166649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques

Abstract

The invention provides a caching method and device for a search engine. The method includes the steps of receiving queries and elements obtained according to the queries, generating key values of the queries on the basis of the queries, finding a list corresponding to the key values in a cache, marking and storing the elements and updating the cache. According to the technical scheme, values and state information of all the elements are cached, so that the number of calling times of the MLR is reduced, and therefore the calculation amount of the engine is reduced and time efficiency of the engine is improved. Compositions of each query are refined and the queries are normalized so that the hit rate of the cache can be increased.

Description

A kind of caching method for search engine and equipment
Technical field
The application relates to searching engine field, relates in particular to a kind of caching method for search engine and equipment.
Background technology
Search engine refers to according to certain strategy, uses specific computer program to gather information from internet, after information being organized and is processed, for user provides retrieval service, and by the relevant information display of user search the system to user.
Along with developing of a search engine, the amount of its data is also more and more, and business need also becomes increasingly complex.Accordingly, the computing module of engine, that is: machine learning arrangement module (Machine Learning Ranking, MLR) also becomes increasingly complex.This computing module often relates to various algorithm models, needs a large amount of calculated amount, need to consume the cpu resource of a large amount of servers, and its performance issue highlights day by day.
Here said data are also called the element (document) being retrieved, and corresponding in an actual engine, element is likely the base units such as webpage, commodity.
Therefore, in order to improve the efficiency of inquiry, reduce calculated amount, existing search engine can be by user's once access result, be cached, wait until when identical access arrives again, just directly from buffer memory, find corresponding result, return to user, thereby reduce the pressure of engine, improve the response time.
But the cache hit rate of this method is relatively low, user changes the composition of inquiry a little, does not just hit.In addition, this cache way can be sacrificed certain ageing, and result and the truth of hitting buffer memory exist a certain distance.
Summary of the invention
The application's fundamental purpose is to provide a kind of technical scheme of the new buffer memory for search engine, the problems referred to above that exist to solve prior art, wherein:
According to the application's first aspect, a kind of caching method for search engine is provided, comprise step: the element that receives inquiry and obtain according to inquiry; Based on described inquiry, the key assignments of generated query; The list of finding this key-value pair to answer in buffer memory; Element is given a mark and preserved; And renewal buffer memory.
According to the application's second aspect, a kind of buffer memory device for search engine is provided, comprising: receiving trap, for the element that receives inquiry and obtain according to inquiry; Generating apparatus, for the key assignments based on described query generation inquiry; Search device, for the list of finding this key-value pair to answer at buffer memory; Marking device, for giving a mark and preserve described element; And updating device, for upgrading buffer memory.
Compared with prior art, the application's technical scheme, by score value and the status information thereof of each element of buffer memory, has reduced the call number of MLR, thereby has reduced the calculated amount of engine, has improved its timeliness.The application is the composition by each inquiry of refinement inquiry is normalized also, improves the hit rate of buffer memory.
Accompanying drawing explanation
Accompanying drawing described herein is used to provide further understanding of the present application, forms the application's a part, and the application's schematic description and description is used for explaining the application, does not form the improper restriction to the application.In the accompanying drawings:
Fig. 1 schematically shows the overview flow chart of the caching method for search engine of the application's proposition;
Fig. 2 schematically shows according to the data structure schematic diagram in the buffer memory of an embodiment of the application;
Fig. 3 schematically shows element is given a mark and preserved the particular flow sheet of step according to an embodiment of the application;
Fig. 4 is the schematic diagram that illustrates the application's method and technology effect;
Fig. 5 schematically shows according to the structured flowchart of the buffer memory device for search engine of an embodiment of the application.
In these accompanying drawings, with identical reference number, represent same or analogous part.
Embodiment
For making the application's object, technical scheme and advantage clearer, below in conjunction with drawings and the specific embodiments, the application is described in further detail.
In a typical configuration, computing equipment comprises one or more processors (CPU), input/output interface, network interface and internal memory.
Internal memory may comprise the volatile memory in computer-readable medium, and the forms such as random access memory (RAM) and/or Nonvolatile memory, as ROM (read-only memory) (ROM) or flash memory (flash RAM).Internal memory is the example of computer-readable medium.
Computer-readable medium comprises that permanent and impermanency, removable and non-removable media can realize information by any method or technology and store.Information can be module or other data of computer-readable instruction, data structure, program.The example of the storage medium of computing machine comprises, but be not limited to phase transition internal memory (PRAM), static RAM (SRAM), dynamic RAM (DRAM), the random access memory of other types (RAM), ROM (read-only memory) (ROM), Electrically Erasable Read Only Memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc ROM (read-only memory) (CD-ROM), digital versatile disc (DVD) or other optical memory, magnetic magnetic tape cassette, the storage of tape magnetic rigid disk or other magnetic storage apparatus or any other non-transmission medium, can be used for the information that storage can be accessed by computing equipment.According to defining herein, computer-readable medium does not comprise non-temporary computer readable media (transitory media), as data-signal and the carrier wave of modulation.
In the following description, quoting of " embodiment ", " embodiment ", " example ", " example " etc. shown to embodiment or the example so described can comprise special characteristic, structure, characteristic, character, element or limit, but be not that each embodiment or example must comprise special characteristic, structure, characteristic, character, element or limit.In addition, reuse phrase " according to the application embodiment " and, although be likely to refer to identical embodiment, not must refer to identical embodiment.
For the sake of simplicity, omitted in below describing and well known to a person skilled in the art some technical characterictic.
Fig. 1 schematically shows the overview flow chart of the caching method 100 for search engine of the application's proposition.
In step 101, the element that receives inquiry and obtain according to inquiry.Here said inquiry (query), refers to the inquiry (query) after certain algorithm intention identification, for distinguishing the demand of different user.Here said element (document) be search engine according to inquiry, find inquire about relevant element to this.
In step 102, based on described inquiry, the key assignments of generated query.Specifically, step 102 further comprises the steps:
Inquiry is carried out to refinement.By loading predefined configuration, identify and the relevant parameter of giving a mark, in this configuration, include with the relevant parameter name of marking.For example, there are following three inquiries respectively:
Query1:keyword=mp3&addr=Beijing&tab=pop&stat=cat
Query2:tab=normal&keyword=MP3&addr=Hangzhou&stat=attr
Query3:tab=pop&keyword=mP3
Wherein, keyword represents the word of inquiry, and addr represents the address of filtering, and tab represents normal searching or popularity search, and stat represents statistical items.
Parameter in predefined configuration is keyword, tab.In this example, keyword, tab can be used as and the relevant parameter of giving a mark.
These three query refinements are obtained later:
Query1:keyword=mp3&tab=pop
Query2:tab=normal&keyword=MP3
Query3:tab=pop&keyword=mP3
The parameter that refinement is obtained, after capital and small letter conversion, sequence, is combined into new inquiry, that is: a normalization.For the inquiry after three refinements above, the later new inquiry of normalization is:
Query1:keyword=mp3&tab=pop
Query2:keyword=mp3&tab=normal
Query3:keyword=mp3&tab=pop
Finally give new query signature, generate a buffer memory key assignments (cache key).The algorithm of signature has a variety of, and after adopting MD5 algorithm signature here, the buffer memory key assignments of each inquiry obtaining is as follows:
Query1:8c2a5244cd5e650b9cb259de4351a887
Query2:e9aee0a751b60863f67a80b3b9f323b8
Query3:8c2a5244cd5e650b9cb259de4351a887
The step of the composition of inquiring about by above refinement normalization inquiry, can improve the hit rate of buffer memory.
In step 103, the list of finding this key-value pair to answer in buffer memory.Fig. 2 schematically shows according to the data structure schematic diagram in the buffer memory of an embodiment of the application, in conjunction with Fig. 2, can find out, the storage organization of the data that are queried in the application's buffer memory adopts the mode of two-stage index.In buffer memory 200, user's inquiry is a key assignments 201 of correspondence only, by this key assignments 201, can find the list 202 of a project, comprise a plurality of projects 203,204 in list 202, each project comprises component identification number, element score value and status information.Identification number is unique label of an element, and element score value is that the marking value of module (MLR) to this element arranged in machine learning, and status information is to this element marking some states at that time.Status information may be summarized to be the information of two aspects: the one, and the state of element, can be understood as and enter buffer memory update time of element at that time; The 2nd, the information of project, can be understood as the time that this project is placed into buffer memory.The quantity of the project that each list can be preserved has the upper limit.According to one embodiment of present invention, in each element, can there is a timestamp, record the time that element upgrades, if certain element has been updated at time point t1, all do not have afterwards renewal, at time point t2, this element that had query hit, t1 enters buffer memory update time of element at that time, and t2 is exactly the time that project that this element is corresponding is placed into buffer memory.Therefore, the status information of this project just can be stored as (t1, t2).
In step 104, element is given a mark and preserved.Fig. 3 schematically shows according to an embodiment of the application element is given a mark and preserved the particular flow sheet of step, and in conjunction with Fig. 3, step 104 may further include following steps 301-303.
In step 301, judge that this element is whether in this bulleted list.If this element is in list, method enters step 302, otherwise forwards step 303 to.
In step 302, check that whether the score value of described element is effective.If effectively, this element is not given a mark again, that is: keep original score value of this element constant, the method for Fig. 3 directly finishes, otherwise enters step 303.According to one embodiment of present invention, suppose once to access and hit the element that above-mentioned project status information is (t1, t2) at time point t3, carry out following two inspections:
1) first check time that element upgrades and the elementary state information in project, find it is all t1, represent that from t1 to t3 interior element is not updated during this period of time;
2) whether the time point t2 in check project status information and the interval between time point t3 have surpassed predefined threshold value, if do not surpassed, just think that this project is effective.
If from t1 to t3 during this period of time interior element be not updated and this project effective, just think that the score value of this element is effective.
In step 303, call machine learning arrangement module (MLR) score value is given a mark, preserved to this element again.Then, the method in Fig. 3 finishes.
It should be noted that, for the above-mentioned element obtaining according to inquiry, can judge according to the step here, give a mark and saving result.
Steps flow chart as shown in Figure 3 can be found out, the application is when giving a mark to element, only can be for those current not elements in bulleted list, or status information is invalid element, for in bulleted list and the effective element of status information, by not calling the MLR module that consumes cpu resource, it is given a mark.The called number of times that has so just reduced MLR module, has reduced cpu load, has improved response speed.
In step 105, upgrade buffer memory.Step 105 further comprises:
The status information of new element more.For the project of again giving a mark, just its status information also need to be upgraded accordingly and preserved.
The higher limit of the quantity of the project that can preserve according to this list maximum retains the high project of element score value in list.Because the number of elements below bulleted list is numerous, all elements all can not be existed in buffer memory, therefore, need to, according to up-to-date marking information and this higher limit, the higher element of score value be retained in this list, and the lower element of score value is shifted out to list.Such as: if this higher limit is 100, only retains score value front 100 elements from high to low, and other element is shifted out to this list.
According to least recently used rule (LRU), determine whether list is retained in buffer memory.After completing the renewal of respective list mean terms object, according to least recently used rule (LRU), least-recently-used list is shifted out to buffer memory.
Fig. 4 is the schematic diagram that illustrates the application's method and technology effect, illustrates the technique effect of the application's method below in conjunction with Fig. 4.
The inquiry that query=mobile phone in figure represents user is " mobile phone ", doc1 and doc2 for according to inquiry, obtain two elements, the numeral (900,1000,1100) of following after doc1 and doc2 is the current score value of this element.
As shown in Figure 4 A, when user inquires about " mobile phone " for the first time, by calling computing module MLR, obtain the score value of doc1 and doc2, be cached, and result of calculation is returned to user.The number of times of current query calls MLR is twice.
As shown in Figure 4 B, when user inquires about " mobile phone " for the second time, do not need to call the score value that computing module MLR obtains doc1 and doc2, directly it is read from buffer memory, and returned to user.MLR is not called in current inquiry.
As shown in Figure 4 C, user, inquire about for the second time " mobile phone " afterwards, inquiry " mobile phone " for the third time before, if external module has upgraded the content of doc2, owing to also not having " mobile phone " inquiry again to arrive, the application's caching method will can not call the score value that computing module MLR recalculates doc2.That is to say, the variation of doc2 can not be reacted in buffer memory and go at once.That is to say: before next time, inquiry arrived, even if the last content of inquiring about corresponding element has occurred to upgrade, change, the application's method also can keep the score value of this element constant.
As shown in Figure 4 D, after externally module has been upgraded the content of doc2, if user inquires about " mobile phone " again, buffer memory can find that by the status information of judgement doc2 doc2 was updated, need to call computing module MLR and recalculate and preserve its score value, and the status information of doc1 does not change and be also present among bulleted list, will can not call computing module MLR and recalculate its score value, only need directly be returned to user.Therefore, for current user's inquiry, only call computing module MLR one time, saved calculated amount once.
The application also provides a kind of buffer memory device for search engine.Fig. 5 schematically shows according to the structured flowchart of the buffer memory device 500 for search engine of an embodiment of the application.According to the application embodiment, equipment 500 can comprise: receiving trap 501, and for the element that receives inquiry and obtain according to inquiry, generating apparatus 502, for the key assignments based on query generation inquiry; Search device 503, for the list of finding this key-value pair to answer at buffer memory; Marking device 504, for giving a mark and preserve described element; Updating device 505, for upgrading buffer memory.
According to the application embodiment, generating apparatus 502 may further include: filtration unit, for filtering out by loading predefined configuration the parameter that inquiry is relevant to marking; Splicing apparatus, for being combined into new inquiry by all parameters relevant to marking; Signature apparatus, generates its key assignments for the query signature to new.
According to another embodiment of the application, marking device 504 may further include: element position testing fixture, for checking that whether described element is in described list; And first device of again giving a mark, for the described element in described list not is again given a mark and is preserved; Score value validity check device, whether effective for checking the score value of described element; And second device of again giving a mark, for the invalid element of score value is again given a mark and is preserved.
According to the application embodiment, list comprises a plurality of projects, and project comprises component identification number, element score value and status information.
According to the application embodiment, the quantity of the project that list can be preserved has the upper limit.
According to the application's another embodiment, updating device 505 may further include: state updating device, for the status information of new element more; Element retaining device, for retaining the high project of list element score value according to the upper limit; List retaining device, for determining according to least recently used rule whether list is retained in buffer memory.
According to the application embodiment, before next time inquiry arrives, keep the score value of the element that this inquiry obtains constant.
Those skilled in the art should understand, the application's embodiment can be provided as method, system or computer program.Therefore, the application can adopt complete hardware implementation example, implement software example or in conjunction with the form of the embodiment of software and hardware aspect completely.And the application can adopt the form that wherein includes the upper computer program of implementing of computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) of computer usable program code one or more.
The embodiment that the foregoing is only the application, is not limited to the application, and for a person skilled in the art, the application can have various modifications and variations.All within the application's spirit and principle, any modification of doing, be equal to replacement, improvement etc., within all should being included in the application's claim scope.

Claims (14)

1. for a caching method for search engine, it is characterized in that comprising step:
The element that receives inquiry and obtain according to inquiry;
Based on described inquiry, the key assignments of generated query;
The list of finding this key-value pair to answer in buffer memory;
Element is given a mark and preserved; And
Upgrade buffer memory.
2. the caching method for search engine according to claim 1, is characterized in that, the step of the key assignments of described generated query further comprises:
By loading predefined configuration, filter out parameter relevant to marking in inquiry;
All parameters relevant to marking are combined into new inquiry; And
New query signature is generated to its key assignments.
3. the caching method for search engine according to claim 1, is characterized in that, described element is given a mark and the step of preserving further comprises:
Check that described element is whether in described list; And
The described element in described list not is again given a mark and preserved;
Whether the score value that checks described element is effective; And
The element that score value is invalid is again given a mark and preserved.
4. according to the caching method for search engine described in any one in claim 1-3, it is characterized in that: described list comprises a plurality of projects, described project comprises described component identification number, element score value and status information.
5. the caching method for search engine according to claim 4, is characterized in that: the quantity of the project that described list can be preserved has the upper limit.
6. the caching method for search engine according to claim 5, is characterized in that, described renewal buffer memory step further comprises:
The status information of new element more;
According to the described upper limit, retain the high project of element score value in list; And
According to least recently used rule, determine whether described list is retained in buffer memory.
7. the caching method for search engine according to claim 1, is characterized in that: before next time, inquiry arrived, keep the score value of the element that this inquiry obtains constant.
8. for a buffer memory device for search engine, it is characterized in that comprising:
Receiving trap, for the element that receives inquiry and obtain according to inquiry;
Generating apparatus, for the key assignments based on described query generation inquiry;
Search device, for the list of finding this key-value pair to answer at buffer memory;
Marking device, for giving a mark and preserve described element; And
Updating device, for upgrading buffer memory.
9. the buffer memory device for search engine according to claim 8, is characterized in that, described generating apparatus further comprises:
Filtration unit, for filtering out by loading predefined configuration the parameter that inquiry is relevant to marking;
Splicing apparatus, for being combined into new inquiry by all parameters relevant to marking; And
Signature apparatus, generates its key assignments for the query signature to new.
10. the buffer memory device for search engine according to claim 8, is characterized in that, described marking device further comprises:
Element position testing fixture, for checking that whether described element is in described list; And
First device of again giving a mark, for again giving a mark and preserve the described element in described list not;
Score value validity check device, whether effective for checking the score value of described element; And
Second device of again giving a mark, for again giving a mark and preserve the invalid element of score value.
The buffer memory device for search engine in 11. according to Claim 8-10 described in any one, is characterized in that: described list comprises a plurality of projects, and described project comprises described component identification number, element score value and status information.
12. buffer memory devices for search engine according to claim 11, is characterized in that: the quantity of the project that described list can be preserved has the upper limit.
13. buffer memory devices for search engine according to claim 12, is characterized in that, described updating device further comprises:
State updating device, for the status information of new element more;
Element retaining device, for retaining the high project of list element score value according to the described upper limit; And
List retaining device, for determining according to least recently used rule whether described list is retained in buffer memory.
14. buffer memory devices for search engine according to claim 8, is characterized in that: before next time inquiry arrives, keep the score value of the element that this inquiry obtains constant.
CN201310182204.7A 2013-05-16 2013-05-16 Caching method and equipment for search engine Active CN104166649B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310182204.7A CN104166649B (en) 2013-05-16 2013-05-16 Caching method and equipment for search engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310182204.7A CN104166649B (en) 2013-05-16 2013-05-16 Caching method and equipment for search engine

Publications (2)

Publication Number Publication Date
CN104166649A true CN104166649A (en) 2014-11-26
CN104166649B CN104166649B (en) 2020-03-20

Family

ID=51910468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310182204.7A Active CN104166649B (en) 2013-05-16 2013-05-16 Caching method and equipment for search engine

Country Status (1)

Country Link
CN (1) CN104166649B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145495A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 The method and device of dynamically-adjusting parameter rule
CN110046286A (en) * 2018-01-16 2019-07-23 马维尔以色列(M.I.S.L.)有限公司 Method and apparatus for search engine caching
CN112507199A (en) * 2020-12-22 2021-03-16 北京百度网讯科技有限公司 Method and apparatus for optimizing a search system
CN115905323A (en) * 2023-01-09 2023-04-04 北京创新乐知网络技术有限公司 Searching method, device, equipment and medium suitable for multiple searching strategies
CN116910100A (en) * 2023-09-08 2023-10-20 湖南立人科技有限公司 Cache data processing method for low-code platform

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120110015A1 (en) * 2010-10-29 2012-05-03 Microsoft Corporation Search cache for document search
CN102479207A (en) * 2010-11-29 2012-05-30 阿里巴巴集团控股有限公司 Information search method, system and device
CN102930054A (en) * 2012-11-19 2013-02-13 北京奇虎科技有限公司 Data search method and data search system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120110015A1 (en) * 2010-10-29 2012-05-03 Microsoft Corporation Search cache for document search
CN102479207A (en) * 2010-11-29 2012-05-30 阿里巴巴集团控股有限公司 Information search method, system and device
CN102930054A (en) * 2012-11-19 2013-02-13 北京奇虎科技有限公司 Data search method and data search system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145495A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 The method and device of dynamically-adjusting parameter rule
CN110046286A (en) * 2018-01-16 2019-07-23 马维尔以色列(M.I.S.L.)有限公司 Method and apparatus for search engine caching
CN112507199A (en) * 2020-12-22 2021-03-16 北京百度网讯科技有限公司 Method and apparatus for optimizing a search system
CN112507199B (en) * 2020-12-22 2022-02-25 北京百度网讯科技有限公司 Method and apparatus for optimizing a search system
CN115905323A (en) * 2023-01-09 2023-04-04 北京创新乐知网络技术有限公司 Searching method, device, equipment and medium suitable for multiple searching strategies
CN115905323B (en) * 2023-01-09 2023-08-18 北京创新乐知网络技术有限公司 Searching method, device, equipment and medium suitable for various searching strategies
CN116910100A (en) * 2023-09-08 2023-10-20 湖南立人科技有限公司 Cache data processing method for low-code platform
CN116910100B (en) * 2023-09-08 2023-11-28 湖南立人科技有限公司 Cache data processing method for low-code platform

Also Published As

Publication number Publication date
CN104166649B (en) 2020-03-20

Similar Documents

Publication Publication Date Title
US10878052B2 (en) Blockchain-based cross-chain data operation method and apparatus
CN105718455A (en) Data query method and apparatus
CN106407303A (en) Data storage method and apparatus, and data query method and apparatus
CN111352902A (en) Log processing method and device, terminal equipment and storage medium
US8887127B2 (en) Web browsing apparatus and method through storing and optimizing JAVASCRIPT® code
CN104166649A (en) Caching method and device for search engine
US11113195B2 (en) Method, device and computer program product for cache-based index mapping and data access
CN112395322B (en) List data display method and device based on hierarchical cache and terminal equipment
CN105302840A (en) Cache management method and device
CN105468623A (en) Data processing method and apparatus
CN112148217A (en) Caching method, device and medium for deduplication metadata of full flash storage system
CN105138649A (en) Data search method and device and terminal
US7676457B2 (en) Automatic index based query optimization
CN106599247A (en) Method and device for merging data file in LSM-tree structure
CN108470043A (en) A kind of acquisition methods and device of business result
CN111858612B (en) Data accelerated access method and device based on graph database and storage medium
US20230342395A1 (en) Network key value indexing design
CN106407347A (en) Data caching method and apparatus
CN111831691A (en) Data reading and writing method and device, electronic equipment and storage medium
CN110955658A (en) Data organization and storage method based on Java intelligent contract
US11645283B2 (en) Predictive query processing
CN114840487A (en) Metadata management method and device for distributed file system
CN108197157B (en) Method, apparatus and computer-readable storage medium for managing data to be stored
AU2016277745A1 (en) Linked-list-based method and device for application caching management
CN113411395B (en) Access request routing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20211116

Address after: Room 3921, floor 3, No. 2879, Longteng Avenue, Xuhui District, Shanghai

Patentee after: Alibaba (Shanghai) Co., Ltd

Address before: P.O. Box 847, 4th floor, Grand Cayman capital building, British Cayman Islands

Patentee before: Alibaba Group Holdings Limited

TR01 Transfer of patent right