CN108090153A - A kind of searching method, device, electronic equipment and storage medium - Google Patents

A kind of searching method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN108090153A
CN108090153A CN201711309502.2A CN201711309502A CN108090153A CN 108090153 A CN108090153 A CN 108090153A CN 201711309502 A CN201711309502 A CN 201711309502A CN 108090153 A CN108090153 A CN 108090153A
Authority
CN
China
Prior art keywords
section
caching
search
request
inquiry request
Prior art date
Application number
CN201711309502.2A
Other languages
Chinese (zh)
Inventor
彭程
田第鸿
石小华
彭齐荣
Original Assignee
深圳云天励飞技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳云天励飞技术有限公司 filed Critical 深圳云天励飞技术有限公司
Priority to CN201711309502.2A priority Critical patent/CN108090153A/en
Publication of CN108090153A publication Critical patent/CN108090153A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Abstract

A kind of searching method, including:Receive the searching request of user;Described search request is cut into slices to obtain multiple section inquiry requests according to default section rule;Judge to whether there is buffered results corresponding with each section inquiry request in caching, wherein, the buffered results in the caching are cut into slices to obtain multiple section buffered results previously according to the default section rule;And when determining to there are buffered results corresponding with each section inquiry request in caching, buffered results corresponding with each section inquiry request are obtained from the caching and return to user;Or when determining to be not present in caching with each section corresponding buffered results of inquiry request, search and the described search corresponding search result of request and user is returned to from search server.The present invention also provides a kind of searcher, server and storage mediums.The present invention can improve buffered results hit rate, reduce the search response time.

Description

A kind of searching method, device, electronic equipment and storage medium

Technical field

The present invention relates to search technique fields, and in particular to a kind of searching method, device, electronic equipment and storage medium.

Background technology

With the development of Internet technology, user is very universal in search information on the net.In general, website service system connects It receives search from the user to access, and is accessed according to the search to user and return to corresponding search result.And it is distributed to cache Significant components in system, mainly solve high concurrent, and under big data scene, performance issue that hot spot data accesses provides high property The data of energy quickly access.Relatively common at present enterprise-level search application server Solr, Elasticsearch etc., have The caching of various levels is done, but it must be identical, i.e., byte-by-byte identical ability that existing caching mechanism, which is querying condition, It can be considered identical, can just get the content of caching.When search application server receives searching request, from full dose It is also a time-consuming step that the data in search range are inquired in data, and not small time-consuming ratio is occupied in whole flow process is retrieved Weight.When full dose data reach 100,000,000, query time is up to 1 to 3 second.

In addition, nowadays most of enterprise-level application scene, querying condition is changeable, such as when searching only for nearest 30 days with 31 days Data, the caching mechanism of common enterprise-level search application server Solr, Elasticsearch can not hit itself design Caching, that is to say, that the time varies slightly during retrieval every time, needs to expend the time and goes to inquire again with server resource and search Data in the range of rope, thus current search application server and be not suitable with the changeable application scenarios of querying condition, during search Between it is long, user's search experience is relatively low.

The content of the invention

In view of the foregoing, it is necessary to propose a kind of searching method, device, electronic equipment and storage medium, utilize and cut Piece caching mechanism is cached, and can improve buffered results hit rate, is reduced the search response time, is mitigated to search server Pressure is accessed, shortens the search process time, promotes user experience.

The first aspect of the present invention provides a kind of searching method, the described method includes:

Receive the searching request of user;

Described search request is cut into slices to obtain multiple section inquiry requests according to default section rule;

Judge to whether there is buffered results corresponding with each section inquiry request in caching, wherein, in the caching Buffered results previously according to it is described it is default section rule cut into slices to obtain multiple section buffered results;And

When determining to there are buffered results corresponding with each section inquiry request in caching, obtained from the caching Buffered results corresponding with each section inquiry request simultaneously return to user;Or

When determining that buffered results corresponding with each section inquiry request are not present in caching, from search server Search is with the described search corresponding search result of request and returning to user.

According to a preferred embodiment of the present invention,

Described search request includes:Search key, required return search result initial time and required return Search result deadline;

The default section rule refers to that the period in asking described search carries out rounding according to default unit and asks Remaining, the period in described search request refers to the deadline of the required search result returned and the required return Difference between the initial time of search result;

The section inquiry request includes:Searching keyword, the initial time of buffered results of required return and required The deadline of the buffered results of return;

The section buffered results include:Keyword, caching initial time, caching deadline are cached, it is described to cache The difference of time beginning and the caching deadline are identical with the default unit.

According to a preferred embodiment of the present invention, described search request is cut according to default section rule described After piece obtains multiple section inquiry requests, the method further includes:

Judge in the multiple section inquiry request with the presence or absence of the section inquiry request of non-default unit;And

When determining the section inquiry request there are non-default unit, obtained and the non-default list from search server The corresponding search result of section inquiry request of position simultaneously returns to user.

According to a preferred embodiment of the present invention, it is corresponding with described search request in the search from search server Search result and after returning to user, the method further includes:

By described search ask to be associated and store with the described search result searched for from described search server to In the caching.

According to a preferred embodiment of the present invention, it is described to judge in caching to whether there is and each section inquiry request Corresponding buffered results include:

Searching keyword in each section inquiry request is carried out one with the caching keyword in section buffered results One matching;

The initial time of the required buffered results returned in each section inquiry request will be cached with section As a result the caching initial time in is matched one by one;

The deadline of the required buffered results returned in each section inquiry request will be cached with section As a result the caching deadline in is matched one by one.

According to a preferred embodiment of the present invention, the method further includes:

In advance to the multiple section buffered results according to time sequencing progress order.

According to a preferred embodiment of the present invention, the method further includes:

The multiple section inquiry request is ranked up according to time sequencing.

The second aspect of the present invention provides a kind of searcher, and described device includes:

Receiving module, for receiving the searching request of user;

It cuts into slices module, it please for being cut into slices to obtain multiple section inquiries according to default section rule to described search request It asks;

Judgment module, for judging to whether there is in caching buffered results corresponding with each section inquiry request, In, the buffered results in the caching are cut into slices to obtain multiple section buffered results previously according to the default section rule;

Acquisition module is determined in the caching for working as the judgment module in the presence of corresponding with each section inquiry request Buffered results when, obtain to cut into slices with each from the caching and the corresponding buffered results of inquiry request and return to user;

Search module determines to be not present in the caching and each section inquiry request pair for working as the judgment module During the buffered results answered, search is with the described search corresponding search result of request and returning to user from search server.

The third aspect of the application provides a kind of electronic equipment, and the electronic equipment includes processor, and the processor is used Described search method is realized when the computer program stored in memory is performed.

The fourth aspect of the application provides a kind of computer readable storage medium, is stored thereon with computer program, described Described search method is realized when computer program is executed by processor.

Searching method of the present invention, device, server and storage medium, using caching mechanism of cutting into slices carry out caching with Inquiry can improve buffered results hit rate, reduce the search response time, mitigate the access pressure to search server.Simultaneously After the searching request of user is cut into slices, concurrently multiple section inquiry requests can be scanned for, greatly shorten search Processing time promotes user experience.Secondly, the request of association described search and the described search obtained from described search server As a result;And by described search request and the storage of corresponding described search result into the caching, user can be convenient for right again It is directly extracted when the keyword scans for from the caching, further improves cache hit rate.

Description of the drawings

It in order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this The embodiment of invention, for those of ordinary skill in the art, without creative efforts, can also basis The attached drawing of offer obtains other attached drawings.

Fig. 1 is the flow chart for the searching method that the embodiment of the present invention one provides.

Fig. 2 is the structure chart of searcher provided by Embodiment 2 of the present invention.

Fig. 3 is the schematic diagram for the server that the embodiment of the present invention three provides.

Following specific embodiment will be further illustrated the present invention with reference to above-mentioned attached drawing.

Specific embodiment

It is to better understand the objects, features and advantages of the present invention, below in conjunction with the accompanying drawings and specific real Applying example, the present invention will be described in detail.It should be noted that in the case where there is no conflict, embodiments herein and embodiment In feature can be mutually combined.

Elaborate many details in the following description to facilitate a thorough understanding of the present invention, described embodiment only Only it is part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill Personnel's all other embodiments obtained without making creative work, belong to the scope of protection of the invention.

Unless otherwise defined, all of technologies and scientific terms used here by the article is with belonging to technical field of the invention The normally understood meaning of technical staff is identical.Term used in the description of the invention herein is intended merely to description tool The purpose of the embodiment of body, it is not intended that in the limitation present invention.

Preferably, searching method of the invention is applied in one or more server.The server is a kind of energy It is enough according to the instruction for being previously set or store, the equipment of automatic progress numerical computations and/or information processing, hardware is included but not It is limited to microprocessor, application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), can compiles Journey gate array (Field-Programmable Gate Array, FPGA), digital processing unit (Digital Signal Processor, DSP), embedded device etc..

The server is not limited to desktop computer, desktop PC, notebook, palm PC and cloud server Wait computing devices.The server can with user by the modes such as keyboard, mouse, remote controler, touch tablet or voice-operated device into Row human-computer interaction.

Described search method can also be applied to by terminal and the server institute being attached by network and the terminal In the hardware environment of composition.Network includes but not limited to:Wide area network, Metropolitan Area Network (MAN) or LAN.The searcher of the embodiment of the present invention Method can be performed by server, can also jointly be performed by server and terminal.The application is not any limitation as herein.

For needing to scan for the server of method, the present processes can be directly integrated on the server and be provided Searching method function.For another example, method provided herein can be with Software Development Kit (Software Development Kit, SDK) form operate in the equipment such as server, in the form of SDK provide function of search interface, The acceleration of search can be realized by the interface of offer for server or other equipment.

Embodiment one

Fig. 1 is the flow chart for the searching method that the embodiment of the present invention one provides.According to different demands, flow shown in Fig. 1 Execution sequence in figure can change, and some steps can be omitted.

101:Receive the searching request of user.

In this preferred embodiment, user can input keyword, request service by the browser installed in the server Device searches out result corresponding with the keyword.The keyword is sent to server by the terminal, and the server connects Packet receiving contains the searching request of the keyword.The terminal can include, but are not limited to desktop PC, notebook, the palm Any desktop computer or all-in-one machine for being equipped with browser such as upper computer, smart mobile phone, tablet computer.The server can wrap It includes:Web server, Internet Server, cache server etc..

In this preferred embodiment, described search request includes:Search key, required return search result starting when Between and required return search result deadline.

102:Described search request is cut into slices to obtain multiple section inquiry requests according to default section rule.

In this preferred embodiment, section rule can be pre-set.The default section rule refers to please to described search Period in asking carries out rounding and complementation according to default unit.Period in described search request refers to the required return Search result deadline and the required return search result initial time between difference.

In this preferred embodiment, the default unit can be in units of day, in units of week, in units of the moon or In units of year.The default unit be not limited only to it is above-mentioned enumerate, can also be using 3 days, any time periods such as first quarter moon to be single Position, the invention is not limited in this regard.

One preferred technical solution is that the default unit is with day unit, in can so asking described search Period carries out rounding, and the section inquiry request of multiple units can be obtained after rounding, so as to effectively shorten at search Manage the time.And the period being even possible to by the moon in units of year in can not asking described search carries out rounding, can only complementation, So as to cannot get the section inquiry request of a unit.In addition the period during described search is asked with day unit cuts into slices It is more in line with practical application scene.

As an example it is assumed that user searches for " recognition of face " in nearly one and a half months, to one and a half months in units of day Period, which carries out rounding, can obtain the section inquiry requests of 45 units, and in units of the moon to period of one and a half months into Row rounding is only capable of obtaining the section inquiry request of 1 unit, and carrying out rounding to the period of one and a half months in units of year obtains 0 The section inquiry request of a unit can not, that is to say, that can not carry out rounding to the period of one and a half months in units of year.

In this preferred embodiment, each section inquiry request includes:Searching keyword, the buffered results of required return The deadline of initial time and the buffered results of required return.Wherein, the inquiry in each section inquiry request is crucial Search key during word and described search are asked is identical, it is described needed for deadline of buffered results for returning with it is described required The difference of the initial time of the buffered results of return is identical with the default unit.

It is further, described that described search request is cut into slices to obtain multiple section inquiries according to default section rule please Asking can also include:The multiple section inquiry request is ranked up according to time sequencing.For example, it is carried out according to time sequencing Order sequence carries out Bit-reversed according to time sequencing.

One preferably technical solution is to carry out Bit-reversed to the multiple section inquiry request according to time sequencing. That is, deadline of the first section inquiry request apart from the search result that user asks to return is nearest, is returned apart from user's request The initial time of the search result returned is farthest;The cut-off for the search result that second section inquiry request asks to return apart from user Time second is near;And so on;The deadline for the search result that the last one section inquiry request asks to return apart from user Farthest, the initial time that the search result returned is asked apart from user is nearest.In addition two section inquiry requests of adjacent sequence, The initial time of the buffered results of required return in the preceding section inquiry request that sorts is that the posterior section inquiry of sequence please The deadline of the buffered results of required return in asking.

As an example it is assumed that the default unit is in units of day, then first section inquiry request can be expressed as " Q=recognitions of face ", second section inquiry request can be with table It is shown as " Q=Ren Lianshibies &T1=20171129000000&T2=20171129240000 ".Wherein, Q is searching keyword, T1 For the initial time (date Hour Minute Second) of the buffered results of the required return, T2 is the required buffered results returned Deadline (date Hour Minute Second).

103:Judge to whether there is buffered results corresponding with each section inquiry request in caching.

In this preferred embodiment, a cache database (referred to as caching herein) is pre-set, caching is used exclusively for Intermediate result or final result are temporarily held in using the interim place of frequent data after being convenient in the caching by storage It is continuous to use.In other embodiments, an individual cache server can also be set, frequently number is used dedicated for caching According to.

In this preferred embodiment, the buffered results in the caching are cut into slices to obtain more previously according to the default unit A section buffered results.

In this preferred embodiment, each section buffered results corresponds to a caching keyword, corresponding caching starting Time, a corresponding caching deadline, wherein, the caching initial time and the difference for caching deadline with it is described Default unit is identical.

In this preferred embodiment, by each section inquiry request with each cut into slices buffered results matched one by one with Judge to whether there is section buffered results corresponding with section inquiry request in the caching.It is described to ask each section inquiry It asks to be matched one by one with each section buffered results and specifically include:By the searching keyword in each section inquiry request It is matched one by one with the caching keyword in section buffered results;By by each section inquiry request in it is described needed for return The initial time for the buffered results returned is matched one by one with the caching initial time in section buffered results;It will be each The deadline of the required buffered results returned in a section inquiry request and the caching in section buffered results Deadline is matched one by one.

Work as successful match, that is, when hitting buffered results, it is determined that exist in caching corresponding with each section inquiry request Buffered results when, perform step 104;When it fails to match, that is, when hitting buffered results failure, it is determined that be not present in caching with During each section corresponding buffered results of inquiry request, step 105 is performed.

104:Buffered results corresponding with each section inquiry request are obtained from the caching and return to user.

In this preferred embodiment, when determining exist in the caching and each corresponding buffered results of section inquiry request When, it extracts corresponding buffered results from the caching, and the buffered results is ranked up and group sequentially in time User is returned to after conjunction.

105:Search is with the described search corresponding search result of request and returning to user from search server.

In this preferred embodiment, buffered results corresponding with each section inquiry request are not present in determining to cache When, it is believed that the inquiry request is first time inquiry request, that is, thinks search input by user for searching request for the first time.For head During secondary searching request, scan for obtaining search result into search server according to described search request, may be employed herein Existing search technique, details are not described herein by the present invention.Described search server is also search engine.

Further, described search method can also include:It is suitable according to the time to the multiple section buffered results in advance Sequence progress order.For example, carry out Bit-reversed according to time sequencing carry out order sequence or according to time sequencing.

One preferably technical solution is to carry out Bit-reversed to the multiple section buffered results according to time sequencing, this Sample, which is in corresponding time newest buffered results, can be always located in the top of cache database.That is, first section caching As a result corresponding caching initial time and caching deadline are nearest apart from current time;Second section buffered results is corresponding It caches initial time and caching deadline is near apart from current time second;And so on;The last one section buffered results pair The caching initial time answered and caching deadline are apart from current time.In addition for two section buffered results of adjacent sequence For, the corresponding caching initial time of the preceding section buffered results that sort is that the posterior section buffered results of sequence are corresponding slow Deposit deadline.

In this preferred embodiment, the multiple section inquiry request is ranked up according to time sequencing, to the multiple Section buffered results are ranked up according to time sequencing, so can be to avoid to each section inquiry request and each section Buffered results are matched one by one, it is only necessary to which successful match goes out a section inquiry request and a section buffered results, and side can be with The matching of other section inquiry requests and buffered results of cutting into slices sequentially is carried out, match complexity is not only reduced, can also save Match time is saved, shortens the query processing time.

Further, after step 105, described search method can also include:By described search ask with from institute The described search result searched in search server is stated to be associated and store in the caching.

Described search is asked as inquiry request with the buffered results searched for from described search server to associate And store into the caching, it is directly extracted when being scanned for again to the keyword in order to user from the caching, And then improve cache hit rate;In addition, the speed of the data in caching is read more than the speed that data are inquired about from search server It spends much higher, it is thus possible to mitigate the access pressure to search server, shorten the search process time, promote user experience.

It should be appreciated that the period in asking in practical applications described search carries out cutting according to default unit Two kinds of situations can be divided into:One kind is just rounding, and one kind is that have surplus.Thus obtained multiple section inquiry requests also correspond to It is divided into two kinds of situations, a kind of is the situation of corresponding just rounding, at this time by the period in asking described search according to default Unit carries out the section inquiry request that multiple section inquiry requests that cutting obtains are referred to as default unit;One kind is to be corresponding with knot Period in asking described search is referred to as by remaining situation according to the inquiry request that default unit can not carry out cutting at this time For the section inquiry request of non-default unit.For the section inquiry request of non-default unit, can not be matched in the caching Go out corresponding buffered results, it is thus regarded that the section inquiry request of the non-default unit is searching request for the first time.For for the first time It during searching request, scans for obtaining search result into search server according to described search request, may be employed herein existing There is search technique, details are not described herein by the present invention.Thus, after step 102, described search method can also be wrapped further It includes:Judge in multiple section inquiry requests with the presence or absence of the section inquiry request of non-default unit;When definite the multiple section There are during the section inquiry request of non-default unit in inquiry request, searched for from search server and the non-default unit The corresponding search result of section inquiry request simultaneously returns to user;It is non-pre- when determining to be not present in the multiple section inquiry request If during the section inquiry request of unit, performing above-mentioned steps 103, continue to judge to whether there is in caching and be inquired about with each section Ask corresponding buffered results.

In conclusion the method for the present invention for scanning for accelerating using query caching of cutting into slices, receives the search of user Request;Described search request is cut into slices to obtain multiple section inquiry requests according to default section rule;It is in judgement caching It is no to there are buffered results corresponding with each section inquiry request;When determine caching in exist and each section inquiry request During corresponding buffered results, buffered results corresponding with each section inquiry request are obtained from the caching and return to use Family;When determining that buffered results corresponding with each section inquiry request are not present in caching, searched for from search server With the described search corresponding search result of request and returning to user.It is cached and is inquired about using caching mechanism of cutting into slices, it can Buffered results hit rate is improved, the search response time is reduced, mitigates the access pressure to search server.Searching user simultaneously After rope request is cut into slices, concurrently multiple section inquiry requests can be scanned for, the search process time is greatly shortened, carry Rise user experience.

Such as by the full dose data in caching, it is data cached to be cut into slices to obtain the section of multiple units in units of day, When user asks the time of search to be any time period, if the time that ask search can be cut into slices to obtain in units of day The section inquiry request of multiple units is then hit cutting for multiple units by the section inquiry request of multiple units from the caching Piece buffered results, the result in caching can be recycled well, save the Internet resources of search server, alleviated network and gathered around It is stifled.For there is the section inquiry request of non-default unit in the section inquiry request cut into slices in units of day, then according to The section inquiry request of the non-default unit is scanned for according to existing search technique, since the section of non-default unit is looked into The search time unit ask in request is less than 1 day, therefore while being scanned for according to existing search technique takes also smaller, thus energy Effectively shorten search time.

Secondly, it is of the present invention using cut into slices query caching scan for accelerate method, association described search request with The described search result searched for from described search server;And described search request and corresponding described search result are stored Into the caching, directly extracted when can be scanned for again to the keyword convenient for user from the caching.Finally, Multiple section buffered results in the multiple section inquiry request and caching are ranked up according to time sequencing, it can be to avoid Each section inquiry request with each section buffered results is matched one by one, and then further shortens caching and looks into Ask the time.

Finally it should be noted that present invention section caching be not limited to it is above-mentioned enumerate using the default unit interval to cut into slices Foundation can be applicable to the section of other forms, for example, quantity etc..Certainly, enumerated once present invention is also not necessarily limited to above-mentioned Section caching can also be multiple section caching, for example, pre-setting two cache servers, one of buffer service Device is cut into slices according to the first default section unit, another cache server is cut into slices according to the second default section unit, Described first default section unit is more than the described second default section unit, and the present invention no longer elaborates herein.

Embodiment two

The above is only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, for For those of ordinary skill in the art, without departing from the concept of the premise of the invention, improvement, but these can also be made It all belongs to the scope of protection of the present invention.

With reference to the 2nd to 3 figure, it is situated between respectively to the function module and hardware configuration of the server of realizing searching method It continues.

It should be appreciated that the embodiment is only purposes of discussion, in patent claim and from the limitation of this structure.

As shown in fig.2, be searcher 20 of the present invention preferred embodiment in functional block diagram.

In some embodiments, described search device 20 is run in the server 3.Described search device 20 can wrap Include multiple function modules being made of program code segments.The program code of each program segment in described search device 20 can be with It is stored in memory, and as performed by least one processor, search is accelerated with performing and (referring to Fig. 1 descriptions).

In the present embodiment, the function of the searcher 20 of the server 1 according to performed by it can be divided into multiple Function module.The function module can include:Receiving module 201, section module 202, judgment module 203, acquisition module 204th, search module 205, sorting module 206 and relating module 207.The so-called module of the present invention refers to that one kind can be by least one A processor is performed and can complete the series of computation machine program segment of fixed function, and storage is in memory.One In a little embodiments, the function on each module will be described in detail in subsequent embodiment.

Receiving module 201, for receiving the searching request of user.

In this preferred embodiment, user can input keyword, request service by the browser installed in the server Device searches out result corresponding with the keyword.The keyword is sent to server by the terminal, and the server connects Packet receiving contains the searching request of the keyword.The terminal can include, but are not limited to desktop PC, notebook, the palm Any desktop computer or all-in-one machine for being equipped with browser such as upper computer, smart mobile phone, tablet computer.The server can wrap It includes:Web server, Internet Server, cache server etc..

In this preferred embodiment, described search request includes:Search key, required return search result starting when Between and required return search result deadline.

It cuts into slices module 202, is looked into for being cut into slices to obtain multiple sections according to default section rule to described search request Ask request.

In this preferred embodiment, section rule can be pre-set.The default section rule refers to please to described search Period in asking carries out rounding and complementation according to default unit.Period in described search request refers to the required return Search result deadline and the required return search result initial time between difference.

In this preferred embodiment, the default unit can be in units of day, in units of week, in units of the moon or In units of year.The default unit be not limited only to it is above-mentioned enumerate, can also be using 3 days, any time periods such as first quarter moon to be single Position, the invention is not limited in this regard.

One preferred technical solution is that the default unit is with day unit, in can so asking described search Period carries out rounding, and the section inquiry request of multiple units can be obtained after rounding, so as to effectively shorten at search Manage the time.And the period being even possible to by the moon in units of year in can not asking described search carries out rounding, can only complementation, So as to cannot get the section inquiry request of a unit.In addition the period during described search is asked with day unit cuts into slices It is more in line with practical application scene.

As an example it is assumed that user searches for " recognition of face " in nearly one and a half months, to one and a half months in units of day Period, which carries out rounding, can obtain the section inquiry requests of 45 units, and in units of the moon to period of one and a half months into Row rounding is only capable of obtaining the section inquiry request of 1 unit, and carrying out rounding to the period of one and a half months in units of year obtains 0 The section inquiry request of a unit can not, that is to say, that can not carry out rounding to the period of one and a half months in units of year.

In this preferred embodiment, each section inquiry request includes:Searching keyword, the buffered results of required return The deadline of initial time and the buffered results of required return.Wherein, the inquiry in each section inquiry request is crucial Search key during word and described search are asked is identical, it is described needed for deadline of buffered results for returning with it is described required The difference of the initial time of the buffered results of return is identical with the default unit.

It is further, described that described search request is cut into slices to obtain multiple section inquiries according to default section rule please Asking can also include:The multiple section inquiry request is ranked up according to time sequencing.For example, it is carried out according to time sequencing Order sequence carries out Bit-reversed according to time sequencing.

One preferably technical solution is to carry out Bit-reversed to the multiple section inquiry request according to time sequencing. That is, deadline of the first section inquiry request apart from the search result that user asks to return is nearest, is returned apart from user's request The initial time of the search result returned is farthest;The cut-off for the search result that second section inquiry request asks to return apart from user Time second is near;And so on;The deadline for the search result that the last one section inquiry request asks to return apart from user Farthest, the initial time that the search result returned is asked apart from user is nearest.In addition two section inquiry requests of adjacent sequence, The initial time of the buffered results of required return in the preceding section inquiry request that sorts is that the posterior section inquiry of sequence please The deadline of the buffered results of required return in asking.

As an example it is assumed that the default unit is in units of day, then first section inquiry request can be expressed as " Q=recognitions of face ", second section inquiry request can be with table It is shown as " Q=Ren Lianshibies &T1=20171129000000&T2=20171129240000 ".Wherein, Q is searching keyword, T1 For the initial time (date Hour Minute Second) of the buffered results of the required return, T2 is the required buffered results returned Deadline (date Hour Minute Second).

Judgment module 203, for judging to whether there is in caching buffered results corresponding with each section inquiry request.

In this preferred embodiment, a cache database (referred to as caching herein) is pre-set, caching is used exclusively for Intermediate result or final result are temporarily held in using the interim place of frequent data after being convenient in the caching by storage It is continuous to use.In other embodiments, an individual cache server can also be set, frequently number is used dedicated for caching According to.

In this preferred embodiment, the buffered results in the caching are cut into slices to obtain more previously according to the default unit A section buffered results.

In this preferred embodiment, each section buffered results corresponds to a caching keyword, corresponding caching starting Time, a corresponding caching deadline, wherein, the caching initial time and the difference for caching deadline with it is described Default unit is identical.

In this preferred embodiment, by each section inquiry request with each cut into slices buffered results matched one by one with Judge to whether there is section buffered results corresponding with section inquiry request in the caching.It is described to ask each section inquiry It asks to be matched one by one with each section buffered results and specifically include:By the searching keyword in each section inquiry request It is matched one by one with the caching keyword in section buffered results;By by each section inquiry request in it is described needed for return The initial time for the buffered results returned is matched one by one with the caching initial time in section buffered results;It will be each The deadline of the required buffered results returned in a section inquiry request and the caching in section buffered results Deadline is matched one by one.

Acquisition module 204 determines exist in caching and each section inquiry request pair for working as the judgment module 203 During the buffered results answered, buffered results corresponding with each section inquiry request are obtained from the caching and return to use Family.

In this preferred embodiment, when determining to there are buffered results corresponding with each section inquiry request in caching, Corresponding buffered results are extracted from the caching, and after the buffered results are ranked up and are combined sequentially in time Return to user.

Search module 205 determines to be not present in the caching and each section inquiry for working as the judgment module 203 It asks during corresponding buffered results search and the described search corresponding search result of request from search server and returns to use Family.

In this preferred embodiment, tied when determining to be not present in the caching with each corresponding caches of section inquiry request During fruit, it is believed that the inquiry request is first time inquiry request, that is, thinks search input by user for searching request for the first time.For For the first time during searching request, scan for obtaining search result into search server according to described search request, can adopt herein With existing search technique, details are not described herein by the present invention.Described search server is also search engine.

Further, described search device 20 can also include sorting module 206, for slow to the multiple section in advance Result is deposited according to time sequencing progress order.For example, it is carried out according to time sequencing carry out order sequence or according to time sequencing Bit-reversed.

One preferably technical solution is to carry out Bit-reversed to the multiple section buffered results according to time sequencing, this Sample, which is in corresponding time newest buffered results, can be always located in the top of cache database.That is, first section caching As a result corresponding caching initial time and caching deadline are nearest apart from current time;Second section buffered results is corresponding It caches initial time and caching deadline is near apart from current time second;And so on;The last one section buffered results pair The caching initial time answered and caching deadline are apart from current time.In addition for two section buffered results of adjacent sequence For, the corresponding caching initial time of the preceding section buffered results that sort is that the posterior section buffered results of sequence are corresponding slow Deposit deadline.

In this preferred embodiment, sorting module 206 is ranked up the multiple section inquiry request according to time sequencing, To it is the multiple section buffered results be ranked up according to time sequencing, so can to avoid to each section inquiry request with Each section buffered results is matched one by one, it is only necessary to which successful match goes out a section inquiry request and a section caching knot Fruit, side can sequentially carry out the matching of other section inquiry requests and buffered results of cutting into slices, not only reduce match complexity, Match time can also be saved, shortens the query processing time.

Further, described search method can also include relating module 207, for by described search ask with from institute The described search result searched in search server is stated to be associated and store in the caching.

Described search is asked as inquiry request with the buffered results searched for from described search server to associate And store into the caching, it is directly extracted when being scanned for again to the keyword in order to user from the caching, And then improve cache hit rate;In addition, the speed of the data in caching is read more than the speed that data are inquired about from search server It spends much higher, it is thus possible to mitigate the access pressure to search server, shorten the search process time, promote user experience.

It should be appreciated that the period in asking in practical applications described search carries out cutting according to default unit Two kinds of situations can be divided into:One kind is just rounding, and one kind is that have surplus.Thus obtained multiple section inquiry requests also correspond to It is divided into two kinds of situations, a kind of is the situation of corresponding just rounding, at this time by the period in asking described search according to default Unit carries out the section inquiry request that multiple section inquiry requests that cutting obtains are referred to as default unit;One kind is to be corresponding with knot Period in asking described search is referred to as by remaining situation according to the inquiry request that default unit can not carry out cutting at this time For the section inquiry request of non-default unit.For the section inquiry request of non-default unit, can not be matched in the caching Go out corresponding buffered results, it is thus regarded that the section inquiry request of the non-default unit is searching request for the first time.For for the first time It during searching request, scans for obtaining search result into search server according to described search request, may be employed herein existing There is search technique, details are not described herein by the present invention.Thus, the judgment module 203 is additionally operable to judge multiple section inquiry requests In whether there is non-default unit section inquiry request;When the judgment module 203 determines the multiple section inquiry request During middle section inquiry request there are non-default unit, described search module 205 searched for from search server with it is described non-pre- If the corresponding search result of section inquiry request of unit simultaneously returns to user;When the judgment module 203 determine it is the multiple When the section inquiry request of non-default unit being not present in inquiry request of cutting into slices, the judgment module 203 continues to judge in caching With the presence or absence of buffered results corresponding with each section inquiry request.

In conclusion the device 20 of the present invention for scanning for accelerating using query caching of cutting into slices, the receiving module 201 receive the searching request of user;The section module 202 according to default section rule cut into slices to described search request To multiple section inquiry requests;The judgment module 203 judges to whether there is in caching corresponding with each section inquiry request Buffered results;When the judgment module 203 determines there are buffered results corresponding with each section inquiry request in caching When, the acquisition module 204 obtains buffered results corresponding with each section inquiry request and is returned to from the caching User;When the judgment module 203 determines that buffered results corresponding with each section inquiry request are not present in caching, institute Search module 205 is stated to search for from search server and the described search corresponding search result of request and return to user.It utilizes Section caching mechanism is cached and inquired about, and can improve buffered results hit rate, reduces the search response time, is mitigated to search The access pressure of server.It, can be concurrently to multiple section inquiry requests after the searching request of user is cut into slices simultaneously It scans for, greatly shortens the search process time, promote user experience.

Such as by the full dose data in caching, it is data cached to be cut into slices to obtain the section of multiple units in units of day, When user asks the time of search to be any time period, if the time that ask search can be cut into slices to obtain in units of day The section inquiry request of multiple units is then hit cutting for multiple units by the section inquiry request of multiple units from the caching Piece is data cached, and the data in caching can be recycled well, saves the Internet resources of search server, alleviates network and gathers around It is stifled.For there is the section inquiry request of non-default unit in the section inquiry request cut into slices in units of day, then according to The section inquiry request of the non-default unit is scanned for according to existing search technique, since the section of non-default unit is looked into The search time unit ask in request is less than 1 day, therefore while being scanned for according to existing search technique takes also smaller, thus energy Effectively shorten search time.

Secondly, the device 20 of the present invention for scanning for accelerating using query caching of cutting into slices, association described search request With the described search result searched for from described search server;And described search request and corresponding described search result are deposited It stores up in the caching, is directly extracted when can be scanned for again to the keyword convenient for user from the caching.Most Afterwards, multiple section buffered results in the multiple section inquiry request and caching are ranked up according to time sequencing, it can be with It avoids matching each section inquiry request one by one with each section buffered results, and then further shortens slow Deposit query time.

Finally it should be noted that present invention section caching be not limited to it is above-mentioned enumerate using the default unit interval to cut into slices Foundation can be applicable to the section of other forms, for example, quantity etc..Certainly, enumerated once present invention is also not necessarily limited to above-mentioned Section caching can also be multiple section caching, for example, pre-setting two cache servers, one of buffer service Device is cut into slices according to the first default section unit, another cache server is cut into slices according to the second default section unit, Described first default section unit is more than the described second default section unit, and the present invention no longer elaborates herein.

The above-mentioned integrated unit realized in the form of software function module, can be stored in one and computer-readable deposit In storage media.Above-mentioned software function module is stored in a storage medium, is used including some instructions so that a computer It is each that equipment (can be personal computer, double screen equipment or the network equipment etc.) or processor (processor) perform the present invention The part of a embodiment the method.

As shown in figure 3, it is the hardware architecture diagram for the server for realizing searching method of the present invention.

In present pre-ferred embodiments, the server 3 includes memory 31, at least one processor 32, at least one Communication bus 33.

It will be understood by a person skilled in the art that the structure of the double screen equipment shown in Fig. 3 does not form the embodiment of the present invention It limits, either bus type structure or star structure, the server 3 can also include more more or more than illustrating Other few hardware either arrange by software or different components.

In some embodiments, the server 3 include it is a kind of can according to the instruction for being previously set or storing, automatically into Row numerical computations and/or the double screen equipment of information processing, hardware include but not limited to microprocessor, application-specific integrated circuit, can Program gate array, digital processing unit, embedded device etc..The server 3 may also include user equipment, the user equipment bag Include but be not limited to any one can be carried out with user by modes such as keyboard, mouse, remote controler, touch tablet or voice-operated devices it is man-machine Interactive electronic product, for example, personal computer, tablet computer, smart mobile phone, personal digital assistant (Personal Digital Assistant, PDA) etc. any electronic product for possessing double screen etc..

It should be noted that the server 3 is only for example, other existing or electronic products for being likely to occur from now on are such as The present invention is adaptable to, should also be included within protection scope of the present invention, and is incorporated herein by reference.

In some embodiments, the memory 31 is used to store program code and various data, such as mounted on described Information in server 3 preserves system 10, and realizes high speed in the operational process of server 3, is automatically completed program or number According to access.The memory 31 includes read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), it is programmable read only memory (Programmable Read-Only Memory, PROM), erasable Except programmable read only memory (Erasable Programmable Read-Only Memory, EPROM), disposable programmable only Reading memory (One-time Programmable Read-Only Memory, OTPROM), electronics erasing type can be made carbon copies read-only Memory (Electrically-Erasable Programmable Read-Only Memory, EEPROM), read-only optical disc (Compact Disc Read-Only Memory, CD-ROM) or other disk storages, magnetic disk storage, magnetic tape storage, Or it can be used in carrying or store any other computer-readable medium of data.

In some embodiments, at least one processor 32 can be made of integrated circuit, such as can be by single The integrated circuit of encapsulation is formed or is made of the integrated circuit that multiple identical functions or difference in functionality encapsulate, and is wrapped Include one or more central processing unit (Central Processing unit, CPU), microprocessor, digital processing chip, Combination of graphics processor and various control chips etc..At least one processor 32 is the control core of the server 3 (Control Unit), using various interfaces and all parts of the entire server 3 of connection, by running or performing storage Program or module and calling in the memory 31 are stored in the data in the memory 31, to perform service The various functions of device 3 and processing data, such as perform searcher 20.

In some embodiments, at least one communication bus 33 be arranged to realize the memory 31, it is described extremely Connection communication between few 32 grade of processor.

Although being not shown, the server 3 can also include the power supply (such as battery) powered to all parts, preferably , power supply can be logically contiguous by power-supply management system and at least one processor 32, so as to pass through power management system System realizes the functions such as management charging, electric discharge and power managed.Power supply can also include one or more direct current or friendship Galvanic electricity source, recharging system, power failure detection circuit, power supply changeover device or inverter, power supply status indicator etc. are arbitrary Component.The server 3 can also include multiple sensors, bluetooth module, Wi-Fi module, camera etc., no longer superfluous herein It states.

It should be appreciated that the embodiment is only purposes of discussion, in patent claim and from the limitation of this structure.

In a further embodiment, with reference to Fig. 1, at least one processor 32 can perform the behaviour of the server 3 Make types of applications program (searcher 20 as mentioned), program code of system and installation etc., for example, above-mentioned each mould Block, including:Receiving module 201, section module 202, judgment module 203, acquisition module 204, search module 205, sorting module 206 and relating module 207.

Specifically, at least one processor 32 can refer to the concrete methods of realizing of above-metioned instruction Fig. 1 and correspond to and implements The description of correlation step in example, this will not be repeated here.

Although above-described embodiment is independent group that application scheme is implemented on to such as search server (search server) On part, completed by separate server as search server from searching request to the conversion of burst inquiry request and reality The inquiry and the extraction of buffered results now cached, but the application imposes any restrictions not to this, but can also be by the application side Case is implemented on such as Web server, mobile client server, search clothes in website service system as a comprising modules It is engaged in the whole system that device, the various server apparatus of cache server form.In addition, except mobile internet service device and Outside Web server, server system can also include any other display mode for known in the art or following exploitation Other servers.In addition, although cache database is provided in the above-mentioned search server shown, it is also possible to set One cache server, as a standalone module or component and described search server combination into a system.

In several embodiments provided by the present invention, it should be understood that disclosed system, apparatus and method can be with It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the module Division is only a kind of division of logic function, can there is other dividing mode in actual implementation.

The module illustrated as separating component may or may not be physically separate, be shown as module The component shown may or may not be physical location, you can be located at a place or can also be distributed to multiple In network element.Some or all of module therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.

In addition, each function module in each embodiment of the present invention can be integrated in a processing unit, it can also That unit is individually physically present, can also two or more units integrate in a unit.Above-mentioned integrated list The form that hardware had both may be employed in member is realized, can also be realized in the form of hardware adds software function module.

It is obvious to a person skilled in the art that the invention is not restricted to the details of above-mentioned exemplary embodiment, Er Qie In the case of without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power Profit requirement rather than above description limit, it is intended that all by what is fallen within the meaning and scope of the equivalent requirements of the claims Variation includes within the present invention.Any reference numeral in claim should not be considered as to the involved claim of limitation.This Outside, it is clear that one word of " comprising " is not excluded for other units or, odd number is not excluded for plural number.The multiple units stated in system claims Or device can also be realized by a unit or device by software or hardware.The first, the second grade words are used for representing name Claim, and do not represent any particular order.

Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention and it is unrestricted, although reference The present invention is described in detail in preferred embodiment, it will be understood by those of ordinary skill in the art that, it can be to the present invention's Technical solution is modified or equivalent substitution, without departing from the spirit and scope of technical solution of the present invention.

Claims (10)

1. a kind of searching method, which is characterized in that the described method includes:
Receive the searching request of user;
Described search request is cut into slices to obtain multiple section inquiry requests according to default section rule;
Judge to whether there is buffered results corresponding with each section inquiry request in caching, wherein, it is slow in the caching Result is deposited to be cut into slices to obtain multiple section buffered results previously according to the default section rule;And
When there are buffered results corresponding with each section inquiry request in definite caching, obtained from the caching and every The corresponding buffered results of one section inquiry request simultaneously return to user;Or
When determining that buffered results corresponding with each section inquiry request are not present in caching, searched for from search server With the described search corresponding search result of request and returning to user.
2. the method as described in claim 1, which is characterized in that
Described search request includes:Search key, the initial time of search result of required return and searching for required return The deadline of hitch fruit;
The default section rule refers to that the period in asking described search carries out rounding and complementation, institute according to default unit State the period in searching request refer to it is described needed for the search of deadline and the required return of search result that returns As a result the difference between initial time;
The section inquiry request includes:Searching keyword, required return buffered results initial time and required return Buffered results deadline;
The section buffered results include:Keyword, caching initial time, caching deadline are cached, when the caching originates Between with it is described caching deadline difference it is identical with the default unit.
3. method as claimed in claim 2, which is characterized in that it is described to described search request according to default section rule into After row section obtains multiple section inquiry requests, the method further includes:
Judge in the multiple section inquiry request with the presence or absence of the section inquiry request of non-default unit;And
When determining the section inquiry request there are non-default unit, obtained from search server and the non-default unit The corresponding search result of section inquiry request simultaneously returns to user.
4. method as claimed in claim 3, which is characterized in that asked in the search from search server with described search Corresponding search result and after returning to user, the method further includes:
Described search request with the described search result searched for from described search server is associated and is stored to described In caching.
5. method as claimed in claim 2, which is characterized in that described to judge in caching to whether there is and each section inquiry Corresponding buffered results is asked to include:
Searching keyword in each section inquiry request is carried out one by one with the caching keyword in section buffered results Match somebody with somebody;
By by each section inquiry request in it is described needed for return buffered results initial time with section buffered results In the caching initial time matched one by one;
By by each section inquiry request in it is described needed for return buffered results deadline with section buffered results In the caching deadline matched one by one.
6. method as claimed in claim 5, which is characterized in that the method further includes:
In advance to the multiple section buffered results according to time sequencing progress order.
7. the method as described in claim 1 to 6, which is characterized in that the method further includes:
The multiple section inquiry request is ranked up according to time sequencing.
8. a kind of searcher, which is characterized in that described device includes:
Receiving module, for receiving the searching request of user;
Section module, for being cut into slices to obtain multiple section inquiry requests according to default section rule to described search request;
Judgment module, for judging to whether there is in caching buffered results corresponding with each section inquiry request;
Acquisition module is determined in the caching for working as the judgment module in the presence of corresponding slow with each section inquiry request When depositing result, buffered results corresponding with each section inquiry request are obtained from the caching and return to user;
Search module determines that there is no corresponding with each section inquiry request in the caching for working as the judgment module During buffered results, search is with the described search corresponding search result of request and returning to user from search server.
9. a kind of electronic equipment, it is characterised in that:The electronic equipment includes processor, and the processor is used to perform memory It is realized during the computer program of middle storage such as any one described search method in claim 1 to 7.
10. a kind of computer readable storage medium, is stored thereon with computer program, it is characterised in that:The computer program It is realized when being executed by processor such as any one described search method in claim 1 to 7.
CN201711309502.2A 2017-12-11 2017-12-11 A kind of searching method, device, electronic equipment and storage medium CN108090153A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711309502.2A CN108090153A (en) 2017-12-11 2017-12-11 A kind of searching method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711309502.2A CN108090153A (en) 2017-12-11 2017-12-11 A kind of searching method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN108090153A true CN108090153A (en) 2018-05-29

Family

ID=62174773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711309502.2A CN108090153A (en) 2017-12-11 2017-12-11 A kind of searching method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108090153A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424199A (en) * 2013-08-21 2015-03-18 阿里巴巴集团控股有限公司 Search method and device
CN105808661A (en) * 2016-02-29 2016-07-27 浪潮通信信息系统有限公司 Data query method and device
CN106598717A (en) * 2016-12-07 2017-04-26 陕西尚品信息科技有限公司 Time slice-based task scheduling method
CN106776632A (en) * 2015-11-23 2017-05-31 北京国双科技有限公司 Data query method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424199A (en) * 2013-08-21 2015-03-18 阿里巴巴集团控股有限公司 Search method and device
CN106776632A (en) * 2015-11-23 2017-05-31 北京国双科技有限公司 Data query method and device
CN105808661A (en) * 2016-02-29 2016-07-27 浪潮通信信息系统有限公司 Data query method and device
CN106598717A (en) * 2016-12-07 2017-04-26 陕西尚品信息科技有限公司 Time slice-based task scheduling method

Similar Documents

Publication Publication Date Title
US8271471B1 (en) Anticipated query generation and processing in a search engine
JP6171111B2 (en) Blending search results on online social networks
Chen et al. Big data: A survey
US20080109285A1 (en) Techniques for determining relevant advertisements in response to queries
JP2013519150A (en) Information retrieval system with real-time feedback
CN102300012B (en) One-to-one matching in contact center
CN101932996B (en) Parallel processing computer systems with reduced power consumption and methods for providing the same
US8972245B2 (en) Text prediction using environment hints
US9552350B2 (en) Virtual assistant conversations for ambiguous user input and goals
US20110016421A1 (en) Task oriented user interface platform
US8214361B1 (en) Organizing search results in a topic hierarchy
US9208240B1 (en) Implementation of a web scale data fabric
CN103748555B (en) Multi-dimensional user request pattern fast supply virtual machine is based in cloud
US20100318538A1 (en) Predictive searching and associated cache management
US20180129702A1 (en) Age-based policies for determining database cache hits
WO2015030796A1 (en) Extensible context-aware natural language interactions for virtual personal assistants
CN104160381A (en) Managing tenant-specific data sets in a multi-tenant environment
US20160277515A1 (en) Server side data cache system
CN104903894A (en) System and method for distributed database query engines
CN102713909A (en) Dynamic community-based cache for mobile search
CN102929950A (en) Contend and member recommended by social network is used for personalized search result
CN104969184A (en) Personalized real-time recommendation system
US8140512B2 (en) Consolidated information retrieval results
CN103597482A (en) Storing data on storage nodes
CN102915380A (en) Method and system for carrying out searching on data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination