Disclosure of Invention
The present invention is directed to a data retrieval method, a data retrieval apparatus, a computer-readable storage medium, and an electronic device, which overcome at least some of the problems of low accuracy of retrieval results due to limitations and disadvantages of the related art.
According to an aspect of the present disclosure, there is provided a data retrieval method including:
inputting data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors, and obtaining a vector matrix to be retrieved according to each sentence vector;
respectively inputting sentence vectors of each line in the vector matrix to be retrieved into a database with vector indexes to be retrieved to obtain a plurality of candidate retrieval results;
calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in the vector matrix to be retrieved;
and obtaining a plurality of target retrieval results according to the candidate retrieval results, the Euclidean distances and the weights, and displaying the target retrieval results according to the word shift distance between the target retrieval results and the data to be retrieved.
In an exemplary embodiment of the present disclosure, inputting data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors includes:
performing word segmentation processing on the data to be retrieved to obtain a plurality of word groups, and inputting each word group into a sentence vector encoder to obtain a plurality of sentence vectors;
wherein, the sentence vector encoder is a supervised sentence embedding model.
In an exemplary embodiment of the present disclosure, the data retrieval method further includes:
calculating the length of each sentence vector;
and filling the sentence vector when the length of the sentence vector is determined not to reach the preset length.
In an exemplary embodiment of the present disclosure, calculating the weight of each sentence vector in the vector matrix to be retrieved includes:
calculating the occurrence frequency of each sentence vector in the vector matrix to be retrieved and the total number of sentence vectors in the vector matrix to be retrieved;
and calculating the weight of each sentence vector in the vector matrix to be retrieved according to the times of the sentence vectors appearing in the vector matrix to be retrieved and the total number of the sentence vectors in the vector matrix to be retrieved.
In an exemplary embodiment of the present disclosure, calculating the euclidean distance between the candidate retrieval result corresponding to each sentence vector and the sentence vector includes:
calculating difference budgeting on the candidate retrieval result corresponding to each sentence vector and the sentence vector to obtain a plurality of difference operation results;
and performing summation operation on the square of each difference operation result, and performing square opening on the operation result of the summation operation to obtain the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the sentence vector.
In an exemplary embodiment of the present disclosure, the data retrieval method further includes:
obtaining a plurality of weight vectors according to the weight of each sentence vector in the vector matrix to be retrieved and the weight of the candidate retrieval result corresponding to each sentence vector in the target retrieval result;
and obtaining a word shift distance between the target retrieval result and the data to be retrieved according to the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the weight vector corresponding to each sentence vector.
In an exemplary embodiment of the present disclosure, obtaining a word shift distance between the target search result and the data to be searched according to the document feature of the target search result and the euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector includes:
carrying out product operation on the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the weight vector corresponding to each sentence vector to obtain a plurality of product operation results;
and performing summation operation on each product operation result to obtain a plurality of sum operation results, and performing minimization operation on each sum operation result to obtain a word shift distance between each target retrieval result and the data to be retrieved.
According to an aspect of the present disclosure, there is provided a data retrieval apparatus including:
the first processing module is used for inputting the data to be retrieved into the sentence vector encoder to obtain a plurality of sentence vectors and obtaining a vector matrix to be retrieved according to each sentence vector;
the second processing module is used for respectively inputting the sentence vectors of each row in the vector matrix to be retrieved into a database with vector indexes for retrieval to obtain a plurality of candidate retrieval results;
the first calculation module is used for calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in the vector matrix to be retrieved;
and the third processing module is used for obtaining a plurality of target retrieval results according to the candidate retrieval results, the Euclidean distances and the weights, and displaying the target retrieval results according to the word shift distance between the target retrieval results and the data to be retrieved.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a data retrieval method as recited in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any of the data retrieval methods described above via execution of the executable instructions.
On one hand, a plurality of sentence vectors are obtained by inputting data to be retrieved into a sentence vector encoder, and a vector matrix to be retrieved is obtained according to each sentence vector; then, sentence vectors of each line in a vector matrix to be retrieved are respectively input into a database with vector indexes for retrieval to obtain a plurality of candidate retrieval results; then calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in a vector matrix to be retrieved; finally, a plurality of target retrieval results are obtained according to the candidate retrieval results, the Euclidean distances and the weights, and the target retrieval results are displayed according to the word shifting distance between the target retrieval results and the data to be retrieved, so that the problem that the retrieval results with different literal surfaces and similar semantics cannot be recalled in the prior art is solved, the accuracy of the retrieval results is low, and the accuracy of the retrieval results is improved; on the other hand, a plurality of target retrieval results are obtained according to the candidate retrieval results, the Euclidean distances and the weights, and the target retrieval results are displayed according to the word shifting distance between the target retrieval results and the data to be retrieved, so that the problem that the semantic deviation phenomenon is easy to occur due to the fact that the recalled results are too wide is solved, and the accuracy of the retrieval results is further improved; on the other hand, sentence vectors of each row in the vector matrix to be retrieved are respectively input into the database with the vector index for retrieval to obtain a plurality of candidate retrieval results, and the retrieval efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the invention.
Furthermore, the drawings are merely schematic illustrations of the invention and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The information retrieval has very wide application in the field of knowledge question answering. The general process of retrieving an answer system may include: firstly, constructing a candidate answer knowledge base; secondly, when the user inputs a question, the question which is closest to the question and is stored in the library is found based on the similarity calculation, and then the corresponding answer is returned. The method mainly comprises the following steps:
1) constructing a candidate answer index set; 2) after receiving the query, preliminarily selecting some candidate answers; 3) matching the query and the answer, and then ranking; 4) finally, topk (k previous) answers are returned.
However, the traditional retrieval is based on word inverted index retrieval, and results with different word surfaces and similar semantics cannot be recalled; in addition, the current search method based on semantic vectors can recall results with similar semantics, but the recalled results are too wide, and the semantic deviation phenomenon is easy to occur.
In the present exemplary embodiment, a data retrieval method is first provided, where the method may be operated in a server, a server cluster, a cloud server, or the like, and may also be operated in an equipment terminal; of course, those skilled in the art may also run the method of the present invention on other platforms as needed, which is not limited in this exemplary embodiment. Referring to fig. 1, the data retrieval method includes the steps of:
step S110, inputting data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors, and obtaining a vector matrix to be retrieved according to the sentence vectors.
And S120, respectively inputting the sentence vectors of each row in the vector matrix to be retrieved into a database with vector indexes for retrieval to obtain a plurality of candidate retrieval results.
Step S130, calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in the vector matrix to be retrieved.
Step S140, a plurality of target retrieval results are obtained according to each candidate retrieval result, each Euclidean distance and each weight, and each target retrieval result is displayed according to the word shift distance between each target retrieval result and the data to be retrieved.
In the data retrieval method, on one hand, a plurality of sentence vectors are obtained by inputting data to be retrieved into a sentence vector encoder, and a vector matrix to be retrieved is obtained according to each sentence vector; then, respectively inputting sentence vectors of each row in a vector matrix to be retrieved into a database with vector indexes for retrieval to obtain a plurality of candidate retrieval results; then calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in a vector matrix to be retrieved; finally, a plurality of target retrieval results are obtained according to the candidate retrieval results, the Euclidean distances and the weights, and the target retrieval results are displayed according to the word shifting distance between the target retrieval results and the data to be retrieved, so that the problem that the retrieval results with different literal surfaces and similar semantics cannot be recalled in the prior art is solved, the accuracy of the retrieval results is low, and the accuracy of the retrieval results is improved; on the other hand, a plurality of target retrieval results are obtained according to the candidate retrieval results, the Euclidean distances and the weights, and the target retrieval results are displayed according to the word shifting distance between the target retrieval results and the data to be retrieved, so that the problem that the semantic deviation phenomenon is easy to occur due to the fact that the recalled results are too wide is solved, and the accuracy of the retrieval results is further improved; on the other hand, sentence vectors of each row in the vector matrix to be retrieved are respectively input into the database with the vector index for retrieval to obtain a plurality of candidate retrieval results, and the retrieval efficiency is improved.
Hereinafter, each step involved in the data retrieval method of the exemplary embodiment of the present invention will be explained and explained in detail with reference to the drawings.
In step S110, the data to be retrieved is input to a sentence vector encoder to obtain a plurality of sentence vectors, and a vector matrix to be retrieved is obtained according to each sentence vector.
In this example embodiment, first, a word segmentation process is performed on the data to be retrieved to obtain a plurality of word groups, and then, each of the word groups is input to a sentence vector encoder to obtain a plurality of sentence vectors; wherein, the sentence vector encoder is a supervised sentence embedding model. In detail:
first, a sentence vector encoder is explained and explained. Referring to fig. 2, the sentence vector encoder may be, for example, a supervised sentence embedding model (inferset model), whose main structure may include BiLSTM and Max-Pooling. The model may include, among other things, a first input (e.g., may be a sensor encoder with prediction input)201, a second input (sensor encoder with perspective input)202, an encoding layer 203, a fully connected layer 204, a pooling layer 205, and an output 206. Specifically, according to the embedding coding of each token of the BiLSTM layer, an encoding matrix of [ max _ seq _ len, embedding size ] is obtained to represent each sentence. Therefore, sentences can be effectively represented, and semantic loss in retrieval caused by using one sentence embedding is avoided. It should be noted that, the first input and the second input are paired inputs, and the second input may be a word similar to the first input or a word opposite to the first input.
Further, the insersent model can be trained by using sentences composed of the existing synonym forest (antisense word forest), the common synonym dictionary (antisense word dictionary) obtained by encyclopedia entry sorting conversion or other Chinese common synonym dictionaries (antisense word dictionaries), so as to obtain the trained supervised sentence embedding model.
Next, the explanation will be made with respect to step S110 described above. Firstly, a word segmentation tool (optionally a word segmentation tool, which is not specially limited in this example) can be used to perform word segmentation on data to be retrieved to obtain a plurality of word groups; then, each word group is input into a sentence vector encoder to obtain a plurality of sentence vectors. It should be further added here that, in order to avoid subsequent influence on the search result due to the word vector short-term segment, the method may further include: calculating the length of each sentence vector; and when the length of the sentence vector does not reach the preset length, filling the sentence vector. For example, the length of a sentence vector may be set to a fixed length M, and when the length of a sentence vector does not reach the length, it may be padded. Specifically, the padding may be performed by using 0, or may be performed by using other characters, which is not particularly limited in this example. It should be noted that, since the sentence vector encoder is trained by using the existing synonym forest (antonym forest), the common synonym dictionary (antonym dictionary) obtained by sorting and converting encyclopedic entries, or other chinese common synonym dictionary (antonym dictionary), the obtained sentence vector may include a sentence vector having synonyms with the phrases in the data to be retrieved. Therefore, the problem that in the prior art, retrieval results with different literal and similar semantics cannot be recalled, and the accuracy of the retrieval results is low is solved.
Further, after sentence vectors are obtained, the sentence vectors can be combined to obtain a vector matrix to be retrieved.
In step S120, the sentence vectors in each row of the vector matrix to be retrieved are respectively input into the database with vector indexes for retrieval, so as to obtain a plurality of candidate retrieval results.
In the present exemplary embodiment, first, a database with a vector index is explained and explained. Referring to fig. 3, the database with vector index may be, for example, an index of sentence vector using Annoy. The Annoy is a C + +/Python tool which is an open source of Spotify and used for approximate nearest neighbor query, the memory usage is optimized, indexes can be stored or loaded in a hard disk, and the query efficiency is guaranteed. And, Annoy is an open source library that approximates nearest neighbors in high dimensional space. Annoy constructs a binary tree (shown in reference to fig. 2), whose leaf nodes may represent the set of vectors assigned to the node; and, the query time is o (logn).
Further, the sentence vectors of each row in the vector matrix to be retrieved may be respectively input into a database with vector indexes for retrieval, and a plurality of candidate retrieval results may be obtained. It should be added here that, since the leaf nodes of the binary tree may represent the vector set allocated to the node, based on the input sentence vector of each row, matching may be performed from the root node until all the leaf nodes are matched, and then the combination of the candidate search results of the sentence vector with changed rows is returned as the above candidate search result. By the method, the query efficiency can be improved, and the problem that results with different literal and similar semantics cannot be recalled and the accuracy of retrieval results is low can be avoided.
In step S130, an euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector and a weight of each sentence vector in the vector matrix to be searched are calculated.
In this exemplary embodiment, first, calculating the euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector may specifically include: firstly, carrying out difference budget calculation on candidate retrieval results corresponding to each sentence vector and the sentence vectors to obtain a plurality of difference operation results; and secondly, performing summation operation on the square of each difference operation result, and performing square opening operation on the operation result of the summation operation to obtain the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the sentence vector.
wherein, y 1 ,y 2 ,...,y n For a set of vectors, x, included in the candidate search results 1 ,x 2 ,...,x n Set of vectors, dist, included in sentence vectors i For the ith sentence vector and its corresponding candidate search nodeEuclidean Distance between fruits (Euclidean Distance).
Secondly, calculating the weight of each sentence vector in the vector matrix to be retrieved, which specifically includes: calculating the occurrence frequency of each sentence vector in the vector matrix to be retrieved and the total number of sentence vectors in the vector matrix to be retrieved; and calculating the weight of each sentence vector in the vector matrix to be retrieved according to the times of the sentence vectors appearing in the vector matrix to be retrieved and the total number of the sentence vectors in the vector matrix to be retrieved.
wherein, d i The weight of the ith sentence vector in a vector matrix to be retrieved; c. C i The times of the ith sentence vector appearing in the vector matrix to be retrieved; and N is the total number of sentence vectors in the vector matrix to be retrieved.
In step S140, a plurality of target search results are obtained according to each candidate search result, each euclidean distance, and each weight, and each target search result is displayed according to a word shift distance between each target search result and the data to be searched.
In this exemplary embodiment, after obtaining the euclidean distance and the weight, the candidate retrieval results may be combined according to an order of occurrence of each sentence vector in the data to be retrieved, a candidate retrieval result corresponding to each sentence vector, a euclidean distance between each sentence vector and its corresponding candidate retrieval result, and a weight of each sentence vector in the vector matrix to be retrieved, so as to obtain a plurality of target retrieval results. Then, displaying each target retrieval according to the word shift distance between each target retrieval result and the data to be retrieved; for example, the closer the word shift distance, the further forward the display.
It should be added that, the target search results may be sorted according to the word shift distance, and then displayed in sequence according to the number of the set search results that can be displayed, so as to be convenient for viewing.
Fig. 4 schematically shows another data retrieval method according to an exemplary embodiment of the present invention. Referring to fig. 4, the data retrieving method may further include step S410 and step S420, which will be described in detail below.
In step S410, a plurality of weight vectors are obtained according to the weight of each sentence vector in the vector matrix to be retrieved and the weight of the candidate retrieval result corresponding to each sentence vector in the target retrieval result.
In this exemplary embodiment, weights of each sentence vector in a vector matrix to be retrieved and weights of candidate retrieval results corresponding to each sentence vector in a target retrieval result may be directly combined to obtain a plurality of weight vectors; other ways are also possible, such as weighted combination or concatenation, etc., which are not specifically limited by this example.
In step S420, a word shift distance between the target search result and the data to be searched is obtained according to the euclidean distance between the candidate search result corresponding to each sentence vector and the weight vector corresponding to each sentence vector.
In this exemplary embodiment, first, a product operation is performed on the euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector, and the weight vector corresponding to each sentence vector to obtain a plurality of product operation results; and secondly, performing summation operation on each product operation result to obtain a plurality of sum operation results, and performing minimization operation on each sum operation result to obtain a word shift distance between each target retrieval result and the data to be retrieved. In detail:
wherein, T ij Is a weight vector; d i And d' j Respectively, a sentence vector and a candidate corresponding to the sentence vectorSelecting the weight of each word in the retrieval result; and d' j And d is calculated i Similarly, no further description is provided herein. Furthermore, the scores of all the results are subjected to secondary normalization operation (in query and returned result dimensions), so that the constraint conditions of the formula are met; and calculating the weight T and the scores of all returned results, calculating to obtain a final WMD score, and sorting according to the scores to return the final result. The calculating method adopts the score returned by retrieval, avoids the optimization process of the traditional WMD solving process and greatly improves the calculating efficiency of the WMD algorithm.
The present disclosure also provides a data retrieval device. Referring to fig. 5, the data retrieval apparatus may include a first processing module 510, a second processing module 520, a first calculation module 530, and a third processing module 540. Wherein:
the first processing module 510 may be configured to input data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors, and obtain a vector matrix to be retrieved according to each sentence vector.
The second processing module 520 may be configured to input the sentence vectors in each row of the vector matrix to be retrieved into a database with a vector index, and retrieve a plurality of candidate retrieval results.
The first calculating module 530 may be configured to calculate a euclidean distance between a candidate search result corresponding to each sentence vector and the sentence vector, and a weight of each sentence vector in the vector matrix to be searched.
The third processing module 540 may be configured to obtain a plurality of target search results according to each candidate search result, each euclidean distance, and each weight, and display each target search result according to a word shift distance between each target search result and the data to be searched.
In an exemplary embodiment of the present disclosure, inputting data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors includes:
performing word segmentation processing on the data to be retrieved to obtain a plurality of word groups, and inputting each word group into a sentence vector encoder to obtain a plurality of sentence vectors; wherein, the sentence vector encoder is a supervised sentence embedding model.
In an exemplary embodiment of the present disclosure, the data retrieval apparatus further includes:
and the second calculation module can be used for calculating the length of each sentence vector.
The filling module may be configured to fill the sentence vector when it is determined that the length of the sentence vector does not reach a preset length.
In an exemplary embodiment of the present disclosure, calculating the weight of each sentence vector in the vector matrix to be retrieved includes:
calculating the occurrence frequency of each sentence vector in the vector matrix to be retrieved and the total number of sentence vectors in the vector matrix to be retrieved;
and calculating the weight of each sentence vector in the vector matrix to be retrieved according to the times of the sentence vectors appearing in the vector matrix to be retrieved and the total number of the sentence vectors in the vector matrix to be retrieved.
In an exemplary embodiment of the present disclosure, calculating the euclidean distance between the candidate retrieval result corresponding to each sentence vector and the sentence vector includes:
calculating difference budgets of the candidate retrieval results corresponding to the sentence vectors and the sentence vectors to obtain a plurality of difference operation results;
and performing summation operation on the square of each difference operation result, and performing square opening operation on the operation result of the summation operation to obtain the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the sentence vector.
In an exemplary embodiment of the present disclosure, the data retrieval apparatus further includes:
the weight vector calculation module may be configured to obtain a plurality of weight vectors according to the weight of each sentence vector in the vector matrix to be retrieved and the weight of the candidate retrieval result corresponding to each sentence vector in the target retrieval result;
the word shift distance calculation module may be configured to obtain a word shift distance between the target search result and the data to be searched according to an euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector and a weight vector corresponding to each sentence vector.
In an exemplary embodiment of the present disclosure, obtaining a word shift distance between the target search result and the data to be searched according to the document feature of the target search result and the euclidean distance between the candidate search result corresponding to each sentence vector and the sentence vector includes:
carrying out product operation on the Euclidean distance between the candidate retrieval result corresponding to each sentence vector and the weight vector corresponding to each sentence vector to obtain a plurality of product operation results;
and performing summation operation on each product operation result to obtain a plurality of sum operation results, and performing minimization operation on each sum operation result to obtain a word shift distance between each target retrieval result and the data to be retrieved.
The specific details of each module in the data retrieval device have been described in detail in the corresponding data retrieval method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the invention. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present invention are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, and may also be implemented by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a mobile terminal, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, there is also provided an electronic device capable of implementing the above method.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 6, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, and a bus 630 that couples the various system components including the memory unit 620 and the processing unit 610.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 610 may perform step S110 as shown in fig. 1: inputting data to be retrieved into a sentence vector encoder to obtain a plurality of sentence vectors, and obtaining a vector matrix to be retrieved according to each sentence vector; step S120: respectively inputting sentence vectors of each row in the vector matrix to be retrieved into a database with vector indexes for retrieval to obtain a plurality of candidate retrieval results; step S130: calculating Euclidean distances between candidate retrieval results corresponding to the sentence vectors and weights of the sentence vectors in the vector matrix to be retrieved; step S140: and obtaining a plurality of target retrieval results according to the candidate retrieval results, the Euclidean distances and the weights, and displaying the target retrieval results according to the word shift distance between the target retrieval results and the data to be retrieved.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 can be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary method" of this description, when said program product is run on said terminal device.
The program product for implementing the above method according to the embodiment of the present invention may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.