US12373504B2 - Method for recommending a search term, method for training a target model and electronic device - Google Patents
Method for recommending a search term, method for training a target model and electronic deviceInfo
- Publication number
- US12373504B2 US12373504B2 US17/398,134 US202117398134A US12373504B2 US 12373504 B2 US12373504 B2 US 12373504B2 US 202117398134 A US202117398134 A US 202117398134A US 12373504 B2 US12373504 B2 US 12373504B2
- Authority
- US
- United States
- Prior art keywords
- semantic
- representation
- candidate search
- search term
- search terms
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/901—Indexing; Data structures therefor; Storage structures
- G06F16/9024—Graphs; Linked lists
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
- G06F16/90324—Query formulation using system suggestions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9532—Query formulation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- the present disclosure relates to the technical field of computers, and in particular to the field of information processing.
- the present disclosure provides a method for recommending a search term, a method for training a target model, an apparatus, an electronic device, a storage medium and a product.
- an apparatus for recommending a search term which includes:
- FIG. 1 is a flowchart for a method for recommending a search term according to an embodiment of the present disclosure
- determining the neighbor node of the current node and the relationship between the current node and the neighbor node, by taking the target search term as the current node may be realized in at least one of the following manners:
- a good training sample can be obtained by constructing a text graph according to a plurality of nodes and the relationships between the respective nodes in the plurality of nodes, so that the trained target model can output a more accurate semantic aggregation representation.
- an intention of one semantic cluster can be obtained by mining nodes and the relationship between the nodes; the text graph constructed according to the nodes and the relationship between the nodes is taken as a training sample, so that the trained target model can capture semantic information of a node itself and surrounding nodes thereof, and acquire semantic aggregation representation of each node.
- the representation of the nodes is a representation of a cluster aggregated by the nodes themselves and the neighbor nodes, and a more accurate semantic aggregation representation can be output.
- the method of training a target model may include:
- a training sample is required to be constructed on the basis of a constructed text graph, and the preset model is trained by adopting a technology combining graph learning and semantic representation.
- positive and negative samples are respectively as follows:
- FIG. 5 shows a schematic diagram of acquiring a semantic aggregation representation.
- a composition relationship according to a text graph is equivalent to acquiring an intention of a semantic cluster.
- Each node captures semantic information of itself and surrounding nodes thereof simultaneously, and acquires a semantic representation vector of each node.
- the semantic representation of the node is a representation which aggregates s of the node itself and the neighbor node, i.e., a semantic aggregation representation.
- the semantic representation vector of each node can be acquired by executing a prediction process by the trained target model.
- the semantic representation of the node is a semantic representation of the cluster of the node itself and the neighbor node. Therefore, a user search intention can be captured more accurately by performing the index recall by the aggregated semantic representation, thereby recalling the expanded search term related to the user search intention.
- the search frequency refers to the number of searches for one or some search terms within a certain period of time. For example, in a case where the number of searches for “AAA” is 100 in the time of day, the search frequency for the search term “AAA” is 100.
- the result page display quantity refers to the result page display quantity of a search term on a preset website, and the preset website includes websites with statistical values, such as official websites, civilian websites with good reputation and the like.
- the result page display quantity may refers to the number of displays of an article on an official web site or of a high quality author.
- the clicks refers to the number of times a certain search term or certain search terms are clicked within a certain period of time.
- the ANN index library is established based on the semantic aggregation representations of all the candidate search terms, to provide support for recalling the candidate search terms to be recommended based on the approximate nearest neighbor service, so that a recall speed is increased and search experience of a user is improved.
- the recalling the candidate search term to be recommended from the set of candidate search terms, by taking the semantic aggregation representation of the target search term as the index in S 15 includes:
- FIG. 9 shows a schematic diagram of an architecture of search term recommendation.
- the architecture mainly includes two parts, semantic aggregation representation learning and a recall system.
- the part of semantic aggregation representation learning is mainly responsible for training a preset model according to a training sample, to obtain the semantic aggregation representation of a search term by the trained target model.
- the part of recall system is mainly responsible for recalling a candidate search term with a high similarity with the semantic aggregation representation of the search term from the candidate term set based on the Approximate Nearest Neighbor (ANN), and finally displaying the candidate search term to a user terminal.
- ANN Approximate Nearest Neighbor
- FIG. 9 is an alternative implementation, and that various obvious changes and/or substitutions may be made by those skilled in the art based on the instance of FIG. 9 , while the obtained technical solution still remains within the scope of the embodiments of the present disclosure.
- FIG. 10 shows a schematic diagram of an apparatus for recommending a search term. As shown in FIG. 10 , the apparatus includes:
- the apparatus further includes:
- the recalling module 1040 is configured for:
- the apparatus for recommending a search term determines a neighbor node of a current node by taking the target search term as the current node, determines a semantic aggregation representation of the target search term based on the first text information of the current node and the second text information of the neighbor node, and recalls a candidate search term to be recommended from a set of candidate search terms, by taking the semantic aggregation representation as an index.
- the apparatus further includes:
- a first information determination module 1310 , a second information determination module 1320 , a third information determination module 1330 and a training module 1340 shown in FIG. 13 are modules same as or similar to the first information determination module 1210 , the second information determination module 1220 , the third information determination module 1230 and the training module 1240 shown in FIG. 12 , respectively.
- the sample collection module 1350 is specifically configured for:
- the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
- FIG. 14 is a block diagram of an electronic device for implementing the method for recommending a search term according to an embodiment of the present disclosure.
- the electronic device is intended to represent various forms of digital computers, such as laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
- the electronic device may also represent various forms of mobile devices, such as a personal digital assistant, a cellular telephone, a smart phone, a wearable device, and other similar computing devices.
- the components shown herein, their connections and relationships, and their functions are by way of example only and are not intended to limit the implementations of the application described and/or claimed herein.
- the electronic device may include one or more processors 1401 , a memory 1402 , and interfaces for connecting components, including high-speed interfaces and low-speed interfaces.
- the respective components are interconnected by different buses and may be mounted on a common main-board or otherwise as desired.
- the processor may process instructions executed within the electronic device, including instructions stored in or on the memory to display graphical information of a graphical user interface (GUI) on an external input/output device, such as a display device coupled to the interface.
- GUI graphical user interface
- a plurality of processors and/or buses may be used with a plurality of memories, if necessary.
- a plurality of electronic devices may be connected, each providing some of the necessary operations (e.g., as an array of servers, a set of blade servers, or a multiprocessor system).
- An example of a processor 1401 is shown in FIG. 14 .
- the memory 1402 is a non-transitory computer-readable storage medium provided herein.
- the memory stores instructions executable by at least one processor to cause the at least one processor to execute the method for recommending a search term provided herein.
- the non-transitory computer-readable storage medium of the present disclosure stores computer instructions for enabling a computer to execute the method for recommending a search term provided herein.
- the input device 1403 may receive input digital or character information, and generate a key signal input related to a user setting and a functional control of electronic device for implementing the method for recommending a search term.
- the input device may be a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, and other input devices.
- the output device 1404 may include a display device, an auxiliary lighting device (e.g., a light emitting diode (LED)), a tactile feedback device (e.g., a vibrating motor), etc.
- the display device may include, but is not limited to, a liquid crystal display (LCD), an LED display, and a plasma display. In some embodiments, the display device may be a touch screen.
- the functions and implementations of the processor and memory of the electronic device may refer to the above description of the processor and memory in the embodiment of the electronic device.
- Various implementations of the systems and techniques described herein may be implemented in a digital electronic circuit system, an integrated circuit system, an application specific integrated circuit (ASIC), a computer hardware, a firmware, a software, and/or a combination thereof.
- ASIC application specific integrated circuit
- These various implementations may include an implementation in one or more computer programs, which can be executed and/or interpreted on a programmable system including at least one programmable processor; the programmable processor may be a dedicated or general-purpose programmable processor and capable of receiving and transmitting data and instructions from and to a storage system, at least one input device, and at least one output device.
- the systems and techniques described herein may be implemented in a computing system (e.g., as a data server) that may include a background component, or a computing system (e.g., an application server) that may include a middleware component, or a computing system (e.g., a user computer having a graphical user interface or a web browser through which a user may interact with embodiments of the systems and techniques described herein) that may include a front-end component, or a computing system that may include any combination of such background components, middleware components, or front-end components.
- the components of the system may be connected to each other through a digital data communication in any form or medium (e.g., a communication network). Examples of the communication network may include a local area network (LAN), a wide area network (WAN), and the Internet.
- LAN local area network
- WAN wide area network
- the Internet the global information network
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
-
- acquiring an input target search term;
- determining a neighbor node of a current node and a relationship between the current node and the neighbor node, by taking the target search term as the current node;
- performing semantic representation processing on first text information of the current node and second text information of the neighbor node respectively, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node;
- determining a semantic aggregation representation of the target search term, based on the first semantic representation, the second semantic representation and the relationship between the current node and the neighbor node; and
- recalling a candidate search term to be recommended from a set of candidate search terms, by taking the semantic aggregation representation of the target search term as an index.
-
- inputting a sample search term in a training sample into a preset first network model of a preset model, to obtain a neighbor node taking the sample search term as a current node, and a relationship between the current node and the neighbor node, which are output by the preset first network model of the preset model;
- inputting first text information of the current node and second text information of the neighbor node into a preset second network model of the preset model, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node which are output by the preset second network model;
- inputting the first semantic representation of the current node and the second semantic representation of the neighbor node into a preset third network model of the preset model, to obtain a semantic aggregation representation of the sample search term output by the preset third network model;
- determining a loss function, based on the neighbor node taking the sample search term as the current node, the relationship between the current node and the neighbor node, the semantic aggregation representation of the sample search term, semantic aggregation representation labels of respective nodes in the training sample and labels of relationships between the respective nodes; and
- updating the preset model by performing a reverse conduction according to the loss function, to obtain the target model.
-
- an acquisition module configured for acquiring an input target search term;
- a first determination module configured for determining a neighbor node of a current node and a relationship between the current node and the neighbor node, by taking the target search term as the current node;
- a second determination module configured for performing semantic representation processing on first text information of the current node and second text information of the neighbor node respectively, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node;
- an aggregation representation module configured for determining a semantic aggregation representation of the target search term, based on the first semantic representation, the second semantic representation and the relationship between the current node and the neighbor node; and
- a recalling module configured for recalling a candidate search term to be recommended from a set of candidate search terms, by taking the semantic aggregation representation as an index.
-
- a first information determination module configured for inputting a sample search term in a training sample into a preset first network model of a preset model, to obtain a neighbor node taking the sample search term as a current node, and a relationship between the current node and the neighbor node, which are output by the preset first network model of the preset model;
- a second information determination module configured for inputting first text information of the current node and second text information of the neighbor node into a preset second network model of the preset model, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node which are output by the preset second network model;
- a third information determination module configured for inputting the first semantic representation of the current node and the second semantic representation of the neighbor node into a preset third network model of the preset model, to obtain a semantic aggregation representation of the sample search term output by the preset third network model; and
- a training module configured for determining a loss function, based on the neighbor node taking the sample search term as the current node, the relationship between the current node and the neighbor node, the semantic aggregation representation of the sample search term, semantic aggregation representation labels of respective nodes in the training sample and labels of relationships between the respective nodes; and updating the preset model by performing a reverse conduction according to the loss function, to obtain the target model.
-
- at least one processor; and
- a memory communicatively connected with the at least one processor, wherein
- the memory stores instructions executable by the at least one processor, the instructions being executed by the at least one processor to enable the at least one processor to perform the method according to any one of the above mentioned aspects.
-
- S11, acquiring an input target search term;
- S12, determining a neighbor node of a current node and a relationship between the current node and the neighbor node, by taking the target search term as the current node;
- S13, performing semantic representation processing on first text information of the current node and second text information of the neighbor node respectively, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node;
- S14, determining a semantic aggregation representation of the target search term, based on the first semantic representation, the second semantic representation and the relationship between the current node and the neighbor node; and
- S15, recalling a candidate search term to be recommended from a set of candidate search terms, by taking the semantic aggregation representation of the target search term as an index.
-
- acquiring a historical search record of a user, and determining a historical search term in the historical search record as a neighbor node;
- acquiring display webpage information during searching, and determining a webpage title in the display webpage information as a neighbor node; or
- acquiring co-occurrence information of historical search terms in a historical search log, establishing a dictionary in a key value pair format with the historical search terms as keys and the co-occurrence information as values, and acquiring the neighbor node in a word searching mode.
-
- S21, mining a plurality of nodes and the relationships between the respective nodes in the plurality of nodes from a search log, based on user interaction behaviors, wherein the plurality of nodes at least includes one type of nodes of the sample search term and a sample webpage title;
- S22, constructing a text graph, according to the plurality of nodes and the relationships between the respective nodes in the plurality of nodes; and
- S23, generating a training sample based on the text graph.
-
- determining a first type of relationship between the sample search terms;
- determining a second type of relationship between the sample search term and the sample webpage title; and
- determining a third type of relationship between the sample webpage titles.
-
- S41, inputting a sample search term in a training sample into a preset first network model of a preset model, to obtain a neighbor node taking the sample search term as a current node, and a relationship between the current node and the neighbor node, which are output by the preset first network model of the preset model;
- S42, inputting first text information of the current node and second text information of the neighbor node into a preset second network model of the preset model, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node which are output by the preset second network model;
- S43, inputting the first semantic representation of the current node and the second semantic representation of the neighbor node into a preset third network model of the preset model, to obtain a semantic aggregation representation of the sample search term output by the preset third network model;
- S44, determining a loss function, based on the neighbor node taking the sample search term as the current node, the relationship between the current node and the neighbor node, the semantic aggregation representation of the sample search term, semantic aggregation representation labels of respective nodes in the training sample and labels of relationships between the respective nodes; and
- S35, updating the preset model by performing a reverse conduction according to the loss function, to obtain the target model.
-
- S61, screening the candidate search terms, according to statistical characteristics of search frequency of the target search term, result page display quantity and clicks; and
- S62, establishing the set of candidate search terms according to the screened candidate search terms.
-
- S71, predicting semantic aggregation representations of all the candidate search terms by a target model, for all the screened candidate search terms; and
- S72, establishing the ANN index library, based on the semantic aggregation representations of all the candidate search terms.
-
- S81, determining similarities between the semantic aggregation representations of all the candidate search terms and a semantic aggregation representation in the ANN index library, by taking the semantic aggregation representation of the target search term as the index;
- S82, determining top N semantic aggregation representations in similarity ranking, as target semantic aggregation representations, wherein N is a positive integer; and
- S83, recalling candidate search terms corresponding to the target semantic aggregation representations as the candidate search terms to be recommended.
-
- an acquisition module 1010 configured for acquiring an input target search term;
- a first determination module 1020 configured for determining a neighbor node of a current node and a relationship between the current node and the neighbor node, by taking the target search term as the current node;
- a second determination module 1030 configured for performing semantic representation processing on first text information of the current node and second text information of the neighbor node respectively, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node;
- an aggregation representation module 1040 configured for determining a semantic aggregation representation of the target search term, based on the first semantic representation, the second semantic representation and the relationship between the current node and the neighbor node; and
- a recalling module 1050 configured for recalling a candidate search term to be recommended from a set of candidate search terms, by taking the semantic aggregation representation as an index.
-
- a set establishing module 1160 configured for:
- screening the candidate search terms, according to statistical characteristics of search frequency of the target search term, result page display quantity and clicks; and
- establishing the set of candidate search terms according to the screened candidate search terms.
-
- predicting semantic aggregation representation of all the candidate search terms by a target model, for all the screened candidate search terms; and
- establishing the ANN index library, based on the semantic aggregation representations of all the candidate search terms.
-
- determining similarities between the semantic aggregation representations and a semantic aggregation representation in the ANN index library, by taking the semantic aggregation representation as the index;
- determining top N semantic aggregation representations in similarity ranking, as target semantic aggregation representations, wherein N is a positive integer; and
- recalling candidate search terms corresponding to the target semantic aggregation representations as the candidate search terms to be recommended.
-
- a first information determination module 1210 configured for inputting a sample search term in a training sample into a preset first network model of a preset model, to obtain a neighbor node taking the sample search term as a current node, and a relationship between the current node and the neighbor node, which are output by the preset first network model of the preset model;
- a second information determination module 1220 configured for inputting first text information of the current node and second text information of the neighbor node into a preset second network model of the preset model, to obtain a first semantic representation of the current node and a second semantic representation of the neighbor node which are output by the preset second network model;
- a third information determination module 1230 configured for inputting the first semantic representation of the current node and the second semantic representation of the neighbor node into a preset third network model of the preset model, to obtain a semantic aggregation representation of the sample search term output by the preset third network model; and
- a training module 1240 configured for determining a loss function, based on the neighbor node taking the sample search term as the current node, the relationship between the current node and the neighbor node, the semantic aggregation representation of the sample search term, semantic aggregation representation labels of respective nodes in the training sample and labels of relationships between the respective nodes; and updating the preset model by performing a reverse conduction according to the loss function, to obtain the target model.
-
- a sample collection module 1350 configured for:
- mining a plurality of nodes and the relationships between the respective nodes in the plurality of nodes from a search log, based on user interaction behaviors, wherein the nodes at least includes one type of nodes of the sample search term and a sample webpage title;
- constructing a text graph, according to the plurality of nodes and the relationships between the respective nodes in the plurality of nodes; and
- generating a training sample based on the text graph.
-
- determining a first type of relationship between the sample search terms;
- determining a second type of relationship between the sample search term and the sample webpage title; and
- determining a third type of relationship between the sample webpage titles.
-
- determining the relationships between the respective nodes in the plurality of nodes, which includes at least one of:
- determining at least one of a co-occurrence relationship in which two sample search terms simultaneously appear in one search time domain, a co-display relationship in which two sample search terms display a same sample webpage title together, and a concurrent relationship in which two sample search terms click a same sample webpage title together, as the first type of relationship between the sample search terms;
- determining at least one of a display relationship in which a sample webpage title is recalled and displayed in a scenario of searching a search term, a click relationship in which a webpage title is clicked in a scenario of searching a sample search term, and a text matching relationship in which a sample search term and a webpage title have a text containing relationship, as the second type of relationship between the sample search term and the sample webpage title; and
- determining at least one of a co-display relationship in which two sample webpage titles are displayed simultaneously in a same search and a concurrent relationship in which two sample webpage titles are clicked simultaneously in a same search, as the third type of relationship between the sample webpage titles.
-
- one or more processors; and
- a storage for storing one or more programs, and in a case where the one or more programs are executed by the one or more processors, the one or more processors are enabled to implement the method in the above method embodiment.
Claims (12)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011563137.XA CN112650907B (en) | 2020-12-25 | 2020-12-25 | Search word recommendation method, target model training method, device and equipment |
| CN202011563137.X | 2020-12-25 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210365515A1 US20210365515A1 (en) | 2021-11-25 |
| US12373504B2 true US12373504B2 (en) | 2025-07-29 |
Family
ID=75363001
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/398,134 Active 2043-04-23 US12373504B2 (en) | 2020-12-25 | 2021-08-10 | Method for recommending a search term, method for training a target model and electronic device |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US12373504B2 (en) |
| EP (1) | EP3876114A3 (en) |
| JP (1) | JP7369740B2 (en) |
| KR (1) | KR102781095B1 (en) |
| CN (1) | CN112650907B (en) |
Families Citing this family (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12386621B2 (en) * | 2020-12-14 | 2025-08-12 | Cognitive Science & Solutions, Inc. | AI synaptic coprocessor |
| CN113761359B (en) * | 2021-05-13 | 2024-02-02 | 腾讯科技(深圳)有限公司 | Recommended methods, devices, electronic equipment and storage media for data packets |
| CN113609378B (en) * | 2021-07-02 | 2022-11-22 | 清华大学 | Information recommendation method and device, electronic equipment and storage medium |
| CN113961665B (en) * | 2021-09-15 | 2024-12-03 | 北京三快在线科技有限公司 | Network object processing method, device, electronic device and readable storage medium |
| CN115827958B (en) * | 2021-09-16 | 2025-08-08 | 腾讯科技(深圳)有限公司 | Information recommendation method, device and storage medium |
| CN113902005B (en) * | 2021-09-30 | 2025-03-25 | 北京百度网讯科技有限公司 | Language model pre-training method, device, equipment and storage medium |
| CN113934933B (en) * | 2021-10-15 | 2024-04-05 | 北京智融云河科技有限公司 | Message forwarding method, device and storage medium with low delay |
| CN113961765B (en) * | 2021-10-21 | 2023-12-19 | 北京百度网讯科技有限公司 | Search methods, devices, equipment and media based on neural network models |
| CN113704507B (en) * | 2021-10-26 | 2022-02-11 | 腾讯科技(深圳)有限公司 | Data processing method, computer device and readable storage medium |
| CN113987271A (en) * | 2021-10-27 | 2022-01-28 | 北京百度网讯科技有限公司 | Video query method and device, electronic equipment and storage medium |
| CN114036373B (en) | 2021-11-05 | 2023-09-29 | 北京百度网讯科技有限公司 | Searching method and device, electronic equipment and storage medium |
| CN114036322B (en) * | 2021-11-05 | 2025-05-16 | 北京百度网讯科技有限公司 | Training method, electronic device and storage medium for search system |
| CN113987358B (en) * | 2021-11-15 | 2025-05-27 | 中国科学技术大学 | A recommendation model training method, recommendation method and recommendation system |
| CN114461822A (en) * | 2021-12-20 | 2022-05-10 | 北京达佳互联信息技术有限公司 | Resource processing method, device, equipment and storage medium |
| CN114428902B (en) * | 2021-12-31 | 2023-11-14 | 北京百度网讯科技有限公司 | Information search method, device, electronic equipment and storage medium |
| CN116578767B (en) * | 2022-01-29 | 2024-07-30 | 腾讯科技(深圳)有限公司 | Semantic data processing and content recommending method and device and computer equipment |
| CN114996567A (en) * | 2022-05-06 | 2022-09-02 | 北京化工大学 | An API recommendation method based on context and graph learning |
| CN117078977A (en) * | 2022-05-06 | 2023-11-17 | 墨奇科技(北京)有限公司 | Task processing methods, neural network training methods, devices, equipment and media |
| WO2023213233A1 (en) * | 2022-05-06 | 2023-11-09 | 墨奇科技(北京)有限公司 | Task processing method, neural network training method, apparatus, device, and medium |
| CN114780867B (en) * | 2022-05-10 | 2023-11-03 | 杭州网易云音乐科技有限公司 | Recommendation method, medium, device and computing equipment |
| CN114926223A (en) * | 2022-06-07 | 2022-08-19 | 北京百度网讯科技有限公司 | Landing page feature generation method, landing page search method and related device |
| CN115129922B (en) * | 2022-07-08 | 2025-08-08 | 杭州网易云音乐科技有限公司 | Search term generation method, model training method, medium, device and equipment |
| CN115204291B (en) * | 2022-07-14 | 2025-11-07 | 广东三维家信息科技有限公司 | Household collocation scheme generation method and device, electronic equipment and storage medium |
| CN115248847B (en) * | 2022-09-22 | 2022-12-16 | 竹间智慧科技(北京)有限公司 | Search data set construction method and device, electronic equipment and storage medium |
| CN115510103B (en) * | 2022-09-23 | 2026-01-02 | 北京百度网讯科技有限公司 | Processing method, device, equipment, medium and product of search flow |
| CN115577154B (en) * | 2022-10-09 | 2025-04-11 | 北京字跳网络技术有限公司 | A method, device, computer equipment and storage medium for determining recommended words |
| CN116010681A (en) * | 2022-12-30 | 2023-04-25 | 拉扎斯网络科技(上海)有限公司 | Recall model training and retrieval method, device and electronic equipment |
| CN116821278A (en) * | 2023-05-24 | 2023-09-29 | 北京字跳网络技术有限公司 | Display method and device of search information, medium and electronic equipment |
| CN117093706B (en) * | 2023-10-19 | 2024-01-09 | 杭州烛微智能科技有限责任公司 | A test paper generation method, system, media and electronic equipment |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040225498A1 (en) * | 2003-03-26 | 2004-11-11 | Ryan Rifkin | Speaker recognition using local models |
| JP2005302043A (en) | 2004-04-15 | 2005-10-27 | Microsoft Corp | Reinforced clustering of multi-type data object for search term suggestion |
| JP2011103020A (en) | 2009-11-10 | 2011-05-26 | Nippon Telegr & Teleph Corp <Ntt> | Device, method, and program for recommending retrieval condition |
| JP2012133520A (en) | 2010-12-21 | 2012-07-12 | Nippon Telegr & Teleph Corp <Ntt> | Stochastic information retrieval processing apparatus, stochastic information retrieval processing method and stochastic information retrieval processing program |
| US20150161201A1 (en) | 2009-11-02 | 2015-06-11 | Google Inc. | Clustering query refinements by inferred user intent |
| US20180181648A1 (en) | 2016-12-27 | 2018-06-28 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and device for clarifying questions on deep question and answer |
| US20190057159A1 (en) | 2017-08-15 | 2019-02-21 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method, apparatus, server, and storage medium for recalling for search |
| US20190384831A1 (en) * | 2018-06-14 | 2019-12-19 | Microsoft Technology Licensing, Llc | Providing query recommendations |
| CN110619076A (en) | 2018-12-25 | 2019-12-27 | 北京时光荏苒科技有限公司 | Search term recommendation method and device, computer and storage medium |
| CN110795612A (en) | 2019-10-28 | 2020-02-14 | 北京字节跳动网络技术有限公司 | Search word recommendation method and device, electronic equipment and computer-readable storage medium |
| US20200286146A1 (en) * | 2019-03-07 | 2020-09-10 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for intelligent guided shopping |
| US20200302018A1 (en) | 2019-03-22 | 2020-09-24 | Servicenow, Inc. | Determining semantic similarity of texts based on sub-sections thereof |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102646103B (en) * | 2011-02-18 | 2016-03-16 | 腾讯科技(深圳)有限公司 | The clustering method of term and device |
| CN106708886B (en) * | 2015-11-17 | 2020-08-11 | 北京国双科技有限公司 | Display method and device for in-site search words |
| CN106294618A (en) * | 2016-08-01 | 2017-01-04 | 北京百度网讯科技有限公司 | Searching method and device |
-
2020
- 2020-12-25 CN CN202011563137.XA patent/CN112650907B/en active Active
-
2021
- 2021-07-14 JP JP2021116392A patent/JP7369740B2/en active Active
- 2021-07-22 EP EP21187193.4A patent/EP3876114A3/en not_active Ceased
- 2021-08-10 US US17/398,134 patent/US12373504B2/en active Active
- 2021-11-25 KR KR1020210164526A patent/KR102781095B1/en active Active
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040225498A1 (en) * | 2003-03-26 | 2004-11-11 | Ryan Rifkin | Speaker recognition using local models |
| JP2005302043A (en) | 2004-04-15 | 2005-10-27 | Microsoft Corp | Reinforced clustering of multi-type data object for search term suggestion |
| US20150161201A1 (en) | 2009-11-02 | 2015-06-11 | Google Inc. | Clustering query refinements by inferred user intent |
| JP2011103020A (en) | 2009-11-10 | 2011-05-26 | Nippon Telegr & Teleph Corp <Ntt> | Device, method, and program for recommending retrieval condition |
| JP2012133520A (en) | 2010-12-21 | 2012-07-12 | Nippon Telegr & Teleph Corp <Ntt> | Stochastic information retrieval processing apparatus, stochastic information retrieval processing method and stochastic information retrieval processing program |
| US20180181648A1 (en) | 2016-12-27 | 2018-06-28 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and device for clarifying questions on deep question and answer |
| US20190057159A1 (en) | 2017-08-15 | 2019-02-21 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method, apparatus, server, and storage medium for recalling for search |
| US20190384831A1 (en) * | 2018-06-14 | 2019-12-19 | Microsoft Technology Licensing, Llc | Providing query recommendations |
| CN110619076A (en) | 2018-12-25 | 2019-12-27 | 北京时光荏苒科技有限公司 | Search term recommendation method and device, computer and storage medium |
| US20200286146A1 (en) * | 2019-03-07 | 2020-09-10 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for intelligent guided shopping |
| US20200302018A1 (en) | 2019-03-22 | 2020-09-24 | Servicenow, Inc. | Determining semantic similarity of texts based on sub-sections thereof |
| CN110795612A (en) | 2019-10-28 | 2020-02-14 | 北京字节跳动网络技术有限公司 | Search word recommendation method and device, electronic equipment and computer-readable storage medium |
Non-Patent Citations (3)
| Title |
|---|
| Extended European Search Report EP 21187193.4 (Jan. 14, 2022) (12 pages). |
| Notice of Preliminary Rejection, issued in corresponding Korean patent application No. 10-2021-0164526, dated May 29, 2024, 19 pages. |
| Notice of Reasons for Refusal JP 2021-116392 (Sep. 1, 2022) (8 pages). |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20210151728A (en) | 2021-12-14 |
| JP7369740B2 (en) | 2023-10-26 |
| EP3876114A2 (en) | 2021-09-08 |
| US20210365515A1 (en) | 2021-11-25 |
| JP2021166098A (en) | 2021-10-14 |
| EP3876114A3 (en) | 2022-02-16 |
| CN112650907A (en) | 2021-04-13 |
| CN112650907B (en) | 2023-07-14 |
| KR102781095B1 (en) | 2025-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12373504B2 (en) | Method for recommending a search term, method for training a target model and electronic device | |
| US11714816B2 (en) | Information search method and apparatus, device and storage medium | |
| CN111831821B (en) | Training sample generation method and device of text classification model and electronic equipment | |
| US11847150B2 (en) | Method and apparatus for training retrieval model, device and computer storage medium | |
| CN112084150B (en) | Model training, data retrieval method, device, equipment and storage medium | |
| US20210200813A1 (en) | Human-machine interaction method, electronic device, and storage medium | |
| CN112559870B (en) | Multi-model fusion method, device, electronic device and storage medium | |
| WO2021139209A1 (en) | Query auto-completion method, apparatus and device, and computer storage medium | |
| CN113746874A (en) | Voice packet recommendation method, device, equipment and storage medium | |
| CN111708934A (en) | Evaluation method, device, electronic device and storage medium of knowledge content | |
| CN111291184B (en) | Expression recommendation method, device, equipment and storage medium | |
| CN111563198B (en) | Material recall method, device, equipment and storage medium | |
| CN111091006A (en) | Entity intention system establishing method, device, equipment and medium | |
| CN111814077A (en) | Information point query method, device, equipment and medium | |
| CN110555486B (en) | Model structure delay prediction method and device and electronic equipment | |
| CN111125176A (en) | Service data searching method and device, electronic equipment and storage medium | |
| CN111666292A (en) | Similarity model establishing method and device for retrieving geographic positions | |
| CN111523019B (en) | Method, apparatus, device and storage medium for outputting information | |
| CN111460289A (en) | News information push method and device | |
| CN111241225B (en) | Method, device, equipment and storage medium for judging change of resident area | |
| CN112052410A (en) | Map point of interest update method and device | |
| CN112052397A (en) | User feature generation method and device, electronic equipment and storage medium | |
| CN111881255B (en) | Synonymous text acquisition method and device, electronic equipment and storage medium | |
| CN111460257B (en) | Topic generation method, device, electronic equipment and storage medium | |
| JP7204903B2 (en) | INFORMATION PUSH METHOD, DEVICE, DEVICE AND STORAGE MEDIUM |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY CO., LTD., UNITED STATES Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JIANG, FUCHUN;REEL/FRAME:057257/0383 Effective date: 20210301 |
|
| AS | Assignment |
Owner name: BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) CO., LTD., CHINA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY DATA PREVIOUSLY RECORDED AT REEL: 057257 FRAME: 0383. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:JIANG, FUCHUN;REEL/FRAME:058234/0116 Effective date: 20210301 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |