CN113010666A - Abstract generation method, device, computer system and readable storage medium - Google Patents

Abstract generation method, device, computer system and readable storage medium Download PDF

Info

Publication number
CN113010666A
CN113010666A CN202110293478.8A CN202110293478A CN113010666A CN 113010666 A CN113010666 A CN 113010666A CN 202110293478 A CN202110293478 A CN 202110293478A CN 113010666 A CN113010666 A CN 113010666A
Authority
CN
China
Prior art keywords
hidden layer
layer sequence
abstract
description text
probability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110293478.8A
Other languages
Chinese (zh)
Other versions
CN113010666B (en
Inventor
袁鹏
李浩然
徐松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JD Digital Technology Holdings Co Ltd
Original Assignee
JD Digital Technology Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JD Digital Technology Holdings Co Ltd filed Critical JD Digital Technology Holdings Co Ltd
Priority to CN202110293478.8A priority Critical patent/CN113010666B/en
Publication of CN113010666A publication Critical patent/CN113010666A/en
Application granted granted Critical
Publication of CN113010666B publication Critical patent/CN113010666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/34Browsing; Visualisation therefor
    • G06F16/345Summarisation for human users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/126Character encoding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/216Parsing using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The present disclosure provides a digest generation method, including: acquiring text data for describing a target object, wherein the text data comprises a structured knowledge graph and an unstructured description text; respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence. The disclosure also provides a summary generation apparatus, a computer system, a readable storage medium, and a computer program product.

Description

Abstract generation method, device, computer system and readable storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a computer system, a readable storage medium, and a computer program product for generating a summary.
Background
Summary generation techniques are generally intended to summarize a large amount of information in a short, refined text. The user can know the meaning of the original information by reading the abstract. The abstract generation technology is applied to aspects of our life, such as extraction of news keywords, search result optimization of a search engine, commodity recommendation of a shopping platform and the like. By using the abstract generation technology, the reader can quickly acquire effective information, the time is saved, and the efficiency is improved.
In implementing the disclosed concept, the inventors found that there are at least the following problems in the related art: the existing abstract generation method does not fully mine and reference original information, so that the generated abstract is low in quality.
Disclosure of Invention
In view of the above, the present disclosure provides a digest generation method, apparatus, computer system, readable storage medium, and computer program product.
One aspect of the present disclosure provides a digest generation method, including:
acquiring text data for describing a target object, wherein the text data comprises a structured knowledge graph and an unstructured description text;
respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and
and generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
According to an embodiment of the present disclosure, wherein generating the summary of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence comprises:
decoding the first encoder hidden layer sequence to generate the replication probability of the first abstract words of the structured knowledge graph;
decoding the hidden layer sequence of the second encoder to generate the copying probability of a second abstract word of the unstructured description text and the generating probability of a third abstract word corresponding to the unstructured description text;
obtaining a fusion probability based on the replication probability of the first abstract word, the replication probability of the second abstract word and the generation probability of the third abstract word; and
and generating the abstract of the text data according to the fusion probability.
According to an embodiment of the present disclosure, decoding the second encoder hidden layer sequence to generate a probability of copying a second abstract word of the unstructured description text and a probability of generating a third abstract word corresponding to the unstructured description text includes:
processing the hidden layer sequence of the second encoder to generate a hidden layer sequence of a decoder and a context vector sequence;
generating a generation probability of a third abstract word corresponding to the unstructured description text based on the decoder hidden layer sequence and the context vector sequence;
generating attention weights of second abstract words of the unstructured description text based on the decoder hidden layer sequence and the context vector sequence; and
and generating the duplication probability of the second abstract words based on the attention weight of the second abstract words.
According to an embodiment of the present disclosure, wherein the structured knowledge-graph comprises attribute identifications and attribute values;
the first encoder hidden layer sequence includes an attribute identification hidden layer sequence and an attribute value hidden layer sequence.
According to an embodiment of the present disclosure, wherein decoding the first encoder hidden layer sequence, generating a probability of duplication of the first abstract word of the structured knowledge-graph comprises:
based on the decoder hidden layer sequence and the context vector sequence, processing the attribute identification hidden layer sequence and the attribute value hidden layer sequence respectively to generate an attribute identification attention weight corresponding to the attribute identification semantic vector and an attribute value attention weight corresponding to the attribute value semantic vector; and
and generating the duplication probability of the first abstract word based on the attribute identification attention weight and the attribute value attention weight.
According to an embodiment of the present disclosure, before encoding the structured knowledge-map and the unstructured description text respectively to generate a first encoder hidden layer sequence corresponding to the structured knowledge-map and a second encoder hidden layer sequence corresponding to the unstructured description text, the method further includes:
and performing word segmentation processing on the structured knowledge graph and the unstructured description text so as to respectively encode the structured knowledge graph and the unstructured description text.
Yet another aspect of the present disclosure provides a digest generation apparatus, including:
the acquisition module is used for acquiring text data for describing the target object, wherein the text data comprises a structured knowledge graph and an unstructured description text;
the encoding module is used for respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and
and the generating module is used for generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
Yet another aspect of the present disclosure provides a computer system comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the digest generation method described above.
Yet another aspect of the present disclosure provides a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to implement the digest generation method described above.
Yet another aspect of the present disclosure provides a computer program product comprising a computer program comprising computer executable instructions for implementing the above summary generation method when executed.
According to the embodiment of the disclosure, the text data for describing the target object is obtained, wherein the text data comprises the structured knowledge map and the unstructured description text; respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; generating a text data abstract according to the first encoder hidden layer sequence and the second encoder hidden layer sequence, and simultaneously coding by combining a structured knowledge map and an unstructured description text and generating the text data abstract; therefore, the technical problem that the quality of the generated abstract is low due to the fact that original information is not fully mined and referred in the prior art is at least partially solved, and the technical effect that the original information is fully mined and referred to generate the abstract so as to improve the integrity and the quality of the abstract generation is achieved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates an exemplary system architecture to which the summary generation methods and apparatus of the present disclosure may be applied;
FIG. 2 schematically illustrates a flow chart of a digest generation method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow diagram of a digest generation method according to another embodiment of the present disclosure;
FIG. 4 schematically shows a block diagram of a digest generation apparatus according to an embodiment of the present disclosure; and
FIG. 5 schematically illustrates a block diagram of a computer system suitable for implementing a digest generation method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
The embodiment of the disclosure provides a summary generation method. The method comprises the following steps: acquiring text data for describing a target object, wherein the text data comprises a structured knowledge graph and an unstructured description text; respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
Fig. 1 schematically illustrates an exemplary system architecture 100 to which the digest generation method and apparatus may be applied, according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a system architecture to which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, and does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104 and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired and/or wireless communication links, and so forth.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as a shopping-like application, a web browser application, a search-like application, an instant messaging tool, a mailbox client, and/or social platform software, etc. (by way of example only).
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (for example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and perform other processing on the received data such as the user request, and feed back a processing result (e.g., a webpage, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the summary generation method provided by the embodiment of the present disclosure may be generally executed by the server 105. Accordingly, the summary generation apparatus provided by the embodiments of the present disclosure may be generally disposed in the server 105. The digest generation method provided by the embodiment of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the summary generation apparatus provided by the embodiment of the present disclosure may also be disposed in a server or a server cluster different from the server 105 and capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Alternatively, the digest generation method provided by the embodiment of the present disclosure may also be executed by the terminal device 101, 102, or 103, or may also be executed by another terminal device different from the terminal device 101, 102, or 103. Accordingly, the summary generation apparatus provided in the embodiments of the present disclosure may also be disposed in the terminal device 101, 102, or 103, or in another terminal device different from the terminal device 101, 102, or 103.
For example, the text data for describing the target object may be originally stored in any one of the terminal apparatuses 101, 102, or 103 (for example, the terminal apparatus 101, but not limited thereto), or stored on an external storage apparatus and may be imported into the terminal apparatus 101. Then, the terminal device 101 may transmit the text data describing the target object to another terminal device, a server, or a server cluster, and execute the digest generation method provided by the embodiment of the present disclosure by another server or a server cluster that receives the text data describing the target object.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 schematically shows a flow chart of a digest generation method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S230.
In operation S210, text data for describing a target object is acquired, wherein the text data includes a structured knowledge graph and an unstructured description text;
in operation S220, encoding the structured knowledge map and the unstructured description text, respectively, to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and
in operation S230, a summary of the text data is generated according to the first and second encoder hidden layer sequences.
According to an embodiment of the present disclosure, the text data for describing the target object may be, for example, heterogeneous product data for describing a certain target product; the heterogeneous commodity data includes a structured knowledge graph and unstructured descriptive text. Wherein the structured knowledge graph may be, for example, a structured commodity knowledge graph, such as tabular data listed with commodity attribute identifications and commodity attribute values corresponding to the commodity attribute identifications; the unstructured description text may be, for example, unstructured commodity detailed description text, such as text describing commodity performance, specifications, and the like.
According to the embodiment of the disclosure, a structured knowledge map and an unstructured description text are respectively encoded, and a summary of the text data is generated based on a first encoder hidden layer sequence and a second encoder hidden layer sequence obtained after encoding. The technical effects of fully mining and generating the abstract by referring to the original information and improving the integrity and quality of the abstract generation are achieved.
The method of fig. 2 is further described with reference to fig. 3 in conjunction with specific embodiments.
Fig. 3 schematically shows a flow chart of a digest generation method according to another embodiment of the present disclosure.
As shown in fig. 3, the digest generation method includes operations S310 to S320, S331, S332, S340 to S350.
In operation S310, text data for describing a target object is acquired, wherein the text data includes a structured knowledge graph and an unstructured description text.
According to an embodiment of the present disclosure, the unstructured description text may be a product detailed description text of the target product, where the sequence x ═ { x ═ x1,x2,...,xnDenotes, each of xiIs a word in the unstructured description text.
According to an alternative embodiment of the present disclosure, the structured knowledge-graph may include attribute identifications and attribute values.
Table 1 is a structured knowledge graph of a target commodity according to embodiments of the present disclosure. As shown in table 1, the attribute identifier may include an attribute description characterizing the target product, for example, the attribute may be color, height, volume, etc.; the attribute value may be a specific parameter corresponding to the attribute identifier, and may be, for example, data of a specific color white, a specific height 2 meters, a specific capacity 4 liters, and the like.
TABLE 1
Name of commodity Colour(s) Height Capacity of
Bottle (CN) White colour 2 m 4 liters of water
According to the embodiment of the disclosure, attribute identification (such as color, height and capacity) in the structured knowledge graph can be represented by using a sequence k ═ { k ═ k%1,k2,...,kmRepresents by "}; attribute values (e.g., white, 2 meters, 4 liters) may be given in the sequence v ═ v1,v2,...,vmAnd (c) represents.
According to an optional embodiment of the present disclosure, before the structured knowledge map and the unstructured description text are respectively encoded to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text, a word segmentation operation may be performed on the structured knowledge map and the unstructured description text to facilitate respectively encoding the structured knowledge map and the unstructured description text.
According to the embodiment of the disclosure, word segmentation processing is firstly carried out on the structured knowledge graph and the unstructured description text, which is beneficial to obtaining sequences x, k and v for subsequent coding processing. In the embodiment of the present disclosure, the word segmentation processing operation may be completed by using a related word segmentation technology, and is not described herein again.
In operation S320, the structured knowledge map and the unstructured description text are encoded respectively, and a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text are generated.
According to embodiments of the present disclosure, a structured knowledge-graph and unstructured description text may be encoded separately using a bi-directional LSTM encoder (BiLSTM); but not limited to this, the structured knowledge graph and the unstructured description text can be encoded separately by using a Transform network as an encoder.
According to the optional embodiment of the disclosure, the structured knowledge map and the unstructured description text are respectively encoded by using a bidirectional LSTM encoder (BilSTM), so that an encoder with good encoding effect can be obtained by training under the condition of small training sample amount.
According to an embodiment of the present disclosure, the first encoder hidden layer sequence may include an attribute identification hidden layer sequence and an attribute value hidden layer sequence.
According to an alternative embodiment of the present disclosure, the sequences k and v are encoded by a bi-directional LSTM encoder BiLSTM, and the attribute identifier hidden layer sequence and the attribute value hidden layer sequences hk and hv are generated respectively as shown in the following formula (1) and formula (2).
Figure BDA0002982454350000091
Figure BDA0002982454350000092
Wherein i represents the i-th hidden layer.
According to an alternative embodiment of the present disclosure, the sequence x is encoded using a bi-directional LSTM encoder BiLSTM, generating a second encoder hidden layer sequence hx, as shown in equation (3) below.
Figure BDA0002982454350000093
Wherein i represents the i-th hidden layer.
In operation S331, the first encoder hidden layer sequence is decoded, and a duplication probability of the first abstract word of the structured knowledge graph is generated.
In operation S332, the second encoder hidden layer sequence is decoded, and a probability of copying a second abstract word of the unstructured description text and a probability of generating a third abstract word corresponding to the unstructured description text are generated.
In operation S340, a fusion probability is obtained based on the replication probability of the first abstract word, the replication probability of the second abstract word, and the generation probability of the third abstract word.
In operation S350, a summary of the text data is generated according to the fusion probability.
According to the embodiment of the disclosure, by using the abstract generation method of the embodiment of the disclosure, not only the structured knowledge graph and the unstructured description text are respectively encoded by using two encoders, but also the replication probability of the first abstract words appearing in the structured knowledge graph, the replication probability of the second abstract words appearing in the unstructured description text and the generation probability of the third abstract words generated according to the semantics of the unstructured description text are comprehensively considered in the final decoding, so that the technical effect that the generated abstract is more accurate on the basis of fully mining the original information is realized.
According to an embodiment of the present disclosure, decoding the second encoder hidden layer sequence to generate a probability of copying the second abstract word of the unstructured description text and a probability of generating the third abstract word corresponding to the unstructured description text may include the following operations.
Processing the hidden layer sequence of the second encoder to generate a hidden layer sequence of a decoder and a context vector sequence;
generating a generation probability of a third abstract word corresponding to the unstructured description text based on the decoder hidden layer sequence and the context vector sequence;
generating attention weights of second abstract words of the unstructured description text based on the decoder hidden layer sequence and the context vector sequence; and
and generating the duplication probability of the second abstract words based on the attention weight of the second abstract words.
According to an alternative embodiment of the present disclosure, the second encoder hidden layer sequence may be decoded with a unidirectional LSTM decoder (UniLSTM), and with an attention mechanism, the decoder hidden layer sequence and the context vector sequence are finally generated.
According to the embodiment of the present disclosure, the second encoder hidden layer sequence is processed to generate a decoder hidden layer sequence and a context vector sequence, which can be referred to in equations (4) to (7).
Figure BDA0002982454350000101
Figure BDA0002982454350000102
Figure BDA0002982454350000103
Figure BDA0002982454350000104
Wherein s istFor decoder hidden layer sequences at time t, yt-1Is the output at the time t-1,
Figure BDA0002982454350000105
is the context vector at time t, ua、Wa、VaRespectively, model parameter matrices.
According to an embodiment of the present disclosure, a generation probability of generating a third abstract word corresponding to an unstructured description text based on a decoder hidden layer sequence and a context vector sequence may be as follows (8).
Figure BDA0002982454350000111
Wherein, WcAnd VcIs a model parameter matrix;
Figure BDA0002982454350000112
is the context vector at time t, stThe decoder is a hidden layer sequence for time t.
According to the embodiment of the disclosure, the keywords in the unstructured description text can be directly copied as abstract words. The probability of copying a keyword in an unstructured description text directly as a second abstract word may be as follows (9).
Figure BDA0002982454350000113
Wherein the content of the first and second substances,
Figure BDA0002982454350000114
is the attention weight of the second abstract word.
According to the embodiment of the disclosure, the abstract generation method of the embodiment of the disclosure not only considers generating abstract words according to the semantics of the unstructured description text, but also considers directly copying the keywords in the unstructured description text as the abstract words.
According to an alternative embodiment of the present disclosure, the keywords in the structured knowledge graph may also be directly copied as abstract words. And directly copying the key words in the structured knowledge graph to be used as the copying probability of the first abstract words.
According to an embodiment of the present disclosure, wherein decoding the first encoder hidden layer sequence, generating a probability of duplication of the first abstract word of the structured knowledge-graph may include the following operations.
Respectively processing the attribute identification hidden layer sequence and the attribute value hidden layer sequence to generate an attribute identification attention weight corresponding to the attribute identification hidden layer sequence and an attribute value attention weight corresponding to the attribute value hidden layer sequence; and
and generating the duplication probability of the first abstract word based on the attribute identification attention weight and the attribute value attention weight.
According to an embodiment of the present disclosure, the probability of duplication of the first abstract word may be as in equation (10).
Figure BDA0002982454350000115
According to an embodiment of the present disclosure, the attribute identification attention weight may be as in equation (11); the attribute value attention weight may be as in equation (12).
Figure BDA0002982454350000121
Figure BDA0002982454350000122
Wherein the content of the first and second substances,
Figure BDA0002982454350000123
stfor the decoder hidden layer sequence at time t, ua、Wa、VaRespectively, model parameter matrices.
According to the embodiment of the disclosure, the attention weight and the attribute value attention weight are identified by using the attribute, that is, the duplication probability of the first abstract word is obtained by using the secondary attention weight, and the recognition of the low-frequency keyword is fully considered, so that the generated abstract information is sufficiently and critically grasped.
According to the embodiment of the disclosure, the final fusion probability includes a replication probability of a first abstract word of the structured knowledge graph, a replication probability of a second abstract word of the unstructured description text, and a generation probability of a third abstract word corresponding to the unstructured description text, a semantic integration generation probability and a keyword replication probability, which may be specifically expressed by a formula
Figure BDA0002982454350000124
Figure BDA0002982454350000125
In summary, with the abstract generation method of the embodiment of the present disclosure, on one hand, two encoders are used to encode the structured knowledge graph and the unstructured description text respectively, and the full mining of the original information is considered; on the other hand, a double replication mechanism is adopted and matched with the codes, and the replication of the keyword information in the structured knowledge map and the unstructured description text is integrated, so that the integrity of the generated abstract is high; on the other hand, in the process of calculating the copying probability of the structured knowledge graph, the secondary attention weight is adopted, the recognition degree of the low-frequency key words is improved, and the generated abstract is more accurate.
According to other embodiments of the present disclosure, it should be noted that the digest generation method of the embodiments of the present disclosure may generate the digest by constructing an encoding-decoding model. For example, the encoding-decoding model includes two bi-directional LSTM encoders and a unidirectional LSTM decoder, and employs an attention mechanism and a duplicate mechanism. The model encoding-decoding is trained based on maximum likelihood, with a loss function as in equation (13).
Figure BDA0002982454350000126
Where T is the number of words in the text data.
And then training by taking the text data with the structured knowledge graph and the unstructured descriptive text as training samples.
According to the embodiment of the disclosure, the text data of the structured knowledge graph and the unstructured description text are used as training samples, and the training is performed by utilizing the maximum likelihood, so that the abstract generated by the finally trained coding-decoding model is refined, accurate and high in integrity.
Fig. 4 schematically shows a block diagram of a digest generation apparatus according to an embodiment of the present disclosure.
As shown in fig. 4, the digest generation apparatus 400 includes an acquisition module 410, an encoding module 420, and a generation module 430.
An obtaining module 410, configured to obtain text data for describing a target object, where the text data includes a structured knowledge graph and an unstructured description text;
the encoding module 420 is configured to encode the structured knowledge map and the unstructured description text respectively to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and
the generating module 430 is configured to generate an abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
According to the embodiment of the disclosure, a structured knowledge map and an unstructured description text are respectively encoded, and a summary of the text data is generated based on a first encoder hidden layer sequence and a second encoder hidden layer sequence obtained after encoding. The technical effects of fully mining and generating the abstract by referring to the original information and improving the integrity and quality of the abstract generation are achieved. According to an embodiment of the present disclosure, the generating module 430 includes a first decoding unit, a second decoding unit, an obtaining unit, and a generating unit.
The first decoding unit is used for decoding the first encoder hidden layer sequence to generate the replication probability of the first abstract words of the structured knowledge graph;
the second decoding unit is used for decoding the hidden layer sequence of the second encoder to generate the copying probability of a second abstract word of the unstructured description text and the generation probability of a third abstract word corresponding to the unstructured description text;
an obtaining unit, configured to obtain a fusion probability based on the replication probability of the first abstract word, the replication probability of the second abstract word, and the generation probability of the third abstract word; and
and the generating unit is used for generating the abstract of the text data according to the fusion probability.
According to an embodiment of the present disclosure, wherein the second decoding unit includes a first generation sub-unit, a second generation sub-unit, a third generation sub-unit, and a fourth generation sub-unit.
The first generating subunit is configured to process the second encoder hidden layer sequence to generate a decoder hidden layer sequence and a context vector sequence;
the second generation subunit is used for generating the generation probability of a third abstract word corresponding to the unstructured description text based on the decoder hidden layer sequence and the context vector sequence;
a third generating subunit, configured to generate an attention weight of a second abstract word of the unstructured description text based on the decoder hidden layer sequence and the context vector sequence; and
and the fourth generation subunit is used for generating the duplication probability of the second abstract word based on the attention weight of the second abstract word.
According to an embodiment of the present disclosure, wherein the structured knowledge-graph comprises attribute identifications and attribute values;
the first encoder hidden layer sequence includes an attribute identification hidden layer sequence and an attribute value hidden layer sequence.
According to an embodiment of the present disclosure, wherein the first decoding unit includes a fifth generation sub-unit and a sixth generation sub-unit.
A fifth generating subunit, configured to process the attribute identifier hidden layer sequence and the attribute value hidden layer sequence based on the decoder hidden layer sequence and the context vector sequence, respectively, and generate an attribute identifier attention weight corresponding to the attribute identifier semantic vector and an attribute value attention weight corresponding to the attribute value semantic vector; and
and the sixth generating subunit is configured to generate a duplication probability of the first abstract word based on the attribute identification attention weight and the attribute value attention weight.
According to an embodiment of the present disclosure, the summary generation apparatus 400 further includes a word segmentation module.
And the word segmentation module is used for carrying out word segmentation processing on the structured knowledge map and the unstructured description text so as to respectively encode the structured knowledge map and the unstructured description text.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any number of the obtaining module 410, the encoding module 420, and the generating module 430 may be combined and implemented in one module/unit/sub-unit, or any one of the modules/units/sub-units may be split into a plurality of modules/units/sub-units. Alternatively, at least part of the functionality of one or more of these modules/units/sub-units may be combined with at least part of the functionality of other modules/units/sub-units and implemented in one module/unit/sub-unit. According to an embodiment of the present disclosure, at least one of the obtaining module 410, the encoding module 420, and the generating module 430 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or the same in any other reasonable manner of integrating or packaging a circuit, or in any one of or a suitable combination of three implementations of software, hardware, and firmware. Alternatively, at least one of the obtaining module 410, the encoding module 420, and the generating module 430 may be implemented at least in part as a computer program module that, when executed, may perform corresponding functions.
It should be noted that, the summary generation apparatus portion in the embodiment of the present disclosure corresponds to the summary generation method portion in the embodiment of the present disclosure, and the description of the summary generation apparatus portion specifically refers to the summary generation method portion, which is not described herein again.
Fig. 5 schematically illustrates a block diagram of a computer system suitable for implementing the above-described method according to an embodiment of the present disclosure. The computer system illustrated in FIG. 5 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 5, a computer system 500 according to an embodiment of the present disclosure includes a processor 501, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. The processor 501 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 501 may also include onboard memory for caching purposes. Processor 501 may include a single processing unit or multiple processing units for performing different actions of a method flow according to embodiments of the disclosure.
In the RAM 503, various programs and data necessary for the operation of the system 500 are stored. The processor 501, the ROM 502, and the RAM 503 are connected to each other by a bus 504. The processor 501 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 502 and/or the RAM 503. Note that the programs may also be stored in one or more memories other than the ROM 502 and the RAM 503. The processor 501 may also perform various operations of method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the present disclosure, system 500 may also include an input/output (I/O) interface 505, input/output (I/O) interface 505 also being connected to bus 504. The system 500 may also include one or more of the following components connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program, when executed by the processor 501, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, a computer-readable storage medium may include ROM 502 and/or RAM 503 and/or one or more memories other than ROM 502 and RAM 503 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the method provided by the embodiments of the present disclosure, when the computer program product is run on an electronic device, the program code being configured to cause the electronic device to implement the summary generation method provided by the embodiments of the present disclosure.
The computer program, when executed by the processor 501, performs the above-described functions defined in the system/apparatus of the embodiments of the present disclosure. The systems, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed in the form of a signal on a network medium, downloaded and installed through the communication section 509, and/or installed from the removable medium 511. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In accordance with embodiments of the present disclosure, program code for executing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using high level procedural and/or object oriented programming languages, and/or assembly/machine languages. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user computing device, partly on the user device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (10)

1. A summary generation method comprises the following steps:
acquiring text data for describing a target object, wherein the text data comprises a structured knowledge graph and an unstructured description text;
respectively encoding the structured knowledge map and the unstructured description text to generate a first encoder hidden layer sequence corresponding to the structured knowledge map and a second encoder hidden layer sequence corresponding to the unstructured description text; and
and generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
2. The method of claim 1, wherein the generating the summary of the text data from the first and second encoder hidden layer sequences comprises:
decoding the first encoder hidden layer sequence to generate the replication probability of the first abstract words of the structured knowledge graph;
decoding the second encoder hidden layer sequence to generate the copy probability of a second abstract word of the unstructured description text and the generation probability of a third abstract word corresponding to the unstructured description text;
obtaining a fusion probability based on the replication probability of the first abstract word, the replication probability of the second abstract word and the generation probability of the third abstract word; and
and generating the abstract of the text data according to the fusion probability.
3. The method of claim 2, wherein the decoding the second encoder hidden layer sequence to generate a probability of duplication of a second abstract word of the unstructured description text and a probability of generation of a third abstract word corresponding to the unstructured description text comprises:
processing the second encoder hidden layer sequence to generate a decoder hidden layer sequence and a context vector sequence;
generating a generation probability of a third abstract word corresponding to the unstructured description text based on the decoder hidden layer sequence and the context vector sequence;
generating attention weights for second abstract words of the unstructured description text based on the decoder hidden layer sequence and the context vector sequence; and
generating a duplication probability of the second abstract word based on the attention weight of the second abstract word.
4. The method of claim 3, wherein the structured knowledge-graph comprises attribute identifications and attribute values;
the first encoder hidden layer sequence comprises an attribute identification hidden layer sequence and an attribute value hidden layer sequence.
5. The method of claim 4, wherein said decoding the first encoder hidden layer sequence to generate a probability of duplication of the first abstract word of the structured knowledge-graph comprises:
based on the decoder hidden layer sequence and the context vector sequence, respectively processing the attribute identification hidden layer sequence and the attribute value hidden layer sequence to generate an attribute identification attention weight corresponding to the attribute identification semantic vector and an attribute value attention weight corresponding to the attribute value semantic vector; and
generating a duplication probability of the first abstract word based on the attribute identification attention weight and the attribute value attention weight.
6. The method of claim 1, wherein the encoding the structured knowledge-graph and the unstructured descriptive text separately precedes generating a first encoder hidden layer sequence corresponding to the structured knowledge-graph and a second encoder hidden layer sequence corresponding to the unstructured descriptive text, the method further comprising:
performing word segmentation processing on the structured knowledge graph and the unstructured description text so as to respectively encode the structured knowledge graph and the unstructured description text.
7. A digest generation apparatus comprising:
the acquisition module is used for acquiring text data for describing a target object, wherein the text data comprises a structured knowledge graph and an unstructured description text;
the coding module is used for coding the structured knowledge map and the unstructured description text respectively to generate a first coder hidden layer sequence corresponding to the structured knowledge map and a second coder hidden layer sequence corresponding to the unstructured description text; and
and the generating module is used for generating the abstract of the text data according to the first encoder hidden layer sequence and the second encoder hidden layer sequence.
8. A computer system, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-6.
9. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to carry out the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program comprising computer executable instructions for implementing the method of any one of claims 1 to 6 when executed.
CN202110293478.8A 2021-03-18 2021-03-18 Digest generation method, digest generation device, computer system, and readable storage medium Active CN113010666B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110293478.8A CN113010666B (en) 2021-03-18 2021-03-18 Digest generation method, digest generation device, computer system, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110293478.8A CN113010666B (en) 2021-03-18 2021-03-18 Digest generation method, digest generation device, computer system, and readable storage medium

Publications (2)

Publication Number Publication Date
CN113010666A true CN113010666A (en) 2021-06-22
CN113010666B CN113010666B (en) 2023-12-08

Family

ID=76402760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110293478.8A Active CN113010666B (en) 2021-03-18 2021-03-18 Digest generation method, digest generation device, computer system, and readable storage medium

Country Status (1)

Country Link
CN (1) CN113010666B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114218932A (en) * 2021-11-26 2022-03-22 中国航空综合技术研究所 Aviation fault text abstract generation method and device based on fault cause and effect map

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9619601B1 (en) * 2015-01-22 2017-04-11 Xilinx, Inc. Control and data flow graph generation for hardware description languages
JP2018067199A (en) * 2016-10-20 2018-04-26 日本電信電話株式会社 Abstract generating device, text converting device, and methods and programs therefor
CN108509413A (en) * 2018-03-08 2018-09-07 平安科技(深圳)有限公司 Digest extraction method, device, computer equipment and storage medium
US10402495B1 (en) * 2016-09-01 2019-09-03 Facebook, Inc. Abstractive sentence summarization
CN111460135A (en) * 2020-03-31 2020-07-28 北京百度网讯科技有限公司 Method and device for generating text abstract
US20200250283A1 (en) * 2018-12-11 2020-08-06 block.one Digital identity social graph
CN111666418A (en) * 2020-04-23 2020-09-15 北京三快在线科技有限公司 Text regeneration method and device, electronic equipment and computer readable medium
JP2021033995A (en) * 2019-08-16 2021-03-01 株式会社Nttドコモ Text processing apparatus, method, device, and computer-readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9619601B1 (en) * 2015-01-22 2017-04-11 Xilinx, Inc. Control and data flow graph generation for hardware description languages
US10402495B1 (en) * 2016-09-01 2019-09-03 Facebook, Inc. Abstractive sentence summarization
JP2018067199A (en) * 2016-10-20 2018-04-26 日本電信電話株式会社 Abstract generating device, text converting device, and methods and programs therefor
CN108509413A (en) * 2018-03-08 2018-09-07 平安科技(深圳)有限公司 Digest extraction method, device, computer equipment and storage medium
US20200250283A1 (en) * 2018-12-11 2020-08-06 block.one Digital identity social graph
JP2021033995A (en) * 2019-08-16 2021-03-01 株式会社Nttドコモ Text processing apparatus, method, device, and computer-readable storage medium
CN111460135A (en) * 2020-03-31 2020-07-28 北京百度网讯科技有限公司 Method and device for generating text abstract
CN111666418A (en) * 2020-04-23 2020-09-15 北京三快在线科技有限公司 Text regeneration method and device, electronic equipment and computer readable medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王仁武;袁毅;袁旭萍;: "基于深度学习与图数据库构建中文商业知识图谱的探索研究", 图书与情报, no. 01 *
陈雪雯;: "基于子词单元的深度学习摘要生成方法", 计算机应用与软件, no. 03 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114218932A (en) * 2021-11-26 2022-03-22 中国航空综合技术研究所 Aviation fault text abstract generation method and device based on fault cause and effect map
CN114218932B (en) * 2021-11-26 2024-02-20 中国航空综合技术研究所 Aviation fault text abstract generation method and device based on fault causal map

Also Published As

Publication number Publication date
CN113010666B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
CN109522483B (en) Method and device for pushing information
US11758088B2 (en) Method and apparatus for aligning paragraph and video
CN111314388B (en) Method and apparatus for detecting SQL injection
CN114840252A (en) Code annotation generation method, model training method, device and equipment
CN113507419B (en) Training method of traffic distribution model, traffic distribution method and device
CN112330382A (en) Item recommendation method and device, computing equipment and medium
CN113010666B (en) Digest generation method, digest generation device, computer system, and readable storage medium
CN116155628B (en) Network security detection method, training device, electronic equipment and medium
WO2024006007A1 (en) Privacy-sensitive neural network training
CN116560661A (en) Code optimization method, device, equipment and storage medium
CN115759292A (en) Model training method and device, semantic recognition method and device, and electronic device
US20230041339A1 (en) Method, device, and computer program product for user behavior prediction
CN114691850A (en) Method for generating question-answer pairs, training method and device of neural network model
CN113391988A (en) Method and device for losing user retention, electronic equipment and storage medium
US9251125B2 (en) Managing text in documents based on a log of research corresponding to the text
CN111767391B (en) Target text generation method, device, computer system and medium
CN113935334A (en) Text information processing method, device, equipment and medium
CN117493519A (en) Training method of text encoder, text generation method, device and storage medium
CN115292503A (en) Logistics information identification method, device, equipment and medium
CN114328891A (en) Training method of information recommendation model, information recommendation method and device
CN114386484A (en) Text matching method, training method, device, equipment and medium
CN116721426A (en) Text recognition method, device, equipment and storage medium
CN115828019A (en) Page generation method, device, equipment and storage medium
CN114707486A (en) Text processing method, device, equipment, medium and program product
CN115757473A (en) Information processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Holding Co.,Ltd.

Address before: Room 221, 2 / F, block C, 18 Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Digital Technology Holding Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant