CN112270170B - Implicit expression statement analysis method and device, medium and electronic equipment - Google Patents

Implicit expression statement analysis method and device, medium and electronic equipment

Info

Publication number
CN112270170B
CN112270170B CN202011116326.2A CN202011116326A CN112270170B CN 112270170 B CN112270170 B CN 112270170B CN 202011116326 A CN202011116326 A CN 202011116326A CN 112270170 B CN112270170 B CN 112270170B
Authority
CN
China
Prior art keywords
expression
unit
vector
evaluation object
emotion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011116326.2A
Other languages
Chinese (zh)
Other versions
CN112270170A (en
Inventor
郑志军
程国艮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Glabal Tone Communication Technology Co ltd
Original Assignee
Glabal Tone Communication Technology Co ltd
Filing date
Publication date
Application filed by Glabal Tone Communication Technology Co ltd filed Critical Glabal Tone Communication Technology Co ltd
Priority to CN202011116326.2A priority Critical patent/CN112270170B/en
Publication of CN112270170A publication Critical patent/CN112270170A/en
Application granted granted Critical
Publication of CN112270170B publication Critical patent/CN112270170B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The disclosure provides an implicit expression statement analysis method, an implicit expression statement analysis device, a medium and electronic equipment. The method divides the implicit expression sentence into a plurality of expression units, generates evaluation object characteristic information capable of representing semantic characteristics of the implicit expression sentence by using the expression units, and imports the evaluation object characteristic information into a trained object analysis model to acquire an evaluation object type of the implicit expression sentence. Therefore, the evaluation object in the implicit expression statement can be effectively determined.

Description

Implicit expression statement analysis method and device, medium and electronic equipment
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to an analysis method, an analysis device, a medium and electronic equipment of an implicit expression statement.
Background
The commodity evaluation comprises specific comments of users on the product quality and service, and the collection of commodity evaluation is important for improving the product quality and service.
Because of the complexity and diversity of languages, different users often use different expression modes when expressing the same evaluation object. The statement may be classified into: the expression sentence and the implicit expression sentence are displayed. In the display expression sentence, an evaluation object of the sentence expression is specified, for example: "this handset is inexpensive, 1000 yuan can be put into hand. The term designates "price" as an evaluation target of the expression, and it can be easily determined that the expression is related to only price and emotion bias is positive. In the implicit expression sentence, no evaluation object of the sentence expression is specified, for example: "fever of the mobile phone". Although the statement is made to be an evaluation object by 'heat dissipation', the expression mode is relatively gentle and abstract, and the conventional analysis method cannot effectively determine the evaluation object in the implicit expression statement and the emotion of the expressive person on the evaluation object facing the expression mode.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The disclosure aims to provide an analysis method, an analysis device, a medium and electronic equipment for implicit expression sentences, which can solve at least one technical problem. The specific scheme is as follows:
according to a specific embodiment of the present disclosure, in a first aspect, the present disclosure provides a method for analyzing an implicit expression sentence, including:
Acquiring each expression unit in the implicit expression statement;
generating evaluation object feature information of the implicit expression statement based on the expression unit;
And importing the characteristic information of the evaluation object into a trained object analysis model to obtain the evaluation object type of the implicit expression statement.
According to a second aspect of the present disclosure, there is provided an apparatus for analyzing an implicit expression sentence, comprising:
an acquisition unit configured to acquire each expression unit in the implicit expression sentence;
An evaluation object feature information generating unit configured to generate evaluation object feature information of the implicit expression sentence based on the expression unit;
And the object classification unit is used for importing the characteristic information of the evaluation object into the trained object analysis model to obtain the evaluation object type of the implicit expression statement.
According to a third aspect of the disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of analyzing an implicit expression sentence according to any one of the first aspects.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of analysis of an implicit expression statement as claimed in any of the first aspects.
Compared with the prior art, the scheme of the embodiment of the disclosure has at least the following beneficial effects:
The disclosure provides an implicit expression statement analysis method, an implicit expression statement analysis device, a medium and electronic equipment.
The method divides the implicit expression sentence into a plurality of expression units, generates evaluation object characteristic information capable of representing semantic characteristics of the implicit expression sentence by using the expression units, and imports the evaluation object characteristic information into a trained object analysis model to acquire an evaluation object type of the implicit expression sentence. Therefore, the evaluation object in the implicit expression statement can be effectively determined.
According to the method and the device, the evaluation object characteristic information and the expression unit for determining the evaluation object type of the implicit expression statement are used as basic data for determining the emotion type of the implicit expression statement, so that the attention point is concentrated on the evaluation object of the implicit expression statement, namely, the emotion characteristic information is added with the evaluation object characteristic information, and the accuracy of determining the emotion type in the implicit expression statement is improved.
Meanwhile, when the loss of the object analysis model and the emotion analysis model is calculated, the output conditions of the two models are considered. Therefore, the object analysis model and the emotion analysis model after the combined training can accurately determine the evaluation object type and the emotion type in the implicit expression statement.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale. In the drawings:
FIG. 1 illustrates a flow chart of a method of analysis of an implicit expression statement in accordance with an embodiment of the disclosure;
FIG. 2 shows a block diagram of the elements of an analysis device of an implicit expression statement in accordance with an embodiment of the disclosure;
Fig. 3 illustrates a schematic diagram of an electronic device connection structure according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Alternative embodiments of the present disclosure are described in detail below with reference to the drawings.
The first embodiment provided in the present disclosure is an embodiment of an analysis method of an implicit expression sentence.
Embodiments of the present disclosure are described in detail below in conjunction with fig. 1.
The implicit expression statement is an evaluation object without specifying the statement expression, so that the expression mode is more graceful and abstract. In order to effectively determine an evaluation object in an implicit expression sentence and emotion of an expressive person to the evaluation object, an embodiment of the present disclosure includes the steps of:
step S101, each expression unit in the implicit expression sentence is acquired.
The embodiments of the present disclosure are directed to implicit expression statements only, but in everyday scenarios we are faced with expression statements that are either display expression statements or implicit expression statements. Generally, the keyword matching method is used to judge the type of the expression sentence. For example, if the focus for one expression sentence is "price", some keywords (or rules) related to the focus are set: "price", "precious", "spending", "price and" ask "; if no keyword is included in the expression sentence, the expression sentence is determined to be an implicit sentence.
The expression unit is a basic element constituting an expression sentence, and the expression sentence expresses a complete semantic meaning through the expression units which are orderly arranged. The expression unit may be a minimum semantic unit of the expression sentence. For example, for the Chinese minimum semantic unit in the expression sentence, the word is the minimum semantic unit of English; the expression and punctuation marks in the expression sentence do not belong to the minimum semantic unit, and are usually ignored when the sentence is preprocessed.
For example, if the implicit expression statement is: "fever of mobile phone"; because the Chinese is taken, the expression units of the implicit expression statement are respectively: "hand", "machine", "hair", "burn" and "have"; if the implicit expression statement is: "iphone has fever"; because the Chinese is taken, the English is taken, and the expression units of the implicit expression statement are respectively as follows: "iphone", "hair", "burn" and "having".
Step S102, generating the characteristic information of the evaluation object of the implicit expression statement based on the expression unit.
In the embodiment of the disclosure, the evaluation object feature information is feature information of an evaluation object in an implicit expression sentence, and is key information for determining the type of the evaluation object in the implicit expression sentence.
Optionally, the generating the evaluation object feature information of the implicit expression sentence based on the expression unit includes the following steps:
Step S102-1, mapping the expression units into expression unit vectors arranged in a corresponding sequence based on a preset mapping sequence.
The direction of the expression unit vector is consistent with the direction of the preset mapping sequence. That is, the expression units are respectively acquired from the implicit expression sentences in accordance with the preset mapping order, and the expression units are mapped to expression unit vectors, and the arrangement order of the expression unit vectors is the same as the preset mapping order. For example, the implicit expression statement S ([ c 1,c2,……,ci,……,cn ]), where c i is the minimum semantic unit of the implicit expression statement S; if the preset mapping sequence is the expression sequence of the expression units, the expression unit vector generated after mapping is M ([ a 1,a2,……,ai,……,an ]), wherein a i is the expression unit vector generated after c i mapping; and the arrangement order of a i is the same as that of c i.
In order to ensure that the evaluation object feature information can fully include the evaluation object feature information of the implicit expression statement, optionally, the preset mapping order includes an expression order and an expression reverse order of the expression unit in the implicit expression statement. For example, the implicit expression statement is: "fever of mobile phone"; because the Chinese is taken, the expression unit and the expression sequence of the implicit expression statement are as follows: "hand", "machine", "hair", "burn" and "have"; the expression reverse order of the implicit expression statement is as follows: "burned", "sent", "machine" and "hand".
Specifically, the method comprises the following steps:
step S102-1-1, mapping the expression units into first expression unit vectors arranged based on the expression order of the expression units.
The direction of the first expression unit vector is consistent with the expression sequence direction.
And step S102-1-2, mapping the expression units into second expression unit vectors which are arranged in an inverse order based on the expressions based on the expression inverse order of the expression units.
The direction of the second expression unit vector is consistent with the reverse order expression direction.
According to the method and the device, the first expression unit vectors which are sequentially arranged are generated by adopting the expression sequence, and the second expression unit vectors which are sequentially arranged are generated by adopting the expression reverse sequence, so that the first expression unit vectors and the second expression unit vectors can fully comprise the characteristic information of the evaluation object of the implicit expression statement, and the reliability of generating the characteristic information of the evaluation object is ensured. Since a large amount of repeated information exists in the first expression unit vector and the second expression unit vector, the analysis efficiency is improved in order to reduce the calculation amount. Alternatively to this, the method may comprise,
Step S102-1-1a, mapping the expression units into a preset number of first expression unit vectors arranged based on the expression sequence of the expression units.
Step S102-1-2a, mapping the expression units into a preset number of second expression unit vectors arranged in the reverse order based on the expression of the expression units.
The preset number is smaller than the number of expression units. For example, an implicit expression sentence S ([ c 1,c2,c3,c4,c5 ]), if the preset mapping order is the expression order of the expression units, the expression unit vector generated after mapping is M ([ a 1,a2,a3,a4 ]); if the preset mapping sequence is the reverse sequence of the expression units, the expression unit vector generated after mapping is M ([ a 5,a4,a3,a2 ]); thus, the full coverage of the surface units is ensured, excessive repeated information is reduced, and the analysis efficiency is improved. Optionally, the preset number is 80% of the number of expression units.
Step S102-2, obtaining the characteristic unit vector of the preset arrangement position from the expression unit vector.
Specifically, the method comprises the following steps:
Step S102-2-1, obtaining a first characteristic unit vector of the last arrangement position from the first expression unit vector.
That is, from the first expression unit vectors arranged based on the expression order, the first expression unit vector arranged at the last position is taken as the first feature unit vector. For example, when the first expression element vector arranged at the last position in M ([ a 1,a2,a3,a4 ]) is a 4, a 4 is taken as the first feature element vector.
Step S102-2-2, obtaining a second characteristic unit vector of the final arrangement position from the second expression unit vector.
That is, from the second expression unit vectors arranged in reverse order based on expressions, the second expression unit vector arranged at the last position is taken as the second feature unit vector. For example, if the second expression element vector arranged at the last position in M ([ a 5,a4,a3,a2) is a 2, a 2 is taken as the second feature element vector.
And step S102-3, generating the characteristic vector of the evaluation object characterized by the characteristic information of the evaluation object based on the characteristic unit vector.
Specifically, in one embodiment, the method comprises the following steps:
and step S102-3a, splicing the first characteristic unit vector and the second characteristic unit vector to generate the characteristic vector of the evaluation object.
That is, the last value of the first feature unit vector and the first value of the second feature unit vector are connected together, so as to generate a new vector (i.e., the feature vector to be evaluated), for example, the first feature unit vector is 123, the second feature unit vector is 456, and the feature vector to be evaluated is 123456.
In another embodiment, the method comprises the steps of:
and step S102-3b, calculating the sum of the first characteristic unit vector and the second characteristic unit vector, and generating the characteristic vector of the evaluation object.
For example, if the first feature cell vector is 123 and the second feature cell vector is 456, the evaluation target feature vector is 123+456=579.
And step S103, importing the characteristic information of the evaluation object into a trained object analysis model to obtain the evaluation object type of the implicit expression statement.
The object analysis model is a model trained to obtain the type of the evaluation object of the implicit expression sentence, and is obtained based on previous historical network data, for example, the object analysis model is trained by using the historical network data as a training sample. The process of performing evaluation object type analysis on network data according to the object analysis model is not described in detail in this embodiment, and may be implemented with reference to various implementations in the prior art. For example, the object analysis model is a Bi-GRU model.
The embodiment of the disclosure divides the implicit expression sentence into a plurality of expression units, generates evaluation object characteristic information capable of representing semantic characteristics of the implicit expression sentence by using the expression units, and imports the evaluation object characteristic information into a trained object analysis model to acquire the evaluation object type of the implicit expression sentence. Therefore, the evaluation object in the implicit expression statement can be effectively determined.
Further, in order to effectively determine the emotion type of the expressive person to the evaluation object in the implicit expression sentence, the method according to the embodiment of the disclosure further includes the following steps:
Step S104, generating emotion characteristic information of the implicit expression statement based on the evaluation object characteristic information and the expression unit.
The embodiment of the disclosure takes the evaluation object characteristic information and the expression unit as basic data for determining the emotion type of the implicit expression statement.
The emotion characteristic information is characteristic information related to emotion in an implicit expression statement, and is key information for determining emotion type in the implicit expression statement.
Optionally, the generating the emotion feature information of the implicit expression sentence based on the evaluation object feature information and the expression unit includes the following steps:
step S104-1, mapping each expression unit into a corresponding emotion unit vector based on a preset mapping sequence.
The preset mapping order includes a presentation order or a presentation reverse order.
The direction of the emotion unit vector is consistent with the direction of a preset mapping sequence.
The method of acquiring the emotion unit vector may be the same as the method of acquiring the expression unit vector based on the expression order of the expression units described above. For example, the implicit expression statement S ([ c 1,c2,……,ci,……,cn ]), where c i is the minimum semantic unit of the implicit expression statement S; if the expression order of expression units is based, the emotion unit vector generated after mapping is N ([ b 1,b2,……,bi,……,bn ]), wherein b i is the emotion unit vector generated after c i mapping.
And step S104-2, generating initial vectors corresponding to the emotion unit vectors based on the evaluation object feature vectors and the emotion unit vectors respectively.
For example, if the emotion unit vector N ([ 11, 22, 33, 44, 55 ]), the evaluation target feature vector is 32, and the sum of the evaluation target feature vector and each emotion unit vector is calculated, the initial vector generated is P ([ 43, 54, 65, 76, 87 ]).
And step S104-3, all the initial vectors are encoded, and the encoded vectors are generated.
And step S104-4, generating the emotion feature vector characterized by the emotion feature information based on the coding vector and the evaluation object feature vector.
For example, the code vector is H, the evaluation object feature vector is H,
The emotion characteristic information comprises emotion information in the implicit expression statement, and the emotion characteristic information is key information for determining the emotion type of the implicit expression statement. In order to concentrate the focus on the evaluation object of the implicit expression statement, the embodiment of the disclosure adds the evaluation object feature vector (i.e., adds the evaluation object feature information to the emotion feature information) to the emotion feature vector so as to improve the accuracy of determining the emotion type in the implicit expression statement.
Specifically, in one embodiment, the generating the emotion feature vector characterized by the emotion feature information based on the encoding vector and the evaluation object feature vector includes the following steps:
And step S104-4a, splicing the coding vector and the characteristic vector of the evaluation object to generate the emotion characteristic vector.
That is, the last value of the code vector and the first value of the feature vector of the evaluation object are connected together to generate a new feature of the evaluation object (namely, emotion feature vector), for example, the code vector is 78, the feature vector of the evaluation object is 90, and the emotion feature vector is 7890.
In another embodiment, the generating the emotion feature vector characterized by the emotion feature information based on the encoding vector and the evaluation object feature vector includes the steps of:
And step S104-4b, calculating the sum of the coding vector and the characteristic vector of the evaluation object, and generating the emotion characteristic vector.
For example, if the code vector is 78 and the evaluation target feature is 90, the emotion feature vector is 78+90=168.
Step S105, importing the emotion characteristic information into a trained emotion analysis model to obtain the emotion type of the implicit expression statement.
The emotion analysis model is a model trained for obtaining the emotion type of the implicit expression statement, and is obtained based on previous historical network data, for example, the emotion analysis model is trained by taking the historical network data as a training sample. The process of emotion type analysis on network data according to emotion analysis model is not described in detail in this embodiment, and may be implemented with reference to various implementations in the prior art. For example, the emotion analysis model is a BERT model.
In order to obtain the evaluation object type and emotion type in the implicit expression statement at the same time, when the loss of the object analysis model and the emotion analysis model is calculated, the output condition of the two models is considered. Therefore, the object analysis model and the emotion analysis model after the combined training can accurately determine the evaluation object type and the emotion type in the implicit expression statement.
According to the embodiment of the disclosure, the evaluation object characteristic information and the expression unit for determining the evaluation object type of the implicit expression statement are used as basic data for determining the emotion type of the implicit expression statement, so that the attention point is concentrated on the evaluation object of the implicit expression statement, namely, the emotion characteristic information is added with the evaluation object characteristic information, and the accuracy of determining the emotion type in the implicit expression statement is improved.
Corresponding to the first embodiment provided by the present disclosure, the present disclosure also provides a second embodiment, i.e., an analysis device of an implicit expression sentence. Since the second embodiment is substantially similar to the first embodiment, the description is relatively simple, and the relevant portions will be referred to the corresponding descriptions of the first embodiment. The device embodiments described below are merely illustrative.
Fig. 2 shows an embodiment of an analysis apparatus of an implicit expression sentence provided by the present disclosure.
As shown in fig. 2, the present disclosure provides an analysis apparatus of an implicit expression sentence, including:
an acquisition unit 201 for acquiring each expression unit in the implicit expression sentence;
a unit 202 for generating evaluation object feature information for generating the evaluation object feature information of the implicit expression sentence based on the expression unit;
and the object classification unit 203 is configured to import the evaluation object feature information into a trained object analysis model, and obtain an evaluation object type of the implicit expression statement.
Optionally, the generating evaluation object feature information unit 202 includes:
A mapping expression unit vector subunit, configured to map the expression unit into expression unit vectors arranged in a corresponding order based on a preset mapping order, where a direction of the expression unit vectors is consistent with a direction of the preset mapping order;
the characteristic unit vector obtaining subunit is used for obtaining a characteristic unit vector of a preset arrangement position from the expression unit vector;
and the generated evaluation object feature vector subunit is used for generating an evaluation object feature vector characterized by the evaluation object feature information based on the feature unit vector.
Optionally, the mapping expression unit vector subunit includes:
A mapping first expression unit vector subunit, configured to map, based on an expression order of the expression units, the expression units into first expression unit vectors that are arranged based on the expression order, where a direction of the first expression unit vectors is consistent with a direction of the expression order;
And a mapping second expression unit vector subunit, configured to map, based on the expression reverse order of the expression units, the expression units into second expression unit vectors arranged based on the expression reverse order, where a direction of the second expression unit vectors is consistent with a direction of the expression reverse order.
Optionally, the obtaining feature unit vector subunit includes:
a first characteristic unit vector subunit is obtained and is used for obtaining a first characteristic unit vector of a final arrangement position from the first expression unit vector;
and the second characteristic unit vector subunit is used for acquiring a second characteristic unit vector of the final arrangement position from the second expression unit vector.
Optionally, the generating the evaluation object feature vector subunit includes:
A splicing subunit, configured to splice the first feature unit vector and the second feature unit vector, and generate the evaluation object feature vector;
or a summation subunit, configured to calculate a sum of the first feature unit vector and the second feature unit vector, and generate the evaluation object feature vector.
Optionally, the apparatus further includes:
The emotion feature information generation unit is used for generating emotion feature information of the implicit expression statement based on the evaluation object feature information and the expression unit;
And the emotion classification unit is used for importing the emotion characteristic information into the trained emotion analysis model to obtain the emotion type of the implicit expression statement.
Optionally, the generating emotion feature information unit includes:
the emotion unit vector mapping subunit is used for mapping each expression unit into a corresponding emotion unit vector based on a preset mapping sequence, and the direction of the emotion unit vector is consistent with the direction of the preset mapping sequence;
an initial vector generation subunit, configured to generate an initial vector corresponding to the emotion unit vector based on the evaluation object feature vector and each emotion unit vector, respectively;
The coding subunit is used for coding all the initial vectors and generating the coding vectors;
and the emotion feature vector generation subunit is used for generating the emotion feature vector characterized by the emotion feature information based on the coding vector and the evaluation object feature vector.
In the embodiment of the disclosure, the implicit expression sentence is divided into a plurality of expression units, the expression units are utilized to generate the evaluation object characteristic information capable of representing the semantic characteristics of the implicit expression sentence, and the evaluation object characteristic information is imported into a trained object analysis model to obtain the evaluation object type of the implicit expression sentence. Therefore, the evaluation object in the implicit expression statement can be effectively determined.
According to the embodiment of the disclosure, the evaluation object characteristic information and the expression unit for determining the evaluation object type of the implicit expression statement are used as basic data for determining the emotion type of the implicit expression statement, so that the attention point is concentrated on the evaluation object of the implicit expression statement, namely, the emotion characteristic information is added with the evaluation object characteristic information, and the accuracy of determining the emotion type in the implicit expression statement is improved.
Meanwhile, when the loss of the object analysis model and the emotion analysis model is calculated, the output conditions of the two models are considered. Therefore, the object analysis model and the emotion analysis model after the combined training can accurately determine the evaluation object type and the emotion type in the implicit expression statement.
The embodiment of the disclosure provides a third embodiment, namely an electronic device, which is used for an analysis method of an implicit expression sentence, and the electronic device comprises: at least one processor; and a memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the one processor to enable the at least one processor to perform the method of analyzing implicit expression statements as described in the first embodiment.
The present disclosure provides the fourth embodiment, namely, a computer storage medium storing computer-executable instructions that can perform the analysis method of the implicit expression sentence as described in the first embodiment.
Referring now to fig. 3, a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 3, the electronic device may include a processing means (e.g., a central processor, a graphics processor, etc.) 301 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the electronic device are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
In general, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 308 including, for example, magnetic tape, hard disk, etc.; and communication means 309. The communication means 309 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 shows an electronic device having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via a communication device 309, or installed from a storage device 308, or installed from a ROM 302. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing means 301.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (7)

1. A method of analyzing an implicit expression sentence, comprising:
Acquiring each expression unit in the implicit expression statement;
generating evaluation object feature information of the implicit expression statement based on the expression unit;
importing the characteristic information of the evaluation object into a trained object analysis model to obtain the evaluation object type of the implicit expression statement;
the generating the evaluation object feature information of the implicit expression statement based on the expression unit includes:
Mapping the expression units into expression unit vectors which are arranged in a corresponding sequence based on a preset mapping sequence, wherein the direction of the expression unit vectors is consistent with the direction of the preset mapping sequence;
acquiring a characteristic unit vector of a preset arrangement position from the expression unit vector;
generating an evaluation object feature vector characterized by the evaluation object feature information based on the feature unit vector;
The method further comprises the steps of: generating emotion feature information of the implicit expression statement based on the evaluation object feature information and the expression unit; importing the emotion characteristic information into a trained emotion analysis model to obtain the emotion type of the implicit expression statement;
The generating emotion feature information of the implicit expression sentence based on the evaluation object feature information and the expression unit includes:
mapping each expression unit into a corresponding emotion unit vector based on a preset mapping sequence, wherein the direction of the emotion unit vector is consistent with the direction of the preset mapping sequence;
generating initial vectors corresponding to the emotion unit vectors based on the evaluation object feature vectors and the emotion unit vectors respectively;
Encoding all the initial vectors to generate encoded vectors;
And generating the emotion feature vector characterized by the emotion feature information based on the coding vector and the evaluation object feature vector.
2. The method of claim 1, wherein mapping the presentation units into the presentation unit vectors arranged in the corresponding order based on the preset mapping order comprises:
Mapping the expression units into first expression unit vectors arranged based on the expression sequence, wherein the direction of the first expression unit vectors is consistent with the expression sequence direction;
And mapping the expression units into second expression unit vectors which are arranged based on the expression reverse order, wherein the direction of the second expression unit vectors is consistent with the expression reverse order direction.
3. The method according to claim 2, wherein the obtaining the feature cell vector of the preset arrangement position from the expression cell vector includes:
Acquiring a first characteristic unit vector of a final arrangement position from the first expression unit vector;
And acquiring a second characteristic unit vector of the final arrangement position from the second expression unit vector.
4. The method of claim 3, wherein the generating an evaluation object feature vector characterized by the evaluation object feature information based on the feature cell vector comprises:
Splicing the first characteristic unit vector and the second characteristic unit vector to generate the characteristic vector of the evaluation object;
Or alternatively, the first and second heat exchangers may be,
And calculating the sum of the first characteristic unit vector and the second characteristic unit vector to generate the characteristic vector of the evaluation object.
5. An apparatus for analyzing an implicit expression sentence, comprising:
an acquisition unit configured to acquire each expression unit in the implicit expression sentence;
An evaluation object feature information generating unit configured to generate evaluation object feature information of the implicit expression sentence based on the expression unit; the method is also used for generating emotion feature information of the implicit expression statement based on the evaluation object feature information and the expression unit; importing the emotion characteristic information into a trained emotion analysis model to obtain the emotion type of the implicit expression statement;
The generating emotion feature information of the implicit expression sentence based on the evaluation object feature information and the expression unit includes: mapping each expression unit into a corresponding emotion unit vector based on a preset mapping sequence, wherein the direction of the emotion unit vector is consistent with the direction of the preset mapping sequence; generating initial vectors corresponding to the emotion unit vectors based on the evaluation object feature vectors and the emotion unit vectors respectively; encoding all the initial vectors to generate encoded vectors; generating an emotion feature vector characterized by the emotion feature information based on the coding vector and the evaluation object feature vector;
And the object classification unit is used for importing the characteristic information of the evaluation object into the trained object analysis model to obtain the evaluation object type of the implicit expression statement.
6. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1 to 4.
7. An electronic device, comprising:
one or more processors;
Storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1 to 4.
CN202011116326.2A 2020-10-19 Implicit expression statement analysis method and device, medium and electronic equipment Active CN112270170B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011116326.2A CN112270170B (en) 2020-10-19 Implicit expression statement analysis method and device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011116326.2A CN112270170B (en) 2020-10-19 Implicit expression statement analysis method and device, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112270170A CN112270170A (en) 2021-01-26
CN112270170B true CN112270170B (en) 2024-07-02

Family

ID=

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
商品隐式评价对象提取的方法研究;邱云飞;倪学峰;邵良杉;;计算机工程与应用(第19期);114-118 *

Similar Documents

Publication Publication Date Title
CN111767371A (en) Intelligent question and answer method, device, equipment and medium
JP2021103506A (en) Method and device for generating information
WO2020207174A1 (en) Method and apparatus for generating quantized neural network
CN112650841A (en) Information processing method and device and electronic equipment
CN110765752B (en) Test question generation method and device, electronic equipment and computer readable storage medium
CN110009101B (en) Method and apparatus for generating a quantized neural network
CN112182255A (en) Method and apparatus for storing media files and for retrieving media files
CN113051933B (en) Model training method, text semantic similarity determination method, device and equipment
US20230315990A1 (en) Text detection method and apparatus, electronic device, and storage medium
CN111752644A (en) Interface simulation method, device, equipment and storage medium
CN114625876B (en) Method for generating author characteristic model, method and device for processing author information
CN111026849A (en) Data processing method and device
CN110689285A (en) Test method, test device, electronic equipment and computer readable storage medium
CN113191257B (en) Order of strokes detection method and device and electronic equipment
CN112270170B (en) Implicit expression statement analysis method and device, medium and electronic equipment
CN116109374A (en) Resource bit display method, device, electronic equipment and computer readable medium
CN113593527B (en) Method and device for generating acoustic features, training voice model and recognizing voice
CN112286808B (en) Application program testing method and device, electronic equipment and medium
CN111680754B (en) Image classification method, device, electronic equipment and computer readable storage medium
CN109857838B (en) Method and apparatus for generating information
CN114564606A (en) Data processing method and device, electronic equipment and storage medium
CN113283115B (en) Image model generation method and device and electronic equipment
CN112270170A (en) Analysis method, device, medium and electronic equipment for implicit expression statement
CN116974684B (en) Map page layout method, map page layout device, electronic equipment and computer readable medium
CN111026983B (en) Method, device, medium and electronic equipment for realizing hyperlink

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant