CN114372458B - Emergency detection method based on government work order - Google Patents

Emergency detection method based on government work order Download PDF

Info

Publication number
CN114372458B
CN114372458B CN202210063986.1A CN202210063986A CN114372458B CN 114372458 B CN114372458 B CN 114372458B CN 202210063986 A CN202210063986 A CN 202210063986A CN 114372458 B CN114372458 B CN 114372458B
Authority
CN
China
Prior art keywords
emergency
model
training
samples
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210063986.1A
Other languages
Chinese (zh)
Other versions
CN114372458A (en
Inventor
郑文博
汤灏
包利安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zero Data Technology Co ltd
Beijing Zero Vision Network Technology Co ltd
Original Assignee
Beijing Zero Data Technology Co ltd
Beijing Zero Vision Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zero Data Technology Co ltd, Beijing Zero Vision Network Technology Co ltd filed Critical Beijing Zero Data Technology Co ltd
Priority to CN202210063986.1A priority Critical patent/CN114372458B/en
Publication of CN114372458A publication Critical patent/CN114372458A/en
Application granted granted Critical
Publication of CN114372458B publication Critical patent/CN114372458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/216Parsing using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Probability & Statistics with Applications (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Computing Systems (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides an emergency detection method based on a government work order, and relates to the technical field of artificial intelligence algorithms. The method comprises the following steps: acquiring training samples, wherein the training samples comprise positive samples and negative samples, the positive samples are text data with emergency events, and the negative samples are text data with non-emergency events; and training the emergency model by taking the positive sample and the negative sample as input and the probability values of all the emergency as output to obtain the trained emergency model. In this way, the accuracy of identification of incidents in government work orders can be improved.

Description

Emergency detection method based on government work order
Technical Field
The application relates to the technical field of artificial intelligence algorithms, in particular to a sudden event detection method based on a government work order.
Background
With the convergence of data of different government data services, the types of events are more and more, and the phenomena of proximity, mutual exclusion and the like exist in the definition range of the events and the events. Therefore, a technology is needed to accurately extract the emergency in the government work order and help the business personnel to immediately coordinate the corresponding units to handle the emergency.
At present, most of existing extraction methods are manually defining specific keywords and performing simple text matching, but by using the method, rules and keywords are manually defined, the generalization capability is lacked, the migration capability is weak, and the analysis cannot be performed based on the semantic environment of a text, so that emergencies in the government work order cannot be accurately and efficiently identified.
Disclosure of Invention
In order to improve the accuracy of identifying the emergency in the government affair work order, the application provides an emergency detection method based on the government affair work order.
In a first aspect of the present application, a method for training an emergency model is provided, including:
acquiring training samples, wherein the training samples comprise positive samples and negative samples, the positive samples are text data with emergency events, and the negative samples are text data with non-emergency events;
and training a sudden event model by taking the positive sample and the negative sample as input and the probability values of all sudden events as output to obtain the trained sudden event model.
Optionally, the method further includes: and optimizing the trained emergency model through a loss function to obtain the optimized emergency model.
Optionally, the method further includes: and acquiring a verification set, and verifying the optimized emergency model by using the verification set.
In a second aspect of the present application, there is provided an emergency model training apparatus, including:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring training samples, the training samples comprise positive samples and negative samples, the positive samples are text data with emergency events, and the negative samples are text data with non-emergency events;
and the training module is used for training an emergency model by taking the positive sample and the negative sample as input and taking the probability values of all the emergency as output to obtain the trained emergency model.
Optionally, the apparatus further comprises: and the optimization module is used for optimizing the trained emergency model through a loss function to obtain the optimized emergency model.
Optionally, the apparatus further comprises: and the verification module is used for acquiring a verification set and verifying the optimized emergency model by using the verification set.
In a third aspect of the present application, there is provided an emergency detection method based on a government work order, including:
acquiring a government work order;
and inputting the government work order into a trained emergency model to obtain the emergency in the government work order.
Optionally, the obtaining the emergency in the government work order includes:
and calculating the probability values of all the emergency events by using the emergency event model, and taking the emergency event with the maximum probability value as the emergency event to be finally obtained.
In a fourth aspect of the present application, there is provided an electronic device comprising a memory having stored thereon a computer program and a processor that, when executing the program, performs the method of any one of the first aspect or the third aspect.
In a fifth aspect of the present application, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the method according to any one of the first aspect or the method according to any one of the third aspect.
By adopting the technical scheme, the obtained training samples are input into the emergency model to train the emergency model, the emergency model is optimized through the loss function to obtain the optimal emergency model, the accuracy of the output data of the model is improved, the data of the government affair work order is input into the trained emergency model, and the emergency in the government affair work order is obtained through the inference of the emergency model, so that the accuracy of recognizing the emergency in the government affair work order is improved.
It should be understood that what is described in this summary section is not intended to limit key or critical features of the embodiments of the application, nor is it intended to limit the scope of the application. Other features of the present application will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of various embodiments of the present application will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, like or similar reference characters denote like or similar elements, and wherein:
FIG. 1 is a flowchart of a training method of an emergency model in an embodiment of the present application;
FIG. 2 is a block diagram of an emergency model training apparatus according to an embodiment of the present application;
FIG. 3 is a flowchart of an emergency detection method based on a government work order in an embodiment of the present application;
fig. 4 is a block diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Fig. 1 shows a flowchart of a training method of an emergency model in an embodiment of the present application. Referring to fig. 1, the method comprises the steps of:
step S110: training samples are obtained.
Wherein the training samples comprise positive samples and negative samples. The positive sample is text data with an emergency; negative examples are text data with non-emergency events. It should be noted that the emergency event refers to an event that occurs suddenly and endangers the safety of people, for example, blasting, fire, explosion, and the like, and all belong to the emergency event. Furthermore, in the positive samples of the training samples, the emergency events have already been labeled.
After the training samples are obtained, the training samples are preprocessed, and the training samples are converted into feature vectors which can be input into a model. The training sample preprocessing is described in detail below.
Firstly, performing word segmentation processing on the labeled supervised learning text data, and then connecting a [ CLS ] mark to the beginning of each piece of text data. Specifically, word segmentation is performed on text data by using a Bert preprocessing model, wherein the Bert preprocessing model mainly comprises two word segmenters: the method comprises the steps that firstly, the basic tokenizer and the Wordpiecetokenizer carry out rough word segmentation on text data to obtain a token list, and then, wordpiecetokenizer processing is carried out on each token to obtain a final word segmentation result.
And after word segmentation processing is carried out on the text data, embedding vectorization representation is carried out on the text data. Namely, each word after word segmentation is represented by a feature vector based on a Bert pre-training model, each sentence is subjected to embedding vectorization representation, the relative position of each word is represented by a coded vector, and the three feature vectors are added. It should be noted that L =12, H =768, a =12 of the Bert pre-training model used in the embodiment of the present application.
In some embodiments, the tag data is encoded, and the sample tag type data is automatically acquired and constructed according to a sample range of the training data.
Step S120: and training the emergency model by taking the positive sample and the negative sample as input and the probability values of all the emergency as output to obtain the trained emergency model.
In some embodiments, the method for training the emergency model further includes the following steps:
step S130: and optimizing the trained emergency model through a loss function to obtain an optimized emergency model.
Specifically, the trained emergency model is used for constructing a multi-label scene loss function based on a pre-trained Bert model to perform fining training. As the Bert model training mainly adopts an encoder module in a bidirectional transformer as vector feature extraction representation, the Bert model is mainly based on a self-attention mechanism, and can automatically mine semantic relations between a current word in a text and other words in a context, and obtain semantic vector representation of the words by ignoring distance. The semantic vector representation of the obtained words can fully consider semantic association, so that the accuracy of recognizing emergency in the government work order is improved.
The transform used in the embodiment of the application is set to be 12 layers, and when the model is constructed, the [ CLS ] marked character vector of the last layer is taken as the input vector of the next layer.
In some embodiments, the following loss function is employed as the optimization objective:
Figure BDA0003479387980000051
where N is a negative sample set, P is a positive sample set, S i Score for positive sample, S j Negative sample scores.
It is worth mentioning that the loss function can be modeled for multi-labeled sample data.
In some embodiments, in training the incident model, the following settings are made:
epoch=40;
batch size batch _ size =16;
maximum text truncation length maxlen =410;
learning rate lr = le-5;
the multi-tag threshold is 0.
It should be noted that an epoch refers to a process of sending all data into the network to complete a forward calculation and a backward propagation. The batch size refers to the number of text data captured in one training session.
In some embodiments, the method for training the emergency model further includes the following steps:
step S140: and acquiring a verification set, and verifying the optimized emergency model by using the verification set.
The validation set also includes text data with an emergency and text data with a non-emergency, i.e. the text data in the validation set is marked. And the verification set is used for verifying whether the parameters of the trained emergency model are optimal or not, and if not, the parameters are continuously optimized. Specifically, in each verification process, a parameter which is better than a parameter of the current emergency model is obtained, the parameter of the current emergency model is replaced by the optimal parameter, and the cycle is repeated, so that the optimal emergency model parameter is found.
In some embodiments, during the continuous loop verification, when the parameters of the emergency model are not updated for a preset number of times, the training is terminated early.
It should be noted that the preset times are set manually, for example, the preset times are set to 10 times, that is, when the parameters of the emergency model reach 10epochs continuously and are not updated, the subsequent training is terminated in advance, so that the GPU resources can be saved.
In some embodiments, the optimized incident model is evaluated using model evaluation metrics. It should be noted that the model evaluation indexes adopted in the present embodiment include accuracy and F1 index. It should be noted that the F1 index is obtained by calculating an average value of F1 values of the test validation set, and the F1 value is a harmonic average value of the precision rate and the recall rate.
Fig. 2 shows a block diagram of an emergency model training apparatus in an embodiment of the present application. Referring to fig. 2, the apparatus includes:
the obtaining module 210 is configured to obtain training samples, where the training samples include positive samples and negative samples, where the positive samples are text data with an emergency and the negative samples are text data with a non-emergency;
the training module 220 is configured to train an emergency model by using the positive samples and the negative samples as inputs and using the probability values of all the emergency as outputs, so as to obtain the trained emergency model.
In some embodiments, the apparatus further includes an optimization module 230, configured to optimize the trained emergency model through a loss function, so as to obtain an optimized emergency model.
In some embodiments, the apparatus further comprises a verification module 240 configured to obtain a verification set, and verify the optimized emergency model by using the verification set.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the described module may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Fig. 3 shows a flowchart of an emergency detection method based on a government work order in the embodiment of the present application. Referring to fig. 3, the emergency detection method includes the following steps:
step S310: and acquiring a government work order.
Step S320: and inputting the government work order into the trained emergency model to obtain the emergency in the government work order.
In some embodiments, step S320 includes: and calculating the probability values of all the emergency events by using the emergency event model, and taking the emergency event with the maximum probability value as the emergency event to be finally obtained.
It should be noted that, a plurality of government work orders are obtained as the test set, and the emergency is not marked in the government work orders.
Specifically, a reasoning function is performed based on an optimal emergency model after training, probability values of all event labels in text data are calculated according to the text data in the input government affair work order, the emergency label with the maximum probability value is selected, and if the label is not empty, the event is the emergency.
In an embodiment of the present application, an electronic device is provided, and as shown in fig. 4, an electronic device 400 shown in fig. 4 includes: a processor 401 and a memory 403. Wherein the processor 401 is coupled to the memory 403, such as via a bus 402. Optionally, the electronic device 400 may also include a transceiver 404. It should be noted that the transceiver 404 is not limited to one in practical applications, and the structure of the electronic device 400 is not limited to the embodiment of the present application.
The Processor 401 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. The processor 401 may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
Bus 402 may include a path that transfers information between the above components. The bus 402 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. Bus 402 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 4, but this does not indicate only one bus or one type of bus.
The Memory 403 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 403 is used for storing application program codes for executing the scheme of the application, and the execution is controlled by the processor 401. Processor 401 is configured to execute application program code stored in memory 403 to implement the aspects illustrated in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As another aspect, the present application also provides a computer-readable storage medium that may be contained in the electronic device described in the above embodiments; or may be separate and not incorporated into the electronic device. The computer readable storage medium stores one or more programs which, when executed by one or more processors, perform the method for training an incident model described herein.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application referred to in the present application is not limited to the embodiments with a particular combination of the above-mentioned features, but also encompasses other embodiments with any combination of the above-mentioned features or their equivalents without departing from the concept of the above-mentioned application. For example, the above features may be interchanged with (but not limited to) features having similar functions as those claimed in the present application.

Claims (8)

1. A training method of an emergency model is characterized by comprising the following steps:
acquiring training samples, wherein the training samples comprise positive samples and negative samples, the positive samples are text data with labeled emergency events, and the negative samples are text data with non-emergency events;
training an emergency model by taking the positive sample and the negative sample as input and the probability values of all emergencies as output to obtain the trained emergency model;
further comprising:
optimizing the trained emergency model through a loss function to obtain the optimized emergency model;
the optimizing the trained emergency model through a loss function includes:
constructing a multi-label scene loss function of the trained emergency model based on a pre-training Bert model to perform fining training;
wherein the loss function is capable of modeling for multi-labeled sample data; the following loss function was used as the optimization objective:
Figure FDA0004037622740000011
wherein N is a negative sample set, P is a positive sample set, and S i Scoring a positive sample, said S j Scoring a negative sample;
after the obtaining of the training sample, further comprising:
firstly, segmenting the training sample by using a word segmenter basic tokenizer in a Bert preprocessing model to obtain a token list, and performing WordpieceTokenizer processing on each token once to obtain a final word segmentation result; then connecting [ CLS ] marks to the beginning of each word after word segmentation; and then representing each word after word segmentation by using a feature vector based on the Bert pre-training model, carrying out embedding vectorization representation on each sentence, representing a relative position coding vector of each word, and adding the three feature vectors to obtain a feature vector capable of being input into the emergency model.
2. The training method of claim 1, further comprising:
and acquiring a verification set, and verifying the optimized emergency model by using the verification set.
3. An emergency model training device, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring training samples, the training samples comprise positive samples and negative samples, the positive samples are text data with labeled emergency events, and the negative samples are text data with non-emergency events;
the training module is used for training an emergency model by taking the positive sample and the negative sample as input and taking probability values of all emergencies as output to obtain the trained emergency model;
further comprising:
the optimization module is used for optimizing the trained emergency model through a loss function to obtain the optimized emergency model;
the optimization module is specifically used for constructing a multi-label scene loss function of the trained emergency model based on a pre-training Bert model to perform finening training; wherein the loss function is capable of modeling for multi-labeled sample data; the following loss function was used as the optimization objective:
Figure FDA0004037622740000021
/>
wherein N is a negative sample set, P is a positive sample set, and S i Scoring a positive sample, said S j Scoring a negative sample;
the preprocessing module is used for segmenting words of the training sample by using a word segmenter basic tokenizer in the Bert preprocessing model to obtain a token list, and performing Wordpiec tokenizer processing on each token once to obtain a final word segmentation result; then connecting [ CLS ] marks to the beginning of each word after word segmentation; and then representing each word after word segmentation by using a characteristic vector based on the Bert pre-training model, carrying out embedding vectorization representation on each sentence, representing a relative position coding vector of each word, and adding the three characteristic vectors to obtain a characteristic vector capable of being input into the emergency model.
4. The training device of claim 3, further comprising:
and the verification module is used for acquiring a verification set and verifying the optimized emergency model by using the verification set.
5. An emergency detection method based on a government work order is characterized by comprising the following steps:
acquiring a government work order;
inputting the government work order into the trained emergency model according to claim 1 or 2, and obtaining the emergency in the government work order.
6. The incident detection method of claim 5, wherein the obtaining the incident in the government work order comprises:
and calculating the probability values of all the emergency events by using the emergency event model, and taking the emergency event with the maximum probability value as the emergency event to be finally obtained.
7. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, wherein the processor, when executing the program, implements the method of claim 1, 2, 5 or 6.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to claim 1, 2, 5 or 6.
CN202210063986.1A 2022-01-20 2022-01-20 Emergency detection method based on government work order Active CN114372458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210063986.1A CN114372458B (en) 2022-01-20 2022-01-20 Emergency detection method based on government work order

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210063986.1A CN114372458B (en) 2022-01-20 2022-01-20 Emergency detection method based on government work order

Publications (2)

Publication Number Publication Date
CN114372458A CN114372458A (en) 2022-04-19
CN114372458B true CN114372458B (en) 2023-04-07

Family

ID=81146457

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210063986.1A Active CN114372458B (en) 2022-01-20 2022-01-20 Emergency detection method based on government work order

Country Status (1)

Country Link
CN (1) CN114372458B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582785A (en) * 2018-10-31 2019-04-05 天津大学 Emergency event public sentiment evolution analysis method based on text vector and machine learning
CN111444404A (en) * 2020-03-19 2020-07-24 杭州叙简科技股份有限公司 Social public opinion monitoring system based on microblog and monitoring method thereof
WO2020174826A1 (en) * 2019-02-25 2020-09-03 日本電信電話株式会社 Answer generating device, answer learning device, answer generating method, and answer generating program
CN111858725A (en) * 2020-04-30 2020-10-30 北京嘀嘀无限科技发展有限公司 Event attribute determination method and system
CN112035668A (en) * 2020-09-02 2020-12-04 深圳前海微众银行股份有限公司 Event subject recognition model optimization method, device and equipment and readable storage medium
CN113190602A (en) * 2021-04-09 2021-07-30 桂林电子科技大学 Event joint extraction method integrating word features and deep learning
CN113392651A (en) * 2020-11-09 2021-09-14 腾讯科技(深圳)有限公司 Training word weight model, and method, device, equipment and medium for extracting core words
CN113486141A (en) * 2021-07-29 2021-10-08 宁波薄言信息技术有限公司 Text, resume and financing bulletin extraction method based on SegaBert pre-training model

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104113643B (en) * 2014-06-27 2017-01-04 国家电网公司 A kind of Customer Service Center's Field Monitoring System and method
CN110110881B (en) * 2019-03-21 2021-10-26 贵州电网有限责任公司 Power customer demand prediction analysis method and system
CN111475649B (en) * 2020-04-02 2023-04-07 中国人民解放军国防科技大学 False news prediction method, system, device and medium based on deep learning
CN111783428B (en) * 2020-07-07 2024-01-23 杭州叙简科技股份有限公司 Emergency management objective question automatic generation system based on deep learning
CN112528031A (en) * 2021-02-09 2021-03-19 中关村科学城城市大脑股份有限公司 Work order intelligent distribution method and system
CN112989841B (en) * 2021-02-24 2021-09-21 中国搜索信息科技股份有限公司 Semi-supervised learning method for emergency news identification and classification
CN113901289A (en) * 2021-10-08 2022-01-07 新华智云科技有限公司 Unsupervised learning-based recommendation method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582785A (en) * 2018-10-31 2019-04-05 天津大学 Emergency event public sentiment evolution analysis method based on text vector and machine learning
WO2020174826A1 (en) * 2019-02-25 2020-09-03 日本電信電話株式会社 Answer generating device, answer learning device, answer generating method, and answer generating program
CN111444404A (en) * 2020-03-19 2020-07-24 杭州叙简科技股份有限公司 Social public opinion monitoring system based on microblog and monitoring method thereof
CN111858725A (en) * 2020-04-30 2020-10-30 北京嘀嘀无限科技发展有限公司 Event attribute determination method and system
CN112035668A (en) * 2020-09-02 2020-12-04 深圳前海微众银行股份有限公司 Event subject recognition model optimization method, device and equipment and readable storage medium
CN113392651A (en) * 2020-11-09 2021-09-14 腾讯科技(深圳)有限公司 Training word weight model, and method, device, equipment and medium for extracting core words
CN113190602A (en) * 2021-04-09 2021-07-30 桂林电子科技大学 Event joint extraction method integrating word features and deep learning
CN113486141A (en) * 2021-07-29 2021-10-08 宁波薄言信息技术有限公司 Text, resume and financing bulletin extraction method based on SegaBert pre-training model

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Wang Song 等.Deep local feature descriptor learning with dual hard batch construction.《IEEE Transactions on Image Processing》.2020,第29卷9572-9583. *
Zikang He 等.Sentiment analysis of agricultural product ecommerce review data based on deep learning.《2020 International Conference on Internet of Things and Intelligent Applications (ITIA)》.2020,1-7. *
富雅玲 等.基于重点突发词的突发事件检测方法.《电子技术应用》.2020,第46卷(第11期),82-86. *
王露.制造执行系统中数据挖掘关键技术研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2018,(第03期),I138-1078. *
赖百胜.弱监督场景语义理解.《中国博士学位论文全文数据库信息科技辑》.2019,(第01期),I138-123. *

Also Published As

Publication number Publication date
CN114372458A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN109918560B (en) Question and answer method and device based on search engine
CN111695352A (en) Grading method and device based on semantic analysis, terminal equipment and storage medium
CN111222305A (en) Information structuring method and device
CN112507704B (en) Multi-intention recognition method, device, equipment and storage medium
CN111522915A (en) Extraction method, device and equipment of Chinese event and storage medium
CN111814454A (en) Multi-modal network spoofing detection model on social network
US20200364216A1 (en) Method, apparatus and storage medium for updating model parameter
CN112085091B (en) Short text matching method, device, equipment and storage medium based on artificial intelligence
CN113282711A (en) Internet of vehicles text matching method and device, electronic equipment and storage medium
CN116882372A (en) Text generation method, device, electronic equipment and storage medium
US20230114673A1 (en) Method for recognizing token, electronic device and storage medium
CN112070506A (en) Risk user identification method, device, server and storage medium
CN115983271A (en) Named entity recognition method and named entity recognition model training method
CN112613293A (en) Abstract generation method and device, electronic equipment and storage medium
CN116450829A (en) Medical text classification method, device, equipment and medium
CN110852066A (en) Multi-language entity relation extraction method and system based on confrontation training mechanism
CN117278675A (en) Outbound method, device, equipment and medium based on intention classification
CN113536784A (en) Text processing method and device, computer equipment and storage medium
CN114372458B (en) Emergency detection method based on government work order
CN111797194A (en) Text risk detection method and device, electronic equipment and storage medium
WO2023137903A1 (en) Reply statement determination method and apparatus based on rough semantics, and electronic device
CN116092101A (en) Training method, image recognition method apparatus, device, and readable storage medium
CN116976341A (en) Entity identification method, entity identification device, electronic equipment, storage medium and program product
CN114090781A (en) Text data-based repulsion event detection method and device
CN115036022A (en) Health risk assessment method and system, computer device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant