WO2021246812A1 - Solution et dispositif d'analyse de niveau de positivité d'actualités utilisant un modèle nlp à apprentissage profond - Google Patents

Solution et dispositif d'analyse de niveau de positivité d'actualités utilisant un modèle nlp à apprentissage profond Download PDF

Info

Publication number
WO2021246812A1
WO2021246812A1 PCT/KR2021/006968 KR2021006968W WO2021246812A1 WO 2021246812 A1 WO2021246812 A1 WO 2021246812A1 KR 2021006968 W KR2021006968 W KR 2021006968W WO 2021246812 A1 WO2021246812 A1 WO 2021246812A1
Authority
WO
WIPO (PCT)
Prior art keywords
sentence
score
news
sentences
word
Prior art date
Application number
PCT/KR2021/006968
Other languages
English (en)
Korean (ko)
Inventor
황규종
김도영
김민철
김준휘
김지원
도용남
안순득
전창환
정은철
최용승
Original Assignee
주식회사 웨이커
황규종
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020200067620A external-priority patent/KR102466428B1/ko
Priority claimed from KR1020200067621A external-priority patent/KR102443629B1/ko
Priority claimed from KR1020200067619A external-priority patent/KR102322899B1/ko
Application filed by 주식회사 웨이커, 황규종 filed Critical 주식회사 웨이커
Publication of WO2021246812A1 publication Critical patent/WO2021246812A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/258Heading extraction; Automatic titling; Numbering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present disclosure relates to a method and apparatus for analyzing news articles. More specifically, it relates to a method and apparatus for analyzing a news article according to a news positive analysis solution using a deep learning model.
  • a method and apparatus for analyzing a news article may be provided.
  • a method and apparatus for analyzing a news article using a plurality of news article analysis models may be provided.
  • a method of analyzing a news article including
  • a memory for storing one or more instructions; and at least one processor executing the one or more instructions.
  • the at least one processor identifies a sentence including a financial word or a person word stored in advance in the news article by executing the one or more instructions, and converts some of the identified sentences into the news article
  • An apparatus for analyzing a news article may be provided to obtain a score, and to determine a comprehensive evaluation score regarding the financial propensity of the news article based on the first news score and the second news score.
  • a computer-readable recording medium recording a program for executing a method of analyzing a news article including a computer may be provided.
  • a method for training a news article analysis model including, may be provided.
  • the method for training the news article analysis model includes: learning the news article analysis model based on the generated training data; may further include.
  • a memory for storing one or more instructions; and at least one processor executing the one or more instructions.
  • the at least one processor stores preset financial words and person words by executing the one or more instructions, obtains a news article from an external device to which the electronic device is connected, and within the news article Learning data for learning the news article analysis model by identifying a sentence including the financial word or the person word, extracting a preset number of sentences from among the identified sentences, and pre-processing the extracted sentences
  • An electronic device that generates it may be provided.
  • the electronic device may train a news article analysis model based on the generated training data.
  • the steps of storing preset financial words and person words acquiring a news article from an external device to which the electronic device is connected; identifying a sentence in the news article that includes the financial word or the people word; extracting a preset number of sentences from among the identified sentences; and generating training data for training the news article analysis model by pre-processing the extracted sentences.
  • a computer-readable recording medium recording a program for executing the method on a computer, including a computer-readable recording medium, may be provided.
  • an electronic device in a news article analysis system obtains a news article from at least one external device connected to the electronic device; transmitting, by the electronic device, the obtained news article to a server connected to the electronic device; identifying, by the server, a sentence including a financial word or a person word stored in advance in the news article; obtaining, by the server, a first news score from the neural network model by inputting, by the server, some sentences of the identified sentences into a neural network model that is trained to output a score for the news article; obtaining, by the server, a second news score from the pre-learning model by inputting some of the identified sentences into the pre-learning model; and transmitting, by the server, the first news score and the second news score to the electronic device.
  • a method of analyzing a news article may be provided, including:
  • a sentence including a financial word or a person word stored in advance in a news article is identified, and some of the identified sentences are used for the news article.
  • a first news score is obtained from the neural network model by inputting to the neural network model trained to output a score
  • a second news score is obtained from the pre-learning model by inputting some sentences of the identified sentences to the pre-learning model.
  • server to; and an electronic device for obtaining a news article from an external device, transmitting the obtained news article to the server, and obtaining the first news score and the second news score from the server;
  • a news article analysis system comprising a may be provided.
  • an electronic device in the news article analysis system may include: acquiring a news article from at least one external device connected to the electronic device; transmitting, by the electronic device, the obtained news article to a server connected to the electronic device; identifying, by the server, a sentence including a financial word or a person word stored in advance in the news article; obtaining, by the server, a first news score from the neural network model by inputting, by the server, some sentences of the identified sentences into a neural network model that is trained to output a score for the news article; obtaining, by the server, a second news score from the pre-learning model by inputting some of the identified sentences into the pre-learning model; and transmitting, by the server, the first news score and the second news score to the electronic device.
  • a computer-readable recording medium recording a program for executing a method of analyzing a news article in a computer, including a computer-readable recording medium, may be provided.
  • the news article may be objectively evaluated using a plurality of models for analyzing the news article.
  • FIG. 1 is a diagram schematically illustrating a method of analyzing a news article according to an exemplary embodiment.
  • FIG. 2 is a flowchart of a method of analyzing a news article according to an exemplary embodiment.
  • FIG. 3 is a diagram for explaining a financial word list and a person weight list that are previously stored in an electronic device, according to an embodiment.
  • FIG. 4 is a diagram for explaining a process in which an electronic device extracts a sentence from a news article and generates summary data by using the extracted sentence, according to an exemplary embodiment.
  • FIG. 5 is a diagram for explaining a method of obtaining, by an electronic device, a first news score using a neural network model, according to an embodiment.
  • FIG. 6 is a flowchart of a method of obtaining, by an electronic device, a second news score by using a pre-learning model, according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating a process in which an electronic device pre-processes a preset number of sentences, according to an embodiment.
  • FIG. 8 is a diagram for describing sentence weight elements used by an electronic device to determine a sentence weight determined for each sentence, according to an embodiment.
  • FIG. 9 is a flowchart illustrating a method for an electronic device to determine a comprehensive evaluation score based on a first news score and a second news score, according to an embodiment.
  • FIG. 10 is a flowchart illustrating a method for an electronic device to analyze a news article according to another exemplary embodiment.
  • FIG. 11 is a diagram schematically illustrating a process of learning a news article analysis model according to an exemplary embodiment.
  • FIG. 12 is a flowchart of a method for training a news article analysis model according to an embodiment.
  • FIG. 13 is a diagram for describing a process in which an electronic device extracts a preset number of sentences from among sentences identified in a news article, according to an embodiment.
  • FIG. 14 is a flowchart illustrating a process in which an electronic device pre-processes a preset number of sentences, according to an embodiment.
  • 15 is a diagram for describing a process in which an electronic device generates learning data and verification data from a news article, according to an exemplary embodiment.
  • 16 is a flowchart of a method for analyzing a news article by a news article analysis system according to an exemplary embodiment.
  • 17 is a flowchart of a method of analyzing a news article by a news article analysis system according to another exemplary embodiment.
  • FIG. 18 is a flowchart illustrating a method of analyzing a news article by a news article analysis system according to another exemplary embodiment.
  • 19 is a block diagram of an electronic device according to an embodiment.
  • 20 is a block diagram of a server according to an embodiment.
  • a method of analyzing a news article comprising: identifying a sentence including a financial word or a person word stored in advance in the news article; obtaining a first news score from the neural network model by inputting some of the identified sentences into a neural network model trained to output a score for the news article; obtaining a second news score from the pre-learning model by inputting some of the identified sentences into the pre-learning model; and determining, based on the first news score and the second news score, a comprehensive evaluation score regarding the financial propensity of the news article;
  • a method may be provided, comprising:
  • an apparatus for analyzing a news article includes: a memory for storing one or more instructions; and at least one processor executing the one or more instructions. including, wherein the at least one processor identifies a sentence including a financial word or a person word stored in advance in the news article by executing the one or more instructions, and converts some of the identified sentences into the news article By inputting into the neural network model that is trained to output a score for An apparatus may be provided for obtaining a score and determining, based on the first news score and the second news score, a comprehensive evaluation score regarding the financial propensity of the news article.
  • FIG. 1 is a diagram schematically illustrating a method of analyzing a news article according to an exemplary embodiment.
  • the electronic device 1000 may acquire the news article 101 from an external device connected to the electronic device, and determine a comprehensive evaluation score 112 for the acquired news article. For example, the electronic device 1000 acquires text data, image data, or image data about the news article 101 from an external device, and uses the acquired data on the news article to comprehensively evaluate the news article A score 112 may be output.
  • the electronic device 1000 may determine a comprehensive evaluation score for the news article 101 by using a news article analysis model for analyzing at least one news article.
  • the news article analysis model may include at least one of the neural network model 102 and the pre-learning model 104 .
  • the electronic device 1000 outputs a first news score 106 output by the neural network model according to a pre-trained weight and a second news score 108 output by the pre-learning model according to scores of words stored in advance. ) may be used to determine the overall evaluation score 110 for the news article 101 .
  • the comprehensive evaluation score 110 may be a numerical value representing the financial propensity indicated by the news article, based on at least one of a financial word and a person word included in the news article.
  • the comprehensive evaluation score 110 is a financial word used by the layers in the neural network model 102 and the weight of the connection strength between the layers is modified and updated, or the pre-learning model 104 is used. Word scores or person weights in the list and person weight list may be changed when they are changed.
  • the electronic device 1000 uses at least one news article analysis model, a smart phone, a tablet PC, an AI program for processing data related to a news article, and a voice recognition function; It may be, but is not limited to, a smart TV, cell phone, media player, server, micro server, or other mobile or non-mobile computing device.
  • the electronic device 1000 may determine the comprehensive evaluation score 112 for the news article 101 by interworking with the server 2000 .
  • the electronic device 1000 may include a communication module capable of communicating with the server 2000 .
  • the server 2000 may include other computing devices that can transmit and receive data to and from the electronic device by being connected to the electronic device 1000 through a network.
  • the server device 2000 may be a Wearable Business Management Server (W-BMS) for managing the wearable device.
  • W-BMS Wearable Business Management Server
  • the server 2000 is a local area network (LAN), a wide area network (WAN), a value added network (VAN), a mobile communication network (mobile radio communication network), It is a data communication network in a comprehensive sense that includes a satellite communication network and a combination thereof, and enables each network constituent entity shown in FIG. 1 to communicate smoothly with each other, and includes a wired Internet, a wireless Internet, and a mobile wireless communication network.
  • LAN local area network
  • WAN wide area network
  • VAN value added network
  • mobile communication network mobile radio communication network
  • FIG. 2 is a flowchart of a method of analyzing a news article according to an exemplary embodiment.
  • the electronic device 1000 may identify a sentence including a financial word or a person word stored in advance in a news article. According to an embodiment, the electronic device 1000 may generate a financial word list by matching a financial word and a word score for each financial word, and may store the generated financial word list. Also, the electronic device 1000 may generate a person weight list by matching the person word and the person word weight for each person word, and store the generated person weight list. According to an embodiment, the person word may be predetermined to be related to the field of finance. According to an embodiment, the person word may be determined based on whether the number of publications of the person word in the published financial field article is equal to or greater than a predetermined threshold during a preset period.
  • the electronic device 1000 may obtain a first news score from the neural network model by inputting some of the identified sentences into the neural network model trained to output scores for news articles.
  • the electronic device 1000 may generate training data using news articles for a preset period, and train the neural network model in advance based on the generated training data.
  • the electronic device 1000 may acquire a pre-trained neural network model from the server.
  • the electronic device 1000 may obtain a second news score from the pre-learning model by inputting some of the identified sentences to the pre-learning model.
  • the dictionary learning model may identify a predetermined word in an article using a dictionary, and output a second news score based on a word score assigned to the identified word.
  • the electronic device 1000 may determine a comprehensive evaluation score regarding the financial propensity of a news article based on the first news score and the second news score.
  • the electronic device 1000 may obtain a neutral index, which is a probability value related to the degree of confidence in the second news score, in addition to the second news score from the pre-learning model.
  • the electronic device 1000 determines an evaluation weight to be applied to the first news score and the second news score based on the neutral index value, and performs a comprehensive evaluation by weighting the first news score and the second news score according to the determined evaluation weight. score can be determined.
  • FIG. 3 is a diagram for explaining a financial word list and a person weight list that are previously stored in an electronic device, according to an embodiment.
  • the electronic device 1000 may generate the financial word list 310 by matching the financial word and the word score for each financial word.
  • the electronic device 1000 may acquire expert evaluation scores of experts for preset financial words.
  • the electronic device 1000 may obtain an expert evaluation score for a specific financial word from a plurality of experts, and may generate a word score vector based on the obtained expert evaluation score.
  • the word score vector may include the evaluation score of each expert as a vector element.
  • the electronic device 1000 may identify the average and standard deviation of elements in the word score vector.
  • the electronic device 1000 may generate the financial word list 310 by converting the financial words into a headword form and matching the converted financial words into the headword form with word scores for each financial word.
  • the electronic device 1000 obtains 2 points, 2 points, 3 points, 2 points, and 1 points, respectively, as user evaluation scores for 'competent' from five financial experts, and obtains from the financial experts
  • a word vector score ⁇ 2,2,3,2,1 ⁇ may be generated using each expert evaluation score obtained as a vector element.
  • the electronic device 1000 may identify the mean and standard deviation of the vector elements of the generated word vector score as 2 and 0.632456, respectively.
  • the expert evaluation score obtained by the electronic device 1000 may include a value between -5 and 5, but is not limited thereto.
  • the expert evaluation score obtained by the electronic device 1000 may be an index value indicating a positive or negative degree of each expert with respect to a corresponding word.
  • the expert evaluation score obtained by the electronic device 1000 may indicate the following positive or negative degrees.
  • an expert rating score between -5 and 5 may indicate a positive or negative degree of a financial word as follows (eg, -5: extremely bad, -4: very bad, -3: not) good, -2: slightly bad, -1: vaguely bad, 0: moderate, 1: vaguely good, 2: slightly good, 3: good, 4: very good, 5: extremely good).
  • the present invention is not limited thereto, and the range of evaluation scores of experts for financial words may vary.
  • the electronic device 1000 may obtain an evaluation score for a predetermined person word from a plurality of financial experts.
  • the electronic device 1000 may generate a person word weight based on an evaluation score for the person word obtained from a plurality of financial experts.
  • a weight for each person word may be expressed in a vector form.
  • the electronic device 1000 may generate the person weight list 320 by matching the person word and the person word weight for each person word.
  • the electronic device 1000 may obtain evaluation scores for 'Warren Edward Buffet', the chairman of Berkshire Hathaway from five financial experts, respectively, and generate a person weight based on the obtained evaluation scores.
  • the expert evaluation score for the person word obtained by the electronic device 1000 may include a value between -5 and 5.
  • the present invention is not limited thereto, and the range of the expert evaluation score for the person word obtained by the electronic device 1000 may vary.
  • the electronic device 1000 may store the aforementioned financial word list 310 and person weight list 320 in a memory in the electronic device in advance.
  • FIG. 4 is a diagram for explaining a process in which an electronic device extracts a sentence from a news article and generates summary data by using the extracted sentence, according to an exemplary embodiment.
  • the electronic device 1000 may obtain a news article and identify each sentence in the obtained news article. For example, the electronic device 1000 may identify a sentence including a preset financial word and a person word from a news article. For example, the electronic device 1000 may acquire the news article 410 and identify sentences 1 to 6 including at least one of a predetermined financial word or a person word in the acquired news article 410 . .
  • the electronic device 1000 may identify a sentence included in a news article, and identify location information in each of the identified sentences.
  • the electronic device 1000 may identify the number of the identified sentence in the news article.
  • the electronic device 1000 may identify a first sentence in a news article and a last sentence in a news article based on location information of the identified sentence.
  • the electronic device 1000 may extract a predetermined sentence from among the sentences identified in the news article. For example, the electronic device 1000 may extract sentences 1 to 4 and sentence 6 from among the sentences extracted from the news article 410 . The electronic device 1000 may generate a summary news article 420 by extracting some sentences from among sentences including at least one of a predetermined financial word or a person word in the news article, and extracting the extracted partial sentences.
  • the electronic device 1000 extracts a preset number of sentences to be scored from among a financial word or a sentence including the person word, a first sentence in a news article, and a last sentence in the news article, , a summary news article 420 may be generated using the extracted preset number of sentences.
  • the electronic device 1000 may generate the summary data 430 by encoding the summary news article.
  • the electronic device 1000 may generate text data for a summarized news article as the summary data 430 .
  • the electronic device 1000 may generate the summary data 430 by encoding the summary news article and converting the summary news article into binarized data.
  • FIG. 5 is a diagram for explaining a method of obtaining, by an electronic device, a first news score using a neural network model, according to an embodiment.
  • the electronic device 1000 inputs the summary data 502 generated according to the process described above with reference to FIG. 4 into the neural network model 510 , and the first news score output from the neural network model 510 . (512) can be obtained.
  • the neural network model 510 may include at least one layer including at least one node and a weight related to the connection strength of the layers.
  • the neural network model used by the electronic device 1000 may include a deep neural network (DNN), for example, a Convolutional Neural Network (CNN), a Deep Neural Network (DNN), or an RNN (Recurrent Neural Network), RBM (Restricted Boltzmann Machine), DBN (Deep Belief Network), BRDNN (Bidirectional Recurrent Deep Neural Network), or deep Q-Networks (Deep Q-Networks), but is also not limited thereto.
  • DNN deep neural network
  • CNN Convolutional Neural Network
  • DNN Deep Neural Network
  • RNN Recurrent Neural Network
  • RBM Restricted Boltzmann Machine
  • DBN Deep Belief Network
  • BRDNN Bidirectional Recurrent Deep Neural Network
  • Deep Q-Networks Deep Q-Networks
  • the electronic apparatus 1000 may obtain a news article from an external device connected to the electronic apparatus according to a preset period, and may generate learning data from the obtained news article. According to an embodiment, the electronic device 1000 may further generate verification data from the obtained news articles in addition to the learning data.
  • the electronic device 1000 trains a neural network model based on training data, an output value of the neural network model output by inputting learning data from the neural network model, and an output of a neural network model output by inputting verification data from the neural network model
  • the neural network model may be trained so that the difference in values (eg, loss loss) is small.
  • the operation of the electronic device 1000 to train the neural network model may correspond to the operation of correcting and updating the layers in the neural network model and weights related to the connection strength of the layers.
  • the electronic device 1000 may acquire the first news score 512 by inputting summary data obtained from a predetermined news article into the trained neural network model 510 .
  • the electronic device 1000 identifies a plurality of sentences in a news article received from an external device, pre-processes some extracted sentences from among the identified sentences, and learns some of the pre-processed sentences.
  • the first news score may be obtained from the neural network model.
  • FIG. 6 is a flowchart of a method of obtaining, by an electronic device, a second news score by using a pre-learning model, according to an exemplary embodiment.
  • the electronic device 1000 may pre-process some extracted sentences among predetermined sentences identified in the news article.
  • the operation of pre-processing the extracted partial sentences by the electronic device 1000 may correspond to the operation of pre-processing the sentences in the summary data described above with reference to FIG. 4 .
  • An operation in which the electronic device 1000 pre-processes some extracted sentences will be described in detail with reference to FIG. 7 to be described later.
  • the electronic device 1000 may determine a sentence weight for each of the pre-processed sentences based on at least one of negative phrases, adverbs, punctuation marks, emphasis phrases, negative words, and person words in the pre-processed sentence. For example, in S610 , the electronic device 1000 may determine a person word weight previously assigned to a person word included in the pre-processed sentence as a weight of a sentence including the person word.
  • the electronic device 1000 may multiply the sentence weight of the pre-processed sentence including the corresponding negative phrase by -1. have. That is, when a negative phrase such as not or isn't is included in the pre-processed sentence, the electronic device 1000 may determine the sentence weight of the pre-processed sentence including the corresponding negative phrase as a negative number.
  • the electronic device 1000 may determine a sentence weight for each pre-processed sentence based on an adverb included in the pre-processed sentence. For example, when an affirmative adverb (eg, absolutely, incredibly) is included in the pre-processed sentence, the electronic device 1000 may set a predetermined value to the sentence weight of the sentence including the positive adverb based on the number of included positive adverbs. You can add weight values. According to an embodiment, when an affirmative adverb is included in the pre-processed sentence, the electronic device 1000 may increase the sentence weight of the corresponding sentence by 0.3, but the degree of increase in the sentence weight may vary. Also, according to an embodiment, when two positive adverbs are included in the pre-processed sentence, the electronic device 1000 may increase the weight by 0.6, but is not limited thereto.
  • an affirmative adverb eg, absolutely, incredibly
  • the electronic device 1000 may increase the sentence weight of the corresponding sentence by 0.3, but the degree of increase in
  • the electronic device 1000 determines the number of the sentence including the negative adverb based on the number of the included negative adverb.
  • a predetermined weight value may be subtracted from the sentence weight.
  • the electronic device 1000 may decrease the sentence weight of the corresponding sentence by 0.3, but is not limited thereto.
  • the electronic device 1000 may decrease the weight by 0.6, but is not limited thereto.
  • the electronic device 1000 may determine a sentence weight in a corresponding sentence based on the pre-processed punctuation marks in the sentence. For example, when the pre-processed sentence includes an exclamation point punctuation mark for emphasis, the electronic device 1000 may increase the sentence weight of the sentence including the exclamation point.
  • the operation of the electronic device increasing the sentence weight may correspond to the operation of scaling up the sentence weight by adding a predetermined weight value or multiplying a scale value according to a predetermined scale.
  • the electronic device 1000 may reduce the sentence weight of the corresponding sentence.
  • the operation of the electronic device to decrease the sentence weight corresponds to the operation of scaling down the sentence weight by subtracting a predetermined weight value or multiplying a scale value according to a predetermined scale.
  • the electronic device 1000 scales up the sentence weight 0.3 by 50% to determine it as 0.45.
  • the electronic device 1000 may increase the weight of the pre-processed sentence.
  • the electronic device 1000 may increase or decrease the sentence weight of the pre-processed sentence including the negative word. have.
  • the electronic device 1000 when the pre-processed sentence includes one affirmative adverb, ends with an exclamation mark, and includes the negative phrase not, the electronic device 1000 increases the sentence weight increased due to one positive adverb to 0.3 and the sentence weight is scaled up to 0.45 due to the exclamation mark, and since the negative phrase not is included, the scaled-up sentence weight 0.45 can be changed to -0.45. Accordingly, when the pre-processed sentence includes one affirmative adverb, ends with an exclamation mark, and includes the negative phrase not, the electronic device 1000 may determine the sentence weight of the corresponding sentence to be ⁇ 0.45. However, the extent to which the electronic device 1000 increases or decreases the sentence weight is not limited thereto.
  • the electronic device 1000 may determine the sentence score of each pre-processed sentence by applying the sentence weight determined for each of the pre-processed sentences to the word score of each financial word in the pre-processed sentence. For example, the electronic device 1000 may determine the sentence word score by identifying preset words in the pre-processed sentence and summing the word scores assigned to each of the identified words. The electronic device 1000 may finally determine the sentence score by multiplying the sentence word score by the sentence weight.
  • the sentence score of the pre-processed sentence may be determined to be -0.18.
  • the sum of scores of financial words included in one pre-processed sentence is 0.4, and the pre-processed sentence includes one positive adverb, ends with an exclamation mark, and does not include negative phrases.
  • the sentence score of the pre-processed sentence may be determined to be 0.18.
  • the electronic device 1000 applies the sentence weight determined for each sentence to the word score of each financial word included in each of the pre-processed sentences, and then weights and sums the word scores to which the sentence weight is applied. It is also possible to determine the sentence score of the given sentence.
  • the electronic device 1000 may adjust the sentence score determined for each pre-processed sentence based on the identified location information. Also, according to an embodiment, the electronic device 1000 may adjust the sentence score determined for each pre-processed sentence based on location information of the pre-processed sentence and a conjunction included in the pre-processed sentence. According to an embodiment, the operation of the electronic device 1000 adjusting the determined sentence score for each pre-processed sentence based on the identified location information includes: the electronic device 1000 based on the identified location information; It may correspond to the operation of adjusting the sentence weight determined for each of the pre-processed sentences and applying the adjusted sentence weight to the sentence score determined for each of the pre-processed sentences.
  • the electronic device 1000 may identify location information regarding the location of the pre-processed sentence in the news article.
  • the electronic device 1000 may identify contexts between pre-processed sentences based on location information of the pre-processed sentences, and adjust sentence scores determined for each pre-processed sentence based on the identified contexts.
  • the context identified by the electronic device 1000 based on the location information of the pre-processed sentences may mean a 'context' or a 'context' between the pre-processed sentences.
  • the electronic device 1000 may identify whether the pre-processed sentence is a continuous sentence based on location information of the pre-processed sentence. When the pre-processed sentences are consecutive sentences, the electronic device 1000 may further assign additional weights to all sentence weights determined for the pre-processed sentences. According to an embodiment, when the pre-processed sentence is a continuous sentence, the electronic device 1000 may increase the sentence weights determined for each of the pre-processed sentences by 0.1, but is not limited thereto. The electronic device 1000 may adjust the sentence scores of each pre-processed sentence by applying the increased sentence weights to each of the pre-processed sentences.
  • the electronic device 1000 may adjust a sentence score for each of the pre-processed sentences based on location information of the pre-processed sentence and a conjunction included in the pre-processed sentence. For example, the electronic device 1000 identifies whether the pre-processed sentence is two consecutive sentences based on the location information of the pre-processed sentence, and when the two consecutive sentences are connected with a but conjunction, The sentence weight of the preceding pre-processed sentence may be decreased, and the sentence weight of the subsequent sentence connected after the but conjunction may be increased.
  • the electronic device 1000 when two consecutive sentences are connected with the but conjunction, the electronic device 1000 reduces the sentence weight of the previous pre-processed sentence by 0.4, and reduces the sentence weight of the following sentence connected after the but conjunction.
  • the sentence weight can be increased by 0.4.
  • the electronic device 1000 may adjust the sentence score of the pre-processed sentence before the but conjunction by applying a sentence weight reduced by 0.4 to the sum of scores of words in the pre-processed sentence before the but conjunction.
  • the electronic device 1000 may adjust the sentence score of the preprocessed sentence after the but conjunction by applying a sentence weight increased by 0.4 to the sum of scores of words in the preprocessed sentence after the but conjunction.
  • the electronic device 1000 may obtain a second news score by using the adjusted sentence score. For example, when sentence scores are adjusted for each of the pre-processed sentences, the electronic device 1000 may acquire the second news score by adding the adjusted sentence scores.
  • FIG. 7 is a flowchart illustrating a process in which an electronic device pre-processes a preset number of sentences, according to an embodiment.
  • the electronic device 1000 may identify each word in the sentence by tokenizing some extracted sentences among the identified sentences in the news article. Although not shown in FIG. 7 , the electronic device 1000 may identify words in the extracted sentences by tokenizing the extracted partial sentences and further performing a process of tokenizing the tokenized words in the partial sentences. . According to an embodiment, the electronic device 1000 may tokenize the sentences by decomposing the extracted sentences into grammatically indivisible units. Also, the electronic device 1000 may tokenize the extracted sentences based on at least one of punctuation marks and spaces in the sentences, and may identify words from the tokenized sentences.
  • the electronic device 1000 may remove words that are not used to calculate a sentence score from among the identified words. For example, the electronic device 1000 may generate a list of words that are not necessary for calculating the score of the sentence among the words identified by tokenizing the sentence, and when words included in the generated word list are identified, the corresponding Words can be removed.
  • the electronic device 1000 may remove words not used for calculating the sentence score and convert the remaining words in each sentence into a headword form. For example, when the electronic device 1000 removes words not used for calculating sentence scores and the remaining words are not in the form of a headword described in the dictionary (eg, looks), the remaining words are converted into the form of a heading (eg, look) described in the dictionary can do.
  • the electronic device 1000 may determine a weight for each sentence by using a sentence including words converted into a headword form, and may determine a sentence score for each sentence based on the determined weight.
  • FIG. 8 is a diagram for describing sentence weight elements used by an electronic device to determine a sentence weight determined for each sentence, according to an embodiment.
  • a sentence score element 804 used by the electronic device 1000 to determine a sentence weight for each sentence and element details 806 of each sentence score element are illustrated.
  • the sentence score element used by the electronic device 1000 may be stored in advance in a memory in the electronic device 1000 according to a predetermined identification number 802 .
  • the electronic device 1000 determines a sentence weight for each pre-processed sentence based on at least one of negative phrases, adverbs, punctuation marks, emphasis phrases, negative words, and person words in the pre-processed sentence according to the pre-processing process of FIG. 7 . can
  • the sentence weighting element 804 may include at least one of negative phrases, adverbs, punctuation marks, conjunctions, emphasis phrases, negative words, and person words.
  • the electronic device 1000 may determine a sentence weight for each sentence based on a sentence weight element included in the pre-processed sentence.
  • the sentence weight element of another sentence adjacent to the sentence including the sentence weight element may be changed. For example, when two consecutive sentences are connected by a but conjunction that is a sentence weighting element 804 (or a later sentence among two consecutive sentences includes a but conjunction that is a sentence weighting element) case), the sentence weight of the pre-processed sentence can be decreased by 0.4, and the sentence weight of the sentence after the but connected after the conjunction can be increased by 0.4.
  • negative phrases are element details 806 , and may include not and isn't.
  • the adverb may include positive adverb and negative adverb, positive adverb may include absolutely or remarkably, and negative adverb may include at least one of scarcely or hardly.
  • the present invention is not limited thereto.
  • the punctuation mark may include an exclamation mark, a question mark, and a comma, and may further include other punctuation marks inserted at some positions in other sentences to emphasize or assist the meaning of the sentence.
  • the conjunction may include but, however, and the like, but is not limited thereto.
  • the emphasis phrase may include very, etc.
  • the negative word may include never, so, and without
  • the person word may include the person's name shown in the person weight list shown in FIG. 3 .
  • FIG. 9 is a flowchart illustrating a method for an electronic device to determine a comprehensive evaluation score based on a first news score and a second news score, according to an embodiment.
  • the electronic device 1000 may normalize the second news score output from the pre-learning model.
  • the distribution pattern of the scores of the second news score obtainable from the pre-learning model by the electronic device 1000 may be different from the distribution pattern of the scores of the first news score obtainable from the neural network model.
  • the electronic device 1000 may normalize the second news score output from the pre-learning model according to the score distribution pattern of the first news score, thereby making the score distribution pattern similar.
  • the electronic device 1000 may normalize the second news score to a score between -1 and 1.
  • the electronic device 1000 may determine an evaluation weight to be applied to the first news score and the second news score based on the neutral index value obtained from the prior learning model. For example, the electronic device 1000 may further acquire a neutral index from the pre-learning model in addition to acquiring the second news score for the news article. According to another embodiment, in addition to acquiring the second news score, the electronic device 1000 may further acquire a neutral index, a negative index, and a positive index.
  • the electronic device 1000 may decrease the first evaluation weight and determine the second evaluation weight to be large. According to another embodiment, when the neutral index is greater than or equal to the second threshold, the electronic device 1000 may determine to increase the first evaluation weight and decrease the second evaluation weight.
  • the electronic device 1000 may determine the overall evaluation score by weighting the first news score and the normalized second news score according to the determined evaluation weight. According to an embodiment, the electronic device 1000 determines different evaluation weights to be applied to the first news score and the second news score, based on the range of the neutral index value output from the pre-learning model, and determines differently A weight may be applied to the first news score and the second news score. Referring to FIG. 10 to be described later, the electronic device 1000 determines an evaluation weight to be applied to the first news score and the second news score based on the neutral index value, and according to the determined evaluation weight, the first news score and A method of weighting the second news score will be described in more detail.
  • the neutral index, negative index, and positive index obtained by the electronic device 1000 from the pre-learning model are based on a distribution pattern of word scores indicated by words included in each of some sentences extracted from a news article. can be decided.
  • the sum of the neutral index, the negative index, and the positive index may be 1, but is not limited thereto.
  • the positive index can be output large when words (positive finance words) containing a positive word score in the news article are distributed more than words containing a negative word score (negative finance words),
  • the negative index may be output to be large when there are more words (negative finance words) including a negative word score in the news article than words (positive finance words) including a positive word score.
  • the positive index when the number of sentences having a positive sentence score in a news article is greater than the number of sentences having a negative sentence score, the positive index may be output large, and the negative index may be negative in the news article.
  • the output may be large.
  • the neutral index may be a value representing the degree of confidence in the second news score output from the pre-learning model as a probability value. For example, when the neutral index is lower than a preset threshold, the second news score may be reliable with high probability, but when the neutral index is greater than the preset threshold, the confidence probability of the second news score may be low.
  • the distribution of sentence scores for each of some sentences (eg, sentences in summary data) extracted from a news article is clear as a negative sentence score and a positive sentence score If the distribution is distributed so that it can be easily classified, the neutral index can be determined to be low.
  • the neutral index when the distribution of sentence scores for each of some sentences extracted from a news article is not clearly divided into a negative sentence score and a positive sentence score, the neutral index is increased can decide According to an embodiment, when the neutral index is large, the reliability of the second news score output from the pre-learning model may be low, and when the neutral index is small, the reliability of the second news score output from the pre-learning model is low. can be high
  • the neutral index may be determined based on some sentence units extracted from the news article.
  • the neutral index is a case in which a word having a positive score in a sentence unit contains more words than a word having a negative score, or a word having a negative score in a sentence unit is more than a word having a positive score in a sentence unit.
  • the neutral index may be greatly output when the number of words having a positive score and a word having a negative score in a sentence unit are similar, or when there is no sentence including a financial word or a person word preset in the corresponding sentence. .
  • FIG. 10 is a flowchart illustrating a method for an electronic device to analyze a news article according to another exemplary embodiment.
  • the electronic device 1000 may identify a sentence including a finance word or a person word stored in advance in a news article. Since S1010 may correspond to S210 of FIG. 2 , a detailed description thereof will be omitted.
  • the electronic device 1000 may obtain a first news score from the neural network model by inputting some of the identified sentences into the neural network model trained in advance to output a score for the news article. Since S1012 may correspond to S220, a detailed description thereof will be omitted.
  • the electronic device 1000 may obtain a second news score and a neutral index from the pre-learning model by inputting some of the identified sentences to the pre-learning model.
  • the electronic device 1000 may obtain a second news score, a positive index, a negative index, and a neutral index from the prior learning model by inputting some of the identified sentences into the prior learning model. have.
  • the operation of the electronic device 1000 to obtain the second news score using the pre-learning model may correspond to S610 to S650 of FIG. 6 , and thus a detailed description thereof will be omitted.
  • a comprehensive evaluation score 1st Evaluation weight*neutral index value*first news score + second evaluation weight*(1-neutral index value)*normalized second news score
  • FIG. 11 is a diagram schematically illustrating a process of learning a news article analysis model according to an exemplary embodiment.
  • the electronic apparatus 1000 may obtain the news article 101a from an external device connected to the electronic apparatus, and may generate learning data from the obtained news article.
  • the electronic device 1000 obtains text data, image data, or image data regarding the news article 101a from an external device, and generates the training data 106a by using the obtained data on the news article. can do.
  • the electronic device 1000 may generate training data 106a and test data 108a using the news article 101a obtained from the external device.
  • the electronic device 1000 may obtain a news article and generate the training data 106a and the verification data 108a at a preset ratio using text data for the obtained news article. have. According to an embodiment, the electronic device 1000 determines a first learning verification ratio 102a or a second learning verification ratio 104a indicating a generation ratio of the training data and the verification data, and learns according to the determined learning verification ratio. It is also possible to generate data and verification data.
  • the electronic device 1000 may train the news article analysis model 110a based on the generated training data. According to an embodiment, the electronic device 1000 may learn the news article analysis model 110a based on the training data and the verification data.
  • the news article analysis model 110a according to the present disclosure may include an artificial intelligence model that can be learned according to an artificial intelligence learning algorithm, or a neural network model.
  • the neural network model may include a deep neural network (DNN), for example, a Convolutional Neural Network (CNN), a Deep Neural Network (DNN), a Recurrent Neural Network (RNN), or an RBM. (Restricted Boltzmann Machine), DBN (Deep Belief Network), BRDNN (Bidirectional Recurrent Deep Neural Network), or deep Q-Networks, etc., but are also not limited thereto.
  • DNN deep neural network
  • CNN Convolutional Neural Network
  • DNN Deep Neural Network
  • RNN Recurrent Neural Network
  • RBM Recurrent Neural Network
  • RBM Recurrent Neural Network
  • the electronic device 1000 may automatically determine a score for a news article using the learned neural network model.
  • the electronic device 1000 is equipped with an AI program for learning at least one news article analysis model and processing data related to a news article using the learned news article analysis model, and has a voice recognition function. It may be a smartphone, tablet PC, smart TV, mobile phone, media player, server, micro server, or other mobile or non-mobile computing device including, but not limited to.
  • the electronic device 1000 learns a news article analysis model by interworking with the server 2000, and obtains a comprehensive evaluation score for the news article 101a using the learned news article analysis model. may decide
  • the electronic device 1000 may include a network interface capable of communicating with the server 2000 .
  • the server 2000 may include other computing devices capable of transmitting and receiving data to and from the electronic device by being connected to the electronic device 1000 through a network interface.
  • the server device 2000 may be a Wearable Business Management Server (W-BMS) for managing the wearable device.
  • W-BMS Wearable Business Management Server
  • the server 2000 includes a local area network (LAN), a wide area network (WAN), a value added network (VAN), a mobile radio communication network, It is a data communication network in a comprehensive sense that includes a satellite communication network and a combination thereof, and enables each network constituent entity shown in FIG. 11 to communicate smoothly with each other, and includes a wired Internet, a wireless Internet, and a mobile wireless communication network.
  • LAN local area network
  • WAN wide area network
  • VAN value added network
  • mobile radio communication network It is a data communication network in a comprehensive sense that includes a satellite communication network and a combination thereof, and enables each network constituent entity shown in FIG. 11 to communicate smoothly with each other, and includes a wired Internet, a wireless Internet, and a mobile wireless communication network.
  • FIG. 12 is a flowchart of a method for training a news article analysis model according to an embodiment.
  • the electronic device 1000 may store preset financial words and person words. According to an embodiment, the electronic device 1000 may generate a financial word list by matching a financial word and a word score for each financial word, and may store the generated financial word list. Also, the electronic device 1000 may generate a person weight list by matching the person word and the person word weight for each person word, and store the generated person weight list. According to an embodiment, the person word may be predetermined to be related to the field of finance. According to an embodiment, the person word may be determined based on whether the number of publications of the person word in the published financial field article is equal to or greater than a predetermined threshold during a preset period.
  • the electronic device 1000 may obtain a news article from an external device to which the electronic device is connected.
  • the electronic apparatus 1000 may obtain data about a news article from an external device or server connected to the electronic apparatus through the Internet.
  • the electronic device 1000 may identify a sentence including a financial word or a person word in a news article. For example, the electronic device 1000 may identify a plurality of sentences including at least one of a finance word in a pre-stored financial word list or a person word in a person weight list.
  • the electronic device 1000 may extract a preset number of sentences from among the identified plurality of sentences. For example, the electronic device 1000 may extract a predetermined sentence from among the identified sentences based on location information of the identified sentences. A detailed method in which the electronic device 1000 identifies a sentence including at least one of a pre-stored financial word or a person word and extracts a part of the identified sentence will be described with reference to FIG. 13 to be described later.
  • the electronic device 1000 may generate learning data for learning the news article analysis model by pre-processing some of the sentences extracted in step S240a. For example, the electronic device 1000 may generate learning data and verification data by using some of the extracted sentences. An operation of the electronic device 1000 to generate learning data by pre-processing some extracted sentences will be described in detail with reference to FIGS. 14 and 15 .
  • FIG. 13 is a diagram for describing a process in which an electronic device extracts a preset number of sentences from among sentences identified in a news article, according to an embodiment.
  • the electronic device 1000 may identify location information of each of the sentences included in the news article. For example, the electronic device 1000 may identify a sentence including at least one of a preset financial word or a person word in a news article, as well as a position regarding where the identified sentence is located in the news article More information can be identified. According to an embodiment, the electronic device 1000 may determine the location information by identifying the number of a sentence in the news article that includes at least one of a predetermined financial word or a person word in the obtained news article. have.
  • the electronic device 1000 may identify the first sentence in the news article and the last sentence in the news article based on the identified position information of the sentence. For example, the electronic device 1000 may identify the number of sentences identifiable from the news article are located in the news article by using the location information. Accordingly, the electronic device 1000 may identify the first sentence and the last sentence of the news article.
  • the electronic device 1000 may extract a preset number of sentences to be scored from among sentences including the financial word or the person word, the first sentence in the news article, and the last sentence in the news article. .
  • the electronic device 1000 may preset the number of sentences to be extracted from a news article.
  • the electronic device 1000 may extract some of the identified sentences so that the number of sentences to be extracted including the first sentence and the last sentence in the news article among the identified sentences in the news article becomes 5.
  • the present invention is not limited thereto, and the number of sentences to be extracted from the news article by the electronic device 1000 may vary.
  • the electronic device 1000 may generate learning data by using a preset number of extracted sentences. For example, the electronic device 1000 may generate summary data for all news articles by encoding a preset number of extracted sentences, and may generate the generated summary data as training data. According to another embodiment, the electronic device 1000 may generate a preset number of sentences themselves as summary data, and use the generated summary data as learning data.
  • FIG. 14 is a flowchart illustrating a process in which an electronic device pre-processes a preset number of sentences, according to an embodiment.
  • the electronic device 1000 may generate learning data by extracting some sentences from among the identified sentences in the news article and pre-processing the extracted partial sentences.
  • the electronic device 1000 pre-processes some extracted sentences will be described in detail with reference to FIG. 14 .
  • the electronic device 1000 may identify each word in the sentence by tokenizing the extracted partial sentences according to the method illustrated in FIG. 13 .
  • the electronic device 1000 may further identify the extracted words in the sentences by further performing a process of tokenizing the tokenized words in the partial sentences.
  • the electronic device 1000 may tokenize the sentences by decomposing the extracted sentences into grammatically indivisible units.
  • the electronic device 1000 may tokenize the extracted sentences based on at least one of punctuation marks and spaces in the sentences, and may identify words from the tokenized sentences.
  • the electronic device 1000 may remove words not used for calculating sentence scores among the identified words from some extracted sentences. For example, the electronic device 1000 may generate a list of words that are not necessary for calculating the score of the sentence among the words identified by tokenizing the sentence, and when words included in the generated word list are identified, the corresponding Words can be removed.
  • the electronic device 1000 may remove words not used in calculating sentence scores and convert the remaining words in each sentence into a headword form. For example, when the electronic device 1000 removes words not used for calculating sentence scores and the remaining words are not in the form of a headword described in the dictionary (eg, looks), the remaining words are converted into the form of a heading (eg, look) described in the dictionary can do.
  • the electronic device 1000 may generate learning data by using some sentences including words converted into a headword form. That is, the electronic device 1000 according to the present disclosure identifies a sentence including at least one of a financial word and a person word from a news article, extracts a preset number of sentences from among the identified sentences, and pre-processes the extracted sentences. By doing so, training data can be generated.
  • 15 is a diagram for describing a process in which an electronic device generates learning data and verification data from a news article, according to an exemplary embodiment.
  • the electronic device 1000 may obtain a news article and identify each sentence in the obtained news article. For example, the electronic device 1000 may identify a sentence including a preset financial word and a person word from a news article. For example, the electronic device 1000 may obtain a news article 702a and identify sentences 1 to 6 including at least one of a predetermined financial word or a person word in the obtained news article 702a. .
  • the electronic device 1000 may identify a sentence included in a news article, and identify location information in each of the identified sentences.
  • the electronic device 1000 may identify the number of the identified sentence in the news article.
  • the electronic device 1000 may identify a first sentence in a news article and a last sentence in a news article based on location information of the identified sentence.
  • the electronic device 1000 may extract a predetermined sentence from among the sentences identified in the news article. For example, the electronic device 1000 may extract sentences 1 to 4 and sentence 6 from among the sentences extracted from the news article 702a. The electronic device 1000 may generate a summary news article 706a by extracting some sentences from among sentences including at least one of a predetermined financial word or a person word in the news article, and extracting the extracted partial sentences.
  • the electronic device 1000 extracts a preset number of sentences to be scored from among a financial word or a sentence including the person word, a first sentence in a news article, and a last sentence in the news article, , a summary news article 706a may be generated using the extracted preset number of sentences.
  • the electronic device 1000 performs the pre-processing process described above with reference to FIG. 14 on the sentences in the summary news article 706a, and then uses the sentence weight factor to describe the sentences in FIG. 14 .
  • a sentence weight for each pre-processed sentence can be determined.
  • the electronic device 1000 may generate the learning data 708a by allocating the determined sentence weight to each pre-processed sentence. That is, the learning data 708a generated by the electronic device 1000 may include at least one pre-processed sentence, and a sentence weight for each sentence may be matched to each sentence and stored together.
  • the electronic device 1000 may generate the training data 708a from the news article and also generate verification data 710a for verifying the training data.
  • the electronic device 1000 generates learning data by using a news article published during a preset period, and news published during a period different from the publication period of the news article used to generate the learning data. Articles can also be used to generate validation data.
  • the electronic device 1000 when the output value output from the news article analysis model learned based on the training data and the verification data are input to the news article analysis model, receives the data from the news article analysis model.
  • the news article analysis model may be trained so that the difference between the output values is reduced.
  • the electronic device 1000 determines the value output from the news article analysis model based on the training data and the output value output from the news article analysis model when the verification data is input to the news article analysis model.
  • a loss function for the difference is defined, and a news article analysis model can be trained so that the defined loss function is minimized.
  • a news article used by the electronic device 1000 to generate the training data 708a and a news article used to generate the verification data 710a may be different news articles.
  • the electronic device 1000 generates the training data 708a from news articles published during the first period, and generates the verification data 710a using the news articles published during a second period different from the first period. You can also create
  • 16 is a flowchart of a method for analyzing a news article by a news article analysis system according to an exemplary embodiment.
  • the electronic apparatus 1000 may obtain a news article from an external device connected to the electronic apparatus 1000 .
  • the electronic device 1000 may acquire information about a news article in at least one of a text format, an image format, and a video format.
  • the electronic device 1000 may transmit the obtained news article to the server.
  • the server 2000 may identify a sentence including a finance word or a person word stored in advance in a news article obtained from the electronic device 1000 .
  • the server 2000 may generate a financial word list by matching the financial word and the word score for each financial word, and may store the generated financial word list.
  • the server 2000 may generate a person weight list by matching the person word and the person word weight for each person word, and store the generated person weight list.
  • the person word may be predetermined to be related to the field of finance.
  • the person word may be determined based on whether the number of publications of the person word in the published financial field article is equal to or greater than a predetermined threshold during a preset period.
  • the financial word list and the person weight list may be pre-stored in the electronic device 1000, and in this case, the electronic device 1000 may include a financial word or a person word from a news article. It is also possible to directly identify the sentence.
  • the server 2000 may obtain the first news score by inputting some of the identified sentences to the neural network model.
  • the server 2000 may store a pre-trained neural network model to output a score for the news article, and use the pre-trained neural network model to write to the news article. It is possible to obtain a first news score for the
  • the server 2000 may obtain a second news score by inputting some of the identified sentences to the pre-learning model.
  • the dictionary learning model may identify a predetermined word in an article using a dictionary, and output a second news score based on a word score assigned to the identified word.
  • the server 2000 may determine a comprehensive evaluation score regarding the financial propensity of the news article based on the first news score and the second news score. For example, the server 2000 may obtain a neutral index, which is a probability value regarding the degree of confidence in the second news score, in addition to the second news score from the pre-learning model. The server 2000 determines an evaluation weight to be applied to the first news score and the second news score based on the neutral index value, and weights and sums the first news score and the second news score according to the determined evaluation weight to obtain a comprehensive evaluation score can be decided
  • the server 2000 may transmit information on the determined comprehensive evaluation score to the electronic device 1000 .
  • the electronic device 1000 may output a comprehensive evaluation score based on information about the comprehensive evaluation score received from the server.
  • the electronic device 1000 may output the comprehensive evaluation score in an audio format or may output it through a display (not shown) in an output unit (not shown) of the electronic device.
  • 17 is a flowchart of a method of analyzing a news article by a news article analysis system according to another exemplary embodiment.
  • S302b to S310b of FIG. 17 may correspond to S202b to S210b of FIG. 16 , a detailed description thereof will be omitted.
  • the server 2000 may transmit information on the first news score and the second news score to the electronic device 1000 .
  • the electronic device 1000 may determine a comprehensive evaluation score regarding the financial propensity of the news article based on the first news score and the second news score.
  • S312b may correspond to S212b of FIG. 16 . That is, in the case of FIG. 16 , the server 2000 determines the overall evaluation score using the first news score output from the neural network model and the second news score output from the pre-learning model, but in FIG. 17 , the electronic device ( 1000) may determine the overall evaluation score by using information about the first news score and the second news score obtained from the server 2000 . In S316b, the electronic device 1000 may output a comprehensive evaluation score for the determined news article.
  • FIG. 18 is a flowchart illustrating a method of analyzing a news article by a news article analysis system according to another exemplary embodiment.
  • the electronic device 1000 may store in advance a neural network model and a pre-learning model for analyzing a news article.
  • the server 2000 may store a neural network model corresponding to the neural network model stored in the electronic device.
  • the server 2000 may acquire the news article from the external device according to a preset period.
  • the server 2000 may generate learning data from the obtained news article.
  • the server 2000 may generate training data from periodically acquired news articles and validation data for validating a neural network model trained according to the training data.
  • the server 2000 may generate verification data by using a news article published in a period different from the news article used to generate the learning data.
  • the server 2000 may train the neural network model based on the training data. For example, the server 2000 may modify and update weights in the neural network model based on the training data. According to an embodiment, the server 2000 is configured such that the difference between the output value of the neural network model output based on the training data and the output value output from the neural network model when the verification data is inputted to the neural network model becomes smaller. The weights of the model can be modified and updated.
  • the server 2000 may transmit information on the weight of the modified and updated neural network model to the electronic device 1000 .
  • the electronic device 1000 may update the neural network model previously stored in the electronic device based on the weight information received from the server 2000 .
  • the electronic device 1000 may convert the weights of the neural network model stored in advance in the electronic device based on information about the weights received from the server 2000 .
  • the electronic apparatus 1000 may obtain a new news article from an external device.
  • the electronic device 1000 may identify a sentence including a financial word or a person word stored in advance in the news article.
  • the electronic device 1000 may obtain a first news score by inputting some of the identified sentences to the neural network model.
  • the electronic device 1000 may obtain a second news score by inputting some of the identified sentences to the dictionary learning model.
  • the electronic device 1000 may determine a comprehensive evaluation score based on the first news score and the second news score.
  • the electronic device 1000 may output the determined overall evaluation score.
  • 19 is a block diagram of an electronic device according to an embodiment.
  • the electronic device 1000 may include a processor 1400 and a memory 1402 . However, not all illustrated components are essential components, and the electronic device 1000 may be implemented with more components than the illustrated components, and the electronic device 1000 may be implemented with fewer components. may be According to an embodiment, the electronic device 1000 may further include a network interface (not shown) in addition to the processor 1400 and the memory 1402 .
  • the processor 1400 generally controls the overall operation of the electronic device 1000 .
  • the processor 1400 executes programs stored in the memory 1402 to perform the functions of the electronic device 1000 described in FIGS. 1 to 18 .
  • the processor 1400 may be composed of one or a plurality of processors, and the one or the plurality of processors is a general-purpose processor such as a CPU, AP, DSP (Digital Signal Processor), etc., a graphics-only processor such as a GPU, or artificial intelligence (AI). ) may include a dedicated processor.
  • the processor 1400 may include other processing units that perform a function for analyzing news articles by executing instructions stored in the memory.
  • the processor 1400 when the processor 1400 includes a general-purpose processor, an artificial intelligence processor, and a graphics-only processor, the artificial intelligence processor may be implemented as a general-purpose processor or a chip separate from the graphics-only processor.
  • processor 1400 when the processor 1400 is implemented as a plurality of processors or graphics-only processors or artificial intelligence-only processors, at least some of the plurality of processors or graphics-only processors or artificial intelligence-only processors include the electronic device 1000 and It may be mounted on another electronic device connected to the electronic device 1000 or a server.
  • the processor 1400 may execute programs stored in the memory 1402 to identify a sentence including a financial word or a person word stored in advance in the news article, and a sentence of some of the identified sentences, the By inputting into a neural network model that is trained to output scores for news articles, a first news score is obtained from the neural network model, and by inputting sentences of some of the identified sentences into a pre-learning model, the first news from the pre-learning model 2 news scores may be obtained, and a comprehensive evaluation score regarding the financial propensity of the news article may be determined based on the first news score and the second news score.
  • the processor 1400 generates a financial word list by matching the financial word and word scores for each financial word, and generates a person weight list by matching the person word and a person word weight for each person word and store the generated financial word list and the person weight list.
  • the processor 1400 identifies position information of each of the sentences included in the news article, and based on the position information of the identified sentence, the first sentence in the news article and the last sentence in the news article It is possible to identify, and extract a preset number of sentences to be scored among sentences including the financial word or the person word, the first sentence, and the last sentence.
  • the processor 1400 may pre-process a preset number of extracted sentences, input the pre-processed sentences to the neural network model, and obtain a first news score output from the neural network model. .
  • the processor 1400 pre-processes the extracted preset number of sentences, and determines the sentence score of each of the pre-processed sentences based on the sentence weight determined for each of the pre-processed sentences. And, based on the identified location information, the determined sentence score for each of the pre-processed sentences may be adjusted, and the second news score may be obtained using the adjusted sentence score.
  • the processor 1400 identifies each word in a sentence by tokenizing the extracted preset sentence, and among the identified words, a word not used in calculating the sentence score may be removed, words not used in calculating the sentence score may be removed, and words in each remaining sentence may be converted into a headword form.
  • the processor 1400 executes instructions stored in the memory 1402 to store preset financial words and person words, and to retrieve news articles from an external device to which the electronic device is connected.
  • the news article analysis model by obtaining, identifying a sentence including the financial word or the person word in the news article, extracting a preset number of sentences from among the identified sentences, and pre-processing the extracted sentences It is possible to generate training data for learning .
  • the processor 1400 may train the news article analysis model based on the generated training data.
  • the processor 1400 may generate verification data from other news articles published in a period different from the news article, and train a news article analysis model based on the generated training data and verification data. .
  • the processor 1400 determines that the difference between the output value output from the news article analysis model learned based on the learning data and the output value output from the news article analysis model when the verification data is input to the news article analysis model To reduce, the news article analysis model may be trained.
  • the processor 1400 generates a financial word list by matching the financial word and the word score for each financial word, and a person weight list by matching the person word and the person word weight for each person word may be generated, the generated financial word list and the person weight list may be stored, and the financial word list and the person weight list may be modified and updated at a preset cycle.
  • the processor 1400 identifies position information of each of the sentences included in the news article, and based on the position information of the identified sentence, the first sentence in the news article and the last sentence in the news article A sentence may be identified, and a preset number of sentences to be scored may be extracted from among sentences including the financial word or the person word, the first sentence, and the last sentence.
  • the processor 1400 may pre-process some extracted sentences from among the identified sentences in the news article, and generate the learning data using the pre-processed sentences.
  • the processor 1400 identifies each word in the sentence by tokenizing the extracted sentence, and removes words that are not used in calculating the sentence score from among the identified words, Words not used in calculating the sentence score may be removed and words in each remaining sentence may be converted into a headword form.
  • the processor 1400 determines a sentence weight for each of the pre-processed sentences based on at least one of a negative phrase, an adverb, a punctuation mark, a emphasized phrase, a negative word, or the person word in the pre-processed sentence. , and assigning the determined sentence weight to each of the pre-processed sentences, thereby generating the learning data.
  • the processor 1400 receives information on the weight of the neural network model from an external device connected to the electronic device through a network interface, and based on the received information on the weight of the neural network model, the electronic device It is also possible to modify and update the neural network model stored in .
  • the processor 1400 may acquire a news article from an external device connected to the electronic apparatus 1000 by executing programs stored in the memory 1402 . Also, the processor 1400 may transmit the obtained news article to a server connected to the electronic device.
  • the processor 1400 determines a comprehensive evaluation score regarding the financial propensity of the news article based on the first news score and the second news score received from the server 2000, and the determined comprehensive evaluation The score may be output through an output unit (not shown) of the electronic device 1000 .
  • the processor 1400 independently identifies a sentence including a financial word or a person word stored in advance in a news article, and sets some of the identified sentences as a score for the news article.
  • the processor 1400 transmits the modified and updated financial word list and the person weight list to the server when the financial word list including the financial word and the person weight list including the person word are modified and updated.
  • the processor 1400 generates a financial word list by matching the financial word and the word score for each financial word, and a person weight list by matching the person word and the person word weight for each person word may be created, and the generated financial word list and the person weight list may be stored.
  • the processor 1400 identifies position information of each of the sentences included in the news article, and based on the position information of the identified sentence, the first sentence in the news article and the last sentence in the news article It is possible to identify, and extract a preset number of sentences to be scored among sentences including the financial word or the person word, the first sentence, and the last sentence.
  • the processor 1400 may transmit a news article to the server 2000 and receive information about a preset number of sentences extracted from the news article from the server.
  • the processor 1400 may pre-process a preset number of extracted sentences, input the pre-processed sentences to the neural network model, and obtain a first news score output from the neural network model.
  • the processor 1400 may receive the first news score output from the server 2000 using the neural network model stored in the server 2000 .
  • the processor 1400 pre-processes the extracted preset number of sentences, and determines the sentence score of each of the pre-processed sentences based on the sentence weight determined for each of the pre-processed sentences. And, based on the identified location information, the determined sentence score for each of the pre-processed sentences may be adjusted, and the second news score may be obtained using the adjusted sentence score.
  • the processor 1400 may receive the second news score output by the pre-learning model from the server 2000 .
  • the processor 1400 may further receive information on the second news score, neutral index, positive index, or negative index output from the pre-learning model in the server 2000 .
  • the processor 1400 identifies each word in a sentence by tokenizing the extracted preset sentence, and among the identified words, a word not used in calculating the sentence score may be removed, words not used in calculating the sentence score may be removed, and words in each remaining sentence may be converted into a headword form.
  • the memory 1402 may store a program for processing and control of the processor 1400 , and may also store data input to or output from the electronic device 1000 .
  • the memory 1402 may include a plurality of news article analysis models including a pre-trained model and a neural network model.
  • the memory 1402 may store information about the layers constituting the neural network, the nodes included in the layers, and weights related to the connection strength of the layers as a configuration of the neural network model.
  • the memory 1402 may further store the financial word list described above with reference to FIG. 3 and the financial word list including word scores for each financial word. Also, the memory 1402 may further store a person weight list including preset person words and person word weights for each person word.
  • the memory 1402 may store the modified and updated pre-learning model and the neural network model.
  • the memory 1402 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg, SD or XD memory), and a RAM.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • magnetic memory, magnetic disk may include at least one type of storage medium among optical disks.
  • the present invention is not limited thereto, and may be other storage media for storing information on at least one artificial intelligence model for analyzing the news article and instructions for performing a method for analyzing other news articles.
  • a network interface may transmit data transmitted/received by the electronic device 1000 to/from an external device or server.
  • the electronic apparatus 1000 may obtain data about a news article from an external device through a network interface.
  • the electronic device 1000 may receive information on the neural network model or the pre-learning model from an external device through a network interface.
  • the electronic device 1000 may output the comprehensive evaluation score to an external device as a result of analysis on the news article through the network interface.
  • 20 is a block diagram of a server according to an embodiment.
  • the server 2000 may include a network interface 2100 , a database 2200 , and a processor 2300 .
  • the network interface 2100 may correspond to the above-described network interface (not shown) of the electronic device 1000 .
  • the network interface 2100 may receive information about a neural network model or information about a pre-learning model from the electronic device 1000 .
  • the network interface 2100 may receive information about layers and nodes included in the layers of the artificial neural network or weight values related to connection strength of layers in the neural network.
  • the network interface 2100 may receive data about a news article from the electronic device 1000 or an external device different from the electronic device. In addition, the network interface 2100 may receive information about a news article from the electronic device, or transmit information about the first news score and the second news score determined by the server 2000 or a comprehensive evaluation score for the news article. . According to an embodiment, the network interface 2100 may transmit information on the news article analysis model or the neural network model learned in advance by the server 2000 to the electronic device 1000 .
  • the database 2200 may correspond to the memory 1402 of the electronic device shown in FIG. 11 .
  • the database 2200 may store a program for processing and controlling the processor 2300 , and may also store data input to or output from the electronic device 1000 .
  • the database 2200 may further store information about the layers constituting the artificial neural network, information about weights related to the nodes included in the layers, and weights related to the connection strength of the layers, and information about the pre-learning model.
  • the database 2200 may further store information on news articles received from an electronic device or an external device connected to the electronic device.
  • the processor 2300 may control overall operations of devices in the server 2000 . According to an embodiment, the processor 2300 may also perform at least some of the operations performed by the electronic device 1000 described with reference to FIGS. 1 to 18 .
  • the processor 2300 acquires a news article from the electronic device, identifies a sentence including a financial word or a person word stored in advance in the acquired news article, and identifies a part of the identified sentence
  • the first news score may be obtained from the neural network model by inputting , into a neural network model that is trained to output a score for the news article.
  • the processor 2300 inputs some of the identified sentences to the pre-learning model, thereby obtaining a second news score from the pre-learning model, and obtaining a first news score and a second news score. may be transmitted to the electronic device.
  • the processor 2300 determines a comprehensive evaluation score regarding financial propensity for a news article received from the electronic device based on the first news score and the second news score, and adds the overall evaluation score to the determined overall evaluation score. information may be transmitted to the electronic device.
  • the processor 2300 generates a financial word list by matching a financial word and a word score for each financial word, and generates a person weight list by matching a person word and a person word weight for each person word and the generated financial word list and the person weight list may be stored.
  • the processor 2300 identifies position information of each of the sentences included in the news article, and identifies the first sentence in the news article and the last sentence in the news article based on the position information of the identified sentence and a predetermined number of sentences to be scored can be extracted from among the financial word or the sentence including the person word, the first sentence, and the last sentence.
  • the processor 2300 may pre-process a preset number of extracted sentences, input the pre-processed sentences to the neural network model, and obtain a first news score output from the neural network model.
  • the processor 2300 pre-processes a preset number of extracted sentences, and based on a sentence weight determined for each of the pre-processed sentences, the sentence score of each of the pre-processed sentences is calculated. It is determined and based on the identified location information, the determined sentence score for each of the pre-processed sentences may be adjusted, and the second news score may be obtained using the adjusted sentence score.
  • the processor 2300 identifies each word in the sentence by tokenizing the extracted preset sentence, and removes words that are not used for calculating the sentence score from among the identified words. And, words not used for calculating the sentence score may be removed, and words in each remaining sentence may be converted into a headword form.
  • the processor 2300 is configured to calculate a sentence weight for each of the pre-processed sentences based on at least one of a negative phrase, an adverb, a punctuation mark, a emphasized phrase, a negative word, or the person word in the pre-processed sentence. , and applying the determined sentence weight to the word score of each financial word in the pre-processed sentence, the sentence score of each of the pre-processed sentences may be determined.
  • the processor 2300 identifies the contexts between the pre-processed sentences based on the location information of the pre-processed sentences, and based on the identified contexts, determines for each of the pre-processed sentences. You can adjust the sentence score.
  • the processor 2300 obtains, from the pre-learning model, a neutral index regarding the confidence level of the second news score that varies according to a distribution pattern of word scores of words included in some sentences of the identified sentences. can be obtained
  • the processor 2300 multiplies the neutral index value by the first news score and the neutral index value from 1 to 1. By adding a value obtained by subtracting the index value multiplied by the normalized second news score, the overall evaluation score may be determined.
  • the processor 2300 multiplies the neutral index value and a first evaluation weight less than 1 by the first news score, and , by adding a value obtained by subtracting the neutral index value from 1 and a value obtained by multiplying the second news score by a second evaluation weight greater than 1 to determine the overall evaluation score.
  • the processor 2300 multiplies the neutral index value and a first evaluation weight greater than 1 by the first news score and , by adding a value obtained by subtracting the neutral index value from 1 and a value obtained by multiplying the second news score by a second evaluation weight less than 1, the overall evaluation score may be determined.
  • the processor 2300 determines a weight to be applied to the sentence including the negative phrase as a negative number, and includes a positive adverb in the pre-processed sentence. case, the weight of the sentence including the positive adverb may be increased, and when the negative adverb is included in the pre-processed sentence, the weight of the sentence including the negative adverb may be decreased.
  • the processor 2300 when a punctuation mark for emphasis is included in the pre-processed sentence, the processor 2300 increases the weight of the sentence including the punctuation mark, and a question mark or a comma in the pre-processed sentence. When punctuation marks are included, the weight of the sentences including the punctuation marks may be reduced.
  • the processor 2300 increases the weight of the sentence including the emphasized phrase when an emphasis phrase is included in the pre-processed sentence, and when a negative word is included in the pre-processed sentence, the negative word is You can increase or decrease the weight of included sentences.
  • the above-described method according to an embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the medium may be specially designed and configured for the present disclosure, or may be known and available to those skilled in the art of computer software.
  • a computer program apparatus including a recording medium storing a program for performing another method
  • the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic such as floppy disks.
  • - includes magneto-optical media, and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • the method according to an embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the medium may be specially designed and configured for the present disclosure, or may be known and available to those skilled in the art of computer software.
  • a computer program apparatus including a recording medium storing a program for performing another method
  • the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floppy disks.
  • - includes magneto-optical media, and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Strategic Management (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Marketing (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Human Resources & Organizations (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Technology Law (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Machine Translation (AREA)

Abstract

La présente invention concerne un procédé d'analyse d'un article d'actualités, et un dispositif électronique pour effectuer le même procédé. Selon un mode de réalisation, un procédé d'analyse d'un article d'actualités peut comprendre les étapes suivantes : identifier une phrase comprenant un mot financier ou un mot de personne préstocké dans l'article d'actualités ; fournir une phrase partielle de la phrase identifiée en entrée d'un modèle de réseau neuronal entraîné pour produire un score de l'article d'actualités, et obtenir un premier score d'actualités grâce au modèle de réseau neuronal ; fournir une phrase partielle de la phrase identifiée en entrée d'un modèle préentraîné, et obtenir un deuxième score d'actualités grâce au modèle préentraîné ; et, en fonction du premier score d'actualités et du deuxième score d'actualités, déterminer un score d'évaluation global concernant une tendance financière de l'article d'actualités.
PCT/KR2021/006968 2020-06-04 2021-06-03 Solution et dispositif d'analyse de niveau de positivité d'actualités utilisant un modèle nlp à apprentissage profond WO2021246812A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR1020200067620A KR102466428B1 (ko) 2020-06-04 2020-06-04 뉴스 긍정도 분석을 위한 인공신경망 학습 모델 및 장치
KR10-2020-0067619 2020-06-04
KR10-2020-0067621 2020-06-04
KR1020200067621A KR102443629B1 (ko) 2020-06-04 2020-06-04 딥러닝 nlp 모델을 활용한 뉴스 긍정도 분석 솔루션 및 시스템
KR10-2020-0067620 2020-06-04
KR1020200067619A KR102322899B1 (ko) 2020-06-04 2020-06-04 딥러닝 nlp 모델을 활용한 뉴스 긍정도 분석 솔루션 및 장치

Publications (1)

Publication Number Publication Date
WO2021246812A1 true WO2021246812A1 (fr) 2021-12-09

Family

ID=78829982

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/006968 WO2021246812A1 (fr) 2020-06-04 2021-06-03 Solution et dispositif d'analyse de niveau de positivité d'actualités utilisant un modèle nlp à apprentissage profond

Country Status (1)

Country Link
WO (1) WO2021246812A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780712A (zh) * 2022-04-06 2022-07-22 科技日报社 一种基于质量评价的新闻专题生成方法及装置
CN115496062A (zh) * 2022-11-10 2022-12-20 杭州费尔斯通科技有限公司 企业选址意愿识别方法、系统、计算机设备以及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120001053A (ko) * 2010-06-29 2012-01-04 (주)워드워즈 문서 감성 분석 시스템 및 그 방법
KR101458004B1 (ko) * 2013-12-26 2014-11-04 주식회사 코스콤 인공 신경망 모형을 이용한 주가 등락 예측 시스템 및 주가 등락 예측 방법
KR20140133185A (ko) * 2013-05-10 2014-11-19 주식회사 코스콤 소셜 데이터의 분석을 통한 주가 예측 방법 및 이를 적용한 주가 예측 시스템
KR20190102905A (ko) * 2018-02-27 2019-09-04 울산과학기술원 콘텐츠 평점 산출 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120001053A (ko) * 2010-06-29 2012-01-04 (주)워드워즈 문서 감성 분석 시스템 및 그 방법
KR20140133185A (ko) * 2013-05-10 2014-11-19 주식회사 코스콤 소셜 데이터의 분석을 통한 주가 예측 방법 및 이를 적용한 주가 예측 시스템
KR101458004B1 (ko) * 2013-12-26 2014-11-04 주식회사 코스콤 인공 신경망 모형을 이용한 주가 등락 예측 시스템 및 주가 등락 예측 방법
KR20190102905A (ko) * 2018-02-27 2019-09-04 울산과학기술원 콘텐츠 평점 산출 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KIM SUNG-JIN, KIM GUN-WOO, LEE DONG-HO: "A Topic Related Word Extraction Method Using Deep Learning Based News Analysis", KOREA INFORMATION PROCESSING SOCIETY 2017 SPRING CONFERENCE 2017, KOREA INFORMATION PROCESSING SOCIETY, KOREA, 1 April 2017 (2017-04-01) - 27 April 2017 (2017-04-27), Korea, pages 873 - 876, XP055877073 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780712A (zh) * 2022-04-06 2022-07-22 科技日报社 一种基于质量评价的新闻专题生成方法及装置
CN114780712B (zh) * 2022-04-06 2023-07-04 科技日报社 一种基于质量评价的新闻专题生成方法及装置
CN115496062A (zh) * 2022-11-10 2022-12-20 杭州费尔斯通科技有限公司 企业选址意愿识别方法、系统、计算机设备以及存储介质
CN115496062B (zh) * 2022-11-10 2023-02-28 杭州费尔斯通科技有限公司 企业选址意愿识别方法、系统、计算机设备以及存储介质

Similar Documents

Publication Publication Date Title
WO2019117466A1 (fr) Dispositif électronique pour analyser la signification de la parole, et son procédé de fonctionnement
WO2020235712A1 (fr) Dispositif d'intelligence artificielle pour générer du texte ou des paroles ayant un style basé sur le contenu, et procédé associé
WO2021246812A1 (fr) Solution et dispositif d'analyse de niveau de positivité d'actualités utilisant un modèle nlp à apprentissage profond
WO2020105856A1 (fr) Appareil électronique pour traitement d'énoncé utilisateur et son procédé de commande
WO2018034426A1 (fr) Procédé de correction automatique d'erreurs dans un corpus balisé à l'aide de règles pdr de noyau
EP3545487A1 (fr) Appareil électronique, procédé de commande associé et support d'enregistrement lisible par ordinateur non transitoire
WO2020130447A1 (fr) Procédé de fourniture de phrases basé sur un personnage et dispositif électronique de prise en charge de ce dernier
WO2021020877A1 (fr) Système et procédé d'enregistrement de dispositif pour service d'assistant vocal
WO2018097439A1 (fr) Dispositif électronique destiné à la réalisation d'une traduction par le partage d'un contexte d'émission de parole et son procédé de fonctionnement
WO2020050509A1 (fr) Dispositif de synthèse vocale
WO2022102937A1 (fr) Procédés et systèmes pour prédire des actions qui ne sont pas par défaut à l'égard d'énoncés non structurés
WO2018074895A1 (fr) Dispositif et procédé de fourniture de mots recommandés pour une entrée de caractère
WO2018056779A1 (fr) Procédé de traduction d'un signal vocal et dispositif électronique l'utilisant
WO2021029643A1 (fr) Système et procédé de modification d'un résultat de reconnaissance vocale
EP3980991A1 (fr) Système et procédé pour reconnaître la voix d'un utilisateur
WO2022164192A1 (fr) Dispositif et procédé pour fournir des phrases recommandées associées à une entrée d'énoncé d'un utilisateur
WO2011068315A4 (fr) Appareil permettant de sélectionner une base de données optimale en utilisant une technique de reconnaissance de force conceptuelle maximale et procédé associé
WO2020076086A1 (fr) Système de traitement d'énoncé d'utilisateur et son procédé de fonctionnement
WO2020141706A1 (fr) Procédé et appareil pour générer des phrases en langage naturel annotées
WO2023048537A1 (fr) Serveur et procédé pour fournir un contenu de recommandation
WO2022177224A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022092796A1 (fr) Dispositif électronique et procédé de reconnaissance vocale de dispositif électronique
WO2024025039A1 (fr) Système et procédé permettant d'analyser l'état d'un utilisateur dans un métavers
WO2020141643A1 (fr) Serveur de synthèse vocale et terminal
WO2022186435A1 (fr) Dispositif électronique pour corriger une entrée vocale d'un utilisateur et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21816988

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.05.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21816988

Country of ref document: EP

Kind code of ref document: A1