WO2017073373A1 - 学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 - Google Patents
学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 Download PDFInfo
- Publication number
- WO2017073373A1 WO2017073373A1 PCT/JP2016/080558 JP2016080558W WO2017073373A1 WO 2017073373 A1 WO2017073373 A1 WO 2017073373A1 JP 2016080558 W JP2016080558 W JP 2016080558W WO 2017073373 A1 WO2017073373 A1 WO 2017073373A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- label
- evaluation
- unit
- data
- input data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2155—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the incorporation of unlabelled data, e.g. multiple instance learning [MIL], semi-supervised techniques using expectation-maximisation [EM] or naïve labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
Definitions
- the present disclosure relates to a learning system, a learning device, a learning method, a learning program, a teacher data creation device, a teacher data creation method, a teacher data creation program, a terminal device, and a threshold value changing device.
- Patent Document 1 describes an apparatus for learning a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- This neural network includes an input layer, a plurality of intermediate layers (hidden layers), and an output layer.
- the input layer includes a plurality of artificial neurons.
- Each of the intermediate layers includes a plurality of artificial neurons.
- the output layer comprises the same number of artificial neurons as the number of labels.
- the learning device learns a neural network using teacher data including input data and label evaluation.
- the label evaluation includes “positive evaluation” indicating that the data content matches the label and “negative evaluation” indicating that the data content does not match the label.
- the positive evaluation or negative evaluation is associated with a numerical value (correct answer score) such as “0” or “1”, and these numerical values are also referred to as a correct answer value (Ground Truth).
- the learning device acquires input data in the input layer, calculates in the intermediate layer, and adjusts the calculation parameter in the intermediate layer so that the recognition score output from the output layer and the correct answer score of the evaluation approach each other.
- the labeling method (classification method) of the neural network includes a single label classification in which only one label selected from a plurality of labels is given to recognition target data, and a plurality of labels.
- There is a multi-label classification that allows a plurality of labels selected from the above to be given to recognition target data.
- a single label if one label is positive, the other label is negative.
- a multi-label a plurality of labels may be evaluated positively.
- Non-Patent Documents 1 to 3 describe that, as an approach to such incomplete teacher data, label estimation is automatically estimated by separate learning.
- a learning system includes a learning device and a teacher data creation device for the learning device.
- the learning device learns a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- the teacher data creation device creates teacher data for the learning device.
- the teacher data creation device includes an input data acquisition unit, an evaluation acquisition unit, and a teacher data creation unit.
- the input data acquisition unit acquires input data.
- the evaluation acquisition unit for the input data acquired by the input data acquisition unit, for each label, a positive evaluation indicating that the content of the input data matches the label, and a negative evaluation indicating that the content of the input data does not match the label And any one of ignoring evaluations indicating exclusion from the learning target label is acquired.
- the teacher data creation unit creates teacher data by associating the input data acquired by the input data acquisition unit with the evaluation for each label acquired by the evaluation acquisition unit.
- the learning device includes a teacher data acquisition unit, an input layer, an intermediate layer, an output layer, and an error back propagation unit.
- the teacher data acquisition unit acquires teacher data created by the teacher data creation device.
- the input layer acquires the input data included in the teacher data acquired by the teacher data acquisition unit as a score.
- the intermediate layer calculates the score acquired by the input layer using a weighting coefficient.
- the output layer outputs a recognition score for each label using the score calculated by the intermediate layer.
- the error back propagation unit adjusts the weighting factor of the intermediate layer using the recognition score for each label output from the output layer and the correct score for the evaluation for each label.
- the error backpropagation unit adjusts the weight coefficient of the intermediate layer so that the recognition score of the positive or negative evaluation label approaches the correct evaluation score of the positive or negative evaluation, and the recognition score of the neglected evaluation label is intermediate. Do not affect the adjustment of the layer weighting factor.
- any one of positive evaluation, negative evaluation, and neglected evaluation is acquired for each label as a label evaluation by the teacher data generation device, and teacher data is generated. That is, in this learning system, learning can be performed using teacher data that can include a new evaluation of “ignore evaluation” in addition to “positive evaluation” and “negative evaluation”.
- the learning device adjusts the weighting factor of the intermediate layer so that the recognition score of the positive evaluation or negative evaluation label approaches the correct score of the positive evaluation or negative evaluation, and the recognition score of the neglected evaluation label is Do not affect the adjustment of the weighting factor. For this reason, the accuracy of the recognition unit can be improved with respect to the positive evaluation label or the negative evaluation label, and the accuracy of the recognition unit can be prevented from being affected with respect to the label with the ignorance evaluation.
- the error backpropagation unit may set the correct score of the ignore evaluation to the same value as the recognition score of the ignore evaluation label, or the correct score of the ignore evaluation and the recognition score of the ignore evaluation label, May be changed to 0, or the differential value of the difference between the correct score of the ignorance evaluation and the recognition score of the label of the ignorance evaluation may be changed to 0.
- the weighting coefficient of the intermediate layer is not adjusted. Accordingly, it is possible to invalidate the back propagation related to the label of the ignorance evaluation without changing the configuration of the neural network or the back propagation formula by the error back propagation unit.
- the error backpropagation unit may block the connection of the neural network related to the label of the ignore evaluation. Thereby, the back propagation itself can be directly invalidated.
- the teacher data creation unit may associate a label that cannot be evaluated by the evaluation acquisition unit with the ignored evaluation. As a result, it is possible to learn an unevaluated label without forcibly setting positive or negative evaluation.
- the teacher data creation device includes a reception unit that receives a user operation that specifies label evaluation, and the evaluation acquisition unit acquires the evaluation of the label specified by the user operation received by the reception unit. May be.
- the evaluation acquisition unit acquires the evaluation of the label specified by the user operation received by the reception unit. May be.
- the accepting unit accepts a user operation for designating an evaluation of a part of the label of the input data
- the teacher data creating unit evaluates the evaluation of the part of the label acquired by the evaluation acquiring unit.
- the evaluation of the remaining labels of the input data may be ignored.
- the teacher data creation unit sets all label evaluations of the input data to be ignored before associating the label evaluation acquired by the evaluation acquisition unit with the input data acquired by the input data acquisition unit. May be. In this way, all labels can be ignored as default settings, and the evaluation can be changed from neglected evaluation to positive evaluation or negative evaluation. In other words, when the annotator works, it is possible to save the trouble of explicitly instructing the ignore evaluation.
- a learning device is a learning device that learns a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- the learning device includes a teacher data acquisition unit, an input layer, an intermediate layer, an output layer, and an error back propagation unit.
- the teacher data acquisition unit acquires teacher data including input data and an evaluation for each label associated with the input data in advance. For input data, any of positive evaluation indicating that the content of the input data matches the label, negative evaluation indicating that the content of the input data does not match the label, and ignore evaluation indicating exclusion from the learning target label One is associated with each label.
- the input layer acquires the input data included in the teacher data acquired by the teacher data acquisition unit as a score.
- the intermediate layer calculates the score acquired by the input layer using a weighting coefficient.
- the output layer outputs a recognition score for each label using the score calculated by the intermediate layer.
- the error back propagation unit adjusts the weighting factor of the intermediate layer using the recognition score for each label output from the output layer and the correct score for the evaluation for each label.
- the error backpropagation unit adjusts the weight coefficient of the intermediate layer so that the recognition score of the positive or negative evaluation label approaches the correct evaluation score of the positive or negative evaluation, and the recognition score of the neglected evaluation label is intermediate. Do not affect the adjustment of the layer weighting factor.
- This learning apparatus can learn using teacher data that can include a new evaluation of “ignore evaluation” in addition to “positive evaluation” and “negative evaluation”.
- the learning device adjusts the weighting factor of the intermediate layer so that the recognition score of the positive evaluation or negative evaluation label approaches the correct score of the positive evaluation or negative evaluation, and the recognition score of the neglected evaluation label is Do not affect the adjustment of the weighting factor. For this reason, the accuracy of the recognition unit can be improved with respect to the positive evaluation label or the negative evaluation label, and the accuracy of the recognition unit can be prevented from being affected with respect to the label with the ignorance evaluation.
- an approach has been adopted in which an incomplete label evaluation is approximated to a complete label evaluation by estimation or the like.
- by introducing a new evaluation called neglected evaluation it is possible to adopt a new approach of learning using only correctly assigned evaluations among incomplete evaluations. Learning based on this can be avoided.
- a teacher data creation device creates teacher data for a learning device that learns a neural network that classifies recognition target data using a plurality of labels by an error back propagation method. It is.
- This apparatus includes an input data acquisition unit, an evaluation acquisition unit, and a teacher data creation unit.
- the input data acquisition unit acquires input data.
- the evaluation acquisition unit for the input data acquired by the input data acquisition unit, for each label, a positive evaluation indicating that the content of the input data matches the label, and a negative evaluation indicating that the content of the input data does not match the label And any one of ignoring evaluations indicating exclusion from the learning target label is acquired.
- the teacher data creation unit creates teacher data by associating the input data acquired by the input data acquisition unit with the evaluation for each label acquired by the evaluation acquisition unit.
- teacher data may be created by a person (annotator). Annotators need to evaluate labels to create teacher data, regardless of whether they are confident in the evaluation of labels. For this reason, learning may be performed based on an erroneous evaluation.
- teacher data is created by obtaining each label as an evaluation of one of positive evaluation, negative evaluation, and neglected evaluation.
- the teacher data can include a new evaluation of “ignore evaluation” in addition to “positive evaluation” and “negative evaluation”.
- a new evaluation called neglected evaluation it is possible to adopt a new approach of learning using only correctly assigned evaluations among incomplete evaluations. Can be avoided.
- a learning method is a learning method for learning a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- the learning method includes a teacher data acquisition step, an input data acquisition step, a calculation step, an output step, and an error back propagation step.
- teacher data acquisition step teacher data including input data and evaluation for each label associated with the input data in advance is acquired.
- the input layer acquires the input data included in the teacher data acquired in the teacher data acquisition step as a score.
- the intermediate layer calculates the score acquired in the input step using a weighting factor.
- the output step the output layer outputs a recognition score for each label using the score calculated in the calculation step.
- the weighting coefficient of the intermediate layer is adjusted using the recognition score for each label output in the output step and the correct answer score for the evaluation for each label. For input data, any of positive evaluation indicating that the content of the input data matches the label, negative evaluation indicating that the content of the input data does not match the label, and ignore evaluation indicating exclusion from the learning target label One is associated with each label.
- the weighting coefficient of the middle layer is adjusted so that the recognition score of the positive or negative evaluation label and the correct evaluation score of the positive or negative evaluation are close to each other, and the recognition score of the neglected evaluation label is intermediate. Do not affect the adjustment of the layer weighting factor.
- a teacher data creation method is a teacher data creation method for creating teacher data for a learning device that learns a neural network that classifies recognition target data using a plurality of labels by an error back propagation method. It is.
- This method includes an input data acquisition step, an evaluation acquisition step, and a teacher data creation step.
- input data is acquired.
- evaluation acquisition step with respect to the input data acquired in the input data acquisition step, for each label, a positive evaluation indicating that the content of the input data matches the label, and a negative evaluation indicating that the content of the input data does not match the label And any one of ignoring evaluations indicating exclusion from the learning target label is acquired.
- teacher data creation step teacher data is created by associating the input data acquired in the input data acquisition step with the evaluation for each label acquired in the evaluation acquisition step.
- a learning program is a learning program for operating a computer so as to learn a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- the learning program causes the computer to function as a teacher data acquisition unit, an input layer, an intermediate layer, an output layer, and an error back propagation unit.
- the teacher data acquisition unit acquires teacher data including input data and an evaluation for each label associated with the input data in advance. For input data, any of positive evaluation indicating that the content of the input data matches the label, negative evaluation indicating that the content of the input data does not match the label, and ignore evaluation indicating exclusion from the learning target label One is associated with each label.
- the input layer acquires the input data included in the teacher data acquired by the teacher data acquisition unit as a score.
- the intermediate layer calculates the score acquired by the input layer using a weighting coefficient.
- the output layer outputs a recognition score for each label using the score calculated by the intermediate layer.
- the error back propagation unit adjusts the weighting factor of the intermediate layer using the recognition score for each label output from the output layer and the correct score for the evaluation for each label.
- the error backpropagation unit adjusts the weight coefficient of the intermediate layer so that the recognition score of the positive or negative evaluation label approaches the correct evaluation score of the positive or negative evaluation, and the recognition score of the neglected evaluation label is intermediate. Do not affect the adjustment of the layer weighting factor.
- a teacher data creation program provides a computer for creating teacher data for a learning device that learns a neural network that classifies recognition target data using a plurality of labels by an error back propagation method.
- This is a teacher data creation program to be operated.
- the teacher data creation program causes the computer to function as an input data acquisition unit, an evaluation acquisition unit, and a teacher data creation unit.
- the input data acquisition unit acquires input data.
- the evaluation acquisition unit for the input data acquired by the input data acquisition unit, for each label, a positive evaluation indicating that the content of the input data matches the label, and a negative evaluation indicating that the content of the input data does not match the label And any one of ignoring evaluations indicating exclusion from the learning target label is acquired.
- the teacher data creation unit creates teacher data by associating the input data acquired by the input data acquisition unit with the evaluation for each label acquired by the evaluation acquisition unit.
- a terminal device is a terminal device capable of communicating with the learning device described above, using a recognition target data acquisition unit that acquires recognition target data, and a parameter learned by the learning device, A recognition unit that assigns a label representing the content of the recognition target data to the recognition target data, an operation reception unit that receives a user operation for determining a private label to be given to the recognition target data acquired by the recognition target data acquisition unit, A label editing unit that adds a private label to the recognition target data based on a user operation received by the operation receiving unit.
- This terminal device can give a label (private label) other than the label given based on the learning result of the learning device. For this reason, this terminal device can improve a user's convenience.
- the terminal device may include a label presenting unit that presents the private label to the user based on the history of the date and time of the private label given by the label editing unit and the reference date and time. With this configuration, the terminal device can present a private label to the user according to the user's behavior.
- the terminal device may include a label presenting unit that presents a private label to the user based on the accompanying information given when the recognition target data is generated.
- the terminal device can present a private label to the user according to the situation at the time of generating the recognition target data.
- the terminal device is configured such that the operation reception unit receives a user operation for adding a comment and sharing the recognition target data with another person, and is shared based on the user operation received by the operation reception unit.
- a determination unit that determines data, an analysis unit that analyzes the content of comments attached to recognition target data determined by the determination unit, and a label presentation unit that presents a private label to the user based on the analysis result of the analysis unit May be provided.
- the terminal device can present a private label to the user in accordance with the comment given by the user.
- the terminal device is configured to be communicable with the language server, and outputs a list of assigned private labels to the language server, and a relationship between the representative label and the assigned private label from the language server. And a recommendation unit that recommends the user to correct the private label to the representative label based on the relationship acquired by the relationship acquisition unit.
- a list acquisition unit that acquires a label, an aggregation unit that aggregates private labels into groups based on the list acquired by the list acquisition unit, and a representative label selection unit that selects a representative label for the group aggregated by the aggregation unit And the relationship between the representative label and the assigned private label based on the selection result of the representative label selector.
- a representative label output unit for outputting to the end device may be provided. With this configuration, this terminal device can prompt the user to organize private labels.
- a threshold value changing device is a device that changes a threshold value in a terminal device.
- the terminal device acquires recognition target data, outputs a recognition score indicating the degree to which the content of the recognition target data matches a predetermined label by a neural network, and uses the recognition score and a threshold value set in advance for the recognition score.
- a recognition result indicating whether or not the content of the recognition target data matches a predetermined label is output.
- the threshold value changing device includes an evaluation data acquisition unit, a terminal data acquisition unit, a recognition score acquisition unit, a calculation unit, and a change unit.
- the evaluation data acquisition unit is associated with the input data and a predetermined label indicating whether the content of the input data is a positive evaluation that matches the predetermined label or whether the content of the input data is a negative evaluation that does not match the predetermined label Evaluation data including the correct answer evaluation is obtained.
- the terminal data acquisition unit acquires a ratio of positive evaluation and negative evaluation of data associated with the terminal device.
- the recognition score acquisition unit receives a recognition score of a predetermined label for input data from a neural network (for example, a neural network for recognition) or a neural network (for example, a neural network for learning) having the same weighting coefficient as the neural network. To get.
- the calculation unit uses the recognition score of the predetermined label acquired by the recognition score acquisition unit and the threshold value, and the number of data in which the input data whose correct evaluation is correct evaluation is recognized as the positive evaluation, and the correct evaluation is negative evaluation
- the number of data in which the input data is recognized as a positive evaluation is calculated, and the precision of the predetermined label is calculated using the calculated number of data.
- the changing unit changes the threshold value using the relevance ratio calculated by the calculating unit.
- the calculation unit calculates the number of data in which the input data whose correct evaluation is negative evaluation is recognized as the positive evaluation, the ratio of the positive evaluation and negative evaluation of the evaluation data, and the positive evaluation and negative evaluation of the data associated with the terminal device. Correction is performed using the ratio, and the precision is calculated using the corrected number of data.
- the number of data recognized as negative evaluation input data is the ratio between the positive evaluation and negative evaluation of the evaluation data, and the positive evaluation and negative evaluation of the data associated with the terminal device. It is corrected using the ratio. And the threshold value used in the recognition performed by a terminal device is changed based on the relevance ratio regarding the predetermined label calculated using the corrected data number. In this way, when calculating the relevance ratio for a predetermined label, the negative recognition input data is recognized as a positive evaluation in consideration of the distribution of positive and negative data in the evaluation data and the distribution of positive and negative data in the terminal device. The number is corrected.
- the bias of positive and negative data in the terminal device can be reflected in the number of data after canceling the bias of positive and negative data in the evaluation data. Therefore, even if there is a bias in the positive and negative data in the evaluation data and there is a difference from the bias in the positive and negative data in the terminal device, this device can perform an appropriate evaluation according to the terminal device. As a result, the threshold value can be appropriately changed according to the terminal device.
- the calculation unit may calculate a recall rate and a matching rate for a predetermined label.
- the changing unit may change the threshold value to a recognition score that maximizes the harmonic average of the recall rate and the matching rate.
- the device can change the threshold using the recall and corrected precision.
- the terminal data acquisition unit calculates a ratio of positive evaluation and negative evaluation of data associated with the terminal device based on a recognition result of the neural network of the terminal device or an annotation result by a user of the terminal device. You may get it. In this case, this device can acquire the bias of positive and negative data in the terminal device based on actual data.
- the terminal data acquisition unit may acquire a ratio of positive evaluation and negative evaluation of data associated with the terminal device based on a user operation of the terminal device or terminal information. In this case, this apparatus can predict and acquire a bias of positive and negative data in the terminal apparatus.
- a threshold value changing device is a device that changes a threshold value in a terminal device.
- the terminal device acquires recognition target data, outputs a recognition score indicating the degree to which the content of the recognition target data matches a predetermined label by a neural network, and uses the recognition score and a threshold value set in advance for the recognition score.
- a recognition result indicating whether or not the content of the recognition target data matches a predetermined label is output.
- the threshold value changing device includes a terminal data acquisition unit, a storage unit, and a changing unit.
- the terminal data acquisition unit acquires a ratio of positive evaluation and negative evaluation of data associated with the terminal device.
- the storage unit stores the relationship between the ratio and the threshold value.
- the changing unit changes the threshold value using the relationship stored in the storage unit and the ratio acquired by the terminal data acquisition unit.
- the threshold value is changed using the relationship between the ratio and the threshold value stored in advance and the ratio acquired by the terminal data acquisition unit.
- the calculation load for changing the threshold value can be reduced.
- FIG. 1 It is a figure explaining a recognition part. It is a figure explaining the neural network in a recognition part. It is a figure explaining the artificial neuron shown in FIG. It is a functional block diagram of the learning system concerning an embodiment.
- a learning system 100 (see FIG. 4) is a system that learns parameters of a recognition unit 11 (see FIG. 1) that recognizes the contents of recognition target data.
- the recognition target data is data that is to be recognized by the computer, such as image data, sound data, text data, and the like.
- the parameters of the recognition unit 11 are values used for recognition processing for recognizing recognition target data.
- the recognition target data is image data and the recognition target is the content of an image (such as a person, an animal, an object, a landscape, or a room).
- FIG. 1 is a diagram illustrating the recognition unit 11.
- the recognition unit 11 is provided in the terminal device 10.
- the recognition unit 11 receives recognition target data G1, which is image data, and outputs a recognition result.
- the recognition target data G1 is image data of an image in which a dog is drawn.
- the recognition unit 11 inputs image data (more specifically, pixel values), and outputs a label representing the content of the image using the learned parameters.
- the label is used to classify the contents of the recognition target data, and is information for identifying a category set in advance by the system user.
- the recognition unit 11 outputs a label “dog” as a recognition result.
- the label is given to the recognition target data G1 by the recognition unit 11. Giving means to associate, and for example, only the relationship between the recognition target data G1 and the label may be recorded in a relation table or the like, or may be incorporated into the recognition target data G1 itself. Generally, giving a label to recognition target data is called annotation. Since the recognition unit 11 can input image data and assign a label, the recognition unit 11 can automatically classify the image data or search for a desired image on the Web.
- the recognition target data G2 is image data of an image in which a person and a flower are drawn.
- the recognition unit 11 gives a “person” label to the recognition target data G2.
- the recognition unit 11 assigns two labels “person” and “flower” to the recognition target data G2.
- FIG. 2 is a diagram for explaining a neural network in the recognition unit 11.
- the recognition unit 11 recognizes a label corresponding to image data using a neural network.
- a neural network is an information processing system modeled on the cranial nervous system.
- the neural network of the recognition unit 11 is a so-called hierarchical neural network, and many artificial neurons indicated by circles are connected while forming a hierarchy.
- the hierarchical neural network includes an artificial neuron for input, an artificial neuron for processing, and an artificial neuron for output.
- the input artificial neuron acquires recognition target data and distributes it to the artificial neuron for processing.
- the signal itself exchanged with a neural network is called a score.
- the score is a numerical value.
- the input artificial neurons form the input layer 111 by being arranged in parallel.
- the artificial neuron for processing is connected to the artificial neuron for input, processes the input according to the function of the artificial neuron, and transmits the output to other neurons.
- Artificial neurons for processing are arranged in parallel to form the intermediate layer 112.
- the intermediate layer 112 may be a plurality of layers.
- a neural network having three or more layers provided with the intermediate layer 112 is referred to as a deep neural network.
- the output artificial neuron outputs the recognition score to the outside.
- the same number of output artificial neurons as the number of labels are prepared. That is, the neural network outputs a recognition score for each label.
- three artificial neurons are prepared according to three labels “dog”, “person”, and “flower”.
- the output artificial neuron outputs a recognition score B1 corresponding to the label “dog”, a recognition score B2 corresponding to the label “human”, and a recognition score B3 corresponding to the label “flower”.
- the recognition score is a score representing the probability of recognition.
- the recognition score is a label that indicates the content of the image as the label recognition score increases. The quality is increased.
- the output artificial neurons form the output layer 113 by being arranged in parallel.
- the recognizing unit 11 determines a given label using the recognition score output by the output layer 113. For example, the recognition unit 11 gives a label corresponding to a recognition score equal to or greater than a predetermined value to the recognition target data. Thereby, the label which shows the content is automatically provided to recognition object data. In the case of single label processing, the recognition unit 11 assigns a label corresponding to the highest recognition score to the recognition target data.
- FIG. 3 is a diagram for explaining the artificial neuron shown in FIG.
- the artificial neuron shown in FIG. 3A inputs x 1 , x 2 , and x 3, and integrates weighting factors w 1 , w 2 , and w 3 corresponding to each of them.
- the artificial neuron calculates the sum of the integrated value (x 1 ⁇ w 1 , x 2 ⁇ w 2 , x 3 ⁇ w 3 ) and the bias value b. This sum is substituted into the activation function and used as the output of the artificial neuron.
- g is an activation function, for example, a sigmoid function.
- N 3
- the outputs h 1 (2) , h 2 (2) , and h 3 (2) of the artificial neurons located in the two layers are respectively expressed by the following equations. 3-5.
- n is the number of artificial neurons in the target hierarchy
- w 1j (1) is a weighting factor corresponding to the 1st hierarchy jth output in the 2nd hierarchy 1st artificial neuron
- b 1 (1) is the bias value of 1 hierarchy. It is.
- w 2j (1) is a weighting factor corresponding to the 1st layer jth output in the 2nd layer 2nd artificial neuron
- w 3j (1) is a weight corresponding to the 1st layer jth output in the 2nd layer 3rd artificial neuron.
- the coefficient, b 2 (1) is the second bias value of the first layer
- b 3 (1) is the third bias value of the first layer.
- the output h 1 (3) of the three-layer artificial neuron is expressed by Equation 6 below. Note that the bias value b is not necessarily required, and the output may be calculated using only the integrated value of the output of the preceding artificial neuron and the weighting factor.
- the artificial neuron is not limited to the above, but may be a generalized one.
- a general expression relating to the function of the i-th intermediate layer 112 is represented by Expression 7 below.
- x (i) is an input vector to the intermediate layer 112
- w (i) is a weight parameter vector of the intermediate layer 112
- b (i) is a bias vector
- v (i) is an output vector of the intermediate layer 112.
- An example of an intermediate layer 112 commonly used in image recognition is a fully connected layer and a convolutional layer. The output of all the coupling layers expressed in FIG.
- x p (i) is the p-th component of the output of the i-th intermediate layer 112
- v q (i) is the q-th component of the output of the intermediate layer 112
- w p, q (i) is the output of the intermediate layer 112.
- the output of the convolution layer is expressed by the following Equation 9.
- xp, (r, s) (i) is the (r, s) component of the p-th channel input to the i-th intermediate layer 112
- v q, (r, s) (i) is the intermediate layer 112.
- (R, s) component of output q, w p, q, (r ′, s ′) (i) is a weighting coefficient relating to the convolution filter of the intermediate layer 112.
- r ′ and s ′ vary from 0 to values of (width ⁇ 1) and (height ⁇ 1) of the convolution filter.
- the learning system 100 is a system that learns the weighting factor and the bias value for associating the feature quantity of the recognition target data with the label indicating the content.
- the learning system 100 learns only the weighting coefficient.
- FIG. 4 is a functional block diagram of the learning system 100 according to the embodiment.
- the learning system 100 collects and learns image data, and provides the learned weighting coefficient and bias value to the terminal device 10.
- the learning system 100 is connected to a database 21 that stores image data, a camera 22 that generates image data, a Web site 23 that can download image data, and the like, and can acquire image data that serves as input data for learning. it can.
- the learning system 100 may acquire image data by connecting an external storage medium, may receive image data via communication, and is not limited to the mode of image data acquisition.
- FIG. 5 is a block diagram showing a hardware configuration of the apparatus shown in FIG.
- the terminal device 10 is physically composed of a main storage device such as a CPU (Central Processing Unit) 101, a RAM (Random Access Memory) 102, and a ROM (Read Only Memory) 103, a touch panel, a keyboard, and the like.
- a main storage device such as a CPU (Central Processing Unit) 101, a RAM (Random Access Memory) 102, and a ROM (Read Only Memory) 103, a touch panel, a keyboard, and the like.
- a main storage device such as a CPU (Central Processing Unit) 101, a RAM (Random Access Memory) 102, and a ROM (Read Only Memory) 103, a touch panel, a keyboard, and the like.
- ROM Read Only Memory
- the CPU 101 loads predetermined computer software on hardware such as the RAM 102 and the ROM 103, operates the input device 104 and the output device 105 under the control of the CPU 101, This is realized by reading and writing data in the auxiliary storage device 106.
- the hardware of the teacher data creation device 30 and the learning device 40 can also be configured with the same hardware as the terminal device 10. That is, the teacher data creation apparatus 30 is physically configured as a normal computer system including a main storage device such as a CPU 301, a RAM 302, and a ROM 303, an input device 304, an output device 305, an auxiliary storage device 306, and the like.
- the learning device 40 is physically configured as a normal computer system including a main storage device such as a CPU 401, a RAM 402, and a ROM 403, an input device 404, an output device 405, an auxiliary storage device 406, and the like.
- the learning system 100 includes a teacher data creation device 30 and a learning device 40.
- the teacher data creation device 30 creates teacher data to be used when the learning device 40 learns and provides it to the learning device 40.
- the teacher data includes data whose processing target data and recognition target are the same, and evaluation of a label corresponding to the data.
- the teacher data creation device 30 includes an input data acquisition unit 31, an evaluation acquisition unit 32, a reception unit 33, and a teacher data creation unit 34.
- the input data acquisition unit 31 acquires input data.
- the input data acquisition unit 31 acquires learning input data from the database 21, the camera 22, the Web site 23, an external storage medium, and the like.
- the input data is data used for learning, and is data for which processing target data and recognition target are the same.
- the input data is image data.
- the input data is data to be evaluated for a plurality of preset labels. Input data may have already been evaluated for some labels or not at all. That is, some label may already be given to the input data.
- the input data includes at least a label for which it has not been determined whether or not to add.
- the evaluation acquisition unit 32 is “correct evaluation” indicating that the content of the input data matches the label for each label, and the content of the input data does not match the label
- One of “negative evaluation” indicating “exclusion” and “ignore evaluation” indicating exclusion from the learning target label is acquired.
- the evaluation of a certain label is a positive evaluation, it means that the content of the input data belongs to the category indicated by the label. If the evaluation of a certain label is negative, it means that the content of the input data does not belong to the category indicated by the label.
- the evaluation of a certain label is neglected evaluation, it means that the label is excluded from the learning target label.
- the evaluation acquisition unit 32 includes not only “positive evaluation” and “negative evaluation” but also “ignore evaluation” in the acquisition target options.
- the evaluation acquisition unit 32 acquires the content determined by the human annotator as the label evaluation.
- the teacher data creation device 30 includes a reception unit 33 that receives a user operation for designating label evaluation.
- the user operation is an annotator operation.
- the user operation is an operation of selecting any one of “positive evaluation”, “negative evaluation”, and “ignore evaluation” for a certain label, or two evaluations for a certain label. It is an operation to eliminate.
- the reception unit 33 acquires a signal related to the user operation to the evaluation acquisition unit 32.
- the evaluation acquisition unit 32 acquires the evaluation of the label specified by the user operation received by the reception unit 33.
- the user operation may include an operation by the user of the terminal device 10. For example, after the user actually operates the recognition unit 11, a user operation for determining an evaluation is performed.
- the terminal device 10 transmits user operation and input data to the teacher data creation device 30.
- the evaluation acquisition unit 32 determines the evaluation of the label of the input data based on the acquired user operation.
- the evaluation acquisition unit 32 may acquire the evaluation of the label already associated with the input data as it is. For example, if there is teacher data related to the label “dog”, any one of “positive evaluation”, “negative evaluation”, and “ignore evaluation” is already associated with the input data for the label “dog”. ing. If there is an evaluation of a label, the evaluation acquisition unit 32 may use the evaluation and accept the user operation described above for an unevaluated label to determine the evaluation. With this configuration, for example, new teacher data can be easily created using existing teacher data.
- the teacher data creation unit 34 creates teacher data by associating the input data acquired by the input data acquisition unit 31 with the evaluation for each label acquired by the evaluation acquisition unit 32.
- the teacher data creation unit 34 may use the input data and the evaluation for each label as one data to form the teacher data, or may associate the input data with the evaluation for each label using a table.
- FIG. 6 is an example of teacher data.
- a plurality of labels are associated with each of the input data T1 to TN (N is a natural number).
- three labels are respectively associated with the input data T1 to TN.
- the first label L1 is a label indicating that the content of the image is “dog”
- the second label L2 is a label indicating that the content of the image is “person”
- the third label L3 is an image. Is a label indicating that the content of is “flower”.
- the teacher data creation unit 34 associates evaluations of all labels for each input data. For example, it is assumed that the input data T1 is a dog image and no person is captured.
- a positive value that is the evaluation of the first label L1 is stored in the table
- a negative value that is the evaluation of the second label L2 is stored in the table.
- the evaluation is ignored.
- the neglect that is the evaluation of the third label L3 is stored in the table.
- the evaluation of each label is associated with each of the input data T1 to TN.
- the positive evaluation may be displayed as a score such as “1” and the negative evaluation may be “0”.
- a score indicating such evaluation of input data is referred to as a correct answer score.
- FIG. 6B shows the table shown in FIG. 6A displayed as a score. Ignore evaluations are associated with input data as asterisks.
- the teacher data creation unit 34 ignores the evaluation of all the labels of the input data before associating the label evaluation acquired by the evaluation acquisition unit 32 with the input data acquired by the input data acquisition unit 31. Also good. That is, the teacher data creation unit 34 sets all labels as neglected evaluation as a default setting, and changes from neglected evaluation to positive evaluation or negative evaluation for those for which the evaluation has been acquired.
- the teacher data creation unit 34 provides the created teacher data to the learning device 40.
- the output of the teacher data creation unit 34 may be stored in a recording medium, and the storage medium may be read by the learning device 40, or may be transmitted from the teacher data creation unit 34 to the learning device 40 via communication.
- the learning device 40 includes a teacher data acquisition unit 41, a learning recognition unit 42, and an error back propagation unit 43.
- the teacher data acquisition unit 41 acquires the teacher data created by the teacher data creation device 30.
- the learning recognition unit 42 has the same configuration as the recognition unit 11 and includes an input layer 111, an intermediate layer 112, and an output layer 113.
- the input layer 111 acquires input data included in the teacher data acquired by the teacher data acquisition unit 41 as a score.
- the intermediate layer 112 calculates the score acquired by the input layer 111 using a weighting coefficient.
- the output layer 113 outputs a recognition score for each label using the score calculated by the intermediate layer 112.
- the error back propagation unit 43 adjusts the weighting coefficient of the intermediate layer 112 using the recognition score for each label output from the output layer 113 and the correct score for the evaluation for each label.
- FIG. 7 is a diagram for explaining a neural network in the learning recognition unit 42. As shown in FIG. 7, the error backpropagation unit 43 adjusts the weighting coefficient of the intermediate layer 112 so that the recognition scores B1 to B3 and the correct scores Y1 to Y3 for each label come closer. For example, the error back propagation unit 43 calculates the difference between the recognition scores B1 to B3 and the correct scores Y1 to Y3 for each label. The correct scores Y1 to Y3 are “1” for a positive evaluation and “0” for a negative evaluation.
- the error back propagation unit 43 adjusts the above-described weighting factors w 1 , w 2 , w 3 and the bias value b so that the difference between the recognition scores B 1 to B 3 and the correct scores Y 1 to Y 3 for each label becomes small.
- the difference is small means that the error is equal to or less than a predetermined value, or that the difference is smaller after the adjustment than before the adjustment.
- the error back propagation unit 43 determines the weighting factors w 1 , w 2 , w 3 and the bias value b with the smallest difference by the gradient method. Such a method is also called an error back propagation method.
- the error back propagation unit 43 determines, for example, the minimum value of the square error by the gradient method.
- the error back propagation unit 43 repeatedly executes the updating of the weighting factors w 1 , w 2 , w 3 and the bias value b, the input to the output of the neural network, and the calculation of the square error as one set. When the variation of the square error is equal to or less than a predetermined value, the iterative process is terminated and the learning of the input data is terminated.
- the error back propagation unit 43 does not necessarily use the difference between the recognition scores B1 to B3 and the correct score Y1 to Y3 for each label. For example, the likelihood corresponding to the correct score may be calculated. A larger likelihood means that the label is closer to the correct answer. The error back propagation unit 43 adjusts the weight coefficient in a direction in which the likelihood increases.
- Equation 11 The general equation for error back propagation is given by Equation 11 below, where E (x) is the error function (square error, log likelihood function, etc.) calculated by the output layer 113.
- E (x) is the error function (square error, log likelihood function, etc.) calculated by the output layer 113.
- w j (i) is the j component of the weight coefficient of the i-th layer
- h k (i) is the k-th component of the output vector of the i-th intermediate layer 112
- g ′ (i) is the activation function It is differentiation.
- the bias value b can be calculated by the same method.
- each intermediate layer 112 is updated as the following equation 12.
- the following terms cannot be calculated from the i-th layer alone, and thus calculations using the values of the i + 1-th layer are necessary. Specifically, the following calculation is performed. Since the calculation is performed in such a manner that the error is propagated to the input side from the calculation result on the side close to the output layer in this way, this is called error back propagation.
- the error backpropagation unit 43 prevents the recognition score of the ignored evaluation label from affecting the adjustment of the weighting factor of the intermediate layer. “Does not affect the adjustment of the weighting coefficient of the intermediate layer” means that there is no difference in the adjustment of the weighting coefficient regardless of whether the recognition score of the label for the ignorance evaluation is input or not input. For example, the error back propagation unit 43 sets the correct score for the ignore evaluation to the same value as the recognition score of the label for the ignore evaluation. Thereby, since the difference between the recognition score and the correct score is 0, the weighting factors w 1 , w 2 , w 3 and the bias value b are not changed with respect to the label of the ignore evaluation.
- the error back propagation unit 43 may change the differential value of the difference between the correct score of the neglected evaluation and the recognition score of the label of the neglected evaluation to 0.
- the square error takes the minimum value for the neglected evaluation label in the square error evaluation formula.
- the weighting factors w 1 , w 2 , w 3 and the bias value b are not changed with respect to the label of the ignore evaluation.
- a separate layer may be provided in the neural network, and the connection of the neural network related to the neglected evaluation label may be blocked. Thereby, the back propagation itself can be directly invalidated.
- FIG. 8 is a flowchart of the teacher data creation method according to the embodiment. The flowchart shown in FIG. 8 is executed every time one piece of teacher data is created.
- the input data acquisition unit 31 of the teacher data creation device 30 acquires input data as an input data acquisition process (S10: input data acquisition step).
- the evaluation acquisition unit 32 of the teacher data creation device 30 performs “correct evaluation” for each label regarding the input data T3 acquired in the input data acquisition process (S10) as the evaluation acquisition process (S12: evaluation acquisition step).
- One of “negative evaluation” and “ignore evaluation” is acquired.
- the evaluation acquisition unit 32 performs “negative evaluation” as the evaluation of the first label L1 indicating that the content of the image is “dog”, and evaluation of the second label L2 that indicates that the content of the image is “person”.
- “Ignore evaluation” is acquired, and “Ignore evaluation” is acquired as the evaluation of the third label L3 indicating that the content of the image is “flower”.
- the evaluation acquisition unit 32 may acquire the evaluation using a user operation received by the reception unit 33.
- the teacher data creation unit 34 of the teacher data creation device 30 performs the teacher data creation process (S14: teacher data creation step) and the input data acquired in the input data acquisition process (S10) and the evaluation acquisition process (S12).
- Teacher data is created by associating with the evaluation for each label acquired in (1).
- the input data T3 is associated with “negative”, “ignore”, and “ignore”, and becomes one teacher data.
- FIG. 9 is a flowchart of the learning method according to the embodiment.
- the flowchart shown in FIG. 9 shows the flow of learning using one teacher data.
- the teacher data acquisition unit 41 of the learning device 40 acquires teacher data as teacher data acquisition processing (S20: teacher data acquisition step).
- the input layer 111 of the learning device 40 acquires input data included in the teacher data as input processing (S22: input step).
- the intermediate layer 112 of the learning device 40 performs a calculation based on the function of the artificial neuron as a calculation process (S24: calculation step).
- the output layer 113 of the learning apparatus 40 outputs the recognition score for every label as an output process (S26: output step).
- the error back propagation unit 43 of the learning device 40 inputs the correct score for each label acquired in the teacher data acquisition process (S20) as the correct score input process (S28: correct score input step).
- the error back propagation unit 43 executes the processes from S301 to S304 as the back propagation process (S30: Error back propagation step).
- the error back propagation unit 43 determines whether or not the correct evaluation score input in the correct score input process (S28) includes negligence evaluation as the ignore evaluation determination process (S301). For example, assume that input data T1 shown in FIG. 6B is a learning target. As shown in FIG. 6B, the third label L3 corresponding to the input data T1 is “asterisk” indicating neglected evaluation. In this case, the error back propagation unit 43 performs invalidation processing (S302). As the invalidation process (S302), the error back propagation unit 43 sets the recognition score output in the output process (S26) as the correct score for the ignore evaluation determined in the ignore evaluation determination process (S301). For example, when the input data T1 is a learning target, the recognition score B3 is substituted for the correct score of the third label L3.
- the error back propagation unit 43 performs the error calculation process (S303). I do.
- the error back propagation unit 43 calculates a difference between the recognition score output in the output process (S26) and the correct answer score as an example of the error calculation process (S303). Then, the error back propagation unit 43 adjusts the weight coefficients w 1 , w 2 , w 3 and the bias value b so that the error evaluation function becomes the minimum value as the adjustment process (S304).
- the adjustment process (S304) ends, the learning process shown in FIG. 9 ends.
- the teacher data creation program includes a main module, an input data acquisition module, an evaluation acquisition module, a reception module, and a teacher data creation module.
- the main module is a part that performs overall control of the apparatus.
- the functions realized by executing the input data acquisition module, the evaluation acquisition module, the reception module, and the teacher data creation module are the input data acquisition unit 31, the evaluation acquisition unit 32, the reception unit 33, and the teacher data generation device 30 described above.
- the function is the same as that of the teacher data creation unit 34.
- the learning program includes a main module, a teacher data acquisition module, a learning recognition module, and an error back propagation module.
- the main module is a part that performs overall control of the apparatus. Functions realized by executing the teacher data acquisition module, the learning recognition module, and the error back propagation module are the functions of the teacher data acquisition unit 41, the learning recognition unit 42, and the error back propagation unit 43 of the learning device 40 described above. And the same for each.
- the teacher data creation program and the learning program are provided by a non-transitory recording medium such as a ROM or a semiconductor memory, for example. Moreover, the teacher data creation program and the learning program may be provided via communication such as a network.
- the teacher data creation device 30 acquires any one of the positive evaluation, the negative evaluation, and the neglected evaluation for each label as the label evaluation, and creates the teacher data. That is, in this learning system 100, learning can be performed using teacher data that can include a new evaluation of “ignore evaluation” in addition to “positive evaluation” and “negative evaluation”.
- the learning device 40 adjusts the weight coefficient of the intermediate layer 112 so that the recognition score of the positive evaluation or negative evaluation label approaches the correct evaluation score of the positive evaluation or negative evaluation, and the recognition score of the label of neglected evaluation is intermediate. The adjustment of the weight coefficient of the layer 112 is not affected.
- the weighting factors w 1 , w 2 , w 3 and the bias value b of the intermediate layer 112 are not adjusted. Therefore, it is possible to invalidate the back propagation regarding the label of the ignorance evaluation without changing the configuration of the neural network or the back propagation formula by the error back propagation unit 43.
- the learning system 100 it is possible to learn an unevaluated label without forcibly setting a positive evaluation or a negative evaluation.
- the learning system 100 it is possible not only to avoid learning based on an erroneous evaluation but also to be correct by adopting a configuration in which the user can change or add the evaluation. Learning can be performed based on the evaluation, and as a result, the accuracy of the recognition unit 11 can be improved.
- all labels are set as neglected evaluation as a default setting, and the evaluation can be changed from neglected evaluation to positive evaluation or negative evaluation. In other words, when the annotator works, it is possible to save the trouble of explicitly instructing the ignore evaluation.
- teacher data may be created by a person (annotator). Annotators need to evaluate labels to create teacher data, regardless of whether they are confident in the evaluation of labels. For this reason, learning may be performed based on an erroneous evaluation.
- the teacher data is created by obtaining each label as an evaluation of one of positive evaluation, negative evaluation, and neglected evaluation.
- a new evaluation of “ignore evaluation” can be included in the teacher data in addition to “positive evaluation” and “negative evaluation”.
- a new evaluation called neglected evaluation it is possible to adopt a new approach of learning using only correctly assigned evaluations among incomplete evaluations. Can be avoided.
- the recognition target data is image data
- the recognition target data may be voice data or character data. Even when such data is targeted, it is possible to avoid learning based on erroneous evaluation.
- the positive evaluation is “1” and the negative evaluation is “0” has been described as an example, but an arbitrary value can be set.
- the positive evaluation may be “0”
- the negative evaluation may be “1”
- the positive evaluation may be “2”
- the negative evaluation may be “1”.
- the teacher data creation device 30 and the learning device 40 may be configured as one device.
- data to be preferentially added to the teacher data may be selected.
- the teacher data creation device 30 recognizes a plurality of images having a neglected evaluation label, and preferentially receives annotation information for input data whose recognition score is an intermediate value that is neither positive evaluation nor negative evaluation. Acquired and used as teacher data. Thereby, since it is possible to prioritize difficult data (a large amount of information) by the recognizing unit 11, the learning efficiency is increased, and as a result, the efficiency of the annotation can be increased.
- an evaluation obtained by averaging the evaluations of a plurality of annotators may be used as the label evaluation.
- the labels may be hierarchized.
- the major classification label is “A”
- the minor classification labels are “A1”, “A2”, and “A3”.
- the learning device 40 may adopt the weighting factor for the label A as the initial value of the small classification label. In this case, since the convergence efficiency of the gradient method can be increased, the learning time can be shortened.
- the teacher data creation unit 34 may process the input data.
- the teacher data creation unit 34 may perform normalization processing (processing for resizing to a certain size in the case of an image) on the input data.
- the input layer 111 may process the input data.
- the hardware configurations of the terminal device 10, the teacher data creation device 30, and the learning device 40 do not have to be physically integrated, and may be configured by a plurality of devices. That is, a plurality of devices may be connected via a network, and the terminal device 10, the teacher data creation device 30, and the learning device 40 may be configured virtually.
- the terminal device 10 gives a label learned by the learning system 100.
- the label in the first embodiment is a label representing preset contents, and is common to the terminal device 10 and the learning system 100. Further, the label may be common among the plurality of terminal devices 10. That is, the label in the first embodiment is not a label freely set by a user or the like. In the second embodiment, the user is allowed to give a label freely.
- the label in 1st Embodiment is called a public label, and the label freely set by the user is called a private label. In the following, description of the contents described in the first embodiment is omitted.
- FIG. 10 is a functional block diagram of the terminal device 50 according to the second embodiment.
- the terminal device 50 includes a data acquisition unit (recognition target data acquisition unit) 51, a recognition unit 52, a given label storage unit 53, a feedback unit 54, an operation reception unit 55, and a label editing unit 56.
- the hardware of the terminal device 50 is the same as that of the terminal device 10.
- the data acquisition unit 51 acquires recognition target data.
- the data acquisition unit 51 acquires image data stored in a storage medium provided in the terminal device 50.
- the data acquisition unit 51 may acquire image data via communication.
- the recognition unit 52 is the same as the recognition unit 11 in the above-described embodiment.
- the recognizing unit 52 uses the parameters learned by the learning device 40 to give a public label representing the content of the recognition target data to the recognition target data.
- the recognition unit 52 stores the recognition result in the assigned label storage unit 53.
- the assigned label storage unit 53 stores recognition target data and public labels in association with each other.
- FIG. 11 is an example of data stored in the assigned label storage unit 53.
- recognition target data and public labels are stored in association with each other.
- the recognition target data is configured to be identifiable by a recognition target data ID.
- the recognition target data ID is an identifier of the recognition target data.
- the recognition target data ID “1” and the public label “flower, outdoors” are stored in association with each other.
- the recognition target data ID “2” and the public label “person” are stored in association with each other.
- the recognition target data ID “3” and the public label “person, school, indoor” are stored in association with each other.
- the recognition target data ID “4” and the public label “indoor, dish, person” are stored in association with each other.
- the recognition unit 52 may further store accompanying information, which is information attached to the recognition target data, in the assigned label storage unit 53.
- the accompanying information is information indicating, for example, the generation status of recognition target data.
- the accompanying information is embedded in the recognition target data as part of the recognition target data, or is managed in association with the identifier of the recognition target data as data different from the recognition target data.
- the recognizing unit 52 acquires accompanying information based on the recognition target data and further stores it in the assigned label storage unit 53.
- the accompanying information includes, for example, the reliability of the public label, the position information at the time of data generation, the data generation date and time, and the like.
- the reliability of the public label means the certainty of the public label.
- the reliability of the public label is, for example, a score value when recognized by the recognition unit 52.
- the recognition target data is image data
- the accompanying information includes public label reliability, shooting position, shooting date, camera information, face recognition result, and the like.
- the shooting position is, for example, latitude and longitude information, and is GPS information as a specific example.
- the shooting date and time includes date and time, day of the week, season, and the like.
- the camera information includes focal length, exposure time, aperture, presence / absence of flash, and the like.
- the face recognition result is a recognition result of the face recognition function of the camera.
- the assigned label storage unit 53 stores the relationship between the recognition target data and the label, and the relationship between the private label and the assigned time.
- the contents stored in the assigned label storage unit 53 are updated by the operation receiving unit 55 and the label editing unit 56.
- Update is a concept that includes addition, modification, overwriting, deletion, and the like.
- the operation accepting unit 55 accepts a user operation for determining a private label to be given to the recognition target data acquired by the data acquiring unit 51.
- the user operation is a terminal operation by the user of the terminal device 50.
- the user operation includes an operation for specifying recognition target data and an operation for specifying a private label.
- the operation for specifying the recognition target data is, for example, an operation for selecting one icon from a list of recognition target data icons displayed on a display device or the like.
- the operation for specifying a private label is, for example, an operation for inputting a label name of the private label, an operation for selecting one label from a list of private labels displayed in the past and displayed on a display device or the like.
- the label editing unit 56 assigns a private label to the recognition target data based on the user operation received by the operation receiving unit 55.
- the label editing unit 56 causes the assigned label storage unit 53 to store the relationship between the recognition target data and the private label.
- the recognition target data ID “4” shown in FIG. 11A does not have a private label.
- the user performs a user operation to select the recognition target data ID “4” and the private labels “Chinese cuisine” and “friend”.
- the recognition target data ID “4” and the private labels “Chinese cuisine” and “friend” are stored in association with each other.
- the label editing unit 56 may also store the date and time when the private label was given in the given label storage unit 53.
- the label editing unit 56 may change the information stored in the assigned label storage unit 53. That is, the label editing unit 56 can also correct or delete the private label once given.
- the operation reception unit 55 receives a user operation for correcting or deleting the public label given to the recognition target data.
- the user operation includes an operation for specifying the recognition target data and an operation for correcting or deleting the public label.
- the operation for specifying the recognition target data is, for example, an operation for selecting one icon from a list of recognition target data icons displayed on a display device or the like.
- the operation for correcting or deleting the public label is, for example, an operation for inputting a label name of the public label, an operation for selecting a delete button, or the like.
- the label editing unit 56 corrects or deletes the public label based on the user operation received by the operation receiving unit 55.
- the feedback unit 54 outputs the correction content to the teacher data creation device 30.
- the reception unit 33 of the teacher data creation device 30 receives a user operation that is an operation of the terminal device 10 by a user.
- the receiving unit 33 may receive a user operation that specifies evaluation of a part of the labels of the input data. That is, the user need not evaluate all the labels of the input data.
- the teacher data creation unit 34 acquires the evaluation of a part of the label by the input data acquisition unit 31. And the evaluation of the remaining labels of the input data is ignored. In this way, the public label is corrected by the user, and the learning efficiency of the learning system 100 is improved by re-learning the corrected evaluation. In addition, what is necessary is just to provide the feedback part 54 as needed.
- FIG. 12 is a flowchart showing a method for assigning a private label.
- the flowchart shown in FIG. 12 is executed, for example, when the label edit button is selected by the user.
- the operation reception unit 55 of the terminal device 50 receives a user operation for determining a private label to be given to the recognition target data as the operation reception process (S40).
- the label editing unit 56 of the terminal device 50 determines a private label to be added to the recognition target data based on the user operation received in the process of S40 as a private label addition process (S42), and recognizes the recognition target. Give the determined private label to the data.
- the private label assignment method ends.
- the terminal device 50 can give a private label other than the public label given based on the learning result of the learning device 40.
- the recognition target data can be easily organized and accessed. For this reason, this terminal device 50 can improve a user's convenience.
- the terminal device 50A according to the third embodiment is different from the terminal device 50 according to the second embodiment in that a label presenting unit 57A is provided, and the others are the same.
- the terminal device 50A according to the third embodiment has an additional function of reducing the user's labor for labeling by presenting the private label to the user.
- the attached label storage unit 53 arbitrarily stores accompanying information and private label assignment date and time.
- the attached label storage unit 53 stores attached information and private information. At least one of the label assignment date and time is stored.
- FIG. 13 is a functional block diagram of the terminal device 50A according to the third embodiment.
- the terminal device 50A includes a data acquisition unit 51, a recognition unit 52, a given label storage unit 53, a feedback unit 54, an operation reception unit 55, a label editing unit 56, and a label presentation unit 57A.
- the hardware of the terminal device 50 ⁇ / b> A is the same as that of the terminal device 10.
- the label presenting unit 57A presents the private label to the user.
- the label presenting unit 57A presents the private label to the user based on the history of private label assignment date and time given by the label editing unit 56 and the reference date and time.
- Presentation means notifying the user.
- the presentation is to display characters or icons on the display device. Alternatively, sound may be output from a speaker or the like, or vibration may be operated.
- the label presenting unit 57A presents the private label to the user at the timing when the operation accepting unit 55 accepts the operation of the label edit button, for example.
- the label presenting unit 57A acquires a signal indicating that the operation has been received from the operation receiving unit 55
- the label presenting unit 57A refers to the assigned label storage unit 53.
- the assigned label storage unit 53 stores a history of the assignment date and time of the private label assigned by the label editing unit 56. That is, the label presenting unit 57 ⁇ / b> A can acquire the history of the date and time of giving the private label by referring to the given label storage unit 53. Then, the label presenting unit 57A acquires the reference date and time.
- the reference date / time is the date / time used for estimation of the private label.
- the label presentation unit 57A acquires the current date and time based on a real-time clock or the like and sets it as the reference date and time. Then, the label presenting unit 57A predicts the user's behavior based on the relationship between the date / time given to each private label and the reference date / time, and presents the private label.
- the label presenting unit 57A refers to the history of a past predetermined period (or a predetermined number), calculates the difference between the given date and the reference date and time for each history, and weights the inverse of the difference as a weight A private label is determined by voting.
- FIG. 14 is a table for explaining private label selection processing.
- the private label “A” is associated with the assigned dates “19:30”, “19:30”, “19:42”, “19:53”, and “20:04”.
- the private label “B” is associated with the assignment dates “20:51” and “20:55”.
- information about time is described, and information about date is omitted.
- the reference date and time is “21:02”.
- the label presenting unit 57A calculates the difference between the given date and time and the given date and time for each history. That is, the label presenting unit 57A calculates all the difference columns shown in FIG. Then, the label presentation unit 57A calculates a weight based on the difference and performs a weighted vote. In the example shown in FIG. 14, the number of votes for the private label “A” is “0.06597”, and the number of votes for the private label “B” is “0.23377”. When other private labels are included in the history of the predetermined period, the label presenting unit 57A calculates the number of votes for other private labels using the same method. Then, the label presentation unit 57A presents the private label with the largest number of votes to the user. In the example illustrated in FIG.
- the label presenting unit 57A presents the private label “B” to the user.
- the label presenting unit 57A may present a plurality of private labels in descending order of the number of votes.
- Other configurations of the terminal device 50 ⁇ / b> A are the same as those of the terminal device 50.
- FIG. 15 is a flowchart showing a method for presenting a private label.
- the flowchart shown in FIG. 15 is executed, for example, when the label edit button is selected by the user.
- the label presenting unit 57A of the terminal device 50A refers to the assigned label storage unit 53 and acquires history information.
- the label presenting unit 57A of the terminal device 50A executes, for example, the process described using FIG. 14 as the label presenting process (S46), and determines the private label.
- the label presenting unit 57A presents the determined private label to the user.
- the public label may be presented simultaneously.
- the private label presentation method is terminated.
- the label editing unit 56 assigns a correct label or deletes an incorrect label by a user operation.
- the label presenting unit 57A may present a private label in consideration of past labeling contents, that is, including a corrected portion by a user operation.
- the terminal device 50A may present the private label based on the accompanying information.
- the label presenting unit 57A presents the private label to the user based on the accompanying information given when the recognition target data is generated. Further, the terminal device 50A may present the private label to the user using both the action history and the accompanying information.
- the label presenting unit 57 ⁇ / b> A When the label presenting unit 57 ⁇ / b> A acquires a signal indicating that the operation has been received from the operation receiving unit 55, the label presenting unit 57 ⁇ / b> A refers to the assigned label storage unit 53. As illustrated in (A) or (B) of FIG. 11, the assigned label storage unit 53 stores accompanying information associated with the recognition target data. That is, the label presenting unit 57 ⁇ / b> A can acquire accompanying information by referring to the assigned label storage unit 53. Then, the label presenting unit 57A presents the private label from the relationship between the accompanying information and the private label assigned in the past.
- the label presenting unit 57A identifies other recognition target data to which the same public label is assigned, and the private information assigned to the other recognition target data.
- Present a label For example, when the accompanying information includes a shooting position, the label presenting unit 57A specifies other recognition target data shot at the same or close shooting position, and the private label assigned to the other recognition target data Present.
- the label presenting unit 57A specifies other recognition target data shot at the shooting date and time of the same period or the same period, and the private information assigned to the other recognition target data Label it.
- the label presenting unit 57A specifies other recognition target data captured with the same or similar camera information, and the private label attached to the other recognition target data Present.
- the label presenting unit 57A specifies other recognition target data of the same face recognition result and presents a private label attached to the other recognition target data.
- the label presenting unit 57A determines the private label to be presented by comprehensively considering when there are a plurality of types of information as the accompanying information of the recognition target data. For example, the label presenting unit 57A may determine a private label to be presented by weighted voting.
- the label presenting unit 57A determines a private label to be presented by using a predetermined relationship between the accompanying information and the assumed situation without using the relationship between the accompanying information and a private label given in the past. May be.
- the predetermined relationship between the accompanying information and the assumed situation is stored in advance in a database or the like before the process is executed. Such a relationship may be derived by general rules or empirical rules.
- Such a predetermined relationship between the accompanying information and the assumed situation will be described using camera information as an example. For example, when the focal length, which is camera information, is short, there is a high possibility that still images and portraits have been taken. Alternatively, when the focal length, which is camera information, is long, there is a high possibility that a landscape has been shot.
- the label presenting unit 57A may present the private label based on the predetermined relationship between the accompanying information and the assumed situation.
- FIG. 16 is a flowchart showing a method for presenting a private label.
- the flowchart shown in FIG. 16 is executed, for example, when the label edit button is selected by the user.
- the label presenting unit 57A of the terminal device 50A refers to the attached label storage unit 53 and acquires the accompanying information as the accompanying information acquisition process (S48).
- the label presenting unit 57A of the terminal device 50A determines a private label using accompanying information as a label presenting process (S50).
- the label presenting unit 57A presents the determined private label to the user.
- the private label presentation method ends.
- the terminal device 50A according to the third embodiment can present the private label to the user according to the user's behavior. Further, the terminal device 50A according to the third embodiment can present a private label to the user according to the situation at the time of generating the recognition target data. For this reason, the user's labor for labeling can be reduced.
- the terminal device 50B according to the fourth embodiment has an operation reception unit 55B, an image determination unit (determination unit) 59, a comment analysis unit (analysis unit) 60, and a label presentation unit.
- the difference is that 57B is provided, and the others are the same.
- the terminal device 50B according to the fourth embodiment provides an additional function that reduces the user's labor for labeling by presenting a private label to the user using a comment generated when the user shares recognition target data.
- FIG. 17 is a functional block diagram of the terminal device 50B according to the fourth embodiment.
- the terminal device 50B includes a data acquisition unit 51, a recognition unit 52, a given label storage unit 53, a feedback unit 54, an operation reception unit 55B, a label editing unit 56, an image determination unit 59, a comment analysis unit 60, and a label presentation unit 57B.
- the hardware of the terminal device 50B is the same as that of the terminal device 10.
- the operation accepting unit 55B accepts a user operation for attaching a comment and sharing the recognition target data with another person.
- the operation accepting unit 55B accepts a comment operation attached when sharing image data with others via the Internet. That is, the recognition target data to which the comment is attached does not need to be associated with the comment and the recognition target data in the database, and may be any comment and recognition target data uploaded in the same period.
- the image determination unit 59 determines the recognition target data based on a user operation that shares the recognition target data with another person. Subsequently, the comment analysis unit 60 analyzes the content of the comment attached to the recognition target data specified by the image determination unit 59. The comment analysis unit 60 analyzes the content of the comment using a well-known language function. The comment analysis unit 60 extracts words from the sentences and outputs them as analysis results.
- the label presentation unit 57B presents the private label to the user based on the analysis result of the comment analysis unit 60. Specifically, the label presentation unit 57B estimates a season and an event related to the extracted word and presents a private label to the user.
- the label presenting unit 57B refers to the assigned label storage unit 53, and presents the private label to the user based on the relationship between the season and the event related to the extracted word and the private label assigned in the past. May be.
- the other configuration of the terminal device 50B is the same as that of the terminal device 50.
- FIG. 18 is a flowchart showing a method for presenting a private label.
- the flowchart shown in FIG. 18 is executed when, for example, the data sharing button is selected by the user.
- the image determination unit 59 of the terminal device 50B determines the recognition target data based on a user operation for sharing the recognition target data with another person as an image determination process (S52).
- the comment analysis unit 60 of the terminal device 50B analyzes the content of the comment attached to the recognition target data identified in the image determination process as the comment analysis process (S54).
- the label presentation unit 57B of the terminal device 50B presents a private label to the user based on the analysis result of the comment analysis unit 60 as a label presentation process (S56). When the process of S56 ends, the private label presentation method ends.
- the terminal device 50B according to the fourth embodiment can present the private label to the user according to the comment given by the user. For this reason, a label with relatively high accuracy can be presented with a simple configuration.
- the terminal device 50C according to the fifth embodiment includes a list output unit 62, a representative label acquisition unit (relation acquisition unit) 63, and a correction recommendation unit 64. It is different and the others are the same.
- the terminal device 50C according to the fifth embodiment has an additional function that makes it easier to organize private labels that have already been assigned by pointing out fluctuations, misprints, divergences, and the like of private labels.
- the learning system 100A according to the fifth embodiment is different from the learning system according to the first embodiment in that it includes a language server 80, and the others are the same.
- FIG. 19 is a functional block diagram of the learning system 100A and the terminal device 50C according to the fifth embodiment.
- the terminal device 50C includes a data acquisition unit 51 (not shown), a recognition unit 52 (not shown), a given label storage unit 53, a feedback unit 54 (not shown), an operation reception unit 55 (not shown), and a label editing unit 56 ( (Not shown), a list output unit 62, a representative label acquisition unit 63, and a correction recommendation unit 64.
- the hardware of the terminal device 50C is the same as that of the terminal device 10.
- the learning system 100A includes a teacher data creation device 30 (not shown), a learning device 40 (not shown), and a language server 80.
- the hardware of the language server 80 is the same as that of the terminal device 10.
- the terminal device 50C is configured to be able to communicate with the language server 80.
- the list output unit 62 outputs a list of assigned private labels to the language server 80.
- the list output unit 62 refers to the assigned label storage unit 53, lists a predetermined range (predetermined number) of private labels, and outputs the list to the language server 80.
- This list may be only text information of a private label among the data shown in (A) or (B) of FIG. Of course, the list may include information other than the text information of the private label.
- the list output unit 62 outputs a list including private labels “Sakura”, “Sakura”, “Hanami”, “Ohanami”, and “Hanami”.
- the representative label acquisition unit 63 acquires the relationship between the representative label and the assigned private label from the language server 80.
- the representative label is a label in which similar private labels are aggregated, or a label in which a wobbling or misprinting is corrected.
- the representative label acquisition unit 63 acquires the representative label “cherry blossom viewing” associated with the private label information “cherry blossom viewing”, “cherry blossom viewing”, and “hanami”.
- the representative label acquisition unit 63 acquires the representative label “Sakura” associated with the private label information “Sakura” and “Sakura”.
- the correction recommendation unit 64 recommends the user to correct the private label to the representative label based on the relationship acquired by the representative label acquisition unit 63. For example, the correction recommendation unit 64 displays the assigned private label and the representative label on the display device, and prompts the user to correct.
- the recommended display is not limited to the above. The user may be prompted by voice information using a device other than the display device, for example, a speaker.
- the language server 80 includes a list acquisition unit 81, an aggregation unit 82, a representative label selection unit 83, a representative label storage unit 84, and a representative label output unit 85.
- the list acquisition unit 81 acquires a list from one or a plurality of terminal devices 50C. As described above, the list includes text information of the private label.
- the aggregation unit 82 aggregates private labels into groups based on the list acquired by the list acquisition unit 81.
- the aggregation unit 82 groups the private labels of the list based on the similarity of meaning, the similarity of sound, and the like.
- the list includes private labels “Sakura”, “Sagra”, “Hanami”, “Ohanami”, and “Hanami”. In this case, “Sakura” and “Sakura” are collected as one group. In addition, “Hanami”, “Ohanami” and “Hanami” will be consolidated as one group.
- the representative label selection unit 83 selects a representative label for the group aggregated by the aggregation unit 82. For a group in which similar private labels are aggregated, the representative label selection unit 83 selects a word having the largest number of searches as a representative label using an Internet search engine or the like. The representative label selection unit 83 selects a correct or appropriate word as a representative label by utilizing a dictionary database or the like when a notation fluctuation or an error is included. As a specific example, the representative label selection unit 83 selects “Sakura” in which an error is corrected as a representative label for a group in which “Sakura” and “Sakura” are collected. The representative label selection unit 83 selects “cherry blossom viewing” with the largest number of search results as a representative label for the group in which “cherry blossom viewing”, “cherry blossom viewing”, and “hanami” are collected.
- the representative label selection unit 83 may store the selected representative label in the representative label storage unit 84.
- the representative label selection unit 83 may compare the selected representative label with the past representative label based on the representative label selection history with reference to the representative label storage unit 84. By comprising in this way, the representative label which the representative label selection part 83 selects can be stabilized.
- the representative label output unit 85 Based on the selection result of the representative label selection unit 83, the representative label output unit 85 outputs the relationship between the representative label and the assigned private label to the terminal device 50C.
- FIG. 20 is a flowchart showing a private label correction recommendation method.
- the flowchart shown in FIG. 20 can be executed at a predetermined timing.
- the list output unit 62 of the terminal device 50C outputs the assigned private label list to the language server 80 as list output processing (S70).
- the list acquisition unit 81 of the language server 80 acquires a list as list acquisition processing (S72).
- the aggregation unit 82 of the language server 80 aggregates private labels into groups based on the list acquired by the list acquisition unit 81 as an aggregation process (S74). Then, the representative label selection unit 83 of the language server 80 selects a representative label for the group aggregated by the aggregation unit 82 as a representative label selection process (S76). Then, the representative label output unit 85 of the language server 80 sets the relationship between the representative label and the assigned private label to the terminal device 50C based on the selection result of the representative label selection unit 83 as representative label output processing (S77). Output.
- the representative label acquisition unit 63 of the terminal device 50C acquires the relationship between the representative label and the assigned private label from the language server 80 as a representative label acquisition process (S78).
- the correction recommendation unit 64 of the terminal device 50C recommends the user to correct the private label to the representative label based on the relationship acquired by the representative label acquisition unit 63 as the recommendation process (S80).
- the private label correction recommendation method ends.
- the learning system 100A and the terminal device 50C according to the fifth embodiment can prompt the user to organize the private labels. For this reason, the already assigned private labels can be organized.
- the learning system 100B according to the sixth embodiment is different from the learning system 100 according to the first embodiment in that it includes a threshold value setting unit 44 (an example of a threshold value changing device), and is otherwise the same. Below, it demonstrates centering around the difference between the learning system 100B and the learning system 100, and the overlapping description is abbreviate
- the weighting factor learned by the learning device 40 is distributed to the terminal device 10.
- the terminal device 10 operates the recognition unit 11 using the distributed weight coefficient.
- the recognition unit 11 updates the neural network using the distributed weighting factor.
- the recognition part 11 acquires recognition object data, and outputs the recognition score which shows the degree to which the content of recognition object data corresponds to a predetermined label with a neural network.
- the recognition unit 11 assigns a label corresponding to a recognition score equal to or greater than a predetermined value to the recognition target data.
- the recognition unit 11 outputs a recognition result indicating whether or not the content of the recognition target data matches a predetermined label using the recognition score and a threshold value preset for the recognition score.
- the predetermined value is a threshold for determining the recognition score, and is set in advance for the recognition score. “Preliminarily set” means that the threshold is set before the recognition unit 11 performs the recognition process.
- the threshold value (predetermined value) may be set in advance at the time of initial setting, or may be calculated by evaluating using evaluation data during learning or after completion of learning.
- the threshold value is determined using the evaluation data. That is, the threshold value is calculated by evaluating the learning recognition unit 42 or the recognition unit 11 using the evaluation data during or after learning.
- the evaluation data is data that does not overlap with the teacher data, and includes the correct evaluation for the input data and the predetermined label.
- the correct evaluation is associated with the input data and indicates whether the content of the input data is a positive evaluation that matches a predetermined label or whether the content of the input data is a negative evaluation that does not match the predetermined label.
- the correct evaluation may include not only “positive evaluation” and “negative evaluation” but also “ignore evaluation”. However, the evaluation data to which “ignore evaluation” is assigned is not used for determining the threshold value.
- the learning system 100B inputs evaluation data to the neural network that is being learned or has been learned, and sets a threshold for the output of the learning recognition unit 42 or the recognition unit 11 using the output recognition score.
- FIG. 22 is a graph for explaining the threshold value of the recognition score.
- the learning recognition unit 42 or the recognition unit 11 recognizes evaluation data to which “positive evaluation” or “negative evaluation” is given with respect to a predetermined label. It is a result.
- the horizontal axis is the recognition score, and the vertical axis is the frequency.
- the recognition score is a score representing the probability of recognition.
- the frequency is the number of evaluation data.
- a threshold value t i for determining a positive evaluation or a negative evaluation from the recognition score is required.
- FIGS. 22A and 22B as a result of evaluation using evaluation data, a distribution of positive evaluation data and a distribution of negative evaluation data can be obtained.
- Learning system 100B based on general statistics, to set these distributions recognition score distinguishing the threshold t i.
- the threshold value is set using an F-measure that is a harmonic average of recall and precision. It will be described in detail later in the setting of the threshold t i.
- the threshold value t i is set to the evaluation score using common statistics
- a general statistics threshold t i to the evaluation score using is set.
- FIG. 23 is a functional block diagram of the learning system and the terminal device according to the sixth embodiment.
- Learning system 100B shown in FIG. 23 delivers the threshold t i to the terminal unit 10B together with the learning result.
- the learning system 100B is different from the learning system 100 according to the first embodiment in the learning device 40B, and the others are the same.
- the learning device 40B is different from the learning device 40 according to the first embodiment in that the learning device 40B includes a threshold setting unit 44, and the others are the same.
- the threshold setting unit 44 includes an evaluation data acquisition unit 441, a terminal data acquisition unit 442, a recognition score acquisition unit 443, a calculation unit 444, and a change unit 445.
- the evaluation data acquisition unit 441 acquires evaluation data.
- the evaluation data is stored, for example, in the storage unit of the learning device 40B.
- the evaluation data includes a correct answer evaluation for a predetermined label (hereinafter, label i is a predetermined label). More specifically, the evaluation data is a set (data set) of image data (input data) to which a correct answer label is assigned.
- label i is a predetermined label
- the evaluation data is a set (data set) of image data (input data) to which a correct answer label is assigned.
- G i + a set of positive evaluation data of label i included in the evaluation data
- G i ⁇ a negative evaluation data set of label i included in the evaluation data
- the number of image data included in the set X will be described as # (X).
- the terminal data acquisition unit 442 acquires the ratio r i, a between the positive evaluation and the negative evaluation related to the label i of the data associated with the terminal device 10B.
- the terminal device 10B is the same as the terminal device 10 according to the first embodiment.
- the data associated with the terminal device 10B is recognition target data related to the terminal device 10B and is recognized data.
- the data associated with the terminal device 10B is stored in a set of recognized image data stored in the terminal device 10B or an external recording medium, and the terminal ID and user ID of the terminal device 10B A set of associated recognized image data.
- a more specific example is an album of images stored in the terminal device 10B.
- the ratio r i, a between the positive evaluation and the negative evaluation regarding the label i is the ratio of the number of positive evaluation data and the number of negative evaluation data among the recognized data, and the positive evaluation data and the negative evaluation data Is the abundance ratio.
- a set of positive evaluation data is denoted as G ′ i +
- a set of negative evaluation data is denoted as G ′ i ⁇ . That is, the ratio r i, a is a value # (G ′ i + ) / # (G ′ i ⁇ ) obtained by dividing the number of positive evaluation data by the number of negative evaluation data.
- the terminal data acquisition unit 442 can acquire the ratios r i, a of the positive evaluation and the negative evaluation regarding the label i using various methods.
- the terminal data acquisition unit 442 acquires the ratio r i, a based on the recognition result of the neural network of the terminal device 10B.
- the terminal data acquisition unit 442 acquires the ratio r i, a based on the recognition result of the recognition unit 11. be able to.
- the terminal data acquisition part 442 may acquire ratio ri , a based on the result of the annotation by the user of the terminal device 10B.
- the terminal data acquisition unit 442 can acquire the ratios r i, a based on the result of the annotation.
- the terminal data acquisition unit 442 can acquire the ratio r i, a based on a user operation of the terminal device 10B or terminal information.
- the terminal data acquisition unit 442 estimates the ratio r i, a based on a user input (user operation) regarding the label i.
- the terminal data acquisition unit 442 inquires the user about the degree of interest indicating the degree of interest regarding the label i, and estimates the ratio r i, a based on the user input to the inquiry.
- the terminal data acquisition unit 442 may directly query the user for the ratio r i, a .
- the terminal data acquisition unit 442 may estimate the ratio r i, a based on the terminal information of the terminal device 10B.
- the terminal information is information stored in the terminal device 10B, such as regional data.
- the terminal data acquisition unit 442 estimates the ratio r i, a based on the correlation between the area stored in advance and the label i and the acquired area data.
- the recognition score acquisition unit 443 acquires a recognition score of a predetermined label related to input data from a neural network (recognition unit 11) or a neural network (learning recognition unit 42) having the same weighting factor as the neural network. Since the weighting factor of the learning recognition unit 42 and the weighting factor of the recognition unit 11 are synchronized, the recognition score acquisition unit 443 may use either neural network.
- the recognition score acquisition unit 443 acquires the recognition score of the predetermined label related to the input data by causing the learning recognition unit 42 or the recognition unit 11 to read the evaluation data acquired by the evaluation data acquisition unit 441.
- p i a degree of probability for the label i (probability as an example)
- the recognition unit for learning 42 recognizes input data whose true evaluation is a positive evaluation as true evaluation and input data whose negative evaluation is a negative evaluation as negative evaluation.
- Data number true negative
- input data with correct evaluation as positive evaluation number of data recognized as negative evaluation (false negative)
- input data with correct evaluation as negative evaluation number of data recognized as positive evaluation (false positive).
- the calculation unit 444 performs evaluation using at least the precision.
- the relevance rate is obtained by dividing the number of data whose correct answer is “correct evaluation” among the data recognized as “correct evaluation” by the learning recognition unit 42 by the number of data recognized as “correct evaluation”. Value.
- a set of data whose recognition result is “positive evaluation” is P i +
- a set of data whose recognition result is “negative evaluation” is P i ⁇ .
- the number of data used for the evaluation can be expressed as # (P i + ) + # (P i ⁇ ).
- the number of data of “true positive”, “true negative”, “false negative”, and “false positive” for the label i described above can be expressed as follows.
- G i + is a set of positive evaluation data of label i included in the evaluation data
- G i ⁇ is a set of negative evaluation data of label i included in the evaluation data. According to the above definition, the precision is expressed as follows.
- the calculation unit 444 may perform evaluation using the recall rate.
- the recall rate is obtained by dividing the number of data recognized as “correct evaluation” by the learning recognition unit 42 among the data whose correct answer is “correct evaluation” by the number of “correct evaluation” data among the evaluation data. Value. Specifically, it is expressed as follows.
- the calculation unit 444 calculates a harmonic average (f-measure) of the recall rate and the matching rate as an evaluation value when using the matching rate and the recall rate.
- f-measure is an index that pays attention to the equality of recall and precision.
- the evaluation value described above is influenced by the data distribution of positive evaluation and negative evaluation in the evaluation data. That is, when there is a bias in the ratio between positive evaluation and negative evaluation in the evaluation data, the calculated evaluation value is a value reflecting the bias in the evaluation data. For this reason, the difference between the data distribution of the evaluation data and the environment (data distribution of the terminal device 10B) actually used by the user is ideally small. For this reason, the calculation unit 444 has a function of correcting the number of data so as to reduce the above-described difference, and calculating the precision using the corrected number of data. In particular, when “ignore evaluation” is included in the evaluation data, the above-described difference may become remarkable.
- FIG. 24 is a diagram for explaining data bias in evaluation data. The distribution shown in FIG.
- 24A is a true distribution of “positive evaluation” and “negative evaluation” (distribution in the terminal device 10B).
- the annotator performs “positive evaluation” tagging on all evaluation data, and then performs “negative evaluation” tagging on some evaluation data.
- This is a distribution when the remaining evaluation data is “ignore evaluation”.
- the data that should be regarded as “negative evaluation” becomes “ignore evaluation” data, so that the data distribution of the evaluation data may be greatly different from the data distribution of the user environment.
- the calculation unit 444 corrects the evaluation data so as to have the same existence ratio as the ratio r i, a in the terminal device 10B, and performs evaluation.
- the evaluation value based on ideal data is expressed as follows. As described above, among the recognized data associated with the terminal device 10B, a set of positive evaluation data is G ′ i + and a set of negative evaluation data is G ′ i ⁇ . Further, among recognized data associated with the terminal device 10B, a set of data whose recognition result is “positive evaluation” is P ′ i + , and a set of data whose recognition result is “negative evaluation” is P ′ i ⁇ . And
- the calculation unit 444 performs correction to reduce the influence of fluctuations in the number of “false positives”. Specifically, the calculation unit 444 corrects as follows.
- r i test is a ratio of “positive evaluation” and “negative evaluation” in the evaluation data.
- the calculation unit 444 calculates the number of “false positives” by the ratios r i, test of “positive evaluation” and “negative evaluation” in the evaluation data, and “positive evaluation” and “negative evaluation” in the terminal device 10B. By correcting using the ratios r i, a , the number of correction “false positives” is obtained. In other words, the calculation unit 444 calculates the relevance ratio using the following Expression 13.
- the changing unit 445 changes the threshold value t i using the precision calculated by the calculating unit 444.
- the recognition score relevance ratio is the highest may be a threshold t i.
- the changing unit 445, the harmonic mean of the thresholds t i recall and precision rate may change to the recognition score of maximum.
- the changed threshold value t i is distributed to the terminal device 10B.
- FIG. 25 is a flowchart showing the threshold value changing process.
- the flowchart shown in FIG. 25 is executed, for example, at a predetermined timing during learning.
- the evaluation data acquisition unit 441 of the threshold setting unit 44 acquires evaluation data as the evaluation data acquisition process (S90).
- the evaluation data acquisition unit 441 acquires, for example, a data set of image data to which any of “correct evaluation”, “negative evaluation”, and “ignore evaluation” regarding the label i is assigned as the correct evaluation.
- the terminal data acquisition unit 442 of the threshold setting unit 44 acquires terminal data as terminal data acquisition processing (S92).
- the terminal data acquisition unit 442 acquires, for example, the ratio r i, a between the positive evaluation and the negative evaluation related to the label i of the data associated with the terminal device 10B.
- the recognition score acquisition unit 443 of the threshold setting unit 44 acquires a recognition score as a recognition score acquisition process (S94).
- the recognition score acquisition unit 443 acquires the recognition score of the predetermined label related to the input data by causing the learning recognition unit 42 to read the evaluation data acquired in the evaluation data acquisition process (S90).
- the calculation part 444 of the threshold value setting part 44 calculates a relevance rate as a calculation process (S96).
- the calculation unit 444 calculates the relevance ratio using the above-described Expression 13. Specifically, the calculation unit 444 calculates “true positive” and “false positive” based on the recognition score acquired in the recognition score acquisition process (S94). Then, the calculation unit 444 determines “false” based on the ratios r i, test of “positive evaluation” and “negative evaluation” in the evaluation data and the ratio r i, a acquired in the terminal data acquisition process (S92). Correct “positive”. Then, the calculation unit 444 calculates the relevance ratio using “true positive” and the correction “false positive” (Formula 13).
- the changing unit 445 of the threshold setting unit 44 changes the threshold t i as the changing process (S98).
- Changing unit 445 changes the threshold t i using a precision rate calculated by the calculation processing (S96).
- the flowchart shown in FIG. 25 ends.
- the number of data recognized as negative evaluation input data is the ratio r i, test of the positive evaluation and negative evaluation of the evaluation data, and the terminal device Correction is performed using the ratio r i, a of the positive evaluation and the negative evaluation of the data associated with 10B. Then, based on the relevance ratio regarding the label i calculated using the corrected number of data, the threshold value t i used in the recognition performed by the terminal device 10B is changed. In this way, when calculating the relevance ratio for the label i, the negative evaluation input data is recognized as a positive evaluation in consideration of the distribution of positive and negative data in the evaluation data and the distribution of positive and negative data in the terminal device 10B. The number of data is corrected.
- the threshold setting unit 44 can make the ratios r i, test and the ratios r i, a equal by correcting the “false positive” number.
- the terminal device 10C according to the seventh embodiment includes a threshold setting unit 44C (an example of a threshold changing device), as compared with the terminal device 10B according to the sixth embodiment (or the terminal device 10 according to the first embodiment).
- a threshold setting unit 44C an example of a threshold changing device
- the threshold t i changing process described in the sixth embodiment needs to use evaluation data having a certain number. For this reason, when it processes with the terminal device 10B, there exists a possibility that it may take time.
- the terminal device 10C according to the seventh embodiment stores in advance the relationship between the positive and negative evaluation ratios r i, a of the data associated with the terminal device 10C and the threshold value t i . It is possible to appropriately change the threshold value t i according to the environmental change.
- FIG. 26 is a functional block diagram of the terminal device 10C according to the seventh embodiment. As illustrated in FIG. 26, the terminal device 10C includes a terminal data acquisition unit 446, a change unit 447, and a storage unit 448.
- the terminal data acquisition unit 446 has the same function as the terminal data acquisition unit 442 according to the sixth embodiment.
- the storage unit 448 stores the relationship between the ratio r i, a of positive evaluation and negative evaluation of data associated with the terminal device 10C and the threshold value t i .
- the storage unit 448 stores a function of the threshold value t i with the ratio r i, a as a variable.
- the changing unit 447 changes the threshold value t i using the relationship stored in the storage unit 448 and the ratio r i, a acquired by the terminal data acquisition unit 446.
- the changing unit 447 uses the ratio r i is stored in the storage unit 448, and the function of the threshold value t i of the variables a, is the ratio r i acquired by the terminal data acquiring unit 446, and a, change A later threshold t i is obtained.
- the changing unit 447 acquires the changed threshold value t i by interpolation using the threshold value t i stored discretely in the storage unit 448 and the ratio r i, a acquired by the terminal data acquiring unit 446. To do.
- the ratio r i, a acquired by the terminal data acquisition unit 446 is 0.15.
- the changing unit 447 sets (t i (0.1) + t i (0.2)) / 2 as the changed threshold value t i (linear interpolation). Changing unit 447 replaces the threshold t i after the change the current threshold.
- the other configuration of the terminal device 10C is the same as that of the terminal device 10B.
- FIG. 27 is a flowchart showing the threshold value changing process.
- the flowchart shown in FIG. 27 is executed, for example, when a threshold change process start button is selected by a user operation.
- the terminal data acquisition unit 446 of the terminal device 10C acquires terminal data as terminal data acquisition processing (S100). For example, the terminal data acquisition unit 446 acquires the ratios r i, a of the positive evaluation and the negative evaluation regarding the label i of the data associated with the terminal device 10C.
- the changing unit 447 of the terminal apparatus 10C as the threshold value obtaining process (S102), and acquires the threshold t i after the change.
- Changing unit 447 for example, the ratio r i is stored in the storage unit 448, a and a threshold t and relationship with i, is the ratio r i acquired by the terminal data acquisition process (S100), based on the a Then, the changed threshold value t i is acquired.
- the changing unit 447 of the terminal apparatus 10C changes the threshold value t i.
- Changing unit 447 replaces the threshold t i after the change acquired the current threshold t i at the threshold acquisition process (S102).
- the flowchart shown in FIG. 27 ends.
- the threshold value setting unit 44C As described above, according to the threshold value setting unit 44C according to the seventh embodiment, the relationship between the ratio r i, a stored in advance and the threshold value t i and the ratio r i, a acquired by the terminal data acquisition unit 446 Is used to change the threshold t i . In this way, by using the relationship between the ratio r i, a stored in advance and the threshold value t i , the calculation load for changing the threshold value can be reduced. Further, the ratios r i, a of positive evaluation and negative evaluation of data associated with the terminal device are different for each terminal device. According to the threshold setting unit 44C according to the seventh embodiment, it can be changed to the optimal threshold t i in accordance with the use environment of the terminal device 10C.
- the present invention is not limited to the above embodiment.
- the present invention can be variously modified without departing from the gist thereof.
- FIG. 21 is a diagram showing a hierarchical structure of private labels.
- the hierarchized private label has an item of “category” for classifying the label.
- the labels “A”, “B”, and “C” are the category “person name”
- the labels “D” and “E” are the category “place names”
- the label “F” is the category “time”
- the label “G” is It is classified into the category "Other”.
- a private label may be automatically assigned on the terminal device side. Further, when the private label satisfies a predetermined condition, the private label may be promoted to a public label. For example, when a certain number of users use the same private label, the private label may be changed to a public label. Alternatively, the private label assigned to the same public label may be totaled by the learning system, and the private label may be replaced with the public label according to the usage situation.
- the learning system 100B according to the above-described sixth embodiment has been described as a learning system that can use not only positive evaluation and negative evaluation but also ignore evaluation, it is not always necessary to use ignore evaluation. That is, the threshold setting unit 44 described in the sixth embodiment may be applied to a conventional learning system that makes a determination based only on positive evaluation and negative evaluation. Even in this case, the threshold value can be appropriately changed according to the terminal device.
- the threshold setting unit 44 according to the sixth embodiment described above may be provided in the terminal device 10B instead of the learning device 40B. Further, the terminal data acquisition process (S92) shown in FIG. 25 is not limited to the case where it is executed between the evaluation data acquisition process (S90) and the recognition score acquisition process (S94), but the calculation process (S96). It only has to be executed before.
- the apparatus according to the second to seventh embodiments described above may exhibit its functions by a program.
- Another form of the second to seventh embodiments includes a method corresponding to the operation of these apparatuses, a program having the functions of the apparatus, or a storage medium storing the program.
- aggregating unit 83 ... representative label selecting unit, 84 ... representative label storage section, 85 ... representative label output unit, 44,44C ... threshold setting unit, t i ... threshold, 100, 100A, 100B ... learning system, 111 Input layer, 112 ... intermediate layer, 113 ... output layer, 441 ... evaluation data acquisition unit, 442,446 ... terminal data acquiring unit, 443 ... recognition score acquisition unit, 444 ... calculator, 445, 447 ... change unit.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Library & Information Science (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
Abstract
Description
実施形態に係る学習システム100(図4参照)は、認識対象データの内容を認識する認識部11(図1参照)のパラメータを学習するシステムである。認識対象データとは、コンピュータに認識させる対象となるデータであり、例えば、画像データ、音声データ、テキストデータなどである。認識部11のパラメータは、認識対象データを認識する認識処理に用いられる値である。以下では、一例として、認識対象データが画像データであり、認識する対象が画像の内容(人、動物、物体、風景、室内など)である場合を説明する。
第1実施形態においては、端末装置10は、学習システム100によって学習されたラベルを付与する。第1実施形態におけるラベルは、予め設定された内容を表すラベルであって、端末装置10及び学習システム100で共通である。また、ラベルは、複数の端末装置10間で共通であり得る。つまり、第1実施形態におけるラベルは、ユーザなどにより自由に設定されたラベルではない。第2実施形態では、ユーザが自由にラベルを付与することを許容する。以下では、第1実施形態におけるラベルをパブリックラベルといい、ユーザにより自由に設定されたラベルをプライベートラベルという。また、以下では、第1実施形態において説明した内容については説明を省略する。
第3実施形態に係る端末装置50Aは、第2実施形態に係る端末装置50と比べて、ラベル提示部57Aを備えている点で相違し、その他は同一である。第3実施形態に係る端末装置50Aは、プライベートラベルをユーザに提示することで、ユーザのラベル付与の労力を軽減させる追加機能を有する。なお、第2実施形態においては、付与ラベル記憶部53は、付随情報及びプライベートラベル付与日時を任意で記憶していたが、第3実施形態においては、付与ラベル記憶部53は、付随情報及びプライベートラベル付与日時の少なくとも一方が記憶されている。
第4実施形態に係る端末装置50Bは、第2実施形態に係る端末装置50と比べて、操作受付部55B、画像決定部(決定部)59、コメント解析部(解析部)60及びラベル提示部57Bを備えている点で相違し、その他は同一である。第4実施形態に係る端末装置50Bは、ユーザが認識対象データを共有するときに生成されたコメントを用いて、プライベートラベルをユーザに提示することで、ユーザのラベル付与の労力を軽減させる追加機能を有する。
第5実施形態に係る端末装置50Cは、第2実施形態に係る端末装置50と比べて、リスト出力部62、代表ラベル取得部(関係取得部)63及び修正推奨部64を備えている点で相違し、その他は同一である。第5実施形態に係る端末装置50Cは、プライベートラベルの表記ゆれ、誤記、発散などを指摘することにより、既に付与したプライベートラベルをより整理し易くさせる追加機能を有する。また、第5実施形態に係る学習システム100Aは、第1実施形態に係る学習システムと比べて言語サーバ80を備える点が相違し、その他は同一である。
第6実施形態に係る学習システム100Bは、第1実施形態に係る学習システム100と比べて、閾値設定部44(閾値変更装置の一例)を備える点が相違し、その他は同一である。以下では、学習システム100Bと学習システム100との相違点を中心に説明し、重複する説明は省略する。
第7実施形態に係る端末装置10Cは、第6実施形態に係る端末装置10B(あるいは第1実施形態に係る端末装置10)と比べて、閾値設定部44C(閾値変更装置の一例)を備える点が相違し、その他は同一である。以下では、端末装置10Cと端末装置10B(あるいは端末装置10)との相違点を中心に説明し、重複する説明は省略する。
Claims (23)
- 複数のラベルを用いて認識対象データを分類するニューラルネットワークを、誤差逆伝搬法によって学習する学習装置と、前記学習装置のための教師データを作成する教師データ作成装置と、を備える学習システムであって、
前記教師データ作成装置は、
入力データを取得する入力データ取得部と、
前記入力データ取得部により取得された前記入力データに関して、ラベルごとに、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つを取得する評価取得部と、
前記入力データ取得部により取得された前記入力データと前記評価取得部により取得されたラベルごとの評価とを関連付けることにより、前記教師データを作成する教師データ作成部と、
を備え、
前記学習装置は、
前記教師データ作成装置により作成された前記教師データを取得する教師データ取得部と、
前記教師データ取得部により取得された前記教師データに含まれる前記入力データをスコアとして取得する入力層と、
前記入力層が取得したスコアを、重み係数を用いて演算する中間層と、
前記中間層が演算したスコアを用いて、ラベルごとの認識スコアを出力する出力層と、
前記出力層が出力したラベルごとの認識スコアとラベルごとの評価の正解スコアとを用いて前記中間層の重み係数を調整する誤差逆伝搬部と、
を備え、
前記誤差逆伝搬部は、正評価又は負評価のラベルの認識スコアと正評価又は負評価の正解スコアとが近づくように前記中間層の重み係数を調整し、かつ、無視評価のラベルの認識スコアが前記中間層の重み係数の調整に影響を与えないようにする、
学習システム。 - 前記誤差逆伝搬部は、無視評価の正解スコアを無視評価のラベルの認識スコアと同じ値に設定する、無視評価の正解スコアと無視評価のラベルの認識スコアとの差分を0に変更する、又は、無視評価の正解スコアと無視評価のラベルの認識スコアとの差分の微分値を0に変更する、請求項1に記載の学習システム。
- 前記誤差逆伝搬部は、無視評価のラベルに関する前記ニューラルネットワークの接続を遮断する、請求項1に記載の学習システム。
- 前記教師データ作成部は、前記評価取得部によって評価が取得できないラベルと無視評価とを関連付ける請求項1~3の何れか一項に記載の学習システム。
- 前記教師データ作成装置は、ラベルの評価を指定するユーザ操作を受け付ける受付部を備え、
前記評価取得部は、前記受付部により受け付けられた前記ユーザ操作によって指定されるラベルの評価を取得する請求項1~4の何れか一項に記載の学習システム。 - 前記受付部は、前記入力データの一部のラベルの評価を指定する前記ユーザ操作を受け付け、
前記教師データ作成部は、前記評価取得部により取得された一部のラベルの評価を、前記入力データ取得部により取得された前記入力データと関連付けるとともに、前記入力データの残りのラベルの評価を無視評価にする、請求項5に記載の学習システム。 - 前記教師データ作成部は、前記評価取得部により取得されたラベルの評価を前記入力データ取得部により取得された前記入力データと関連付ける前に、前記入力データの全てのラベルの評価を無視評価にする、請求項1~6の何れか一項に記載の学習システム。
- 複数のラベルを用いて認識対象データを分類するニューラルネットワークを、誤差逆伝搬法によって学習する学習装置であって、
入力データと前記入力データに予め関連付けられたラベルごとの評価とを含む教師データを取得する教師データ取得部と、
前記教師データ取得部により取得された前記教師データに含まれる前記入力データをスコアとして取得する入力層と、
前記入力層が取得したスコアを、重み係数を用いて演算する中間層と、
前記中間層が演算したスコアを用いて、ラベルごとの認識スコアを出力する出力層と、
前記出力層が出力したラベルごとの認識スコアとラベルごとの評価の正解スコアとを用いて前記中間層の重み係数を調整する誤差逆伝搬部と、
を備え、
前記入力データには、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つがラベルごとに関連付けられており、
前記誤差逆伝搬部は、正評価又は負評価のラベルの認識スコアと正評価又は負評価の正解スコアとが近づくように前記中間層の重み係数を調整し、かつ、無視評価のラベルの認識スコアが前記中間層の重み係数の調整に影響を与えないようにする、
学習装置。 - 複数のラベルを用いて認識対象データを分類するニューラルネットワークを誤差逆伝搬法によって学習する学習装置のための教師データを作成する教師データ作成装置であって、
入力データを取得する入力データ取得部と、
前記入力データ取得部により取得された前記入力データに関して、ラベルごとに、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つを取得する評価取得部と、
前記入力データ取得部により取得された前記入力データと前記評価取得部により取得されたラベルごとの評価とを関連付けることにより、前記教師データを作成する教師データ作成部と、
を備える教師データ作成装置。 - 複数のラベルを用いて認識対象データを分類するニューラルネットワークを、誤差逆伝搬法によって学習する学習方法であって、
入力データと前記入力データに予め関連付けられたラベルごとの評価とを含む教師データを取得する教師データ取得ステップと、
入力層が、前記教師データ取得ステップで取得された前記教師データに含まれる前記入力データをスコアとして取得する入力ステップと、
中間層が、前記入力ステップで取得されたスコアを、重み係数を用いて演算する演算ステップと、
出力層が、前記演算ステップで演算されたスコアを用いて、ラベルごとの認識スコアを出力する出力ステップと、
前記出力ステップで出力されたラベルごとの認識スコアとラベルごとの評価の正解スコアとを用いて前記中間層の重み係数を調整する誤差逆伝搬ステップと、
を備え、
前記入力データには、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つがラベルごとに関連付けられており、
前記誤差逆伝搬ステップでは、正評価又は負評価のラベルの認識スコアと正評価又は負評価の正解スコアとが近づくように前記中間層の重み係数を調整し、かつ、無視評価のラベルの認識スコアが前記中間層の重み係数の調整に影響を与えないようにする、
学習方法。 - 複数のラベルを用いて認識対象データを分類するニューラルネットワークを誤差逆伝搬法によって学習する学習装置のための教師データを作成する教師データ作成方法であって、
入力データを取得する入力データ取得ステップと、
前記入力データ取得ステップにより取得された前記入力データに関して、ラベルごとに、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つを取得する評価取得ステップと、
前記入力データ取得ステップで取得された前記入力データと前記評価取得ステップで取得されたラベルごとの評価とを関連付けることにより、前記教師データを作成する教師データ作成ステップと、
を備える教師データ作成方法。 - 複数のラベルを用いて認識対象データを分類するニューラルネットワークを、誤差逆伝搬法によって学習するようにコンピュータを動作させる学習プログラムであって、
前記コンピュータを、
入力データと前記入力データに予め関連付けられたラベルごとの評価とを含む教師データを取得する教師データ取得部、
前記入力データをスコアとして取得する入力層、
前記入力層が取得したスコアを、重み係数を用いて演算する中間層、
前記中間層が演算したスコアを用いて、ラベルごとの認識スコアを出力する出力層、及び、
前記出力層が出力したラベルごとの認識スコアとラベルごとの評価の正解スコアとを用いて前記中間層の重み係数を調整する誤差逆伝搬部
として機能させ、
前記入力データには、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つがラベルごとに関連付けられており、
前記誤差逆伝搬部は、正評価又は負評価のラベルの認識スコアと正評価又は負評価の正解スコアとが近づくように前記中間層の重み係数を調整し、かつ、無視評価のラベルの認識スコアが前記中間層の重み係数の調整に影響を与えないようにする、
学習プログラム。 - 複数のラベルを用いて認識対象データを分類するニューラルネットワークを誤差逆伝搬法によって学習する学習装置のための教師データを作成するようにコンピュータを動作させる教師データ作成プログラムであって、
前記コンピュータを、
入力データを取得する入力データ取得部、
前記入力データ取得部により取得された前記入力データに関して、ラベルごとに、前記入力データの内容がラベルに合致することを示す正評価、前記入力データの内容がラベルに合致しないことを示す負評価、及び、学習対象ラベルから除外することを示す無視評価の何れか1つを取得する評価取得部、及び、
前記入力データ取得部により取得された前記入力データと前記評価取得部により取得されたラベルごとの評価とを関連付けることにより、前記教師データを作成する教師データ作成部
として機能させる教師データ作成プログラム。 - 請求項8に記載の学習装置と通信可能な端末装置であって、
前記認識対象データを取得する認識対象データ取得部と、
前記学習装置によって学習されたパラメータを用いて、前記認識対象データの内容を表す前記ラベルを前記認識対象データに付与する認識部と、
前記認識対象データ取得部により取得された前記認識対象データに付与するためのプライベートラベルを決定するユーザ操作を受け付ける操作受付部と、
前記操作受付部により受け付けられた前記ユーザ操作に基づいて、前記プライベートラベルを前記認識対象データに付与するラベル編集部と、
を備える端末装置。 - 前記ラベル編集部により付与された前記プライベートラベルの付与日時の履歴、及び、基準日時に基づいて、前記プライベートラベルをユーザに提示するラベル提示部を備える請求項14に記載の端末装置。
- 前記認識対象データの生成時に付与された付随情報に基づいて、前記プライベートラベルをユーザに提示するラベル提示部を備える請求項14に記載の端末装置。
- 前記操作受付部は、コメントを付して前記認識対象データを他人と共有するユーザ操作を受け付け、
前記操作受付部により受け付けられた前記ユーザ操作に基づいて、共有される前記認識対象データを決定する決定部と、
前記決定部により決定された前記認識対象データに付されたコメントの内容を解析する解析部と、
前記解析部の解析結果に基づいて、前記プライベートラベルをユーザに提示するラベル提示部を備える請求項14に記載の端末装置。 - 言語サーバと通信可能に構成され、
付与された前記プライベートラベルのリストを前記言語サーバへ出力するリスト出力部と、
前記言語サーバから代表ラベルと付与された前記プライベートラベルとの関係を取得する関係取得部と、
前記関係取得部により取得された前記関係に基づいて、前記プライベートラベルを前記代表ラベルへ修正することをユーザに推奨する推奨部と、
を有し、
前記言語サーバは、
前記端末装置から前記リストを取得するリスト取得部と、
前記リスト取得部により取得された前記リストに基づいて、前記プライベートラベルをグループに集約する集約部と、
前記集約部により集約された前記グループに対して前記代表ラベルを選択する代表ラベル選択部と、
前記代表ラベル選択部の選択結果に基づいて、前記代表ラベルと付与された前記プライベートラベルとの関係を前記端末装置へ出力する代表ラベル出力部と、
を備える、請求項14に記載の端末装置。 - 認識対象データを取得し、ニューラルネットワークによって前記認識対象データの内容が所定ラベルに合致する度合いを示す認識スコアを出力し、前記認識スコアと前記認識スコアに対して予め設定された閾値とを用いて前記認識対象データの内容が所定ラベルに合致するか否かを示す認識結果を出力する端末装置における前記閾値を変更する閾値変更装置であって、
入力データと、前記入力データに関連付けられ、前記入力データの内容が前記所定ラベルに合致する正評価であるか前記入力データの内容が所定ラベルに合致しない負評価であるかを示す前記所定ラベルの正解評価と、を含む評価データを取得する評価データ取得部と、
前記端末装置に関連付けられたデータの前記正評価及び前記負評価の比を取得する端末データ取得部と、
前記ニューラルネットワーク又は前記ニューラルネットワークの重み係数と同一の重み係数を有するニューラルネットワークから、前記入力データに関する前記所定ラベルの前記認識スコアを取得する認識スコア取得部と、
前記認識スコア取得部により取得された前記所定ラベルの前記認識スコアと、前記閾値を用いて、正解評価が正評価の前記入力データが正評価として認識されたデータ数、及び、正解評価が負評価の前記入力データが正評価として認識されたデータ数を算出し、算出されたデータ数を用いて前記所定ラベルに関する適合率を算出する算出部と、
前記算出部により算出された前記適合率を用いて前記閾値を変更する変更部と、
を備え、
前記算出部は、正解評価が負評価の前記入力データが正評価として認識されたデータ数を、前記評価データの正評価及び負評価の比、及び、前記端末装置に関連付けられたデータの正評価及び負評価の比を用いて補正し、補正されたデータ数を用いて前記適合率を算出する、
閾値変更装置。 - 前記算出部は、前記所定ラベルに関する再現率及び前記適合率を算出し、
前記変更部は、前記閾値を前記再現率及び前記適合率の調和平均が最大となる認識スコアへ変更する請求項19に記載の閾値変更装置。 - 前記端末データ取得部は、前記端末装置の前記ニューラルネットワークの認識結果、又は、前記端末装置のユーザによるアノテーションの結果に基づいて、前記端末装置に関連付けられたデータの前記正評価及び前記負評価の比を取得する請求項19又は20に記載の閾値変更装置。
- 前記端末データ取得部は、前記端末装置のユーザの操作又は端末情報に基づいて、前記端末装置に関連付けられたデータの前記正評価及び前記負評価の比を取得する請求項19~21の何れか一項に記載の閾値変更装置。
- 認識対象データを取得し、ニューラルネットワークによって前記認識対象データの内容が所定ラベルに合致する度合いを示す認識スコアを出力し、前記認識スコアと前記認識スコアに対して予め設定された閾値とを用いて前記認識対象データの内容が所定ラベルに合致するか否かを示す認識結果を出力する端末装置における前記閾値を変更する閾値変更装置であって、
前記端末装置に関連付けられたデータの正評価及び負評価の比を取得する端末データ取得部と、
前記比と前記閾値との関係性を記憶する記憶部と、
前記記憶部に記憶された前記関係性、及び、前記端末データ取得部により取得された前記比を用いて前記閾値を変更する変更部と、
を備える閾値変更装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017513559A JP6271085B2 (ja) | 2015-10-30 | 2016-10-14 | 学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 |
KR1020187015260A KR102114564B1 (ko) | 2015-10-30 | 2016-10-14 | 학습 시스템, 학습 장치, 학습 방법, 학습 프로그램, 교사 데이터 작성 장치, 교사 데이터 작성 방법, 교사 데이터 작성 프로그램, 단말 장치 및 임계치 변경 장치 |
CN201680062416.3A CN108351986B (zh) | 2015-10-30 | 2016-10-14 | 学习系统及装置和方法、训练数据生成装置及生成方法 |
US15/771,735 US11170262B2 (en) | 2015-10-30 | 2016-10-14 | Training system, training device, method for training, training data creation device, training data creation method, terminal device, and threshold value changing device |
EP16859603.9A EP3361423B1 (en) | 2015-10-30 | 2016-10-14 | Learning system, learning device, learning method, learning program, teacher data creation device, teacher data creation method, teacher data creation program, terminal device, and threshold value changing device |
US17/494,100 US20220101059A1 (en) | 2015-10-30 | 2021-10-05 | Learning system, learning device, learning method, learning program, teacher data creation device, teacher data creation method, teacher data creation program, terminal device, and threshold value changing device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015215057 | 2015-10-30 | ||
JP2015-215057 | 2015-10-30 | ||
JP2016141558 | 2016-07-19 | ||
JP2016-141558 | 2016-07-19 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/771,735 A-371-Of-International US11170262B2 (en) | 2015-10-30 | 2016-10-14 | Training system, training device, method for training, training data creation device, training data creation method, terminal device, and threshold value changing device |
US17/494,100 Division US20220101059A1 (en) | 2015-10-30 | 2021-10-05 | Learning system, learning device, learning method, learning program, teacher data creation device, teacher data creation method, teacher data creation program, terminal device, and threshold value changing device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017073373A1 true WO2017073373A1 (ja) | 2017-05-04 |
Family
ID=58630045
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/080558 WO2017073373A1 (ja) | 2015-10-30 | 2016-10-14 | 学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 |
Country Status (6)
Country | Link |
---|---|
US (2) | US11170262B2 (ja) |
EP (1) | EP3361423B1 (ja) |
JP (2) | JP6271085B2 (ja) |
KR (1) | KR102114564B1 (ja) |
CN (1) | CN108351986B (ja) |
WO (1) | WO2017073373A1 (ja) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107609084A (zh) * | 2017-09-06 | 2018-01-19 | 华中师范大学 | 一种基于群智汇聚收敛的资源关联方法 |
US20190034764A1 (en) * | 2017-07-31 | 2019-01-31 | Samsung Electronics Co., Ltd. | Method and apparatus for generating training data to train student model using teacher model |
JP2019046095A (ja) * | 2017-08-31 | 2019-03-22 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法及びプログラム |
JP2019067299A (ja) * | 2017-10-04 | 2019-04-25 | 株式会社豊田中央研究所 | ラベル推定装置及びラベル推定プログラム |
WO2019102892A1 (ja) * | 2017-11-21 | 2019-05-31 | 千代田化工建設株式会社 | 検査支援システム、学習装置、及び判定装置 |
JP2019095898A (ja) * | 2017-11-20 | 2019-06-20 | 株式会社日立製作所 | インスタンス利用促進システム |
EP3502966A1 (en) * | 2017-12-25 | 2019-06-26 | Omron Corporation | Data generation apparatus, data generation method, and data generation program |
CN109978812A (zh) * | 2017-12-24 | 2019-07-05 | 奥林巴斯株式会社 | 摄像系统、学习装置、摄像装置和学习方法 |
JP2019144767A (ja) * | 2018-02-19 | 2019-08-29 | 富士通株式会社 | 学習プログラム、学習方法および学習装置 |
WO2019176806A1 (ja) * | 2018-03-16 | 2019-09-19 | 富士フイルム株式会社 | 機械学習装置および方法 |
JP2019197441A (ja) * | 2018-05-11 | 2019-11-14 | 株式会社 日立産業制御ソリューションズ | 学習装置、学習方法及び学習プログラム |
EP3582142A1 (en) * | 2018-06-15 | 2019-12-18 | Université de Liège | Image classification using neural networks |
CN110610169A (zh) * | 2019-09-20 | 2019-12-24 | 腾讯科技(深圳)有限公司 | 图片标注方法和装置、存储介质及电子装置 |
CN111507371A (zh) * | 2019-01-31 | 2020-08-07 | 斯特拉德视觉公司 | 方法和装置 |
JPWO2021044459A1 (ja) * | 2019-09-02 | 2021-03-11 | ||
JP2021119524A (ja) * | 2018-11-15 | 2021-08-12 | LeapMind株式会社 | ニューラルネットワークモデル、ニューラルネットワーク処理装置、およびニューラルネットワークの演算方法 |
JP2022506866A (ja) * | 2018-11-07 | 2022-01-17 | エレメント・エイ・アイ・インコーポレイテッド | トレーニングセットとして用いる文書からの機密データの除去 |
US11803615B2 (en) | 2019-03-04 | 2023-10-31 | Nec Corporation | Generating 3D training data from 2D images |
JP7427072B2 (ja) | 2021-12-29 | 2024-02-02 | 楽天グループ株式会社 | 情報処理装置、情報処理方法、及び記録媒体 |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017073373A1 (ja) | 2015-10-30 | 2017-05-04 | 株式会社モルフォ | 学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 |
CN108268938B (zh) * | 2018-01-24 | 2020-04-21 | 清华大学 | 神经网络及其信息处理方法、信息处理系统 |
US20210209466A1 (en) * | 2018-05-31 | 2021-07-08 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN109348400B (zh) * | 2018-09-16 | 2020-08-04 | 台州昉创科技有限公司 | 一种3d音效的主体位姿预判方法 |
US20220044147A1 (en) * | 2018-10-05 | 2022-02-10 | Nec Corporation | Teaching data extending device, teaching data extending method, and program |
EP3867919A4 (en) * | 2018-10-19 | 2022-08-31 | F. Hoffmann-La Roche AG | DEFECT DETECTION IN LYOPHILIZED MEDICINAL PRODUCTS USING NEURAL CONVOLUTIONAL NETWORKS |
JP7135750B2 (ja) * | 2018-11-12 | 2022-09-13 | 富士通株式会社 | 学習プログラム、学習方法、学習装置、検知プログラム、検知方法及び検知装置 |
US11922314B1 (en) * | 2018-11-30 | 2024-03-05 | Ansys, Inc. | Systems and methods for building dynamic reduced order physical models |
US11087170B2 (en) * | 2018-12-03 | 2021-08-10 | Advanced Micro Devices, Inc. | Deliberate conditional poison training for generative models |
JP6632773B1 (ja) * | 2018-12-14 | 2020-01-22 | 三菱電機株式会社 | 学習識別装置、学習識別方法、及び、学習識別プログラム |
JP6989485B2 (ja) * | 2018-12-21 | 2022-01-05 | 株式会社 日立産業制御ソリューションズ | マルチラベルデータ学習支援装置、マルチラベルデータ学習支援方法およびマルチラベルデータ学習支援プログラム |
KR102189761B1 (ko) * | 2018-12-21 | 2020-12-11 | 주식회사 엘지씨엔에스 | 딥러닝 학습 방법 및 서버 |
US11373298B2 (en) * | 2019-03-28 | 2022-06-28 | Canon Medical Systems Corporation | Apparatus and method for training neural networks using small, heterogeneous cohorts of training data |
JP2020161086A (ja) * | 2019-03-28 | 2020-10-01 | 株式会社デンソーテン | 制御装置および補正方法 |
CN113557536B (zh) * | 2019-04-25 | 2024-05-31 | 欧姆龙株式会社 | 学习系统、数据生成装置、数据生成方法及存储介质 |
US11804070B2 (en) | 2019-05-02 | 2023-10-31 | Samsung Electronics Co., Ltd. | Method and apparatus with liveness detection |
CN110147852A (zh) * | 2019-05-29 | 2019-08-20 | 北京达佳互联信息技术有限公司 | 图像识别的方法、装置、设备及存储介质 |
KR20200144658A (ko) | 2019-06-19 | 2020-12-30 | 삼성전자주식회사 | 분류 장치 및 이의 동작 방법과 트레이닝 방법 |
JP7200851B2 (ja) * | 2019-06-27 | 2023-01-10 | トヨタ自動車株式会社 | 学習装置、リハビリ支援システム、方法、プログラム、及び学習済みモデル |
WO2021044671A1 (ja) * | 2019-09-03 | 2021-03-11 | 富士フイルム株式会社 | 学習装置、学習装置の作動方法、学習装置の作動プログラム |
CN114730309A (zh) * | 2019-11-20 | 2022-07-08 | Oppo广东移动通信有限公司 | 数据清洗设备、数据清洗方法和人脸验证方法 |
KR102131347B1 (ko) * | 2020-01-29 | 2020-07-07 | 주식회사 이글루시큐리티 | 머신 러닝 학습 데이터 생성 방법 및 그 시스템 |
JP7421363B2 (ja) * | 2020-02-14 | 2024-01-24 | 株式会社Screenホールディングス | パラメータ更新装置、分類装置、パラメータ更新プログラム、および、パラメータ更新方法 |
WO2021241173A1 (ja) * | 2020-05-27 | 2021-12-02 | コニカミノルタ株式会社 | 学習装置、学習方法及び学習プログラム、認識装置、認識方法及び認識プログラム並びに学習認識装置 |
CN113188715A (zh) * | 2021-03-17 | 2021-07-30 | 重庆大学 | 基于机器学习的多维力传感器静态校准数据处理方法 |
US12038980B2 (en) * | 2021-08-20 | 2024-07-16 | Optum Services (Ireland) Limited | Machine learning techniques for generating string-based database mapping prediction |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002008000A (ja) * | 2000-06-16 | 2002-01-11 | Nippon Telegr & Teleph Corp <Ntt> | データ分類学習方法、データ分類方法、データ分類学習装置、データ分類装置、データ分類学習プログラムを記録した記録媒体、データ分類プログラムを記録した記録媒体 |
JP2005215988A (ja) * | 2004-01-29 | 2005-08-11 | Canon Inc | パターン認識用学習方法、パターン認識用学習装置、画像入力装置、コンピュータプログラム、及びコンピュータ読み取り可能な記録媒体 |
JP2014238763A (ja) * | 2013-06-10 | 2014-12-18 | ヤフー株式会社 | 分類精度推定装置、分類精度推定方法、およびプログラム |
JP2015170281A (ja) * | 2014-03-10 | 2015-09-28 | 日本電信電話株式会社 | データ解析装置、方法、及びプログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001109733A (ja) | 1999-10-12 | 2001-04-20 | Hitachi Ltd | 識別モデルの評価方法及び閾値調整方法 |
AU2002228700A1 (en) | 2000-11-02 | 2002-05-15 | Cybersource Corporation | Method and apparatus for evaluating fraud risk in an electronic commerce transaction |
US7925080B2 (en) * | 2006-01-13 | 2011-04-12 | New Jersey Institute Of Technology | Method for identifying marked images based at least in part on frequency domain coefficient differences |
US8352386B2 (en) * | 2009-07-02 | 2013-01-08 | International Business Machines Corporation | Identifying training documents for a content classifier |
US8423568B2 (en) * | 2009-09-16 | 2013-04-16 | Microsoft Corporation | Query classification using implicit labels |
US8774515B2 (en) | 2011-04-20 | 2014-07-08 | Xerox Corporation | Learning structured prediction models for interactive image labeling |
CN102298606B (zh) * | 2011-06-01 | 2013-07-17 | 清华大学 | 基于标签图模型随机游走的图像自动标注方法及装置 |
US9235799B2 (en) * | 2011-11-26 | 2016-01-12 | Microsoft Technology Licensing, Llc | Discriminative pretraining of deep neural networks |
CN108073948A (zh) | 2012-01-17 | 2018-05-25 | 华为技术有限公司 | 一种照片分类管理方法、服务器、装置及系统 |
US9536178B2 (en) * | 2012-06-15 | 2017-01-03 | Vufind, Inc. | System and method for structuring a large scale object recognition engine to maximize recognition accuracy and emulate human visual cortex |
US9122950B2 (en) * | 2013-03-01 | 2015-09-01 | Impac Medical Systems, Inc. | Method and apparatus for learning-enhanced atlas-based auto-segmentation |
US8923608B2 (en) * | 2013-03-04 | 2014-12-30 | Xerox Corporation | Pre-screening training data for classifiers |
JP6164639B2 (ja) | 2013-05-23 | 2017-07-19 | 国立研究開発法人情報通信研究機構 | ディープ・ニューラルネットワークの学習方法、及びコンピュータプログラム |
US9430460B2 (en) * | 2013-07-12 | 2016-08-30 | Microsoft Technology Licensing, Llc | Active featuring in computer-human interactive learning |
US10373047B2 (en) * | 2014-02-28 | 2019-08-06 | Educational Testing Service | Deep convolutional neural networks for automated scoring of constructed responses |
US9552549B1 (en) * | 2014-07-28 | 2017-01-24 | Google Inc. | Ranking approach to train deep neural nets for multilabel image annotation |
US9965704B2 (en) | 2014-10-31 | 2018-05-08 | Paypal, Inc. | Discovering visual concepts from weakly labeled image collections |
US9495619B2 (en) * | 2014-12-30 | 2016-11-15 | Facebook, Inc. | Systems and methods for image object recognition based on location information and object categories |
JP6182279B2 (ja) * | 2015-03-31 | 2017-08-16 | 株式会社Ubic | データ分析システム、データ分析方法、データ分析プログラム、および、記録媒体 |
WO2017073373A1 (ja) | 2015-10-30 | 2017-05-04 | 株式会社モルフォ | 学習システム、学習装置、学習方法、学習プログラム、教師データ作成装置、教師データ作成方法、教師データ作成プログラム、端末装置及び閾値変更装置 |
-
2016
- 2016-10-14 WO PCT/JP2016/080558 patent/WO2017073373A1/ja active Application Filing
- 2016-10-14 US US15/771,735 patent/US11170262B2/en active Active
- 2016-10-14 CN CN201680062416.3A patent/CN108351986B/zh active Active
- 2016-10-14 JP JP2017513559A patent/JP6271085B2/ja active Active
- 2016-10-14 EP EP16859603.9A patent/EP3361423B1/en active Active
- 2016-10-14 KR KR1020187015260A patent/KR102114564B1/ko active IP Right Grant
-
2017
- 2017-09-28 JP JP2017188820A patent/JP6453968B2/ja active Active
-
2021
- 2021-10-05 US US17/494,100 patent/US20220101059A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002008000A (ja) * | 2000-06-16 | 2002-01-11 | Nippon Telegr & Teleph Corp <Ntt> | データ分類学習方法、データ分類方法、データ分類学習装置、データ分類装置、データ分類学習プログラムを記録した記録媒体、データ分類プログラムを記録した記録媒体 |
JP2005215988A (ja) * | 2004-01-29 | 2005-08-11 | Canon Inc | パターン認識用学習方法、パターン認識用学習装置、画像入力装置、コンピュータプログラム、及びコンピュータ読み取り可能な記録媒体 |
JP2014238763A (ja) * | 2013-06-10 | 2014-12-18 | ヤフー株式会社 | 分類精度推定装置、分類精度推定方法、およびプログラム |
JP2015170281A (ja) * | 2014-03-10 | 2015-09-28 | 日本電信電話株式会社 | データ解析装置、方法、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3361423A4 * |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190034764A1 (en) * | 2017-07-31 | 2019-01-31 | Samsung Electronics Co., Ltd. | Method and apparatus for generating training data to train student model using teacher model |
KR20190013011A (ko) * | 2017-07-31 | 2019-02-11 | 삼성전자주식회사 | 교사 모델로부터 학생 모델을 트레이닝하는데 사용되는 학습 데이터를 생성하는 장치 및 방법 |
US12039016B2 (en) | 2017-07-31 | 2024-07-16 | Samsung Electronics Co., Ltd. | Method and apparatus for generating training data to train student model using teacher model |
KR102570278B1 (ko) * | 2017-07-31 | 2023-08-24 | 삼성전자주식회사 | 교사 모델로부터 학생 모델을 트레이닝하는데 사용되는 학습 데이터를 생성하는 장치 및 방법 |
JP7197971B2 (ja) | 2017-08-31 | 2022-12-28 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法及びプログラム |
JP2019046095A (ja) * | 2017-08-31 | 2019-03-22 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法及びプログラム |
CN107609084B (zh) * | 2017-09-06 | 2021-01-26 | 华中师范大学 | 一种基于群智汇聚收敛的资源关联方法 |
CN107609084A (zh) * | 2017-09-06 | 2018-01-19 | 华中师范大学 | 一种基于群智汇聚收敛的资源关联方法 |
JP2019067299A (ja) * | 2017-10-04 | 2019-04-25 | 株式会社豊田中央研究所 | ラベル推定装置及びラベル推定プログラム |
JP2019095898A (ja) * | 2017-11-20 | 2019-06-20 | 株式会社日立製作所 | インスタンス利用促進システム |
JP7050470B2 (ja) | 2017-11-21 | 2022-04-08 | 千代田化工建設株式会社 | 検査支援システム、学習装置、及び判定装置 |
US11301976B2 (en) | 2017-11-21 | 2022-04-12 | Chiyoda Corporation | Inspection support system, learning device, and determination device |
WO2019102892A1 (ja) * | 2017-11-21 | 2019-05-31 | 千代田化工建設株式会社 | 検査支援システム、学習装置、及び判定装置 |
JP2019095247A (ja) * | 2017-11-21 | 2019-06-20 | 千代田化工建設株式会社 | 検査支援システム、学習装置、及び判定装置 |
CN109978812A (zh) * | 2017-12-24 | 2019-07-05 | 奥林巴斯株式会社 | 摄像系统、学习装置、摄像装置和学习方法 |
JP2019114243A (ja) * | 2017-12-24 | 2019-07-11 | オリンパス株式会社 | 撮像装置および学習方法 |
JP2019114116A (ja) * | 2017-12-25 | 2019-07-11 | オムロン株式会社 | データ生成装置、データ生成方法及びデータ生成プログラム |
US10878283B2 (en) | 2017-12-25 | 2020-12-29 | Omron Corporation | Data generation apparatus, data generation method, and data generation program |
EP3502966A1 (en) * | 2017-12-25 | 2019-06-26 | Omron Corporation | Data generation apparatus, data generation method, and data generation program |
JP2019144767A (ja) * | 2018-02-19 | 2019-08-29 | 富士通株式会社 | 学習プログラム、学習方法および学習装置 |
JP7040104B2 (ja) | 2018-02-19 | 2022-03-23 | 富士通株式会社 | 学習プログラム、学習方法および学習装置 |
US11823375B2 (en) | 2018-03-16 | 2023-11-21 | Fujifilm Corporation | Machine learning device and method |
WO2019176806A1 (ja) * | 2018-03-16 | 2019-09-19 | 富士フイルム株式会社 | 機械学習装置および方法 |
JPWO2019176806A1 (ja) * | 2018-03-16 | 2021-04-08 | 富士フイルム株式会社 | 機械学習装置および方法 |
JP2019197441A (ja) * | 2018-05-11 | 2019-11-14 | 株式会社 日立産業制御ソリューションズ | 学習装置、学習方法及び学習プログラム |
JP7025989B2 (ja) | 2018-05-11 | 2022-02-25 | 株式会社 日立産業制御ソリューションズ | 学習装置、学習方法及び学習プログラム |
EP3582142A1 (en) * | 2018-06-15 | 2019-12-18 | Université de Liège | Image classification using neural networks |
WO2019238976A1 (en) * | 2018-06-15 | 2019-12-19 | Université de Liège | Image classification using neural networks |
JP2022506866A (ja) * | 2018-11-07 | 2022-01-17 | エレメント・エイ・アイ・インコーポレイテッド | トレーニングセットとして用いる文書からの機密データの除去 |
JP7353366B2 (ja) | 2018-11-07 | 2023-09-29 | サービスナウ・カナダ・インコーポレイテッド | トレーニングセットとして用いる文書からの機密データの除去 |
JP2021119524A (ja) * | 2018-11-15 | 2021-08-12 | LeapMind株式会社 | ニューラルネットワークモデル、ニューラルネットワーク処理装置、およびニューラルネットワークの演算方法 |
JP7274180B2 (ja) | 2018-11-15 | 2023-05-16 | LeapMind株式会社 | プログラム、ニューラルネットワーク処理コンピュータ、ニューラルネットワーク処理装置、およびニューラルネットワークの演算方法 |
CN111507371B (zh) * | 2019-01-31 | 2023-12-19 | 斯特拉德视觉公司 | 自动评估对训练图像的标签可靠性的方法和装置 |
CN111507371A (zh) * | 2019-01-31 | 2020-08-07 | 斯特拉德视觉公司 | 方法和装置 |
US11803615B2 (en) | 2019-03-04 | 2023-10-31 | Nec Corporation | Generating 3D training data from 2D images |
JPWO2021044459A1 (ja) * | 2019-09-02 | 2021-03-11 | ||
JP7283548B2 (ja) | 2019-09-02 | 2023-05-30 | 日本電気株式会社 | 学習装置、予測システム、方法およびプログラム |
WO2021044459A1 (ja) * | 2019-09-02 | 2021-03-11 | 日本電気株式会社 | 学習装置、予測システム、方法およびプログラム |
CN110610169A (zh) * | 2019-09-20 | 2019-12-24 | 腾讯科技(深圳)有限公司 | 图片标注方法和装置、存储介质及电子装置 |
CN110610169B (zh) * | 2019-09-20 | 2023-12-15 | 腾讯科技(深圳)有限公司 | 图片标注方法和装置、存储介质及电子装置 |
JP7427072B2 (ja) | 2021-12-29 | 2024-02-02 | 楽天グループ株式会社 | 情報処理装置、情報処理方法、及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
EP3361423B1 (en) | 2022-12-14 |
KR20180079391A (ko) | 2018-07-10 |
US11170262B2 (en) | 2021-11-09 |
JP2018018537A (ja) | 2018-02-01 |
EP3361423A1 (en) | 2018-08-15 |
CN108351986A (zh) | 2018-07-31 |
JP6453968B2 (ja) | 2019-01-16 |
CN108351986B (zh) | 2022-03-29 |
JPWO2017073373A1 (ja) | 2017-10-26 |
EP3361423A4 (en) | 2019-06-12 |
US20180307946A1 (en) | 2018-10-25 |
US20220101059A1 (en) | 2022-03-31 |
KR102114564B1 (ko) | 2020-05-22 |
JP6271085B2 (ja) | 2018-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6453968B2 (ja) | 閾値変更装置 | |
US9965717B2 (en) | Learning image representation by distilling from multi-task networks | |
US11501161B2 (en) | Method to explain factors influencing AI predictions with deep neural networks | |
CN109213864A (zh) | 基于深度学习的刑事案件预判系统及其构建和预判方法 | |
CN105164672A (zh) | 内容分类 | |
CN115858919A (zh) | 基于项目领域知识和用户评论的学习资源推荐方法及系统 | |
CN108304568A (zh) | 一种房地产公众预期大数据处理方法及系统 | |
WO2023164312A1 (en) | An apparatus for classifying candidates to postings and a method for its use | |
CN112069806A (zh) | 简历筛选方法、装置、电子设备及存储介质 | |
US20230018525A1 (en) | Artificial Intelligence (AI) Framework to Identify Object-Relational Mapping Issues in Real-Time | |
US12124352B1 (en) | Apparatus and method generating a path using classified distractions | |
CN117764536B (zh) | 一种基于人工智能的创新创业项目辅助管理系统 | |
CN113011551B (zh) | 一种基于用户情感反馈的机器人服务认知方法及系统 | |
CN116303376B (zh) | 一种基于资产大数据平台的资产管理优化方法及系统 | |
US20240248765A1 (en) | Integrated platform graphical user interface customization | |
US11829735B2 (en) | Artificial intelligence (AI) framework to identify object-relational mapping issues in real-time | |
Cuevas et al. | An improved evolutionary algorithm for reducing the number of function evaluations | |
CN118313446A (zh) | 面向冷启动场景的因果元学习多视角图学习方法及设备 | |
Mouakher et al. | Explainable evaluation framework for facial expression recognition in web-based learning environments | |
Pancini | Enhancing data preparation with adaptive learning | |
Sumithabhashini et al. | Introduction And Fundamental Concepts Of Machine Learning | |
Pancini | Enhancing Data Preparation with Adaptive Learning: A Contextual Bandit Approach for Recommender Systems | |
CN117688241A (zh) | 一种教育资源推荐方法及系统 | |
CN116911801A (zh) | 活动方案生成方法、装置、设备及存储介质 | |
CN115934561A (zh) | 测试结果分类模型的训练方法、装置和计算机设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017513559 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16859603 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15771735 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016859603 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20187015260 Country of ref document: KR Kind code of ref document: A |