WO2022030134A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, programme et support d'enregistrement - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, programme et support d'enregistrement Download PDF

Info

Publication number
WO2022030134A1
WO2022030134A1 PCT/JP2021/024124 JP2021024124W WO2022030134A1 WO 2022030134 A1 WO2022030134 A1 WO 2022030134A1 JP 2021024124 W JP2021024124 W JP 2021024124W WO 2022030134 A1 WO2022030134 A1 WO 2022030134A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
information
concept
unit
processing unit
Prior art date
Application number
PCT/JP2021/024124
Other languages
English (en)
Japanese (ja)
Inventor
由仁 宮内
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2022541152A priority Critical patent/JP7381143B2/ja
Publication of WO2022030134A1 publication Critical patent/WO2022030134A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, a program and a recording medium.
  • Deep learning uses a large amount of training data to make a multi-layer neural network learn features.
  • Patent Documents 1 to 3 disclose a neural network processing apparatus capable of constructing a neural network with a small amount of labor and arithmetic processing by defining a large-scale neural network as a combination of a plurality of subnetworks. ing. Further, Patent Document 4 discloses a structure optimizing device that optimizes a neural network.
  • Japanese Unexamined Patent Publication No. 2001-051968 Japanese Patent Application Laid-Open No. 2002-251601
  • Japanese Patent Application Laid-Open No. 2003-317573 Japanese Unexamined Patent Publication No. 09-091263
  • An object of the present invention is to provide an information processing device, an information processing method, a program, and a recording medium capable of processing information with an algorithm closer to human thinking.
  • an input unit that generates first data representing a situation grasped from the information based on the received information, and a concept related to the situation based on the first data. It has a processing unit that generates a second data to be represented, and an output unit that generates and outputs a word expressing the concept based on the second data, and the output unit has the information that the generated word is generated.
  • An information processing apparatus configured to output the generated words as new information to the input unit when the solution does not match the corresponding solution is provided.
  • the first step of generating the first data representing the situation grasped from the information based on the received information has a second step of generating a second data representing a situation-related concept, and a third step of generating and outputting a word representing the concept based on the second data.
  • an information processing method is provided in which the generated words are used as new information and repeated from the first step.
  • the computer is based on the information received, the means for generating the first data representing the situation grasped from the information, and the first data.
  • a means for generating a second data representing a situation-related concept a word representing the concept is generated based on the second data, and the generated word does not match the solution according to the information.
  • a program is provided that functions as a means for outputting the generated words as new information to the means for generating the first data.
  • FIG. 1 is a schematic diagram showing a configuration example of an information processing apparatus according to the first embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing a configuration example of a data processing module in the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a diagram illustrating the operation of the data processing module in the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 4 is a flowchart showing an information processing method using the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 5 is a sequence diagram showing a first application example of information processing using the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 6 is a diagram showing preconditions in a second application example of information processing using the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 1 is a schematic diagram showing a configuration example of an information processing apparatus according to the first embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing a configuration example of a data processing module in the information processing
  • FIG. 7 is a sequence diagram showing a second application example of information processing using the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 8 is a schematic diagram (No. 1) showing a hardware configuration example of the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 9 is a schematic diagram (No. 2) showing a hardware configuration example of the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a data processing processor in the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 11A is a diagram (No. 1) illustrating a data processing processor in the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 11B is a diagram (No.
  • FIG. 11C is a diagram (No. 3) illustrating a data processing processor in the information processing apparatus according to the first embodiment of the present invention.
  • FIG. 12 is a schematic diagram showing a configuration example of the information processing apparatus according to the second embodiment of the present invention.
  • FIG. 1 is a schematic diagram showing a configuration example of an information processing apparatus according to the present embodiment.
  • FIG. 2 is a schematic diagram showing a configuration example of a data processing module in the information processing apparatus according to the present embodiment.
  • FIG. 3 is a diagram illustrating the operation of the data processing module in the information processing apparatus according to the present embodiment.
  • the information processing apparatus 1000 may be composed of an input unit 100, a processing unit 200, and an output unit 300.
  • the input unit 100 has a function of generating status information data (first data) based on the received information.
  • the situation information data is a data of the situation grasped from the received information in a predetermined format.
  • the format of the situation information data is not particularly limited, but for example, bitmap format data or vector representation format data can be applied.
  • the information received by the input unit 100 includes information received from the outside of the information processing apparatus 1000 and information output by the output unit 300.
  • the information received by the input unit 100 from the outside is not particularly limited, and examples thereof include auditory information (words, sounds, etc.), visual information (images, etc.), tactile information (sensible temperature, texture, etc.). Be done.
  • voice input or text input is made to the input unit 100 as information from the outside
  • the input unit 100 can convert the input data into the situation information data by using a known word identification AI system or the like.
  • the input unit 100 can convert the input data into situation information data using a known image identification AI system or the like.
  • the information received by the input unit 100 from the output unit 300 is information related to "words" (thoughts) generated by the output unit 300.
  • the processing unit 200 performs predetermined conceptual processing on the status information data received from the input unit 100, and generates conceptual information data (second data) representing a concept related to the situation represented in the status information data. , It has a function to output the generated conceptual information data to the output unit 300.
  • the processing unit 200 may be configured to include a recognition processing unit 210 and a concept processing unit 220.
  • the recognition processing unit 210 has a function of converting the situation information data received from the input unit 100 into conceptual information data (third data).
  • the conceptual information data generated by the recognition processing unit 210 is data that conceptualizes the situation represented by the situation information data received from the input unit 100.
  • the conceptual information data may be data in the same format as the situation information data received from the input unit 100. Data in the same format is data having the same data structure and size. However, the situation information data and the conceptual information data do not necessarily have to be in the same format, and may be in different formats as long as there is at least a partially overlapping portion.
  • the recognition processing unit 210 performs conceptual processing such as identification and classification on the status information data received from the input unit 100, and generates conceptual information data.
  • the recognition processing unit 210 may include a plurality of types of recognition modules depending on the type of status information data.
  • the recognition processing unit 210 may include a word recognition module that processes status information data received from the word identification AI system, an image recognition module that processes status information data received from the image identification AI system, and the like.
  • the classification performed by the recognition processing unit 210 may include, for example, classification of objects, requirements, environments, and the like shown in the situation information data.
  • the concept processing unit 220 performs conceptual processing on the conceptual information data received from the recognition processing unit 210, and converts the conceptual information data received from the recognition processing unit 210 into conceptual information data (second data) of another concept. It has a function to do.
  • the conceptual processing performed on the conceptual information data is a process of grasping the situation from the received conceptual information data, thinking about the grasped situation, and deriving the optimum solution for the grasped situation.
  • a certification support system such as a Coq system can be applied to the concept processing unit 220. If there is no answer (solution) from the concept processing unit 220 (incomplete), the output data is fed back to the input unit 100 via the output unit 300, which will be described later, so that the self-questioning self-answer is repeated and a more appropriate answer is given. It will be possible to obtain.
  • the output unit 300 has a function of generating "words" (thoughts) based on the solution (conceptual information data) output from the processing unit 200.
  • the "word” is not particularly limited, but as an example, a sentence including at least a subject and a predicate expressing an idea corresponding to the concept information data output from the concept processing unit 220 can be mentioned.
  • the "word” generated by the output unit 300 can be output as character information or voice information, and can be output to the input unit 100.
  • the method of converting a concept into a "word” is not particularly limited. As a simple method, for example, a method of generating a sentence by inserting a word extracted from a concept into a sentence of a fixed frame can be mentioned.
  • the output unit 300 may further include a function of generating an "action" based on the solution (conceptual information data) output from the processing unit 200.
  • the output unit 300 can be configured to output, for example, an instruction regarding the generated "behavior".
  • This instruction is not particularly limited, but may include, for example, a predetermined instruction for performing a task according to the generated "behavior", for example, execution of an application.
  • the concept information data received by the output unit 300 may be the concept information data output by the recognition processing unit 210 or the concept information data output by the concept processing unit 220.
  • FIG. 2 is a diagram showing a schematic configuration of a data processing module in the information processing apparatus according to the present embodiment.
  • the same data processing module can also be applied to the input unit 100 and the output unit 300.
  • one functional block may include a plurality of data processing modules 400.
  • one functional block can be configured to use different data processing modules 400 depending on the input data.
  • the data processing module 400 may include a data acquisition unit 410, a storage unit 420, an identification unit 430, and a data output unit 440.
  • the data acquisition unit 410 has a function of acquiring status information data and conceptual information data.
  • the situation information data and the conceptual information data may be data in a bitmap format having a predetermined size, as described above.
  • the storage unit 420 stores a learning model including a plurality of models.
  • a pattern representing a specific information or concept is associated with a value representing another information or concept corresponding to the pattern.
  • each model has a pattern that maps the relationship between multiple elements that represent a particular situation or concept and their element values, and multiple elements that represent different information or concepts that correspond to that particular situation or concept. It may include a value that maps the relationship to those element values.
  • the identification unit 430 compares the pattern of the data acquired by the data acquisition unit 410 with each pattern of the plurality of models stored in the storage unit 420, and selects the model having the most suitable pattern among the plurality of models. It has a function to select the associated value.
  • the patterns and values possessed by each of the plurality of models may be data in the same format as the situation information data and the conceptual information data, or data in different formats, as in the relationship between the situation information data and the conceptual information data. You may.
  • the value associated with the pattern does not necessarily have to be bitmap format data, and may be data including predetermined index information. In this case, this index information can be used to configure the data corresponding to the index information to be acquired from the outside.
  • the identification unit 430 may have a learning function of a learning model. For example, when there is no model in the learning model whose degree of conformity is equal to or higher than a predetermined threshold, a model corresponding to the data acquired by the data acquisition unit 410 is added to the learning model, and the learning model is updated. can do.
  • a model corresponding to the data acquired by the data acquisition unit 410 is added to the learning model, and the learning model is updated. can do.
  • the technique described in Japanese Patent Application No. 2019-144121 by the same applicant can be applied.
  • the data output unit 440 has a function of outputting the value (data in bitmap format) selected by the identification unit 430 as output data.
  • the data output unit 440 typically outputs the selected value to the next-stage processing unit, but returns the selected value to the data acquisition unit 410 and performs recursive processing in the data processing module 400. It may be configured to do so.
  • FIG. 3 is a schematic diagram showing the operation of the data processing module 400 in the information processing apparatus according to the present embodiment.
  • the data acquisition unit 410 acquires status information data from the input unit 100. Alternatively, the data acquisition unit 410 acquires conceptual information data from the processing unit 200. The data acquisition unit 410 outputs the acquired data (input data) to the identification unit 430.
  • the identification unit 430 compares the pattern of the data received from the data acquisition unit 410 with each pattern of the plurality of models of the learning model stored in the storage unit 420. Then, the identification unit 430 extracts a model having the pattern having the highest degree of conformity to the pattern of the data received from the data acquisition unit 410 from the plurality of models included in the learning model. It can be said that a model having a pattern having a high degree of conformity with respect to the pattern of the received data is a model in which the information distance is close to the received data.
  • the method of determining the goodness of fit between the input data (situation information data or conceptual information data) and the training model is not particularly limited, but for example, an internal product value of the input data pattern and the training model pattern is used. The method can be mentioned.
  • each of the situation information data, the conceptual information data, the pattern of the learning model, and the value of the training data is in a bit map format including nine cells arranged in a 3 ⁇ 3 matrix. It shall be a pattern.
  • the value of each cell is 0 or 1.
  • the internal product value of the input data pattern and the learning model pattern is calculated by multiplying the values of cells with the same coordinates and adding up the multiplication values of each coordinate.
  • the values of the cells constituting the input data pattern are A, B, C, D, E, F, G, H, and I, and the pattern of the learning model to be compared is configured. It is assumed that the value of each cell is 1,0,0,0,1,0,0,0,1.
  • the internal product value of the input data pattern and the learning model pattern is A ⁇ 1 + B ⁇ 0 + C ⁇ 0 + D ⁇ 0 + E ⁇ 1 + F ⁇ 0 + G ⁇ 0 + H ⁇ 0 + I ⁇ 1.
  • the inner product value calculated in this way is normalized by dividing by the number of cells having a value of 1 among the cells included in the input data.
  • the calculation and normalization of the inner product value for the input data is performed for each of the plurality of models included in the training model.
  • the model with the maximum normalized internal product value is extracted from the multiple models of the learning model.
  • the normalized internal product value represents the likelihood, and the larger the value, the higher the goodness of fit to the input data. Therefore, by extracting the model with the maximum normalized internal product value, the model with the highest goodness of fit to the input data can be extracted.
  • a model having a pattern in which the value of each cell is 1,0,0,0,1,0,0,0,1 has the highest goodness of fit to the input data. It is assumed that it is extracted as.
  • the identification unit 430 outputs the value associated with the extracted model to the data output unit 440 as another conceptual information data converted from the input data.
  • the extracted model has a pattern in which the value of each cell is 1,0,0,0,1,0,0,0,1 and the value of each cell is 1,1. , 0,0,1,0,0,1,1 and the value.
  • the value whose value in each cell is 1,1,0,0,1,0,0,1,1 becomes the output data to be output to the data output unit 440.
  • the data output unit 440 outputs the data acquired from the identification unit 430 to the next-stage processing unit.
  • the input data in which the value of each cell is A, B, C, D, E, F, G, H, I can be input, and the value of each cell is 1,1,0,0,1. It can be converted into output data of, 0, 0, 1, 1.
  • the recognition processing unit 210 By configuring the recognition processing unit 210 using such a data processing module 400, it is possible to convert the situation information data received from the input unit 100 into conceptual information data. Further, by configuring the concept processing unit 220 using such a data processing module 400, the concept information data received from the recognition processing unit 210 can be converted into the concept information data of another concept.
  • FIG. 4 is a flowchart showing an information processing method according to the present embodiment.
  • the input unit 100 acquires information from the outside (step S101).
  • the information acquired by the input unit 100 is not particularly limited, and examples thereof include auditory information (words, sounds, etc.), visual information (images, etc.), and tactile information (sensible, temperature, texture, etc.). ..
  • the input unit 100 converts the information received from the outside into data in a predetermined format and generates status information data (step S102). For example, when voice input or text input is made to the input unit 100 as information from the outside, the input unit 100 converts the input data into the situation information data by using a known word identification AI system or the like. Alternatively, when an image is input to the input unit 100 as information from the outside, the input unit 100 converts the input data into situation information data using a known image identification AI system or the like.
  • the input unit 100 outputs the generated status information data to the recognition processing unit 210 of the processing unit 200.
  • the recognition processing unit 210 uses, for example, the above-mentioned data processing module 400 to convert the situation information data received from the input unit into the concept information data and output it to the concept processing unit 220 (step S103).
  • the concept processing unit 220 uses, for example, the above-mentioned data processing module 400 to convert the concept information data received from the recognition processing unit 210 into concept information data representing a concept different from the received concept information data (. Step S104).
  • the concept processing unit 220 outputs the generated concept information data to the output unit 300.
  • the output unit 300 generates "words” and "actions” based on the conceptual information data received from the processing unit 200 (step S105).
  • the output unit 300 determines whether or not the generated "word” is sufficient as an answer (solution) (step S106).
  • step S106 if the generated "word” is insufficient (incomplete) as a solution (“No” in step S106), the process proceeds to step S102, and the generated “word” is fed back to the input unit 100. Then, the process of step S102 to step S106 is repeated. That is, repeat the self-questioning and self-answer using the generated "words".
  • the "word” generated by the output unit 300 represents, so to speak, a result based on past experience. Output unit 300 By processing the generated "words" again, it is possible to make a judgment based on past experience.
  • the output unit 300 outputs the generated “word” and / or "action” and ends a series of processing. (Step S107).
  • the output of "words” is not particularly limited, but for example, the generated “words” can be output as character information or voice information.
  • the output of the "behavior” is not particularly limited, but for example, a predetermined instruction for executing the task corresponding to the generated "behavior” can be output.
  • all the processing in the information processing apparatus according to the present embodiment is data-to-data conversion, that is, data-driven processing.
  • FIGS. 5 to 7 will be described with respect to an application example of information processing using the information processing apparatus 1000 according to the present embodiment.
  • two application examples will be given to more specifically explain the information processing in the present embodiment.
  • the first application example is an application example to a license issuing business process for the purpose of acquiring a license number in response to a license request from a user and reporting the acquired license number to the user. It is assumed that the information processing apparatus 1000 has knowledge of acquiring a license number by operating the numbering ledger.
  • FIG. 5 is a sequence diagram showing an information processing method in the first application example.
  • the input unit 100 converts the input information into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the status information data received from the input unit 100 into conceptual information data.
  • the input information is converted into conceptual information data including the concept (concept A) indicating the situation of "Mr. A”, “license”, and "issue”.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates "words” based on the conceptual information data received from the recognition processing unit 210. For example, for the concept A indicating the situation of "Mr. A”, “license”, and “issuance”, a word such as “think” is added to generate a "word”. As a result, for example, the "word” “Consider issuing Mr. A's license” is generated. The output unit 300 outputs the generated “word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the data is converted into conceptual information data including the concept (concept B) indicating the problems of "Mr. A”, “license”, and “issuance” according to the word "thinking”.
  • the recognition processing unit 210 outputs the converted concept information data to the concept processing unit 220.
  • the concept processing unit 220 converts the input concept information data into another concept information data.
  • the data is converted into conceptual information data including a concept (concept C) indicating an action recalled from the problems of "Mr. A”, “license”, and “issuance”.
  • concept C concept indicating an action recalled from the problems of "Mr. A”, “license”, and “issuance”.
  • the method of "numbering ledger operation” is recalled from the information of "license” and "issuance”
  • the input conceptual information data is used as the concept of "Mr. A”, “license”, and “numbering ledger operation”. Convert to include conceptual information data.
  • the concept processing unit 220 outputs the converted concept information data to the output unit 300.
  • the output unit 300 generates "words” based on the concept information data received from the concept processing unit 220. For example, a word such as “do” is added to the concept C indicating the behavior of "Mr. A", “license”, and “numbering ledger operation” to generate “words”. As a result, for example, the "word” of "operate the license numbering ledger of Mr. A” is generated.
  • the output unit 300 outputs the generated “word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the data is converted into conceptual information data including the concept (concept D) indicating the work for "Mr. A”, “license”, and “numbering ledger operation” according to the word "do".
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates an "action" based on the conceptual information data received from the recognition processing unit 210.
  • a schedule application for issuing a license number based on the information "license” and “numbering ledger operation” shown in the conceptual information data.
  • the schedule application outputs the license number obtained by execution to the input unit 100.
  • the input unit 100 converts the license number received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of the execution of the schedule application, and outputs the received information to the output unit 300.
  • the output unit 300 generates "words” based on the information received from the recognition processing unit 210. For example, a word such as “desu” is added to the license number "123-456-7890" output by the schedule application to generate a word. As a result, for example, the "word” "123-456-7890.” Is generated.
  • the output unit 300 reports the generated "words" to the user by displaying them on a display device, outputting them by voice, or the like, and ends a series of processes.
  • the second application example is a schedule adjustment process for the purpose of comparing the schedules of two users (Mr. A and Mr. B) for 5 days from Monday to Friday, adjusting the free time of both users, and reporting. It is an application example to. Here, it is assumed that the schedules of Mr. A and Mr. B from Monday to Friday are as shown in FIG. In the table of FIG. 6, it is assumed that the "am” and “afternoon” time zones are working hours, and the "lunch break" is outside working hours. Further, “0” indicates a free time zone without reservation, and "1", "2", and "3" indicate a time zone with reservation.
  • the values of "1", “2”, and “3” indicate the priority of the reservation, "1" is the reservation that requires attendance, “2" is the reservation that attendance is preferable, and “3” is the reservation that can be absent.
  • the information processing apparatus 1000 is provided with knowledge of adjusting the schedule during working hours, but is not provided with knowledge of "lunch break” and "priority" in the initial state.
  • FIG. 7 is a sequence diagram showing an information processing method in the second application example.
  • the input unit 100 converts the input information into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the status information data received from the input unit 100 into conceptual information data.
  • the input information is converted into conceptual information data including the concept (concept A) indicating the situation of "Mr. A”, “Mr. B", and "schedule adjustment".
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates "words” based on the conceptual information data received from the recognition processing unit 210. For example, a word such as “think” is added to the concept A indicating the situation of "Mr. A”, “Mr. B", and “schedule adjustment” to generate “words”. As a result, for example, the "word” “Consider adjusting the schedule of Mr. A and Mr. B.” is generated. The output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the data is converted into conceptual information data including the concept (concept B) indicating the tasks of "Mr. A”, “Mr. B", and “schedule adjustment” according to the word "thinking”.
  • the recognition processing unit 210 outputs the converted concept information data to the concept processing unit 220.
  • the concept processing unit 220 converts the input concept information data into another concept information data.
  • the data is converted into conceptual information data including a concept (concept C) indicating an action recalled from the tasks of "Mr. A”, “Mr. B”, and “schedule adjustment”.
  • concept C concept indicating an action recalled from the tasks of "Mr. A”, “Mr. B”, and “schedule adjustment”.
  • the method of "working” and “searching with the schedule app” is recalled from the information “schedule adjustment”
  • the input conceptual information data is searched for "Mr. A”, “Mr. B", “working", and “schedule app”. It is converted into conceptual information data including the concept showing the behavior.
  • the concept processing unit 220 outputs the converted concept information data to the output unit 300.
  • the output unit 300 generates "words” based on the concept information data received from the concept processing unit 220. For example, for the concept C indicating the behaviors of "Mr. A”, “Mr. B", “working”, and “searching with the schedule application”, words such as “do” are added to generate “words”. As a result, for example, the “word” “Search with Mr. A and Mr. B's working schedule application” is generated.
  • the output unit 300 outputs the generated “word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data. Here, it is converted into conceptual information data including the concept (concept D) indicating the work for "Mr. A”, “Mr. B", “working”, and “searching with the schedule application” according to the word "do". And.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates an "action" based on the conceptual information data received from the recognition processing unit 210.
  • an instruction to execute a schedule application for adjusting the schedule during work of Mr. A and Mr. B is output.
  • the schedule application outputs the result obtained by execution to the input unit 100.
  • the input unit 100 converts the result received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of the execution of the schedule application, and outputs the received information to the output unit 300.
  • the output unit 300 generates "words” based on the information received from the recognition processing unit 210. For example, as a result of output by the schedule application, a word such as “is” is added to “no space” to generate a word. As a result, for example, the "word” "There is no space” is generated.
  • the output unit 300 reports the generated "words" to the user by displaying them on a display device, outputting them by voice, or the like, and ends a series of processes.
  • the input unit 100 converts the input information into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the status information data received from the input unit 100 into conceptual information data.
  • the input information is converted into conceptual information data including the concept (concept A) indicating the situation of "Mr. A”, “Mr. B”, “lunch break”, and "schedule adjustment”.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates "words” based on the conceptual information data received from the recognition processing unit 210. For example, for the concept A indicating the situation of "Mr. A”, “Mr. B", “lunch break”, and “schedule adjustment”, words such as "think” are added to generate “words”. As a result, for example, the "word” “Mr. A and Mr. B consider adjusting the lunch break schedule” is generated.
  • the output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the data is converted into conceptual information data including the concept (concept B) indicating the tasks of "Mr. A”, “Mr. B", “lunch break”, and "schedule adjustment” according to the word "thinking”.
  • the recognition processing unit 210 outputs the converted concept information data to the concept processing unit 220.
  • the concept processing unit 220 converts the input concept information data into another concept information data.
  • the data is converted into conceptual information data including a concept (concept C) indicating an action recalled from the tasks of "Mr. A”, “Mr. B”, “lunch break”, and “schedule adjustment”.
  • the information processing apparatus 1000 does not have "lunch break” as knowledge, but the information "schedule adjustment" recalls the methods of "lunch break” and "search with the schedule application”.
  • the input conceptual information data is converted into conceptual information data including concepts indicating actions such as “Mr. A”, “Mr. B", “lunch break", and “search with the schedule application”.
  • the concept processing unit 220 outputs the converted concept information data to the output unit 300.
  • the output unit 300 generates "words” based on the concept information data received from the concept processing unit 220. For example, to the concept C indicating the actions of "Mr. A”, “Mr. B", “lunch break”, and “search with the schedule application”, words such as “do” are added to generate “words”. As a result, for example, the "word” “Search with Mr. A and Mr. B's lunch break schedule application” is generated.
  • the output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data. Here, it is converted into conceptual information data including the concept (concept D) indicating the work for "Mr. A”, “Mr. B", “lunch break", and "search with the schedule application” according to the word "do". do.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates an "action" based on the conceptual information data received from the recognition processing unit 210.
  • an instruction to execute a schedule application for adjusting the schedule of lunch breaks of Mr. A and Mr. B is output.
  • the schedule application outputs the result obtained by execution to the input unit 100.
  • the input unit 100 converts the result received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of execution of the schedule application, and outputs the received information to the output unit 300.
  • the result of "Tuesday, Thursday, Friday lunch break" is output. ..
  • the input unit 100 converts the result received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of the execution of the schedule application, and outputs the received information to the output unit 300.
  • the output unit 300 generates "words” based on the information received from the recognition processing unit 210. For example, as a result of the schedule application output, a word such as “desu” is added to "Tue, Thu, Fri lunch break” to generate a word. This produces, for example, the "word” that says, “Tuesday, Thursday, and Friday lunch break.”
  • the output unit 300 reports the generated "words" to the user by displaying them on a display device or outputting them by voice.
  • the output unit 300 generates "words” based on the concept information data received from the recognition processing unit 210 in parallel with the generation of the "action" for the concept D. For example, to the concept D indicating the work of "Mr. A”, “Mr. B", “lunch break", and “search with the schedule application”, words such as "done” are added to generate “words”. As a result, for example, a “word” such as “Mr. A and Mr. B searched with the lunch break schedule application” is generated. The output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the concept processing unit 220 learns the learning model stored in the storage unit 420 of the data processing module 400, and makes the "lunch break" memorized as knowledge. In this way, a series of processes is completed.
  • the input unit 100 converts the input information into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the status information data received from the input unit 100 into conceptual information data.
  • the input information is converted into conceptual information data including the concept (concept A) indicating the situation of "Mr. A”, “Mr. B", “Priority 3", and "Schedule adjustment”.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates "words” based on the conceptual information data received from the recognition processing unit 210. For example, for the concept A indicating the situation of "Mr. A”, “Mr. B", “Priority 3", and “Schedule adjustment”, words such as “think” are added to generate “words”. As a result, for example, the "word” “Mr. A and Mr. B consider the priority 3 schedule adjustment” is generated.
  • the output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data. Here, it is converted into conceptual information data including the concept (concept B) indicating the tasks of "Mr. A”, “Mr. B", “Priority 3", and "Schedule adjustment” according to the word "thinking". do.
  • the recognition processing unit 210 outputs the converted concept information data to the concept processing unit 220.
  • the concept processing unit 220 converts the input concept information data into another concept information data.
  • the data is converted into conceptual information data including a concept (concept C) indicating an action recalled from the tasks of "Mr. A”, “Mr. B”, “Priority 3", and “Schedule adjustment”.
  • the information processing apparatus 1000 does not have "priority 3" as knowledge, but the information "schedule adjustment" recalls the methods "priority 3" and "search by schedule application”.
  • the input conceptual information data is converted into conceptual information data including concepts indicating actions such as “Mr. A”, “Mr. B", “Priority 3", and “Search by schedule application”.
  • the concept processing unit 220 outputs the converted concept information data to the output unit 300.
  • the output unit 300 generates "words” based on the concept information data received from the concept processing unit 220. For example, for the concept C indicating the actions of "Mr. A”, “Mr. B", “Priority 3", and “Search by schedule application”, words such as “do” are added to generate “words”. As a result, for example, the "word” “Search with Mr. A and Mr. B priority 3 schedule application” is generated.
  • the output unit 300 outputs the generated "word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data. Here, it was converted into conceptual information data including the concept (concept D) indicating the work for "Mr. A”, “Mr. B", “Priority 3", and "Search with the schedule application” according to the word "do". It shall be.
  • the recognition processing unit 210 outputs the converted conceptual information data to the output unit 300.
  • the output unit 300 generates an "action" based on the conceptual information data received from the recognition processing unit 210.
  • an instruction to execute a schedule application for adjusting the schedule of priority 3 of Mr. A and Mr. B is output.
  • the schedule application outputs the result obtained by execution to the input unit 100.
  • the input unit 100 converts the result received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of the execution of the schedule application, and outputs the received information to the output unit 300.
  • the free time or the time zone of priority 3 common to the schedules of Mr. A and Mr. B exists on Monday afternoon and Friday morning, "Monday afternoon, Friday morning priority 3". It is assumed that the result is output.
  • the input unit 100 converts the result received from the schedule application as it is or into status information data, and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 identifies that the received information is the result of the execution of the schedule application, and outputs the received information to the output unit 300.
  • the output unit 300 generates "words” based on the information received from the recognition processing unit 210. For example, as a result output by the schedule application, a word such as “is” is added to "priority 3 on Monday afternoon and Friday morning” to generate a word. As a result, for example, the "word” “Monday afternoon, Friday morning has priority 3" is generated.
  • the output unit 300 reports the generated "words" to the user by displaying them on a display device or outputting them by voice.
  • the output unit generates "words” based on the concept information data received from the recognition processing unit 210 in parallel with the generation of the "action" for the concept D. For example, to the concept D indicating the work of "Mr. A”, “Mr. B", “Priority 3", and “Search by schedule application”, words such as “done” are added to generate “words”. As a result, for example, a “word” such as “Mr. A and Mr. B searched with the priority 3 schedule application” is generated. The output unit 300 outputs the generated “word” to the input unit 100.
  • the input unit 100 converts the input "word” into status information data and outputs it to the recognition processing unit 210.
  • the recognition processing unit 210 converts the situation information data received from the input unit 100 into conceptual information data.
  • the result of "action” is referred to according to the word “done”
  • “priority 3” is obtained according to the result obtained even though “priority 3" is not provided as knowledge.
  • the concept processing unit 220 learns the learning model stored in the storage unit 420 of the data processing module 400, and makes the "priority 3" memorized as knowledge. In this way, a series of processes is completed.
  • FIGS. 8 to 11C are schematic views showing a hardware configuration example of the information processing apparatus according to the present embodiment.
  • FIG. 10 is a diagram illustrating a data processing processor in the information processing apparatus according to the present embodiment.
  • 11A, 11B and 11C are diagrams showing pattern examples of input data and a learning model.
  • the information processing device 1000 can be realized by a hardware configuration similar to that of a general information processing device, as shown in FIG. 8, for example.
  • the information processing apparatus 1000 may include a CPU (Central Processing Unit) 500, a main storage unit 502, a communication unit 504, and an input / output interface unit 506.
  • a CPU Central Processing Unit
  • main storage unit 502 main storage unit 502
  • communication unit 504 input / output interface unit 506.
  • the CPU 500 is a control / arithmetic unit that controls the overall control and arithmetic processing of the information processing apparatus 1000.
  • the main storage unit 502 is a storage unit used for a data work area or a data temporary save area, and may be configured by a memory such as a RAM (Random Access Memory).
  • the communication unit 504 is an interface for transmitting and receiving data via a network.
  • the input / output interface unit 506 is an interface for connecting to an external output device 510, an input device 512, a storage device 514, and the like to transmit / receive data.
  • the CPU 500, the main storage unit 502, the communication unit 504, and the input / output interface unit 506 are connected to each other by the system bus 508.
  • the storage device 514 may be configured by, for example, a hard disk device composed of a non-volatile memory such as a ROM (Read Only Memory), a magnetic disk, or a semiconductor memory.
  • the main storage unit 502 can be used as a work area for executing an operation in the data processing module 400 or the like.
  • the CPU 500 can function as a control unit that controls arithmetic processing in the main storage unit 502.
  • the storage device 514 can be used as a storage unit 420 and can store a trained learning model.
  • the communication unit 504 is a communication interface based on standards such as Ethernet (registered trademark) and Wi-Fi (registered trademark), and is a module for communicating with other devices.
  • the learning model stored in the storage device 514 may be configured to receive from another device via the communication unit 504. For example, a frequently used learning model can be stored in the storage device 514, and infrequently used learning cell information can be configured to be read from another device.
  • the output device 510 may include a display such as a liquid crystal display device.
  • the output device 510 can be used as a display device for presenting the processing result to the user.
  • the input device 512 is a keyboard, a mouse, a touch panel, or the like, and can be used for a user to input a predetermined instruction to the information processing device 1000.
  • each part of the information processing apparatus 1000 can be realized in terms of hardware by mounting circuit components that are hardware components such as LSI (Large Scale Integration) in which a program is incorporated.
  • LSI Large Scale Integration
  • it can be realized by software by storing the program providing the function in the storage device 514, loading the program in the main storage unit 502, and executing the program in the CPU 500.
  • the configuration of the information processing device 1000 shown in FIG. 1 does not necessarily have to be configured as one independent device.
  • a part of the input unit 100, the processing unit 200, and the output unit 300, for example, the processing unit 200 may be arranged on the cloud, and an information processing system may be constructed by these.
  • the information processing apparatus 1000 can be configured as a data-driven data flow machine, for example, as shown in FIG.
  • the information processing apparatus 1000 may include a plurality of data processing processors 600 and an input / output interface unit 620.
  • the plurality of data processing processors 600 are connected in series.
  • the first-stage data processing processor 600 of the series connection of the data processing processor 600 is connected to the input / output interface unit 620 and receives data from the input / output interface unit 620.
  • the data processing processor 600 at the final stage of the series connection body of the data processing processor 600 is connected to the input / output interface unit 620 and outputs data to the input / output interface unit 620.
  • the data processing processor 600 has a function of outputting a value associated with a pattern at the information distance closest to the input.
  • the input / output interface unit 620 is an interface for connecting to an external output device 630, input device 640, storage device 650, or the like to transmit / receive data.
  • FIG. 9 shows one route from the input / output interface unit 620 to the input / output interface unit 620 via the plurality of data processing processors 600, but a plurality of routes may be provided in parallel. Further, a branching route or a merging route may be provided for a plurality of routes.
  • each of the plurality of data processing processors 600 includes an input processing unit 602, a plurality of (for example, m) inner stackers 604 1 to 604 m , a comparator 606, a selector 608, and an output. It may be configured to include a processing unit 610.
  • Each of the plurality of data processing processors 600 has a function of receiving data, performing predetermined processing on input data, and outputting the processed data.
  • the inner stacker 604 1 to 604 m , the comparator 606 and the selector 608 may be configured by a logic gate circuit or the like.
  • the patterns (PAT1 to PATm) and values (Value1 to Valuem) of the training data can be stored in registers.
  • the patterns (PAT1 to PATm) and values (Value1 to Valuem) of the learning data can be stored in the storage device 650 in advance. In this case, when the information processing device 1000 is started, it can be read from the storage device 650 through the input / output interface unit 620 and set in each data processing processor 600.
  • the data output from the input / output interface unit 620 is input to the input processing unit 602 of the first stage data processing processor 600.
  • the data input to the input processing unit 602 is the above-mentioned situation information data and conceptual information data.
  • the input processing unit 602 outputs input data in parallel to each of the plurality of inner stackers 604 corresponding to the plurality of models constituting the learning model. For example, when the training model includes m models, the input data is input in parallel to the m inner stackers 604 1 to 604 m .
  • Each of the inner product devices 604 1 to 604 m performs an inner product calculation of the pattern of the input data and the pattern of the learning model.
  • the pattern of the input data is composed of the data as shown in FIG. 11A
  • the pattern of the learning model corresponding to the inner stacker 6041 is composed of the data as shown in FIG. 11B.
  • the value as shown in FIG. 11C is associated with the pattern of the learning model of FIG. 11B.
  • the sum of the multiplications for each element is calculated as in.
  • the calculation results of the inner stackers 604 1 to 604 m are input to the comparator 606.
  • the inner product calculation process may be performed in a plurality of times while exchanging the patterns of the learning data supplied to each inner product unit 604.
  • the comparator 606 compares the output values of the inner product units 604 1 to 604 m , and outputs the learning model number (1 to m) indicating the largest inner product value with respect to the input data to the selector 608.
  • the selector 608 selects the value associated with the learning model corresponding to the number output from the comparator 606 from the values associated with the patterns of the plurality of learning models, and outputs the value to the output processing unit 610. do.
  • the output processing unit 610 outputs the selected value to the next-stage data processing processor 600.
  • data may be transmitted from the input processing unit 602 to the output processing unit 610. Further, when performing processing such as a state transition, the processing result by the data processing processor 600 may be returned from the output processing unit 610 to the input processing unit 602.
  • FIG. 12 is a schematic diagram showing a schematic configuration of an information processing apparatus according to the present embodiment.
  • the information processing apparatus 1000 has an input unit 100, a processing unit 200, and an output unit 300.
  • the input unit 100 has a function of generating first data representing a situation grasped from the information based on the received information.
  • the processing unit 200 has a function of generating second data representing a situation-related concept based on the first data received from the input unit 100.
  • the output unit 300 has a function of generating a word expressing the concept based on the second data received from the processing unit 200. Further, the output unit 300 is configured to output the generated words as new information to the input unit 100 when the generated words do not match the solution corresponding to the information.
  • an example in which a partial configuration of any of the embodiments is added to another embodiment or an example in which a partial configuration of another embodiment is replaced with another embodiment is also an embodiment of the present invention.
  • a program for operating the configuration of the embodiment is recorded on a recording medium so as to realize the function of the above-described embodiment
  • the program recorded on the recording medium is read out as a code, and the program is executed by a computer.
  • a computer-readable recording medium is also included in the scope of each embodiment.
  • not only the recording medium on which the above-mentioned program is recorded but also the program itself is included in each embodiment.
  • the recording medium for example, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a non-volatile memory card, or a ROM can be used.
  • a floppy (registered trademark) disk for example, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a non-volatile memory card, or a ROM
  • the program recorded on the recording medium that executes the process alone, but also the program that operates on the OS and executes the process in cooperation with other software and the function of the expansion board is also an embodiment. Is included in the category of.
  • an input unit that generates first data representing the situation grasped from the information, and an input unit.
  • a processing unit that generates a second data representing a concept related to the situation based on the first data, and a processing unit. It has an output unit that generates and outputs a word expressing the concept based on the second data.
  • the information processing unit is configured to output the generated words as new information to the input unit when the generated words do not match the solution corresponding to the information.
  • Appendix 2 The information processing apparatus according to Appendix 1, wherein the output unit is configured to further output an instruction regarding an action according to the concept represented by the second data.
  • the processing unit has a first processing unit that generates a third data conceptualizing the situation represented by the first data, and the second data by converting the third data into another concept.
  • the first processing unit includes a first learning model and a first identification unit.
  • the first data is data that maps the relationship between a plurality of elements representing the situation and their element values.
  • the first learning model has a pattern that maps the relationship between a plurality of elements representing a specific situation and their element values, and a plurality of elements associated with the pattern and representing a concept corresponding to the specific situation. Contains multiple models, each containing a value that maps the relationship between and their element values,
  • the first identification unit transfers the value associated with the pattern having the highest goodness of fit to the first data among the plurality of models of the first learning model.
  • the information processing apparatus according to Appendix 3, characterized in that it is selected as data.
  • the second processing unit includes a second learning model and a second identification unit.
  • the third data is data that maps the relationship between a plurality of elements representing the concept and their element values.
  • the second learning model represents a pattern that maps the relationship between a plurality of elements representing a specific concept and their element values, and another concept associated with the pattern and assumed from the specific concept. Contains multiple models, each containing a value that maps the relationship between multiple elements and their element values.
  • the second identification unit transfers the value associated with the pattern having the highest goodness of fit to the third data among the plurality of models of the second learning model.
  • the information processing apparatus according to Appendix 3 or 4, characterized in that it is selected as data.
  • the second processing unit corresponds to the at least a part of the elements when the model corresponding to at least a part of the concept represented by the third data is not included in the second learning model.
  • the processing unit is composed of a plurality of data processing processors connected in series. 5. The item according to any one of Supplementary note 1 to 6, wherein each of the plurality of data processing processors is configured to output a value associated with a pattern at the information distance closest to the input. Information processing device.
  • the second step is A step of generating a third data that conceptualizes the situation represented by the first data, and The information processing method according to Supplementary note 8 or 9, further comprising a step of converting the third data into another concept to generate the second data.
  • the first data is data that maps the relationship between a plurality of elements representing the situation and their element values.
  • the first learning model includes a pattern that maps the relationship between a plurality of elements representing a specific situation and their element values, and a plurality of elements associated with the pattern and representing a concept corresponding to the specific situation. Includes multiple models, each containing a value that maps the relationship to those element values,
  • the value associated with the pattern having the highest goodness of fit to the first data among the plurality of models of the first learning model is referred to.
  • the information processing method according to Appendix 10 wherein the data is selected as the third data.
  • the third data is data that maps the relationship between a plurality of elements representing the concept and their element values.
  • the second learning model is a pattern that maps the relationship between a plurality of elements representing a specific concept and their element values, and a plurality of patterns that are associated with the pattern and represent another concept that is assumed from the specific concept. Contains multiple models, each containing a value that maps the relationship between the elements of and their element values.
  • the value associated with the pattern having the highest goodness of fit to the third data among the plurality of models of the second learning model is referred to.
  • (Appendix 14) Computer A means for generating first data representing the situation grasped from the information based on the received information.
  • Appendix 15 A computer-readable recording medium in which the program described in Appendix 14 is described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Machine Translation (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations comprenant : une unité d'entrée qui, sur la base d'informations reçues, génère des premières données représentant une situation qui est saisie à partir des informations ; une unité de traitement qui, sur la base des premières données, génère des secondes données représentant un concept lié à la situation ; et une unité de sortie qui, sur la base des secondes données, génère et délivre un langage représentant le concept. L'unité de sortie est configurée de telle sorte que, lorsque le langage généré ne correspond pas à une solution en réponse aux informations, le langage généré est délivré à l'unité d'entrée en tant que nouvelles informations.
PCT/JP2021/024124 2020-08-04 2021-06-25 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et support d'enregistrement WO2022030134A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022541152A JP7381143B2 (ja) 2020-08-04 2021-06-25 情報処理装置、情報処理方法、プログラム及び記録媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020132276 2020-08-04
JP2020-132276 2020-08-04

Publications (1)

Publication Number Publication Date
WO2022030134A1 true WO2022030134A1 (fr) 2022-02-10

Family

ID=80119690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/024124 WO2022030134A1 (fr) 2020-08-04 2021-06-25 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP7381143B2 (fr)
WO (1) WO2022030134A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014222504A (ja) * 2014-05-24 2014-11-27 洋彰 宮崎 自律型思考パターン生成機
US20200066277A1 (en) * 2018-08-24 2020-02-27 Bright Marbles, Inc Idea scoring for creativity tool selection

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014222504A (ja) * 2014-05-24 2014-11-27 洋彰 宮崎 自律型思考パターン生成機
US20200066277A1 (en) * 2018-08-24 2020-02-27 Bright Marbles, Inc Idea scoring for creativity tool selection

Also Published As

Publication number Publication date
JP7381143B2 (ja) 2023-11-15
JPWO2022030134A1 (fr) 2022-02-10

Similar Documents

Publication Publication Date Title
Nelson Foundations and methods of stochastic simulation
US20190034785A1 (en) System and method for program induction using probabilistic neural programs
CN112256886B (zh) 图谱中的概率计算方法、装置、计算机设备及存储介质
CN111985229A (zh) 一种序列标注方法、装置及计算机设备
CN110807566A (zh) 人工智能模型评测方法、装置、设备及存储介质
RU2670781C9 (ru) Система и способ для хранения и обработки данных
CN107807968A (zh) 基于贝叶斯网络的问答装置、方法及存储介质
CN109035028A (zh) 智能投顾策略生成方法及装置、电子设备、存储介质
Iqbal et al. Comparative investigation of learning algorithms for image classification with small dataset
US20240037345A1 (en) System and method for artificial intelligence cleaning transform
Luo et al. Diagnosing university student subject proficiency and predicting degree completion in vector space
Priore et al. Learning-based scheduling of flexible manufacturing systems using support vector machines
Nguyen et al. From black boxes to conversations: Incorporating xai in a conversational agent
WO2022030134A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et support d'enregistrement
Carmichael A framework for a civil engineering systems BOK
US20210357791A1 (en) System and method for storing and processing data
Bryndin Ensembles of intelligent agents with expanding communication abilities
US11562126B2 (en) Coaching system and coaching method
CN113342988B (zh) 一种基于lda跨域的构建服务知识图谱实现服务组合优化的方法及系统
US11853270B2 (en) Method and apparatus for visualizing a process map
US20120084748A1 (en) System and a method for generating a domain-specific software solution
Li et al. Overconfidence in the face of ambiguity with adversarial data
KR20210148877A (ko) 전자 장치 및 이의 제어 방법
Sharma et al. Learning non-convex abstract concepts with regulated activation networks: A hybrid and evolving computational modeling approach
EP4439401A1 (fr) Procédé et système de recommandation d'une ligne de tuyau d'apprentissage automatique pour un cas d'utilisation industriel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21854457

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022541152

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21854457

Country of ref document: EP

Kind code of ref document: A1