WO2023286105A1 - Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2023286105A1
WO2023286105A1 PCT/JP2021/026060 JP2021026060W WO2023286105A1 WO 2023286105 A1 WO2023286105 A1 WO 2023286105A1 JP 2021026060 W JP2021026060 W JP 2021026060W WO 2023286105 A1 WO2023286105 A1 WO 2023286105A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sensor
target
unit
state
Prior art date
Application number
PCT/JP2021/026060
Other languages
English (en)
Japanese (ja)
Inventor
井上哲也
小栗佳祐
Original Assignee
株式会社ウフル
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ウフル filed Critical 株式会社ウフル
Priority to PCT/JP2021/026060 priority Critical patent/WO2023286105A1/fr
Priority to JP2023534430A priority patent/JPWO2023286105A1/ja
Publication of WO2023286105A1 publication Critical patent/WO2023286105A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to an information processing device, an information processing system, an information processing program, and an information processing method.
  • AI Artificial intelligence
  • teacher data which is a set of known inputs and outputs
  • Patent Document 1 Japanese Patent No. 6492880 below proposes machine learning using feature information extracted from an image and user information input by a user visually recognizing the image. Inference results by AI depend on teacher data used for machine learning. Therefore, in the field of machine learning, a technique for generating teacher data for machine learning according to the use of AI is desired.
  • the detection result of the sensor that detects the target is acquired, and the state information obtained from the information different from the detection result of the sensor and representing the state of the target is associated with the target.
  • An information processing apparatus includes an acquisition unit that acquires from a predetermined terminal, and a generation unit that generates data used for machine learning using the detection result and state information acquired by the acquisition unit.
  • the information processing apparatus processes the detection result obtained by the obtaining unit using a model generated by machine learning by the learning unit to estimate the state of the object.
  • An information processing system comprising: an estimator;
  • the computer acquires the detection result of the sensor that detects the target, and the state information obtained from the information different from the detection result of the sensor and representing the state of the target is transmitted to the target. and generating data used for machine learning using the obtained detection results and state information.
  • an information processing method using one or more computers wherein the communication unit of the computer receives the detection result of a sensor that detects an object, and receiving state information obtained from different information and representing the state of the object from a predetermined terminal associated with the object; and generating data utilized for machine learning.
  • FIG. 1 is a diagram showing an information processing system according to a first embodiment
  • FIG. FIG. 4 is a diagram showing processing related to a predetermined terminal
  • FIG. 4 is a diagram showing association between sensors and terminals
  • It is a figure which shows the process which derives the detection result of the sensor corresponding to state information.
  • It is a figure which shows the process which performs machine learning.
  • It is a figure which shows the information processing method which concerns on 1st Embodiment.
  • It is a figure which shows the information processing system which concerns on 2nd Embodiment.
  • It is a figure which shows the information processing system which concerns on 3rd Embodiment.
  • It is a figure which shows the information processing system which concerns on 4th Embodiment.
  • It is a figure which shows the computer which concerns on embodiment.
  • FIG. 1 is a diagram showing an information processing system according to the first embodiment.
  • the information processing system 1 associates the first information and the second information regarding the target 2 to generate data used for machine learning.
  • This data includes teacher data for machine learning or data that is the source of the teacher data.
  • data used for machine learning is collectively referred to as teacher data as appropriate.
  • the first information corresponds to the result of detection of the target 2 by the sensor 10, and the second information corresponds to the result of the person 3 judging the state of the target 2.
  • the subject 2 is a person.
  • the target 2 will be referred to as the target person 2 as appropriate.
  • the subject 2 is, for example, an employee working in an office.
  • the person 3 is, for example, an evaluator who evaluates the working condition of the subject person 2 . Person 3 will be referred to as evaluator 3 as appropriate in the following description.
  • the information processing system 1 uses machine learning to generate an AI model that estimates the second information from the first information.
  • the above model will be referred to as an inference model as appropriate.
  • the information processing system 1 associates, for example, the detection result (first information) of the detection of the subject 2 by the sensor 10 with the evaluation result (second information) of the state of the subject 2 evaluated by the evaluator 3. to perform machine learning.
  • the information processing system 1 generates an AI inference model for estimating the state of the subject 2 from the detection result of the sensor 10 by machine learning.
  • the information processing system 1 estimates the state of the subject 2 from the detection results of the sensor 10 using AI in which this inference model is implemented. For example, the information processing system 1 estimates the working condition as the condition of the subject 2 .
  • the information processing system 1 provides the application 60 with the estimated work status of the subject person 2 .
  • the application 60 includes, for example, an application that manages attendance of the subject 2 .
  • the application 60 for example, accumulates the results of estimating the working condition of the subject 2 and creates a log.
  • the superior of the subject 2 for example, the evaluator 3 can grasp the attendance of the subject 2 from the log generated by the application 60.
  • the AI in which the inference model is implemented estimates the state of the subject. and can be managed. For example, after the inference model is generated, the information processing system 1 reduces the time and effort of the evaluator 3 to evaluate and objectively evaluates the state of the subject 2 in managing the state of the subject 2. help to do
  • the information processing system 1 includes a sensor 10 , a terminal 20 , an information processing device 30 , an information processing device 40 , and an information processing device 50 .
  • the sensor 10 detects the target 2.
  • Sensors 10 include, for example, sensors associated with subject 2 .
  • the sensor 10 may be a wearable device such as a smartwatch worn by the subject 2 .
  • the sensor 10 detects, for example, information representing the life activity of the subject 2 .
  • the information representing vital activity includes, for example, at least one of the subject's 2 body temperature, the subject's 2 surface humidity, pulse, blood pressure, blood oxygen concentration, and respiratory rate.
  • the sensor 10 may detect the movement (eg, acceleration) of the part (eg, hand) of the subject 2 to which the sensor 10 is attached.
  • the sensor 10 may include smart glasses and may detect the line of sight of the subject 2 .
  • the terminal 20 includes the terminal of the evaluator 3 associated with the subject 2.
  • the status information includes, for example, information input to the terminal 20 by the evaluator 3 (working status of the subject 2, etc.).
  • the terminal 20 may be a portable device (eg, smart phone, tablet) possessed by the evaluator 3, or may be a stationary device (eg, personal computer).
  • FIG. 2 is a diagram showing an example of processing related to a terminal.
  • the terminal 20 is a smartphone, for example, and includes a display unit 21 .
  • the display unit 21 is overlapped with, for example, a transmissive touch pad, and the display unit 21 and the touch pad constitute a touch panel.
  • the terminal 20 displays information D1 to D4 on the display unit 21, for example.
  • the information D1 includes information indicating that the input of the condition of the subject 2 is accepted.
  • the information D1 includes the text "What is your status?" as information notifying that the status input is being accepted.
  • the information D1 may include information specifying the target 2 .
  • the information D1 includes text of the name of the target person 2 (eg, “Mr. A”) as information specifying the target 2 .
  • the information specifying the target 2 may include information different from the target person's name, and may include, for example, identification information (eg, employee number) of the target person 2 .
  • the information specifying the target 2 may be displayed separately from the information notifying that the status of the target is being received.
  • the information D1 may not include information specifying the target 2 . For example, when the number of subjects 2 corresponding to the terminal 20 is 1, the evaluator 3 can specify the subject whose state is to be input even if the information specifying the subject 2 is not displayed.
  • Information D2-D4 includes information corresponding to a plurality of candidates representing the state of target 2.
  • information D2 to information D4 are items (eg, icons) in the displayed image.
  • information D2, information D3, and information D4 will be referred to as icon D2, icon D3, and icon D4, respectively.
  • Icon D2 includes the text "at work”.
  • Icon D3 includes the text "resting”.
  • Icon D4 contains the text "Other”.
  • the texts "at work”, "resting", and “others” are information representing candidate states of the subject 2, respectively.
  • Information representing candidates for the state of the object 2 may be represented in a form different from text (for example, a pictogram).
  • the terminal 20 detects that information associated with the icon D2 (eg, "at work") has been input as information representing the state of the target 2. For example, the evaluator 3 visually recognizes the subject 2 and determines that the subject 2 is at work. In this case, when the evaluator 3 touches the icon D2, information corresponding to "at work" is input to the terminal 20 as the status information.
  • information associated with the icon D2 eg, "at work”
  • the terminal 20 detects that information associated with the icon D2 (eg, "at work") has been input as information representing the state of the target 2. For example, the evaluator 3 visually recognizes the subject 2 and determines that the subject 2 is at work. In this case, when the evaluator 3 touches the icon D2, information corresponding to "at work" is input to the terminal 20 as the status information.
  • the "Other” icon D4 has a state of the target 2 different from the state candidates represented by the other icons (icon D2, icon D3), for example, when the state of the target 2 is "absent". is selected when it is determined that
  • the “other” icon D4 may be selected when the evaluator 3 cannot determine the condition of the subject 2, such as when the evaluator 3 is in a position where the subject 2 cannot be recognized.
  • the evaluator 3 may directly recognize the real object of the subject 2 and input the state of the subject 2 into the terminal 20 .
  • Direct recognition includes, for example, evaluator 3 obtaining information about subject 2 without going through a device.
  • the evaluator 3 may judge the state of the subject 2 by directly seeing the subject 2 or directly hearing the sound emitted by the subject 2 , and input the state of the subject 2 to the terminal 20 .
  • the evaluator 3 may ask the subject 2 what state he or she is in using voice, text, or gestures, and input the state of the subject 2 into the terminal 20 based on the response from the subject 2. .
  • the evaluator 3 may indirectly recognize the subject 2 and input the state of the subject 2 to the terminal 20 .
  • Indirectly perceiving includes, for example, evaluator 3 obtaining information about subject 2 via a device.
  • the evaluator 3 inputs the state of the target 2 to the terminal 20 using the result of detecting the target 2 with a sensor (suitably referred to as a second sensor) different from the sensor 10 (suitably referred to as a first sensor).
  • a sensor suitably referred to as a second sensor
  • the second sensor may differ from the first sensor in the type of information it detects.
  • the first sensor may be a sensor that detects the subject's 2 vital activity
  • the second sensor may be a camera that detects the subject's 2 appearance.
  • the evaluator 3 may input the state of the subject 2 to the terminal 20 by viewing an image (eg, moving image) of the subject 2 captured by a camera.
  • the positional relationship of the second sensor with respect to the target 2 may be different from that of the first sensor.
  • the second sensor may have a different distance from the target 2 than the first sensor.
  • the first sensor may be a sensor attached to the target 2, such as a wearable device, and the second sensor may be a sensor remote from the target 2, such as a surveillance camera.
  • the second sensor may be a sensor that detects the target 2 in a direction different from that of the first sensor.
  • both the first sensor and the second sensor include a camera, the first sensor detects (eg, photographs) the object 2 from a first direction (eg, lateral), and the second sensor , the second sensor may detect (eg, photograph) the target 2 from a second direction (eg, front) different from the first direction.
  • the information processing device 30 includes a communication unit 31, a processing unit 32, and a storage unit 33.
  • the communication unit 31 controls communication performed by the information processing device 30 with other devices.
  • the processing unit 32 processes information.
  • the processing executed by the processing unit 32 includes, for example, at least one of arithmetic processing, image processing, analysis processing, determination processing, inference processing, and control processing.
  • the storage unit 33 stores at least one of information acquired by the information processing device 30 from another device, information used for processing executed by the processing unit 32, and information generated by processing executed by the processing unit 32. memorize
  • the storage unit 33 stores, for example, candidate information D6 as information used for processing executed by the processing unit 32 .
  • the candidate information D6 will be described later.
  • the processing unit 32 includes an acquisition unit 34, an identification unit 35, and a generation unit 36.
  • the acquisition unit 34 acquires the detection result of the sensor 10 that detects the subject 2 .
  • the acquisition unit 34 acquires the status information of the subject 2 from the terminal 20 associated with the subject 2 .
  • the state information is information obtained from information different from the detection result of the sensor 10 and includes information representing the state of the target 2 .
  • status information includes information (eg, at work, on break, or other) entered by subject 2's superior (evaluator 3).
  • the storage unit 33 stores candidate information D6.
  • the candidate information D ⁇ b>6 includes information that associates the candidate of the terminal 20 used for inputting the status information of the subject 2 with the subject 2 .
  • the specifying unit 35 specifies the terminal 20 from candidates associated with the subject 2 in the candidate information stored in the storage unit 33 .
  • FIG. 3 is a diagram showing an example of association between sensors and terminals.
  • the subject 2 includes multiple subjects (2A, 2B).
  • the sensor 10 includes multiple sensors (10A, 10B).
  • the sensor 10A corresponds to the subject 2A and detects the subject 2A.
  • the sensor 10B corresponds to the subject 2B and detects the subject 2B.
  • evaluators 3 include multiple evaluators (3A, 3B).
  • evaluator 3A corresponds to each of the subject 2A and the subject 2B.
  • the evaluator 3A is, for example, the superior of the subject 2A and the subject 2B.
  • Evaluator 3B corresponds to subject 2B.
  • the evaluator 3B is, for example, the superior of the subject 2B.
  • the subject 2B may be an employee belonging to two departments, and the evaluators 3A and 3B may be managers of the respective departments to which the subject 2B belongs.
  • the terminal 20 includes multiple terminals (20A, 20B).
  • Terminal 20A is associated with evaluator 3A.
  • the terminal 20A receives the input of the condition information of the subject 2 from the evaluator 3A.
  • the evaluator 3A judges the condition of the subject 2A corresponding to the evaluator 3A, and inputs the condition information representing the judgment result to the terminal 20A.
  • the evaluator 3A judges the condition of the subject 2B corresponding to the evaluator 3A, and inputs the condition information indicating the judgment result to the terminal 20A.
  • Terminal 20B is associated with evaluator 3B.
  • the terminal 20B receives the input of the condition information of the subject 2 from the evaluator 3B.
  • the evaluator 3B determines the state of the subject 2B corresponding to the evaluator 3B, and inputs state information indicating the determination result to the terminal 20B.
  • reference D6 represents candidate information that associates the terminal 20 candidate with the target 2.
  • the candidate information D6 is, for example, table data.
  • Candidate information D6 includes, for example, items of sensor ID, target ID, terminal ID, and evaluator ID.
  • the sensor ID includes information (eg, identification information) that identifies each sensor. Sensor IDs are assigned so as not to duplicate among a plurality of sensors.
  • the sensor ID of the sensor 10A is "S10A" and the sensor ID of the sensor 10B is "S10B”.
  • the target ID includes information (eg, identification information) specifying each target 2 . Target IDs are assigned so as not to duplicate among multiple targets 2 .
  • the terminal ID includes information identifying each terminal 20 . Terminal IDs are assigned so as not to duplicate among a plurality of terminals 20 .
  • the terminal ID of the terminal 20A is "T20A” and the terminal ID of the terminal 20B is "T20B”.
  • the evaluator ID includes information (eg, identification information) that identifies each evaluator 3 . Evaluator IDs are assigned so as not to overlap among multiple evaluators.
  • the evaluator ID of the evaluator 3A is "E01" and the evaluator ID of the evaluator 3B is "E02".
  • the values of each item arranged on the same line are associated with each other.
  • the sensor ID "S10A” and the target ID "U02A” are arranged on the same line and have a correspondence relationship with each other.
  • the sensor 10A whose sensor ID is "S10A” has a corresponding relationship with the target person 2A whose target ID is "U02A”.
  • the sensor 10A detects the target person 2A in correspondence.
  • the sensor 10A is installed in a place where the subject 2A is assumed to be in an office.
  • the detection result output by the sensor 10A is treated as the result of detecting the subject 2A.
  • the sensor ID "S10A” and the terminal ID “T20A” are arranged on the same line and have a corresponding relationship.
  • the sensor 10A whose sensor ID is “S10A” has a corresponding relationship with the terminal 20A whose terminal ID is "T20A”.
  • the detection result of the sensor 10A is associated with information output from the corresponding terminal 20A and used as teacher data or its original data.
  • the sensor ID "S10A” and the evaluator ID “E03A” are arranged on the same line and have a correspondence relationship with each other.
  • the sensor 10A whose sensor ID is "S10A” has a corresponding relationship with the evaluator 3A whose evaluator ID is "E03A”.
  • the detection result of the sensor 10A is associated with state information input by the evaluator 3A, which has a corresponding relationship, and is used as teacher data or its original data.
  • the target ID "U02A” and the terminal ID “T20A” are arranged on the same line and have a corresponding relationship.
  • the target person 2A whose target ID is “U02A” has a corresponding relationship with the terminal 20A whose terminal ID is "T20A”.
  • the status information of the subject 2A is output from the corresponding terminal 20A. At least part of the information output from the terminal 20A is treated as the subject's 2A status information.
  • the subject ID "U02A” and the evaluator ID "E03A” are arranged on the same line and have a correspondence relationship with each other.
  • the subject 2A whose subject ID is "U02A” has a corresponding relationship with the evaluator 3A whose evaluator ID is "E03A”.
  • the status information of the subject 2A is input by the evaluator 3A who has a corresponding relationship.
  • terminal ID "T20A” and the evaluator ID "E03A” are arranged on the same line and have a corresponding relationship.
  • Terminal 20A whose terminal ID is "T20A” is in correspondence with evaluator 3A whose evaluator ID is "E03A”.
  • the status information input to the terminal 20A is treated as input by the evaluator 3A.
  • the sensor 10A for example, information including a value obtained by detection (arbitrarily referred to as a sensor value) or a processing result of processing the sensor value, time information indicating the timing at which detection was performed (for example, a time stamp), It provides the information processing device 30 with the detection result including the sensor ID of its own device.
  • the evaluator 3A judges the state of the subject 2A in correspondence.
  • the evaluator 3A inputs information representing the condition of the subject 2A to the corresponding terminal 20A.
  • the terminal 20A for example, stores state information including information including an input value or a processing result of processing the input value, time information (for example, time stamp) indicating the timing of the input, and the terminal ID of its own device. provided to the processing device 30;
  • the evaluator 3A judges the state of the subject 2A who has a corresponding relationship with himself.
  • the evaluator 3A inputs information representing the condition of the subject 2A to the terminal 20A, which has a corresponding relationship with him/herself.
  • the terminal 20A provides the information processing apparatus 30 with state information including, for example, state information, time information (for example, time stamp) indicating input timing, and the terminal ID of the terminal itself.
  • state information output by the terminal 20A may include one or both of state information input to the terminal 20A and state information obtained by processing information (eg, input values) input to the terminal 20A.
  • the sensor 10A may detect at least one of a target other than the target person 2A, a person other than the target, and an object other than the target.
  • the sensor 10A may include a camera installed in the space where the subject 2A works.
  • the sensor 10A or a device external to the sensor 10 may analyze the image captured by the camera and detect the subject 2A appearing in the image.
  • the sensor 10A or the above-mentioned external device may perform image analysis on an image showing a plurality of subjects 2 and identify the subject 2A appearing in the image.
  • the target ID does not have to be associated with the sensor ID.
  • the target ID may be associated with feature information representing the external features of the target person 2A.
  • the sensor 10A or the external device may identify the target by comparing the detection result of the sensor 10A and the feature information, and associate the detection result of the specified target with the target ID associated with the feature information.
  • the sensor 10A may detect a plurality of targets 2 in chronological order according to a predetermined schedule.
  • the target 2 detected by the sensor 10A is specified using the timing at which the detection was performed and the above schedule.
  • the sensor 10A may detect a plurality of targets 2 by detecting the target 2 in a predetermined direction and switching the predetermined direction.
  • the target 2 detected by the sensor 10A is specified using information that associates the direction of detection with the target 2 in advance, and the direction of detection that has been performed.
  • the terminal 20A may output the information designating the subject 2 in the form of an image or voice when accepting the input of the status information.
  • the terminal 20A may display information specifying the target person 2 (for example, "Mr. A") as the information specifying the target person 2 as shown in FIG.
  • the terminal 20A may receive input of information specifying the subject 2 corresponding to the input state information.
  • the evaluator 3A may input information specifying the subject 2B (eg, name of the subject 2B, subject ID) and state information about the subject 2B into the terminal 20A.
  • the terminal 20A may provide the information processing device 30 with information including information specifying the subject 2B and state information.
  • the evaluator 3A may be a different person from the subject 2A, or may be the same person as the subject 2A.
  • the subject 2A may be the same person as the evaluator 3A
  • the terminal 20A may be a terminal possessed by the subject 2A.
  • the subject 2A may input his/her status information to the terminal 20A.
  • the evaluator ID may not be associated with the subject ID.
  • the state information output from the terminal 20A may be associated with the detection result of the sensor 10A associated with the terminal 20A and used as teacher data.
  • the specifying unit 35 specifies the correspondence relationship between the information output from the sensor 10 and the information output from the terminal 20.
  • the specifying unit 35 specifies the state information acquired from the terminal 20 corresponding to the detection result acquired from the sensor 10 by referring to the candidate information D6.
  • the sensor ID included in the detection result obtained from the sensor 10 is "S10A.”
  • the identifying unit 35 identifies the sensor ID (here, “S10A”) from the information acquired from the sensor 10 .
  • the identifying unit 35 identifies the terminal ID (here, "T20A”) corresponding to "S10A" in the candidate information D6.
  • the identifying unit 35 identifies the state information acquired from the terminal 20A whose terminal ID is "T20A" as state information corresponding to the detection result of the sensor 10A whose sensor ID is "S10A".
  • the identifying unit 35 may identify the detection result obtained from the sensor 10 corresponding to the state information obtained from the terminal 20 .
  • the evaluator 3 inputs information specifying the subject 2 (eg, name, subject ID) when inputting the status information of the subject 2 .
  • the terminal 20 is the terminal 20A in FIG. 3, and when outputting the state information, the terminal ID of its own device (here, "T20A” and the information specifying the target 2 input by the evaluator 3 ( For example, the target ID “U02B”) is output from the identification unit 35 based on the information acquired from the terminal 20A.
  • the identifying unit 35 identifies the sensor ID (here, "S10B") associated with "T20A" and "S02B" in the candidate information D6.
  • the identifying unit 35 identifies the sensor ID "S10B”. is the state information about "U02B" acquired from the terminal 20A whose terminal ID is "T20A”.
  • the information processing device 30 may be the above-described external device and identify the target 2 from the detection result output from the sensor 10 .
  • the sensor 10A may include a camera that captures the subject 2A, and the information processing device 30 (eg, the identification unit 35) may identify the subject 2A by analyzing the image captured by the camera of the sensor 10A.
  • the identifying unit 35 identifies the terminal ID (here, “T20A”) associated in the candidate information D6 with the target ID (here, “U02A”) of the subject 2A identified from the detection result of the sensor 10A.
  • the identifying unit 35 may associate the detection result of the sensor 10A with the terminal ID (here, "T20A") identified using the detection result of the sensor 10A.
  • the sensor ID may not be associated with the target ID in the candidate information D6, and may not be included in the candidate information D6.
  • the generation unit 36 in FIG. 1 generates data by associating the detection result of the sensor 10 with the state information.
  • the detection result may be a sensor value of the sensor 10 or a value obtained by processing the sensor value of the sensor 10 (for example, a value derived as a detection result).
  • the generation unit 36 uses the detection result acquired by the acquisition unit 34 to derive the sensor detection result at the time when the state information was obtained, and associates the derived sensor detection result with the state information to generate data.
  • the state information may be information output by the terminal 20, or may be information obtained by processing this information (for example, information derived as state information).
  • the generating unit 36 generates data by associating, for example, the detection result derived from the sensor value of the sensor 10 with the state information specified by the specifying unit 35 as the state information corresponding to this detection result.
  • FIG. 4 is a diagram showing an example of the process of deriving the sensor detection results corresponding to the state information.
  • the generation unit 36 derives the timing at which the state information is acquired, and derives (eg, estimates) the sensor value at this timing.
  • the generating unit 36 uses the time indicated by the time stamp output by the terminal 20 together with the state information as the timing at which the state information was acquired.
  • the generation unit 36 derives the timing at which the state information was acquired by the process of specifying the value of the time stamp corresponding to the state information.
  • the sensor 10 performs detection operations at, for example, a predetermined sampling frequency.
  • the sensor 10 outputs a sensor value, for example, at predetermined time intervals.
  • symbol t is the time derived by the generator 36 as the time when the state information was obtained.
  • the time t is information obtained from a time stamp output by the terminal 20, for example.
  • the generation unit 36 derives the sensor value as the detection result corresponding to the state information by the process of specifying the sensor value closest to the time t from the time-series data of the sensor value.
  • the generation unit 36 may derive (eg, calculate) the detection result by statistically processing a plurality of sensor values in a predetermined period T including time t.
  • the generation unit 36 may calculate an average value of multiple sensor values in a predetermined period T as the detection result.
  • This average value may be an arithmetic mean value or a geometric mean value of the sensor values in the predetermined period T.
  • the average value may be a geometric mean value using a weight according to the time between the time when the sensor value is obtained and the time t when the state information is obtained.
  • the generation unit 36 may calculate the detection result by approximation or interpolation using a plurality of sensor values in the predetermined period T.
  • the generation unit 36 may generate teacher data that associates one detection result (eg, sensor value) obtained from the sensor 10 with one piece of state information.
  • the generation unit 36 may generate teacher data that associates one piece of information (eg, a waveform of sensor values with respect to time) that is a set of a plurality of detection results obtained from the sensor 10 and one piece of state information. .
  • the terminal 20 does not have to output the time stamp corresponding to the timing at which the state information was acquired.
  • the generator 36 may derive the time at which the information processing device 30 acquires the state information as the timing at which the state information was acquired.
  • the generation unit 36 may derive the time when the communication unit 31 received the state information as the timing at which the state information was acquired.
  • the information processing device 30 eg, the processing unit 32
  • the generation unit 36 may derive the timing at which the state information was acquired using the time when the communication unit 31 transmitted the state information request to the terminal 20 .
  • the generator 36 may derive the time at which the state information request was transmitted as the timing at which the state information was acquired. As the timing at which the state information is acquired, the generating unit 36 selects the time between the first time when the communication unit 31 transmitted the state information request and the second time when the communication unit 31 received the state information (for example, , the average value of the first time and the second time) may be derived. The generation unit 36 may derive the timing at which the state information is acquired using one or both of the first time and the second time and an estimated value of the time required for communication.
  • the sensor 10 may detect the target 2 periodically, or may detect the target 2 irregularly.
  • the sensor 10 may detect the target 2 upon receiving information requesting detection of the target 2 (hereinafter referred to as a detection request).
  • the sensor 10 may output a sensor value or information obtained by processing the sensor value as a response to the detection request.
  • the device that outputs the detection request may be the information processing device 30, a device different from the information processing device 30 in the information processing system 1 (eg, the information processing device 50 for dictation), or an external device of the information processing system 1. good.
  • the information processing device 30 provides the data generated by the generation unit 36 to the information processing device that executes machine learning.
  • the information processing device 40 includes a learning unit 41 that executes machine learning, and the information processing device 30 provides the information processing device 40 with teacher data for machine learning or data that is the source of the teacher data. .
  • FIG. 5 is a diagram showing an example of processing for executing machine learning.
  • model M is an inference model generated by deep learning.
  • the model M receives the detection result of the sensor 10 and outputs the estimation result of the state of the object 2 .
  • the model M includes an input layer M1, an intermediate layer M2 and an output layer M3.
  • the input layer M1 is a layer to which original data for inference is input.
  • the output layer M3 is a layer that outputs data indicating an inference result.
  • the intermediate layer M2 is a layer arranged between the input layer M1 and the output layer M3.
  • the number of intermediate layers M2 is arbitrary, and may be one layer or multiple layers.
  • the learning unit 41 inputs the detection result of the sensor 10 in the teacher data to the input layer M1.
  • a value input to the input layer M1 propagates to the output layer M3 via the intermediate layer M2.
  • the model M includes parameters (eg, coupling coefficients, biases) for propagating values from the input layer M1 to the output layer M3.
  • the learning unit 41 optimizes the parameters so that the output data output from the output layer M3 approaches the data representing the state information when the input data in the teacher data is input to the input layer M1.
  • the machine learning performed by the learning unit 41 is arbitrary and does not have to be deep learning.
  • the machine learning method may be a neural network that does not include the intermediate layer M2, or may be a method other than the neural network.
  • the information processing device 30 may provide the learning unit 41 with teacher data obtained from the detection results and state information of a plurality of targets 2 as teacher data for one inference model.
  • the obtained inference model can be used, for example, as an inference model commonly used for a plurality of targets 2 (arbitrarily referred to as a shared model).
  • the information processing device 30 may provide the learning unit 41 with teacher data obtained from the detection results and state information of a specific target 2 (for example, the target 2A) as teacher data for one inference model.
  • the obtained inference model can be used, for example, as an inference model (arbitrarily referred to as an individual model) for estimating the state of a specific subject 2 (eg, subject 2A).
  • the number of targets 2 included in the specific target 2 may be one, or two or more.
  • a plurality of targets 2 may be a predetermined set, and a particular target 2 may be a part (eg, a subset) of the predetermined set.
  • the predetermined set may be a set of employees belonging to a specific company
  • the subset may be a set of employees belonging to a predetermined department within the specific company.
  • the learning unit 41 generates a shared model when the amount of teacher data accumulated for the specific target 2 is less than a predetermined amount, and generates an individual model when the amount of teacher data accumulated for the specific target 2 is equal to or greater than the predetermined amount.
  • model may be generated.
  • the learning unit 41 generates a common model using the detection results obtained by the sensor 10 in a first period, and uses the detection results obtained by the sensor 10 in a second period longer than the first period. Individual models may be generated.
  • the learning unit 41 provides the generated inference model to the information processing device that estimates the state of the target 2 using the detection result of the sensor 10 .
  • the information processing device 40 provides the inference model generated by the learning unit 41 to the information processing device 50 .
  • the information processing device 50 includes a processing unit realized by a general-purpose processor or the like.
  • the processing unit of the information processing device 50 configures the estimating unit 51 by executing the computation defined by the inference model.
  • the estimating unit 51 is, for example, an AI in which an inference model generated by machine learning by the learning unit 41 is implemented.
  • the information processing device 30 provides the information processing device 50 with the detection result obtained by the obtaining unit 34 from the sensor 10 .
  • the estimation unit 51 of the information processing device 50 estimates the state of the target 2 by performing an operation defined by the inference model on the detection result acquired by the acquisition unit 34 .
  • the information processing device 50 provides, for example, the application 60 with the state of the target 2 estimated by the estimation unit 51 (arbitrarily referred to as an estimation result of the estimation unit 51).
  • the application 60 is an application that processes the estimation result of the estimation unit 51 .
  • the application 60 uses the estimation result of the estimation unit 51, for example, to execute a process of creating a log indicating the work status of the subject 2.
  • the processing executed by the application 60 is arbitrary, and may include processing for analyzing the estimation result of the estimation unit 51 .
  • the application 60 may estimate the time during which the subject 2 was working intensively, or calculate an index indicating the work status of the subject 2 (eg, efficiency, degree of concentration, amount of work). good too.
  • the application 60 may perform processing for outputting (eg, displaying) one or both of the estimation result of the estimation unit 51 and the processing result obtained by processing the estimation result of the estimation unit 51 .
  • the application 60 may be installed in at least one of the information processing devices included in the information processing system 1 .
  • the application 60 may be installed in the information processing device 30 .
  • the application 60 may be installed in at least one information processing device external to the information processing system 1 .
  • the number of applications to which the estimation result of the estimation unit 51 is provided may be one or plural.
  • the application 60 may include a web application, an application on the cloud, or an on-premises application.
  • the uses of the estimation results may be the same or different for the multiple applications.
  • a first application among the plurality of applications may manage the work status of the subject 2 and a second application among the plurality of applications may manage the health status of the subject 2 .
  • FIG. 6 is a diagram showing an information processing method according to the first embodiment.
  • FIG. 1 to FIG. 5 are referred to appropriately for each part of the information processing system 1 .
  • the sensor 10 shown in FIG. 1 detects the target 2.
  • the sensor 10 transmits the detection result of detecting the target 2 .
  • the acquisition unit 34 acquires the detection result from the sensor 10 .
  • the acquisition unit 34 controls the communication unit 31 to receive the detection result transmitted by the sensor 10 .
  • the evaluator 3 judges the condition of the subject 2 and inputs information representing the judgment result into the terminal 20 as condition information.
  • the acquisition unit 34 acquires state information from the terminal 20 .
  • the acquisition unit 34 controls the communication unit 31 to receive the detection result transmitted by the terminal 20 .
  • step S3 the generating unit 36 generates data used for machine learning.
  • Step S3 includes, for example, steps S4 to S6.
  • step S4 the generator 36 derives the timing at which the state information was acquired.
  • step S5 the generator 36 derives the detection result at the timing derived in step S4.
  • step S6 the generation unit 36 associates the state information acquired in step S2 with the detection result derived in step S5. For example, the generation unit 36 sets the detection result and the state information as input teacher data for the model M (see FIG. 5) and the state information as the output teacher data for the model M (see FIG. 5).
  • step S7 the information processing device 30 provides data to the learning unit 41.
  • the communication unit 31 of the information processing device 30 transmits data including teacher data generated by the generation unit 36 to the information processing device 40 including the learning unit 41 .
  • the learning unit 41 uses the data provided in step S7 to generate a model through machine learning.
  • step S ⁇ b>9 the learning unit 41 provides the model generated in step S ⁇ b>9 to the estimation unit 51 .
  • the information processing device 40 including the learning unit 41 transmits data representing the inference model generated by the learning unit 41 to the information processing device 40 including the estimation unit 51 .
  • step S10 the estimation unit 51 processes the detection result of the sensor 10 using the model provided in step S9 to estimate the state of the object.
  • the communication unit 31 of the information processing device 30 transmits the sensor detection result to the information processing device 50, and the estimation unit 51 estimates the target state using the sensor detection result received by the information processing device 50. do.
  • the estimating unit 51 inputs, for example, the model M (see FIG. 5) in which the result of machine learning is reflected to the input layer M1, and derives the data output from the output layer M3 as the estimation result.
  • step S11 the estimation unit 51 provides the application 60 with information indicating the target state estimated in step S10.
  • the information processing device 50 transmits data indicating the estimation result of the estimation unit 51 to the information processing device on which the application 60 is installed.
  • step S2 may be performed before at least part of the process of step S1, or may be performed in parallel with at least part of the process of step S1.
  • step S3 the generation unit 36 may derive the timing at which the detection result of the sensor 10 is acquired, and derive the state information at the derived timing.
  • the information processing system 1 may repeatedly execute the processing from step S1 to step S3 (referred to as teacher generation processing as appropriate) and accumulate the teacher data generated in step S3.
  • the information processing system 1 may execute the processes after step S7 when the data amount of the accumulated teacher data reaches or exceeds a predetermined amount.
  • step S8 may be repeatedly executed.
  • a first teacher generation process may generate a first model
  • a second teacher generation process executed after the first teacher generation process may generate a second model.
  • the second model may be generated using teacher data having a larger amount of data than the first model.
  • the first model may be generated using first teacher data
  • the second model may be generated using second teacher data accumulated after the generation of the first teacher data.
  • a second model may be generated using the first teacher data and the second teacher data.
  • the second teacher data may be generated by adding (for example, feedback) the error of the estimation result using the first model.
  • the second training data may differ from the first training data in a set of objects corresponding to the original data when generated.
  • the first teacher data is teacher data based on the detection results and state information about the plurality of targets 2
  • the second teacher data is a part of the plurality of targets 2 (e.g., a target person 2A) may be teacher data based only on detection results and state information.
  • the second model may differ from the first model in terms of the state estimation target.
  • the first model is used for estimating the status of employees belonging to a given company
  • the second model is used for estimating the status of employees belonging to a given department within a given company.
  • the second model may not be generated.
  • the process of step S10 and the process of step S11 may be repeatedly performed using the first model.
  • the information processing system 1 in the above-described embodiment associates the detection result of the sensor 10 with information different from the detection result of the sensor 10 to generate an inference model.
  • the state of the target 2 can be estimated by the estimation unit 51 .
  • Such an information processing system 1 contributes to grasping the state of the subject 2 even if, for example, the evaluation by the evaluator 3 is simplified or omitted after the inference model is generated.
  • the information processing system 1 can facilitate, for example, associating different information about the target 2, and contributes to facilitating multifaceted analysis of the target 2.
  • the information processing system 1 can, for example, express the state of the target 2 as data according to a certain rule based on an inference model, and contributes to objective evaluation of the state of the target 2 .
  • the first application example has been described in which the subject 2 is an employee working in an office and the evaluator 3 is the employee's boss.
  • Application examples of the information processing system 1 will be described below, but the scope of application of the information processing system 1 is not limited to the examples described above and the examples described later.
  • subject 2 is a worker at a construction site
  • evaluator 3 is a site supervisor.
  • the sensor 10 detects the temperature of the target 2 (eg, body temperature), the humidity in the vicinity of the target 2 (eg, sweating state), the movement of the target 2 (eg, hand movement, posture), and the vital activity of the target 2 (eg, , pulse, blood pressure).
  • the evaluator 3 evaluates the health condition of the subject 2 based on, for example, the complexion and the like, and inputs the evaluation result into the terminal 20 as condition information.
  • the information processing system 1 uses the detection result of the sensor 10 and state information to generate an inference model for estimating the health condition of the worker, for example.
  • the estimation unit 51 acquires the detection result of the sensor 10 in real time, and estimates the worker's health condition (eg, physical condition, degree of fatigue) in real time using an inference model.
  • the application 60 may use the estimation result of the estimation unit 51 to determine whether the worker's health condition is suitable for continuing work. For example, the application 60 estimates the health condition of the worker after a predetermined period of time has elapsed using the history of estimation results regarding the health condition of the worker. For example, the application 60 estimates the health condition of the worker by calculating an index indicating the health condition of the worker after a predetermined period of time has passed. The application 60 compares the health condition indicator with a threshold value to determine whether the worker's health condition is suitable for continuing work. For example, the worse the health condition, the lower the index, and the application 60 determines that the health condition of the worker is not suitable for continuing work when the calculated index is less than the threshold.
  • the application 60 may output the determination result.
  • the application 60 is provided in the information processing device 30, and the information processing device 30 sends a warning about the worker's health condition to the terminal 20 (eg, the site supervisor's terminal) associated with the target 2 who is the worker, or It may be provided to other devices.
  • the site supervisor may make the worker stop the work or take a rest, referring to the above warning.
  • the information processing system 1 can be used for health management of workers.
  • the position of the subject 2 and the relationship between the subject 2 and the evaluator 3 are not limited to the above examples.
  • the subject 2 may be a care recipient and the evaluator 3 may be the subject 2's caregiver.
  • the subject 2 may be a protected person and the evaluator 3 may be the subject 2's guardian.
  • the subject 2 may be a patient and the evaluator 3 may be a medical practitioner in charge of the subject 2 .
  • the target 2 may be an animal other than humans.
  • the subject 2 is a domesticated animal and the evaluator 3 is the keeper of this animal.
  • the animal may be an animal raised in a zoo or the like, or may be a pet raised by an individual.
  • the sensor 10 detects, for example, information about vital activity of the animal, which is the subject 2, and the evaluator 3 inputs the health condition of the animal, which is the subject 2, into the terminal 20 as condition information.
  • the information processing system 1 generates an inference model by the same processing as in the second application example, and uses this inference model to estimate the state of health of the animal. Such an information processing system 1 can be used for health management of the subject 2, as in the second application example.
  • the information processing system 1 may estimate the growth state as the state of the animal that is the target 2 .
  • the estimation result of the growth state may be used, for example, as a reference for adjusting breeding conditions, or as a reference for deciding shipping of livestock products (eg, shipping amount, shipping time).
  • Object 2 may be an organism other than an animal.
  • the information processing system 1 determines the state of the target 2 (eg, health condition, growth) as in the case where the target 2 is an animal as described above. Contribute to managing (e.g., understanding).
  • the target 2 may be an inanimate object (eg, machine, article).
  • the target 2 is a machine
  • the evaluator 3 is a manager (eg, operator, maintenance person, inspection person) of the target 2 which is the machine.
  • the sensor 10 detects, for example, information (eg, temperature, vibration) indicating the operating state of the machine that is the object 2 .
  • the evaluator 3 inspects the object 2, which is, for example, a machine, and inputs information indicating the operating state to the terminal 20 as state information.
  • the operating state may include information as to whether the operation is normal, information indicating the degree of certainty that a failure has occurred, and information indicating whether inspection or maintenance is necessary.
  • the information processing system 1 uses the detection result of the sensor 10 and state information to generate an inference model for estimating the operating state of the machine, for example.
  • the estimating unit 51 for example, acquires the detection result of the sensor 10 and estimates the operating state of the machine using an inference model.
  • the application 60 uses the estimation result of the estimation unit 51 to output (eg, notify) information about the machine when the estimated operating state satisfies a predetermined condition.
  • the sensor 10 detects vibrations of the target 2, which is a machine, and the estimation unit 51 estimates the probability that the target 2 is out of order (arbitrarily referred to as failure probability).
  • the application 60 is provided in the information processing device 30 and outputs a warning prompting inspection of the target 2 when the failure probability estimated by the estimation unit 51 is equal to or greater than a predetermined value.
  • the information processing device 30 provides the warning output by the application 60 to the terminal 20 associated with the target 2 (for example, the terminal of the person in charge of inspection).
  • FIG. 7 is a diagram showing an information processing system according to the second embodiment.
  • the same reference numerals are given to the same configurations as those of the above-described embodiment, and the description thereof is omitted or simplified.
  • the information processing device 30 includes a control section 37 .
  • the control unit 37 requests the predetermined terminal specified by the specifying unit 35 to provide the state information.
  • the acquisition unit 34 acquires state information from the terminal 20 as a response to the request.
  • the control unit 37 requests the terminal 20 to provide state information when the detection result of the sensor 10 satisfies a predetermined condition.
  • the control unit 37 determines whether the detection result obtained from the sensor 10 by the obtaining unit 34 satisfies a predetermined condition.
  • the sensor 10 is a wearable device (eg, smart watch) worn by the subject 2 and detects information indicating the vital activity of the subject 2 .
  • the sensor 10 is worn on the hand of the subject 2 and detects acceleration.
  • the detection results of the sensor 10 show different values or waveforms, for example, when the subject 2 is working (for example, operating a computer) and when the subject 2 is resting.
  • the sensor 10 detects at least one of the subject's 2 heart rate, blood pressure, body temperature, and perspiration.
  • the detection result of the sensor 10 shows different values or waveforms depending on, for example, the subject's 2 stress, tension, and concentration level.
  • the control unit 37 performs frequency analysis on the waveform of the sensor value at predetermined time intervals to calculate the characteristic frequency of the waveform. For example, when the difference between the characteristic frequency in the first period and the characteristic frequency in the second period following the first period exceeds a threshold, the control unit 37 determines that the predetermined condition is satisfied.
  • control unit 37 may perform the above determination by comparing one value (eg, sensor value, moving average value of sensor values with respect to time) indicating the detection result of the sensor 10 with a threshold value.
  • the predetermined condition may be set to the same condition for a plurality of targets 2 or may be set for each target 2 .
  • the predetermined conditions for at least one target 2 among the plurality of targets 2 may differ from the predetermined conditions for other targets.
  • a plurality of targets 2 are divided into a plurality of groups, and the predetermined condition may be set for each group. For example, a plurality of target persons 2 may be divided into a plurality of groups (eg, departments), and predetermined conditions may be set according to the work content of each group.
  • the control unit 37 monitors the detection result of the sensor 10, for example.
  • the control unit 37 determines that the detection result of the sensor 10 satisfies a predetermined condition, it generates a state information request.
  • the state information request is information (eg, command) that causes the terminal 20 to provide (eg, transmit) state information.
  • the information processing device 30 designates the target 2 and requests provision of state information about the target 2 .
  • the status information request includes information identifying the target 2 for which status information is requested (eg, the target ID shown in FIG. 3).
  • the specifying unit 35 refers to the candidate information D6 stored in the storage unit 33, and specifies the target ID corresponding to the sensor ID of the sensor 10 that provided the detection result used in the determination.
  • the control unit 37 generates a state information request including, for example, the target ID specified by the specifying unit 35 and a message requesting state information.
  • the information processing device 30 transmits a state information request to the terminal 20 to which the state information is requested.
  • the identifying unit 35 identifies the terminal 20 corresponding to the sensor 10 that provided the detection result used in the above determination.
  • the specifying unit 35 refers to the candidate information D6 stored in the storage unit 33, and specifies the terminal ID corresponding to the sensor ID (see FIG. 3) of the sensor 10 that provided the detection result used in the determination. do.
  • the control unit 37 causes the communication unit 31 to transmit the state information request to the terminal 20 specified by the specifying unit 35 .
  • the communication unit 31 causes the storage unit 33 to store information (eg, time stamp) indicating the timing of transmitting the state information request.
  • the terminal 20 When the terminal 20 receives the state information request, it outputs a notification (eg, image, voice) prompting the user to enter the state information. For example, the terminal 20 displays an image for accepting input of state information on the display unit 21 as shown in FIG. This image includes, for example, information identifying the subject 2 (eg, the name of the subject 2). Status information is input to the terminal 20 by the evaluator 3 , and the terminal 20 provides the input status information to the information processing device 30 .
  • the information processing device 30 receives the state information transmitted from the terminal 20 .
  • the generation unit 36 of the information processing device 30 uses one or both of the first timing at which the state information request is transmitted and the second timing at which the state information is transmitted so that the state information is transmitted by the terminal 20. Deriving (eg, estimating) the obtained timing.
  • the generation unit 36 calculates the time between the time indicating the first timing and the time indicating the second timing as the timing at which the terminal 20 acquires the state information.
  • the generator 36 derives the detection result of the sensor 10 at the derived timing and generates teacher data.
  • the information processing device 30 provides teacher data to the information processing device 40 as described in the first embodiment.
  • the learning unit 41 of the information processing device 40 executes machine learning using teacher data to generate a model.
  • the information processing device 40 provides the model to the information processing device 50 .
  • the estimation unit 51 of the information processing device 50 processes the detection result of the sensor 10 using a model to estimate the state of the target 2 .
  • the information processing device 50 provides the estimation result of the state of the target 2 to the application 60 .
  • the information processing system 1 can selectively cause the evaluator 3 to evaluate the subject 2 in a situation where it is estimated that the state of the subject 2 has changed. Such an information processing system 1 contributes to, for example, reducing the load required for the process of acquiring state information when the state of the target 2 is substantially the same.
  • FIG. 8 is a diagram showing an information processing system according to the third embodiment.
  • the same reference numerals are given to the same configurations as those of the above-described embodiment, and the description thereof will be omitted or simplified.
  • the identifying unit 35 of the information processing system 1 identifies a predetermined terminal from among the terminals 20 existing within a predetermined range with respect to the detection target position of the sensor 10 .
  • the generation unit 36 generates teacher data using the state information provided from the predetermined terminal identified by the identification unit 35 and the detection result of the sensor 10 .
  • the storage unit 33 of the information processing device 30 stores, for example, the location information D7, and the specifying unit 35 uses the location information D7 to specify the location of the predetermined terminal.
  • the position information D7 includes, for example, information on the position of the detection target of each sensor 10 (arbitrarily referred to as sensor position information).
  • the sensor position information includes, for example, information indicating the position where each sensor 10 is installed and information indicating the detectable range of each sensor 10 .
  • Sensor location information may include information indicating the location of the target 2 associated with the sensor 10 .
  • the sensor position information may be the position information of the seat of the subject 2 in the office.
  • the sensor position information may be the central position of the field of view (angle of view) when the sensor 10 takes an image.
  • the sensor position information may be given as a fixed value when the movement of the sensor 10 is not assumed (for example, when the sensor 10 is a fixed point camera).
  • the sensor position information may be given as a variable value when the sensor 10 is assumed to move (for example, when the sensor 10 is a wearable device).
  • a device eg, wearable device, mobile device
  • Any method may be used to detect the position of the sensor 10.
  • a method using GPS, a method using signals (e.g., beacon signals) received from a plurality of other devices, or other methods may be used. good.
  • a device that detects the position of the sensor 10 may be a device including the sensor 10 or a device different from this device (for example, a device external to the sensor 10).
  • the information processing device 30 acquires the position information of the sensor 10 from a device including the sensor 10 or a device different from this device, and stores the information in the storage unit 33 using the acquired information. may update the sensor position information stored by .
  • the position information D7 includes, for example, the position information of the terminal 20 (arbitrarily referred to as terminal position information).
  • the terminal location information may be given as a fixed value when the terminal 20 is not expected to move (for example, when the terminal 20 is a stationary computer).
  • the terminal location information may include information indicating the location where the terminal 20 is installed.
  • the terminal location information may include information indicating the location of the evaluator 3 associated with the terminal 20 .
  • the terminal 20 may be installed at the seat of the evaluator 3 in the office, and the terminal position information may include the position information of the seat of the evaluator 3 .
  • the terminal location information may be given as a variable value when the terminal 20 is expected to move (for example, when the terminal 20 is a smartphone).
  • the terminal 20 eg, smart phone
  • Any method may be used to detect the position of the terminal 20.
  • a method using GPS, a method using signals (e.g., beacon signals) received from a plurality of other devices, or other methods may be used. good.
  • a device that detects the position of the terminal 20 may be the terminal 20 or a device different from the terminal 20 (for example, a device external to the terminal 20).
  • the information processing device 30 acquires the position information of the terminal 20 from the terminal 20 or a device different from the terminal 20, and the storage unit 33 stores the acquired information.
  • Terminal location information may be updated.
  • the specifying unit 35 refers to the position information D7 stored in the storage unit 33 and specifies (eg, determines) the terminal 20 to which the state information of the target 2 is input. For example, the specifying unit 35 sets the predetermined range R using the sensor position information included in the position information D7.
  • the predetermined range R is, for example, a range in which the distance from the detection target position of the sensor 10 is equal to or less than a predetermined value.
  • the position of the detection target of the sensor 10 is the position of the object 2 .
  • the predetermined range R is, for example, an area inside a circle (eg, geofence) centered on the position of the target 2 . The radius of this circle corresponds to the predetermined value.
  • the predetermined range R is set, for example, to a range in which the evaluator 3 can directly recognize the target 2 .
  • the radius of the predetermined range R is set in advance to a distance (eg, several meters) at which a person can recognize the person with normal eyesight. be done.
  • the specifying unit 35 determines whether or not the terminal 20 exists inside the predetermined range R using the terminal location information included in the location information D7.
  • the specifying unit 35 sets, among the terminals whose terminal location information is included in the location information D7, terminals existing inside a predetermined range R (terminals 20A and 20B in FIG. 8) as predetermined terminal candidates.
  • the specifying unit 35 excludes a terminal (the terminal 25 in FIG. 8) existing outside the predetermined range R from among the terminals whose terminal position information is included in the position information D7 from the predetermined terminal candidates.
  • the generating unit 36 generates teacher data by associating the state information provided by the terminal set as a predetermined terminal candidate by the specifying unit 35 with the detection result of the sensor 10 that detects the target 2 .
  • the information processing device 30 transmits a state information request to at least one of the predetermined terminal candidates.
  • the control unit 37 shown in FIG. 7 selects at least one terminal 20 from the predetermined terminal candidates as the predetermined terminal.
  • the control unit 37 may randomly select a predetermined terminal from the predetermined terminal candidates.
  • the control unit 37 may select a predetermined terminal from the predetermined terminal candidates according to a predetermined rule. For example, multiple terminals are registered in advance in the information processing device 30 .
  • a priority order may be set in advance for a plurality of registered terminals, and the control unit 37 may select a predetermined terminal according to the priority order.
  • the predetermined terminal candidates include the terminal 20A of the boss of the subject 2 and the terminal 20B of the colleague of the subject 2, and the control unit 37 prioritizes the terminal 20A of the boss over the terminal 20B of the colleague. You may choose to
  • the information processing device 30 when the information processing device 30 acquires the detection result of the sensor 10, it provides a state information request including information specifying the target 2 corresponding to this sensor 10 to a predetermined terminal.
  • the terminal 20 does not have to be pre-associated with the target 2, and the generator 36 associates the detection result of the sensor 10 with the state information acquired as a response to the state information request to generate teacher data.
  • the information specifying the target 2 may be associated in advance with the information specifying the sensor 10 or may be derived using the detection result of the sensor 10 .
  • the sensor 10 includes a camera, and the information processing device 30 extracts the features of the target 2 appearing in the image captured by the sensor 10, compares the information of the extracted features with pre-registered information, and extracts the feature of the target. 2 may be specified. In this case, the information specifying the target 2 may not be associated in advance with the information specifying the sensor 10 .
  • FIG. 9 is a diagram showing an information processing system according to the fourth embodiment.
  • the same reference numerals are given to the same configurations as those of the above-described embodiment, and the description thereof will be omitted or simplified.
  • the information processing system 1 generates teacher data using the detection result of the sensor 11 and the detection result of the sensor 12 different from the sensor 11 .
  • the sensor 10 is, for example, a sensor similar to the sensor 10 shown in FIG.
  • the sensor 12 differs from the sensor 11 in the type of information it detects, for example.
  • the sensor 11 may be a sensor that detects vital activity of the subject 2 and the sensor 12 may be a camera that detects the appearance of the subject 2 .
  • Sensor 11 may be a sensor that detects sounds produced by object 2 and sensor 12 may be a camera that detects the appearance of object 2 .
  • the sensor 12 may differ from the sensor 11 in positional relationship with respect to the target 2 .
  • sensor 12 may have a different distance from object 2 than sensor 11 .
  • the sensor 11 may be a sensor attached to the target 2 like a wearable device, and the sensor 12 may be a sensor remote from the target 2 like a surveillance camera.
  • the sensor 12 may be a sensor that detects the target 2 in a direction different from that of the sensor 11 .
  • the sensors 11 and 12 both include cameras, the sensor 11 detects (eg, photographs) the target 2 from a first direction (eg, lateral), and the sensor 12 detects the target 2 from the first direction. Detection (eg, photographing) may be performed from a second direction (eg, front) different from the direction.
  • the sensor 12 provides the detection result to the terminal 26.
  • the terminal 26 may be the same terminal as the terminal 20 described in FIG. 1, or may be a terminal different from the terminal 20.
  • FIG. The terminal 26 has an estimator 27 .
  • the estimation unit 27 estimates the state of the subject 2 using the detection result of the sensor 12 .
  • the estimation unit 27 processes the detection result of the sensor 12 using a model generated by machine learning to estimate the state of the object.
  • the terminal 20 uses the estimation result of the estimation unit 27 to generate state information.
  • the terminal 20 provides the generated state information to the information processing device 30 .
  • the information processing device 30 generates training data using the detection result of the sensor 11 and the state information provided from the terminal 20.
  • the information processing device 30 provides the generated training data to the information processing device 40.
  • the learning unit 41 of the information processing device 40 executes machine learning using the teacher data provided from the information processing device 30 to generate an inference model.
  • the information processing device 40 provides the generated inference model to the information processing device 50 .
  • the estimation unit 52 of the information processing device 50 may be the same as or different from the estimation unit 51 described with reference to FIG.
  • the estimation unit 52 estimates the state of the target 2 using the detection result of the sensor 11 and the inference model generated by the learning unit 41 .
  • the information processing device 50 provides the estimation result of the estimation unit 52 to the application 60 .
  • the information processing system 1 is used, for example, as follows. As described with reference to FIG. 1, the information processing system 1 generates teacher data using the detection result of the sensor 10 and the state information input to the terminal 20 by the evaluator 3, and uses the generated teacher data. Generate a first inference model.
  • the sensor 12 in FIG. 9 includes a sensor that detects the same type of information as the sensor 10, and the estimation unit 27 of the terminal 26 performs estimation processing using the first inference model.
  • the sensor 10 in FIG. 1 and the sensor 12 in FIG. 9 both include sensors that detect information indicating vital activity of the subject 2 (eg, heart rate).
  • a first inference model is a model for estimating the state of the subject 2 from information indicating the vital activity of the subject 2 .
  • the estimation unit 27 of the terminal 26 processes the information indicating the vital activity of the subject 2 detected by the sensor 12 and estimates state information (eg, health condition of the subject 2).
  • Sensor 11 includes, for example, a camera.
  • the information processing apparatus 30 uses the data of the image of the subject 2 photographed by the sensor 11 as the detection result, and the data indicating the health condition of the subject 2 estimated by the estimation unit 27 as the state information, as the second teacher data. to generate
  • the learning unit 41 generates an inference model for estimating the health condition of the target 2 from the image of the target 2 using the second learning data. By repeating such processing, the information processing system 1 can generate an inference model that associates a plurality of types of information.
  • FIG. 10 is a diagram showing an example of a computer according to the embodiment.
  • the computer 100 is, for example, at least an information processing device according to the embodiment (eg, information processing device 30, information processing device 40, information processing device 50) and a terminal (eg, terminal 20, terminal 26). used for one.
  • Computer 100 includes CPU 101 , ROM 102 , RAM 103 , bus 104 , input section 105 , output section 106 , storage section 107 , communication section 108 and drive 109 .
  • the CPU 101, ROM 102 and RAM 103 are interconnected via a bus 104, respectively.
  • An input unit 105 , an output unit 106 , a storage unit 107 , a communication unit 108 and a drive 109 are connected to the bus 104 .
  • the CPU 101 includes a Central Processing Unit.
  • ROM 102 includes Read Only Memory.
  • RAM 103 includes Random Access Memory.
  • the CPU 101 executes various processes according to programs recorded in the ROM 102 or programs loaded from the storage unit 107 to the RAM 103 .
  • the RAM 103 also stores data necessary for the CPU 101 to execute various kinds of processing.
  • the input unit 105 includes, for example, at least one of a keyboard, mouse, trackball, and touchpad.
  • the input unit 105 receives input of various information by being operated by the user.
  • the input unit 105 may include a microphone or the like that receives voice input.
  • the output unit 106 includes, for example, a display device (eg, display) that outputs images.
  • the output unit 106 may include a speaker that outputs audio.
  • the input unit 105 and the output unit 106 may include a touch panel in which a transmissive touch pad is superimposed on the display unit.
  • the storage unit 107 includes, for example, at least one of a hard disk, solid state drive, and nonvolatile memory.
  • the storage unit 107 stores at least part of various data externally input to the computer 100 , various data used in processing by the computer 100 , and various data generated by the computer 100 .
  • a communication unit 108 controls communication with another device via a network.
  • a removable medium 110 is appropriately attached to the drive 109 .
  • Removable media 110 include, for example, storage media such as magnetic disks, optical disks, magneto-optical disks, and semiconductor memories.
  • Drive 109 reads information from removable media 110 .
  • Information read from the removable medium 110 is stored in the storage unit 107 .
  • drive 109 reads a program stored on removable media 110 and this program is installed on computer 100 .
  • the computer 100 does not have to include one or both of the input unit 105 and the output unit 106 .
  • the output unit 106 may be a device externally attached to the computer 100, and the computer 100 may include an interface to which the output unit 106 can be connected.
  • the information processing device and the terminal read the program stored in the storage unit and the storage device, and execute various processes according to this program.
  • This program for example, causes the computer to receive the detection result of the sensor that detects the target and the state information representing the state of the target, which is information obtained from information different from the detection result of the sensor, from a predetermined terminal associated with the target. Acquiring and using the acquired detection results and state information to generate data used for machine learning are performed.
  • the above program may be recorded on a computer-readable storage medium and provided.
  • the above program may be a differential program (also referred to as a differential file as appropriate) that executes various processes in combination with a program (eg, operating system) recorded in a computer system.
  • a program eg, operating system
  • the information processing system 1 associates the first information and the second information about the target to generate data used for machine learning.
  • the second information includes information different from the first information.
  • the first information includes information obtained by detecting the target 2 with a sensor
  • the second information includes information representing the state of the target judged by a person.
  • the second information may include information obtained by processing the first information by the device.
  • the first information may correspond to the detection result of a sensor that has detected an object, and the second information may include the result of arithmetic processing, statistical processing, or inference processing using the first information.
  • the second information includes information representing the state of the object obtained by other systems.
  • the first information may include information obtained by detecting the target 2 with a sensor
  • the second information may include information on the target 2 provided by a system that monitors or provides information.
  • the first information may be information of a different type from the second information.
  • the first information may be image-based information and the second information may be material composition, temperature, or force-based information.
  • the first information may be visually based information and the second information may be auditory, olfactory, gustatory, or tactile based.
  • the first information may include information acquired by a sensor attached to the target, and the second information may include information acquired by a sensor installed away from the target.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention vise à générer des données à utiliser pour l'apprentissage automatique. La solution selon l'invention porte sur un dispositif de traitement d'informations qui comprend : une unité d'acquisition qui acquiert un résultat de détection par un capteur qui détecte un sujet, et acquiert, à partir d'un terminal prédéterminé associé au sujet, des informations d'état représentant l'état du sujet et obtenues à partir d'informations différentes du résultat de détection par le capteur ; et une unité de génération qui génère des données à utiliser pour l'apprentissage automatique en utilisant le résultat de la détection et des informations d'état acquises par l'unité d'acquisition.
PCT/JP2021/026060 2021-07-12 2021-07-12 Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations WO2023286105A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/026060 WO2023286105A1 (fr) 2021-07-12 2021-07-12 Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations
JP2023534430A JPWO2023286105A1 (fr) 2021-07-12 2021-07-12

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/026060 WO2023286105A1 (fr) 2021-07-12 2021-07-12 Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2023286105A1 true WO2023286105A1 (fr) 2023-01-19

Family

ID=84919130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/026060 WO2023286105A1 (fr) 2021-07-12 2021-07-12 Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JPWO2023286105A1 (fr)
WO (1) WO2023286105A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284139A (ja) * 2002-03-20 2003-10-03 Sharp Corp 情報提供サービス及び情報提供システム
JP6522173B1 (ja) * 2018-01-16 2019-05-29 株式会社エンライブ 情報処理装置及び情報処理プログラム
WO2020170986A1 (fr) * 2019-02-21 2020-08-27 ソニー株式会社 Dispositif, procédé et programme de traitement d'informations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284139A (ja) * 2002-03-20 2003-10-03 Sharp Corp 情報提供サービス及び情報提供システム
JP6522173B1 (ja) * 2018-01-16 2019-05-29 株式会社エンライブ 情報処理装置及び情報処理プログラム
WO2020170986A1 (fr) * 2019-02-21 2020-08-27 ソニー株式会社 Dispositif, procédé et programme de traitement d'informations

Also Published As

Publication number Publication date
JPWO2023286105A1 (fr) 2023-01-19

Similar Documents

Publication Publication Date Title
Golan et al. A framework for operator–workstation interaction in Industry 4.0
US20210233654A1 (en) Personal protective equipment and safety management system having active worker sensing and assessment
US20100153390A1 (en) Scoring Deportment and Comportment Cohorts
CN112119396A (zh) 用于安全事件检测和可视化的具有增强现实的个人防护设备系统
US10104509B2 (en) Method and system for identifying exceptions of people behavior
US20180330815A1 (en) Dynamically-adaptive occupant monitoring and interaction systems for health care facilities
JP2018142259A (ja) 作業管理装置、方法およびプログラム
JP2021523466A (ja) 比較安全イベント評価のための個人保護具及び安全管理システム
WO2018233858A1 (fr) Procédé pour l'interaction sociale de robot
CN115797868A (zh) 一种监测对象的行为预警方法、系统、装置和介质
Sanchez et al. Hidden markov models for activity recognition in ambient intelligence environments
JP2016058078A (ja) 連想メモリによって分類されたフレームを使用して位置に対する計量値を取得すること
TW202025173A (zh) 生理數據智能處理方法與系統
Bangaru et al. Gesture recognition–based smart training assistant system for construction worker earplug-wearing training
US11704615B2 (en) Risk assessment apparatus and related methods
Cook et al. Assessing leadership behavior with observational and sensor-based methods: A brief overview
US20210271217A1 (en) Using Real Time Data For Facilities Control Systems
WO2023286105A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, programme de traitement d'informations et procédé de traitement d'informations
CN113678148A (zh) 针对个人防护设备的动态消息管理
US20200381131A1 (en) System and method for healthcare compliance
Nepal et al. A survey of passive sensing in the workplace
US11636359B2 (en) Enhanced collection of training data for machine learning to improve worksite safety and operations
US20210142047A1 (en) Salient feature extraction using neural networks with temporal modeling for real time incorporation (sentri) autism aide
Nair et al. Evaluation of an IoT framework for a workplace wellbeing application
Aksüt et al. Using wearable technological devices to improve workplace health and safety: An assessment on a sector base with multi-criteria decision-making methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21950049

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023534430

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE