WO2022254693A1 - 操作検出装置及び操作検出方法 - Google Patents
操作検出装置及び操作検出方法 Download PDFInfo
- Publication number
- WO2022254693A1 WO2022254693A1 PCT/JP2021/021381 JP2021021381W WO2022254693A1 WO 2022254693 A1 WO2022254693 A1 WO 2022254693A1 JP 2021021381 W JP2021021381 W JP 2021021381W WO 2022254693 A1 WO2022254693 A1 WO 2022254693A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operator
- predetermined
- sound
- motion
- accuracy
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 133
- 230000009471 action Effects 0.000 claims abstract description 22
- 230000033001 locomotion Effects 0.000 claims description 87
- 238000000034 method Methods 0.000 claims description 48
- 238000013528 artificial neural network Methods 0.000 claims description 14
- 238000010801 machine learning Methods 0.000 claims description 13
- 230000003183 myoelectrical effect Effects 0.000 description 30
- 230000006870 function Effects 0.000 description 25
- 238000003384 imaging method Methods 0.000 description 22
- 238000012545 processing Methods 0.000 description 14
- 238000005259 measurement Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 230000013011 mating Effects 0.000 description 4
- 210000003205 muscle Anatomy 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 239000005357 flat glass Substances 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000004243 sweat Anatomy 0.000 description 2
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
Definitions
- the present invention relates to a detection device and detection method for detecting an operator's operation.
- the user interface parts are displayed on the operation plane where the user (operator) operates, the touch position where the user's finger touches the operation plane is detected, and the finger touching the operation plane is acquires the shape of the user's hand, recognizes the projection area when the hand is projected onto the operation plane from the acquired hand shape, and from the detected touch position and the position where the UI component is displayed, the UI
- Patent Document 1 a technique of detecting a user's operation on a component and determining the content of the operation on the UI component according to the recognized projection area and the detected user's operation.
- the problem to be solved by the present invention is to provide an operation detection device and an operation detection method that can improve the detection accuracy of the operator's operation when the operator's touch position cannot be detected.
- the present invention detects a motion of an operator, estimates the probability that the operator has performed a predetermined operation based on the detected motion of the operator, and generates a predetermined sound when performing the predetermined operation. is detected, and when the certainty is equal to or greater than a predetermined value and the predetermined sound is detected, it is determined that the operator has performed the predetermined operation.
- FIG. 1 is a block diagram showing one embodiment of an operation detection system including an operation detection device according to the present invention
- FIG. 2 is a perspective view showing an example of the positional relationship between the imaging device, myoelectric potential measurement device, sound collector, and display device of FIG. 1 and an operator.
- FIG. FIG. 3 is an explanatory diagram showing an example of a trained model used in the accuracy estimation unit in FIG. 1; It is explanatory drawing which shows an example of a predetermined procedure.
- 5 is a graph showing the accuracy of execution of a predetermined operation with respect to the degree of progress of steps and the accuracy of generation of a predetermined sound with respect to the degree of progress of steps in the predetermined procedure shown in FIG. 4;
- FIG. 10 is an explanatory diagram showing another example of the predetermined procedure;
- 2 is a flowchart showing an example of an information processing procedure in the operation detection system of FIG. 1;
- FIG. 1 is a block diagram showing an operation detection system 1 according to the invention.
- the operation detection system 1 is a device that detects an operator's operation on a certain device.
- a vehicle dealer hereinafter also referred to as a "dealer”
- it can be used, for example, to confirm whether or not maintenance by a mechanic engaged in vehicle maintenance has been carried out according to the manual.
- An operator whose operation is detected by the operation detection system 1 (hereinafter also simply referred to as "operator”) is not particularly limited, and examples thereof include a vehicle crew member, a factory worker, and a dealer mechanic.
- the operation detection system 1 includes an imaging device 11, a myoelectric potential measurement device 12, a sound collector 13, a display device 14, and an operation detection device 15.
- the devices constituting the operation detection system 1 are connected in a state in which data can be exchanged with each other by known means such as a wired or wireless LAN.
- the number of imaging devices 11, myoelectric potential measurement devices 12, sound collectors 13, and display devices 14 is not particularly limited as long as each is at least one or more.
- the imaging device 11, myoelectric potential measurement device 12, sound collector 13, and display device 14 do not need to be provided together with the operation detection device 15, and may be installed at a location away from the operation detection device 15. good.
- the imaging device 11, the sound collector 13, and the display device 14 are installed near the assembly line of an assembly plant, and the operation detection device 15 is installed in a central control room away from the assembly line, or in a remote control room away from the assembly plant. It may be provided on a local server.
- the imaging device 11 is a device for acquiring image data of objects existing around the operator, and is, for example, a camera including an imaging device such as a CCD, an ultrasonic camera, an infrared camera, or the like.
- Objects include objects existing around the operator in addition to the operator.
- objects include switches and touch panels around the vehicle occupants, parts assembled by workers and tools used, and vehicles maintained by dealer mechanics.
- the imaging device 11 can detect the operator's actions such as the dashboard, roof, and seats of the vehicle, the assembly line of the assembly plant, the workbench, the vicinity of the tools used by the workers, and the lift of the dealer. installed in a position where
- the myoelectric potential measurement device 12 is a device for measuring the operator's myoelectric potential, and is, for example, an electromyograph.
- the type of myoelectric potential measurement device 12 is not particularly limited, and may be, for example, an electromyograph with needle electrodes, an electromyograph with surface electrodes, or an electromyograph with wire electrodes.
- the myoelectric potential measuring device 12 is attached to the operator's body by, for example, an adhesive pad or hook-and-loop fastener, and measures the myoelectric potential of the attached portion that comes into contact with the operator's body.
- the part of the operator to which the myoelectric potential measuring device 12 is attached includes the arm of a vehicle occupant, the upper arm and/or the forearm of a worker, the leg of a mechanic at a dealer, and the like.
- the sound collecting device 13 is a device for acquiring ambient sound as audio data, and is, for example, a microphone such as a stand microphone, a close-talking microphone, or a gun microphone.
- a microphone may be omnidirectional or directional.
- the communication method may be either wired or wireless.
- Sounds acquired by the sound collector 13 include the operator's voice and the voices of people around the operator, as well as sounds caused by the operator's operations. Sounds caused by the operator's operation include the sound generated from the switch operated by the vehicle occupant, the sound generated from the speaker when the vehicle occupant touches the touch panel, and the sound generated when the worker assembles multiple parts. , the sound of parts meshing together, and the operating sound of tools used by dealer mechanics.
- the sound collector 13 is placed at a position where it can detect sounds around the operator, such as the dashboard, roof, and seats of the vehicle, the assembly line of the assembly plant, the workbench, and the tools used by the workers. It may be installed together with the imaging device 11 or myoelectric potential measurement device 12 .
- the display device 14 is a device for notifying the operator of the operation detected by the operation detection device 15 .
- the display device 14 is, for example, a liquid crystal display, a projector, or the like, and may include a speaker.
- the display device 14 is installed in a position close to the operator, such as a dashboard of a vehicle or a work place of a worker in an assembly plant, where necessary information can be notified to the operator. If there is a supervisor to monitor the operation, it will be installed near the supervisor. In this case, if the supervisor is located away from the operator, the display device 14 is installed at a location away from the operator. Moreover, the display device 14 may be attached to the operator as a wearable terminal.
- the imaging device 11, the myoelectric potential measurement device 12, the sound collector 13, the display device 14, and the operation detection device 15 may be integrated into one wearable terminal and attached to the operator. Also, instead of the display device 14, only a speaker that emits an alarm sound may be used.
- FIG. 2 shows an example of the positional relationship between the imaging device 11, the myoelectric potential measuring device 12, the sound collecting device 13, the display device 14, and the operator.
- the operator OP is a worker engaged in assembly work in an assembly factory, and is assembling parts P on a workbench WB.
- An imaging device 11 is installed above the operator OP, and acquires image data of how the parts P are assembled by the operator OP on the workbench WB.
- a myoelectric potential measuring device 12 is attached to the right forearm of the operator OP, and from the value of the measured myoelectric potential, the movement of the muscles when the operator OP assembles the part P is acquired, and the motion of the operator OP is measured. To detect.
- a sound collector 13 is attached to the right upper arm of the operator OP, and detects the sound of the parts engaging with each other, which is generated when the operator OP assembles the parts P.
- a display device 14 for notifying the operator OP of the detected operation is installed as a liquid crystal display 14a, for example, in front of the operator OP. Alternatively, they may be provided as earphones 14b attached to the ears of the operator OP.
- the operation detection device 15 is a device for determining whether or not the operator has performed a certain operation.
- Image data, myoelectric potential data, and audio data used for the determination are acquired from the imaging device 11, the myoelectric potential measuring device 12, and the sound collecting device 13, respectively, at predetermined time intervals.
- the operation detection device 15 uses the processor 16 to implement functions such as processing of acquired data, determination of whether or not an operation has been properly executed, and output of the result of the determination.
- the processor 16 includes a ROM (Read Only Memory) 162 storing a program, and a CPU (Central Processing Unit) 161 which is an operation circuit for functioning as the operation detection device 15 by executing the program stored in the ROM 162. and a RAM (Random Access Memory) 163 functioning as an accessible storage device.
- ROM Read Only Memory
- CPU Central Processing Unit
- a program used in the operation detection device 15 of the present embodiment implements functions such as processing of acquired data, determination of whether or not an operation has been properly executed, and output of the result of the determination. It includes an operation detection unit 2 which is a functional block for The operation detection unit 2 acquires image data from the imaging device 11, acquires myoelectric potential values from the myoelectric potential measurement device 12, acquires voice data from the sound collector 13, and detects the operator based on the acquired data. It has a function of estimating an operation to be performed, determining whether or not the estimated operation has been appropriately performed, and outputting the determination result.
- the operation detection unit 2 includes an operation detection unit 21, a probability estimation unit 22, a sound detection unit 23, an operation determination unit 24, and a determination result output unit 25, as shown in FIG. In FIG. 1, each part is extracted and shown for convenience.
- the operation detection device 15 shown in FIG. 1 includes all of the functional blocks described above, but a single operation detection device 15 need not include all of the functional blocks, and some of the functional blocks described above may be used for operation detection. It may be provided in another device included in the system 1 or in another information processing device (not shown).
- the determination result output section 25 may be provided in the display device 14 . In this case, the functions of the determination result output unit 25 are executed using the CPU, ROM, and RAM of the display device 14 .
- each functional block it is not necessary for a single device to execute all the processing of each functional block, and the function of each functional block may be realized across multiple devices connected in a state where data can be exchanged.
- part of the processing performed by the motion detection unit 21 is performed by the imaging device 11 or myoelectric potential measurement device 12, and the remaining processing is performed by the operation detection device. 15 may be executed.
- the CPU, ROM, and RAM of the imaging device 11 or myoelectric potential measuring device 12 are used to perform part of the processing for realizing the function of the motion detection unit 21 .
- part of the processing performed by the sound detection unit 23 may be performed by the sound collector 13 and the rest of the processing may be performed by the operation detection device 15. good.
- the motion detection unit 21 has a function of detecting the motion of the operator. For example, the motion detection unit 21 acquires image data including the operator from the imaging device 11 and detects the motion of the operator from the acquired image data. The motion detection unit 21 performs analysis such as pattern matching on image data acquired from the imaging device 11, and classifies objects included in the image data. Next, an operator is selected from the classified objects, and data relating to the operator extracted from the image data is acquired. Then, each part of the operator's body and its positional relationship are recognized from the data about the operator, and the operator's motion is detected from the recognized positional relationship of each part.
- the motion detection unit 21 may acquire the measured myoelectric potential value from the myoelectric potential measurement device 12 and detect the operator's motion from the acquired myoelectric potential value. good.
- the myoelectric potential measurement device 12 is attached to the arm of a worker working on an assembly line in a factory, and the potential of each muscle in the upper arm and forearm of the worker's arm is measured. From the measured potential of each muscle, which muscle is contracting, that is, how the worker's arm is moving is detected as a motion.
- the accuracy estimation unit 22 has a function of estimating the accuracy with which the operator has performed a predetermined operation based on the operator's motion detected by the motion detection unit 21 .
- the predetermined operation is not particularly limited, and includes all operations for the operator to perform some kind of input on the operation target. Specifically, a vehicle occupant pushes a switch to move the windows of the vehicle up and down, a vehicle occupant touches a touch panel to change the map display of the navigation device, and an assembly plant worker The operation of fitting the coupler connected to the sensor with the coupler connected to the electronic control unit (ECU), and the operation of the assembly plant worker using tools to tighten the bolts and attach the exhaust manifold to the engine block. , an operation by a dealer's mechanic to fit a spark plug into an engine, and an operation by a dealer's mechanic to tighten a bolt using a torque wrench.
- ECU electronice control unit
- the accuracy estimation unit 22 estimates the operation that the operator is about to perform based on the operator's motion detected by the motion detection unit 21 when estimating the certainty that the predetermined operation has been performed.
- the correspondence between the operator's motion and the operation that the operator is about to execute is obtained in advance for each operation and stored in a database such as the database 17, for example.
- the accuracy estimating unit 22 can acquire the correspondence between operations and actions from the database 17 as necessary. Based on the correspondence obtained from the database 17, the accuracy estimation unit 22 estimates the operation that the operator is about to perform from the operator's motion.
- the motion detection unit 21 detects a motion of an occupant sitting in the front passenger seat of a vehicle wiping sweat from his face with a handkerchief and then reaching out to a switch installed on the front passenger door.
- the accuracy estimating unit 22 searches for the operation corresponding to the operation of reaching for the switch installed on the front passenger seat side door from the correspondence relationship between the operation and the operation acquired from the database 17 . Assuming that the operation corresponding to the operation corresponds to the operation of pulling the handle of the passenger seat side door to open the door and the operation of pushing in the switch for moving the window glass of the vehicle up and down. was driving, and the occupant was wiping sweat from his face with a handkerchief.
- the motion detection unit 21 detects a motion of a worker working on an assembly line in a factory holding an uncoupled coupler in his left hand and extending his right hand toward another coupler.
- the accuracy estimating unit 22 searches for an operation corresponding to the motion of reaching out to the coupler from the correspondence relationship between motions and operations acquired from the database 17 .
- the operations corresponding to this action include the operation of removing one coupler from a pair of couplers already engaged, and the operation of engaging the coupler connected to the sensor with the coupler connected to the electronic control unit.
- the accuracy estimating unit 22 determines that since the worker is holding the coupler that is not engaged in the left hand, the worker's movement is connected to the coupler connected to the sensor and to the electronic control device. It is presumed that it corresponds to the operation of mating the coupler.
- the accuracy estimation unit 22 determines whether the operation estimated as described above corresponds to the predetermined operation. Then, when it is determined that the estimated operation corresponds to the predetermined operation, the probability that the predetermined operation has been performed is estimated.
- the accuracy is predetermined as a function of the operator's motion for each predetermined operation, and is stored in a database such as the database 17, for example.
- the accuracy estimating unit 22 can acquire from the database 17 the accuracy that a predetermined operation has been performed on the detected motion, if necessary. For example, in the case of an operation to touch the touch panel to change the display of the device, the accuracy is highest when the occupant's hand touching the touch panel is detected, and the accuracy decreases as the occupant's hand moves away from the touch panel. In the case of the operation of pushing the switch, the accuracy is highest when the occupant's hand touches the switch as the operation, and the accuracy decreases as the occupant's hand moves away from the switch.
- the motion is most accurate when the motion of the mated coupler stops moving and the worker's hand stops moving, and the motion of the coupler to coupler The greater the distance, the lower the accuracy.
- the accuracy is highest when the movement of the torque wrench stops and the operator's hand stops. less accurate while the is rotating and after the torque wrench is stationary.
- the accuracy is highest when the motion of the spark plug stops rotating and the mechanic's hand is stationary, while the hand is moving, and Accuracy decreases after the mechanic's body has left the engine.
- Such accuracy estimation is performed in parallel with detection of the operator's motion by the motion detection unit 21, and the accuracy is estimated by the accuracy estimation unit 22 each time the operator's motion changes.
- the accuracy estimating unit 22 can estimate the accuracy at predetermined time intervals, and the predetermined time can be appropriately set according to the computing power of the CPU 161 .
- the accuracy estimation unit 22 can estimate the accuracy using a learned model that has undergone machine learning for estimating the accuracy from the operator's motion detected by the motion detection unit 21 .
- a trained model is a model that has been learned in advance by machine learning so that appropriate output data can be obtained for certain input data. At least, a program that performs operations from input data to output data, and a weighting coefficient (parameter) used for the calculation.
- the learned model of the present embodiment is based on the input data, the operation corresponding to the operator's motion, and the operation corresponding to the operation.
- the computer (particularly, the CPU 161 of the processor 16) is caused to output output data including the accuracy of execution of .
- the trained model of this embodiment is not particularly limited, it is, for example, a neural network 3 as shown in FIG.
- the neural network 3 comprises an input layer 31, an intermediate layer 32 and an output layer 33, each layer containing at least one neuron.
- the input layer 31 receives input data 34 including operator motion data detected by the motion detection unit 21 and outputs the input data to the intermediate layer 32 .
- the intermediate layer 32 extracts motion data from the data input from the input layer 31 . Next, the motion in the extracted motion data is associated with the operation. Then, when the operation is a predetermined operation, the accuracy with which the predetermined operation has been performed for the action is estimated.
- the output layer 33 outputs the data input from the intermediate layer 32 as output data 35 including accuracy data.
- the action of reaching out to another coupler with the right hand, or the action of a factory worker holding a torque wrench with his right hand and fitting it into the bolt of the engine block on the workbench, are all actions that are
- a correspondence relationship is established by machine learning.
- the corresponding relationship is associated with a plurality of factors such as the tool used by the operator and the environment around the operator.
- the parameters are set so that the appropriate operation is output for the operation. For example, the fact that the vehicle in which the crew is riding is running, the fact that the worker is holding the coupler with each hand, and the like are taken into consideration as parameters when estimating the operation from the motion.
- the correspondence relationship of the intermediate layer 32 described above may be learned in advance by machine learning, but the input data 34 input to the neural network 3 in the past and the output data 35 output from the neural network 3 in the past are New learning may be performed using the teacher data 36 included therein, or a learned model previously learned by machine learning may be further learned.
- the learning is performed by the machine learning unit 22a included in the accuracy estimation unit 22.
- the teacher data 36 is stored in a database such as the database 17 shown in FIG. 1, for example, and can be obtained as needed.
- the sound detection unit 23 has a function of detecting a predetermined sound generated when performing a predetermined operation.
- the predetermined sound is not particularly limited, and includes all sounds generated from an operation target operated by the operator in a predetermined operation and an object attached to the operation target. Specifically, if the predetermined operation is an operation of pushing a switch, it is the operation sound of the switch that is generated when the switch is pushed. If the predetermined operation is a touch panel operation, the sound is generated from the speaker when the touch panel is touched. When the predetermined operation is an operation of fitting the couplers together, the sound is generated when the fitting portions of the couplers are engaged.
- a predetermined sound corresponding to a predetermined operation is stored as data such as waveform data in a database such as the database 17, and can be obtained as needed.
- the sound detection unit 23 determines whether or not the sound data acquired from the sound collector 13 contains the predetermined sound as described above. Specifically, the sound detection unit 23 performs a process of reducing noise such as road noise, engine sound, factory noise, and human voices on the audio data acquired from the sound collector 13 . Then, the same sound as the predetermined sound obtained from the database 17 is detected from the audio data that has undergone necessary preprocessing such as noise reduction by performing frequency analysis or the like. When the same sound as the predetermined sound is detected from the preprocessed audio data, the sound detection section 23 outputs to the operation determination section 24 that the predetermined sound has been detected. Note that the correspondence between the predetermined operation and the predetermined sound is not necessarily a one-to-one correspondence, and a plurality of predetermined sounds may be set for one predetermined operation.
- the operation determination unit 24 combines the accuracy estimated by the accuracy estimation unit 22 and the timing at which the predetermined sound is detected by the sound detection unit 23 to determine whether or not the predetermined operation has been performed. It has a function to determine whether or not the procedure has been executed.
- a predetermined procedure is a procedure of operations set in advance for a predetermined operation. Specifically, when the accuracy estimated by the accuracy estimation unit 22 is equal to or greater than a predetermined value and the sound detection unit 23 detects a predetermined sound, the operation determination unit 24 executes a predetermined operation in a predetermined procedure.
- the predetermined value is, for example, 80% or more, and can be set to an appropriate value within a range in which it can be determined whether or not the predetermined operation has been appropriately performed.
- the switch SW shown in FIG. 4 switches between ON and OFF of the apparatus, and is OFF when positioned on the left side in the horizontal direction of FIG. 4, and ON when positioned on the right side.
- the switch SW can be switched from the OFF state to the ON state.
- step 1) the occupant touches the switch SW with the finger F of the hand
- step 2) the occupant pushes the switch SW in the ON direction with the finger F
- step 3 the occupant pushes the switch SW in the ON direction with the finger F
- the operation determination unit 24 acquires the accuracy from the accuracy estimation unit 22 at predetermined time intervals, and grasps the timing when the accuracy reaches or exceeds a predetermined value along with the change in the detected motion.
- a graph like the one shown in the upper part of Fig. 5 can be obtained.
- the operation determination unit 24 determines whether or not the switch operating sound S, which is a predetermined sound, is detected while the passenger is performing the procedure of step 3), that is, while the accuracy is equal to or higher than a predetermined value.
- a graph such as that shown in the lower part of FIG. 5 can be obtained.
- the sound detection unit 23 analyzes the sound data acquired by the sound collector 13 using a method such as frequency analysis, particularly while step 3) is being executed, and detects the operating sound S of the switch. .
- the operation determination unit 24 determines that the operation of pushing the switch of the vehicle corresponds to the predetermined procedure. It is determined that it has been executed according to On the other hand, if the occupant does not receive the output of the sound detection unit 23 while performing the procedure of step 3) and the occupant proceeds to the procedure of step 4), the operation determination unit 24 operates the switch of the vehicle. is not executed according to a predetermined procedure. That is, since the operation sound of the switch was not detected in step 3), it is determined that the switch was not pushed to the position where it would operate properly, and that the occupant's switch input operation was not properly executed.
- FIG. 6 will be used to explain a case where the predetermined operation is an operation for fitting two couplers by a worker.
- the coupler C1 held in the left hand of the worker and the coupler C2 held in the left hand of the worker are fitted together, and when the fitting portions of the coupler C1 and the coupler C2 are fitted properly, the operating sound shall occur.
- the predetermined procedure in this case is, for example, as shown in FIG.
- the couplers C1 and C2 are pushed into each other until the fitting portion Z is engaged and a clicking sound S is generated, and step 4) when the operating sound S is generated, the couplers C1 and C2 are released. be.
- the operation determination unit 24 acquires the accuracy from the accuracy estimation unit 22 and grasps the timing when the accuracy becomes equal to or greater than a predetermined value. In this predetermined procedure, it is assumed that the accuracy becomes equal to or greater than a predetermined value in step 3). While the operator is performing the procedure of step 3), that is, while the accuracy is equal to or higher than a predetermined value, the operation determination unit 24 generates a predetermined sound, that is, an operation sound when the fitting portion Z of the coupler is engaged. Determine whether S is detected. While the worker is performing the procedure of step 3), if an output indicating that the predetermined sound has been detected is received from the sound detection unit 23, the operation determination unit 24 instructs the worker to engage the couplers C1 and C2.
- the operation determination unit 24 determines that the coupler It is determined that the operator's operation to fit C1 and C2 has not been performed according to a predetermined procedure. That is, since the operating sound S of the fitting portion was not detected in step 3), it is determined that the couplers C1 and C2 were not properly fitted because the couplers were not pushed to a position where they are properly meshed.
- predetermined procedures and predetermined operations are not limited to a one-to-one correspondence, and a plurality of predetermined procedures may be set for one predetermined operation.
- the determination result output unit 25 has a function of outputting the determination result of the operation determination unit 24 to an external device. has a function of outputting that the predetermined operation has not been executed when the The determination result output unit 25 outputs to the display device 14 a determination result to the effect that the predetermined operation has not been performed properly, and displays the determination result on the display device 14, so that the operator can properly perform the predetermined operation. It can be notified that it was not executed. For example, if the operator is driving a bus and the supervisor controlling the operation of the bus is in the operation control room away from the bus driven by the operator, the display device 14 The determination result is output from the determination result output unit 25 to the display device 14 provided in the operation control room. As a result, the supervisor can monitor the operation status of the bus from a position away from the operator, and the operator who drives the bus can operate the ventilator and air conditioner to appropriately control the temperature inside the bus. You can check whether
- FIG. 7 is an example of a flowchart showing information processing in the operation detection system 1 of this embodiment. The processing described below is executed by the processor 16 of the operation detection device 15 at predetermined time intervals.
- step S1 the action of the operator is detected by the function of the action detection unit 21 .
- the motion of the occupant is detected from image data acquired by an imaging device 11 installed on the dashboard of the vehicle, for example.
- the function of the accuracy estimation unit 22 estimates the operation that the operator is about to perform using the correspondence between operations and actions stored in the database 17.
- the operation of the occupant is estimated from the correspondence relationship between operations and actions based on the action detected by the action detection unit 21 using image data.
- the function of the accuracy estimation unit 22 determines whether or not the estimated operation corresponds to the predetermined operation, and if it is determined that the estimated operation corresponds to the predetermined operation, the predetermined operation is executed.
- Estimate the accuracy of The estimation uses functions of motion and probability stored in the database 17 .
- the operation of the operator interface is executed from a function of the action and accuracy based on the action detected by the action detection unit 21 using image data.
- the function of the sound detection unit 23 detects a predetermined sound using the sound collector 13.
- the sound collector 13 detects the operating sound generated from the speaker of the vehicle when the operator interface is operated.
- step S5 the function of the operation determination unit 24 determines whether or not the sound detection unit 23 has detected a predetermined sound while the accuracy estimated by the accuracy estimation unit 22 is equal to or greater than a predetermined value. If it is determined that the sound detection unit 23 has detected the predetermined sound while the accuracy estimated by the accuracy estimation unit 22 is equal to or greater than the predetermined value, it is determined that the operator has performed the predetermined operation, and the routine is executed. Stops execution and terminates operation detection processing. On the other hand, if the sound detector 23 does not detect the predetermined sound while the accuracy is equal to or greater than the predetermined value, it is determined that the predetermined operation has not been performed according to the predetermined procedure, and the process proceeds to step S6.
- step S6 the function of the determination result output unit 25 outputs to the display device 14 that the predetermined operation has not been properly executed. Then, the display device 14 receives the output of the determination result output unit 25 and notifies the operator that the predetermined operation was not properly executed. When an occupant of the vehicle operates the operator interface of the in-vehicle device, the operator interface displays a message indicating that the operation was not performed properly. After the display, the execution of the routine is stopped and the operation detection processing is terminated.
- step S1 when the vehicle occupant opens the sunroof or operates the ventilator, for example, in step S1, from the image data acquired by the imaging device 11 installed on the window glass at the rear of the vehicle, Detect occupant movements.
- step S2 based on the motion detected by the motion detection unit 21 using the image data, the operation of the occupant is estimated from the correspondence relationship between the operation and the motion.
- step S3 based on the motion detected by the motion detection unit 21 using the image data, the probability that the sunroof or ventilator has been operated is estimated from the function of motion and probability.
- the sound of the opening/closing mechanism of the sunroof being activated or the sound of the ventilator being opened is detected.
- step S5 it is determined whether or not the sound of the opening/closing mechanism of the sunroof or the sound of the ventilator being opened is detected while the accuracy of operating the sunroof or the ventilator is equal to or higher than a predetermined value.
- the operation detection process ends.
- step S6 it is determined that the operation of the sunroof or the ventilator has not been performed according to the prescribed procedure, and the process proceeds to step S6.
- step S6 it is output to the display device 14 that the operation of the sunroof or the ventilator was not properly executed.
- step S1 when a worker engaged in assembly work in a factory tightens a bolt using a tool, for example, in step S1, the myoelectric potential measurement device 12 attached to the worker acquires The motion of the worker is detected from the myoelectric potential value.
- step S2 based on the motion detected by the motion detection unit 21 using the value of myoelectric potential, the operation of the occupant is estimated from the correspondence relationship between the operation and the motion.
- step S3 based on the motion detected by the motion detection unit 21 using the value of myoelectric potential, the accuracy of the bolt tightening operation being performed is estimated from the function of the motion and the accuracy.
- step S4 the sound generated from the tool when the bolt has been tightened is detected.
- step S5 it is determined whether or not the sound generated from the tool is detected while the accuracy of the bolt tightening operation is equal to or higher than a predetermined value.
- the operation detection process ends.
- step S6 it is determined that the bolt has not been properly tightened, and the process proceeds to step S6.
- step S6 it is output to the display device 14 that the bolt has not been properly tightened.
- step S1 when a worker engaged in assembly work in a factory connects two couplers, for example, in step S1, image data is captured by an imaging device 11 attached to the worker as a wearable terminal. to detect worker movements.
- step S2 based on the motion detected by the motion detection unit 21 using the image data, the operator's operation is estimated from the correspondence relationship between the operation and the motion.
- step S3 based on the motion detected by the motion detection unit 21 using the image data, the accuracy of the worker combining the couplers is estimated from the function of the motion and the accuracy.
- a sound generated from the fitting portion when the coupler is fitted is detected.
- step S5 it is determined whether or not a sound generated from the fitting portion is detected while the accuracy of performing the operation of fitting the coupler is equal to or higher than a predetermined value.
- the sound generated from the mating portion is detected, it is determined that the coupler is properly mated, and the operation detection process is terminated.
- the sound generated from the mating portion is not detected, it is determined that the coupler has not been properly mated, and the process proceeds to step S6. Then, in step S6, it is output to the display device 14 that the coupler has not been properly fitted.
- the operation detection unit 21 for detecting the operation of the operator and the operation of the operator detected by the operation detection unit 21 are detected by the operation detection unit 21 .
- a certainty estimating unit 22 for estimating the accuracy of performing a predetermined operation, a sound detecting unit 23 for detecting a predetermined sound generated when the predetermined operation is performed, and the accuracy estimated by the accuracy estimating unit 22 is a predetermined and an operation determination unit 24 that determines that the operator has performed the predetermined operation when the predetermined sound is detected by the sound detection unit 23. be done.
- the operator's operation can be detected without detecting the operator's touch position.
- the accuracy estimation unit 22 estimates the accuracy at predetermined time intervals. This makes it possible to estimate more accurate accuracy.
- the action detection unit 21 acquires an image including the operator, and detects the action of the operator from the acquired image. Thereby, the motion of the operator can be detected without attaching the device to the operator.
- the operation determination unit 24 It is determined that the predetermined operation has not been performed according to a predetermined procedure. Thereby, the operation detection device 15 can recognize that the operation was not performed according to the predetermined procedure.
- the determination result output unit outputs that the predetermined operation has not been performed. 25. Thereby, the operator can be notified that the operation was not performed according to the predetermined procedure.
- the display device 14 receives the output of the determination result output unit 25 and notifies the operator that the predetermined operation has not been performed. Thereby, the operator can be notified that the operation was not performed according to the predetermined procedure.
- the predetermined sound is a sound generated due to the predetermined operation by the operator.
- the sound to be detected by the sound detection unit 23 can be narrowed down, and the predetermined sound can be detected more accurately.
- the accuracy estimation unit 22 performs machine learning for estimating the accuracy from the operation of the operator detected by the operation detection unit 21.
- a trained model is used to estimate the likelihood. As a result, it is possible to estimate the degree of certainty for operations that have not been set in advance.
- the learned model inputs the input data 34 including the data of the operation of the operator detected by the operation detection unit 21 to the input layer 31.
- the output data 35 including the accuracy data is the neural network 3 output from the output layer 33
- the accuracy estimation unit 22 receives the input data 34 input to the neural network 3 in the past and the neural
- a machine learning unit for learning the neural network 3 using output data 35 output from the network 3 as teacher data 36 is provided, and the neural network 3 is trained by the machine learning unit. This makes it possible to more accurately estimate the accuracy of operations that have not been set in advance.
- the operation of the operator is detected, the probability that the operator has performed a predetermined operation is estimated based on the detected operation of the operator, and the predetermined operation is performed. is detected, and if the certainty is equal to or greater than a predetermined value and the predetermined sound is detected, it is determined that the operator has performed the predetermined operation.
- the operator's operation can be detected without detecting the operator's touch position.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Transportation (AREA)
- Biophysics (AREA)
- Neurosurgery (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Dermatology (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Neurology (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- General Factory Administration (AREA)
Abstract
Description
図1は、本発明に係る操作検出システム1を示すブロック図である。操作検出システム1は、ある装置に対する操作者の操作を検出する装置であり、たとえば、車両の乗員(ドライバーを含む。以下同じ。)による車載機器の操作を検出し、当該操作に付随した機能を発現させる場合、車両の組立て工場において、組立て作業に従事する作業員の作業が、予め決められた手順に従って行われたか否かを検出する場合、車両の販売店(以下、「ディーラー」とも言う。)において、車両の整備に従事する整備士の整備がマニュアルに従って行われたか否かを確認する場合などに用いることができる。操作検出システム1により操作を検出される操作者(以下、単に「操作者」とも言う。)は特に限定されず、車両の乗員、工場の作業員、ディーラーの整備士などが挙げられる。
本実施形態の操作検出装置15で用いるプログラムは、取得したデータの処理、操作が適切に実行されたか否かの判定、及び当該判定の結果の出力などの機能を操作検出装置15にて実現するための機能ブロックである操作検出部2を含む。操作検出部2は、撮像装置11から画像データを取得し、筋電位測定装置12から筋電位の値を取得し、集音装置13から音声データを取得し、取得したデータに基づいて操作者が実行する操作を推定し、推定した操作が適切に実行されたか否かを判定し、当該判定結果を出力する機能を有する。操作検出部2は、図1に示すように、動作検出部21、確度推定部22、音検出部23、操作判定部24、及び判定結果出力部25を備える。図1には、各部を便宜的に抽出して示す。
図7を参照して、操作検出装置15が情報を処理する際の手順を説明する。図7は、本実施形態の操作検出システム1における情報の処理を示すフローチャートの一例である。以下に説明する処理は、操作検出装置15のプロセッサ16により所定の時間間隔で実行される。
以上のとおり、本実施形態の操作検出装置15によれば、操作者の動作を検出する動作検出部21と、前記動作検出部21により検出された前記操作者の動作に基づいて、前記操作者が所定操作を実行した確度を推定する確度推定部22と、前記所定操作を実行する際に発生する所定音を検出する音検出部23と、前記確度推定部22により推定された前記確度が所定値以上であり、且つ、前記音検出部23で前記所定音が検出された場合に、前記操作者が前記所定操作を実行したと判定する操作判定部24と、を備える、操作検出装置が提供される。これにより、操作者のタッチ位置を検出せずに、操作者の操作を検出することができる。また、適切に操作が実行されたか否かを画像のみから判定することが難しい場合でも、操作が適切に完了したか否かを判定することができる。
11…撮像装置
12…筋電位測定装置
13…集音装置
14…表示装置
14a…液晶ディスプレイ
14b…イヤホン
15…操作検出装置
16…プロセッサ
161…CPU
162…ROM
163…RAM
17…データベース
2…操作検出部
21…動作検出部
22…確度推定部
22a…機械学習部
23…音検出部
24…操作判定部
25…判定結果出力部
3…ニューラルネットワーク
31…入力層
32…中間層
33…出力層
34…入力データ
35…出力データ
36…教師データ
C1、C2…カプラー
F…指
OP…操作者
P…部品
S…作動音
SW…スイッチ
WB…作業台
Z…嵌合部
Claims (10)
- 操作者の動作を検出する動作検出部と、
前記動作検出部により検出された前記操作者の動作に基づいて、前記操作者が所定操作を実行した確度を推定する確度推定部と、
前記所定操作を実行する際に発生する所定音を検出する音検出部と、
前記確度推定部により推定された前記確度が所定値以上であり、且つ、前記音検出部で前記所定音が検出された場合に、前記操作者が前記所定操作を実行したと判定する操作判定部と、を備える、操作検出装置。 - 前記確度推定部は、所定時間ごとに前記確度を推定する、請求項1に記載の操作検出装置。
- 前記動作検出部は、前記操作者を含む画像を取得し、取得した前記画像から前記操作者の動作を検出する、請求項1又は2に記載の操作検出装置。
- 前記操作判定部は、前記確度が前記所定値以上である間に、前記音検出部により前記所定音が検出されなかった場合は、前記所定操作が所定手順で実行されなかったと判定する、請求項1~3のいずれか一項に記載の操作検出装置。
- 前記操作判定部が、前記所定操作が実行されなかったと判定した場合に、前記所定操作が実行されなかったことを出力する判定結果出力部を備える、請求項4に記載の操作検出装置。
- 前記判定結果出力部の出力を受け取り、前記操作者に、前記所定操作が実行されなかったことを通知する表示装置を備える、請求項5に記載の操作検出装置。
- 前記所定音は、前記操作者の前記所定操作に起因して発生する音である、請求項1~6のいずれか一項に記載の操作検出装置。
- 前記確度推定部は、前記動作検出部にて検出された前記操作者の前記動作から前記確度を推定するための機械学習を行った学習済みモデルを用いて前記確度を推定する、請求項1~7のいずれか一項に記載の操作検出装置。
- 前記学習済みモデルは、前記動作検出部にて検出された前記操作者の前記動作のデータを含む入力データを入力層に入力すると、前記確度のデータを含む出力データが出力層からの出力されるニューラルネットワークであり、
前記確度推定部は、過去に前記ニューラルネットワークに入力した入力データ、及び過去に前記ニューラルネットワークから出力された出力データを教師データとして前記ニューラルネットワークを学習させる機械学習部を備え、前記機械学習部により前記ニューラルネットワークを学習させる、請求項8に記載の操作検出装置。 - 操作者の動作を検出し、
検出された前記操作者の動作に基づいて、前記操作者が所定操作を実行した確度を推定し、
前記所定操作を実行する際に発生する所定音を検出し、
前記確度が所定値以上であり、且つ、前記所定音が検出された場合に、前記操作者が前記所定操作を実行したと判定する、操作検出方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21943339.8A EP4350483A1 (en) | 2021-06-04 | 2021-06-04 | Operation detection device and operation detection method |
JP2023525312A JPWO2022254693A1 (ja) | 2021-06-04 | 2021-06-04 | |
CN202180098857.XA CN117396829A (zh) | 2021-06-04 | 2021-06-04 | 操作检测装置和操作检测方法 |
PCT/JP2021/021381 WO2022254693A1 (ja) | 2021-06-04 | 2021-06-04 | 操作検出装置及び操作検出方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/021381 WO2022254693A1 (ja) | 2021-06-04 | 2021-06-04 | 操作検出装置及び操作検出方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022254693A1 true WO2022254693A1 (ja) | 2022-12-08 |
Family
ID=84322959
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/021381 WO2022254693A1 (ja) | 2021-06-04 | 2021-06-04 | 操作検出装置及び操作検出方法 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4350483A1 (ja) |
JP (1) | JPWO2022254693A1 (ja) |
CN (1) | CN117396829A (ja) |
WO (1) | WO2022254693A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014147785A1 (ja) * | 2013-03-21 | 2014-09-25 | 富士通株式会社 | 動作検知装置,動作検知方法,プログラム及び記録媒体 |
JP2015088161A (ja) * | 2013-09-26 | 2015-05-07 | 富士通株式会社 | ジェスチャ入力装置、ジェスチャ入力方法、およびジェスチャ入力プログラム |
JP2017162126A (ja) | 2016-03-08 | 2017-09-14 | キヤノン株式会社 | 入力システム、入力方法、制御用プログラム、及び記憶媒体 |
JP2018173913A (ja) * | 2017-03-31 | 2018-11-08 | 綜合警備保障株式会社 | 画像処理システム、情報処理装置、プログラム |
JP2020064376A (ja) * | 2018-10-15 | 2020-04-23 | 東京瓦斯株式会社 | 情報処理システムおよびプログラム |
-
2021
- 2021-06-04 CN CN202180098857.XA patent/CN117396829A/zh active Pending
- 2021-06-04 EP EP21943339.8A patent/EP4350483A1/en active Pending
- 2021-06-04 JP JP2023525312A patent/JPWO2022254693A1/ja active Pending
- 2021-06-04 WO PCT/JP2021/021381 patent/WO2022254693A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014147785A1 (ja) * | 2013-03-21 | 2014-09-25 | 富士通株式会社 | 動作検知装置,動作検知方法,プログラム及び記録媒体 |
JP2015088161A (ja) * | 2013-09-26 | 2015-05-07 | 富士通株式会社 | ジェスチャ入力装置、ジェスチャ入力方法、およびジェスチャ入力プログラム |
JP2017162126A (ja) | 2016-03-08 | 2017-09-14 | キヤノン株式会社 | 入力システム、入力方法、制御用プログラム、及び記憶媒体 |
JP2018173913A (ja) * | 2017-03-31 | 2018-11-08 | 綜合警備保障株式会社 | 画像処理システム、情報処理装置、プログラム |
JP2020064376A (ja) * | 2018-10-15 | 2020-04-23 | 東京瓦斯株式会社 | 情報処理システムおよびプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP4350483A1 (en) | 2024-04-10 |
JPWO2022254693A1 (ja) | 2022-12-08 |
CN117396829A (zh) | 2024-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10324425B2 (en) | Human collaborative robot system having improved external force detection accuracy by machine learning | |
US11498216B2 (en) | Remote control manipulator system and control device | |
US11931905B2 (en) | Failure prediction method and failure prediction apparatus | |
CN111459274B (zh) | 一种基于5g+ ar的针对非结构化环境的遥操作方法 | |
WO2019198179A1 (ja) | 搭乗者状態判定装置、警告出力制御装置及び搭乗者状態判定方法 | |
CN109543651A (zh) | 一种驾驶员危险驾驶行为检测方法 | |
CN113412178A (zh) | 机器人控制装置、机器人系统以及机器人控制方法 | |
CN112215093A (zh) | 一种车辆驾驶能力水平的评价方法及装置 | |
WO2022254693A1 (ja) | 操作検出装置及び操作検出方法 | |
JPH06110543A (ja) | 直接教示装置 | |
JP5284179B2 (ja) | 作業判定システム及び作業判定方法並びに該作業判定方法を記録した記録媒体 | |
Zhao et al. | In vehicle diver postural monitoring using a depth camera kinect | |
Solomon et al. | Driver Attention and Behavior Detection with Kinect | |
KR102013854B1 (ko) | 상지 다관절 임피던스 측정 방법 및 그 장치 | |
JPH09225872A (ja) | ロボット教示装置 | |
CN113386125A (zh) | 机器人、控制装置、信息处理装置、方法及存储介质 | |
WO2021195916A1 (zh) | 动态手部仿真方法、装置和系统 | |
Lopez et al. | Taichi algorithm: human-like arm data generation applied on non-anthropomorphic robotic manipulators for demonstration | |
JP2020163511A (ja) | 遠隔操作ロボットの作業支援システム及び作業支援方法 | |
McMahan et al. | Spectral subtraction of robot motion noise for improved event detection in tactile acceleration signals | |
JP2022186188A (ja) | 監視装置及び監視方法 | |
CN115225682B (zh) | 管理服务器、远程操作系统、远程操作方法以及存储介质 | |
Ogawa et al. | Development of interface for teleoperation of humanoid robot using task model method | |
Zhang et al. | Visuotactile feedback parallel gripper for robotic adaptive grasping | |
US20210008711A1 (en) | Robotic device, robotic device controlling system, and robotic device controlling method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21943339 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023525312 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180098857.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18566893 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021943339 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021943339 Country of ref document: EP Effective date: 20240104 |