WO2023281648A1 - 遠隔操作システム - Google Patents
遠隔操作システム Download PDFInfo
- Publication number
- WO2023281648A1 WO2023281648A1 PCT/JP2021/025584 JP2021025584W WO2023281648A1 WO 2023281648 A1 WO2023281648 A1 WO 2023281648A1 JP 2021025584 W JP2021025584 W JP 2021025584W WO 2023281648 A1 WO2023281648 A1 WO 2023281648A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- haptic
- target
- remote control
- marker
- information
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 claims abstract description 161
- 230000005540 biological transmission Effects 0.000 claims abstract description 36
- 238000001514 detection method Methods 0.000 claims abstract description 14
- 239000003550 marker Substances 0.000 claims description 207
- 238000003384 imaging method Methods 0.000 claims description 131
- 230000009209 sensory transmission Effects 0.000 claims description 109
- 230000001953 sensory effect Effects 0.000 claims description 29
- 238000001914 filtration Methods 0.000 claims description 8
- 238000010191 image analysis Methods 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 abstract description 12
- 238000010586 diagram Methods 0.000 description 89
- 239000012636 effector Substances 0.000 description 82
- 230000006870 function Effects 0.000 description 70
- 230000000694 effects Effects 0.000 description 22
- 230000003993 interaction Effects 0.000 description 12
- 230000035807 sensation Effects 0.000 description 12
- 238000000034 method Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 3
- 239000012141 concentrate Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000015541 sensory perception of touch Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
- B25J13/025—Hand grip control means comprising haptic means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40122—Manipulate virtual object, for trajectory planning of real object, haptic display
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40553—Haptic object recognition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40619—Haptic, combination of tactile and proprioceptive sensing
Definitions
- the present disclosure relates to a remote control system for operating an edge device by a person remote from the edge device.
- Patent Document 1 discloses a motion detection device that detects the motion of an operator's finger for remotely controlling an articulated robot.
- a remote control system in which an operator operates an edge device by wire.
- a conventional remote control system contributes to improving the operator's operability and realism by transmitting haptic information in the activity environment of the edge device to the operator.
- the present disclosure has been made in view of the above, and an object thereof is to obtain a remote control system that avoids enlargement and complication of the operation device.
- the remote control system includes an edge device that contacts a target, an operator operation for operating the edge device, and motion transmission corresponding to the operator operation.
- a motion control transmission device for outputting information to the edge device and a visual display device for displaying images of the target and the edge device.
- the edge device includes a haptic sensor that detects a haptic sense of a target, and a force that converts the haptic sense detected by the haptic sensor into a signal specifying the intensity of light corresponding to the magnitude of the detected haptic sense. It has a haptic light transducer and a light intensity controlled light emitter that emits light of an intensity corresponding to the signal obtained by the haptic light transducer.
- the remote control system according to the present disclosure has the effect that it is possible to avoid the operation device from becoming large and complicated.
- FIG. 1 shows a configuration of a remote control system according to Embodiment 1;
- FIG. 1 shows an operator, a visual presentation device, and an exercise operation transmission device according to Embodiment 1.
- FIG. 1 is a perspective view showing an edge device, an imaging device, and a target according to Embodiment 1;
- FIG. FIG. 4 is a diagram showing imaging information provided to the visual presentation device in a state where the end effector of the edge device of the remote control system according to Embodiment 1 is not in contact with the target;
- FIG. 4 is a diagram showing imaging information provided to the visual presentation device in a state where the end effector of the edge device of the remote control system according to Embodiment 1 is in contact with the target;
- FIG. 11 is a diagram showing functions of a visual-based sensory transmission device included in a remote control system according to Embodiment 3;
- FIG. 11 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 3 is not in contact with a target;
- FIG. 11 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 3 is in contact with a target;
- FIG. 11 is a diagram showing functions of a visual-based sensory transmission device included in a remote control system according to Embodiment 3;
- FIG. 11 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 3 is not in contact with a target;
- FIG. 11 is a diagram showing image pickup information output from an image pickup device in a state where an
- FIG. 12 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the third embodiment is in contact with the target;
- FIG. 10 is a diagram showing functions of a visual-based sensory transmission device included in a remote control system according to Embodiment 5;
- FIG. 14 is a diagram showing imaging information output from an imaging device in a state in which an end effector of an edge device of a remote control system according to Embodiment 5 is not in contact with a target;
- FIG. 14 is a diagram showing a visual-based sensory transmission video output from the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to Embodiment 5 is not in contact with the target;
- FIG. 11 is a diagram showing functions of a visual-based sensory transmission device included in a remote control system according to Embodiment 6;
- FIG. 12 is a perspective view showing an edge device and an imaging device of a remote control system according to Embodiment 6, and a target that is an activity environment of the edge device;
- FIG. 14 is a diagram showing image pickup information output from an image pickup device in a state where the end effector of the edge device of the remote control system according to Embodiment 6 is not in contact with the target and the end effector is not shielded by the target; Vision-based sensations that are outputs of the vision-based sensory transmission device when the end effector of the edge device of the remote control system according to Embodiment 6 is not in contact with the target and the end effector is not shielded by the target.
- FIG. 14 is a diagram showing imaging information output from an imaging device in a state in which the end effector of the edge device of the remote control system according to Embodiment 6 is not in contact with the target and the end effector is shielded by the target; Vision-based sensations that are outputs of the vision-based sensory transmission device in a state in which the end effector of the edge device of the remote control system according to Embodiment 6 is not in contact with the target and the end effector is shielded by the target.
- FIG. 12 is a diagram showing imaging information output from an imaging device in a state in which the end effector of the edge device of the remote control system according to Embodiment 6 is in contact with the target and the end effector is shielded by the target;
- Visually-based sensory transmission that is the output of the visually-based sensory transmission device in a state in which the end effector of the edge device of the remote control system according to Embodiment 6 is in contact with the target and the end effector is shielded by the target.
- Diagram showing video FIG. 12 is a diagram showing functions of a visual-based sensory transmission device included in a remote control system according to Embodiment 7; FIG.
- FIG. 14 is a diagram showing functions of a visual-based sensory transmission device included in the remote control system according to the eighth embodiment; The figure which shows the structure of the remote control system which concerns on Embodiment 9.
- FIG. 21 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 9 is not in contact with a target;
- FIG. 21 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 9 is not in contact with a target;
- FIG. 21 is a diagram showing image pickup information output from an image pickup device in a state where an end effector of an edge device of a remote control system according to Embodiment 9 is not in contact with a target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is not in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is not in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is not in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the ninth embodiment is in contact with the target;
- FIG. 22 is a diagram showing imaging information that is an output of an imaging device in a state where the tool of the edge device of the remote control system according to Embodiment 10 is not in contact with the target;
- FIG. 20 is a diagram showing imaging information provided to a visual presentation device in a state in which a tool possessed by an edge device of a remote control system according to Embodiment 10 is in contact with a target;
- FIG. 11 shows a configuration of a remote control system according to an eleventh embodiment
- FIG. 22 is a diagram showing imaging information that is an output of an imaging device in a state where the tool of the edge device of the remote control system according to the eleventh embodiment is not in contact with the target
- FIG. 22 is a diagram showing imaging information output from an imaging device in a state where a tool of an edge device of a remote control system according to Embodiment 11 is in contact with a target
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the eleventh embodiment is not in contact with the target
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is the output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the eleventh embodiment is in contact with the target;
- FIG. 123 is a diagram showing imaging information output from an imaging device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is not in contact with a target;
- FIG. 123 is a diagram showing imaging information output from an imaging device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is not in contact with a target;
- FIG. 123 is a diagram showing imaging information output from an imaging device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is not in contact with a target;
- FIG. 123 is a diagram showing a visual-based sensory transmission image that is the output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the twelfth embodiment is not in contact with the target;
- FIG. 123 is a diagram showing a visual-based sensory transmission image that is the output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the twelfth embodiment is not in contact with the target;
- FIG. 123 is a diagram showing a visual-based sensory transmission image that is the output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the twelfth embodiment is not in contact with the target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of a visual-based sensory transmission device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is in contact with a target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of a visual-based sensory transmission device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is in contact with a target;
- FIG. 22 is a diagram showing a visual-based sensory transmission image that is an output of a visual-based sensory transmission device in a state where a tool of an edge device of a remote control system according to Embodiment 12 is in contact with a target;
- FIG. 13 shows a configuration of a remote control system according to a thirteenth embodiment;
- FIG. 22 is a diagram showing functions of a vision-based sensory transmission device included in the remote control system according to the thirteenth embodiment;
- FIG. 122 is a diagram showing imaging information that is an output of an imaging device in a state where a tool possessed by an edge device of the remote control system according to the thirteenth embodiment is in contact with the target;
- FIG. 142 is a diagram showing a visual-based sensory transmission image that is the output of the visual-based sensory transmission device in a state where the tool of the edge device of the remote control system according to the thirteenth embodiment is in contact with the target;
- FIG. 22 is a diagram showing imaging information that is an output of an imaging device in a state where an end effector of an edge device of a remote control system according to Embodiment 14 is in contact with a target;
- FIG. 123 is a diagram showing a visual-based sensory transmission image that is an output of the visual-based sensory transmission device in a state where the end effector of the edge device of the remote control system according to the fourteenth embodiment is in contact with the target;
- FIG. 22 is a diagram showing imaging information that is an output of an imaging device in a state where a tool possessed by an edge device of a remote control system according to Embodiment 15 is in contact with a target;
- FIG. 20 is a diagram showing a vision-based sensory transmission image that is an output of a vision-based sensory transmission device in a state where a tool of an edge device of a remote control system according to Embodiment 15 is in contact with a target;
- FIG. 4 is a diagram showing a processor when some of the plurality of components of the remote operation system according to Embodiment 1 are realized by the processor;
- FIG. 4 is a diagram showing a processing circuit when some of the plurality of components of the remote control system according to Embodiment 1 are realized by the processing circuit;
- FIG. 1 is a diagram showing the configuration of a remote control system 1 according to Embodiment 1.
- the remote control system 1 has an exercise operation transmission device 1200 that receives an operator operation 1000a, which is an exercise operation of an operator 1000, as an input. Also shown in FIG. 1 is an operator 1000 .
- Exercise operation transmission device 1200 outputs exercise transmission information 1200a corresponding to operator operation 1000a.
- the remote control system 1 further has an edge device 2000 that contacts the target 4000 .
- the operator operation 1000 a is an instruction for operating the edge device 2000 .
- a target 4000 is also shown in FIG.
- the exercise operation transmission device 1200 receives an operator operation 1000 a for operating the edge device 2000 and outputs exercise transmission information 1200 a corresponding to the operator operation 1000 a to the edge device 2000 .
- the edge device 2000 is operated by exercise transmission information 1200 a output from the exercise operation transmission device 1200 .
- the target 4000 exerts an interaction 4000a with the edge device 2000 when the edge device 2000 operates.
- Target 4000 is the active environment of edge device 2000 .
- the remote control system 1 further has an imaging device 3000 that images the target 4000 and the edge device 2000 .
- the imaging device 3000 outputs imaging information 3000a indicating imaging results.
- the remote control system 1 further includes a visual presentation device 1100 that receives imaging information 3000a output from the imaging device 3000 and provides the operator 1000 with image information 1100a corresponding to the imaging information 3000a. In other words, the visual presentation device 1100 displays images of the target 4000 and edge device 2000 .
- the edge device 2000 has an end effector 2100 intended for interaction with a target 4000 and a haptic sense device 2300 attached to the end effector 2100 for detecting a haptic sense with respect to the target 4000 .
- the haptic sense device 2300 outputs haptic sense information 2300a indicating the detected haptic sense.
- the edge device 2000 further includes a haptic optical converter 2700 that converts the haptic information 2300a output from the haptic sensing device 2300 into a light intensity signal 2700a.
- the haptic light converter 2700 converts the haptic sense detected by the haptic sense device 2300 into a light intensity signal 2700a that specifies the intensity of light corresponding to the magnitude of the sensed haptic sense.
- the edge device 2000 further comprises a light intensity controlled light emitter 2600 attached to the end effector 2100 and emitting light at an intensity corresponding to the light intensity signal 2700 a obtained by the haptic light transducer 2700 .
- the intensity of light emitted by light intensity controlled light emitter 2600 is controlled by light intensity signal 2700a.
- FIG. 2 is a diagram showing the operator 1000, the visual presentation device 1100, and the movement operation transmission device 1200 according to the first embodiment.
- FIG. 3 is a perspective view showing the edge device 2000, imaging device 3000 and target 4000 according to the first embodiment.
- FIG. 4 is a diagram showing imaging information 3000a provided to the visual presentation device 1100 when the end effector 2100 of the edge device 2000 of the remote control system 1 according to Embodiment 1 is not in contact with the target 4000.
- FIG. The light intensity controlled light emitter 2600 is not illuminated.
- FIG. 5 is a diagram showing imaging information 3000a provided to the visual presentation device 1100 while the end effector 2100 of the edge device 2000 of the remote control system 1 according to Embodiment 1 is in contact with the target 4000.
- FIG. The light intensity control type light emitter 2600 is illuminated corresponding to the state of contact detected by the haptic detection device 2300 .
- the remote control system 1 may further have a temperature sensor arranged in parallel with the haptic sensing device 2300 .
- the remote control system 1 may have a temperature sensor instead of the haptic sensing device 2300 .
- a current proportional to the magnitude of the haptic sense sensed by the haptic sense device 2300 is attached to the end effector 2100 .
- the light intensity controlled light emitter 2600 is lit by supplying current.
- the imaging device 3000 images the end effector 2100 and the target 4000, and provides the visual presentation device 1100 with imaging information 3000a indicating the result of imaging.
- the visual presentation device 1100 displays video information 1100a corresponding to the imaging information 3000a.
- the operator 1000 recognizes the contact between the end effector 2100 and the target 4000 based on the image information 1100a displayed by the visual presentation device 1100, and operates the exercise operation transmission device 1200.
- the operation interface of the operator 1000 does not require a drive unit, it is possible to avoid the increase in size and complexity of the operation device.
- the operator 1000 to whom the haptic information 2300a is presented in the form of an image recognizes the haptic sensation generated by the edge device 2000 in the brain so as not to contradict the presented video, so that the operator 1000 must perform an operation that does not cause discomfort. can be done. Since the operator 1000 can obtain the haptic information 2300a by gazing at the contact point area of the visual presentation device 1100, the operator can concentrate on the work.
- the remote control system 1 enables the operator 1000 to detect the interaction 4000a between the end effector 2100 and the target 4000 from visual information. Since the remote control system 1 can visually transmit the haptic information 2300a to the operator 1000, the operation interface of the operator 1000 does not require a drive unit. Therefore, the remote control system 1 can avoid the enlargement and complication of the operation device. By concentrating on the visual presentation device 1100, the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability. When a temperature sensor is arranged, the operator 1000 can obtain temperature information that does not make him feel uncomfortable.
- Embodiment 2 The configuration of the remote control system according to the second embodiment is the same as the configuration of the remote control system 1 according to the first embodiment. However, the function of the light intensity controlled light emitter 2600 differs between the second embodiment and the first embodiment. In the second embodiment, the light intensity control type light emitter 2600 does not emit light by changing the light intensity, but expresses the contact state of the haptic detection device 2300 with the target 4000 by changing the hue and the lighting frequency. to emit light.
- the remote control system according to the second embodiment has the light intensity control type light emitter 2600 having a function different from that of the first embodiment, the contact state of the haptic sensor 2300 with the target 4000 is temperature and tactile sensation. , or both can be communicated to the operator 1000 .
- FIG. 6 is a diagram showing the configuration of a remote control system 3 according to Embodiment 3.
- the remote control system 3 has an exercise operation transmission device 1200 that receives an operator operation 1000a, which is an exercise operation of the operator 1000, as an input.
- An operator 1000 is also shown in FIG.
- Exercise operation transmission device 1200 outputs exercise transmission information 1200a corresponding to operator operation 1000a.
- the remote control system 3 further has an edge device 2000 that contacts the target 4000 .
- the operator operation 1000 a is an instruction for operating the edge device 2000 .
- a target 4000 is also shown in FIG.
- the exercise operation transmission device 1200 receives an operator operation 1000 a for operating the edge device 2000 and outputs exercise transmission information 1200 a corresponding to the operator operation 1000 a to the edge device 2000 .
- the edge device 2000 is operated by exercise transmission information 1200 a output from the exercise operation transmission device 1200 .
- the target 4000 exerts an interaction 4000a with the edge device 2000 when the edge device 2000 operates.
- the remote control system 3 further has an imaging device 3000 that images the edge device 2000 and the target 4000 .
- the imaging device 3000 outputs imaging information 3000a indicating imaging results.
- the remote control system 3 receives imaging information 3000a output from the imaging device 3000, marker information 3300a output from the edge device 2000, and haptic information 2300a output from the edge device 2000, and generates a vision-based sensory transfer image 3200a. It further has a visual-based sensory transfer device 3200 for generating.
- the remote control system 3 further has a visual presentation device 1100 that receives a vision-based sensory transfer image 3200a as an input and provides the operator 1000 with image information 1100a corresponding to the vision-based sensory transfer image 3200a.
- the marker information 3300a is, for example, a color, a shape, or a feature point that can be image-extracted from the imaging information 3000a of the passive marker 2500, or information combining two or more of them.
- the marker information 3300a is a combination of red color information and circle shape information.
- the edge device 2000 has an end effector 2100 intended for interaction with a target 4000 and a haptic sense device 2300 attached to the end effector 2100 for detecting a haptic sense with respect to the target 4000 .
- the haptic sense device 2300 outputs haptic sense information 2300a indicating the detected haptic sense.
- the edge device 2000 further comprises a passive marker 2500 having markers that are spatially identifiable with the imaging device 3000 .
- the passive marker 2500 is an AR (Augmented Reality) marker that can be extracted from the imaging information 3000a captured by the imaging device 3000, or a member having a feature point.
- the passive marker 2500 outputs marker information 3300a.
- FIG. 7 is a diagram showing the functions of the vision-based sensory transmission device 3200 included in the remote control system 3 according to the third embodiment.
- the vision-based sensory transmission device 3200 has a marker point identification function 3251 that uses the marker information 3300a to identify the position of the passive marker 2500 based on the imaging information 3000a to generate marker point identification information 3251a.
- the vision-based sensory transmission device 3200 further has a visual haptic image generation function 3250 that generates visual haptic image information 3250a based on the haptic information 2300a.
- the visual haptic image generation function 3250 is a function for generating visual haptic image information 3250a representing an image corresponding to the haptic information 2300a representing the haptic sense detected by the haptic sense device 2300.
- the visual-based sensory transfer device 3200 further has a superimposing function 3252 that generates a visual-based sensory transfer image 3200a based on the visual haptic image information 3250a and the marker point identification information 3251a.
- the superimposition function 3252 combines marker point identification information 3251 a obtained by the marker point identification function 3251 , visual haptic image information 3250 a obtained by the visual haptic image generation function 3250 , and imaging obtained by the imaging device 3000 . This is a function of superimposing the information 3000a.
- FIG. 8 is a diagram showing imaging information 3000a output from the imaging device 3000 when the end effector 2100 of the edge device 2000 of the remote control system 3 according to Embodiment 3 is not in contact with the target 4000.
- FIG. Passive marker 2500 is detected as marker point identification information 3251a.
- FIG. 9 is a diagram showing imaging information 3000a output from the imaging device 3000 when the end effector 2100 of the edge device 2000 of the remote control system 3 according to Embodiment 3 is in contact with the target 4000.
- Passive marker 2500 is detected as marker point identification information 3251a
- haptic sense device 2300 is detecting haptic sense.
- FIG. 10 shows a visual-based sensory transmission image 3200a that is the output of the visual-based sensory transmission device 3200 in a state in which the end effector 2100 of the edge device 2000 of the remote control system 3 according to Embodiment 3 is in contact with the target 4000. It is a figure which shows.
- a superimposed image 3201 representing the haptic information 2300a is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a in FIG.
- the remote control system 3 may further have a temperature sensor arranged in parallel with the haptic sensing device 2300 .
- the remote control system 3 may have a temperature sensor instead of the haptic sensing device 2300 .
- the remote control system 3 may recognize the shape of the fingertip of the end effector 2100 and display the superimposed image 3201 on the fingertip.
- the remote control system 3 enables the operator 1000 to detect the interaction 4000a between the end effector 2100 and the target 4000 from visual information.
- the remote control system 3 can avoid enlargement and complication of the control device.
- the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability.
- the operator 1000 can obtain temperature information that does not make him feel uncomfortable. Since the remote control system 3 can superimpose a visual haptic image intended by the designer on the contact point, it is possible to provide an image that matches the operator 1000 or the scene.
- Embodiment 4 The configuration of the remote control system according to the fourth embodiment is the same as the configuration of the remote control system 3 according to the third embodiment. However, the superimposed image 3201 differs between the fourth embodiment and the third embodiment. In Embodiment 3, the superimposed image 3201 indicates the intensity of the lighting image. In Embodiment 4, the superimposed image 3201 does not indicate the intensity of the lighting image, but expresses the contact state of the haptic detection device 2300 with the target 4000 by changes in hue and lighting frequency.
- the remote control system according to Embodiment 4 can transmit one or both of temperature and tactile sensation to the operator 1000 as the state of contact with the target 4000 .
- Embodiment 5 The configuration of the remote control system according to the fifth embodiment is the same as the configuration of the remote control system 3 according to the third embodiment. However, the functions of the visual-based sensory transmission device 3200 of the fifth embodiment are different from the functions of the visual-based sensory transmission device 3200 of the third embodiment. In Embodiment 5, differences from Embodiment 3 will be mainly described.
- FIG. 11 is a diagram showing the functions of the vision-based sensory transmission device 3200 that the remote control system according to Embodiment 5 has.
- the vision-based sensory transmission device 3200 has a marker point identification function 3251 that uses the marker information 3300a to identify the position of the passive marker 2500 based on the imaging information 3000a to generate marker point identification information 3251a.
- the vision-based sensory transmission device 3200 further has a visual haptic image generation function 3250 that generates visual haptic image information 3250a based on the haptic information 2300a.
- the vision-based sensory transmission device 3200 further has a marker filter function 3253 that generates marker filter information 3253a for filtering the detected passive markers 2500.
- the visual-based sensory transfer device 3200 further has a superimposing function 3252 that generates a visual-based sensory transfer image 3200a based on the visual haptic image information 3250a, the marker point identification information 3251a, and the marker filter information 3253a.
- the remote control system according to Embodiment 5 performs filtering to reduce visual load.
- the remote control system according to Embodiment 5 does not display passive marker 2500 depending on conditions.
- FIG. 12 is a diagram showing imaging information 3000a output from the imaging device 3000 when the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 5 is not in contact with the target 4000.
- FIG. Passive marker 2500 is detected as marker point identification information 3251a.
- FIG. 13 shows a visual-based sensory transmission image 3200a that is the output of the visual-based sensory transmission device 3200 when the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 5 is not in contact with the target 4000.
- FIG. 4 is a diagram showing; A superimposed image 3201 representing marker filter information 3253a, which is filtering information, is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a in the visual-based sensory transmission image 3200a.
- the remote control system according to Embodiment 5 enables the operator 1000 to detect the interaction 4000a between the end effector 2100 and the target 4000 from visual information.
- the remote control system according to Embodiment 5 can avoid the increase in size and complexity of the control device. By concentrating on the visual presentation device 1100, the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability.
- the image of the passive marker 2500 can be replaced with an image preset by the designer.
- the remote control system according to Embodiment 5 can reduce the recognition load on the operator 1000 .
- Embodiment 6 The configuration of the remote control system according to the sixth embodiment is the same as the configuration of the remote control system 3 according to the third embodiment. However, the functions of the visual-based sensory transmission device 3200 of the sixth embodiment are different from the functions of the visual-based sensory transmission device 3200 of the third embodiment. In Embodiment 6, differences from Embodiment 3 will be mainly described.
- FIG. 14 is a diagram showing the functions of the vision-based sensory transmission device 3200 included in the remote control system according to the sixth embodiment.
- the vision-based sensory transmission device 3200 has a marker point identification function 3251 that uses the marker information 3300a to identify the position of the passive marker 2500 based on the imaging information 3000a to generate marker point identification information 3251a.
- the vision-based sensory transmission device 3200 further has a visual haptic image generation function 3250 that generates visual haptic image information 3250a based on the haptic information 2300a.
- the vision-based sensory transmission device 3200 further has a marker filter function 3253 that generates marker filter information 3253a for filtering the detected passive markers 2500.
- the vision-based sensory transmission device 3200 further has a marker point estimation function 3254 that generates marker point estimation information 3254a based on the imaging information 3000a and the marker point identification information 3251a.
- the marker point estimation function 3254 is used when the other passive marker 2500 is captured. This is a function of estimating the missing marker point based on the imaging information 3000a and the marker point identification information 3251a when it exists in the information 3000a.
- the visual-based sensory transfer device 3200 includes a superimposing function 3252 that generates a visual-based sensory transfer image 3200a based on visual haptic video information 3250a, marker point identification information 3251a, marker filter information 3253a, and marker point estimation information 3254a. have more.
- FIG. 15 is a perspective view showing the edge device 2000 and imaging device 3000 of the remote control system according to Embodiment 6, and the target 4000 which is the activity environment of the edge device 2000.
- FIG. 15 is a perspective view showing the edge device 2000 and imaging device 3000 of the remote control system according to Embodiment 6, and the target 4000 which is the activity environment of the edge device 2000.
- FIG. 16 shows the imaging apparatus 3000 in a state in which the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is not in contact with the target 4000 and the end effector 2100 is not shielded by the target 4000.
- Passive marker 2500 is detected as marker point identification information 3251a.
- FIG. 17 shows the visual base sensation when the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is not in contact with the target 4000 and the end effector 2100 is not blocked by the target 4000.
- 3200a shows a visual-based sensory-transmitted image 3200a, which is the output of a communication device 3200.
- FIG. In the visual-based sensory transfer image 3200a a superimposed image 3201 representing marker filter information 3253a, which is filtering information, is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a in FIG.
- FIG. 18 shows the imaging apparatus 3000 in a state in which the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is not in contact with the target 4000 and the end effector 2100 is shielded by the target 4000.
- a passive marker 2500 is detected as marker point identification information 3251a and marker point estimation information 3254a.
- FIG. 19 shows a visual base sensation in a state in which the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is not in contact with the target 4000 and the end effector 2100 is shielded by the target 4000.
- 3200a shows a visual-based sensory-transmitted image 3200a, which is the output of a communication device 3200.
- FIG. In the visual-based sensory transfer image 3200a a superimposed image 3201 that expresses the marker filter information 3253a, which is the filtering information, at the position where the passive marker 2500 is detected as the marker point identification information 3251a and the marker point estimation information 3254a in FIG. is displayed.
- FIG. 20 shows the imaging device 3000 in a state in which the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is in contact with the target 4000 and the end effector 2100 is shielded by the target 4000.
- FIG. 10 is a diagram showing imaging information 3000a as an output; A passive marker 2500 is detected as marker point identification information 3251a and marker point estimation information 3254a.
- FIG. 21 shows vision-based sensory transmission in a state in which the end effector 2100 of the edge device 2000 of the remote control system according to Embodiment 6 is in contact with the target 4000 and the end effector 2100 is shielded by the target 4000.
- 3200a shows a visual-based sensory-transmitted video 3200a, which is the output of device 3200.
- FIG. In the visual-based sensory transfer image 3200a a superimposed image 3201 representing the haptic information 2300a is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a and the marker point estimation information 3254a in FIG. there is
- the vision-based sensory transmission device 3200 stores the position of the blocked passive marker 2500 at the time when the passive marker 2500 is blocked by the marker point estimation function 3254, Marker point estimation information 3254a is derived by adding the vector that the marker point identification information 3251a of another passive marker 2500 that is not shielded has moved to the position of the shielded passive marker 2500 stored in the time that has elapsed since the time that was passed. do.
- the remote control system according to Embodiment 6 enables the operator 1000 to detect the interaction 4000a between the end effector 2100 and the target 4000 from visual information.
- the remote control system according to Embodiment 6 can avoid the increase in size and complexity of the control device. By concentrating on the visual presentation device 1100, the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability.
- the remote control system according to Embodiment 6 in a scene in which contact involving haptic sensation between edge device 2000 and target 4000 does not occur, the image of passive marker 2500 can be replaced with an image preset by the designer. It becomes possible. In other words, the remote control system according to Embodiment 6 can reduce the recognition load on the operator 1000 . Even when passive marker 2500 is hidden behind a shield, the point of hidden passive marker 2500 can be estimated, and the remote control system according to Embodiment 6 can improve operability for operator 1000 .
- Embodiment 7 The configuration of the remote control system according to the seventh embodiment is the same as the configuration of the remote control system according to the sixth embodiment. However, the functions of the visual-based sensory transmission device 3200 of the seventh embodiment are different from the functions of the visual-based sensory transmission device 3200 of the sixth embodiment. In Embodiment 7, differences from Embodiment 6 will be mainly described.
- FIG. 22 is a diagram showing functions of a vision-based sensory transmission device 3200 included in a remote control system according to Embodiment 7.
- FIG. The vision-based sensory transmission device 3200 of Embodiment 7 has all the functions of the vision-based sensory transmission device 3200 of Embodiment 6, except for the marker filter function 3253 .
- the image of the passive marker 2500 is displayed in advance by the designer. can be replaced with the video set to In Embodiment 7, although the effect of reducing the recognition load on the operator 1000 is not obtained, the effect of reducing the calculation load for simplifying the configuration of the remote control system is obtained.
- Embodiment 8 The configuration of the remote control system according to the eighth embodiment is the same as the configuration of the remote control system 3 according to the third embodiment. However, the functions of the visual-based sensory transmission device 3200 of the eighth embodiment are different from the functions of the visual-based sensory transmission device 3200 of the third embodiment. In the eighth embodiment, differences from the third embodiment will be mainly described.
- FIG. 23 is a diagram showing the functions of the vision-based sensory transmission device 3200 included in the remote control system according to the eighth embodiment.
- the vision-based sensory transmission device 3200 has a marker point identification function 3251 that uses the marker information 3300a to identify the position of the passive marker 2500 based on the imaging information 3000a to generate marker point identification information 3251a.
- the visual sense transmission device 3200 uses marker filter image selection information 3255b, which is image selection information effective as a marker filter, and image pattern selection information effective as a haptic image. It further has an image analysis function 3255 that generates certain haptic video pattern selection information 3255a. Furthermore, based on the imaging information 3000a, the image analysis function 3255 includes haptic image pattern selection information 3255a that selects a haptic image pattern that is easy for humans to recognize, and marker filter image that selects an image that is naturally familiar. This is a function for generating selection information 3255b.
- the vision-based sensory transmission device 3200 further has a visual haptic image generation function 3250 that generates visual haptic image information 3250a based on the haptic information 2300a and haptic image pattern selection information 3255a.
- the vision-based sensory transmission device 3200 further has a marker filter function 3253 that generates marker filter information 3253a based on the marker filter image selection information 3255b to filter the detected passive markers 2500.
- the vision-based sensory transmission device 3200 further has a marker point estimation function 3254 that generates marker point estimation information 3254a based on the imaging information 3000a and the marker point identification information 3251a.
- the visual-based sensory transfer device 3200 includes a superimposing function 3252 that generates a visual-based sensory transfer image 3200a based on visual haptic video information 3250a, marker point identification information 3251a, marker filter information 3253a, and marker point estimation information 3254a. have more.
- the video selection criteria are set to the same color as the end effector for the filter video, and a color that stands out among the imaging information for the visual haptic video.
- the remote control system according to Embodiment 8 enables the operator 1000 to detect the interaction 4000a between the end effector 2100 and the target 4000 from visual information.
- the remote control system according to Embodiment 8 can avoid an increase in size and complexity of the control device. By concentrating on the visual presentation device 1100, the operator 1000 can obtain haptic information that does not cause a sense of discomfort, and thus can obtain relatively high operability.
- the remote control system according to the eighth embodiment in a scene in which the edge device 2000 and the target 4000, which is the activity environment of the edge device 2000, do not contact each other with a haptic sensation, the image of the passive marker 2500 is displayed in advance by the designer. can be replaced with the video set to In other words, the remote control system according to the eighth embodiment can reduce the recognition load on the operator 1000 .
- the remote control system according to the eighth embodiment can improve the operability of the operator 1000.
- the remote control system according to the eighth embodiment can present a visual haptic image pattern that is easy for humans to perceive when haptic is detected, and the marker naturally fits in when haptic is not detected. Images can be presented, and operability for the operator 1000 can be improved.
- FIG. 24 is a diagram showing the configuration of a remote control system 9 according to the ninth embodiment.
- the remote control system 9 includes a motion control transmission device 1200, an edge device 2000, an imaging device 3000, a vision-based sensory transmission device 3200, and a visual presentation device included in the remote control system according to any one of Embodiments 3 to 8. 1100.
- passive marker 2500 possessed by edge device 2000 is changed to active marker 2400 .
- Operator 1000 and target 4000 are also shown in FIG.
- the remote control system 9 further has an active marker control device 3300 that receives imaging information 3000a and generates an active marker control signal 3300b detectable by the imaging device 3000 and marker information 3300a. Active marker control signal 3300 b is output to active marker 2400 . The marker information 3300 a is output to the vision-based sensory transfer device 3200 .
- FIGS. 25, 26, and 27 are an imaging output of the imaging device 3000 in a state in which the end effector 2100 of the edge device 2000 of the remote control system 9 according to Embodiment 9 is not in contact with the target 4000.
- FIG. 30 is a diagram showing information 3000a;
- the active marker control device 3300 receives imaging information 3000a, generates a color, shape, or color and shape that can be easily detected from the imaging device 3000, and outputs an active marker control signal 3300b to the active marker 2400.
- the active marker 2400 is illuminated in "blue” color which is not included in the imaging information 3000a.
- the active marker 2400 is illuminated in a “yellow” color that is not included in the imaging information 3000a.
- the active marker 2400 is illuminated with a "green” color that is not included in the imaging information 3000a.
- FIGS. 28, 29 and 30 shows the output of the vision-based sensory transmission device 3200 when the end effector 2100 of the edge device 2000 of the remote control system 9 according to Embodiment 9 is not in contact with the target 4000.
- 3200a is a visual-based sensory transfer video 3200a.
- Visually-based sensory transfer images 3200a in each of FIGS. 28, 29, and 30 superimpose a fixed superimposed image 3201 on the position of the active marker 2400 that changes according to the imaging information 3000a.
- the visual-based sensory transmission image 3200a presented to the operator 1000 shows that the information of the change of the active marker 2400 is blocked.
- FIGS. 31, 32, and 33 shows the output of the vision-based sensory transmission device 3200 when the end effector 2100 of the edge device 2000 of the remote control system 9 according to Embodiment 9 is in contact with the target 4000.
- 3200a is a visual-based sensory transfer video 3200a.
- the operator 1000 is presented with a superimposed image 3201 that matches the state of contact between the haptic sensing device 2300 and the target 4000 by means of the visual-based sensory transfer image 3200a in each of FIGS.
- the remote control system 9 can identify the position of the active marker 2400 based on the imaging information 3000a.
- the effect of making it possible is obtained.
- the remote control system 9 can improve detection stability of the active marker 2400 .
- the remote control system 9 superimposes a fixed image on the position of the active marker 2400 identified based on the imaging information 3000a, thereby blocking transmission of information about image changes that are unnecessary for the operator 1000 to the operator 1000. As a result, the range of use of the remote control system 9 can be expanded without hindering the operability of the operator 1000 .
- FIG. 34 is a diagram showing the configuration of the remote control system 10 according to the tenth embodiment.
- the remote control system 10 has all the components of the remote control system 1 according to the first embodiment.
- the edge device 2000 of the remote control system 10 also has a tool 2200 attached to the end effector 2100 .
- the haptic sensing device 2300 and the light intensity controlled light emitter 2600 are attached to the tip portion that contacts the target 4000 .
- FIG. 35 is a diagram showing imaging information 3000a output from the imaging device 3000 when the tool 2200 of the edge device 2000 of the remote control system 10 according to Embodiment 10 is not in contact with the target 4000.
- FIG. The haptic sensing device 2300 and the light intensity controlled light emitter 2600 are placed at the point of contact of the tool 2200 with the target 4000 .
- the light intensity controlled light emitter 2600 is not illuminated.
- FIG. 36 is a diagram showing imaging information 3000a provided to the visual presentation device 1100 while the tool 2200 of the edge device 2000 of the remote control system 10 according to Embodiment 10 is in contact with the target 4000.
- FIG. The light intensity control type light emitter 2600 lights up corresponding to the state of contact between the haptic sensing device 2300 and the target 4000 .
- the remote control system 10 enables the operator 1000 to detect the interaction 4000a of the tool 2200 with the target 4000 from visual information.
- the remote control system 10 can avoid enlargement and complication of the control device.
- the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability.
- the light intensity control type light emitter 2600 does not emit light by changing the light intensity, but expresses the contact state of the haptic sensor 2300 with the target 4000 by changing the hue and the lighting frequency. may emit light.
- FIG. 37 is a diagram showing the configuration of the remote control system 11 according to the eleventh embodiment.
- the remote control system 11 has all the components of any of the remote control systems according to the third to eighth embodiments.
- the edge device 2000 of the remote control system 11 further has a tool 2200 attached to the end effector 2100 .
- the haptic sensing device 2300 and the passive marker 2500 are attached to the tip that contacts the target 4000 .
- FIG. 38 is a diagram showing imaging information 3000a output from the imaging device 3000 when the tool 2200 of the edge device 2000 of the remote control system 11 according to the eleventh embodiment is not in contact with the target 4000.
- FIG. Passive marker 2500 is detected as marker point identification information 3251a.
- FIG. 39 is a diagram showing imaging information 3000a output from the imaging device 3000 when the tool 2200 of the edge device 2000 of the remote control system 11 according to the eleventh embodiment is in contact with the target 4000.
- Passive marker 2500 is detected as marker point identification information 3251a
- haptic sense device 2300 is detecting haptic sense.
- FIG. 40 shows a visual-based sensory transmission image 3200a that is the output of the visual-based sensory transmission device 3200 when the tool 2200 of the edge device 2000 of the remote control system 11 according to Embodiment 11 is not in contact with the target 4000.
- FIG. 4 is a diagram showing; In FIG. 38, a superimposed image 3201 representing marker filter information 3253a, which is filtering information, is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a.
- FIG. 41 shows a visual-based sensory transmission image 3200a that is the output of the visual-based sensory transmission device 3200 in a state where the tool 2200 of the edge device 2000 of the remote control system 11 according to the eleventh embodiment is in contact with the target 4000.
- FIG. 4 is a diagram showing; A superimposed image 3201 representing the haptic information 2300a is displayed at the position where the passive marker 2500 is detected as the marker point identification information 3251a in FIG.
- the remote control system 11 enables the operator 1000 to detect the interaction 4000a of the tool 2200 with the target 4000 from visual information.
- the remote control system 11 can avoid enlargement and complication of the operation device.
- the operator 1000 can obtain the haptic information 2300a that does not cause a sense of discomfort, and therefore can obtain relatively high operability.
- FIG. 42 is a diagram showing the configuration of the remote control system 12 according to the twelfth embodiment.
- Remote operation system 12 is a system in which passive marker 2500 of remote operation system 11 according to Embodiment 11 is changed to active marker 2400 .
- the haptic sensing device 2300 and the active marker 2400 are attached to the tip that contacts the target 4000 .
- the remote control system 12 further has an active marker control device 3300 that receives imaging information 3000a and generates an active marker control signal 3300b detectable by the imaging device 3000 and marker information 3300a. Active marker control signal 3300 b is output to active marker 2400 . The marker information 3300 a is output to the vision-based sensory transfer device 3200 .
- the active marker control device 3300 receives imaging information 3000 a , generates a color, shape, or color and shape that can be easily detected from the imaging device 3000 and outputs an active marker control signal 3300 b to the active marker 2400 .
- the active marker 2400 is illuminated with a "blue" color that is not included in the imaging information 3000a.
- the active marker 2400 is illuminated in a “yellow” color that is not included in the imaging information 3000a.
- the active marker 2400 is illuminated with a "green” color that is not included in the imaging information 3000a.
- FIGS. 46, 47 and 48 shows the output of the vision-based sensory transmission device 3200 when the tool 2200 of the edge device 2000 of the remote control system 12 according to Embodiment 12 is not in contact with the target 4000.
- Fig. 32 shows a visual-based sensory transfer video 3200a; An image based on the imaging information 3000a changes from moment to moment. 46, 47 and 48, which are presented to the operator 1000 as a result, by superimposing a fixed superimposed image 3201 on the position of the active marker 2400 that changes according to the imaging information 3000a. , the information of changes in the active marker 2400 is blocked.
- FIGS. 49, 50, and 51 shows the output of the vision-based sensory transmission device 3200 when the tool 2200 of the edge device 2000 of the remote control system 12 according to Embodiment 12 is in contact with the target 4000.
- Fig. 32 shows a visual-based sensory transfer video 3200a; 49, 50 and 51, the operator 1000 is presented with a superimposed image 3201 that matches the state of contact between the haptic sensing device 2300 and the target 4000 in the visual-based sensory transfer image 3200a.
- the remote control system 12 can stably identify the position of the active marker 2400 based on the imaging information 3000a.
- the effect of becoming is obtained.
- the remote control system 12 superimposes a fixed image on the position identified based on the imaging information 3000a, thereby making it possible for the operator 1000 to block unnecessary image change information, which hinders the operability of the operator 1000.
- the range of use of the remote control system 12 can be expanded.
- FIG. 52 is a diagram showing the configuration of the remote control system 13 according to the thirteenth embodiment.
- the remote control system 13 is a system in which the motion transmission information 1200a is input to the vision-based sensory transmission device 3200 of the remote control system 12 according to the twelfth embodiment.
- FIG. 53 is a diagram showing functions of a vision-based sensory transmission device 3200 included in the remote control system 13 according to the thirteenth embodiment.
- the vision-based sensory transmission device 3200 has an edge device model storage unit 3400 that stores edge device model information 3400 a that is model information of the edge device 2000 .
- the edge device model storage unit 3400 stores geometric information of points where the edge device 2000 contacts the target 4000 .
- the edge device model storage unit 3400 is realized by a semiconductor memory.
- Visual haptic image information 3250a is generated.
- marker point identification information 3251a is a visual-based sensory transfer image 3200a.
- marker filter information 3253a is a visual-based sensory transfer image 3200a.
- marker point estimation information 3254a is a visual-based sensory transfer image 3200a.
- FIG. 54 is a diagram showing imaging information 3000a output from the imaging device 3000 when the tool 2200 of the edge device 2000 of the remote control system 13 according to the thirteenth embodiment is in contact with the target 4000.
- FIG. Tool 2200 has two active markers 2400 .
- FIG. 55 shows a visual-based sensory transmission video 3200a that is the output of the visual-based sensory transmission device 3200 in a state where the tool 2200 of the edge device 2000 of the remote control system 13 according to the thirteenth embodiment is in contact with the target 4000.
- FIG. 4 is a diagram showing; In the visual-based sensory transmission video 3200a, the position of the tip of the tool 2200 is calculated based on the vectors in the image calculated from the positions of the two active markers 2400 and the pre-stored edge device model information 3400a. Visual haptic image information 3250a is superimposed on the image.
- the remote control system 13 according to the thirteenth embodiment has the effect even when the position where the active marker 2400 is attached is not the contact point with the target 4000. It is possible to present the haptic information 2300a at the position where it contacts the target 4000, and it is possible to expand the range of selection of the size of the active marker 2400 and the mounting method.
- Embodiment 14 In Embodiment 13, active marker 2400 is attached to tool 2200 . In Embodiment 14, active marker 2400 is attached to end effector 2100 rather than tool 2200 . In other words, the component to which the active marker 2400 is attached differs between the thirteenth embodiment and the fourteenth embodiment.
- FIG. 56 is a diagram showing imaging information 3000a output from the imaging device 3000 when the end effector 2100 of the edge device 2000 of the remote control system according to the fourteenth embodiment is in contact with the target 4000.
- FIG. FIG. 57 shows a visual-based sensory transmission image 3200a that is the output of the visual-based sensory transmission device 3200 in a state where the end effector 2100 of the edge device 2000 of the remote control system according to the fourteenth embodiment is in contact with the target 4000.
- FIG. 4 is a diagram showing; As shown in FIGS. 56 and 57, in the fourteenth embodiment, active marker 2400 is attached to end effector 2100 rather than tool 2200 .
- the remote control system according to the fourteenth embodiment can detect the target even when the position where the active marker 2400 is attached is not the contact point with the target 4000 . It is possible to present the haptic information 2300a at the position where the marker 4000 is in contact with the active marker 2400, and the range of selection of the size and mounting method of the active marker 2400 can be further expanded.
- Embodiment 15 the haptic sensing device 2300 is attached to the tool 2200 .
- the haptic sensing device 2300 is attached to the end effector 2100 instead of the tool 2200 . That is, the thirteenth embodiment and the fifteenth embodiment are different in the component to which the haptic sensor 2300 is attached.
- FIG. 58 is a diagram showing imaging information 3000a output from the imaging device 3000 when the tool 2200 of the edge device 2000 of the remote control system according to the fifteenth embodiment is in contact with the target 4000.
- FIG. FIG. 59 shows a vision-based sensory transmission video 3200a that is the output of the vision-based sensory transmission device 3200 in a state where the tool 2200 of the edge device 2000 of the remote control system according to the fifteenth embodiment is in contact with the target 4000. It is a diagram. As shown in FIGS. 58 and 59 , in the fifteenth embodiment, the haptic sensing device 2300 is attached to the end effector 2100 instead of the tool 2200 .
- the remote control system according to the fifteenth embodiment has the advantage that the position where the haptic sensor 2300 is attached is not the point of contact with the target 4000.
- the haptic information 2300a can be presented at a position where the haptic sensor 2300 also contacts the target 4000, and the range of selection of the size and mounting method of the haptic sensor 2300 can be further expanded.
- the haptic sensing device 2300 may be attached to the end effector 2100 instead of the tool 2200 .
- the position where haptic sensing device 2300 is attached is not the point of contact with target 4000 . Even in this case, it is possible to present the haptic information 2300a at a position where it contacts the target 4000, and the effect is obtained that the range of selection of the size and mounting method of the haptic sensor 2300 can be further expanded.
- Embodiment 16 In the sixteenth embodiment, although not shown, the configuration in which the active marker 2400 is attached to a location other than the contact point with the target 4000 as shown in the thirteenth and fourteenth embodiments is the same as in the third, fourth, and fourth embodiments. 5, 6, 7, 8 and 11 configurations.
- Embodiment 16 in addition to the effects obtained in Embodiments 3, 4, 5, 6, 7, 8 and 11, when the position where haptic sensor 2300 is attached is not the point of contact with target 4000 , it is possible to present the haptic information 2300a at the position where it contacts the target 4000, and the effect is obtained that the range of selection of the size and attachment method of the haptic sensor 2300 can be further expanded.
- Embodiment 17 the configuration of attaching the haptic sense device 2300 to a location other than the contact point with the target 4000 as shown in the fifteenth embodiment is the same as in the third, fourth, fifth, and sixth embodiments. , 7, 8, 11 and 16 configurations.
- the position where haptic sensor 2300 is attached is the point of contact with target 4000. Even if it is not, it is possible to present the haptic information 2300a at the position where it contacts the target 4000, and it is possible to further expand the range of selection of the size and attachment method of the haptic sensor 2300.
- Embodiment 18 the haptic sensing device 2300 in any one of Embodiments 3, 4, 5, 6, 7, 8, 9, 11, 12, 13, 14, 15, 16 and 17 is , has the capability of being detectable even before contact with the target 4000 .
- the edge device 2000 can It is possible to make the operator 1000 visually recognize the situation in which the target 4000 is likely to come into contact with the target 4000, and the operability of the operator 1000 can be improved.
- Embodiment 19 In Embodiment 19, although not shown, in any of Embodiments 1, 2, 3, 4, 5, 6, 7, 8, 9, 11, 12, 13, 14, 15, 16, 17 and 18
- the haptic detection device 2300 detects the target 4000, not only the visual haptic image information 3250a, but also the detection distance, the detection force, the detected tactile sensation, or a combination of these three, such as loudness, pitch, It has the function of transmitting timbre, or auditory information combining these three at the same time.
- Embodiment 19 Therefore, when the haptic detection device 2300 detects the target 4000, the operator 1000 can recognize the target 4000 not only visually but also audibly, and the operability of the operator 1000 can be improved. can get.
- FIG. 60 is a diagram showing processor 500 when some of the plurality of components of remote control system 1 according to Embodiment 1 are implemented by processor 500 .
- the functions of some of the components of remote control system 1 may be implemented by processor 500 executing programs stored in memory 600 .
- the processor 500 is a CPU (Central Processing Unit), processing device, arithmetic device, microprocessor, DSP (Digital Signal Processor), or system LSI (Large Scale Integration). Also shown in FIG. 60 is memory 600 .
- processor 500 When the functions of some of the components of the remote control system 1 are implemented by the processor 500, the functions are implemented by the processor 500 and software, firmware, or a combination of software and firmware. be.
- Software or firmware is written as a program and stored in memory 600 .
- Processor 500 reads and executes programs stored in memory 600 to implement functions of some of the components of remote control system 1 .
- the remote control system 1 can be configured such that some of the steps executed by the remote control system 1 are It has a memory 600 for storing programs to be executed. It can be said that the program stored in the memory 600 causes the computer to execute some of the multiple components of the remote control system 1 .
- the memory 600 is non-volatile such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory). Or a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Disk), or the like.
- the memory 600 also stores obstacle information, target shape information, and shape deformation information.
- Memory 600 may also be used for temporary memory while processor 500 is performing operations.
- the program executed by the processor 500 may be stored in a computer-readable storage medium in an installable or executable format and provided as a computer product.
- the program executed by processor 500 may be provided to remote control system 1 via a communication network such as the Internet.
- FIG. 61 is a diagram showing a processing circuit 700 in which some of the multiple components of the remote control system 1 according to Embodiment 1 are implemented by the processing circuit 700.
- the processing circuit 700 is dedicated hardware.
- the processing circuit 700 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof. is. Some of the multiple components of the remote control system 1 may be realized by dedicated hardware separate from the rest.
- part of the multiple functions may be implemented by software or firmware, and the rest of the multiple functions may be implemented by dedicated hardware.
- the multiple functions of the remote control system 1 can be realized by hardware, or by a combination of one or both of software and firmware and hardware.
- a part of the multiple components of each of the remote control systems of Embodiments 2 to 18 may be implemented by a processor.
- the processor is similar to processor 500 described above. In that case, the processor executes a program stored in the memory to implement functions of some of the multiple components of each of the remote control systems of Embodiments 2 to 18. do.
- the above memory is a memory similar to memory 600 .
- a part of the plurality of components of each remote control system of Embodiments 2 to 18 may be implemented by a processing circuit.
- the processing circuit is similar to the processing circuit 700 described above.
- Remote control system 500 processor, 600 memory, 700 processing circuit, 1000 operator, 1000a operator operation, 1100 visual presentation device, 1100a video information, 1200 movement operation transmission device, 1200a Motion transmission information, 2000 Edge device, 2100 End effector, 2200 Tool, 2300 Force tactile sense device, 2300a Force tactile sense information, 2400 Active marker, 2500 Passive marker, 2600 Light intensity control type light emitter, 2700 Force tactile light transducer, 2700a Light intensity signal, 3000 imaging device, 3000a imaging information, 3200 vision-based sensory transmission device, 3200a vision-based sensory transmission image, 3201 superimposed image, 3250 visual force-tactile image generation function, 3250a visual force-tactile image information, 3251 marker point Identification function, 3251a Marker point identification information, 3252 Superimposition function, 3253 Marker filter function, 3253a Marker filter information, 3254 Marker point estimation function, 3254a Marker point estimation information, 3255 Image analysis function, 3255a Haptic image pattern selection information, 3255
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
図1は、実施の形態1に係る遠隔操作システム1の構成を示す図である。遠隔操作システム1は、オペレータ1000の運動操作であるオペレータ操作1000aを入力とする運動操作伝達装置1200を有する。図1には、オペレータ1000も示されている。運動操作伝達装置1200は、オペレータ操作1000aに対応する運動伝達情報1200aを出力する。
実施の形態2に係る遠隔操作システムの構成は、実施の形態1に係る遠隔操作システム1の構成と同じである。しかしながら、実施の形態2と実施の形態1とでは、光強度制御式発光器2600の機能が異なる。実施の形態2では、光強度制御式発光器2600は、光の強度を変えて発光するのではなく、力触覚検知装置2300のターゲット4000との接触の状態を色相及び点灯周波数の変化で表現して発光する。
図6は、実施の形態3に係る遠隔操作システム3の構成を示す図である。遠隔操作システム3は、オペレータ1000の運動操作であるオペレータ操作1000aを入力とする運動操作伝達装置1200を有する。図6には、オペレータ1000も示されている。運動操作伝達装置1200は、オペレータ操作1000aに対応する運動伝達情報1200aを出力する。
実施の形態4に係る遠隔操作システムの構成は、実施の形態3に係る遠隔操作システム3の構成と同じである。しかしながら、実施の形態4と実施の形態3とでは、重畳映像3201が異なる。実施の形態3では、重畳映像3201は点灯映像の強度を示す。実施の形態4では、重畳映像3201は、点灯映像の強度を示すのではなく、力触覚検知装置2300のターゲット4000との接触の状態を色相及び点灯周波数の変化で表現する。
実施の形態5に係る遠隔操作システムの構成は、実施の形態3に係る遠隔操作システム3の構成と同じである。しかしながら、実施の形態5の視覚ベース感覚伝達装置3200が有する機能は、実施の形態3の視覚ベース感覚伝達装置3200が有する機能と異なる。実施の形態5では、実施の形態3との相違点を主に説明する。
実施の形態6に係る遠隔操作システムの構成は、実施の形態3に係る遠隔操作システム3の構成と同じである。しかしながら、実施の形態6の視覚ベース感覚伝達装置3200が有する機能は、実施の形態3の視覚ベース感覚伝達装置3200が有する機能と異なる。実施の形態6では、実施の形態3との相違点を主に説明する。
実施の形態7に係る遠隔操作システムの構成は、実施の形態6に係る遠隔操作システムの構成と同じである。しかしながら、実施の形態7の視覚ベース感覚伝達装置3200が有する機能は、実施の形態6の視覚ベース感覚伝達装置3200が有する機能と異なる。実施の形態7では、実施の形態6との相違点を主に説明する。
実施の形態8に係る遠隔操作システムの構成は、実施の形態3に係る遠隔操作システム3の構成と同じである。しかしながら、実施の形態8の視覚ベース感覚伝達装置3200が有する機能は、実施の形態3の視覚ベース感覚伝達装置3200が有する機能と異なる。実施の形態8では、実施の形態3との相違点を主に説明する。
図24は、実施の形態9に係る遠隔操作システム9の構成を示す図である。遠隔操作システム9は、実施の形態3から実施の形態8までのいずれかに係る遠隔操作システムが有する運動操作伝達装置1200、エッジ機器2000、撮像装置3000、視覚ベース感覚伝達装置3200及び視覚提示装置1100を有する。実施の形態9では、エッジ機器2000が有するパッシブマーカ2500は、アクティブマーカ2400に変更されている。図24には、オペレータ1000及びターゲット4000も示されている。
図34は、実施の形態10に係る遠隔操作システム10の構成を示す図である。遠隔操作システム10は、実施の形態1に係る遠隔操作システム1が有するすべての構成要素を有する。遠隔操作システム10のエッジ機器2000は、エンドエフェクタ2100に取り付けられたツール2200を更に有する。実施の形態10では、力触覚検知装置2300及び光強度制御式発光器2600は、ターゲット4000に接触する先端部に取り付けられている。
図37は、実施の形態11に係る遠隔操作システム11の構成を示す図である。遠隔操作システム11は、実施の形態3から実施の形態8までのいずれかの遠隔操作システムが有するすべての構成要素を有する。遠隔操作システム11のエッジ機器2000は、エンドエフェクタ2100に取り付けられたツール2200を更に有する。実施の形態11では、力触覚検知装置2300及びパッシブマーカ2500は、ターゲット4000に接触する先端部に取り付けられている。
図42は、実施の形態12に係る遠隔操作システム12の構成を示す図である。遠隔操作システム12は、実施の形態11に係る遠隔操作システム11のパッシブマーカ2500がアクティブマーカ2400に変更されたシステムである。実施の形態12では、力触覚検知装置2300及びアクティブマーカ2400は、ターゲット4000に接触する先端部に取り付けられている。
図52は、実施の形態13に係る遠隔操作システム13の構成を示す図である。遠隔操作システム13は、実施の形態12に係る遠隔操作システム12が有する視覚ベース感覚伝達装置3200に運動伝達情報1200aが入力されるシステムである。図53は、実施の形態13に係る遠隔操作システム13が有する視覚ベース感覚伝達装置3200の機能を示す図である。視覚ベース感覚伝達装置3200は、エッジ機器2000のモデル情報であるエッジ機器モデル情報3400aを記憶しているエッジ機器モデル記憶部3400を有する。エッジ機器モデル記憶部3400は、エッジ機器2000がターゲット4000とコンタクトするポイントの幾何学的な情報を記憶する。例えば、エッジ機器モデル記憶部3400は半導体メモリによって実現される。
実施の形態13では、アクティブマーカ2400はツール2200に取り付けられている。実施の形態14では、アクティブマーカ2400は、ツール2200ではなく、エンドエフェクタ2100に取り付けられている。つまり、実施の形態13と実施の形態14とでは、アクティブマーカ2400が取り付けられている構成要素が異なる。
実施の形態13では、力触覚検知装置2300はツール2200に取り付けられている。実施の形態15では、力触覚検知装置2300は、ツール2200ではなく、エンドエフェクタ2100に取り付けられている。つまり、実施の形態13と実施の形態15とでは、力触覚検知装置2300が取り付けられている構成要素が異なる。
実施の形態16では、図示されないが、実施の形態13及び実施の形態14に示されるようにアクティブマーカ2400をターゲット4000との接触点以外の場所に取り付けるという構成が、実施の形態3、4、5、6、7、8及び11の構成に適用される。
実施の形態17では、図示されないが、実施の形態15に示されるように力触覚検知装置2300をターゲット4000との接触点以外の場所に取り付けるという構成が、実施の形態3、4、5、6、7、8、11及び16の構成に適用される。
実施の形態18では、図示されないが、実施の形態3、4、5、6、7、8、9、11、12、13、14、15、16及び17のいずれかにおける力触覚検知装置2300は、ターゲット4000と接触する前から検知可能な機能を有する。
実施の形態19では、図示されないが、実施の形態1、2、3、4、5、6、7、8、9、11、12、13、14、15、16、17及び18のいずれかにおいて力触覚検知装置2300が、ターゲット4000を検知した場合に視覚的力触覚映像情報3250aだけでなく、検知距離、検知力、検知触覚、あるいは、これら3つの組合せに応じた、音の大きな、音程、音色、あるいは、これら3つを組み合わせた聴覚情報を同時に伝達する機能を有する。
Claims (10)
- ターゲットに接触するエッジ機器と、
前記エッジ機器を操作するためのオペレータ操作を受け付けて前記オペレータ操作に対応する運動伝達情報を前記エッジ機器に出力する運動操作伝達装置と、
前記ターゲット及び前記エッジ機器の映像を表示する視覚表示装置と、を備え、
前記エッジ機器は、
前記ターゲットに対する力触覚を検知する力触覚検知装置と、
前記力触覚検知装置によって検知された力触覚を検知された前記力触覚の大きさに対応する光の強度を特定する信号に変換する力触覚光変換器と、
前記力触覚光変換器によって得られた信号に対応する強度の光を発する光強度制御式発光器と、
を有することを特徴とする遠隔操作システム。 - ターゲットに接触するエッジ機器と、
前記エッジ機器を操作するためのオペレータ操作を受け付けて前記オペレータ操作に対応する運動伝達情報を前記エッジ機器に出力する運動操作伝達装置と、
前記ターゲット及び前記エッジ機器を撮像する撮像装置と、
前記ターゲット及び前記エッジ機器の映像を表示する視覚表示装置と、
視覚ベース感覚伝達装置と、を備え、
前記エッジ機器は、
前記ターゲットに対する力触覚を検知する力触覚検知装置と、
パッシブマーカとを有し、
前記視覚ベース感覚伝達装置は、
前記パッシブマーカの位置を同定するマーカポイント同定機能と、
前記力触覚検知装置によって検知された力触覚を示す力触覚情報に対応する映像を示す視覚的力触覚映像情報を生成する視覚的力触覚映像生成機能と、
前記マーカポイント同定機能によって得られたマーカポイント同定情報と、前記視覚的力触覚映像生成機能によって得られた視覚的力触覚映像情報と、前記撮像装置によって得られた撮像情報とを重畳する重畳機能と、
を有することを特徴とする遠隔操作システム。 - 前記視覚ベース感覚伝達装置は、検出された前記パッシブマーカをフィルタリングするためのマーカフィルタ情報を生成するマーカフィルタ機能を更に有する
ことを特徴とする請求項2に記載の遠隔操作システム。 - 前記エッジ機器が前記パッシブマーカを複数有する場合であって、前記パッシブマーカのひとつが前記ターゲットにより遮蔽されて前記撮像情報から消失した場合に、
前記視覚ベース感覚伝達装置は、他の前記パッシブマーカが前記撮像情報に存在するとき、前記撮像情報と前記マーカポイント同定情報とをもとに消失したマーカポイントを推定するマーカポイント推定機能を更に有する
ことを特徴とする請求項2又は3に記載の遠隔操作システム。 - 前記視覚ベース感覚伝達装置は、前記撮像情報をもとに、人が認知しやすい力触覚映像パターンを選定する力触覚映像パターン選定情報と、自然に馴染んだ映像を選定するマーカフィルタ映像選定情報と、を生成する画像分析機能を更に有する
ことを特徴とする請求項2から4のいずれか1項に記載の遠隔操作システム。 - ターゲットに接触するエッジ機器と、
前記エッジ機器を操作するためのオペレータ操作を受け付けて前記オペレータ操作に対応する運動伝達情報を前記エッジ機器に出力する運動操作伝達装置と、
前記ターゲット及び前記エッジ機器を撮像する撮像装置と、
前記ターゲット及び前記エッジ機器の映像を表示する視覚表示装置と、
視覚ベース感覚伝達装置と、
前記撮像装置によって得られた撮像情報をもとに検出可能なマーカを生成するためのアクティブマーカ制御信号とマーカ情報とを生成するアクティブマーカ制御装置と、を備え、
前記エッジ機器は、
前記ターゲットに対する力触覚を検知する力触覚検知装置と、
前記アクティブマーカ制御装置によって生成された前記アクティブマーカ制御信号によって制御されるアクティブマーカと、を有し、
前記視覚ベース感覚伝達装置は、
前記アクティブマーカの位置を同定するマーカポイント同定機能と、
前記力触覚検知装置によって検知された力触覚を示す力触覚情報に対応する映像を示す視覚的力触覚映像情報を生成する視覚的力触覚映像生成機能と、
前記マーカポイント同定機能によって得られたマーカポイント同定情報と、前記視覚的力触覚映像生成機能によって得られた視覚的力触覚映像情報と、前記撮像装置によって得られた撮像情報とを重畳する重畳機能と、
を有することを特徴とする遠隔操作システム。 - 前記エッジ機器に設けられた前記力触覚検知装置及び前記光強度制御式発光器は、前記ターゲットに接触する先端部に取り付けられている
ことを特徴とする請求項1に記載の遠隔操作システム。 - 前記エッジ機器に設けられた前記力触覚検知装置及び前記パッシブマーカは、前記ターゲットに接触する先端部に取り付けられている
ことを特徴とする請求項2から5のいずれか1項に記載の遠隔操作システム。 - 前記エッジ機器に設けられた前記力触覚検知装置及び前記アクティブマーカは、前記ターゲットに接触する先端部に取り付けられている
ことを特徴とする請求項6に記載の遠隔操作システム。 - 前記視覚ベース感覚伝達装置は、前記エッジ機器が前記ターゲットとコンタクトするポイントの幾何学的な情報を記憶するエッジ機器モデル記憶部を有する
ことを特徴とする請求項2から9のいずれか1項に記載の遠隔操作システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/025584 WO2023281648A1 (ja) | 2021-07-07 | 2021-07-07 | 遠隔操作システム |
CN202180100061.3A CN117580689A (zh) | 2021-07-07 | 2021-07-07 | 远程操作系统 |
JP2022508872A JP7109699B1 (ja) | 2021-07-07 | 2021-07-07 | 遠隔操作システム |
EP21949282.4A EP4368351A4 (en) | 2021-07-07 | 2021-07-07 | REMOTE OPERATING SYSTEM |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/025584 WO2023281648A1 (ja) | 2021-07-07 | 2021-07-07 | 遠隔操作システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023281648A1 true WO2023281648A1 (ja) | 2023-01-12 |
Family
ID=82652287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/025584 WO2023281648A1 (ja) | 2021-07-07 | 2021-07-07 | 遠隔操作システム |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4368351A4 (ja) |
JP (1) | JP7109699B1 (ja) |
CN (1) | CN117580689A (ja) |
WO (1) | WO2023281648A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014079824A (ja) * | 2012-10-15 | 2014-05-08 | Toshiba Corp | 作業画面表示方法および作業画面表示装置 |
WO2016113836A1 (ja) * | 2015-01-13 | 2016-07-21 | 株式会社日立製作所 | マニプレータ制御方法、システム、およびマニプレータ |
WO2019059364A1 (ja) * | 2017-09-22 | 2019-03-28 | 三菱電機株式会社 | 遠隔制御マニピュレータシステムおよび制御装置 |
WO2019224994A1 (ja) | 2018-05-25 | 2019-11-28 | 株式会社メルティンMmi | 動き検出装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2887884B1 (en) * | 2012-08-27 | 2019-06-12 | University Of Houston | Robotic device and system software for image-guided and robot-assisted surgery |
JP6938369B2 (ja) * | 2014-03-28 | 2021-09-22 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 定量的な3次元イメージングに基づく触覚フィードバックを用いる手術システム |
JP6420229B2 (ja) * | 2015-12-10 | 2018-11-07 | ファナック株式会社 | 仮想物体の画像をロボットの映像に重畳表示する映像表示装置を備えるロボットシステム |
CA3062101A1 (en) * | 2017-05-03 | 2018-11-08 | Taiga Robotics Corp. | Systems and methods for remotely controlling a robotic device |
JP6948164B2 (ja) * | 2017-06-12 | 2021-10-13 | 日立Geニュークリア・エナジー株式会社 | 作業用ロボットのアーム姿勢制御システムおよび方法 |
JP2020156800A (ja) * | 2019-03-27 | 2020-10-01 | ソニー株式会社 | 医療用アームシステム、制御装置、及び制御方法 |
-
2021
- 2021-07-07 EP EP21949282.4A patent/EP4368351A4/en active Pending
- 2021-07-07 WO PCT/JP2021/025584 patent/WO2023281648A1/ja active Application Filing
- 2021-07-07 JP JP2022508872A patent/JP7109699B1/ja active Active
- 2021-07-07 CN CN202180100061.3A patent/CN117580689A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014079824A (ja) * | 2012-10-15 | 2014-05-08 | Toshiba Corp | 作業画面表示方法および作業画面表示装置 |
WO2016113836A1 (ja) * | 2015-01-13 | 2016-07-21 | 株式会社日立製作所 | マニプレータ制御方法、システム、およびマニプレータ |
WO2019059364A1 (ja) * | 2017-09-22 | 2019-03-28 | 三菱電機株式会社 | 遠隔制御マニピュレータシステムおよび制御装置 |
WO2019224994A1 (ja) | 2018-05-25 | 2019-11-28 | 株式会社メルティンMmi | 動き検出装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4368351A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4368351A1 (en) | 2024-05-15 |
JPWO2023281648A1 (ja) | 2023-01-12 |
CN117580689A (zh) | 2024-02-20 |
JP7109699B1 (ja) | 2022-07-29 |
EP4368351A4 (en) | 2024-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11688162B2 (en) | Drive assist device | |
CN105666505B (zh) | 具备扩展现实对应显示器的机器人系统 | |
JP6598617B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US10474411B2 (en) | System and method for alerting VR headset user to real-world objects | |
US9815199B2 (en) | Display control device, display control method, computer program product, and communication system | |
US10489981B2 (en) | Information processing device, information processing method, and program for controlling display of a virtual object | |
JP5564300B2 (ja) | ヘッドマウント型拡張現実映像提示装置及びその仮想表示物操作方法 | |
US9697610B2 (en) | Information processing device and information processing method | |
JP6601402B2 (ja) | 制御装置、制御方法およびプログラム | |
US20120182155A1 (en) | Danger presentation device, danger presentation system, danger presentation method and program | |
US9758098B2 (en) | Vehicle periphery monitoring device | |
EP3422152A1 (en) | Remote operation device, remote operation method, remote operation system, and program | |
JP2006167867A (ja) | 遠隔操作装置 | |
TW202105129A (zh) | 具有用於閘控使用者介面元件的個人助理元件之人工實境系統 | |
WO2016208261A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP7428436B2 (ja) | 随意のデュアルレンジ運動学を用いたプロキシコントローラスーツ | |
JP3933139B2 (ja) | コマンド入力装置 | |
US10334237B2 (en) | Information processing device, display device, and information processing method | |
JP2003035515A (ja) | 三次元位置検出方法,装置および三次元位置検出用のマーカ | |
US20220155881A1 (en) | Sensing movement of a hand-held controller | |
JP2009031981A (ja) | インタフェース装置、インタフェース方法及びインタフェースプログラム | |
WO2023281648A1 (ja) | 遠隔操作システム | |
US12042240B2 (en) | Augmented reality using eye tracking in a robot assisted surgical system | |
WO2020154971A1 (en) | Electronic device and control method therefor | |
US20180164895A1 (en) | Remote control apparatus, remote control method, remote control system, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2022508872 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21949282 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180100061.3 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021949282 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2021949282 Country of ref document: EP Effective date: 20240207 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |