WO2024089890A1 - Système d'opération à distance et procédé d'opération à distance - Google Patents

Système d'opération à distance et procédé d'opération à distance Download PDF

Info

Publication number
WO2024089890A1
WO2024089890A1 PCT/JP2022/040482 JP2022040482W WO2024089890A1 WO 2024089890 A1 WO2024089890 A1 WO 2024089890A1 JP 2022040482 W JP2022040482 W JP 2022040482W WO 2024089890 A1 WO2024089890 A1 WO 2024089890A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote machine
image
camera
operator
attitude
Prior art date
Application number
PCT/JP2022/040482
Other languages
English (en)
Japanese (ja)
Inventor
正樹 春名
茂明 田頭
正樹 荻野
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/040482 priority Critical patent/WO2024089890A1/fr
Publication of WO2024089890A1 publication Critical patent/WO2024089890A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • This disclosure relates to a remote control system and a remote control method.
  • a remote control system that operates machinery remotely, for example, a combination of a head-mounted display and an operation interface that detects the operator's gestures is used.
  • Patent document 1 discloses a technology that uses a fisheye stereo camera to present a wide-field-of-view image to a remote operator without the need to drive the camera.
  • Patent Document 1 requires the operator to wear a head-mounted display and have stereoscopic perception, and the operator must operate the joystick while sequentially checking the surroundings using stereoscopic perception, which places a high operational burden on the operator.
  • the present disclosure has been made in consideration of the above, and aims to provide a remote control system that can reduce the operational burden on the operator.
  • the remote operation system disclosed herein comprises a remote machine equipped with a camera and remotely operated, an image presentation device that presents an image captured by the camera to an operator as a presentation image, and an operation device that is operated by the operator and accepts, through the operator's operation, an input of a target attitude of the remote machine as a position in the presentation image, and accepts, through the operator's operation, an input of a position movement instruction indicating that the position of the remote machine is to be moved.
  • the remote operation system further comprises an operation calculator that drives the attitude of the remote machine based on the target attitude received by the operation device, and moves the position of the remote machine based on the position movement instruction received by the operation device.
  • the remote control system disclosed herein has the effect of reducing the operational burden on the operator.
  • FIG. 1 is a diagram showing a configuration example of a remote control system according to a first embodiment
  • FIG. 1 is a schematic diagram showing a specific example of a remote control system according to a first embodiment
  • FIG. 1 is a diagram showing an example of a method for setting a target attitude according to the first embodiment
  • FIG. 1 is a diagram showing an example of an operation using the operation device according to the first embodiment
  • a flowchart showing an example of an operation in the operational computing unit according to the first embodiment.
  • FIG. 1 is a diagram showing an example of the configuration of a computer system that realizes an operational calculator according to a first embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a second embodiment.
  • FIG. 1 is a diagram showing a configuration example of a remote control system according to a first embodiment
  • FIG. 1 is a schematic diagram showing a specific example of a remote control system according to a first embodiment
  • FIG. 1 is a diagram showing an example of a method for
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example in which a terminal device is used as an image presentation device according to a second embodiment
  • FIG. 13 is a diagram showing an example of a video display method according to the second embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a third embodiment.
  • FIG. 13 is a diagram showing an example of camera switching in the third embodiment
  • FIG. 13 is a diagram showing an example of camera switching in the third embodiment
  • FIG. 13 is a diagram showing an example of a camera provided inside a hand of a manipulator according to a third embodiment
  • FIG. 13 is a diagram showing an example of a camera according to a third embodiment for photographing a manipulator from the side.
  • FIG. 13 is a diagram showing an example of video switching in the third embodiment.
  • FIG. 13 is a diagram showing an example of superimposing images captured from the side in the third embodiment;
  • FIG. 13 is a diagram showing an example of a method for detecting pressure and displaying pressure according to the third embodiment;
  • FIG. 13 is a diagram showing another example of a method for detecting pressure and displaying pressure according to the third embodiment;
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a fourth embodiment.
  • FIG. 13 is a diagram showing an example of a remote control system according to a fourth embodiment.
  • FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment
  • FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a fifth embodiment.
  • Embodiment 1. 1 is a diagram showing a configuration example of a remote operation system according to embodiment 1.
  • a remote operation system 100 of this embodiment includes a remote machine 1, a camera 2, an operation calculator 3, an image display device 4, and an operation device 5.
  • the remote machine 1 is a machine that is remotely operated, and may be, for example, but is not limited to, a combination of a cart and a manipulator, a mobile vehicle, a manipulator, a humanoid robot, etc.
  • a case where the remote machine 1 is a combination of a cart and a manipulator will be mainly described.
  • the remote machine 1 includes an attitude drive mechanism 11 and a position drive mechanism 12.
  • the attitude drive mechanism 11 is a mechanism that can change the attitude of the remote machine 1, and changes the attitude of the remote machine 1 according to a control signal received from the operation calculator 3.
  • the attitude indicates the orientation of the remote machine 1.
  • the target attitude indicates the target orientation of the cart.
  • the target attitude indicates the target orientation of the vehicle.
  • the attitude drive mechanism 11 includes an actuator that changes the orientation of the wheels of the remote machine 1.
  • the attitude drive mechanism 11 may be a drive mechanism that can change the orientation of the remote machine 1 on the spot, or may be a drive mechanism that can change the orientation by turning. Also, for example, if the remote machine 1 is a combination of a cart and a manipulator, the target attitude may be appropriately allocated to the orientation of the cart and the target orientation of the manipulator.
  • the position drive mechanism 12 is a mechanism capable of changing the position of the remote machine 1, and changes the position of the remote machine 1 in response to a control signal received from the operation calculator 3.
  • the position drive mechanism 12 includes, for example, an actuator capable of changing at least one of the horizontal and vertical positions of the remote machine 1.
  • the camera 2 is mounted on the remote machine 1 and captures the surroundings of the remote machine 1. There is no restriction on the number of cameras 2, and one or more may be used. In the following embodiment, an example in which the camera 2 includes two fisheye cameras with different viewpoints will be described, but the camera 2 is not limited to a fisheye camera.
  • the image presentation device 4 is, for example, a display device such as a monitor, a display, or a smartphone monitor, and presents an image to an operator who remotely operates the remote machine 1.
  • the image presentation device 4 presents, for example, an image captured by the camera 2 to the operator as a presented image.
  • the operation device 5 is a device operated by the operator, such as, but not limited to, a mouse, a keyboard, a touchpad, or a device that performs screen operation by detecting a face or line of sight.
  • the operation device 5 accepts input of a target attitude of the remote machine 1 as a position in the presented image by the operator's operation, and accepts input of a position movement instruction indicating that the position of the remote machine 1 is to be moved by the operator's operation.
  • the image presentation device 4 and the operation device 5 may also be integrated, and a touch panel, a smartphone, or the like may be used.
  • the operation device 5 is a mouse and the image presentation device 4 is a monitor will be mainly described.
  • the operation calculator 3 receives the image captured by the camera 2 from the camera 2, and outputs the received image to the image display device 4, thereby displaying the image on the image display device 4.
  • the operation calculator 3 also controls the remote machine 1 according to the operation content accepted by the operation device 5. For example, the operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude accepted by the operation device 5, and moves the position of the remote machine 1 based on the position movement instruction accepted by the operation device 5.
  • the camera 2 and the operation calculator 3 may be directly connected, or the camera 2 and the operation calculator 3 may have a communication unit not shown in the figure, and data may be transmitted and received by communication.
  • the communication between the camera 2 and the operation calculator 3 may be wired communication, wireless communication, or a combination of wired communication and wireless communication.
  • the operation calculator 3 and the remote machine 1 may be directly connected, or the operation calculator 3 and the remote machine 1 may have a communication unit not shown in the figure, and data may be transmitted and received by communication.
  • the image presentation device 4 and the operation device 5 may be directly connected to the remote machine 1, or the image presentation device 4, the operation device 5, and the operation calculator 3 may be provided with a communication unit (not shown), and data may be transmitted and received through communication.
  • the communication between the image presentation device 4 and the operation device 5 and the remote machine 1 may be wired communication, wireless communication, or a combination of wired communication and wireless communication.
  • the operation calculator 3 may be provided at a first location where the remote machine 1 is located, or at a second location where the image presentation device 4 and the operation device 5 are located, i.e., on the operator's side.
  • the operation calculator 3 may be mounted on the remote machine 1.
  • the operation calculator 3 may be provided at a location different from both the first location and the second location.
  • the operation calculator 3 may be provided on a cloud server.
  • the operation calculator 3 includes an input discrimination unit 31, a display information generation unit 32, a target attitude setting unit 33, an attitude drive unit 34, a target attitude determination unit 35, and a position drive unit 36.
  • the input discrimination unit 31 acquires operation information indicating the operation content from the operation device 5, and outputs the operation information to the corresponding functional unit according to the content of the acquired operation information.
  • the display information generation unit 32 generates display data indicating a display screen to be displayed on the image presentation device 4 using the image received from the camera 2 and the operation information received from the input discrimination unit 31, and transmits the generated display data to the image presentation device 4.
  • the display information generation unit 32 also outputs the image received from the camera 2 to the target attitude setting unit 33.
  • the target attitude setting unit 33 sets a target attitude of the remote machine 1 based on the operation information received from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude.
  • the attitude driving unit 34 When the attitude driving unit 34 is notified of the target attitude by the target attitude setting unit 33, it generates a control signal for driving the attitude of the remote machine 1 based on the target attitude, and transmits the generated control signal to the remote machine 1. Furthermore, when the attitude driving unit 34 is notified by the target attitude determination unit 35 that the target attitude has not been reached, it generates a control signal for driving the attitude of the remote machine 1, and transmits the generated control signal to the remote machine 1.
  • the target attitude determination unit 35 determines whether the attitude of the remote machine 1 has become the target attitude, and if it determines that it has become the target attitude, it notifies the position drive unit 36 that it has become the target attitude. If the target attitude determination unit 35 determines that it has not become the target attitude, it notifies the attitude drive unit 34 that it has not become the target attitude. When notified by the target attitude determination unit 35 that the target attitude has been achieved, if the operation information received from the input discrimination unit 31 indicates that the position of the remote machine 1 is to be moved, the position drive unit 36 generates a control signal for moving the position of the remote machine 1 and transmits the generated control signal to the remote machine 1.
  • FIG. 2 is a diagram showing a schematic example of a remote control system 100 according to the present embodiment.
  • the remote machine 1 includes a manipulator 6 and a dolly 7.
  • a fish-eye camera 2-1 is provided on the front of the dolly 7, and a fish-eye camera 2-2 is provided near the hand 6a of the manipulator 6.
  • the image display device 4 is a monitor, and the operation device 5 is a mouse.
  • the image display device 4 displays an image captured by the camera 2-1 or an image captured by the camera 2-2.
  • Both cameras 2-1 and 2-2 are examples of the camera 2 shown in FIG. 1.
  • the camera switching that is, the switching of whether the image captured by the camera 2-1 or the image captured by the camera 2-2 is displayed on the image display device 4, is performed by the operator.
  • the operator remotely controls the movement of the remote machine 1 between the target attitude and position by operating the operation device 5 while watching the image displayed on the image display device 4.
  • FIG. 3 is a diagram showing an example of a method for setting a target posture in this embodiment.
  • FIG. 3 is a diagram showing an example of a display screen of the image display device 4.
  • an image captured by the camera 2-1 is displayed on the image display device 4.
  • This image shows the upper part of the dolly 7 and the hand 6a of the manipulator 6.
  • a marker 201 indicating the target posture is displayed together with the image captured by the camera 2-1.
  • the marker 201 in the display screen can be changed by the operator operating the operation device 5.
  • the position of the marker 201 in the display screen moves in accordance with the movement of the operation device 5, and the position of the marker 201 in the display screen is determined by pressing the mouse, which is the operation device 5, thereby specifying the target posture.
  • the dolly 7, the hand 6a, and the marker 201 are respectively given the symbols 7, 6a, and 201 for the purpose of explanation, but these symbols are not displayed on the display screen.
  • symbols are given for the purpose of explanation, but these symbols are not displayed on the display screen.
  • the remote machine 1 operates while the mouse, which is the operating device 5, is pressed. More specifically, when the position of the marker 201, i.e., the target attitude, is determined by pressing the mouse, which is the operating device 5, the operation calculator 3 controls the attitude of the remote machine 1 so that the cart 7 rotates in a direction corresponding to the target attitude. That is, the operation calculator 3 may display the marker 201 that can be moved by operating the operating device 5 in the presented image, and set the position of the marker 201 at the time when the operating device 5 is pressed by the operator as the target attitude.
  • the operation calculator 3 may determine that the position movement instruction has been accepted by the operating device 5 if the pressing of the operating device 5 continues at the time when the attitude of the remote machine 1 reaches the target attitude, and may start moving the position of the remote machine 1.
  • the operation calculator 3 stops driving the attitude of the remote machine 1 and stopping the movement of the position of the remote machine 1. Then, when the remote machine 1 is driven to the target attitude, that is, when the orientation of the remote machine 1 corresponds to the target attitude, the operation calculator 3 controls the position of the remote machine 1 so that the remote machine 1 starts moving in the direction of travel in that attitude. While the mouse, which is the operating device 5, is pressed, the operation calculator 3 continues to move the position of the remote machine 1, and when the mouse, which is the operating device 5, is no longer pressed, the operation calculator 3 stops the remote machine 1.
  • FIG. 4 is a diagram showing an example of an operation using the operation device 5 of this embodiment.
  • Each diagram in FIG. 4 shows an example of a display screen of the image presentation device 4, as in FIG. 3, and a marker 201 indicating the target attitude is displayed on each display screen together with an image captured by the camera 2-1.
  • FIG. 4 shows an example in which a mouse is used as the operation device 5.
  • the first display screen in FIG. 4 shows a state in which a target is set.
  • the operator moves the marker 201 to a position to be set as the target attitude and presses the mouse, which is the operation device 5, to set the target attitude.
  • FIG. 4 shows an example of an operation using the operation device 5 of this embodiment.
  • Each diagram in FIG. 4 shows an example of a display screen of the image presentation device 4, as in FIG. 3, and a marker 201 indicating the target attitude is displayed on each display screen together with an image captured by the camera 2-1.
  • FIG. 4 shows an example in which a mouse is used as the operation device 5.
  • the vector from the center of the remote machine 1 to the front (front face) of the remote machine 1 is vector 202
  • the vector corresponding to the target attitude is vector 203.
  • the operation calculator 3 calculates vector 202 and vector 203 based on the image and the position of the marker 201 in the image, and starts control to rotate the cart 7 of the remote machine 1 counterclockwise so that vector 202 coincides with vector 203.
  • the attitude of the carriage 7 of the remote machine 1 will change, as shown in the second display screen of Figure 4, i.e., the orientation of the carriage 7 of the remote machine 1 will rotate counterclockwise, and the marker 201 will be positioned in front of the carriage 7 of the remote machine 1. This ends the attitude drive.
  • the operation calculator 3 starts position driving to move the trolley 7 of the remote machine 1 forward, as shown in the third display screen of FIG. 4. Thereafter, the position driving of the trolley 7 continues until the pressing of the mouse, which is the operating device 5, ends.
  • the operator sets the target posture by moving the mouse, which is the operation device 5, to the desired position in the image projected as the display screen and pressing the mouse, and by continuing to press the mouse, the remote machine 1 is driven to the target posture as described above. Then, when the remote machine 1 is in the target posture, if the operator continues to press the mouse, which is the operation device 5, position driving of the remote machine 1 begins.
  • the operator does not need to wear a head-mounted display, and there is no need for stereoscopic perception. This reduces the operation load on the operator.
  • the operator can change the posture and position of the remote machine 1 simply by moving and pressing the mouse, which is the operation device 5, while looking at the display screen of the image presentation device 4, allowing remote operation with simple operations.
  • the operator continues to press the mouse, which is the operation device 5, to instruct the remote machine 1 to continue operating. That is, in the above example, pressing the mouse after the remote machine 1 has been driven to the target posture means the start of position driving, and then stopping the mouse pressing means the end of position driving.
  • the specific method of specifying the operation of the remote machine 1 is not limited to the above example. For example, the following operation may be performed. In the first display screen of FIG. 4, the operator moves the marker 201 to a position corresponding to the target posture, and then clicks or double-clicks the mouse, which is the operation device 5, to once confirm the target posture.
  • the operation calculator 3 stops the remote machine 1 when the posture of the remote machine 1 becomes the target posture.
  • the operator may press the mouse, which is the operation device 5, to start position driving, and stop pressing the mouse to end the position driving.
  • the operator can click or double-click the mouse, which is the operating device 5, to start driving the position, and click or double-click again to end the driving of the position.
  • the operation device 5 may be a touchpad, in which case, for example, pressing the touchpad may be used instead of pressing the mouse, and tapping or double tapping the touchpad may be used instead of clicking or double clicking the mouse.
  • the operation device 5 may be a touch panel integrated with the image presentation device 4, in which case the operator determines the target posture by touching a position on the touch panel that corresponds to the target posture on the display screen, and, for example, pressing the touch panel may be used instead of pressing the mouse, and tapping or double tapping the touch panel may be used instead of clicking or double clicking the mouse.
  • the operation device 5 may be a keyboard.
  • specific keys such as arrow keys may be assigned to up, down, left, and right movement, and the marker 201 may be moved using these keys, and pressing specific keys such as the enter key may be treated the same as pressing the mouse.
  • FIG. 5 is a flowchart showing an example of the operation of the operation calculator 3 of this embodiment.
  • FIG. 5 shows an example in which an operator starts pressing the operation device 5 at a position indicating the target posture, and continues pressing the operation device 5 while the remote machine 1 is operating, thereby remotely operating the remote machine 1.
  • the operation calculator 3 first judges whether or not a target setting has been input (step S1).
  • the input discrimination unit 31 judges whether or not the operation information received from the operation device 5 is information indicating the setting of the target posture.
  • the information indicating the setting of the target posture is, for example, information input from the operation device 5 when the operation device 5 is pressed, and includes information indicating the position of the operation device 5 (position on the display screen).
  • the input discrimination unit 31 notifies the position drive unit 36 of operation information indicating that the input of a movement instruction is continuing while the operation device 5 is being pressed. If there is no target setting input (step S1 No), the operation calculator 3 repeats step S1.
  • a target attitude is set (step S2).
  • the input discrimination unit 31 outputs operation information to the target attitude setting unit 33, and the target attitude setting unit 33 sets a target attitude using the image acquired from the display information generation unit 32 and the operation information indicating the setting of the target attitude acquired from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude.
  • the target attitude setting unit 33 obtains the vector 202 illustrated in the first row of FIG. 4 using the image, obtains the vector 203 using the operation information and the image, and calculates the difference between the vectors 202 and 203 to calculate the direction and angle by which the remote machine 1 is rotated as the target attitude. In this way, the target attitude is calculated as a relative value from the current attitude, that is, the amount of change from the current attitude.
  • the operation calculator 3 performs posture driving (step S3).
  • the posture driving unit 34 generates a control signal for driving the posture of the remote machine 1 based on the target posture notified by the target posture setting unit 33, and transmits the generated control signal to the remote machine 1.
  • the posture driving unit 34 generates a control signal to change the orientation at a constant speed so as to approach the target posture received from the target posture setting unit 33.
  • the operation calculator 3 determines whether the target attitude has been reached (step S4).
  • the target attitude determination unit 35 determines whether the remote machine 1 has reached the target attitude based on the target attitude notified by the target attitude setting unit 33. As described above, when the target attitude is indicated as a relative value from the current attitude, for example, the target attitude determination unit 35 determines that the remote machine 1 has reached the target attitude when the target attitude is equal to or less than a threshold value.
  • the threshold value is determined in advance, and may be set to, for example, 0, or may be set to a value corresponding to an error range such that the amount of change in the target attitude, i.e., the attitude that changes the remote machine 1, can be considered to be 0.
  • step S5 the operation calculator 3 determines whether the input of a movement command is continuing. In detail, if it is determined that the target posture has been reached, the target posture determination unit 35 notifies the position drive unit 36 of this fact, and upon receiving the notification that the target posture has been reached, the position drive unit 36 determines Yes in step S5 if it has received operation information indicating that movement is continuing from the input discrimination unit 31. If it is determined that the input of a movement command is continuing (step S5 Yes), the operation calculator 3 performs position drive (step S6). In detail, the position drive unit 36 generates a control signal for moving the remote machine 1 forward, and transmits the generated control signal to the remote machine 1.
  • step S4 If it is determined in step S4 that the target posture has not been reached (step S4: No), the operation calculator 3 repeats the process from step S3. If it is determined in step S5 that the input of the movement command is not continuing (step S5: No), the operation calculator 3 ends the process.
  • FIG. 5 shows an example in which the operator remotely controls the remote machine 1 by starting to press the operation device 5 at a position indicating the target posture and continuing to press while the remote machine 1 is operating.
  • the operation is generally similar to FIG. 5, although it is changed appropriately depending on the operation method. For example, when the operation device 5 is clicked at a position indicating the target posture, and position drive is started by clicking after the remote machine 1 reaches the target posture, in step S5, the operation calculator 3 judges Yes when it receives operation information indicating that a click was made after the remote machine 1 reaches the target posture.
  • position driving is performed after posture driving, but this is not limited to the above, and operations related to posture driving and position driving may be performed separately.
  • the operator may use the operation device 5 to perform a predetermined operation such as double-clicking or double-tapping to drive the position of the remote machine 1, and posture driving may be performed by specifying a position using the operation device 5 and continuing to press it (long press).
  • FIG. 6 is a diagram showing an example of the configuration of a computer system that realizes the operation calculator 3 of this embodiment.
  • this computer system includes a control unit 101, an input unit 102, a memory unit 103, a display unit 104, a communication unit 105, and an output unit 106, which are connected via a system bus 107.
  • the control unit 101 and the memory unit 103 form a processing circuit.
  • the control unit 101 is, for example, a processor such as a CPU (Central Processing Unit), and executes a program in which the processing in the operation calculator 3 of this embodiment is described. Note that a part of the control unit 101 may be realized by dedicated hardware such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array).
  • the input unit 102 is a button, a keyboard, a mouse, a touchpad, etc.
  • the storage unit 103 includes various memories such as a RAM (Random Access Memory) and a ROM (Read Only Memory) and a storage device such as a hard disk, and stores the program to be executed by the control unit 101, necessary data obtained in the process of processing, etc.
  • the storage unit 103 is also used as a temporary storage area for the program.
  • the display unit 104 is, for example, a display.
  • the display unit 104 and the input unit 102 may be integrated and realized by a touch panel, etc.
  • the communication unit 105 is a receiver and a transmitter that perform communication processing.
  • the output unit 106 is a speaker or the like.
  • FIG. 6 is an example, and the configuration of the computer system is not limited to the example of FIG. 6.
  • the computer system that realizes the operation calculator 3 does not need to include the display unit 104 and the output unit 106.
  • a computer program is installed in the storage unit 103 from a CD-ROM or DVD-ROM set in a CD (Compact Disc)-ROM drive or DVD (Digital Versatile Disc)-ROM drive (not shown). Then, when the program is executed, the program read from the storage unit 103 is stored in the main memory area of the storage unit 103. In this state, the control unit 101 executes processing as the operation calculator 3 of this embodiment according to the program stored in the storage unit 103.
  • a program describing the processing in the operational calculator 3 is provided on a CD-ROM or DVD-ROM as a recording medium, but this is not limiting.
  • a program provided via a transmission medium such as the Internet may be used.
  • the operation device 5 shown in FIG. 1 may be the input unit 102 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the input unit 102.
  • the image presentation device 4 shown in FIG. 1 may be the display unit 104 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the display unit 104.
  • the input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1 are realized by the control unit 101 shown in FIG. 6 executing a computer program stored in the storage unit 103 shown in FIG. 6.
  • the storage unit 103 shown in FIG. 6 is also used to realize the input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1.
  • the remote operation system 100 of this embodiment comprises the remote machine 1, the image presentation device 4 that presents the image from the camera 2 to the operator, the operation device 5, and the operation calculator 3.
  • the operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude specified as a position in the target image by the operation device 5, and moves the position of the remote machine 1 based on instructions to start and end position drive input using the operation device 5. This reduces the operational burden on the operator.
  • Embodiment 2. 7 is a diagram showing a configuration example of a remote control system according to the second embodiment.
  • a remote control system 100a according to the second embodiment is similar to the remote control system 100 according to the first embodiment, except that it includes an operation calculator 3a instead of the operation calculator 3 and a precision operation device 8a is added.
  • Components having the same functions as those in the first embodiment are given the same reference numerals as those in the first embodiment, and duplicated explanations will be omitted. Below, differences from the first embodiment will be mainly explained.
  • the precision operation device 8a is a device that allows more precise operation than the operation device 5, and is, for example, a joystick, an operation device that combines a joystick and a dial, or a mouse that allows precision operation.
  • a joystick is used as the precision operation device 8a and the remote machine 1 has multiple drive axes
  • the operation of which drive axis is to be performed may be switched by a button on the joystick or an operation on the operation device 5.
  • the precision operation device 8a may also be the same as the operation device 5 in terms of hardware. For example, a mouse that can be switched between a normal mode and a mode that allows precision operation may be set to the normal mode when used as the operation device 5, and to the mode that allows precision operation when used as the precision operation device 8a.
  • the operation calculator 3a may enlarge the image displayed on the image display device 4 in response to the operation of the operation device 5, thereby enabling precision operation by the operator.
  • the operation of enlarging may be an operation used for enlarging a general screen, and for example, when the operation device 5 is a touch panel, a pinch out may be used as the operation of enlarging.
  • the enlargement operation is not limited to this, and a button for enlargement or the like may be displayed on the image display device 4, and the position to be enlarged and displayed may be moved using the operation device 5.
  • the operation calculator 3a is similar to the operation calculator 3 of the first embodiment, except that an input switch 8b is added and an input discrimination unit 31a is provided instead of the input discrimination unit 31.
  • an input switch is provided separately from the operation calculator 3a, but the input switch 8b may be provided within the operation calculator 3a.
  • the input switch 8b switches the operation target from the operator between the operation device 5 and the precision operation device 8a.
  • the input switch 8b has a function of switching between normal operation (normal operation mode) and precision operation (precision operation mode).
  • normal operation mode the input switch 8b outputs operation information input from the operation device 5 to the input discrimination unit 31a.
  • the input discrimination unit 31a receives operation information input from the operation device 5 via the input switch 8b, it performs the same operation as in the first embodiment.
  • the input switch 8b outputs operation information input from the precision operation device 8a to the input discrimination unit 31a.
  • the input discrimination unit 31a outputs the operation information to the attitude drive unit 34 or the position drive unit 36 depending on the content of the operation information.
  • the input switch 8b may switch between normal operation and precision operation depending on which of these devices the input is from, or may switch between normal operation and precision operation when a specific operation is performed by the operation device 5.
  • the image presentation device 4 may be provided with buttons for switching operations, and the input switch 8b may switch between normal operation and precision operation by pressing these buttons by the operation device 5, or may detect the operator's operation by gesture and switch between normal operation and precision operation by the operator's gesture.
  • the attitude drive unit 34 of the remote machine 1 may have a function as a manipulator drive unit that drives each joint and hand 6a of the manipulator 6.
  • the attitude drive unit 34 may generate a control signal for the attitude drive unit 34 to control the manipulator 6 in response to the operation by the precision operation device 8a, and transmit the generated control signal to the remote machine 1.
  • the manipulator drive unit may be provided separately from the attitude drive unit 34.
  • the operation method described in the first embodiment can reduce the operational burden on the operator, but since the operation is performed using an operating device 5 such as a mouse, it may be difficult to precisely operate the remote machine 1. Also, for example, when the remote machine 1 has a manipulator 6 and approaches an object to be operated by the manipulator 6, it may be easier for the operator to perform precise remote operation by using an operation that directly corresponds to the drive shaft of the remote machine 1, such as an operation using a joystick, rather than using the operation method described in the first embodiment.
  • an input switch 8b that switches between normal operation and precise operation is provided, allowing the operator to more appropriately operate the remote machine 1.
  • a screen showing an operation method corresponding to the precision manipulation device 8a may be displayed on the image presentation device 4.
  • the display screen of one image presentation device 4 may be divided to separate the image captured by the camera 2 and the screen showing the operation method, or multiple image presentation devices 4 may be provided, and the image captured by the camera 2 and the screen showing the operation method may be displayed on different image presentation devices 4.
  • FIG. 8 to 10 are diagrams showing an example of a screen showing the operation method of this embodiment.
  • FIG. 8 to FIG. 10 are display examples of the operation method when a joystick is used as the precision operation device 8a in precision operation, and in FIG. 8 to FIG. 10, the operation of the joystick in the Arm Joystick Mode, the Wrist Mode, and the Drive Joystick Mode and the corresponding operation of the remote machine 1 are displayed together with an image showing the operation of the remote machine 1.
  • the display information generating unit 32 receives an input regarding the setting of the mode from the input discrimination unit 31, and controls the image display device 4 to display, for example, one of FIG. 8 to FIG. 10 according to the current mode.
  • FIG. 8 to FIG. 10 are examples, and the settable modes, the correspondence between each joystick operation and the operation of the remote machine 1 are not limited to the examples shown in FIG. 8 to FIG. 10.
  • 8 to 10 show an example in which a joystick is used as the precision operation device 8a, but even when a precision operation device 8a other than a joystick is used, the same effect as when a joystick is used can be obtained by displaying an operation method corresponding to the precision operation device 8a.
  • an operation method corresponding to the operation device 5 may be displayed in the same manner for normal operation using the operation device 5.
  • FIG. 11 is a diagram showing an example in which a terminal device is used as the image presentation device 4 of this embodiment.
  • a terminal device When a terminal device is used as the image presentation device 4, an image corresponding to the location where the terminal device is held is displayed on the terminal device.
  • the image displayed on the image presentation device 4, which is a terminal device may also be able to be enlarged and reduced.
  • both the monitor and the terminal device as shown in FIG. 2 may be used as the image presentation device 4.
  • an image connected to an image displayed on the image presentation device 4, which is a monitor is displayed according to the position of the image presentation device 4, which is a terminal device, thereby realizing a virtual screen wider than the image presentation device 4, which is a monitor.
  • FIG. 12 is a diagram showing an example of a method of displaying an image in this embodiment.
  • the upper part of FIG. 12 shows an example of a display screen when the image captured by the camera 2-1 is enlarged.
  • the upper part of FIG. 12 is, for example, an enlarged upper part of the third display screen of FIG. 4 described in embodiment 1, and although the hand 6a is enlarged, the dolly 7 displayed on the third display screen of FIG. 4 disappears. This makes it difficult for the operator to know the orientation of the dolly 7 and to operate it.
  • a three-dimensional figure 308 shown by a dashed line in the lower part of FIG. 12 may be superimposed on the image. That is, the display information generating unit 32 may generate a three-dimensional figure 308 that imitates the current state of the remote machine 1 so that the orientation of the trolley 7 can be known, for example, by using CG (Computer Graphics), generate display data in which the three-dimensional figure 308 is superimposed on the image captured by the camera 2-1, and display the display data on the image display device 4.
  • CG Computer Graphics
  • the display method of the three-dimensional figure 308 is not limited to the example shown in FIG. 12, and the image to be displayed is not limited to the image of the camera 2-1.
  • the three-dimensional figure 308 a two-dimensional figure, symbol, character, etc. indicating the orientation of the trolley 7 may be displayed, or a symbol, character, etc. indicating the orientation of the trolley 7 may be displayed together with the three-dimensional figure 308.
  • the operation calculator 3a of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3a of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • Embodiment 3. 13 is a diagram showing a configuration example of a remote control system according to the third embodiment.
  • a remote control system 100b according to the present embodiment is similar to the remote control system 100a according to the second embodiment, except that the remote control system 100b according to the present embodiment includes an operation calculator 3b instead of the operation calculator 3a, and multiple cameras 2 are provided.
  • Components having the same functions as those in the second embodiment are given the same reference numerals as those in the second embodiment, and duplicated explanations are omitted. Below, differences from the second embodiment will be mainly explained.
  • one or more cameras 2 are provided, but in the present embodiment, two or more cameras 2 are provided.
  • cameras 2-1 and 2-2 are provided as cameras 2.
  • the operation calculator 3b is the same as the operation calculator 3a of the second embodiment, except that a camera switching unit 37, which is a camera switch, is added.
  • the camera switching unit 37 switches between the multiple cameras 2 as the camera from which the image to be displayed on the image presentation device 4 is obtained. That is, the camera switching unit 37 switches the camera 2 corresponding to the presented image by selecting the presented image to be presented on the image presentation device 4 from the multiple images captured by each of the multiple cameras 2.
  • the camera switching unit 37 notifies the display information generating unit 32 of information indicating the camera 2 from which the image to be displayed on the image presentation device 4 is obtained, that is, information indicating the camera 2 selected as the selected display target.
  • the display information generating unit 32 generates display data using the notified image of the camera 2.
  • the camera switching unit 37 may switch the image to be displayed on the image presentation device 4 by receiving operation information instructing the camera to be switched from the operation device 5 or the precision operation device 8a via the input discrimination unit 31a, or may switch the image to be displayed on the image presentation device 4 by detecting the operator's gesture.
  • a button for switching cameras may be displayed on the image presentation device 4, and the operator may use the operation device 5 or the precision operation device 8a to press the button to switch the camera 2.
  • the image displayed on the image presentation device 4 may be switched according to preset conditions and the image displayed on the image presentation device 4.
  • FIG. 14 is a diagram showing an example of switching of the camera 2 in this embodiment.
  • FIG. 14 shows an example in which the camera 2-1 installed on the dolly 7 and the camera 2-2 installed near the hand 6a of the manipulator 6 are used, as described in FIG. 2 of the first embodiment, and the image captured by the camera 2-1 is displayed on the display screen 301, and the image captured by the camera 2-2 is displayed on the display screen 302.
  • the camera switching unit 37 may switch the image displayed as the display screen on the image presentation device 4 according to the operation information input from the operation device 5 or the precision operation device 8a, as described above, or may switch by detecting the gesture of the operator, or may switch according to a preset condition and the image displayed on the image presentation device 4.
  • the preset condition may be, for example, but is not limited to, a condition in which the image of the camera 2-2 is displayed when the distance between the remote machine 1 and the object is equal to or less than a specified distance, and the image of the camera 2-1 is displayed when the distance between the remote machine 1 and the object is greater than a specified distance.
  • the target object is determined in advance by the operator using the operation device 5 or the precision operation device 8a.
  • the operator can perform remote control while watching the image captured by camera 2-2 until the remote machine 1 approaches the target object to a certain extent, and when the remote machine approaches the target object beyond a certain extent, the operator can perform remote control while watching the image captured by camera 2-1.
  • FIG. 15 is a diagram showing an example of a camera provided inside the hand 6a of the manipulator 6 in this embodiment.
  • the manipulator 6 and the object 9 to be grasped by the hand 6a are shown on the left, and an image displayed on the image display device 4 is shown on the right.
  • the display screen 310 shows the screen on which the image captured by the camera 2-2 is displayed, and the display screen 320 shows the screen on which the image captured by the camera 2-3 is displayed.
  • the camera 2-3 is provided inside the hand 6a, and when the hand 6a approaches the object 9 to be grasped, the image displayed on the image display device 4 is switched from the image captured by the camera 2-2 to the image captured by the camera 2-3.
  • the operator can perform operations while understanding the state of the grasped object 9 in detail. For example, if either the grasped object 9 or the hand 6a is a deformable structure, the operator can check the degree of deformation of the deformable structure and adjust the gripping force of the hand 6a.
  • the adjustment of the gripping force of the hand 6a may be performed by the operating device 5 or the precision operating device 8a, or by other operating means other than these.
  • the image displayed on the image presentation device 4 was an image captured by one camera 2, but the image presentation device 4 may display images captured by multiple cameras 2 as separate display screens.
  • the display information generation unit 32 may display the image captured by the selected camera 2 at high illuminance and reduce the illuminance of the image of the unselected camera 2.
  • the display information generation unit 32 may also display the display screen of the image of the selected camera 2 in the center and display the image of the unselected camera 2 at the edge.
  • the display information generation unit 32 may also generate display data so that the size of the image of the unselected camera 2 is smaller than the size of the image of the selected camera 2.
  • the operation calculator 3b of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3b of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • the camera switching unit 37 is provided within the operation calculator 3b in FIG. 13, a camera switcher having the functions of the camera switching unit 37 may be provided separately from the operation calculator 3b.
  • FIG. 13 shows an example in which a camera switching function is added to the operation calculator 3a of embodiment 2, but the remote control system 100 of embodiment 1 may be equipped with multiple cameras 2, and a camera switching unit 37 may be added to the operation calculator 3 to add a camera switching function.
  • FIG. 16 is a diagram showing an example of a camera 2 in this embodiment that photographs the manipulator 6 from the side.
  • camera 2-4 which is one of the multiple cameras 2, photographs the manipulator 6 from the side.
  • a drive mechanism is provided in the camera 2-4, and the drive mechanism drives the camera 2-4 so as to track, for example, the tip of the hand 6a.
  • FIG. 17 is a diagram showing an example of switching of images in this embodiment. In the example shown in FIG. 17, for example, a display screen 304 on which an image photographed by camera 2-4 is displayed and a display screen 302 on which an image photographed by camera 2-2 is displayed are switched by a gesture of the operator.
  • FIG. 17 shows an example of switching between the image captured by the camera 2-2 and the image captured by the camera 2-4, it is also possible to switch between the image captured by the camera 2-1 and the image captured by the camera 2-4 in a similar manner.
  • FIG. 18 is a diagram showing an example of superimposing an image captured from the side in this embodiment.
  • the upper left diagram in FIG. 18 shows an example in which an image obtained by capturing the tip of hand 6a and the object to be grasped 9 by camera 2-2 is displayed as display screen 302.
  • display screen 304 showing the image from camera 2-4 is displayed on display screen 302.
  • display screen 304 may be displayed on display screen 302, but if left as is, the entire screen will look unnatural. For this reason, a virtual mirror 307 may be displayed as shown in the lower left diagram in FIG. 18.
  • the display information generating unit 32 generates a composite image showing the mirror 307 and the image displayed on the mirror 307, assuming that the image from the camera 2-4 is reflected in the virtual mirror 307, generates display data by superimposing the composite image on the image captured by the camera 2-1 or the camera 2-2, and displays the display data on the image presenting device 4. This allows the operator to recognize the image from the camera 2-4 as a natural image.
  • FIG. 19 is a diagram showing an example of a pressure detection and pressure display method in this embodiment.
  • the display unit 602 which is the image presentation device 4 or another monitor
  • the tip of the hand 6a provided with the pressure detection sensor moves and traces the surface of an object such as a grasped object 9.
  • the display unit 602 may have a function as an input means such as a touch panel, or the movement of the finger may be detected and the tip of the hand 6a may be driven in accordance with the detected movement.
  • Pressure information indicating the pressure detected by the pressure detection sensor is input to the display information generation unit 32 in the same way as the image of the camera 2.
  • the display information generation unit 32 may generate display data based on the pressure information so as to change at least one of, for example, the color density, transparency, and composition ratio, and display the display data on the image presentation device 4 or another monitor to visually present the force haptics to the operator.
  • the presentation of the haptic sensation is not limited to being displayed on the image display device 4 or another monitor, but may be achieved by displaying it on the operator's fingertips using projection mapping.
  • the operation calculator 3c when the operation calculator 3c receives an operation instructing to trace the surface of an object such as the grasped object 9, it moves the tip of the hand 6a so as to trace the surface of the object such as the grasped object 9, and outputs pressure information indicating the pressure detected by the pressure detection sensor and movement information indicating the amount of movement of the tip of the hand 6a to the haptic presentation device 607.
  • the operation calculator 3c may control the horizontal position of the tip of the hand 6a in accordance with the movement of the finger 601, thereby moving the tip of the hand 6a so as to trace the surface.
  • the haptic presentation device 607 includes, for example, a contact unit 604, a drive unit 605 capable of driving the contact unit 604, and a support unit 606 that supports the operator's finger from above.
  • the haptic feedback device 607 controls the drive unit 605 based on the movement information to move the horizontal position of the contact unit 604, and moves the contact unit 604 up and down based on the pressure information.
  • the finger 601 moves horizontally together with the contact unit 604, and also moves up and down according to the pressure information.
  • This allows the haptic feedback to be presented, and the operator to perceive the surface condition.
  • the unevenness of the surface of the object may be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c may move the contact unit 604 up and down according to the detected unevenness.
  • a membrane may also be provided on the portion where the operator places his/her finger 601.
  • the membrane can provide the operator with a sliding sensation when tracing a surface.
  • the operation calculator 3c can also reproduce and communicate to the operator the unevenness detected by the tip of the hand 6a by moving the membrane up and down in response to pressure information.
  • the unevenness of the surface of the object can also be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c can move the contact part 604 up and down in response to the detected unevenness.
  • Embodiment 4. 21 is a diagram showing a configuration example of a remote control system according to the fourth embodiment.
  • a remote control system 100c according to the present embodiment is similar to the remote control system 100b according to the third embodiment, except that the remote control system 100c according to the present embodiment includes an operation calculator 3c instead of the operation calculator 3b, and a remote machine 1a instead of the remote machine 1.
  • Components having the same functions as those in the third embodiment are given the same reference numerals as those in the third embodiment, and duplicated explanations will be omitted. Below, differences from the third embodiment will be mainly explained.
  • the remote machine 1a is similar to the remote machine 1 of embodiment 3, except that a hand drive mechanism 13 is added to the remote machine 1.
  • the hand drive mechanism 13 drives the hand 6a based on a control signal received from the operation calculator 3c.
  • the remote machines 1 of embodiments 1 to 3 may also be provided with a hand drive mechanism 13, and there are no particular restrictions on the method of operating the hand drive mechanism 13 in embodiments 1 to 3.
  • the operation calculator 3c is similar to the operation calculator 3b of embodiment 3, except that a hand angle setting unit 38 is added and an input discrimination unit 31c is provided instead of the input discrimination unit 31a.
  • the operator can also set the hand angle of the hand 6a using the operation device 5, and when the input discrimination unit 31c receives operation information indicating the hand angle of the hand 6a, it outputs the operation information to the hand angle setting unit 38.
  • the hand angle setting unit 38 uses the operation information received from the input discrimination unit 31c to generate a control signal for controlling the hand 6a of the remote machine 1a, and transmits the generated control signal to the remote machine 1a.
  • FIG. 22 and 23 are diagrams showing an example of a hand angle setting screen in this embodiment.
  • FIG. 22 and FIG. 23 each show a display screen displayed on the image display device 4.
  • angle change buttons 401 to 403 are displayed on the right side of the display screen 301 on which the image captured by the camera 2-1 is displayed.
  • the angle change button 401 is a button for setting the hand angle of the hand 6a in the horizontal direction
  • the angle change button 402 is a button for setting the hand angle of the hand 6a to 45 deg (the angle between the horizontal direction and the hand 6a is 45 deg)
  • the angle change button 403 is a button for setting the hand angle of the hand 6a in the vertical direction.
  • FIG. 22 and FIG. 23 each show a display screen displayed on the image display device 4.
  • angle change buttons 401 to 403 are displayed on the right side of the display screen 301 on which the image captured by the camera 2-1 is displayed.
  • the angle change button 401 is a button for setting the hand angle of the hand 6a
  • FIG. 22 shows an example in which the hand end angle of the hand 6a can be set to three types of hand end angles, horizontal, 45 deg, and vertical, but the number of hand end angles that can be set for the hand end angle of the hand 6a is not limited to this example. For example, it may be possible to set it to one of two types of hand end angles, or it may be possible to set the hand end angle to more than three types.
  • the display and operation for setting the hand end angle are not limited to FIG. 22.
  • an indicator showing a memory along with the hand end angle may be displayed, and the hand end angle of the hand 6a may be set by moving the memory with the operating device 5, or the hand end angle may be set by a gesture, or the hand end angle of the hand 6a may be set by a method other than these.
  • FIG. 23 illustrates an operation method different from that of FIG. 22.
  • a side view of the hand 6a is displayed as the display screen 304, and the operator rotates the displayed hand 6a using the operation device 5 to set the hand end angle of the hand 6a. That is, an image including the hand end of the hand 6a is displayed on the hand end angle setting screen, and the hand end angle is set by the operator changing the angle of the hand end in the image using the operation device 5.
  • the operator uses the operation device 5 to move the operation device 5 in the rotation direction 404 while pressing the part of the hand 6a, and ends the pressing when the desired hand end angle is reached.
  • the target hand end angle may be set by clicking the target hand end angle with the marker 201.
  • the image of the hand 6a shown in FIG. 23 may be one that has been created in advance by simulating the remote machine 1a, or may be an image captured by a camera 2 that is provided on the manipulator 6 and captures images of the hand 6a from the side. In the latter case, similar to the operation of the posture and position of the remote machine 1 in embodiment 1, the operator may start pressing the operation device 5 after moving the marker 201 to the target hand angle, and continue pressing the operation device 5 until the target hand angle is reached.
  • the mode may be changed to a mode in which the attitude and position of the remote machine 1 are operated as described in the first embodiment, and when the hand 6a is clicked, the hand tip angle setting screen of the hand 6a may be displayed.
  • the position of the hand 6a of the manipulator 6 may be set, and when the manipulator 6 is clicked, the setting screen of the position of the hand 6a of the manipulator 6 may be displayed.
  • the operation calculator 3c may grasp the joint angle by attaching an AR marker to each joint of the manipulator 6, or may control the position of the hand 6a by constructing a robot model inside and grasping the joint angle, or may control the position of the hand 6a by other methods.
  • the position of the hand 6a of the manipulator 6 may be set by clicking or continuously pressing the operating device 5 at the position of the marker 201 as described in embodiment 1, or, as illustrated in FIG. 22, a button indicating the position of the hand 6a of the manipulator 6 may be displayed and the position of the hand 6a of the manipulator 6 may be set by the button.
  • the remote machine 1a has a hand 6a whose end-point angle can be set, and the image display device 4 displays a setting screen for setting the end-point angle of the hand 6a.
  • the operator can also set the end-point angle of the hand 6a with a simple operation.
  • the position of the manipulator 6 and the end-point angle of the hand 6a may be set using the precision operation device 8a, as described in the second embodiment.
  • the operation calculator 3c of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3c of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • FIG. 21 shows an example in which a hand angle setting function is added to the operation calculator 3b of embodiment 3
  • a hand angle setting unit 38 may be added to the operation calculator 3a of embodiment 1 or the operation calculator 3b of embodiment 2 to add a hand angle setting function.
  • Embodiment 5 is a diagram showing a configuration example of a remote control system according to the fifth embodiment.
  • a remote control system 100d according to the fifth embodiment is similar to the remote control system 100c according to the fourth embodiment, except that the remote control system 100d according to the fifth embodiment includes an operation calculator 3d instead of the operation calculator 3c.
  • Components having the same functions as those in the fourth embodiment are given the same reference numerals as those in the fourth embodiment, and duplicated explanations are omitted. Below, differences from the fourth embodiment will be mainly explained.
  • the operation calculator 3d is similar to the operation calculator 3c of embodiment 4, except that a motion switching unit 39, which is a motion switch, is added, and the input discrimination unit 31d is provided instead of the input discrimination unit 31c.
  • the motion switching unit 39 switches the operation part of the remote machine 1a in conjunction with switching by the camera switching unit 37. For example, the camera switching unit 37 notifies the motion switching unit 39 of information indicating the selected camera 2, and the motion switching unit 39 switches the operation part based on the notified information.
  • the motion switching unit 39 selects the cart 7 of the remote machine 1a as the operation part
  • the motion switching unit 39 selects the manipulator 6 or hand 6a of the remote machine 1a as the operation part
  • the motion switching unit 39 selects the hand 6a of the remote machine 1a as the operation part.
  • the motion switching unit 39 notifies the input discrimination unit 31d of the selected operation part.
  • the input discrimination unit 31d outputs the input operation information to the function unit corresponding to the operation part.
  • the input discrimination unit 31d when the operation part is the cart 7, the input discrimination unit 31d outputs operation information to the target attitude setting unit 33 and the attitude driving unit 34, and when the operation part is the hand 6a, the input discrimination unit 31d outputs operation information to the hand angle setting unit 38.
  • the input discrimination unit 31d When the operation part is the manipulator 6, the input discrimination unit 31d outputs operation information to a manipulator driving unit (not shown), or to a target attitude setting unit 33 and an attitude driving unit 34 that function as a manipulator driving unit.
  • the motion switching unit 39 may set a gain for the motion of the remote machine 1 when the precision operation device 8a is operated according to the camera 2 selected by the camera switching unit 37.
  • the gain may be determined in advance for each camera 2.
  • the gain may be set instead of switching the operation part, or both the operation part switching and the gain setting may be performed.
  • the motion switching unit 39 outputs the set gain to the target attitude setting unit 33, the attitude driving unit 34, and the hand angle setting unit 38.
  • the camera switching unit 37 may set a gain according to the magnification of the enlargement or reduction even when the image displayed on the image presentation device 4 is enlarged or reduced.
  • the operation calculator 3d of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3d of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • a motion switch having the function of the motion switch unit 39 may be provided separately from the operation calculator 3d.
  • FIG. 24 shows an example in which a motion switching function is added to the operation calculator 3c of embodiment 4, a motion switching function may be added by adding a motion switching unit 39 to the operation calculator 3a of embodiment 2 or the operation calculator 3b of embodiment 3.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manipulator (AREA)

Abstract

Un système d'opération à distance (100) selon la présente divulgation comprend : une machine à distance (1) dans laquelle une caméra (2) est installée et qui est opérée à distance; un dispositif de présentation vidéo (4) pour présenter à un opérateur, en tant que vidéo de présentation, une vidéo capturée par la caméra (2); un dispositif d'opération (5) qui est opéré par l'opérateur, reçoit, en réponse à une opération par l'opérateur, une entrée d'une posture cible de la machine distante (1) en tant que position dans la vidéo de présentation, et reçoit, en réponse à une opération par l'opérateur, une entrée d'une instruction de mouvement de position indiquant que la position de la machine distante (1) doit être déplacée; et un calculateur d'opération (3) pour commander la posture de la machine distante (1) sur la base de la posture cible reçue par le dispositif d'opération (5), et déplacer la position de la machine distante (1) sur la base de l'instruction de mouvement de position reçue par le dispositif d'opération (5).
PCT/JP2022/040482 2022-10-28 2022-10-28 Système d'opération à distance et procédé d'opération à distance WO2024089890A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040482 WO2024089890A1 (fr) 2022-10-28 2022-10-28 Système d'opération à distance et procédé d'opération à distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040482 WO2024089890A1 (fr) 2022-10-28 2022-10-28 Système d'opération à distance et procédé d'opération à distance

Publications (1)

Publication Number Publication Date
WO2024089890A1 true WO2024089890A1 (fr) 2024-05-02

Family

ID=90830319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/040482 WO2024089890A1 (fr) 2022-10-28 2022-10-28 Système d'opération à distance et procédé d'opération à distance

Country Status (1)

Country Link
WO (1) WO2024089890A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09201785A (ja) * 1996-01-30 1997-08-05 Shimadzu Corp マニピュレータ
JP2012171024A (ja) * 2011-02-17 2012-09-10 Japan Science & Technology Agency ロボットシステム
JP2018535487A (ja) * 2015-09-15 2018-11-29 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Uav経路を計画し制御するシステム及び方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09201785A (ja) * 1996-01-30 1997-08-05 Shimadzu Corp マニピュレータ
JP2012171024A (ja) * 2011-02-17 2012-09-10 Japan Science & Technology Agency ロボットシステム
JP2018535487A (ja) * 2015-09-15 2018-11-29 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Uav経路を計画し制御するシステム及び方法

Similar Documents

Publication Publication Date Title
JP5839220B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US9798395B2 (en) Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
JP4100773B2 (ja) ロボットの遠隔制御方法及びシステム
EP2593848B1 (fr) Procédé et système d'interaction avec une interface utilisateur projetée sur une surface
US7535486B2 (en) Display control device, display control method, program, and portable apparatus
US7420547B2 (en) Method and apparatus for matching tactile sensation to the contents of a display
WO2016148072A1 (fr) Programme informatique et système informatique permettant de commander la manipulation d'objets dans un espace virtuel immersif
JP4171561B2 (ja) ロータリーエンコーダ
KR101444858B1 (ko) 원격 존재 장치
US20150304615A1 (en) Projection control apparatus and projection control method
JP2010092086A (ja) ユーザ入力装置、デジタルカメラ、入力制御方法、および入力制御プログラム
JP2014026355A (ja) 映像表示装置および映像表示方法
WO2021192491A1 (fr) Serveur d'assistance à distance, systeme d'assistance à distance, et procédé d'assistance à distance
JP2014228702A (ja) 地図表示制御装置
KR20120136719A (ko) 손과 눈의 3차원 위치정보를 이용한 원거리 스크린 상의 물체지목 및 제어방법
JP2012179682A (ja) 移動ロボットシステム、移動ロボット制御装置、該制御装置に用いられる移動制御方法及び移動制御プログラム
CN113574592A (zh) 电子设备、电子设备的控制方法、程序和存储介质
WO2024089890A1 (fr) Système d'opération à distance et procédé d'opération à distance
JP6549066B2 (ja) 没入型仮想空間でオブジェクト操作を制御するためのコンピュータ・プログラムおよびコンピュータ・システム
JP2015194794A (ja) 操作装置
TW201913298A (zh) 可顯示實體輸入裝置即時影像之虛擬實境系統及其控制方法
JPH08129449A (ja) 信号入力装置
JP2019215769A (ja) 操作装置及び操作方法
JP7536312B2 (ja) 画像インタフェース装置、画像操作装置、操作対象物操作装置、操作対象物操作システム、操作対象物提示方法および操作対象物提示プログラム
JP2005332231A (ja) ポインティング方法およびポインティング装置、ならびにポインティングプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22963537

Country of ref document: EP

Kind code of ref document: A1