WO2024089890A1 - Remote operation system and remote operation method - Google Patents

Remote operation system and remote operation method Download PDF

Info

Publication number
WO2024089890A1
WO2024089890A1 PCT/JP2022/040482 JP2022040482W WO2024089890A1 WO 2024089890 A1 WO2024089890 A1 WO 2024089890A1 JP 2022040482 W JP2022040482 W JP 2022040482W WO 2024089890 A1 WO2024089890 A1 WO 2024089890A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote machine
image
camera
operator
attitude
Prior art date
Application number
PCT/JP2022/040482
Other languages
French (fr)
Japanese (ja)
Inventor
正樹 春名
茂明 田頭
正樹 荻野
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/040482 priority Critical patent/WO2024089890A1/en
Publication of WO2024089890A1 publication Critical patent/WO2024089890A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • This disclosure relates to a remote control system and a remote control method.
  • a remote control system that operates machinery remotely, for example, a combination of a head-mounted display and an operation interface that detects the operator's gestures is used.
  • Patent document 1 discloses a technology that uses a fisheye stereo camera to present a wide-field-of-view image to a remote operator without the need to drive the camera.
  • Patent Document 1 requires the operator to wear a head-mounted display and have stereoscopic perception, and the operator must operate the joystick while sequentially checking the surroundings using stereoscopic perception, which places a high operational burden on the operator.
  • the present disclosure has been made in consideration of the above, and aims to provide a remote control system that can reduce the operational burden on the operator.
  • the remote operation system disclosed herein comprises a remote machine equipped with a camera and remotely operated, an image presentation device that presents an image captured by the camera to an operator as a presentation image, and an operation device that is operated by the operator and accepts, through the operator's operation, an input of a target attitude of the remote machine as a position in the presentation image, and accepts, through the operator's operation, an input of a position movement instruction indicating that the position of the remote machine is to be moved.
  • the remote operation system further comprises an operation calculator that drives the attitude of the remote machine based on the target attitude received by the operation device, and moves the position of the remote machine based on the position movement instruction received by the operation device.
  • the remote control system disclosed herein has the effect of reducing the operational burden on the operator.
  • FIG. 1 is a diagram showing a configuration example of a remote control system according to a first embodiment
  • FIG. 1 is a schematic diagram showing a specific example of a remote control system according to a first embodiment
  • FIG. 1 is a diagram showing an example of a method for setting a target attitude according to the first embodiment
  • FIG. 1 is a diagram showing an example of an operation using the operation device according to the first embodiment
  • a flowchart showing an example of an operation in the operational computing unit according to the first embodiment.
  • FIG. 1 is a diagram showing an example of the configuration of a computer system that realizes an operational calculator according to a first embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a second embodiment.
  • FIG. 1 is a diagram showing a configuration example of a remote control system according to a first embodiment
  • FIG. 1 is a schematic diagram showing a specific example of a remote control system according to a first embodiment
  • FIG. 1 is a diagram showing an example of a method for
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment
  • FIG. 13 is a diagram showing an example in which a terminal device is used as an image presentation device according to a second embodiment
  • FIG. 13 is a diagram showing an example of a video display method according to the second embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a third embodiment.
  • FIG. 13 is a diagram showing an example of camera switching in the third embodiment
  • FIG. 13 is a diagram showing an example of camera switching in the third embodiment
  • FIG. 13 is a diagram showing an example of a camera provided inside a hand of a manipulator according to a third embodiment
  • FIG. 13 is a diagram showing an example of a camera according to a third embodiment for photographing a manipulator from the side.
  • FIG. 13 is a diagram showing an example of video switching in the third embodiment.
  • FIG. 13 is a diagram showing an example of superimposing images captured from the side in the third embodiment;
  • FIG. 13 is a diagram showing an example of a method for detecting pressure and displaying pressure according to the third embodiment;
  • FIG. 13 is a diagram showing another example of a method for detecting pressure and displaying pressure according to the third embodiment;
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a fourth embodiment.
  • FIG. 13 is a diagram showing an example of a remote control system according to a fourth embodiment.
  • FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment
  • FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment
  • FIG. 13 is a diagram showing a configuration example of a remote control system according to a fifth embodiment.
  • Embodiment 1. 1 is a diagram showing a configuration example of a remote operation system according to embodiment 1.
  • a remote operation system 100 of this embodiment includes a remote machine 1, a camera 2, an operation calculator 3, an image display device 4, and an operation device 5.
  • the remote machine 1 is a machine that is remotely operated, and may be, for example, but is not limited to, a combination of a cart and a manipulator, a mobile vehicle, a manipulator, a humanoid robot, etc.
  • a case where the remote machine 1 is a combination of a cart and a manipulator will be mainly described.
  • the remote machine 1 includes an attitude drive mechanism 11 and a position drive mechanism 12.
  • the attitude drive mechanism 11 is a mechanism that can change the attitude of the remote machine 1, and changes the attitude of the remote machine 1 according to a control signal received from the operation calculator 3.
  • the attitude indicates the orientation of the remote machine 1.
  • the target attitude indicates the target orientation of the cart.
  • the target attitude indicates the target orientation of the vehicle.
  • the attitude drive mechanism 11 includes an actuator that changes the orientation of the wheels of the remote machine 1.
  • the attitude drive mechanism 11 may be a drive mechanism that can change the orientation of the remote machine 1 on the spot, or may be a drive mechanism that can change the orientation by turning. Also, for example, if the remote machine 1 is a combination of a cart and a manipulator, the target attitude may be appropriately allocated to the orientation of the cart and the target orientation of the manipulator.
  • the position drive mechanism 12 is a mechanism capable of changing the position of the remote machine 1, and changes the position of the remote machine 1 in response to a control signal received from the operation calculator 3.
  • the position drive mechanism 12 includes, for example, an actuator capable of changing at least one of the horizontal and vertical positions of the remote machine 1.
  • the camera 2 is mounted on the remote machine 1 and captures the surroundings of the remote machine 1. There is no restriction on the number of cameras 2, and one or more may be used. In the following embodiment, an example in which the camera 2 includes two fisheye cameras with different viewpoints will be described, but the camera 2 is not limited to a fisheye camera.
  • the image presentation device 4 is, for example, a display device such as a monitor, a display, or a smartphone monitor, and presents an image to an operator who remotely operates the remote machine 1.
  • the image presentation device 4 presents, for example, an image captured by the camera 2 to the operator as a presented image.
  • the operation device 5 is a device operated by the operator, such as, but not limited to, a mouse, a keyboard, a touchpad, or a device that performs screen operation by detecting a face or line of sight.
  • the operation device 5 accepts input of a target attitude of the remote machine 1 as a position in the presented image by the operator's operation, and accepts input of a position movement instruction indicating that the position of the remote machine 1 is to be moved by the operator's operation.
  • the image presentation device 4 and the operation device 5 may also be integrated, and a touch panel, a smartphone, or the like may be used.
  • the operation device 5 is a mouse and the image presentation device 4 is a monitor will be mainly described.
  • the operation calculator 3 receives the image captured by the camera 2 from the camera 2, and outputs the received image to the image display device 4, thereby displaying the image on the image display device 4.
  • the operation calculator 3 also controls the remote machine 1 according to the operation content accepted by the operation device 5. For example, the operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude accepted by the operation device 5, and moves the position of the remote machine 1 based on the position movement instruction accepted by the operation device 5.
  • the camera 2 and the operation calculator 3 may be directly connected, or the camera 2 and the operation calculator 3 may have a communication unit not shown in the figure, and data may be transmitted and received by communication.
  • the communication between the camera 2 and the operation calculator 3 may be wired communication, wireless communication, or a combination of wired communication and wireless communication.
  • the operation calculator 3 and the remote machine 1 may be directly connected, or the operation calculator 3 and the remote machine 1 may have a communication unit not shown in the figure, and data may be transmitted and received by communication.
  • the image presentation device 4 and the operation device 5 may be directly connected to the remote machine 1, or the image presentation device 4, the operation device 5, and the operation calculator 3 may be provided with a communication unit (not shown), and data may be transmitted and received through communication.
  • the communication between the image presentation device 4 and the operation device 5 and the remote machine 1 may be wired communication, wireless communication, or a combination of wired communication and wireless communication.
  • the operation calculator 3 may be provided at a first location where the remote machine 1 is located, or at a second location where the image presentation device 4 and the operation device 5 are located, i.e., on the operator's side.
  • the operation calculator 3 may be mounted on the remote machine 1.
  • the operation calculator 3 may be provided at a location different from both the first location and the second location.
  • the operation calculator 3 may be provided on a cloud server.
  • the operation calculator 3 includes an input discrimination unit 31, a display information generation unit 32, a target attitude setting unit 33, an attitude drive unit 34, a target attitude determination unit 35, and a position drive unit 36.
  • the input discrimination unit 31 acquires operation information indicating the operation content from the operation device 5, and outputs the operation information to the corresponding functional unit according to the content of the acquired operation information.
  • the display information generation unit 32 generates display data indicating a display screen to be displayed on the image presentation device 4 using the image received from the camera 2 and the operation information received from the input discrimination unit 31, and transmits the generated display data to the image presentation device 4.
  • the display information generation unit 32 also outputs the image received from the camera 2 to the target attitude setting unit 33.
  • the target attitude setting unit 33 sets a target attitude of the remote machine 1 based on the operation information received from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude.
  • the attitude driving unit 34 When the attitude driving unit 34 is notified of the target attitude by the target attitude setting unit 33, it generates a control signal for driving the attitude of the remote machine 1 based on the target attitude, and transmits the generated control signal to the remote machine 1. Furthermore, when the attitude driving unit 34 is notified by the target attitude determination unit 35 that the target attitude has not been reached, it generates a control signal for driving the attitude of the remote machine 1, and transmits the generated control signal to the remote machine 1.
  • the target attitude determination unit 35 determines whether the attitude of the remote machine 1 has become the target attitude, and if it determines that it has become the target attitude, it notifies the position drive unit 36 that it has become the target attitude. If the target attitude determination unit 35 determines that it has not become the target attitude, it notifies the attitude drive unit 34 that it has not become the target attitude. When notified by the target attitude determination unit 35 that the target attitude has been achieved, if the operation information received from the input discrimination unit 31 indicates that the position of the remote machine 1 is to be moved, the position drive unit 36 generates a control signal for moving the position of the remote machine 1 and transmits the generated control signal to the remote machine 1.
  • FIG. 2 is a diagram showing a schematic example of a remote control system 100 according to the present embodiment.
  • the remote machine 1 includes a manipulator 6 and a dolly 7.
  • a fish-eye camera 2-1 is provided on the front of the dolly 7, and a fish-eye camera 2-2 is provided near the hand 6a of the manipulator 6.
  • the image display device 4 is a monitor, and the operation device 5 is a mouse.
  • the image display device 4 displays an image captured by the camera 2-1 or an image captured by the camera 2-2.
  • Both cameras 2-1 and 2-2 are examples of the camera 2 shown in FIG. 1.
  • the camera switching that is, the switching of whether the image captured by the camera 2-1 or the image captured by the camera 2-2 is displayed on the image display device 4, is performed by the operator.
  • the operator remotely controls the movement of the remote machine 1 between the target attitude and position by operating the operation device 5 while watching the image displayed on the image display device 4.
  • FIG. 3 is a diagram showing an example of a method for setting a target posture in this embodiment.
  • FIG. 3 is a diagram showing an example of a display screen of the image display device 4.
  • an image captured by the camera 2-1 is displayed on the image display device 4.
  • This image shows the upper part of the dolly 7 and the hand 6a of the manipulator 6.
  • a marker 201 indicating the target posture is displayed together with the image captured by the camera 2-1.
  • the marker 201 in the display screen can be changed by the operator operating the operation device 5.
  • the position of the marker 201 in the display screen moves in accordance with the movement of the operation device 5, and the position of the marker 201 in the display screen is determined by pressing the mouse, which is the operation device 5, thereby specifying the target posture.
  • the dolly 7, the hand 6a, and the marker 201 are respectively given the symbols 7, 6a, and 201 for the purpose of explanation, but these symbols are not displayed on the display screen.
  • symbols are given for the purpose of explanation, but these symbols are not displayed on the display screen.
  • the remote machine 1 operates while the mouse, which is the operating device 5, is pressed. More specifically, when the position of the marker 201, i.e., the target attitude, is determined by pressing the mouse, which is the operating device 5, the operation calculator 3 controls the attitude of the remote machine 1 so that the cart 7 rotates in a direction corresponding to the target attitude. That is, the operation calculator 3 may display the marker 201 that can be moved by operating the operating device 5 in the presented image, and set the position of the marker 201 at the time when the operating device 5 is pressed by the operator as the target attitude.
  • the operation calculator 3 may determine that the position movement instruction has been accepted by the operating device 5 if the pressing of the operating device 5 continues at the time when the attitude of the remote machine 1 reaches the target attitude, and may start moving the position of the remote machine 1.
  • the operation calculator 3 stops driving the attitude of the remote machine 1 and stopping the movement of the position of the remote machine 1. Then, when the remote machine 1 is driven to the target attitude, that is, when the orientation of the remote machine 1 corresponds to the target attitude, the operation calculator 3 controls the position of the remote machine 1 so that the remote machine 1 starts moving in the direction of travel in that attitude. While the mouse, which is the operating device 5, is pressed, the operation calculator 3 continues to move the position of the remote machine 1, and when the mouse, which is the operating device 5, is no longer pressed, the operation calculator 3 stops the remote machine 1.
  • FIG. 4 is a diagram showing an example of an operation using the operation device 5 of this embodiment.
  • Each diagram in FIG. 4 shows an example of a display screen of the image presentation device 4, as in FIG. 3, and a marker 201 indicating the target attitude is displayed on each display screen together with an image captured by the camera 2-1.
  • FIG. 4 shows an example in which a mouse is used as the operation device 5.
  • the first display screen in FIG. 4 shows a state in which a target is set.
  • the operator moves the marker 201 to a position to be set as the target attitude and presses the mouse, which is the operation device 5, to set the target attitude.
  • FIG. 4 shows an example of an operation using the operation device 5 of this embodiment.
  • Each diagram in FIG. 4 shows an example of a display screen of the image presentation device 4, as in FIG. 3, and a marker 201 indicating the target attitude is displayed on each display screen together with an image captured by the camera 2-1.
  • FIG. 4 shows an example in which a mouse is used as the operation device 5.
  • the vector from the center of the remote machine 1 to the front (front face) of the remote machine 1 is vector 202
  • the vector corresponding to the target attitude is vector 203.
  • the operation calculator 3 calculates vector 202 and vector 203 based on the image and the position of the marker 201 in the image, and starts control to rotate the cart 7 of the remote machine 1 counterclockwise so that vector 202 coincides with vector 203.
  • the attitude of the carriage 7 of the remote machine 1 will change, as shown in the second display screen of Figure 4, i.e., the orientation of the carriage 7 of the remote machine 1 will rotate counterclockwise, and the marker 201 will be positioned in front of the carriage 7 of the remote machine 1. This ends the attitude drive.
  • the operation calculator 3 starts position driving to move the trolley 7 of the remote machine 1 forward, as shown in the third display screen of FIG. 4. Thereafter, the position driving of the trolley 7 continues until the pressing of the mouse, which is the operating device 5, ends.
  • the operator sets the target posture by moving the mouse, which is the operation device 5, to the desired position in the image projected as the display screen and pressing the mouse, and by continuing to press the mouse, the remote machine 1 is driven to the target posture as described above. Then, when the remote machine 1 is in the target posture, if the operator continues to press the mouse, which is the operation device 5, position driving of the remote machine 1 begins.
  • the operator does not need to wear a head-mounted display, and there is no need for stereoscopic perception. This reduces the operation load on the operator.
  • the operator can change the posture and position of the remote machine 1 simply by moving and pressing the mouse, which is the operation device 5, while looking at the display screen of the image presentation device 4, allowing remote operation with simple operations.
  • the operator continues to press the mouse, which is the operation device 5, to instruct the remote machine 1 to continue operating. That is, in the above example, pressing the mouse after the remote machine 1 has been driven to the target posture means the start of position driving, and then stopping the mouse pressing means the end of position driving.
  • the specific method of specifying the operation of the remote machine 1 is not limited to the above example. For example, the following operation may be performed. In the first display screen of FIG. 4, the operator moves the marker 201 to a position corresponding to the target posture, and then clicks or double-clicks the mouse, which is the operation device 5, to once confirm the target posture.
  • the operation calculator 3 stops the remote machine 1 when the posture of the remote machine 1 becomes the target posture.
  • the operator may press the mouse, which is the operation device 5, to start position driving, and stop pressing the mouse to end the position driving.
  • the operator can click or double-click the mouse, which is the operating device 5, to start driving the position, and click or double-click again to end the driving of the position.
  • the operation device 5 may be a touchpad, in which case, for example, pressing the touchpad may be used instead of pressing the mouse, and tapping or double tapping the touchpad may be used instead of clicking or double clicking the mouse.
  • the operation device 5 may be a touch panel integrated with the image presentation device 4, in which case the operator determines the target posture by touching a position on the touch panel that corresponds to the target posture on the display screen, and, for example, pressing the touch panel may be used instead of pressing the mouse, and tapping or double tapping the touch panel may be used instead of clicking or double clicking the mouse.
  • the operation device 5 may be a keyboard.
  • specific keys such as arrow keys may be assigned to up, down, left, and right movement, and the marker 201 may be moved using these keys, and pressing specific keys such as the enter key may be treated the same as pressing the mouse.
  • FIG. 5 is a flowchart showing an example of the operation of the operation calculator 3 of this embodiment.
  • FIG. 5 shows an example in which an operator starts pressing the operation device 5 at a position indicating the target posture, and continues pressing the operation device 5 while the remote machine 1 is operating, thereby remotely operating the remote machine 1.
  • the operation calculator 3 first judges whether or not a target setting has been input (step S1).
  • the input discrimination unit 31 judges whether or not the operation information received from the operation device 5 is information indicating the setting of the target posture.
  • the information indicating the setting of the target posture is, for example, information input from the operation device 5 when the operation device 5 is pressed, and includes information indicating the position of the operation device 5 (position on the display screen).
  • the input discrimination unit 31 notifies the position drive unit 36 of operation information indicating that the input of a movement instruction is continuing while the operation device 5 is being pressed. If there is no target setting input (step S1 No), the operation calculator 3 repeats step S1.
  • a target attitude is set (step S2).
  • the input discrimination unit 31 outputs operation information to the target attitude setting unit 33, and the target attitude setting unit 33 sets a target attitude using the image acquired from the display information generation unit 32 and the operation information indicating the setting of the target attitude acquired from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude.
  • the target attitude setting unit 33 obtains the vector 202 illustrated in the first row of FIG. 4 using the image, obtains the vector 203 using the operation information and the image, and calculates the difference between the vectors 202 and 203 to calculate the direction and angle by which the remote machine 1 is rotated as the target attitude. In this way, the target attitude is calculated as a relative value from the current attitude, that is, the amount of change from the current attitude.
  • the operation calculator 3 performs posture driving (step S3).
  • the posture driving unit 34 generates a control signal for driving the posture of the remote machine 1 based on the target posture notified by the target posture setting unit 33, and transmits the generated control signal to the remote machine 1.
  • the posture driving unit 34 generates a control signal to change the orientation at a constant speed so as to approach the target posture received from the target posture setting unit 33.
  • the operation calculator 3 determines whether the target attitude has been reached (step S4).
  • the target attitude determination unit 35 determines whether the remote machine 1 has reached the target attitude based on the target attitude notified by the target attitude setting unit 33. As described above, when the target attitude is indicated as a relative value from the current attitude, for example, the target attitude determination unit 35 determines that the remote machine 1 has reached the target attitude when the target attitude is equal to or less than a threshold value.
  • the threshold value is determined in advance, and may be set to, for example, 0, or may be set to a value corresponding to an error range such that the amount of change in the target attitude, i.e., the attitude that changes the remote machine 1, can be considered to be 0.
  • step S5 the operation calculator 3 determines whether the input of a movement command is continuing. In detail, if it is determined that the target posture has been reached, the target posture determination unit 35 notifies the position drive unit 36 of this fact, and upon receiving the notification that the target posture has been reached, the position drive unit 36 determines Yes in step S5 if it has received operation information indicating that movement is continuing from the input discrimination unit 31. If it is determined that the input of a movement command is continuing (step S5 Yes), the operation calculator 3 performs position drive (step S6). In detail, the position drive unit 36 generates a control signal for moving the remote machine 1 forward, and transmits the generated control signal to the remote machine 1.
  • step S4 If it is determined in step S4 that the target posture has not been reached (step S4: No), the operation calculator 3 repeats the process from step S3. If it is determined in step S5 that the input of the movement command is not continuing (step S5: No), the operation calculator 3 ends the process.
  • FIG. 5 shows an example in which the operator remotely controls the remote machine 1 by starting to press the operation device 5 at a position indicating the target posture and continuing to press while the remote machine 1 is operating.
  • the operation is generally similar to FIG. 5, although it is changed appropriately depending on the operation method. For example, when the operation device 5 is clicked at a position indicating the target posture, and position drive is started by clicking after the remote machine 1 reaches the target posture, in step S5, the operation calculator 3 judges Yes when it receives operation information indicating that a click was made after the remote machine 1 reaches the target posture.
  • position driving is performed after posture driving, but this is not limited to the above, and operations related to posture driving and position driving may be performed separately.
  • the operator may use the operation device 5 to perform a predetermined operation such as double-clicking or double-tapping to drive the position of the remote machine 1, and posture driving may be performed by specifying a position using the operation device 5 and continuing to press it (long press).
  • FIG. 6 is a diagram showing an example of the configuration of a computer system that realizes the operation calculator 3 of this embodiment.
  • this computer system includes a control unit 101, an input unit 102, a memory unit 103, a display unit 104, a communication unit 105, and an output unit 106, which are connected via a system bus 107.
  • the control unit 101 and the memory unit 103 form a processing circuit.
  • the control unit 101 is, for example, a processor such as a CPU (Central Processing Unit), and executes a program in which the processing in the operation calculator 3 of this embodiment is described. Note that a part of the control unit 101 may be realized by dedicated hardware such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array).
  • the input unit 102 is a button, a keyboard, a mouse, a touchpad, etc.
  • the storage unit 103 includes various memories such as a RAM (Random Access Memory) and a ROM (Read Only Memory) and a storage device such as a hard disk, and stores the program to be executed by the control unit 101, necessary data obtained in the process of processing, etc.
  • the storage unit 103 is also used as a temporary storage area for the program.
  • the display unit 104 is, for example, a display.
  • the display unit 104 and the input unit 102 may be integrated and realized by a touch panel, etc.
  • the communication unit 105 is a receiver and a transmitter that perform communication processing.
  • the output unit 106 is a speaker or the like.
  • FIG. 6 is an example, and the configuration of the computer system is not limited to the example of FIG. 6.
  • the computer system that realizes the operation calculator 3 does not need to include the display unit 104 and the output unit 106.
  • a computer program is installed in the storage unit 103 from a CD-ROM or DVD-ROM set in a CD (Compact Disc)-ROM drive or DVD (Digital Versatile Disc)-ROM drive (not shown). Then, when the program is executed, the program read from the storage unit 103 is stored in the main memory area of the storage unit 103. In this state, the control unit 101 executes processing as the operation calculator 3 of this embodiment according to the program stored in the storage unit 103.
  • a program describing the processing in the operational calculator 3 is provided on a CD-ROM or DVD-ROM as a recording medium, but this is not limiting.
  • a program provided via a transmission medium such as the Internet may be used.
  • the operation device 5 shown in FIG. 1 may be the input unit 102 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the input unit 102.
  • the image presentation device 4 shown in FIG. 1 may be the display unit 104 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the display unit 104.
  • the input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1 are realized by the control unit 101 shown in FIG. 6 executing a computer program stored in the storage unit 103 shown in FIG. 6.
  • the storage unit 103 shown in FIG. 6 is also used to realize the input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1.
  • the remote operation system 100 of this embodiment comprises the remote machine 1, the image presentation device 4 that presents the image from the camera 2 to the operator, the operation device 5, and the operation calculator 3.
  • the operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude specified as a position in the target image by the operation device 5, and moves the position of the remote machine 1 based on instructions to start and end position drive input using the operation device 5. This reduces the operational burden on the operator.
  • Embodiment 2. 7 is a diagram showing a configuration example of a remote control system according to the second embodiment.
  • a remote control system 100a according to the second embodiment is similar to the remote control system 100 according to the first embodiment, except that it includes an operation calculator 3a instead of the operation calculator 3 and a precision operation device 8a is added.
  • Components having the same functions as those in the first embodiment are given the same reference numerals as those in the first embodiment, and duplicated explanations will be omitted. Below, differences from the first embodiment will be mainly explained.
  • the precision operation device 8a is a device that allows more precise operation than the operation device 5, and is, for example, a joystick, an operation device that combines a joystick and a dial, or a mouse that allows precision operation.
  • a joystick is used as the precision operation device 8a and the remote machine 1 has multiple drive axes
  • the operation of which drive axis is to be performed may be switched by a button on the joystick or an operation on the operation device 5.
  • the precision operation device 8a may also be the same as the operation device 5 in terms of hardware. For example, a mouse that can be switched between a normal mode and a mode that allows precision operation may be set to the normal mode when used as the operation device 5, and to the mode that allows precision operation when used as the precision operation device 8a.
  • the operation calculator 3a may enlarge the image displayed on the image display device 4 in response to the operation of the operation device 5, thereby enabling precision operation by the operator.
  • the operation of enlarging may be an operation used for enlarging a general screen, and for example, when the operation device 5 is a touch panel, a pinch out may be used as the operation of enlarging.
  • the enlargement operation is not limited to this, and a button for enlargement or the like may be displayed on the image display device 4, and the position to be enlarged and displayed may be moved using the operation device 5.
  • the operation calculator 3a is similar to the operation calculator 3 of the first embodiment, except that an input switch 8b is added and an input discrimination unit 31a is provided instead of the input discrimination unit 31.
  • an input switch is provided separately from the operation calculator 3a, but the input switch 8b may be provided within the operation calculator 3a.
  • the input switch 8b switches the operation target from the operator between the operation device 5 and the precision operation device 8a.
  • the input switch 8b has a function of switching between normal operation (normal operation mode) and precision operation (precision operation mode).
  • normal operation mode the input switch 8b outputs operation information input from the operation device 5 to the input discrimination unit 31a.
  • the input discrimination unit 31a receives operation information input from the operation device 5 via the input switch 8b, it performs the same operation as in the first embodiment.
  • the input switch 8b outputs operation information input from the precision operation device 8a to the input discrimination unit 31a.
  • the input discrimination unit 31a outputs the operation information to the attitude drive unit 34 or the position drive unit 36 depending on the content of the operation information.
  • the input switch 8b may switch between normal operation and precision operation depending on which of these devices the input is from, or may switch between normal operation and precision operation when a specific operation is performed by the operation device 5.
  • the image presentation device 4 may be provided with buttons for switching operations, and the input switch 8b may switch between normal operation and precision operation by pressing these buttons by the operation device 5, or may detect the operator's operation by gesture and switch between normal operation and precision operation by the operator's gesture.
  • the attitude drive unit 34 of the remote machine 1 may have a function as a manipulator drive unit that drives each joint and hand 6a of the manipulator 6.
  • the attitude drive unit 34 may generate a control signal for the attitude drive unit 34 to control the manipulator 6 in response to the operation by the precision operation device 8a, and transmit the generated control signal to the remote machine 1.
  • the manipulator drive unit may be provided separately from the attitude drive unit 34.
  • the operation method described in the first embodiment can reduce the operational burden on the operator, but since the operation is performed using an operating device 5 such as a mouse, it may be difficult to precisely operate the remote machine 1. Also, for example, when the remote machine 1 has a manipulator 6 and approaches an object to be operated by the manipulator 6, it may be easier for the operator to perform precise remote operation by using an operation that directly corresponds to the drive shaft of the remote machine 1, such as an operation using a joystick, rather than using the operation method described in the first embodiment.
  • an input switch 8b that switches between normal operation and precise operation is provided, allowing the operator to more appropriately operate the remote machine 1.
  • a screen showing an operation method corresponding to the precision manipulation device 8a may be displayed on the image presentation device 4.
  • the display screen of one image presentation device 4 may be divided to separate the image captured by the camera 2 and the screen showing the operation method, or multiple image presentation devices 4 may be provided, and the image captured by the camera 2 and the screen showing the operation method may be displayed on different image presentation devices 4.
  • FIG. 8 to 10 are diagrams showing an example of a screen showing the operation method of this embodiment.
  • FIG. 8 to FIG. 10 are display examples of the operation method when a joystick is used as the precision operation device 8a in precision operation, and in FIG. 8 to FIG. 10, the operation of the joystick in the Arm Joystick Mode, the Wrist Mode, and the Drive Joystick Mode and the corresponding operation of the remote machine 1 are displayed together with an image showing the operation of the remote machine 1.
  • the display information generating unit 32 receives an input regarding the setting of the mode from the input discrimination unit 31, and controls the image display device 4 to display, for example, one of FIG. 8 to FIG. 10 according to the current mode.
  • FIG. 8 to FIG. 10 are examples, and the settable modes, the correspondence between each joystick operation and the operation of the remote machine 1 are not limited to the examples shown in FIG. 8 to FIG. 10.
  • 8 to 10 show an example in which a joystick is used as the precision operation device 8a, but even when a precision operation device 8a other than a joystick is used, the same effect as when a joystick is used can be obtained by displaying an operation method corresponding to the precision operation device 8a.
  • an operation method corresponding to the operation device 5 may be displayed in the same manner for normal operation using the operation device 5.
  • FIG. 11 is a diagram showing an example in which a terminal device is used as the image presentation device 4 of this embodiment.
  • a terminal device When a terminal device is used as the image presentation device 4, an image corresponding to the location where the terminal device is held is displayed on the terminal device.
  • the image displayed on the image presentation device 4, which is a terminal device may also be able to be enlarged and reduced.
  • both the monitor and the terminal device as shown in FIG. 2 may be used as the image presentation device 4.
  • an image connected to an image displayed on the image presentation device 4, which is a monitor is displayed according to the position of the image presentation device 4, which is a terminal device, thereby realizing a virtual screen wider than the image presentation device 4, which is a monitor.
  • FIG. 12 is a diagram showing an example of a method of displaying an image in this embodiment.
  • the upper part of FIG. 12 shows an example of a display screen when the image captured by the camera 2-1 is enlarged.
  • the upper part of FIG. 12 is, for example, an enlarged upper part of the third display screen of FIG. 4 described in embodiment 1, and although the hand 6a is enlarged, the dolly 7 displayed on the third display screen of FIG. 4 disappears. This makes it difficult for the operator to know the orientation of the dolly 7 and to operate it.
  • a three-dimensional figure 308 shown by a dashed line in the lower part of FIG. 12 may be superimposed on the image. That is, the display information generating unit 32 may generate a three-dimensional figure 308 that imitates the current state of the remote machine 1 so that the orientation of the trolley 7 can be known, for example, by using CG (Computer Graphics), generate display data in which the three-dimensional figure 308 is superimposed on the image captured by the camera 2-1, and display the display data on the image display device 4.
  • CG Computer Graphics
  • the display method of the three-dimensional figure 308 is not limited to the example shown in FIG. 12, and the image to be displayed is not limited to the image of the camera 2-1.
  • the three-dimensional figure 308 a two-dimensional figure, symbol, character, etc. indicating the orientation of the trolley 7 may be displayed, or a symbol, character, etc. indicating the orientation of the trolley 7 may be displayed together with the three-dimensional figure 308.
  • the operation calculator 3a of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3a of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • Embodiment 3. 13 is a diagram showing a configuration example of a remote control system according to the third embodiment.
  • a remote control system 100b according to the present embodiment is similar to the remote control system 100a according to the second embodiment, except that the remote control system 100b according to the present embodiment includes an operation calculator 3b instead of the operation calculator 3a, and multiple cameras 2 are provided.
  • Components having the same functions as those in the second embodiment are given the same reference numerals as those in the second embodiment, and duplicated explanations are omitted. Below, differences from the second embodiment will be mainly explained.
  • one or more cameras 2 are provided, but in the present embodiment, two or more cameras 2 are provided.
  • cameras 2-1 and 2-2 are provided as cameras 2.
  • the operation calculator 3b is the same as the operation calculator 3a of the second embodiment, except that a camera switching unit 37, which is a camera switch, is added.
  • the camera switching unit 37 switches between the multiple cameras 2 as the camera from which the image to be displayed on the image presentation device 4 is obtained. That is, the camera switching unit 37 switches the camera 2 corresponding to the presented image by selecting the presented image to be presented on the image presentation device 4 from the multiple images captured by each of the multiple cameras 2.
  • the camera switching unit 37 notifies the display information generating unit 32 of information indicating the camera 2 from which the image to be displayed on the image presentation device 4 is obtained, that is, information indicating the camera 2 selected as the selected display target.
  • the display information generating unit 32 generates display data using the notified image of the camera 2.
  • the camera switching unit 37 may switch the image to be displayed on the image presentation device 4 by receiving operation information instructing the camera to be switched from the operation device 5 or the precision operation device 8a via the input discrimination unit 31a, or may switch the image to be displayed on the image presentation device 4 by detecting the operator's gesture.
  • a button for switching cameras may be displayed on the image presentation device 4, and the operator may use the operation device 5 or the precision operation device 8a to press the button to switch the camera 2.
  • the image displayed on the image presentation device 4 may be switched according to preset conditions and the image displayed on the image presentation device 4.
  • FIG. 14 is a diagram showing an example of switching of the camera 2 in this embodiment.
  • FIG. 14 shows an example in which the camera 2-1 installed on the dolly 7 and the camera 2-2 installed near the hand 6a of the manipulator 6 are used, as described in FIG. 2 of the first embodiment, and the image captured by the camera 2-1 is displayed on the display screen 301, and the image captured by the camera 2-2 is displayed on the display screen 302.
  • the camera switching unit 37 may switch the image displayed as the display screen on the image presentation device 4 according to the operation information input from the operation device 5 or the precision operation device 8a, as described above, or may switch by detecting the gesture of the operator, or may switch according to a preset condition and the image displayed on the image presentation device 4.
  • the preset condition may be, for example, but is not limited to, a condition in which the image of the camera 2-2 is displayed when the distance between the remote machine 1 and the object is equal to or less than a specified distance, and the image of the camera 2-1 is displayed when the distance between the remote machine 1 and the object is greater than a specified distance.
  • the target object is determined in advance by the operator using the operation device 5 or the precision operation device 8a.
  • the operator can perform remote control while watching the image captured by camera 2-2 until the remote machine 1 approaches the target object to a certain extent, and when the remote machine approaches the target object beyond a certain extent, the operator can perform remote control while watching the image captured by camera 2-1.
  • FIG. 15 is a diagram showing an example of a camera provided inside the hand 6a of the manipulator 6 in this embodiment.
  • the manipulator 6 and the object 9 to be grasped by the hand 6a are shown on the left, and an image displayed on the image display device 4 is shown on the right.
  • the display screen 310 shows the screen on which the image captured by the camera 2-2 is displayed, and the display screen 320 shows the screen on which the image captured by the camera 2-3 is displayed.
  • the camera 2-3 is provided inside the hand 6a, and when the hand 6a approaches the object 9 to be grasped, the image displayed on the image display device 4 is switched from the image captured by the camera 2-2 to the image captured by the camera 2-3.
  • the operator can perform operations while understanding the state of the grasped object 9 in detail. For example, if either the grasped object 9 or the hand 6a is a deformable structure, the operator can check the degree of deformation of the deformable structure and adjust the gripping force of the hand 6a.
  • the adjustment of the gripping force of the hand 6a may be performed by the operating device 5 or the precision operating device 8a, or by other operating means other than these.
  • the image displayed on the image presentation device 4 was an image captured by one camera 2, but the image presentation device 4 may display images captured by multiple cameras 2 as separate display screens.
  • the display information generation unit 32 may display the image captured by the selected camera 2 at high illuminance and reduce the illuminance of the image of the unselected camera 2.
  • the display information generation unit 32 may also display the display screen of the image of the selected camera 2 in the center and display the image of the unselected camera 2 at the edge.
  • the display information generation unit 32 may also generate display data so that the size of the image of the unselected camera 2 is smaller than the size of the image of the selected camera 2.
  • the operation calculator 3b of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3b of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • the camera switching unit 37 is provided within the operation calculator 3b in FIG. 13, a camera switcher having the functions of the camera switching unit 37 may be provided separately from the operation calculator 3b.
  • FIG. 13 shows an example in which a camera switching function is added to the operation calculator 3a of embodiment 2, but the remote control system 100 of embodiment 1 may be equipped with multiple cameras 2, and a camera switching unit 37 may be added to the operation calculator 3 to add a camera switching function.
  • FIG. 16 is a diagram showing an example of a camera 2 in this embodiment that photographs the manipulator 6 from the side.
  • camera 2-4 which is one of the multiple cameras 2, photographs the manipulator 6 from the side.
  • a drive mechanism is provided in the camera 2-4, and the drive mechanism drives the camera 2-4 so as to track, for example, the tip of the hand 6a.
  • FIG. 17 is a diagram showing an example of switching of images in this embodiment. In the example shown in FIG. 17, for example, a display screen 304 on which an image photographed by camera 2-4 is displayed and a display screen 302 on which an image photographed by camera 2-2 is displayed are switched by a gesture of the operator.
  • FIG. 17 shows an example of switching between the image captured by the camera 2-2 and the image captured by the camera 2-4, it is also possible to switch between the image captured by the camera 2-1 and the image captured by the camera 2-4 in a similar manner.
  • FIG. 18 is a diagram showing an example of superimposing an image captured from the side in this embodiment.
  • the upper left diagram in FIG. 18 shows an example in which an image obtained by capturing the tip of hand 6a and the object to be grasped 9 by camera 2-2 is displayed as display screen 302.
  • display screen 304 showing the image from camera 2-4 is displayed on display screen 302.
  • display screen 304 may be displayed on display screen 302, but if left as is, the entire screen will look unnatural. For this reason, a virtual mirror 307 may be displayed as shown in the lower left diagram in FIG. 18.
  • the display information generating unit 32 generates a composite image showing the mirror 307 and the image displayed on the mirror 307, assuming that the image from the camera 2-4 is reflected in the virtual mirror 307, generates display data by superimposing the composite image on the image captured by the camera 2-1 or the camera 2-2, and displays the display data on the image presenting device 4. This allows the operator to recognize the image from the camera 2-4 as a natural image.
  • FIG. 19 is a diagram showing an example of a pressure detection and pressure display method in this embodiment.
  • the display unit 602 which is the image presentation device 4 or another monitor
  • the tip of the hand 6a provided with the pressure detection sensor moves and traces the surface of an object such as a grasped object 9.
  • the display unit 602 may have a function as an input means such as a touch panel, or the movement of the finger may be detected and the tip of the hand 6a may be driven in accordance with the detected movement.
  • Pressure information indicating the pressure detected by the pressure detection sensor is input to the display information generation unit 32 in the same way as the image of the camera 2.
  • the display information generation unit 32 may generate display data based on the pressure information so as to change at least one of, for example, the color density, transparency, and composition ratio, and display the display data on the image presentation device 4 or another monitor to visually present the force haptics to the operator.
  • the presentation of the haptic sensation is not limited to being displayed on the image display device 4 or another monitor, but may be achieved by displaying it on the operator's fingertips using projection mapping.
  • the operation calculator 3c when the operation calculator 3c receives an operation instructing to trace the surface of an object such as the grasped object 9, it moves the tip of the hand 6a so as to trace the surface of the object such as the grasped object 9, and outputs pressure information indicating the pressure detected by the pressure detection sensor and movement information indicating the amount of movement of the tip of the hand 6a to the haptic presentation device 607.
  • the operation calculator 3c may control the horizontal position of the tip of the hand 6a in accordance with the movement of the finger 601, thereby moving the tip of the hand 6a so as to trace the surface.
  • the haptic presentation device 607 includes, for example, a contact unit 604, a drive unit 605 capable of driving the contact unit 604, and a support unit 606 that supports the operator's finger from above.
  • the haptic feedback device 607 controls the drive unit 605 based on the movement information to move the horizontal position of the contact unit 604, and moves the contact unit 604 up and down based on the pressure information.
  • the finger 601 moves horizontally together with the contact unit 604, and also moves up and down according to the pressure information.
  • This allows the haptic feedback to be presented, and the operator to perceive the surface condition.
  • the unevenness of the surface of the object may be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c may move the contact unit 604 up and down according to the detected unevenness.
  • a membrane may also be provided on the portion where the operator places his/her finger 601.
  • the membrane can provide the operator with a sliding sensation when tracing a surface.
  • the operation calculator 3c can also reproduce and communicate to the operator the unevenness detected by the tip of the hand 6a by moving the membrane up and down in response to pressure information.
  • the unevenness of the surface of the object can also be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c can move the contact part 604 up and down in response to the detected unevenness.
  • Embodiment 4. 21 is a diagram showing a configuration example of a remote control system according to the fourth embodiment.
  • a remote control system 100c according to the present embodiment is similar to the remote control system 100b according to the third embodiment, except that the remote control system 100c according to the present embodiment includes an operation calculator 3c instead of the operation calculator 3b, and a remote machine 1a instead of the remote machine 1.
  • Components having the same functions as those in the third embodiment are given the same reference numerals as those in the third embodiment, and duplicated explanations will be omitted. Below, differences from the third embodiment will be mainly explained.
  • the remote machine 1a is similar to the remote machine 1 of embodiment 3, except that a hand drive mechanism 13 is added to the remote machine 1.
  • the hand drive mechanism 13 drives the hand 6a based on a control signal received from the operation calculator 3c.
  • the remote machines 1 of embodiments 1 to 3 may also be provided with a hand drive mechanism 13, and there are no particular restrictions on the method of operating the hand drive mechanism 13 in embodiments 1 to 3.
  • the operation calculator 3c is similar to the operation calculator 3b of embodiment 3, except that a hand angle setting unit 38 is added and an input discrimination unit 31c is provided instead of the input discrimination unit 31a.
  • the operator can also set the hand angle of the hand 6a using the operation device 5, and when the input discrimination unit 31c receives operation information indicating the hand angle of the hand 6a, it outputs the operation information to the hand angle setting unit 38.
  • the hand angle setting unit 38 uses the operation information received from the input discrimination unit 31c to generate a control signal for controlling the hand 6a of the remote machine 1a, and transmits the generated control signal to the remote machine 1a.
  • FIG. 22 and 23 are diagrams showing an example of a hand angle setting screen in this embodiment.
  • FIG. 22 and FIG. 23 each show a display screen displayed on the image display device 4.
  • angle change buttons 401 to 403 are displayed on the right side of the display screen 301 on which the image captured by the camera 2-1 is displayed.
  • the angle change button 401 is a button for setting the hand angle of the hand 6a in the horizontal direction
  • the angle change button 402 is a button for setting the hand angle of the hand 6a to 45 deg (the angle between the horizontal direction and the hand 6a is 45 deg)
  • the angle change button 403 is a button for setting the hand angle of the hand 6a in the vertical direction.
  • FIG. 22 and FIG. 23 each show a display screen displayed on the image display device 4.
  • angle change buttons 401 to 403 are displayed on the right side of the display screen 301 on which the image captured by the camera 2-1 is displayed.
  • the angle change button 401 is a button for setting the hand angle of the hand 6a
  • FIG. 22 shows an example in which the hand end angle of the hand 6a can be set to three types of hand end angles, horizontal, 45 deg, and vertical, but the number of hand end angles that can be set for the hand end angle of the hand 6a is not limited to this example. For example, it may be possible to set it to one of two types of hand end angles, or it may be possible to set the hand end angle to more than three types.
  • the display and operation for setting the hand end angle are not limited to FIG. 22.
  • an indicator showing a memory along with the hand end angle may be displayed, and the hand end angle of the hand 6a may be set by moving the memory with the operating device 5, or the hand end angle may be set by a gesture, or the hand end angle of the hand 6a may be set by a method other than these.
  • FIG. 23 illustrates an operation method different from that of FIG. 22.
  • a side view of the hand 6a is displayed as the display screen 304, and the operator rotates the displayed hand 6a using the operation device 5 to set the hand end angle of the hand 6a. That is, an image including the hand end of the hand 6a is displayed on the hand end angle setting screen, and the hand end angle is set by the operator changing the angle of the hand end in the image using the operation device 5.
  • the operator uses the operation device 5 to move the operation device 5 in the rotation direction 404 while pressing the part of the hand 6a, and ends the pressing when the desired hand end angle is reached.
  • the target hand end angle may be set by clicking the target hand end angle with the marker 201.
  • the image of the hand 6a shown in FIG. 23 may be one that has been created in advance by simulating the remote machine 1a, or may be an image captured by a camera 2 that is provided on the manipulator 6 and captures images of the hand 6a from the side. In the latter case, similar to the operation of the posture and position of the remote machine 1 in embodiment 1, the operator may start pressing the operation device 5 after moving the marker 201 to the target hand angle, and continue pressing the operation device 5 until the target hand angle is reached.
  • the mode may be changed to a mode in which the attitude and position of the remote machine 1 are operated as described in the first embodiment, and when the hand 6a is clicked, the hand tip angle setting screen of the hand 6a may be displayed.
  • the position of the hand 6a of the manipulator 6 may be set, and when the manipulator 6 is clicked, the setting screen of the position of the hand 6a of the manipulator 6 may be displayed.
  • the operation calculator 3c may grasp the joint angle by attaching an AR marker to each joint of the manipulator 6, or may control the position of the hand 6a by constructing a robot model inside and grasping the joint angle, or may control the position of the hand 6a by other methods.
  • the position of the hand 6a of the manipulator 6 may be set by clicking or continuously pressing the operating device 5 at the position of the marker 201 as described in embodiment 1, or, as illustrated in FIG. 22, a button indicating the position of the hand 6a of the manipulator 6 may be displayed and the position of the hand 6a of the manipulator 6 may be set by the button.
  • the remote machine 1a has a hand 6a whose end-point angle can be set, and the image display device 4 displays a setting screen for setting the end-point angle of the hand 6a.
  • the operator can also set the end-point angle of the hand 6a with a simple operation.
  • the position of the manipulator 6 and the end-point angle of the hand 6a may be set using the precision operation device 8a, as described in the second embodiment.
  • the operation calculator 3c of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3c of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • FIG. 21 shows an example in which a hand angle setting function is added to the operation calculator 3b of embodiment 3
  • a hand angle setting unit 38 may be added to the operation calculator 3a of embodiment 1 or the operation calculator 3b of embodiment 2 to add a hand angle setting function.
  • Embodiment 5 is a diagram showing a configuration example of a remote control system according to the fifth embodiment.
  • a remote control system 100d according to the fifth embodiment is similar to the remote control system 100c according to the fourth embodiment, except that the remote control system 100d according to the fifth embodiment includes an operation calculator 3d instead of the operation calculator 3c.
  • Components having the same functions as those in the fourth embodiment are given the same reference numerals as those in the fourth embodiment, and duplicated explanations are omitted. Below, differences from the fourth embodiment will be mainly explained.
  • the operation calculator 3d is similar to the operation calculator 3c of embodiment 4, except that a motion switching unit 39, which is a motion switch, is added, and the input discrimination unit 31d is provided instead of the input discrimination unit 31c.
  • the motion switching unit 39 switches the operation part of the remote machine 1a in conjunction with switching by the camera switching unit 37. For example, the camera switching unit 37 notifies the motion switching unit 39 of information indicating the selected camera 2, and the motion switching unit 39 switches the operation part based on the notified information.
  • the motion switching unit 39 selects the cart 7 of the remote machine 1a as the operation part
  • the motion switching unit 39 selects the manipulator 6 or hand 6a of the remote machine 1a as the operation part
  • the motion switching unit 39 selects the hand 6a of the remote machine 1a as the operation part.
  • the motion switching unit 39 notifies the input discrimination unit 31d of the selected operation part.
  • the input discrimination unit 31d outputs the input operation information to the function unit corresponding to the operation part.
  • the input discrimination unit 31d when the operation part is the cart 7, the input discrimination unit 31d outputs operation information to the target attitude setting unit 33 and the attitude driving unit 34, and when the operation part is the hand 6a, the input discrimination unit 31d outputs operation information to the hand angle setting unit 38.
  • the input discrimination unit 31d When the operation part is the manipulator 6, the input discrimination unit 31d outputs operation information to a manipulator driving unit (not shown), or to a target attitude setting unit 33 and an attitude driving unit 34 that function as a manipulator driving unit.
  • the motion switching unit 39 may set a gain for the motion of the remote machine 1 when the precision operation device 8a is operated according to the camera 2 selected by the camera switching unit 37.
  • the gain may be determined in advance for each camera 2.
  • the gain may be set instead of switching the operation part, or both the operation part switching and the gain setting may be performed.
  • the motion switching unit 39 outputs the set gain to the target attitude setting unit 33, the attitude driving unit 34, and the hand angle setting unit 38.
  • the camera switching unit 37 may set a gain according to the magnification of the enlargement or reduction even when the image displayed on the image presentation device 4 is enlarged or reduced.
  • the operation calculator 3d of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1.
  • the operation calculator 3d of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
  • a motion switch having the function of the motion switch unit 39 may be provided separately from the operation calculator 3d.
  • FIG. 24 shows an example in which a motion switching function is added to the operation calculator 3c of embodiment 4, a motion switching function may be added by adding a motion switching unit 39 to the operation calculator 3a of embodiment 2 or the operation calculator 3b of embodiment 3.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manipulator (AREA)

Abstract

A remote operation system (100) according to the present disclosure comprises: a remote machine (1) in which a camera (2) is installed and which is remotely operated; a video presentation device (4) for presenting to an operator, as a presentation video, a video captured by the camera (2); an operation device (5) that is operated by the operator, receives, in response to an operation by the operator, an input of a target posture of the remote machine (1) as a position in the presentation video, and receives, in response to an operation by the operator, an input of a position movement instruction indicating that the position of the remote machine (1) is to be moved; and an operation calculator (3) for driving the posture of the remote machine (1) on the basis of the target posture received by the operation device (5), and moving the position of the remote machine (1) on the basis of the position movement instruction received by the operation device (5).

Description

遠隔操作システムおよび遠隔操作方法Remote control system and remote control method
 本開示は、遠隔操作システムおよび遠隔操作方法に関する。 This disclosure relates to a remote control system and a remote control method.
 遠隔で機械を操作する遠隔操作システムでは、例えば、ヘッドマウントディスプレイとオペレータのジェスチャを検出する操作インタフェースとの組み合わせが用いられる。 In a remote control system that operates machinery remotely, for example, a combination of a head-mounted display and an operation interface that detects the operator's gestures is used.
 特許文献1には、魚眼型ステレオカメラを用いることで、遠隔操作を行うオペレータにカメラ駆動を必要とせず広視野映像を提示する技術が開示されている。 Patent document 1 discloses a technology that uses a fisheye stereo camera to present a wide-field-of-view image to a remote operator without the need to drive the camera.
特開2021-180415号公報JP 2021-180415 A
 しかしながら、特許文献1に記載の技術では、ヘッドマウントディスプレイの装着および立体認知が必要となる上、オペレータは立体認知で周囲を順次確認しつつジョイスティックを操作することになり、オペレータの操作負荷が高いという問題がある。 However, the technology described in Patent Document 1 requires the operator to wear a head-mounted display and have stereoscopic perception, and the operator must operate the joystick while sequentially checking the surroundings using stereoscopic perception, which places a high operational burden on the operator.
 本開示は、上記に鑑みてなされたものであって、オペレータの操作負荷を軽減することができる遠隔操作システムを得ることを目的とする。 The present disclosure has been made in consideration of the above, and aims to provide a remote control system that can reduce the operational burden on the operator.
 上述した課題を解決し、目的を達成するために、本開示にかかる遠隔操作システムは、カメラが搭載され、遠隔操作される遠隔機械と、オペレータへカメラによって撮影された映像を提示映像として提示する映像提示装置と、オペレータによって操作され、提示映像における位置として遠隔機械の目標姿勢の入力をオペレータの操作により受付け、遠隔機械の位置を移動させることを示す位置移動指示の入力をオペレータの操作により受付ける操作装置と、を備える。遠隔操作システムは、さらに、遠隔機械の姿勢を操作装置が受付けた目標姿勢に基づいて駆動させ、遠隔機械の位置を操作装置が受付けた位置移動指示に基づいて移動させる操作演算器と、を備える。 In order to solve the above-mentioned problems and achieve the object, the remote operation system disclosed herein comprises a remote machine equipped with a camera and remotely operated, an image presentation device that presents an image captured by the camera to an operator as a presentation image, and an operation device that is operated by the operator and accepts, through the operator's operation, an input of a target attitude of the remote machine as a position in the presentation image, and accepts, through the operator's operation, an input of a position movement instruction indicating that the position of the remote machine is to be moved. The remote operation system further comprises an operation calculator that drives the attitude of the remote machine based on the target attitude received by the operation device, and moves the position of the remote machine based on the position movement instruction received by the operation device.
 本開示にかかる遠隔操作システムは、オペレータの操作負荷を軽減することができるという効果を奏する。 The remote control system disclosed herein has the effect of reducing the operational burden on the operator.
実施の形態1にかかる遠隔操作システムの構成例を示す図FIG. 1 is a diagram showing a configuration example of a remote control system according to a first embodiment; 実施の形態1の遠隔操作システムの具体例を模式的に示す図FIG. 1 is a schematic diagram showing a specific example of a remote control system according to a first embodiment; 実施の形態1の目標姿勢の設定方法の一例を示す図FIG. 1 is a diagram showing an example of a method for setting a target attitude according to the first embodiment; 実施の形態1の操作装置を用いた操作の一例を示す図FIG. 1 is a diagram showing an example of an operation using the operation device according to the first embodiment; 実施の形態1の操作演算器における動作の一例を示すフローチャートA flowchart showing an example of an operation in the operational computing unit according to the first embodiment. 実施の形態1の操作演算器を実現するコンピュータシステムの構成例を示す図FIG. 1 is a diagram showing an example of the configuration of a computer system that realizes an operational calculator according to a first embodiment; 実施の形態2にかかる遠隔操作システムの構成例を示す図FIG. 13 is a diagram showing a configuration example of a remote control system according to a second embodiment. 実施の形態2の操作方法を示す画面の一例を示す図FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment; 実施の形態2の操作方法を示す画面の一例を示す図FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment; 実施の形態2の操作方法を示す画面の一例を示す図FIG. 13 is a diagram showing an example of a screen showing an operation method according to the second embodiment; 実施の形態2の映像提示装置として端末装置が用いられる例を示す図FIG. 13 is a diagram showing an example in which a terminal device is used as an image presentation device according to a second embodiment; 実施の形態2の映像の表示方法の一例を示す図FIG. 13 is a diagram showing an example of a video display method according to the second embodiment; 実施の形態3にかかる遠隔操作システムの構成例を示す図FIG. 13 is a diagram showing a configuration example of a remote control system according to a third embodiment. 実施の形態3のカメラの切替えの一例を示す図FIG. 13 is a diagram showing an example of camera switching in the third embodiment; 実施の形態3のマニピュレータのハンドの内側に設けられたカメラの一例を示す図FIG. 13 is a diagram showing an example of a camera provided inside a hand of a manipulator according to a third embodiment; マニピュレータを側方から撮影する実施の形態3のカメラの一例を示す図FIG. 13 is a diagram showing an example of a camera according to a third embodiment for photographing a manipulator from the side. 実施の形態3の映像の切替えの一例を示す図FIG. 13 is a diagram showing an example of video switching in the third embodiment. 実施の形態3における側方から撮影した映像の重畳の一例を示す図FIG. 13 is a diagram showing an example of superimposing images captured from the side in the third embodiment; 実施の形態3の圧力検知と圧力の表示方法の一例を示す図FIG. 13 is a diagram showing an example of a method for detecting pressure and displaying pressure according to the third embodiment; 実施の形態3の圧力検知と圧力の表示方法の別の一例を示す図FIG. 13 is a diagram showing another example of a method for detecting pressure and displaying pressure according to the third embodiment; 実施の形態4にかかる遠隔操作システムの構成例を示す図FIG. 13 is a diagram showing a configuration example of a remote control system according to a fourth embodiment. 実施の形態4の手先角度の設定画面の一例を示す図FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment; 実施の形態4の手先角度の設定画面の一例を示す図FIG. 13 is a diagram showing an example of a setting screen for a hand angle according to the fourth embodiment; 実施の形態5にかかる遠隔操作システムの構成例を示す図FIG. 13 is a diagram showing a configuration example of a remote control system according to a fifth embodiment.
 以下に、実施の形態にかかる遠隔操作システムおよび遠隔操作方法を図面に基づいて詳細に説明する。 Below, the remote control system and remote control method according to the embodiment will be described in detail with reference to the drawings.
実施の形態1.
 図1は、実施の形態1にかかる遠隔操作システムの構成例を示す図である。本実施の形態の遠隔操作システム100は、遠隔機械1、カメラ2、操作演算器3、映像提示装置4および操作装置5を備える。
Embodiment 1.
1 is a diagram showing a configuration example of a remote operation system according to embodiment 1. A remote operation system 100 of this embodiment includes a remote machine 1, a camera 2, an operation calculator 3, an image display device 4, and an operation device 5.
 遠隔機械1は、遠隔操作される機械であり、例えば、台車とマニピュレータとの組み合わせであってもよいし、移動可能な車両であってもよいし、マニピュレータであってもよいし、人型のロボットなどであってもよく、これらに限定されない。以下では、一例として、遠隔機械1が台車とマニピュレータとの組み合わせである例を主に説明する。 The remote machine 1 is a machine that is remotely operated, and may be, for example, but is not limited to, a combination of a cart and a manipulator, a mobile vehicle, a manipulator, a humanoid robot, etc. In the following, as an example, a case where the remote machine 1 is a combination of a cart and a manipulator will be mainly described.
 遠隔機械1は、姿勢駆動機構11と、位置駆動機構12とを備える。姿勢駆動機構11は、遠隔機械1の姿勢を変更することが可能な機構部であり、操作演算器3から受信した制御信号に応じて遠隔機械1の姿勢を変更する。なお、ここでは、姿勢は遠隔機械1の向きを示す。例えば、遠隔機械1が台車とマニピュレータとの組み合わせである場合、目標姿勢は、台車の目標の向きを示す。例えば、遠隔機械1が車両の場合には、目標姿勢は、車両の目標の向きを示す。例えば、遠隔機械1がマニピュレータの場合には、マニピュレータの基部が回転可能であるとし、目標姿勢は、マニピュレータの目標の向きを示す。例えば、姿勢駆動機構11は、遠隔機械1の車輪の向きを変更するアクチュエータを含む。姿勢駆動機構11は、遠隔機械1の向きをその場で変更可能な駆動機構であってもよいし、切り返しにより向きを変更可能な駆動機構であってもよい。また、例えば、遠隔機械1が台車とマニピュレータとの組み合わせである場合、目標姿勢は、台車の向きとマニピュレータの目標の向きとに適切に配分されてもよい。 The remote machine 1 includes an attitude drive mechanism 11 and a position drive mechanism 12. The attitude drive mechanism 11 is a mechanism that can change the attitude of the remote machine 1, and changes the attitude of the remote machine 1 according to a control signal received from the operation calculator 3. Here, the attitude indicates the orientation of the remote machine 1. For example, if the remote machine 1 is a combination of a cart and a manipulator, the target attitude indicates the target orientation of the cart. For example, if the remote machine 1 is a vehicle, the target attitude indicates the target orientation of the vehicle. For example, if the remote machine 1 is a manipulator, the base of the manipulator is rotatable, and the target attitude indicates the target orientation of the manipulator. For example, the attitude drive mechanism 11 includes an actuator that changes the orientation of the wheels of the remote machine 1. The attitude drive mechanism 11 may be a drive mechanism that can change the orientation of the remote machine 1 on the spot, or may be a drive mechanism that can change the orientation by turning. Also, for example, if the remote machine 1 is a combination of a cart and a manipulator, the target attitude may be appropriately allocated to the orientation of the cart and the target orientation of the manipulator.
 位置駆動機構12は、遠隔機械1の位置を変更することが可能な機構部であり、操作演算器3から受信した制御信号に応じて遠隔機械1の位置を変更する。位置駆動機構12は、例えば、遠隔機械1の水平方向および鉛直方向のうち少なくとも一方の位置を変更することが可能なアクチュエータを含む。 The position drive mechanism 12 is a mechanism capable of changing the position of the remote machine 1, and changes the position of the remote machine 1 in response to a control signal received from the operation calculator 3. The position drive mechanism 12 includes, for example, an actuator capable of changing at least one of the horizontal and vertical positions of the remote machine 1.
 カメラ2は、遠隔機械1に搭載され、遠隔機械1の周囲を撮影する。カメラ2の台数に制約はなく、1台以上であればよい。以下、本実施の形態では、カメラ2として、視点の異なる2つの魚眼カメラを含む例を説明するが、カメラ2は魚眼カメラに限定されない。 The camera 2 is mounted on the remote machine 1 and captures the surroundings of the remote machine 1. There is no restriction on the number of cameras 2, and one or more may be used. In the following embodiment, an example in which the camera 2 includes two fisheye cameras with different viewpoints will be described, but the camera 2 is not limited to a fisheye camera.
 映像提示装置4は、例えば、モニタ、ディスプレイ、スマートフォンのモニタなどの表示装置であり、遠隔機械1を遠隔操作するオペレータに映像を提示する。映像提示装置4は、例えば、オペレータへカメラ2によって撮影された映像を提示映像として提示する。操作装置5は、オペレータによって操作される装置である、例えば、マウス、キーボード、タッチパッド、画面操作を顔、視線を検出することによって行う装置などであるがこれに限定されない。操作装置5は、例えば、後述するように、提示映像における位置として遠隔機械1の目標姿勢の入力をオペレータの操作により受付け、遠隔機械1の位置を移動させることを示す位置移動指示の入力をオペレータの操作により受付ける。また、映像提示装置4と操作装置5とが一体化され、タッチパネルやスマートフォンなどが用いられてもよい。以下では、一例として、操作装置5がマウスであり、映像提示装置4がモニタである例を主に説明する。 The image presentation device 4 is, for example, a display device such as a monitor, a display, or a smartphone monitor, and presents an image to an operator who remotely operates the remote machine 1. The image presentation device 4 presents, for example, an image captured by the camera 2 to the operator as a presented image. The operation device 5 is a device operated by the operator, such as, but not limited to, a mouse, a keyboard, a touchpad, or a device that performs screen operation by detecting a face or line of sight. For example, as described below, the operation device 5 accepts input of a target attitude of the remote machine 1 as a position in the presented image by the operator's operation, and accepts input of a position movement instruction indicating that the position of the remote machine 1 is to be moved by the operator's operation. The image presentation device 4 and the operation device 5 may also be integrated, and a touch panel, a smartphone, or the like may be used. In the following, as an example, an example in which the operation device 5 is a mouse and the image presentation device 4 is a monitor will be mainly described.
 操作演算器3は、カメラ2から、カメラ2によって撮影された映像を受信し、受信した映像を映像提示装置4に出力することで、映像提示装置4に映像を表示させる。また、操作演算器3は、操作装置5が受付けた操作内容に応じて、遠隔機械1を制御する。例えば、操作演算器3は、遠隔機械1の姿勢を操作装置5が受付けた目標姿勢に基づいて駆動させ、遠隔機械1の位置を操作装置5が受付けた位置移動指示に基づいて移動させる。カメラ2と操作演算器3とは、直接接続されていてもよいし、カメラ2と操作演算器3とが図示を省略した通信部を備え、通信によってデータの送受信が行われてもよい。カメラ2と操作演算器3との間の通信は、有線通信であってもよいし、無線通信であってもよいし、有線通信と無線通信との組み合わせであってもよい。同様に、操作演算器3と遠隔機械1とは、直接接続されていてもよいし、操作演算器3と遠隔機械1とが図示を省略した通信部を備え、通信によってデータの送受信が行われてもよい。同様に、映像提示装置4および操作装置5のそれぞれと遠隔機械1とは、直接接続されていてもよいし、映像提示装置4および操作装置5と操作演算器3とが図示を省略した通信部を備え、通信によってデータの送受信が行われてもよい。映像提示装置4および操作装置5のそれぞれと遠隔機械1との間の通信は、有線通信であってもよいし、無線通信であってもよいし、有線通信と無線通信との組み合わせであってもよい。 The operation calculator 3 receives the image captured by the camera 2 from the camera 2, and outputs the received image to the image display device 4, thereby displaying the image on the image display device 4. The operation calculator 3 also controls the remote machine 1 according to the operation content accepted by the operation device 5. For example, the operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude accepted by the operation device 5, and moves the position of the remote machine 1 based on the position movement instruction accepted by the operation device 5. The camera 2 and the operation calculator 3 may be directly connected, or the camera 2 and the operation calculator 3 may have a communication unit not shown in the figure, and data may be transmitted and received by communication. The communication between the camera 2 and the operation calculator 3 may be wired communication, wireless communication, or a combination of wired communication and wireless communication. Similarly, the operation calculator 3 and the remote machine 1 may be directly connected, or the operation calculator 3 and the remote machine 1 may have a communication unit not shown in the figure, and data may be transmitted and received by communication. Similarly, the image presentation device 4 and the operation device 5 may be directly connected to the remote machine 1, or the image presentation device 4, the operation device 5, and the operation calculator 3 may be provided with a communication unit (not shown), and data may be transmitted and received through communication. The communication between the image presentation device 4 and the operation device 5 and the remote machine 1 may be wired communication, wireless communication, or a combination of wired communication and wireless communication.
 また、操作演算器3は、遠隔機械1が位置する第1の地点に設けられてもよいし、映像提示装置4および操作装置5が位置する第2の地点、すなわちオペレータ側に設置されてもよい。例えば、操作演算器3は、遠隔機械1に搭載されていてもよい。また、操作演算器3は、第1の地点および第2の地点の両方と異なる別の場所に設けられてもよい。例えば、操作演算器3はクラウドサーバ上に設けられてもよい。 Furthermore, the operation calculator 3 may be provided at a first location where the remote machine 1 is located, or at a second location where the image presentation device 4 and the operation device 5 are located, i.e., on the operator's side. For example, the operation calculator 3 may be mounted on the remote machine 1. Furthermore, the operation calculator 3 may be provided at a location different from both the first location and the second location. For example, the operation calculator 3 may be provided on a cloud server.
 操作演算器3は、入力判別部31、表示情報生成部32、目標姿勢設定部33、姿勢駆動部34、目標姿勢判定部35および位置駆動部36を備える。入力判別部31は、操作装置5から操作内容を示す操作情報を取得し、取得した操作情報の内容に応じて、対応する機能部へ操作情報を出力する。表示情報生成部32は、カメラ2から受信した映像と入力判別部31から受け取った操作情報とを用いて映像提示装置4に表示させる表示画面を示す表示データを生成し、生成した表示データを映像提示装置4へ送信する。また、表示情報生成部32は、カメラ2から受信した映像を目標姿勢設定部33へ出力する。 The operation calculator 3 includes an input discrimination unit 31, a display information generation unit 32, a target attitude setting unit 33, an attitude drive unit 34, a target attitude determination unit 35, and a position drive unit 36. The input discrimination unit 31 acquires operation information indicating the operation content from the operation device 5, and outputs the operation information to the corresponding functional unit according to the content of the acquired operation information. The display information generation unit 32 generates display data indicating a display screen to be displayed on the image presentation device 4 using the image received from the camera 2 and the operation information received from the input discrimination unit 31, and transmits the generated display data to the image presentation device 4. The display information generation unit 32 also outputs the image received from the camera 2 to the target attitude setting unit 33.
 目標姿勢設定部33は、入力判別部31から受け取った操作情報に基づいて、遠隔機械1の目標姿勢を設定し、設定した目標姿勢を姿勢駆動部34および目標姿勢判定部35へ通知する。姿勢駆動部34は、目標姿勢設定部33から目標姿勢が通知されると目標姿勢に基づいて、遠隔機械1の姿勢を駆動するための制御信号を生成し、生成した制御信号を遠隔機械1へ送信する。また、姿勢駆動部34は、目標姿勢判定部35から目標姿勢に到達していないことを通知されると、遠隔機械1の姿勢を駆動するための制御信号を生成し、生成した制御信号を遠隔機械1へ送信する。 The target attitude setting unit 33 sets a target attitude of the remote machine 1 based on the operation information received from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude. When the attitude driving unit 34 is notified of the target attitude by the target attitude setting unit 33, it generates a control signal for driving the attitude of the remote machine 1 based on the target attitude, and transmits the generated control signal to the remote machine 1. Furthermore, when the attitude driving unit 34 is notified by the target attitude determination unit 35 that the target attitude has not been reached, it generates a control signal for driving the attitude of the remote machine 1, and transmits the generated control signal to the remote machine 1.
 目標姿勢判定部35は、遠隔機械1の姿勢が目標姿勢となったか否かを判定し、目標姿勢となったと判定した場合には、位置駆動部36へ目標姿勢となったことを通知する。目標姿勢判定部35は、目標姿勢となっていないと判定した場合には、姿勢駆動部34へ目標姿勢となっていないことを通知する。位置駆動部36は、目標姿勢判定部35から目標姿勢になったことを通知されると、入力判別部31から受け取った操作情報が遠隔機械1の位置を移動させることを示している場合に、遠隔機械1の位置を移動させるための制御信号を生成し、生成した制御信号を遠隔機械1へ送信する。 The target attitude determination unit 35 determines whether the attitude of the remote machine 1 has become the target attitude, and if it determines that it has become the target attitude, it notifies the position drive unit 36 that it has become the target attitude. If the target attitude determination unit 35 determines that it has not become the target attitude, it notifies the attitude drive unit 34 that it has not become the target attitude. When notified by the target attitude determination unit 35 that the target attitude has been achieved, if the operation information received from the input discrimination unit 31 indicates that the position of the remote machine 1 is to be moved, the position drive unit 36 generates a control signal for moving the position of the remote machine 1 and transmits the generated control signal to the remote machine 1.
 図2は、本実施の形態の遠隔操作システム100の具体例を模式的に示す図である。図2に示した例では、遠隔機械1は、マニピュレータ6と台車7とを備え、台車7の前面に魚眼カメラであるカメラ2-1が設けられ、マニピュレータ6のハンド6a付近に魚眼カメラであるカメラ2-2が設けられている。図2に示した例では、映像提示装置4はモニタであり、操作装置5はマウスである。映像提示装置4には、カメラ2-1によって撮影された映像またはカメラ2-2によって撮影された映像が表示される。カメラ2-1,2-2は、いずれも図1に示したカメラ2の一例である。本実施の形態では、カメラの切替え、すなわち、映像提示装置4にカメラ2-1によって撮影された映像とカメラ2-2によって撮影された映像とのうちいずれが表示されるかの切替えは、オペレータの操作により行われるとする。オペレータは、映像提示装置4に表示された映像をみながら、操作装置5を操作することで、遠隔機械1の目標姿勢と位置との移動を遠隔操作する。 FIG. 2 is a diagram showing a schematic example of a remote control system 100 according to the present embodiment. In the example shown in FIG. 2, the remote machine 1 includes a manipulator 6 and a dolly 7. A fish-eye camera 2-1 is provided on the front of the dolly 7, and a fish-eye camera 2-2 is provided near the hand 6a of the manipulator 6. In the example shown in FIG. 2, the image display device 4 is a monitor, and the operation device 5 is a mouse. The image display device 4 displays an image captured by the camera 2-1 or an image captured by the camera 2-2. Both cameras 2-1 and 2-2 are examples of the camera 2 shown in FIG. 1. In this embodiment, the camera switching, that is, the switching of whether the image captured by the camera 2-1 or the image captured by the camera 2-2 is displayed on the image display device 4, is performed by the operator. The operator remotely controls the movement of the remote machine 1 between the target attitude and position by operating the operation device 5 while watching the image displayed on the image display device 4.
 図3は、本実施の形態の目標姿勢の設定方法の一例を示す図である。図3は、映像提示装置4の表示画面の一例を示す図である。図3に示した例では、カメラ2-1によって撮影された映像が映像提示装置4に表示されている。この映像には、台車7の上部と、マニピュレータ6のハンド6aとが映っている。図3に示した例では、カメラ2-1によって撮影された映像とともに、目標姿勢を示すマーカー201が表示されている。表示画面内のマーカー201は、オペレータが、操作装置5を操作することで変更可能である。例えば、操作装置5がマウスであるとすると、操作装置5の移動に応じてマーカー201の表示画面内の位置が移動し、操作装置5であるマウスを押下することでマーカー201の表示画面内の位置が確定し、これにより目標姿勢が指定される。なお、図3では、説明のために、台車7、ハンド6aおよびマーカー201にそれぞれ7,6a,201の符号が付されているが、これらの符号は表示画面に表示されない。以降の表示画面の図においても、同様に、説明のために符号が付されているが、これらの符号は表示画面に表示されない。 FIG. 3 is a diagram showing an example of a method for setting a target posture in this embodiment. FIG. 3 is a diagram showing an example of a display screen of the image display device 4. In the example shown in FIG. 3, an image captured by the camera 2-1 is displayed on the image display device 4. This image shows the upper part of the dolly 7 and the hand 6a of the manipulator 6. In the example shown in FIG. 3, a marker 201 indicating the target posture is displayed together with the image captured by the camera 2-1. The marker 201 in the display screen can be changed by the operator operating the operation device 5. For example, if the operation device 5 is a mouse, the position of the marker 201 in the display screen moves in accordance with the movement of the operation device 5, and the position of the marker 201 in the display screen is determined by pressing the mouse, which is the operation device 5, thereby specifying the target posture. Note that in FIG. 3, the dolly 7, the hand 6a, and the marker 201 are respectively given the symbols 7, 6a, and 201 for the purpose of explanation, but these symbols are not displayed on the display screen. Similarly, in the subsequent figures of the display screen, symbols are given for the purpose of explanation, but these symbols are not displayed on the display screen.
 本実施の形態では、例えば、操作装置5であるマウスが押下されている間、遠隔機械1が動作する。より詳細には、マーカー201の位置、すなわち目標姿勢が、操作装置5であるマウスが押下されることにより決定されると、目標姿勢に対応する方向に台車7が回転するように操作演算器3は遠隔機械1の姿勢を制御する。すなわち、操作演算器3は、提示映像内に操作装置5の操作によって移動可能なマーカー201を表示させ、オペレータにより操作装置5が押下された時点のマーカー201の位置を目標姿勢として設定してもよい。そして、操作演算器3は、目標姿勢の設定による遠隔機械1の姿勢の駆動を開始した後に、遠隔機械1の姿勢が目標姿勢に達したと判定すると、遠隔機械1の姿勢が目標姿勢に達した時点で操作装置5の押下が継続している場合に操作装置5によって位置移動指示が受付けられたと判断し、遠隔機械1の位置の移動を開始させてもよい。 In this embodiment, for example, the remote machine 1 operates while the mouse, which is the operating device 5, is pressed. More specifically, when the position of the marker 201, i.e., the target attitude, is determined by pressing the mouse, which is the operating device 5, the operation calculator 3 controls the attitude of the remote machine 1 so that the cart 7 rotates in a direction corresponding to the target attitude. That is, the operation calculator 3 may display the marker 201 that can be moved by operating the operating device 5 in the presented image, and set the position of the marker 201 at the time when the operating device 5 is pressed by the operator as the target attitude. Then, after starting to drive the attitude of the remote machine 1 by setting the target attitude, the operation calculator 3 may determine that the position movement instruction has been accepted by the operating device 5 if the pressing of the operating device 5 continues at the time when the attitude of the remote machine 1 reaches the target attitude, and may start moving the position of the remote machine 1.
 遠隔機械1の動作中に、操作装置5であるマウスの押下が終了すると、操作演算器3は、遠隔機械1の姿勢の駆動および遠隔機械1の位置の移動を停止させる。そして、遠隔機械1が目標姿勢まで駆動されると、すなわち、遠隔機械1の向きが目標姿勢に対応する向きになると、操作演算器3は遠隔機械1のその姿勢における進行方向への移動を開始するように遠隔機械1の位置を制御する。操作装置5であるマウスが押下されている間、操作演算器3は遠隔機械1の位置を移動させ続け、操作装置5であるマウスの押下が停止すると、操作演算器3は遠隔機械1を停止させる。 When the mouse, which is the operating device 5, is no longer pressed while the remote machine 1 is operating, the operation calculator 3 stops driving the attitude of the remote machine 1 and stopping the movement of the position of the remote machine 1. Then, when the remote machine 1 is driven to the target attitude, that is, when the orientation of the remote machine 1 corresponds to the target attitude, the operation calculator 3 controls the position of the remote machine 1 so that the remote machine 1 starts moving in the direction of travel in that attitude. While the mouse, which is the operating device 5, is pressed, the operation calculator 3 continues to move the position of the remote machine 1, and when the mouse, which is the operating device 5, is no longer pressed, the operation calculator 3 stops the remote machine 1.
 図4は、本実施の形態の操作装置5を用いた操作の一例を示す図である。図4の各図は、図3と同様に、映像提示装置4の表示画面の一例を示しており、各表示画面には、カメラ2-1によって撮影された映像とともに、目標姿勢を示すマーカー201が表示されている。図4では、操作装置5としてマウスが用いられる例を示している。図4の一段目の表示画面では、目標設定が行われる状態を示しており、詳細には、オペレータが、目標姿勢として設定したい位置にマーカー201を移動させて操作装置5であるマウスが押下すると目標姿勢が設定される。図4では、遠隔機械1の中心から遠隔機械1の正面(前面)へ向かうベクトルをベクトル202とし、目標姿勢に対応するベクトルをベクトル203としている。操作演算器3は、映像と映像内におけるマーカー201の位置とに基づいて、ベクトル202とベクトル203とを算出し、ベクトル202がベクトル203に一致するように、遠隔機械1の台車7を反時計回りに回転させる制御を開始する。 FIG. 4 is a diagram showing an example of an operation using the operation device 5 of this embodiment. Each diagram in FIG. 4 shows an example of a display screen of the image presentation device 4, as in FIG. 3, and a marker 201 indicating the target attitude is displayed on each display screen together with an image captured by the camera 2-1. FIG. 4 shows an example in which a mouse is used as the operation device 5. The first display screen in FIG. 4 shows a state in which a target is set. In detail, the operator moves the marker 201 to a position to be set as the target attitude and presses the mouse, which is the operation device 5, to set the target attitude. In FIG. 4, the vector from the center of the remote machine 1 to the front (front face) of the remote machine 1 is vector 202, and the vector corresponding to the target attitude is vector 203. The operation calculator 3 calculates vector 202 and vector 203 based on the image and the position of the marker 201 in the image, and starts control to rotate the cart 7 of the remote machine 1 counterclockwise so that vector 202 coincides with vector 203.
 目標姿勢の設定後に、オペレータが操作装置5であるマウスの押下を継続すると、図4の二段目の表示画面に示すように、遠隔機械1の台車7の姿勢が変更されることで、すなわち、遠隔機械1の台車7の向きが反時計回りに回転することで、マーカー201が遠隔機械1の台車7の正面に位置するようになる。これにより、姿勢駆動が終了する。 After setting the target attitude, if the operator continues to press the mouse, which is the operating device 5, the attitude of the carriage 7 of the remote machine 1 will change, as shown in the second display screen of Figure 4, i.e., the orientation of the carriage 7 of the remote machine 1 will rotate counterclockwise, and the marker 201 will be positioned in front of the carriage 7 of the remote machine 1. This ends the attitude drive.
 図4の二段目の表示画面で示した状態でさらに操作装置5であるマウスの押下が継続されたままである場合、操作演算器3は、図4の三段目の表示画面に示すように、遠隔機械1の台車7を前進させる位置駆動を開始する。以降、操作装置5であるマウスの押下が終了するまで、台車7の位置駆動が継続する。 If the mouse, which is the operating device 5, continues to be pressed in the state shown in the second display screen of FIG. 4, the operation calculator 3 starts position driving to move the trolley 7 of the remote machine 1 forward, as shown in the third display screen of FIG. 4. Thereafter, the position driving of the trolley 7 continues until the pressing of the mouse, which is the operating device 5, ends.
 図4に示した例では、オペレータは、操作装置5であるマウスを、表示画面として投影されている映像内の所望の位置に移動させてマウスを押下することで目標姿勢を設定し、押下をそのまま継続することで、上述したように目標姿勢までの遠隔機械1の駆動が行われる。そして、遠隔機械1が目標姿勢となった状態で、オペレータが、操作装置5であるマウスの押下を継続していると、遠隔機械1の位置駆動が開始される。本実施の形態では、オペレータは、ヘッドマウントディスプレイを装着する必要がなく、立体認知の必要もない。これにより、オペレータの操作負荷を軽減することができる。また、オペレータは映像提示装置4の表示画面をみながら操作装置5であるマウスを移動させて押下するだけで遠隔機械1の姿勢と位置とを変更することができるため、簡易な操作で遠隔操作を行うことができる。 In the example shown in FIG. 4, the operator sets the target posture by moving the mouse, which is the operation device 5, to the desired position in the image projected as the display screen and pressing the mouse, and by continuing to press the mouse, the remote machine 1 is driven to the target posture as described above. Then, when the remote machine 1 is in the target posture, if the operator continues to press the mouse, which is the operation device 5, position driving of the remote machine 1 begins. In this embodiment, the operator does not need to wear a head-mounted display, and there is no need for stereoscopic perception. This reduces the operation load on the operator. Also, the operator can change the posture and position of the remote machine 1 simply by moving and pressing the mouse, which is the operation device 5, while looking at the display screen of the image presentation device 4, allowing remote operation with simple operations.
 なお、上述した例では、操作装置5であるマウスを押下し続けることで、オペレータは、遠隔機械1の動作の継続を指示するようにした。すなわち、上述した例では、目標姿勢までの遠隔機械1の駆動が行われた後にマウスを押下することが位置駆動の開始を意味し、その後のマウスの押下の終了が位置駆動の終了を意味することになる。遠隔機械1の動作の具体的な指定方法は、上述した例に限定されない。例えば、次のような操作が行われてもよい。図4の一段目の表示画面において、オペレータは、目標姿勢に対応する位置にマーカー201を移動させた後、操作装置5であるマウスを、クリックまたはダブルクリックすることで一旦目標姿勢を確定させる。その後、遠隔機械1の姿勢変更が行わる間は、オペレータはマウスを押下せず、操作演算器3は、遠隔機械1の姿勢が目標姿勢になると遠隔機械1を停止させる。この状態で、オペレータが、操作装置5であるマウスを、押下することで位置の駆動が開始され、押下が終了することで位置の駆動が終了してもよい。または、遠隔機械1の姿勢が目標姿勢になると、オペレータが、操作装置5であるマウスを、クリックまたはダブルクリックすることで位置の駆動が開始され、再び、クリックまたはダブルクリックすることで位置の駆動が終了してもよい。 In the above example, the operator continues to press the mouse, which is the operation device 5, to instruct the remote machine 1 to continue operating. That is, in the above example, pressing the mouse after the remote machine 1 has been driven to the target posture means the start of position driving, and then stopping the mouse pressing means the end of position driving. The specific method of specifying the operation of the remote machine 1 is not limited to the above example. For example, the following operation may be performed. In the first display screen of FIG. 4, the operator moves the marker 201 to a position corresponding to the target posture, and then clicks or double-clicks the mouse, which is the operation device 5, to once confirm the target posture. Thereafter, while the posture of the remote machine 1 is being changed, the operator does not press the mouse, and the operation calculator 3 stops the remote machine 1 when the posture of the remote machine 1 becomes the target posture. In this state, the operator may press the mouse, which is the operation device 5, to start position driving, and stop pressing the mouse to end the position driving. Alternatively, when the attitude of the remote machine 1 reaches the target attitude, the operator can click or double-click the mouse, which is the operating device 5, to start driving the position, and click or double-click again to end the driving of the position.
 また、操作装置5がタッチパッドであってもよく、この場合、例えば上述したマウスの押下の代わりにタッチパッドの押下が用いられ、上述したマウスのクリックまたはダブルクリックの代わりにタッチパッドのタップまたはダブルタップが用いられてもよい。また、操作装置5が映像提示装置4と一体化したタッチパネルであってもよく、この場合、オペレータは、表示画面上の目標姿勢に対応するタッチパネル上の位置をタッチすることで目標姿勢が決定され、例えば上述したマウスの押下の代わりにタッチパネルの押下が用いられ、上述したマウスのクリックまたはダブルクリックの代わりにタッチパネルのタップまたはダブルタップが用いられてもよい。また、操作装置5は、キーボードであってもよい。操作装置5がキーボードである場合には、例えば、矢印キーなどの特定のキーを、上下左右の移動に割当てておき、これらのキーでマーカー201を移動させて、エンターキーなどの特定のキーの押下を、上記マウスの押下と同様に扱ってもよい。 The operation device 5 may be a touchpad, in which case, for example, pressing the touchpad may be used instead of pressing the mouse, and tapping or double tapping the touchpad may be used instead of clicking or double clicking the mouse. The operation device 5 may be a touch panel integrated with the image presentation device 4, in which case the operator determines the target posture by touching a position on the touch panel that corresponds to the target posture on the display screen, and, for example, pressing the touch panel may be used instead of pressing the mouse, and tapping or double tapping the touch panel may be used instead of clicking or double clicking the mouse. The operation device 5 may be a keyboard. When the operation device 5 is a keyboard, for example, specific keys such as arrow keys may be assigned to up, down, left, and right movement, and the marker 201 may be moved using these keys, and pressing specific keys such as the enter key may be treated the same as pressing the mouse.
 図5は、本実施の形態の操作演算器3における動作の一例を示すフローチャートである。なお、図5では、オペレータが、目標姿勢を示す位置で操作装置5の押下を開始し、遠隔機械1を動作させる間は押下を継続することで、遠隔機械1を遠隔操作する例を示している。操作演算器3は、まず、目標設定の入力があったか否かを判断する(ステップS1)。詳細には、入力判別部31が、操作装置5から受け取った操作情報が、目標姿勢の設定を示す情報であるか否かを判断する。上述したように、目標姿勢の設定を示す情報は、例えば、操作装置5が押下された場合に、操作装置5から入力される情報であり、操作装置5の位置(表示画面における位置)を示す情報を含む。また、入力判別部31は、操作装置5が押下されている間、移動指示の入力が継続されていることを示す操作情報を位置駆動部36へ通知する。目標設定の入力がない場合(ステップS1 No)、操作演算器3はステップS1を繰り返す。 FIG. 5 is a flowchart showing an example of the operation of the operation calculator 3 of this embodiment. Note that FIG. 5 shows an example in which an operator starts pressing the operation device 5 at a position indicating the target posture, and continues pressing the operation device 5 while the remote machine 1 is operating, thereby remotely operating the remote machine 1. The operation calculator 3 first judges whether or not a target setting has been input (step S1). In detail, the input discrimination unit 31 judges whether or not the operation information received from the operation device 5 is information indicating the setting of the target posture. As described above, the information indicating the setting of the target posture is, for example, information input from the operation device 5 when the operation device 5 is pressed, and includes information indicating the position of the operation device 5 (position on the display screen). In addition, the input discrimination unit 31 notifies the position drive unit 36 of operation information indicating that the input of a movement instruction is continuing while the operation device 5 is being pressed. If there is no target setting input (step S1 No), the operation calculator 3 repeats step S1.
 目標設定の入力があった場合(ステップS1 Yes)、目標姿勢を設定する(ステップS2)。詳細には、入力判別部31が、操作情報を目標姿勢設定部33へ出力し、目標姿勢設定部33が、表示情報生成部32から取得した映像と、入力判別部31から取得した目標姿勢の設定を示す操作情報とを用いて、目標姿勢を設定し、設定した目標姿勢を姿勢駆動部34および目標姿勢判定部35へ通知する。例えば、目標姿勢設定部33は、映像を用いて図4の一段目に例示したベクトル202を求め、操作情報と映像とを用いてベクトル203を求め、ベクトル202とベクトル203との差を算出することで、遠隔機械1の向きを回転させる向きと角度とを目標姿勢として算出する。このように、目標姿勢は、現在の姿勢からの相対値、すなわち現在の姿勢から変化させる量として算出される。 If there is an input for setting a target attitude (step S1 Yes), a target attitude is set (step S2). In detail, the input discrimination unit 31 outputs operation information to the target attitude setting unit 33, and the target attitude setting unit 33 sets a target attitude using the image acquired from the display information generation unit 32 and the operation information indicating the setting of the target attitude acquired from the input discrimination unit 31, and notifies the attitude driving unit 34 and the target attitude determination unit 35 of the set target attitude. For example, the target attitude setting unit 33 obtains the vector 202 illustrated in the first row of FIG. 4 using the image, obtains the vector 203 using the operation information and the image, and calculates the difference between the vectors 202 and 203 to calculate the direction and angle by which the remote machine 1 is rotated as the target attitude. In this way, the target attitude is calculated as a relative value from the current attitude, that is, the amount of change from the current attitude.
 次に、操作演算器3は、姿勢駆動を実施する(ステップS3)。詳細には、姿勢駆動部34が、目標姿勢設定部33から通知された目標姿勢に基づいて遠隔機械1の姿勢を駆動するための制御信号を生成し、生成した制御信号を遠隔機械1へ送信する。例えば、姿勢駆動部34が、目標姿勢設定部33から受け取った目標姿勢に近づくように、一定速度で向きを変化させるように制御信号を生成する。 Next, the operation calculator 3 performs posture driving (step S3). In detail, the posture driving unit 34 generates a control signal for driving the posture of the remote machine 1 based on the target posture notified by the target posture setting unit 33, and transmits the generated control signal to the remote machine 1. For example, the posture driving unit 34 generates a control signal to change the orientation at a constant speed so as to approach the target posture received from the target posture setting unit 33.
 次に、操作演算器3は、目標姿勢に到達したか否かを判定する(ステップS4)。詳細には、目標姿勢判定部35は、目標姿勢設定部33から通知された目標姿勢に基づいて、遠隔機械1が目標姿勢に到達したか否かを判定する。上述したように、目標姿勢が現在の姿勢からの相対値で示される場合、例えば、目標姿勢判定部35は、目標姿勢がしきい値以下である場合に、遠隔機械1が目標姿勢に到達したと判定する。しきい値は、あらかじめ定められ、例えば、0に設定されてもよいし、目標姿勢、すなわち、遠隔機械1を変化させる姿勢の変化量を0とみなせるような誤差の範囲に相当する値に設定されてもよい。 Next, the operation calculator 3 determines whether the target attitude has been reached (step S4). In detail, the target attitude determination unit 35 determines whether the remote machine 1 has reached the target attitude based on the target attitude notified by the target attitude setting unit 33. As described above, when the target attitude is indicated as a relative value from the current attitude, for example, the target attitude determination unit 35 determines that the remote machine 1 has reached the target attitude when the target attitude is equal to or less than a threshold value. The threshold value is determined in advance, and may be set to, for example, 0, or may be set to a value corresponding to an error range such that the amount of change in the target attitude, i.e., the attitude that changes the remote machine 1, can be considered to be 0.
 目標姿勢に到達したと判定した場合(ステップS4 Yes)、操作演算器3は、移動指示の入力が継続しているか否かを判断する(ステップS5)。詳細には、目標姿勢に到達したと判定した場合、目標姿勢判定部35は、その旨を位置駆動部36へ通知し、位置駆動部36は、目標姿勢に到達した通知を受けると、入力判別部31から移動を継続していることを示す操作情報を受信している場合に、ステップS5でYesと判断する。移動指示の入力が継続していると判断した場合(ステップS5 Yes)、操作演算器3は、位置駆動を実施する(ステップS6)。詳細には、位置駆動部36が、遠隔機械1を前進させるための制御信号を生成し、生成した制御信号を遠隔機械1へ送信する。 If it is determined that the target posture has been reached (step S4 Yes), the operation calculator 3 determines whether the input of a movement command is continuing (step S5). In detail, if it is determined that the target posture has been reached, the target posture determination unit 35 notifies the position drive unit 36 of this fact, and upon receiving the notification that the target posture has been reached, the position drive unit 36 determines Yes in step S5 if it has received operation information indicating that movement is continuing from the input discrimination unit 31. If it is determined that the input of a movement command is continuing (step S5 Yes), the operation calculator 3 performs position drive (step S6). In detail, the position drive unit 36 generates a control signal for moving the remote machine 1 forward, and transmits the generated control signal to the remote machine 1.
 ステップS4で、目標姿勢に到達していないと判定した場合(ステップS4 No)、操作演算器3は、ステップS3からの処理を繰り返す。ステップS5で、移動指示の入力が継続していないと判断した場合(ステップS5 No)、操作演算器3は、処理を終了する。 If it is determined in step S4 that the target posture has not been reached (step S4: No), the operation calculator 3 repeats the process from step S3. If it is determined in step S5 that the input of the movement command is not continuing (step S5: No), the operation calculator 3 ends the process.
 なお、図5では、オペレータが、目標姿勢を示す位置で操作装置5の押下を開始し、遠隔機械1を動作させる間は押下を継続することで、遠隔機械1を遠隔操作する例を示したが、これ以外の操作方法で操作が行われる場合の動作は、操作方法によって適宜変更されるが図5と概略は同様である。例えば、目標姿勢を示す位置で操作装置5をクリックし、遠隔機械1が目標姿勢に到達した後に、クリックすることで位置駆動が開始される場合には、ステップS5では、操作演算器3は、遠隔機械1が目標姿勢に到達した後に、クリックされたことを示す操作情報を受け取ると、Yesと判断する。 Note that FIG. 5 shows an example in which the operator remotely controls the remote machine 1 by starting to press the operation device 5 at a position indicating the target posture and continuing to press while the remote machine 1 is operating. However, when operation is performed using other operation methods, the operation is generally similar to FIG. 5, although it is changed appropriately depending on the operation method. For example, when the operation device 5 is clicked at a position indicating the target posture, and position drive is started by clicking after the remote machine 1 reaches the target posture, in step S5, the operation calculator 3 judges Yes when it receives operation information indicating that a click was made after the remote machine 1 reaches the target posture.
 また、上述した例では、姿勢駆動を行った後に位置駆動を行ったが、これに限らず、姿勢駆動と位置駆動とに関する操作が個別に行われてもよい。例えば、オペレータが、ダブルクリック、ダブルタップなどのようなあらかじめ定められた操作を、操作装置5を用いて行うことで、遠隔機械1の位置駆動が行われ、操作装置5を用いた位置の指定と押下の継続(長押し)で姿勢駆動が行われてもよい。 In addition, in the above example, position driving is performed after posture driving, but this is not limited to the above, and operations related to posture driving and position driving may be performed separately. For example, the operator may use the operation device 5 to perform a predetermined operation such as double-clicking or double-tapping to drive the position of the remote machine 1, and posture driving may be performed by specifying a position using the operation device 5 and continuing to press it (long press).
 次に、本実施の形態の操作演算器3のハードウェア構成について説明する。図1に示した本実施の形態の操作演算器3は、コンピュータシステム上で、操作演算器3における処理が記述されたコンピュータプログラムであるプログラムが実行されることにより、コンピュータシステムが操作演算器3として機能する。図6は、本実施の形態の操作演算器3を実現するコンピュータシステムの構成例を示す図である。図6に示すように、このコンピュータシステムは、制御部101と入力部102と記憶部103と表示部104と通信部105と出力部106とを備え、これらはシステムバス107を介して接続されている。制御部101と記憶部103とは処理回路を構成する。 Next, the hardware configuration of the operation calculator 3 of this embodiment will be described. The operation calculator 3 of this embodiment shown in FIG. 1 functions as the operation calculator 3 when a computer system executes a program, which is a computer program describing the processing in the operation calculator 3. FIG. 6 is a diagram showing an example of the configuration of a computer system that realizes the operation calculator 3 of this embodiment. As shown in FIG. 6, this computer system includes a control unit 101, an input unit 102, a memory unit 103, a display unit 104, a communication unit 105, and an output unit 106, which are connected via a system bus 107. The control unit 101 and the memory unit 103 form a processing circuit.
 図6において、制御部101は、例えば、CPU(Central Processing Unit)等のプロセッサであり、本実施の形態の操作演算器3における処理が記述されたプログラムを実行する。なお、制御部101の一部が、GPU(Graphics Processing Unit),FPGA(Field-Programmable Gate Array)などの専用ハードウェアにより実現されてもよい。入力部102は、ボタン、キーボード、マウス、タッチパッドなどである。記憶部103は、RAM(Random Access Memory),ROM(Read Only Memory)などの各種メモリおよびハードディスクなどのストレージデバイスを含み、上記制御部101が実行すべきプログラム、処理の過程で得られた必要なデータ、などを記憶する。また、記憶部103は、プログラムの一時的な記憶領域としても使用される。表示部104は、上述したように、例えば、ディスプレイなどである。なお、表示部104と入力部102とは、一体化されてタッチパネルなどにより実現されてもよい。通信部105は、通信処理を実施する受信機および送信機である。出力部106は、スピーカなどである。なお、図6は、一例であり、コンピュータシステムの構成は図6の例に限定されない。例えば、本実施の形態では、操作演算器3を、実現するコンピュータシステムは、表示部104および出力部106を備えていなくてもよい。 6, the control unit 101 is, for example, a processor such as a CPU (Central Processing Unit), and executes a program in which the processing in the operation calculator 3 of this embodiment is described. Note that a part of the control unit 101 may be realized by dedicated hardware such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array). The input unit 102 is a button, a keyboard, a mouse, a touchpad, etc. The storage unit 103 includes various memories such as a RAM (Random Access Memory) and a ROM (Read Only Memory) and a storage device such as a hard disk, and stores the program to be executed by the control unit 101, necessary data obtained in the process of processing, etc. The storage unit 103 is also used as a temporary storage area for the program. As described above, the display unit 104 is, for example, a display. Note that the display unit 104 and the input unit 102 may be integrated and realized by a touch panel, etc. The communication unit 105 is a receiver and a transmitter that perform communication processing. The output unit 106 is a speaker or the like. Note that FIG. 6 is an example, and the configuration of the computer system is not limited to the example of FIG. 6. For example, in this embodiment, the computer system that realizes the operation calculator 3 does not need to include the display unit 104 and the output unit 106.
 ここで、本実施の形態のプログラムが実行可能な状態になるまでのコンピュータシステムの動作例について説明する。上述した構成をとるコンピュータシステムには、たとえば、図示しないCD(Compact Disc)-ROMドライブまたはDVD(Digital Versatile Disc)-ROMドライブにセットされたCD-ROMまたはDVD-ROMから、コンピュータプログラムが記憶部103にインストールされる。そして、プログラムの実行時に、記憶部103から読み出されたプログラムが記憶部103の主記憶領域に格納される。この状態で、制御部101は、記憶部103に格納されたプログラムに従って、本実施の形態の操作演算器3としての処理を実行する。 Here, an example of the operation of the computer system until the program of this embodiment is in a state where it can be executed will be described. In a computer system having the above-mentioned configuration, for example, a computer program is installed in the storage unit 103 from a CD-ROM or DVD-ROM set in a CD (Compact Disc)-ROM drive or DVD (Digital Versatile Disc)-ROM drive (not shown). Then, when the program is executed, the program read from the storage unit 103 is stored in the main memory area of the storage unit 103. In this state, the control unit 101 executes processing as the operation calculator 3 of this embodiment according to the program stored in the storage unit 103.
 なお、上記の説明においては、CD-ROMまたはDVD-ROMを記録媒体として、操作演算器3における処理を記述したプログラムを提供しているが、これに限らず、コンピュータシステムの構成、提供するプログラムの容量などに応じて、たとえば、インターネットなどの伝送媒体により提供されたプログラムを用いることとしてもよい。 In the above explanation, a program describing the processing in the operational calculator 3 is provided on a CD-ROM or DVD-ROM as a recording medium, but this is not limiting. Depending on the configuration of the computer system and the capacity of the program provided, for example, a program provided via a transmission medium such as the Internet may be used.
 なお、図1に示した操作装置5は、図1に示した操作演算器3を実現するコンピュータシステムにおける入力部102であってもよいし、入力部102とは別に設けられてもよい。図1に示した映像提示装置4は、図1に示した操作演算器3を実現するコンピュータシステムにおける表示部104であってもよいし、表示部104とは別に設けられてもよい。 The operation device 5 shown in FIG. 1 may be the input unit 102 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the input unit 102. The image presentation device 4 shown in FIG. 1 may be the display unit 104 in the computer system that realizes the operation calculator 3 shown in FIG. 1, or may be provided separately from the display unit 104.
 図1に示した入力判別部31、表示情報生成部32、目標姿勢設定部33、姿勢駆動部34、目標姿勢判定部35および位置駆動部36は、図6に示した記憶部103に記憶されたコンピュータプログラムが図6に示した制御部101により実行されることにより実現される。図1に示した入力判別部31、表示情報生成部32、目標姿勢設定部33、姿勢駆動部34、目標姿勢判定部35および位置駆動部36の実現には、図6に示した記憶部103も用いられる。 The input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1 are realized by the control unit 101 shown in FIG. 6 executing a computer program stored in the storage unit 103 shown in FIG. 6. The storage unit 103 shown in FIG. 6 is also used to realize the input discrimination unit 31, display information generation unit 32, target posture setting unit 33, posture drive unit 34, target posture determination unit 35, and position drive unit 36 shown in FIG. 1.
 以上述べたように、本実施の形態の遠隔操作システム100は、遠隔機械1と、カメラ2の映像をオペレータに提示する映像提示装置4と、操作装置5と、操作演算器3とを備える。操作演算器3は、操作装置5によって目標映像内における位置として指定された目標姿勢に基づいて遠隔機械1の姿勢を駆動させ、操作装置5を用いて入力される位置駆動の開始および終了の指示に基づいて遠隔機械1の位置を移動させる。これにより、オペレータの操作負荷を軽減することができる。 As described above, the remote operation system 100 of this embodiment comprises the remote machine 1, the image presentation device 4 that presents the image from the camera 2 to the operator, the operation device 5, and the operation calculator 3. The operation calculator 3 drives the attitude of the remote machine 1 based on the target attitude specified as a position in the target image by the operation device 5, and moves the position of the remote machine 1 based on instructions to start and end position drive input using the operation device 5. This reduces the operational burden on the operator.
実施の形態2.
 図7は、実施の形態2にかかる遠隔操作システムの構成例を示す図である。本実施の形態の遠隔操作システム100aは、操作演算器3の代わりに操作演算器3aを備え、精密操作装置8aが追加される以外は、実施の形態1の遠隔操作システム100と同様である。実施の形態1と同様の機能を有する構成要素は、実施の形態1と同一の符号を付して重複する説明を省略する。以下、実施の形態1と異なる点を主に説明する。
Embodiment 2.
7 is a diagram showing a configuration example of a remote control system according to the second embodiment. A remote control system 100a according to the second embodiment is similar to the remote control system 100 according to the first embodiment, except that it includes an operation calculator 3a instead of the operation calculator 3 and a precision operation device 8a is added. Components having the same functions as those in the first embodiment are given the same reference numerals as those in the first embodiment, and duplicated explanations will be omitted. Below, differences from the first embodiment will be mainly explained.
 精密操作装置8aは、操作装置5よりも精密な操作を可能にする装置であり、例えば、ジョイスティック、ジョイスティックとダイヤルを組み合わせた操作装置、精密操作の可能なマウスなどである。精密操作装置8aとしてジョイスティックが用いられ、遠隔機械1が複数の駆動軸を有する場合には、どの駆動軸に関する操作を行うかをジョイスティックのボタンや操作装置5における操作によって切替えてもよい。また、精密操作装置8aは、操作装置5とハードウェアとしては同一であってもよい。例えば、通常のモードと精密操作可能なモードとに切替可能なマウスを、操作装置5として用いる場合には通常のモードに設定し、精密操作装置8aとして用いる場合には精密操作可能なモードに設定してもよい。また、例えば、操作装置5が、タッチパネル、マウス、キーボードなどである場合に、操作演算器3aが、操作装置5の操作に応じて、映像提示装置4に表示されている映像を拡大することで、オペレータによる精密操作を可能としてもよい。拡大の操作は、一般的な画面の拡大に用いられる操作を用いてもよく、例えば、操作装置5がタッチパネルである場合には、拡大の操作としてピンチアウトが用いられてもよい。拡大の操作はこれに限定されず、映像提示装置4に拡大を行うためのボタンなどが表示され、操作装置5によって拡大して表示させる位置を移動させるようにしてもよい。 The precision operation device 8a is a device that allows more precise operation than the operation device 5, and is, for example, a joystick, an operation device that combines a joystick and a dial, or a mouse that allows precision operation. When a joystick is used as the precision operation device 8a and the remote machine 1 has multiple drive axes, the operation of which drive axis is to be performed may be switched by a button on the joystick or an operation on the operation device 5. The precision operation device 8a may also be the same as the operation device 5 in terms of hardware. For example, a mouse that can be switched between a normal mode and a mode that allows precision operation may be set to the normal mode when used as the operation device 5, and to the mode that allows precision operation when used as the precision operation device 8a. For example, when the operation device 5 is a touch panel, a mouse, a keyboard, or the like, the operation calculator 3a may enlarge the image displayed on the image display device 4 in response to the operation of the operation device 5, thereby enabling precision operation by the operator. The operation of enlarging may be an operation used for enlarging a general screen, and for example, when the operation device 5 is a touch panel, a pinch out may be used as the operation of enlarging. The enlargement operation is not limited to this, and a button for enlargement or the like may be displayed on the image display device 4, and the position to be enlarged and displayed may be moved using the operation device 5.
 操作演算器3aは、入力切替器8bが追加され、入力判別部31の代わりに入力判別部31aを備える以外は、実施の形態1の操作演算器3と同様である。図7では、操作演算器3aとは別に入力切替器が設けられているが、入力切替器8bが操作演算器3a内に設けられていてもよい。 The operation calculator 3a is similar to the operation calculator 3 of the first embodiment, except that an input switch 8b is added and an input discrimination unit 31a is provided instead of the input discrimination unit 31. In FIG. 7, an input switch is provided separately from the operation calculator 3a, but the input switch 8b may be provided within the operation calculator 3a.
 入力切替器8bは、オペレータからの操作対象を操作装置5と精密操作装置8aとの間で切替える。例えば、入力切替器8bは、通常操作(通常操作モード)と精密操作(精密操作モード)とを切替える機能を有する。入力切替器8bは、通常操作モードでは、操作装置5から入力された操作情報を入力判別部31aへ出力する。入力判別部31aは、入力切替器8bを介して操作装置5から入力された操作情報を受け取ると、実施の形態1と同様の動作を行う。入力切替器8bは、精密操作モードでは、精密操作装置8aから入力された操作情報を入力判別部31aへ出力する。入力判別部31aは、操作情報の内容に応じて、当該操作情報を姿勢駆動部34または位置駆動部36へ出力する。 The input switch 8b switches the operation target from the operator between the operation device 5 and the precision operation device 8a. For example, the input switch 8b has a function of switching between normal operation (normal operation mode) and precision operation (precision operation mode). In the normal operation mode, the input switch 8b outputs operation information input from the operation device 5 to the input discrimination unit 31a. When the input discrimination unit 31a receives operation information input from the operation device 5 via the input switch 8b, it performs the same operation as in the first embodiment. In the precision operation mode, the input switch 8b outputs operation information input from the precision operation device 8a to the input discrimination unit 31a. The input discrimination unit 31a outputs the operation information to the attitude drive unit 34 or the position drive unit 36 depending on the content of the operation information.
 入力切替器8bは、例えば、操作装置5と精密操作装置8aと異なる装置である場合には、これらのうちどちらの装置から入力があったかに応じて、通常操作と精密操作とを切替えてもよいし、操作装置5による特定に操作が有った場合に、通常操作と精密操作とを切替えてもよい。また、映像提示装置4に操作の切替えのためのボタンなどが設けられ、入力切替器8bは、これらのボタンが操作装置5により押下されることで、通常操作と精密操作とを切替えてもよいし、ジェスチャによるオペレータの操作を検出し、オペレータのジェスチャにより通常操作と精密操作とを切替えてもよい。また、遠隔機械1がマニピュレータ6を有する場合に、遠隔機械1の姿勢駆動部34がマニピュレータ6の各関節やハンド6aを駆動するマニピュレータ駆動部としての機能を有してもよい。この場合、姿勢駆動部34は、精密操作装置8aによる操作に応じて、姿勢駆動部34がマニピュレータ6を制御するための制御信号を生成し、生成した制御信号を遠隔機械1へ送信してもよい。または、マニピュレータ駆動部を姿勢駆動部34とは別に設けてもよい。 When the operation device 5 and the precision operation device 8a are different devices, the input switch 8b may switch between normal operation and precision operation depending on which of these devices the input is from, or may switch between normal operation and precision operation when a specific operation is performed by the operation device 5. In addition, the image presentation device 4 may be provided with buttons for switching operations, and the input switch 8b may switch between normal operation and precision operation by pressing these buttons by the operation device 5, or may detect the operator's operation by gesture and switch between normal operation and precision operation by the operator's gesture. In addition, when the remote machine 1 has a manipulator 6, the attitude drive unit 34 of the remote machine 1 may have a function as a manipulator drive unit that drives each joint and hand 6a of the manipulator 6. In this case, the attitude drive unit 34 may generate a control signal for the attitude drive unit 34 to control the manipulator 6 in response to the operation by the precision operation device 8a, and transmit the generated control signal to the remote machine 1. Alternatively, the manipulator drive unit may be provided separately from the attitude drive unit 34.
 実施の形態1で述べた操作方法を用いると、オペレータの操作負荷を軽減することができるが、マウスなどの操作装置5によって操作を行っているため、遠隔機械1を精密に操作しにくい場合がある。また、例えば、遠隔機械1がマニピュレータ6を有し、マニピュレータ6による操作の対象物に近づいた場合、実施の形態1で述べた操作方法を用いる場合より、ジョイスティックを用いた操作などのように遠隔機械1の駆動軸に直接対応した操作を用いた方が、オペレータが精密な遠隔操作を実行しやすくなる場合がある。本実施の形態では、通常操作と精密操作とを切替える入力切替器8bを備えるため、オペレータがより適切に遠隔機械1を操作することができる。 The operation method described in the first embodiment can reduce the operational burden on the operator, but since the operation is performed using an operating device 5 such as a mouse, it may be difficult to precisely operate the remote machine 1. Also, for example, when the remote machine 1 has a manipulator 6 and approaches an object to be operated by the manipulator 6, it may be easier for the operator to perform precise remote operation by using an operation that directly corresponds to the drive shaft of the remote machine 1, such as an operation using a joystick, rather than using the operation method described in the first embodiment. In this embodiment, an input switch 8b that switches between normal operation and precise operation is provided, allowing the operator to more appropriately operate the remote machine 1.
 なお、精密操作によって操作を行う場合に、精密操作装置8aに応じた操作方法を示す画面を映像提示装置4に表示してもよい。1つの映像提示装置4の表示画面が分割されて、カメラ2によって撮影された映像と操作方法を示す画面とが分割されてもよいし、映像提示装置4が複数設けられ、カメラ2によって撮影された映像と操作方法を示す画面とがそれぞれ異なる映像提示装置4に表示されてもよい。 When performing an operation using precision manipulation, a screen showing an operation method corresponding to the precision manipulation device 8a may be displayed on the image presentation device 4. The display screen of one image presentation device 4 may be divided to separate the image captured by the camera 2 and the screen showing the operation method, or multiple image presentation devices 4 may be provided, and the image captured by the camera 2 and the screen showing the operation method may be displayed on different image presentation devices 4.
 図8~図10は、本実施の形態の操作方法を示す画面の一例を示す図である。図8~図10は、精密操作においてジョイスティックを精密操作装置8aとして用いる場合の操作方法の表示例であり、図8~図10には、それぞれArm Joystick Mode、Wrist Mode、Drive Joystick Modeにおけるジョイスティックの操作と対応する遠隔機械1の動作とが遠隔機械1の動作を模式的に示す画像とともに表示されている。表示情報生成部32は、入力判別部31からモード(Mode)の設定に関する入力を受け取り、現在のモードに応じて、映像提示装置4に、例えば、図8~図10のいずれかが表示されるように制御する。これにより、オペレータは、操作方法を確認しながらジョイスティックを操作することが可能となり、オペレータがジョイスティックの操作と対応する遠隔機械1の動作との対応を記憶していない場合であっても遠隔機械1を操作することができる。また、オペレータの操作誤りを抑制することができる。なお、図8~図10は、例示であり、設定可能なモード、ジョイスティックの各操作と遠隔機械1の動作との対応は図8~図10に示した例に限定されない。図8~図10では、精密操作装置8aとしてジョイスティックを使用する例を示したが、ジョイスティック以外の精密操作装置8aを用いる場合にも、精密操作装置8aに応じた操作方法が表示されることで、ジョイスティックを用いる場合と同様の効果が得られる。また、精密操作に限らず、操作装置5を用いる通常操作においても、同様に、操作装置5に応じた操作方法が表示されてもよい。 8 to 10 are diagrams showing an example of a screen showing the operation method of this embodiment. FIG. 8 to FIG. 10 are display examples of the operation method when a joystick is used as the precision operation device 8a in precision operation, and in FIG. 8 to FIG. 10, the operation of the joystick in the Arm Joystick Mode, the Wrist Mode, and the Drive Joystick Mode and the corresponding operation of the remote machine 1 are displayed together with an image showing the operation of the remote machine 1. The display information generating unit 32 receives an input regarding the setting of the mode from the input discrimination unit 31, and controls the image display device 4 to display, for example, one of FIG. 8 to FIG. 10 according to the current mode. This allows the operator to operate the joystick while checking the operation method, and allows the operator to operate the remote machine 1 even if the operator has not memorized the correspondence between the joystick operation and the corresponding operation of the remote machine 1. In addition, it is possible to suppress the operator's operation errors. Note that FIG. 8 to FIG. 10 are examples, and the settable modes, the correspondence between each joystick operation and the operation of the remote machine 1 are not limited to the examples shown in FIG. 8 to FIG. 10. 8 to 10 show an example in which a joystick is used as the precision operation device 8a, but even when a precision operation device 8a other than a joystick is used, the same effect as when a joystick is used can be obtained by displaying an operation method corresponding to the precision operation device 8a. Also, not limited to precision operation, an operation method corresponding to the operation device 5 may be displayed in the same manner for normal operation using the operation device 5.
 また、映像提示装置4として、スマートフォン、タブレットなどの携帯可能な端末装置を用いてもよい。図11は、本実施の形態の映像提示装置4として端末装置が用いられる例を示す図である。映像提示装置4として端末装置が用いられる場合、端末装置をかざした場所に対応する映像が端末装置に表示される。これにより、映像提示装置4を移動させることで仮想的なワイドモニタを実現できる。例えば、端末装置である映像提示装置4に表示される映像も拡大および縮小が可能であってもよい。また、図2に示したようなモニタと端末装置との両方が、それぞれ映像提示装置4として用いられてもよい。例えばモニタである映像提示装置4に表示される映像に繋がる映像が、端末装置である映像提示装置4の位置に応じて表示されることで、モニタである映像提示装置4より広い仮想的な画面を実現できる。 Also, a portable terminal device such as a smartphone or tablet may be used as the image presentation device 4. FIG. 11 is a diagram showing an example in which a terminal device is used as the image presentation device 4 of this embodiment. When a terminal device is used as the image presentation device 4, an image corresponding to the location where the terminal device is held is displayed on the terminal device. This makes it possible to realize a virtual wide monitor by moving the image presentation device 4. For example, the image displayed on the image presentation device 4, which is a terminal device, may also be able to be enlarged and reduced. In addition, both the monitor and the terminal device as shown in FIG. 2 may be used as the image presentation device 4. For example, an image connected to an image displayed on the image presentation device 4, which is a monitor, is displayed according to the position of the image presentation device 4, which is a terminal device, thereby realizing a virtual screen wider than the image presentation device 4, which is a monitor.
 また、映像提示装置4に表示された映像を拡大すると、台車7が表示されなくなり、マニピュレータ6と台車7との相対的な位置関係がわからなくなることがある。図12は、本実施の形態の映像の表示方法の一例を示す図である。図12の上段には、カメラ2-1で撮影された映像が拡大された場合の表示画面の一例が示されている。図12の上段は、例えば、実施の形態1で述べた図4の三段目の表示画面の上部が拡大されたものであり、ハンド6aは拡大されるものの、図4の三段目の表示画面では表示されていた台車7が表示されなくなる。これによって、オペレータは台車7の向きがわからなくなり、操作しにくくなる。このため、図12の下段において破線で示した3次元図形308が、映像に重畳されてもよい。すなわち、表示情報生成部32は、例えばCG(Computer Graphics)などによって、台車7の向きがわかるように遠隔機械1の現在の状態を模した3次元図形308を生成し、3次元図形308をカメラ2-1で撮影された映像に重畳させた表示データを生成して映像提示装置4に表示させてもよい。遠隔機械1の現在の状態を模した3次元図形308が表示されることで、オペレータは、台車7の向きを把握することができる。なお、図12は一例であり、3次元図形308の表示方法は図12に示した例に限定されず、また表示される映像もカメラ2-1の映像に限定されない。また、3次元図形308の代わりに、台車7の向きを示す2次元の図形、記号、文字などが表示されてもよいし、3次元図形308とともに、台車7の向きを示す記号、文字などが表示されてもよい。 Furthermore, when the image displayed on the image display device 4 is enlarged, the dolly 7 disappears, and the relative positional relationship between the manipulator 6 and the dolly 7 may become unclear. FIG. 12 is a diagram showing an example of a method of displaying an image in this embodiment. The upper part of FIG. 12 shows an example of a display screen when the image captured by the camera 2-1 is enlarged. The upper part of FIG. 12 is, for example, an enlarged upper part of the third display screen of FIG. 4 described in embodiment 1, and although the hand 6a is enlarged, the dolly 7 displayed on the third display screen of FIG. 4 disappears. This makes it difficult for the operator to know the orientation of the dolly 7 and to operate it. For this reason, a three-dimensional figure 308 shown by a dashed line in the lower part of FIG. 12 may be superimposed on the image. That is, the display information generating unit 32 may generate a three-dimensional figure 308 that imitates the current state of the remote machine 1 so that the orientation of the trolley 7 can be known, for example, by using CG (Computer Graphics), generate display data in which the three-dimensional figure 308 is superimposed on the image captured by the camera 2-1, and display the display data on the image display device 4. By displaying the three-dimensional figure 308 that imitates the current state of the remote machine 1, the operator can grasp the orientation of the trolley 7. Note that FIG. 12 is an example, and the display method of the three-dimensional figure 308 is not limited to the example shown in FIG. 12, and the image to be displayed is not limited to the image of the camera 2-1. In addition, instead of the three-dimensional figure 308, a two-dimensional figure, symbol, character, etc. indicating the orientation of the trolley 7 may be displayed, or a symbol, character, etc. indicating the orientation of the trolley 7 may be displayed together with the three-dimensional figure 308.
 本実施の形態の操作演算器3aは、実施の形態1の操作演算器3と同様に、コンピュータシステムにより実現される。例えば、本実施の形態の操作演算器3aは、実施の形態1の操作演算器3と同様に、図6に例示したコンピュータシステムにより実現される。 The operation calculator 3a of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1. For example, the operation calculator 3a of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
実施の形態3.
 図13は、実施の形態3にかかる遠隔操作システムの構成例を示す図である。本実施の形態の遠隔操作システム100bは、操作演算器3aの代わりに操作演算器3bを備え、カメラ2が複数設けられる以外は、実施の形態2の遠隔操作システム100aと同様である。実施の形態2と同様の機能を有する構成要素は、実施の形態2と同一の符号を付して重複する説明を省略する。以下、実施の形態2と異なる点を主に説明する。
Embodiment 3.
13 is a diagram showing a configuration example of a remote control system according to the third embodiment. A remote control system 100b according to the present embodiment is similar to the remote control system 100a according to the second embodiment, except that the remote control system 100b according to the present embodiment includes an operation calculator 3b instead of the operation calculator 3a, and multiple cameras 2 are provided. Components having the same functions as those in the second embodiment are given the same reference numerals as those in the second embodiment, and duplicated explanations are omitted. Below, differences from the second embodiment will be mainly explained.
 実施の形態1,2では、カメラ2は1台以上であったが、本実施の形態では、カメラ2は2台以上設けられる。例えば、図2に例示したように、カメラ2としてカメラ2-1とカメラ2-2とが設けられる。 In the first and second embodiments, one or more cameras 2 are provided, but in the present embodiment, two or more cameras 2 are provided. For example, as shown in FIG. 2, cameras 2-1 and 2-2 are provided as cameras 2.
 操作演算器3bは、カメラ切替器であるカメラ切替部37が追加される以外は実施の形態2の操作演算器3aと同様である。カメラ切替部37は、映像提示装置4に表示させる映像の取得元のカメラ2を、複数のカメラ2間で切替える。すなわち、カメラ切替部37は、映像提示装置4に提示する提示映像を、複数のカメラ2のそれぞれによって撮影された複数の映像のなかから選択することで、提示映像に対応するカメラ2を切替える。カメラ切替部37は、映像提示装置4に表示させる映像の取得元のカメラ2を示す情報、すなわち選択した表示対象として選択したカメラ2を示す情報を表示情報生成部32へ通知する。表示情報生成部32は、通知されたカメラ2の映像を用いて表示データを生成する。カメラ切替部37は、入力判別部31aを介して、操作装置5または精密操作装置8aから、カメラの切替えを指示する操作情報を受け取ることで、映像提示装置4に表示させる映像を切替えてもよいし、オペレータのジェスチャを検出して映像提示装置4に表示させる映像を切替えてもよい。または、映像提示装置4に、カメラを切替えるためのボタンなどが表示され、オペレータが操作装置5または精密操作装置8aを用いて、当該ボタンを押下することでカメラ2の切替えが行われてもよい。または、あらかじめ設定された条件と映像提示装置4に表示される映像とに応じて、映像提示装置4に表示させる映像を切替えてもよい。 The operation calculator 3b is the same as the operation calculator 3a of the second embodiment, except that a camera switching unit 37, which is a camera switch, is added. The camera switching unit 37 switches between the multiple cameras 2 as the camera from which the image to be displayed on the image presentation device 4 is obtained. That is, the camera switching unit 37 switches the camera 2 corresponding to the presented image by selecting the presented image to be presented on the image presentation device 4 from the multiple images captured by each of the multiple cameras 2. The camera switching unit 37 notifies the display information generating unit 32 of information indicating the camera 2 from which the image to be displayed on the image presentation device 4 is obtained, that is, information indicating the camera 2 selected as the selected display target. The display information generating unit 32 generates display data using the notified image of the camera 2. The camera switching unit 37 may switch the image to be displayed on the image presentation device 4 by receiving operation information instructing the camera to be switched from the operation device 5 or the precision operation device 8a via the input discrimination unit 31a, or may switch the image to be displayed on the image presentation device 4 by detecting the operator's gesture. Alternatively, a button for switching cameras may be displayed on the image presentation device 4, and the operator may use the operation device 5 or the precision operation device 8a to press the button to switch the camera 2. Alternatively, the image displayed on the image presentation device 4 may be switched according to preset conditions and the image displayed on the image presentation device 4.
 図14は、本実施の形態のカメラ2の切替えの一例を示す図である。図14では、実施の形態1の図2で説明したように、台車7に設置されたカメラ2-1と、マニピュレータ6のハンド6a付近に設置されたカメラ2-2とが用いられる例を示しており、表示画面301には、カメラ2-1によって撮影された映像が表示されており、表示画面302には、カメラ2-2によって撮影された映像が表示されている。カメラ切替部37は、映像提示装置4に表示画面として表示される映像を、上述したように、操作装置5または精密操作装置8aから入力される操作情報に応じて切替えてもよいし、オペレータのジェスチャを検出して切替えてもよいし、あらかじめ設定された条件と映像提示装置4に表示される映像とに応じて切替えてもよい。あらかじめ設定された条件は、例えば、遠隔機械1と対象物との距離が定められた距離以下である場合にカメラ2-2の映像を表示し、遠隔機械1と対象物との距離が定められた距離より大きい場合にカメラ2-1の映像を表示するという条件を設定することができるがこれに限定されない。対象物は、オペレータによって操作装置5または精密操作装置8aを用いてあらかじめ定められる。 14 is a diagram showing an example of switching of the camera 2 in this embodiment. FIG. 14 shows an example in which the camera 2-1 installed on the dolly 7 and the camera 2-2 installed near the hand 6a of the manipulator 6 are used, as described in FIG. 2 of the first embodiment, and the image captured by the camera 2-1 is displayed on the display screen 301, and the image captured by the camera 2-2 is displayed on the display screen 302. The camera switching unit 37 may switch the image displayed as the display screen on the image presentation device 4 according to the operation information input from the operation device 5 or the precision operation device 8a, as described above, or may switch by detecting the gesture of the operator, or may switch according to a preset condition and the image displayed on the image presentation device 4. The preset condition may be, for example, but is not limited to, a condition in which the image of the camera 2-2 is displayed when the distance between the remote machine 1 and the object is equal to or less than a specified distance, and the image of the camera 2-1 is displayed when the distance between the remote machine 1 and the object is greater than a specified distance. The target object is determined in advance by the operator using the operation device 5 or the precision operation device 8a.
 このように、カメラ切替部37がカメラ2を切替えることにより、例えば、オペレータは、遠隔機械1が対象物にある程度まで接近するまではカメラ2-2によって撮影された映像をみながら遠隔操作を行い、対象物にある程度以上接近すると、カメラ2-1によって撮影された映像をみながら遠隔操作を行うことができる。 In this way, by the camera switching unit 37 switching between cameras 2, for example, the operator can perform remote control while watching the image captured by camera 2-2 until the remote machine 1 approaches the target object to a certain extent, and when the remote machine approaches the target object beyond a certain extent, the operator can perform remote control while watching the image captured by camera 2-1.
 また、カメラ2の1つとして、マニピュレータ6のハンド6aの内側に設けられたカメラが用いられてもよい。図15は、本実施の形態のマニピュレータ6のハンド6aの内側に設けられたカメラの一例を示す図である。図15では、左側に、マニピュレータ6と、ハンド6aが把持する把持対象物9とが示されており、右側に映像提示装置4に表示される映像が示されている。表示画面310は、カメラ2-2によって撮影された映像が表示される画面を示し、表示画面320は、カメラ2-3によって撮影された映像が表示される画面を示す。図15に示すように、ハンド6aの内側にカメラ2-3が設けられ、ハンド6aが把持対象物9に接近すると、映像提示装置4に表示される映像が、カメラ2-2によって撮影された映像から、カメラ2-3によって撮影された映像に切替わっている。 Also, a camera provided inside the hand 6a of the manipulator 6 may be used as one of the cameras 2. FIG. 15 is a diagram showing an example of a camera provided inside the hand 6a of the manipulator 6 in this embodiment. In FIG. 15, the manipulator 6 and the object 9 to be grasped by the hand 6a are shown on the left, and an image displayed on the image display device 4 is shown on the right. The display screen 310 shows the screen on which the image captured by the camera 2-2 is displayed, and the display screen 320 shows the screen on which the image captured by the camera 2-3 is displayed. As shown in FIG. 15, the camera 2-3 is provided inside the hand 6a, and when the hand 6a approaches the object 9 to be grasped, the image displayed on the image display device 4 is switched from the image captured by the camera 2-2 to the image captured by the camera 2-3.
 オペレータは、カメラ2-3によって撮影された映像を確認することで、把持対象物9の状態を詳細に把握しながら操作を行うことができる。例えば、把持対象物9およびハンド6aのうちいずれかが変形可能な構造物である場合、当該変形可能な構造物の変形の度合いを確認してハンド6aの把持力を調整することができる。なお、ハンド6aの把持力の調整は、操作装置5または精密操作装置8aによって行われてもよいし、これら以外の別の操作手段によって行われてもよい。 By checking the images captured by the cameras 2-3, the operator can perform operations while understanding the state of the grasped object 9 in detail. For example, if either the grasped object 9 or the hand 6a is a deformable structure, the operator can check the degree of deformation of the deformable structure and adjust the gripping force of the hand 6a. The adjustment of the gripping force of the hand 6a may be performed by the operating device 5 or the precision operating device 8a, or by other operating means other than these.
 なお、上述した例では、映像提示装置4に表示される映像は1つのカメラ2によって撮影された映像であったが、映像提示装置4に、複数のカメラ2によって撮影された映像がそれぞれ別の表示画面として表示されてもよい。この場合、表示情報生成部32は、選択されたカメラ2によって撮影された映像を高い照度で表示し、選択されていないカメラ2の映像の照度を落としてもよい。また、表示情報生成部32は、選択されたカメラ2の映像の表示画面を中央に表示し、選択されていないカメラ2の映像を端部に表示してもよい。また、表示情報生成部32は、選択されていないカメラ2の映像のサイズが選択されたカメラ2の映像のサイズより小さくなるように表示データを生成してもよい。 In the above example, the image displayed on the image presentation device 4 was an image captured by one camera 2, but the image presentation device 4 may display images captured by multiple cameras 2 as separate display screens. In this case, the display information generation unit 32 may display the image captured by the selected camera 2 at high illuminance and reduce the illuminance of the image of the unselected camera 2. The display information generation unit 32 may also display the display screen of the image of the selected camera 2 in the center and display the image of the unselected camera 2 at the edge. The display information generation unit 32 may also generate display data so that the size of the image of the unselected camera 2 is smaller than the size of the image of the selected camera 2.
 本実施の形態の操作演算器3bは、実施の形態1の操作演算器3と同様に、コンピュータシステムにより実現される。例えば、本実施の形態の操作演算器3bは、実施の形態1の操作演算器3と同様に、図6に例示したコンピュータシステムにより実現される。なお、図13では、カメラ切替部37が操作演算器3b内に設けられているが、カメラ切替部37の機能を有するカメラ切替器が、操作演算器3bとは別に設けられてもよい。 The operation calculator 3b of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1. For example, the operation calculator 3b of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1. Note that, although the camera switching unit 37 is provided within the operation calculator 3b in FIG. 13, a camera switcher having the functions of the camera switching unit 37 may be provided separately from the operation calculator 3b.
 また、図13では、実施の形態2の操作演算器3aにカメラの切替え機能を追加する例を示したが、実施の形態1の遠隔操作システム100がカメラ2を複数備え、操作演算器3に、カメラ切替部37を追加することで、カメラの切替え機能を追加してもよい。 In addition, FIG. 13 shows an example in which a camera switching function is added to the operation calculator 3a of embodiment 2, but the remote control system 100 of embodiment 1 may be equipped with multiple cameras 2, and a camera switching unit 37 may be added to the operation calculator 3 to add a camera switching function.
 また、複数のカメラ2のうちの1つとして、マニピュレータ6を側方から撮影するカメラ2を設けてもよい。図16は、マニピュレータ6を側方から撮影する本実施の形態のカメラ2の一例を示す図である。図16に示した例では、複数のカメラ2のうちの1つであるカメラ2-4がマニピュレータ6を側方から撮影する。カメラ2-4には、駆動機構が設けられており、例えば、ハンド6aの手先を追尾するように、駆動機構によってカメラ2-4が駆動される。図17は、本実施の形態の映像の切替えの一例を示す図である。図17に示した例では、例えば、カメラ2-4によって撮影された映像が表示される表示画面304と、カメラ2-2によって撮影された映像表示される表示画面302と、がオペレータのジェスチャによって切替わる。 Furthermore, as one of the multiple cameras 2, a camera 2 that photographs the manipulator 6 from the side may be provided. FIG. 16 is a diagram showing an example of a camera 2 in this embodiment that photographs the manipulator 6 from the side. In the example shown in FIG. 16, camera 2-4, which is one of the multiple cameras 2, photographs the manipulator 6 from the side. A drive mechanism is provided in the camera 2-4, and the drive mechanism drives the camera 2-4 so as to track, for example, the tip of the hand 6a. FIG. 17 is a diagram showing an example of switching of images in this embodiment. In the example shown in FIG. 17, for example, a display screen 304 on which an image photographed by camera 2-4 is displayed and a display screen 302 on which an image photographed by camera 2-2 is displayed are switched by a gesture of the operator.
 例えば、表示画面302が表示されている状態で、オペレータが映像提示装置4に向かってのぞき込むような動作を行うと、映像提示装置4に表示される表示画面が、カメラ2-4によって撮影された映像が表示される表示画面304に切替わる。これにより、オペレータは、マニピュレータ6を側方からも確認することができ、マニピュレータ6が外界の機構とぶつかるような衝突を回避することができる。なお、図17では、カメラ2-2によって撮影された映像と、カメラ2-4によって撮影された映像とを切替える例を示したが、カメラ2-1によって撮影された映像と、カメラ2-4によって撮影された映像とを同様に切替えてもよい。 For example, when the operator looks towards the image presentation device 4 while the display screen 302 is displayed, the display screen displayed on the image presentation device 4 switches to the display screen 304 which displays the image captured by the camera 2-4. This allows the operator to check the manipulator 6 from the side, and to avoid the manipulator 6 colliding with an external mechanism. Note that while FIG. 17 shows an example of switching between the image captured by the camera 2-2 and the image captured by the camera 2-4, it is also possible to switch between the image captured by the camera 2-1 and the image captured by the camera 2-4 in a similar manner.
 また、側方から撮影するカメラ2-4の映像を、カメラ2-1またはカメラ2-2によって撮影された映像と重畳させてもよい。図18は、本実施の形態における側方から撮影した映像の重畳の一例を示す図である。図18に示した左上の図は、ハンド6aの手先と把持対象物9とをカメラ2-2によって撮影して得られる映像が、表示画面302として表示されている例を示している。図18に示した右上の図では、表示画面302に、カメラ2-4の映像を示す表示画面304が表示されている。このように、表示画面302に表示画面304を表示してもよいが、このままでは画面全体としては不自然となる。このため、図18の左下の図に示すように、仮想的な鏡307を表示してもよい。すなわち、表示情報生成部32は、カメラ2-4の映像が仮想的な鏡307に写っていると仮定した場合の、鏡307と鏡307に表示される映像とを示す合成映像を生成し、合成映像をカメラ2-1またはカメラ2-2によって撮影された映像に重畳させることで表示データを生成し、表示データを映像提示装置4に表示させる。これにより、オペレータは、カメラ2-4の映像を自然な映像として認識することができる。 Also, the image captured by camera 2-4 from the side may be superimposed on the image captured by camera 2-1 or camera 2-2. FIG. 18 is a diagram showing an example of superimposing an image captured from the side in this embodiment. The upper left diagram in FIG. 18 shows an example in which an image obtained by capturing the tip of hand 6a and the object to be grasped 9 by camera 2-2 is displayed as display screen 302. In the upper right diagram in FIG. 18, display screen 304 showing the image from camera 2-4 is displayed on display screen 302. In this way, display screen 304 may be displayed on display screen 302, but if left as is, the entire screen will look unnatural. For this reason, a virtual mirror 307 may be displayed as shown in the lower left diagram in FIG. 18. That is, the display information generating unit 32 generates a composite image showing the mirror 307 and the image displayed on the mirror 307, assuming that the image from the camera 2-4 is reflected in the virtual mirror 307, generates display data by superimposing the composite image on the image captured by the camera 2-1 or the camera 2-2, and displays the display data on the image presenting device 4. This allows the operator to recognize the image from the camera 2-4 as a natural image.
 また、図10に示した例では、ハンド6aの手先にカメラ2-3を設けたが、さらに、ハンド6aの手先の内側に1つ以上の圧力検知センサが設けられてもよい。図19は、本実施の形態の圧力検知と圧力の表示方法の一例を示す図である。オペレータの指601が、映像提示装置4または別のモニタである表示部602をなぞることで、圧力検知センサが設けられたハンド6aの手先が移動し、把持対象物9などの対象物の表面をなぞる。なお、表示部602が、タッチパネルなどのように入力手段としての機能を有していてもよいし、指の動きが検出されて、検出された動きに合わせてハンド6aの手先が駆動されてもよい。圧力検知センサによって検出された圧力を示す圧力情報は、カメラ2の映像と同様に、表示情報生成部32に入力される。表示情報生成部32が、圧力情報に基づいて、例えば、色の濃度、透明度、構成比のうちの少なくとも1つを変化させるように表示データを生成し、映像提示装置4または別のモニタに表示データを表示せることで、オペレータに力触覚を視覚的に提示してもよい。なお、力触覚の提示は、映像提示装置4または別のモニタに表示される例に限定されず、オペレータの指先部分にプロジェクションマッピングにより表示することで実現されてもよい。 In the example shown in FIG. 10, the camera 2-3 is provided at the tip of the hand 6a, but one or more pressure detection sensors may be provided inside the tip of the hand 6a. FIG. 19 is a diagram showing an example of a pressure detection and pressure display method in this embodiment. When the operator's finger 601 traces the display unit 602, which is the image presentation device 4 or another monitor, the tip of the hand 6a provided with the pressure detection sensor moves and traces the surface of an object such as a grasped object 9. Note that the display unit 602 may have a function as an input means such as a touch panel, or the movement of the finger may be detected and the tip of the hand 6a may be driven in accordance with the detected movement. Pressure information indicating the pressure detected by the pressure detection sensor is input to the display information generation unit 32 in the same way as the image of the camera 2. The display information generation unit 32 may generate display data based on the pressure information so as to change at least one of, for example, the color density, transparency, and composition ratio, and display the display data on the image presentation device 4 or another monitor to visually present the force haptics to the operator. The presentation of the haptic sensation is not limited to being displayed on the image display device 4 or another monitor, but may be achieved by displaying it on the operator's fingertips using projection mapping.
 図20は、本実施の形態の圧力検知と圧力の表示方法の別の一例を示す図である。図20に示した例では、操作演算器3cは、把持対象物9などの対象物の表面をなぞることを指示する操作を受付けると、ハンド6aの手先を把持対象物9などの対象物の表面をなぞるように移動させ、圧力検知センサによって検出された圧力を示す圧力情報とハンド6aの手先の移動量を示す移動情報とを力触覚提示装置607へ出力する。または、指601の移動を検知する手段を用いて、操作演算器3cは、指601の移動に合わせてハンド6aの手先の水平方向の位置を制御することで、ハンド6aの手先が表面をなぞるように移動させてもよい。力触覚提示装置607は、例えば、接触部604と、接触部604を駆動することが可能な駆動部605と、オペレータの指を上から支える支持部606とを備える。例えば、力触覚提示装置607は、移動情報に基づいて駆動部605を制御することで接触部604の水平方向の位置を移動させるとともに、圧力情報に基づいて接触部604を上下運動させる。オペレータが指601を接触部604に接触させると、接触部604とともに移動することで、指601が水平方向に移動するとともに圧力情報に応じて上下に移動する。これにより、力触覚を提示することができ、オペレータは、表面の状態を知覚することができる。なお、圧力検知センサを用いずに、ハンド6aの手先の上下方向の位置を検出することで、対象物の表面の凹凸が検出され、操作演算器3cが、検知された凹凸に応じて接触部604を上下に移動させてもよい。 20 is a diagram showing another example of the pressure detection and pressure display method of this embodiment. In the example shown in FIG. 20, when the operation calculator 3c receives an operation instructing to trace the surface of an object such as the grasped object 9, it moves the tip of the hand 6a so as to trace the surface of the object such as the grasped object 9, and outputs pressure information indicating the pressure detected by the pressure detection sensor and movement information indicating the amount of movement of the tip of the hand 6a to the haptic presentation device 607. Alternatively, using a means for detecting the movement of the finger 601, the operation calculator 3c may control the horizontal position of the tip of the hand 6a in accordance with the movement of the finger 601, thereby moving the tip of the hand 6a so as to trace the surface. The haptic presentation device 607 includes, for example, a contact unit 604, a drive unit 605 capable of driving the contact unit 604, and a support unit 606 that supports the operator's finger from above. For example, the haptic feedback device 607 controls the drive unit 605 based on the movement information to move the horizontal position of the contact unit 604, and moves the contact unit 604 up and down based on the pressure information. When the operator brings the finger 601 into contact with the contact unit 604, the finger 601 moves horizontally together with the contact unit 604, and also moves up and down according to the pressure information. This allows the haptic feedback to be presented, and the operator to perceive the surface condition. Note that, without using a pressure detection sensor, the unevenness of the surface of the object may be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c may move the contact unit 604 up and down according to the detected unevenness.
 また、オペレータの指601を置く部分に膜が設けられいてもよい。膜があることにより表面をなぞる際のすべる感覚をオペレータに与えることができる。また、操作演算器3cが、圧力情報に応じて膜が上下することで、ハンド6aの手先が検知している凹凸をオペレータに再現して伝達することができる。また、ハンド6aの手先の上下方向の位置を検出することで、対象物の表面の凹凸が検出され、操作演算器3cが、検知された凹凸に応じて接触部604を上下に移動させてもよい。 A membrane may also be provided on the portion where the operator places his/her finger 601. The membrane can provide the operator with a sliding sensation when tracing a surface. The operation calculator 3c can also reproduce and communicate to the operator the unevenness detected by the tip of the hand 6a by moving the membrane up and down in response to pressure information. The unevenness of the surface of the object can also be detected by detecting the vertical position of the tip of the hand 6a, and the operation calculator 3c can move the contact part 604 up and down in response to the detected unevenness.
実施の形態4.
 図21は、実施の形態4にかかる遠隔操作システムの構成例を示す図である。本実施の形態の遠隔操作システム100cは、操作演算器3bの代わりに操作演算器3cを備え、遠隔機械1の代わりに遠隔機械1aを備える以外は、実施の形態3の遠隔操作システム100bと同様である。実施の形態3と同様の機能を有する構成要素は、実施の形態3と同一の符号を付して重複する説明を省略する。以下、実施の形態3と異なる点を主に説明する。
Embodiment 4.
21 is a diagram showing a configuration example of a remote control system according to the fourth embodiment. A remote control system 100c according to the present embodiment is similar to the remote control system 100b according to the third embodiment, except that the remote control system 100c according to the present embodiment includes an operation calculator 3c instead of the operation calculator 3b, and a remote machine 1a instead of the remote machine 1. Components having the same functions as those in the third embodiment are given the same reference numerals as those in the third embodiment, and duplicated explanations will be omitted. Below, differences from the third embodiment will be mainly explained.
 遠隔機械1aは、遠隔機械1に手先駆動機構13が追加される以外は、実施の形態3の遠隔機械1と同様である。手先駆動機構13は、操作演算器3cから受信した制御信号に基づいてハンド6aを駆動する。なお、実施の形態1~3の遠隔機械1にも手先駆動機構13が設けられていてもよく、実施の形態1~3では手先駆動機構13の操作方法は特に制約はない。 The remote machine 1a is similar to the remote machine 1 of embodiment 3, except that a hand drive mechanism 13 is added to the remote machine 1. The hand drive mechanism 13 drives the hand 6a based on a control signal received from the operation calculator 3c. Note that the remote machines 1 of embodiments 1 to 3 may also be provided with a hand drive mechanism 13, and there are no particular restrictions on the method of operating the hand drive mechanism 13 in embodiments 1 to 3.
 操作演算器3cは、手先角度設定部38が追加され、入力判別部31aの代わりに入力判別部31cを備える以外は、実施の形態3の操作演算器3bと同様である。本実施の形態では、ハンド6aの手先角度についても、オペレータが、操作装置5を用いて設定可能であり、入力判別部31cは、ハンド6aの手先角度を示す操作情報を受信すると、当該操作情報を手先角度設定部38へ出力する。手先角度設定部38は、入力判別部31cから受け取った操作情報を用いて、遠隔機械1aのハンド6aを制御するための制御信号を生成し、生成した制御信号を遠隔機械1aへ送信する。 The operation calculator 3c is similar to the operation calculator 3b of embodiment 3, except that a hand angle setting unit 38 is added and an input discrimination unit 31c is provided instead of the input discrimination unit 31a. In this embodiment, the operator can also set the hand angle of the hand 6a using the operation device 5, and when the input discrimination unit 31c receives operation information indicating the hand angle of the hand 6a, it outputs the operation information to the hand angle setting unit 38. The hand angle setting unit 38 uses the operation information received from the input discrimination unit 31c to generate a control signal for controlling the hand 6a of the remote machine 1a, and transmits the generated control signal to the remote machine 1a.
 図22および図23は、本実施の形態の手先角度の設定画面の一例を示す図である。図22および図23は、それぞれ映像提示装置4に表示される表示画面を示している。図22は、図2に示した例ではカメラ2-1によって撮影された映像が表示される表示画面301の右側に角度変更ボタン401~403が表示されている。角度変更ボタン401は、ハンド6aの手先角度を水平方向に設定するためのボタンであり、角度変更ボタン402は、ハンド6aの手先角度を45deg(水平方向とハンド6aとのなす角が45deg)に設定するためのボタンであり、角度変更ボタン403は、ハンド6aの手先角度を鉛直方向に設定するためのボタンである。図22に示した例では、映像提示装置4に表示されたボタンを、操作装置5を用いてオペレータが押下することで、ハンド6aの手先角度が設定される。なお、図22では、ハンド6aの手先角度を、水平、45deg、鉛直の3種類の手先角度に設定可能な例を示したが、ハンド6aの手先角度に関して設定可能な手先角度の数はこの例に限定されない。例えば、2種類の手先角度のいずれかに設定可能であってもよいし、3種類より細かく手先角度を設定可能であってもよい。また、手先角度を設定するための表示および操作も、図22に限定されず、例えば、手先角度とともにメモリが示されるインジケータが表示され、メモリを操作装置5によって動かすことによってハンド6aの手先角度が設定されてもよく、ジェスチャによって手先角度が設定されてもよく、これら以外の方法でハンド6aの手先角度が設定されてもよい。 22 and 23 are diagrams showing an example of a hand angle setting screen in this embodiment. FIG. 22 and FIG. 23 each show a display screen displayed on the image display device 4. In the example shown in FIG. 22, angle change buttons 401 to 403 are displayed on the right side of the display screen 301 on which the image captured by the camera 2-1 is displayed. The angle change button 401 is a button for setting the hand angle of the hand 6a in the horizontal direction, the angle change button 402 is a button for setting the hand angle of the hand 6a to 45 deg (the angle between the horizontal direction and the hand 6a is 45 deg), and the angle change button 403 is a button for setting the hand angle of the hand 6a in the vertical direction. In the example shown in FIG. 22, the operator presses the buttons displayed on the image display device 4 using the operation device 5 to set the hand angle of the hand 6a. In addition, FIG. 22 shows an example in which the hand end angle of the hand 6a can be set to three types of hand end angles, horizontal, 45 deg, and vertical, but the number of hand end angles that can be set for the hand end angle of the hand 6a is not limited to this example. For example, it may be possible to set it to one of two types of hand end angles, or it may be possible to set the hand end angle to more than three types. In addition, the display and operation for setting the hand end angle are not limited to FIG. 22. For example, an indicator showing a memory along with the hand end angle may be displayed, and the hand end angle of the hand 6a may be set by moving the memory with the operating device 5, or the hand end angle may be set by a gesture, or the hand end angle of the hand 6a may be set by a method other than these.
 図23は、図22とは別の操作方法を例示している。図23に示した例では、ハンド6aを側方からみた図が表示画面304として表示されており、表示されたハンド6aを、オペレータが操作装置5を用いて回転させることでハンド6aの手先角度を設定する。すなわち、手先角度の設定画面には、ハンド6aの手先を含む画像が表示され、オペレータによって操作装置5を用いて画像内の手先の角度が変更されることにより、手先角度が設定される。図23に示した例では、ハンド6aの手先角度が水平方向に設定されている場合に、オペレータが操作装置5を用いてハンド6aの部分で押下しながら、回転方向404に操作装置5を移動させ、所望の手先角度となると押下を終了する。これにより、ハンド6aの手先角度が設定される。また、例えば、実施の形態1と同様に、マーカー201によって目標の手先角度でクリックすることで目標の手先角度を設定してもよい。なお、図23に示したハンド6aの図は、遠隔機械1aを模擬してあらかじめ作成されたものであってもよいし、ハンド6aを側方から撮影するカメラ2がマニピュレータ6に設けられ、当該カメラ2によって撮影された映像であってもよい。後者の場合、実施の形態1の遠隔機械1の姿勢および位置の操作と同様に、オペレータは、マーカー201を目標の手先角度に移動させた後に操作装置5の押下を開始し目標の手先角度となるまで操作装置5の押下を継続するようにしてもよい。 23 illustrates an operation method different from that of FIG. 22. In the example shown in FIG. 23, a side view of the hand 6a is displayed as the display screen 304, and the operator rotates the displayed hand 6a using the operation device 5 to set the hand end angle of the hand 6a. That is, an image including the hand end of the hand 6a is displayed on the hand end angle setting screen, and the hand end angle is set by the operator changing the angle of the hand end in the image using the operation device 5. In the example shown in FIG. 23, when the hand end angle of the hand 6a is set in the horizontal direction, the operator uses the operation device 5 to move the operation device 5 in the rotation direction 404 while pressing the part of the hand 6a, and ends the pressing when the desired hand end angle is reached. This sets the hand end angle of the hand 6a. Also, for example, as in the first embodiment, the target hand end angle may be set by clicking the target hand end angle with the marker 201. The image of the hand 6a shown in FIG. 23 may be one that has been created in advance by simulating the remote machine 1a, or may be an image captured by a camera 2 that is provided on the manipulator 6 and captures images of the hand 6a from the side. In the latter case, similar to the operation of the posture and position of the remote machine 1 in embodiment 1, the operator may start pressing the operation device 5 after moving the marker 201 to the target hand angle, and continue pressing the operation device 5 until the target hand angle is reached.
 また、図23に示した画面は、実施の形態3で表示される画面(カメラ2で撮影された映像)とともに映像提示装置4に表示されていてもよいし、操作装置5による特定の操作が行われることで、図23に示した画面への切替えが行われてもよい。例えば、カメラ2で撮影された映像が映像提示装置4に表示されている状態で、台車7の部分がクリックされた場合には、実施の形態1で述べたように、遠隔機械1の姿勢と位置との操作が行われるモードとなり、ハンド6aの部分をクリックされた場合には、ハンド6aの手先角度の設定画面となるようにしてもよい。また、同様に、マニピュレータ6のハンド6aの位置も、設定可能であってもよく、マニピュレータ6がクリックされた場合には、マニピュレータ6のハンド6aの位置の設定画面となるようにしてもよい。例えば、操作演算器3cは、マニピュレータ6の各関節にARマーカーを貼付することで関節角を把握してもよいし、内部にロボットモデルを構築して、関節角を把握することでハンド6aの位置を制御してもよいし、これら以外の方法でハンド6aの位置を制御してもよい。マニピュレータ6のハンド6aの位置の設定は、実施の形態1で述べたようにマーカー201の位置で操作装置5がクリックまたは押下され続けることで実施されてもよいし、図22に例示したように、マニピュレータ6のハンド6aの位置を示すボタンが表示され、ボタンによってマニピュレータ6のハンド6aの位置が設定されてもよい。 23 may be displayed on the image display device 4 together with the screen (image captured by the camera 2) displayed in the third embodiment, or a specific operation by the operation device 5 may be performed to switch to the screen shown in FIG. 23. For example, when the dolly 7 is clicked while the image captured by the camera 2 is displayed on the image display device 4, the mode may be changed to a mode in which the attitude and position of the remote machine 1 are operated as described in the first embodiment, and when the hand 6a is clicked, the hand tip angle setting screen of the hand 6a may be displayed. Similarly, the position of the hand 6a of the manipulator 6 may be set, and when the manipulator 6 is clicked, the setting screen of the position of the hand 6a of the manipulator 6 may be displayed. For example, the operation calculator 3c may grasp the joint angle by attaching an AR marker to each joint of the manipulator 6, or may control the position of the hand 6a by constructing a robot model inside and grasping the joint angle, or may control the position of the hand 6a by other methods. The position of the hand 6a of the manipulator 6 may be set by clicking or continuously pressing the operating device 5 at the position of the marker 201 as described in embodiment 1, or, as illustrated in FIG. 22, a button indicating the position of the hand 6a of the manipulator 6 may be displayed and the position of the hand 6a of the manipulator 6 may be set by the button.
 このように、本実施の形態では、遠隔機械1aは手先角度を設定可能なハンド6aを有し、映像提示装置4は、ハンド6aの手先角度を設定するための設定画面を提示する。本実施の形態では、オペレータは、ハンド6aの手先角度の設定についても、簡易な操作で行うことができる。また、マニピュレータ6の位置、ハンド6aの手先角度の設定は、実施の形態2で述べたように、精密操作装置8aを用いて行われてもよい。 In this manner, in this embodiment, the remote machine 1a has a hand 6a whose end-point angle can be set, and the image display device 4 displays a setting screen for setting the end-point angle of the hand 6a. In this embodiment, the operator can also set the end-point angle of the hand 6a with a simple operation. Furthermore, the position of the manipulator 6 and the end-point angle of the hand 6a may be set using the precision operation device 8a, as described in the second embodiment.
 本実施の形態の操作演算器3cは、実施の形態1の操作演算器3と同様に、コンピュータシステムにより実現される。例えば、本実施の形態の操作演算器3cは、実施の形態1の操作演算器3と同様に、図6に例示したコンピュータシステムにより実現される。 The operation calculator 3c of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1. For example, the operation calculator 3c of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1.
 また、図21では、実施の形態3の操作演算器3bに手先角度の設定機能を追加する例を示したが、実施の形態1の操作演算器3aまたは実施の形態2の操作演算器3bに、手先角度設定部38を追加することで、手先角度の設定機能を追加してもよい。 In addition, while FIG. 21 shows an example in which a hand angle setting function is added to the operation calculator 3b of embodiment 3, a hand angle setting unit 38 may be added to the operation calculator 3a of embodiment 1 or the operation calculator 3b of embodiment 2 to add a hand angle setting function.
実施の形態5.
 図24は、実施の形態5にかかる遠隔操作システムの構成例を示す図である。本実施の形態の遠隔操作システム100dは、操作演算器3cの代わりに操作演算器3dを備える以外は、実施の形態4の遠隔操作システム100cと同様である。実施の形態4と同様の機能を有する構成要素は、実施の形態4と同一の符号を付して重複する説明を省略する。以下、実施の形態4と異なる点を主に説明する。
Embodiment 5.
24 is a diagram showing a configuration example of a remote control system according to the fifth embodiment. A remote control system 100d according to the fifth embodiment is similar to the remote control system 100c according to the fourth embodiment, except that the remote control system 100d according to the fifth embodiment includes an operation calculator 3d instead of the operation calculator 3c. Components having the same functions as those in the fourth embodiment are given the same reference numerals as those in the fourth embodiment, and duplicated explanations are omitted. Below, differences from the fourth embodiment will be mainly explained.
 操作演算器3dは、運動切替器である運動切替部39が追加され、入力判別部31cの代わりに入力判別部31dを備える以外は、実施の形態4の操作演算器3cと同様である。運動切替部39は、遠隔機械1aの操作部位を、カメラ切替部37による切替に連動して切替える。例えば、カメラ切替部37が、選択しているカメラ2を示す情報を運動切替部39へ通知し、運動切替部39は、通知された情報に基づいて、操作部位を切替える。 The operation calculator 3d is similar to the operation calculator 3c of embodiment 4, except that a motion switching unit 39, which is a motion switch, is added, and the input discrimination unit 31d is provided instead of the input discrimination unit 31c. The motion switching unit 39 switches the operation part of the remote machine 1a in conjunction with switching by the camera switching unit 37. For example, the camera switching unit 37 notifies the motion switching unit 39 of information indicating the selected camera 2, and the motion switching unit 39 switches the operation part based on the notified information.
 例えば、運動切替部39は、カメラ切替部37が図2に例示したカメラ2-1を選択している場合には、遠隔機械1aの台車7を操作部位として選択し、カメラ切替部37が図2に例示したカメラ2-2を選択している場合には、遠隔機械1aのマニピュレータ6またはハンド6aを操作部位として選択し、カメラ切替部37が図15に例示したカメラ2-3を選択している場合には、遠隔機械1aのハンド6aを操作部位として選択する。運動切替部39は、選択した操作部位を入力判別部31dへ通知する。入力判別部31dは、入力された操作情報を操作部位に対応する機能部へ出力する。例えば、入力判別部31dは、操作部位が台車7である場合には、目標姿勢設定部33および姿勢駆動部34に操作情報を出力し、操作部位がハンド6aである場合には、手先角度設定部38操作情報を出力する。入力判別部31dは、操作部位がマニピュレータ6である場合には、図示を省略したマニピュレータ駆動部、またはマニピュレータ駆動部としての機能を有する目標姿勢設定部33および姿勢駆動部34に操作情報を出力する。 For example, when the camera switching unit 37 selects the camera 2-1 illustrated in FIG. 2, the motion switching unit 39 selects the cart 7 of the remote machine 1a as the operation part, when the camera switching unit 37 selects the camera 2-2 illustrated in FIG. 2, the motion switching unit 39 selects the manipulator 6 or hand 6a of the remote machine 1a as the operation part, and when the camera switching unit 37 selects the camera 2-3 illustrated in FIG. 15, the motion switching unit 39 selects the hand 6a of the remote machine 1a as the operation part. The motion switching unit 39 notifies the input discrimination unit 31d of the selected operation part. The input discrimination unit 31d outputs the input operation information to the function unit corresponding to the operation part. For example, when the operation part is the cart 7, the input discrimination unit 31d outputs operation information to the target attitude setting unit 33 and the attitude driving unit 34, and when the operation part is the hand 6a, the input discrimination unit 31d outputs operation information to the hand angle setting unit 38. When the operation part is the manipulator 6, the input discrimination unit 31d outputs operation information to a manipulator driving unit (not shown), or to a target attitude setting unit 33 and an attitude driving unit 34 that function as a manipulator driving unit.
 これにより、オペレータは、カメラ切替部37の切替えの操作やジェスチャを行うだけで、操作部位が切替わるため、操作負荷を軽減して操作を行うことができる。また、運動切替部39は、カメラ切替部37によって選択されてカメラ2に応じて、精密操作装置8aが操作された際の遠隔機械1の運動のゲインを設定してもよい。ゲインは、カメラ2ごとにあらかじめ定められていてもよい。ゲインの設定は、操作部位の切替えの代わりに行われてもよいし、操作部位の切替えとゲインの設定との両方が行われてもよい。運動切替部39は、設定したゲインを目標姿勢設定部33、姿勢駆動部34および手先角度設定部38へ出力する。また、カメラ切替部37は、映像提示装置4に表示されている映像が拡大または縮小された場合にも、拡大または縮小の倍率に応じてゲインを設定してもよい。 As a result, the operator can switch the operation part simply by performing a switching operation or gesture of the camera switching unit 37, thereby reducing the operational burden and allowing the operator to perform operations. In addition, the motion switching unit 39 may set a gain for the motion of the remote machine 1 when the precision operation device 8a is operated according to the camera 2 selected by the camera switching unit 37. The gain may be determined in advance for each camera 2. The gain may be set instead of switching the operation part, or both the operation part switching and the gain setting may be performed. The motion switching unit 39 outputs the set gain to the target attitude setting unit 33, the attitude driving unit 34, and the hand angle setting unit 38. In addition, the camera switching unit 37 may set a gain according to the magnification of the enlargement or reduction even when the image displayed on the image presentation device 4 is enlarged or reduced.
 本実施の形態の操作演算器3dは、実施の形態1の操作演算器3と同様に、コンピュータシステムにより実現される。例えば、本実施の形態の操作演算器3dは、実施の形態1の操作演算器3と同様に、図6に例示したコンピュータシステムにより実現される。また、運動切替部39の機能を有する運動切替器を、操作演算器3dとは別に設けてもよい。 The operation calculator 3d of this embodiment is realized by a computer system, similar to the operation calculator 3 of embodiment 1. For example, the operation calculator 3d of this embodiment is realized by the computer system illustrated in FIG. 6, similar to the operation calculator 3 of embodiment 1. In addition, a motion switch having the function of the motion switch unit 39 may be provided separately from the operation calculator 3d.
 また、図24では、実施の形態4の操作演算器3cに運動切替えの機能を追加する例を示したが、実施の形態2の操作演算器3aまたは実施の形態3の操作演算器3bに、運動切替部39を追加することで、運動切替えの機能を追加してもよい。 In addition, while FIG. 24 shows an example in which a motion switching function is added to the operation calculator 3c of embodiment 4, a motion switching function may be added by adding a motion switching unit 39 to the operation calculator 3a of embodiment 2 or the operation calculator 3b of embodiment 3.
 以上の実施の形態に示した構成は、一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、実施の形態同士を組み合わせることも可能であるし、要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。 The configurations shown in the above embodiments are merely examples, and may be combined with other known technologies, or the embodiments may be combined with each other. In addition, parts of the configurations may be omitted or modified without departing from the spirit of the invention.
 1,1a 遠隔機械、2,2-1~2-3 カメラ、3,3a,3b,3c,3d 操作演算器、4 映像提示装置、5 操作装置、6 マニピュレータ、6a ハンド、7 台車、8a 精密操作装置、8b 入力切替器、9 把持対象物、11 姿勢駆動機構、12 位置駆動機構、13 手先駆動機構、31,31a,31c,31d 入力判別部、32 表示情報生成部、33 目標姿勢設定部、34 姿勢駆動部、35 目標姿勢判定部、36 位置駆動部、37 カメラ切替部、38 手先角度設定部、39 運動切替部、100,100a,100b,100c,100d 遠隔操作システム、101 制御部、102 入力部、103 記憶部、104 表示部、105 通信部、106 出力部、107 システムバス。 1, 1a remote machine, 2, 2-1 to 2-3 camera, 3, 3a, 3b, 3c, 3d operation calculator, 4 image display device, 5 operation device, 6 manipulator, 6a hand, 7 dolly, 8a precision operation device, 8b input switch, 9 grasped object, 11 attitude drive mechanism, 12 position drive mechanism, 13 hand drive mechanism, 31, 31a, 31c, 31d input discrimination unit, 32 display information information generation unit, 33 target attitude setting unit, 34 attitude driving unit, 35 target attitude determination unit, 36 position driving unit, 37 camera switching unit, 38 hand tip angle setting unit, 39 motion switching unit, 100, 100a, 100b, 100c, 100d remote operation system, 101 control unit, 102 input unit, 103 memory unit, 104 display unit, 105 communication unit, 106 output unit, 107 system bus.

Claims (10)

  1.  カメラが搭載され、遠隔操作される遠隔機械と、
     オペレータへ前記カメラによって撮影された映像を提示映像として提示する映像提示装置と、
     前記オペレータによって操作され、前記提示映像における位置として前記遠隔機械の目標姿勢の入力を前記オペレータの操作により受付け、前記遠隔機械の位置を移動させることを示す位置移動指示の入力を前記オペレータの操作により受付ける操作装置と、
     前記遠隔機械の姿勢を前記操作装置が受付けた前記目標姿勢に基づいて駆動させ、前記遠隔機械の位置を前記操作装置が受付けた前記位置移動指示に基づいて移動させる操作演算器と、
     を備えることを特徴とする遠隔操作システム。
    A remote machine equipped with a camera and remotely operated;
    an image presentation device that presents the image captured by the camera to an operator as a presentation image;
    an operation device that is operated by the operator, receives an input of a target attitude of the remote machine as a position in the presented image by the operation of the operator, and receives an input of a position movement instruction indicating that the position of the remote machine is to be moved by the operation of the operator;
    an operation calculator that drives the attitude of the remote machine based on the target attitude received by the operation device and moves the position of the remote machine based on the position movement instruction received by the operation device;
    A remote control system comprising:
  2.  前記操作演算器は、
     前記提示映像内に前記操作装置の操作によって移動可能なマーカーを表示させ、前記オペレータにより前記操作装置が押下された時点の前記マーカーの位置を前記目標姿勢として設定し、
     前記目標姿勢の設定による前記遠隔機械の姿勢の駆動を開始した後に、前記遠隔機械の姿勢が前記目標姿勢に達したと判定すると、前記遠隔機械の姿勢が前記目標姿勢に達した時点で前記操作装置の押下が継続している場合に前記操作装置によって前記位置移動指示が受付けられたと判断し、前記遠隔機械の位置の移動を開始させることを特徴とする請求項1に記載の遠隔操作システム。
    The operation calculator includes:
    displaying a marker that can be moved by operating the operation device within the presented image, and setting a position of the marker at a time when the operation device is pressed by the operator as the target attitude;
    The remote operation system according to claim 1, characterized in that after starting to drive the attitude of the remote machine by setting the target attitude, when it is determined that the attitude of the remote machine has reached the target attitude, if the operating device continues to be pressed at the time when the attitude of the remote machine has reached the target attitude, it is determined that the position movement instruction has been accepted by the operating device, and movement of the position of the remote machine is started.
  3.  前記操作装置の押下が停止されると、前記遠隔機械の姿勢の駆動および前記遠隔機械の位置の移動を停止させることを特徴とする請求項2に記載の遠隔操作システム。 The remote control system according to claim 2, characterized in that when the pressing of the operating device is stopped, the driving of the attitude of the remote machine and the movement of the position of the remote machine are stopped.
  4.  前記操作装置よりも精密な操作を可能にする精密操作装置、
     を備え、
     オペレータからの操作対象を前記操作装置と前記精密操作装置との間で切替える入力切替器、
     を備えることを特徴とする請求項1から3のいずれか1つに記載の遠隔操作システム。
    A precision operation device that enables more precise operation than the operation device;
    Equipped with
    an input switch for switching an operation target from an operator between the operation device and the precision operation device;
    4. The remote control system according to claim 1, further comprising:
  5.  前記カメラは複数であり、
     前記提示映像を、複数の前記カメラのそれぞれによって撮影された複数の映像のなかから選択することで、前記提示映像に対応する前記カメラを切替えるカメラ切替器、
     を備えることを特徴とする請求項1から4のいずれか1つに記載の遠隔操作システム。
    The camera is a plurality of cameras,
    a camera switcher that switches the camera corresponding to the presentation image by selecting the presentation image from among a plurality of images captured by each of the plurality of cameras;
    5. The remote control system according to claim 1, further comprising:
  6.  前記遠隔機械は手先の角度である手先角度を設定可能なハンドを有し、
     前記映像提示装置は、前記ハンドの前記手先角度を設定するための設定画面を提示することを特徴とする請求項5に記載の遠隔操作システム。
    The remote machine has a hand capable of setting a hand angle,
    The remote control system according to claim 5 , wherein the image presentation device presents a setting screen for setting the tip angle of the hand.
  7.  前記設定画面には、前記ハンドの手先を含む画像が表示され、前記オペレータによって前記操作装置を用いて前記画像内の前記手先の角度が変更されることにより、前記手先角度が設定されることを特徴とする請求項6に記載の遠隔操作システム。 The remote control system of claim 6, characterized in that the setting screen displays an image including the tip of the hand, and the hand angle is set by the operator using the operation device to change the angle of the tip in the image.
  8.  前記カメラは、前記手先の内側に設けられるカメラを含むことを特徴とする請求項6または7に記載の遠隔操作システム。 The remote control system according to claim 6 or 7, characterized in that the camera includes a camera provided on the inside of the hand.
  9.  前記遠隔機械の操作部位を、前記カメラ切替器による切替えと連動して切替える運動切替器、
     を備えることを特徴とする請求項5から8のいずれか1つに記載の遠隔操作システム。
    a motion switch for switching an operating part of the remote machine in conjunction with the switching by the camera switch;
    9. The remote control system according to claim 5, further comprising:
  10.  カメラが搭載され、遠隔操作される遠隔機械と、オペレータへ前記カメラによって撮影された映像を提示映像として提示する映像提示装置と、操作装置と、操作演算器とを備える遠隔操作システムにおける遠隔操作方法であって、
     前記操作装置が、前記オペレータによって操作され、前記提示映像における位置として前記遠隔機械の目標姿勢の入力を前記オペレータの操作により受付け、前記遠隔機械の位置を移動させることを示す位置移動指示の入力を前記オペレータの操作により受付けるステップと、
     前記操作演算器が、前記遠隔機械の姿勢を前記目標姿勢に基づいて駆動させ、前記遠隔機械の位置を前記位置移動指示に基づいて移動させるステップと、
     を含むことを特徴とする遠隔操作方法。
    A remote operation method in a remote operation system including a remote machine equipped with a camera and remotely operated, an image display device that displays an image captured by the camera as a display image to an operator, an operation device, and an operation calculator, comprising:
    the operation device is operated by the operator, and receives an input of a target attitude of the remote machine as a position in the presented image by the operation of the operator, and receives an input of a position movement instruction indicating that the position of the remote machine is to be moved by the operation of the operator;
    a step of the operation computing unit driving the attitude of the remote machine based on the target attitude and moving the position of the remote machine based on the position movement instruction;
    A remote control method comprising:
PCT/JP2022/040482 2022-10-28 2022-10-28 Remote operation system and remote operation method WO2024089890A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040482 WO2024089890A1 (en) 2022-10-28 2022-10-28 Remote operation system and remote operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040482 WO2024089890A1 (en) 2022-10-28 2022-10-28 Remote operation system and remote operation method

Publications (1)

Publication Number Publication Date
WO2024089890A1 true WO2024089890A1 (en) 2024-05-02

Family

ID=90830319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/040482 WO2024089890A1 (en) 2022-10-28 2022-10-28 Remote operation system and remote operation method

Country Status (1)

Country Link
WO (1) WO2024089890A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09201785A (en) * 1996-01-30 1997-08-05 Shimadzu Corp Manipulator
JP2012171024A (en) * 2011-02-17 2012-09-10 Japan Science & Technology Agency Robot system
JP2018535487A (en) * 2015-09-15 2018-11-29 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd System and method for planning and controlling UAV paths

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09201785A (en) * 1996-01-30 1997-08-05 Shimadzu Corp Manipulator
JP2012171024A (en) * 2011-02-17 2012-09-10 Japan Science & Technology Agency Robot system
JP2018535487A (en) * 2015-09-15 2018-11-29 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd System and method for planning and controlling UAV paths

Similar Documents

Publication Publication Date Title
JP5839220B2 (en) Information processing apparatus, information processing method, and program
US9798395B2 (en) Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
EP2593848B1 (en) Methods and systems for interacting with projected user interface
US7535486B2 (en) Display control device, display control method, program, and portable apparatus
US7420547B2 (en) Method and apparatus for matching tactile sensation to the contents of a display
WO2016148072A1 (en) Computer program and computer system for controlling object manipulation in immersive virtual space
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
JP2000079587A (en) Remote controlling method and system for robot
JP4171561B2 (en) Rotary encoder
US9544556B2 (en) Projection control apparatus and projection control method
JP2010184600A (en) Onboard gesture switch device
KR101444858B1 (en) Telepresence apparatus
JP2010092086A (en) User input apparatus, digital camera, input control method, and input control program
JP2014026355A (en) Image display device and image display method
JP2014228702A (en) Map display control apparatus
KR20120136719A (en) The method of pointing and controlling objects on screen at long range using 3d positions of eyes and hands
WO2024089890A1 (en) Remote operation system and remote operation method
WO2021192491A1 (en) Remote operation assistance server, remote operation assistance system, and remote operation assistance method
JP6549066B2 (en) Computer program and computer system for controlling object operation in immersive virtual space
JP2015194794A (en) Handling device
CN113574592A (en) Electronic device, control method for electronic device, program, and storage medium
JPH08129449A (en) Signal input device
JP4357554B2 (en) Robot remote control method and system, and robot
JP4744217B2 (en) How the terminal works
JP2019215769A (en) Operation apparatus and operation method