WO2024013895A1 - Remote control system, remote control method, and remote control program - Google Patents

Remote control system, remote control method, and remote control program Download PDF

Info

Publication number
WO2024013895A1
WO2024013895A1 PCT/JP2022/027596 JP2022027596W WO2024013895A1 WO 2024013895 A1 WO2024013895 A1 WO 2024013895A1 JP 2022027596 W JP2022027596 W JP 2022027596W WO 2024013895 A1 WO2024013895 A1 WO 2024013895A1
Authority
WO
WIPO (PCT)
Prior art keywords
control signal
information
remote control
control
processing unit
Prior art date
Application number
PCT/JP2022/027596
Other languages
French (fr)
Japanese (ja)
Inventor
匡人 福田
仁志 瀬下
大祐 佐藤
成宗 松村
雅人 宮原
太智 金田
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/027596 priority Critical patent/WO2024013895A1/en
Publication of WO2024013895A1 publication Critical patent/WO2024013895A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Definitions

  • the present invention relates to a remote control system, a remote control method, and a remote control program.
  • Remote control systems include systems in which an operator operates a robot at a remote location, and systems in which a robot supports the operator's work. Such a remote control system requires control of a robot that synchronizes with the movements of an operator or controls a robot that cooperates with the movements of an operator.
  • control signals that reflect the operator's movements be transmitted to the robot without delay.
  • control signals are transmitted to the robot with a delay due to communication and processing.
  • remote control systems use a predictive model that predicts the human's future behavior to generate a predictive control signal that matches the human's future behavior. control the robot.
  • Non-Patent Document 1 discloses a remote control system that utilizes a predictive model of human motion to predict the future state of a human and influence a robot to change its intentions and motions.
  • predictive models of human motion are constructed based on the output of sensors attached to the human body and human joint information called skeletal information obtained through image processing from human video data.
  • Non-Patent Document 2 assumes that there is a correlation between past human movements and future human movements, and predicts future joint information from input series data of past human joint information. is disclosed.
  • An object of the present invention is to provide a remote control system, a remote control method, and a remote control program that accurately and stably control a remote control target.
  • a remote control system includes a remote control device operated by an operator and a controlled device remotely controlled by the remote control device.
  • the remote control device includes a control signal transmission processing section that transmits a control signal including first control information and second control information.
  • the controlled device generates a predictive control signal based on a control signal reception processing unit that receives the control signal, and first control information and second control information included in the control signal received by the control signal reception processing unit. and an operation control processing section that controls the operation of a remote control target based on the predictive control signal generated by the predictive control signal generation processing section.
  • the remote control method includes a first step of transmitting a control signal including first control information and second control information from the remote control device to the controlled device, and a second step of receiving the control signal in the controlled device. a third step of generating a predictive control signal based on the first control information and second control information included in the control signal received in the second step; and a fourth step of controlling the operation of the remote control target based on the predicted control signal.
  • the remote control program allows a computer to perform the functions of the control signal transmission processing section of the above-mentioned remote control device, or the functions of the control signal reception processing section, predictive control signal generation processing section, and operation control processing section of the above-mentioned controlled object device. It is executed by the processor that has it.
  • a remote control system a remote control method, and a remote control program that accurately and stably control a remote control target.
  • FIG. 1 is a diagram illustrating an example of an outline of general predictive control.
  • FIG. 2 is a diagram illustrating an example of an overview of predictive control according to an embodiment of the present invention.
  • FIG. 3 is a block diagram showing an example of a remote control system according to an embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating an example of a process executed by a remote control device of a remote control system according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating an example of a process executed by a controlled device of a remote control system according to an embodiment of the present invention.
  • FIG. 6 is a block diagram showing an example of the hardware configuration of a remote control device and a controlled device of a remote control system according to an embodiment of the present invention.
  • FIG. 1 is a diagram illustrating an example of an outline of general predictive control.
  • FIG. 2 is a diagram illustrating an example of an overview of predictive control according to an embodiment of the present invention.
  • FIGS. 1 and 2 show examples in which the prediction model is composed of an encoder and a decoder.
  • the predictive model is not limited to the configuration shown in FIGS. 1 and 2, and may be a model calculated from known parameters of the robot, for example.
  • a control signal is input to the encoder, and a predictive control signal is output from the decoder based on the control signal.
  • the control signal has a single control information obtained by a single modality device.
  • the control information is human joint information.
  • This predictive control uses, for example, a predictive model of human motion that uses only human joint information. For this reason, if there are defects or noise in human joint information, prediction accuracy will decrease. Furthermore, when predicting a human's long-term future movements, the information necessary for the prediction may not be included in the human's past movements, and in that case, the accuracy of the prediction decreases.
  • a control signal including first control information and second control information is input to the encoder, and a predictive control signal is output from the decoder.
  • the first control information is control information obtained by the first modality device.
  • the second control information is control information obtained by another second modality device different from the first modality device.
  • the first control information is human joint information
  • the second control information is human line of sight information.
  • This predictive control uses a predictive model of human motion that utilizes, for example, human joint information as well as human line of sight information that indicates the human's intentions. Therefore, even if there are defects or noise in the human joint information, it is possible to prevent the prediction accuracy from decreasing. Furthermore, it is possible to predict human behavior in the long-term future with high accuracy. This makes it possible to stably control the robot.
  • FIG. 3 is a diagram showing an example of a remote control system according to an embodiment of the present invention.
  • a remote control system includes a remote control device 100 and a controlled device 200.
  • the remote control device 100 and the controlled device 200 are connected, for example, via a communication network 300 so that they can communicate in both directions.
  • the remote control device 100 is a device for remotely controlling a controlled device 200, and the controlled device 200 is a device remotely controlled by the remote control device 100.
  • the remote control device 100 is operated by an operator in order to remotely control the operation of the controlled device 200.
  • the controlled device 200 operates in response to an operator's operation on the remote control device 100.
  • remote control means controlling the operation of the controlled device 200 according to a control signal output by the remote control device 100, regardless of the distance between the remote control device 100 and the controlled device 200.
  • the controlled device 200 is a device that is located in a remote location far away from the remote control device 100, and whose operation is controlled by an operator.
  • the controlled device 200 is a device that is placed near the remote control device 100 and supports the work of an operator, or a device that performs collaborative work with the operator.
  • the remote control device 100 is a device that remotely controls the controlled device 200 via the communication network 300.
  • the remote control device 100 includes a detection section 110, a display section 120, and a communication section 130.
  • the detection unit 110 detects an operator's control input.
  • the operator's control input includes operator's motion information and operator's line of sight information.
  • the operator's motion information is, for example, skeletal information or joint information of the operator.
  • the detection unit 110 also generates a control signal for remotely controlling the controlled device 200.
  • the display unit 120 receives information transmitted from the controlled device 200 and displays the received information.
  • the information received by the display unit 120 includes video information and operating state information of the remote control target 250, as will be described later.
  • the communication unit 130 is an interface that enables information to be transmitted and received between the remote control device 100 and the controlled device 200 via the communication network 300.
  • the detection unit 110 includes a motion information acquisition processing unit 111, a line of sight information acquisition processing unit 112, and a control signal transmission processing unit 113.
  • the motion information acquisition processing unit 111 acquires the motion information of the operator.
  • the motion information is skeletal information or joint information of the operator.
  • the motion information acquisition processing unit 111 acquires skeletal information or joint information of the operator based on the output of a sensor attached to the operator's body and the operator's moving image data.
  • the operation information may also be information on an operator's operation of an input device such as a joystick.
  • the operation information acquisition processing unit 111 acquires operator operation information based on the output of the input device.
  • the line-of-sight information acquisition processing unit 112 acquires line-of-sight information of the operator.
  • the line-of-sight information is information on the eyeball direction obtained from image processing of moving image data of the operator, or information on the three-dimensional gaze position obtained from a line-of-sight measurement device.
  • the motion information acquired by the motion information acquisition processing unit 111 corresponds to the first control information in the predictive control described with reference to FIG. 2.
  • the line-of-sight information acquired by the line-of-sight information acquisition processing unit 112 corresponds to the second control information in the predictive control described with reference to FIG. 2. That is, the motion information acquisition processing unit 111 and the line of sight information acquisition processing unit 112 each acquire different types of control information obtained by different types of modality equipment (sensors, cameras, measurement devices, etc.).
  • the control signal transmission processing unit 113 generates a control signal that includes the motion information acquired by the motion information acquisition processing unit 111 and the gaze information acquired by the gaze information acquisition processing unit 112.
  • the control signal transmission processing unit 113 transmits the control signal to the controlled device 200 via the communication unit 130 and the communication network 300.
  • the controlled device 200 is a device that is remotely controlled by the remote control device 100 via the communication network 300.
  • the controlled device 200 includes a communication section 210, a prediction section 220, an operation section 230, and a photographing section 240.
  • the controlled device 200 also has a remote controlled object 250 .
  • the remote control target 250 is, for example, a humanoid robot or an arm-shaped robot.
  • the remote control target 250 will be described as a part of the controlled target device 200, in other words, as something included in the controlled target device 200, as shown in FIG.
  • the remote control target 250 may be a separate element from the control target device 200, in other words, an element external to the control target device 200.
  • the communication unit 210 is an interface that enables information to be transmitted and received between the controlled device 200 and the remote control device 100 via the communication network 300.
  • the prediction unit 220 receives the control signal transmitted from the remote control device 100 through the communication unit 210 via the communication network 300.
  • the control signals received by the predictor 220 are delayed relative to operator control inputs due to communication and processing.
  • the prediction unit 220 performs prediction processing on the control signal and generates a predicted control signal.
  • the predictive control signal is ideally a control signal that does not include the influence of delay. In other words, the predictive control signal can be said to be a more appropriate control signal in the future than at the time of the operator's control input.
  • the operation unit 230 actually operates the remote control target 250 based on the predictive control signal generated by the prediction unit 220.
  • the photographing unit 240 photographs the remote control target 250.
  • the image photographed by the photographing unit 240 shows the state as a result of the operation performed by the remote control target 250 under the control of the operation unit 230.
  • the photographing unit 240 also transmits video information to the remote control device 100 via the communication unit 210 and the communication network 300.
  • the video information is received by the display unit 120 of the remote control device 100, and the display unit 120 displays the video based on the received video information.
  • the prediction unit 220 includes a control signal reception processing unit 221 , a controlled object state acquisition processing unit 222 , a line-of-sight information processing unit 223 , a predictive control signal generation processing unit 224 , and a predictive control signal transmission processing unit 225 .
  • the control signal reception processing unit 221 receives a control signal from the remote control device 100 via the communication unit 210.
  • the controlled object state acquisition processing unit 222 obtains the current state of the remote controlled object 250.
  • the control target state acquisition processing unit 222 acquires rotation angle information and torque information of each of the six actuators of the arm robot.
  • the rotation angle information and torque information of the six actuators can indicate the state of the arm type robot.
  • the rotation angle information and torque information of each actuator are detected, for example, by a sensor provided on each actuator.
  • the line-of-sight information processing section 223 performs preprocessing on the line-of-sight information included in the control signal received by the control signal reception processing section 221. Specifically, the line-of-sight information processing unit 223 detects outliers in the line-of-sight information and replaces values exceeding the interquartile range with values from the previous frame in order to alleviate the influence of noise. The line-of-sight information processing unit 223 also smooths the line-of-sight information using a moving average in order to alleviate the influence of noise and make it easier to learn features during prediction.
  • the predictive control signal generation processing unit 224 uses the motion information included in the control signal received by the control signal reception processing unit 221, the line of sight information preprocessed by the line of sight information processing unit 223, and the control target state acquisition processing unit 222. A predictive control signal is generated based on the acquired information on the current state of the remote control target 250.
  • the predictive control signal transmission processing section 225 transmits the predictive control signal generated by the predictive control signal generation processing section 224 to the operation section 230.
  • the operation section 230 includes a predictive control signal reception processing section 231, an operation control processing section 232, an operation state acquisition processing section 233, and an operation state information transmission processing section 234.
  • the predictive control signal reception processing section 231 receives the predictive control signal transmitted from the predictive control signal transmission processing section 225 of the prediction section 220.
  • the operation control processing section 232 controls the operation of the remote control target 250 according to the predictive control signal received by the predictive control signal reception processing section 231. That is, the operation control processing unit 232 controls each actuator of the remote control target 250 according to the predictive control signal.
  • the operation state acquisition processing unit 233 acquires operation state information indicating the state of the result of the operation performed by the remote control target 250 under the control of the operation control processing unit 232. For example, when the remote control target 250 is a six-axis controlled arm robot, the operating state acquisition processing unit 233 acquires rotation angle information and torque information of each of the six actuators.
  • the operating state information transmission processing unit 234 transmits the operating state information acquired by the operating state acquisition processing unit 233 to the remote control device 100 via the communication unit 210 and the communication network 300.
  • the operating state information is received by the display unit 120 of the remote control device 100, and the display unit 120 displays the received operating state information.
  • FIG. 4 is a flowchart illustrating an example of a process executed by the remote control device 100.
  • step S11 the detection unit 110 of the remote control device 100 detects an operator's control input. Specifically, the motion information acquisition processing section 111 of the detection section 110 detects the motion information of the operator. Further, the line-of-sight information acquisition processing unit 112 of the detection unit 110 detects line-of-sight information of the operator.
  • step S12 the control signal transmission processing section 113 of the detection section 110 uses the operator's motion information acquired by the motion information acquisition processing section 111 and the operator's line of sight information acquired by the gaze information acquisition processing section 112. Based on the information, a control signal including motion information and line-of-sight information is generated. The control signal transmission processing unit 113 then transmits the control signal to the controlled device 200 via the communication unit 130 and the communication network 300.
  • the remote control device 100 repeatedly executes the processes of steps S11 and S12 described above.
  • FIG. 5 is a flowchart illustrating an example of a process executed by the controlled device 200.
  • step S21 the control signal reception processing unit 221 of the prediction unit 220 of the controlled device 200 receives a control signal transmitted from the remote control device 100 through the communication unit 210 via the communication network 300.
  • step S22 the predictive control signal generation processing section 224 of the prediction section 220 generates motion information and line-of-sight information (specifically, the line-of-sight information processing section 223) included in the control signal received by the control signal reception processing section 221. If necessary, information on the current state of the remote control target 250 acquired by the control target state acquisition processing unit 222 is used to generate a predictive control signal.
  • line-of-sight information specifically, the line-of-sight information processing section 223
  • step S23 the operation control processing unit 232 of the operation unit 230 of the controlled device 200 controls the operation of the remote control target 250 using the predictive control signal generated by the prediction unit 220.
  • the controlled device 200 transmits information indicating the result of the operation of the remote controlled object 250 to the remote control device 100 via the communication network 300.
  • the photographing unit 240 photographs the remote control target 250 and transmits the image information to the remote control device 100 via the communication unit 210 and the communication network 300 .
  • the operating state acquisition processing unit 233 obtains operating state information of the remote control target 250, and the operating state information transmission processing unit 234 transmits the operating state information to the remote control target 250 through the communication unit 210 and the communication network 300.
  • Send to device 100 The remote control device 100 displays the result of the operation of the remote control target 250 on the display unit 120 based on the received video information and operation state information.
  • the controlled device 200 repeatedly executes the processes of steps S21 to S24 described above.
  • the detection unit 110 of the remote control device 100 generates a control signal that includes operator motion information and line of sight information
  • the prediction unit 220 of the controlled device 200 generates a control signal that includes operator motion information and line of sight information.
  • a predictive control signal is generated using the line of sight information.
  • the remote control system since the remote control system according to the present embodiment uses information from two modality devices (sensor output, device output, video data, etc.) as control input, it uses information from a single modality device as control input.
  • the remote control target 250 can be controlled more stably than other remote control systems.
  • the prediction unit 220 of the controlled device 200 suppresses a decrease in prediction accuracy and uses the predictive control signal. can be generated.
  • the prediction unit 220 of the controlled device 200 uses line-of-sight information indicating the operator's intention for prediction, it is possible to predict the operator's long-term future movements with high accuracy.
  • FIG. 6 is a block diagram showing an example of the hardware configuration of the remote control device 100 and the controlled device 200 of the remote control system according to an embodiment of the present invention.
  • the detection unit 110, display unit 120, and communication unit 130 of the remote control device 100 are configured by a computer. Furthermore, the communication section 210, prediction section 220, operation section 230, and photographing section 240 of the controlled device 200 are configured by a computer.
  • the computer may be, for example, a personal computer, a server computer, or the like.
  • the computer has a hardware processor 501, a program memory 502, a data memory 503, a communication interface 504, and an input/output interface 505.
  • the hardware processor 501, program memory 502, data memory 503, communication interface 504, and input/output interface 505 are connected to each other via a bus 510, and can transmit and receive information between them.
  • the computer also has an input device 600 and an output device 700, as appropriate.
  • the input device 600 and the output device 700 are connected to the input/output interface 505 and can transmit and receive information to and from the input/output interface 505, respectively.
  • the hardware processor 501 is, for example, a CPU (Central Processing Unit).
  • the hardware processor 501 executes programs, performs data arithmetic processing, and the like.
  • the hardware processor 501 controls a program memory 502, a data memory 503, a communication interface 504, an input/output interface 505, and further controls an input device 600 and an output device 700 connected to the input/output interface 505.
  • the program memory 502 is a non-temporary tangible storage medium, such as a nonvolatile memory that can be written to and read from at any time such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and a ROM (Read Only Memory). It is configured by combining non-volatile memory such as The program memory 502 stores programs executed by the hardware processor 501 in order for the remote control device 100 or the controlled device 200 to execute various processes.
  • a nonvolatile memory that can be written to and read from at any time such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and a ROM (Read Only Memory). It is configured by combining non-volatile memory such as
  • the program memory 502 stores programs executed by the hardware processor 501 in order for the remote control device 100 or the controlled device 200 to execute various processes.
  • the data memory 503 is configured as a tangible storage medium, for example, by combining the above-mentioned nonvolatile memory and volatile memory such as RAM (Random Access Memory). Data memory 503 temporarily stores data necessary for processing executed by hardware processor 501.
  • RAM Random Access Memory
  • the communication interface 504 includes, for example, a wireless communication interface unit, and enables transmission and reception of information between the hardware processor 501 and the like and the communication network NW.
  • a wireless communication interface unit for example, an interface adopting a low power wireless data communication standard such as wireless LAN (Local Area Network) may be used.
  • the input/output interface 505 includes a wireless or wired communication interface unit, and enables information to be transmitted and received between the hardware processor 501 and the like, and the input device 600 and the output device 700.
  • the input device 600 may include any information input equipment such as a keyboard, mouse, touch panel, pointing device, camera, measurement device, joystick, etc.
  • the output device 700 may include any information output equipment such as a speaker, a light emitting device, etc., in addition to a display device such as a liquid crystal display or an organic EL display.
  • each processing unit of the detection unit 110 is such that the hardware processor 501 works with the data memory 503 to read the program stored in the program memory 502. This can be implemented by reading and executing.
  • the display unit 120 is configured by an output device 700 such as a display device.
  • the communication unit 130 is configured by a communication interface 504. Further, a modality device that provides the operator's motion information and the operator's line of sight information to the motion information acquisition processing section 111 and the line of sight information acquisition processing section 112, respectively, corresponds to the input device 600.
  • the communication unit 210 is configured by a communication interface 504.
  • the functions of each processing unit of the prediction unit 220 and the functions of each processing unit of the operation unit 230 are achieved by the hardware processor 501 working with the data memory 503 to read and execute the program stored in the program memory 502. can be implemented.
  • the photographing unit 240 includes an input device 600 such as a camera.
  • each processing section of the detection section 110 of the remote control device 100 and each processing section of the prediction section 220 and the operation section 230 of the controlled device 200 are configured using an application specific integrated circuit (ASIC) or an application specific integrated circuit (ASIC). It may also be constructed in a variety of other formats, including integrated circuits such as FPGAs (Field-Programmable Gate Arrays).
  • the present invention is not limited to the above-described embodiments, and can be variously modified at the implementation stage without departing from the gist thereof.
  • each embodiment may be implemented in combination as appropriate, and in that case, the combined effect can be obtained.
  • the embodiments described above include various inventions, and various inventions can be extracted by combinations selected from the plurality of constituent features disclosed. For example, if a problem can be solved and an effect can be obtained even if some constituent features are deleted from all the constituent features shown in the embodiment, the configuration from which these constituent features are deleted can be extracted as an invention.
  • DESCRIPTION OF SYMBOLS 100 Remote control device 110... Detection part 111... Motion information acquisition processing part 112... Line of sight information acquisition processing part 113... Control signal transmission processing part 120... Display part 130... Communication part 200... Control target device 210... Communication part 220... Prediction Units 221... Control signal reception processing section 222... Controlled object state acquisition processing section 223... Line of sight information processing section 224... Predictive control signal generation processing section 225... Predictive control signal transmission processing section 230... Operation section 231...

Abstract

This remote control system has a remote control device that is operated by an operator, and a control target device that is remotely operated by means of the remote control device. The remote control device has a control signal transmission processing unit that transmits a control signal including first control information and second control information. The control target device has a control signal reception processing unit that receives the control signal, a prediction control signal generation processing unit that generates a prediction control signal on the basis of the first control information and the second control information included in the control signal, and an operation control processing unit that controls the operation of the remote control target on the basis of the prediction control signal.

Description

遠隔制御システム、遠隔制御方法、および遠隔制御プログラムRemote control system, remote control method, and remote control program
 本発明は、遠隔制御システム、遠隔制御方法、および遠隔制御プログラムに関する。 The present invention relates to a remote control system, a remote control method, and a remote control program.
 近年のインターネット等の普及および通信速度の高速化に伴い、遠隔制御装置から、通信ネットワークを介して、人型やアーム型のロボット等の遠隔制御対象を制御する遠隔制御システムを構築する取り組みがおこなわれている。 With the spread of the Internet and faster communication speeds in recent years, efforts have been made to construct remote control systems that control remote control objects such as human-shaped and arm-shaped robots from remote control devices via communication networks. It is.
 遠隔制御システムとしては、オペレータが遠隔地にあるロボットを操作するシステムや、オペレータの作業をロボットが支援するシステムがある。このような遠隔制御システムは、オペレータの動作に同調するロボットの制御や、オペレータの動作と協調するロボットの制御が必要である。 Remote control systems include systems in which an operator operates a robot at a remote location, and systems in which a robot supports the operator's work. Such a remote control system requires control of a robot that synchronizes with the movements of an operator or controls a robot that cooperates with the movements of an operator.
 遠隔制御システムでは、オペレータの動きを反映した制御信号がロボットに遅延なく伝達されることが望まれる。しかし、実際には、制御信号は、通信や処理のため、ロボットに遅延して伝達される。 In remote control systems, it is desirable that control signals that reflect the operator's movements be transmitted to the robot without delay. However, in reality, control signals are transmitted to the robot with a delay due to communication and processing.
 一般に、遠隔制御システムは、制御信号の遅延伝達を補償するため、ヒトの未来の動作を予測する予測モデルを使用して、ヒトの未来の動作に合った予測制御信号を生成し、予測制御信号によりロボットを制御する。 In general, in order to compensate for the delayed transmission of control signals, remote control systems use a predictive model that predicts the human's future behavior to generate a predictive control signal that matches the human's future behavior. control the robot.
 非特許文献1は、ヒトの動作の予測モデルを活用し、ヒトの未来の状態を予測し、その意図や動作をロボットに働きかける遠隔制御システムを開示している。 Non-Patent Document 1 discloses a remote control system that utilizes a predictive model of human motion to predict the future state of a human and influence a robot to change its intentions and motions.
 一般に、ヒトの動作の予測モデルは、ヒトの身体に取り付けられたセンサの出力や、ヒトの動画像データから画像処理により得られる骨格情報と呼ばれるヒトの関節情報をもとに構築される。 In general, predictive models of human motion are constructed based on the output of sensors attached to the human body and human joint information called skeletal information obtained through image processing from human video data.
 非特許文献2は、過去のヒトの動作と未来のヒトの動作の間に相関があることを仮定し、入力となるヒトの過去の関節情報の系列データから、未来の関節情報を予測する手法を開示している。 Non-Patent Document 2 assumes that there is a correlation between past human movements and future human movements, and predicts future joint information from input series data of past human joint information. is disclosed.
 本発明の目的は、遠隔制御対象を精度良く安定的に制御する遠隔制御システム、遠隔制御方法、および遠隔制御プログラムを提供することにある。 An object of the present invention is to provide a remote control system, a remote control method, and a remote control program that accurately and stably control a remote control target.
 本発明の一態様は、遠隔制御システムである。遠隔制御システムは、オペレータによって操作される遠隔制御装置と、遠隔制御装置によって遠隔操作される制御対象装置とを有する。遠隔制御装置は、第1の制御情報と第2の制御情報とを含む制御信号を送信する制御信号送信処理部を有する。制御対象装置は、制御信号を受信する制御信号受信処理部と、制御信号受信処理部により受信された制御信号に含まれる第1の制御情報と第2の制御情報とに基づいて、予測制御信号を生成する予測制御信号生成処理部と、予測制御信号生成処理部により生成された予測制御信号に基づいて、遠隔制御対象の動作を制御する動作制御処理部とを有する。 One aspect of the present invention is a remote control system. A remote control system includes a remote control device operated by an operator and a controlled device remotely controlled by the remote control device. The remote control device includes a control signal transmission processing section that transmits a control signal including first control information and second control information. The controlled device generates a predictive control signal based on a control signal reception processing unit that receives the control signal, and first control information and second control information included in the control signal received by the control signal reception processing unit. and an operation control processing section that controls the operation of a remote control target based on the predictive control signal generated by the predictive control signal generation processing section.
 本発明の一態様は、遠隔制御方法である。遠隔制御方法は、第1の制御情報と第2の制御情報とを含む制御信号を遠隔制御装置から制御対象装置へ送信する第1のステップと、制御対象装置において制御信号を受信する第2のステップと、第2のステップにおいて受信された制御信号に含まれる第1の制御情報と第2の制御情報とに基づいて、予測制御信号を生成する第3のステップと、第3のステップにおいて生成された予測制御信号に基づいて、遠隔制御対象の動作を制御する第4のステップとを有する。 One aspect of the present invention is a remote control method. The remote control method includes a first step of transmitting a control signal including first control information and second control information from the remote control device to the controlled device, and a second step of receiving the control signal in the controlled device. a third step of generating a predictive control signal based on the first control information and second control information included in the control signal received in the second step; and a fourth step of controlling the operation of the remote control target based on the predicted control signal.
 本発明の一態様は、遠隔制御プログラムである。遠隔制御プログラムは、上記の遠隔制御装置の制御信号送信処理部の機能、または、上記の制御対象装置の制御信号受信処理部と予測制御信号生成処理部と動作制御処理部の機能を、コンピュータが有するプロセッサに実行させる。 One aspect of the present invention is a remote control program. The remote control program allows a computer to perform the functions of the control signal transmission processing section of the above-mentioned remote control device, or the functions of the control signal reception processing section, predictive control signal generation processing section, and operation control processing section of the above-mentioned controlled object device. It is executed by the processor that has it.
 本発明によれば、遠隔制御対象を精度良く安定的に制御する遠隔制御システム、遠隔制御方法、および遠隔制御プログラムが提供される。 According to the present invention, there are provided a remote control system, a remote control method, and a remote control program that accurately and stably control a remote control target.
図1は、一般的な予測制御の概要の一例を説明する図である。FIG. 1 is a diagram illustrating an example of an outline of general predictive control. 図2は、本発明の一実施形態に係る予測制御の概要の一例を説明する図である。FIG. 2 is a diagram illustrating an example of an overview of predictive control according to an embodiment of the present invention. 図3は、本発明の一実施形態に係る遠隔制御システムの一例を示すブロック図である。FIG. 3 is a block diagram showing an example of a remote control system according to an embodiment of the present invention. 図4は、本発明の一実施形態に係る遠隔制御システムの遠隔制御装置が実行する処理の一例を示すフローチャートである。FIG. 4 is a flowchart illustrating an example of a process executed by a remote control device of a remote control system according to an embodiment of the present invention. 図5は、本発明の一実施形態に係る遠隔制御システムの制御対象装置が実行する処理の一例を示すフローチャートである。FIG. 5 is a flowchart illustrating an example of a process executed by a controlled device of a remote control system according to an embodiment of the present invention. 図6は、本発明の一実施形態に係る遠隔制御システムの遠隔制御装置と制御対象装置のハードウェア構成の一例を示すブロック図である。FIG. 6 is a block diagram showing an example of the hardware configuration of a remote control device and a controlled device of a remote control system according to an embodiment of the present invention.
 以下、図面を参照して本発明に係る実施形態について説明する。 Hereinafter, embodiments according to the present invention will be described with reference to the drawings.
 [予測制御の概要]
 まず、図1と図2を参照して、一般的な予測制御の概要と、本発明の一実施形態に係る予測制御の概要とについて説明する。図1は、一般的な予測制御の概要の一例を説明する図である。図2は、本発明の一実施形態に係る予測制御の概要の一例を説明する図である。
[Overview of predictive control]
First, an overview of general predictive control and an overview of predictive control according to an embodiment of the present invention will be described with reference to FIGS. 1 and 2. FIG. 1 is a diagram illustrating an example of an outline of general predictive control. FIG. 2 is a diagram illustrating an example of an overview of predictive control according to an embodiment of the present invention.
 予測制御においては、通信や処理のために遅延した制御信号を、未来の制御信号である予測制御信号に変換する予測モデルが使用される。図1と図2はいずれも、予測モデルが、エンコーダとデコーダとから構成された例を示している。予測モデルは、図1と図2に示された構成に限られず、例えば、ロボットの既知のパラメータから算出されるモデルであってもよい。 In predictive control, a predictive model is used that converts control signals delayed for communication and processing into predictive control signals that are future control signals. Both FIGS. 1 and 2 show examples in which the prediction model is composed of an encoder and a decoder. The predictive model is not limited to the configuration shown in FIGS. 1 and 2, and may be a model calculated from known parameters of the robot, for example.
 図1に示される予測制御においては、制御信号がエンコーダに入力され、制御信号に基づいて、デコーダから予測制御信号が出力される。制御信号は、単一のモダリティ機器によって得られる単一の制御情報を有する。一般に、制御情報は、ヒトの関節情報である。 In the predictive control shown in FIG. 1, a control signal is input to the encoder, and a predictive control signal is output from the decoder based on the control signal. The control signal has a single control information obtained by a single modality device. Generally, the control information is human joint information.
 この予測制御は、例えばヒトの関節情報のみを利用したヒトの動作の予測モデルを用いている。このため、ヒトの関節情報に欠損やノイズがある場合、予測精度が低下してしまう。また、長期の未来のヒトの動作を予測する場合、ヒトの過去の動作に予測に必要な情報が含まれない場合があり、その場合には予測の精度が低下してしまう。 This predictive control uses, for example, a predictive model of human motion that uses only human joint information. For this reason, if there are defects or noise in human joint information, prediction accuracy will decrease. Furthermore, when predicting a human's long-term future movements, the information necessary for the prediction may not be included in the human's past movements, and in that case, the accuracy of the prediction decreases.
 図2に示される予測制御においては、第1の制御情報と第2の制御情報を含む制御信号がエンコーダに入力され、デコーダから予測制御信号が出力される。第1の制御情報は、第1のモダリティ機器によって得られる制御情報である。第2の制御情報は、第1のモダリティ機器とは異なる別の第2のモダリティ機器によって得られる制御情報である。例えば、第1の制御情報は、ヒトの関節情報であり、第2の制御情報は、ヒトの視線情報である。 In the predictive control shown in FIG. 2, a control signal including first control information and second control information is input to the encoder, and a predictive control signal is output from the decoder. The first control information is control information obtained by the first modality device. The second control information is control information obtained by another second modality device different from the first modality device. For example, the first control information is human joint information, and the second control information is human line of sight information.
 この予測制御は、例えばヒトの関節情報に加えて、ヒトの意図等を示すヒトの視線情報を利用したヒトの動作の予測モデルを用いている。このため、ヒトの関節情報に欠損やノイズがある場合でも、予測精度の低下を抑止することが可能である。また、長期の未来のヒトの動作を高精度に予測することが可能である。これにより、ロボットを安定的に制御することが可能となる。 This predictive control uses a predictive model of human motion that utilizes, for example, human joint information as well as human line of sight information that indicates the human's intentions. Therefore, even if there are defects or noise in the human joint information, it is possible to prevent the prediction accuracy from decreasing. Furthermore, it is possible to predict human behavior in the long-term future with high accuracy. This makes it possible to stably control the robot.
 [遠隔制御システム]
 次に、図3を参照して、本発明の一実施形態に係る遠隔制御システムの一例について説明する。図3は、本発明の一実施形態に係る遠隔制御システムの一例を示す図である。
[Remote control system]
Next, with reference to FIG. 3, an example of a remote control system according to an embodiment of the present invention will be described. FIG. 3 is a diagram showing an example of a remote control system according to an embodiment of the present invention.
 図3に示されるように、本発明の一実施形態に係る遠隔制御システムは、遠隔制御装置100と、制御対象装置200とを有する。遠隔制御装置100と制御対象装置200は、例えば、通信ネットワーク300を介して、双方向に通信可能に接続される。 As shown in FIG. 3, a remote control system according to an embodiment of the present invention includes a remote control device 100 and a controlled device 200. The remote control device 100 and the controlled device 200 are connected, for example, via a communication network 300 so that they can communicate in both directions.
 遠隔制御装置100は、制御対象装置200を遠隔操作するための装置であり、制御対象装置200は、遠隔制御装置100によって遠隔操作される装置である。遠隔制御装置100は、制御対象装置200の動作を遠隔制御するために、オペレータによって操作される。制御対象装置200は、遠隔制御装置100に対するオペレータの操作に応じて動作する。 The remote control device 100 is a device for remotely controlling a controlled device 200, and the controlled device 200 is a device remotely controlled by the remote control device 100. The remote control device 100 is operated by an operator in order to remotely control the operation of the controlled device 200. The controlled device 200 operates in response to an operator's operation on the remote control device 100.
 ここで、遠隔制御は、遠隔制御装置100と制御対象装置200の間の距離の遠近にかかわらず、遠隔制御装置100が出力する制御信号に従って制御対象装置200の動作を制御することを意味する。 Here, remote control means controlling the operation of the controlled device 200 according to a control signal output by the remote control device 100, regardless of the distance between the remote control device 100 and the controlled device 200.
 一例では、制御対象装置200は、遠隔制御装置100から遠く離れた遠隔地に配置され、オペレータによって動作が制御される装置である。 In one example, the controlled device 200 is a device that is located in a remote location far away from the remote control device 100, and whose operation is controlled by an operator.
 別の一例では、制御対象装置200は、遠隔制御装置100の近くに配置され、オペレータの作業を支援する装置、または、オペレータと共同作業を行う装置である。 In another example, the controlled device 200 is a device that is placed near the remote control device 100 and supports the work of an operator, or a device that performs collaborative work with the operator.
 〔遠隔制御装置100〕
 次に、図3に示される遠隔制御装置100の構成および動作について説明する。遠隔制御装置100は、通信ネットワーク300を介して、制御対象装置200を遠隔制御する装置である。
[Remote control device 100]
Next, the configuration and operation of the remote control device 100 shown in FIG. 3 will be explained. The remote control device 100 is a device that remotely controls the controlled device 200 via the communication network 300.
 遠隔制御装置100は、検出部110と、表示部120と、通信部130とを有する。 The remote control device 100 includes a detection section 110, a display section 120, and a communication section 130.
 検出部110は、オペレータの制御入力を検出する。オペレータの制御入力は、オペレータの動作情報と、オペレータの視線情報とを含む。オペレータの動作情報は、例えば、オペレータの骨格情報または関節情報である。検出部110はまた、制御対象装置200を遠隔制御するための制御信号を生成する。 The detection unit 110 detects an operator's control input. The operator's control input includes operator's motion information and operator's line of sight information. The operator's motion information is, for example, skeletal information or joint information of the operator. The detection unit 110 also generates a control signal for remotely controlling the controlled device 200.
 表示部120は、制御対象装置200から送信された情報を受信し、受信した情報を表示する。表示部120が受信する情報は、後述するように、遠隔制御対象250の映像情報と動作状態情報を含む。 The display unit 120 receives information transmitted from the controlled device 200 and displays the received information. The information received by the display unit 120 includes video information and operating state information of the remote control target 250, as will be described later.
 通信部130は、通信ネットワーク300を介して、遠隔制御装置100と制御対象装置200との相互間における情報の送受信を可能にするインタフェースである。 The communication unit 130 is an interface that enables information to be transmitted and received between the remote control device 100 and the controlled device 200 via the communication network 300.
 (検出部110)
 次に、検出部110の各部について説明する。検出部110は、動作情報取得処理部111と、視線情報取得処理部112と、制御信号送信処理部113を有する。
(Detection unit 110)
Next, each part of the detection section 110 will be explained. The detection unit 110 includes a motion information acquisition processing unit 111, a line of sight information acquisition processing unit 112, and a control signal transmission processing unit 113.
 動作情報取得処理部111は、オペレータの動作情報を取得する。例えば、動作情報は、オペレータの骨格情報または関節情報である。動作情報取得処理部111は、オペレータの身体に取り付けられたセンサの出力やオペレータの動画像データに基づいて、オペレータの骨格情報または関節情報を取得する。動作情報はまた、ジョイスティック等の入力機器に対するオペレータの操作情報であってもよい。動作情報取得処理部111は、入力機器の出力に基づいて、オペレータの操作情報を取得する。 The motion information acquisition processing unit 111 acquires the motion information of the operator. For example, the motion information is skeletal information or joint information of the operator. The motion information acquisition processing unit 111 acquires skeletal information or joint information of the operator based on the output of a sensor attached to the operator's body and the operator's moving image data. The operation information may also be information on an operator's operation of an input device such as a joystick. The operation information acquisition processing unit 111 acquires operator operation information based on the output of the input device.
 視線情報取得処理部112は、オペレータの視線情報を取得する。例えば、視線情報は、オペレータの動画像データに対する画像処理から得られる眼球方向の情報や、視線計測デバイスから得られる3次元注視位置の情報である。 The line-of-sight information acquisition processing unit 112 acquires line-of-sight information of the operator. For example, the line-of-sight information is information on the eyeball direction obtained from image processing of moving image data of the operator, or information on the three-dimensional gaze position obtained from a line-of-sight measurement device.
 動作情報取得処理部111により取得される動作情報が、図2を参照して説明された予測制御における第1の制御情報に相当する。また、視線情報取得処理部112により取得された取得される視線情報が、図2を参照して説明された予測制御における第2の制御情報に相当する。つまり、動作情報取得処理部111と視線情報取得処理部112は、それぞれ、別種のモダリティ機器(センサ、カメラ、計測デバイス等)によって得られる別種の制御情報を取得する。 The motion information acquired by the motion information acquisition processing unit 111 corresponds to the first control information in the predictive control described with reference to FIG. 2. Further, the line-of-sight information acquired by the line-of-sight information acquisition processing unit 112 corresponds to the second control information in the predictive control described with reference to FIG. 2. That is, the motion information acquisition processing unit 111 and the line of sight information acquisition processing unit 112 each acquire different types of control information obtained by different types of modality equipment (sensors, cameras, measurement devices, etc.).
 制御信号送信処理部113は、動作情報取得処理部111により取得された動作情報と、視線情報取得処理部112により取得された視線情報とを含む制御信号を生成する。制御信号送信処理部113は、制御信号を、通信部130を通して、通信ネットワーク300を介して、制御対象装置200へ送信する。 The control signal transmission processing unit 113 generates a control signal that includes the motion information acquired by the motion information acquisition processing unit 111 and the gaze information acquired by the gaze information acquisition processing unit 112. The control signal transmission processing unit 113 transmits the control signal to the controlled device 200 via the communication unit 130 and the communication network 300.
 〔制御対象装置200〕
 次に、図3に示される制御対象装置200の構成および動作について説明する。制御対象装置200は、通信ネットワーク300を介して、遠隔制御装置100により遠隔制御される装置である。
[Controlled device 200]
Next, the configuration and operation of the controlled device 200 shown in FIG. 3 will be described. The controlled device 200 is a device that is remotely controlled by the remote control device 100 via the communication network 300.
 制御対象装置200は、通信部210と、予測部220と、動作部230と、撮影部240とを有する。制御対象装置200はまた、遠隔制御対象250を有する。遠隔制御対象250は、例えば、人型ロボットまたはアーム型ロボットである。 The controlled device 200 includes a communication section 210, a prediction section 220, an operation section 230, and a photographing section 240. The controlled device 200 also has a remote controlled object 250 . The remote control target 250 is, for example, a humanoid robot or an arm-shaped robot.
 本実施形態では、遠隔制御対象250は、図3に示されるように、制御対象装置200の一部の要素として、言い換えれば、制御対象装置200に含まれるものとして説明する。しかし、遠隔制御対象250は、制御対象装置200とは別個の要素、言い換えれば、制御対象装置200の外部の要素であってもよい。 In this embodiment, the remote control target 250 will be described as a part of the controlled target device 200, in other words, as something included in the controlled target device 200, as shown in FIG. However, the remote control target 250 may be a separate element from the control target device 200, in other words, an element external to the control target device 200.
 通信部210は、通信ネットワーク300を介して、制御対象装置200と遠隔制御装置100との相互間における情報の送受信を可能にするインタフェースである。 The communication unit 210 is an interface that enables information to be transmitted and received between the controlled device 200 and the remote control device 100 via the communication network 300.
 予測部220は、遠隔制御装置100から送信された制御信号を、通信ネットワーク300を介して、通信部210を通して受信する。予測部220が受信する制御信号は、通信および処理のため、オペレータの制御入力に対して遅延している。予測部220は、制御信号に対する予測処理をおこない、予測制御信号を生成する。予測制御信号は、理想的には遅延の影響を含まない制御信号である。言い換えれば、予測制御信号は、オペレータの制御入力時よりも未来における適切な制御信号と言い得る。 The prediction unit 220 receives the control signal transmitted from the remote control device 100 through the communication unit 210 via the communication network 300. The control signals received by the predictor 220 are delayed relative to operator control inputs due to communication and processing. The prediction unit 220 performs prediction processing on the control signal and generates a predicted control signal. The predictive control signal is ideally a control signal that does not include the influence of delay. In other words, the predictive control signal can be said to be a more appropriate control signal in the future than at the time of the operator's control input.
 動作部230は、予測部220が生成した予測制御信号に基づいて、遠隔制御対象250を実際に動作させる。 The operation unit 230 actually operates the remote control target 250 based on the predictive control signal generated by the prediction unit 220.
 撮影部240は、遠隔制御対象250を撮影する。撮影部240が撮影する映像は、動作部230による制御下において遠隔制御対象250が実行した動作の結果の状態を示す。撮影部240はまた、映像情報を、通信部210を通して、通信ネットワーク300を介して、遠隔制御装置100へ送信する。映像情報は、遠隔制御装置100の表示部120により受信され、表示部120は、受信した映像情報に基づいて、映像を表示する。 The photographing unit 240 photographs the remote control target 250. The image photographed by the photographing unit 240 shows the state as a result of the operation performed by the remote control target 250 under the control of the operation unit 230. The photographing unit 240 also transmits video information to the remote control device 100 via the communication unit 210 and the communication network 300. The video information is received by the display unit 120 of the remote control device 100, and the display unit 120 displays the video based on the received video information.
 (予測部220)
 次に、予測部220の各部について説明する。予測部220は、制御信号受信処理部221と、制御対象状態取得処理部222と、視線情報処理部223と、予測制御信号生成処理部224と、予測制御信号送信処理部225とを有する。
(Prediction unit 220)
Next, each part of the prediction unit 220 will be explained. The prediction unit 220 includes a control signal reception processing unit 221 , a controlled object state acquisition processing unit 222 , a line-of-sight information processing unit 223 , a predictive control signal generation processing unit 224 , and a predictive control signal transmission processing unit 225 .
 制御信号受信処理部221は、通信部210を介して、遠隔制御装置100からの制御信号を受信する。 The control signal reception processing unit 221 receives a control signal from the remote control device 100 via the communication unit 210.
 制御対象状態取得処理部222は、遠隔制御対象250の現在の状態を取得する。例えば、遠隔制御対象250が6軸制御のアーム型ロボットである場合、制御対象状態取得処理部222は、アーム型ロボットの6つのアクチュエータの各々の回転角情報とトルク情報を取得する。6つのアクチュエータの回転角情報とトルク情報は、アーム型ロボットの状態を示し得る。各アクチュエータの回転角情報とトルク情報は、例えば、各アクチュエータに設けられたセンサにより検出される。 The controlled object state acquisition processing unit 222 obtains the current state of the remote controlled object 250. For example, when the remote control target 250 is a six-axis controlled arm robot, the control target state acquisition processing unit 222 acquires rotation angle information and torque information of each of the six actuators of the arm robot. The rotation angle information and torque information of the six actuators can indicate the state of the arm type robot. The rotation angle information and torque information of each actuator are detected, for example, by a sensor provided on each actuator.
 視線情報処理部223は、制御信号受信処理部221により受信された制御信号に含まれる視線情報に対して前処理をおこなう。詳しくは、視線情報処理部223は、視線情報について、外れ値を検出し、ノイズの影響を緩和するため、四分位範囲を超える値に関しては前フレームの値に置換する。視線情報処理部223はまた、ノイズの影響を緩和し、予測の際に特徴を学習しやすくするため、視線情報に対して、移動平均による平滑化を行う。 The line-of-sight information processing section 223 performs preprocessing on the line-of-sight information included in the control signal received by the control signal reception processing section 221. Specifically, the line-of-sight information processing unit 223 detects outliers in the line-of-sight information and replaces values exceeding the interquartile range with values from the previous frame in order to alleviate the influence of noise. The line-of-sight information processing unit 223 also smooths the line-of-sight information using a moving average in order to alleviate the influence of noise and make it easier to learn features during prediction.
 予測制御信号生成処理部224は、制御信号受信処理部221により受信された制御信号に含まれる動作情報と、視線情報処理部223により前処理された視線情報と、制御対象状態取得処理部222により取得された遠隔制御対象250の現在の状態の情報とに基づいて、予測制御信号を生成する。 The predictive control signal generation processing unit 224 uses the motion information included in the control signal received by the control signal reception processing unit 221, the line of sight information preprocessed by the line of sight information processing unit 223, and the control target state acquisition processing unit 222. A predictive control signal is generated based on the acquired information on the current state of the remote control target 250.
 予測制御信号送信処理部225は、予測制御信号生成処理部224により生成された予測制御信号を動作部230に送信する。 The predictive control signal transmission processing section 225 transmits the predictive control signal generated by the predictive control signal generation processing section 224 to the operation section 230.
 (動作部230)
 次に、動作部230の各部について説明する。動作部230は、予測制御信号受信処理部231と、動作制御処理部232と、動作状態取得処理部233と、動作状態情報送信処理部234とを有する。
(Operation unit 230)
Next, each part of the operating section 230 will be explained. The operation section 230 includes a predictive control signal reception processing section 231, an operation control processing section 232, an operation state acquisition processing section 233, and an operation state information transmission processing section 234.
 予測制御信号受信処理部231は、予測部220の予測制御信号送信処理部225から送信された予測制御信号を受信する。 The predictive control signal reception processing section 231 receives the predictive control signal transmitted from the predictive control signal transmission processing section 225 of the prediction section 220.
 動作制御処理部232は、予測制御信号受信処理部231により受信された予測制御信号に従って、遠隔制御対象250の動作を制御する。すなわち、動作制御処理部232は、予測制御信号に従って、遠隔制御対象250の各アクチュエータを制御する。 The operation control processing section 232 controls the operation of the remote control target 250 according to the predictive control signal received by the predictive control signal reception processing section 231. That is, the operation control processing unit 232 controls each actuator of the remote control target 250 according to the predictive control signal.
 動作状態取得処理部233は、動作制御処理部232による制御下において遠隔制御対象250が実行した動作の結果の状態を示す動作状態情報を取得する。例えば、遠隔制御対象250が6軸制御のアーム型ロボットである場合、動作状態取得処理部233は、6つのアクチュエータの各々の回転角情報とトルク情報を取得する。 The operation state acquisition processing unit 233 acquires operation state information indicating the state of the result of the operation performed by the remote control target 250 under the control of the operation control processing unit 232. For example, when the remote control target 250 is a six-axis controlled arm robot, the operating state acquisition processing unit 233 acquires rotation angle information and torque information of each of the six actuators.
 動作状態情報送信処理部234は、動作状態取得処理部233により取得された動作状態情報を、通信部210を通して、通信ネットワーク300を介して、遠隔制御装置100へ送信する。動作状態情報は、遠隔制御装置100の表示部120により受信され、表示部120は、受信した動作状態情報を表示する。 The operating state information transmission processing unit 234 transmits the operating state information acquired by the operating state acquisition processing unit 233 to the remote control device 100 via the communication unit 210 and the communication network 300. The operating state information is received by the display unit 120 of the remote control device 100, and the display unit 120 displays the received operating state information.
 [動作例]
 以下、本発明の一実施形態に係る遠隔制御システムが実行する遠隔制御の動作例について説明する。
[Operation example]
Hereinafter, an example of remote control operation performed by the remote control system according to an embodiment of the present invention will be described.
 (遠隔制御装置100の処理)
 まず、図4を参照して、遠隔制御装置100が実行する処理の一例について説明する。図4は、遠隔制御装置100が実行する処理の一例を示すフローチャートである。
(Processing of remote control device 100)
First, with reference to FIG. 4, an example of a process executed by the remote control device 100 will be described. FIG. 4 is a flowchart illustrating an example of a process executed by the remote control device 100.
 まず、ステップS11において、遠隔制御装置100の検出部110は、オペレータの制御入力を検出する。詳しくは、検出部110の動作情報取得処理部111は、オペレータの動作情報を検出する。また、検出部110の視線情報取得処理部112は、オペレータの視線情報を検出する。 First, in step S11, the detection unit 110 of the remote control device 100 detects an operator's control input. Specifically, the motion information acquisition processing section 111 of the detection section 110 detects the motion information of the operator. Further, the line-of-sight information acquisition processing unit 112 of the detection unit 110 detects line-of-sight information of the operator.
 次に、ステップS12において、検出部110の制御信号送信処理部113は、動作情報取得処理部111により取得されたオペレータの動作情報と、視線情報取得処理部112により取得されたオペレータの視線情報に基づいて、動作情報と視線情報を含む制御信号を生成する。制御信号送信処理部113は続いて、制御信号を、通信部130を通して、通信ネットワーク300を介して、制御対象装置200へ送信する。 Next, in step S12, the control signal transmission processing section 113 of the detection section 110 uses the operator's motion information acquired by the motion information acquisition processing section 111 and the operator's line of sight information acquired by the gaze information acquisition processing section 112. Based on the information, a control signal including motion information and line-of-sight information is generated. The control signal transmission processing unit 113 then transmits the control signal to the controlled device 200 via the communication unit 130 and the communication network 300.
 遠隔制御装置100は、上述したステップS11,S12の処理を繰り返し実行する。 The remote control device 100 repeatedly executes the processes of steps S11 and S12 described above.
 (制御対象装置200の処理)
 次に、図5を参照して、制御対象装置200が実行する処理の一例について説明する。図5は、制御対象装置200が実行する処理の一例を示すフローチャートである。
(Processing of controlled device 200)
Next, with reference to FIG. 5, an example of a process executed by the controlled device 200 will be described. FIG. 5 is a flowchart illustrating an example of a process executed by the controlled device 200.
 まず、ステップS21において、制御対象装置200の予測部220の制御信号受信処理部221は、遠隔制御装置100から送信された制御信号を、通信ネットワーク300を介して、通信部210を通して受信する。 First, in step S21, the control signal reception processing unit 221 of the prediction unit 220 of the controlled device 200 receives a control signal transmitted from the remote control device 100 through the communication unit 210 via the communication network 300.
 次に、ステップS22において、予測部220の予測制御信号生成処理部224は、制御信号受信処理部221により受信された制御信号に含まれる動作情報と視線情報(詳しくは、視線情報処理部223により前処理された視線情報)を用いて、必要であれば、制御対象状態取得処理部222により取得された遠隔制御対象250の現在の状態の情報をさらに用いて、予測制御信号を生成する。 Next, in step S22, the predictive control signal generation processing section 224 of the prediction section 220 generates motion information and line-of-sight information (specifically, the line-of-sight information processing section 223) included in the control signal received by the control signal reception processing section 221. If necessary, information on the current state of the remote control target 250 acquired by the control target state acquisition processing unit 222 is used to generate a predictive control signal.
 続いて、ステップS23において、制御対象装置200の動作部230の動作制御処理部232は、予測部220により生成された予測制御信号を用いて、遠隔制御対象250の動作を制御する。 Subsequently, in step S23, the operation control processing unit 232 of the operation unit 230 of the controlled device 200 controls the operation of the remote control target 250 using the predictive control signal generated by the prediction unit 220.
 その後、ステップS24において、制御対象装置200は、遠隔制御対象250の動作の結果を示す情報を、通信ネットワーク300を介して、遠隔制御装置100へ送信する。例えば、撮影部240は、遠隔制御対象250を撮影し、その映像情報を、通信部210を通して、通信ネットワーク300を介して、遠隔制御装置100へ送信する。また、動作状態取得処理部233は、遠隔制御対象250の動作状態情報を取得し、動作状態情報送信処理部234は、動作状態情報を、通信部210を通して、通信ネットワーク300を介して、遠隔制御装置100へ送信する。遠隔制御装置100は、受信した映像情報と動作状態情報に基づいて、遠隔制御対象250の動作の結果を表示部120に表示する。 Thereafter, in step S24, the controlled device 200 transmits information indicating the result of the operation of the remote controlled object 250 to the remote control device 100 via the communication network 300. For example, the photographing unit 240 photographs the remote control target 250 and transmits the image information to the remote control device 100 via the communication unit 210 and the communication network 300 . Further, the operating state acquisition processing unit 233 obtains operating state information of the remote control target 250, and the operating state information transmission processing unit 234 transmits the operating state information to the remote control target 250 through the communication unit 210 and the communication network 300. Send to device 100. The remote control device 100 displays the result of the operation of the remote control target 250 on the display unit 120 based on the received video information and operation state information.
 制御対象装置200は、上述したステップS21~S24の処理を繰り返し実行する。 The controlled device 200 repeatedly executes the processes of steps S21 to S24 described above.
 [効果]
 本実施形態に係る遠隔制御システムでは、遠隔制御装置100の検出部110は、オペレータの動作情報と視線情報を含む制御信号を生成し、制御対象装置200の予測部220は、オペレータの動作情報と視線情報を用いて、予測制御信号を生成する。
[effect]
In the remote control system according to the present embodiment, the detection unit 110 of the remote control device 100 generates a control signal that includes operator motion information and line of sight information, and the prediction unit 220 of the controlled device 200 generates a control signal that includes operator motion information and line of sight information. A predictive control signal is generated using the line of sight information.
 すなわち、本実施形態に係る遠隔制御システムは、二つのモダリティ機器からの情報(センサ出力、デバイス出力、動画像データ等)を制御入力とするため、単一のモダリティ機器からの情報を制御入力とする遠隔制御システムに比べて、遠隔制御対象250を安定的に制御することができる。 In other words, since the remote control system according to the present embodiment uses information from two modality devices (sensor output, device output, video data, etc.) as control input, it uses information from a single modality device as control input. The remote control target 250 can be controlled more stably than other remote control systems.
 具体的には、オペレータの動作情報に欠損があったり、制御信号にノイズが乗ったりした場合であっても、制御対象装置200の予測部220は、予測精度の低下を抑えて、予測制御信号を生成することができる。 Specifically, even if operator motion information is missing or noise is added to the control signal, the prediction unit 220 of the controlled device 200 suppresses a decrease in prediction accuracy and uses the predictive control signal. can be generated.
 また、制御対象装置200の予測部220は、オペレータの意図を示す視線情報を予測に利用するため、長期の未来のオペレータの動作を高精度に予測することができる。 Further, since the prediction unit 220 of the controlled device 200 uses line-of-sight information indicating the operator's intention for prediction, it is possible to predict the operator's long-term future movements with high accuracy.
 [ハードウェア構成]
 図6を参照して、本発明の一実施形態に係る遠隔制御システムの遠隔制御装置100と制御対象装置200のハードウェア構成の一例について説明する。図6は、本発明の一実施形態に係る遠隔制御システムの遠隔制御装置100と制御対象装置200のハードウェア構成の一例を示すブロック図である。
[Hardware configuration]
With reference to FIG. 6, an example of the hardware configuration of the remote control device 100 and the controlled device 200 of the remote control system according to an embodiment of the present invention will be described. FIG. 6 is a block diagram showing an example of the hardware configuration of the remote control device 100 and the controlled device 200 of the remote control system according to an embodiment of the present invention.
 遠隔制御装置100の検出部110と表示部120と通信部130は、コンピュータにより構成される。また、制御対象装置200の通信部210と予測部220と動作部230と撮影部240は、コンピュータにより構成される。コンピュータは、例えば、パーソナルコンピュータ、サーバコンピュータ等であってよい。 The detection unit 110, display unit 120, and communication unit 130 of the remote control device 100 are configured by a computer. Furthermore, the communication section 210, prediction section 220, operation section 230, and photographing section 240 of the controlled device 200 are configured by a computer. The computer may be, for example, a personal computer, a server computer, or the like.
 コンピュータは、ハードウェアプロセッサ501と、プログラムメモリ502と、データメモリ503と、通信インタフェース504と、入出力インタフェース505とを有する。ハードウェアプロセッサ501とプログラムメモリ502とデータメモリ503と通信インタフェース504と入出力インタフェース505は、バス510を介して互いに接続されており、相互間で情報の送受信をおこなうことができる。 The computer has a hardware processor 501, a program memory 502, a data memory 503, a communication interface 504, and an input/output interface 505. The hardware processor 501, program memory 502, data memory 503, communication interface 504, and input/output interface 505 are connected to each other via a bus 510, and can transmit and receive information between them.
 コンピュータはまた、適宜、入力デバイス600と出力デバイス700を有する。入力デバイス600と出力デバイス700は、入出力インタフェース505と接続され、それぞれ、入出力インタフェース505との間で情報の送信と受信をおこなうことができる。 The computer also has an input device 600 and an output device 700, as appropriate. The input device 600 and the output device 700 are connected to the input/output interface 505 and can transmit and receive information to and from the input/output interface 505, respectively.
 ハードウェアプロセッサ501は、例えば、CPU(Central Processing Unit)である。ハードウェアプロセッサ501は、プログラムの実行、データの演算処理等をおこなう。ハードウェアプロセッサ501は、プログラムメモリ502とデータメモリ503と通信インタフェース504と入出力インタフェース505を制御し、さらには、入出力インタフェース505に接続された入力デバイス600と出力デバイス700をも制御する。 The hardware processor 501 is, for example, a CPU (Central Processing Unit). The hardware processor 501 executes programs, performs data arithmetic processing, and the like. The hardware processor 501 controls a program memory 502, a data memory 503, a communication interface 504, an input/output interface 505, and further controls an input device 600 and an output device 700 connected to the input/output interface 505.
 プログラムメモリ502は、非一時的な有形の記憶媒体として、例えば、HDD(Hard Disk Drive)またはSSD(Solid State Drive)等の随時書込み及び読出しが可能な不揮発性メモリと、ROM(Read Only Memory)等の不揮発性メモリとを組み合わせて構成される。プログラムメモリ502は、遠隔制御装置100または制御対象装置200が各処理を実行するために、ハードウェアプロセッサ501が実行するプログラムを格納している。 The program memory 502 is a non-temporary tangible storage medium, such as a nonvolatile memory that can be written to and read from at any time such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and a ROM (Read Only Memory). It is configured by combining non-volatile memory such as The program memory 502 stores programs executed by the hardware processor 501 in order for the remote control device 100 or the controlled device 200 to execute various processes.
 データメモリ503は、有形の記憶媒体として、例えば、上記の不揮発性メモリと、RAM(Random Access Memory)等の揮発性メモリとを組み合わせて構成される。データメモリ503は、ハードウェアプロセッサ501が実行する処理において必要なデータを一時的に記憶する。 The data memory 503 is configured as a tangible storage medium, for example, by combining the above-mentioned nonvolatile memory and volatile memory such as RAM (Random Access Memory). Data memory 503 temporarily stores data necessary for processing executed by hardware processor 501.
 通信インタフェース504は、例えば、無線の通信インタフェースユニットを含んでおり、ハードウェアプロセッサ501等と通信ネットワークNWとの間の情報の送受信を可能にする。無線インタフェースとしては、例えば、無線LAN(Local Area Network)などの小電力無線データ通信規格が採用されたインタフェースが使用され得る。 The communication interface 504 includes, for example, a wireless communication interface unit, and enables transmission and reception of information between the hardware processor 501 and the like and the communication network NW. As the wireless interface, for example, an interface adopting a low power wireless data communication standard such as wireless LAN (Local Area Network) may be used.
 入出力インタフェース505は、無線または有線の通信インタフェースユニットを含んでおり、ハードウェアプロセッサ501等と、入力デバイス600および出力デバイス700との間の情報の送受信を可能にする。 The input/output interface 505 includes a wireless or wired communication interface unit, and enables information to be transmitted and received between the hardware processor 501 and the like, and the input device 600 and the output device 700.
 入力デバイス600は、キーボード、マウス、タッチパネル、ポインティングデバイス、カメラ、計測デバイス、ジョイスティック等、任意の情報入力機器を含み得る。 The input device 600 may include any information input equipment such as a keyboard, mouse, touch panel, pointing device, camera, measurement device, joystick, etc.
 出力デバイス700は、液晶ディスプレイまたは有機ELディスプレイ等の表示デバイスに加え、スピーカ、発光デバイス等、任意の情報出力機器を含み得る。 The output device 700 may include any information output equipment such as a speaker, a light emitting device, etc., in addition to a display device such as a liquid crystal display or an organic EL display.
 このようなハードウェア構成において、遠隔制御装置100に関しては、検出部110の各処理部の機能は、ハードウェアプロセッサ501が、データメモリ503と共働して、プログラムメモリ502に格納されたプログラムを読み込み実行することにより実施され得る。表示部120は、表示デバイス等の出力デバイス700により構成される。通信部130は、通信インタフェース504により構成される。また、オペレータの動作情報とオペレータの視線情報をそれぞれ動作情報取得処理部111と視線情報取得処理部112に提供するモダリティ機器は、入力デバイス600に相当する。 In such a hardware configuration, regarding the remote control device 100, the function of each processing unit of the detection unit 110 is such that the hardware processor 501 works with the data memory 503 to read the program stored in the program memory 502. This can be implemented by reading and executing. The display unit 120 is configured by an output device 700 such as a display device. The communication unit 130 is configured by a communication interface 504. Further, a modality device that provides the operator's motion information and the operator's line of sight information to the motion information acquisition processing section 111 and the line of sight information acquisition processing section 112, respectively, corresponds to the input device 600.
 また、制御対象装置200に関しては、通信部210は、通信インタフェース504により構成される。予測部220の各処理部の機能と動作部230の各処理部の機能は、ハードウェアプロセッサ501が、データメモリ503と共働して、プログラムメモリ502に格納されたプログラムを読み込み実行することにより実施され得る。撮影部240は、カメラ等の入力デバイス600により構成される。 Furthermore, regarding the controlled device 200, the communication unit 210 is configured by a communication interface 504. The functions of each processing unit of the prediction unit 220 and the functions of each processing unit of the operation unit 230 are achieved by the hardware processor 501 working with the data memory 503 to read and execute the program stored in the program memory 502. can be implemented. The photographing unit 240 includes an input device 600 such as a camera.
 遠隔制御装置100の検出部110の各処理部と制御対象装置200の予測部220と動作部230の各処理部の一部または全部は、特定用途向け集積回路(ASIC:Application Specific Integrated Circuit)またはFPGA(Field-Programmable Gate Array)などの集積回路を含む、他の多様な形式によって構成されてもよい。 A part or all of each processing section of the detection section 110 of the remote control device 100 and each processing section of the prediction section 220 and the operation section 230 of the controlled device 200 are configured using an application specific integrated circuit (ASIC) or an application specific integrated circuit (ASIC). It may also be constructed in a variety of other formats, including integrated circuits such as FPGAs (Field-Programmable Gate Arrays).
 なお、本発明は、上記実施形態に限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で種々に変形することが可能である。また、各実施形態は適宜組み合わせて実施してもよく、その場合組み合わせた効果が得られる。更に、上記実施形態には種々の発明が含まれており、開示される複数の構成要件から選択された組み合わせにより種々の発明が抽出され得る。例えば、実施形態に示される全構成要件からいくつかの構成要件が削除されても、課題が解決でき、効果が得られる場合には、この構成要件が削除された構成が発明として抽出され得る。 Note that the present invention is not limited to the above-described embodiments, and can be variously modified at the implementation stage without departing from the gist thereof. Moreover, each embodiment may be implemented in combination as appropriate, and in that case, the combined effect can be obtained. Furthermore, the embodiments described above include various inventions, and various inventions can be extracted by combinations selected from the plurality of constituent features disclosed. For example, if a problem can be solved and an effect can be obtained even if some constituent features are deleted from all the constituent features shown in the embodiment, the configuration from which these constituent features are deleted can be extracted as an invention.
  100…遠隔制御装置
  110…検出部
  111…動作情報取得処理部
  112…視線情報取得処理部
  113…制御信号送信処理部
  120…表示部
  130…通信部
  200…制御対象装置
  210…通信部
  220…予測部
  221…制御信号受信処理部
  222…制御対象状態取得処理部
  223…視線情報処理部
  224…予測制御信号生成処理部
  225…予測制御信号送信処理部
  230…動作部
  231…予測制御信号受信処理部
  232…動作制御処理部
  233…動作状態取得処理部
  234…動作状態情報送信処理部
  240…撮影部
  250…遠隔制御対象
  300…通信ネットワーク
  501…ハードウェアプロセッサ
  502…プログラムメモリ
  503…データメモリ
  504…通信インタフェース
  505…入出力インタフェース
  510…バス
  600…入力デバイス
  700…出力デバイス
DESCRIPTION OF SYMBOLS 100... Remote control device 110... Detection part 111... Motion information acquisition processing part 112... Line of sight information acquisition processing part 113... Control signal transmission processing part 120... Display part 130... Communication part 200... Control target device 210... Communication part 220... Prediction Units 221... Control signal reception processing section 222... Controlled object state acquisition processing section 223... Line of sight information processing section 224... Predictive control signal generation processing section 225... Predictive control signal transmission processing section 230... Operation section 231... Predictive control signal reception processing section 232...Operation control processing section 233...Operating state acquisition processing section 234...Operating state information transmission processing section 240...Photographing section 250...Remote control target 300...Communication network 501...Hardware processor 502...Program memory 503...Data memory 504...Communication Interface 505...Input/output interface 510...Bus 600...Input device 700...Output device

Claims (7)

  1.  オペレータによって操作される遠隔制御装置と、
     前記遠隔制御装置によって遠隔操作される制御対象装置とを有し、
     前記遠隔制御装置は、第1の制御情報と第2の制御情報とを含む制御信号を送信する制御信号送信処理部を有し、
     前記制御対象装置は、
      前記制御信号を受信する制御信号受信処理部と、
      前記制御信号受信処理部により受信された前記制御信号に含まれる前記第1の制御情報と前記第2の制御情報とに基づいて、予測制御信号を生成する予測制御信号生成処理部と、
      前記予測制御信号生成処理部により生成された前記予測制御信号に基づいて、遠隔制御対象の動作を制御する動作制御処理部とを有する、
     遠隔制御システム。
    a remote control device operated by an operator;
    and a controlled device remotely controlled by the remote control device,
    The remote control device has a control signal transmission processing unit that transmits a control signal including first control information and second control information,
    The controlled device is
    a control signal reception processing unit that receives the control signal;
    a predictive control signal generation processing unit that generates a predictive control signal based on the first control information and the second control information included in the control signal received by the control signal reception processing unit;
    an operation control processing section that controls the operation of a remote control target based on the predictive control signal generated by the predictive control signal generation processing section;
    remote control system.
  2.  前記遠隔制御装置は、
      前記オペレータの動作情報を取得する動作情報取得処理部と、
      前記オペレータの視線情報を取得する視線情報取得処理部とを有し、
     前記制御信号送信処理部は、前記動作情報取得処理部により取得された前記動作情報を前記第1の制御情報として、前記視線情報取得処理部により取得された前記視線情報を前記第2の制御情報として含む前記制御信号を生成する、
     請求項1に記載の遠隔制御システム。
    The remote control device includes:
    a motion information acquisition processing unit that acquires motion information of the operator;
    and a line-of-sight information acquisition processing unit that acquires line-of-sight information of the operator,
    The control signal transmission processing section sets the motion information acquired by the motion information acquisition processing section as the first control information, and the gaze information acquired by the gaze information acquisition processing section as the second control information. generating the control signal comprising:
    The remote control system according to claim 1.
  3.  前記制御対象装置は、前記制御信号受信処理部により受信された前記制御信号に含まれる前記視線情報を前処理する視線情報処理部を有し、
     前記予測制御信号生成処理部は、前記制御信号受信処理部により受信された前記制御信号に含まれる前記動作情報と、前記視線情報処理部により前処理された前記視線情報とに基づいて、前記予測制御信号を生成する、
     請求項2に記載の遠隔制御システム。
    The controlled device includes a line-of-sight information processing unit that preprocesses the line-of-sight information included in the control signal received by the control signal reception processing unit,
    The predictive control signal generation processing unit generates the prediction based on the motion information included in the control signal received by the control signal reception processing unit and the line-of-sight information preprocessed by the line-of-sight information processing unit. generate a control signal,
    The remote control system according to claim 2.
  4.  前記制御対象装置は、前記遠隔制御対象を撮影し、その映像情報を送信する撮影部を有し、
     前記遠隔制御装置は、前記撮影部から送信された前記映像情報を受信し、その映像を表示する表示部を有する、
     請求項1に記載の遠隔制御システム。
    The control target device has a photographing unit that photographs the remote control target and transmits the image information,
    The remote control device includes a display unit that receives the video information transmitted from the imaging unit and displays the video.
    The remote control system according to claim 1.
  5.  前記制御対象装置は、
      前記遠隔制御対象の動作状態情報を取得する動作状態取得処理部と、
      前記動作状態取得処理部により取得された前記動作状態情報を送信する動作状態送信処理部を有し、
     前記遠隔制御装置の前記表示部は、前記動作状態送信処理部から送信された前記動作状態情報を受信して表示する、
     請求項4に記載の遠隔制御システム。
    The controlled device is
    an operating state acquisition processing unit that obtains operating state information of the remote control target;
    an operating state transmission processing unit that transmits the operating state information acquired by the operating state acquisition processing unit;
    The display section of the remote control device receives and displays the operating state information transmitted from the operating state transmission processing section.
    The remote control system according to claim 4.
  6.  第1の制御情報と第2の制御情報とを含む制御信号を遠隔制御装置から制御対象装置へ送信する第1のステップと、
     前記制御対象装置において前記制御信号を受信する第2のステップと、
     前記第2のステップにおいて受信された前記制御信号に含まれる前記第1の制御情報と前記第2の制御情報とに基づいて、予測制御信号を生成する第3のステップと、
     前記第3のステップにおいて生成された前記予測制御信号に基づいて、遠隔制御対象の動作を制御する第4のステップとを有する、
     遠隔制御方法。
    a first step of transmitting a control signal including first control information and second control information from the remote control device to the controlled device;
    a second step of receiving the control signal in the controlled device;
    a third step of generating a predictive control signal based on the first control information and the second control information included in the control signal received in the second step;
    a fourth step of controlling the operation of the remote control target based on the predictive control signal generated in the third step;
    Remote control method.
  7.  請求項1に記載の前記遠隔制御装置の前記制御信号送信処理部の機能、または、請求項1に記載の前記制御対象装置の前記制御信号受信処理部と前記予測制御信号生成処理部と前記動作制御処理部の機能を、コンピュータが有するプロセッサに実行させる、
     遠隔制御プログラム。
    The function of the control signal transmission processing unit of the remote control device according to claim 1, or the control signal reception processing unit, the predictive control signal generation processing unit, and the operation of the control target device according to claim 1. causing a processor included in the computer to execute the functions of the control processing unit;
    remote control program.
PCT/JP2022/027596 2022-07-13 2022-07-13 Remote control system, remote control method, and remote control program WO2024013895A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027596 WO2024013895A1 (en) 2022-07-13 2022-07-13 Remote control system, remote control method, and remote control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027596 WO2024013895A1 (en) 2022-07-13 2022-07-13 Remote control system, remote control method, and remote control program

Publications (1)

Publication Number Publication Date
WO2024013895A1 true WO2024013895A1 (en) 2024-01-18

Family

ID=89536167

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027596 WO2024013895A1 (en) 2022-07-13 2022-07-13 Remote control system, remote control method, and remote control program

Country Status (1)

Country Link
WO (1) WO2024013895A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
JP2017196678A (en) * 2016-04-25 2017-11-02 国立大学法人 千葉大学 Robot motion control device
JP2018153874A (en) * 2017-03-15 2018-10-04 株式会社オカムラ Presentation device, presentation method, program and work system
WO2022124398A1 (en) * 2020-12-10 2022-06-16 三菱電機株式会社 Remote control manipulator system and remote control assistance system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
JP2017196678A (en) * 2016-04-25 2017-11-02 国立大学法人 千葉大学 Robot motion control device
JP2018153874A (en) * 2017-03-15 2018-10-04 株式会社オカムラ Presentation device, presentation method, program and work system
WO2022124398A1 (en) * 2020-12-10 2022-06-16 三菱電機株式会社 Remote control manipulator system and remote control assistance system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NEGISHI, KENTA; MATSUMOTO, YOUSUKE; TAKAHASHI, HUMIYASU; NAMIKI, AKIO: "Operation assistance with visual feedback that takes into consideration the operator's intentions in master-slave control", 20TH ROBOTICS SYMPOSIA; KARUIZAWA, JAPAN; MARCH 15-16, 2015, JAPAN SOCIETY OF MECHANICAL ENGINEERS (JSME), vol. 20, 14 March 2015 (2015-03-14) - 16 March 2015 (2015-03-16), pages 512 - 519, XP009552618 *

Similar Documents

Publication Publication Date Title
US20210157217A1 (en) Gimbal control method, gimbal control apparatus, and gimbal
JP2021098492A (en) Collision detection method, device, electronic device, storage medium, and program
US10383552B2 (en) Gait analysis medical assistance robot
JP2021103564A (en) Method for driving virtual object, device, electronic apparatus, and readable storage medium
JP6526051B2 (en) Image processing apparatus, image processing method and program
US20200306974A1 (en) Teleoperation System, Method, Apparatus, and Computer-Readable Medium
JP2022532206A (en) Image / video defocus using a convolutional neural network with application to SFM / SLAM with blurred images / videos
JP2020026028A (en) Data generation device, data generation method, data generation program and remote operation system
JP2006000977A (en) Device for presenting action state of force between robot and environment
JP2015100075A (en) Image processing device, system and method
JP7082713B2 (en) Rolling Shutter Correction for images / videos using convolutional neural networks in applications for image / video SFM / SLAM
WO2020032211A1 (en) Data generating device, data generating method, data generating program, and remote operation system
WO2020054408A1 (en) Control device, information processing method, and program
WO2024013895A1 (en) Remote control system, remote control method, and remote control program
JP6383716B2 (en) Control device and control method for drone
JP2020095471A (en) Estimation device, training device, estimation method, and training method
JP2017134775A (en) Image processing apparatus, image processing method, and program
WO2021002465A1 (en) Information processing device, robot system, and information processing method
KR20090044118A (en) Method and system for creating robot map
WO2023275944A1 (en) Remote control system, method, and program
US10891755B2 (en) Apparatus, system, and method for controlling an imaging device
JP2022072184A (en) Communication system, robot, and storage medium
JP5638283B2 (en) Control device
JP2021045816A (en) Device, method and program for supporting operation of redundant degree of freedom robot arm
TW202040506A (en) Virtual reality view compensation method, non-transitory computer readable medium and virtual reality device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22951109

Country of ref document: EP

Kind code of ref document: A1