WO2022254515A1 - Remote manipulation device, remote manipulation program, and non-transitory recording medium - Google Patents

Remote manipulation device, remote manipulation program, and non-transitory recording medium Download PDF

Info

Publication number
WO2022254515A1
WO2022254515A1 PCT/JP2021/020669 JP2021020669W WO2022254515A1 WO 2022254515 A1 WO2022254515 A1 WO 2022254515A1 JP 2021020669 W JP2021020669 W JP 2021020669W WO 2022254515 A1 WO2022254515 A1 WO 2022254515A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
camera
unit
terminal
image
Prior art date
Application number
PCT/JP2021/020669
Other languages
French (fr)
Japanese (ja)
Inventor
昭宏 千葉
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2021/020669 priority Critical patent/WO2022254515A1/en
Publication of WO2022254515A1 publication Critical patent/WO2022254515A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a remote control device, a remote control program, and a non-temporary recording medium.
  • Non-Patent Document 1 In fields such as lifesaving and construction, the use of remotely operated robots (heavy machinery, etc.) is increasing (see Non-Patent Document 1). Also, in the field of remote work, it is expected that an operator will remotely operate a robot deployed at the workplace.
  • a remote operation system for remotely operating a robot
  • data (control information) necessary for operating the robot is communicated between the robot deployed on-site and the terminal device of the operator at a remote location.
  • data control information necessary for operating the robot is communicated between the robot deployed on-site and the terminal device of the operator at a remote location.
  • a plurality of moving images captured by a plurality of cameras mounted on the robot are transmitted from the robot to the operator's terminal device in real time.
  • the display unit of the terminal device displays multiple moving images sent from the robot in real time.
  • the operator operates the operation unit of the terminal device while viewing the plurality of displayed moving images.
  • the terminal device transmits data (hereinafter referred to as "instruction data") indicating an instruction by an operator who operates the robot to the robot.
  • the robot operates according to command data.
  • the operator's terminal device is not a dedicated device for remote control.
  • the operator's terminal device be a widely used information processing device (such as a smart phone).
  • a smart phone such as a smart phone
  • visibility of the displayed moving images deteriorates because the moving image display area per camera is small.
  • the operator performs an operation (selection operation) for selecting a camera on the operation unit of the terminal device, thereby using command data for selecting the camera to select one of the cameras mounted on the robot.
  • You can choose a camera from among them.
  • the small display unit of the terminal device may display only the moving image captured by the camera selected according to the selection operation. As a result, it is possible to prevent the moving image display area per camera from becoming small in a small display unit, thereby suppressing deterioration in the visibility of the moving image.
  • the operator not only operates the robot, but also operates the operation unit of the terminal device to select the camera. Therefore, when the operator is performing an operation for selecting a camera, the operation of the robot is temporarily interrupted. Thus, there is a problem that the operability of the robot equipped with a plurality of cameras cannot be improved.
  • One aspect of the present invention includes a direction command generator that generates direction command data indicating a traveling direction or a turning direction of a robot equipped with a plurality of cameras that capture images in different directions, and a transmission that transmits the direction command data to the robot.
  • an acquisition unit for acquiring an image generated by the camera selected based on the distance between the robot and the object; and a display unit for displaying the image generated by the selected camera. It is a remote control device.
  • One aspect of the present invention is a program for causing a computer to function as the above remote control device.
  • One aspect of the present invention is a computer-readable non-temporary recording medium recording a program for causing a computer to function as the above-described remote control device.
  • FIG. 2 is a top view showing an appearance example of the robot in the first embodiment
  • FIG. 2 is a side view showing an appearance example of the robot in the first embodiment
  • FIG. 4 is a diagram showing a first example of the position of the operator with respect to the position of the terminal camera of the terminal device in the first embodiment
  • FIG. 4 is a diagram showing an example of the position of an image of an operator within a frame of a captured image in the first embodiment
  • FIG. 5 is a diagram showing an example of correspondence between cameras selected in the left-right direction and conditions in the first embodiment
  • FIG. 7 is a diagram showing a second example of the position of the operator with respect to the position of the terminal camera of the terminal device in the first embodiment;
  • FIG. 4 is a diagram showing an example of the length of an image of an operator within a frame of a captured image in the first embodiment;
  • FIG. 5 is a diagram showing an example of correspondence between cameras selected in the front-rear direction and conditions in the first embodiment;
  • 4 is a flowchart showing an operation example of the remote control system in the first embodiment; It is a figure which shows the structural example of the remote control system in the modification of 1st Embodiment.
  • 9 is a flowchart showing an operation example of the remote control system in the modified example of the first embodiment; It is a figure which shows the structural example of the remote control system in 2nd Embodiment.
  • FIG. 11 is a top view showing an example of the appearance of the robot in the second embodiment;
  • FIG. 11 is a side view showing an example of the appearance of the robot in the second embodiment;
  • FIG. 11 is a diagram showing an example of correspondence between cameras selected in the left-right direction and the front-rear direction and conditions in the second embodiment;
  • 9 is a flow chart showing an operation example of the remote control system in the second embodiment;
  • FIG. 11 is a diagram showing a configuration example of a remote control system in a modified example of the second embodiment;
  • FIG. 11 is a flow chart showing an operation example of a remote control system in a modified example of the second embodiment;
  • FIG. FIG. 11 is a diagram showing a configuration example of a remote control system according to a third embodiment;
  • FIG. 11 is a diagram showing an example of correspondence between cameras selected in the left-right direction and the front-rear direction and conditions in the third embodiment; 10 is a flow chart showing an operation example of the remote control system in the third embodiment;
  • FIG. 12 is a diagram showing a configuration example of a remote control system in a modified example of the third embodiment;
  • FIG. 11 is a flowchart showing an operation example of a remote control system in a modified example of the third embodiment;
  • FIG. 2 is a diagram showing a hardware configuration example of a terminal device (remote control device) and a control device (remote control device) in each embodiment;
  • FIG. 1 is a diagram showing a configuration example of a remote control system 1a.
  • the remote control system 1a is a system for remotely operating an operation target (control target).
  • the remote control system 1a includes a terminal device 2a, a control device 3a, and a robot 4a.
  • the robot 4a is an object to be operated by the remote control system 1a.
  • the terminal device 2a (first remote control device) is, for example, an information processing device such as a smartphone terminal, a tablet terminal, and a laptop computer.
  • the terminal device 2 a includes a terminal storage section 20 , an operation section 21 , a terminal camera 22 (first camera), a terminal control section 23 a , a display section 24 and a terminal communication section 25 .
  • the terminal control unit 23a has a media processing unit 230a, a log data processing unit 231, and a first instruction generation unit 232a.
  • the terminal device 2a may include a microphone (not shown). This microphone (not shown) may be integrated with the terminal camera 22 .
  • FIG. 2 is a diagram showing an example of the appearance of the terminal device 2a.
  • the terminal camera 22 may be separate from the terminal device 2a.
  • the terminal camera 22 may perform wireless communication or wired communication between the terminal camera 22 and the terminal device 2a.
  • the operation unit 21 operation key images, etc.
  • the display area 240 image display area
  • One or more cameras are selected from a plurality of cameras mounted on the robot 4a. When one camera is selected, the image captured by the selected one camera is displayed in the display area 240 .
  • the captured image may be a moving image or a still image.
  • images captured by the selected two or more cameras are displayed in the display area 240, for example, arranged by camera.
  • the operation unit 21 is an operation key image that allows operations such as pressing operations using a touch panel.
  • the operation unit 21 receives an operation for controlling the advancing direction or turning direction of the robot 4a.
  • command data indicating the traveling direction of the robot will be referred to as "direction command data”.
  • the command data may indicate the turning direction of the robot.
  • the operation unit 21-1 "left” is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the left direction of the robot 4a.
  • the operation unit 21-2 "forward” is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the forward direction of the robot 4a.
  • the operation unit 21-3 “Right” is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the right direction of the robot 4a.
  • the operation unit 21-4 “rear” is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the rearward direction of the robot 4a.
  • the control device 3a (second remote control device) is, for example, an information processing device such as a smart phone terminal, a tablet terminal, a notebook computer, or a server device.
  • the control device 3a includes a control communication unit 30, a control storage unit 31, and an upper control unit 32a.
  • the robot 4a is a mobile body that is determined to be an object to be operated, and is, for example, a working machine such as a heavy machine.
  • the robot 4a includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras) (N is an integer equal to or greater than 2), and a lower control unit 43a.
  • FIG. 3 is a top view showing an example of the appearance of the robot 4a.
  • FIG. 4 is a side view showing an example of the appearance of the robot 4a.
  • the "x1" axis and the "y1" axis are the axes that span the horizontal plane.
  • the "y1" axis indicates the forward direction (forward direction).
  • the “z1" axis indicates the vertical direction.
  • the forward direction of the robot 4a is the direction of the "y1" axis.
  • the robot 4a includes, for example, a camera 42-1, a camera 42-2, a camera 42-3, a camera 42-4, and a camera 42-5.
  • the camera 42-1 is a camera that captures the left direction of the robot 4a.
  • the camera 42-2 is a camera that takes an image of the forward direction of the robot 4a.
  • the camera 42-3 is a camera for imaging the right direction of the robot 4a.
  • the camera 42-4 is a camera that zooms in on the forward direction of the robot 4a.
  • the camera 42-4 is provided, for example, on the arm of the robot 4a.
  • the camera 42-5 is a camera that images the rear direction of the robot 4a.
  • the communication line 10 and the communication line 11 may be wireless lines, wired lines, or lines having both a wireless line and a wired line.
  • the terminal device 2a and the control device 3a can communicate data necessary for operation via the communication line 10.
  • FIG. The control device 3a and the robot 4a can communicate data necessary for operation via the communication line 11.
  • FIG. The data required for operation are instruction data, media data, and log data.
  • Media data is image (video) data.
  • the image may be a moving image or a still image.
  • Media data may include audio data.
  • the media data may include identification information of the camera 42 so that the terminal device 2a or the control device 3a that has acquired the media data transmitted from the robot 4a can select the camera 42 .
  • the media data may include information representing the direction of each camera 42 mounted on the robot 4a.
  • the robot 4a transmits media data including images generated by the camera 42 selected from among the plurality of cameras 42 to the control device 3a.
  • the control device 3a transmits media data including images generated by the camera 42 to the terminal device 2a.
  • Log data is the data of the robot's operation log. Log data may include sensor data.
  • the robot 4a transmits log data to the control device 3a.
  • the control device 3a transmits the log data to the terminal device 2a.
  • a browser operates on the terminal device 2a.
  • the terminal device 2a uses the browser to display on the display unit 24 an image (reconstructed image) corresponding to media data including the image generated by the camera 42.
  • the terminal device 2a may display log data on the display unit 24 using a browser.
  • the terminal device 2a may transmit media data including images generated by the terminal camera 22 to the control device 3a. Further, the terminal device 2a may transmit the result of image recognition processing on media data to the control device 3a.
  • the terminal storage unit 20 stores programs executed by the terminal control unit 23a.
  • the operation unit 21 receives operations according to inputs by the operator.
  • the operation unit 21 and the display unit 24 may be integrated. For example, if the operation unit 21 is a touch panel integrated with the display unit 24, the operation unit 21 receives a pressing operation by the operator.
  • the operation unit 21 outputs a signal corresponding to the received operation to the terminal control unit 23a.
  • the terminal camera 22 (first camera) captures an image in a predetermined direction with an angle of view "v".
  • the predetermined direction is, for example, the normal direction of the screen of the display unit 24 .
  • the terminal camera 22 generates image data including, for example, the facial image of the operator 200 by capturing an image of the position and direction of the operator 200 who operates the terminal device 2a from the terminal device 2a.
  • the media processing unit 230a generates media data including image data generated by the terminal camera 22.
  • the media processing unit 230 a performs image recognition processing on image data generated by the terminal camera 22 .
  • the image recognition process is, for example, a process of recognizing the position and length (size) of the face image of the operator 200 .
  • the media processing unit 230a outputs the result of the image recognition processing to the first command generation unit 232a.
  • the media processing section 230a may perform voice recognition processing.
  • the voice recognition process is, for example, a process of recognizing the contents of the speech of the operator 200 .
  • the utterance content includes, for example, identification information of the selected camera.
  • the media processing unit 230a acquires media data including images generated by the selected camera 42 from the terminal communication unit 25.
  • the media processing unit 230 a executes image processing on media data acquired from the terminal communication unit 25 .
  • Image processing is, for example, processing for reconstructing an image generated by the camera 42 of the robot 4a from media data.
  • the media processing unit 230a displays the reconstructed image on the display unit 24 in real time.
  • the log data processing unit 231 acquires log data generated by the lower control unit 43a from the terminal communication unit 25.
  • the log data is data such as an operation log of the robot 4a.
  • the log data processing unit 231 executes predetermined processing on log data. This predetermined process is, for example, a process of generating an image representing an operation log.
  • the log data processing section 231 displays an image representing the operation log on the display section 24 .
  • the first command generation unit 232a (selection command generation unit) generates selection command data based on the result of image recognition processing. For example, the first command generation unit 232a generates selection command data based on the position and length (size) of the face image of the operator 200 within the frame of the image generated by the terminal camera 22 . The first command generation unit 232a uses the terminal communication unit 25 to transmit the selection command data to the control device 3a.
  • the first command generation unit 232a (direction command generation unit) generates direction command data according to an operation on the operation unit 21.
  • the first command generation unit 232a uses the terminal communication unit 25 to transmit direction command data to the control device 3a.
  • the display unit 24 is a display device such as a liquid crystal display.
  • the display unit 24 displays images generated by the media processing unit 230a. Also, the display unit 24 displays an image generated by the log data processing unit 231 .
  • the image generated by the log data processing unit 231 is, for example, an image representing the action log of the robot 4a.
  • the terminal communication unit 25 uses the communication line 10 to communicate with the control communication unit 30 .
  • the terminal communication unit 25 transmits direction command data according to the operation on the operation unit 21 to the control communication unit 30 .
  • the terminal communication unit 25 transmits selection instruction data corresponding to the position and length of the face image of the operator 200 within the frame of the image to the control communication unit 30 .
  • the terminal communication unit 25 acquires the media data generated by the lower control unit 43a from the control communication unit 30.
  • the terminal communication section 25 outputs the media data generated by the lower control section 43a to the media processing section 230a.
  • the terminal communication unit 25 acquires the log data generated by the lower control unit 43 a from the control communication unit 30 .
  • the terminal communication unit 25 outputs the log data generated by the lower control unit 43 a to the log data processing unit 231 .
  • the control communication unit 30 acquires direction command data from the terminal communication unit 25 .
  • the control communication unit 30 outputs the direction command data acquired from the terminal communication unit 25 to the upper control unit 32a.
  • the control communication unit 30 acquires selection command data from the terminal communication unit 25 .
  • the control communication unit 30 outputs the selection instruction data acquired from the terminal communication unit 25 to the upper control unit 32a.
  • the control communication unit 30 acquires media data generated by the lower control unit 43a from the robot communication unit 40.
  • the control communication unit 30 transmits the media data generated by the lower control unit 43 a to the terminal communication unit 25 .
  • the control communication unit 30 acquires log data generated by the lower control unit 43a from the robot communication unit 40.
  • the control communication unit 30 transmits the log data generated by the lower control unit 43 a to the terminal communication unit 25 .
  • the control communication unit 30 acquires the direction command data in the format used by the lower control unit 43a from the upper control unit 32a.
  • the control communication unit 30 outputs direction command data in a format used by the lower control unit 43 a to the robot communication unit 40 .
  • the control communication unit 30 acquires the selection instruction data in the format used by the lower control unit 43a from the upper control unit 32a.
  • the control communication unit 30 outputs to the robot communication unit 40 the selection command data in the format used by the lower control unit 43a.
  • the control storage unit 31 stores programs executed by the upper control unit 32a.
  • the upper control unit 32a Based on the direction command data transmitted from the terminal communication unit 25, the upper control unit 32a generates direction command data in a format used by the lower control unit 43a. The upper controller 32 a outputs direction command data in the format used by the lower controller 43 a to the control communication unit 30 .
  • the upper control unit 32a Based on the selection command data transmitted from the terminal communication unit 25, the upper control unit 32a generates selection command data in a format used by the lower control unit 43a.
  • the upper controller 32 a outputs selection instruction data in the format used by the lower controller 43 a to the control communication unit 30 .
  • the robot communication unit 40 communicates with the control communication unit 30 using the communication line 11 .
  • the robot communication unit 40 acquires from the control communication unit 30 direction command data according to the operation on the operation unit 21 .
  • the robot communication unit 40 acquires from the control communication unit 30 selection instruction data in a format used by the lower control unit 43a.
  • the robot communication unit 40 transmits media data generated by the lower control unit 43 a to the control communication unit 30 .
  • media data includes images produced by the selected camera 42 .
  • the robot communication unit 40 transmits log data generated by the lower control unit 43 a to the control communication unit 30 .
  • the robot storage unit 41 stores programs executed by the lower control unit 43a.
  • the camera 42 (second camera) captures an image with a predetermined angle of view in a direction determined by the mounted orientation. For example, the camera 42-1 captures the left direction of the robot 4a with a predetermined angle of view. Each camera 42 outputs the captured (generated) image to the lower control unit 43a.
  • the lower control unit 43a controls the operation of each functional unit of the robot 4a.
  • the lower control unit 43a acquires from the robot communication unit 40 the selection command data in the format used by the lower control unit 43a.
  • the lower control unit 43a selects one or more cameras 42 from the plurality of cameras 42 based on the acquired selection command data.
  • the lower control unit 43a generates media data including the image generated by the selected camera 42.
  • the lower control unit 43 a transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 .
  • the lower control unit 43 a outputs log data to the robot communication unit 40 .
  • the camera 42 is selected according to the position of the operator's image (for example, face image) captured by the terminal camera 22 .
  • the camera 42 is selected according to the position of the operator's image within the frame of the image generated by the terminal camera 22 .
  • the camera 42 may be selected according to the length (size) of the image of the operator within the frame of the image generated by the terminal camera 22 .
  • FIG. 5 is a diagram showing a first example of the position of the operator 200 with respect to the position of the terminal camera 22.
  • the “x2” axis and the “y2” axis are the axes that span the horizontal plane.
  • the "y2” axis indicates the front-rear direction (depth direction).
  • the “z2” axis indicates the vertical direction (vertical direction).
  • the face of the operator 200 is located at a horizontal angle "a0" from the end of the angle of view "v" of the terminal camera 22 with respect to the horizontal direction.
  • FIG. 6 is a diagram showing an example of the position of the image of the operator 200 within the frame of the captured image.
  • the frame width (length in the horizontal direction (number of pixels)) of the image generated (captured) by the terminal camera 22 is “w".
  • the frame width "w” is the length corresponding to the angle of view "v" shown in FIG.
  • the face image of the operator 200 is captured at a position separated by a length "d0" in the horizontal direction from the left end (reference position) of the frame. This length “d0” corresponds to the horizontal angle “a0” of the position of the operator 200 .
  • FIG. 7 is a diagram showing an example of correspondence between cameras 42 selected in the horizontal direction and conditions.
  • a predetermined range is defined in the horizontal direction at least according to the number of the cameras 42 mounted side by side in the left-right direction (horizontal direction). ing.
  • three cameras 42 ie, camera 42-1, camera 42-2, and camera 42-3, are arranged in at least the left-right direction (horizontal direction) and mounted on the robot 4a. Accordingly, in FIG. 7, the range from "0" to less than "w/3" on the left end (the range on the left side of the frame) and the range from "w/3" to less than "2w/3" (the range in the middle of the frame) range) and the range from "2w/3" to "w" (range on the right side of the frame) are defined as the frame width "w".
  • the camera 42-1 When the horizontal angle "a0" of the position of the operator 200 is within a range of less than "w/3", the camera 42-1 is selected. That is, when the face image of the operator 200 is captured within the range from the left end "0" to less than "w/3" of the image, the camera 42-1 that captures the left direction of the robot 4a is selected.
  • the camera 42-1 that captures the left direction of the robot 4a is selected.
  • the camera 42-2 that captures the forward direction (front direction) of the robot 4a may be selected.
  • the camera 42-2 is selected. That is, when the face image of the operator 200 is captured within a range from "w/3" to less than "2w/3" of the angle of view of the terminal camera 22, the camera 42 captures the forward direction (frontal direction) of the robot 4a. -2 is selected.
  • the camera 42-3 When the horizontal angle “a0" of the position of the operator 200 is in the range from "2w/3" to "w”, the camera 42-3 is selected. That is, when the face image of the operator 200 is captured in the range of the angle of view of the terminal camera 22 from "2w/3" to less than "w", the camera 42-3 that captures the right direction of the robot 4a is selected. .
  • the camera 42-3 that captures the right direction of the robot 4a may be selected.
  • the camera 42-1 images the left direction (negative direction of the x1 axis) of the robot 4a.
  • the image can be viewed by the operator 200 .
  • an image generated by the camera 42-3 that captures the right direction (positive direction of the x1 axis) of the robot 4a. can be seen by the operator 200 .
  • the camera 42-3 for imaging the right direction of the robot 4a is selected. may be selected.
  • the horizontal angle "a0" of the position of the operator 200 is in the range from "2w/3" to "w”
  • the camera 42-3 that captures the left direction of the robot 4a is selected. 1 may be selected.
  • the camera 42-1 images the left direction (negative direction of the x1 axis) of the robot 4a.
  • the image can be viewed by the operator 200 .
  • an image generated by the camera 42-3 that captures the rightward direction (positive direction of the x1 axis) of the robot 4a. can be seen by the operator 200 .
  • FIG. 8 is a diagram showing a second example of the position of the operator 200 with respect to the position of the terminal camera 22.
  • FIG. FIG. 8 shows the position of the face of the operator 200 at a distance "L1" from the terminal camera 22 and the position of the face of the operator 200 at a distance "L2" from the terminal camera 22 in the depth direction. It is Here, the distance "L1" is a predetermined reference distance and is longer than the distance "L2".
  • FIG. 9 is a diagram showing an example of the length (size) of the image of the operator 200 within the frame of the captured image.
  • the horizontal length of the face image of the operator 200 is "d1" in the frame with the frame width "w”.
  • the length “d1” is a reference length corresponding to the horizontal angle “a1” with respect to the width of the operator's 200 face. Note that the length “d1” may be expressed as the length of one side of a rectangular frame (bounding box) containing the face image of the operator 200 inside.
  • the operator 200 captures the face of the operator 200 at the position of the distance "L1" using the terminal camera 22, for example, at the timing when the robot 4a is activated or at any timing. to shoot.
  • the horizontal length of the face image of the operator 200 is “d2" in the frame with the frame width "w”.
  • the length “d2” is a length corresponding to the horizontal angle “a2” with respect to the width of the operator's 200 face.
  • FIG. 10 is a diagram showing an example of correspondence between the cameras 42 selected in the front-rear direction and the conditions.
  • two cameras 42, a camera 42-2 and a camera 42-4 are mounted on the robot 4a so as to be aligned at least in the front-rear direction (depth direction).
  • a threshold "R” is predetermined for selection of the cameras 42 arranged in the front-rear direction and mounted on the robot 4a.
  • the threshold "R” is, for example, 1.2.
  • the camera 42-4 is selected when the horizontal length "d2" is greater than the reference length "d1" by 20% or more. That is, when the length ratio "d2/d1" is equal to or greater than the threshold value "R", the camera 42-4 for zooming in on the forward direction (frontal direction) of the robot 4a is selected. On the other hand, if the length ratio 'd2/d1' is less than the threshold 'R', camera 42-2 is selected.
  • FIG. 11 is a flow chart showing an operation example of the remote control system 1a.
  • the media processing unit 230a acquires an image generated by the terminal camera 22 that captures the operator 200 (step S101).
  • the first command generation unit 232a generates direction command data indicating the traveling direction or turning direction of the robot 4a equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S102).
  • the terminal communication unit 25 transmits direction command data to the robot 4a via the control communication unit 30 (step S103).
  • the first command generation unit 232a generates selection command data based on the position and length of the facial image of the operator 200 in the frame of the image generated by the terminal camera 22 (result of image recognition processing) (step S104).
  • the terminal communication unit 25 transmits the selection instruction data to the robot 4a (step S105).
  • the terminal communication unit 25 generates a The obtained image is obtained (step S106).
  • the display unit 24 displays the image generated by the selected camera 42 (step S107).
  • the media processing unit 230a acquires an image generated by the terminal camera 22 (first camera).
  • the first command generation unit 232a (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4a on which the multiple cameras 42 are mounted.
  • the terminal communication unit 25 (transmitting unit) transmits the direction instruction data to the robot 4a via the control communication unit 30.
  • FIG. The terminal communication unit 25 (second acquisition unit) selects the camera 42 (second Acquire the image produced by the camera).
  • the display unit 24 displays images generated by the selected camera 42 .
  • the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image generated by the camera 42 even in an environment where a large display cannot be used (an environment where a small display is used).
  • the terminal device generated the selection command data according to the image data generated by the terminal camera 22 .
  • the modification of the first embodiment differs from the first embodiment in that the control device generates the selection command data.
  • differences from the first embodiment will be mainly described.
  • FIG. 12 is a diagram showing a configuration example of the remote control system 1b.
  • the remote control system 1b is a system for remotely operating an operation target (control target).
  • the remote control system 1b includes a terminal device 2b, a control device 3b, and a robot 4b.
  • the terminal device 2b includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23b, a display unit 24, and a terminal communication unit 25.
  • the terminal control unit 23b has a media processing unit 230b, a log data processing unit 231, and a first command generation unit 232b (direction command generation unit).
  • the terminal device 2b may include a microphone (not shown).
  • the control device 3b includes a control communication unit 30, a control storage unit 31, and a higher control unit 32b.
  • the upper controller 32b has a second instruction generator 320b.
  • the robot 4b includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43b.
  • the media processing unit 230b Instead of outputting the result of image recognition processing to the first command generation unit 232a, the media processing unit 230b outputs the result of image recognition processing to the second command generation unit 320b.
  • the media processing section 230 b uses the terminal communication section 25 to transmit the result of the image recognition processing to the control communication section 30 .
  • the second command generation unit 320b (selection command generation unit) generates selection command data based on the result of image recognition processing. For example, the second command generation unit 320b generates selection command data based on the position and length of the facial image of the operator 200 in the frame of the image generated by the terminal camera 22 (result of image recognition processing). . The second command generation unit 320b uses the control communication unit 30 to transmit selection command data to the robot 4b.
  • FIG. 13 is a flow chart showing an operation example of the remote control system 1b.
  • the media processing unit 230b acquires an image generated by the terminal camera 22 that captures the operator 200 (step S201).
  • the first command generation unit 232b generates direction command data indicating the traveling direction or turning direction of the robot 4b equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S202).
  • the terminal communication unit 25 transmits direction command data to the robot 4b via the control communication unit 30 (step S203).
  • the second command generation unit 320b generates selection command data based on the position of the image of the operator 200 in the frame of the image generated by the terminal camera 22 (step S205).
  • the control communication unit 30 transmits the selection command data to the robot 4b (step S206).
  • the terminal communication unit 25 acquires an image generated by the camera 42 selected based on the position and length of the facial image of the operator 200 (step S207).
  • the display unit 24 displays the image generated by the selected camera 42 (step S208).
  • the media processing unit 230b acquires an image generated by the terminal camera 22 (first camera) that captures the operator 200.
  • the first command generation unit 232b (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4b equipped with a plurality of cameras 42 that capture images in different directions.
  • the terminal communication unit 25 (transmitting unit) transmits the direction command data to the robot 4b via the control communication unit 30.
  • the terminal communication unit 25 transmits information such as the position of the image of the operator 200 to the control device 3b.
  • the terminal communication unit 25 acquires an image generated by the camera 42 (second camera) selected based on the position of the face image of the operator 200 or the like.
  • the display unit 24 displays images generated by the selected camera 42 .
  • the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
  • the difference between the second embodiment and the first embodiment is that the camera 42 is selected according to the distance between the robot and objects around the robot.
  • 2nd Embodiment demonstrates centering around the difference with 1st Embodiment.
  • FIG. 14 is a diagram showing a configuration example of the remote control system 1c.
  • the remote control system 1c is a system for remotely operating an operation target (control target).
  • the remote control system 1c includes a terminal device 2c, a control device 3c, and a robot 4c.
  • the terminal device 2c (first remote control device) includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23c, a display unit 24, and a terminal communication unit 25.
  • the terminal control unit 23c has a media processing unit 230c, a log data processing unit 231, and a first instruction generation unit 232c.
  • the terminal device 2c may include a microphone (not shown).
  • the control device 3c (second remote control device) includes a control communication unit 30, a control storage unit 31, and a higher control unit 32c.
  • the robot 4c includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), a lower control unit 43c, and M sensors 44 (where M is an integer equal to or greater than 2). .
  • the sensor 44 is a sensor that measures the distance between the sensor 44 and an object (not shown). This object is, for example, a wall, although it is not limited to a specific object. Sensor 44 measures the distance between sensor 44 and objects around sensor 44 .
  • the sensor 44 measures the distance between the sensor 44 and the object based on the round trip time of the ultrasonic wave between the sensor 44 and the object.
  • the sensor 44 is an electromagnetic radar, the sensor 44 measures the distance between the sensor 44 and the object based on the round trip time of the electromagnetic wave between the sensor 44 and the object.
  • the sensor 44 outputs the measurement result of the distance between the sensor 44 and the object to the lower controller 43c.
  • the lower control unit 43c acquires from each sensor 44 the measurement result of the distance between the sensor 44 and the object.
  • the lower control unit 43c selection command generation unit selects one or more cameras 42 from the plurality of cameras 42 based on the measurement result of the distance between the sensor 44 and the object. That is, the lower control unit 43c (selection command generation unit) generates selection command data based on the measurement result of the distance between the sensor 44 and the object.
  • the lower control unit 43c generates media data including the image generated by the selected camera 42.
  • the lower control unit 43 c transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 .
  • the lower control unit 43c outputs the log data to the robot communication unit 40.
  • FIG. Log data may include distance data measured by sensors 44 .
  • FIG. 15 is a top view showing an appearance example of the robot 4c.
  • FIG. 16 is a side view showing an appearance example of the robot 4c. 15 and 16, the forward direction of the robot 4c is the direction of the "y1" axis.
  • the robot 4c includes, for example, a camera 42-1, a camera 42-2, a camera 42-3, a camera 42-4, and a camera 42-5.
  • the robot 4c is equipped with a sensor 44-1 on the left side of the robot 4c. Thereby, the sensor 44-1 measures the distance between the object approaching the robot 4c from the left direction and the robot 4c.
  • the robot 4c has a sensor 44-2 in front of the robot 4c. Thereby, the sensor 44-2 measures the distance between the object approaching the robot 4c from the front and the robot 4c.
  • the robot 4c has a sensor 44-3 on the right side of the robot 4c. Thereby, the sensor 44-3 measures the distance between the object approaching the robot 4c from the front and the robot 4c.
  • the robot 4c has a sensor 44-4 on the back of the robot 4c. Thereby, the sensor 44-4 measures the distance between the object approaching the robot 4c from behind and the robot 4c.
  • FIG. 17 is a diagram showing an example of correspondence between the cameras 42 selected in the left-right direction and the front-rear direction and the conditions.
  • a distance "L0" is predetermined as a distance (threshold value) at which at least one of the object approaching the robot 4c and the robot 4c may be damaged or hindered.
  • the lower control unit 43c selects the camera 42-1 from among the plurality of cameras 42. If the distance “L4” between the sensor 44-3 and the object is less than or equal to the distance “L0”, the lower control unit 43c selects the camera 42-3 from the plurality of cameras 42. FIG. If the distance “L5” between the sensor 44-4 and the object is less than or equal to the distance “L0”, the lower control unit 43c selects the camera 42-5 from the plurality of cameras 42. FIG.
  • the lower control unit 43c selects the camera 42-2 from among the plurality of cameras 42. Except for these cases, the lower control unit 43c selects the camera 42-4 from among the plurality of cameras 42. FIG. Note that these selections are examples.
  • FIG. 18 is a flow chart showing an operation example of the remote control system 1c.
  • the first command generation unit 232c generates direction command data indicating the traveling direction or turning direction of the robot 4c equipped with a plurality of cameras 42 that capture images in different directions, based on an operation, for example (step S301).
  • the terminal communication unit 25 transmits direction command data to the robot 4c (lower control unit 43c) via the control communication unit 30 (step S302).
  • the lower control unit 43c selects the camera 42 that captures the direction of the object whose distance between the robot 4c and the object (not shown) is equal to or less than a threshold using each sensor 44 (step S303).
  • the terminal communication unit 25 acquires the image generated by the selected camera 42 .
  • the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4c and the object is equal to or less than a threshold (step S304).
  • the display unit 24 displays the image generated by the selected camera 42 (step S305).
  • the first command generation unit 232c (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4c equipped with a plurality of cameras 42 that capture images in different directions.
  • the terminal communication unit 25 transmits direction command data to the robot 4c via the control communication unit 30.
  • the terminal communication unit 25 acquires an image generated by the selected camera 42 .
  • the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4c and the object is equal to or less than a threshold.
  • the display unit 24 displays images generated by the selected camera 42 .
  • the robot 4c can avoid the other moving body while the operator watches the image captured by the selected camera 42.
  • the moving robot 4 c approaches a wall or the like, it is possible for the robot 4 c to avoid the wall while the operator watches the image captured by the selected camera 42 .
  • the robot selected the camera according to the distance between the object and the robot.
  • the modification of the second embodiment differs from the second embodiment in that the control device selects the camera according to the distance between the object and the robot.
  • differences from the second embodiment will be mainly described.
  • FIG. 19 is a diagram showing a configuration example of the remote control system 1d.
  • the remote control system 1d is a system for remotely operating an operation target (control target).
  • the remote control system 1d includes a terminal device 2d, a control device 3d, and a robot 4d.
  • the terminal device 2d includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23a, a display unit 24, and a terminal communication unit 25.
  • the terminal control unit 23d has a media processing unit 230b, a log data processing unit 231, and a first instruction generation unit 232b.
  • the terminal device 2d may include a microphone (not shown).
  • the control device 3d includes a control communication unit 30, a control storage unit 31, and a higher control unit 32d.
  • the upper controller 32d has a second instruction generator 320d.
  • the robot 4 d includes a robot communication unit 40 , a robot storage unit 41 , N cameras 42 (second cameras), a lower control unit 43 d and M sensors 44 .
  • the lower control unit 43d acquires from each sensor 44 the measurement result of the distance between the sensor 44 and the object.
  • the lower controller 43 d transmits the measurement result of the distance between the sensor 44 and the object to the robot communication unit 40 .
  • the lower control unit 43d acquires from the robot communication unit 40 the selection command data in the format used by the lower control unit 43d.
  • the lower controller 43d selects one or more cameras 42 from among the plurality of cameras 42 based on the selection instruction data in the format used by the lower controller 43d.
  • the lower control unit 43d generates media data including the image generated by the selected camera 42.
  • the lower control unit 43 d transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 .
  • the lower control unit 43d outputs the log data to the robot communication unit 40.
  • the robot communication unit 40 acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the lower control unit 43d.
  • the robot communication unit 40 transmits the measurement result of the distance between the sensor 44 and the object to the control communication unit 30 .
  • the robot communication unit 40 transmits the media data generated by the lower control unit 43d to the control communication unit 30.
  • media data includes images produced by the selected camera 42 .
  • the robot communication unit 40 transmits log data generated by the lower control unit 43 d to the control communication unit 30 .
  • the control communication unit 30 transmits the media data generated by the lower control unit 43d to the terminal communication unit 25.
  • the control communication unit 30 transmits log data to the terminal communication unit 25 .
  • the control communication unit 30 outputs the measurement result of the distance between the sensor 44 and the object for each sensor 44 to the second command generation unit 320d.
  • the control communication unit 30 acquires the selection command data in the format used by the lower control unit 43d from the upper control unit 32d.
  • the control communication unit 30 outputs to the robot communication unit 40 the selection instruction data in the format used by the lower control unit 43d.
  • the second command generation unit 320d acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the control communication unit 30.
  • the second command generator 320d selects one or more cameras 42 from the plurality of cameras 42 based on the measurement result of the distance between the sensor 44 and the object.
  • the second command generation unit 320d outputs the selection command data to the control communication unit 30.
  • FIG. 20 is a flow chart showing an operation example of the remote control system 1d.
  • the first command generation unit 232d generates direction command data indicating the traveling direction or turning direction of the robot 4d equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S401).
  • the terminal communication unit 25 transmits direction command data to the robot 4d (lower control unit 43d) via the control communication unit 30 (step S402).
  • the second command generation unit 320d acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the control communication unit 30 (step S403).
  • the second command generation unit 320d selects the camera 42 that captures the direction of the object when the distance between the robot 4d and the object is equal to or less than the threshold (step S404).
  • the second command generation unit 320d transmits selection command data indicating the selected camera 42 to the robot 4d (lower control unit 43d) (step S405).
  • the terminal communication unit 25 acquires the image generated by the selected camera 42. For example, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4d and the object is equal to or less than a threshold (step S406).
  • the display unit 24 displays the image generated by the selected camera 42 (step S407).
  • the first command generation unit 232d (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4d equipped with a plurality of cameras 42 .
  • the terminal communication unit 25 transmits direction command data to the robot 4d via the control communication unit 30.
  • the terminal communication unit 25 acquires an image generated by the selected camera 42 . That is, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object in which the distance between the robot 4d and the object is equal to or less than the threshold.
  • the display unit 24 displays images generated by the selected camera 42 .
  • the difference between the third embodiment and the first and second embodiments is that the camera 42 is selected according to command data relating to the traveling direction or turning direction of the robot.
  • 3rd Embodiment demonstrates centering around the difference with 1st Embodiment and 2nd Embodiment.
  • FIG. 21 is a diagram showing a configuration example of the remote control system 1e.
  • the remote control system 1e is a system for remotely operating an operation target (control target).
  • the remote control system 1e includes a terminal device 2e, a control device 3e, and a robot 4e.
  • a terminal device 2e (first remote control device) includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23e, a display unit 24, and a terminal communication unit 25.
  • the terminal control unit 23e has a media processing unit 230e, a log data processing unit 231, and a first instruction generation unit 232e.
  • the terminal device 2e may include a microphone (not shown).
  • the control device 3e (second remote control device) includes a control communication unit 30, a control storage unit 31, and a higher control unit 32e.
  • the robot 4e includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43e.
  • the first command generation unit 232e acquires from the operation unit 21 a signal corresponding to the operation received by the operation unit 21.
  • the first command generation unit 232e generates direction command data according to an operation on the operation unit 21.
  • FIG. Also, the first command generation unit 232e generates selection command data according to an operation on the operation unit 21.
  • FIG. For example, the first command generation unit 232e generates selection command data so that the camera 42 captures an image of the real space in the traveling direction (turning direction) indicated by the direction command data.
  • FIG. 22 is a diagram showing an example of correspondence between the cameras 42 selected in the left-right direction and the front-rear direction and the conditions.
  • the first command generation unit 232e selects the camera 42-1 from among the plurality of cameras 42.
  • the traveling direction indicated by the direction command data is the right direction
  • the first command generation unit 232e selects the camera 42-3 from among the plurality of cameras 42.
  • the first command generation unit 232e selects the camera 42-5 from among the multiple cameras 42. Except for these cases, the first command generation unit 232e selects the camera 42-2 from among the plurality of cameras 42. FIG. The first command generation unit 232e may select the camera 42-4 from among the plurality of cameras 42.
  • FIG. 23 is a flow chart showing an operation example of the remote control system 1e.
  • the first command generation unit 232e acquires from the operation unit 21 a signal corresponding to an operation for controlling the traveling direction or turning direction of the different robot 4e (step S501).
  • the first command generation unit 232e converts the direction command data indicating the advancing direction or turning direction of the robot 4e equipped with a plurality of cameras 42 that capture images in different directions into an operation for controlling the advancing direction or turning direction of the robot 4e. (step S502).
  • the terminal communication unit 25 transmits direction command data to the robot 4e via the control communication unit 30 (step S503).
  • the first command generation unit 232e selection command generation unit
  • the terminal communication unit 25 transmits the selection command data to the robot 4e (step S505).
  • the terminal communication unit 25 acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction (step S506).
  • the display unit 24 displays the image generated by the selected camera 42 (step S507).
  • the first command generation unit 232e (direction command generation unit) generates the direction command data indicating the traveling direction or turning direction of the robot 4e equipped with a plurality of cameras 42 to indicate the traveling direction or turning direction of the robot 4e. Generate based on operations to control.
  • the terminal communication unit 25 transmits direction command data to the robot 4e via the control communication unit 30.
  • the terminal communication unit 25 acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction.
  • the display unit 24 displays images generated by the selected camera 42 .
  • the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
  • the operator can see the image captured by the camera 42-2 that captures the forward direction. Only by operating the operation unit 21-4 so that the robot 4e advances backward, the operator can see the image captured by the camera 42-5 that captures images in the backward direction. As described above, the operator can operate the robot while constantly checking the situation in the real space in the robot's traveling direction on the screen.
  • the terminal device generated selection command data according to the traveling direction of the robot.
  • the modification of the third embodiment differs from the third embodiment in that the control device generates selection command data.
  • differences from the third embodiment will be mainly described.
  • FIG. 24 is a diagram showing a configuration example of the remote control system 1f.
  • the remote control system 1f is a system for remotely operating an operation target (control target).
  • the remote control system 1f includes a terminal device 2f, a control device 3f, and a robot 4f.
  • the terminal device 2f includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23a, a display unit 24, and a terminal communication unit 25.
  • the terminal control section 23f has a media processing section 230b, a log data processing section 231, and a first command generation section 232b.
  • the terminal device 2f may include a microphone (not shown).
  • the control device 3f includes a control communication unit 30, a control storage unit 31, and a higher control unit 32f.
  • the upper controller 32f has a second instruction generator 320f.
  • the robot 4f includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43f.
  • the second command generation unit 320f acquires direction command data from the control communication unit 30.
  • the second command generator 320f generates selection command data according to the direction command data.
  • the second command generation unit 320f generates the selection command data so that the real space in the direction indicated by the direction command data (traveling direction) is captured by the camera 42 .
  • FIG. 25 is a flow chart showing an operation example of the remote control system 1f.
  • the first command generation unit 232f acquires a signal corresponding to an operation for controlling the traveling direction or turning direction of the robot 4f from the operation unit 21 (step S601).
  • the first command generation unit 232f converts the direction command data indicating the traveling direction or turning direction of the robot 4f equipped with a plurality of cameras 42 that capture images in different directions into an operation for controlling the traveling direction or turning direction of the robot 4f. (step S602).
  • the terminal communication unit 25 transmits direction command data to the robot 4f via the control communication unit 30 (step S603).
  • the second command generation unit 300f selection command generation unit
  • the terminal communication unit 25 transmits the selection command data to the robot 4f (step S605).
  • the terminal communication unit 25 acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction (step S606).
  • the display unit 24 displays the image generated by the selected camera 42 (step S607).
  • the first command generation unit 232f (direction command generation unit) generates the direction command data indicating the traveling direction or turning direction of the robot 4f equipped with a plurality of cameras 42 to indicate the traveling direction or turning direction of the robot 4f. Generate based on operations to control.
  • the terminal communication unit 25 transmits direction command data to the robot 4f via the control communication unit 30.
  • the terminal communication unit 25 acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction.
  • the display unit 24 displays images generated by the selected camera 42 .
  • the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
  • the device of the present invention can also be realized by a computer and a program, and the program can be recorded on a non-temporary recording medium or provided through a network.
  • FIG. 26 is a diagram showing a hardware configuration example of the remote control device 100 (terminal device and control device) in each embodiment.
  • the remote control device 100 corresponds to at least one of a terminal device and a control device in each embodiment.
  • Some or all of the functional units of the remote control device 100 are configured by a processor 101 such as a CPU (Central Processing Unit), a storage device 102 having a non-volatile recording medium (non-temporary recording medium), and a memory 103. It is implemented as software by executing a program stored in the .
  • the program may be recorded on a computer-readable non-transitory recording medium.
  • a computer-readable non-temporary recording medium is, for example, a portable medium such as a flexible disk, a magneto-optical disk, a ROM (Read Only Memory), a CD-ROM (Compact Disc Read Only Memory), or a hard disk built into a computer system. It is a non-temporary recording medium such as a storage device such as The communication unit 104 executes predetermined communication processing. The communication unit 104 may acquire data and programs.
  • LSI Large Scale Integrated circuit
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • Each embodiment may be combined. Also, the terminal device and the control device may be integrated or separated.
  • the present invention can be applied to an information processing device (remote control device) that remotely controls an operation target such as a robot.
  • Terminal control Unit 24 Display unit 25 Terminal communication unit 30
  • Control communication unit 31 Control storage unit 32a Upper control unit 40
  • Robot communication unit 41 Robot storage unit 42
  • Sensor 100
  • Remote control device 101 Processor 102
  • Storage device 103 Memory 104
  • Communication unit 200 Operator 230a, 230b, 230c, 230d, 230e, 230f Media processing Part 231...
  • First instruction generation part 240 ... Display area 320b, 320d, 320f... Second instruction generation part

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Manipulator (AREA)

Abstract

A remote manipulation device according to the present invention is provided with: a direction-instruction generation unit that generates direction instruction data indicating an advancing direction or a pivot direction of a robot having mounted thereon a plurality of cameras for capturing images in different directions; a transmission unit that transmits the direction instruction data to the robot; an acquisition unit that acquires an image generated by a camera selected on the basis of the distance between the robot and an object; and a display unit that displays the image generated by the selected camera. The acquisition unit acquires an image generated by a camera that captures an image in the direction of an object with which the distance has become less than or equal to a threshold, and the display unit displays the image acquired by the acquisition unit.

Description

遠隔操作装置、遠隔操作プログラム及び非一時的記録媒体Remote control device, remote control program and non-temporary recording medium
 本発明は、遠隔操作装置、遠隔操作プログラム及び非一時的記録媒体に関する。 The present invention relates to a remote control device, a remote control program, and a non-temporary recording medium.
 人命救助及び工事等の分野において、遠隔から操作されるロボット(重機等)の活用が進んでいる(非特許文献1参照)。また、リモートワークの分野において、勤務地に配備されたロボットを操作者が遠隔から操作するという活用が予想される。 In fields such as lifesaving and construction, the use of remotely operated robots (heavy machinery, etc.) is increasing (see Non-Patent Document 1). Also, in the field of remote work, it is expected that an operator will remotely operate a robot deployed at the workplace.
 ロボットを遠隔から操作するための遠隔操作システムでは、現地に配備されたロボットと、遠隔の地に居る操作者の端末装置との間で、ロボットの操作に必要なデータ(制御情報)が通信される。例えば、ロボットに搭載された複数のカメラによって撮像された複数の動画像が、操作者の端末装置にリアルタイムでロボットから送信される。 In a remote operation system for remotely operating a robot, data (control information) necessary for operating the robot is communicated between the robot deployed on-site and the terminal device of the operator at a remote location. be. For example, a plurality of moving images captured by a plurality of cameras mounted on the robot are transmitted from the robot to the operator's terminal device in real time.
 端末装置の表示部は、ロボットから送信された複数の動画像を、リアルタイムで表示する。操作者は、表示された複数の動画像を見ながら、端末装置の操作部に対して操作を行う。端末装置は、ロボットを操作する操作者による命令を表すデータ(以下「命令データ」という。)を、ロボットに送信する。ロボットは、命令データに応じて動作する。 The display unit of the terminal device displays multiple moving images sent from the robot in real time. The operator operates the operation unit of the terminal device while viewing the plurality of displayed moving images. The terminal device transmits data (hereinafter referred to as "instruction data") indicating an instruction by an operator who operates the robot to the robot. The robot operates according to command data.
 リモートワークにおいてロボットの活用が予想されていることもあり、操作者の端末装置は、遠隔操作のための専用装置ではないことが望ましい。例えば、操作者の端末装置は、普及した情報処理装置(スマートフォン等)であることが望ましい。しかしながら、例えばスマートフォン等の小さい表示部に複数の動画像が並列に表示された場合、1台のカメラ当たりの動画像表示領域が小さいので、表示された動画像の視認性は低下する。  The use of robots in remote work is expected, so it is desirable that the operator's terminal device is not a dedicated device for remote control. For example, it is desirable that the operator's terminal device be a widely used information processing device (such as a smart phone). However, for example, when a plurality of moving images are displayed in parallel on a small display unit such as a smart phone, visibility of the displayed moving images deteriorates because the moving image display area per camera is small.
 そこで操作者は、カメラを選択するための操作(選択操作)を端末装置の操作部に対して行うことによって、カメラを選択するための命令データを用いて、ロボットに搭載された複数のカメラのうちからカメラを選択してもよい。端末装置の小さい表示部は、選択操作に応じて選択されたカメラによって撮像された動画像のみを表示してもよい。これによって、小さい表示部において1台のカメラ当たりの動画像表示領域が小さくならないようにして、動画像の視認性の低下を抑制することができる。 Therefore, the operator performs an operation (selection operation) for selecting a camera on the operation unit of the terminal device, thereby using command data for selecting the camera to select one of the cameras mounted on the robot. You can choose a camera from among them. The small display unit of the terminal device may display only the moving image captured by the camera selected according to the selection operation. As a result, it is possible to prevent the moving image display area per camera from becoming small in a small display unit, thereby suppressing deterioration in the visibility of the moving image.
 しかしながら、操作者は、ロボットを操作するだけでなく、カメラを選択するための操作を端末装置の操作部に対して行う。このため、カメラを選択するための操作を操作者が行っている場合には、ロボットの操作が一時中断される。このように、複数のカメラを搭載したロボットの操作性を向上させることができないという問題がある。 However, the operator not only operates the robot, but also operates the operation unit of the terminal device to select the camera. Therefore, when the operator is performing an operation for selecting a camera, the operation of the robot is temporarily interrupted. Thus, there is a problem that the operability of the robot equipped with a plurality of cameras cannot be improved.
 上記事情に鑑み、本発明は、複数のカメラを搭載したロボットの操作性を向上させることが可能である遠隔操作装置、遠隔操作プログラム及び非一時的記録媒体を提供することを目的としている。 In view of the above circumstances, it is an object of the present invention to provide a remote control device, a remote control program, and a non-temporary recording medium capable of improving the operability of a robot equipped with multiple cameras.
 本発明の一態様は、異なる方向を撮像する複数のカメラを搭載したロボットの進行方向又は旋回方向を示す方向命令データを生成する方向命令生成部と、前記方向命令データを前記ロボットに送信する送信部と、前記ロボットと物体との間の距離に基づいて選択された前記カメラによって生成された画像を取得する取得部と、選択された前記カメラによって生成された画像を表示する表示部とを備える遠隔操作装置である。 One aspect of the present invention includes a direction command generator that generates direction command data indicating a traveling direction or a turning direction of a robot equipped with a plurality of cameras that capture images in different directions, and a transmission that transmits the direction command data to the robot. an acquisition unit for acquiring an image generated by the camera selected based on the distance between the robot and the object; and a display unit for displaying the image generated by the selected camera. It is a remote control device.
 本発明の一態様は、上記の遠隔操作装置としてコンピュータを機能させるためのプログラムである。 One aspect of the present invention is a program for causing a computer to function as the above remote control device.
 本発明の一態様は、上記の遠隔操作装置としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な非一時的記録媒体である。 One aspect of the present invention is a computer-readable non-temporary recording medium recording a program for causing a computer to function as the above-described remote control device.
 本発明により、複数のカメラを搭載したロボットの操作性を向上させることが可能である。 With the present invention, it is possible to improve the operability of a robot equipped with multiple cameras.
第1実施形態における、遠隔操作システムの構成例を示す図である。It is a figure which shows the structural example of the remote control system in 1st Embodiment. 第1実施形態における、端末装置の外観例を示す図である。It is a figure which shows the external appearance example of a terminal device in 1st Embodiment. 第1実施形態における、ロボットの外観例を示す上面図である。FIG. 2 is a top view showing an appearance example of the robot in the first embodiment; 第1実施形態における、ロボットの外観例を示す側面図である。FIG. 2 is a side view showing an appearance example of the robot in the first embodiment; 第1実施形態における、端末装置の端末カメラの位置に対する操作者の位置の第1例を示す図である。FIG. 4 is a diagram showing a first example of the position of the operator with respect to the position of the terminal camera of the terminal device in the first embodiment; 第1実施形態における、撮像された画像のフレーム内に関する、操作者の画像の位置例を示す図である。FIG. 4 is a diagram showing an example of the position of an image of an operator within a frame of a captured image in the first embodiment; 第1実施形態における、左右方向に関して選択されるカメラと条件との対応付けの例を示す図である。FIG. 5 is a diagram showing an example of correspondence between cameras selected in the left-right direction and conditions in the first embodiment; 第1実施形態における、端末装置の端末カメラの位置に対する操作者の位置の第2例を示す図である。FIG. 7 is a diagram showing a second example of the position of the operator with respect to the position of the terminal camera of the terminal device in the first embodiment; 第1実施形態における、撮像された画像のフレーム内に関する、操作者の画像の長さ例を示す図である。FIG. 4 is a diagram showing an example of the length of an image of an operator within a frame of a captured image in the first embodiment; 第1実施形態における、前後方向に関して選択されるカメラと条件との対応付けの例を示す図である。FIG. 5 is a diagram showing an example of correspondence between cameras selected in the front-rear direction and conditions in the first embodiment; 第1実施形態における、遠隔操作システムの動作例を示すフローチャートである。4 is a flowchart showing an operation example of the remote control system in the first embodiment; 第1実施形態の変形例における、遠隔操作システムの構成例を示す図である。It is a figure which shows the structural example of the remote control system in the modification of 1st Embodiment. 第1実施形態の変形例における、遠隔操作システムの動作例を示すフローチャートである。9 is a flowchart showing an operation example of the remote control system in the modified example of the first embodiment; 第2実施形態における、遠隔操作システムの構成例を示す図である。It is a figure which shows the structural example of the remote control system in 2nd Embodiment. 第2実施形態における、ロボットの外観例を示す上面図である。FIG. 11 is a top view showing an example of the appearance of the robot in the second embodiment; 第2実施形態における、ロボットの外観例を示す側面図である。FIG. 11 is a side view showing an example of the appearance of the robot in the second embodiment; 第2実施形態における、左右方向及び前後方向に関して選択されるカメラと条件との対応付けの例を示す図である。FIG. 11 is a diagram showing an example of correspondence between cameras selected in the left-right direction and the front-rear direction and conditions in the second embodiment; 第2実施形態における、遠隔操作システムの動作例を示すフローチャートである。9 is a flow chart showing an operation example of the remote control system in the second embodiment; 第2実施形態の変形例における、遠隔操作システムの構成例を示す図である。FIG. 11 is a diagram showing a configuration example of a remote control system in a modified example of the second embodiment; 第2実施形態の変形例における、遠隔操作システムの動作例を示すフローチャートである。FIG. 11 is a flow chart showing an operation example of a remote control system in a modified example of the second embodiment; FIG. 第3実施形態における、遠隔操作システムの構成例を示す図である。FIG. 11 is a diagram showing a configuration example of a remote control system according to a third embodiment; FIG. 第3実施形態における、左右方向及び前後方向に関して選択されるカメラと条件との対応付けの例を示す図である。FIG. 11 is a diagram showing an example of correspondence between cameras selected in the left-right direction and the front-rear direction and conditions in the third embodiment; 第3実施形態における、遠隔操作システムの動作例を示すフローチャートである。10 is a flow chart showing an operation example of the remote control system in the third embodiment; 第3実施形態の変形例における、遠隔操作システムの構成例を示す図である。FIG. 12 is a diagram showing a configuration example of a remote control system in a modified example of the third embodiment; 第3実施形態の変形例における、遠隔操作システムの動作例を示すフローチャートである。FIG. 11 is a flowchart showing an operation example of a remote control system in a modified example of the third embodiment; FIG. 各実施形態における、端末装置(遠隔操作装置)、制御装置(遠隔操作装置)のハードウェア構成例を示す図である。2 is a diagram showing a hardware configuration example of a terminal device (remote control device) and a control device (remote control device) in each embodiment; FIG.
 本発明の実施形態について、図面を参照して詳細に説明する。
 (第1実施形態)
 図1は、遠隔操作システム1aの構成例を示す図である。遠隔操作システム1aは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1aは、端末装置2aと、制御装置3aと、ロボット4aとを備える。ロボット4aは、遠隔操作システム1aの操作対象である。
Embodiments of the present invention will be described in detail with reference to the drawings.
(First embodiment)
FIG. 1 is a diagram showing a configuration example of a remote control system 1a. The remote control system 1a is a system for remotely operating an operation target (control target). The remote control system 1a includes a terminal device 2a, a control device 3a, and a robot 4a. The robot 4a is an object to be operated by the remote control system 1a.
 端末装置2a(第1遠隔操作装置)は、例えば、スマートフォン端末、タブレット端末及びノートパソコン等の情報処理装置である。端末装置2aは、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23aと、表示部24と、端末通信部25とを備える。端末制御部23aは、メディア処理部230aと、ログデータ処理部231と、第1命令生成部232aとを有する。端末装置2aは、マイク(不図示)を備えてもよい。このマイク(不図示)は、端末カメラ22と一体でもよい。 The terminal device 2a (first remote control device) is, for example, an information processing device such as a smartphone terminal, a tablet terminal, and a laptop computer. The terminal device 2 a includes a terminal storage section 20 , an operation section 21 , a terminal camera 22 (first camera), a terminal control section 23 a , a display section 24 and a terminal communication section 25 . The terminal control unit 23a has a media processing unit 230a, a log data processing unit 231, and a first instruction generation unit 232a. The terminal device 2a may include a microphone (not shown). This microphone (not shown) may be integrated with the terminal camera 22 .
 図2は、端末装置2aの外観例を示す図である。端末カメラ22は、端末装置2aとは別体でもよい。端末カメラ22は、端末カメラ22と端末装置2aとの間で、無線通信又は有線通信を実行してもよい。 FIG. 2 is a diagram showing an example of the appearance of the terminal device 2a. The terminal camera 22 may be separate from the terminal device 2a. The terminal camera 22 may perform wireless communication or wired communication between the terminal camera 22 and the terminal device 2a.
 表示部24(画面)には、操作部21(操作キー画像等)と、表示領域240(画像表示領域)とが、例えばブラウザを用いて表示される。ロボット4aに搭載された複数のカメラのうちから、1台以上のカメラが選択される。1台のカメラが選択された場合、選択された1台のカメラによって撮像された画像が、表示領域240に表示される。撮像された画像は、動画像でもよいし、静止画像でもよい。2台以上のカメラが選択された場合、選択された2台以上のカメラによって撮像された画像が、例えばカメラごとに並べられて、表示領域240に表示される。 On the display unit 24 (screen), the operation unit 21 (operation key images, etc.) and the display area 240 (image display area) are displayed using, for example, a browser. One or more cameras are selected from a plurality of cameras mounted on the robot 4a. When one camera is selected, the image captured by the selected one camera is displayed in the display area 240 . The captured image may be a moving image or a still image. When two or more cameras are selected, images captured by the selected two or more cameras are displayed in the display area 240, for example, arranged by camera.
 操作部21は、タッチパネルを用いて押下操作等の操作が可能な操作キー画像である。操作部21は、ロボット4aの進行方向又は旋回方向を制御するための操作を受け付ける。以下、ロボットの進行方向を示す命令データを「方向命令データ」という。命令データは、ロボットの旋回方向を示してもよい。 The operation unit 21 is an operation key image that allows operations such as pressing operations using a touch panel. The operation unit 21 receives an operation for controlling the advancing direction or turning direction of the robot 4a. Hereinafter, command data indicating the traveling direction of the robot will be referred to as "direction command data". The command data may indicate the turning direction of the robot.
 操作部21-1「左」は、ロボット4aの進行方向がロボット4aの左方向であることを示す方向命令データを生成するための操作キー画像である。操作部21-2「前」は、ロボット4aの進行方向がロボット4aの前方向であることを示す方向命令データを生成するための操作キー画像である。 The operation unit 21-1 "left" is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the left direction of the robot 4a. The operation unit 21-2 "forward" is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the forward direction of the robot 4a.
 操作部21-3「右」は、ロボット4aの進行方向がロボット4aの右方向であることを示す方向命令データを生成するための操作キー画像である。操作部21-4「後」は、ロボット4aの進行方向がロボット4aの後方向であることを示す方向命令データを生成するための操作キー画像である。 The operation unit 21-3 "Right" is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the right direction of the robot 4a. The operation unit 21-4 "rear" is an operation key image for generating direction command data indicating that the traveling direction of the robot 4a is the rearward direction of the robot 4a.
 図1に戻り、遠隔操作システム1aの構成例の説明を続ける。制御装置3a(第2遠隔操作装置)は、例えば、スマートフォン端末、タブレット端末、ノートパソコン及びサーバ装置等の情報処理装置である。制御装置3aは、制御通信部30と、制御記憶部31と、上位制御部32aとを備える。 Returning to FIG. 1, the description of the configuration example of the remote control system 1a is continued. The control device 3a (second remote control device) is, for example, an information processing device such as a smart phone terminal, a tablet terminal, a notebook computer, or a server device. The control device 3a includes a control communication unit 30, a control storage unit 31, and an upper control unit 32a.
 ロボット4aは、操作対象と定められた移動体であり、例えば、重機等の作業用機械である。ロボット4aは、ロボット通信部40と、ロボット記憶部41と、N個(Nは2以上の整数)のカメラ42(第2カメラ)と、下位制御部43aとを備える。 The robot 4a is a mobile body that is determined to be an object to be operated, and is, for example, a working machine such as a heavy machine. The robot 4a includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras) (N is an integer equal to or greater than 2), and a lower control unit 43a.
 図3は、ロボット4aの外観例を示す上面図である。図4は、ロボット4aの外観例を示す側面図である。以下では、「x1」軸と「y1」軸とは、水平面を張る各軸である。ここで、「y1」軸は、前方向(正面方向)(前進方向)を示す。「z1」軸は、垂直方向を示す。図3及び図4では、ロボット4aの前方向は、「y1」軸の方向である。ロボット4aは、一例として、カメラ42-1と、カメラ42-2と、カメラ42-3と、カメラ42-4と、カメラ42-5とを備える。 FIG. 3 is a top view showing an example of the appearance of the robot 4a. FIG. 4 is a side view showing an example of the appearance of the robot 4a. In the following, the "x1" axis and the "y1" axis are the axes that span the horizontal plane. Here, the "y1" axis indicates the forward direction (forward direction). The "z1" axis indicates the vertical direction. 3 and 4, the forward direction of the robot 4a is the direction of the "y1" axis. The robot 4a includes, for example, a camera 42-1, a camera 42-2, a camera 42-3, a camera 42-4, and a camera 42-5.
 カメラ42-1は、ロボット4aの左方向を撮像するカメラである。カメラ42-2は、ロボット4aの前方向を撮像するカメラである。カメラ42-3は、ロボット4aの右方向を撮像するカメラである。カメラ42-4は、ロボット4aの前方向をズーム撮像するカメラである。カメラ42-4は、例えば、ロボット4aのアームに備えられる。カメラ42-5は、ロボット4aの後ろ方向を撮像するカメラである。 The camera 42-1 is a camera that captures the left direction of the robot 4a. The camera 42-2 is a camera that takes an image of the forward direction of the robot 4a. The camera 42-3 is a camera for imaging the right direction of the robot 4a. The camera 42-4 is a camera that zooms in on the forward direction of the robot 4a. The camera 42-4 is provided, for example, on the arm of the robot 4a. The camera 42-5 is a camera that images the rear direction of the robot 4a.
 図1に戻り、遠隔操作システム1aの構成例の説明を続ける。通信回線10と通信回線11とは、無線回線でもよいし、有線回線でもよいし、無線回線と有線回線との両方を有する回線でもよい。端末装置2aと制御装置3aとは、通信回線10を経由して、操作に必要なデータを通信可能である。制御装置3aとロボット4aとは、通信回線11を経由して、操作に必要なデータを通信可能である。操作に必要なデータとは、命令データと、メディアデータと、ログデータとである。 Returning to FIG. 1, the description of the configuration example of the remote control system 1a is continued. The communication line 10 and the communication line 11 may be wireless lines, wired lines, or lines having both a wireless line and a wired line. The terminal device 2a and the control device 3a can communicate data necessary for operation via the communication line 10. FIG. The control device 3a and the robot 4a can communicate data necessary for operation via the communication line 11. FIG. The data required for operation are instruction data, media data, and log data.
 メディアデータは、画像(映像)データである。画像は、動画像でもよいし、静止画像でもよい。メディアデータは、音声データを含んでもよい。メディアデータは、ロボット4aから送信されたメディアデータを取得した端末装置2a又は制御装置3aがカメラ42を選択することができるように、カメラ42の識別情報を含んでもよい。メディアデータは、ロボット4aに搭載された各カメラ42の向きを表す情報を含んでもよい。 Media data is image (video) data. The image may be a moving image or a still image. Media data may include audio data. The media data may include identification information of the camera 42 so that the terminal device 2a or the control device 3a that has acquired the media data transmitted from the robot 4a can select the camera 42 . The media data may include information representing the direction of each camera 42 mounted on the robot 4a.
 ロボット4aは、複数のカメラ42のうちから選択されたカメラ42によって生成された画像を含むメディアデータを、制御装置3aに送信する。制御装置3aは、カメラ42によって生成された画像を含むメディアデータを、端末装置2aに送信する。 The robot 4a transmits media data including images generated by the camera 42 selected from among the plurality of cameras 42 to the control device 3a. The control device 3a transmits media data including images generated by the camera 42 to the terminal device 2a.
 ログデータは、ロボットの動作ログのデータである。ログデータは、センサデータを含んでもよい。ロボット4aは、ログデータを制御装置3aに送信する。制御装置3aは、ログデータを端末装置2aに送信する。 Log data is the data of the robot's operation log. Log data may include sensor data. The robot 4a transmits log data to the control device 3a. The control device 3a transmits the log data to the terminal device 2a.
 端末装置2aでは、例えば、ブラウザが動作する。端末装置2aは、ブラウザを用いて、カメラ42によって生成された画像を含むメディアデータに応じた画像(再構成された画像)を表示部24に表示する。端末装置2aは、ブラウザを用いて、ログデータを表示部24に表示してもよい。端末装置2aは、端末カメラ22によって生成された画像を含むメディアデータを、制御装置3aに送信してもよい。また、端末装置2aは、メディアデータに対する画像認識処理の結果を、制御装置3aに送信してもよい。 For example, a browser operates on the terminal device 2a. The terminal device 2a uses the browser to display on the display unit 24 an image (reconstructed image) corresponding to media data including the image generated by the camera 42. FIG. The terminal device 2a may display log data on the display unit 24 using a browser. The terminal device 2a may transmit media data including images generated by the terminal camera 22 to the control device 3a. Further, the terminal device 2a may transmit the result of image recognition processing on media data to the control device 3a.
 次に、端末装置2aの各機能部について説明する。
 端末記憶部20は、端末制御部23aによって実行されるプログラムを記憶する。操作部21は、操作者による入力に応じた操作を受け付ける。操作部21と表示部24とは、一体化されていてもよい。例えば、操作部21が表示部24と一体化されたタッチパネルである場合、操作部21は、操作者による押下操作を受け付ける。操作部21は、受け付けた操作に応じた信号を、端末制御部23aに出力する。
Next, each functional unit of the terminal device 2a will be described.
The terminal storage unit 20 stores programs executed by the terminal control unit 23a. The operation unit 21 receives operations according to inputs by the operator. The operation unit 21 and the display unit 24 may be integrated. For example, if the operation unit 21 is a touch panel integrated with the display unit 24, the operation unit 21 receives a pressing operation by the operator. The operation unit 21 outputs a signal corresponding to the received operation to the terminal control unit 23a.
 端末カメラ22(第1カメラ)は、所定方向を画角「v」で撮像する。所定方向は、例えば、表示部24の画面の法線方向である。端末カメラ22は、端末装置2aを操作する操作者200の位置の方向を端末装置2aから撮像することによって、例えば、操作者200の顔画像を含む画像データを生成する。 The terminal camera 22 (first camera) captures an image in a predetermined direction with an angle of view "v". The predetermined direction is, for example, the normal direction of the screen of the display unit 24 . The terminal camera 22 generates image data including, for example, the facial image of the operator 200 by capturing an image of the position and direction of the operator 200 who operates the terminal device 2a from the terminal device 2a.
 メディア処理部230aは、端末カメラ22によって生成された画像データを含むメディアデータを生成する。メディア処理部230aは、端末カメラ22によって生成された画像データに対して、画像認識処理を実行する。画像認識処理は、例えば、操作者200の顔画像の位置及び長さ(サイズ)を認識する処理である。メディア処理部230aは、画像認識処理の結果を、第1命令生成部232aに出力する。 The media processing unit 230a generates media data including image data generated by the terminal camera 22. The media processing unit 230 a performs image recognition processing on image data generated by the terminal camera 22 . The image recognition process is, for example, a process of recognizing the position and length (size) of the face image of the operator 200 . The media processing unit 230a outputs the result of the image recognition processing to the first command generation unit 232a.
 なお、端末装置2aがマイク(不図示)を備えている場合には、メディア処理部230aは、音声認識処理を実行してもよい。音声認識処理は、例えば、操作者200の発話内容を認識する処理である。発話内容は、例えば、選択されたカメラの識別情報を含む。 If the terminal device 2a is equipped with a microphone (not shown), the media processing section 230a may perform voice recognition processing. The voice recognition process is, for example, a process of recognizing the contents of the speech of the operator 200 . The utterance content includes, for example, identification information of the selected camera.
 メディア処理部230aは、選択されたカメラ42によって生成された画像を含むメディアデータを、端末通信部25から取得する。メディア処理部230aは、端末通信部25から取得されたメディアデータに対して、画像処理を実行する。画像処理は、例えば、ロボット4aのカメラ42によって生成された画像をメディアデータから再構成する処理である。メディア処理部230aは、再構成された画像を、表示部24にリアルタイムで表示する。 The media processing unit 230a acquires media data including images generated by the selected camera 42 from the terminal communication unit 25. The media processing unit 230 a executes image processing on media data acquired from the terminal communication unit 25 . Image processing is, for example, processing for reconstructing an image generated by the camera 42 of the robot 4a from media data. The media processing unit 230a displays the reconstructed image on the display unit 24 in real time.
 ログデータ処理部231は、下位制御部43aによって生成されたログデータを、端末通信部25から取得する。ログデータは、ロボット4aの動作ログ等のデータである。ログデータ処理部231は、ログデータに対して所定の処理を実行する。この所定の処理は、例えば、動作ログを表す画像を生成する処理である。ログデータ処理部231は、動作ログを表す画像を、表示部24に表示する。 The log data processing unit 231 acquires log data generated by the lower control unit 43a from the terminal communication unit 25. The log data is data such as an operation log of the robot 4a. The log data processing unit 231 executes predetermined processing on log data. This predetermined process is, for example, a process of generating an image representing an operation log. The log data processing section 231 displays an image representing the operation log on the display section 24 .
 以下、選択されたカメラを示す命令データを「選択命令データ」という。第1命令生成部232a(選択命令生成部)は、画像認識処理の結果に基づいて、選択命令データを生成する。例えば、第1命令生成部232aは、端末カメラ22によって生成された画像のフレーム内における操作者200の顔画像の位置及び長さ(大きさ)に基づいて、選択命令データを生成する。第1命令生成部232aは、端末通信部25を用いて、選択命令データを制御装置3aに送信する。 Hereinafter, command data indicating the selected camera will be referred to as "selection command data". The first command generation unit 232a (selection command generation unit) generates selection command data based on the result of image recognition processing. For example, the first command generation unit 232a generates selection command data based on the position and length (size) of the face image of the operator 200 within the frame of the image generated by the terminal camera 22 . The first command generation unit 232a uses the terminal communication unit 25 to transmit the selection command data to the control device 3a.
 第1命令生成部232a(方向命令生成部)は、操作部21に対する操作に応じて、方向命令データを生成する。第1命令生成部232aは、端末通信部25を用いて、方向命令データを制御装置3aに送信する。 The first command generation unit 232a (direction command generation unit) generates direction command data according to an operation on the operation unit 21. The first command generation unit 232a uses the terminal communication unit 25 to transmit direction command data to the control device 3a.
 表示部24は、液晶ディスプレイ等の表示デバイスである。表示部24は、メディア処理部230aによって生成された画像を表示する。また、表示部24は、ログデータ処理部231によって生成された画像を表示する。ログデータ処理部231によって生成された画像は、例えば、ロボット4aの動作ログを表す画像である。 The display unit 24 is a display device such as a liquid crystal display. The display unit 24 displays images generated by the media processing unit 230a. Also, the display unit 24 displays an image generated by the log data processing unit 231 . The image generated by the log data processing unit 231 is, for example, an image representing the action log of the robot 4a.
 端末通信部25は、通信回線10を用いて、制御通信部30との通信を実行する。例えば、端末通信部25(送信部)は、操作部21に対する操作に応じた方向命令データを、制御通信部30に送信する。例えば、端末通信部25は、画像のフレーム内における操作者200の顔画像の位置及び長さに応じた選択命令データを、制御通信部30に送信する。 The terminal communication unit 25 uses the communication line 10 to communicate with the control communication unit 30 . For example, the terminal communication unit 25 (transmitting unit) transmits direction command data according to the operation on the operation unit 21 to the control communication unit 30 . For example, the terminal communication unit 25 transmits selection instruction data corresponding to the position and length of the face image of the operator 200 within the frame of the image to the control communication unit 30 .
 端末通信部25(取得部)は、下位制御部43aによって生成されたメディアデータを、制御通信部30から取得する。端末通信部25は、下位制御部43aによって生成されたメディアデータを、メディア処理部230aに出力する。端末通信部25は、下位制御部43aによって生成されたログデータを、制御通信部30から取得する。端末通信部25は、下位制御部43aによって生成されたログデータを、ログデータ処理部231に出力する。 The terminal communication unit 25 (acquisition unit) acquires the media data generated by the lower control unit 43a from the control communication unit 30. The terminal communication section 25 outputs the media data generated by the lower control section 43a to the media processing section 230a. The terminal communication unit 25 acquires the log data generated by the lower control unit 43 a from the control communication unit 30 . The terminal communication unit 25 outputs the log data generated by the lower control unit 43 a to the log data processing unit 231 .
 次に、制御装置3aの各機能部について説明する。
 制御通信部30は、方向命令データを端末通信部25から取得する。制御通信部30は、端末通信部25から取得された方向命令データを、上位制御部32aに出力する。制御通信部30は、選択命令データを端末通信部25から取得する。制御通信部30は、端末通信部25から取得された選択命令データを、上位制御部32aに出力する。
Next, each functional unit of the control device 3a will be described.
The control communication unit 30 acquires direction command data from the terminal communication unit 25 . The control communication unit 30 outputs the direction command data acquired from the terminal communication unit 25 to the upper control unit 32a. The control communication unit 30 acquires selection command data from the terminal communication unit 25 . The control communication unit 30 outputs the selection instruction data acquired from the terminal communication unit 25 to the upper control unit 32a.
 制御通信部30は、下位制御部43aによって生成されたメディアデータを、ロボット通信部40から取得する。制御通信部30は、下位制御部43aによって生成されたメディアデータを、端末通信部25に送信する。 The control communication unit 30 acquires media data generated by the lower control unit 43a from the robot communication unit 40. The control communication unit 30 transmits the media data generated by the lower control unit 43 a to the terminal communication unit 25 .
 制御通信部30は、下位制御部43aによって生成されたログデータを、ロボット通信部40から取得する。制御通信部30は、下位制御部43aによって生成されたログデータを、端末通信部25に送信する。 The control communication unit 30 acquires log data generated by the lower control unit 43a from the robot communication unit 40. The control communication unit 30 transmits the log data generated by the lower control unit 43 a to the terminal communication unit 25 .
 制御通信部30は、下位制御部43aが用いる形式の方向命令データを、上位制御部32aから取得する。制御通信部30は、下位制御部43aが用いる形式の方向命令データを、ロボット通信部40に出力する。 The control communication unit 30 acquires the direction command data in the format used by the lower control unit 43a from the upper control unit 32a. The control communication unit 30 outputs direction command data in a format used by the lower control unit 43 a to the robot communication unit 40 .
 制御通信部30は、下位制御部43aが用いる形式の選択命令データを、上位制御部32aから取得する。制御通信部30は、下位制御部43aが用いる形式の選択命令データを、ロボット通信部40に出力する。
 制御記憶部31は、上位制御部32aが実行するプログラムを記憶する。
The control communication unit 30 acquires the selection instruction data in the format used by the lower control unit 43a from the upper control unit 32a. The control communication unit 30 outputs to the robot communication unit 40 the selection command data in the format used by the lower control unit 43a.
The control storage unit 31 stores programs executed by the upper control unit 32a.
 上位制御部32aは、端末通信部25から送信された方向命令データに基づいて、下位制御部43aが用いる形式の方向命令データを生成する。上位制御部32aは、下位制御部43aが用いる形式の方向命令データを、制御通信部30に出力する。 Based on the direction command data transmitted from the terminal communication unit 25, the upper control unit 32a generates direction command data in a format used by the lower control unit 43a. The upper controller 32 a outputs direction command data in the format used by the lower controller 43 a to the control communication unit 30 .
 上位制御部32aは、端末通信部25から送信された選択命令データに基づいて、下位制御部43aが用いる形式の選択命令データを生成する。上位制御部32aは、下位制御部43aが用いる形式の選択命令データを、制御通信部30に出力する。 Based on the selection command data transmitted from the terminal communication unit 25, the upper control unit 32a generates selection command data in a format used by the lower control unit 43a. The upper controller 32 a outputs selection instruction data in the format used by the lower controller 43 a to the control communication unit 30 .
 次に、ロボット4aの各機能部について説明する。
 ロボット通信部40は、通信回線11を用いて、制御通信部30との通信を実行する。例えば、ロボット通信部40は、操作部21に対する操作に応じた方向命令データを、制御通信部30から取得する。例えば、ロボット通信部40は、下位制御部43aが用いる形式の選択命令データを、制御通信部30から取得する。
Next, each functional part of the robot 4a will be described.
The robot communication unit 40 communicates with the control communication unit 30 using the communication line 11 . For example, the robot communication unit 40 acquires from the control communication unit 30 direction command data according to the operation on the operation unit 21 . For example, the robot communication unit 40 acquires from the control communication unit 30 selection instruction data in a format used by the lower control unit 43a.
 ロボット通信部40は、下位制御部43aによって生成されたメディアデータを、制御通信部30に送信する。例えば、メディアデータは、選択されたカメラ42によって生成された画像を含む。ロボット通信部40は、下位制御部43aによって生成されたログデータを、制御通信部30に送信する。
 ロボット記憶部41は、下位制御部43aが実行するプログラムを記憶する。
The robot communication unit 40 transmits media data generated by the lower control unit 43 a to the control communication unit 30 . For example, media data includes images produced by the selected camera 42 . The robot communication unit 40 transmits log data generated by the lower control unit 43 a to the control communication unit 30 .
The robot storage unit 41 stores programs executed by the lower control unit 43a.
 カメラ42(第2カメラ)は、搭載された向きにより定まる方向を、所定の画角で撮像する。例えば、カメラ42-1は、ロボット4aの左方向を、所定の画角で撮像する。各カメラ42は、撮像(生成)された画像を、下位制御部43aに出力する。 The camera 42 (second camera) captures an image with a predetermined angle of view in a direction determined by the mounted orientation. For example, the camera 42-1 captures the left direction of the robot 4a with a predetermined angle of view. Each camera 42 outputs the captured (generated) image to the lower control unit 43a.
 下位制御部43aは、ロボット4aの各機能部の動作を制御する。下位制御部43aは、下位制御部43aが用いる形式の選択命令データを、ロボット通信部40から取得する。下位制御部43aは、取得された選択命令データに基づいて、複数のカメラ42のうちから1台以上のカメラ42を選択する。 The lower control unit 43a controls the operation of each functional unit of the robot 4a. The lower control unit 43a acquires from the robot communication unit 40 the selection command data in the format used by the lower control unit 43a. The lower control unit 43a selects one or more cameras 42 from the plurality of cameras 42 based on the acquired selection command data.
 下位制御部43aは、選択されたカメラ42によって生成された画像を含むメディアデータを生成する。下位制御部43aは、選択されたカメラ42によって生成された画像を含むメディアデータを、ロボット通信部40に送信する。下位制御部43aは、ログデータをロボット通信部40に出力する。 The lower control unit 43a generates media data including the image generated by the selected camera 42. The lower control unit 43 a transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 . The lower control unit 43 a outputs log data to the robot communication unit 40 .
 次に、カメラ42の選択方法について説明する。
 第1実施形態では、端末カメラ22によって撮影された操作者の画像(例えば、顔の画像)の位置に応じて、カメラ42が選択される。例えば、端末カメラ22によって生成された画像のフレーム内における操作者の画像の位置に応じて、カメラ42が選択される。例えば、端末カメラ22によって生成された画像のフレーム内における操作者の画像の長さ(大きさ)に応じて、カメラ42が選択されてもよい。
Next, a method for selecting the camera 42 will be described.
In the first embodiment, the camera 42 is selected according to the position of the operator's image (for example, face image) captured by the terminal camera 22 . For example, the camera 42 is selected according to the position of the operator's image within the frame of the image generated by the terminal camera 22 . For example, the camera 42 may be selected according to the length (size) of the image of the operator within the frame of the image generated by the terminal camera 22 .
 図5は、端末カメラ22の位置に対する操作者200の位置の第1例を示す図である。以下では、「x2」軸と「y2」軸とは、水平面を張る各軸である。ここで、「y2」軸は、前後方向(奥行方向)を示す。「z2」軸は、垂直方向(鉛直方向)を示す。図5では、水平方向に関して、操作者200の顔の位置は、端末カメラ22の画角「v」に対して画角「v」の端から水平角度「a0」の位置に有る。 FIG. 5 is a diagram showing a first example of the position of the operator 200 with respect to the position of the terminal camera 22. FIG. In the following, the “x2” axis and the “y2” axis are the axes that span the horizontal plane. Here, the "y2" axis indicates the front-rear direction (depth direction). The "z2" axis indicates the vertical direction (vertical direction). In FIG. 5, the face of the operator 200 is located at a horizontal angle "a0" from the end of the angle of view "v" of the terminal camera 22 with respect to the horizontal direction.
 図6は、撮像された画像のフレーム内に関する、操作者200の画像の位置例を示す図である。端末カメラ22によって生成(撮像)された画像のフレーム幅(水平方向の長さ(画素数))は、「w」である。フレーム幅「w」は、図5に示された画角「v」に応じた長さである。このようなフレーム内において、操作者200の顔画像は、フレームの左端(基準位置)から水平方向に長さ「d0」だけ離れた位置に撮像される。この長さ「d0」は、操作者200の位置の水平角度「a0」に応じた長さである。 FIG. 6 is a diagram showing an example of the position of the image of the operator 200 within the frame of the captured image. The frame width (length in the horizontal direction (number of pixels)) of the image generated (captured) by the terminal camera 22 is "w". The frame width "w" is the length corresponding to the angle of view "v" shown in FIG. Within such a frame, the face image of the operator 200 is captured at a position separated by a length "d0" in the horizontal direction from the left end (reference position) of the frame. This length “d0” corresponds to the horizontal angle “a0” of the position of the operator 200 .
 図7は、左右方向に関して選択されるカメラ42と条件との対応付けの例を示す図である。端末カメラ22によって生成(撮像)された画像のフレーム幅「w」には、少なくとも左右方向(水平方向)に並んで搭載されたカメラ42の台数に応じて、所定の範囲が水平方向に定められている。 FIG. 7 is a diagram showing an example of correspondence between cameras 42 selected in the horizontal direction and conditions. In the frame width “w” of the image generated (captured) by the terminal camera 22, a predetermined range is defined in the horizontal direction at least according to the number of the cameras 42 mounted side by side in the left-right direction (horizontal direction). ing.
 図3では、カメラ42-1とカメラ42-2とカメラ42-3との3台のカメラ42が、少なくとも左右方向(水平方向)に並べられて、ロボット4aに搭載されている。これに応じて、図7では、左端「0」から「w/3」未満の範囲(フレームの左側の範囲)と、「w/3」から「2w/3」未満の範囲(フレームの中央の範囲)と、「2w/3」から「w」までの範囲(フレームの右側の範囲)とが、フレーム幅「w」に定められている。 In FIG. 3, three cameras 42, ie, camera 42-1, camera 42-2, and camera 42-3, are arranged in at least the left-right direction (horizontal direction) and mounted on the robot 4a. Accordingly, in FIG. 7, the range from "0" to less than "w/3" on the left end (the range on the left side of the frame) and the range from "w/3" to less than "2w/3" (the range in the middle of the frame) range) and the range from "2w/3" to "w" (range on the right side of the frame) are defined as the frame width "w".
 操作者200の位置の水平角度「a0」が「w/3」未満の範囲にある場合、カメラ42-1が選択される。すなわち、画像の左端「0」から「w/3」未満の範囲に操作者200の顔画像が撮像された場合、ロボット4aの左方向を撮像するカメラ42-1が選択される。ここで、ロボット4aの左方向を撮像するカメラ42-1が選択されるだけでなく、ロボット4aの前方向(正面方向)を撮像するカメラ42-2が選択されてもよい。 When the horizontal angle "a0" of the position of the operator 200 is within a range of less than "w/3", the camera 42-1 is selected. That is, when the face image of the operator 200 is captured within the range from the left end "0" to less than "w/3" of the image, the camera 42-1 that captures the left direction of the robot 4a is selected. Here, not only the camera 42-1 that captures the left direction of the robot 4a but also the camera 42-2 that captures the forward direction (front direction) of the robot 4a may be selected.
 操作者200の位置の水平角度「a0」が「w/3」から「2w/3」未満の範囲にある場合、カメラ42-2が選択される。すなわち、端末カメラ22の画角の「w/3」から「2w/3」未満の範囲に操作者200の顔画像が撮像された場合、ロボット4aの前方向(正面方向)を撮像するカメラ42-2が選択される。 When the horizontal angle "a0" of the position of the operator 200 is in the range from "w/3" to less than "2w/3", the camera 42-2 is selected. That is, when the face image of the operator 200 is captured within a range from "w/3" to less than "2w/3" of the angle of view of the terminal camera 22, the camera 42 captures the forward direction (frontal direction) of the robot 4a. -2 is selected.
 操作者200の位置の水平角度「a0」が「2w/3」から「w」までの範囲にある場合、カメラ42-3が選択される。すなわち、端末カメラ22の画角の「2w/3」から「w」未満の範囲に操作者200の顔画像が撮像された場合、ロボット4aの右方向を撮像するカメラ42-3が選択される。ここで、ロボット4aの右方向を撮像するカメラ42-3が選択されるだけでなく、ロボット4aの前方向(正面方向)を撮像するカメラ42-2が選択されてもよい。 When the horizontal angle "a0" of the position of the operator 200 is in the range from "2w/3" to "w", the camera 42-3 is selected. That is, when the face image of the operator 200 is captured in the range of the angle of view of the terminal camera 22 from "2w/3" to less than "w", the camera 42-3 that captures the right direction of the robot 4a is selected. . Here, not only the camera 42-3 that captures the right direction of the robot 4a but also the camera 42-2 that captures the forward direction (frontal direction) of the robot 4a may be selected.
 これらによって、図5に示された操作者200が右方向(x2軸の正方向)に移動した場合、ロボット4aの左方向(x1軸の負方向)を撮像するカメラ42-1によって生成された画像を、操作者200が見ることができる。また、図5に示された操作者200が左方向(x2軸の負方向)に移動した場合、ロボット4aの右方向(x1軸の正方向)を撮像するカメラ42-3によって生成された画像を、操作者200が見ることができる。 As a result, when the operator 200 shown in FIG. 5 moves in the right direction (positive direction of the x2 axis), the camera 42-1 images the left direction (negative direction of the x1 axis) of the robot 4a. The image can be viewed by the operator 200 . Also, when the operator 200 shown in FIG. 5 moves leftward (negative direction of the x2 axis), an image generated by the camera 42-3 that captures the right direction (positive direction of the x1 axis) of the robot 4a. can be seen by the operator 200 .
 なお、操作者200の位置の水平角度「a0」が「w/3」未満の範囲にある場合、カメラ42-1が選択される代わりに、ロボット4aの右方向を撮像するカメラ42-3が選択されてもよい。操作者200の位置の水平角度「a0」が「2w/3」から「w」までの範囲にある場合、カメラ42-3が選択される代わりに、ロボット4aの左方向を撮像するカメラ42-1が選択されてもよい。 When the horizontal angle "a0" of the position of the operator 200 is within a range of less than "w/3", instead of selecting the camera 42-1, the camera 42-3 for imaging the right direction of the robot 4a is selected. may be selected. When the horizontal angle "a0" of the position of the operator 200 is in the range from "2w/3" to "w", instead of selecting the camera 42-3, the camera 42-3 that captures the left direction of the robot 4a is selected. 1 may be selected.
 これらによって、図5に示された操作者200が左方向(x2軸の負方向)に移動した場合、ロボット4aの左方向(x1軸の負方向)を撮像するカメラ42-1によって生成された画像を、操作者200が見ることができる。また、図5に示された操作者200が右方向(x2軸の正方向)に移動した場合、ロボット4aの右方向(x1軸の正方向)を撮像するカメラ42-3によって生成された画像を、操作者200が見ることができる。 As a result, when the operator 200 shown in FIG. 5 moves in the left direction (negative direction of the x2 axis), the camera 42-1 images the left direction (negative direction of the x1 axis) of the robot 4a. The image can be viewed by the operator 200 . Also, when the operator 200 shown in FIG. 5 moves rightward (positive direction of the x2 axis), an image generated by the camera 42-3 that captures the rightward direction (positive direction of the x1 axis) of the robot 4a. can be seen by the operator 200 .
 図8は、端末カメラ22の位置に対する操作者200の位置の第2例を示す図である。図8には、奥行方向に関して、端末カメラ22から距離「L1」だけ離れた操作者200の顔の位置と、端末カメラ22から距離「L2」だけ離れた操作者200の顔の位置とが示されている。ここで、距離「L1」は、予め定められた基準距離であり、距離「L2」よりも長い。 FIG. 8 is a diagram showing a second example of the position of the operator 200 with respect to the position of the terminal camera 22. FIG. FIG. 8 shows the position of the face of the operator 200 at a distance "L1" from the terminal camera 22 and the position of the face of the operator 200 at a distance "L2" from the terminal camera 22 in the depth direction. It is Here, the distance "L1" is a predetermined reference distance and is longer than the distance "L2".
 端末カメラ22から距離「L1」だけ離れた位置に操作者200の顔が有る場合、操作者200の顔の幅に関する水平角度は、「a1」である。端末カメラ22から距離「L2」だけ離れた位置に操作者200の顔が有る場合、操作者200の顔の幅に関する水平角度は、「a2」である。 When the face of the operator 200 is located at a distance "L1" from the terminal camera 22, the horizontal angle of the width of the face of the operator 200 is "a1". When the operator 200's face is located at a distance "L2" from the terminal camera 22, the horizontal angle for the width of the operator's 200 face is "a2".
 図9は、撮像された画像のフレーム内に関する、操作者200の画像の長さ(大きさ)例を示す図である。距離「L1」だけ離れた位置に操作者200の顔が有る場合、フレーム幅「w」のフレーム内において、操作者200の顔画像の水平方向の長さは「d1」である。長さ「d1」は、操作者200の顔の幅に関する水平角度「a1」に応じた基準長さである。なお、長さ「d1」は、操作者200の顔画像を内側に含む矩形枠(バウンディング・ボックス)の一辺の長さとして表現されてもよい。 FIG. 9 is a diagram showing an example of the length (size) of the image of the operator 200 within the frame of the captured image. When the face of the operator 200 is located at a distance "L1", the horizontal length of the face image of the operator 200 is "d1" in the frame with the frame width "w". The length “d1” is a reference length corresponding to the horizontal angle “a1” with respect to the width of the operator's 200 face. Note that the length “d1” may be expressed as the length of one side of a rectangular frame (bounding box) containing the face image of the operator 200 inside.
 基準の長さ「d1」を定めるため、操作者200は、例えばロボット4aが起動されたタイミング又は任意のタイミングで、距離「L1」の位置に居る操作者200の顔を、端末カメラ22を用いて撮像する。 In order to determine the reference length "d1", the operator 200 captures the face of the operator 200 at the position of the distance "L1" using the terminal camera 22, for example, at the timing when the robot 4a is activated or at any timing. to shoot.
 端末カメラ22から距離「L2」だけ離れた位置に操作者200の顔が有る場合、フレーム幅「w」のフレーム内において、操作者200の顔画像の水平方向の長さは「d2」である。長さ「d2」は、操作者200の顔の幅に関する水平角度「a2」に応じた長さである。 When the face of the operator 200 is located at a distance "L2" from the terminal camera 22, the horizontal length of the face image of the operator 200 is "d2" in the frame with the frame width "w". . The length “d2” is a length corresponding to the horizontal angle “a2” with respect to the width of the operator's 200 face.
 図10は、前後方向に関して選択されるカメラ42と条件との対応付けの例を示す図である。図3では、カメラ42-2とカメラ42-4との2台のカメラ42が、少なくとも前後方向(奥行方向)に並べられて、ロボット4aに搭載されている。前後方向に並べられてロボット4aに搭載されたカメラ42の選択に関して、閾値「R」が予め定められている。閾値「R」は、例えば、1.2である。 FIG. 10 is a diagram showing an example of correspondence between the cameras 42 selected in the front-rear direction and the conditions. In FIG. 3, two cameras 42, a camera 42-2 and a camera 42-4, are mounted on the robot 4a so as to be aligned at least in the front-rear direction (depth direction). A threshold "R" is predetermined for selection of the cameras 42 arranged in the front-rear direction and mounted on the robot 4a. The threshold "R" is, for example, 1.2.
 図7では、水平方向の長さ「d2」が基準長さ「d1」に対して20%以上大きい場合、カメラ42-4が選択される。すなわち、長さの比「d2/d1」が閾値「R」以上である場合、ロボット4aの前方向(正面方向)をズーム撮像するカメラ42-4が選択される。これに対して、長さの比「d2/d1」が閾値「R」未満である場合、カメラ42-2が選択される。 In FIG. 7, the camera 42-4 is selected when the horizontal length "d2" is greater than the reference length "d1" by 20% or more. That is, when the length ratio "d2/d1" is equal to or greater than the threshold value "R", the camera 42-4 for zooming in on the forward direction (frontal direction) of the robot 4a is selected. On the other hand, if the length ratio 'd2/d1' is less than the threshold 'R', camera 42-2 is selected.
 次に、遠隔操作システム1aの動作例を説明する。
 図11は、遠隔操作システム1aの動作例を示すフローチャートである。メディア処理部230aは、操作者200を撮像する端末カメラ22によって生成された画像を取得する(ステップS101)。第1命令生成部232aは、異なる方向を撮像する複数のカメラ42を搭載したロボット4aの進行方向又は旋回方向を示す方向命令データを、例えば操作に基づいて生成する(ステップS102)。端末通信部25は、方向命令データを、制御通信部30を介してロボット4aに送信する(ステップS103)。
Next, an operation example of the remote control system 1a will be described.
FIG. 11 is a flow chart showing an operation example of the remote control system 1a. The media processing unit 230a acquires an image generated by the terminal camera 22 that captures the operator 200 (step S101). The first command generation unit 232a generates direction command data indicating the traveling direction or turning direction of the robot 4a equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S102). The terminal communication unit 25 transmits direction command data to the robot 4a via the control communication unit 30 (step S103).
 第1命令生成部232aは、端末カメラ22によって生成された画像のフレーム内における操作者200の顔画像の位置及び長さ(画像認識処理の結果)に基づいて、選択命令データを生成する(ステップS104)。端末通信部25は、選択命令データをロボット4aに送信する(ステップS105)。 The first command generation unit 232a generates selection command data based on the position and length of the facial image of the operator 200 in the frame of the image generated by the terminal camera 22 (result of image recognition processing) (step S104). The terminal communication unit 25 transmits the selection instruction data to the robot 4a (step S105).
 端末通信部25は、端末カメラ22によって生成された画像のフレーム内における操作者200の画像の位置及び長さのうちの少なくとも一方に基づいて選択されたカメラ42に関して、選択されたカメラ42によって生成された画像を取得する(ステップS106)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS107)。 The terminal communication unit 25 generates a The obtained image is obtained (step S106). The display unit 24 displays the image generated by the selected camera 42 (step S107).
 以上のように、メディア処理部230a(取得部)は、端末カメラ22(第1カメラ)によって生成された画像を取得する。第1命令生成部232a(方向命令生成部)は、複数のカメラ42を搭載したロボット4aの進行方向又は旋回方向を示す方向命令データを生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4aに送信する。端末通信部25(第2取得部)は、端末カメラ22によって生成された画像のフレーム内における操作者200の画像の位置及び長さのうちの少なくとも一方に基づいて選択されたカメラ42(第2カメラ)によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the media processing unit 230a (acquisition unit) acquires an image generated by the terminal camera 22 (first camera). The first command generation unit 232a (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4a on which the multiple cameras 42 are mounted. The terminal communication unit 25 (transmitting unit) transmits the direction instruction data to the robot 4a via the control communication unit 30. FIG. The terminal communication unit 25 (second acquisition unit) selects the camera 42 (second Acquire the image produced by the camera). The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下(小さい表示部が使用される環境下)でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 As a result, the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image generated by the camera 42 even in an environment where a large display cannot be used (an environment where a small display is used).
 (第1実施形態の変形例)
 第1実施形態では、端末カメラ22によって生成された画像データに応じた選択命令データを、端末装置が生成した。第1実施形態の変形例では、選択命令データを制御装置が生成する点が、第1実施形態との差分である。第1実施形態の変形例では、第1実施形態との差分を中心に説明する。
(Modified example of the first embodiment)
In the first embodiment, the terminal device generated the selection command data according to the image data generated by the terminal camera 22 . The modification of the first embodiment differs from the first embodiment in that the control device generates the selection command data. In the modified example of the first embodiment, differences from the first embodiment will be mainly described.
 図12は、遠隔操作システム1bの構成例を示す図である。遠隔操作システム1bは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1bは、端末装置2bと、制御装置3bと、ロボット4bとを備える。 FIG. 12 is a diagram showing a configuration example of the remote control system 1b. The remote control system 1b is a system for remotely operating an operation target (control target). The remote control system 1b includes a terminal device 2b, a control device 3b, and a robot 4b.
 端末装置2bは、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23bと、表示部24と、端末通信部25とを備える。端末制御部23bは、メディア処理部230bと、ログデータ処理部231と、第1命令生成部232b(方向命令生成部)とを有する。端末装置2bは、マイク(不図示)を備えてもよい。 The terminal device 2b includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23b, a display unit 24, and a terminal communication unit 25. The terminal control unit 23b has a media processing unit 230b, a log data processing unit 231, and a first command generation unit 232b (direction command generation unit). The terminal device 2b may include a microphone (not shown).
 制御装置3bは、制御通信部30と、制御記憶部31と、上位制御部32bとを備える。上位制御部32bは、第2命令生成部320bを有する。ロボット4bは、ロボット通信部40と、ロボット記憶部41と、N個のカメラ42(第2カメラ)と、下位制御部43bとを備える。 The control device 3b includes a control communication unit 30, a control storage unit 31, and a higher control unit 32b. The upper controller 32b has a second instruction generator 320b. The robot 4b includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43b.
 メディア処理部230bは、画像認識処理の結果を第1命令生成部232aに出力する代わりに、画像認識処理の結果を第2命令生成部320bに出力する。ここで、メディア処理部230bは、端末通信部25を用いて、画像認識処理の結果を制御通信部30に送信する。 Instead of outputting the result of image recognition processing to the first command generation unit 232a, the media processing unit 230b outputs the result of image recognition processing to the second command generation unit 320b. Here, the media processing section 230 b uses the terminal communication section 25 to transmit the result of the image recognition processing to the control communication section 30 .
 第2命令生成部320b(選択命令生成部)は、画像認識処理の結果に基づいて、選択命令データを生成する。例えば、第2命令生成部320bは、端末カメラ22によって生成された画像のフレーム内における操作者200の顔画像の位置及び長さ(画像認識処理の結果)に基づいて、選択命令データを生成する。第2命令生成部320bは、制御通信部30を用いて、選択命令データをロボット4bに送信する。 The second command generation unit 320b (selection command generation unit) generates selection command data based on the result of image recognition processing. For example, the second command generation unit 320b generates selection command data based on the position and length of the facial image of the operator 200 in the frame of the image generated by the terminal camera 22 (result of image recognition processing). . The second command generation unit 320b uses the control communication unit 30 to transmit selection command data to the robot 4b.
 次に、遠隔操作システム1bの動作例を説明する。
 図13は、遠隔操作システム1bの動作例を示すフローチャートである。メディア処理部230bは、操作者200を撮像する端末カメラ22によって生成された画像を取得する(ステップS201)。第1命令生成部232bは、異なる方向を撮像する複数のカメラ42を搭載したロボット4bの進行方向又は旋回方向を示す方向命令データを、例えば操作に基づいて生成する(ステップS202)。端末通信部25は、方向命令データを、制御通信部30を介してロボット4bに送信する(ステップS203)。
Next, an operation example of the remote control system 1b will be described.
FIG. 13 is a flow chart showing an operation example of the remote control system 1b. The media processing unit 230b acquires an image generated by the terminal camera 22 that captures the operator 200 (step S201). The first command generation unit 232b generates direction command data indicating the traveling direction or turning direction of the robot 4b equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S202). The terminal communication unit 25 transmits direction command data to the robot 4b via the control communication unit 30 (step S203).
 端末通信部25は、操作者の画像の位置等に基づいて選択命令データを生成する制御装置3bに、操作者200の画像の位置等(画像認識処理の結果)を送信する(ステップS204)。 The terminal communication unit 25 transmits the position of the image of the operator 200 (result of image recognition processing) to the control device 3b that generates selection instruction data based on the position of the image of the operator (step S204).
 第2命令生成部320bは、端末カメラ22によって生成された画像のフレーム内における操作者200の画像の位置等に基づいて、選択命令データを生成する(ステップS205)。制御通信部30は、選択命令データをロボット4bに送信する(ステップS206)。 The second command generation unit 320b generates selection command data based on the position of the image of the operator 200 in the frame of the image generated by the terminal camera 22 (step S205). The control communication unit 30 transmits the selection command data to the robot 4b (step S206).
 端末通信部25は、操作者200の顔画像の位置及び長さに基づいて選択されたカメラ42によって生成された画像を取得する(ステップS207)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS208)。 The terminal communication unit 25 acquires an image generated by the camera 42 selected based on the position and length of the facial image of the operator 200 (step S207). The display unit 24 displays the image generated by the selected camera 42 (step S208).
 以上のように、メディア処理部230b(取得部)は、操作者200を撮像する端末カメラ22(第1カメラ)によって生成された画像を取得する。第1命令生成部232b(方向命令生成部)は、異なる方向を撮像する複数のカメラ42を搭載したロボット4bの進行方向又は旋回方向を示す方向命令データを生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4bに送信する。端末通信部25は、操作者200の画像の位置等を、制御装置3bに送信する。端末通信部25は、操作者200の顔画像の位置等に基づいて選択されたカメラ42(第2カメラ)によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the media processing unit 230b (acquisition unit) acquires an image generated by the terminal camera 22 (first camera) that captures the operator 200. The first command generation unit 232b (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4b equipped with a plurality of cameras 42 that capture images in different directions. The terminal communication unit 25 (transmitting unit) transmits the direction command data to the robot 4b via the control communication unit 30. FIG. The terminal communication unit 25 transmits information such as the position of the image of the operator 200 to the control device 3b. The terminal communication unit 25 acquires an image generated by the camera 42 (second camera) selected based on the position of the face image of the operator 200 or the like. The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 As a result, the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
 (第2実施形態)
 第2実施形態では、ロボットの周囲の物体とロボットとの間の距離に応じてカメラ42が選択される点が、第1実施形態との差分である。第2実施形態では、第1実施形態との差分を中心に説明する。
(Second embodiment)
The difference between the second embodiment and the first embodiment is that the camera 42 is selected according to the distance between the robot and objects around the robot. 2nd Embodiment demonstrates centering around the difference with 1st Embodiment.
 図14は、遠隔操作システム1cの構成例を示す図である。遠隔操作システム1cは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1cは、端末装置2cと、制御装置3cと、ロボット4cとを備える。 FIG. 14 is a diagram showing a configuration example of the remote control system 1c. The remote control system 1c is a system for remotely operating an operation target (control target). The remote control system 1c includes a terminal device 2c, a control device 3c, and a robot 4c.
 端末装置2c(第1遠隔操作装置)は、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23cと、表示部24と、端末通信部25とを備える。端末制御部23cは、メディア処理部230cと、ログデータ処理部231と、第1命令生成部232cとを有する。端末装置2cは、マイク(不図示)を備えてもよい。 The terminal device 2c (first remote control device) includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23c, a display unit 24, and a terminal communication unit 25. Prepare. The terminal control unit 23c has a media processing unit 230c, a log data processing unit 231, and a first instruction generation unit 232c. The terminal device 2c may include a microphone (not shown).
 制御装置3c(第2遠隔操作装置)は、制御通信部30と、制御記憶部31と、上位制御部32cとを備える。ロボット4cは、ロボット通信部40と、ロボット記憶部41と、N個のカメラ42(第2カメラ)と、下位制御部43cと、M個(Mは2以上の整数)のセンサ44とを備える。 The control device 3c (second remote control device) includes a control communication unit 30, a control storage unit 31, and a higher control unit 32c. The robot 4c includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), a lower control unit 43c, and M sensors 44 (where M is an integer equal to or greater than 2). .
 センサ44は、センサ44と物体(不図示)との間の距離を測定するセンサである。この物体とは、特定の物体に限定されないが、例えば壁である。センサ44は、センサ44とそのセンサ44の周囲の物体との間の距離を測定する。 The sensor 44 is a sensor that measures the distance between the sensor 44 and an object (not shown). This object is, for example, a wall, although it is not limited to a specific object. Sensor 44 measures the distance between sensor 44 and objects around sensor 44 .
 例えば、センサ44が超音波ソナーである場合、センサ44は、センサ44と物体との間を超音波が往復する時間に基づいて、センサ44と物体との間の距離を測定する。例えば、センサ44が電磁波レーダである場合、センサ44は、センサ44と物体との間を電磁波が往復する時間に基づいて、センサ44と物体との間の距離を測定する。センサ44は、センサ44と物体との間の距離の測定結果を、下位制御部43cに出力する。 For example, if the sensor 44 is an ultrasonic sonar, the sensor 44 measures the distance between the sensor 44 and the object based on the round trip time of the ultrasonic wave between the sensor 44 and the object. For example, if the sensor 44 is an electromagnetic radar, the sensor 44 measures the distance between the sensor 44 and the object based on the round trip time of the electromagnetic wave between the sensor 44 and the object. The sensor 44 outputs the measurement result of the distance between the sensor 44 and the object to the lower controller 43c.
 下位制御部43cは、センサ44と物体との間の距離の測定結果を、各センサ44から取得する。下位制御部43c(選択命令生成部)は、センサ44と物体との間の距離の測定結果に基づいて、複数のカメラ42のうちから1台以上のカメラ42を選択する。すなわち、下位制御部43c(選択命令生成部)は、センサ44と物体との間の距離の測定結果に基づいて、選択命令データを生成する。 The lower control unit 43c acquires from each sensor 44 the measurement result of the distance between the sensor 44 and the object. The lower control unit 43c (selection command generation unit) selects one or more cameras 42 from the plurality of cameras 42 based on the measurement result of the distance between the sensor 44 and the object. That is, the lower control unit 43c (selection command generation unit) generates selection command data based on the measurement result of the distance between the sensor 44 and the object.
 下位制御部43cは、選択されたカメラ42によって生成された画像を含むメディアデータを生成する。下位制御部43cは、選択されたカメラ42によって生成された画像を含むメディアデータを、ロボット通信部40に送信する。下位制御部43cは、ログデータをロボット通信部40に出力する。ログデータは、センサ44によって測定された距離データを含んでもよい。 The lower control unit 43c generates media data including the image generated by the selected camera 42. The lower control unit 43 c transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 . The lower control unit 43c outputs the log data to the robot communication unit 40. FIG. Log data may include distance data measured by sensors 44 .
 図15は、ロボット4cの外観例を示す上面図である。図16は、ロボット4cの外観例を示す側面図である。図15及び図16では、ロボット4cの前方向は、「y1」軸の方向である。ロボット4cは、一例として、カメラ42-1と、カメラ42-2と、カメラ42-3と、カメラ42-4と、カメラ42-5とを備える。 FIG. 15 is a top view showing an appearance example of the robot 4c. FIG. 16 is a side view showing an appearance example of the robot 4c. 15 and 16, the forward direction of the robot 4c is the direction of the "y1" axis. The robot 4c includes, for example, a camera 42-1, a camera 42-2, a camera 42-3, a camera 42-4, and a camera 42-5.
 ロボット4cは、ロボット4cの左側面にセンサ44-1を備える。これによって、センサ44-1は、ロボット4cに左方向から接近した物体とロボット4cとの間の距離を測定する。ロボット4cは、ロボット4cの正面にセンサ44-2を備える。これによって、センサ44-2は、ロボット4cに前方向から接近した物体とロボット4cとの間の距離を測定する。 The robot 4c is equipped with a sensor 44-1 on the left side of the robot 4c. Thereby, the sensor 44-1 measures the distance between the object approaching the robot 4c from the left direction and the robot 4c. The robot 4c has a sensor 44-2 in front of the robot 4c. Thereby, the sensor 44-2 measures the distance between the object approaching the robot 4c from the front and the robot 4c.
 ロボット4cは、ロボット4cの右側面にセンサ44-3を備える。これによって、センサ44-3は、ロボット4cに前方向から接近した物体とロボット4cとの間の距離を測定する。ロボット4cは、ロボット4cの背面にセンサ44-4を備える。これによって、センサ44-4は、ロボット4cに後ろ方向から接近した物体とロボット4cとの間の距離を測定する。 The robot 4c has a sensor 44-3 on the right side of the robot 4c. Thereby, the sensor 44-3 measures the distance between the object approaching the robot 4c from the front and the robot 4c. The robot 4c has a sensor 44-4 on the back of the robot 4c. Thereby, the sensor 44-4 measures the distance between the object approaching the robot 4c from behind and the robot 4c.
 次に、カメラ42の選択方法について説明する。
 図17は、左右方向及び前後方向に関して選択されるカメラ42と条件との対応付けの例を示す図である。ロボット4cに接近した物体とロボット4cとのうちの少なくとも一方に損害又は支障を生じさせる可能性のある距離(閾値)として、距離「L0」が予め定められている。
Next, a method for selecting the camera 42 will be described.
FIG. 17 is a diagram showing an example of correspondence between the cameras 42 selected in the left-right direction and the front-rear direction and the conditions. A distance "L0" is predetermined as a distance (threshold value) at which at least one of the object approaching the robot 4c and the robot 4c may be damaged or hindered.
 センサ44-1と物体との間の距離「L3」が距離「L0」以下である場合、下位制御部43cは、複数のカメラ42のうちから、カメラ42-1を選択する。センサ44-3と物体との間の距離「L4」が距離「L0」以下である場合、下位制御部43cは、複数のカメラ42のうちから、カメラ42-3を選択する。センサ44-4と物体との間の距離「L5」が距離「L0」以下である場合、下位制御部43cは、複数のカメラ42のうちから、カメラ42-5を選択する。 When the distance "L3" between the sensor 44-1 and the object is less than or equal to the distance "L0", the lower control unit 43c selects the camera 42-1 from among the plurality of cameras 42. If the distance “L4” between the sensor 44-3 and the object is less than or equal to the distance “L0”, the lower control unit 43c selects the camera 42-3 from the plurality of cameras 42. FIG. If the distance “L5” between the sensor 44-4 and the object is less than or equal to the distance “L0”, the lower control unit 43c selects the camera 42-5 from the plurality of cameras 42. FIG.
 センサ44-2と物体との間の距離「L6」が距離「L0」以下である場合、下位制御部43cは、複数のカメラ42のうちから、カメラ42-2を選択する。これらの場合以外では、下位制御部43cは、複数のカメラ42のうちから、カメラ42-4を選択する。なお、これらの選択は一例である。 When the distance "L6" between the sensor 44-2 and the object is less than or equal to the distance "L0", the lower control unit 43c selects the camera 42-2 from among the plurality of cameras 42. Except for these cases, the lower control unit 43c selects the camera 42-4 from among the plurality of cameras 42. FIG. Note that these selections are examples.
 次に、遠隔操作システム1cの動作例を説明する。
 図18は、遠隔操作システム1cの動作例を示すフローチャートである。第1命令生成部232cは、異なる方向を撮像する複数のカメラ42を搭載したロボット4cの進行方向又は旋回方向を示す方向命令データを、例えば操作に基づいて生成する(ステップS301)。端末通信部25は、方向命令データを、制御通信部30を介してロボット4c(下位制御部43c)に送信する(ステップS302)。
Next, an operation example of the remote control system 1c will be described.
FIG. 18 is a flow chart showing an operation example of the remote control system 1c. The first command generation unit 232c generates direction command data indicating the traveling direction or turning direction of the robot 4c equipped with a plurality of cameras 42 that capture images in different directions, based on an operation, for example (step S301). The terminal communication unit 25 transmits direction command data to the robot 4c (lower control unit 43c) via the control communication unit 30 (step S302).
 下位制御部43cは、ロボット4cと物体(不図示)との間の距離が閾値以下となった物体の方向を撮像するカメラ42を、各センサ44を用いて選択する(ステップS303)。端末通信部25は、選択されたカメラ42によって生成された画像を取得する。例えば、端末通信部25は、ロボット4cと物体との間の距離が閾値以下となった物体の方向を撮像するカメラ42によって生成された画像を取得する(ステップS304)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS305)。 The lower control unit 43c selects the camera 42 that captures the direction of the object whose distance between the robot 4c and the object (not shown) is equal to or less than a threshold using each sensor 44 (step S303). The terminal communication unit 25 acquires the image generated by the selected camera 42 . For example, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4c and the object is equal to or less than a threshold (step S304). The display unit 24 displays the image generated by the selected camera 42 (step S305).
 以上のように、第1命令生成部232c(方向命令生成部)は、異なる方向を撮像する複数のカメラ42を搭載したロボット4cの進行方向又は旋回方向を示す方向命令データを生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4cに送信する。端末通信部25(取得部)は、選択されたカメラ42によって生成された画像を取得する。例えば、端末通信部25は、ロボット4cと物体との間の距離が閾値以下となった物体の方向を撮像するカメラ42によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the first command generation unit 232c (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4c equipped with a plurality of cameras 42 that capture images in different directions. The terminal communication unit 25 (transmitting unit) transmits direction command data to the robot 4c via the control communication unit 30. FIG. The terminal communication unit 25 (acquisition unit) acquires an image generated by the selected camera 42 . For example, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4c and the object is equal to or less than a threshold. The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する(手動で画面を切り替える)必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 This eliminates the need for the operator to manually select the camera 42 (manually switch the screen) while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42. It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
 他の移動体がロボット4cに接近した場合、選択されたカメラ42によって撮像された画像を操作者が見ながら、他の移動体をロボット4cが回避することが可能である。移動中のロボット4cが壁等に接近した場合、選択されたカメラ42によって撮像された画像を操作者が見ながら、壁をロボット4cが回避することが可能である。 When another moving body approaches the robot 4c, the robot 4c can avoid the other moving body while the operator watches the image captured by the selected camera 42. When the moving robot 4 c approaches a wall or the like, it is possible for the robot 4 c to avoid the wall while the operator watches the image captured by the selected camera 42 .
 (第2実施形態の変形例)
 第2実施形態では、物体とロボットとの間の距離に応じて、ロボットがカメラを選択した。第2実施形態の変形例では、物体とロボットとの間の距離に応じて制御装置がカメラを選択する点が、第2実施形態との差分である。第2実施形態の変形例では、第2実施形態との差分を中心に説明する。
(Modification of Second Embodiment)
In the second embodiment, the robot selected the camera according to the distance between the object and the robot. The modification of the second embodiment differs from the second embodiment in that the control device selects the camera according to the distance between the object and the robot. In the modified example of the second embodiment, differences from the second embodiment will be mainly described.
 図19は、遠隔操作システム1dの構成例を示す図である。遠隔操作システム1dは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1dは、端末装置2dと、制御装置3dと、ロボット4dとを備える。 FIG. 19 is a diagram showing a configuration example of the remote control system 1d. The remote control system 1d is a system for remotely operating an operation target (control target). The remote control system 1d includes a terminal device 2d, a control device 3d, and a robot 4d.
 端末装置2dは、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23aと、表示部24と、端末通信部25とを備える。端末制御部23dは、メディア処理部230bと、ログデータ処理部231と、第1命令生成部232bとを有する。端末装置2dは、マイク(不図示)を備えてもよい。 The terminal device 2d includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23a, a display unit 24, and a terminal communication unit 25. The terminal control unit 23d has a media processing unit 230b, a log data processing unit 231, and a first instruction generation unit 232b. The terminal device 2d may include a microphone (not shown).
 制御装置3dは、制御通信部30と、制御記憶部31と、上位制御部32dとを備える。上位制御部32dは、第2命令生成部320dを有する。ロボット4dは、ロボット通信部40と、ロボット記憶部41と、N個のカメラ42(第2カメラ)と、下位制御部43dと、M個のセンサ44とを備える。 The control device 3d includes a control communication unit 30, a control storage unit 31, and a higher control unit 32d. The upper controller 32d has a second instruction generator 320d. The robot 4 d includes a robot communication unit 40 , a robot storage unit 41 , N cameras 42 (second cameras), a lower control unit 43 d and M sensors 44 .
 下位制御部43dは、センサ44と物体との間の距離の測定結果を、各センサ44から取得する。下位制御部43dは、センサ44と物体との間の距離の測定結果を、ロボット通信部40に送信する。 The lower control unit 43d acquires from each sensor 44 the measurement result of the distance between the sensor 44 and the object. The lower controller 43 d transmits the measurement result of the distance between the sensor 44 and the object to the robot communication unit 40 .
 下位制御部43dは、下位制御部43dが用いる形式の選択命令データを、ロボット通信部40から取得する。下位制御部43dは、下位制御部43dが用いる形式の選択命令データに基づいて、複数のカメラ42のうちから1台以上のカメラ42を選択する。 The lower control unit 43d acquires from the robot communication unit 40 the selection command data in the format used by the lower control unit 43d. The lower controller 43d selects one or more cameras 42 from among the plurality of cameras 42 based on the selection instruction data in the format used by the lower controller 43d.
 下位制御部43dは、選択されたカメラ42によって生成された画像を含むメディアデータを生成する。下位制御部43dは、選択されたカメラ42によって生成された画像を含むメディアデータを、ロボット通信部40に送信する。下位制御部43dは、ログデータをロボット通信部40に出力する。 The lower control unit 43d generates media data including the image generated by the selected camera 42. The lower control unit 43 d transmits media data including the image generated by the selected camera 42 to the robot communication unit 40 . The lower control unit 43d outputs the log data to the robot communication unit 40. FIG.
 ロボット通信部40は、センサ44と物体との間の距離の測定結果を、センサ44ごとに、下位制御部43dから取得する。ロボット通信部40は、センサ44と物体との間の距離の測定結果を、制御通信部30に送信する。 The robot communication unit 40 acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the lower control unit 43d. The robot communication unit 40 transmits the measurement result of the distance between the sensor 44 and the object to the control communication unit 30 .
 ロボット通信部40は、下位制御部43dによって生成されたメディアデータを、制御通信部30に送信する。例えば、メディアデータは、選択されたカメラ42によって生成された画像を含む。ロボット通信部40は、下位制御部43dによって生成されたログデータを、制御通信部30に送信する。 The robot communication unit 40 transmits the media data generated by the lower control unit 43d to the control communication unit 30. For example, media data includes images produced by the selected camera 42 . The robot communication unit 40 transmits log data generated by the lower control unit 43 d to the control communication unit 30 .
 制御通信部30は、下位制御部43dによって生成されたメディアデータを、端末通信部25に送信する。制御通信部30は、ログデータを端末通信部25に送信する。制御通信部30は、センサ44と物体との間の距離の測定結果を、センサ44ごとに、第2命令生成部320dに出力する。 The control communication unit 30 transmits the media data generated by the lower control unit 43d to the terminal communication unit 25. The control communication unit 30 transmits log data to the terminal communication unit 25 . The control communication unit 30 outputs the measurement result of the distance between the sensor 44 and the object for each sensor 44 to the second command generation unit 320d.
 制御通信部30は、下位制御部43dが用いる形式の選択命令データを、上位制御部32dから取得する。制御通信部30は、下位制御部43dが用いる形式の選択命令データを、ロボット通信部40に出力する。 The control communication unit 30 acquires the selection command data in the format used by the lower control unit 43d from the upper control unit 32d. The control communication unit 30 outputs to the robot communication unit 40 the selection instruction data in the format used by the lower control unit 43d.
 第2命令生成部320d(選択命令生成部)は、センサ44と物体との間の距離の測定結果を、センサ44ごとに、制御通信部30から取得する。第2命令生成部320dは、センサ44と物体との間の距離の測定結果に基づいて、複数のカメラ42のうちから1台以上のカメラ42を選択する。第2命令生成部320dは、選択命令データを制御通信部30に出力する。 The second command generation unit 320d (selection command generation unit) acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the control communication unit 30. The second command generator 320d selects one or more cameras 42 from the plurality of cameras 42 based on the measurement result of the distance between the sensor 44 and the object. The second command generation unit 320d outputs the selection command data to the control communication unit 30. FIG.
 次に、遠隔操作システム1dの動作例を説明する。
 図20は、遠隔操作システム1dの動作例を示すフローチャートである。第1命令生成部232dは、異なる方向を撮像する複数のカメラ42を搭載したロボット4dの進行方向又は旋回方向を示す方向命令データを、例えば操作に基づいて生成する(ステップS401)。端末通信部25は、方向命令データを、制御通信部30を介してロボット4d(下位制御部43d)に送信する(ステップS402)。
Next, an operation example of the remote control system 1d will be described.
FIG. 20 is a flow chart showing an operation example of the remote control system 1d. The first command generation unit 232d generates direction command data indicating the traveling direction or turning direction of the robot 4d equipped with a plurality of cameras 42 that capture images in different directions, for example, based on an operation (step S401). The terminal communication unit 25 transmits direction command data to the robot 4d (lower control unit 43d) via the control communication unit 30 (step S402).
 第2命令生成部320dは、センサ44と物体との間の距離の測定結果を、センサ44ごとに、制御通信部30から取得する(ステップS403)。第2命令生成部320dは、ロボット4dと物体との間の距離が閾値以下となった物体の方向を撮像するカメラ42を選択する(ステップS404)。第2命令生成部320dは、選択されたカメラ42を示す選択命令データを、ロボット4d(下位制御部43d)に送信する(ステップS405)。 The second command generation unit 320d acquires the measurement result of the distance between the sensor 44 and the object for each sensor 44 from the control communication unit 30 (step S403). The second command generation unit 320d selects the camera 42 that captures the direction of the object when the distance between the robot 4d and the object is equal to or less than the threshold (step S404). The second command generation unit 320d transmits selection command data indicating the selected camera 42 to the robot 4d (lower control unit 43d) (step S405).
 端末通信部25は、選択されたカメラ42によって生成された画像を取得する。例えば、端末通信部25は、ロボット4dと物体との間の距離が閾値以下となった物体の方向を撮像するカメラ42によって生成された画像を取得する(ステップS406)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS407)。 The terminal communication unit 25 acquires the image generated by the selected camera 42. For example, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object when the distance between the robot 4d and the object is equal to or less than a threshold (step S406). The display unit 24 displays the image generated by the selected camera 42 (step S407).
 以上のように、第1命令生成部232d(方向命令生成部)は、複数のカメラ42を搭載したロボット4dの進行方向又は旋回方向を示す方向命令データを生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4dに送信する。端末通信部25(取得部)は、選択されたカメラ42によって生成された画像を取得する。すなわち、端末通信部25は、ロボット4dと物体との間の距離が閾値以下となった物体の方向を撮像するカメラ42によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the first command generation unit 232d (direction command generation unit) generates direction command data indicating the traveling direction or turning direction of the robot 4d equipped with a plurality of cameras 42 . The terminal communication unit 25 (transmitting unit) transmits direction command data to the robot 4d via the control communication unit 30. FIG. The terminal communication unit 25 (acquisition unit) acquires an image generated by the selected camera 42 . That is, the terminal communication unit 25 acquires an image generated by the camera 42 that captures the direction of the object in which the distance between the robot 4d and the object is equal to or less than the threshold. The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する(手動で画面を切り替える)必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 This eliminates the need for the operator to manually select the camera 42 (manually switch the screen) while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42. It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
 (第3実施形態)
 第3実施形態では、ロボットの進行方向又は旋回方向に関する命令データに応じてカメラ42が選択される点が、第1実施形態及び第2実施形態との差分である。第3実施形態では、第1実施形態及び第2実施形態との差分を中心に説明する。
(Third embodiment)
The difference between the third embodiment and the first and second embodiments is that the camera 42 is selected according to command data relating to the traveling direction or turning direction of the robot. 3rd Embodiment demonstrates centering around the difference with 1st Embodiment and 2nd Embodiment.
 図21は、遠隔操作システム1eの構成例を示す図である。遠隔操作システム1eは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1eは、端末装置2eと、制御装置3eと、ロボット4eとを備える。 FIG. 21 is a diagram showing a configuration example of the remote control system 1e. The remote control system 1e is a system for remotely operating an operation target (control target). The remote control system 1e includes a terminal device 2e, a control device 3e, and a robot 4e.
 端末装置2e(第1遠隔操作装置)は、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23eと、表示部24と、端末通信部25とを備える。端末制御部23eは、メディア処理部230eと、ログデータ処理部231と、第1命令生成部232eとを有する。端末装置2eは、マイク(不図示)を備えてもよい。 A terminal device 2e (first remote control device) includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23e, a display unit 24, and a terminal communication unit 25. Prepare. The terminal control unit 23e has a media processing unit 230e, a log data processing unit 231, and a first instruction generation unit 232e. The terminal device 2e may include a microphone (not shown).
 制御装置3e(第2遠隔操作装置)は、制御通信部30と、制御記憶部31と、上位制御部32eとを備える。ロボット4eは、ロボット通信部40と、ロボット記憶部41と、N個のカメラ42(第2カメラ)と、下位制御部43eとを備える。 The control device 3e (second remote control device) includes a control communication unit 30, a control storage unit 31, and a higher control unit 32e. The robot 4e includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43e.
 第1命令生成部232eは、操作部21が受け付けた操作に応じた信号を、操作部21から取得する。第1命令生成部232eは、操作部21に対する操作に応じて、方向命令データを生成する。また、第1命令生成部232eは、操作部21に対する操作に応じて、選択命令データを生成する。例えば、第1命令生成部232eは、方向命令データが示す進行方向(旋回方向)の実空間がカメラ42によって撮像されるように、選択命令データを生成する。 The first command generation unit 232e acquires from the operation unit 21 a signal corresponding to the operation received by the operation unit 21. The first command generation unit 232e generates direction command data according to an operation on the operation unit 21. FIG. Also, the first command generation unit 232e generates selection command data according to an operation on the operation unit 21. FIG. For example, the first command generation unit 232e generates selection command data so that the camera 42 captures an image of the real space in the traveling direction (turning direction) indicated by the direction command data.
 図22は、左右方向及び前後方向に関して選択されるカメラ42と条件との対応付けの例を示す図である。方向命令データが示す進行方向が左方向である場合、第1命令生成部232eは、複数のカメラ42のうちから、カメラ42-1を選択する。方向命令データが示す進行方向が右方向である場合、第1命令生成部232eは、複数のカメラ42のうちから、カメラ42-3を選択する。 FIG. 22 is a diagram showing an example of correspondence between the cameras 42 selected in the left-right direction and the front-rear direction and the conditions. When the traveling direction indicated by the direction command data is the left direction, the first command generation unit 232e selects the camera 42-1 from among the plurality of cameras 42. FIG. When the traveling direction indicated by the direction command data is the right direction, the first command generation unit 232e selects the camera 42-3 from among the plurality of cameras 42. FIG.
 方向命令データが示す進行方向が後ろ方向である場合、第1命令生成部232eは、複数のカメラ42のうちから、カメラ42-5を選択する。これらの場合以外では、第1命令生成部232eは、複数のカメラ42のうちから、カメラ42-2を選択する。第1命令生成部232eは、複数のカメラ42のうちから、カメラ42-4を選択してもよい。 When the traveling direction indicated by the direction command data is the backward direction, the first command generation unit 232e selects the camera 42-5 from among the multiple cameras 42. Except for these cases, the first command generation unit 232e selects the camera 42-2 from among the plurality of cameras 42. FIG. The first command generation unit 232e may select the camera 42-4 from among the plurality of cameras 42. FIG.
 次に、遠隔操作システム1eの動作例を説明する。
 図23は、遠隔操作システム1eの動作例を示すフローチャートである。第1命令生成部232eは、異なるロボット4eの進行方向又は旋回方向を制御するための操作に応じた信号を、操作部21から取得する(ステップS501)。
Next, an operation example of the remote control system 1e will be described.
FIG. 23 is a flow chart showing an operation example of the remote control system 1e. The first command generation unit 232e acquires from the operation unit 21 a signal corresponding to an operation for controlling the traveling direction or turning direction of the different robot 4e (step S501).
 第1命令生成部232eは、異なる方向を撮像する複数のカメラ42を搭載したロボット4eの進行方向又は旋回方向を示す方向命令データを、ロボット4eの進行方向又は旋回方向を制御するための操作に基づいて生成する(ステップS502)。 The first command generation unit 232e converts the direction command data indicating the advancing direction or turning direction of the robot 4e equipped with a plurality of cameras 42 that capture images in different directions into an operation for controlling the advancing direction or turning direction of the robot 4e. (step S502).
 端末通信部25は、方向命令データを、制御通信部30を介してロボット4eに送信する(ステップS503)。第1命令生成部232e(選択命令生成部)は、進行方向又は旋回方向を制御するための操作に基づいて、選択命令データを生成する(ステップS504)。端末通信部25は、選択命令データをロボット4eに送信する(ステップS505)。 The terminal communication unit 25 transmits direction command data to the robot 4e via the control communication unit 30 (step S503). The first command generation unit 232e (selection command generation unit) generates selection command data based on the operation for controlling the traveling direction or turning direction (step S504). The terminal communication unit 25 transmits the selection command data to the robot 4e (step S505).
 端末通信部25(取得部)は、進行方向又は旋回方向を制御するための操作に基づいて選択されたカメラ42によって生成された画像を取得する(ステップS506)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS507)。 The terminal communication unit 25 (acquisition unit) acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction (step S506). The display unit 24 displays the image generated by the selected camera 42 (step S507).
 以上のように、第1命令生成部232e(方向命令生成部)は、複数のカメラ42を搭載したロボット4eの進行方向又は旋回方向を示す方向命令データを、ロボット4eの進行方向又は旋回方向を制御するための操作に基づいて生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4eに送信する。端末通信部25(取得部)は、進行方向又は旋回方向を制御するための操作に基づいて選択されたカメラ42によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the first command generation unit 232e (direction command generation unit) generates the direction command data indicating the traveling direction or turning direction of the robot 4e equipped with a plurality of cameras 42 to indicate the traveling direction or turning direction of the robot 4e. Generate based on operations to control. The terminal communication unit 25 (transmitting unit) transmits direction command data to the robot 4e via the control communication unit 30. FIG. The terminal communication unit 25 (acquisition unit) acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction. The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 As a result, the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
 ロボット4eが左方向に進行するように操作者が操作部21-1を操作するだけで、左方向を撮像するカメラ42-1によって撮像された画像を操作者は見ることが可能である。ロボット4eが右方向に進行するように操作者が操作部21-3を操作するだけで、右方向を撮像するカメラ42-3によって撮像された画像を操作者は見ることが可能である。 By simply operating the operation unit 21-1 so that the robot 4e moves leftward, the operator can see the image captured by the camera 42-1 that captures the image in the left direction. Only by operating the operation unit 21-3 so that the robot 4e advances in the right direction, the operator can view the image captured by the camera 42-3 that captures images in the right direction.
 ロボット4eが前方向に進行するように操作者が操作部21-2を操作するだけで、前方向を撮像するカメラ42-2によって撮像された画像を操作者は見ることが可能である。ロボット4eが後ろ方向に進行するように操作者が操作部21-4を操作するだけで、後ろ方向を撮像するカメラ42-5によって撮像された画像を操作者は見ることが可能である。これらのように、ロボットの進行方向の実空間における状況を画面上で常に確認しながら、操作者がロボットを操作することが可能になる。 By simply operating the operation unit 21-2 so that the robot 4e moves forward, the operator can see the image captured by the camera 42-2 that captures the forward direction. Only by operating the operation unit 21-4 so that the robot 4e advances backward, the operator can see the image captured by the camera 42-5 that captures images in the backward direction. As described above, the operator can operate the robot while constantly checking the situation in the real space in the robot's traveling direction on the screen.
 (第3実施形態の変形例)
 第3実施形態では、ロボットの進行方向に応じた選択命令データを、端末装置が生成した。第3実施形態の変形例では、選択命令データを制御装置が生成する点が、第3実施形態との差分である。第3実施形態の変形例では、第3実施形態との差分を中心に説明する。
(Modified example of the third embodiment)
In the third embodiment, the terminal device generated selection command data according to the traveling direction of the robot. The modification of the third embodiment differs from the third embodiment in that the control device generates selection command data. In the modified example of the third embodiment, differences from the third embodiment will be mainly described.
 図24は、遠隔操作システム1fの構成例を示す図である。遠隔操作システム1fは、操作対象(制御対象)を遠隔から操作するシステムである。遠隔操作システム1fは、端末装置2fと、制御装置3fと、ロボット4fとを備える。 FIG. 24 is a diagram showing a configuration example of the remote control system 1f. The remote control system 1f is a system for remotely operating an operation target (control target). The remote control system 1f includes a terminal device 2f, a control device 3f, and a robot 4f.
 端末装置2fは、端末記憶部20と、操作部21と、端末カメラ22(第1カメラ)と、端末制御部23aと、表示部24と、端末通信部25とを備える。端末制御部23fは、メディア処理部230bと、ログデータ処理部231と、第1命令生成部232bとを有する。端末装置2fは、マイク(不図示)を備えてもよい。 The terminal device 2f includes a terminal storage unit 20, an operation unit 21, a terminal camera 22 (first camera), a terminal control unit 23a, a display unit 24, and a terminal communication unit 25. The terminal control section 23f has a media processing section 230b, a log data processing section 231, and a first command generation section 232b. The terminal device 2f may include a microphone (not shown).
 制御装置3fは、制御通信部30と、制御記憶部31と、上位制御部32fとを備える。上位制御部32fは、第2命令生成部320fを有する。ロボット4fは、ロボット通信部40と、ロボット記憶部41と、N個のカメラ42(第2カメラ)と、下位制御部43fとを備える。 The control device 3f includes a control communication unit 30, a control storage unit 31, and a higher control unit 32f. The upper controller 32f has a second instruction generator 320f. The robot 4f includes a robot communication unit 40, a robot storage unit 41, N cameras 42 (second cameras), and a lower control unit 43f.
 第2命令生成部320fは、方向命令データを制御通信部30から取得する。第2命令生成部320fは、方向命令データに応じて、選択命令データを生成する。例えば、第2命令生成部320fは、方向命令データが示す方向(進行方向)の実空間がカメラ42によって撮像されるように、選択命令データを生成する。 The second command generation unit 320f acquires direction command data from the control communication unit 30. The second command generator 320f generates selection command data according to the direction command data. For example, the second command generation unit 320f generates the selection command data so that the real space in the direction indicated by the direction command data (traveling direction) is captured by the camera 42 .
 次に、遠隔操作システム1fの動作例を説明する。
 図25は、遠隔操作システム1fの動作例を示すフローチャートである。第1命令生成部232fは、ロボット4fの進行方向又は旋回方向を制御するための操作に応じた信号を、操作部21から取得する(ステップS601)。
Next, an operation example of the remote control system 1f will be described.
FIG. 25 is a flow chart showing an operation example of the remote control system 1f. The first command generation unit 232f acquires a signal corresponding to an operation for controlling the traveling direction or turning direction of the robot 4f from the operation unit 21 (step S601).
 第1命令生成部232fは、異なる方向を撮像する複数のカメラ42を搭載したロボット4fの進行方向又は旋回方向を示す方向命令データを、ロボット4fの進行方向又は旋回方向を制御するための操作に基づいて生成する(ステップS602)。 The first command generation unit 232f converts the direction command data indicating the traveling direction or turning direction of the robot 4f equipped with a plurality of cameras 42 that capture images in different directions into an operation for controlling the traveling direction or turning direction of the robot 4f. (step S602).
 端末通信部25は、方向命令データを、制御通信部30を介してロボット4fに送信する(ステップS603)。第2命令生成部300f(選択命令生成部)は、進行方向又は旋回方向を制御するための操作に基づいて、選択命令データを生成する(ステップS604)。端末通信部25は、選択命令データをロボット4fに送信する(ステップS605)。 The terminal communication unit 25 transmits direction command data to the robot 4f via the control communication unit 30 (step S603). The second command generation unit 300f (selection command generation unit) generates selection command data based on the operation for controlling the traveling direction or turning direction (step S604). The terminal communication unit 25 transmits the selection command data to the robot 4f (step S605).
 端末通信部25は、進行方向又は旋回方向を制御するための操作に基づいて選択されたカメラ42によって生成された画像を取得する(ステップS606)。表示部24は、選択されたカメラ42によって生成された画像を表示する(ステップS607)。 The terminal communication unit 25 acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction (step S606). The display unit 24 displays the image generated by the selected camera 42 (step S607).
 以上のように、第1命令生成部232f(方向命令生成部)は、複数のカメラ42を搭載したロボット4fの進行方向又は旋回方向を示す方向命令データを、ロボット4fの進行方向又は旋回方向を制御するための操作に基づいて生成する。端末通信部25(送信部)は、方向命令データを、制御通信部30を介してロボット4fに送信する。端末通信部25(取得部)は、進行方向又は旋回方向を制御するための操作に基づいて選択されたカメラ42によって生成された画像を取得する。表示部24は、選択されたカメラ42によって生成された画像を表示する。 As described above, the first command generation unit 232f (direction command generation unit) generates the direction command data indicating the traveling direction or turning direction of the robot 4f equipped with a plurality of cameras 42 to indicate the traveling direction or turning direction of the robot 4f. Generate based on operations to control. The terminal communication unit 25 (transmitting unit) transmits direction command data to the robot 4f via the control communication unit 30. FIG. The terminal communication unit 25 (acquisition unit) acquires an image generated by the camera 42 selected based on the operation for controlling the traveling direction or turning direction. The display unit 24 displays images generated by the selected camera 42 .
 これによって、ロボットの操作中に操作者がカメラ42を手動で選択する必要がないので、複数のカメラ42を搭載したロボットの操作性を向上させることが可能である。大きい表示部を使用することができない環境下でも、カメラ42によって生成された画像の視認性を向上させることが可能である。 As a result, the operator does not need to manually select the camera 42 while operating the robot, so it is possible to improve the operability of the robot equipped with a plurality of cameras 42 . It is possible to improve the visibility of the image produced by the camera 42 even in environments where a large display cannot be used.
 本発明の装置はコンピュータとプログラムによっても実現でき、プログラムを非一時的記録媒体に記録することも、ネットワークを通して提供することも可能である。 The device of the present invention can also be realized by a computer and a program, and the program can be recorded on a non-temporary recording medium or provided through a network.
 (ハードウェア構成例)
 図26は、各実施形態における、遠隔操作装置100(端末装置及び制御装置)のハードウェア構成例を示す図である。遠隔操作装置100は、各実施形態における、端末装置と制御装置とのうちの少なくとも一方に相当する。遠隔操作装置100の各機能部のうちの一部又は全部は、CPU(Central Processing Unit)等のプロセッサ101が、不揮発性の記録媒体(非一時的記録媒体)を有する記憶装置102とメモリ103とに記憶されたプログラムを実行することにより、ソフトウェアとして実現される。プログラムは、コンピュータ読み取り可能な非一時的記録媒体に記録されてもよい。コンピュータ読み取り可能な非一時的記録媒体とは、例えばフレキシブルディスク、光磁気ディスク、ROM(Read Only Memory)、CD-ROM(Compact Disc Read Only Memory)等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置などの非一時的記録媒体である。通信部104は、所定の通信処理を実行する。通信部104は、データとプログラムとを取得してもよい。
(Hardware configuration example)
FIG. 26 is a diagram showing a hardware configuration example of the remote control device 100 (terminal device and control device) in each embodiment. The remote control device 100 corresponds to at least one of a terminal device and a control device in each embodiment. Some or all of the functional units of the remote control device 100 are configured by a processor 101 such as a CPU (Central Processing Unit), a storage device 102 having a non-volatile recording medium (non-temporary recording medium), and a memory 103. It is implemented as software by executing a program stored in the . The program may be recorded on a computer-readable non-transitory recording medium. A computer-readable non-temporary recording medium is, for example, a portable medium such as a flexible disk, a magneto-optical disk, a ROM (Read Only Memory), a CD-ROM (Compact Disc Read Only Memory), or a hard disk built into a computer system. It is a non-temporary recording medium such as a storage device such as The communication unit 104 executes predetermined communication processing. The communication unit 104 may acquire data and programs.
 遠隔操作装置100の各機能部の一部又は全部は、例えば、LSI(Large Scale Integrated circuit)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)又はFPGA(Field Programmable Gate Array)等を用いた電子回路(electronic circuit又はcircuitry)を含むハードウェアを用いて実現されてもよい。 Some or all of the functional units of the remote control device 100 use, for example, LSI (Large Scale Integrated circuit), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), or FPGA (Field Programmable Gate Array). may be implemented using hardware including electronic circuits or circuitry.
 各実施形態は、組み合わされてもよい。また、端末装置と制御装置とは、一体でもよいし、別体でもよい。 Each embodiment may be combined. Also, the terminal device and the control device may be integrated or separated.
 以上、この発明の実施形態について図面を参照して詳述してきたが、具体的な構成はこの実施形態に限られるものではなく、この発明の要旨を逸脱しない範囲の設計等も含まれる。 Although the embodiment of the present invention has been described in detail with reference to the drawings, the specific configuration is not limited to this embodiment, and includes design within the scope of the gist of the present invention.
 本発明は、ロボット等の操作対象を遠隔から操作する情報処理装置(遠隔操作装置)に適用可能である。 The present invention can be applied to an information processing device (remote control device) that remotely controls an operation target such as a robot.
1a,1b,1c,1d,1e,1f…遠隔操作システム、2a,2b,2c,2d,2e,2f…端末装置、3a,3b,3c,3d,3e,3f…制御装置、4a,4b,4c,4d,4e,4f…ロボット、10…通信回線、11…通信回線、20…端末記憶部、21…操作部、22…端末カメラ、23a,23b,23c,23d,23e,23f…端末制御部、24…表示部、25…端末通信部、30…制御通信部、31…制御記憶部、32a…上位制御部、40…ロボット通信部、41…ロボット記憶部、42…カメラ、43a…下位制御部、44…センサ、100…遠隔操作装置、101…プロセッサ、102…記憶装置、103…メモリ、104…通信部、200…操作者、230a,230b,230c,230d,230e,230f…メディア処理部、231…ログデータ処理部、232a,232b,232c,232d,232e,232f…第1命令生成部、240…表示領域、320b,320d,320f…第2命令生成部 1a, 1b, 1c, 1d, 1e, 1f... remote control system, 2a, 2b, 2c, 2d, 2e, 2f... terminal device, 3a, 3b, 3c, 3d, 3e, 3f... control device, 4a, 4b, 4c, 4d, 4e, 4f... Robot 10... Communication line 11... Communication line 20... Terminal storage unit 21... Operation unit 22... Terminal camera 23a, 23b, 23c, 23d, 23e, 23f... Terminal control Unit 24 Display unit 25 Terminal communication unit 30 Control communication unit 31 Control storage unit 32a Upper control unit 40 Robot communication unit 41 Robot storage unit 42 Camera 43a Lower Control unit 44 Sensor 100 Remote control device 101 Processor 102 Storage device 103 Memory 104 Communication unit 200 Operator 230a, 230b, 230c, 230d, 230e, 230f Media processing Part 231... Log data processing part 232a, 232b, 232c, 232d, 232e, 232f... First instruction generation part 240... Display area 320b, 320d, 320f... Second instruction generation part

Claims (4)

  1.  異なる方向を撮像する複数のカメラを搭載したロボットの進行方向又は旋回方向を示す方向命令データを生成する方向命令生成部と、
     前記方向命令データを前記ロボットに送信する送信部と、
     前記ロボットと物体との間の距離に基づいて選択された前記カメラによって生成された画像を取得する取得部と、
     選択された前記カメラによって生成された画像を表示する表示部と
     を備える遠隔操作装置。
    a direction command generation unit that generates direction command data indicating a traveling direction or a turning direction of a robot equipped with a plurality of cameras that capture images in different directions;
    a transmitting unit configured to transmit the direction command data to the robot;
    an acquisition unit for acquiring an image generated by the camera selected based on the distance between the robot and an object;
    and a display for displaying an image generated by the selected camera.
  2.  前記取得部は、前記距離が閾値以下となった前記物体の方向を撮像する前記カメラによって生成された画像を取得し、
     前記表示部は、前記取得部によって取得された画像を表示する、
     請求項1に記載の遠隔操作装置。
    The acquisition unit acquires an image generated by the camera that captures a direction of the object whose distance is equal to or less than a threshold;
    the display unit displays the image acquired by the acquisition unit;
    The remote control device according to claim 1.
  3.  請求項1又は請求項2に記載の遠隔操作装置としてコンピュータを機能させるための遠隔操作プログラム。 A remote control program for causing a computer to function as the remote control device according to claim 1 or claim 2.
  4.  請求項1又は請求項2に記載の遠隔操作装置としてコンピュータを機能させるための遠隔操作プログラムを記録したコンピュータ読み取り可能な非一時的記録媒体。 A computer-readable non-temporary recording medium recording a remote control program for causing a computer to function as the remote control device according to claim 1 or claim 2.
PCT/JP2021/020669 2021-05-31 2021-05-31 Remote manipulation device, remote manipulation program, and non-transitory recording medium WO2022254515A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/020669 WO2022254515A1 (en) 2021-05-31 2021-05-31 Remote manipulation device, remote manipulation program, and non-transitory recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/020669 WO2022254515A1 (en) 2021-05-31 2021-05-31 Remote manipulation device, remote manipulation program, and non-transitory recording medium

Publications (1)

Publication Number Publication Date
WO2022254515A1 true WO2022254515A1 (en) 2022-12-08

Family

ID=84323949

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/020669 WO2022254515A1 (en) 2021-05-31 2021-05-31 Remote manipulation device, remote manipulation program, and non-transitory recording medium

Country Status (1)

Country Link
WO (1) WO2022254515A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008144378A (en) * 2006-12-06 2008-06-26 Shin Caterpillar Mitsubishi Ltd Controller for remote controlled working machine
JP2014036400A (en) * 2012-08-10 2014-02-24 Mitsubishi Agricultural Machinery Co Ltd On-vehicle camera system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008144378A (en) * 2006-12-06 2008-06-26 Shin Caterpillar Mitsubishi Ltd Controller for remote controlled working machine
JP2014036400A (en) * 2012-08-10 2014-02-24 Mitsubishi Agricultural Machinery Co Ltd On-vehicle camera system

Similar Documents

Publication Publication Date Title
US10076840B2 (en) Information processing apparatus, information processing method, and program
US8103066B2 (en) Ultrasound system and method for forming an ultrasound image
JP6015032B2 (en) Provision of location information in a collaborative environment
US9685005B2 (en) Virtual lasers for interacting with augmented reality environments
US20150138065A1 (en) Head-mounted integrated interface
US11288871B2 (en) Web-based remote assistance system with context and content-aware 3D hand gesture visualization
JP4706409B2 (en) Input display device and input display method
JP7268130B2 (en) Head-mounted information processing device
JP6306209B2 (en) Device for obtaining a trigger signal from an ultrasound system
JP2015184839A (en) Image generation device, image display system, and image generation method
JP2007253648A (en) Input support system and on-vehicle terminal equipment constituting the same system
US20240173018A1 (en) System and apparatus for remote interaction with an object
EP3547265A1 (en) Method, storage medium and apparatus for generating environment model
WO2023016107A1 (en) Remote interaction method, apparatus and system, and electronic device and storage medium
WO2017052880A1 (en) Augmented reality with off-screen motion sensing
JP2013105330A (en) Control processing program, image display device and image display method
WO2022254515A1 (en) Remote manipulation device, remote manipulation program, and non-transitory recording medium
WO2022254514A1 (en) Remote control device, remote control program, and non-transitory recording medium
WO2022254518A1 (en) Remote control device, remote control program, and non-transitory recording medium
US20240119943A1 (en) Apparatus for implementing speaker diarization model, method of speaker diarization, and portable terminal including the apparatus
US9261974B2 (en) Apparatus and method for processing sensory effect of image data
JP7232663B2 (en) Image processing device and image processing method
US10809870B2 (en) Information processing apparatus and information processing method
JP2006318094A (en) Information processing method and information processor
CN111512640A (en) Multi-camera device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21944025

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21944025

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP