WO2021256463A1 - Imaging system and robot system - Google Patents

Imaging system and robot system Download PDF

Info

Publication number
WO2021256463A1
WO2021256463A1 PCT/JP2021/022669 JP2021022669W WO2021256463A1 WO 2021256463 A1 WO2021256463 A1 WO 2021256463A1 JP 2021022669 W JP2021022669 W JP 2021022669W WO 2021256463 A1 WO2021256463 A1 WO 2021256463A1
Authority
WO
WIPO (PCT)
Prior art keywords
image pickup
head
display
image
posture
Prior art date
Application number
PCT/JP2021/022669
Other languages
French (fr)
Japanese (ja)
Inventor
雅幸 掃部
裕和 杉山
Original Assignee
川崎重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 川崎重工業株式会社 filed Critical 川崎重工業株式会社
Priority to JP2022531836A priority Critical patent/JP7478236B2/en
Publication of WO2021256463A1 publication Critical patent/WO2021256463A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices

Definitions

  • This disclosure relates to an imaging system and a robot system.
  • Patent Document 1 discloses a master-slave type manifold.
  • the manipulator includes an operation robot, a work robot, and an image pickup device that captures an image of a work object.
  • the image pickup device is attached to the first articulated telescopic mechanism, and the operator's helmet is attached to the second articulated telescopic mechanism.
  • the first multi-joint expansion / contraction mechanism operates following the operation of the second multi-joint expansion / contraction mechanism, so that the image pickup device follows the movement of the operator's head.
  • the image captured by the image pickup device is projected on the screen, and the operator operates the operating robot while visually recognizing the image on the screen. For example, when the operator changes the direction of the head to the left or right with respect to the front screen, the screen displays an image captured by an image pickup device that moves following the movement of the head. The operator may not be able to sufficiently see the image of the screen located in the right direction or the left direction, and may not be able to perform an accurate operation.
  • the image pickup system includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device.
  • a variable device that changes the orientation, a position in which the image is displayed and a direction in which the image is displayed according to the movement of the head by displaying the image captured by the imaging device to the user. It is equipped with a display device that changes at least one of them.
  • FIG. 1 is a perspective view showing an example of a configuration of a robot system according to an exemplary embodiment.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an imaging system according to an exemplary embodiment.
  • FIG. 3 is a flowchart showing an example of the operation of the imaging system according to the exemplary embodiment.
  • FIG. 4 is a side view showing an example of the configuration of the display device according to the first modification of the exemplary embodiment.
  • FIG. 5 is a flowchart showing an example of the operation of the imaging system according to the first modification.
  • FIG. 6 is a side view showing an example of the configuration of the display device according to the second modification of the exemplary embodiment.
  • FIG. 7 is a flowchart showing an example of the operation of the imaging system according to the second modification.
  • FIG. 1 is a perspective view showing an example of a configuration of a robot system according to an exemplary embodiment.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an imaging system according to an
  • FIG. 8 is a side view showing an example of the configuration of the display device according to the modified example 3 of the exemplary embodiment.
  • FIG. 9 is a flowchart showing an example of the operation of the imaging system according to the modified example 3.
  • FIG. 10 is a side view showing an example of the configuration of the image pickup apparatus according to the modified example 4 of the exemplary embodiment.
  • FIG. 1 is a perspective view showing an example of the configuration of the robot system 1 according to the exemplary embodiment.
  • the robot system 1 includes an image pickup system 100, a robot 200, a robot operation device 300, and a robot control device 400.
  • the image pickup system 100 includes an image pickup device 110, a mobile device 120, a motion detection device 130, a display device 140, an image pickup control device 150, and an image pickup input device 160.
  • the mobile device 120 is an example of a variable device.
  • the robot 200 is an industrial robot and includes a robot arm 210, a base 220, and an end effector 230.
  • the robot 200 may be another type of robot such as a service robot, a medical robot, a drug discovery robot, and a humanoid.
  • Service robots are robots used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, and product provision.
  • the base 220 is fixed on the support surface and supports the robot arm 210.
  • the support surface of the base 220 may be an immovable surface such as a floor surface, or may be a movable surface on a movable device such as a traveling device.
  • the robot arm 210 has at least one joint and has at least one degree of freedom.
  • the robot arm 210 is configured so that the end effector 230 is attached to the tip of the robot arm 210.
  • the robot arm 210 can move the end effector 230 so as to freely change the position and posture of the end effector 230.
  • the end effector 230 performs various actions such as gripping, adsorption, spraying of a liquid such as paint, welding, and injection of a sealing agent on the object (also referred to as "work") W according to the application of the end effector 230. It is configured to be able to be added.
  • the robot arm 210 is a vertical articulated robot arm with 6 degrees of freedom having 6 rotating joints, but is not limited thereto.
  • the type of the robot arm 210 may be any type, for example, a horizontal articulated type, a polar coordinate type, a cylindrical coordinate type, a rectangular coordinate type, or the like.
  • the joint of the robot arm 210 may be any joint such as a linear motion joint.
  • the number of joints of the robot arm 210 may be any number such as 5 or less or 7 or more.
  • the robot operating device 300 is arranged at a position away from the robot 200 and is used for remotely controlling the robot 200.
  • the robot operating device 300 may be arranged at a position where the user P who handles the robot operating device 300 can directly see the robot 200, or may be arranged at a position where the user P cannot directly see the robot 200.
  • the robot operating device 300 may be arranged in a space isolated from the space in which the robot 200 is arranged, or in a space at a position away from the space.
  • the robot operating device 300 receives inputs such as various commands, information, and data, and outputs them to the robot control device 400.
  • the robot operating device 300 can accept an input by the user P.
  • the robot operating device 300 is connected to another device and can receive input from the device.
  • the robot operating device 300 may include known input means such as a lever, a button, a touch panel, a joystick, a motion capture, a camera, and a microphone.
  • the robot operating device 300 may include a teaching pendant, which is one of the teaching devices, a smart device such as a smartphone and a tablet, a personal computer, and a terminal device such as a dedicated terminal device.
  • the robot operating device 300 may include a master machine.
  • the master machine may be configured to perform similar or similar movements to the robot arm 210.
  • the robot control device 400 controls the operation of the robot 200.
  • the robot control device 400 is connected to the robot 200 and the robot operation device 300 via wired communication or wireless communication. Any wired communication or wireless communication may be used.
  • the robot control device 400 processes commands, information, data, and the like input via the robot operation device 300.
  • the robot control device 400 may be connected to an external device and may be configured to receive and process inputs such as commands, information, and data from the device.
  • the robot control device 400 controls the operation of the robot 200 according to the above commands, information, data, and the like.
  • the robot control device 400 controls the supply of power and the like to the robot 200.
  • the robot control device 400 manages information and the like for managing the robot 200.
  • the robot control device 400 outputs various commands, information, data, and the like to the robot operation device 300 and / or the display device 140 of the image pickup system 100.
  • the robot control device 400 causes the display device 140 to visually and / or audibly present various commands, information, data, and the like.
  • the robot control device 400 may output an image for operating the robot 200, an image showing the state of the robot 200, an image for managing the robot 200, and the like.
  • the robot control device 400 includes a computer. Further, the robot control device 400 supplies the electric circuit for controlling the electric power supplied to the robot 200, the device for controlling the power other than the electric power such as the air pressure and the hydraulic pressure supplied to the robot 200, and the robot 200. Equipment for controlling substances such as cooling water and paint may be provided. Devices other than the computer may be provided separately from the robot control device 400.
  • the image pickup device 110 of the image pickup system 100 includes a camera that captures a still image and / or a moving image of a digital image.
  • the camera may be a three-dimensional camera capable of capturing a three-dimensional image including the position information of the subject in the image.
  • the mobile device 120 is equipped with an image pickup device 110 and is configured so that the position and orientation of the image pickup device 110 can be freely changed.
  • the position of the image pickup device 110 may be the three-dimensional position of the image pickup device 110 in the three-dimensional space
  • the orientation of the image pickup device 110 is the center of the optical axis of the camera of the image pickup device 110. It may be oriented, and specifically, it may be a three-dimensional direction of the center of the optical axis in the three-dimensional space.
  • the orientation of the image pickup device 110 may correspond to the posture of the image pickup device 110.
  • the moving device 120 is not particularly limited, but in the present exemplary embodiment, it is a robot arm similar to the robot arm 210 and is fixed on the support surface.
  • the support surface is a ceiling surface
  • the robot arm 210 is suspended from the ceiling surface, but the position and orientation of the support surface are not limited.
  • the image pickup device 110 is attached to the tip of the robot arm.
  • the moving device 120 is other than, for example, a traveling device capable of traveling on a support surface such as a floor surface, an orbit traveling device capable of traveling on an orbit, a crane movable on an orbit, a crane provided with an arm, and a robot arm. It may be an articulated arm and other devices such as an unmanned aerial vehicle such as a drone.
  • the track of the track traveling device may be arranged so as to extend in the vertical direction, the horizontal direction, and the direction intersecting them.
  • the track traveling device can travel in various directions and positions.
  • the image pickup device 110 may be attached to a crane hook.
  • the mobile device 120 includes a pan head (also referred to as a “gimbal”) to which the image pickup device 110 is attached, and the orientation of the image pickup device 110 may be freely changed by operating the pan head.
  • a pan head also referred to as a “gimbal”
  • the motion detection device 130 is an example of the detection device, and detects the motion of the head H of the user P who operates the robot operation device 300.
  • the motion detecting device 130 is not particularly limited, but includes at least one infrared sensor 131 and at least one infrared marker 132 attached to the head H of the user P in this exemplary embodiment.
  • a plurality of infrared sensors 131 specifically, three infrared sensors 131 are arranged around the user P toward the user P.
  • the three infrared sensors 131 are arranged at positions away from the head H of the user P.
  • a plurality of infrared markers 132 specifically, four infrared markers 132 are arranged at different positions on the head H.
  • the head includes a portion of the human body above the neck, and may include, for example, the face, the crown, the temporal region, the occipital region, and the like.
  • the infrared marker 132 emits infrared light.
  • the infrared marker 132 may be a light emitter such as an infrared LED (Light Emitting Diode) that emits infrared light by itself, or may be a reflector that reflects the irradiated infrared light, and is a light emitter and a reflection. It may be configured to include both bodies.
  • the infrared sensor 131 receives infrared light and can detect the direction, intensity, intensity distribution, etc. of the received infrared light.
  • the infrared sensor 131 may be configured to be capable of receiving only infrared light, or may be configured to emit infrared light by itself and receive infrared light such as reflected light of the infrared light. You may. In the latter case, the infrared sensor 131 may be an infrared camera. By detecting the infrared light from the four infrared markers 132 using the three infrared sensors 131, it is possible to detect the position and posture of the head H with high accuracy. Although not limited to the following, the position of the head H may be a three-dimensional position such as a predetermined reference point of the head H in the three-dimensional space.
  • the posture of the head H may be the posture of a predetermined part such as a front portion of the head H, a plane crossing the head H, and an axis passing from the jaw of the head H to the crown, a surface or an axis. Specifically, it may be the three-dimensional orientation of the predetermined portion, surface or axis in the three-dimensional space.
  • the infrared sensor 131 may be attached to the head H of the user P and the infrared marker 132 may be arranged at a position away from the head H of the user P so as to reverse the above.
  • the positions and quantities of the infrared sensor 131 and the infrared marker 132 are not particularly limited as long as they can detect the position, posture, or both the position and the posture of the head H.
  • the display device 140 perceptibly presents the image captured by the image pickup device 110 to the user P.
  • the display device 140 is arranged in the vicinity of the robot operating device 300, and is arranged at a position away from the image pickup device 110.
  • the display device 140 may present the command, information, data, and the like received from the robot control device 400 to the user P in a perceptible manner.
  • the display device 140 includes a display such as a liquid crystal display (Liquid Crystal Display) and an organic or inorganic EL display (Electro-Luminescence Display), and presents visually.
  • the display device 140 may include an audio output device such as a speaker, and may make an auditory presentation.
  • the display device 140 may be configured to make a tactile presentation.
  • the display device 140 is a head-mounted display attached to the head H of the user P.
  • the head-mounted display has a goggle-like shape, and the lens portion of the head-mounted display forms a display surface on which an image is displayed.
  • the display device 140 can change the position and direction in which the display device 140 displays an image in accordance with the movement of the head H of the user P.
  • the display device 140 may be configured so as not to be attached to the head H of the user P.
  • the display drive can change the position of the display, the posture of the display, or the position and posture of the display. It may be equipped with a device.
  • the configuration for moving the display and the configuration for changing the position and orientation of the display may be configured by a device as exemplified for the moving device 120.
  • the configuration for changing the posture of the display may be configured by a device such as a gimbal.
  • the image pickup input device 160 receives input and operations for operating the image pickup system 100 from the user P.
  • the image pickup input device 160 receives inputs such as various commands, information, and data, and outputs them to the image pickup control device 150.
  • the image pickup input device 160 may be arranged in the vicinity of the robot operation device 300 and may have a configuration similar to the configuration exemplified by the robot operation device 300.
  • the robot operation device 300 may include an image pickup input device 160 and also have a function of the image pickup input device 160.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the imaging system 100 according to the exemplary embodiment.
  • the image pickup control device 150 is connected to the image pickup device 110, the mobile device 120, the motion detection device 130, the display device 140, and the image pickup input device 160 via wired communication or wireless communication. Any wired communication or wireless communication may be used.
  • the image pickup control device 150 includes a drive control device 151 that controls the drive of the image pickup device 110 and the mobile device 120, a detection control device 152 that controls the operation of the motion detection device 130, and a display control device that controls the operation of the display device 140. 153 and more.
  • the detection control device 152 controls the drive of the three infrared sensors 131, processes the result of the three infrared sensors 131 detecting the infrared light from the four infrared markers 132, and processes the result of detecting the infrared light from the four infrared markers 132 in three dimensions of the four infrared markers 132. Detects the position and posture of. That is, the detection control device 152 detects the position and posture of the head H of the user P by detecting the three-dimensional position and posture of the infrared marker 132.
  • the detection control device 152 is an example of a processing device.
  • each of the three infrared sensors 131 receives infrared light emitted from the four infrared markers 132.
  • the infrared light emitted from each infrared marker 132 is associated with identification information such as an ID set in the infrared marker 132. Therefore, each infrared sensor 131 can detect the direction, intensity, intensity distribution, and the like of infrared light of each of the four infrared markers 132.
  • the detection control device 152 uses the three-dimensional position and orientation information of each infrared sensor 131 and the detection result of the infrared light of the four infrared markers 132 by each infrared sensor 131, and uses the three of the four infrared markers 132.
  • the detection control device 152 detects the three-dimensional positions of the four infrared markers 132 according to the three-dimensional coordinate system set in the space where the three infrared sensors 131 and the robot control device 400 are arranged. Further, the detection control device 152 detects the three-dimensional position and posture of the head H of the user P by using the information on the three-dimensional positions of the four infrared markers 132. For example, the detection control device 152 expresses a posture by using posture angles such as a rolling angle, a pitching angle, and a yawing angle.
  • posture angles such as a rolling angle, a pitching angle, and a yawing angle.
  • the drive control device 151 controls the image pickup operation of the image pickup device 110. Further, the drive control device 151 controls the operation of the moving device 120 so as to change the position and posture of the image pickup device 110 according to the movement of the head H of the user P detected by the detection control device 152. The drive control device 151 moves the image pickup device 110 so as to move the image pickup device 110 according to the change amount of the position and posture of the image pickup device 110 corresponding to the change amount of the position and posture of the head H detected by the detection control device 152. Controls the behavior of. For example, the drive control device 151 controls the position and orientation of the image pickup device 110 according to the three-dimensional coordinate system set in the space where the moving device 120 is arranged, and for example, the three-dimensional coordinate system set in the moving device 120. Use.
  • the relationship between the amount of change in the position and posture of the image pickup device 110 and the amount of change in the position and posture of the head H is arbitrary.
  • the amount of change in the position and posture of the image pickup apparatus 110 may correspond to the amount of change in the position and posture of the head H on a one-to-one basis, and corresponds to a constant multiple of the amount of change in the position and posture of the head H.
  • the amount of change in the posture of the image pickup device 110 and the amount of change in the posture of the head H correspond to one-to-one
  • the amount of change in the position of the image pickup device 110 corresponds to a constant multiple of the amount of change in the position of the head H. You may.
  • the amount of change in the position of the image pickup device 110 and the amount of change in the position of the head H have a one-to-one correspondence, and the amount of change in the posture of the image pickup device 110 corresponds to a constant multiple of the amount of change in the posture of the head H. You may.
  • the display control device 153 causes the display device 140 to display the image captured by the image pickup device 110.
  • the display control device 153 may control the operation of the display drive device. Further, the display control device 153 may perform position processing for changing the position where the image of the image pickup device 110 is displayed on the display screen of the display device 140.
  • the display device 140 is a head-mounted display. Therefore, the display control device 153 causes the display device 140 to display the image captured by the image pickup device 110 without controlling the display drive device and performing position processing. For example, the display control device 153 positions the center of the image captured by the image pickup device 110 on the center of the display screen of the display device 140. As a result, the user P can always see the image captured by the image pickup device 110 in the vicinity of the front surface of the head H.
  • the display control device 153 controls the operation of the display drive device so that the display moves following the movement of the head H of the user P. You may. For example, the display control device 153 may perform the above control so that the display screen of the display is located near the front surface of the head H and / or faces the front surface of the head H. Further, the display control device 153 performs image position processing on the display screen of the display so that the center of the image captured by the image pickup device 110 moves following the movement of the head H of the user P. May be good. In this case, the center of the image captured by the image pickup device 110 and the center of the display screen do not always match.
  • the user P can see the image captured by the image pickup device 110 near the front of the head H.
  • the display control device 153 prevents the center of the image captured by the image pickup device 110 from moving following the movement of the head H of the user P on the display screen of the display in accordance with the command of the user P or the like. Image position processing may be performed. As a result, the user P can see the image of the object captured from another direction by the image pickup device 110 in front of the head H or the like.
  • the display control device 153 moves the head H of the user P at the center of the image captured by the image pickup device 110 on the display screen of the display.
  • the position processing of the image may be performed so as to move following the image.
  • the image pickup control device 150 as described above includes a computer. Further, the image pickup control device 150 may include an electric circuit for controlling the electric power supplied to the image pickup device 110, the mobile device 120, the motion detection device 130, and the display device 140. Devices other than the computer may be provided separately from the image pickup control device 150.
  • the computer of the robot control device 400 and the image pickup control device 150 includes a circuit or a processing circuit having a processor, a memory, and the like.
  • the circuit or processing circuit sends and receives commands, information, data, etc. to and from other devices.
  • the circuit or processing circuit inputs signals from various devices and outputs control signals to each controlled object.
  • the memory is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, a hard disk, and a storage device such as an SSD (Solid State Drive).
  • the memory stores a program executed by a circuit or a processing circuit, various data, and the like.
  • the function of the circuit or processing circuit is realized by a computer system consisting of a processor such as a CPU (Central Processing Unit), a volatile memory such as RAM (Random Access Memory), and a non-volatile memory such as ROM (Read-Only Memory). You may.
  • the computer system may realize the function of the circuit or the processing circuit by the CPU using the RAM as a work area to execute the program recorded in the ROM.
  • a part or all of the functions of the circuit or the processing circuit may be realized by the above-mentioned computer system, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the above-mentioned computer system and hardware may be realized. It may be realized by a combination of circuits.
  • the robot control device 400 and the image pickup control device 150 may execute each process by centralized control by a single computer, or may execute each process by distributed control by cooperation of a plurality of computers.
  • each function of the robot control device 400 and the image pickup control device 150 includes a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration), a system LSI, a PLC (Programmable Gate Array), and a logic circuit. It may be realized by such as.
  • the plurality of functions of the robot control device 400 and the image pickup control device 150 may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them.
  • the circuit may be a general-purpose circuit or a dedicated circuit, respectively.
  • an FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application.
  • An ASIC Application Specific Integrated Circuit
  • the image pickup control device 150 of the image pickup system 100 includes a drive control device 151, a detection control device 152, a display control device 153, and a storage unit 154.
  • the drive control device 151 includes an image pickup control unit 1511 and a first movement control unit 1512 as functional components.
  • the detection control device 152 includes a device control unit 1521 and a detection processing unit 1522 as functional components.
  • the display control device 153 includes a display control unit 1531, a second movement control unit 1532, and an image processing unit 1533 as functional components.
  • the function of the storage unit 154 is realized by the memory of the computer of the image pickup control device 150 or the like.
  • the functions of the functional components of the image pickup control device 150 other than the storage unit 154 are realized by a computer processor or the like.
  • the storage unit 154 stores various information and enables reading of the stored information.
  • the storage unit 154 may store a program, various data, and the like.
  • the storage unit 154 may store programs, data, information, and the like for operating each device of the image pickup system 100.
  • the storage unit 154 stores the coordinate system set in each device of the image pickup system 100.
  • the coordinate system is a three-dimensional coordinate system (hereinafter, also referred to as “first coordinate system”) set in the space where the infrared sensor 131 and the robot operating device 300 are arranged, and the space where the moving device 120 is arranged. Even if the set three-dimensional coordinate system (hereinafter, also referred to as "second coordinate system”) and the three-dimensional coordinate system set on the moving device 120 (hereinafter, also referred to as "third coordinate system”) are included. good.
  • the storage unit 154 stores information on the position and orientation of each infrared sensor 131, for example, the information in the first coordinate system.
  • the storage unit 154 may store the identification information of each infrared marker 132 and the information of the characteristics of the infrared light emitted from each infrared marker 132 in association with each other.
  • the storage unit 154 may store information on the position and posture of each infrared marker 132 on the head H of the user P.
  • the storage unit 154 stores information on the position and orientation of the moving device 120, for example, the information in the second coordinate system.
  • the storage unit 154 stores information on the position and orientation of the image pickup device 110 on the mobile device 120, for example, the information in the third coordinate system.
  • the storage unit 154 may store the relationship of various parameters for moving each device according to the movement of the head H of the user P.
  • the relationship between the parameters is the first relationship between the amount of change in the position and posture of the head H and the amount of change in the position and posture of each device, the amount of change in the position and posture of the head H and the display of the display device 140.
  • the second relationship with the amount of change in the position and posture of the head H, and the third relationship between the amount of change in the position and posture of the head H and the amount of change in the position of the reference point of the image on the display screen of the display device 140, etc. May include.
  • the image pickup control unit 1511 controls the drive of the image pickup device 110.
  • the image pickup control unit 1511 controls the execution and stop of the image pickup operation of the image pickup device 110, and the zoom-up and zoom-back operations of the image pickup device 110.
  • the image pickup control unit 1511 may be configured to receive information, commands, and the like from the robot operation device 300, and may control the operation of the image pickup device 110 according to the commands and the like received from the robot operation device 300.
  • the first movement control unit 1512 controls the drive of the movement device 120. For example, when the first movement control unit 1512 receives the operation information from the image pickup input device 160, the first movement control unit 1512 generates an operation command for operating the operation corresponding to the operation information and outputs the operation command to the movement device 120. As a result, the mobile device 120 performs an operation corresponding to the operation information.
  • the operation information is information indicating the content of the operation input to the image pickup input device 160 by the user P in order to operate the mobile device 120.
  • the first movement control unit 1512 when the first movement control unit 1512 receives the fluctuation amount of the position and posture of the head H of the user P from the detection processing unit 1522 of the detection control device 152, the first movement control unit 1512 reads out the first relationship from the storage unit 154.
  • the first movement control unit 1512 is the position and posture of the image pickup device 110 for moving the image pickup device 110 following the movement of the head H based on the fluctuation amount of the position and posture of the head H and the first relationship. Determine the amount of fluctuation.
  • the fluctuation amount of the position and the posture can be expressed by the second coordinate system.
  • the first movement control unit 1512 generates a command for the operation of the moving device 120 for moving the image pickup device 110 by the determined position and posture fluctuation amount, and outputs the command to the moving device 120.
  • the moving device 120 moves the image pickup device 110 so as to follow the movement of the head H.
  • the horizontal and vertical movements of the head H are associated with the horizontal and vertical movements of the image pickup apparatus 110, respectively.
  • the movement of the head H in the rolling direction, pitching direction, and yawing direction is associated with the movement of the image pickup apparatus 110 in the rolling direction, pitching direction, and yawing direction, respectively.
  • the device control unit 1521 controls the drive of each infrared sensor 131 of the motion detection device 130.
  • the device control unit 1521 may control execution and stop of irradiation of infrared light of each infrared sensor 131.
  • the apparatus control unit 1521 may control operations such as execution and stop of irradiation of infrared light of each infrared marker 132.
  • the detection processing unit 1522 processes the detection result of the infrared light from the infrared marker 132 in each infrared sensor 131, and detects the position and posture of the head H of the user P.
  • the detection processing unit 1522 is an example of a detection device.
  • the detection processing unit 1522 reads out the identification information of each infrared marker 132 and the characteristic information of the infrared light from the storage unit 154, and sets the infrared light and the infrared marker 132 detected by each infrared sensor 131. Link.
  • the detection processing unit 1522 reads out the position and orientation information of each infrared sensor 131 from the storage unit 154, and uses the information and the infrared light detection result of each infrared marker 132 in each infrared sensor 131 to make each infrared ray. The three-dimensional position of the marker 132 is detected.
  • the detection processing unit 1522 detects the three-dimensional position and posture of the head H of the user P from the three-dimensional positions of the four infrared markers 132.
  • the position and orientation of the head H can be represented in the first coordinate system.
  • the detection processing unit 1522 detects the position and posture of the head H over time, and outputs the fluctuation amount of the position and posture of the head H to the first movement control unit 1512.
  • the amount of change in the position and posture of the head H is the amount of change in the position and posture of the head H before and after the change, the position and posture of the head H before the change, and the position of the head H after the change.
  • the posture, the position of the head H after the change and the fluctuation speed of the position and the posture of the head H toward the posture, and the acceleration of the position and the posture of the head H toward the position and the posture of the head H after the change It may contain at least one selected from the group comprising.
  • the display control unit 1531 acquires the image data captured by the image pickup device 110 from the image pickup device 110, outputs the image data to the display device 140, and displays the image corresponding to the image data.
  • the display control unit 1531 may perform image processing on the image data acquired from the image pickup device 110 and output the image data after the image processing to the display device 140.
  • the second movement control unit 1532 controls the operation of the display drive device when the display device 140 includes the display drive device.
  • the second movement control unit 1532 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the second movement control unit 1532 reads the second relationship from the storage unit 154.
  • the second movement control unit 1532 determines the amount of change in the position and posture of the display for moving the display following the movement of the head H based on the amount of change in the position and posture of the head H and the second relationship. decide.
  • the amount of change in the position and posture can be represented by the first coordinate system.
  • the second movement control unit 1532 generates a command for operating the display drive device for moving the display by the determined position and posture fluctuation amount, and outputs the command to the display drive device.
  • the display drive device moves the display following the movement of the head H so that the display is located near the front of the head H and / or faces the head H.
  • the second movement control unit 1532 may be omitted.
  • the image processing unit 1533 controls the position where the image captured by the image pickup device 110 is displayed on the display screen of the display device 140. For example, when the display device 140 includes one display, the image processing unit 1533 follows the movement of the head H of the user P and sets the reference point of the image captured by the image pickup device 110 on the display screen of the display. You may move it. For example, when the display device 140 includes a plurality of displays arranged so that the orientation of the display screen is different, the image processing unit 1533 follows the movement of the head H of the user P and is imaged by the image pickup device 110. The reference point of the image may be moved across the display screens of a plurality of displays.
  • the image processing unit 1533 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the image processing unit 1533 reads out the third relationship from the storage unit 154. The image processing unit 1533 determines the amount of change in the position of the reference point on the display screen that follows the movement of the head H, based on the amount of change in the position and posture of the head H and the third relationship. Further, the image processing unit 1533 generates a command for moving the reference point by the amount of fluctuation of the determined position, and outputs the command to the display control unit 1531. The display control unit 1531 displays the image on the display of the display device 140 so that the reference point of the image captured by the image pickup device 110 moves following the movement of the head H.
  • the user P can visually recognize the image captured by the image pickup device 110 at a position close to the front of the head H.
  • the image processing unit 1533 is imaged by the image pickup device 110 on the display screen of the display regardless of the movement of the head H of the user P. Maintain the position of the reference point of the image.
  • FIG. 3 is a flowchart showing an example of the operation of the imaging system 100 according to the exemplary embodiment.
  • the user P shall attach the head-mounted display to the head H as the display device 140.
  • the image pickup control device 150 operates in the initial setting mode for determining the initial position and the initial posture of the image pickup device 110 and the head H of the user P, respectively. For example, the image pickup control device 150 starts the initial setting mode according to an activation command input to the image pickup input device 160 by the user P.
  • the image pickup control device 150 determines the initial position and the initial posture of the image pickup device 110. Specifically, the user P operates the image pickup input device 160 while visually recognizing the image captured by the image pickup device 110 on the display device 140 to operate the moving device 120, and changes the position and posture of the image pickup device 110. .. When the desired image is projected on the display device 140, the user P inputs a command to the image pickup input device 160 to determine the current position and posture of the image pickup device 110 to the initial position and initial posture of the image pickup device 110. The image pickup control device 150 determines the commanded position and posture of the image pickup device 110 as the initial position and initial posture of the image pickup device 110.
  • the image pickup control device 150 determines the initial position and the initial posture of the head H of the user P. Specifically, when the position and posture of the head H becomes a desired position and posture, the user P inputs a command for determining the initial position and initial posture of the head H to the image pickup input device 160.
  • the image pickup control device 150 causes the three infrared sensors 131 of the motion detection device 130 to detect infrared light, processes the detection result of each infrared sensor 131, and detects the position and posture of the head H.
  • the image pickup control device 150 determines the detected position and posture of the head H as the initial position and initial posture of the head H.
  • step S104 the image pickup control device 150 ends the initial setting mode and starts the operation in the normal operation mode.
  • step S105 the image pickup control device 150 causes the image pickup device 110 to start the image pickup operation.
  • the image pickup device 110 continuously captures a moving image and displays it on the display device 140.
  • step S106 the image pickup control device 150 causes the three infrared sensors 131 to continuously detect the infrared light of the infrared marker 132 of the head H.
  • step S107 the image pickup control device 150 processes the detection result of each infrared sensor 131, and detects the position and posture of the head H of the user P with respect to the initial position and the initial posture.
  • the image pickup control device 150 detects the position and posture of the head H at predetermined time intervals, thereby detecting the amount of change in the position and posture of the head H at predetermined time intervals.
  • step S108 the image pickup control device 150 determines the image pickup device 110 with respect to the initial position and the initial posture based on the first relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target.
  • the image pickup control device 150 determines the target position and posture of the image pickup device 110 at predetermined time intervals, thereby determining the amount of change in the position and posture of the image pickup device 110 at predetermined time intervals.
  • step S109 the image pickup control device 150 generates an operation command corresponding to the target position and posture of the image pickup device 110 and outputs the operation command to the mobile device 120.
  • the image pickup control device 150 generates an operation command for operating the moving device 120 so that the position and posture of the image pickup device 110 satisfy the target position and posture.
  • step S110 the moving device 120 operates according to the operation command to move the image pickup device 110 to the target position and posture.
  • step S111 the image pickup control device 150 determines whether or not a command for terminating the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it has been input (Yes in step S111).
  • the process returns to step S106.
  • the image pickup control device 150 determines the position and posture of the head H based on the relationship between the initial position and initial posture of the head H of the user P and the initial position and initial posture of the image pickup device 110.
  • the position and orientation of the image pickup apparatus 110 can be changed so as to follow the fluctuation amount.
  • Modification 1 of the exemplary embodiment differs from the exemplary embodiment in that the display device 140A comprises one display 141 and a display drive device 142 for moving the display 141.
  • the image pickup control device 150 controls the operation of the display drive device 142 to move the position and orientation of the display 141 in accordance with the movement of the head H of the user P.
  • the modification 1 will be described mainly on the points different from the exemplary embodiment, and the description of the same points as the exemplary embodiment will be omitted as appropriate.
  • FIG. 4 is a side view showing an example of the configuration of the display device 140A according to the modified example 1 of the exemplary embodiment.
  • the display drive device 142 is configured to support the display 141 and to freely change the position and orientation of the display 141.
  • the display drive device 142 is a robot arm having a plurality of joints. The base of the robot arm is fixed to a support surface or the like, and a display 141 is attached to the tip of the robot arm.
  • the display drive device 142 can arbitrarily change the position and orientation of the display 141 in the three-dimensional direction.
  • the second movement control unit 1532 of the display control device 153 of the image pickup control device 150 controls the operation of the display drive device 142 so that the position and posture of the display 141 follow the movement of the head H of the user P. For example, the second movement control unit 1532 moves the display 141 upward and downward when the head H is directed upward, and the display 141 is directed to the head H when the head H is directed to the left. Move to the left and turn to the right. Further, the second movement control unit 1532 controls the operation of the display drive device 142 so as to change the position and posture of the display 141 according to the operation via the image pickup input device 160.
  • FIG. 5 is a flowchart showing an example of the operation of the image pickup system 100 according to the first modification.
  • steps S201 to S203 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
  • step S204 the image pickup control device 150 determines the initial position and the initial posture of the display 141.
  • the user P operates the image pickup input device 160 to operate the display drive device 142, and changes the position and posture of the display 141 to a desired position and posture.
  • the user P inputs a command for determining the current position and orientation of the display 141 to the initial position and initial posture of the display 141 to the image pickup input device 160.
  • the image pickup control device 150 determines the commanded position and orientation of the display 141 as the initial position and initial posture of the display 141.
  • steps S205 to S211 are the same as those in steps S104 to S110 in the exemplary embodiment, respectively.
  • step S212 the image pickup control device 150 determines the display 141 with respect to the initial position and the initial posture based on the second relationship stored in the storage unit 154 and the position and the posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target.
  • the image pickup control device 150 determines the target position and orientation of the display 141 at predetermined time intervals, thereby determining the amount of change in the position and orientation of the display 141 at predetermined time intervals.
  • step S213 the image pickup control device 150 generates an operation command corresponding to the target position and orientation of the display 141 and outputs the operation command to the display drive device 142.
  • the image pickup control device 150 generates an operation command for operating the display drive device 142 so that the position and posture of the display 141 satisfy the target position and posture.
  • step S214 the display drive device 142 operates according to the operation command, and moves the display 141 to the target position and posture.
  • step S215 the image pickup control device 150 determines whether or not a command to end the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it is input (Yes in step S215).
  • the process returns to step S207.
  • the image pickup control device 150 is based on the relationship between the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the initial position and initial posture of the display 141.
  • the position and posture of the image pickup apparatus 110 and the display 141 can be changed so as to follow the fluctuation amount of the position and posture of the head H.
  • the image pickup control device 150 may execute the processes of steps S209 to S211 and the processes of steps S212 to S214 in parallel, or may execute the processes in the reverse order of the above.
  • the display drive device 142 is configured to move both the position and orientation of the display 141, but is configured to move only the position of the display 141 or only the orientation of the display 141. May be done.
  • a second embodiment of the exemplary embodiment is exemplary in that the display device 140B comprises one display 143 having a display surface 143a that includes a curve that surrounds a portion of the periphery of the user P. Different from the form.
  • the image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 on the screen of the display surface 143a to display the position and the image. Turn around.
  • the modified example 2 will be mainly described with reference to the exemplary embodiment and the points different from the modified example 1, and the description of the same points as the exemplary embodiment and the modified example 1 will be omitted as appropriate.
  • FIG. 6 is a side view showing an example of the configuration of the display device 140B according to the modified example 2 of the exemplary embodiment.
  • the display surface 143a of the display 143 surrounds the user P horizontally from both sides to the front of the user P and vertically from above and below to the front of the user P.
  • Such a display surface 143a surrounds the user P in the horizontal direction, the vertical direction, and the direction intersecting the horizontal direction and the vertical direction.
  • the display surface 143a has a curved surface shape similar to, for example, a part of a spherical surface or an ellipsoidal surface.
  • the display surface 143a may have a shape that surrounds a part of the periphery of the user P, and may have a shape that surrounds the entire circumference of the user P, for example.
  • the shape of the display surface 143a is not limited to the curved surface shape, and may be any shape including bending, bending, or both bending and bending.
  • the shape of the display surface 143a may be the same as at least a part of the surface of the cylindrical surface or the polyhedron.
  • the "cylindrical surface” has a cross-sectional shape perpendicular to the axis of a circle, an ellipse, a shape close to a circle, a shape close to an ellipse, or two of these. It may include the surface of a columnar body, which is the above combination.
  • the image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied on the screen of the display surface 143a.
  • the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward on the screen of the display surface 143a when the head H faces upward, and the display surface when the head H faces left.
  • the reference point Pf is moved to the left with respect to the head H on the screen of 143a.
  • FIG. 7 is a flowchart showing an example of the operation of the image pickup system 100 according to the second modification.
  • steps S301 to S303 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
  • step S304 the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110 on the screen of the display surface 143a of the display 143.
  • the reference point Pf is the center of the image.
  • the image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S302 as the initial position.
  • steps S305 to S311 are the same as in steps S104 to S110 in the exemplary embodiment, respectively.
  • step S312 the image pickup control device 150 sets the initial position of the display surface 143a on the screen based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture.
  • the target reference point Pft is a reference point of the movement destination that follows the fluctuation of the position and posture of the head H.
  • the image pickup control device 150 determines the target position of the target reference point Pft at predetermined time intervals, thereby determining the amount of change in the position of the reference point Pf at predetermined time intervals.
  • step S313 the image pickup control device 150 sets the image so that the position of the reference point Pf of the image captured by the image pickup device 110 coincides with the target position of the target reference point Pft on the screen of the display surface 143a. Is processed and output to the display 143. That is, the image pickup control device 150 executes image processing associated with the target reference point Pft.
  • step S314 the display 143 displays the processed image on the screen of the display surface 143a.
  • step S315 is the same as that of step S215 in the first modification.
  • the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position on the display surface 143a of the display 143 of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image so as to follow the fluctuation amount of the position and posture of the head H. And can be varied.
  • the image pickup control device 150 may execute the processes of steps S309 to S311 and the processes of steps S312 to S314 in parallel, or may execute the processes in the reverse order of the above.
  • the image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H.
  • the control for changing the position of the reference point Pf may be configured to be stopped or released. While the control is stopped, the position and posture of the image pickup apparatus 110 change so as to follow the fluctuation amount of the position and posture of the head H, but the position of the reference point Pf of the image on the display surface 143a does not change. ..
  • the user P can see an image of a subject such as an object W captured from another direction on the display surface 143a. For example, when the user P changes the position of the reference point Pf of the image according to the movement of the head H and then orders the image pickup input device 160 to stop following, the image is projected to a place other than the front of the head H. You can see the image.
  • the display 143 may be movable in the same manner as the modification 1.
  • the image pickup control device 150 combines the processing in the modification 2 and the processing in the modification 1 to change the display position and orientation of the image so as to follow the movement of the head H of the user P. May be good.
  • the third modification of the exemplary embodiment is different from the second modification in that the display device 140C includes a plurality of displays 141 arranged so as to surround a part around the user P.
  • the plurality of displays 141 are arranged so that the positions and orientations of the respective display surfaces 141a are different.
  • the image pickup control device 150 causes each of the plurality of displays 141 to display a part of one image captured by the image pickup device 110, that is, causes the plurality of displays 141 to display the one image as a whole.
  • the image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 over the screen of the display surface 141a of the plurality of displays 141, thereby moving the image.
  • the modified example 3 will be mainly described with reference to the exemplary embodiment and the points different from the modified examples 1 and 2, and the description of the exemplary embodiment and the same points as the modified examples 1 and 2 will be omitted as appropriate. ..
  • FIG. 8 is a side view showing an example of the configuration of the display device 140C according to the modified example 3 of the exemplary embodiment.
  • the plurality of displays 141 are arranged so as to surround a part around the user P horizontally from both sides of the user P to the front and vertically from the upper side and the lower side to the front of the user P. Will be done.
  • the plurality of displays 141 are arranged so that the display surface 141a forms a plurality of horizontal rows and a plurality of vertical columns.
  • the plurality of display surfaces 141a surround the user P in the horizontal direction, the vertical direction, and the directions intersecting the horizontal direction and the vertical direction.
  • the plurality of displays 141 are arranged so that their respective display surfaces 141a are arranged along a spherical surface or an ellipsoidal surface and are adjacent to each other. Each display surface 141a is directed towards the center or focal point of a spherical or ellipsoidal surface.
  • the plurality of displays 141 may be arranged so as to surround the entire periphery of the user P, may be arranged so as to surround the periphery of the user P in the horizontal direction, and may be arranged so as to surround the periphery of the user P in the vertical direction. May be placed in.
  • the plurality of displays 141 are arranged in a cross-shaped array extending horizontally and vertically, a cylindrical array extending horizontally and bending, a cylindrical array extending vertically and bending, and the like. May be done.
  • the plurality of displays 141 are arranged adjacent to each other, but may be arranged at a distance from each other.
  • the image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied over the screens of the plurality of display surfaces 141a. For example, the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward over the screens of the plurality of display surfaces 143a when the head H is directed upward, and when the head H is directed to the left, the reference point Pf is moved upward. The reference point Pf is moved to the left with respect to the head H over the screen of the display surface 143a.
  • FIG. 9 is a flowchart showing an example of the operation of the image pickup system 100 according to the modified example 3.
  • steps S401 to S403 are the same as those of steps S301 to S303 in the second modification, respectively.
  • the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110.
  • the image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S402 as the initial position.
  • the image pickup control device 150 determines the position of the display 141 that displays the reference point Pfa at the initial position and the position of the reference point Pfa on the screen of the display surface 141a of the display 141.
  • steps S405 to S411 are the same as those in steps S305 to S311 in the second modification, respectively.
  • step S412 the image pickup control device 150 together with the display 141 that displays the target reference point Pft based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. , The target position of the target reference point Pft on the screen of the display surface 141a of the display 141 is determined. The image pickup control device 150 executes the above determination at predetermined time intervals.
  • step S413 the image pickup control device 150 matches the position of the reference point Pf of the image captured by the image pickup device 110 with the target position of the target reference point Pft on the screen of the display surface 141a of the determined display 141. As such, the image is processed and output to each display 141.
  • step S414 the plurality of displays 141 display the processed image on the screen of the display surface 141a as a whole.
  • step S415 is the same as step S315 in the modification 2.
  • the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image can be changed so as to follow the fluctuation amount of the position and posture of the head H.
  • the image pickup control device 150 may execute the processes of steps S409 to S411 and the processes of steps S412 to S414 in parallel, or may execute the processes in the reverse order of the above.
  • the image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H.
  • the control for changing the position of the reference point Pf may be configured to be stopped or released.
  • the image pickup control device 150 is configured to display one image by using the entire plurality of displays 141, but the present invention is not limited to this.
  • the image pickup control device 150 may be configured to display the image captured by the image pickup device 110 on a part of the plurality of displays 141.
  • the image pickup control device 150 may be configured to select a display 141 for displaying an image so as to follow the fluctuation amount of the position and posture of the head H.
  • the plurality of displays 141 may be movable in the same manner as in the modified example 1.
  • the plurality of displays 141 may have a display surface 141a that is curved, bent, or includes both curved and bent, as in the second modification.
  • the image pickup control device 150 performs both the processing in the modification 1 and the processing in the modification 2, or the processing in the modification 1 and the processing in the modification 2 in the processing in the modification 3.
  • the display position and orientation of the image may be changed so as to follow the movement of the head H of the user P.
  • Modification example 4 of the exemplary embodiment differs from the exemplary embodiment in that the imaging system 100 includes a plurality of imaging devices 110 arranged at different positions and orientations.
  • the image pickup control device 150 changes the position and orientation of the image pickup device 110 according to the movement of the head H of the user P by switching the image pickup device 110 to be displayed on the display device 140.
  • the modified example 4 will be mainly described with reference to the exemplary embodiments and the points different from the modified examples 1 to 3, and the description of the same points as the exemplary embodiments and the modified examples 1 to 3 will be omitted as appropriate. ..
  • FIG. 10 is a perspective view showing an example of the configuration of the image pickup apparatus 110 according to the modified example 4 of the exemplary embodiment.
  • the plurality of image pickup devices 110 are arranged so as to surround at least a part of the periphery of the object W to be imaged.
  • the object W is a work target of the robot 200.
  • the plurality of image pickup devices 110 are arranged along a cylindrical surface having a vertical axis and separated from each other. The axis passes through the object W or the vicinity of the object W.
  • the plurality of image pickup devices 110 are arranged at positions in the vertical direction equivalent to each other. Each image pickup device 110 is directed toward the object W and is fixed to an inanimate object such as a ceiling via a support.
  • the arrangement of the plurality of image pickup devices 110 is not limited to the above.
  • the plurality of image pickup devices 110 may be arranged so as to surround the object W, the robot arm 210, or both the object W and the robot arm 210.
  • the plurality of image pickup devices 110 may be arranged at different positions in the vertical direction.
  • the plurality of image pickup devices 110 are arranged along a cylindrical surface having a horizontal axis, a cylindrical surface having a vertical axis, a spherical surface, an ellipsoid, or a combination of two or more thereof. May be good.
  • the plurality of image pickup devices 110 may have two or more horizontal circumferences having different vertical positions, two or more vertical circumferences having different horizontal positions or horizontal orientations, or the horizontal. It may be arranged along the combination of the circumference in the direction and the circumference in the vertical direction.
  • each image pickup device 110 are stored in advance in the storage unit 154 of the image pickup control device 150 as parameters of the image pickup device 110.
  • the posture of the image pickup device 110 is the posture angle of the center of the optical axis of the image pickup device 110.
  • the first movement control unit 1512 of the drive control device 151 of the image pickup control device 150 receives the information on the position and posture of the head H of the user P, the first movement of the head H is followed by the movement of the head H using the first relationship.
  • the target position and target posture of the image pickup device 110 for moving the image pickup device 110 are determined.
  • the first movement control unit 1512 uses the parameters of the image pickup device 110 stored in the storage unit 154 to set the image pickup device 110 having the position and the posture closest to the target position and the target posture among the plurality of image pickup devices 110. To decide from.
  • the first movement control unit 1512 determines the zoom-up rate or zoom-back rate to be executed by the image pickup device 110 in order to compensate for the difference between the determined position and posture of the image pickup device 110 and the target position and target posture. decide. For example, when the position of the image pickup device 110 is located in front of the target position in the determined direction of the image pickup device 110, the first movement control unit 1512 zooms the image pickup device 110 to zoom back. Determine the back rate. When the position of the image pickup apparatus 110 is located behind the target position in the directivity direction, the first movement control unit 1512 determines the zoom-up rate for causing the image pickup apparatus 110 to perform zoom-up imaging.
  • the first movement control unit 1512 outputs a command to the determined image pickup device 110 to execute the image pickup at the determined zoom-up rate or zoom-back rate to the image pickup control unit 1511.
  • the image pickup control unit 1511 causes the image pickup device 110 to take an image according to a command.
  • the first movement control unit 1512 is an example of a variable device.
  • the image pickup control device 150 determines an image pickup device 110 to perform imaging according to the movement of the head H of the user P, and causes the image pickup device 110 to take an image to follow the movement of the user's head.
  • An image whose imaging position and imaging direction fluctuate can be displayed on the display device 140.
  • the configuration of the modified example 4 may be applied to the modified examples 1 to 3.
  • the present disclosure is not limited to the above exemplary embodiments and modifications. That is, various modifications and improvements are possible within the scope of the present disclosure.
  • the present disclosure also includes a form in which various modifications are applied to an exemplary embodiment and a modification, and a form constructed by combining components in different exemplary embodiments and modifications. Is done.
  • the motion detection device 130 has an infrared sensor 131 and an infrared marker 132 as sensors for detecting the position and posture of the head H from a position away from the head H of the user P.
  • the present invention is not limited to this, and any configuration that can detect the movement of the head H may be provided.
  • the motion detection device 130 may include an acceleration sensor and an angular velocity sensor mounted on the head H, and may detect the acceleration and the angular velocity of the head H in the 6-axis direction.
  • the image pickup control device 150 may be configured to receive the detection result from the acceleration sensor and the angular velocity sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H using the detection results of the acceleration and the angular velocity.
  • the motion detection device 130 may include a three-dimensional camera arranged at a position away from the head H and capture a three-dimensional image of the head H.
  • the pixel value of each pixel of the three-dimensional image indicates the distance value to the subject projected on the pixel.
  • the image pickup control device 150 detects the image of the head H and the posture of the head H projected on the three-dimensional image by image processing such as a pattern matching method using the template of the head H, and detects the three-dimensional image.
  • the position of the head H may be detected from the pixel value of each pixel of.
  • the motion detection device 130 may include a plurality of three-dimensional cameras arranged at different positions and orientations from each other.
  • the image pickup control device 150 may generate a three-dimensional model of the head H by processing a three-dimensional image of each three-dimensional camera.
  • the image pickup control device 150 may detect the position and posture of the head H using a three-dimensional model of the head H.
  • the motion detection device 130 may include a magnetic field generator and a magnetic sensor mounted on the head H, and detect the position and orientation of the magnetic sensor.
  • the image pickup control device 150 may be configured to receive the detection result from the magnetic sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H by using the detection result of the position and posture of the magnetic sensor.
  • the image pickup system includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device.
  • a variable device that changes the orientation, an image captured by the imaging device is displayed to the user, and the position where the image is displayed and the direction in which the image is displayed according to the movement of the head are selected. It is equipped with a display device that changes at least one of them.
  • the imaging system can change the position and orientation of the imaging device according to the movement of the head. Further, the image pickup system follows the movement of the head in the display device to display the image captured by the image pickup device, the direction in which the image is displayed, or the position and direction in which the image is displayed. Both can be changed. Therefore, the user can easily display an image captured at a position and orientation that follows the movement of the head in a state where the image is displayed at a position, orientation, or position and orientation corresponding to the movement of the head. And you can see it surely. Therefore, the image pickup system can make the image pickup device and the display surface of the image captured by the image pickup device follow the movement of the head.
  • variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the image pickup device.
  • the image pickup device can be moved to a position and a direction that follows the movement of the head. Therefore, the image pickup apparatus can capture an image from a position and orientation that faithfully follows the movement of the head. Further, the image pickup apparatus can capture an image that continuously changes according to the movement of the head.
  • variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the variable device itself.
  • variable device since the variable device itself moves, the variable device can increase the fluctuation range of the position and orientation of the image pickup device.
  • the user can have the image pickup device capture an image from a wide range of positions and orientations by the movement of the head and visually recognize the image.
  • the image pickup system includes a plurality of the image pickup devices arranged at different positions and orientations, and the variable device obtains the image pickup by switching the image pickup device for displaying an image on the display device.
  • the position and orientation of the device may be varied.
  • variable device switches the image pickup device for displaying the captured image, so that the image pickup device after the switch captures the image from the position and direction following the movement of the head and displays the image on the display device.
  • the position and orientation of the image pickup device that captured the image can be changed, so that the image can be captured from the position and orientation that follow the quick movement of the head.
  • the detection device may include a sensor that detects the position and posture of the head from a position away from the head.
  • the detection device can detect the position and posture of the head without contacting the head. Therefore, the user can move the head without being restricted by the detection device.
  • the detection device includes at least one infrared sensor, at least one infrared marker, and a processing device, and includes the at least one infrared sensor and the at least one infrared marker.
  • the processing apparatus is the infrared sensor. May detect the position and orientation of the head by processing the result of detecting the infrared light from the infrared marker.
  • the detection device can detect the position and posture of the head with high accuracy by a simple process by using an infrared sensor and an infrared marker.
  • the user can move his head without being restricted by the detector.
  • the display device is a head-mounted display attached to the head, and the display device follows the movement of the head by moving with the head.
  • the position and orientation in which the image is displayed may be changed.
  • the display surface of the head-mounted display moves together with the head, and the position and orientation of the display surface follow the movement of the head. Therefore, the configuration for making the position and orientation of the display surface follow the movement of the head becomes simple.
  • the display device is arranged so as to surround at least a part of the periphery of the user in the vertical direction, and the display device follows the movement of the head.
  • the display device follows the movement of the head.
  • the imaging system follows the vertical movement of the head and moves the reference point of the image captured by the imaging device in the vertical direction on the display surface of the display device.
  • the position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
  • the display device is arranged so as to surround at least a part of the periphery of the user in the horizontal direction, and the display device follows the movement of the head and is described.
  • the display device follows the movement of the head and is described.
  • the imaging system follows the movement of the head in the left-right direction and moves the reference point of the image captured by the imaging device in the horizontal direction on the display surface of the display device.
  • the position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
  • the display device has a plurality of display surfaces arranged so as to have different orientations, and the display device follows the movement of the head and the image pickup device. By moving the reference point of the image captured by the above on the plurality of display surfaces, the position and orientation of displaying the image may be changed.
  • the plurality of display surfaces of the display device can surround at least a part of the periphery of the user.
  • the position of the reference point of the image captured by the image pickup apparatus and the direction in which the reference point exists can correspond to the position and orientation of the head.
  • each of the plurality of display surfaces may be, for example, a flat surface.
  • the plurality of display surfaces may be display surfaces of a plurality of flat displays. This makes it possible to reduce the cost of the display device.
  • the display device may have a display surface including at least one of bending and bending so as to surround at least a part around the user.
  • the display device can have a display surface extending along the periphery so as to surround at least a part of the periphery of the user. This allows the display device to present the user with a continuous image.
  • the display surface of one display device may be configured to surround at least a part around the user.
  • the display device is driven to move at least one of the position of the display surface of the display device and the orientation of the display surface in accordance with the movement of the head. It may be equipped with a device.
  • the imaging system can move the position of the display surface of the display device, the orientation of the display surface, or both the position and orientation of the display surface according to the movement of the head.
  • the imaging system can maintain the position and orientation of the display surface with respect to the head even when the head moves. Therefore, the user can easily visually recognize the display surface.
  • the robot system includes an image pickup system according to one aspect of the present disclosure and a robot that performs work on an object, and the image pickup device is a combination of the object and the robot. It is placed in a position where at least one can be imaged. According to the above aspect, the same effect as that of the imaging system according to one aspect of the present disclosure can be obtained.
  • the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • the connection relationship between the components is exemplified for concretely explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited to this.
  • Robot system 100 Imaging system 110 Imaging device 120 Mobile device (variable device) 130 Motion detection device (detection device) 131 Infrared sensor 132 Infrared marker 140, 140A, 140B, 140C Display device 141a, 143a Display surface 142 Display drive device (drive device) 1512 1st movement control unit (variable device) 1522 Detection processing unit (detection device) 1531 Display control unit 1532 Second movement control unit 1533 Image processing unit H Head P User W Object

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

This imaging system (100) comprises an imaging device (110), a detection device (130) that detects movement of the head of a user, a changing device (120) that changes the position and orientation of the imaging device (110) so as to track the movement of the head detected by the detection device (130), and a display device (140) that displays an image captured by the imaging device (110) to a user and that varies at least one of the position at which the image is displayed and the orientation in which the image is displayed so as to track the movement of the head.

Description

撮像システム及びロボットシステムImaging system and robot system
 本開示は、撮像システム及びロボットシステムに関する。 This disclosure relates to an imaging system and a robot system.
 従来から、カメラによって撮像された対象物の画像をモニタに表示させつつ、モニタを視認するユーザによるロボットの遠隔操作を受け入れる技術がある。例えば、特許文献1は、マスター・スレーブ形マニプレータを開示する。当該マニプレータは、操作ロボットと、作業ロボットと、作業対象物を撮像する撮像装置とを備える。撮像装置は第1の多関節伸縮機構に取り付けられ、操作者のヘルメットは第2の多関節伸縮機構に取り付けられる。第1の多関節伸縮機構は、第2の多関節伸縮機構の動作に追従して動作することで、撮像装置に操作者の頭部の動きを追従させる。 Conventionally, there is a technology that accepts the remote control of the robot by the user who visually recognizes the monitor while displaying the image of the object captured by the camera on the monitor. For example, Patent Document 1 discloses a master-slave type manifold. The manipulator includes an operation robot, a work robot, and an image pickup device that captures an image of a work object. The image pickup device is attached to the first articulated telescopic mechanism, and the operator's helmet is attached to the second articulated telescopic mechanism. The first multi-joint expansion / contraction mechanism operates following the operation of the second multi-joint expansion / contraction mechanism, so that the image pickup device follows the movement of the operator's head.
特開昭58-77463号公報Japanese Unexamined Patent Publication No. 58-77463
 特許文献1では、撮像装置によって撮像された画像はスクリーンに投影され、操作者は当該スクリーン上の画像を視認しつつ、操作ロボットを操作する。例えば、操作者が正面のスクリーンに対して頭部の向きを左方向又は右方向に変えたとき、スクリーンには頭部の動きに追従して動いた撮像装置によって撮像された画像が映し出されるが、作業者は、右方向又は左方向に位置するスクリーンの画像を十分に視ることができず、正確な操作を行えないおそれがある。 In Patent Document 1, the image captured by the image pickup device is projected on the screen, and the operator operates the operating robot while visually recognizing the image on the screen. For example, when the operator changes the direction of the head to the left or right with respect to the front screen, the screen displays an image captured by an image pickup device that moves following the movement of the head. The operator may not be able to sufficiently see the image of the screen located in the right direction or the left direction, and may not be able to perform an accurate operation.
 本開示は、撮像装置と撮像装置によって撮像された画像の表示面とをユーザの頭部の動きに追従させることを可能にする撮像システム及びロボットシステムを提供することを目的とする。 It is an object of the present disclosure to provide an imaging system and a robot system that enable an imaging device and a display surface of an image captured by the imaging device to follow the movement of the user's head.
 本開示の一態様に係る撮像システムは、撮像装置と、ユーザの頭部の動きを検出する検出装置と、前記検出装置によって検出される前記頭部の動きに追従して前記撮像装置の位置及び向きを変動させる変動装置と、前記撮像装置によって撮像された画像を前記ユーザに表示し、前記頭部の動きに追従して、前記画像を表示する位置と前記画像を表示する向きとのうちの少なくとも一方を変える表示装置とを備える。 The image pickup system according to one aspect of the present disclosure includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device. Of a variable device that changes the orientation, a position in which the image is displayed and a direction in which the image is displayed according to the movement of the head by displaying the image captured by the imaging device to the user. It is equipped with a display device that changes at least one of them.
図1は、例示的な実施の形態に係るロボットシステムの構成の一例を示す斜視図である。FIG. 1 is a perspective view showing an example of a configuration of a robot system according to an exemplary embodiment. 図2は、例示的な実施の形態に係る撮像システムの機能的な構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of a functional configuration of an imaging system according to an exemplary embodiment. 図3は、例示的な実施の形態に係る撮像システムの動作の一例を示すフローチャートである。FIG. 3 is a flowchart showing an example of the operation of the imaging system according to the exemplary embodiment. 図4は、例示的な実施の形態の変形例1に係る表示装置の構成の一例を示す側面図である。FIG. 4 is a side view showing an example of the configuration of the display device according to the first modification of the exemplary embodiment. 図5は、変形例1に係る撮像システムの動作の一例を示すフローチャートである。FIG. 5 is a flowchart showing an example of the operation of the imaging system according to the first modification. 図6は、例示的な実施の形態の変形例2に係る表示装置の構成の一例を示す側面図である。FIG. 6 is a side view showing an example of the configuration of the display device according to the second modification of the exemplary embodiment. 図7は、変形例2に係る撮像システムの動作の一例を示すフローチャートである。FIG. 7 is a flowchart showing an example of the operation of the imaging system according to the second modification. 図8は、例示的な実施の形態の変形例3に係る表示装置の構成の一例を示す側面図である。FIG. 8 is a side view showing an example of the configuration of the display device according to the modified example 3 of the exemplary embodiment. 図9は、変形例3に係る撮像システムの動作の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the operation of the imaging system according to the modified example 3. 図10は、例示的な実施の形態の変形例4に係る撮像装置の構成の一例を示す側面図である。FIG. 10 is a side view showing an example of the configuration of the image pickup apparatus according to the modified example 4 of the exemplary embodiment.
 以下において、本開示の例示的な実施の形態を、図面を参照しつつ説明する。なお、以下で説明する例示的な実施の形態は、いずれも包括的又は具体的な例を示すものである。また、以下の例示的な実施の形態における構成要素のうち、最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。また、添付の図面における各図は、模式的な図であり、必ずしも厳密に図示されたものでない。さらに、各図において、実質的に同一の構成要素に対しては同一の符号を付しており、重複する説明は省略又は簡略化される場合がある。また、本明細書及び請求の範囲では、「装置」とは、1つの装置を意味し得るだけでなく、複数の装置からなるシステムも意味し得る。 Hereinafter, exemplary embodiments of the present disclosure will be described with reference to the drawings. It should be noted that all of the exemplary embodiments described below are comprehensive or specific examples. Further, among the components in the following exemplary embodiments, the components not described in the independent claim indicating the highest level concept are described as arbitrary components. Further, each figure in the attached drawings is a schematic view and is not necessarily exactly illustrated. Further, in each figure, substantially the same components are designated by the same reference numerals, and duplicate description may be omitted or simplified. Further, in the present specification and claims, the "device" may mean not only one device but also a system including a plurality of devices.
 [ロボットシステムの構成]
 例示的な実施の形態に係るロボットシステム1の構成を説明する。図1は、例示的な実施の形態に係るロボットシステム1の構成の一例を示す斜視図である。図1に示すように、ロボットシステム1は、撮像システム100と、ロボット200と、ロボット操作装置300と、ロボット制御装置400とを備える。撮像システム100は、撮像装置110と、移動装置120と、動き検出装置130と、表示装置140と、撮像制御装置150と、撮像入力装置160とを備える。移動装置120は変動装置の一例である。
[Robot system configuration]
The configuration of the robot system 1 according to the exemplary embodiment will be described. FIG. 1 is a perspective view showing an example of the configuration of the robot system 1 according to the exemplary embodiment. As shown in FIG. 1, the robot system 1 includes an image pickup system 100, a robot 200, a robot operation device 300, and a robot control device 400. The image pickup system 100 includes an image pickup device 110, a mobile device 120, a motion detection device 130, a display device 140, an image pickup control device 150, and an image pickup input device 160. The mobile device 120 is an example of a variable device.
 以下に限定されないが、本例示的な実施の形態では、ロボット200は、産業用ロボットであり、ロボットアーム210と基部220とエンドエフェクタ230とを備える。なお、ロボット200は、サービス用ロボット、医療用ロボット、創薬用ロボット及びヒューマノイドのように他のタイプのロボットであってもよい。サービス用ロボットは、介護、医療、清掃、警備、案内、救助、調理、商品提供等の様々なサービス業で使用されるロボットである。 Although not limited to the following, in this exemplary embodiment, the robot 200 is an industrial robot and includes a robot arm 210, a base 220, and an end effector 230. The robot 200 may be another type of robot such as a service robot, a medical robot, a drug discovery robot, and a humanoid. Service robots are robots used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, and product provision.
 基部220は、支持面上に固定され、ロボットアーム210を支持する。基部220の支持面は、床面等の不動な面であってもよく、走行装置等の移動可能な装置上の移動可能な面であってもよい。ロボットアーム210は、少なくとも1つの関節を有し、少なくとも1つの自由度を有する。ロボットアーム210は、ロボットアーム210の先端にエンドエフェクタ230が取り付けられるように構成される。ロボットアーム210は、エンドエフェクタ230の位置及び姿勢を自在に変動させるようにエンドエフェクタ230を動かすことができる。エンドエフェクタ230は、エンドエフェクタ230の用途に応じて把持、吸着、塗料などの液体の吹き付け、溶接、及び、シール剤などの注入等の様々な作用を対象物(「ワーク」とも呼ぶ)Wに加えることができるように構成される。 The base 220 is fixed on the support surface and supports the robot arm 210. The support surface of the base 220 may be an immovable surface such as a floor surface, or may be a movable surface on a movable device such as a traveling device. The robot arm 210 has at least one joint and has at least one degree of freedom. The robot arm 210 is configured so that the end effector 230 is attached to the tip of the robot arm 210. The robot arm 210 can move the end effector 230 so as to freely change the position and posture of the end effector 230. The end effector 230 performs various actions such as gripping, adsorption, spraying of a liquid such as paint, welding, and injection of a sealing agent on the object (also referred to as "work") W according to the application of the end effector 230. It is configured to be able to be added.
 なお、本例示的な実施の形態では、ロボットアーム210は、6つの回転関節を有する6自由度の垂直多関節型のロボットアームであるが、これに限定されない。ロボットアーム210の形式は、いかなる型式であってもよく、例えば、水平多関節型、極座標型、円筒座標型又は直角座標型等であってもよい。ロボットアーム210の関節も、直動関節等のいかなる関節であってもよい。ロボットアーム210の関節の数量も、5つ以下又は7つ以上等のいかなる数量であってもよい。 In this exemplary embodiment, the robot arm 210 is a vertical articulated robot arm with 6 degrees of freedom having 6 rotating joints, but is not limited thereto. The type of the robot arm 210 may be any type, for example, a horizontal articulated type, a polar coordinate type, a cylindrical coordinate type, a rectangular coordinate type, or the like. The joint of the robot arm 210 may be any joint such as a linear motion joint. The number of joints of the robot arm 210 may be any number such as 5 or less or 7 or more.
 ロボット操作装置300は、ロボット200から離れた位置に配置され、ロボット200を遠隔操作するために用いられる。ロボット操作装置300は、ロボット操作装置300を扱うユーザPがロボット200を直接視認できるような位置に配置されてもよく、ユーザPがロボット200を直接視認できないような位置に配置されてもよい。例えば、ロボット操作装置300は、ロボット200が配置される空間から隔離された空間、又は当該空間から離れた位置の空間に配置されてもよい。 The robot operating device 300 is arranged at a position away from the robot 200 and is used for remotely controlling the robot 200. The robot operating device 300 may be arranged at a position where the user P who handles the robot operating device 300 can directly see the robot 200, or may be arranged at a position where the user P cannot directly see the robot 200. For example, the robot operating device 300 may be arranged in a space isolated from the space in which the robot 200 is arranged, or in a space at a position away from the space.
 ロボット操作装置300は、種々の指令、情報及びデータ等の入力を受け付け、ロボット制御装置400に出力する。例えば、ロボット操作装置300は、ユーザPによる入力を受け付けることができる。例えば、ロボット操作装置300は、他の機器と接続され、当該機器からの入力を受け付けることができる。例えば、ロボット操作装置300は、レバー、ボタン、タッチパネル、ジョイスティック、モーションキャプチャ、カメラ及びマイク等の公知の入力手段を備えてもよい。例えば、ロボット操作装置300は、教示装置の1つであるティーチングペンダント、スマートフォン及びタブレットなどのスマートデバイス、パーソナルコンピュータ、並びに専用端末装置等の端末装置を備えてもよい。例えば、ロボット200がマスター・スレーブ方式で制御される場合、ロボット操作装置300はマスター機を備えてもよい。例えば、マスター機は、ロボットアーム210と同様又は類似する動作を行うことができるように構成されてもよい。 The robot operating device 300 receives inputs such as various commands, information, and data, and outputs them to the robot control device 400. For example, the robot operating device 300 can accept an input by the user P. For example, the robot operating device 300 is connected to another device and can receive input from the device. For example, the robot operating device 300 may include known input means such as a lever, a button, a touch panel, a joystick, a motion capture, a camera, and a microphone. For example, the robot operating device 300 may include a teaching pendant, which is one of the teaching devices, a smart device such as a smartphone and a tablet, a personal computer, and a terminal device such as a dedicated terminal device. For example, when the robot 200 is controlled by the master-slave method, the robot operating device 300 may include a master machine. For example, the master machine may be configured to perform similar or similar movements to the robot arm 210.
 ロボット制御装置400は、ロボット200の動作を制御する。ロボット制御装置400は、ロボット200及びロボット操作装置300と有線通信又は無線通信を介して接続される。なお、いかなる有線通信及び無線通信が用いられてもよい。ロボット制御装置400は、ロボット操作装置300を介して入力される指令、情報及びデータ等を処理する。ロボット制御装置400は、外部の機器と接続され、当該機器から指令、情報及びデータ等の入力を受け付け処理するように構成されてもよい。 The robot control device 400 controls the operation of the robot 200. The robot control device 400 is connected to the robot 200 and the robot operation device 300 via wired communication or wireless communication. Any wired communication or wireless communication may be used. The robot control device 400 processes commands, information, data, and the like input via the robot operation device 300. The robot control device 400 may be connected to an external device and may be configured to receive and process inputs such as commands, information, and data from the device.
 例えば、ロボット制御装置400は、上記指令、情報及びデータ等に従って、ロボット200の動作を制御する。ロボット制御装置400は、ロボット200への動力等の供給を制御する。ロボット制御装置400は、ロボット200を管理するための情報等を管理する。 For example, the robot control device 400 controls the operation of the robot 200 according to the above commands, information, data, and the like. The robot control device 400 controls the supply of power and the like to the robot 200. The robot control device 400 manages information and the like for managing the robot 200.
 また、ロボット制御装置400は、種々の指令、情報及びデータ等をロボット操作装置300及び/又は撮像システム100の表示装置140に出力する。例えば、ロボット制御装置400は、種々の指令、情報及びデータ等を表示装置140に視覚的及び/又は聴覚的に提示させる。例えば、ロボット制御装置400は、ロボット200を操作するための画像、ロボット200の状態を示す画像、並びに、ロボット200を管理するための画像等を出力してもよい。 Further, the robot control device 400 outputs various commands, information, data, and the like to the robot operation device 300 and / or the display device 140 of the image pickup system 100. For example, the robot control device 400 causes the display device 140 to visually and / or audibly present various commands, information, data, and the like. For example, the robot control device 400 may output an image for operating the robot 200, an image showing the state of the robot 200, an image for managing the robot 200, and the like.
 ロボット制御装置400は、コンピュータを備える。さらに、ロボット制御装置400は、ロボット200に供給する電力を制御するための電気回路、ロボット200に供給する空気圧及び液圧等の電力以外の動力を制御するための機器、並びに、ロボット200に供給する冷却水及び塗料等の物質を制御するための機器等を備えてもよい。コンピュータ以外の機器は、ロボット制御装置400と別個に設けられてもよい。 The robot control device 400 includes a computer. Further, the robot control device 400 supplies the electric circuit for controlling the electric power supplied to the robot 200, the device for controlling the power other than the electric power such as the air pressure and the hydraulic pressure supplied to the robot 200, and the robot 200. Equipment for controlling substances such as cooling water and paint may be provided. Devices other than the computer may be provided separately from the robot control device 400.
 撮像システム100の撮像装置110は、デジタル画像の静止画及び/又は動画を撮像するカメラを含む。当該カメラは、画像内の被写体の位置情報を含む3次元画像を撮像できる3次元カメラであってもよい。 The image pickup device 110 of the image pickup system 100 includes a camera that captures a still image and / or a moving image of a digital image. The camera may be a three-dimensional camera capable of capturing a three-dimensional image including the position information of the subject in the image.
 移動装置120は、撮像装置110を搭載し、撮像装置110の位置及び向きを自在に変えることができるように構成される。以下に限定されないが、撮像装置110の位置は、3次元空間内での撮像装置110の3次元の位置であってもよく、撮像装置110の向きは、撮像装置110のカメラの光軸中心の向きであってもよく、具体的には、3次元空間内での光軸中心の3次元の方向であってもよい。例えば、撮像装置110の向きは、撮像装置110の姿勢に対応し得る。移動装置120は特に限定されないが、本例示的な実施の形態では、ロボットアーム210と同様のロボットアームであり、支持面上に固定される。本例示的な実施の形態では、支持面は天井面であり、ロボットアーム210は天井面から吊り下げられるが、支持面の位置及び向きは限定されない。撮像装置110は、当該ロボットアームの先端に取り付けられる。なお、移動装置120は、例えば、床面等の支持面上を走行可能な走行装置、軌道上を走行可能な軌道走行装置、軌道上を移動可能なクレーン、アームを備えるクレーン、ロボットアーム以外の多関節アーム、及びドローンなどの無人飛行体等の他の装置であってもよい。例えば、軌道走行装置の軌道は、鉛直方向、水平方向及びこれらに交差する方向に延びるように配置されてもよい。これにより、軌道走行装置は様々な方向及び位置に走行可能である。例えば、クレーンにおいて、撮像装置110は、クレーンフックに取り付けられてもよい。移動装置120は、撮像装置110が取り付けられる雲台(「ジンバル」とも呼ばれる)を備え、雲台を動作させることで撮像装置110の向きを自在に変動させてもよい。 The mobile device 120 is equipped with an image pickup device 110 and is configured so that the position and orientation of the image pickup device 110 can be freely changed. Although not limited to the following, the position of the image pickup device 110 may be the three-dimensional position of the image pickup device 110 in the three-dimensional space, and the orientation of the image pickup device 110 is the center of the optical axis of the camera of the image pickup device 110. It may be oriented, and specifically, it may be a three-dimensional direction of the center of the optical axis in the three-dimensional space. For example, the orientation of the image pickup device 110 may correspond to the posture of the image pickup device 110. The moving device 120 is not particularly limited, but in the present exemplary embodiment, it is a robot arm similar to the robot arm 210 and is fixed on the support surface. In this exemplary embodiment, the support surface is a ceiling surface, and the robot arm 210 is suspended from the ceiling surface, but the position and orientation of the support surface are not limited. The image pickup device 110 is attached to the tip of the robot arm. The moving device 120 is other than, for example, a traveling device capable of traveling on a support surface such as a floor surface, an orbit traveling device capable of traveling on an orbit, a crane movable on an orbit, a crane provided with an arm, and a robot arm. It may be an articulated arm and other devices such as an unmanned aerial vehicle such as a drone. For example, the track of the track traveling device may be arranged so as to extend in the vertical direction, the horizontal direction, and the direction intersecting them. As a result, the track traveling device can travel in various directions and positions. For example, in a crane, the image pickup device 110 may be attached to a crane hook. The mobile device 120 includes a pan head (also referred to as a “gimbal”) to which the image pickup device 110 is attached, and the orientation of the image pickup device 110 may be freely changed by operating the pan head.
 動き検出装置130は、検出装置の一例であり、ロボット操作装置300を操作するユーザPの頭部Hの動きを検出する。動き検出装置130は特に限定されないが、本例示的な実施の形態では、少なくとも1つの赤外線センサ131と、ユーザPの頭部Hに装着される少なくとも1つの赤外線マーカ132とを含む。本例示的な実施の形態では、複数の赤外線センサ131、具体的には、3つの赤外線センサ131がユーザPの周りにユーザPに向けて配置される。3つの赤外線センサ131は、ユーザPの頭部Hから離れた位置に配置される。複数の赤外線マーカ132、具体的には、4つの赤外線マーカ132が頭部H上の異なる位置に配置される。なお、頭部は、人体における首から上の部分を含み、例えば、顔面、頭頂部、側頭部、後頭部等を含み得る。 The motion detection device 130 is an example of the detection device, and detects the motion of the head H of the user P who operates the robot operation device 300. The motion detecting device 130 is not particularly limited, but includes at least one infrared sensor 131 and at least one infrared marker 132 attached to the head H of the user P in this exemplary embodiment. In this exemplary embodiment, a plurality of infrared sensors 131, specifically, three infrared sensors 131 are arranged around the user P toward the user P. The three infrared sensors 131 are arranged at positions away from the head H of the user P. A plurality of infrared markers 132, specifically, four infrared markers 132 are arranged at different positions on the head H. The head includes a portion of the human body above the neck, and may include, for example, the face, the crown, the temporal region, the occipital region, and the like.
 赤外線マーカ132は、赤外光を出射する。赤外線マーカ132は、赤外線LED(Light Emitting Diode)等の赤外光を自身で発する発光体であってもよく、照射された赤外光を反射する反射体であってもよく、発光体及び反射体の両方を含むように構成されてもよい。赤外線センサ131は、赤外光を受光し、受光される赤外光の方向、強度及び強度分布等を検出することができる。赤外線センサ131は、赤外光の受光のみが可能であるように構成されてもよく、赤外光を自身で発し、当該赤外光の反射光等の赤外光を受光するように構成されてもよい。後者の場合、赤外線センサ131は、赤外線カメラであってもよい。3つの赤外線センサ131を用いて4つの赤外線マーカ132からの赤外光を検出することで、頭部Hの位置及び姿勢等を高精度に検出することが可能である。以下に限定されないが、頭部Hの位置は、3次元空間内での頭部Hの所定の基準点等の3次元の位置であってもよい。頭部Hの姿勢は、頭部Hの正面部分、頭部Hを横切る平面、及び頭部Hの顎から頭頂部を通る軸等の所定の部分、面又は軸の姿勢であってもよく、具体的には、3次元空間内での上記所定の部分、面又は軸の3次元の向きであってもよい。 The infrared marker 132 emits infrared light. The infrared marker 132 may be a light emitter such as an infrared LED (Light Emitting Diode) that emits infrared light by itself, or may be a reflector that reflects the irradiated infrared light, and is a light emitter and a reflection. It may be configured to include both bodies. The infrared sensor 131 receives infrared light and can detect the direction, intensity, intensity distribution, etc. of the received infrared light. The infrared sensor 131 may be configured to be capable of receiving only infrared light, or may be configured to emit infrared light by itself and receive infrared light such as reflected light of the infrared light. You may. In the latter case, the infrared sensor 131 may be an infrared camera. By detecting the infrared light from the four infrared markers 132 using the three infrared sensors 131, it is possible to detect the position and posture of the head H with high accuracy. Although not limited to the following, the position of the head H may be a three-dimensional position such as a predetermined reference point of the head H in the three-dimensional space. The posture of the head H may be the posture of a predetermined part such as a front portion of the head H, a plane crossing the head H, and an axis passing from the jaw of the head H to the crown, a surface or an axis. Specifically, it may be the three-dimensional orientation of the predetermined portion, surface or axis in the three-dimensional space.
 なお、上記と逆となるように、赤外線センサ131がユーザPの頭部Hに装着され、赤外線マーカ132がユーザPの頭部Hから離れた位置に配置されてもよい。赤外線センサ131及び赤外線マーカ132の位置及び数量は、頭部Hの位置、姿勢、又は、位置及び姿勢の両方等の検出が可能であればよく、特に限定されない。 Note that the infrared sensor 131 may be attached to the head H of the user P and the infrared marker 132 may be arranged at a position away from the head H of the user P so as to reverse the above. The positions and quantities of the infrared sensor 131 and the infrared marker 132 are not particularly limited as long as they can detect the position, posture, or both the position and the posture of the head H.
 表示装置140は、撮像装置110によって撮像された画像をユーザPに知覚可能に提示する。表示装置140は、ロボット操作装置300の近傍に配置され、撮像装置110から離れた位置に配置される。表示装置140は、ロボット制御装置400から受け取る指令、情報及びデータ等を、ユーザPに知覚可能に提示してもよい。例えば、表示装置140は、液晶ディスプレイ(Liquid Crystal Display)及び有機又は無機ELディスプレイ(Electro-Luminescence Display)等のディスプレイを備え、視覚的な提示をする。表示装置140は、スピーカ等の音声出力装置を備え、聴覚的な提示をしてもよい。表示装置140は、触覚的な提示をするように構成されてもよい。 The display device 140 perceptibly presents the image captured by the image pickup device 110 to the user P. The display device 140 is arranged in the vicinity of the robot operating device 300, and is arranged at a position away from the image pickup device 110. The display device 140 may present the command, information, data, and the like received from the robot control device 400 to the user P in a perceptible manner. For example, the display device 140 includes a display such as a liquid crystal display (Liquid Crystal Display) and an organic or inorganic EL display (Electro-Luminescence Display), and presents visually. The display device 140 may include an audio output device such as a speaker, and may make an auditory presentation. The display device 140 may be configured to make a tactile presentation.
 以下に限定されないが、本例示的な実施の形態では、表示装置140は、ユーザPの頭部Hに取り付けられるヘッドマウントディスプレイである。本例示的な実施の形態では、ヘッドマウントディスプレイは、ゴーグル状の形状を有し、ヘッドマウントディスプレイのレンズ部分が、画像が表示される表示面を形成する。表示装置140は、ユーザPの頭部Hと共に移動することでユーザPの頭部Hの動きに追従して、表示装置140が画像を表示する位置及び向きを変えることができる。 Although not limited to the following, in the present exemplary embodiment, the display device 140 is a head-mounted display attached to the head H of the user P. In this exemplary embodiment, the head-mounted display has a goggle-like shape, and the lens portion of the head-mounted display forms a display surface on which an image is displayed. By moving together with the head H of the user P, the display device 140 can change the position and direction in which the display device 140 displays an image in accordance with the movement of the head H of the user P.
 なお、表示装置140は、ユーザPの頭部Hに取り付けられないように構成されてもよく、この場合、ディスプレイの位置、ディスプレイの姿勢、又は、ディスプレイの位置及び姿勢を変えることができるディスプレイ駆動装置を備えてもよい。ディスプレイを移動させる構成、及び、ディスプレイの位置及び姿勢を変える構成は、移動装置120について例示したような装置により構成されてもよい。ディスプレイの姿勢を変える構成は、ジンバル等の装置により構成されてもよい。 The display device 140 may be configured so as not to be attached to the head H of the user P. In this case, the display drive can change the position of the display, the posture of the display, or the position and posture of the display. It may be equipped with a device. The configuration for moving the display and the configuration for changing the position and orientation of the display may be configured by a device as exemplified for the moving device 120. The configuration for changing the posture of the display may be configured by a device such as a gimbal.
 撮像入力装置160は、撮像システム100を動作させるための入力及び操作等をユーザPから受け付ける。撮像入力装置160は、種々の指令、情報及びデータ等の入力を受け付け、撮像制御装置150に出力する。例えば、撮像入力装置160は、ロボット操作装置300の近傍に配置され、ロボット操作装置300の例示した構成と同様の構成を有してもよい。ロボット操作装置300が撮像入力装置160を含み、撮像入力装置160の機能を兼ねてもよい。 The image pickup input device 160 receives input and operations for operating the image pickup system 100 from the user P. The image pickup input device 160 receives inputs such as various commands, information, and data, and outputs them to the image pickup control device 150. For example, the image pickup input device 160 may be arranged in the vicinity of the robot operation device 300 and may have a configuration similar to the configuration exemplified by the robot operation device 300. The robot operation device 300 may include an image pickup input device 160 and also have a function of the image pickup input device 160.
 図2は、例示的な実施の形態に係る撮像システム100の機能的な構成の一例を示すブロック図である。図1及び図2に示すように、撮像制御装置150は、撮像装置110、移動装置120、動き検出装置130、表示装置140及び撮像入力装置160と有線通信又は無線通信を介して接続される。なお、いかなる有線通信及び無線通信が用いられてもよい。撮像制御装置150は、撮像装置110及び移動装置120の駆動を制御する駆動制御装置151と、動き検出装置130の動作を制御する検出制御装置152と、表示装置140の動作を制御する表示制御装置153とを含む。 FIG. 2 is a block diagram showing an example of the functional configuration of the imaging system 100 according to the exemplary embodiment. As shown in FIGS. 1 and 2, the image pickup control device 150 is connected to the image pickup device 110, the mobile device 120, the motion detection device 130, the display device 140, and the image pickup input device 160 via wired communication or wireless communication. Any wired communication or wireless communication may be used. The image pickup control device 150 includes a drive control device 151 that controls the drive of the image pickup device 110 and the mobile device 120, a detection control device 152 that controls the operation of the motion detection device 130, and a display control device that controls the operation of the display device 140. 153 and more.
 検出制御装置152は、3つの赤外線センサ131の駆動を制御し、3つの赤外線センサ131が4つの赤外線マーカ132からの赤外光を検出する結果を処理して、4つの赤外線マーカ132の3次元の位置及び姿勢を検出する。つまり、検出制御装置152は、赤外線マーカ132の3次元の位置及び姿勢を検出することで、ユーザPの頭部Hの位置及び姿勢を検出する。検出制御装置152は処理装置の一例である。 The detection control device 152 controls the drive of the three infrared sensors 131, processes the result of the three infrared sensors 131 detecting the infrared light from the four infrared markers 132, and processes the result of detecting the infrared light from the four infrared markers 132 in three dimensions of the four infrared markers 132. Detects the position and posture of. That is, the detection control device 152 detects the position and posture of the head H of the user P by detecting the three-dimensional position and posture of the infrared marker 132. The detection control device 152 is an example of a processing device.
 具体的には、3つの赤外線センサ131はそれぞれ、4つの赤外線マーカ132から出射される赤外光を受光する。各赤外線マーカ132から出射される赤外光は、当該赤外線マーカ132に設定されるID等の識別情報と関連付けられている。このため、各赤外線センサ131は、4つの赤外線マーカ132それぞれの赤外光の向き、強度及び強度分布等を検出することができる。検出制御装置152は、各赤外線センサ131の3次元の位置及び姿勢の情報と、各赤外線センサ131による4つの赤外線マーカ132の赤外光の検出結果とを用いて、4つの赤外線マーカ132の3次元の位置を検出する。例えば、検出制御装置152は、3つの赤外線センサ131及びロボット制御装置400が配置される空間に設定される3次元座標系に従って、4つの赤外線マーカ132の3次元の位置を検出する。さらに、検出制御装置152は、4つの赤外線マーカ132の3次元の位置の情報を用いて、ユーザPの頭部Hの3次元の位置及び姿勢を検出する。例えば、検出制御装置152は、ローリング角、ピッチング角及びヨーイング角等の姿勢角を用いて姿勢を表す。 Specifically, each of the three infrared sensors 131 receives infrared light emitted from the four infrared markers 132. The infrared light emitted from each infrared marker 132 is associated with identification information such as an ID set in the infrared marker 132. Therefore, each infrared sensor 131 can detect the direction, intensity, intensity distribution, and the like of infrared light of each of the four infrared markers 132. The detection control device 152 uses the three-dimensional position and orientation information of each infrared sensor 131 and the detection result of the infrared light of the four infrared markers 132 by each infrared sensor 131, and uses the three of the four infrared markers 132. Detect the position of a dimension. For example, the detection control device 152 detects the three-dimensional positions of the four infrared markers 132 according to the three-dimensional coordinate system set in the space where the three infrared sensors 131 and the robot control device 400 are arranged. Further, the detection control device 152 detects the three-dimensional position and posture of the head H of the user P by using the information on the three-dimensional positions of the four infrared markers 132. For example, the detection control device 152 expresses a posture by using posture angles such as a rolling angle, a pitching angle, and a yawing angle.
 駆動制御装置151は、撮像装置110の撮像動作を制御する。さらに、駆動制御装置151は、検出制御装置152によって検出されるユーザPの頭部Hの動きに追従して撮像装置110の位置及び姿勢を変動させるように移動装置120の動作を制御する。駆動制御装置151は、検出制御装置152によって検出される頭部Hの位置及び姿勢の変動量に対応する撮像装置110の位置及び姿勢の変動量で撮像装置110を移動させるように、移動装置120の動作を制御する。例えば、駆動制御装置151は、移動装置120が配置される空間に設定される3次元座標系に従って撮像装置110の位置及び姿勢を制御し、例えば、移動装置120に設定される3次元座標系を用いる。 The drive control device 151 controls the image pickup operation of the image pickup device 110. Further, the drive control device 151 controls the operation of the moving device 120 so as to change the position and posture of the image pickup device 110 according to the movement of the head H of the user P detected by the detection control device 152. The drive control device 151 moves the image pickup device 110 so as to move the image pickup device 110 according to the change amount of the position and posture of the image pickup device 110 corresponding to the change amount of the position and posture of the head H detected by the detection control device 152. Controls the behavior of. For example, the drive control device 151 controls the position and orientation of the image pickup device 110 according to the three-dimensional coordinate system set in the space where the moving device 120 is arranged, and for example, the three-dimensional coordinate system set in the moving device 120. Use.
 撮像装置110の位置及び姿勢の変動量と頭部Hの位置及び姿勢の変動量との関係は、任意である。例えば、撮像装置110の位置及び姿勢の変動量は、頭部Hの位置及び姿勢の変動量と1対1で対応してもよく、頭部Hの位置及び姿勢の変動量の定数倍に対応してもよい。例えば、撮像装置110の姿勢の変動量と頭部Hの姿勢の変動量とが1対1で対応し、撮像装置110の位置の変動量が頭部Hの位置の変動量の定数倍に対応してもよい。例えば、撮像装置110の位置の変動量と頭部Hの位置の変動量とが1対1で対応し、撮像装置110の姿勢の変動量が頭部Hの姿勢の変動量の定数倍に対応してもよい。 The relationship between the amount of change in the position and posture of the image pickup device 110 and the amount of change in the position and posture of the head H is arbitrary. For example, the amount of change in the position and posture of the image pickup apparatus 110 may correspond to the amount of change in the position and posture of the head H on a one-to-one basis, and corresponds to a constant multiple of the amount of change in the position and posture of the head H. You may. For example, the amount of change in the posture of the image pickup device 110 and the amount of change in the posture of the head H correspond to one-to-one, and the amount of change in the position of the image pickup device 110 corresponds to a constant multiple of the amount of change in the position of the head H. You may. For example, the amount of change in the position of the image pickup device 110 and the amount of change in the position of the head H have a one-to-one correspondence, and the amount of change in the posture of the image pickup device 110 corresponds to a constant multiple of the amount of change in the posture of the head H. You may.
 表示制御装置153は、撮像装置110によって撮像された画像を表示装置140に表示させる。また、ディスプレイ駆動装置を表示装置140が備える場合、表示制御装置153は、ディスプレイ駆動装置の動作を制御してもよい。また、表示制御装置153は、表示装置140の表示画面上において撮像装置110の画像を表示する位置を変える位置処理を行ってもよい。 The display control device 153 causes the display device 140 to display the image captured by the image pickup device 110. When the display device 140 includes the display drive device, the display control device 153 may control the operation of the display drive device. Further, the display control device 153 may perform position processing for changing the position where the image of the image pickup device 110 is displayed on the display screen of the display device 140.
 本例示的な実施の形態では、表示装置140はヘッドマウントディスプレイである。このため、表示制御装置153は、ディスプレイ駆動装置の制御及び位置処理を行わずに、撮像装置110によって撮像された画像を表示装置140に表示させる。例えば、表示制御装置153は、撮像装置110によって撮像された画像の中心を表示装置140の表示画面の中心上に位置させる。これにより、ユーザPは、撮像装置110によって撮像された画像を常に頭部Hの正面付近に視ることができる。 In this exemplary embodiment, the display device 140 is a head-mounted display. Therefore, the display control device 153 causes the display device 140 to display the image captured by the image pickup device 110 without controlling the display drive device and performing position processing. For example, the display control device 153 positions the center of the image captured by the image pickup device 110 on the center of the display screen of the display device 140. As a result, the user P can always see the image captured by the image pickup device 110 in the vicinity of the front surface of the head H.
 表示装置140がヘッドマウントディスプレイでなく、ディスプレイ駆動装置を備える場合、表示制御装置153は、ユーザPの頭部Hの動きに追従してディスプレイが移動するように、ディスプレイ駆動装置の動作を制御してもよい。例えば、ディスプレイの表示画面が頭部Hの正面付近に位置する及び/又は頭部Hの前面に向くように、表示制御装置153は上記制御を行ってもよい。さらに、表示制御装置153は、ディスプレイの表示画面上において、撮像装置110によって撮像された画像の中心がユーザPの頭部Hの動きに追従して移動するように、画像の位置処理を行ってもよい。この場合、撮像装置110によって撮像された画像の中心と表示画面の中心とが必ずしも一致しない。これにより、ユーザPは、撮像装置110によって撮像された画像を頭部Hの正面付近に視ることができる。なお、表示制御装置153は、ユーザPの指令等に従って、ディスプレイの表示画面上において、撮像装置110によって撮像された画像の中心がユーザPの頭部Hの動きに追従して移動しないように、画像の位置処理を行ってもよい。これにより、ユーザPは、撮像装置110によって、別の向きから撮像された対象物の画像を頭部Hの正面等に視ることができる。 When the display device 140 includes a display drive device instead of a head-mounted display, the display control device 153 controls the operation of the display drive device so that the display moves following the movement of the head H of the user P. You may. For example, the display control device 153 may perform the above control so that the display screen of the display is located near the front surface of the head H and / or faces the front surface of the head H. Further, the display control device 153 performs image position processing on the display screen of the display so that the center of the image captured by the image pickup device 110 moves following the movement of the head H of the user P. May be good. In this case, the center of the image captured by the image pickup device 110 and the center of the display screen do not always match. As a result, the user P can see the image captured by the image pickup device 110 near the front of the head H. The display control device 153 prevents the center of the image captured by the image pickup device 110 from moving following the movement of the head H of the user P on the display screen of the display in accordance with the command of the user P or the like. Image position processing may be performed. As a result, the user P can see the image of the object captured from another direction by the image pickup device 110 in front of the head H or the like.
 表示装置140がヘッドマウントディスプレイでなく、ディスプレイ駆動装置を備えない場合、表示制御装置153は、ディスプレイの表示画面上において、撮像装置110によって撮像された画像の中心がユーザPの頭部Hの動きに追従して移動するように、画像の位置処理を行ってもよい。 When the display device 140 is not a head-mounted display and does not include a display drive device, the display control device 153 moves the head H of the user P at the center of the image captured by the image pickup device 110 on the display screen of the display. The position processing of the image may be performed so as to move following the image.
 上記のような撮像制御装置150は、コンピュータを備える。さらに、撮像制御装置150は、撮像装置110、移動装置120、動き検出装置130及び表示装置140に供給する電力を制御するための電気回路等を備えてもよい。コンピュータ以外の機器は、撮像制御装置150と別個に設けられてもよい。 The image pickup control device 150 as described above includes a computer. Further, the image pickup control device 150 may include an electric circuit for controlling the electric power supplied to the image pickup device 110, the mobile device 120, the motion detection device 130, and the display device 140. Devices other than the computer may be provided separately from the image pickup control device 150.
 例えば、ロボット制御装置400及び撮像制御装置150のコンピュータは、プロセッサ及びメモリ等を有する回路又は処理回路を含む。回路又は処理回路は、他の装置との指令、情報及びデータ等の送受信を行う。回路又は処理回路は、各種機器からの信号の入力及び各制御対象への制御信号の出力を行う。メモリは、揮発性メモリ及び不揮発性メモリなどの半導体メモリ、ハードディスク及びSSD(Solid State Drive)等の記憶装置で構成される。例えば、メモリは、回路又は処理回路が実行するプログラム、及び各種データ等を記憶する。 For example, the computer of the robot control device 400 and the image pickup control device 150 includes a circuit or a processing circuit having a processor, a memory, and the like. The circuit or processing circuit sends and receives commands, information, data, etc. to and from other devices. The circuit or processing circuit inputs signals from various devices and outputs control signals to each controlled object. The memory is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, a hard disk, and a storage device such as an SSD (Solid State Drive). For example, the memory stores a program executed by a circuit or a processing circuit, various data, and the like.
 回路又は処理回路の機能は、CPU(Central Processing Unit)などのプロセッサ、RAM(Random Access Memory)などの揮発性メモリ及びROM(Read-Only Memory)などの不揮発性メモリ等からなるコンピュータシステムにより実現されてもよい。コンピュータシステムは、CPUがRAMをワークエリアとして用いてROMに記録されたプログラムを実行することによって、回路又は処理回路の機能を実現してもよい。なお、回路又は処理回路の機能の一部又は全部は、上記コンピュータシステムにより実現されてもよく、電子回路又は集積回路等の専用のハードウェア回路により実現されてもよく、上記コンピュータシステム及びハードウェア回路の組み合わせにより実現されてもよい。ロボット制御装置400及び撮像制御装置150は、単一のコンピュータによる集中制御により各処理を実行してもよく、複数のコンピュータの協働による分散制御により各処理を実行してもよい。 The function of the circuit or processing circuit is realized by a computer system consisting of a processor such as a CPU (Central Processing Unit), a volatile memory such as RAM (Random Access Memory), and a non-volatile memory such as ROM (Read-Only Memory). You may. The computer system may realize the function of the circuit or the processing circuit by the CPU using the RAM as a work area to execute the program recorded in the ROM. A part or all of the functions of the circuit or the processing circuit may be realized by the above-mentioned computer system, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the above-mentioned computer system and hardware may be realized. It may be realized by a combination of circuits. The robot control device 400 and the image pickup control device 150 may execute each process by centralized control by a single computer, or may execute each process by distributed control by cooperation of a plurality of computers.
 例えば、ロボット制御装置400及び撮像制御装置150の各機能は、マイクロコントローラ、MPU(Micro Processing Unit)、LSI(Large Scale Integration:大規模集積回路)、システムLSI、PLC(Programmable Gate Array)及び論理回路等によって実現されてもよい。ロボット制御装置400及び撮像制御装置150それぞれの複数の機能は、個別に1チップ化されてもよく、一部又は全てを含むように1チップ化されてもよい。また、回路は、それぞれ、汎用的な回路でもよく、専用の回路でもよい。LSIとして、LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、LSI内部の回路セルの接続及び/又は設定を再構成可能なリコンフィギュラブルプロセッサ、又は、特定用途向けに複数の機能の回路が1つにまとめられたASIC(Application Specific Integrated Circuit)等が利用されてもよい。 For example, each function of the robot control device 400 and the image pickup control device 150 includes a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration), a system LSI, a PLC (Programmable Gate Array), and a logic circuit. It may be realized by such as. The plurality of functions of the robot control device 400 and the image pickup control device 150 may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them. Further, the circuit may be a general-purpose circuit or a dedicated circuit, respectively. As an LSI, an FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application. An ASIC (Application Specific Integrated Circuit) or the like in which the circuits of the above are integrated into one may be used.
 [撮像システムの機能的構成]
 撮像システム100の機能的な構成を説明する。図2に示すように、撮像システム100の撮像制御装置150は、駆動制御装置151と、検出制御装置152と、表示制御装置153と、記憶部154とを含む。
[Functional configuration of imaging system]
The functional configuration of the image pickup system 100 will be described. As shown in FIG. 2, the image pickup control device 150 of the image pickup system 100 includes a drive control device 151, a detection control device 152, a display control device 153, and a storage unit 154.
 駆動制御装置151は、撮像制御部1511と、第1移動制御部1512とを機能的構成要素として含む。検出制御装置152は、装置制御部1521と、検出処理部1522とを機能的構成要素として含む。表示制御装置153は、表示制御部1531と、第2移動制御部1532と、画像処理部1533とを機能的構成要素として含む。 The drive control device 151 includes an image pickup control unit 1511 and a first movement control unit 1512 as functional components. The detection control device 152 includes a device control unit 1521 and a detection processing unit 1522 as functional components. The display control device 153 includes a display control unit 1531, a second movement control unit 1532, and an image processing unit 1533 as functional components.
 記憶部154の機能は、撮像制御装置150のコンピュータのメモリ等によって実現される。記憶部154以外の撮像制御装置150の機能的構成要素の機能は、コンピュータのプロセッサ等によって実現される。 The function of the storage unit 154 is realized by the memory of the computer of the image pickup control device 150 or the like. The functions of the functional components of the image pickup control device 150 other than the storage unit 154 are realized by a computer processor or the like.
 記憶部154は、種々の情報を記憶し、記憶された情報の読出しを可能にする。例えば、記憶部154は、プログラム及び各種データ等を記憶してもよい。例えば、記憶部154は、撮像システム100の各装置を動作させるためのプログラム、データ及び情報等を記憶してもよい。 The storage unit 154 stores various information and enables reading of the stored information. For example, the storage unit 154 may store a program, various data, and the like. For example, the storage unit 154 may store programs, data, information, and the like for operating each device of the image pickup system 100.
 例えば、記憶部154は、撮像システム100の各装置に設定される座標系を記憶する。例えば、上記座標系は、赤外線センサ131及びロボット操作装置300が配置される空間に設定される3次元座標系(以下、「第1座標系」とも呼ぶ)、移動装置120が配置される空間に設定される3次元座標系(以下、「第2座標系」とも呼ぶ)、及び、移動装置120上に設定される3次元座標系(以下、「第3座標系」とも呼ぶ)等を含んでもよい。 For example, the storage unit 154 stores the coordinate system set in each device of the image pickup system 100. For example, the coordinate system is a three-dimensional coordinate system (hereinafter, also referred to as “first coordinate system”) set in the space where the infrared sensor 131 and the robot operating device 300 are arranged, and the space where the moving device 120 is arranged. Even if the set three-dimensional coordinate system (hereinafter, also referred to as "second coordinate system") and the three-dimensional coordinate system set on the moving device 120 (hereinafter, also referred to as "third coordinate system") are included. good.
 例えば、記憶部154は、各赤外線センサ131の位置及び姿勢の情報、例えば、第1座標系での当該情報を記憶する。記憶部154は、各赤外線マーカ132の識別情報と各赤外線マーカ132から出射される赤外光の特徴の情報とを関連付けて記憶してもよい。記憶部154は、ユーザPの頭部Hにおける各赤外線マーカ132の位置及び姿勢の情報を記憶してもよい。記憶部154は、移動装置120の位置及び姿勢の情報、例えば、第2座標系での当該情報を記憶する。記憶部154は、移動装置120上での撮像装置110の位置及び姿勢の情報、例えば、第3座標系での当該情報を記憶する。 For example, the storage unit 154 stores information on the position and orientation of each infrared sensor 131, for example, the information in the first coordinate system. The storage unit 154 may store the identification information of each infrared marker 132 and the information of the characteristics of the infrared light emitted from each infrared marker 132 in association with each other. The storage unit 154 may store information on the position and posture of each infrared marker 132 on the head H of the user P. The storage unit 154 stores information on the position and orientation of the moving device 120, for example, the information in the second coordinate system. The storage unit 154 stores information on the position and orientation of the image pickup device 110 on the mobile device 120, for example, the information in the third coordinate system.
 例えば、記憶部154は、ユーザPの頭部Hの動きに従って各装置を移動させるための種々のパラメータの関係を記憶してもよい。例えば、当該パラメータの関係は、頭部Hの位置及び姿勢の変動量と各装置の位置及び姿勢の変動量との第1関係、頭部Hの位置及び姿勢の変動量と表示装置140のディスプレイの位置及び姿勢の変動量との第2関係、並びに、頭部Hの位置及び姿勢の変動量と表示装置140の表示画面上での画像の基準点の位置の変動量との第3関係等を含んでもよい。 For example, the storage unit 154 may store the relationship of various parameters for moving each device according to the movement of the head H of the user P. For example, the relationship between the parameters is the first relationship between the amount of change in the position and posture of the head H and the amount of change in the position and posture of each device, the amount of change in the position and posture of the head H and the display of the display device 140. The second relationship with the amount of change in the position and posture of the head H, and the third relationship between the amount of change in the position and posture of the head H and the amount of change in the position of the reference point of the image on the display screen of the display device 140, etc. May include.
 撮像制御部1511は、撮像装置110の駆動を制御する。例えば、撮像制御部1511は、撮像装置110の撮像動作の実行及び停止、並びに、撮像装置110のズームアップ及びズームバックの動作等を制御する。例えば、撮像制御部1511は、ロボット操作装置300から情報及び指令等を受け取るように構成され、ロボット操作装置300から受け取る指令等に従って、撮像装置110の動作を制御してもよい。 The image pickup control unit 1511 controls the drive of the image pickup device 110. For example, the image pickup control unit 1511 controls the execution and stop of the image pickup operation of the image pickup device 110, and the zoom-up and zoom-back operations of the image pickup device 110. For example, the image pickup control unit 1511 may be configured to receive information, commands, and the like from the robot operation device 300, and may control the operation of the image pickup device 110 according to the commands and the like received from the robot operation device 300.
 第1移動制御部1512は、移動装置120の駆動を制御する。例えば、第1移動制御部1512は、撮像入力装置160から操作情報を受け取ると、当該操作情報に対応する動作をさせるための動作指令を生成し、移動装置120に出力する。これにより、移動装置120は操作情報に対応する動作を行う。操作情報は、移動装置120を動作させるためにユーザPによって撮像入力装置160に入力される操作の内容を示す情報である。 The first movement control unit 1512 controls the drive of the movement device 120. For example, when the first movement control unit 1512 receives the operation information from the image pickup input device 160, the first movement control unit 1512 generates an operation command for operating the operation corresponding to the operation information and outputs the operation command to the movement device 120. As a result, the mobile device 120 performs an operation corresponding to the operation information. The operation information is information indicating the content of the operation input to the image pickup input device 160 by the user P in order to operate the mobile device 120.
 例えば、第1移動制御部1512は、ユーザPの頭部Hの位置及び姿勢の変動量を検出制御装置152の検出処理部1522から受け取ると、記憶部154から第1関係を読み出す。第1移動制御部1512は、頭部Hの位置及び姿勢の変動量と第1関係とに基づき、頭部Hの動きに追従して撮像装置110を移動させるための撮像装置110の位置及び姿勢の変動量を決定する。当該位置及び姿勢の変動量は、第2座標系で表され得る。さらに、第1移動制御部1512は、決定した位置及び姿勢の変動量で撮像装置110を移動させるための移動装置120の動作の指令を生成し、移動装置120に出力する。これにより、移動装置120は頭部Hの動きに追従するように撮像装置110を移動させる。 For example, when the first movement control unit 1512 receives the fluctuation amount of the position and posture of the head H of the user P from the detection processing unit 1522 of the detection control device 152, the first movement control unit 1512 reads out the first relationship from the storage unit 154. The first movement control unit 1512 is the position and posture of the image pickup device 110 for moving the image pickup device 110 following the movement of the head H based on the fluctuation amount of the position and posture of the head H and the first relationship. Determine the amount of fluctuation. The fluctuation amount of the position and the posture can be expressed by the second coordinate system. Further, the first movement control unit 1512 generates a command for the operation of the moving device 120 for moving the image pickup device 110 by the determined position and posture fluctuation amount, and outputs the command to the moving device 120. As a result, the moving device 120 moves the image pickup device 110 so as to follow the movement of the head H.
 例えば、頭部Hの水平方向及び鉛直方向の移動はそれぞれ、撮像装置110の水平方向及び鉛直方向の移動に対応付けられる。頭部Hのローリング方向、ピッチング方向及びヨーイング方向の移動はそれぞれ、撮像装置110のローリング方向、ピッチング方向及びヨーイング方向の移動に対応付けられる。 For example, the horizontal and vertical movements of the head H are associated with the horizontal and vertical movements of the image pickup apparatus 110, respectively. The movement of the head H in the rolling direction, pitching direction, and yawing direction is associated with the movement of the image pickup apparatus 110 in the rolling direction, pitching direction, and yawing direction, respectively.
 装置制御部1521は、動き検出装置130の各赤外線センサ131の駆動を制御する。例えば、赤外線センサ131が赤外光を発するように構成される場合、装置制御部1521は、各赤外線センサ131の赤外光の照射の実行及び停止等を制御してもよい。各赤外線マーカ132が赤外光を発するように構成される場合、装置制御部1521は、各赤外線マーカ132の赤外光の照射の実行及び停止等の動作を制御してもよい。 The device control unit 1521 controls the drive of each infrared sensor 131 of the motion detection device 130. For example, when the infrared sensor 131 is configured to emit infrared light, the device control unit 1521 may control execution and stop of irradiation of infrared light of each infrared sensor 131. When each infrared marker 132 is configured to emit infrared light, the apparatus control unit 1521 may control operations such as execution and stop of irradiation of infrared light of each infrared marker 132.
 検出処理部1522は、各赤外線センサ131における赤外線マーカ132からの赤外光の検出結果を処理し、ユーザPの頭部Hの位置及び姿勢を検出する。検出処理部1522は、検出装置の一例である。 The detection processing unit 1522 processes the detection result of the infrared light from the infrared marker 132 in each infrared sensor 131, and detects the position and posture of the head H of the user P. The detection processing unit 1522 is an example of a detection device.
 具体的には、検出処理部1522は、記憶部154から各赤外線マーカ132の識別情報及び赤外光の特徴の情報を読み出し、各赤外線センサ131によって検出される赤外光と赤外線マーカ132とを紐付ける。検出処理部1522は、記憶部154から各赤外線センサ131の位置及び姿勢の情報を読み出し、当該情報と、各赤外線センサ131における各赤外線マーカ132の赤外光の検出結果とを用いて、各赤外線マーカ132の3次元の位置を検出する。検出処理部1522は、4つの赤外線マーカ132の3次元の位置からユーザPの頭部Hの3次元の位置及び姿勢を検出する。頭部Hの位置及び姿勢は、第1座標系で表され得る。検出処理部1522は、経時的に頭部Hの位置及び姿勢を検出することで、頭部Hの位置及び姿勢の変動量を第1移動制御部1512に出力する。 Specifically, the detection processing unit 1522 reads out the identification information of each infrared marker 132 and the characteristic information of the infrared light from the storage unit 154, and sets the infrared light and the infrared marker 132 detected by each infrared sensor 131. Link. The detection processing unit 1522 reads out the position and orientation information of each infrared sensor 131 from the storage unit 154, and uses the information and the infrared light detection result of each infrared marker 132 in each infrared sensor 131 to make each infrared ray. The three-dimensional position of the marker 132 is detected. The detection processing unit 1522 detects the three-dimensional position and posture of the head H of the user P from the three-dimensional positions of the four infrared markers 132. The position and orientation of the head H can be represented in the first coordinate system. The detection processing unit 1522 detects the position and posture of the head H over time, and outputs the fluctuation amount of the position and posture of the head H to the first movement control unit 1512.
 頭部Hの位置及び姿勢の変動量は、変動前後の間での頭部Hの位置及び姿勢の変化量と、変動前の頭部Hの位置及び姿勢と、変動後の頭部Hの位置及び姿勢と、変動後の頭部Hの位置及び姿勢に向かう頭部Hの位置及び姿勢の変動速度と、変動後の頭部Hの位置及び姿勢に向かう頭部Hの位置及び姿勢の加速度とを含む群から選択される少なくとも1つを含んでもよい。 The amount of change in the position and posture of the head H is the amount of change in the position and posture of the head H before and after the change, the position and posture of the head H before the change, and the position of the head H after the change. And the posture, the position of the head H after the change and the fluctuation speed of the position and the posture of the head H toward the posture, and the acceleration of the position and the posture of the head H toward the position and the posture of the head H after the change. It may contain at least one selected from the group comprising.
 表示制御部1531は、撮像装置110によって撮像された画像データを撮像装置110から取得し、表示装置140に当該画像データを出力し当該画像データに対応する画像を表示させる。表示制御部1531は、撮像装置110から取得された画像データを画像処理し、画像処理後の画像データを表示装置140に出力してもよい。 The display control unit 1531 acquires the image data captured by the image pickup device 110 from the image pickup device 110, outputs the image data to the display device 140, and displays the image corresponding to the image data. The display control unit 1531 may perform image processing on the image data acquired from the image pickup device 110 and output the image data after the image processing to the display device 140.
 第2移動制御部1532は、表示装置140がディスプレイ駆動装置を備える場合、ディスプレイ駆動装置の動作を制御する。この場合、第2移動制御部1532は、検出処理部1522からユーザPの頭部Hの位置及び姿勢の変動量の情報を取得する。さらに、第2移動制御部1532は、記憶部154から第2関係を読み出す。第2移動制御部1532は、頭部Hの位置及び姿勢の変動量と第2関係とに基づき、頭部Hの動きに追従してディスプレイを移動させるためのディスプレイの位置及び姿勢の変動量を決定する。当該位置及び姿勢の変動量は、第1座標系で表され得る。 The second movement control unit 1532 controls the operation of the display drive device when the display device 140 includes the display drive device. In this case, the second movement control unit 1532 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the second movement control unit 1532 reads the second relationship from the storage unit 154. The second movement control unit 1532 determines the amount of change in the position and posture of the display for moving the display following the movement of the head H based on the amount of change in the position and posture of the head H and the second relationship. decide. The amount of change in the position and posture can be represented by the first coordinate system.
 さらに、第2移動制御部1532は、決定した位置及び姿勢の変動量でディスプレイを移動させるためのディスプレイ駆動装置の動作の指令を生成し、ディスプレイ駆動装置に出力する。これにより、ディスプレイ駆動装置は、ディスプレイが頭部Hの正面付近に位置する及び/又は頭部Hに向くように、頭部Hの動きに追従してディスプレイを移動させる。本例示的な実施の形態では、表示装置140はヘッドマウントディスプレイであるため、第2移動制御部1532は省略されてもよい。 Further, the second movement control unit 1532 generates a command for operating the display drive device for moving the display by the determined position and posture fluctuation amount, and outputs the command to the display drive device. As a result, the display drive device moves the display following the movement of the head H so that the display is located near the front of the head H and / or faces the head H. In this exemplary embodiment, since the display device 140 is a head-mounted display, the second movement control unit 1532 may be omitted.
 画像処理部1533は、表示装置140のディスプレイの表示画面上において撮像装置110によって撮像された画像を表示する位置を制御する。例えば、表示装置140が1つのディスプレイを備える場合、画像処理部1533は、ユーザPの頭部Hの動きに追従して、撮像装置110によって撮像された画像の基準点をディスプレイの表示画面上で移動させてもよい。例えば、表示装置140が表示画面の向きが異なるように配置された複数のディスプレイを備える場合、画像処理部1533は、ユーザPの頭部Hの動きに追従して、撮像装置110によって撮像された画像の基準点を複数のディスプレイの表示画面にわたって移動させてもよい。 The image processing unit 1533 controls the position where the image captured by the image pickup device 110 is displayed on the display screen of the display device 140. For example, when the display device 140 includes one display, the image processing unit 1533 follows the movement of the head H of the user P and sets the reference point of the image captured by the image pickup device 110 on the display screen of the display. You may move it. For example, when the display device 140 includes a plurality of displays arranged so that the orientation of the display screen is different, the image processing unit 1533 follows the movement of the head H of the user P and is imaged by the image pickup device 110. The reference point of the image may be moved across the display screens of a plurality of displays.
 画像処理部1533は、検出処理部1522からユーザPの頭部Hの位置及び姿勢の変動量の情報を取得する。さらに、画像処理部1533は、記憶部154から第3関係を読み出す。画像処理部1533は、頭部Hの位置及び姿勢の変動量と第3関係とに基づき、頭部Hの動きに追従するような表示画面上での基準点の位置の変動量を決定する。さらに、画像処理部1533は、決定した位置の変動量で基準点を移動させるための指令を生成し、表示制御部1531に出力する。表示制御部1531は、撮像装置110によって撮像された画像の基準点が頭部Hの動きに追従して移動するように、表示装置140のディスプレイに当該画像を表示させる。 The image processing unit 1533 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the image processing unit 1533 reads out the third relationship from the storage unit 154. The image processing unit 1533 determines the amount of change in the position of the reference point on the display screen that follows the movement of the head H, based on the amount of change in the position and posture of the head H and the third relationship. Further, the image processing unit 1533 generates a command for moving the reference point by the amount of fluctuation of the determined position, and outputs the command to the display control unit 1531. The display control unit 1531 displays the image on the display of the display device 140 so that the reference point of the image captured by the image pickup device 110 moves following the movement of the head H.
 これにより、ユーザPは、頭部Hの正面に近い位置で、撮像装置110によって撮像された画像を視認することができる。本例示的な実施の形態では、表示装置140はヘッドマウントディスプレイであるため、画像処理部1533は、ユーザPの頭部Hの動きに関係なく、ディスプレイの表示画面上における撮像装置110によって撮像された画像の基準点の位置を維持する。 As a result, the user P can visually recognize the image captured by the image pickup device 110 at a position close to the front of the head H. In this exemplary embodiment, since the display device 140 is a head-mounted display, the image processing unit 1533 is imaged by the image pickup device 110 on the display screen of the display regardless of the movement of the head H of the user P. Maintain the position of the reference point of the image.
 [撮像システムの動作]
 例示的な実施の形態に係る撮像システム100の動作を説明する。図3は、例示的な実施の形態に係る撮像システム100の動作の一例を示すフローチャートである。なお、本例において、ユーザPは、表示装置140としてヘッドマウントディスプレイを頭部Hに装着するものとする。
[Operation of imaging system]
The operation of the image pickup system 100 according to the exemplary embodiment will be described. FIG. 3 is a flowchart showing an example of the operation of the imaging system 100 according to the exemplary embodiment. In this example, the user P shall attach the head-mounted display to the head H as the display device 140.
 まず、ステップS101において、撮像制御装置150は、撮像装置110及びユーザPの頭部Hそれぞれの初期位置及び初期姿勢を決定する初期設定モードで動作する。例えば、撮像制御装置150は、ユーザPによって撮像入力装置160に入力される起動指令に従って、初期設定モードを開始する。 First, in step S101, the image pickup control device 150 operates in the initial setting mode for determining the initial position and the initial posture of the image pickup device 110 and the head H of the user P, respectively. For example, the image pickup control device 150 starts the initial setting mode according to an activation command input to the image pickup input device 160 by the user P.
 次いで、ステップS102において、撮像制御装置150は、撮像装置110の初期位置及び初期姿勢を決定する。具体的には、ユーザPは、表示装置140において撮像装置110によって撮像された画像を視認しつつ撮像入力装置160を操作して移動装置120を動作させ、撮像装置110の位置及び姿勢を変動させる。表示装置140に所望の画像が映し出されると、ユーザPは、現状の撮像装置110の位置及び姿勢を撮像装置110の初期位置及び初期姿勢に決定する指令を撮像入力装置160に入力する。撮像制御装置150は、指令された撮像装置110の位置及び姿勢を撮像装置110の初期位置及び初期姿勢に決定する。 Next, in step S102, the image pickup control device 150 determines the initial position and the initial posture of the image pickup device 110. Specifically, the user P operates the image pickup input device 160 while visually recognizing the image captured by the image pickup device 110 on the display device 140 to operate the moving device 120, and changes the position and posture of the image pickup device 110. .. When the desired image is projected on the display device 140, the user P inputs a command to the image pickup input device 160 to determine the current position and posture of the image pickup device 110 to the initial position and initial posture of the image pickup device 110. The image pickup control device 150 determines the commanded position and posture of the image pickup device 110 as the initial position and initial posture of the image pickup device 110.
 次いで、ステップS103において、撮像制御装置150は、ユーザPの頭部Hの初期位置及び初期姿勢を決定する。具体的には、ユーザPは、頭部Hの位置及び姿勢が所望の位置及び姿勢となると、頭部Hの初期位置及び初期姿勢の決定する指令を撮像入力装置160に入力する。撮像制御装置150は、動き検出装置130の3つの赤外線センサ131に赤外光の検出動作をさせ、各赤外線センサ131の検出結果を処理し、頭部Hの位置及び姿勢を検出する。撮像制御装置150は、検出された頭部Hの位置及び姿勢を、頭部Hの初期位置及び初期姿勢に決定する。 Next, in step S103, the image pickup control device 150 determines the initial position and the initial posture of the head H of the user P. Specifically, when the position and posture of the head H becomes a desired position and posture, the user P inputs a command for determining the initial position and initial posture of the head H to the image pickup input device 160. The image pickup control device 150 causes the three infrared sensors 131 of the motion detection device 130 to detect infrared light, processes the detection result of each infrared sensor 131, and detects the position and posture of the head H. The image pickup control device 150 determines the detected position and posture of the head H as the initial position and initial posture of the head H.
 次いで、ステップS104において、撮像制御装置150は、初期設定モードを終了し、通常動作モードで動作を開始する。 Next, in step S104, the image pickup control device 150 ends the initial setting mode and starts the operation in the normal operation mode.
 次いで、ステップS105において、撮像制御装置150は、撮像装置110に撮像動作を開始させる。撮像装置110は連続的に動画を撮像し表示装置140に表示させる。 Next, in step S105, the image pickup control device 150 causes the image pickup device 110 to start the image pickup operation. The image pickup device 110 continuously captures a moving image and displays it on the display device 140.
 次いで、ステップS106において、撮像制御装置150は、3つの赤外線センサ131に、頭部Hの赤外線マーカ132の赤外光の検出を連続的に実行させる。 Next, in step S106, the image pickup control device 150 causes the three infrared sensors 131 to continuously detect the infrared light of the infrared marker 132 of the head H.
 次いで、ステップS107において、撮像制御装置150は、各赤外線センサ131の検出結果を処理し、初期位置及び初期姿勢に対するユーザPの頭部Hの位置及び姿勢を検出する。撮像制御装置150は、所定の時間間隔で頭部Hの位置及び姿勢を検出し、それにより、所定の時間毎での頭部Hの位置及び姿勢の変動量を検出する。 Next, in step S107, the image pickup control device 150 processes the detection result of each infrared sensor 131, and detects the position and posture of the head H of the user P with respect to the initial position and the initial posture. The image pickup control device 150 detects the position and posture of the head H at predetermined time intervals, thereby detecting the amount of change in the position and posture of the head H at predetermined time intervals.
 次いで、ステップS108において、撮像制御装置150は、記憶部154に記憶される第1関係と、初期位置及び初期姿勢に対する頭部Hの位置及び姿勢とに基づき、初期位置及び初期姿勢に対する撮像装置110の目標の位置及び姿勢を決定する。撮像制御装置150は、所定の時間間隔で撮像装置110の目標の位置及び姿勢を決定し、それにより、所定の時間毎での撮像装置110の位置及び姿勢の変動量を決定する。 Next, in step S108, the image pickup control device 150 determines the image pickup device 110 with respect to the initial position and the initial posture based on the first relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target. The image pickup control device 150 determines the target position and posture of the image pickup device 110 at predetermined time intervals, thereby determining the amount of change in the position and posture of the image pickup device 110 at predetermined time intervals.
 次いで、ステップS109において、撮像制御装置150は、撮像装置110の目標の位置及び姿勢に対応する動作指令を生成し、移動装置120に出力する。撮像制御装置150は、目標の位置及び姿勢を撮像装置110の位置及び姿勢が満たすように移動装置120を動作させる動作指令を生成する。 Next, in step S109, the image pickup control device 150 generates an operation command corresponding to the target position and posture of the image pickup device 110 and outputs the operation command to the mobile device 120. The image pickup control device 150 generates an operation command for operating the moving device 120 so that the position and posture of the image pickup device 110 satisfy the target position and posture.
 次いで、ステップS110において、移動装置120は動作指令に従って動作し、撮像装置110を目標の位置及び姿勢に移動させる。 Next, in step S110, the moving device 120 operates according to the operation command to move the image pickup device 110 to the target position and posture.
 次いで、ステップS111において、撮像制御装置150は、ユーザPによって撮像入力装置160に撮像システム100の動作を終了する指令が入力されたか否かを判定し、入力済みの場合(ステップS111でYes)に一連の処理を終了し、未入力の場合(ステップS111でNo)にステップS106に戻る。 Next, in step S111, the image pickup control device 150 determines whether or not a command for terminating the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it has been input (Yes in step S111). When the series of processes is completed and no input is made (No in step S111), the process returns to step S106.
 ステップS101からS111の処理によって、撮像制御装置150は、ユーザPの頭部Hの初期位置及び初期姿勢と撮像装置110の初期位置及び初期姿勢との関係に基づき、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110の位置及び姿勢を変動させることができる。 By the processing of steps S101 to S111, the image pickup control device 150 determines the position and posture of the head H based on the relationship between the initial position and initial posture of the head H of the user P and the initial position and initial posture of the image pickup device 110. The position and orientation of the image pickup apparatus 110 can be changed so as to follow the fluctuation amount.
 (変形例1)
 例示的な実施の形態の変形例1は、表示装置140Aが、1つのディスプレイ141と、ディスプレイ141を移動させるディスプレイ駆動装置142とを備える点で、例示的な実施の形態と異なる。撮像制御装置150は、ディスプレイ駆動装置142の動作を制御することで、ユーザPの頭部Hの動きに追従して、ディスプレイ141の位置及び向きを移動させる。以下、変形例1について、例示的な実施の形態と異なる点を中心に説明し、例示的な実施の形態と同様の点の説明を適宜省略する。
(Modification 1)
Modification 1 of the exemplary embodiment differs from the exemplary embodiment in that the display device 140A comprises one display 141 and a display drive device 142 for moving the display 141. The image pickup control device 150 controls the operation of the display drive device 142 to move the position and orientation of the display 141 in accordance with the movement of the head H of the user P. Hereinafter, the modification 1 will be described mainly on the points different from the exemplary embodiment, and the description of the same points as the exemplary embodiment will be omitted as appropriate.
 図4は、例示的な実施の形態の変形例1に係る表示装置140Aの構成の一例を示す側面図である。図4に示すように、ディスプレイ駆動装置142は、ディスプレイ141を支持し、且つディスプレイ141の位置及び姿勢を自在に変えることができるように構成される。本変形例では、ディスプレイ駆動装置142は、複数の関節を有するロボットアームである。当該ロボットアームの基部は、支持面等に固定され、当該ロボットアームの先端には、ディスプレイ141が取り付けられる。ディスプレイ駆動装置142は、ディスプレイ141の位置及び姿勢を3次元方向に任意に変動させることができる。 FIG. 4 is a side view showing an example of the configuration of the display device 140A according to the modified example 1 of the exemplary embodiment. As shown in FIG. 4, the display drive device 142 is configured to support the display 141 and to freely change the position and orientation of the display 141. In this modification, the display drive device 142 is a robot arm having a plurality of joints. The base of the robot arm is fixed to a support surface or the like, and a display 141 is attached to the tip of the robot arm. The display drive device 142 can arbitrarily change the position and orientation of the display 141 in the three-dimensional direction.
 撮像制御装置150の表示制御装置153の第2移動制御部1532は、ユーザPの頭部Hの動きにディスプレイ141の位置及び姿勢が追従するように、ディスプレイ駆動装置142の動作を制御する。例えば、第2移動制御部1532は、頭部Hが上方へ向くと、ディスプレイ141を上方へ移動させ且つ下方へ向け、頭部Hが左方へ向くと、ディスプレイ141を、頭部Hに対して左方へ移動させ且つ右方へ向ける。また、第2移動制御部1532は、撮像入力装置160を介した操作に従ってディスプレイ141の位置及び姿勢の変動させるように、ディスプレイ駆動装置142の動作を制御する。 The second movement control unit 1532 of the display control device 153 of the image pickup control device 150 controls the operation of the display drive device 142 so that the position and posture of the display 141 follow the movement of the head H of the user P. For example, the second movement control unit 1532 moves the display 141 upward and downward when the head H is directed upward, and the display 141 is directed to the head H when the head H is directed to the left. Move to the left and turn to the right. Further, the second movement control unit 1532 controls the operation of the display drive device 142 so as to change the position and posture of the display 141 according to the operation via the image pickup input device 160.
 変形例1に係る撮像システム100の動作を説明する。図5は、変形例1に係る撮像システム100の動作の一例を示すフローチャートである。まず、ステップS201からS203の処理はそれぞれ、例示的な実施の形態でのステップS101からS103と同様である。 The operation of the imaging system 100 according to the first modification will be described. FIG. 5 is a flowchart showing an example of the operation of the image pickup system 100 according to the first modification. First, the processes of steps S201 to S203 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
 次いで、ステップS204において、撮像制御装置150は、ディスプレイ141の初期位置及び初期姿勢を決定する。具体的には、ユーザPは、撮像入力装置160を操作してディスプレイ駆動装置142を動作させ、ディスプレイ141の位置及び姿勢を所望の位置及び姿勢に変動させる。変動停止後、ユーザPは、現状のディスプレイ141の位置及び姿勢をディスプレイ141の初期位置及び初期姿勢に決定する指令を撮像入力装置160に入力する。撮像制御装置150は、指令されたディスプレイ141の位置及び姿勢をディスプレイ141の初期位置及び初期姿勢に決定する。 Next, in step S204, the image pickup control device 150 determines the initial position and the initial posture of the display 141. Specifically, the user P operates the image pickup input device 160 to operate the display drive device 142, and changes the position and posture of the display 141 to a desired position and posture. After the fluctuation is stopped, the user P inputs a command for determining the current position and orientation of the display 141 to the initial position and initial posture of the display 141 to the image pickup input device 160. The image pickup control device 150 determines the commanded position and orientation of the display 141 as the initial position and initial posture of the display 141.
 次いで、ステップS205からS211での処理はそれぞれ、例示的な実施の形態でのステップS104からS110と同様である。 Next, the processes in steps S205 to S211 are the same as those in steps S104 to S110 in the exemplary embodiment, respectively.
 次いで、ステップS212において、撮像制御装置150は、記憶部154に記憶される第2関係と、初期位置及び初期姿勢に対する頭部Hの位置及び姿勢とに基づき、初期位置及び初期姿勢に対するディスプレイ141の目標の位置及び姿勢を決定する。撮像制御装置150は、所定の時間間隔でディスプレイ141の目標の位置及び姿勢を決定し、それにより、所定の時間毎でのディスプレイ141の位置及び姿勢の変動量を決定する。 Next, in step S212, the image pickup control device 150 determines the display 141 with respect to the initial position and the initial posture based on the second relationship stored in the storage unit 154 and the position and the posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target. The image pickup control device 150 determines the target position and orientation of the display 141 at predetermined time intervals, thereby determining the amount of change in the position and orientation of the display 141 at predetermined time intervals.
 次いで、ステップS213において、撮像制御装置150は、ディスプレイ141の目標の位置及び姿勢に対応する動作指令を生成し、ディスプレイ駆動装置142に出力する。撮像制御装置150は、目標の位置及び姿勢をディスプレイ141の位置及び姿勢が満たすようにディスプレイ駆動装置142を動作させる動作指令を生成する。 Next, in step S213, the image pickup control device 150 generates an operation command corresponding to the target position and orientation of the display 141 and outputs the operation command to the display drive device 142. The image pickup control device 150 generates an operation command for operating the display drive device 142 so that the position and posture of the display 141 satisfy the target position and posture.
 次いで、ステップS214において、ディスプレイ駆動装置142は動作指令に従って動作し、ディスプレイ141を目標の位置及び姿勢に移動させる。 Next, in step S214, the display drive device 142 operates according to the operation command, and moves the display 141 to the target position and posture.
 次いで、ステップS215において、撮像制御装置150は、ユーザPによって撮像入力装置160に撮像システム100の動作を終了する指令が入力されたか否かを判定し、入力された場合(ステップS215でYes)に一連の処理を終了し、未入力の場合(ステップS215でNo)にステップS207に戻る。 Next, in step S215, the image pickup control device 150 determines whether or not a command to end the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it is input (Yes in step S215). When the series of processes is completed and no input is made (No in step S215), the process returns to step S207.
 ステップS201からS215の処理によって、撮像制御装置150は、ユーザPの頭部Hの初期位置及び初期姿勢と撮像装置110の初期位置及び初期姿勢とディスプレイ141の初期位置及び初期姿勢との関係に基づき、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110及びディスプレイ141それぞれの位置及び姿勢を変動させることができる。なお、撮像制御装置150は、ステップS209からS211の処理と、ステップS212からS214の処理とを並行して実行してもよく、上記と逆の順序で実行してもよい。なお、本変形例では、ディスプレイ駆動装置142は、ディスプレイ141の位置及び向きの両方を移動するように構成されるが、ディスプレイ141の位置のみ、又は、ディスプレイ141の向きのみを移動するように構成されてもよい。 By the processing of steps S201 to S215, the image pickup control device 150 is based on the relationship between the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the initial position and initial posture of the display 141. The position and posture of the image pickup apparatus 110 and the display 141 can be changed so as to follow the fluctuation amount of the position and posture of the head H. The image pickup control device 150 may execute the processes of steps S209 to S211 and the processes of steps S212 to S214 in parallel, or may execute the processes in the reverse order of the above. In this modification, the display drive device 142 is configured to move both the position and orientation of the display 141, but is configured to move only the position of the display 141 or only the orientation of the display 141. May be done.
 (変形例2)
 例示的な実施の形態の変形例2は、表示装置140Bが、ユーザPの周囲の一部を囲むような湾曲を含む表示面143aを有する1つのディスプレイ143を備える点で、例示的な実施の形態と異なる。撮像制御装置150は、ユーザPの頭部Hの動きに追従して、撮像装置110によって撮像された画像の基準点を表示面143aの画面上で移動させることで、当該画像を表示する位置及び向きを変える。以下、変形例2について、例示的な実施の形態及び変形例1と異なる点を中心に説明し、例示的な実施の形態及び変形例1と同様の点の説明を適宜省略する。
(Modification 2)
A second embodiment of the exemplary embodiment is exemplary in that the display device 140B comprises one display 143 having a display surface 143a that includes a curve that surrounds a portion of the periphery of the user P. Different from the form. The image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 on the screen of the display surface 143a to display the position and the image. Turn around. Hereinafter, the modified example 2 will be mainly described with reference to the exemplary embodiment and the points different from the modified example 1, and the description of the same points as the exemplary embodiment and the modified example 1 will be omitted as appropriate.
 図6は、例示的な実施の形態の変形例2に係る表示装置140Bの構成の一例を示す側面図である。図6に示すように、ディスプレイ143の表示面143aは、ユーザPの両側方から前方にわたって水平方向に且つユーザPの上方及び下方から前方にわたって鉛直方向に、ユーザPの周囲を囲む。このような表示面143aは、水平方向と、鉛直方向と、水平方向及び鉛直方向と交差する方向とにおいて、ユーザPの周囲を囲む。表示面143aは、例えば、球面又は楕円面の一部と同様の曲面形状を有する。なお、表示面143aは、ユーザPの周囲の一部を囲むような形状を有すればよく、例えば、ユーザPの周囲全体を囲むような形状を有してもよい。表示面143aの形状は、上記曲面形状に限定されず、屈曲、湾曲、又は、屈曲及び湾曲の両方を含むいかなる形状であってもよい。例えば、表示面143aの形状は、円筒面又は多面体の表面の少なくとも一部と同様の形状であってもよい。なお、本明細書及び請求の範囲において、「円筒面」は、軸心に垂直な断面の形状が円形、楕円形、円形に近似する形状、楕円形に近似する形状、又は、これらの2つ以上の組み合わせである、柱状体の表面を含み得る。 FIG. 6 is a side view showing an example of the configuration of the display device 140B according to the modified example 2 of the exemplary embodiment. As shown in FIG. 6, the display surface 143a of the display 143 surrounds the user P horizontally from both sides to the front of the user P and vertically from above and below to the front of the user P. Such a display surface 143a surrounds the user P in the horizontal direction, the vertical direction, and the direction intersecting the horizontal direction and the vertical direction. The display surface 143a has a curved surface shape similar to, for example, a part of a spherical surface or an ellipsoidal surface. The display surface 143a may have a shape that surrounds a part of the periphery of the user P, and may have a shape that surrounds the entire circumference of the user P, for example. The shape of the display surface 143a is not limited to the curved surface shape, and may be any shape including bending, bending, or both bending and bending. For example, the shape of the display surface 143a may be the same as at least a part of the surface of the cylindrical surface or the polyhedron. In the present specification and the scope of the claim, the "cylindrical surface" has a cross-sectional shape perpendicular to the axis of a circle, an ellipse, a shape close to a circle, a shape close to an ellipse, or two of these. It may include the surface of a columnar body, which is the above combination.
 撮像制御装置150の表示制御装置153の画像処理部1533及び表示制御部1531は、ユーザPの頭部Hの動きに追従するように、撮像装置110によって撮像された画像の基準点Pfの位置を表示面143aの画面上で変動させる。例えば、画像処理部1533及び表示制御部1531は、頭部Hが上方へ向くと、表示面143aの画面上において基準点Pfを上方へ移動させ、頭部Hが左方へ向くと、表示面143aの画面上において基準点Pfを頭部Hに対して左方へ移動させる。 The image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied on the screen of the display surface 143a. For example, the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward on the screen of the display surface 143a when the head H faces upward, and the display surface when the head H faces left. The reference point Pf is moved to the left with respect to the head H on the screen of 143a.
 変形例2に係る撮像システム100の動作を説明する。図7は、変形例2に係る撮像システム100の動作の一例を示すフローチャートである。まず、ステップS301からS303の処理はそれぞれ、例示的な実施の形態でのステップS101からS103と同様である。 The operation of the imaging system 100 according to the second modification will be described. FIG. 7 is a flowchart showing an example of the operation of the image pickup system 100 according to the second modification. First, the processes of steps S301 to S303 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
 次いで、ステップS304において、撮像制御装置150は、ディスプレイ143の表示面143aの画面上において、撮像装置110によって撮像される画像の基準点Pfの初期位置を決定する。これに限定されないが、本変形例では、基準点Pfは、当該画像の中心である。撮像制御装置150は、ステップS302において撮像装置110の初期位置及び初期姿勢が決定されたときの撮像装置110によって撮像された画像の基準点Pfaの位置を、初期位置に決定する。 Next, in step S304, the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110 on the screen of the display surface 143a of the display 143. Although not limited to this, in this modification, the reference point Pf is the center of the image. The image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S302 as the initial position.
 次いで、ステップS305からS311での処理はそれぞれ、例示的な実施の形態でのステップS104からS110と同様である。 Next, the processes in steps S305 to S311 are the same as in steps S104 to S110 in the exemplary embodiment, respectively.
 次いで、ステップS312において、撮像制御装置150は、記憶部154に記憶される第3関係と、初期位置及び初期姿勢に対する頭部Hの位置及び姿勢とに基づき、表示面143aの画面上における初期位置の基準点Pfaに対する目標基準点Pftの目標位置を決定する。目標基準点Pftは、頭部Hの位置及び姿勢の変動に追従した移動先の基準点である。撮像制御装置150は、所定の時間間隔で目標基準点Pftの目標位置を決定し、それにより、所定の時間毎での基準点Pfの位置の変動量を決定する。 Next, in step S312, the image pickup control device 150 sets the initial position of the display surface 143a on the screen based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. The target position of the target reference point Pft with respect to the reference point Pfa of. The target reference point Pft is a reference point of the movement destination that follows the fluctuation of the position and posture of the head H. The image pickup control device 150 determines the target position of the target reference point Pft at predetermined time intervals, thereby determining the amount of change in the position of the reference point Pf at predetermined time intervals.
 次いで、ステップS313において、撮像制御装置150は、表示面143aの画面上において、撮像装置110によって撮像された画像の基準点Pfの位置が目標基準点Pftの目標位置に一致するように、当該画像を処理しディスプレイ143に出力する。つまり、撮像制御装置150は、目標基準点Pftに対応付ける画像処理を実行する。 Next, in step S313, the image pickup control device 150 sets the image so that the position of the reference point Pf of the image captured by the image pickup device 110 coincides with the target position of the target reference point Pft on the screen of the display surface 143a. Is processed and output to the display 143. That is, the image pickup control device 150 executes image processing associated with the target reference point Pft.
 次いで、ステップS314において、ディスプレイ143は、処理後の画像を表示面143aの画面上に表示する。 Next, in step S314, the display 143 displays the processed image on the screen of the display surface 143a.
 次いで、ステップS315の処理は、変形例1でのステップS215と同様である。 Next, the process of step S315 is the same as that of step S215 in the first modification.
 ステップS301からS315の処理によって、撮像制御装置150は、ユーザPの頭部Hの初期位置及び初期姿勢と、撮像装置110の初期位置及び初期姿勢と、撮像装置110によって撮像された画像の基準点Pfのディスプレイ143の表示面143a上での初期位置との関係に基づき、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110の位置及び姿勢と画像の表示位置及び表示方向とを変動させることができる。なお、撮像制御装置150は、ステップS309からS311の処理と、ステップS312からS314の処理とを並行して実行してもよく、上記と逆の順序で実行してもよい。 By the processing of steps S301 to S315, the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position on the display surface 143a of the display 143 of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image so as to follow the fluctuation amount of the position and posture of the head H. And can be varied. The image pickup control device 150 may execute the processes of steps S309 to S311 and the processes of steps S312 to S314 in parallel, or may execute the processes in the reverse order of the above.
 なお、撮像制御装置150は、撮像入力装置160等を介したユーザPの指令を受け付けることにより、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110によって撮像された画像の基準点Pfの位置を変動させる制御を、停止又は停止解除することができるように構成されてもよい。上記制御の停止中、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110の位置及び姿勢が変動するが、表示面143a上での画像の基準点Pfの位置は変動しない。ユーザPは、表示面143a上において、対象物W等の被写体を別の方向から撮像した画像を視ることができる。例えば、ユーザPは、頭部Hの動きに追従して画像の基準点Pfの位置を変動させ、その後、撮像入力装置160に追従の停止を指令すると、頭部Hの正面以外の場所に映し出される画像を視認することができる。 The image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H. The control for changing the position of the reference point Pf may be configured to be stopped or released. While the control is stopped, the position and posture of the image pickup apparatus 110 change so as to follow the fluctuation amount of the position and posture of the head H, but the position of the reference point Pf of the image on the display surface 143a does not change. .. The user P can see an image of a subject such as an object W captured from another direction on the display surface 143a. For example, when the user P changes the position of the reference point Pf of the image according to the movement of the head H and then orders the image pickup input device 160 to stop following, the image is projected to a place other than the front of the head H. You can see the image.
 また、変形例2において、ディスプレイ143は、変形例1と同様に移動可能であってもよい。この場合、撮像制御装置150は、変形例2での処理と変形例1での処理とを組み合わせて、ユーザPの頭部Hの動きに追従するように画像の表示位置及び向きを変動させてもよい。 Further, in the modification 2, the display 143 may be movable in the same manner as the modification 1. In this case, the image pickup control device 150 combines the processing in the modification 2 and the processing in the modification 1 to change the display position and orientation of the image so as to follow the movement of the head H of the user P. May be good.
 (変形例3)
 例示的な実施の形態の変形例3では、表示装置140Cが、ユーザPの周囲の一部を囲むように配置される複数のディスプレイ141を備える点で、変形例2と異なる。複数のディスプレイ141は、それぞれの表示面141aの位置及び向きが異なるように配置される。撮像制御装置150は、複数のディスプレイ141のそれぞれに、撮像装置110によって撮像された1つの画像の一部を表示させる、つまり、複数のディスプレイ141に、全体で当該1つの画像を表示させる。撮像制御装置150は、ユーザPの頭部Hの動きに追従して、撮像装置110によって撮像された画像の基準点を複数のディスプレイ141の表示面141aの画面上にわたって移動させることで、当該画像を表示する位置及び向きを変える。以下、変形例3について、例示的な実施の形態並びに変形例1及び2と異なる点を中心に説明し、例示的な実施の形態並びに変形例1及び2と同様の点の説明を適宜省略する。
(Modification 3)
The third modification of the exemplary embodiment is different from the second modification in that the display device 140C includes a plurality of displays 141 arranged so as to surround a part around the user P. The plurality of displays 141 are arranged so that the positions and orientations of the respective display surfaces 141a are different. The image pickup control device 150 causes each of the plurality of displays 141 to display a part of one image captured by the image pickup device 110, that is, causes the plurality of displays 141 to display the one image as a whole. The image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 over the screen of the display surface 141a of the plurality of displays 141, thereby moving the image. Change the position and orientation of the display. Hereinafter, the modified example 3 will be mainly described with reference to the exemplary embodiment and the points different from the modified examples 1 and 2, and the description of the exemplary embodiment and the same points as the modified examples 1 and 2 will be omitted as appropriate. ..
 図8は、例示的な実施の形態の変形例3に係る表示装置140Cの構成の一例を示す側面図である。図8に示すように、複数のディスプレイ141は、ユーザPの両側方から前方にわたって水平方向に且つユーザPの上方及び下方から前方にわたって鉛直方向に、ユーザPの周囲の一部を囲むように配置される。 FIG. 8 is a side view showing an example of the configuration of the display device 140C according to the modified example 3 of the exemplary embodiment. As shown in FIG. 8, the plurality of displays 141 are arranged so as to surround a part around the user P horizontally from both sides of the user P to the front and vertically from the upper side and the lower side to the front of the user P. Will be done.
 本変形例では、複数のディスプレイ141は、表示面141aが複数の水平方向の行を形成し且つ複数の鉛直方向の列を形成するように、配置される。このような複数の表示面141aは、水平方向と、鉛直方向と、水平方向及び鉛直方向と交差する方向とにおいて、ユーザPの周囲を囲む。さらに、複数のディスプレイ141は、それぞれの表示面141aが球面又は楕円面に沿って配置され且つ互いに隣接するように、配列される。各表示面141aは、球面又は楕円面の中心又は焦点へ向けられている。 In this modification, the plurality of displays 141 are arranged so that the display surface 141a forms a plurality of horizontal rows and a plurality of vertical columns. The plurality of display surfaces 141a surround the user P in the horizontal direction, the vertical direction, and the directions intersecting the horizontal direction and the vertical direction. Further, the plurality of displays 141 are arranged so that their respective display surfaces 141a are arranged along a spherical surface or an ellipsoidal surface and are adjacent to each other. Each display surface 141a is directed towards the center or focal point of a spherical or ellipsoidal surface.
 なお、複数のディスプレイ141は、ユーザPの周囲全体を囲むように配置されてもよく、ユーザPの周囲を水平方向に囲むように配置されてもよく、ユーザPの周囲を鉛直方向に囲むように配置されてもよい。例えば、複数のディスプレイ141は、水平方向及び鉛直方向に延びる十字状の配列、水平方向に延び且つ湾曲する円筒面状の配列、又は、鉛直方向に延び且つ湾曲する円筒面状の配列等で配置されてもよい。図8では、複数のディスプレイ141は、互いに隣接して配置されるが、互いから距離をあけて配置されてもよい。 The plurality of displays 141 may be arranged so as to surround the entire periphery of the user P, may be arranged so as to surround the periphery of the user P in the horizontal direction, and may be arranged so as to surround the periphery of the user P in the vertical direction. May be placed in. For example, the plurality of displays 141 are arranged in a cross-shaped array extending horizontally and vertically, a cylindrical array extending horizontally and bending, a cylindrical array extending vertically and bending, and the like. May be done. In FIG. 8, the plurality of displays 141 are arranged adjacent to each other, but may be arranged at a distance from each other.
 撮像制御装置150の表示制御装置153の画像処理部1533及び表示制御部1531は、ユーザPの頭部Hの動きに追従するように、撮像装置110によって撮像された画像の基準点Pfの位置を複数の表示面141aの画面上にわたって変動させる。例えば、画像処理部1533及び表示制御部1531は、頭部Hが上方へ向くと、複数の表示面143aの画面上にわたって基準点Pfを上方へ移動させ、頭部Hが左方へ向くと、表示面143aの画面上にわたって基準点Pfを頭部Hに対して左方へ移動させる。 The image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied over the screens of the plurality of display surfaces 141a. For example, the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward over the screens of the plurality of display surfaces 143a when the head H is directed upward, and when the head H is directed to the left, the reference point Pf is moved upward. The reference point Pf is moved to the left with respect to the head H over the screen of the display surface 143a.
 変形例3に係る撮像システム100の動作を説明する。図9は、変形例3に係る撮像システム100の動作の一例を示すフローチャートである。まず、ステップS401からS403の処理はそれぞれ、変形例2でのステップS301からS303と同様である。 The operation of the image pickup system 100 according to the modification 3 will be described. FIG. 9 is a flowchart showing an example of the operation of the image pickup system 100 according to the modified example 3. First, the processes of steps S401 to S403 are the same as those of steps S301 to S303 in the second modification, respectively.
 次いで、ステップS404において、撮像制御装置150は、撮像装置110によって撮像される画像の基準点Pfの初期位置を決定する。撮像制御装置150は、ステップS402において撮像装置110の初期位置及び初期姿勢が決定されたときの撮像装置110によって撮像された画像の基準点Pfaの位置を、初期位置に決定する。具体的には、撮像制御装置150は、初期位置の基準点Pfaを映し出すディスプレイ141と、当該ディスプレイ141の表示面141aの画面上での基準点Pfaの位置とを決定する。 Next, in step S404, the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110. The image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S402 as the initial position. Specifically, the image pickup control device 150 determines the position of the display 141 that displays the reference point Pfa at the initial position and the position of the reference point Pfa on the screen of the display surface 141a of the display 141.
 次いで、ステップS405からS411での処理はそれぞれ、変形例2でのステップS305からS311と同様である。 Next, the processes in steps S405 to S411 are the same as those in steps S305 to S311 in the second modification, respectively.
 次いで、ステップS412において、撮像制御装置150は、記憶部154に記憶される第3関係と、初期位置及び初期姿勢に対する頭部Hの位置及び姿勢とに基づき、目標基準点Pftを映し出すディスプレイ141と、当該ディスプレイ141の表示面141aの画面上での目標基準点Pftの目標位置とを決定する。撮像制御装置150は、所定の時間間隔で上記決定を実行する。 Next, in step S412, the image pickup control device 150 together with the display 141 that displays the target reference point Pft based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. , The target position of the target reference point Pft on the screen of the display surface 141a of the display 141 is determined. The image pickup control device 150 executes the above determination at predetermined time intervals.
 次いで、ステップS413において、撮像制御装置150は、決定されたディスプレイ141の表示面141aの画面上において、撮像装置110によって撮像された画像の基準点Pfの位置が目標基準点Pftの目標位置に一致するように、当該画像を処理し各ディスプレイ141に出力する。 Next, in step S413, the image pickup control device 150 matches the position of the reference point Pf of the image captured by the image pickup device 110 with the target position of the target reference point Pft on the screen of the display surface 141a of the determined display 141. As such, the image is processed and output to each display 141.
 次いで、ステップS414において、複数のディスプレイ141は全体で、処理後の画像を表示面141aの画面上に表示する。 Next, in step S414, the plurality of displays 141 display the processed image on the screen of the display surface 141a as a whole.
 次いで、ステップS415での処理は、変形例2でのステップS315と同様である。 Next, the processing in step S415 is the same as step S315 in the modification 2.
 ステップS401からS415の処理によって、撮像制御装置150は、ユーザPの頭部Hの初期位置及び初期姿勢と、撮像装置110の初期位置及び初期姿勢と、撮像装置110によって撮像された画像の基準点Pfの初期位置との関係に基づき、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110の位置及び姿勢と画像の表示位置及び表示方向とを変動させることができる。なお、撮像制御装置150は、ステップS409からS411の処理と、ステップS412からS414の処理とを並行して実行してもよく、上記と逆の順序で実行してもよい。 By the processing of steps S401 to S415, the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image can be changed so as to follow the fluctuation amount of the position and posture of the head H. The image pickup control device 150 may execute the processes of steps S409 to S411 and the processes of steps S412 to S414 in parallel, or may execute the processes in the reverse order of the above.
 なお、撮像制御装置150は、撮像入力装置160等を介したユーザPの指令を受け付けることにより、頭部Hの位置及び姿勢の変動量に追従するように、撮像装置110によって撮像された画像の基準点Pfの位置を変動させる制御を、停止又は停止解除することができるように構成されてもよい。 The image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H. The control for changing the position of the reference point Pf may be configured to be stopped or released.
 また、本変形例では、撮像制御装置150は、複数のディスプレイ141の全体を用いて1つの画像を表示するように構成されるが、これに限定されない。例えば、撮像制御装置150は、複数のディスプレイ141の一部に、撮像装置110によって撮像された画像を表示するように構成されてもよい。撮像制御装置150は、頭部Hの位置及び姿勢の変動量に追従するように、画像を表示するディスプレイ141を選択するように構成されてもよい。 Further, in the present modification, the image pickup control device 150 is configured to display one image by using the entire plurality of displays 141, but the present invention is not limited to this. For example, the image pickup control device 150 may be configured to display the image captured by the image pickup device 110 on a part of the plurality of displays 141. The image pickup control device 150 may be configured to select a display 141 for displaying an image so as to follow the fluctuation amount of the position and posture of the head H.
 また、変形例3において、複数のディスプレイ141は、変形例1と同様に移動可能であってもよい。複数のディスプレイ141は、変形例2と同様に、湾曲、屈曲、又は、湾曲及び屈曲の両方を含む表示面141aを有してもよい。これらの場合、撮像制御装置150は、変形例3での処理に、変形例1での処理、変形例2での処理、又は、変形例1での処理及び変形例2での処理の両方を組み合わせて、ユーザPの頭部Hの動きに追従するように画像の表示位置及び向きを変動させてもよい。 Further, in the modified example 3, the plurality of displays 141 may be movable in the same manner as in the modified example 1. The plurality of displays 141 may have a display surface 141a that is curved, bent, or includes both curved and bent, as in the second modification. In these cases, the image pickup control device 150 performs both the processing in the modification 1 and the processing in the modification 2, or the processing in the modification 1 and the processing in the modification 2 in the processing in the modification 3. In combination, the display position and orientation of the image may be changed so as to follow the movement of the head H of the user P.
 (変形例4)
 例示的な実施の形態の変形例4は、撮像システム100が、異なる位置及び向きで配置された複数の撮像装置110を備える点で、例示的な実施の形態と異なる。撮像制御装置150は、表示装置140に画像を表示する対象の撮像装置110を切り替えることで、ユーザPの頭部Hの動きに追従して撮像装置110の位置及び向きを変動させる。以下、変形例4について、例示的な実施の形態及び変形例1から3と異なる点を中心に説明し、例示的な実施の形態及び変形例1から3と同様の点の説明を適宜省略する。
(Modification example 4)
Modification 4 of the exemplary embodiment differs from the exemplary embodiment in that the imaging system 100 includes a plurality of imaging devices 110 arranged at different positions and orientations. The image pickup control device 150 changes the position and orientation of the image pickup device 110 according to the movement of the head H of the user P by switching the image pickup device 110 to be displayed on the display device 140. Hereinafter, the modified example 4 will be mainly described with reference to the exemplary embodiments and the points different from the modified examples 1 to 3, and the description of the same points as the exemplary embodiments and the modified examples 1 to 3 will be omitted as appropriate. ..
 図10は、例示的な実施の形態の変形例4に係る撮像装置110の構成の一例を示す斜視図である。図10に示すように、複数の撮像装置110は、撮像の対象物Wの周囲の少なくとも一部を囲むように配置される。例えば、対象物Wは、ロボット200の作業対象のワークである。本変形例では、複数の撮像装置110は、鉛直方向の軸心を有する円筒面に沿って配置され、互いから離されている。当該軸心は対象物W又は対象物Wの近傍を通る。複数の撮像装置110は、互いに同等の鉛直方向位置に配置される。各撮像装置110は、対象物Wに向けられ、支持体を介して天井等の不動物体に固定される。 FIG. 10 is a perspective view showing an example of the configuration of the image pickup apparatus 110 according to the modified example 4 of the exemplary embodiment. As shown in FIG. 10, the plurality of image pickup devices 110 are arranged so as to surround at least a part of the periphery of the object W to be imaged. For example, the object W is a work target of the robot 200. In this modification, the plurality of image pickup devices 110 are arranged along a cylindrical surface having a vertical axis and separated from each other. The axis passes through the object W or the vicinity of the object W. The plurality of image pickup devices 110 are arranged at positions in the vertical direction equivalent to each other. Each image pickup device 110 is directed toward the object W and is fixed to an inanimate object such as a ceiling via a support.
 複数の撮像装置110の配置は、上記に限定さない。例えば、複数の撮像装置110は、対象物W、ロボットアーム210、又は、対象物W及びロボットアーム210の両方を囲むように配置されてもよい。例えば、複数の撮像装置110は、鉛直方向で異なる位置に配置されてもよい。例えば、複数の撮像装置110は、水平方向の軸心を有する円筒面、鉛直方向の軸心を有する円筒面、球面、楕円面、又は、これらの2つ以上の組み合わせ等に沿って配置されてもよい。例えば、複数の撮像装置110は、鉛直方向での位置が異なる2つ以上の水平方向の円周、水平方向位置若しくは水平方向の向きが異なる2つ以上の鉛直方向の円周、又は、当該水平方向の円周と当該鉛直方向の円周との組み合わせに沿って配置されてもよい。 The arrangement of the plurality of image pickup devices 110 is not limited to the above. For example, the plurality of image pickup devices 110 may be arranged so as to surround the object W, the robot arm 210, or both the object W and the robot arm 210. For example, the plurality of image pickup devices 110 may be arranged at different positions in the vertical direction. For example, the plurality of image pickup devices 110 are arranged along a cylindrical surface having a horizontal axis, a cylindrical surface having a vertical axis, a spherical surface, an ellipsoid, or a combination of two or more thereof. May be good. For example, the plurality of image pickup devices 110 may have two or more horizontal circumferences having different vertical positions, two or more vertical circumferences having different horizontal positions or horizontal orientations, or the horizontal. It may be arranged along the combination of the circumference in the direction and the circumference in the vertical direction.
 各撮像装置110の位置及び姿勢は、当該撮像装置110のパラメータとして、撮像制御装置150の記憶部154に予め記憶される。例えば、撮像装置110の姿勢は、撮像装置110の光軸中心の姿勢角である。 The position and orientation of each image pickup device 110 are stored in advance in the storage unit 154 of the image pickup control device 150 as parameters of the image pickup device 110. For example, the posture of the image pickup device 110 is the posture angle of the center of the optical axis of the image pickup device 110.
 撮像制御装置150の駆動制御装置151の第1移動制御部1512は、ユーザPの頭部Hの位置及び姿勢の情報を受け取ると、第1関係を用いて、頭部Hの動きに追従して撮像装置110を移動させるための撮像装置110の目標位置及び目標姿勢を決定する。第1移動制御部1512は、記憶部154に記憶される撮像装置110のパラメータを用いて、目標位置及び目標姿勢に最も近似する位置及び姿勢を有する撮像装置110を、複数の撮像装置110の中から決定する。 When the first movement control unit 1512 of the drive control device 151 of the image pickup control device 150 receives the information on the position and posture of the head H of the user P, the first movement of the head H is followed by the movement of the head H using the first relationship. The target position and target posture of the image pickup device 110 for moving the image pickup device 110 are determined. The first movement control unit 1512 uses the parameters of the image pickup device 110 stored in the storage unit 154 to set the image pickup device 110 having the position and the posture closest to the target position and the target posture among the plurality of image pickup devices 110. To decide from.
 さらに、第1移動制御部1512は、決定された撮像装置110の位置及び姿勢と目標位置及び目標姿勢との差異を補完するために、当該撮像装置110に実行させるズームアップ率又はズームバック率を決定する。例えば、決定された撮像装置110の指向方向で、当該撮像装置110の位置が目標位置よりも前方に位置する場合、第1移動制御部1512は、当該撮像装置110にズームバック撮像させるためのズームバック率を決定する。上記指向方向で、当該撮像装置110の位置が目標位置よりも後方に位置する場合、第1移動制御部1512は、当該撮像装置110にズームアップ撮像させるためのズームアップ率を決定する。第1移動制御部1512は、決定された撮像装置110に、決定されたズームアップ率又はズームバック率で撮像を実行させる指令を撮像制御部1511に出力する。撮像制御部1511は、指令に従って、当該撮像装置110に撮像させる。第1移動制御部1512は、変動装置の一例である。 Further, the first movement control unit 1512 determines the zoom-up rate or zoom-back rate to be executed by the image pickup device 110 in order to compensate for the difference between the determined position and posture of the image pickup device 110 and the target position and target posture. decide. For example, when the position of the image pickup device 110 is located in front of the target position in the determined direction of the image pickup device 110, the first movement control unit 1512 zooms the image pickup device 110 to zoom back. Determine the back rate. When the position of the image pickup apparatus 110 is located behind the target position in the directivity direction, the first movement control unit 1512 determines the zoom-up rate for causing the image pickup apparatus 110 to perform zoom-up imaging. The first movement control unit 1512 outputs a command to the determined image pickup device 110 to execute the image pickup at the determined zoom-up rate or zoom-back rate to the image pickup control unit 1511. The image pickup control unit 1511 causes the image pickup device 110 to take an image according to a command. The first movement control unit 1512 is an example of a variable device.
 撮像制御装置150は、ユーザPの頭部Hの動きに追従して、撮像を実行させる撮像装置110を決定し、当該撮像装置110に撮像させることで、ユーザの頭部の動きに追従して撮像位置及び撮像方向が変動する画像を表示装置140に表示させることができる。また、変形例4の構成を、変形例1から3に適用してもよい。 The image pickup control device 150 determines an image pickup device 110 to perform imaging according to the movement of the head H of the user P, and causes the image pickup device 110 to take an image to follow the movement of the user's head. An image whose imaging position and imaging direction fluctuate can be displayed on the display device 140. Further, the configuration of the modified example 4 may be applied to the modified examples 1 to 3.
 (その他の実施の形態)
 以上、本開示の例示的な実施の形態及び変形例について説明したが、本開示は、上記例示的な実施の形態及び変形例に限定されない。すなわち、本開示の範囲内で種々の変形及び改良が可能である。例えば、各種変形を例示的な実施の形態及び変形例に施したもの、及び、異なる例示的な実施の形態及び変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
(Other embodiments)
Although the exemplary embodiments and variations of the present disclosure have been described above, the present disclosure is not limited to the above exemplary embodiments and modifications. That is, various modifications and improvements are possible within the scope of the present disclosure. For example, the present disclosure also includes a form in which various modifications are applied to an exemplary embodiment and a modification, and a form constructed by combining components in different exemplary embodiments and modifications. Is done.
 例えば、例示的な実施の形態及び変形例において、動き検出装置130は、ユーザPの頭部Hから離れた位置から頭部Hの位置及び姿勢を検出するセンサとして、赤外線センサ131及び赤外線マーカ132を備えるが、これに限定されず、頭部Hの動きを検出可能ないかなる構成を備えてもよい。 For example, in the exemplary embodiment and modification, the motion detection device 130 has an infrared sensor 131 and an infrared marker 132 as sensors for detecting the position and posture of the head H from a position away from the head H of the user P. However, the present invention is not limited to this, and any configuration that can detect the movement of the head H may be provided.
 例えば、動き検出装置130は、頭部Hに装着される加速度センサ及び角速度センサを備え、頭部Hの6軸方向の加速度及び角速度を検出してもよい。この場合、撮像制御装置150は、加速度センサ及び角速度センサから検出結果を有線通信又は無線通信を介して受信するように構成されてもよい。撮像制御装置150は、加速度及び角速度の検出結果を用いて頭部Hの位置及び姿勢を検出してもよい。 For example, the motion detection device 130 may include an acceleration sensor and an angular velocity sensor mounted on the head H, and may detect the acceleration and the angular velocity of the head H in the 6-axis direction. In this case, the image pickup control device 150 may be configured to receive the detection result from the acceleration sensor and the angular velocity sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H using the detection results of the acceleration and the angular velocity.
 又は、動き検出装置130は、頭部Hから離れた位置に配置された3次元カメラを備え、頭部Hの3次元画像を撮像してもよい。3次元画像の各画素の画素値は、当該画素に映し出される被写体までの距離値を示す。この場合、撮像制御装置150は、頭部Hのテンプレートを用いたパターンマッチング手法等の画像処理により、3次元画像に映し出される頭部Hの像及び頭部Hの姿勢を検出し、3次元画像の各画素の画素値から頭部Hの位置を検出してもよい。さらに、動き検出装置130は、互いに異なる位置及び向きで配置された複数の3次元カメラを備えてもよい。撮像制御装置150は、各3次元カメラの3次元画像を処理することで、頭部Hの3次元モデルを生成してもよい。撮像制御装置150は、頭部Hの3次元モデルを用いて頭部Hの位置及び姿勢を検出してもよい。 Alternatively, the motion detection device 130 may include a three-dimensional camera arranged at a position away from the head H and capture a three-dimensional image of the head H. The pixel value of each pixel of the three-dimensional image indicates the distance value to the subject projected on the pixel. In this case, the image pickup control device 150 detects the image of the head H and the posture of the head H projected on the three-dimensional image by image processing such as a pattern matching method using the template of the head H, and detects the three-dimensional image. The position of the head H may be detected from the pixel value of each pixel of. Further, the motion detection device 130 may include a plurality of three-dimensional cameras arranged at different positions and orientations from each other. The image pickup control device 150 may generate a three-dimensional model of the head H by processing a three-dimensional image of each three-dimensional camera. The image pickup control device 150 may detect the position and posture of the head H using a three-dimensional model of the head H.
 又は、動き検出装置130は、磁場発生装置と、頭部Hに装着される磁気センサとを備え、磁気センサの位置及び姿勢を検出してもよい。この場合、撮像制御装置150は、磁気センサから検出結果を有線通信又は無線通信を介して受信するように構成されてもよい。撮像制御装置150は、磁気センサの位置及び姿勢の検出結果を用いて頭部Hの位置及び姿勢を検出してもよい。 Alternatively, the motion detection device 130 may include a magnetic field generator and a magnetic sensor mounted on the head H, and detect the position and orientation of the magnetic sensor. In this case, the image pickup control device 150 may be configured to receive the detection result from the magnetic sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H by using the detection result of the position and posture of the magnetic sensor.
 また、本開示の技術の各態様例は、以下のように挙げられる。本開示の一態様に係る撮像システムは、撮像装置と、ユーザの頭部の動きを検出する検出装置と、前記検出装置によって検出される前記頭部の動きに追従して前記撮像装置の位置及び向きを変動させる変動装置と、前記撮像装置によって撮像された画像を前記ユーザに表示し、前記頭部の動きに追従して、前記画像を表示する位置と前記画像を表示する向きとのうちの少なくとも一方を変える表示装置とを備える。 Further, examples of each aspect of the technology of the present disclosure are given as follows. The image pickup system according to one aspect of the present disclosure includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device. A variable device that changes the orientation, an image captured by the imaging device is displayed to the user, and the position where the image is displayed and the direction in which the image is displayed according to the movement of the head are selected. It is equipped with a display device that changes at least one of them.
 上記態様によると、撮像システムは、頭部の動きに追従して撮像装置の位置及び向きを変動させることができる。さらに、撮像システムは、表示装置において、当該頭部の動きに追従して、撮像装置によって撮像された画像を表示する位置、当該画像を表示する向き、又は、当該画像を表示する位置及び向きの両方を変えることができる。よって、ユーザは、頭部の動きに追従する位置及び向きで撮像された画像を、当該画像が頭部の動きに対応する位置、向き、又は、位置及び向きで表示される状態で、容易に且つ確実に視ることができる。従って、撮像システムは、撮像装置と撮像装置によって撮像された画像の表示面とを頭部の動きに追従させることができる。 According to the above aspect, the imaging system can change the position and orientation of the imaging device according to the movement of the head. Further, the image pickup system follows the movement of the head in the display device to display the image captured by the image pickup device, the direction in which the image is displayed, or the position and direction in which the image is displayed. Both can be changed. Therefore, the user can easily display an image captured at a position and orientation that follows the movement of the head in a state where the image is displayed at a position, orientation, or position and orientation corresponding to the movement of the head. And you can see it surely. Therefore, the image pickup system can make the image pickup device and the display surface of the image captured by the image pickup device follow the movement of the head.
 本開示の一態様に係る撮像システムにおいて、前記変動装置は、前記撮像装置を搭載し、前記撮像装置を移動させることで前記撮像装置の位置及び向きを変動させてもよい。 In the image pickup system according to one aspect of the present disclosure, the variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the image pickup device.
 上記態様によると、撮像装置は、頭部の動きに追従する位置及び向きに移動されることが可能である。よって、撮像装置は、頭部の動きにより忠実に追従する位置及び向きからの画像を撮像することができる。さらに、撮像装置は、頭部の動きに追従して連続的に変化する画像を撮像することができる。 According to the above aspect, the image pickup device can be moved to a position and a direction that follows the movement of the head. Therefore, the image pickup apparatus can capture an image from a position and orientation that faithfully follows the movement of the head. Further, the image pickup apparatus can capture an image that continuously changes according to the movement of the head.
 本開示の一態様に係る撮像システムにおいて、前記変動装置は、前記撮像装置を搭載し、前記変動装置自体が移動することで前記撮像装置の位置及び向きを変動させてもよい。 In the image pickup system according to one aspect of the present disclosure, the variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the variable device itself.
 上記態様によると、変動装置自体が移動するため、変動装置は、撮像装置の位置及び向きの変動範囲を大きくすることができる。ユーザは、頭部の動きにより広範囲の位置及び向きからの画像を撮像装置に撮像させ、視認することができる。 According to the above aspect, since the variable device itself moves, the variable device can increase the fluctuation range of the position and orientation of the image pickup device. The user can have the image pickup device capture an image from a wide range of positions and orientations by the movement of the head and visually recognize the image.
 本開示の一態様に係る撮像システムは、異なる位置及び向きで配置された複数の前記撮像装置を備え、前記変動装置は、前記表示装置において画像を表示させる前記撮像装置を切り替えることで、前記撮像装置の位置及び向きを変動させてもよい。 The image pickup system according to one aspect of the present disclosure includes a plurality of the image pickup devices arranged at different positions and orientations, and the variable device obtains the image pickup by switching the image pickup device for displaying an image on the display device. The position and orientation of the device may be varied.
 上記態様によると、変動装置は、撮像された画像を表示する撮像装置を切り替えることで、切り替え後の撮像装置に、頭部の動きに追従した位置及び向きから画像を撮像させ表示装置に表示させることができる。撮像装置を切り替えることで、画像を撮像した撮像装置の位置及び向きの変更が可能であるため、素早い頭部の動きに追従した位置及び向きからの画像の撮像が可能である。 According to the above aspect, the variable device switches the image pickup device for displaying the captured image, so that the image pickup device after the switch captures the image from the position and direction following the movement of the head and displays the image on the display device. be able to. By switching the image pickup device, the position and orientation of the image pickup device that captured the image can be changed, so that the image can be captured from the position and orientation that follow the quick movement of the head.
 本開示の一態様に係る撮像システムにおいて、前記検出装置は、前記頭部から離れた位置から前記頭部の位置及び姿勢を検出するセンサを含んでもよい。 In the imaging system according to one aspect of the present disclosure, the detection device may include a sensor that detects the position and posture of the head from a position away from the head.
 上記態様によると、検出装置は、頭部と非接触で頭部の位置及び姿勢を検出することができる。よって、ユーザは、検出装置による制約を受けずに頭部を動かすことができる。 According to the above aspect, the detection device can detect the position and posture of the head without contacting the head. Therefore, the user can move the head without being restricted by the detection device.
 本開示の一態様に係る撮像システムにおいて、前記検出装置は、少なくとも1つの赤外線センサと、少なくとも1つの赤外線マーカと、処理装置とを含み、前記少なくとも1つの赤外線センサと前記少なくとも1つの赤外線マーカとうちのの一方が前記頭部に配置され、前記少なくとも1つの赤外線センサと前記少なくとも1つの赤外線マーカとのうちの他方が前記頭部から離れた位置に配置され、前記処理装置は、前記赤外線センサが前記赤外線マーカからの赤外光を検出する結果を処理して前記頭部の位置及び姿勢を検出してもよい。 In the imaging system according to one aspect of the present disclosure, the detection device includes at least one infrared sensor, at least one infrared marker, and a processing device, and includes the at least one infrared sensor and the at least one infrared marker. One of them is arranged on the head, the other of the at least one infrared sensor and the at least one infrared marker is arranged at a position away from the head, and the processing apparatus is the infrared sensor. May detect the position and orientation of the head by processing the result of detecting the infrared light from the infrared marker.
 上記態様によると、検出装置は、赤外線センサ及び赤外線マーカを用いることで、頭部の位置及び姿勢を簡易な処理で高精度に検出することができる。さらに、ユーザは、検出装置による制約を受けずに頭部を動かすことができる。 According to the above aspect, the detection device can detect the position and posture of the head with high accuracy by a simple process by using an infrared sensor and an infrared marker. In addition, the user can move his head without being restricted by the detector.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、前記頭部に取り付けられるヘッドマウントディスプレイであり、前記表示装置は、前記頭部と共に移動することで前記頭部の動きに追従して前記画像を表示する位置及び向きを変えてもよい。 In the imaging system according to one aspect of the present disclosure, the display device is a head-mounted display attached to the head, and the display device follows the movement of the head by moving with the head. The position and orientation in which the image is displayed may be changed.
 上記態様によると、ヘッドマウントディスプレイの表示面は、頭部と共に移動し、当該表示面の位置及び向きが頭部の動きに追従する。よって、表示面の位置及び向きを頭部の動きに追従させるための構成が簡易になる。 According to the above aspect, the display surface of the head-mounted display moves together with the head, and the position and orientation of the display surface follow the movement of the head. Therefore, the configuration for making the position and orientation of the display surface follow the movement of the head becomes simple.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、鉛直方向で前記ユーザの周囲の少なくとも一部を囲むように配置され、前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記表示装置の表示面上で移動させることで、前記画像を表示する位置及び向きを変えてもよい。 In the imaging system according to one aspect of the present disclosure, the display device is arranged so as to surround at least a part of the periphery of the user in the vertical direction, and the display device follows the movement of the head. By moving the reference point of the image captured by the image pickup device on the display surface of the display device, the position and orientation of displaying the image may be changed.
 上記態様によると、撮像システムは、頭部の上下方向の動きに追従して、表示装置の表示面上で鉛直方向に、撮像装置によって撮像された画像の基準点を移動させる。当該画像の基準点の位置及び基準点が存在する方向は、頭部の位置及び向きに対応することができる。よって、ユーザは、当該画像を容易に且つ確実に視ることができる。 According to the above aspect, the imaging system follows the vertical movement of the head and moves the reference point of the image captured by the imaging device in the vertical direction on the display surface of the display device. The position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、水平方向で前記ユーザの周囲の少なくとも一部を囲むように配置され、前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記表示装置の表示面上で移動させることで、前記画像を表示する位置及び向きを変えてもよい。 In the imaging system according to one aspect of the present disclosure, the display device is arranged so as to surround at least a part of the periphery of the user in the horizontal direction, and the display device follows the movement of the head and is described. By moving the reference point of the image captured by the image pickup device on the display surface of the display device, the position and orientation of displaying the image may be changed.
 上記態様によると、撮像システムは、頭部の左右方向の動きに追従して、表示装置の表示面上で水平方向に、撮像装置によって撮像された画像の基準点を移動させる。当該画像の基準点の位置及び基準点が存在する方向は、頭部の位置及び向きに対応することができる。よって、ユーザは、当該画像を容易に且つ確実に視ることができる。 According to the above aspect, the imaging system follows the movement of the head in the left-right direction and moves the reference point of the image captured by the imaging device in the horizontal direction on the display surface of the display device. The position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、向きが異なるように配置された複数の表示面を有し、前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記複数の表示面上で移動させることで、前記画像を表示する位置及び向きを変えてもよい。 In the image pickup system according to one aspect of the present disclosure, the display device has a plurality of display surfaces arranged so as to have different orientations, and the display device follows the movement of the head and the image pickup device. By moving the reference point of the image captured by the above on the plurality of display surfaces, the position and orientation of displaying the image may be changed.
 上記態様によると、表示装置の複数の表示面はユーザの周囲の少なくとも一部を囲むことができる。このような複数の表示面上において、撮像装置によって撮像された画像の基準点の位置及び基準点が存在する方向は、頭部の位置及び向きに対応することができる。また、複数の表示面はそれぞれ、例えば平面状であってもよい。例えば、複数の表示面は、複数の平面ディスプレイの表示面であってもよい。これにより、表示装置の低コスト化が可能になる。 According to the above aspect, the plurality of display surfaces of the display device can surround at least a part of the periphery of the user. On such a plurality of display surfaces, the position of the reference point of the image captured by the image pickup apparatus and the direction in which the reference point exists can correspond to the position and orientation of the head. Further, each of the plurality of display surfaces may be, for example, a flat surface. For example, the plurality of display surfaces may be display surfaces of a plurality of flat displays. This makes it possible to reduce the cost of the display device.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、前記ユーザの周囲の少なくとも一部を囲むような、屈曲と湾曲とのうちの少なくとも一方を含む表示面を有してもよい。 In the imaging system according to one aspect of the present disclosure, the display device may have a display surface including at least one of bending and bending so as to surround at least a part around the user.
 上記態様によると、表示装置は、ユーザの周囲の少なくとも一部を囲むように当該周囲に沿って延びる表示面を有することができる。これにより、表示装置は、ユーザに連続的な画像を提示することができる。例えば、1つの表示装置の表示面がユーザの周囲の少なくとも一部を囲むように構成されてもよい。 According to the above aspect, the display device can have a display surface extending along the periphery so as to surround at least a part of the periphery of the user. This allows the display device to present the user with a continuous image. For example, the display surface of one display device may be configured to surround at least a part around the user.
 本開示の一態様に係る撮像システムにおいて、前記表示装置は、前記頭部の動きに追従して、前記表示装置の表示面の位置と前記表示面の向きとのうちの少なくとも一方を移動させる駆動装置を備えてもよい。 In the imaging system according to one aspect of the present disclosure, the display device is driven to move at least one of the position of the display surface of the display device and the orientation of the display surface in accordance with the movement of the head. It may be equipped with a device.
 上記態様によると、撮像システムは、頭部の動きに追従して、表示装置の表示面の位置、当該表示面の向き、又は、当該表示面の位置及び向きの両方を移動させることができる。これにより、撮像システムは、頭部が動いた場合であっても、頭部に対する表示面の位置及び向きを保持することができる。よって、ユーザは、表示面を容易に視認することができる。 According to the above aspect, the imaging system can move the position of the display surface of the display device, the orientation of the display surface, or both the position and orientation of the display surface according to the movement of the head. As a result, the imaging system can maintain the position and orientation of the display surface with respect to the head even when the head moves. Therefore, the user can easily visually recognize the display surface.
 本開示の一態様に係るロボットシステムは、本開示の一態様に係る撮像システムと、対象物に対して作業を行うロボットとを備え、前記撮像装置は、前記対象物と前記ロボットとのうちの少なくとも一方を撮像可能である位置に配置される。上記態様によると、本開示の一態様に係る撮像システムと同様の効果が得られる。 The robot system according to one aspect of the present disclosure includes an image pickup system according to one aspect of the present disclosure and a robot that performs work on an object, and the image pickup device is a combination of the object and the robot. It is placed in a position where at least one can be imaged. According to the above aspect, the same effect as that of the imaging system according to one aspect of the present disclosure can be obtained.
 また、上記で用いた序数、数量等の数字は、全て本開示の技術を具体的に説明するために例示するものであり、本開示は例示された数字に制限されない。また、構成要素間の接続関係は、本開示の技術を具体的に説明するために例示するものであり、本開示の機能を実現する接続関係はこれに限定されない。 In addition, the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers. Further, the connection relationship between the components is exemplified for concretely explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited to this.
 本開示は、その本質的な特徴の精神から逸脱することなく、様々なかたちで実施され得るように、本開示の範囲は、明細書の記載よりも添付の請求項によって定義されるため、例示的な実施の形態及び変形例は、例示的なものであって限定的なものではない。請求項及びその範囲内にあるすべての変更、又は、請求項及びその範囲の均等物は、請求項によって包含されることが意図されている。 The scope of the disclosure is defined by the appended claims rather than the description of the specification so that the disclosure can be carried out in various ways without departing from the spirit of its essential characteristics. Embodiments and modifications are exemplary and not limited. Claims and all modifications within the scope of the claims, or equivalents of the claims and their scope, are intended to be embraced by the claims.
1 ロボットシステム
100 撮像システム
110 撮像装置
120 移動装置(変動装置)
130 動き検出装置(検出装置)
131 赤外線センサ
132 赤外線マーカ
140,140A,140B,140C 表示装置
141a,143a 表示面
142 ディスプレイ駆動装置(駆動装置)
1512 第1移動制御部(変動装置)
1522 検出処理部(検出装置)
1531 表示制御部
1532 第2移動制御部
1533 画像処理部
H 頭部
P ユーザ
W 対象物
1 Robot system 100 Imaging system 110 Imaging device 120 Mobile device (variable device)
130 Motion detection device (detection device)
131 Infrared sensor 132 Infrared marker 140, 140A, 140B, 140C Display device 141a, 143a Display surface 142 Display drive device (drive device)
1512 1st movement control unit (variable device)
1522 Detection processing unit (detection device)
1531 Display control unit 1532 Second movement control unit 1533 Image processing unit H Head P User W Object

Claims (13)

  1.  撮像装置と、
     ユーザの頭部の動きを検出する検出装置と、
     前記検出装置によって検出される前記頭部の動きに追従して前記撮像装置の位置及び向きを変動させる変動装置と、
     前記撮像装置によって撮像された画像を前記ユーザに表示し、前記頭部の動きに追従して、前記画像を表示する位置と前記画像を表示する向きとのうちの少なくとも一方を変える表示装置とを備える
     撮像システム。
    Imaging device and
    A detection device that detects the movement of the user's head,
    A variable device that changes the position and orientation of the image pickup device according to the movement of the head detected by the detection device, and a variable device.
    An image captured by the image pickup device is displayed to the user, and a display device that follows the movement of the head and changes at least one of the position where the image is displayed and the direction in which the image is displayed. Equipped with an imaging system.
  2.  前記変動装置は、前記撮像装置を搭載し、前記撮像装置を移動させることで前記撮像装置の位置及び向きを変動させる
     請求項1に記載の撮像システム。
    The image pickup system according to claim 1, wherein the variable device is equipped with the image pickup device and changes the position and orientation of the image pickup device by moving the image pickup device.
  3.  前記変動装置は、前記撮像装置を搭載し、前記変動装置自体が移動することで前記撮像装置の位置及び向きを変動させる
     請求項1又は2に記載の撮像システム。
    The imaging system according to claim 1 or 2, wherein the variable device is equipped with the image pickup device, and the position and orientation of the image pickup device are changed by moving the variable device itself.
  4.  異なる位置及び向きで配置された複数の前記撮像装置を備え、
     前記変動装置は、前記表示装置において画像を表示させる前記撮像装置を切り替えることで、前記撮像装置の位置及び向きを変動させる
     請求項1から3のいずれか一項に記載の撮像システム。
    With a plurality of the imaging devices arranged in different positions and orientations,
    The image pickup system according to any one of claims 1 to 3, wherein the variable device changes the position and orientation of the image pickup device by switching the image pickup device for displaying an image on the display device.
  5.  前記検出装置は、前記頭部から離れた位置から前記頭部の位置及び姿勢を検出するセンサを含む
     請求項1から4のいずれか一項に記載の撮像システム。
    The imaging system according to any one of claims 1 to 4, wherein the detection device includes a sensor that detects the position and posture of the head from a position away from the head.
  6.  前記検出装置は、少なくとも1つの赤外線センサと、少なくとも1つの赤外線マーカと、処理装置とを含み、
     前記少なくとも1つの赤外線センサと前記少なくとも1つの赤外線マーカとのうちの一方が前記頭部に配置され、
     前記少なくとも1つの赤外線センサと前記少なくとも1つの赤外線マーカとのうちの他方が前記頭部から離れた位置に配置され、
     前記処理装置は、前記赤外線センサが前記赤外線マーカからの赤外光を検出する結果を処理して前記頭部の位置及び姿勢を検出する
     請求項5に記載の撮像システム。
    The detection device includes at least one infrared sensor, at least one infrared marker, and a processing device.
    One of the at least one infrared sensor and the at least one infrared marker is placed on the head.
    The other of the at least one infrared sensor and the at least one infrared marker is arranged at a position away from the head.
    The imaging system according to claim 5, wherein the processing device processes the result of the infrared sensor detecting infrared light from the infrared marker to detect the position and posture of the head.
  7.  前記表示装置は、前記頭部に取り付けられるヘッドマウントディスプレイであり、
     前記表示装置は、前記頭部と共に移動することで前記頭部の動きに追従して前記画像を表示する位置及び向きを変える
     請求項1から6のいずれか一項に記載の撮像システム。
    The display device is a head-mounted display attached to the head.
    The imaging system according to any one of claims 1 to 6, wherein the display device changes the position and direction in which the image is displayed according to the movement of the head by moving together with the head.
  8.  前記表示装置は、鉛直方向で前記ユーザの周囲の少なくとも一部を囲むように配置され、
     前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記表示装置の表示面上で移動させることで、前記画像を表示する位置及び向きを変える
     請求項1から6のいずれか一項に記載の撮像システム。
    The display device is arranged so as to surround at least a part of the user in the vertical direction.
    The display device changes the position and orientation of displaying the image by moving the reference point of the image captured by the image pickup device on the display surface of the display device in accordance with the movement of the head. The imaging system according to any one of claims 1 to 6.
  9.  前記表示装置は、水平方向で前記ユーザの周囲の少なくとも一部を囲むように配置され、
     前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記表示装置の表示面上で移動させることで、前記画像を表示する位置及び向きを変える
     請求項1から6及び8のいずれか一項に記載の撮像システム。
    The display device is arranged horizontally so as to surround at least a part around the user.
    The display device changes the position and orientation of displaying the image by moving the reference point of the image captured by the image pickup device on the display surface of the display device in accordance with the movement of the head. The imaging system according to any one of claims 1 to 6 and 8.
  10.  前記表示装置は、向きが異なるように配置された複数の表示面を有し、
     前記表示装置は、前記頭部の動きに追従して、前記撮像装置によって撮像された画像の基準点を前記複数の表示面上で移動させることで、前記画像を表示する位置及び向きを変える
     請求項8又は9に記載の撮像システム。
    The display device has a plurality of display surfaces arranged so as to have different orientations.
    The display device changes the position and orientation of displaying the image by moving the reference point of the image captured by the image pickup device on the plurality of display surfaces in accordance with the movement of the head. Item 8. The imaging system according to Item 8.
  11.  前記表示装置は、前記ユーザの周囲の少なくとも一部を囲むような、屈曲と湾曲とのうちの少なくとも一方を含む表示面を有する
     請求項8又は9に記載の撮像システム。
    The imaging system according to claim 8 or 9, wherein the display device has a display surface including at least one of bending and bending so as to surround at least a part around the user.
  12.  前記表示装置は、前記頭部の動きに追従して、前記表示装置の表示面の位置と前記表示面の向きとのうちの少なくとも一方を移動させる駆動装置を備える
     請求項1から11のいずれか一項に記載の撮像システム。
    One of claims 1 to 11, wherein the display device includes a drive device that moves at least one of the position of the display surface of the display device and the orientation of the display surface according to the movement of the head. The imaging system according to paragraph 1.
  13.  請求項1から12のいずれか一項に記載の撮像システムと、
     対象物に対して作業を行うロボットとを備え、
     前記撮像装置は、前記対象物と前記ロボットとのうちの少なくとも一方を撮像可能である位置に配置される
     ロボットシステム。
    The imaging system according to any one of claims 1 to 12,
    Equipped with a robot that works on an object
    The image pickup device is a robot system arranged at a position where at least one of the object and the robot can be imaged.
PCT/JP2021/022669 2020-06-19 2021-06-15 Imaging system and robot system WO2021256463A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022531836A JP7478236B2 (en) 2020-06-19 2021-06-15 Imaging system and robot system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-106026 2020-06-19
JP2020106026 2020-06-19

Publications (1)

Publication Number Publication Date
WO2021256463A1 true WO2021256463A1 (en) 2021-12-23

Family

ID=79267954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022669 WO2021256463A1 (en) 2020-06-19 2021-06-15 Imaging system and robot system

Country Status (2)

Country Link
JP (1) JP7478236B2 (en)
WO (1) WO2021256463A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000176675A (en) * 1998-12-17 2000-06-27 Kawasaki Heavy Ind Ltd Hmd-attached monitoring device for weld zone
WO2016189924A1 (en) * 2015-05-28 2016-12-01 株式会社日立製作所 Robot operation device and program
WO2017033355A1 (en) * 2015-08-25 2017-03-02 川崎重工業株式会社 Manipulator system
JP2018202032A (en) * 2017-06-08 2018-12-27 株式会社メディカロイド Remote control apparatus for medical equipment
JP2019179226A (en) * 2018-03-30 2019-10-17 株式会社小松製作所 Display device and remote control system
JP2019202354A (en) * 2018-05-21 2019-11-28 Telexistence株式会社 Robot control device, robot control method, and robot control program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5877463B2 (en) 2011-12-07 2016-03-08 矢崎総業株式会社 Shield shell

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000176675A (en) * 1998-12-17 2000-06-27 Kawasaki Heavy Ind Ltd Hmd-attached monitoring device for weld zone
WO2016189924A1 (en) * 2015-05-28 2016-12-01 株式会社日立製作所 Robot operation device and program
WO2017033355A1 (en) * 2015-08-25 2017-03-02 川崎重工業株式会社 Manipulator system
JP2018202032A (en) * 2017-06-08 2018-12-27 株式会社メディカロイド Remote control apparatus for medical equipment
JP2019179226A (en) * 2018-03-30 2019-10-17 株式会社小松製作所 Display device and remote control system
JP2019202354A (en) * 2018-05-21 2019-11-28 Telexistence株式会社 Robot control device, robot control method, and robot control program

Also Published As

Publication number Publication date
JP7478236B2 (en) 2024-05-02
JPWO2021256463A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
US9104981B2 (en) Robot teaching system and method using imaging based on training position
US20130338525A1 (en) Mobile Human Interface Robot
SE504846C2 (en) Control equipment with a movable control means
CN111614919B (en) Image recording device and head-mounted display
JP2006513504A (en) Position and orientation reading by projector
JP6798425B2 (en) Robot control method and robot system
CN108536142B (en) Industrial robot anti-collision early warning system and method based on digital grating projection
JP6950192B2 (en) Information processing equipment, information processing systems and programs
WO2017122270A1 (en) Image display device
US20230256606A1 (en) Robot System with Object Detecting Sensors
CN112008692A (en) Teaching method
JP2001148025A (en) Device and method for detecting position, and device and method for detecting plane posture
WO2021256463A1 (en) Imaging system and robot system
EP3147752B1 (en) An arrangement for providing a user interface
WO2021256464A1 (en) Image capturing system and robot system
JPH05318361A (en) Method for manipulating object
WO2017086771A1 (en) A visual surveillance system with target tracking or positioning capability
JP7224559B2 (en) Remote control manipulator system and remote control support system
Yu et al. Efficiency and learnability comparison of the gesture-based and the mouse-based telerobotic systems
US20230190403A1 (en) System for monitoring a surgical luminaire assembly
JP6005496B2 (en) Remote monitoring device and remote monitoring method
US20230214004A1 (en) Information processing apparatus, information processing method, and information processing program
KR20190091870A (en) Robot control system using motion sensor and VR
US11407117B1 (en) Robot centered augmented reality system
US20200057501A1 (en) System, devices, and methods for remote projection of haptic effects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21824947

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022531836

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21824947

Country of ref document: EP

Kind code of ref document: A1