WO2020216037A1 - 控制装置、摄像装置、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2020216037A1
WO2020216037A1 PCT/CN2020/083101 CN2020083101W WO2020216037A1 WO 2020216037 A1 WO2020216037 A1 WO 2020216037A1 CN 2020083101 W CN2020083101 W CN 2020083101W WO 2020216037 A1 WO2020216037 A1 WO 2020216037A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
imaging device
lens
optical axis
imaging
Prior art date
Application number
PCT/CN2020/083101
Other languages
English (en)
French (fr)
Inventor
本庄谦一
永山佳范
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080002854.7A priority Critical patent/CN112154371A/zh
Publication of WO2020216037A1 publication Critical patent/WO2020216037A1/zh
Priority to US17/506,426 priority patent/US20220046177A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/30Systems for automatic generation of focusing signals using parallactic triangle with a base line
    • G02B7/32Systems for automatic generation of focusing signals using parallactic triangle with a base line using active means, e.g. light emitter
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Definitions

  • the invention relates to a control device, an imaging device, a mobile body, a control method, and a program.
  • each distance correction TOF pixel Based on the comparison result between each distance correction TOF pixel and each imaging pixel corresponding to each distance correction TOF pixel, for each distance pixel correction TOF pixel whose brightness difference with each imaging pixel is greater than or equal to a threshold value, it detects Each distance pixel corresponding to the distance correction TOF pixel serves as an error pixel.
  • Patent Document 1 Japanese Patent Publication No. 2014-70936
  • the lens optical axis of the distance measuring sensor and the lens optical axis of the imaging device are physically shifted. Therefore, among the respective distances of the plurality of regions measured by the distance measuring sensor, the distance corresponding to the subject passing through the lens optical axis of the imaging device differs according to the distance to the subject.
  • the control device may be a control device that controls an imaging device, the imaging device including a distance measuring sensor that uses a first image sensor to measure the distance to a subject associated with each of the multiple regions .
  • the control device may include a circuit configured to be based on the plurality of distances measured by the distance measuring sensor, the first distance between the lens optical axis of the imaging device and the lens optical axis of the distance measuring sensor, and the angle of view of the distance measuring sensor, The distance to the subject passing through the optical axis of the lens of the imaging device is determined from a plurality of distances.
  • the circuit may be configured to determine the ranging range of the ranging sensor in each of the multiple distances from the optical axis of the camera lens toward the optical axis of the ranging sensor based on each of the multiple distances, the first distance, and the angle of view.
  • the width of the direction, and based on the ratio of each width to the first distance, the distance to the subject passing through the lens optical axis of the imaging device is determined from a plurality of distances.
  • the circuit may be configured to perform focus control of the imaging device based on the determined distance.
  • the circuit may be configured to perform focus control of the imaging device based on the contrast evaluation value of the image captured by the imaging device.
  • the circuit may be configured to determine the first target position of the focus lens of the imaging device based on the first distance, and determine the first target position of the focus lens based on the blur amount of at least two images captured by the imaging device during the movement of the focus lens based on the first target position.
  • the second target position, and the focus lens is moved to the second target position, thereby performing focus control.
  • the imaging device may include the above-mentioned control device, a distance measuring sensor, and a second image sensor that photographs a subject.
  • the mobile body according to one aspect of the present invention may be a mobile body that mounts the above-mentioned imaging device and moves.
  • the control method may be a control method for controlling an imaging device, the imaging device including a distance measuring sensor that uses a first image sensor to measure the distance to a subject associated with each of the multiple regions .
  • the control method may include the following stages: based on the plurality of distances measured by the distance measuring sensor, the first distance between the lens optical axis of the imaging device and the lens optical axis of the distance measuring sensor, and the angle of view of the distance measuring sensor, from the plurality of distances Determine the distance to the subject passing through the optical axis of the camera lens.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • the distance to the desired subject can be measured with high accuracy by using the distance measuring sensor that measures the distance of the subject in each of the plurality of regions using the first image sensor.
  • Fig. 1 is an external perspective view of the camera system.
  • Fig. 2 is a schematic diagram of functional blocks of the camera system.
  • FIG 3 is a diagram showing an example of the positional relationship between the optical axis of the lens of the imaging device and the optical axis of the TOF sensor.
  • FIG. 4 is a flowchart showing an example of a focus control procedure of the imaging control unit.
  • FIG. 5 is a diagram showing an example of a curve representing the relationship between the blur amount and the lens position.
  • FIG. 6 is a diagram showing an example of a process of calculating the distance to the object based on the amount of blur.
  • FIG. 7 is a diagram for explaining the relationship between the object position, the lens position, and the focal length.
  • FIG. 8A is a diagram for explaining the moving direction of the focus lens.
  • FIG. 8B is a diagram for explaining the moving direction of the focus lens.
  • FIG. 9 is a flowchart showing another example of the focus control procedure of the imaging control section.
  • Fig. 10 is an external perspective view showing another form of the imaging system.
  • Fig. 11 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device.
  • Fig. 12 is a diagram showing an example of a hardware configuration.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored is provided with a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) Blu-ray discs, memory sticks, integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray discs memory sticks, integrated circuit cards, etc.
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 1 is a diagram showing an example of an external perspective view of an imaging system 10 according to this embodiment.
  • the imaging system 10 includes an imaging device 100, a supporting mechanism 200 and a grip 300.
  • the support mechanism 200 uses actuators to rotatably support the imaging device 100 around the roll axis, the pitch axis, and the yaw axis, respectively.
  • the support mechanism 200 can change or maintain the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203.
  • the supporting mechanism 200 also includes a base 204 for fixing the yaw axis driving mechanism 203.
  • the grip 300 is fixed to the base 204.
  • the holding part 300 includes an operation interface 301 and a display part 302.
  • the imaging device 100 is fixed to the pitch axis driving mechanism 202.
  • the operation interface 301 receives instructions for operating the imaging device 100 and the supporting mechanism 200 from the user.
  • the operation interface 301 may include a shutter/recording button that instructs the imaging device 100 to take or record a picture.
  • the operation interface 301 may include a power/function key button for instructing to turn on or off the power of the camera system 10, and to switch the still shooting mode or the dynamic shooting mode of the camera 100.
  • the display unit 302 can display an image captured by the imaging device 100.
  • the display unit 302 can display a menu screen for operating the imaging device 100 and the support mechanism 200.
  • the display unit 302 may be a touch panel display that receives instructions for operating the imaging device 100 and the supporting mechanism 200.
  • the user holds the grip 300 to take a still image or a moving image through the imaging device 100.
  • FIG. 2 is a schematic diagram of functional blocks of the camera system 10.
  • the imaging device 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens driving unit 152, a plurality of lenses 154, and a TOF sensor 160.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 is an example of a second image sensor used for shooting.
  • the image sensor 120 outputs image data of optical images formed by the plurality of lenses 154 to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 follows the operation instruction of the holding unit 300 to the imaging device 100, and the imaging control unit 110 performs demosaicing processing on the image signal output from the image sensor 120, thereby generating image data.
  • the imaging control unit 110 stores image data in the memory 130.
  • the imaging control unit 110 controls the TOF sensor 160.
  • the imaging control unit 110 is an example of a circuit.
  • the TOF sensor 160 is a time-of-flight type sensor that measures the distance to an object.
  • the imaging device 100 adjusts the position of the focus lens based on the distance measured by the TOF sensor 160, thereby performing focus control.
  • the memory 130 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the grip 300 may include another memory for storing image data captured by the imaging device 100.
  • the holding part 300 may include a slot that can detach the storage from the housing of the holding part 300.
  • the multiple lenses 154 can function as a zoom lens, a variable focal length lens, and a focusing lens. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis.
  • the lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving unit 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction.
  • the lens driving unit 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor.
  • the lens driving unit 152 can transmit the power from the motor to at least a part or all of the plurality of lenses 154 via mechanism components such as cam rings and guide shafts, so as to move at least a part or all of the plurality of lenses 154 along the optical axis.
  • the imaging device 100 further includes a posture control unit 210, an angular velocity sensor 212, and an acceleration sensor 214.
  • the angular velocity sensor 212 detects the angular velocity of the imaging device 100.
  • the angular velocity sensor 212 detects angular velocities around the roll axis, pitch axis, and yaw axis of the imaging device 100, respectively.
  • the posture control unit 210 acquires angular velocity information related to the angular velocity of the imaging device 100 from the angular velocity sensor 212.
  • the angular velocity information may show the angular velocity around the roll axis, pitch axis, and yaw axis of the camera device 100, respectively.
  • the posture control unit 210 acquires acceleration information related to the acceleration of the imaging device 100 from the acceleration sensor 214.
  • the acceleration information may show the acceleration in the respective directions of the roll axis, the pitch axis, and the yaw axis of the imaging device 100.
  • the angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that houses the image sensor 120, the lens 154, and the like. In this embodiment, a form in which the imaging device 100 and the support mechanism 200 are integratedly configured will be described. However, the supporting mechanism 200 may include a pedestal that detachably fixes the imaging device 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging device 100 such as a pedestal.
  • the posture control unit 210 controls the support mechanism 200 to maintain or change the posture of the imaging device 100 based on the angular velocity information and acceleration information.
  • the posture control unit 210 controls the support mechanism 200 to maintain or change the posture of the imaging device 100 in accordance with the operation mode of the support mechanism 200 for controlling the posture of the imaging device.
  • the action modes include the following modes: at least one action of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 makes the posture change of the camera device 100 follow the posture change of the base 204 of the support mechanism 200 .
  • the action modes include the following modes: each of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 is operated so that the posture change of the imaging device 100 follows the posture of the base 204 of the support mechanism 200 Variety.
  • the operation mode includes a mode in which each of the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 is operated to change the posture of the imaging device 100 and the posture of the base 204 of the support mechanism 200 is changed.
  • the operation mode includes a mode in which only the yaw axis driving mechanism 203 of the support mechanism 200 is operated so that the posture change of the imaging device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the action mode may include the following modes: FPV (First Person View) mode in which the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200; and the support mechanism 200 is operated to A fixed mode in which the posture of the imaging device 100 is maintained.
  • FPV First Person View
  • the FPV mode is a mode in which at least one action of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 causes the posture change of the imaging device 100 to follow the posture change of the base 204 of the support mechanism 200.
  • the fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to maintain the current posture of the imaging device 100.
  • the TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emitting control unit 166, a light receiving control unit 167, and a memory 168.
  • the TOF sensor 160 is an example of a distance measuring sensor.
  • the light emitting part 162 includes at least one light emitting element 163.
  • the light emitting element 163 is a device that repeatedly emits high-speed modulated pulsed light such as an LED or laser.
  • the light emitting element 163 may emit infrared pulse light.
  • the light emission control unit 166 controls the light emission of the light emitting element 163.
  • the light emission control part 166 can control the pulse width of the pulse light emitted by the light emitting element 163.
  • the light-receiving unit 164 includes a plurality of light-receiving elements 165 that measure the distance to the subject associated with each of the plurality of regions.
  • the light receiving section 164 is an example of a first image sensor used for distance measurement.
  • the plurality of light receiving elements 165 respectively correspond to a plurality of regions.
  • the light receiving element 165 repeatedly receives reflected light of pulsed light from the object.
  • the light receiving control unit 167 controls the light receiving of the light receiving element 165.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of regions based on the amount of reflected light repeatedly received by the light receiving element 165 during the predetermined light receiving period.
  • the light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light based on the amount of reflected light repeatedly received by the light receiving element 165 during a predetermined light receiving period.
  • the memory 168 may be a computer-readable storage medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM.
  • the memory 168 stores a program necessary for the light emitting control unit 166 to control the light emitting unit 162, a program necessary for the light receiving control unit 167 to control the light receiving unit 164, and the like.
  • the lens optical axis of the imaging device 100 and the lens optical axis of the TOF sensor 160 are physically shifted.
  • the lens optical axis 101 of the imaging device 100 and the lens optical axis 161 of the TOF sensor 160 are parallel, the lens optical axis 101 and the lens optical axis 161 are separated by a distance h.
  • the lens optical axis 101 is an optical axis of a lens system including a lens 154 that forms light on the light receiving surface of the image sensor 120 of the imaging device 100.
  • the lens optical axis 161 is the optical axis of the lens system that forms light on the light receiving portion 164 of the TOF sensor 160, that is, the light receiving surface of the image sensor 164.
  • the viewing angle of the camera 100 is ⁇
  • the viewing angle of the TOF sensor 160 is ⁇
  • the two optical axes are staggered. Therefore, if the distance to the subject existing in the distance measurement range of the TOF sensor 160 is different, the subject is measured by the plurality of light receiving elements 165 of the TOF sensor 160 The distance of the light receiving element 165 is also different.
  • the distance measuring range 1601 of the TOF sensor 160 is shown with 4 ⁇ 4 light receiving elements 165.
  • the light-receiving element 165 corresponding to the third column from the top in the range 1601 measures the distance to the subject passing through the lens optical axis 101 of the imaging device 100. measuring.
  • the object distance X 2 within the range of 1601 numbers ranging from the top light-receiving element corresponding to 165 pairs of four by the imaging lens optical axis 100 of the subject device 101 The distance is measured. That is, if the distance to the subject passing through the lens optical axis 101 is different, the light receiving element 165 that measures the subject distance is also different.
  • the imaging control unit 110 is based on the plurality of distances X n measured by the TOF sensor 160, the distance h between the lens optical axis 101 of the imaging device 100 and the lens optical axis 161 of the TOF sensor 160, and the angle of view of the TOF sensor 160 From the plurality of distances X n , the distance to the subject passing through the lens optical axis 101 of the imaging device 100 is determined.
  • the camera 100 can be based on each of a plurality of distances X n , the distance h, and the angle of view
  • the width H n of the distance measuring range 1601 of the TOF sensor 160 at each of the plurality of distances X n from the lens optical axis 101 of the imaging device 100 toward the lens optical axis 161 of the TOF sensor 160 is determined.
  • the imaging control unit 110 may determine the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n based on the ratio h/H n of each width H n to the distance h.
  • the TOF sensor 160 includes 4 ⁇ 4 light receiving elements 165.
  • the light receiving element 165 corresponding to the third column from the top in the ranging range 1601 is measured to pass through the lens optical axis 101 of the imaging device 100 The distance of the subject X 1 .
  • the light-receiving element 165 corresponding to the fourth column from the top in the ranging range 1601 measures the distance passing through the lens optical axis 101 of the imaging device 100 The distance of the subject X 2 .
  • the imaging control unit 110 is based on the plurality of distances X n , distance h, and angle of view measured by the TOF sensor 160
  • the distance to the subject passing through the lens optical axis 101 of the imaging device 100 can be determined from the plurality of distances X n .
  • the imaging control unit 110 may perform focus control of the imaging device 100 based on the determined distance.
  • any of the multiple distances X n measured by the TOF sensor 160 may not meet the optical axis 101 of the lens through the imaging device 100.
  • the distance of the subject In this case, the distance to the subject cannot be measured by the TOF sensor 160. Therefore, when the imaging control unit 110 cannot determine the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n , it can perform the focus control of the imaging device 100 based on the contrast evaluation value of the image. That is, when the imaging control unit 110 cannot determine the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n , it can perform contrast autofocus.
  • FIG. 4 is a flowchart showing an example of a focus control procedure of the imaging control unit 110.
  • the imaging control unit 110 causes the TOF sensor 160 to measure the distance to the subject in each of the plurality of regions (light-receiving elements 165) (S100).
  • the imaging control unit 110 determines whether the distance to the subject passing through the lens optical axis 101 of the imaging device 100 can be determined from a plurality of distances X n (S104).
  • the imaging control unit 110 determines the target position of the focus lens for focusing the subject based on the determined distance (S106 ).
  • the imaging control unit 110 cannot determine the distance to the subject passing through the lens optical axis 101 of the imaging device 100, it performs contrast AF, and based on the image contrast evaluation value, determines the target of the focus lens for focusing on the subject Position (S108).
  • the imaging control unit 110 moves the focus lens to the determined target position (S110).
  • the present embodiment it is possible to accurately determine the area for measuring the distance of the subject passing through the lens optical axis 101 of the imaging device 100 in the plurality of areas of the distance measurement target of the TOF sensor 160. Therefore, the distance to the subject can be measured with high accuracy, and the accuracy of focus control based on the distance measurement result of the TOF sensor 160 can be improved.
  • a method of focusing control of the imaging device 100 as a method for determining the subject distance, there is the following method: moving the focus lens while based on a state where the positional relationship between the focus lens and the light receiving surface of the image sensor 120 is different To determine the amount of blur in the multiple images taken.
  • this method is referred to as a blur detection auto focus (Bokeh Detection Auto Foucus: BDAF) method.
  • the amount of blurring (Cost) of the image can be expressed by the following formula (1) using a Gaussian function.
  • x represents the pixel position in the horizontal direction.
  • represents the standard deviation value.
  • FIG. 5 shows an example of the curve represented by equation (1).
  • FIG. 6 is a flowchart showing an example of the distance calculation process of the BDAF method.
  • the imaging device 100 captures a first image I 1 and stores it in the memory 130.
  • the imaging control unit 110 uses the imaging device 100 to capture and store the second image I 2 In the memory 130 (S201).
  • the focus lens or the imaging surface of the image sensor 120 is moved along the optical axis direction without exceeding the focus point.
  • the amount of movement of the focus lens or the imaging surface of the image sensor 120 may be, for example, 10 ⁇ m.
  • the imaging control unit 110 divides the image I 1 into a plurality of regions (S202).
  • the imaging control unit 110 may calculate the feature amount for each pixel in the image I 2 , and divide the image I 1 into a plurality of regions by taking a group of pixels having similar feature amounts as one area.
  • the imaging control unit 110 may divide the pixel group set as the range of the AF processing frame in the image I 1 into a plurality of regions.
  • the imaging control unit 110 divides the image I 2 into a plurality of regions corresponding to the plurality of regions of the image I 1 .
  • the imaging control unit 110 calculates the distance to the object contained in each of the multiple areas for each of the multiple areas based on the respective blur amounts of the multiple areas of the image I 1 and the respective blur amounts of the multiple areas of the image I 2 ( S203).
  • the calculation process of the distance is further explained with reference to FIG. 7.
  • the focal length is F.
  • the relationship between the distance A, the distance B, and the focal length F can be expressed by the following formula (2) according to the lens formula.
  • the focal length F is determined by the lens position. Therefore, if the distance B at which the subject 510 is imaged on the imaging surface can be determined, the equation (2) can be used to determine the distance A from the lens L to the subject 510.
  • the distance B and the distance A can be determined by calculating the imaging position of the object 510 based on the blur size (circles of confusion 512 and 514) of the object 510 projected on the imaging surface. That is, the imaging position can be determined by combining the size of the blur (blur amount) in proportion to the imaging surface and the imaging position.
  • the distance from the image I 1 that is closer to the imaging surface to the lens L is D 1 .
  • the distance from the image I 2 farther from the imaging surface to the lens L be D 2 .
  • Each image is blurred.
  • the Point Spread Function at this time is PSF, and the images at D 1 and D 2 are respectively I d1 and I d2 .
  • the image I 1 can be expressed by the following equation (3) according to the convolution operation.
  • the C value shown in equation (4) is the amount of change in the respective blur amounts of the images I d1 and I d2 , that is, the C value corresponds to the difference between the blur amount of the image I d1 and the blur amount of the image I d2n .
  • the imaging control unit 110 can combine focus control based on the TOF sensor 160 for distance measurement and focus control using the BDAF method.
  • the imaging control unit 110 may determine the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n , and determine the first target position of the focus lens of the imaging device 100 based on the distance. Furthermore, the imaging control unit 110 may determine the second target position of the focus lens according to the blur amount of at least two images captured by the imaging device 100 during the movement of the focus lens based on the first target position. In other words, the imaging control unit 110 can perform BDAF while moving the focus lens to the first target position, and thereby determine the target position of the focus lens for focusing the subject with high accuracy. Next, the imaging control section 110 may perform focus control by moving the focus lens to the second target position.
  • the imaging control unit 110 needs at least two images with different blur amounts when performing focus control of the BDAF method. However, if the amount of movement of the focus lens is small, the difference in the amount of blur between the two images is too small, and the imaging control unit 110 may not be able to accurately determine the target position.
  • the imaging control unit 110 determines the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n , and determines the first target position of the focus lens of the imaging device 100 based on the distance. Thereafter, the imaging control section 110 determines the amount of movement of the focus lens required to move the focus lens from the current position of the focus lens to the first target position. The imaging control unit 110 determines whether the amount of movement is greater than or equal to a predetermined threshold value that enables BDAF to be performed.
  • the imaging control unit 110 starts the focus lens to move to the first target position.
  • the imaging control unit 110 first moves the focus lens in a direction away from the first target position, and then moves the focus lens toward the first target position and the opposite direction so that the movement of the focus lens is greater than or Equal to the threshold. Therefore, the imaging control unit 110 can execute BDAF during the movement of the focus lens to the first target position, and execute more precise focus control.
  • the imaging control unit 110 first moves the focus lens in a direction 801 opposite to the direction toward the first target position, and then moves the focus lens in a direction 802 toward the first target position so that the amount of movement of the focus lens Can be greater than or equal to the threshold.
  • the imaging control unit 110 may start moving in the direction 803 toward the first position, once the focus lens is moved beyond the first target position, and then the focus lens is moved toward the first target position and moved in the opposite direction 804 So that the amount of movement of the focus lens can be greater than or equal to the threshold.
  • FIG. 9 is a flowchart showing another example of the focus control process of the imaging control section 110.
  • the imaging control unit 110 causes the TOF sensor 160 to measure the distance to the subject in each of the plurality of regions (light-receiving elements 165) (S300).
  • the imaging control unit 110 determines the distance to the subject passing through the lens optical axis 101 of the imaging device 100 from the plurality of distances X n based on the width H n and the distance h between the lens optical axis 101 and the lens optical axis 161 (S304) .
  • the imaging control unit 110 determines the first target position of the focus lens for focusing on the subject based on the determined distance (S306). Next, the imaging control unit 110 moves the focus lens to the determined first target position (S308).
  • the imaging control unit 110 acquires a first image captured by the imaging device 100 during the movement of the focus lens to the first target position (S310). Next, the imaging control unit 110 moves the focus lens by a predetermined distance, and then acquires a second image captured by the imaging device 100 (S312). The imaging control unit 110 derives the second position of the focus lens by the BDAF method based on the blur amount of the first image and the second image (S314). The imaging control unit 110 corrects the target position of the focus lens from the first target position to the second target position, and moves the focus lens to the target position (S316).
  • the target position of the focus lens can be corrected by executing BDAF, so that a desired subject can be focused with high accuracy. Furthermore, based on the target position based on the distance measured by the TOF sensor 160, the imaging control unit 110 can accurately determine the direction in which the focus lens starts to move. That is, the imaging control unit 110 can prevent the time for focus control or power consumption from increasing due to the pointless movement of the focus lens in the opposite direction.
  • FIG. 10 An example of an external perspective view showing another aspect of the imaging system 10 is shown.
  • the camera system 10 can be used in a state where a mobile terminal including a display such as a smart phone 400 is fixed to the side of the grip 300.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 11.
  • UAV 1000 may include a UAV body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV1000 is an example of a moving body propelled by a propulsion unit.
  • the concept of moving objects refers to flying objects such as airplanes moving in the air, vehicles moving on the ground, ships moving on water, etc., in addition to UAVs.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 1000 fly by controlling the rotation of a plurality of rotors.
  • the UAV body 20 uses, for example, four rotating wings to make the UAV1000 fly.
  • the number of rotors is not limited to four.
  • UAV1000 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is a camera for imaging that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the universal joint 50 supports the imaging device 100 so that it can be rotated with the pitch axis using an actuator.
  • the universal joint 50 supports the camera device 100 so that it can also be rotated around the roll axis and the yaw axis using an actuator.
  • the gimbal 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 1000 in order to control the flight of the UAV 1000.
  • the two camera devices 60 can be installed on the nose of the UAV1000, that is, on the front side.
  • the other two camera devices 60 can be installed on the bottom surface of the UAV1000.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side can also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 1000 can be generated from the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 1000 is not limited to four.
  • the UAV1000 may be equipped with at least one camera 60.
  • the UAV1000 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV1000.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 600 communicates with the UAV1000 to perform remote operation on the UAV1000.
  • the remote operation device 600 can wirelessly communicate with the UAV1000.
  • the remote operation device 600 transmits to the UAV 1000 instruction information indicating various commands related to the movement of the UAV 1000 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 1000.
  • the indication information can indicate the height at which the UAV1000 should be located.
  • the UAV 1000 moves to be at the height indicated by the instruction information received from the remote operation device 600.
  • the instruction information may include an ascending instruction to raise the UAV1000. UAV1000 rises while receiving the rise command. When the height of the UAV1000 has reached the upper limit height, even if an ascending command is received, the UAV1000 can be restricted from ascending.
  • FIG. 12 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

在由测距传感器测量的多个区域中的各自的距离中,通过摄像装置的镜头光轴的被摄体所对应的距离根据到被摄体的距离不同而不同。提供一种控制装置,其可以为对摄像装置进行控制的控制装置,该摄像装置包括采用第一图像传感器对到多个区域各自相关联的被摄体的距离进行测量的测距传感器。控制装置可以包括一种电路,该电路构成为基于由测距传感器测量到的多个距离、摄像装置的镜头光轴和测距传感器的镜头光轴间的第一距离以及测距传感器的视角,从多个距离中确定到通过摄像装置的镜头光轴的被摄体的距离。

Description

控制装置、摄像装置、移动体、控制方法以及程序 【技术领域】
本发明涉及一种控制装置、摄像装置、移动体、控制方法以及程序。
【背景技术】
存在以下记载:基于各距离补正TOF像素和与各距离补正TOF像素对应的各成像像素之间的比较结果,对于与各成像像素的亮度差大于或等于阈值的各距离像素补正TOF像素,检测出距离补正TOF像素所对应的各距离像素作为误差像素。
[背景技术文献]
[专利文献]
[专利文献1]日本专利文献特开2014-70936号公报
【发明内容】
【发明所要解决的技术问题】
在包括使用图像传感器对到多个区域中的被摄体的距离进行测量的测距传感器的摄像装置中,测距传感器的镜头光轴和摄像装置的镜头光轴物理性地错开。因此,在由测距传感器测量的多个区域的各自的距离中,通过摄像装置的镜头光轴的被摄体所对应的距离根据到被摄体的距离不同而不同。
【用于解决问题的技术手段】
本发明的一个方面所涉及的控制装置可以为对摄像装置进行控制的控制装置,该摄像装置包括采用第一图像传感器对到多个区域各自相关联的被摄体的距离进行测量的测距传感器。控制装置可以包括一种电路,该电路构成为基于由测距传感器测量到的多个距离、摄像装置的镜头光轴和测距传感器的镜头光轴间的第一距离以及测距传感器的视角,从多个距离中确定到通过摄像装置镜头光轴的被摄体的距离。
电路可以构成为基于多个距离的每一个、第一距离以及视角,确定在多个距离的每一个中的测距传感器的测距范围的、从摄像装置镜头光轴朝向测距传感器镜头光轴方向的宽度,并基于各个宽度与第一距离的比,从多个距离中确定到通过摄像装置的镜头光轴的被摄体的距离。
电路可以构成为基于确定的距离,执行摄像装置的对焦控制。
当无法从多个距离中确定到通过摄像装置的镜头光轴的被摄体的距离时,电路可以构成为基于摄像装置拍摄的图像的对比度评估值,执行摄像装置的对焦控制。
电路可以构成为:基于第一距离确定摄像装置的聚焦镜头的第一目标位置,在基于第一目标位置移动聚焦镜头期间,基于由摄像装置拍摄的至少两个图像的模糊量确定聚焦镜头的第二目标位置,并使聚焦镜头移动到第二目标位置,从而执行对焦控制。
本发明的一个方面所涉及的摄像装置可以包括上述控制装置、测距传感器以及拍摄被摄体的第二图像传感器。
本发明的一个方面所涉及的移动体可以是搭载上述摄像装置并进行移动的移动体。
本发明的一个方面所涉及的控制方法可以是对摄像装置进行控制的控制方法,该摄像装置包括采用第一图像传感器对到多个区域各自相关联的被摄体的距离进行测量的测距传感器。控制方法可以包括以下阶段:基于由测距传感器测量到的多个距离、摄像装置的镜头光轴和测距传感器的镜头光轴间的第一距离以及测距传感器的视角,从多个距离中确定到通过摄像装置镜头光轴的被摄体的距离。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。
根据本发明的一个方面,通过采用第一图像传感器对多个区域各自的被摄体的距离进行测量的测距传感器,能够高精度地测量到期望被摄体的距离。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1是摄像系统的外观立体图。
图2是摄像系统的功能块的示意图。
图3是示出摄像装置镜头光轴和TOF传感器镜头光轴的位置关系的一个示例的图。
图4是示出摄像控制部对焦控制过程的一个示例的流程图。
图5是示出表示模糊量与镜头位置的关系的曲线的一个示例的图。
图6是示出基于模糊量计算出到对象的距离的过程的一个示例的图。
图7是用于对对象位置、镜头位置及焦距之间的关系进行说明的图。
图8A是用于说明聚焦镜头的移动方向的图。
图8B是用于说明聚焦镜头的移动方向的图。
图9是示出摄像控制部的对焦控制过程的另一个示例的流程图。
图10是示出摄像系统的另一个形态的外观立体图。
图11是示出了无人驾驶航空器及远程操作装置的外观的一个示例的图。
图12是示出硬件配置的一个示例的图。
【具体实施方式】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质具备一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、Blu-ray(注册商标)蓝光光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1是本实施方式所涉及的摄像系统10的外观立体图的一个示例的图。摄像系统10包括摄像装置100、支撑机构200以及握持部300。支撑机构200使用致动器分别以滚转轴、俯仰轴和偏航轴为中心可旋转地支撑摄像装置100。支撑机构200可通过使摄像装置100以滚转轴、俯仰轴以及偏航轴中的至少一个为中心旋转,来变更或维持摄像装置100的姿势。支撑机构200包括滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203。支撑机构200还包括固定偏航轴驱动机构203的基部204。握持部300固定于基部204。握持部300包括操作接口301以及显示部302。摄像装置100固定于俯仰轴驱动机构202。
操作接口301从用户接收用于操作摄像装置100以及支撑机构200的指令。操作接口301可以包括指示摄像装置100摄影或者录像的快门/录像按钮。操作接口301可以包括指示开启或关闭摄像系统10的电源、切换摄像装置100的静态拍摄模式或动态拍摄模式的电源/功能键按钮。
显示部302可以显示由摄像装置100拍摄的图像。显示部302可以显示用于操作摄像装置100以及支撑机构200的菜单画面。显示部302可以是接收用于操作摄像装置100以及支撑机构200的指令的触控面板显示器。
用户握持握持部300通过摄像装置100拍摄静态图像或动态图像。
图2是摄像系统10的功能块的示意图。摄像装置100包括摄像控制部110、图像传感器120、存储器130、镜头控制部150、镜头驱动部152、多个镜头154以及TOF传感器160。
图像传感器120可以由CCD或CMOS构成。图像传感器120是用于拍摄的第二图像传感器的一个示例。图像传感器120将通过多个镜头154成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器以及MCU等微控制器等构成。
摄像控制部110按照握持部300对摄像装置100的动作指令,并且摄像控制部110对从图像传感器120输出的图像信号实施去马赛克处理,从而生成图像数据。摄像控制部110将图像数据存储在存储器130。摄像控制部110控制TOF传感器160。摄像控制部110是电路的一个示例。TOF传感器160是测量到对象物距离的飞行时间型传感器。摄像装置100基于由TOF传感器160测量到的距离,对聚焦镜头的位置进行调整,从而执行对焦控制。
存储器130可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置在摄像装置100的壳体内部。握持部300可以包括用于保存由摄像装置100拍摄的图像数据的另一个存储器。握持部300可以包括可将存储器从握持部300的壳体拆卸下来的槽。
多个镜头154可以起到变焦镜头(zoom lens)、可变焦距镜头(varifocal lens)及聚焦镜头的作用。多个镜头154中的至少一部分或全部被配置为能够沿着光轴移动。镜头控制部150按照来自摄像控制部110的镜头控制指令来驱动镜头驱动部152,使一个或者多个镜头154沿光轴方向移动。镜头控制指令例如是变焦控制指令和聚焦控制指令。镜头驱动部152可以包含使至少一部分或全部多个镜头154沿光轴方向移动的音圈电机(VCM)。镜头驱动部152可以包含DC电机、无芯电机或者超声波电机等电动机。镜头驱动部152可以将来自电动机的动力经由凸轮环、导轴等的机构部件传递到多个镜头154的至少一部分或全部,而使多个镜头154的至少一部分或全部沿光轴移动。
摄像装置100还包括姿势控制部210、角速度传感器212以及加速度传感器214。角速度传感器212检测出摄像装置100的角速度。角速度传感器212检测出分别围绕摄像装置100的滚转轴、俯仰轴以及偏航轴的角速度。姿势控制部210从角速度传感器212获取与摄像装置100的角速度相关的角速度信息。角速度信息可以示出分别围绕摄像装置100的滚转轴、俯仰轴以及偏航轴的角速度。姿势控制部210从加速度传感器214获取与摄像装置100的加速度相关的加速度信息。加速度信息可以示出摄像装置100的滚转轴、俯仰轴以及偏航轴的各自方向的加速度。
角速度传感器212和加速度传感器214可以设于收纳图像传感器120以及镜头154等的壳体内。在本实施方式中,对摄像装置100和支撑机构200一体化构成的形态进行说明。然而,支撑机构200可以包括可拆装地固定摄像装置100的台座。这种情况下,角速度传感器212以及加速度传感器214可以设于台座等摄像装置100的壳体外。
姿势控制部210基于角速度信息和加速度信息,控制支撑机构200维持或变更摄像装置100的姿势。姿势控制部210按照用于控制摄像装置的姿势的支撑机构200的动作模式,控制支撑机构200维持或变更摄像装置100的姿势。
动作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203的至少一个动作使得摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。动作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203的每一个分别动作使得摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。动作模式包括以下模式:使支撑机构200的俯仰轴驱动机构202以及偏航轴驱动机构203的每一个分别动作使得摄像装置100的姿势变化支撑机构200的基部204的姿势变化。动作模式包括以下模式:仅使支撑机构200的偏航轴驱动机构203动作使得摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。
动作模式可以包括以下模式:使支撑机构200动作使得摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化的FPV(First Person View,第一人称主视角)模式;以及使支撑机构200动作以维持摄像装置100的姿势的固定模式。
FPV模式为以下模式:使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203的至少一个动作使得摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。固定模式为以下模式:使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203的至少一个动作以维持摄像装置100的当前姿势。
TOF传感器160包括发光部162、受光部164、发光控制部166、受光控制部167以及存储器168。TOF传感器160是测距传感器的一个示例。
发光部162包括至少一个发光元件163。发光元件163是反复发射LED或激光等经高速调制的脉冲光的设备。发光元件163可以发射红外线脉冲光。发光控制部166控制发光元件163的发光。发光控制部166可以控制发光元件163发射的脉冲光的脉冲宽度。
受光部164包括测量到多个区域各自相关联的被摄体的距离的多个受光元件165。受光部164是用于测距的第一图像传感器的一个示例。多个受光元件165分别对应多个区域。受光元件165反复接收来自对象物的脉冲光的反射光。受光控制部167控制受光元件165的受光。受光控制部167基于预定受光期间受光元件165反复接收的反射光的量,测量到多个区域各自相关联的被摄体的距离。受光控制部167可以基于预定受光期间受光元件165反复接收的反射光的量,通过确定脉冲光与反射光之间的相位差,测量到被摄体的距离。
存储器168可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM及EEPROM的至少一个。存储器168存储发光控制部166对发光部162进行控制所需的程序以及受光控制部167对受光部164进行控制所需的程序等。
在如此构成的摄像系统10中,摄像装置100的镜头光轴和TOF传感器160的镜头光轴物理性地错开。例如,如图3所示,摄像装置100的镜头光轴101和TOF传感器160的镜头光轴161虽然平行,但是镜头光轴101和镜头光轴161间隔距离h。镜头光轴101是包含在摄像装置100的图像传感器120的受光面上使光成像的镜头154的镜头系统的光轴。镜头光轴161是在TOF传感器160的受光部164即图像传感器164的受光面上使光成像的镜头系统的光轴。摄像装置100的视角是θ,TOF传感器160的视角是
Figure PCTCN2020083101-appb-000001
这样一来,两个光轴错开,因此,若到TOF传感器160测距的测距范围中存在的被摄体的距离不同,则TOF传感器160的多个受光元件165中的测量到被摄体距离的受光元件165也不同。
在图3中,为了简化说明,以4×4个的受光元件165示出TOF传感器160的测距范围1601。例如,到被摄体的距离为X 1时,测距范围1601内从上往下数第3列所对应的受光元件165对到通过摄像装置100的镜头光轴101的被摄体的距离进行测量。另一方面,到被摄体的距离为X 2时,测距范围1601内从上往下数第4列所对应的受光元件165对到通过摄像装置100的镜头光轴101的被摄体的距离进行测量。也就是 说,如果到通过镜头光轴101的被摄体的距离不同,测量该被摄体距离的受光元件165也不同。
因此,摄像控制部110基于由TOF传感器160测量到的多个距离X n、摄像装置100的镜头光轴101和TOF传感器160的镜头光轴161间的距离h以及TOF传感器160的视角
Figure PCTCN2020083101-appb-000002
从多个距离X n中,确定到通过摄像装置100的镜头光轴101的被摄体的距离。摄像装置100可以基于多个距离X n的每一个、距离h以及视角
Figure PCTCN2020083101-appb-000003
确定在多个距离X n的每一个的TOF传感器160的测距范围1601的、从摄像装置100的镜头光轴101朝向TOF传感器160的镜头光轴161的方向的宽度H n。进而,摄像控制部110可以基于各个宽度H n与距离h的比h/H n,从多个距离X n中,确定到通过摄像装置100的镜头光轴101的被摄体的距离。
这里,H n满足H n=2×X n×tan
Figure PCTCN2020083101-appb-000004
。例如,TOF传感器160包括4×4个受光元件165。这种情况下,满足0<h/H n<1<4的情况时,测距范围1601内从上往下数第3列所对应的受光元件165测量到通过摄像装置100的镜头光轴101的被摄体的距离X 1。另一方面,满足1<4/H n<1/2的情况时,测距范围1601内从上往下数第4列所对应的受光元件165测量到通过摄像装置100的镜头光轴101的被摄体的距离X 2
如此,摄像控制部110基于由TOF传感器160测量到的多个距离X n、距离h以及视角
Figure PCTCN2020083101-appb-000005
可以从多个距离X n中,确定到通过摄像装置100的镜头光轴101的被摄体的距离。进而,摄像控制部110可以由基于确定的距离,执行摄像装置100的对焦控制。
这里,如果到被摄体的距离过短,根据TOF传感器160的视角的不同,有时由TOF传感器160测量到的多个距离X n的任一个都不符合到通过摄像装置100的镜头光轴101的被摄体的距离。这种情况下,不能根据TOF传感器160测量到被摄体的距离。因此,摄像控制部110无法从多个距离X n中确定到通过摄像装置100的镜头光轴101的被摄体的距离时,可以基于图像的对比度评估值,执行摄像装置100的对焦控制。也就是说,摄像控制部110无法从多个距离X n中确定到通过摄像装置100的镜头光轴101被摄体的距离时,可以执行对比度自动聚焦。
图4是示出摄像控制部110对焦控制过程的一个示例的流程图。
摄像控制部110使TOF传感器160测量到多个区域(受光元件165)的每一个中的被摄体的距离(S100)。摄像控制部110按照H n=2×X n×tan
Figure PCTCN2020083101-appb-000006
,计算出经测量的多个距离X n分别对应的TOF传感器160的测距范围的宽度H n(S102)。摄像控制部110基于宽度H n和镜头光轴101和镜头光轴161间的距离h,判断是否能从多个距离X n中确定到通过摄像装置100的镜头光轴101的被摄体的距离(S104)。
摄像控制部110在确定出到通过摄像装置100的镜头光轴101的被摄体的距离时,摄像控制部110基于确定的距离,确定用于对焦该被摄体的聚焦镜头的目标位置 (S106)。摄像控制部110在无法确定出到通过摄像装置100的镜头光轴101的被摄体的距离时,执行对比度AF,并基于图像对比度评估值,确定用于对焦该被摄体的聚焦镜头的目标位置(S108)。
接着,摄像控制部110使聚焦镜头向已确定的目标位置移动(S110)。
如上所述,根据本实施方式,可以在在TOF传感器160的测距对象的多个区域内,正确地确定测量通过摄像装置100的镜头光轴101的被摄体的距离的区域。因此,能够高精度地测量到被摄体的距离,并能够提高基于TOF传感器160的测距结果的对焦控制的精度。
那么,作为摄像装置100的对焦控制的方式,作为用于确定被摄体距离的方式,存在以下方式:使聚焦镜头移动,同时基于在聚焦镜头和图像传感器120的受光面的位置关系不同的状态下拍摄的多个图像的模糊量来进行确定。这里将该方式称为模糊检测自动聚焦(Bokeh Detection Auto Foucus:BDAF)方式。
例如,图像的模糊量(Cost)可以采用高斯函数由以下公式(1)来表示。在式(1)中,x表示水平方向上的像素位置。σ表示标准偏差值。
【式1】
Figure PCTCN2020083101-appb-000007
图5示出了由式(1)表示的曲线的一个示例。通过将聚焦镜头对准与曲线500的最小点502对应的镜头位置,可以对焦在包含在图像I中的对象上。
图6是示出BDAF方式的距离计算过程的一个示例的流程图。首先,在镜头与摄像面处于第一位置关系的状态下,由摄像装置100对第一张图像I 1进行拍摄并存储在存储器130中。其次,通过使聚焦镜头或图像传感器120的摄像面沿着光轴方向移动,使镜头与摄像面处于第二位置关系,摄像控制部110用摄像装置100对第二张图像I 2进行拍摄并存储在存储器130中(S201)。例如,如所谓的爬山AF那样,使聚焦镜头或图像传感器120的摄像面沿着光轴方向移动而不超过对焦点。聚焦镜头或图像传感器120的摄像面的移动量例如可以是10μm。
接着,摄像控制部110将图像I 1划分为多个区域(S202)。摄像控制部110可以按图像I 2中的每个像素计算出特征量,并将具有相似特征量的像素组作为一个区域进而将图像I 1划分为多个区域。摄像控制部110也可以将图像I 1内的设定为AF处理框的范围的像素组划分为多个区域。摄像控制部110将图像I 2划分为与图像I 1的多个区域对应的多个区域。摄像控制部110基于图像I 1的多个区域各自的模糊量及图像I 2 的多个区域各自的模糊量,按多个区域的每一个计算出到多个区域各自所包含的对象的距离(S203)。
参照图7进一步说明距离的计算过程。设从镜头L(主点)到被摄体510(物面)的距离为A,从镜头L(主点)到被摄体510在摄像面上成像的位置(像面)的距离为B,焦距为F。在这种情况下,距离A、距离B及焦距F之间的关系可以根据镜头公式用以下式(2)来表示。
【式2】
Figure PCTCN2020083101-appb-000008
焦距F由镜头位置确定。因此,如果可以确定被摄体510在摄像面上成像的距离B,则能够采用式(2)来确定从镜头L到被摄体510的距离A。
如图7所示,可以通过基于投影在摄像面上的被摄体510的模糊的大小(弥散圆512和514)计算出被摄体510成像的位置,来确定距离B,进而确定距离A。即,可以结合模糊的大小(模糊量)与摄像面及成像位置成比例,来确定成像位置。
其中,设从距离摄像面较近的图像I 1到镜头L的距离为D 1。设从距离摄像面较远的图像I 2到镜头L的距离为D 2。各个图像都有模糊。设此时的点扩散函数(Point Spread Function)为PSF,设D 1和D 2处的图像分别为I d1和I d2。在这种情况下,例如,图像I 1可以根据卷积运算用以下式(3)表示。
【式3】
I 1=PSF*I d1
进而,设图像数据I d1及I d2的傅立叶变换函数为f,设对图像I d1及I d2的点扩散函数PSF 1及PSF 2进行傅里叶变换后的光学传递函数(Optical Transfer Function)为OTF 1及OTF 2,如以下式(4)得到比值。
【式4】
Figure PCTCN2020083101-appb-000009
式(4)所示的C值为图像I d1及I d2的各自的模糊量的变化量,即,C值相当于图像I d1的模糊量与图像I d2n的模糊量的差。
这里,即使是如上所述确定的距离,由TOF传感器160测量到被摄体的距离也有产生误差的可能性。因此,摄像控制部110可以将基于TOF传感器160测距的对焦控制和利用BDAF方式的对焦控制组合。
摄像控制部110可以从多个距离X n中确定到通过摄像装置100的镜头光轴101被摄体的距离,并基于该距离,确定摄像装置100的聚焦镜头的第一目标位置。进而,摄像控制部110可以在基于第一目标位置而使聚焦镜头移动期间,根据摄像装置100拍摄的至少两个图像的模糊量,确定聚焦镜头的第二目标位置。即,摄像控制部110可以在使聚焦镜头向第一目标位置移动期间,执行BDAF,进而高精度地确定对焦被摄体的聚焦镜头的目标位置。接着,摄像控制部110可以通过使聚焦镜头移动到第二目标位置而执行对焦控制。
这里,摄像控制部110在执行BDAF方式的对焦控制时,需要至少两个模糊量大小不同的图像。但是,若聚焦镜头的移动量小,则两个图像的模糊量大小差过小,可能摄像控制部110无法高精度地确定目标位置。
因此,摄像控制部110从多个距离X n中确定到通过摄像装置100的镜头光轴101被摄体的距离,并基于该距离,确定摄像装置100的聚焦镜头的第一目标位置。此后,摄像控制部110确定使聚焦镜头从聚焦镜头当前位置移动到第一目标位置所需聚焦镜头的移动量。摄像控制部110判断该移动量是否大于或等于能够执行BDAF的预定阈值。
如果移动量大于或等于阈值,摄像控制部110使聚焦镜头开始向第一目标位置移动。另一方面,当移动量小于阈值时,摄像控制部110先使聚焦镜头向远离第一目标位置的方向移动,再使聚焦镜头朝向第一目标位置而反方向移动以使聚焦镜头的移动大于或等于阈值。从而,摄像控制部110能够在使聚焦镜头向第一目标位置移动期间,执行BDAF,执行更高精度的对焦控制。
如图8A所示,摄像控制部110先使聚焦镜头向与朝向第一目标位置的方向的反方向801移动,再使聚焦镜头向朝向第一目标位置的方向802移动以使聚焦镜头的移动量能够大于或等于阈值。或者如图8B所示,摄像控制部110可以开始朝向第一位置的方向803的移动,一旦使聚焦镜头移动超过第一目标位置,再重新使聚焦镜头朝向第一目标位置而向反方向804移动以使聚焦镜头的移动量能够大于或等于阈值。
图9是示出摄像控制部110对焦控制过程的另一个示例的流程图。
摄像控制部110使TOF传感器160测量到多个区域(受光元件165)的每一个中被摄体的距离(S300)。摄像控制部110按照H n=2×X n×tan
Figure PCTCN2020083101-appb-000010
,计算出经测距的多个距离X n的每一个分别对应的TOF传感器160的测距范围的宽度H n(S302)。摄像控制部110基于宽度H n和镜头光轴101和镜头光轴161间的距离h,从多个距离X n中确定到通过摄像装置100的镜头光轴101的被摄体的距离(S304)。
摄像控制部110基于确定的距离,确定用于对焦该被摄体的聚焦镜头的第一目标位置(S306)。接着,摄像控制部110使聚焦镜头向确定的第一目标位置移动(S308)。
摄像控制部110在聚焦镜头向第一目标位置移动期间,获取由摄像装置100拍摄的第一图像(S310)。接着,摄像控制部110在使聚焦镜头移动了预定距离后,再获取由摄像装置100拍摄的第二图像(S312)。摄像控制部110基于第一图像和第二图像的模糊量,通过BDAF方式导出聚焦镜头的第二位置(S314)。摄像控制部110将聚焦镜头的目标位置从第一目标位置修正到第二目标位置,并使聚焦镜头移动到目标位置(S316)。
如上所述,根据本实施方式,即使当由TOF传感器160测量的距离包含误差时,也能够通过执行BDAF修正聚焦镜头的目标位置,从而能够高精度地对焦期望的被摄体。而且,根据基于由TOF传感器160测量的距离的目标位置,摄像控制部110能够正确判断聚焦镜头开始移动的方向。即,摄像控制部110能够防止因聚焦镜头向反方向无意义地移动而使对焦控制的时间变长或者耗电增加。
表示摄像系统10的另一个形态的外观立体图的一个示例。如图10所示,摄像系统10可以在将包括智能手机400等显示器的移动终端固定在握持部300的侧旁状态下使用。
上述这种摄像装置100可以搭载于移动体上。摄像装置100还可以搭载于如图11所示的无人驾驶航空器(UAV)上。UAV1000可以包括UAV本体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV1000为由推进部推进的移动体的一个示例。移动体的概念是指除UAV之外,包括在空中移动的飞机等飞行体、在地面上移动的车辆、在水上移动的船舶等。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV1000飞行。UAV本体20例如采用四个旋转翼,使UAV1000飞行。旋翼的数量不限于四个。另外,UAV1000也可以是没有旋翼的固定翼机。
摄像装置100是拍摄期望的拍摄范围所包含的被摄体的拍摄用摄像机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50支撑摄像装置100,使其能够使用致动器而以俯仰轴旋转。万向节50支撑摄像装置100,使其还能够使用致动器而分别以滚转轴和偏航轴为中心旋转。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。
多个摄像装置60是为了控制UAV1000的飞行而对UAV1000的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV1000的机头、即正面。并且,其它两个摄像装置60可以设置于UAV1000的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机 的作用。可以根据由多个摄像装置60所拍摄的图像来生成UAV1000周围的三维空间数据。UAV1000所具备的摄像装置60的数量不限于四个。UAV1000具备至少一个摄像装置60即可。UAV1000也可以在UAV1000的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置600与UAV1000通信,对UAV1000进行远程操作。远程操作装置600可以与UAV1000进行无线通信。远程操作装置600向UAV1000发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV1000的移动有关的各种指令的指示信息。指示信息包括例如使UAV1000的高度上升的指示信息。指示信息可以表示UAV1000应该位于的高度。UAV1000进行移动,以位于从远程操作装置600接收的指示信息所表示的高度。指示信息可以包括使UAV1000上升的上升指令。UAV1000在接收上升指令的期间上升。在UAV1000的高度已达到上限高度时,即使接收上升指令,也可以限制UAV1000上升。
图12显示可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212和RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接收信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。【符号说明】
10 摄像系统
20 UAV主体
50 万向节
60 摄像装置
100 摄像装置
101 镜头光轴
110 摄像控制部
120 图像传感器
130 存储器
150 镜头控制部
152 镜头驱动部
154 镜头
160 TOF传感器
161 镜头光轴
162 发光部
163 发光元件
164 受光部
165 受光元件
166 发光控制部
167 受光控制部
168 存储器
200 支撑机构
201 滚转轴驱动机构
202 俯仰轴驱动机构
203 偏航轴驱动机构
204 基部
210 姿势控制部
212 角速度传感器
214 加速度传感器
300 握持部
301 操作接口
302 显示部
400 智能手机
600 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (9)

  1. 一种控制装置,其对摄像装置进行控制,所述摄像装置包括采用第一图像传感器对到多个区域各自相关联的被摄体的距离进行测量的测距传感器,其特征在于,其包括一种电路,
    所述电路构成为基于由测距传感器测量到的多个距离、所述摄像装置的镜头光轴和所述测距传感器的镜头光轴间的第一距离以及所述测距传感器的视角,从多个距离中确定到通过所述摄像装置镜头光轴的被摄体的距离。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为基于所述多个距离的每一个、所述第一距离以及所述视角,确定在所述多个距离的每一个中的所述测距传感器的测距范围的、从所述摄像装置镜头光轴朝向所述测距传感器镜头光轴方向的宽度,并基于各个所述宽度与所述第一距离的比,从所述多个距离中确定到通过所述摄像装置的镜头光轴的被摄体的距离。
  3. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为基于所述确定的距离,执行所述摄像装置的对焦控制。
  4. 根据权利要求3所述的控制装置,其特征在于,当无法从所述多个距离中确定到通过所述摄像装置的镜头光轴的被摄体的距离时,所述电路构成为基于所述摄像装置拍摄的图像的对比度评估值,执行所述摄像装置的对焦控制。
  5. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:
    基于所述第一距离确定所述摄像装置的聚焦镜头的第一目标位置,在基于所述第一目标位置移动所述聚焦镜头期间,基于由所述摄像装置拍摄的至少两个图像的模糊量确定所述聚焦镜头的第二目标位置,并使所述聚焦镜头移动到第二目标位置,从而执行对焦控制。
  6. 一种摄像装置,其特征在于,包括:根据权利要求1至5中任意一项所述的控制装置;
    所述测距传感器;以及
    拍摄所述被摄体的第二图像传感器。
  7. 一种移动体,其特征在于,其搭载根据权利要求6所述的摄像装置并进行移动。
  8. 一种控制方法,其对摄像装置进行控制,所述摄像装置包括采用第一图像传感器对到多个区域各自相关联的被摄体的距离进行测量的测距传感器,其特征在于,
    包括以下阶段:基于由所述测距传感器测量到的多个所述距离、所述摄像装置的镜头光轴和所述测距传感器的镜头光轴间的第一距离以及所述测距传感器的视角,从所述多个距离中确定到通过摄像装置镜头光轴的被摄体的距离。
  9. 一种程序,其特征在于,其用于使计算机作为上述控制装置发挥作用。
PCT/CN2020/083101 2019-04-23 2020-04-03 控制装置、摄像装置、移动体、控制方法以及程序 WO2020216037A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080002854.7A CN112154371A (zh) 2019-04-23 2020-04-03 控制装置、摄像装置、移动体、控制方法以及程序
US17/506,426 US20220046177A1 (en) 2019-04-23 2021-10-20 Control device, camera device, movable object, control method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019082336A JP6768997B1 (ja) 2019-04-23 2019-04-23 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP2019-082336 2019-04-23

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/506,426 Continuation US20220046177A1 (en) 2019-04-23 2021-10-20 Control device, camera device, movable object, control method, and program

Publications (1)

Publication Number Publication Date
WO2020216037A1 true WO2020216037A1 (zh) 2020-10-29

Family

ID=72745108

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/083101 WO2020216037A1 (zh) 2019-04-23 2020-04-03 控制装置、摄像装置、移动体、控制方法以及程序

Country Status (4)

Country Link
US (1) US20220046177A1 (zh)
JP (1) JP6768997B1 (zh)
CN (1) CN112154371A (zh)
WO (1) WO2020216037A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022213339A1 (zh) * 2021-04-09 2022-10-13 深圳市大疆创新科技有限公司 对焦方法、拍摄设备、拍摄系统以及可读存储介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010031143A1 (en) * 2000-02-22 2001-10-18 Minolta Co., Ltd. Imaging system, two-dimensional photographing device and three-dimensional measuring device
CN1519639A (zh) * 2003-02-04 2004-08-11 ���ְ�˹��ʽ���� 相机
CN1536424A (zh) * 2000-07-14 2004-10-13 ���ְ�˹��ѧ��ҵ��ʽ���� 测距装置和具有测距装置的照相机
CN101701793A (zh) * 2009-10-29 2010-05-05 天津三星光电子有限公司 利用数码相机测定物体与拍摄相机之间距离的方法
CN101968354A (zh) * 2010-09-29 2011-02-09 清华大学 基于激光探测和图像识别的无人直升机测距方法
CN102445183A (zh) * 2011-10-09 2012-05-09 福建汇川数码技术科技有限公司 基于激光与摄像机平行实现的远程测距系统测距激光点的装置及定位方法
US20160138910A1 (en) * 2014-11-13 2016-05-19 Samsung Electronics Co., Ltd. Camera for measuring depth image and method of measuring depth image
CN107333036A (zh) * 2017-06-28 2017-11-07 驭势科技(北京)有限公司 双目相机
CN107544073A (zh) * 2017-08-29 2018-01-05 北醒(北京)光子科技有限公司 一种飞行器探测方法及高度控制方法
CN108303702A (zh) * 2017-12-30 2018-07-20 武汉灵途传感科技有限公司 一种相位式激光测距系统及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001221945A (ja) * 2000-02-08 2001-08-17 Ricoh Co Ltd 自動合焦装置
CN101430477B (zh) * 2007-11-09 2011-06-08 鸿富锦精密工业(深圳)有限公司 判断被摄物体距离的方法
JP2009175279A (ja) * 2008-01-22 2009-08-06 Olympus Imaging Corp カメラシステム
JP5609270B2 (ja) * 2010-05-28 2014-10-22 ソニー株式会社 撮像装置、撮像システム、撮像装置の制御方法およびプログラム
JP2012141436A (ja) * 2010-12-28 2012-07-26 Canon Inc 焦点検出装置およびその制御方法
JP5834410B2 (ja) * 2011-01-13 2015-12-24 株式会社リコー 撮像装置および撮像方法
US9551914B2 (en) * 2011-03-07 2017-01-24 Microsoft Technology Licensing, Llc Illuminator with refractive optical element
JP2013247543A (ja) * 2012-05-28 2013-12-09 Sony Corp 撮像装置、表示装置、および画像処理方法、並びにプログラム
WO2017197630A1 (en) * 2016-05-19 2017-11-23 SZ DJI Technology Co., Ltd. Autofocus initialization based on target detection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010031143A1 (en) * 2000-02-22 2001-10-18 Minolta Co., Ltd. Imaging system, two-dimensional photographing device and three-dimensional measuring device
CN1536424A (zh) * 2000-07-14 2004-10-13 ���ְ�˹��ѧ��ҵ��ʽ���� 测距装置和具有测距装置的照相机
CN1519639A (zh) * 2003-02-04 2004-08-11 ���ְ�˹��ʽ���� 相机
CN101701793A (zh) * 2009-10-29 2010-05-05 天津三星光电子有限公司 利用数码相机测定物体与拍摄相机之间距离的方法
CN101968354A (zh) * 2010-09-29 2011-02-09 清华大学 基于激光探测和图像识别的无人直升机测距方法
CN102445183A (zh) * 2011-10-09 2012-05-09 福建汇川数码技术科技有限公司 基于激光与摄像机平行实现的远程测距系统测距激光点的装置及定位方法
US20160138910A1 (en) * 2014-11-13 2016-05-19 Samsung Electronics Co., Ltd. Camera for measuring depth image and method of measuring depth image
CN107333036A (zh) * 2017-06-28 2017-11-07 驭势科技(北京)有限公司 双目相机
CN107544073A (zh) * 2017-08-29 2018-01-05 北醒(北京)光子科技有限公司 一种飞行器探测方法及高度控制方法
CN108303702A (zh) * 2017-12-30 2018-07-20 武汉灵途传感科技有限公司 一种相位式激光测距系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WANG, YAZHOU: "Non-official translation: Design of 3D Depth Camera using TOF Image Sensor and Binocular Vision", INFORMATION & TECHNOLOGY, CHINA MASTER’S THESES FULL-TEXT DATABASE, vol. 07, 15 July 2017 (2017-07-15), DOI: 20200611160438 *

Also Published As

Publication number Publication date
JP2020181028A (ja) 2020-11-05
JP6768997B1 (ja) 2020-10-14
CN112154371A (zh) 2020-12-29
US20220046177A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
WO2018185939A1 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
JP2019110462A (ja) 制御装置、システム、制御方法、及びプログラム
JP2020012878A (ja) 制御装置、移動体、制御方法、及びプログラム
JP2019216343A (ja) 決定装置、移動体、決定方法、及びプログラム
WO2021013143A1 (zh) 装置、摄像装置、移动体、方法以及程序
WO2020216037A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
CN112335227A (zh) 控制装置、摄像系统、控制方法以及程序
WO2020098603A1 (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
WO2021031833A1 (zh) 控制装置、摄像系统、控制方法以及程序
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
US20220188993A1 (en) Control apparatus, photographing apparatus, control method, and program
WO2019085771A1 (zh) 控制装置、镜头装置、摄像装置、飞行体以及控制方法
WO2020108284A1 (zh) 确定装置、移动体、确定方法以及程序
WO2021052216A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
WO2021031840A1 (zh) 装置、摄像装置、移动体、方法以及程序
CN111226170A (zh) 控制装置、移动体、控制方法以及程序
JP7043706B2 (ja) 制御装置、撮像システム、制御方法、及びプログラム
US20220070362A1 (en) Control apparatuses, photographing apparatuses, movable objects, control methods, and programs
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2022001561A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2021204020A1 (zh) 装置、摄像装置、摄像系统、移动体、方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20794265

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20794265

Country of ref document: EP

Kind code of ref document: A1