WO2021031833A1 - Control device, photographing system, control method, and program - Google Patents

Control device, photographing system, control method, and program Download PDF

Info

Publication number
WO2021031833A1
WO2021031833A1 PCT/CN2020/106579 CN2020106579W WO2021031833A1 WO 2021031833 A1 WO2021031833 A1 WO 2021031833A1 CN 2020106579 W CN2020106579 W CN 2020106579W WO 2021031833 A1 WO2021031833 A1 WO 2021031833A1
Authority
WO
WIPO (PCT)
Prior art keywords
positional relationship
image
imaging
focusing lens
focus
Prior art date
Application number
PCT/CN2020/106579
Other languages
French (fr)
Chinese (zh)
Inventor
高宫诚
永山佳范
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003361.5A priority Critical patent/CN112335227A/en
Publication of WO2021031833A1 publication Critical patent/WO2021031833A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/40Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the invention relates to a control device, a camera system, a control method and a program.
  • Patent Document 1 describes that the distance to the target subject is measured based on the reflected light of the light pulse.
  • the distance measuring sensor that measures the distance to the object based on the reflected light of the light pulse may not be able to accurately measure the distance to the object because it receives external light that is not reflected light of the light pulse such as sunlight.
  • the control device may be a control device that controls an imaging device including a distance measuring sensor and a focusing lens.
  • the distance measuring sensor includes a light emitting element that emits pulsed light and receives reflections that include pulsed light from an object.
  • a light-receiving element that outputs a signal corresponding to the amount of light received by the light, and measures the distance to the object based on the signal.
  • the control device may include a circuit configured to perform focusing on the object based on the first target positional relationship between the imaging surface of the imaging device and the focus lens determined according to the distance when the signal satisfies the first condition showing the reliability of the distance Focus control.
  • the circuit can be configured as follows: when the signal does not satisfy the first condition, the first image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship and the positional relationship between the imaging surface and the focusing lens are The second positional relationship is the second target positional relationship between the imaging surface determined by the respective blur amounts of the second image captured by the imaging device and the focus lens, and the focus control is performed.
  • the circuit may be configured to perform focus control according to the second target position relationship when the first difference degree showing the degree of difference between the first image and the second image satisfies the second condition showing the reliability of the second target position relationship.
  • the circuit may be configured to perform focus control according to contrast AF when the first degree of difference does not satisfy the second condition.
  • the circuit can be configured as follows: when the first degree of difference does not satisfy the second condition, the positional relationship between the imaging surface and the focusing lens is changed from the second positional relationship to the third positional relationship; when the positional relationship between the imaging surface and the focusing lens is shown When it is the third positional relationship, when the second degree of difference between the third image taken by the imaging device and the first image or the second image satisfies the second condition, it is based on the difference between the first image or the second image and the third image
  • the third target positional relationship between the imaging surface and the focusing lens determined by the respective blur amounts of, performs focus control; when the second degree of difference does not meet the second condition, focus control is performed according to contrast AF.
  • the signal When the signal shows that the amount of light is contained within the preset range, the signal may satisfy the first condition.
  • the circuit may be configured to: when the signal does not meet the first condition, after acquiring the first image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, the positional relationship between the imaging surface and the focusing lens is converted Is a second positional relationship, and acquires a second image captured by the imaging device when the positional relationship between the imaging surface and the focus lens is the second positional relationship.
  • the circuit may be configured as follows: when the signal does not satisfy the first condition, after acquiring the first image taken by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, the focusing lens is moved according to the first target positional relationship Move the predetermined distance in the determined direction, thereby moving the positional relationship between the imaging surface and the focusing lens to the second positional relationship.
  • the camera system may include the above-mentioned control device, a distance measuring sensor, and a camera device.
  • the control method may be a control method for controlling an imaging device including a distance measuring sensor and a focus lens.
  • the distance measuring sensor includes a light emitting element that emits pulsed light and receives reflections that include pulsed light from an object.
  • a light-receiving element that outputs a signal corresponding to the amount of light received by the light, and measures the distance to the object based on the signal.
  • the control method may include, when the signal satisfies the first condition showing the reliability of the distance, performing focus control of focusing on the object based on the first target position relationship of the imaging surface of the imaging device and the focus lens determined according to the distance.
  • the control method may include: when the signal does not satisfy the first condition, based on the first image taken by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship and the positional relationship between the imaging surface and the focusing lens In the case of the second positional relationship, the second target positional relationship between the imaging surface determined by the respective blur amounts of the second image captured by the imaging device and the focus lens is executed to perform focus control.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • Fig. 1 is an example of an external perspective view of an imaging system.
  • FIG. 2 is an example of an external perspective view showing another form of the imaging system.
  • Fig. 3 is a diagram showing functional blocks of the camera system.
  • FIG. 4 is a diagram showing an example of a curve showing the relationship between the blur amount and the lens position.
  • FIG. 5 is a diagram showing an example of the process of calculating the distance to the object based on the blur amount.
  • FIG. 6 is a diagram for explaining the relationship between the object position, the lens position, and the focal length.
  • FIG. 7 is a flowchart showing one example of the AF processing procedure of the imaging device.
  • FIG. 8 is a diagram showing the relationship between the TOF calculation result and the focus lens position.
  • FIG. 9 is a diagram showing an example of the action of the focus lens when performing AF.
  • FIG. 10 is a diagram showing an example of the action of the focus lens when performing AF.
  • FIG. 11 is a diagram showing an example of the action of the focus lens when performing AF.
  • FIG. 12 is a diagram showing an example of the action of the focus lens when performing AF.
  • Fig. 13 is a diagram showing an example of the appearance of an unmanned aircraft and a remote control device.
  • FIG. 14 is a diagram showing an example of the hardware configuration.
  • the blocks may show (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blue-lay (registered trademark) Blu-ray discs, memory sticks, integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blue-lay registered trademark
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 1 is an example of an external perspective view of an imaging system 10 according to this embodiment.
  • the imaging system 10 includes an imaging device 100, a supporting mechanism 200 and a grip 300.
  • the imaging device 100 includes a TOF sensor 160.
  • the support mechanism 200 uses actuators to rotatably support the imaging device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 can change or maintain the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203.
  • the supporting mechanism 200 also includes a base 204 for fixing the yaw axis driving mechanism 203.
  • the grip 300 is fixed to the base 204.
  • the holding part 300 includes an operation interface 301 and a display part 302.
  • the imaging device 100 is fixed to the pitch axis driving mechanism 202.
  • the operation interface 301 accepts instructions for operating the camera device 100 and the supporting mechanism 200 from the user.
  • the operation interface 301 may include a shutter/recording button for instructing the camera device 100 to shoot or record.
  • the operation interface 301 may include a power/function button for instructing to turn on or off the power of the camera 10 and to switch the still image shooting mode or the moving image shooting mode of the camera 100.
  • the display part 302 can display an image captured by the imaging device 100.
  • the display unit 302 can display a menu screen for operating the imaging device 100 and the supporting mechanism 200.
  • the display unit 302 may be a touch screen display that accepts instructions for operating the imaging device 100 and the supporting mechanism 200.
  • FIG. 2 is an example of an external perspective view showing another form of the imaging system 10.
  • the camera system 10 can be used in a state where a mobile terminal including a display such as a smartphone 400 is fixed to the side of the grip 300. The user holds the grip 300 and takes a still image or a moving image through the imaging device 100.
  • a display such as the smartphone 400 displays still images or moving images of the imaging device 100.
  • FIG. 3 is a diagram showing functional blocks of the imaging system 10.
  • the imaging device 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens driving unit 152, a plurality of lenses 154, and a TOF sensor 160.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 is an example of an image sensor used for shooting.
  • the image sensor 120 outputs image data of optical images formed by the plurality of lenses 154 to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 follows the operation instruction of the grip 300 to the imaging device 100, and the imaging control unit 110 performs a demosaicing process on the image signal output from the image sensor 120 to generate image data.
  • the imaging control unit 110 stores image data in the memory 130.
  • the imaging control unit 110 controls the TOF sensor 160.
  • the imaging control unit 110 is an example of a circuit.
  • the TOF sensor 160 is a time-of-flight type sensor that measures the distance to an object.
  • the imaging apparatus 100 adjusts the position of the focus lens based on the distance measured by the TOF sensor 160, thereby performing focus control.
  • the memory 130 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the grip 300 may include other memory for storing image data captured by the imaging device 100.
  • the holding part 300 may have a slot through which the memory can be detached from the housing of the holding part 300.
  • the multiple lenses 154 can function as zoom lenses, varifocal lenses, and focus lenses. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis.
  • the lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving part 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction.
  • the lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor.
  • the lens driving unit 152 can transmit the power from the motor to at least a part or all of the plurality of lenses 154 via mechanism components such as cam rings and guide shafts, and move at least a part or all of the plurality of lenses 154 along the optical axis.
  • the imaging device 100 further includes a posture control unit 210, an angular velocity sensor 212, and an acceleration sensor 214.
  • the angular velocity sensor 212 detects the angular velocity of the imaging device 100.
  • the angular velocity sensor 212 detects the respective angular velocities of the imaging device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the posture control unit 210 acquires angular velocity information related to the angular velocity of the imaging device 100 from the angular velocity sensor 212.
  • the angular velocity information may show the respective angular velocities of the camera device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the posture control unit 210 acquires acceleration information related to the acceleration of the imaging device 100 from the acceleration sensor 214.
  • the acceleration information may also show the acceleration of the camera device 100 in each direction of the roll axis, the pitch axis, and the yaw axis.
  • the angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that houses the image sensor 120, the lens 154, and the like. In this embodiment, a mode of the integrated configuration of the imaging device 100 and the support mechanism 200 will be described. However, the supporting mechanism 200 may include a base for detachably fixing the camera device 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging device 100 such as a base.
  • the posture control unit 210 controls the support mechanism 200 based on angular velocity information and acceleration information to maintain or change the posture of the imaging device 100.
  • the posture control unit 210 controls the support mechanism 200 according to the operation mode of the support mechanism 200 for controlling the posture of the imaging device 100 to maintain or change the posture of the imaging device 100.
  • the working modes include the following modes: operating at least one of the roll axis driving mechanism 201, the pitch axis driving mechanism 202, and the yaw axis driving mechanism 203 of the support mechanism 200 so that the posture change of the camera device 100 follows the base 204 of the support mechanism 200 The posture changes.
  • the working modes include the following modes: each of the roll axis drive mechanism 201, the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200 .
  • the working modes include the following modes: each of the pitch axis driving mechanism 202 and the yaw axis driving mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the working modes include the following modes: only the yaw axis driving mechanism 203 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the working mode may include: FPV (First Person View) mode in which the support mechanism 200 is operated to make the posture change of the camera device 100 follow the posture change of the base 204 of the support mechanism 200; and to maintain the posture of the camera device 100 And the fixed mode in which the support mechanism 200 works.
  • FPV First Person View
  • the FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to make the posture change of the camera 100 follow the posture change of the base 204 of the support mechanism 200.
  • the fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to maintain the current posture of the imaging device 100.
  • the TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emitting control unit 166, a light receiving control unit 167, and a memory 168.
  • the TOF sensor 160 is an example of a distance measuring sensor.
  • the light emitting part 162 includes at least one light emitting element 163.
  • the light emitting element 163 is a device that repeatedly emits high-speed modulated pulsed light such as an LED or laser.
  • the light emitting element 163 may emit pulsed light which is infrared light.
  • the light emission control unit 166 controls the light emission of the light emitting element 163.
  • the light emission control section 166 can control the pulse width of the pulse light emitted from the light emitting element 163.
  • the light receiving unit 164 includes a plurality of light receiving elements 165 that measure the distance to the subject associated with each of the plurality of regions.
  • the light receiving unit 164 is an example of an image sensor used for distance measurement.
  • the plurality of light receiving elements 165 respectively correspond to each of the plurality of regions.
  • the light receiving element 165 repeatedly receives reflected light of pulsed light from the object.
  • the light receiving element 165 receives light including reflected light of pulsed light from the object, and outputs a signal corresponding to the amount of received light.
  • the light receiving control unit 167 controls the light receiving element 165 to receive light.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of areas based on the signal output from the light receiving element 165.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of regions based on the amount of reflected light repeatedly received by the light receiving element 165 in the preset light receiving period.
  • the light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light based on the amount of reflected light repeatedly received by the light receiving element 165 in a preset light receiving period.
  • the light receiving unit 164 can measure the distance to the subject by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) mode.
  • FMCW Frequency Modulated Continuous Wave
  • the memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM.
  • the memory 168 stores a program necessary for the light emitting control unit 166 to control the light emitting unit 162, a program necessary for the light receiving control unit 167 to control the light receiving unit 164, and the like.
  • the auto focus (AF) method executed by the imaging device 100 will be described.
  • the imaging device 100 may move the focus lens according to the distance (subject distance) from the imaging device 100 to the subject measured by the TOF sensor 160 to control the positional relationship between the focus lens and the imaging surface of the image sensor 120.
  • the imaging control unit 110 may perform contrast AF by deriving the contrast evaluation value of the image captured by the imaging device 100 while moving the focus lens, and determining the position of the focus lens when the contrast evaluation value reaches a peak.
  • the imaging control unit 110 may apply a contrast evaluation filter to the image captured by the imaging device 100, thereby deriving the contrast evaluation value of the image.
  • the imaging control unit 110 may determine the position of the focus lens focused on the designated subject based on the contrast evaluation value to perform contrast AF.
  • BDAF Bokeh Detection Auto Foucus
  • the amount of image blur can be expressed by the following equation (1) using a Gaussian function.
  • x represents the pixel position in the horizontal direction.
  • represents the standard deviation value.
  • FIG. 4 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the focus lens position.
  • C1 is the blur amount of the image acquired when the focus lens is located at x1.
  • C2 is the blur amount of the image acquired when the focus lens is located at x2. It is possible to focus on the subject by aligning the focus lens to the lens position x0 corresponding to the minimum point 502 of the curve 500 determined according to the blur amount C1 and the amount C2 in consideration of the optical characteristics of the lens 154.
  • Fig. 5 is a flowchart showing an example of a distance calculation process in the BDAF method.
  • the imaging control unit 110 takes a first image and stores it in the memory 130 when the lens 154 and the imaging surface of the image sensor 120 are in a first positional relationship.
  • the imaging control unit 110 moves the lens 154 in the optical axis direction so that the lens 154 and the imaging surface are in a second positional relationship, and the second image is captured by the imaging device 100 and stored in the memory 130 (S201).
  • the imaging control unit 110 changes the positional relationship between the lens 154 and the imaging surface from the first positional relationship to the second positional relationship by moving the focus lens along the optical axis direction.
  • the amount of movement of the lens may be about 10 ⁇ m, for example.
  • the imaging control unit 110 divides the first image into a plurality of regions (S202).
  • the imaging control unit 110 may calculate a feature amount for each pixel in the first image, and divide the first image into a plurality of regions by using a group of pixels having similar feature amounts as one region.
  • the imaging control unit 110 may divide the pixel group set as the range of the AF processing frame in the first image into a plurality of regions.
  • the imaging control unit 110 divides the second image into a plurality of regions corresponding to the plurality of regions of the first image.
  • the imaging control unit 110 calculates the distance to the subject corresponding to the object contained in each of the multiple areas based on the respective blur amounts of the multiple areas of the first image and the respective blur amounts of the multiple areas of the second image ( S203).
  • the method of changing the positional relationship between the lens 154 and the imaging surface of the image sensor 120 is not limited to the method of moving the focus lens provided in the lens 154.
  • the imaging control unit 110 may move the entire lens 154 in the optical axis direction.
  • the imaging control unit 110 can move the imaging surface of the image sensor 120 in the optical axis direction.
  • the imaging control unit 110 can move at least a part of the lens included in the lens 154 and the imaging surface of the image sensor 120 along the optical axis direction.
  • the imaging control unit 110 may adopt any method for optically changing the relative positional relationship between the focal point of the lens 154 and the imaging surface of the image sensor 120.
  • the calculation process of the subject distance will be further explained with reference to FIG. 5.
  • the distance from the principal point of the lens L to the subject 510 (object plane) is set to A, and the distance from the principal point of the lens L to the position (image plane) where the light beam from the subject 510 is formed is set to B, Set the focal length of lens L to F.
  • the following formula (2) can be used to express the relationship between the distance A, the distance B, and the focal length F according to the lens formula.
  • the focal length F is determined by the position of each lens included in the lens L. Therefore, if the distance B of the light beam imaging from the subject 510 can be determined, the equation (2) can be used to determine the distance A from the principal point of the lens L to the subject 510.
  • the imaging surface of the image sensor to the lens L side, the positional relationship between the lens L and the imaging surface is changed.
  • the image of the subject 510 projected on the imaging surface will be Produce blur.
  • the distance B can be determined by calculating the imaged position of the subject 510 according to the blur size (circles of confusion 512 and 514) of the image of the subject 510 projected on the imaging surface, and the distance A can be further determined. That is, considering that the size of the blur (blur amount) is proportional to the imaging surface and the imaging position, the imaging position can be determined based on the difference in the blur amount.
  • the respective images of the image I 1 at a distance D1 from the imaging surface and the image I 2 at a distance D2 from the imaging surface are blurred.
  • the image I 1 assuming that the point spread function (Point Spread Function) is PSF 1 and the subject image is I d1 , then the image I 1 can be represented by the following equation (3) by convolution operation.
  • the image I 2 can also be represented by the convolution operation of PSF 2.
  • the Fourier transform of the subject image is f
  • the optical transfer function obtained by the Fourier transform of the point spread functions PSF 1 and PSF 2 is OTF 1 and OTF 2 , as shown in the following equation (4) ratio.
  • the C value shown in equation (4) is the amount of change in the respective blur amounts of the image at a distance D1 from the principal point of the lens L and the image at a position D2 from the principal point of the lens L, that is, the C value corresponds to The difference between the amount of blur between the image at a position distance D1 from the principal point of the lens L and the image at a position distance D2 from the principal point of the lens L.
  • FIG. 6 the case where the positional relationship between the lens L and the imaging surface is changed due to the movement of the imaging surface on the side of the lens L has been described.
  • the positional relationship between the focal position of the lens L and the imaging surface is changed, so that the amount of blur will also be different.
  • images with different blur amounts are mainly acquired due to moving the focus lens relative to the imaging surface, and DFD calculations are performed based on the acquired images to obtain DFD calculation values showing the amount of defocus, and based on the DFD calculation values
  • the position target value of the focus lens for focusing on the subject is calculated as the target position relationship between the imaging surface and the focus lens.
  • the imaging device 100 performs AF in any one of AF based on the TOF sensor 160 distance measurement (TOF method), AF based on DFD calculation (DFD method), and AF based on a contrast evaluation value (contrast method).
  • TOF method TOF sensor 160 distance measurement
  • DFD method AF based on DFD calculation
  • contrast evaluation value contrast evaluation value
  • the imaging device 100 can derive a target value for focusing on the position of the focus lens of the subject using one image.
  • the TOF method can accurately derive the target value.
  • external light other than the reflected light of the pulsed light may enter the TOF sensor 160, and therefore, the target value may not be accurately derived.
  • the imaging device 100 can derive a target value for focusing on the position of the focus lens of the subject based on two or more images taken at different positions of the focus lens.
  • the imaging device 100 captures low-contrast images such as clouds, sometimes the difference in the amount of blur of two or more images is small, and the target value cannot be accurately derived.
  • the imaging device 100 needs to capture multiple images to determine the position of the focus lens when the contrast evaluation value reaches the peak. Therefore, it takes longer to derive the target value.
  • it is highly resistant to disturbances such as mechanical changes caused by temperature and changes in optical components such as lenses, and the target value accuracy is also high.
  • the optimal AF method differs according to the environment in which the subject photographed by the imaging device 100 is located. Moreover, by performing AF without moving the focus lens as much as possible, image blur and the like will not occur during the AF processing, and the sense of disharmony caused to the user is reduced. Therefore, according to the present embodiment, the imaging device 100 preferentially executes the AF method, that is, it can perform AF while moving the focus lens as unnecessarily as possible. More specifically, the imaging apparatus 100 performs AF in the TOF method when the reliability of the subject distance calculated in the TOF method is high. When the reliability of the subject distance calculated by the TOF method is not high, the imaging device 100 performs AF in the DFD method. Further, when the reliability of the subject distance calculated by the TOF method and the DFD method is not high, AF is performed in the contrast method.
  • the imaging control unit 110 determines whether the signal corresponding to the amount of light output from the light receiving element 165 satisfies the first condition showing the reliability of the subject distance. When the signal shows that the amount of light is contained within the preset range, the imaging control section 110 determines that the signal satisfies the first condition. When the signal continues to show that the amount of light is included in the preset range during the preset period, the imaging control section 110 may determine that the signal satisfies the first condition. When the amplitude of the light received by the light receiving element 165 indicated by the signal is above the lower limit threshold, and the A/D converted value of the signal is below the upper threshold, the imaging control unit 110 may determine that the signal satisfies the first condition.
  • the imaging control unit 110 may determine that the signal satisfies the first condition. That is, when the light received by the light receiving element 165 is not too weak and not too strong with respect to the pulsed light emitted from the light emitting element 163, the imaging control unit 110 determines that the signal satisfies the first condition.
  • the imaging control unit 110 When the signal satisfies the first condition, the imaging control unit 110 performs focus control to focus on the object based on the first target positional relationship between the imaging surface of the imaging device 100 and the focus lens determined from the distance to the object measured by the TOF sensor 160 .
  • the imaging control unit 110 is based on the first image captured by the imaging device 100 when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, and the positional relationship between the imaging surface and the focusing lens is The second positional relationship is the second target positional relationship between the imaging plane determined by the respective blur amounts of the second image captured by the imaging device 100 and the focus lens, and the focus control for focusing on the object is performed. That is, when the signal does not satisfy the first condition, the imaging control section 110 performs focus control in the DFD method.
  • the imaging control unit 110 may convert the positional relationship between the imaging surface and the focusing lens to the first image after acquiring the first image captured by the imaging device 100 when the imaging surface and the focusing lens are in the first positional relationship.
  • the second position relationship is to acquire a second image captured by the imaging device 100 when the position relationship between the imaging surface and the focus lens is the second position relationship.
  • the imaging control unit 110 may obtain the first image captured by the imaging device 100 when the imaging surface and the focusing lens are in the first positional relationship, and then determine the orientation of the focusing lens according to the first target positional relationship. Move the preset distance in the direction to make the positional relationship between the imaging surface and the focusing lens move to the second positional relationship. Even when the accuracy of the subject distance calculated by the TOF method is low, the imaging control unit 110 may correctly determine the direction in which the focus lens should move based on the subject distance calculated by the TOF method. Therefore, when the signal does not satisfy the first condition, the imaging control unit 110 may determine the moving direction of the focus lens according to the subject distance calculated in the TOF method.
  • the imaging control section 110 may determine whether the first degree of difference showing the degree of difference between the first image and the second image satisfies the second condition showing the reliability of the second target positional relationship before performing the focus control in the DFD method.
  • the imaging control unit 110 may determine whether the first degree of difference satisfies the second condition showing the reliability of the DFD method based on the similarity between the first image and the second image. When the degree of similarity between the first image and the second image is lower than the preset threshold, the imaging control unit 110 may determine that the first degree of difference satisfies the second condition. When the difference between the contrast of the first image and the contrast of the second image is greater than a preset threshold, the imaging control unit 110 may determine that the first degree of difference satisfies the second condition.
  • the imaging control unit 110 may determine that the first degree of difference satisfies the second condition. Also, when the first degree of difference satisfies the second condition, the imaging control unit 110 may perform focus control to focus on the object according to the second target position relationship derived in the DFD method.
  • the imaging control section 110 may perform focus control of focusing on the object through contrast AF.
  • the imaging control unit 110 may further transform the positional relationship between the imaging surface and the focus lens from the second positional relationship to the third positional relationship. And, when the positional relationship between the imaging surface and the focus lens is in the third positional relationship, the second degree of difference, which is the degree of difference between the third image captured by the imaging device 100 and the first image or the second image, satisfies the second
  • the imaging control unit 110 may perform focus control of focusing on the object based on the third target position relationship between the imaging surface and the focus lens determined according to the respective blur amounts of the first image or the second image and the third image. Also, when the second degree of difference does not satisfy the second condition, the imaging control unit 110 may perform focus control to focus on the object through contrast AF.
  • FIG. 7 is a flowchart showing one example of the AF processing procedure of the imaging device 100.
  • the imaging control unit 110 causes the imaging device 100 to capture the first image for DFD when the focus lens is at the first position, and acquire the first image for DFD (S100). Then, the imaging control unit 110 determines whether the reliability of the subject distance measured by the TOF sensor 160 is high (S102). If the amount of light indicated by the signal output from the light receiving element 165 is within the preset range, the imaging control section 110 can determine that the reliability of the subject distance measured by the TOF sensor 160 is high.
  • the imaging control unit 110 calculates the position of the focus lens that focuses on the object based on the TOF result showing the subject distance measured by the TOF sensor 160 Focus position (S104).
  • the imaging control unit 110 can calculate the focus position by referring to the data showing the relationship between the TOF calculation result and the focus lens position as shown in FIG. 8, for example. Then, the imaging control unit 110 drives the focus lens to the focus position of the TOF method (S106). In a case where the reliability of the TOF result is high, as shown in FIG. 9, the imaging control unit 110 drives the focus lens from the AF start position to the focus position.
  • the imaging control unit 110 drives the focus lens to the focus direction indicating the direction toward the focus position calculated from the TOF result by only one depth or less (S108).
  • the imaging control unit 110 moves the focus lens from the first position to the second position.
  • a depth is equivalent to the minimum moving distance of the focus lens that can be used for the point spread function (Point Spread Function) when performing DFD calculations. For example, a depth can be 0.003mm.
  • the imaging control unit 110 causes the imaging device 100 to capture the second image for DFD when the focus lens is at the second position, and acquire the second image for DFD (S110).
  • the imaging control unit 110 performs DFD calculation using the first image and the second image to calculate the focus position (S112).
  • the imaging control unit 110 determines whether the reliability of the focus position derived from the DFD calculation using the first image and the second image is high (S114). When the similarity between the first image and the second image is below the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD calculation is high. When the difference between the blur amount of the first image and the blur amount of the second image is greater than or equal to the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD calculation is high.
  • the imaging control unit 110 drives the focus lens to the focus position of the DFD method (S116).
  • the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position. Then, the imaging control unit 110 drives the focus lens to the focus position calculated by the DFD method.
  • the imaging control unit 110 When it is determined that the reliability of the focus position derived by the DFD calculation is low, the imaging control unit 110 further drives the focus lens in the focus direction (S118).
  • the imaging control unit 110 can drive the focus lens only one depth or less.
  • the imaging control unit 110 may drive the focus lens to a boundary position in the contrast AF search range determined by the focus position calculated by the TOF method or the DFD method to perform contrast AF.
  • the imaging control unit 110 drives the focus lens from the second position to the third position. Then, the imaging control unit 110 causes the imaging device 100 to capture a third image for DFD when the focus lens is at the third position, and acquire the third image for DFD (S120).
  • the imaging control unit 110 determines whether the reliability of the focus position derived from the DFD calculation using the second image and the third image is high (S124). The imaging control unit 110 may also determine whether the reliability of the focus position derived from the DFD calculation using the first image and the third image is high.
  • the imaging control unit 110 drives the focus lens to the focus position of the DFD method (S116). As shown in FIG. 11, the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position, and after further moving to the third position, drives the focus lens until the second image and The focus position derived from the DFD calculation of the third image.
  • the imaging control unit 110 executes contrast AF and acquires the contrast evaluation value of the third image (S126).
  • the imaging control unit 110 searches for the position of the focus lens when the contrast evaluation value reaches the peak, and drives the focus lens in minute steps (S128).
  • the imaging control unit 110 determines whether the position of the focus lens when the contrast evaluation value reaches the peak is retrieved (S130). If the peak value is not found, the imaging control unit 110 repeats the steps after step S126. If the peak value is retrieved, the imaging control unit 110 drives the focus lens to the focus position determined by the contrast method (S132). As shown in FIG. 12, if the reliability of the focus position of the TOF method and the DFD method is low, the imaging control unit 110 drives the focus lens in a mountain climbing method to retrieve the peak value of the contrast evaluation value. The imaging control unit 110 drives the focus lens to a peak value exceeding the contrast evaluation value, and then drives the focus lens in the reverse direction to drive the focus lens to the focus position of the contrast method.
  • the imaging device 100 when the reliability of the subject distance calculated in the TOF method is high, the imaging device 100 performs AF in the TOF method. When the reliability of the subject distance calculated in the TOF method is not high, the imaging device 100 performs AF in the DFD method. Further, when the reliability of the subject distance calculated by the TOF method and the DFD method is not high, AF is performed in the contrast method. As a result, the imaging device 100 can preferentially execute the AF method, that is, a method capable of moving the focus lens as necessary as possible to perform AF.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 13.
  • UAV 1000 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV1000 is an example of a moving body propelled by a propulsion unit.
  • the concept of moving objects refers to flying objects such as airplanes moving in the air, vehicles moving on the ground, ships moving on water, etc., in addition to UAVs.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 1000 fly by controlling the rotation of a plurality of rotors.
  • the UAV body 20 uses, for example, four rotors to make the UAV 1000 fly.
  • the number of rotors is not limited to four.
  • UAV1000 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 1000 in order to control the flight of the UAV 1000.
  • the two camera devices 60 can be installed on the nose of the UAV1000, that is, on the front side.
  • the other two camera devices 60 may be installed on the bottom surface of the UAV1000.
  • the two camera devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 1000 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of imaging devices 60 included in the UAV 1000 is not limited to four. It is sufficient that the UAV1000 includes at least one camera device 60.
  • the UAV1000 may also include at least one camera 60 on the nose, tail, side, bottom and top surfaces of the UAV1000.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 600 communicates with the UAV1000 to perform remote operation on the UAV1000.
  • the remote operation device 600 can wirelessly communicate with the UAV1000.
  • the remote operation device 600 transmits to the UAV 1000 instruction information indicating various commands related to the movement of the UAV 1000 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 1000.
  • the indication information may show the height at which the UAV1000 should be located.
  • the UAV 1000 moves to be at the height indicated by the instruction information received from the remote operation device 600.
  • the instruction information may include an ascending instruction to raise the UAV1000.
  • UAV1000 rises while receiving the rise command. When the height of the UAV1000 has reached the upper limit height, even if the ascent command is accepted, the UAV1000 can be restricted from rising.
  • FIG. 14 shows an example of a computer 1200 that can fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. The item that matches the condition is read, and the attribute value of the second attribute stored in the item is read to obtain the attribute value of the second attribute associated with the first attribute that meets the preset condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Abstract

A distance measurement sensor for measuring a distance to an object according to reflective light of an optical pulse cannot precisely measure the distance to the object sometimes because of receiving external light other than the reflective light of the optical pulse such as sunlight. A control device comprises a circuit. The circuit is configured to: when a signal satisfies a first condition of the reliability of a shown distance, perform focus control for focusing on an object on the basis of a first target position relationship, determined according to the distance, between a photographing surface of a photographing device and a focusing lens; and when the signal does not satisfy the first condition, perform focus control on the basis of a second target position relationship between the photographing surface and the focusing lens, wherein the second target position relationship is determined according to a blur amount of a first image photographed by the photographing device when the position relationship between the photographing surface and the focusing lens is a first position relationship and a blur amount of a second image photographed by the photographing device when the position relationship between the photographing surface and the focusing lens is a second position relationship.

Description

控制装置、摄像系统、控制方法以及程序Control device, camera system, control method and program 【技术领域】【Technical Field】
本发明涉及一种控制装置、摄像系统、控制方法以及程序。The invention relates to a control device, a camera system, a control method and a program.
【背景技术】【Background technique】
专利文献1中记载了根据光脉冲的反射光来测量到目标被摄体的距离。Patent Document 1 describes that the distance to the target subject is measured based on the reflected light of the light pulse.
[现有技术文献][Prior Art Literature]
[专利文献][Patent Literature]
[专利文献1]日本特开2006-79074号公报[Patent Document 1] JP 2006-79074 A
【发明内容】[Content of the invention]
【发明所要解决的技术问题】[Technical problems to be solved by the invention]
根据光脉冲的反射光来测量到对象物距离的测距传感器有时由于会接收太阳光等非光脉冲的反射光的外部光线而无法精确测量到对象物的距离。The distance measuring sensor that measures the distance to the object based on the reflected light of the light pulse may not be able to accurately measure the distance to the object because it receives external light that is not reflected light of the light pulse such as sunlight.
【用于解决问题的技术手段】[Technical means used to solve the problem]
本发明的一个方面所涉及的控制装置可以是对包括测距传感器以及聚焦镜头的摄像装置进行控制的控制装置,测距传感器包括发射脉冲光的发光元件和接收包括来自对象物的脉冲光的反射光在内的光并输出与所接收的光的量相应的信号的受光元件,并根据信号来测量到对象物的距离。控制装置可以包括电路,电路构成为:当信号满足示出距离的可靠性的第一条件时,基于根据距离确定的摄像装置的摄像面与聚焦镜头的第一目标位置关系,执行对焦于对象物的对焦控制。电路可以构成为:当信号不满足第一条件时,根据在摄像面与聚焦镜头的位置关系为第一位置关系时由摄像装置所拍摄的第一图像以及在摄像面与聚焦镜头的位置关系为第二位置关系时由摄像装置所拍摄的第二图像的各自的模糊量而确定的摄像面与聚焦镜头的第二目标位置关系,执行对焦控制。The control device according to one aspect of the present invention may be a control device that controls an imaging device including a distance measuring sensor and a focusing lens. The distance measuring sensor includes a light emitting element that emits pulsed light and receives reflections that include pulsed light from an object. A light-receiving element that outputs a signal corresponding to the amount of light received by the light, and measures the distance to the object based on the signal. The control device may include a circuit configured to perform focusing on the object based on the first target positional relationship between the imaging surface of the imaging device and the focus lens determined according to the distance when the signal satisfies the first condition showing the reliability of the distance Focus control. The circuit can be configured as follows: when the signal does not satisfy the first condition, the first image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship and the positional relationship between the imaging surface and the focusing lens are The second positional relationship is the second target positional relationship between the imaging surface determined by the respective blur amounts of the second image captured by the imaging device and the focus lens, and the focus control is performed.
电路可以构成为:当示出第一图像和第二图像的差异程度的第一差异度满足示出第二目标位置关系的可靠性的第二条件时,根据第二目标位置关系执行对焦控制。The circuit may be configured to perform focus control according to the second target position relationship when the first difference degree showing the degree of difference between the first image and the second image satisfies the second condition showing the reliability of the second target position relationship.
电路可以构成为:当第一差异度不满足第二条件时,根据对比度AF来执行对焦控制。The circuit may be configured to perform focus control according to contrast AF when the first degree of difference does not satisfy the second condition.
电路可以构成为:当第一差异度不满足第二条件时,使摄像面与聚焦镜头的位置关系由第二位置关系转变为第三位置关系;当示出在摄像面与聚焦镜头的位置关系为第三位置关系时由摄像装置拍摄的第三图像与第一图像或与第二图像的差异程度的第二差异度满足第二条件时,基于根据第一图像或第二图像与第三图像的各自模糊量确定的摄像面与聚焦镜头的第三目标位置关系执行对焦控制;当第二差异度不满足第二条件时,根据对比度AF来执行对焦控制。The circuit can be configured as follows: when the first degree of difference does not satisfy the second condition, the positional relationship between the imaging surface and the focusing lens is changed from the second positional relationship to the third positional relationship; when the positional relationship between the imaging surface and the focusing lens is shown When it is the third positional relationship, when the second degree of difference between the third image taken by the imaging device and the first image or the second image satisfies the second condition, it is based on the difference between the first image or the second image and the third image The third target positional relationship between the imaging surface and the focusing lens determined by the respective blur amounts of, performs focus control; when the second degree of difference does not meet the second condition, focus control is performed according to contrast AF.
当信号示出光量包含在预设范围内时,信号可以满足第一条件。When the signal shows that the amount of light is contained within the preset range, the signal may satisfy the first condition.
电路可以构成为:当信号不满足第一条件时,获取在摄像面与聚焦镜头的位置关系为第一位置关系时由摄像装置拍摄的第一图像后,使摄像面与聚焦镜头的位置关系转变为第二位置关系,并获取在摄像面与聚焦镜头的位置关系为第二位置关系时由摄像装置拍摄的第二图像。The circuit may be configured to: when the signal does not meet the first condition, after acquiring the first image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, the positional relationship between the imaging surface and the focusing lens is converted Is a second positional relationship, and acquires a second image captured by the imaging device when the positional relationship between the imaging surface and the focus lens is the second positional relationship.
电路可以构成为:当信号不满足第一条件时,获取在摄像面与聚焦镜头的位置关系为第一位置关系时由摄像装置拍摄的第一图像后,使聚焦镜头向根据第一目标位置关系确定的方向移动预设距离,由此使摄像面与聚焦镜头的位置关系移动至第二位置关系。The circuit may be configured as follows: when the signal does not satisfy the first condition, after acquiring the first image taken by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, the focusing lens is moved according to the first target positional relationship Move the predetermined distance in the determined direction, thereby moving the positional relationship between the imaging surface and the focusing lens to the second positional relationship.
本发明的一个方面所涉及的摄像系统可以包括上述控制装置、测距传感器和摄像装置。The camera system according to an aspect of the present invention may include the above-mentioned control device, a distance measuring sensor, and a camera device.
本发明的一个方面所涉及的控制方法可以是对包括测距传感器以及聚焦镜头的摄像装置进行控制的控制方法,测距传感器包括发射脉冲光的发光元件和接收包括来自对象物的脉冲光的反射光在内的光并输出与所接收的光的量相应的信号的受光元件,并根据信号来测量到对象物的距离。控制方法可以包括:当信号满足示出距离的可靠性的第一条件时,基于根据距离确定的摄像装置的摄像面与聚焦镜头的第一目标位置关系,执行对焦于对象物的对焦控制。控制方法可以包括:当信号不满足第一条件时,基于根据在摄像面与聚焦镜头的位置关系为第一位置关系时由摄像装置所拍摄的第一图像以及在摄像面与聚焦镜头的位置关系为第二位置关系时由摄像装置所拍摄的第二图像的各自的模糊量确定的摄像面与聚焦镜头的第二目标位置关系,执行对焦控制。The control method according to an aspect of the present invention may be a control method for controlling an imaging device including a distance measuring sensor and a focus lens. The distance measuring sensor includes a light emitting element that emits pulsed light and receives reflections that include pulsed light from an object. A light-receiving element that outputs a signal corresponding to the amount of light received by the light, and measures the distance to the object based on the signal. The control method may include, when the signal satisfies the first condition showing the reliability of the distance, performing focus control of focusing on the object based on the first target position relationship of the imaging surface of the imaging device and the focus lens determined according to the distance. The control method may include: when the signal does not satisfy the first condition, based on the first image taken by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship and the positional relationship between the imaging surface and the focusing lens In the case of the second positional relationship, the second target positional relationship between the imaging surface determined by the respective blur amounts of the second image captured by the imaging device and the focus lens is executed to perform focus control.
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。The program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
根据本发明的一个方面,可以避免无意义的驱动聚焦镜头,而以更佳的方式来执行对焦控制。According to an aspect of the present invention, it is possible to avoid meaningless driving of the focus lens, and to perform focus control in a better manner.
此外,上述发明内容未列举本发明的全部必要特征。此外,这些特征组的子组合也可以构成发明。In addition, the above summary does not enumerate all the essential features of the present invention. In addition, sub-combinations of these feature groups may also constitute inventions.
【附图说明】【Explanation of drawings】
图1是摄像系统的外观立体图的一个示例。Fig. 1 is an example of an external perspective view of an imaging system.
图2是示出摄像系统的其他形式的外观立体图的一个示例。FIG. 2 is an example of an external perspective view showing another form of the imaging system.
图3是示出摄像系统的功能块的图。Fig. 3 is a diagram showing functional blocks of the camera system.
图4是示出示出模糊量与镜头位置的关系的曲线的一个示例的图。FIG. 4 is a diagram showing an example of a curve showing the relationship between the blur amount and the lens position.
图5是示出基于模糊量计算出到对象的距离的过程的一个示例的图。FIG. 5 is a diagram showing an example of the process of calculating the distance to the object based on the blur amount.
图6是用于说明对象位置、镜头位置及焦距之间的关系的图。FIG. 6 is a diagram for explaining the relationship between the object position, the lens position, and the focal length.
图7是示出摄像装置的AF处理过程的一个示例的流程图。FIG. 7 is a flowchart showing one example of the AF processing procedure of the imaging device.
图8是示出TOF运算结果与聚焦镜头位置的关系的图。FIG. 8 is a diagram showing the relationship between the TOF calculation result and the focus lens position.
图9是示出在执行AF时的聚焦镜头的动作的一个示例的图。FIG. 9 is a diagram showing an example of the action of the focus lens when performing AF.
图10是示出在执行AF时的聚焦镜头的动作的一个示例的图。FIG. 10 is a diagram showing an example of the action of the focus lens when performing AF.
图11是示出在执行AF时的聚焦镜头的动作的一个示例的图。FIG. 11 is a diagram showing an example of the action of the focus lens when performing AF.
图12是示出在执行AF时的聚焦镜头的动作的一个示例的图。FIG. 12 is a diagram showing an example of the action of the focus lens when performing AF.
图13是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。Fig. 13 is a diagram showing an example of the appearance of an unmanned aircraft and a remote control device.
图14是示出硬件构成的一个示例的图。FIG. 14 is a diagram showing an example of the hardware configuration.
【具体实施方式】【detailed description】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未 必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, all the feature combinations described in the embodiments are not necessarily necessary for the solution of the invention. It is obvious to a person skilled in the art that various changes or improvements can be made to the following embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所示出的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the abstract of the description include matters that are the subject of copyright protection. As long as anyone copies these files as shown in the patent office files or records, the copyright owner will not raise an objection. However, in other cases, all copyrights are reserved.
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可示出(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。Various embodiments of the present invention may be described with reference to flowcharts and block diagrams. Here, the blocks may show (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and "parts" can be implemented by programmable circuits and/or processors. Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits. The programmable circuit may include a reconfigurable hardware circuit. Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、Blue-lay(注册商标)蓝光光盘、记忆棒、集成电路卡等。The computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device. As a result, the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram. As examples of computer-readable media, electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included. As a more specific example of the computer readable medium, it may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blue-lay (registered trademark) Blu-ray discs, memory sticks, integrated circuit cards, etc.
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。The computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages. The source code or object code includes traditional procedural programming languages. Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages. The computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device. The processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
图1是本实施方式所涉及的摄像系统10的外观立体图的一个示例。摄像系统10包括摄像装置100、支撑机构200以及握持部300。摄像装置100包括TOF传感器160。支撑机构200使用致动器分别以滚转轴、俯仰轴、偏航轴为中心可旋转地支撑摄像装置100。支撑机构200可通过使摄像装置100以滚转轴、俯仰轴以及偏航轴中的至少一个为中心旋转,来变更或维持摄像装置100的姿势。支撑机构200包括滚转轴驱动机构201、俯仰轴驱动机构202和偏航轴驱动机构203。支撑机构200还包括固定偏航轴驱动机构203的基部204。握持部300固定于基部204。握持部300包括操作界面301以及显示部302。摄像装置100固定于俯仰轴驱动机构202。FIG. 1 is an example of an external perspective view of an imaging system 10 according to this embodiment. The imaging system 10 includes an imaging device 100, a supporting mechanism 200 and a grip 300. The imaging device 100 includes a TOF sensor 160. The support mechanism 200 uses actuators to rotatably support the imaging device 100 around the roll axis, the pitch axis, and the yaw axis. The support mechanism 200 can change or maintain the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the roll axis, the pitch axis, and the yaw axis. The support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203. The supporting mechanism 200 also includes a base 204 for fixing the yaw axis driving mechanism 203. The grip 300 is fixed to the base 204. The holding part 300 includes an operation interface 301 and a display part 302. The imaging device 100 is fixed to the pitch axis driving mechanism 202.
操作界面301接受来自用户的用于操作摄像装置100及支撑机构200的指令。操作界面301可包括指示摄像装置100进行拍摄或者录像的快门/录像按钮。操作界面 301可包括指示打开或者切断摄像装置10的电源以及切换摄像装置100的静态图像拍摄模式或动态图像拍摄模式的电源/功能按钮。The operation interface 301 accepts instructions for operating the camera device 100 and the supporting mechanism 200 from the user. The operation interface 301 may include a shutter/recording button for instructing the camera device 100 to shoot or record. The operation interface 301 may include a power/function button for instructing to turn on or off the power of the camera 10 and to switch the still image shooting mode or the moving image shooting mode of the camera 100.
显示部302可显示由摄像装置100拍摄的图像。显示部302可显示用于操作摄像装置100以及支撑机构200的菜单画面。显示部302可以是接受用于操作摄像装置100以及支撑机构200的指令的触屏显示器。The display part 302 can display an image captured by the imaging device 100. The display unit 302 can display a menu screen for operating the imaging device 100 and the supporting mechanism 200. The display unit 302 may be a touch screen display that accepts instructions for operating the imaging device 100 and the supporting mechanism 200.
图2是示出摄像系统10的其他形式的外观立体图的一个示例。如图2所示,可在将包括智能手机400等显示器的移动终端固定于握持部300的侧旁的状态下使用摄像系统10。用户握持握持部300,通过摄像装置100拍摄静态图像或者动态图像。智能手机400等显示器显示摄像装置100的静止图像或者动态图像。FIG. 2 is an example of an external perspective view showing another form of the imaging system 10. As shown in FIG. 2, the camera system 10 can be used in a state where a mobile terminal including a display such as a smartphone 400 is fixed to the side of the grip 300. The user holds the grip 300 and takes a still image or a moving image through the imaging device 100. A display such as the smartphone 400 displays still images or moving images of the imaging device 100.
图3是示出摄像系统10的功能块的图。摄像装置100包括摄像控制部110、图像传感器120、存储器130、镜头控制部150、镜头驱动部152、多个镜头154以及TOF传感器160。FIG. 3 is a diagram showing functional blocks of the imaging system 10. The imaging device 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens driving unit 152, a plurality of lenses 154, and a TOF sensor 160.
图像传感器120可以由CCD或CMOS构成。图像传感器120是用于拍摄的图像传感器的一个示例。图像传感器120将通过多个镜头154成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。The image sensor 120 may be composed of CCD or CMOS. The image sensor 120 is an example of an image sensor used for shooting. The image sensor 120 outputs image data of optical images formed by the plurality of lenses 154 to the imaging control unit 110. The imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
摄像控制部110按照握持部300对摄像装置100的动作指令,并且摄像控制部110对从图像传感器120输出的图像信号实施去马赛克处理,生成图像数据。摄像控制部110将图像数据存储在存储器130。摄像控制部110控制TOF传感器160。摄像控制部110是电路的一个示例。TOF传感器160是测量到对象物距离的飞行时间型传感器。摄像装置100基于由TOF传感器160测量的距离来调整聚焦镜头的位置,以此来执行对焦控制。The imaging control unit 110 follows the operation instruction of the grip 300 to the imaging device 100, and the imaging control unit 110 performs a demosaicing process on the image signal output from the image sensor 120 to generate image data. The imaging control unit 110 stores image data in the memory 130. The imaging control unit 110 controls the TOF sensor 160. The imaging control unit 110 is an example of a circuit. The TOF sensor 160 is a time-of-flight type sensor that measures the distance to an object. The imaging apparatus 100 adjusts the position of the focus lens based on the distance measured by the TOF sensor 160, thereby performing focus control.
存储器130可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器130储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。握持部300可包括用于保存由摄像装置100拍摄的图像数据的其他存储器。握持部300可具有可从握持部300的壳体上拆卸存储器的插槽。The memory 130 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the imaging device 100. The grip 300 may include other memory for storing image data captured by the imaging device 100. The holding part 300 may have a slot through which the memory can be detached from the housing of the holding part 300.
多个镜头154可以起到变焦镜头(zoom lens)、可变焦距镜头(varifocal lens)及对焦镜头的作用。多个镜头154中的至少一部分或全部被配置为能够沿着光轴移动。镜头控制部150按照来自摄像控制部110的镜头控制指令来驱动镜头驱动部152,使一个或者多个镜头154沿光轴方向移动。镜头控制指令例如为变焦控制指令及对焦控制指令。镜头驱动部152可包括使多个镜头154的至少一部分或全部沿光轴方向移动的音圈电机(VCM)。镜头驱动部152可包括DC电机、空心杯电机或超声波电机等电动机。镜头驱动部152可将来自电动机的动力经由凸轮环、导轴等的机构部件传递给多个镜头154中的至少一部分或全部,使多个镜头154中的至少一部分或全部沿光轴移动。The multiple lenses 154 can function as zoom lenses, varifocal lenses, and focus lenses. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis. The lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction. The lens control commands are, for example, zoom control commands and focus control commands. The lens driving part 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction. The lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor. The lens driving unit 152 can transmit the power from the motor to at least a part or all of the plurality of lenses 154 via mechanism components such as cam rings and guide shafts, and move at least a part or all of the plurality of lenses 154 along the optical axis.
摄像装置100还包括姿势控制部210、角速度传感器212以及加速度传感器214。角速度传感器212检测摄像装置100的角速度。角速度传感器212检测摄像装置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210获取来自于角速度传感器212的与摄像装置100的角速度相关的角速度信息。角速度信息可示出摄像装置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210获取来自加 速度传感器214的与摄像装置100的加速度相关的加速度信息。加速度信息也可示出摄像装置100在滚转轴、俯仰轴以及偏航轴的各个方向的加速度。The imaging device 100 further includes a posture control unit 210, an angular velocity sensor 212, and an acceleration sensor 214. The angular velocity sensor 212 detects the angular velocity of the imaging device 100. The angular velocity sensor 212 detects the respective angular velocities of the imaging device 100 around the roll axis, the pitch axis, and the yaw axis. The posture control unit 210 acquires angular velocity information related to the angular velocity of the imaging device 100 from the angular velocity sensor 212. The angular velocity information may show the respective angular velocities of the camera device 100 around the roll axis, the pitch axis, and the yaw axis. The posture control unit 210 acquires acceleration information related to the acceleration of the imaging device 100 from the acceleration sensor 214. The acceleration information may also show the acceleration of the camera device 100 in each direction of the roll axis, the pitch axis, and the yaw axis.
角速度传感器212以及加速度传感器214可设置于容纳图像传感器120以及镜头154等的壳体内。在本实施方式中,对摄像装置100和支撑机构200一体的构成的方式进行说明。但是,支撑机构200可包括可拆装地固定摄像装置100的基座。在该情况下,角速度传感器212以及加速度传感器214可设置于基座等摄像装置100的壳体外。The angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that houses the image sensor 120, the lens 154, and the like. In this embodiment, a mode of the integrated configuration of the imaging device 100 and the support mechanism 200 will be described. However, the supporting mechanism 200 may include a base for detachably fixing the camera device 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging device 100 such as a base.
姿势控制部210基于角速度信息及加速度信息来控制支撑机构200,以维持或改变摄像装置100的姿势。姿势控制部210按照用于控制摄像装置100的姿势的支撑机构200的工作模式来控制支撑机构200,以维持或改变摄像装置100的姿势。The posture control unit 210 controls the support mechanism 200 based on angular velocity information and acceleration information to maintain or change the posture of the imaging device 100. The posture control unit 210 controls the support mechanism 200 according to the operation mode of the support mechanism 200 for controlling the posture of the imaging device 100 to maintain or change the posture of the imaging device 100.
工作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203各自工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作模式包括以下模式:使支撑机构200的俯仰轴驱动机构202以及偏航轴驱动机构203各自工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作模式包括以下模式:仅使偏航轴驱动机构203工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。The working modes include the following modes: operating at least one of the roll axis driving mechanism 201, the pitch axis driving mechanism 202, and the yaw axis driving mechanism 203 of the support mechanism 200 so that the posture change of the camera device 100 follows the base 204 of the support mechanism 200 The posture changes. The working modes include the following modes: each of the roll axis drive mechanism 201, the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200 . The working modes include the following modes: each of the pitch axis driving mechanism 202 and the yaw axis driving mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200. The working modes include the following modes: only the yaw axis driving mechanism 203 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
工作模式可包括:使支撑机构200工作以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化的FPV(First Person View,第一人称主视角)模式;以及为维持摄像装置100的姿势而使支撑机构200工作的固定模式。The working mode may include: FPV (First Person View) mode in which the support mechanism 200 is operated to make the posture change of the camera device 100 follow the posture change of the base 204 of the support mechanism 200; and to maintain the posture of the camera device 100 And the fixed mode in which the support mechanism 200 works.
FPV模式是为使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作的模式。固定模式是为维持摄像装置100的当前姿势,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作的模式。The FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to make the posture change of the camera 100 follow the posture change of the base 204 of the support mechanism 200. The fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to maintain the current posture of the imaging device 100.
TOF传感器160包括发光部162、受光部164、发光控制部166、受光控制部167以及存储器168。TOF传感器160是测距传感器的一个示例。The TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emitting control unit 166, a light receiving control unit 167, and a memory 168. The TOF sensor 160 is an example of a distance measuring sensor.
发光部162包含至少一个发光元件163。发光元件163是反复发射LED或者激光等经高速调制的脉冲光的设备。发光元件163可发射是红外光的脉冲光。发光控制部166控制发光元件163的发光。发光控制部166可控制从发光元件163发射的脉冲光的脉冲宽度。The light emitting part 162 includes at least one light emitting element 163. The light emitting element 163 is a device that repeatedly emits high-speed modulated pulsed light such as an LED or laser. The light emitting element 163 may emit pulsed light which is infrared light. The light emission control unit 166 controls the light emission of the light emitting element 163. The light emission control section 166 can control the pulse width of the pulse light emitted from the light emitting element 163.
受光部164包括测量到多个区域各自所关联的被摄体的距离的多个受光元件165。受光部164是用于测距的图像传感器的一个示例。多个受光元件165分别对应多个区域的各个区域。受光元件165反复接收来自对象物的脉冲光的反射光。受光元件165接收包括来自对象物的脉冲光的反射光在内的光,并输出与接收的光的量相应的信号。受光控制部167控制受光元件165接收光。受光控制部167基于从受光元件165输出的信号,测量到多个区域各自所关联的被摄体的距离。受光控制部167基于预设受光期间内受光元件165反复接收的反射光的量,测量到多个区域各自所关联的被摄体的距离。受光控制部167可以基于预设受光期间内受光元件165反复接收的反射光的量,通过确定脉冲光与反射光之间的相位差,来测量到被摄体的距离。受光部164可以通 过读取反射波的频率变化来测量到被摄体的距离。这被称为FMCW(Frequency Modulated Continuous Wave,调频连续波)方式。The light receiving unit 164 includes a plurality of light receiving elements 165 that measure the distance to the subject associated with each of the plurality of regions. The light receiving unit 164 is an example of an image sensor used for distance measurement. The plurality of light receiving elements 165 respectively correspond to each of the plurality of regions. The light receiving element 165 repeatedly receives reflected light of pulsed light from the object. The light receiving element 165 receives light including reflected light of pulsed light from the object, and outputs a signal corresponding to the amount of received light. The light receiving control unit 167 controls the light receiving element 165 to receive light. The light receiving control unit 167 measures the distance to the subject associated with each of the plurality of areas based on the signal output from the light receiving element 165. The light receiving control unit 167 measures the distance to the subject associated with each of the plurality of regions based on the amount of reflected light repeatedly received by the light receiving element 165 in the preset light receiving period. The light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light based on the amount of reflected light repeatedly received by the light receiving element 165 in a preset light receiving period. The light receiving unit 164 can measure the distance to the subject by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) mode.
存储器168可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM以及EEPROM中的至少一个。存储器168存储发光控制部166为控制发光部162所需的程序以及受光控制部167为控制受光部164所需的程序等。The memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM. The memory 168 stores a program necessary for the light emitting control unit 166 to control the light emitting unit 162, a program necessary for the light receiving control unit 167 to control the light receiving unit 164, and the like.
对摄像装置100执行的自动聚焦(AF)方式进行说明。摄像装置100可根据由TOF传感器160测量的从摄像装置100到被摄体的距离(被摄体距离)使聚焦镜头移动,以此来控制聚焦镜头和图像传感器120的摄像面的位置关系。The auto focus (AF) method executed by the imaging device 100 will be described. The imaging device 100 may move the focus lens according to the distance (subject distance) from the imaging device 100 to the subject measured by the TOF sensor 160 to control the positional relationship between the focus lens and the imaging surface of the image sensor 120.
摄像控制部110可通过使聚焦镜头移动的同时,导出由摄像装置100所拍摄的图像的对比度评估值,确定对比度评估值达到峰值时的聚焦镜头的位置,来执行对比度AF。摄像控制部110可针对由摄像装置100拍摄的图像应用对比度评估滤波器,从而将图像的对比度评估值导出。摄像控制部110可根据对比度评估值,对对焦于指定的被摄体上的聚焦镜头的位置进行确定来执行对比度AF。The imaging control unit 110 may perform contrast AF by deriving the contrast evaluation value of the image captured by the imaging device 100 while moving the focus lens, and determining the position of the focus lens when the contrast evaluation value reaches a peak. The imaging control unit 110 may apply a contrast evaluation filter to the image captured by the imaging device 100, thereby deriving the contrast evaluation value of the image. The imaging control unit 110 may determine the position of the focus lens focused on the designated subject based on the contrast evaluation value to perform contrast AF.
作为其他的AF方式,存在以下方式:通过使聚焦镜头移动,基于在聚焦镜头和图像传感器120的摄像面的位置关系不同的状态下拍摄的多个图像的模糊量进行确定。文中,将采用该方式的AF称为模糊检测自动聚焦(Bokeh Detection Auto Foucus:BDAF)方式。具体来说,在BDAF中进行DFD(Depth From Defocus,散焦测距)运算来进行AF。As another AF method, there is a method in which the focus lens is moved to determine the blur amount of a plurality of images captured in a state where the positional relationship between the focus lens and the imaging surface of the image sensor 120 is different. In this article, the AF using this method is called the Bokeh Detection Auto Foucus (BDAF) method. Specifically, DFD (Depth From Defocus) calculation is performed in BDAF to perform AF.
例如,图像的模糊量(Cost)可以采用高斯函数由以下式(1)来表示。在式(1)中,x表示水平方向上的像素位置。σ表示标准偏差值。For example, the amount of image blur (Cost) can be expressed by the following equation (1) using a Gaussian function. In formula (1), x represents the pixel position in the horizontal direction. σ represents the standard deviation value.
【式1】【Formula 1】
Figure PCTCN2020106579-appb-000001
Figure PCTCN2020106579-appb-000001
图4示出表示图像的模糊量(Cost)与聚焦镜头位置的关系的曲线的一个示例。C1为聚焦镜头位于x1时获取的图像的模糊量。C2为聚焦镜头位于x2时获取的图像的模糊量。可以通过将聚焦镜头对准对应于考虑镜头154的光学特性而根据模糊量C1及量C2确定的曲线500的极小点502的镜头位置x0而对焦于被摄体。FIG. 4 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the focus lens position. C1 is the blur amount of the image acquired when the focus lens is located at x1. C2 is the blur amount of the image acquired when the focus lens is located at x2. It is possible to focus on the subject by aligning the focus lens to the lens position x0 corresponding to the minimum point 502 of the curve 500 determined according to the blur amount C1 and the amount C2 in consideration of the optical characteristics of the lens 154.
图5是示出在BDAF方式中距离计算过程的一个示例的流程图。摄像控制部110在镜头154和图像传感器120的摄像面处于第一位置关系的状态下,拍摄第一图像并存储在存储器130中。摄像控制部110通过沿光轴方向移动镜头154,使镜头154与摄像面处于第二位置关系的状态,通过摄像装置100拍摄第二图像,并存储在存储器130中(S201)。例如,摄像控制部110通过沿光轴方向移动聚焦镜头,将镜头154与摄像面的位置关系从第一位置关系变更为第二位置关系。镜头的移动量例如可以为10μm左右。Fig. 5 is a flowchart showing an example of a distance calculation process in the BDAF method. The imaging control unit 110 takes a first image and stores it in the memory 130 when the lens 154 and the imaging surface of the image sensor 120 are in a first positional relationship. The imaging control unit 110 moves the lens 154 in the optical axis direction so that the lens 154 and the imaging surface are in a second positional relationship, and the second image is captured by the imaging device 100 and stored in the memory 130 (S201). For example, the imaging control unit 110 changes the positional relationship between the lens 154 and the imaging surface from the first positional relationship to the second positional relationship by moving the focus lens along the optical axis direction. The amount of movement of the lens may be about 10 μm, for example.
然后,摄像控制部110将第一图像分割为多个区域(S202)。摄像控制部110可以对第一图像内的每个像素计算特征量,将具有类似的特征量的像素组作为一个区域而将第一图像分割为多个区域。摄像控制部110也可以将第一图像中被设定为AF处理框的范围的像素组分割为多个区域。摄像控制部110将第二图像分割为与第一图像的多个区域相对应的多个区域。摄像控制部110基于第一图像的多个区域各自的模糊 量及第二图像的多个区域各自的模糊量,计算到对应于多个区域的各个区域中包含的物体的被摄体的距离(S203)。Then, the imaging control unit 110 divides the first image into a plurality of regions (S202). The imaging control unit 110 may calculate a feature amount for each pixel in the first image, and divide the first image into a plurality of regions by using a group of pixels having similar feature amounts as one region. The imaging control unit 110 may divide the pixel group set as the range of the AF processing frame in the first image into a plurality of regions. The imaging control unit 110 divides the second image into a plurality of regions corresponding to the plurality of regions of the first image. The imaging control unit 110 calculates the distance to the subject corresponding to the object contained in each of the multiple areas based on the respective blur amounts of the multiple areas of the first image and the respective blur amounts of the multiple areas of the second image ( S203).
而且,改变镜头154和图像传感器120的摄像面的位置关系的方法并不限于使镜头154具备的聚焦镜头移动的方法。例如,摄像控制部110可以使镜头154整体沿光轴方向移动。摄像控制部110可以使图像传感器120的摄像面沿光轴方向移动。摄像控制部110可以将镜头154所包括的至少一部分镜头以及图像传感器120的摄像面都沿光轴方向移动。摄像控制部110可以采用用于光学地改变镜头154的焦点与图像传感器120的摄像面的相对位置关系的任意方法。In addition, the method of changing the positional relationship between the lens 154 and the imaging surface of the image sensor 120 is not limited to the method of moving the focus lens provided in the lens 154. For example, the imaging control unit 110 may move the entire lens 154 in the optical axis direction. The imaging control unit 110 can move the imaging surface of the image sensor 120 in the optical axis direction. The imaging control unit 110 can move at least a part of the lens included in the lens 154 and the imaging surface of the image sensor 120 along the optical axis direction. The imaging control unit 110 may adopt any method for optically changing the relative positional relationship between the focal point of the lens 154 and the imaging surface of the image sensor 120.
将参照图5进一步说明被摄体距离的计算过程。将从镜头L的主点到被摄体510(物面)的距离设为A,将从镜头L的主点到从被摄体510的光束成像的位置(像面)的距离设为B,将镜头L的焦距设为F。在这种情况下,可以根据镜头公式用以下式(2)来表示距离A、距离B及焦距F之间的关系。The calculation process of the subject distance will be further explained with reference to FIG. 5. The distance from the principal point of the lens L to the subject 510 (object plane) is set to A, and the distance from the principal point of the lens L to the position (image plane) where the light beam from the subject 510 is formed is set to B, Set the focal length of lens L to F. In this case, the following formula (2) can be used to express the relationship between the distance A, the distance B, and the focal length F according to the lens formula.
【式2】[Formula 2]
Figure PCTCN2020106579-appb-000002
Figure PCTCN2020106579-appb-000002
焦距F由镜头L所包括的各镜头的位置决定。因此,如果可以确定来自被摄体510的光束成像的距离B,则能够采用式(2)来确定从镜头L的主点到被摄体510的距离A。The focal length F is determined by the position of each lens included in the lens L. Therefore, if the distance B of the light beam imaging from the subject 510 can be determined, the equation (2) can be used to determine the distance A from the principal point of the lens L to the subject 510.
这里,假设通过将图像传感器的摄像面移动到镜头L侧,改变了镜头L与摄像面的位置关系。如图6所示,如果在距离镜头L的主点的距离D1的位置或距离镜头L的主点的距离D2的位置处存在摄像面,则投影在摄像面上的被摄体510的像会产生模糊。可以通过根据投影在摄像面上的被摄体510的像的模糊大小(弥散圆512和514)计算出被摄体510成像的位置,来确定距离B,并且进一步确定距离A。也就是说,考虑到模糊的大小(模糊量)与摄像面和成像位置成比例,根据模糊量的差可以确定摄像位置。Here, it is assumed that by moving the imaging surface of the image sensor to the lens L side, the positional relationship between the lens L and the imaging surface is changed. As shown in FIG. 6, if there is an imaging surface at a distance D1 from the principal point of the lens L or a distance D2 from the principal point of the lens L, the image of the subject 510 projected on the imaging surface will be Produce blur. The distance B can be determined by calculating the imaged position of the subject 510 according to the blur size (circles of confusion 512 and 514) of the image of the subject 510 projected on the imaging surface, and the distance A can be further determined. That is, considering that the size of the blur (blur amount) is proportional to the imaging surface and the imaging position, the imaging position can be determined based on the difference in the blur amount.
在此,距摄像面距离D1的位置的像I 1以及距离摄像面距离D2的位置的像I 2的各个图像是模糊的。关于像I 1,设点扩散函数(Point Spread Function)为PSF 1,被摄体像为I d1,则像I 1可以通过卷积运算用下式(3)来示出。 Here, the respective images of the image I 1 at a distance D1 from the imaging surface and the image I 2 at a distance D2 from the imaging surface are blurred. Regarding the image I 1 , assuming that the point spread function (Point Spread Function) is PSF 1 and the subject image is I d1 , then the image I 1 can be represented by the following equation (3) by convolution operation.
【式3】[Formula 3]
I 1=PSF 1*I d1…(3) I 1 =PSF 1 *I d1 …(3)
图像I 2也同样可以通过PSF 2的卷积运算来表示。设被摄体像的傅里叶变换为f,设点扩散函数PSF 1及PSF 2进行傅里叶变换获取的光学传递函数(Optical Transfer Function)为OTF 1及OTF 2,如下式(4)得到比值。 The image I 2 can also be represented by the convolution operation of PSF 2. Suppose the Fourier transform of the subject image is f, and the optical transfer function obtained by the Fourier transform of the point spread functions PSF 1 and PSF 2 is OTF 1 and OTF 2 , as shown in the following equation (4) ratio.
【式4】[Formula 4]
Figure PCTCN2020106579-appb-000003
Figure PCTCN2020106579-appb-000003
式(4)所示的C值为距镜头L的主点距离D1的位置的像以及距镜头L的主点距离D2的位置的图像的各自的模糊量的变化量,即,C值相当于距镜头L的主点距离D1的位置的像以及距镜头L的主点距离D2的位置的图像的模糊量之差。The C value shown in equation (4) is the amount of change in the respective blur amounts of the image at a distance D1 from the principal point of the lens L and the image at a position D2 from the principal point of the lens L, that is, the C value corresponds to The difference between the amount of blur between the image at a position distance D1 from the principal point of the lens L and the image at a position distance D2 from the principal point of the lens L.
在图6中,对于由于使摄像面向镜头L侧移动而改变了镜头L和摄像面的位置关系的情况进行了说明。通过使聚焦镜头相对于摄像面移动而使镜头L的焦点位置和摄像面的位置关系发生改变,由此模糊量也会产生不同。在本实施方式中,主要获取由于使聚焦镜头相对于摄像面移动而获取不同模糊量的图像,并基于获取的图像进行DFD运算来获取示出散焦量的DFD运算值,并基于DFD运算值计算出用于对焦于被摄体的聚焦镜头的位置目标值作为摄像面与聚焦镜头的目标位置关系。In FIG. 6, the case where the positional relationship between the lens L and the imaging surface is changed due to the movement of the imaging surface on the side of the lens L has been described. By moving the focus lens with respect to the imaging surface, the positional relationship between the focal position of the lens L and the imaging surface is changed, so that the amount of blur will also be different. In this embodiment, images with different blur amounts are mainly acquired due to moving the focus lens relative to the imaging surface, and DFD calculations are performed based on the acquired images to obtain DFD calculation values showing the amount of defocus, and based on the DFD calculation values The position target value of the focus lens for focusing on the subject is calculated as the target position relationship between the imaging surface and the focus lens.
如上所述,摄像装置100以基于TOF传感器160测距的AF(TOF方式)、基于DFD运算的AF(DFD方式)以及基于对比度评估值的AF(对比度方式)中的任意一种执行AF。As described above, the imaging device 100 performs AF in any one of AF based on the TOF sensor 160 distance measurement (TOF method), AF based on DFD calculation (DFD method), and AF based on a contrast evaluation value (contrast method).
根据TOF方式,摄像装置100可用一张图像导出用于对焦于被摄体的聚焦镜头的位置的目标值。当摄像装置100拍摄低亮度且低对比度的图像时,TOF方式可精确导出目标值。但是,在太阳光容易入射的环境下,脉冲光的反射光以外的外部光线有可能入射至TOF传感器160,因此存在无法精确导出目标值的情况。According to the TOF method, the imaging device 100 can derive a target value for focusing on the position of the focus lens of the subject using one image. When the imaging device 100 captures low-brightness and low-contrast images, the TOF method can accurately derive the target value. However, in an environment where sunlight is easy to enter, external light other than the reflected light of the pulsed light may enter the TOF sensor 160, and therefore, the target value may not be accurately derived.
根据DFD方式,摄像装置100可根据在聚焦镜头的位置不同的位置拍摄的两张以上的图像,导出用于对焦于被摄体的聚焦镜头的位置的目标值。但是,摄像装置100在拍摄例如云等对比度较低的图像时,有时两张以上图像的模糊量的差较小,无法精确导出目标值。According to the DFD method, the imaging device 100 can derive a target value for focusing on the position of the focus lens of the subject based on two or more images taken at different positions of the focus lens. However, when the imaging device 100 captures low-contrast images such as clouds, sometimes the difference in the amount of blur of two or more images is small, and the target value cannot be accurately derived.
根据对比度方式,摄像装置100需要拍摄多个图像来确定对比度评估值达到峰值时的聚焦镜头的位置。因此,导出目标值花费的时间会较长。不过,对温度引起的机械变化、镜头等光学部件的变化等的干扰的抗性较强,目标值精度也高。According to the contrast method, the imaging device 100 needs to capture multiple images to determine the position of the focus lens when the contrast evaluation value reaches the peak. Therefore, it takes longer to derive the target value. However, it is highly resistant to disturbances such as mechanical changes caused by temperature and changes in optical components such as lenses, and the target value accuracy is also high.
如上所述,根据由摄像装置100拍摄的被摄体所处的环境不同,最佳AF方式有所不同。而且,通过尽量不移动聚焦镜头地进行AF,在AF处理期间不会产生图像模糊等,而降低对用户造成的违和感。因此,根据本实施方式,摄像装置100优先执行AF方式,即能够在尽量不必要地移动聚焦镜头的情况下进行AF。更具体而言,摄像装置100在以TOF方式计算出的被摄体距离的可靠性较高的情况下,以TOF方式进行AF。摄像装置100在以TOF方式计算出的被摄体距离的可靠性不高的情况下,以DFD方式进行AF。进一步地,在以TOF方式以及DFD方式计算出的被摄体距离的可靠性都不高的情况下,以对比度方式进行AF。As described above, the optimal AF method differs according to the environment in which the subject photographed by the imaging device 100 is located. Moreover, by performing AF without moving the focus lens as much as possible, image blur and the like will not occur during the AF processing, and the sense of disharmony caused to the user is reduced. Therefore, according to the present embodiment, the imaging device 100 preferentially executes the AF method, that is, it can perform AF while moving the focus lens as unnecessarily as possible. More specifically, the imaging apparatus 100 performs AF in the TOF method when the reliability of the subject distance calculated in the TOF method is high. When the reliability of the subject distance calculated by the TOF method is not high, the imaging device 100 performs AF in the DFD method. Further, when the reliability of the subject distance calculated by the TOF method and the DFD method is not high, AF is performed in the contrast method.
摄像控制部110判断与从受光元件165输出的光的量相应的信号是否满足示出被摄体距离的可靠性的第一条件。当信号示出光量包含在预设范围内时,摄像控制部110判定信号满足第一条件。当信号在预设期间持续示出光量包含在预设范围内时,摄像控制部110可判定信号满足第一条件。当信号所示出的由受光元件165接收的光的振幅在下限阈值以上,并且信号经A/D转换后的值在上限阈值以下时,摄像控制部110可判定信号满足第一条件。当信号所示出的由受光元件165接收的光的振幅在阈值以 上,并且信号经A/D转换后的值未达到饱和时,摄像控制部110可判定信号满足第一条件。即,当相对于从发光元件163发射的脉冲光,受光元件165接收的光不过弱且不过强时,摄像控制部110判定信号满足第一条件。The imaging control unit 110 determines whether the signal corresponding to the amount of light output from the light receiving element 165 satisfies the first condition showing the reliability of the subject distance. When the signal shows that the amount of light is contained within the preset range, the imaging control section 110 determines that the signal satisfies the first condition. When the signal continues to show that the amount of light is included in the preset range during the preset period, the imaging control section 110 may determine that the signal satisfies the first condition. When the amplitude of the light received by the light receiving element 165 indicated by the signal is above the lower limit threshold, and the A/D converted value of the signal is below the upper threshold, the imaging control unit 110 may determine that the signal satisfies the first condition. When the amplitude of the light received by the light receiving element 165 indicated by the signal is above the threshold and the value of the signal after A/D conversion does not reach saturation, the imaging control unit 110 may determine that the signal satisfies the first condition. That is, when the light received by the light receiving element 165 is not too weak and not too strong with respect to the pulsed light emitted from the light emitting element 163, the imaging control unit 110 determines that the signal satisfies the first condition.
当信号满足第一条件时,摄像控制部110基于根据TOF传感器160测量的到对象物的距离确定的摄像装置100的摄像面与聚焦镜头的第一目标位置关系,执行对焦于对象物的对焦控制。当信号不满足第一条件时,摄像控制部110基于根据在摄像面与聚焦镜头的位置关系为第一位置关系时由摄像装置100拍摄的第一图像以及在摄像面与聚焦镜头的位置关系为第二位置关系时由摄像装置100拍摄的第二图像的各自的模糊量确定的摄像面与聚焦镜头的第二目标位置关系,执行对焦于对象物的对焦控制。即,当信号不满足第一条件时,摄像控制部110以DFD方式执行对焦控制。When the signal satisfies the first condition, the imaging control unit 110 performs focus control to focus on the object based on the first target positional relationship between the imaging surface of the imaging device 100 and the focus lens determined from the distance to the object measured by the TOF sensor 160 . When the signal does not satisfy the first condition, the imaging control unit 110 is based on the first image captured by the imaging device 100 when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, and the positional relationship between the imaging surface and the focusing lens is The second positional relationship is the second target positional relationship between the imaging plane determined by the respective blur amounts of the second image captured by the imaging device 100 and the focus lens, and the focus control for focusing on the object is performed. That is, when the signal does not satisfy the first condition, the imaging control section 110 performs focus control in the DFD method.
当信号不满足第一条件时,摄像控制部110可以在获取摄像面与聚焦镜头为第一位置关系时由摄像装置100拍摄的第一图像后,使摄像面与聚焦镜头的位置关系转变为第二位置关系,获取摄像面与聚焦镜头的位置关系为第二位置关系时由摄像装置100拍摄的第二图像。When the signal does not satisfy the first condition, the imaging control unit 110 may convert the positional relationship between the imaging surface and the focusing lens to the first image after acquiring the first image captured by the imaging device 100 when the imaging surface and the focusing lens are in the first positional relationship. The second position relationship is to acquire a second image captured by the imaging device 100 when the position relationship between the imaging surface and the focus lens is the second position relationship.
当信号不满足第一条件时,摄像控制部110可以在获取摄像面与聚焦镜头为第一位置关系时由摄像装置100拍摄的第一图像后,通过使聚焦镜头向根据第一目标位置关系确定的方向移动预设的距离来使摄像面与聚焦镜头的位置关系移动到第二位置关系。即使在以TOF方式计算出的被摄体距离的精度较低的情况下,摄像控制部110也有可能根据以TOF方式计算出的被摄体距离,正确判断出聚焦镜头应该移动的方向。因此,当信号不满足第一条件时,摄像控制部110可根据以TOF方式计算出的被摄体距离来确定聚焦镜头的移动方向。When the signal does not satisfy the first condition, the imaging control unit 110 may obtain the first image captured by the imaging device 100 when the imaging surface and the focusing lens are in the first positional relationship, and then determine the orientation of the focusing lens according to the first target positional relationship. Move the preset distance in the direction to make the positional relationship between the imaging surface and the focusing lens move to the second positional relationship. Even when the accuracy of the subject distance calculated by the TOF method is low, the imaging control unit 110 may correctly determine the direction in which the focus lens should move based on the subject distance calculated by the TOF method. Therefore, when the signal does not satisfy the first condition, the imaging control unit 110 may determine the moving direction of the focus lens according to the subject distance calculated in the TOF method.
摄像控制部110可以在以DFD方式执行对焦控制之前,判断示出第一图像和第二图像之间的差异程度的第一差异度是否满足示出第二目标位置关系可靠性的第二条件。摄像控制部110可以根据第一图像和第二图像的相似度来判定第一差异度是否满足示出DFD方式的可靠性的第二条件。当第一图像和第二图像的相似度低于预设的阈值时,摄像控制部110可以判定第一差异度满足第二条件。当第一图像的对比度和第二图像的对比度之差大于预设的阈值时,摄像控制部110可以判定第一差异度满足第二条件。当第一图像的模糊量和第二图像的模糊量之差大于预设的阈值时,摄像控制部110可以判定第一差异度满足第二条件。并且,当第一差异度满足第二条件时,摄像控制部110可以根据以DFD方式导出的第二目标位置关系来执行对焦于对象物的对焦控制。The imaging control section 110 may determine whether the first degree of difference showing the degree of difference between the first image and the second image satisfies the second condition showing the reliability of the second target positional relationship before performing the focus control in the DFD method. The imaging control unit 110 may determine whether the first degree of difference satisfies the second condition showing the reliability of the DFD method based on the similarity between the first image and the second image. When the degree of similarity between the first image and the second image is lower than the preset threshold, the imaging control unit 110 may determine that the first degree of difference satisfies the second condition. When the difference between the contrast of the first image and the contrast of the second image is greater than a preset threshold, the imaging control unit 110 may determine that the first degree of difference satisfies the second condition. When the difference between the blur amount of the first image and the blur amount of the second image is greater than a preset threshold, the imaging control unit 110 may determine that the first degree of difference satisfies the second condition. Also, when the first degree of difference satisfies the second condition, the imaging control unit 110 may perform focus control to focus on the object according to the second target position relationship derived in the DFD method.
当第一差异度不满足第二条件时,摄像控制部110可以通过对比度AF来执行对焦于对象物的对焦控制。When the first degree of difference does not satisfy the second condition, the imaging control section 110 may perform focus control of focusing on the object through contrast AF.
当第一差异度不满足第二条件时,摄像控制部110可以进一步地将摄像面与聚焦镜头的位置关系由第二位置关系转变为第三位置关系。并且,当示出摄像面与聚焦镜头的位置关系在第三位置关系时由摄像装置100所拍摄的第三图像与第一图像或者第二图像之间的差异程度的第二差异度满足第二条件时,摄像控制部110可以基于根据第一图像或者第二图像与第三图像的各自模糊量确定的摄像面与聚焦镜头的第三目标位置关系,执行对焦于对象物的对焦控制。并且,当第二差异度不满足第二条件时,摄像控制部110可以通过对比度AF来执行对焦于对象物的对焦控制。When the first degree of difference does not satisfy the second condition, the imaging control unit 110 may further transform the positional relationship between the imaging surface and the focus lens from the second positional relationship to the third positional relationship. And, when the positional relationship between the imaging surface and the focus lens is in the third positional relationship, the second degree of difference, which is the degree of difference between the third image captured by the imaging device 100 and the first image or the second image, satisfies the second When the conditions are met, the imaging control unit 110 may perform focus control of focusing on the object based on the third target position relationship between the imaging surface and the focus lens determined according to the respective blur amounts of the first image or the second image and the third image. Also, when the second degree of difference does not satisfy the second condition, the imaging control unit 110 may perform focus control to focus on the object through contrast AF.
图7是示出摄像装置100的AF处理过程的一个示例的流程图。摄像控制部110在AF处理后,使摄像装置100在聚焦镜头处于第一位置时拍摄DFD用的第一图像, 获取DFD用的第一图像(S100)。然后,摄像控制部110判定由TOF传感器160测量的被摄体距离的可靠性是否高(S102)。如果从受光元件165输出的信号所示的光量在预设范围内,则摄像控制部110可以判定由TOF传感器160测量的被摄体距离的可靠性高。FIG. 7 is a flowchart showing one example of the AF processing procedure of the imaging device 100. After the AF process, the imaging control unit 110 causes the imaging device 100 to capture the first image for DFD when the focus lens is at the first position, and acquire the first image for DFD (S100). Then, the imaging control unit 110 determines whether the reliability of the subject distance measured by the TOF sensor 160 is high (S102). If the amount of light indicated by the signal output from the light receiving element 165 is within the preset range, the imaging control section 110 can determine that the reliability of the subject distance measured by the TOF sensor 160 is high.
在由TOF传感器160测量的被摄体距离的可靠性高时,摄像控制部110根据示出TOF传感器160测量的被摄体距离的TOF结果,计算示出对焦于对象物的聚焦镜头的位置的对焦位置(S104)。摄像控制部110例如可通过参照如图8所示的示出TOF运算结果和聚焦镜头位置之间的关系的数据来计算对焦位置。然后,摄像控制部110将聚焦镜头驱动至TOF方式的对焦位置(S106)。在TOF结果的可靠性高的情况下,如图9所示,摄像控制部110将聚焦镜头从AF开始位置驱动至对焦位置。When the reliability of the subject distance measured by the TOF sensor 160 is high, the imaging control unit 110 calculates the position of the focus lens that focuses on the object based on the TOF result showing the subject distance measured by the TOF sensor 160 Focus position (S104). The imaging control unit 110 can calculate the focus position by referring to the data showing the relationship between the TOF calculation result and the focus lens position as shown in FIG. 8, for example. Then, the imaging control unit 110 drives the focus lens to the focus position of the TOF method (S106). In a case where the reliability of the TOF result is high, as shown in FIG. 9, the imaging control unit 110 drives the focus lens from the AF start position to the focus position.
当TOF结果的可靠性低时,摄像控制部110将聚焦镜头向表示朝向根据TOF结果计算出的对焦位置的方向的对焦方向驱动仅一个深度以下(S108)。摄像控制部110使聚焦镜头从第一位置移动到第二位置。一个深度相当于在进行DFD运算时可采用点扩散函数(Point Spread Function)的最低限度的聚焦镜头移动距离。例如,一个深度可以为0.003mm。通过对聚焦镜头进行仅一个深度以下的驱动,用户难以察觉由于聚焦镜头的驱动引起的图像模糊度的变化。因此,可以防止由于驱动聚焦镜头而使显示器显示的图像模糊度产生极大变化,而使用户产生违和感。When the reliability of the TOF result is low, the imaging control unit 110 drives the focus lens to the focus direction indicating the direction toward the focus position calculated from the TOF result by only one depth or less (S108). The imaging control unit 110 moves the focus lens from the first position to the second position. A depth is equivalent to the minimum moving distance of the focus lens that can be used for the point spread function (Point Spread Function) when performing DFD calculations. For example, a depth can be 0.003mm. By driving the focus lens only one depth or less, it is difficult for the user to perceive the change in image blur caused by the driving of the focus lens. Therefore, it is possible to prevent the blur degree of the image displayed on the display from greatly changing due to the driving of the focus lens, thereby causing the user to feel a sense of disharmony.
然后,摄像控制部110使摄像装置100在聚焦镜头为第二位置时拍摄DFD用的第二图像,获取DFD用的第二图像(S110)。摄像控制部110使用第一图像以及第二图像进行DFD运算来计算对焦位置(S112)。摄像控制部110判定由使用了第一图像以及第二图像的DFD运算导出的对焦位置的可靠性是否高(S114)。当第一图像和第二图像的相似度在预设阈值以下时,摄像控制部110可以判定由DFD运算导出的对焦位置的可靠性高。当第一图像的模糊量与第二图像的模糊量之差在预设阈值以上时,摄像控制部110可以判定由DFD运算导出的对焦位置的可靠性高。Then, the imaging control unit 110 causes the imaging device 100 to capture the second image for DFD when the focus lens is at the second position, and acquire the second image for DFD (S110). The imaging control unit 110 performs DFD calculation using the first image and the second image to calculate the focus position (S112). The imaging control unit 110 determines whether the reliability of the focus position derived from the DFD calculation using the first image and the second image is high (S114). When the similarity between the first image and the second image is below the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD calculation is high. When the difference between the blur amount of the first image and the blur amount of the second image is greater than or equal to the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD calculation is high.
当判定出由DFD运算导出的对焦位置的可靠性高时,摄像控制部110将聚焦镜头驱动至DFD方式的对焦位置(S116)。当判断出由DFD运算导出的对焦位置的可靠性高时,如图10所示,摄像控制部110将聚焦镜头从AF开始位置(第一位置)驱动至第二位置。然后,摄像控制部110将聚焦镜头驱动至以DFD方式计算出的对焦位置。When it is determined that the reliability of the focus position derived by the DFD calculation is high, the imaging control unit 110 drives the focus lens to the focus position of the DFD method (S116). When it is determined that the reliability of the focus position derived by the DFD calculation is high, as shown in FIG. 10, the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position. Then, the imaging control unit 110 drives the focus lens to the focus position calculated by the DFD method.
当判断出由DFD运算导出的对焦位置的可靠性低时,摄像控制部110进一步地将聚焦镜头向对焦方向驱动(S118)。摄像控制部110可以对聚焦镜头进行仅一个深度以下的驱动。或者,摄像控制部110可以将聚焦镜头驱动至由TOF方式或DFD方式计算出的对焦位置所确定的对比度AF搜索范围中的一个边界位置,以执行对比度AF。摄像控制部110将聚焦镜头从第二位置驱动至第三位置。并且,摄像控制部110使摄像装置100在聚焦镜头在第三位置时拍摄DFD用的第三图像,获取DFD用的第三图像(S120)。摄像控制部110判断由使用了第二图像和第三图像的DFD运算所导出的对焦位置的可靠性是否高(S124)。摄像控制部110也可判断由使用了第一图像和第三图像的DFD运算所导出的对焦位置的可靠性是否高。When it is determined that the reliability of the focus position derived by the DFD calculation is low, the imaging control unit 110 further drives the focus lens in the focus direction (S118). The imaging control unit 110 can drive the focus lens only one depth or less. Alternatively, the imaging control unit 110 may drive the focus lens to a boundary position in the contrast AF search range determined by the focus position calculated by the TOF method or the DFD method to perform contrast AF. The imaging control unit 110 drives the focus lens from the second position to the third position. Then, the imaging control unit 110 causes the imaging device 100 to capture a third image for DFD when the focus lens is at the third position, and acquire the third image for DFD (S120). The imaging control unit 110 determines whether the reliability of the focus position derived from the DFD calculation using the second image and the third image is high (S124). The imaging control unit 110 may also determine whether the reliability of the focus position derived from the DFD calculation using the first image and the third image is high.
在由DFD运算所导出的对焦位置的可靠性高时,摄像控制部110将聚焦镜头驱动至DFD方式的对焦位置(S116)。如图11所示,摄像控制部110将聚焦镜头从AF开始位置(第一位置)驱动至第二位置,在进一步地移动到第三位置后,将聚焦镜头驱动到由使用了第二图像和第三图像的DFD运算所导出的对焦位置。When the reliability of the focus position derived by the DFD calculation is high, the imaging control unit 110 drives the focus lens to the focus position of the DFD method (S116). As shown in FIG. 11, the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position, and after further moving to the third position, drives the focus lens until the second image and The focus position derived from the DFD calculation of the third image.
在由DFD运算所导出的对焦位置的可靠性低时,摄像控制部110为执行对比度AF,获取第三图像的对比度评估值(S126)。摄像控制部110为检索对比度评估值达到峰值时的聚焦镜头的位置,以微小步进驱动聚焦镜头(S128)。When the reliability of the focus position derived by the DFD calculation is low, the imaging control unit 110 executes contrast AF and acquires the contrast evaluation value of the third image (S126). The imaging control unit 110 searches for the position of the focus lens when the contrast evaluation value reaches the peak, and drives the focus lens in minute steps (S128).
摄像控制部110判定是否检索出对比度评估值达到峰值时聚焦镜头的位置(S130)。如果未检索出峰值,则摄像控制部110重复步骤S126以后的步骤。如果检索出峰值,则摄像控制部110将聚焦镜头驱动到以对比度方式确定的对焦位置(S132)。如图12所示,如果TOF方式以及DFD方式的对焦位置的可靠性低,则摄像控制部110以登山方式驱动聚焦镜头,以检索对比度评估值的峰值。摄像控制部110将聚焦镜头驱动至超过对比度评估值的峰值后,进一步地反方向驱动聚焦镜头,将聚焦镜头驱动至对比度方式的对焦位置。The imaging control unit 110 determines whether the position of the focus lens when the contrast evaluation value reaches the peak is retrieved (S130). If the peak value is not found, the imaging control unit 110 repeats the steps after step S126. If the peak value is retrieved, the imaging control unit 110 drives the focus lens to the focus position determined by the contrast method (S132). As shown in FIG. 12, if the reliability of the focus position of the TOF method and the DFD method is low, the imaging control unit 110 drives the focus lens in a mountain climbing method to retrieve the peak value of the contrast evaluation value. The imaging control unit 110 drives the focus lens to a peak value exceeding the contrast evaluation value, and then drives the focus lens in the reverse direction to drive the focus lens to the focus position of the contrast method.
如上所述,根据本实施方式,在以TOF方式计算出的被摄体距离的可靠性高时,摄像装置100以TOF方式进行AF。摄像装置100在以TOF方式计算出的被摄体距离的可靠性不高时,以DFD方式进行AF。进一步地,在以TOF方式以及DFD方式计算出的被摄体距离的可靠性都不高时,以对比度方式进行AF。由此,摄像装置100可以优先执行AF方式,即能够尽量不必要地移动聚焦镜头而进行AF的方式。As described above, according to the present embodiment, when the reliability of the subject distance calculated in the TOF method is high, the imaging device 100 performs AF in the TOF method. When the reliability of the subject distance calculated in the TOF method is not high, the imaging device 100 performs AF in the DFD method. Further, when the reliability of the subject distance calculated by the TOF method and the DFD method is not high, AF is performed in the contrast method. As a result, the imaging device 100 can preferentially execute the AF method, that is, a method capable of moving the focus lens as necessary as possible to perform AF.
上述摄像装置100可以搭载于移动体上。摄像装置100可以搭载于如图13所示的无人驾驶航空器(UAV)上。UAV1000可以包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV1000为由推进部推进的移动体的一个示例。移动体的概念是指除UAV之外,包括在空中移动的飞机等飞行体、在地面上移动的车辆、在水上移动的船舶等。The aforementioned imaging device 100 may be mounted on a mobile body. The camera device 100 may be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 13. The UAV 1000 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100. The universal joint 50 and the camera device 100 are an example of a camera system. UAV1000 is an example of a moving body propelled by a propulsion unit. The concept of moving objects refers to flying objects such as airplanes moving in the air, vehicles moving on the ground, ships moving on water, etc., in addition to UAVs.
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV1000飞行。UAV本体20例如采用四个旋翼,使UAV1000飞行。旋翼的数量不限于四个。另外,UAV1000也可以是没有旋翼的固定翼机。The UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section. The UAV main body 20 makes the UAV 1000 fly by controlling the rotation of a plurality of rotors. The UAV body 20 uses, for example, four rotors to make the UAV 1000 fly. The number of rotors is not limited to four. In addition, UAV1000 can also be a fixed-wing aircraft without rotors.
摄像装置100是为对包含在所期望的摄像范围内的被摄体进行摄像的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少1个为中心旋转,来改变摄像装置100的姿势。The imaging device 100 is an imaging camera that captures a subject included in a desired imaging range. The universal joint 50 rotatably supports the imaging device 100. The universal joint 50 is an example of a supporting mechanism. For example, the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis. The universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively. The gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
多个摄像装置60是为了控制UAV1000的飞行而对UAV1000的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV1000的机头、即正面。并且,其它两个摄像装置60可以设置于UAV1000的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所摄像的图像来生成UAV1000周围的三维空间数据。UAV1000所包括的摄像装置60的数量不限于四个。UAV1000包括至少一个摄像装置60即可。UAV1000也可以在UAV1000的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。The plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 1000 in order to control the flight of the UAV 1000. The two camera devices 60 can be installed on the nose of the UAV1000, that is, on the front side. In addition, the other two camera devices 60 may be installed on the bottom surface of the UAV1000. The two camera devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom side may also be paired to function as a stereo camera. The three-dimensional spatial data around the UAV 1000 can be generated based on the images captured by the plurality of imaging devices 60. The number of imaging devices 60 included in the UAV 1000 is not limited to four. It is sufficient that the UAV1000 includes at least one camera device 60. The UAV1000 may also include at least one camera 60 on the nose, tail, side, bottom and top surfaces of the UAV1000. The viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100. The imaging device 60 may have a single focus lens or a fisheye lens.
远程操作装置600与UAV1000通信,对UAV1000进行远程操作。远程操作装置600可以与UAV1000进行无线通信。远程操作装置600向UAV1000发送示出上升、下降、加速、减速、前进、后退、旋转等与UAV1000的移动有关的各种指令的指示 信息。指示信息包括例如使UAV1000的高度上升的指示信息。指示信息可以示出UAV1000应该位于的高度。UAV1000进行移动,以位于从远程操作装置600接收的指示信息所示出的高度。指示信息可以包括使UAV1000上升的上升指令。UAV1000在接受上升指令的期间上升。在UAV1000的高度已达到上限高度时,即使接受上升指令,也可以限制UAV1000上升。The remote operation device 600 communicates with the UAV1000 to perform remote operation on the UAV1000. The remote operation device 600 can wirelessly communicate with the UAV1000. The remote operation device 600 transmits to the UAV 1000 instruction information indicating various commands related to the movement of the UAV 1000 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating. The instruction information includes, for example, instruction information for raising the height of the UAV 1000. The indication information may show the height at which the UAV1000 should be located. The UAV 1000 moves to be at the height indicated by the instruction information received from the remote operation device 600. The instruction information may include an ascending instruction to raise the UAV1000. UAV1000 rises while receiving the rise command. When the height of the UAV1000 has reached the upper limit height, even if the ascent command is accepted, the UAV1000 can be restricted from rising.
图14表示可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。FIG. 14 shows an example of a computer 1200 that can fully or partially embody various aspects of the present invention. The program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts". This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention. Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。The computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210. The computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220. The computer 1200 also includes a ROM 1230. The CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive can store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network. The program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212. The information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above. The apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。For example, when communication is performed between the computer 1200 and an external device, the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing. The communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。In addition, the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预设条件的第一属性相关联的第二属性的属性值。It is possible to store various types of information such as various types of programs, data, tables, and databases in the recording medium and receive information processing. For the data read from the RAM 1214, the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214. In addition, the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. The item that matches the condition is read, and the attribute value of the second attribute stored in the item is read to obtain the attribute value of the second attribute associated with the first attribute that meets the preset condition.
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。The programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It is obvious to those skilled in the art that various changes or improvements can be made to the above-mentioned embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。It should be noted that the execution order of the actions, sequences, steps, and stages of the devices, systems, programs, and methods shown in the claims, specification, and drawings of the specification, as long as there is no special indication that "before... ", "in advance", etc., and can be implemented in any order as long as the output of the previous processing is not used in the subsequent processing. Regarding the operating procedures in the claims, the specification and the drawings in the specification, the descriptions are made using "first", "next", etc. for convenience, but it does not mean that it must be implemented in this order.
【符号说明】【Symbol Description】
10 摄像系统10 Camera system
20 UAV主体20 UAV subject
50 万向节50 universal joint
60 摄像装置60 Camera device
100 摄像装置100 camera device
110 摄像控制部110 Camera Control Department
120 图像传感器120 Image sensor
130 存储器130 Memory
150 镜头控制部150 Lens Control Department
152 镜头驱动部152 Lens Drive
154 镜头154 Lens
160 传感器160 sensor
162 发光部162 Light-emitting part
163 发光元件163 Light-emitting element
164 受光部164 Light Receiving Department
165 受光元件165 Light receiving element
166 发光控制部166 Luminous Control Unit
167 受光控制部167 Light Receiving Control Department
168 存储器168 Memory
200 支撑机构200 supporting institutions
201 滚转轴驱动机构201 Rolling shaft drive mechanism
202 俯仰轴驱动机构202 Pitch axis drive mechanism
203 偏航轴驱动机构203 Yaw axis drive mechanism
204 基部204 Base
210 姿势控制部210 Posture Control Department
212 角速度传感器212 Angular velocity sensor
214 加速度传感器214 Acceleration sensor
300 握持部300 grip
301 操作界面301 Operation interface
302 显示部302 Display
400 智能手机400 smart phones
600 远程操作装置600 remote operation device
1200 计算机1200 Computer
1210 主机控制器1210 Host Controller
1212 CPU1212 CPU
1214 RAM1214 RAM
1220 输入/输出控制器1220 Input/Output Controller
1222 通信接口1222 Communication interface
1230 ROM1230 ROM

Claims (10)

  1. 一种控制装置,其对包括测距传感器以及聚焦镜头的摄像装置进行控制,所述测距传感器包括发射脉冲光的发光元件和接收包括来自对象物的所述脉冲光的反射光在内的光并输出与所接收的所述光的量相应的信号的受光元件,并基于所述信号来测量到所述对象物的距离,所述控制装置的特征在于,包括电路,所述电路构成为:A control device that controls an imaging device including a distance measuring sensor and a focusing lens. The distance measuring sensor includes a light emitting element that emits pulsed light and receives light including reflected light of the pulsed light from an object. And a light-receiving element that outputs a signal corresponding to the amount of light received and measures the distance to the object based on the signal. The control device is characterized by including a circuit, and the circuit is configured as:
    当所述信号满足示出所述距离的可靠性的第一条件时,基于根据所述距离确定的所述摄像装置的摄像面与所述聚焦镜头的第一目标位置关系,执行对焦于所述对象物的对焦控制;When the signal satisfies the first condition showing the reliability of the distance, based on the first target position relationship between the imaging surface of the imaging device and the focusing lens determined according to the distance, focusing on the Focus control of objects;
    当所述信号不满足所述第一条件时,基于根据在所述摄像面与所述聚焦镜头的位置关系为第一位置关系时由所述摄像装置所拍摄的第一图像以及所述摄像面与所述聚焦镜头的位置关系为第二位置关系时由所述摄像装置所拍摄的第二图像的各自的模糊量确定的所述摄像面与所述聚焦镜头的第二目标位置关系,执行所述对焦控制。When the signal does not satisfy the first condition, based on the first image taken by the imaging device and the imaging surface when the positional relationship between the imaging surface and the focus lens is the first positional relationship When the positional relationship with the focusing lens is the second positional relationship, the second target positional relationship between the imaging surface and the focusing lens determined by the respective blur amounts of the second image captured by the imaging device is executed.述focus control.
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当示出所述第一图像和所述第二图像的差异程度的第一差异度满足示出所述第二目标位置关系的可靠性的第二条件时,根据所述第二目标位置关系执行所述对焦控制。The control device according to claim 1, wherein the circuit is configured to: when a first degree of difference showing the degree of difference between the first image and the second image satisfies the display of the second target When the second condition of the reliability of the positional relationship is established, the focus control is executed according to the second target positional relationship.
  3. 根据权利要求2所述的控制装置,其特征在于,所述电路构成为:当所述第一差异度不满足所述第二条件时,根据对比度AF来执行所述对焦控制。The control device according to claim 2, wherein the circuit is configured to execute the focus control based on contrast AF when the first degree of difference does not satisfy the second condition.
  4. 根据权利要求2所述的控制装置,其特征在于,所述电路构成为:The control device according to claim 2, wherein the circuit is configured as:
    当所述第一差异度不满足所述第二条件时,使所述摄像面与所述聚焦镜头的位置关系由所述第二位置关系转变为第三位置关系;When the first degree of difference does not satisfy the second condition, changing the positional relationship between the imaging surface and the focusing lens from the second positional relationship to a third positional relationship;
    当示出在所述摄像面与所述聚焦镜头的位置关系为第三位置关系时由所述摄像装置拍摄的第三图像与所述第一图像或与所述第二图像的差异程度的第二差异度满足所述第二条件时,基于根据所述第一图像或第二图像与所述第三图像的各自模糊量确定的所述摄像面与所述聚焦镜头的第三目标位置关系,执行所述对焦控制;When the positional relationship between the imaging surface and the focusing lens is the third positional relationship, the third image taken by the imaging device is different from the first image or the second image. When the second difference degree satisfies the second condition, based on the third target positional relationship between the imaging surface and the focusing lens determined according to the respective blur amounts of the first image or the second image and the third image, Execute the focus control;
    当所述第二差异度不满足所述第二条件时,根据对比度AF执行所述对焦控制。When the second degree of difference does not satisfy the second condition, the focus control is performed according to contrast AF.
  5. 根据权利要求1所述的控制装置,其特征在于,当所述信号示出所述光量包含在预设范围内时,所述信号满足所述第一条件。The control device according to claim 1, wherein when the signal shows that the light amount is contained within a preset range, the signal satisfies the first condition.
  6. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当所述信号不满足所述第一条件时,获取在所述摄像面与所述聚焦镜头的位置关系为所述第一位置关系时由所述摄像装置拍摄的所述第一图像后,使所述摄像面与所述聚焦镜头的位置关系转变为所述第二位置关系,并获取在所述摄像面与所述聚焦镜头的位置关系为所述第二位置关系时由所述摄像装置拍摄的所述第二图像。The control device according to claim 1, wherein the circuit is configured to: when the signal does not satisfy the first condition, obtain the positional relationship between the imaging surface and the focusing lens as the In the first positional relationship, after the first image captured by the imaging device, the positional relationship between the imaging surface and the focusing lens is transformed into the second positional relationship, and the positional relationship between the imaging surface and the The positional relationship of the focus lens is the second image captured by the imaging device in the second positional relationship.
  7. 根据权利要求6所述的控制装置,其特征在于,所述电路构成为:在所述信号不满足所述第一条件时,获取在所述摄像面与所述聚焦镜头的位置关系为所述第一位置关系时由所述摄像装置拍摄的所述第一图像后,使所述聚焦镜头向基于所述第一目标位置关系确定的方向移动预设距离,由此使所述摄像面与所述聚焦镜头的位置关 系移动至第二位置关系。The control device according to claim 6, wherein the circuit is configured to obtain the positional relationship between the imaging surface and the focusing lens as the In the first positional relationship, after the first image captured by the imaging device, the focusing lens is moved by a preset distance in a direction determined based on the first target positional relationship, thereby causing the imaging surface to be aligned with the The positional relationship of the focus lens moves to the second positional relationship.
  8. 一种摄像系统,其特征在于,包括:根据权利要求1至7中任一项所述的控制装置;A camera system, characterized by comprising: the control device according to any one of claims 1 to 7;
    所述测距传感器;以及The ranging sensor; and
    所述摄像装置。The camera device.
  9. 一种控制方法,其特征在于,其对包括测距传感器以及聚焦镜头的摄像装置进行控制,所述测距传感器包括发射脉冲光的发光元件和接收包括来自对象物的所述脉冲光的反射光在内的光并输出与所接收的所述光的量相对应的信号的受光元件,并根据所述信号来测量到所述对象物的距离,所述控制方法的特征在于,包括:A control method, characterized in that it controls an imaging device including a range-finding sensor and a focusing lens, the range-finding sensor including a light emitting element that emits pulsed light and receiving reflected light including the pulsed light from an object The light-receiving element that outputs a signal corresponding to the amount of the received light and measures the distance to the object based on the signal. The control method is characterized in that it includes:
    当所述信号满足示出所述距离的可靠性的第一条件时,基于根据所述距离确定的所述摄像装置的摄像面与所述聚焦镜头的第一目标位置关系,执行对焦于所述对象物的对焦控制;When the signal satisfies the first condition showing the reliability of the distance, based on the first target position relationship between the imaging surface of the imaging device and the focusing lens determined according to the distance, focusing on the Focus control of objects;
    当所述信号不满足所述第一条件时,基于根据在所述摄像面与所述聚焦镜头的位置关系为第一位置关系时由所述摄像装置所拍摄的第一图像以及在所述摄像面与所述聚焦镜头的位置关系为第二位置关系时由所述摄像装置所拍摄的第二图像的各自的模糊量确定的所述摄像面与所述聚焦镜头的第二目标位置关系,执行所述对焦控制。When the signal does not satisfy the first condition, based on the first image taken by the imaging device when the positional relationship between the imaging surface and the focus lens is the first positional relationship and the When the positional relationship between the surface and the focusing lens is the second positional relationship, the second target positional relationship between the imaging surface and the focusing lens determined by the respective blur amounts of the second image captured by the imaging device is executed The focus control.
  10. 一种程序,其特征在于,其用于使计算机作为权利要求1至7中任一项所述的控制装置发挥作用。A program characterized by causing a computer to function as the control device according to any one of claims 1 to 7.
PCT/CN2020/106579 2019-08-21 2020-08-03 Control device, photographing system, control method, and program WO2021031833A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080003361.5A CN112335227A (en) 2019-08-21 2020-08-03 Control device, imaging system, control method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019151539A JP2021032990A (en) 2019-08-21 2019-08-21 Control device, imaging system, control method and program
JP2019-151539 2019-08-21

Publications (1)

Publication Number Publication Date
WO2021031833A1 true WO2021031833A1 (en) 2021-02-25

Family

ID=74660175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106579 WO2021031833A1 (en) 2019-08-21 2020-08-03 Control device, photographing system, control method, and program

Country Status (2)

Country Link
JP (1) JP2021032990A (en)
WO (1) WO2021031833A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7160391B2 (en) * 2021-03-24 2022-10-25 ビクター ハッセルブラッド アクチボラーグ Distance detection device and imaging device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105763805A (en) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 Control method and device and electronic device
US20170048459A1 (en) * 2014-05-02 2017-02-16 Fujifilm Corporation Distance measurement device, distance measurement method, and distance measurement program
CN107809586A (en) * 2017-10-31 2018-03-16 努比亚技术有限公司 Switching method, mobile terminal and the storage medium of mobile terminal focal modes
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program
WO2019061887A1 (en) * 2017-09-28 2019-04-04 深圳市大疆创新科技有限公司 Control device, photographing device, aircraft, control method and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170048459A1 (en) * 2014-05-02 2017-02-16 Fujifilm Corporation Distance measurement device, distance measurement method, and distance measurement program
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105763805A (en) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 Control method and device and electronic device
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program
WO2019061887A1 (en) * 2017-09-28 2019-04-04 深圳市大疆创新科技有限公司 Control device, photographing device, aircraft, control method and program
CN107809586A (en) * 2017-10-31 2018-03-16 努比亚技术有限公司 Switching method, mobile terminal and the storage medium of mobile terminal focal modes

Also Published As

Publication number Publication date
JP2021032990A (en) 2021-03-01

Similar Documents

Publication Publication Date Title
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN112335227A (en) Control device, imaging system, control method, and program
WO2021013143A1 (en) Apparatus, photgraphic apparatus, movable body, method, and program
WO2021031833A1 (en) Control device, photographing system, control method, and program
WO2020098603A1 (en) Determination device, camera device, camera system, moving object, determination method and program
WO2020216037A1 (en) Control device, camera device, movable body, control method and program
JP2021085893A (en) Control device, image capturing device, control method, and program
US20220188993A1 (en) Control apparatus, photographing apparatus, control method, and program
JP6543875B2 (en) Control device, imaging device, flying object, control method, program
JP2019082539A (en) Control device, lens device, flying body, control method, and program
WO2021052216A1 (en) Control device, photographing device, control method, and program
JP6714802B2 (en) Control device, flying body, control method, and program
WO2020108284A1 (en) Determining device, moving object, determining method, and program
WO2021031840A1 (en) Device, photographing apparatus, moving body, method, and program
WO2019223614A1 (en) Control apparatus, photographing apparatus, moving body, control method, and program
WO2022001561A1 (en) Control device, camera device, control method, and program
WO2021249245A1 (en) Device, camera device, camera system, and movable member
WO2020216039A1 (en) Control apparatus, camera system, moving body, control method and program
JP6746856B2 (en) Control device, imaging system, moving body, control method, and program
WO2021204020A1 (en) Device, camera device, camera system, moving body, method, and program
US20220070362A1 (en) Control apparatuses, photographing apparatuses, movable objects, control methods, and programs
WO2020216057A1 (en) Control device, photographing device, mobile body, control method and program
JP2022011712A (en) Scene recognition apparatus, imaging apparatus, scene recognition method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20855069

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20855069

Country of ref document: EP

Kind code of ref document: A1