WO2021052216A1 - 控制装置、摄像装置、控制方法以及程序 - Google Patents

控制装置、摄像装置、控制方法以及程序 Download PDF

Info

Publication number
WO2021052216A1
WO2021052216A1 PCT/CN2020/113963 CN2020113963W WO2021052216A1 WO 2021052216 A1 WO2021052216 A1 WO 2021052216A1 CN 2020113963 W CN2020113963 W CN 2020113963W WO 2021052216 A1 WO2021052216 A1 WO 2021052216A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
imaging
area
light receiving
sensor
Prior art date
Application number
PCT/CN2020/113963
Other languages
English (en)
French (fr)
Inventor
本庄谦一
永山佳范
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003318.9A priority Critical patent/CN112313941A/zh
Publication of WO2021052216A1 publication Critical patent/WO2021052216A1/zh
Priority to US17/683,163 priority patent/US20220188993A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/40Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a control device, an imaging device, a control method, and a program.
  • Patent Document 1 discloses that the distance value is calculated based on the TOF algorithm of each of M ⁇ N pixels, and the distance information is stored in the depth map memory.
  • Patent Document 1 Japanese Patent Document Special Publication No. 2019-508717.
  • the positional relationship between the light-receiving surface of the TOF sensor and the light-receiving surface of the image sensor of the imaging device varies with the distance to the subject measured by the TOF sensor. Due to the error in the positional relationship between the position of the subject on the light-receiving surface of the TOF sensor and the position of the subject on the light-receiving surface of the image sensor of the imaging device, sometimes it may not be possible to focus on the desired subject Case.
  • the control device may be a distance measuring sensor that measures the distance to a subject associated with each of a plurality of distance measuring areas on the light receiving surface of the light receiving element, and an image sensor that photographs the subject.
  • the control device that controls the camera device.
  • the control device includes a circuit configured to: based on the plurality of distances measured by the distance measuring sensor, determine the predetermined positional relationship between the plurality of distance measuring areas on the light receiving surface of the light receiving element and the plurality of imaging areas on the light receiving surface of the image sensor Make corrections.
  • the circuit may be configured to determine the first ranging area corresponding to the first imaging area of the focus object based on the corrected positional relationship.
  • the circuit may be configured to perform focus control of the imaging device based on the distance of the first ranging area measured by the ranging sensor.
  • the circuit may be configured to: based on the multiple distances measured by the distance measurement sensor, classify the multiple distance measurement areas into group areas according to adjacent distance measurement areas belonging to the predetermined distance range.
  • the circuit can be configured to correct the positional relationship of each group area.
  • the circuit may be configured to determine the group area corresponding to the first imaging area based on the corrected positional relationship.
  • the circuit may be configured to perform focus control of the camera based on the distance of the group area, and the group area is based on multiple distances measured by the distance measuring sensor.
  • the circuit may be configured to perform focus control of the imaging device based on the distance of the distance measurement area located at the reference position among the plurality of distance measurement areas included in the group area.
  • the circuit may be configured to superimpose a frame indicating the position of the subject on the position corresponding to the group area of the image captured by the imaging device and display it on the display unit.
  • the circuit may be configured to classify the multiple ranging areas in the corrected positional relationship into group areas belonging to each adjacent ranging area of a predetermined distance range.
  • the circuit may be configured to specify the group area corresponding to the first imaging area based on the corrected positional relationship.
  • the circuit may be configured to perform focus control of the imaging device based on the distance of the group area of the plurality of distances measured by the distance measuring sensor.
  • the circuit may be configured to perform focus control of the imaging device based on the distance of the distance measurement area located at the reference position among the plurality of distance measurement areas included in the group area.
  • the circuit may be configured to superimpose a frame indicating the position of the subject on the position corresponding to the group area of the image captured by the imaging device and display it on the display unit.
  • the predetermined positional relationship may be determined based on the positional relationship between the position of the optical axis center on the light receiving surface of the light receiving element and the position of the optical axis center on the light receiving surface of the image sensor.
  • the predetermined positional relationship may indicate the correspondence between the first coordinate system associated with the light-receiving surface of the light-receiving element and the second coordinate system associated with the light-receiving surface of the image sensor.
  • the circuit may be configured to: based on a predetermined correction condition representing the correction amount of the positional relationship corresponding to the angle of view of the distance measuring sensor, the angle of view of the camera, and the distance to the subject, determine the correction amounts corresponding to the multiple distances measured by the distance measuring sensor , Correct the positional relationship based on the determined correction amount.
  • the imaging device may include the above-mentioned control device, a distance measuring sensor, and a second image sensor.
  • the control method may be to control a distance measuring sensor that measures the distance to a subject associated with each of a plurality of distance measuring areas on the light receiving surface of the light receiving element, and an image sensor that photographs the subject.
  • the control method of the camera device may include the stage of correcting the predetermined positional relationship between the plurality of distance measurement areas on the light receiving surface of the light receiving element and the multiple imaging areas on the light receiving surface of the image sensor based on the multiple distances measured by the distance measurement sensor.
  • the control method may include the following stages: based on the corrected positional relationship, the first ranging area corresponding to the first imaging area of the focus object is determined.
  • the control method may include the following stages: performing focus control of the imaging device based on the distance of the first ranging area measured by the ranging sensor.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • Fig. 1 is an external perspective view of the camera system.
  • Fig. 2 is a diagram showing functional blocks of the camera system.
  • FIG 3 is a diagram showing an example of the positional relationship between the optical axis of the lens of the imaging device and the optical axis of the TOF sensor.
  • FIG. 4 is a diagram showing an example of the positional relationship between a plurality of imaging areas of an image sensor and a plurality of ranging areas of a TOF sensor.
  • FIG. 5 is a diagram showing an example of the positional relationship between a plurality of imaging regions of an image sensor and a plurality of ranging regions of a TOF sensor.
  • FIG. 6 is a diagram showing an example of a table showing a correspondence relationship between a coordinate system related to the light-receiving surface of the TOF sensor and a coordinate system related to the light-receiving surface of the image sensor.
  • FIG. 7 is a diagram showing an example of correction conditions indicating the relationship between the distance to the subject and the correction amount.
  • FIG. 8 is a diagram showing an example of a table showing a corrected correspondence relationship between the coordinate system associated with the light receiving surface of the TOF sensor and the coordinate system associated with the light receiving surface of the image sensor.
  • FIG. 9 is a flowchart showing an example of a focus control processing procedure of the imaging control unit.
  • Fig. 10 is a flowchart showing an example of a focus control processing procedure of the imaging control unit.
  • Fig. 11 is an external perspective view showing another form of the imaging system.
  • Fig. 12 is a diagram showing an example of the appearance of an unmanned aircraft and a remote control device.
  • FIG. 13 is a diagram showing an example of the hardware configuration.
  • the blocks may represent (1) a stage of a process of performing operations or (2) a "part" of a device that performs operations.
  • Specific stages and “sections” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored includes a product that includes instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • the computer-readable medium may include floppy disk (registered trademark), floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory ), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) disc, memory stick , Integrated circuit cards, etc.
  • floppy disk registered trademark
  • floppy disk hard disk
  • random access memory RAM
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray registered trademark
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes a traditional procedural programming language.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 is an example of an external perspective view of an imaging system 10 according to this embodiment.
  • the imaging system 10 includes an imaging device 100, a supporting mechanism 200, and a holding part 300.
  • the support mechanism 200 uses actuators to rotatably support the imaging device 100 around the roll axis, the pitch axis, and the yaw axis, respectively.
  • the support mechanism 200 can change or maintain the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203.
  • the support mechanism 200 also includes a base 204 to which the yaw axis drive mechanism 203 is fixed.
  • the grip 300 is fixed to the base 204.
  • the holding part 300 includes an operation interface 301 and a display part 302.
  • the imaging device 100 is fixed to the pitch axis driving mechanism 202.
  • the operation interface 301 accepts instructions for operating the camera device 100 and the supporting mechanism 200 from the user.
  • the operation interface 301 may include a shutter/recording button for instructing the camera device 100 to shoot or record.
  • the operation interface 301 may include a power/function button for instructing to turn on or off the power of the camera system 10 and to switch the still image shooting mode or the movie shooting mode of the camera 100.
  • the display part 302 can display an image captured by the imaging device 100.
  • the display unit 302 can display a menu screen for operating the imaging device 100 and the supporting mechanism 200.
  • the display unit 302 may be a touch screen display that accepts instructions for operating the imaging device 100 and the supporting mechanism 200.
  • the user holds the grip 300 and shoots still images or animations through the imaging device 100.
  • FIG. 2 is a diagram showing functional blocks of the imaging system 10.
  • the imaging device 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens drive unit 152, a plurality of lenses 154, and a TOF sensor 160.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 is an example of an image sensor used for shooting.
  • the image sensor 120 outputs image data of the optical image formed by the plurality of lenses 154 to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 responds to the operation instruction of the holding unit 300 to the imaging device 100, and the imaging control unit 110 performs a demosaicing process on the image signal output from the image sensor 120 to generate image data.
  • the imaging control unit 110 stores image data in the memory 130.
  • the imaging control unit 110 controls the TOF sensor 160.
  • the imaging control unit 110 is an example of a circuit.
  • the TOF sensor 160 is a time-of-flight type sensor that measures the distance to an object.
  • the imaging device 100 adjusts the position of the focus lens based on the distance measured by the TOF sensor 160, thereby performing focus control.
  • the memory 130 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the grip 300 may include other memory for storing image data captured by the imaging device 100.
  • the holding part 300 may include a slot through which the memory can be detached from the housing of the holding part 300.
  • the multiple lenses 154 can function as zoom lenses, varifocal lenses, and focusing lenses. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis.
  • the lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving part 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction.
  • the lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor.
  • the lens driving unit 152 can transmit the power from the motor to at least a part or all of the plurality of lenses 154 via mechanism components such as cam rings and guide shafts, and move at least a part or all of the plurality of lenses 154 along the optical axis.
  • mechanism components such as cam rings and guide shafts
  • the plurality of lenses 154 and the imaging device 100 are integrated will be described.
  • the plurality of lenses 154 may be interchangeable lenses, and may be formed separately from the imaging device 100.
  • the imaging device 100 further includes a posture control unit 210, an angular velocity sensor 212, and an acceleration sensor 214.
  • the angular velocity sensor 212 detects the angular velocity of the imaging device 100.
  • the angular velocity sensor 212 detects the respective angular velocities of the imaging device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the posture control unit 210 acquires angular velocity information related to the angular velocity of the imaging device 100 from the angular velocity sensor 212.
  • the angular velocity information may show the respective angular velocities of the camera device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the posture control unit 210 acquires acceleration information related to the acceleration of the imaging device 100 from the acceleration sensor 214.
  • the acceleration information may also show the acceleration of the camera device 100 in each direction of the roll axis, the pitch axis, and the yaw axis.
  • the angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that houses the image sensor 120, the lens 154, and the like. In the present embodiment, the form of the integrated configuration of the imaging device 100 and the support mechanism 200 will be described. However, the supporting mechanism 200 may include a base for detachably fixing the imaging device 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging device 100 such as a base.
  • the posture control unit 210 controls the support mechanism 200 based on the angular velocity information and acceleration information to maintain or change the posture of the imaging device 100.
  • the posture control unit 210 controls the support mechanism 200 in accordance with the operating mode of the support mechanism 200 for controlling the posture of the imaging device 100 to maintain or change the posture of the imaging device 100.
  • the working modes include the following modes: operating at least one of the roll axis driving mechanism 201, the pitch axis driving mechanism 202, and the yaw axis driving mechanism 203 of the support mechanism 200 so that the posture change of the camera device 100 follows the base 204 of the support mechanism 200 The posture changes.
  • the working modes include the following modes: each of the roll axis drive mechanism 201, the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200 .
  • the working modes include the following modes: each of the pitch axis driving mechanism 202 and the yaw axis driving mechanism 203 of the support mechanism 200 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the operation mode includes the following modes: only the yaw axis driving mechanism 203 is operated so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the working mode may include the following modes: FPV (First Person View) mode in which the support mechanism 200 is operated to make the posture change of the camera device 100 follow the posture change of the base 204 of the support mechanism 200; and to maintain the camera device 100
  • FPV First Person View
  • the support mechanism 200 works in a fixed mode.
  • the FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to make the posture change of the imaging device 100 follow the posture change of the base 204 of the support mechanism 200.
  • the fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to maintain the current posture of the imaging device 100.
  • the TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emitting control unit 166, a light receiving control unit 167, and a memory 168.
  • the TOF sensor 160 is an example of a distance measuring sensor.
  • the light emitting part 162 includes at least one light emitting element 163.
  • the light-emitting element 163 is a device that repeatedly emits high-speed modulated pulsed light such as an LED or a laser.
  • the light emitting element 163 may emit infrared light pulse light.
  • the light emission control unit 166 controls the light emission of the light emitting element 163.
  • the light emission control part 166 can control the pulse width of the pulse light emitted from the light emitting element 163.
  • the light-receiving unit 164 includes a plurality of light-receiving elements 165 that measure the distance to the subject associated with each of the plurality of ranging areas.
  • the light receiving unit 164 is an example of a sensor for distance measurement.
  • the plurality of light receiving elements 165 correspond to each of the plurality of ranging areas.
  • the light receiving element 165 repeatedly receives reflected light of pulsed light from the object.
  • the light receiving control unit 167 controls the light receiving element 165 to receive light.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of ranging areas based on the amount of reflected light repeatedly received by the light receiving element 165 during the predetermined light receiving period.
  • the light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light based on the amount of reflected light repeatedly received by the light receiving element 165 during a predetermined light receiving period.
  • the light receiving unit 164 can measure the distance to the subject by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) mode.
  • FMCW Frequency Modulated Continuous Wave
  • the memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM.
  • the memory 168 stores a program necessary for the light-emitting control unit 166 to control the light-emitting unit 162, a program necessary for the light-receiving control unit 167 to control the light-receiving unit 164, and the like.
  • the TOF sensor 160 can measure the distance to the subject associated with each of the plurality of distance measurement areas corresponding to the number of pixels of the light receiving unit 164. However, generally, the number of pixels of the light receiving unit 164 is less than the number of pixels of the image sensor 120 for imaging of the imaging device 100. In addition, the positional relationship between the light-receiving surface of the light-receiving portion 164 of the TOF sensor 160 and the light-receiving surface of the image sensor 120 of the imaging device 100 changes according to the distance to the subject measured by the TOF sensor 160.
  • the imaging device 100 performs focus control based on the distance of the subject measured by the TOF sensor 160, sometimes the subject cannot be focused on the user's intention. The situation of the subject.
  • FIG. 3 shows the positional relationship between the position of the subject on the light-receiving surface of the light-receiving section 164 of the TOF sensor 160 and the position of the subject on the light-receiving surface of the image sensor 120 of the imaging device 100.
  • 3 shows a situation where the imaging device 100 is capturing a subject (Obj1) 501 at a distance of L1 from the imaging device 100 and a subject (Obj2) 502 at a distance of L2 from the imaging device 100.
  • the angle of view of the imaging device 100 is ⁇
  • the angle of view of the TOF sensor 160 is ⁇ .
  • the ranging area of the TOF sensor 160 is 8 pixels*8 pixels, that is, 64 areas.
  • one ranging area corresponds to one pixel of the light receiving unit 164.
  • Area 511 indicates a position relationship 1, a ranging region and the imaging region of the imaging apparatus 100 of the image pickup apparatus 100 from the distance L is.
  • the area 512 indicates the positional relationship between the distance measurement area and the imaging area of the imaging device 100 when the distance from the imaging device 100 is L 2.
  • the optical axis P 2 of the TOF sensor 160 passes through the center of the plurality of ranging areas of the TOF sensor 160.
  • the optical axis P 1 of the imaging device 100 is 1.9 pixels away from the center of the plurality of ranging areas of the TOF sensor 160.
  • the optical axis P 1 of the imaging device 100 is 1.2 pixels away from the center of the plurality of ranging areas of the TOF sensor 160. That is, the distance between the center of the optical axis of the imaging device 100 and the object distance, the image pickup apparatus 100 of a plurality of the ranging areas P 1 and TOF sensor 160 is different. This phenomenon is called the so-called parallax phenomenon.
  • the imaging device 100 corrects the positional relationship between the light-receiving surface of the image sensor 120 and the light-receiving surface of the light-receiving portion 164 of the TOF sensor 160 based on the distance to the subject measured by the TOF sensor. Furthermore, the imaging device 100 determines the distance measurement area of the TOF sensor 160 corresponding to the subject position on the image captured by the imaging device 100 according to the corrected positional relationship, and the TOF sensor 160 measures the distance measurement area based on the determined distance measurement area. Perform focus control at the distance.
  • the imaging control unit 110 acquires the distance to the subject associated with each of the plurality of ranging areas measured by the TOF sensor 160.
  • the imaging control unit 110 corrects the predetermined positional relationship between the plurality of distance measurement areas on the light receiving surface of the light receiving unit 164 and the plurality of imaging areas on the light receiving surface of the image sensor 120 based on the multiple distances.
  • the imaging control unit 110 may determine the corrections corresponding to the multiple distances measured by the TOF sensor 160 based on predetermined correction conditions indicating the correction amount of the positional relationship corresponding to the angle of view of the TOF sensor 160, the angle of view of the imaging device 100, and the distance to the subject. And correct the positional relationship based on the determined correction amount.
  • the imaging control unit 110 can correct the positional relationship by moving the position on the light-receiving surface of the TOF sensor 160 corresponding to the position on the light-receiving surface of the image sensor 120 by the number of pixels corresponding to the correction amount.
  • the predetermined positional relationship may be determined based on the positional relationship between the position of the optical axis center on the light receiving surface of the light receiving section 164 and the position of the optical axis center on the light receiving surface of the image sensor 120.
  • the predetermined positional relationship may indicate the correspondence between the first coordinate system associated with the light-receiving surface of the light-receiving portion 164 and the second coordinate system associated with the light-receiving surface of the image sensor 120.
  • the imaging control unit 110 may determine the first ranging region corresponding to the first imaging region of the focus target among the plurality of ranging regions based on the corrected positional relationship.
  • the imaging control unit 110 performs focus control of the imaging device 100 based on the distance of the first ranging area measured by the TOF sensor 160.
  • the focus control is control to move the focus lens to focus on the subject existing in the distance of the first ranging area.
  • the imaging control unit 110 may classify the multiple distance measurement areas into group areas according to adjacent distance measurement areas belonging to a predetermined distance range based on the multiple distances measured by the TOF sensor 160.
  • the area surrounded by the adjacent ranging area belonging to the predetermined distance range is an area in which the subject is likely to exist within the distance range.
  • the imaging control unit 110 may classify the multiple distance measurement areas into group areas according to adjacent distance measurement areas belonging to the same distance range based on the multiple distances measured by the TOF sensor 160. The imaging control unit 110 can correct the positional relationship for each group area.
  • the imaging control unit 110 may determine the group area corresponding to the first imaging area based on the corrected positional relationship.
  • the imaging control section 110 may perform focus control of the imaging apparatus 100 based on the distance of the group area according to the plurality of distances measured by the TOF sensor 160, wherein the distance of the group area is based on the plurality of distances measured by the TOF sensor 160.
  • the imaging control unit 110 may perform the focus control of the imaging device 100 based on the distance of the distance measurement area located at the reference position of the group area among the plurality of distance measurement areas included in the group area.
  • the reference position is, for example, half of the maximum length in the column direction and the maximum length in the row direction of the group area.
  • the imaging control unit 110 may perform focus control of the imaging device 100 based on the average value of the respective distances of the plurality of distance measurement areas included in the group area.
  • the imaging control unit 110 may classify the plurality of distance-measuring regions in the corrected positional relationship into group regions according to adjacent distance-measuring regions belonging to a predetermined distance range.
  • the imaging control unit 110 may, after correcting the positional relationship based on the distances of the plurality of ranging regions, classify the plurality of ranging regions in the corrected positional relationship into group regions as adjacent ranging regions belonging to the same distance range.
  • the imaging control unit 110 may determine the group area corresponding to the first imaging area based on the positional relationship after the multiple ranging areas are classified into the group area, and perform the focus control of the imaging device 100 based on the distance of the group area. , The distance of the group area is based on the multiple distances measured by the TOF sensor 160.
  • the imaging control unit 110 may superimpose a frame indicating the position of the subject on the position corresponding to the group region of the captured image captured by the imaging device 100 and display it on the display unit 302 or the like.
  • the imaging device 100 may determine a plurality of adjacent distance measurement areas belonging to a predetermined distance range measured by the TOF sensor 160, and superimpose a frame containing the determined plurality of distance measurement areas as a frame indicating the location of the subject on the captured image. It is displayed on the display unit 302 and the like as a preview image.
  • the imaging control unit 110 may classify the adjacent 7 pixels included in the distance range indicating the distance L 1 into the group area 531 corresponding to the subject 501.
  • the imaging control unit 110 may classify the adjacent 3 pixels included in the distance range representing the distance L 2 into the group area 532 corresponding to the subject 502.
  • the memory 130 Angle ⁇ TOF sensor 160, the viewing angle ⁇ of the imaging apparatus 100, corresponding to the distance L 1 is the positional relationship between a correction amount is stored as 1.9 pixels.
  • the memory 130 stores the correction amount of the positional relationship corresponding to the viewing angle ⁇ of the TOF sensor 160, the viewing angle ⁇ of the imaging device 100, and the distance L 2 as 1.2 pixels.
  • the imaging control unit 110 corrects the positional relationship by moving the position of the imaging area corresponding to the group area 531 upward by an amount equivalent to 1.9 pixels.
  • the imaging control unit 110 corrects the positional relationship by moving the position of the imaging area corresponding to the group area 532 upward by 1.2 pixels.
  • the imaging control unit 110 determines a group area 611 and a group area 622 including a plurality of adjacent distance measurement areas belonging to the same distance range from the plurality of distance measurement areas 602.
  • the imaging control unit 110 determines the correction amount corresponding to the distance between the group area 611 and the group area 622 based on predetermined correction conditions stored in the memory 130.
  • the imaging control unit 110 corrects the positional relationship by moving the positions of the imaging regions corresponding to the group region 611 and the group region 622 by a determined correction amount, respectively.
  • the memory 130 may store a table representing the correspondence relationship between the coordinate system associated with the light-receiving surface of the TOF sensor 160 and the coordinate system associated with the light-receiving surface of the image sensor 120 as a predetermined positional relationship. That is, the memory 130 may store the coordinate values of a plurality of imaging regions of the image sensor 120 corresponding to the coordinate values of the respective ranging regions of the TOF sensor 160. As shown in FIG. 7, the memory 130 may store the correction amount corresponding to the distance to the subject for each combination of the angle of view of the TOF sensor 160 and the angle of view of the imaging device 100 as a predetermined correction condition.
  • the imaging control unit 110 may refer to the predetermined positional relationship and predetermined correction conditions stored in the memory 130, and for each ranging area, move the position of the imaging area by a corresponding correction amount to correct the predetermined positional relationship.
  • FIG. 9 is a flowchart showing an example of a focus control procedure of the imaging control unit 110.
  • the imaging control unit 110 acquires the distance of each ranging area from the TOF sensor 160 (S100).
  • the imaging control unit 110 selects and classifies adjacent distance measurement areas whose distances of two or more pixels belong to the same distance range as a group area (S102).
  • the imaging control unit 110 determines the distance for each group area.
  • the imaging control unit 110 determines, for example, the distance of the distance measurement area located at the reference position among the plurality of distance measurement areas included in the group area as the distance of the group area.
  • the imaging control unit 110 may determine the average distance of the respective distances of the plurality of ranging areas included in the group area as the distance of the group area.
  • the imaging control unit 110 may weight each of the multiple distance measurement areas included in the group area, and determine the average distance of each weighted distance as the distance of the group area.
  • the imaging control unit 110 corrects the predetermined positional relationship between the imaging area of the image sensor 120 and the distance measuring area of the TOF sensor 160 for each group area based on the distance of the group area.
  • the imaging control unit 110 may correct the predetermined positional relationship according to a predetermined correction condition indicating the correction amount of each distance corresponding to the combination of the angle of view of the TOF sensor 160 and the angle of view of the imaging device 100.
  • the imaging control unit 110 acquires a captured image captured by the imaging device 100 (S106).
  • the imaging device 100 may display a preview image obtained by superimposing a frame indicating the location of the subject on the position of the captured image corresponding to the group area on the display unit 302 or the like.
  • the imaging control unit 110 determines the distance of the group area corresponding to the imaging area of the focus target touched by the user on the captured image displayed by the display unit 302.
  • the user can touch the frame including the desired subject in at least one frame displayed in the preview image.
  • the imaging control unit 110 performs an auto-focus operation, that is, focus control based on the determined distance of the group area (S108).
  • FIG. 10 is a flowchart showing an example of a focus control processing procedure of the imaging control unit 110.
  • the imaging control unit 110 acquires the distance of each ranging area from the TOF sensor 160 (S200).
  • the imaging control unit 110 corrects the predetermined positional relationship between the imaging area of the image sensor 120 and the distance measuring area of the TOF sensor 160 based on the respective distances of the distance measuring area of the TOF sensor 160 (S202).
  • the imaging control unit 110 may perform predetermined positional relationships based on the respective distances of the TOF sensor 160 from the distance measurement area in accordance with predetermined correction conditions indicating the correction amount of each distance corresponding to the combination of the angle of view of the TOF sensor 160 and the angle of view of the imaging device 100. Correction.
  • the imaging control unit 110 classifies the distance measurement area of the TOF sensor 160 in the corrected positional relationship into a group area as the distance measurement area belonging to the same distance range (S204).
  • the imaging control unit 110 acquires a captured image captured by the imaging device 100 (S206).
  • the imaging device 100 may display a preview image obtained by superimposing a frame indicating the location of the subject on the position of the captured image corresponding to the group area on the display unit 302 or the like.
  • the imaging control unit 110 determines the distance of the group area corresponding to the imaging area of the focus target touched by the user on the captured image displayed by the display unit 302.
  • the imaging control unit 110 executes an autofocus operation based on the distance of the group area (S208).
  • the imaging device 100 can perform focus control based on the distance of the subject detected based on the distance measurement information of the TOF sensor 160 to focus on the desired subject.
  • the imaging device 100 photographs a person wearing white clothes standing in front of a white wall, sometimes the imaging device 100 cannot identify the person from the captured image. Even in this case, the TOF sensor 160 can measure the distance of the person. Based on the distance measured by the TOF sensor 160, the imaging device 100 classifies a plurality of distance measurement areas of the TOF sensor 160 into group areas according to adjacent distance measurement areas of the same distance range. In addition, the imaging device 100 corrects the positional relationship between the position on the light-receiving surface of the TOF sensor 160 and the position on the light-receiving surface of the image sensor 120 according to the distance of the group area.
  • the imaging device 100 displays a captured image obtained by superimposing a frame indicating the location of the person on the captured image at a location corresponding to the group area on the display unit 302 as a preview image.
  • the user touches the box.
  • the imaging device 100 determines the group area corresponding to the position of the touched frame based on the corrected positional relationship.
  • the imaging device 100 performs focus control based on the distance of the group area measured by the TOF sensor 160. As a result, even when the subject cannot be identified from the captured image, the imaging device 100 can reliably focus on the desired subject existing at an arbitrary distance.
  • optical axis of the image sensor 120 and the optical axis of the TOF sensor 160 are parallel.
  • the optical axis of the image sensor 120 and the optical axis of the TOF sensor 160 may not be parallel.
  • the viewing angle of the camera 100 may be smaller than the viewing angle of the TOF sensor 160.
  • FIG. 11 is an example of an external perspective view of another form of the imaging system 10. As shown in FIG. 11, the camera system 10 can be used in a state where a mobile terminal including a display such as a smartphone 400 is fixed to the side of the grip 300.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 can be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 12.
  • UAV 1000 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the gimbal 50 and the camera device 100 are an example of a camera system.
  • UAV1000 is an example of a moving body propelled by a propulsion unit.
  • the concept of moving objects also includes flying objects such as airplanes that move in the air, vehicles that move on the ground, and ships that move on the water.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 controls the rotation of a plurality of rotors to make the UAV 1000 fly.
  • the UAV body 20 uses, for example, four rotors to make the UAV1000 fly. The number of rotors is not limited to four.
  • UAV1000 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera for imaging a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 1000 in order to control the flight of the UAV 1000.
  • the two camera devices 60 can be installed on the nose of the UAV1000, that is, on the front side.
  • the other two camera devices 60 may be installed on the bottom surface of the UAV1000.
  • the two camera devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 1000 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of imaging devices 60 included in the UAV 1000 is not limited to four. It is sufficient that the UAV1000 includes at least one camera device 60.
  • the UAV1000 may also include at least one camera 60 on the nose, tail, side, bottom and top surfaces of the UAV1000.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may also have a single focus lens or a fisheye lens.
  • the remote operation device 600 communicates with the UAV1000 to perform remote operation on the UAV1000.
  • the remote operation device 600 can wirelessly communicate with the UAV1000.
  • the remote operation device 600 transmits to the UAV 1000 instruction information indicating various commands related to the movement of the UAV 1000 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 1000.
  • the indication information can indicate the height at which the UAV1000 should be located.
  • the UAV 1000 moves to be located at the height indicated by the instruction information received from the remote operation device 600.
  • the instruction information may include an ascending instruction to raise the UAV1000. UAV1000 rises while receiving the rise command. When the height of the UAV1000 has reached the upper limit height, even if the ascending command is accepted, the UAV1000 can be restricted from ascending.
  • FIG. 13 shows an example of a computer 1200 that can embody various aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes designated operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method can be constituted by realizing operation or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instructs the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network.
  • the received data is written in the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or required parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, condition determination, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute associated with the first attribute that satisfies the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

由于TOF传感器的受光面上的被摄体的位置和摄像装置的图像传感器的受光面上的该被摄体的位置之间的位置关系的误差,有时会出现不能对焦于所期望的被摄体的情况。一种控制装置,其对包括测量到受光元件的受光面上的多个测距区域的各自相关联的被摄体的距离的测距传感器和拍摄被摄体的图像传感器的摄像装置进行控制。控制装置包括电路,电路配置为:基于由测距传感器测量的多个距离,对受光元件的受光面上的多个测距区域和图像传感器的受光面上的多个摄像区域的预定位置关系进行校正;基于校正后的位置关系,确定对焦对象的第一摄像区域所对应的第一测距区域;基于测量传感器测量的第一测距区域的距离,执行摄像装置的对焦控制。

Description

控制装置、摄像装置、控制方法以及程序 技术领域
本发明涉及一种控制装置、摄像装置、控制方法以及程序。
背景技术
专利文献1中公开了根据M×N个像素各自的TOF算法计算距离值,并将该距离信息存储至深度图存储器中。
[专利文献1]日本专利文献特表2019-508717号公报。
发明内容
TOF传感器的受光面和摄像装置的图像传感器的受光面的位置关系会随TOF传感器测量的到被摄体的距离而变化。由于TOF传感器的受光面上的被摄体的位置和摄像装置的图像传感器的受光面上的该被摄体的位置之间的位置关系的误差,有时会出现不能对焦于所期望的被摄体的情况。
本发明的一个方面所涉及的控制装置可以是对包括测量到受光元件的受光面上的多个测距区域的各自相关联的被摄体的距离的测距传感器和拍摄被摄体的图像传感器的摄像装置进行控制的控制装置。控制装置包括一种电路,电路配置为:基于测距传感器测量的多个距离,对受光元件的受光面上的多个测距区域和图像传感器的受光面上的多个摄像区域的预定位置关系进行校正。电路可以配置为:基于校正后的位置关系,确定对焦对象的第一摄像区域所对应的第一测距区域。电路可以配置为:基于测距传感器测量的第一测距区域的距离,执行摄像装置的对焦控制。
电路可以配置为:基于测距传感器测量的多个距离,将多个测距区域按属于预定距离范围的邻接的测距区域分类为群组区域。电路可以配置为:对各个群组区域的位置关系进行校正。电路可以配置为:基于校正后的位置关系,确定第一摄像区域所对应的群组区域。电路可以配置为:基于群组区域的距离,执行摄像装置的对焦控制,群组区域基于测距传感器测量的多个距离。
电路可以配置为:基于群组区域所包含的多个测距区域中的位于基准位置的测距区域的距离,执行摄像装置的对焦控制。
电路可以配置为:将表示被摄体的位置的框叠加于摄像装置拍摄的图像的与群组区域对应的位置并显示于显示部。
电路可以配置为:将校正后的位置关系中的多个测距区域分类为属于预定距离范围的邻接的各个测距区域的群组区域。电路可以配置为:基于校正后的位置关系,指定第一摄像区域所对应的群组区域。电路可以配置为:基于根据测距传感器测量的多个距离的群组区域的距离,执行摄像装置的对焦控制。
电路可以配置为:基于群组区域所包含的多个测距区域中的位于基准位置的测距区域的距离,执行摄像装置的对焦控制。
电路可以配置为:将表示被摄体的位置的框叠加于摄像装置拍摄的图像的与群组区域对应的位置并显示于显示部。
预定位置关系可以基于受光元件的受光面上的光轴中心的位置和图像传感器的受光面上的光轴中心的位置之间的位置关系来确定。
预定位置关系可以表示与受光元件的受光面相关联的第一坐标系和与图像传感器的受光面相关联的第二坐标系之间的对应关系。
电路可以配置为:基于表示测距传感器的视角、摄像装置的视角以及到被摄体的距离所对应的位置关系的校正量的预定校正条件,确定测距传感器测量的多个距离相应的校正量,基于确定的校正量校正位置关系。
本发明的一个方面所涉及的摄像装置可以包括上述控制装置、测距传感器以及第二图像传感器。
本发明的一个方面所涉及的控制方法可以是控制包括测量到受光元件的受光面上的多个测距区域的各自相关联的被摄体的距离的测距传感器和拍摄被摄体的图像传感器的摄像装置的控制方法。控制方法可以包括以下阶段:基于测距传感器测量的多个距离,对受光元件的受光面上的多个测距区域和图像传感器的受光面上的多个摄像区域的预定位置关系进行校正。控制方法可以包括以下阶段:基于校正后的位置关系,确定对焦对象的第一摄像区域所对应的第一测距区域。控制方法可以包括以下阶段:基于测距传感器测量的第一测距区域的距离,执行摄像装置的对焦控制。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。
根据本发明的一个方面,可以防止由于TOF传感器的受光元件的受光面上的被摄体的位置和摄像装置的图像传感器的受光面上的该被摄体的位置之间的位置关系的误差影响而出现不能对焦于所期望的被摄体的情况。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。
附图说明
图1是摄像系统的外观立体图。
图2是示出摄像系统的功能块的图。
图3是示出摄像装置镜头光轴和TOF传感器镜头光轴的位置关系的一个示例的图。
图4是示出图像传感器的多个摄像区域和TOF传感器的多个测距区域的位置关系的一个示例的图。
图5是示出图像传感器的多个摄像区域和TOF传感器的多个测距区域的位置关系的一个示例的图。
图6是示出表示与TOF传感器的受光面相关联的坐标系和与图像传感器的受光面相关联的坐标系的对应关系的表格的一个示例的图。
图7是示出表示到被摄体的距离和校正量之间的关系的校正条件的一个示例的图。
图8是示出表示与TOF传感器的受光面相关联的坐标系和与图像传感器的受光面相关联的坐标系的校正后的对应关系的表格的一个示例的图。
图9是示出摄像控制部的对焦控制处理过程的一个示例的流程图。
图10是示出摄像控制部的对焦控制处理过程的一个示例的流程图。
图11是示出摄像系统的其他形式的外观立体图。
图12是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。
图13是示出硬件构成的一个示例的图。
【符号说明】
10 摄像系统
20 UAV主体
50 万向节
60 摄像装置
100 摄像装置
110 摄像控制部
120 图像传感器
130 存储器
150 镜头控制部
152 镜头驱动部
154 镜头
160 TOF传感器
162 发光部
163 发光元件
164 受光部
165 受光元件
166 发光控制部
167 受光控制部
168 存储器
200 支撑机构
201 滚转轴驱动机构
202 俯仰轴驱动机构
203 偏航轴驱动机构
204 基部
210 姿势控制部
212 角速度传感器
214 加速度传感器
300 握持部
301 操作界面
302 显示部
400 智能手机
600 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
具体实施方式
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机 可读介质的更具体的示例,可以包括软盘(注册商标)、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(注册商标)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1是本实施方式所涉及的摄像系统10的外观立体图的一个示例。摄像系统10包括摄像装置100、支撑机构200以及握持部300。支撑机构200使用致动器分别以滚转轴、俯仰轴、偏航轴为中心可旋转地支撑摄像装置100。支撑机构200可通过使摄像装置100以滚转轴、俯仰轴以及偏航轴中的至少一个为中心旋转,来变更或维持摄像装置100的姿势。支撑机构200包括滚转轴驱动机构201、俯仰轴驱动机构202和偏航轴驱动机构203。支撑机构200还包括固定偏航轴驱动机构203的基部204。握持部300固定于基部204。握持部300包括操作界面301以及显示部302。摄像装置100固定于俯仰轴驱动机构202。
操作界面301接受来自用户的用于操作摄像装置100及支撑机构200的指令。操作界面301可包括指示摄像装置100进行拍摄或者录像的快门/录像按钮。操作界面301可包括指示打开或者切断摄像系统10的电源以及切换摄像装置100的静态图像拍摄模式或动画拍摄模式的电源/功能按钮。
显示部302可显示由摄像装置100拍摄的图像。显示部302可显示用于操作摄像装置100以及支撑机构200的菜单画面。显示部302可以是接受用于操作摄像装置100以及支撑机构200的指令的触屏显示器。
用户手持握持部300,通过摄像装置100拍摄静态图像或者动画。
图2是示出摄像系统10的功能块的图。摄像装置100包括摄像控制部110、图像传感器120、存储器130、镜头控制部150、镜头驱动部152、多个镜头154以及TOF传感器160。
图像传感器120可以由CCD或CMOS构成。图像传感器120是用于拍摄的图像传感器的一个示例。图像传感器120将通过多个镜头154成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。
摄像控制部110按照握持部300对摄像装置100的动作指令,并且摄像控制部110对从图像传感器120输出的图像信号实施去马赛克处理,生成图像数据。摄像控制部110将图像数据存储在存储器130。摄像控制部110控制TOF传感器160。摄像控制部110是电路的一个示例。TOF传感器160是测量到对象物距离的飞行时间型传感器。摄像装置100基于由TOF传感器160测量的距离来调整聚焦镜头的位置,以此来执行对焦控制。
存储器130可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器130储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。握持部300可包括用于保存由摄像装置100拍摄的图像数据的其他存储器。握持部300可包括可从握持部300的壳体上拆卸存储器的插槽。
多个镜头154可以起到变焦镜头(zoom lens)、可变焦距镜头(varifocal lens)及聚焦镜头的作用。多个镜头154中的至少一部分或全部被配置为能够沿着光轴移动。镜头控制部150按照来自摄像控制部110的镜头控制指令来驱动镜头驱动部152,使一个或者多个镜头154沿光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。镜头驱动部152可包括使多个镜头154的至少一部分或全部沿光轴方向移动的音圈电机(VCM)。镜头驱动部152可包括DC电机、空心杯电机或超声波电机等 电动机。镜头驱动部152可将来自电动机的动力经由凸轮环、导轴等的机构部件传递给多个镜头154中的至少一部分或全部,使多个镜头154中的至少一部分或全部沿光轴移动。在本实施方式中,对多个镜头154和摄像装置100一体化的例子进行说明。不过,多个镜头154也可以是可更换镜头,可以与摄像装置100分体构成。
摄像装置100还包括姿势控制部210、角速度传感器212以及加速度传感器214。角速度传感器212检测摄像装置100的角速度。角速度传感器212检测摄像装置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210获取来自于角速度传感器212的与摄像装置100的角速度相关的角速度信息。角速度信息可示出摄像装置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210获取来自加速度传感器214的与摄像装置100的加速度相关的加速度信息。加速度信息也可示出摄像装置100在滚转轴、俯仰轴以及偏航轴的各个方向的加速度。
角速度传感器212以及加速度传感器214可设置于容纳图像传感器120以及镜头154等的壳体内。在本实施方式中,对摄像装置100和支撑机构200一体的构成的方式进行说明。但是,支撑机构200可包括可拆装地固定摄像装置100的基座。在该情况下,角速度传感器212以及加速度传感器214可设置于基座等摄像装置100的壳体外。
姿势控制部210基于角速度信息及加速度信息来控制支撑机构200,以维持或改变摄像装置100的姿势。姿势控制部210按照用于控制摄像装置100的姿势的支撑机构200的工作模式来控制支撑机构200,以维持或改变摄像装置100的姿势。
工作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203各自工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作模式包括以下模式:使支撑机构200的俯仰轴驱动机构202以及偏航轴驱动机构203各自工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。工作 模式包括以下模式:仅使偏航轴驱动机构203工作,以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化。
工作模式可包括以下模式:使支撑机构200工作以使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化的FPV(First Person View,第一人称主视角)模式;以及为维持摄像装置100的姿势而使支撑机构200工作的固定模式。
FPV模式是为使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作的模式。固定模式是为维持摄像装置100的当前姿势,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作的模式。
TOF传感器160包括发光部162、受光部164、发光控制部166、受光控制部167以及存储器168。TOF传感器160是测距传感器的一个示例。
发光部162包括至少一个发光元件163。发光元件163是反复发射LED或者激光等高速已调制的脉冲光的设备。发光元件163可发射红外光脉冲光。发光控制部166控制发光元件163的发光。发光控制部166可控制从发光元件163发射的脉冲光的脉冲宽度。
受光部164包括多个受光元件165,其测量到多个测距区域各自相关联的被摄体的距离。受光部164是测距用传感器的一个示例。多个受光元件165与多个测距区域的各个对应。受光元件165反复接收来自对象物的脉冲光的反射光。受光控制部167控制受光元件165接收光。受光控制部167基于预定受光期间受光元件165反复接收的反射光的量,测量到多个测距区域各自相关联的被摄体的距离。受光控制部167可以基于预定受光期间受光元件165反复接收的反射光量,通过确定脉冲光与反射光之间的相位差,来测量到被摄体的距离。受光部164可以通过读取反射波的频率变化来测量到被摄体的距离。这被称为FMCW(Frequency Modulated Continuous Wave,调频连续波)方式。
存储器168可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM以及EEPROM中的至少一个。存储器168存储发光控制部166为控制发光部162所需的程序以及受光控制部167为控制受光部164所需 的程序等。
TOF传感器160可以测量到受光部164的像素数所对应的多个测距区域各自相关联的被摄体的距离。然而,通常,受光部164的像素数少于摄像装置100的摄像用图像传感器120的像素数。另外,TOF传感器160的受光部164的受光面和摄像装置100的图像传感器120的受光面的位置关系会随TOF传感器160测量的到被摄体的距离而变化。因此,即使基于来自TOF传感器160的距离信息检测出被摄体,并基于TOF传感器160测量出的该被摄体的距离由摄像装置100执行对焦控制,有时也会出现无法对焦于用户意向的被摄体的情况。
图3表示TOF传感器160的受光部164的受光面上的被摄体的位置和摄像装置100的图像传感器120的受光面上的该被摄体的位置之间的位置关系。图3表示摄像装置100正在拍摄距摄像装置100的距离为L1的被摄体(Obj1)501和距摄像装置100的距离为L2的被摄体(Obj2)502的情形。
摄像装置100的视角是θ,TOF传感器160的视角是φ。摄像装置100的光轴P 1和TOF传感器160的光轴P 2之间的距离为h。TOF传感器160的测距区域是8像素*8像素即64个区域。这里,一个测距区域相当于受光部164的1像素。
区域511表示距摄像装置100的距离为L 1时,测距区域和摄像装置100的摄像区域的位置关系。区域512表示距摄像装置100的距离为L 2时,测距区域和摄像装置100的摄像区域的位置关系。
图像传感器120的光轴P 1和TOF传感器160的光轴P 2之间的距离间隔为h。TOF传感器160的光轴P 2通过TOF传感器160的多个测距区域的中心。然而,在距离L 1中,摄像装置100的光轴P 1相距TOF传感器160的多个测距区域的中心1.9像素。另一方面,在距离L 2中,摄像装置100的光轴P 1相距TOF传感器160的多个测距区域的中心1.2像素。也就是说,根据摄像装置100与被摄体的距离,摄像装置100的光轴P 1和TOF传感器160的多个测距区域的中心之间的距离是不同的。这种现象被称为所谓的视差现象。
本实施方式所涉及的摄像装置100基于TOF传感器测量的到被摄体 的距离,对图像传感器120的受光面和TOF传感器160的受光部164的受光面的位置关系进行校正。进而,摄像装置100按照校正后的位置关系,确定由摄像装置100拍摄的图像上的被摄体位置所对应的TOF传感器160的测距区域,基于对于该确定的测距区域由TOF传感器160测量的距离执行对焦控制。
摄像控制部110获取TOF传感器160测量的到多个测距区域各自相关联的被摄体的距离。摄像控制部110基于多个距离,对受光部164的受光面上的多个测距区域与图像传感器120的受光面上的多个摄像区域之间的预定位置关系进行校正。
摄像控制部110可以基于表示TOF传感器160的视角、摄像装置100的视角以及到被摄体的距离所对应的位置关系的校正量的预定校正条件,确定TOF传感器160测量的多个距离相应的校正量,并基于确定的校正量来校正位置关系。摄像控制部110可以通过使与图像传感器120的受光面上的位置对应的TOF传感器160的受光面上的位置移动校正量对应的像素数,对位置关系进行校正。
预定位置关系可以基于受光部164的受光面上的光轴中心的位置和图像传感器120的受光面上的光轴中心的位置之间的位置关系来确定。预定位置关系可以表示与受光部164的受光面相关联的第一坐标系和与图像传感器120的受光面相关联的第二坐标系之间的对应关系。
摄像控制部110可以基于校正后的位置关系,确定多个测距区域中对焦对象的第一摄像区域所对应的第一测距区域。摄像控制部110基于TOF传感器160测量的第一测距区域的距离,执行摄像装置100的对焦控制。对焦控制是使聚焦镜头移动以对焦于第一测距区域的距离中所存在的被摄体的控制。摄像控制部110可以基于TOF传感器160测量的多个距离,将多个测距区域按属于预定距离范围的邻接测距区域分类为群组区域。属于预定距离范围的邻接测距区域所包围的区域是该距离范围内被摄体存在可能性较大的区域。设基准距离为L,则预定距离范围可以是L±αL(0<α<1)。例如,当L为1m时,设α=0.1,预定距离范围可以是0.9m~1.1m。摄像控制部110可以基于TOF传感器160测量的多个距离,将多个测距区域按属于同一距离范围的邻接测距区域分类为群组区域。摄像控制部 110可以对各群组区域校正位置关系。
摄像控制部110可以基于校正后的位置关系,确定第一摄像区域所对应的群组区域。摄像控制部110可以基于根据TOF传感器160测量的多个距离的群组区域的距离,执行摄像装置100的对焦控制,其中,群组区域的距离基于TOF传感器160测量的多个距离。摄像控制部110可以基于群组区域所包含的多个测距区域中位于群组区域的基准位置的测距区域的距离,执行摄像装置100的对焦控制。基准位置例如是群组区域的列方向的最大长度和行方向的最大长度的一半。即使不是一半,也可以对行方向和列方向分别进行加权来设置基准位置。摄像控制部110可以基于群组区域所包含的多个测距区域的各个距离的平均值,执行摄像装置100的对焦控制。
摄像控制部110也可以在基于多个测距区域的距离校正位置关系后,将校正后的位置关系中的多个测距区域按属于预定距离范围的邻接测距区域分类为群组区域。摄像控制部110也可以在基于多个测距区域的距离,校正位置关系后,将校正后的位置关系中的多个测距区域按属于同一距离范围的邻接测距区域分类为群组区域。摄像控制部110可以基于将多个测距区域分类为群组区域后的位置关系,确定第一摄像区域所对应的群组区域,基于群组区域的距离,执行摄像装置100的对焦控制,其中,群组区域的距离基于TOF传感器160测量的多个距离。
摄像控制部110可以将表示被摄体的位置的框叠加于摄像装置100拍摄的摄像图像的与群组区域对应的位置并显示于显示部302等。
例如,摄像装置100可以确定属于TOF传感器160测量的预定距离范围的邻接的多个测距区域,将包含确定的多个测距区域的框作为表示被摄体存在位置的框叠加于摄像图像,并作为预览图像显示于显示部302等。
在图3中,摄像控制部110可以将表示距离L 1的距离范围所包含的邻接的7像素分类为与被摄体501对应的群组区域531。摄像控制部110可以将表示距离L 2的距离范围所包含的邻接的3像素分类为与被摄体502对应的群组区域532。存储器130将TOF传感器160的视角φ、摄像装置100的视角θ、距离L 1所对应的位置关系的校正量存储为1.9像素。另外,存储器130将TOF传感器160的视角φ、摄像装置100的视角θ、距离L 2 所对应的位置关系的校正量存储为1.2像素。摄像控制部110通过使群组区域531所对应的摄像区域的位置向上移动1.9像素相当的量,对位置关系进行校正。摄像控制部110通过使群组区域532所对应的摄像区域的位置向上移动1.2像素相当的量,对位置关系进行校正。
图4以及图5是示出图像传感器120的多个摄像区域601和TOF传感器160的多个测距区域602的位置关系的一个示例。如图4所示,摄像控制部110从多个测距区域602中确定包含属于同一距离范围的多个邻接测距区域的群组区域611以及群组区域622。摄像控制部110基于存储器130存储的预定校正条件,确定群组区域611以及群组区域622的距离所对应的校正量。如图5所示,摄像控制部110通过使群组区域611以及群组区域622各自对应的摄像区域的位置分别移动确定的校正量,对位置关系进行校正。
如图6所示,存储器130可以存储表示与TOF传感器160的受光面相关联坐标系和与图像传感器120的受光面相关联坐标系的对应关系的表格作为预定位置关系。也就是说,存储器130可以存储与TOF传感器160的各个测距区域的坐标值对应的图像传感器120的多个摄像区域的坐标值。如图7所示,存储器130可以存储针对TOF传感器160的视角以及摄像装置100的视角的每个组合的到被摄体的距离所对应的校正量作为预定校正条件。
如图8所示,摄像控制部110可以参照存储器130存储的预定位置关系和预定校正条件,对于各个测距区域,通过使摄像区域的位置移动对应的校正量,对预定位置关系进行校正。
图9是示出摄像控制部110的对焦控制过程的一个示例的流程图。摄像控制部110从TOF传感器160获取各个测距区域的距离(S100)。摄像控制部110选择两个像素以上的距离属于同一个距离范围的邻接的测距区域分类为群组区域(S102)。摄像控制部110对各个群组区域确定距离。摄像控制部110例如将群组区域所包含的多个测距区域中位于基准位置的测距区域的距离确定为群组区域的距离。摄像控制部110可以确定群组区域所包含的多个测距区域的各自距离的平均距离作为群组区域的距离。摄像控制部110可以对群组区域所包含的多个测距区域的每一个进行加权, 确定加权后的各个距离的平均距离作为群组区域的距离。
摄像控制部110基于群组区域的距离,对各个群组区域校正图像传感器120的摄像区域与TOF传感器160的测距区域之间的预定位置关系。摄像控制部110可以按照表示TOF传感器160的视角和摄像装置100的视角的组合所对应的各个距离的校正量的预定校正条件来校正预定位置关系。
摄像控制部110获取由摄像装置100拍摄的摄像图像(S106)。摄像装置100可以将表示被摄体的存在位置的框叠加于摄像图像的与群组区域对应的位置所得到的预览图像显示于显示部302等。摄像控制部110确定显示部302显示的摄像图像上的用户触摸的对焦对象的摄像区域所对应的群组区域的距离。用户可以在预览图像显示的至少一个的框中,触摸包括所期望的被摄体的框。摄像控制部110基于确定的群组区域的距离,执行自动聚焦动作,也就是对焦控制(S108)。
图10是示出摄像控制部110的对焦控制处理过程的一个示例的流程图。
摄像控制部110从TOF传感器160获取各个测距区域的距离(S200)。摄像控制部110基于TOF传感器160的测距区域的各个距离,对图像传感器120的摄像区域和TOF传感器160的测距区域的预定位置关系进行校正(S202)。摄像控制部110可以按照表示TOF传感器160的视角和摄像装置100的视角的组合所对应的各个距离的校正量的预定校正条件,基于TOF传感器160的测距区域的各个距离,对预定位置关系进行校正。
摄像控制部110将校正后的位置关系中的TOF传感器160的测距区域按属于同一个距离范围的测距区域分类为群组区域(S204)。摄像控制部110获取由摄像装置100拍摄的摄像图像(S206)。摄像装置100可以将表示被摄体的存在位置的框叠加于摄像图像的与群组区域对应的位置所得到的预览图像显示于显示部302等。摄像控制部110确定显示部302显示的摄像图像上的用户触摸的对焦对象的摄像区域所对应的群组区域的距离。摄像控制部110基于群组区域的距离,执行自动聚焦动作(S208)。
根据本实施方式,考虑到TOF传感器160的受光面和摄像装置100的图像传感器120的受光面的位置关系会随TOF传感器160测量的到被 摄体的距离而变化,而对TOF传感器160的受光面上的被摄体的位置和摄像装置100的图像传感器120的受光面上的被摄体的位置之间的位置关系进行校正。因此,摄像装置100可以基于根据TOF传感器160的测距信息而检测出的被摄体的距离,执行对焦控制,从而对焦于所期望的被摄体。
例如,当摄像装置100拍摄站在白色墙壁前的穿着白色衣服的人时,有时摄像装置100无法从摄像图像确定该人物。即使在这种情况下,TOF传感器160也可以测量该人物的距离。摄像装置100基于TOF传感器160测量的距离,将TOF传感器160的多个测距区域按同一距离范围的邻接的测距区域分类为群组区域。此外,摄像装置100根据群组区域的距离,对TOF传感器160的受光面上的位置和图像传感器120的受光面上的位置之间的位置关系进行校正。摄像装置100将表示该人物的存在位置的框叠加于摄像图像上的与群组区域所对应的位置得到的摄像图像作为预览图像显示于显示部302。用户触摸该框。摄像装置100基于校正后的位置关系确定触摸的框的位置所对应的群组区域。摄像装置100基于TOF传感器160测量的该群组区域的距离,执行对焦控制。由此,即使在无法从摄像图像中确定被摄体的情况下,摄像装置100也能够可靠地对焦于存在于任意距离中的所期望的被摄体。
在本实施方式中,对图像传感器120的光轴和TOF传感器160的光轴平行的示例进行了说明。然而,图像传感器120的光轴和TOF传感器160的光轴可以是不平行的。摄像装置100的视角可以小于TOF传感器160的视角。
图11是摄像系统10的另一个形式的外观立体图的一个示例。如图11所示,可在将包括智能手机400等显示器的移动终端固定于握持部300的侧旁的状态下,使用摄像系统10。
上述摄像装置100可以搭载于移动体上。摄像装置100可以搭载于如图12所示的无人驾驶航空器(UAV)上。UAV1000可以包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV1000为由推进部推进的移动体的一个示例。移动体的概念除UAV之外,还包括在空中移动的飞机等飞行体、 在地面上移动的车辆、在水上移动的船舶等。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV1000飞行。UAV本体20采用例如四个旋翼,使UAV1000飞行。旋翼的数量不限于四个。另外,UAV1000也可以是没有旋翼的固定翼机。
摄像装置100是为对包含在所期望的摄像范围内的被摄体进行摄像的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。
多个摄像装置60是为了控制UAV1000的飞行而对UAV1000的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV1000的机头、即正面。并且,其它两个摄像装置60可以设置于UAV1000的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所摄像的图像来生成UAV1000周围的三维空间数据。UAV1000所包括的摄像装置60的数量不限于四个。UAV1000包括至少一个摄像装置60即可。UAV1000也可以在UAV1000的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置600与UAV1000通信,对UAV1000进行远程操作。远程操作装置600可以与UAV1000进行无线通信。远程操作装置600向UAV1000发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV1000的移动有关的各种指令的指示信息。指示信息包括例如使UAV1000的高度上升的指示信息。指示信息可以表示UAV1000应该位于的高度。UAV1000进行移动,以位于从远程操作装置600接收的指示信息所表示的高度。指示信息可以包括使UAV1000上升的上升指令。UAV1000在接受上升指令的期间上升。在UAV1000的高度已达到上限高度时,即使接受 上升指令,也可以限制UAV1000上升。
图13示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据 执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判定、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。

Claims (13)

  1. 一种控制装置,其对包括测量到受光元件的受光面上的多个测距区域的各自相关联的被摄体的距离的测距传感器和拍摄所述被摄体的图像传感器的摄像装置进行控制,其特征在于,包括电路,所述电路配置为:
    基于由所述测距传感器测量的多个所述距离,对所述受光元件的所述受光面上的所述多个测距区域与所述图像传感器的受光面上的多个摄像区域的预定位置关系进行校正;
    基于校正后的所述位置关系,确定对焦对象的第一摄像区域所对应的第一测距区域;
    基于所述测量传感器测量的所述第一测距区域的所述距离,执行所述摄像装置的对焦控制。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路配置为:
    基于所述测距传感器测量的所述多个距离,将所述多个测距区域按属于预定距离范围的邻接的测距区域分类成群组区域;
    对各所述群组区域校正所述位置关系;
    基于校正后的所述位置关系,确定所述第一摄像区域所对应的群组区域;
    基于所述群组区域的距离,执行所述摄像装置的对焦控制,所述群组区域的距离基于所述测距传感器测量的所述多个距离。
  3. 根据权利要求2所述的控制装置,其特征在于,所述电路配置为:
    基于所述群组区域所包含的多个测距区域中的位于基准位置的测距区域的距离,执行所述摄像装置的对焦控制。
  4. 根据权利要求2所述的控制装置,其特征在于,所述电路配置为:将表示被摄体的位置的框叠加于所述摄像装置拍摄的图像的与所述群组区域对应的位置,并显示于显示部。
  5. 根据权利要求1所述的控制装置,其特征在于,所述电路配置为:
    将校正后的所述位置关系中的所述多个测距区域按属于预定距离范围的邻接的测距区域分类为群组区域;
    基于校正后的所述位置关系,确定所述第一摄像区域所对应的群组区域;
    基于所述群组区域的距离,执行所述摄像装置的对焦控制,所述群组区域的距离基于所述测距传感器测量的所述多个距离。
  6. 根据权利要求5所述的控制装置,其特征在于,所述电路配置为:
    基于所述群组区域所包含的多个测距区域中的位于基准位置的测距区域的距离,执行所述摄像装置的对焦控制。
  7. 根据权利要求5所述的控制装置,其特征在于,所述电路配置为:将表示被摄体的位置的框叠加于所述摄像装置拍摄的图像的与所述群组区域对应的位置,并显示于显示部。
  8. 根据权利要求1所述的控制装置,其特征在于,所述预定位置关系基于所述受光元件的所述受光面上的光轴中心的位置和所述图像传感器的所述受光面上的光轴中心的位置之间的位置关系来确定。
  9. 根据权利要求1所述的控制装置,其特征在于,所述预定位置关系表示与所述受光元件的所述受光面相关联的第一坐标系和与所述图像传感器的所述受光面相关联的第二坐标系之间的对应关系。
  10. 根据权利要求1所述的控制装置,其特征在于,
    所述电路基于表示所述测距传感器的视角、所述摄像装置的视角以及到被摄体的距离所对应的所述位置关系的校正量的预定校正条件,确定所述测距传感器测量的多个所述距离相应的校正量,基于确定的所述校正量校正所述位置关系。
  11. 一种摄像装置,其特征在于,其包括:根据权利要求1至10中任一项所述的控制装置;
    所述测距传感器;以及
    所述图像传感器。
  12. 一种控制方法,其对包括测量到受光元件的受光面上的多个测距区域的各自相关联的被摄体的距离的测距传感器和拍摄所述被摄体的图像传感器的摄像装置进行控制,其特征在于,包括以下阶段:
    基于由所述测距传感器测量的多个所述距离,对所述受光元件的所述受光面上的所述多个测距区域和所述图像传感器的受光面上的多个摄像区域的预定位置关系进行校正;
    基于校正后的所述位置关系,确定对焦对象的第一摄像区域所对应的第一测距区域;以及
    基于所述测距传感器测量的所述第一测距区域的所述距离,执行所述摄像装置的对焦控制。
  13. 一种程序,其特征在于,其用于使计算机用作权利要求1至10中任一项所述的控制装置而发挥功能。
PCT/CN2020/113963 2019-09-20 2020-09-08 控制装置、摄像装置、控制方法以及程序 WO2021052216A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080003318.9A CN112313941A (zh) 2019-09-20 2020-09-08 控制装置、摄像装置、控制方法以及程序
US17/683,163 US20220188993A1 (en) 2019-09-20 2022-02-28 Control apparatus, photographing apparatus, control method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019171306A JP7173657B2 (ja) 2019-09-20 2019-09-20 制御装置、撮像装置、制御方法、及びプログラム
JP2019-171306 2019-09-20

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/683,163 Continuation US20220188993A1 (en) 2019-09-20 2022-02-28 Control apparatus, photographing apparatus, control method, and program

Publications (1)

Publication Number Publication Date
WO2021052216A1 true WO2021052216A1 (zh) 2021-03-25

Family

ID=74876295

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/113963 WO2021052216A1 (zh) 2019-09-20 2020-09-08 控制装置、摄像装置、控制方法以及程序

Country Status (2)

Country Link
JP (1) JP7173657B2 (zh)
WO (1) WO2021052216A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004361431A (ja) * 2003-05-30 2004-12-24 Minolta Co Ltd 撮像装置
JP2005300925A (ja) * 2004-04-12 2005-10-27 Konica Minolta Photo Imaging Inc 撮像装置
CN102025914A (zh) * 2009-09-10 2011-04-20 佳能株式会社 摄像设备及其测距方法
CN102445688A (zh) * 2010-08-20 2012-05-09 电装国际美国公司 组合的飞行时间和图像传感器系统
CN107924040A (zh) * 2016-02-19 2018-04-17 索尼公司 图像拾取装置、图像拾取控制方法和程序

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3956317B2 (ja) 2006-06-01 2007-08-08 シャープ株式会社 合焦エリア調節カメラ付携帯端末
JP5272551B2 (ja) * 2008-07-15 2013-08-28 株式会社リコー 撮像装置及び方法
JP5444651B2 (ja) * 2008-07-16 2014-03-19 カシオ計算機株式会社 カメラ装置、その撮影方法と撮影制御プログラム
CN107005646B (zh) 2014-12-02 2020-05-19 奥林巴斯株式会社 对焦控制装置、内窥镜装置以及对焦控制装置的控制方法
JP6838417B2 (ja) 2016-03-16 2021-03-03 リコーイメージング株式会社 撮影装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004361431A (ja) * 2003-05-30 2004-12-24 Minolta Co Ltd 撮像装置
JP2005300925A (ja) * 2004-04-12 2005-10-27 Konica Minolta Photo Imaging Inc 撮像装置
CN102025914A (zh) * 2009-09-10 2011-04-20 佳能株式会社 摄像设备及其测距方法
CN102445688A (zh) * 2010-08-20 2012-05-09 电装国际美国公司 组合的飞行时间和图像传感器系统
CN107924040A (zh) * 2016-02-19 2018-04-17 索尼公司 图像拾取装置、图像拾取控制方法和程序

Also Published As

Publication number Publication date
JP2021047367A (ja) 2021-03-25
JP7173657B2 (ja) 2022-11-16

Similar Documents

Publication Publication Date Title
JP2019216343A (ja) 決定装置、移動体、決定方法、及びプログラム
JP6630939B2 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
US20220188993A1 (en) Control apparatus, photographing apparatus, control method, and program
WO2021013143A1 (zh) 装置、摄像装置、移动体、方法以及程序
CN112335227A (zh) 控制装置、摄像系统、控制方法以及程序
WO2021031833A1 (zh) 控制装置、摄像系统、控制方法以及程序
JP2021085893A (ja) 制御装置、撮像装置、制御方法、及びプログラム
WO2020216037A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
WO2021052216A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2021031840A1 (zh) 装置、摄像装置、移动体、方法以及程序
US11125970B2 (en) Method for lens autofocusing and imaging device thereof
WO2020108284A1 (zh) 确定装置、移动体、确定方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6805448B2 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
WO2022001561A1 (zh) 控制装置、摄像装置、控制方法以及程序
US20220070362A1 (en) Control apparatuses, photographing apparatuses, movable objects, control methods, and programs
JP7043706B2 (ja) 制御装置、撮像システム、制御方法、及びプログラム
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
JP7043707B2 (ja) シーン認識装置、撮像装置、シーン認識方法、及びプログラム
WO2020244440A1 (zh) 控制装置、摄像装置、摄像系统、控制方法以及程序
WO2019080805A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
WO2020156085A1 (zh) 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866521

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866521

Country of ref document: EP

Kind code of ref document: A1