WO2021249245A1 - Device, camera device, camera system, and movable member - Google Patents

Device, camera device, camera system, and movable member Download PDF

Info

Publication number
WO2021249245A1
WO2021249245A1 PCT/CN2021/097705 CN2021097705W WO2021249245A1 WO 2021249245 A1 WO2021249245 A1 WO 2021249245A1 CN 2021097705 W CN2021097705 W CN 2021097705W WO 2021249245 A1 WO2021249245 A1 WO 2021249245A1
Authority
WO
WIPO (PCT)
Prior art keywords
receiving element
lens
imaging device
light receiving
imaging
Prior art date
Application number
PCT/CN2021/097705
Other languages
French (fr)
Chinese (zh)
Inventor
本庄谦一
高宫诚
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202180006043.9A priority Critical patent/CN114600024A/en
Publication of WO2021249245A1 publication Critical patent/WO2021249245A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot

Definitions

  • the invention relates to a device, a camera device, a camera system and a mobile body.
  • Patent Document 1 discloses a camera system that calculates a distance value for each pixel of M ⁇ N pixels based on the TOF algorithm.
  • Patent Document 1 Japanese Patent Document Special Form No. 2019-508717
  • An apparatus includes a plurality of light receiving elements including a first light receiving element, and the first light receiving element is configured to receive a light beam that passes through a first lens and is divided by a pupil.
  • the device includes a first image sensor that is arranged to deviate from the focal position of the first lens by more than a predetermined distance along the optical axis direction of the first lens.
  • the device includes a circuit configured to generate information for control of a second imaging device including a second image sensor based on a signal generated by the first light receiving element.
  • the circuit may be configured to generate information used for focus adjustment of the second imaging device based on the signal generated by the first light receiving element.
  • the first image sensor may be disposed closer to the first lens side than the position of the focal point of the first lens.
  • the first light receiving element may be arranged at a preset interval.
  • the first image sensor may be arranged to deviate from the focal position of the first lens along the optical axis direction of the first lens, so that the expansion width of the image of the infinitely distant subject on the image plane of the first image sensor is greater than or equal to a preset interval.
  • the first lens may be a fixed focus lens.
  • the first image sensor may include a first light-receiving element and a second light-receiving element, and the second light-receiving element is configured to receive a light beam belonging to a wavelength range of visible light and generate an image signal.
  • the circuit may be configured to generate information used for exposure control and color adjustment of the second imaging device based on the image signal generated by the second light receiving element.
  • the first image sensor may include a third light-receiving element configured to receive a light beam in a wavelength range of invisible light and generate an image signal.
  • the circuit may be configured to generate an image in the wavelength range of non-visible light based on the image signal generated by the third light receiving element.
  • the imaging range of the first imaging device including the first lens and the first image sensor may be larger than the imaging range of the second imaging device.
  • the imaging range of the first imaging device may be set to include the imaging range of the second imaging device.
  • the circuit may be configured to generate distance measurement information in the imaging range of the first imaging device based on the signal generated by the first light receiving element.
  • the circuit may be configured such that when the imaging range of the second imaging device is changed, the focus adjustment of the second imaging device is performed based on the distance measurement information in the imaging range of the second imaging device after the change.
  • An imaging device includes the above-mentioned device and a first lens.
  • An imaging system includes the above-mentioned imaging device and a second imaging device.
  • the moving body according to one aspect is equipped with the above-mentioned imaging device and moves.
  • the moving body may include a second camera device and a support mechanism capable of controlling the posture of the second camera device and supporting it.
  • the information related to the distance to the subject can be acquired more accurately.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 12.
  • UAV unmanned aerial vehicle
  • Fig. 2 shows an example of the functional blocks of UAV10.
  • FIG. 3 shows an example of functional blocks of the imaging device 60.
  • FIG. 4 schematically shows the arrangement pattern of the light receiving elements of the image sensor 320.
  • FIG. 5 schematically shows the positional relationship between the focal point FP of the lens 300 and the image sensor 320.
  • FIG. 6 schematically shows the MTF characteristics obtained by the lens 300 and the image sensor 320.
  • FIG. 7 schematically shows the output waveform of the light receiving element 410 assuming that the image surface 322 of the image sensor 320 is located at the position of the focal point FP.
  • FIG. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position away from the focal point FP.
  • FIG. 9 schematically shows the imaging range 910 of the imaging device 60 and the imaging range 920 of the imaging device 100.
  • Fig. 10 schematically shows a ranging action when a variable focus lens is used.
  • FIG. 11 shows an example of a computer 1200.
  • the blocks can represent (1) a stage of a process of performing an operation or (2) a "part" of a device that performs an operation.
  • Specific stages and “sections” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • the computer readable medium may include floppy (registered trademark) disk floppy disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks, integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray (RTM) disc memory Sticks, integrated circuit cards, etc.
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 12.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of imaging devices 60, and the imaging device 100.
  • the gimbal 50, the camera device 100, and the camera device 60 are an example of a camera system.
  • UAV10, or mobile refers to the concept including flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to make the UAV 10 fly.
  • the number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the gimbal 50 is an example of a support mechanism that can control the posture of the imaging device 100 and support it.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the universal joint 50 can change the posture of the camera device 100 by rotating the camera device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the imaging device 60 is a sensor camera that photographs the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side can function as so-called stereo cameras.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • three-dimensional spatial data around the UAV 10 can be generated.
  • At least one of the plurality of imaging devices 60 is an imaging device for acquiring information used for control of the imaging device 100 including focus control and exposure control.
  • the number of camera devices 60 included in the UAV 10 is not limited.
  • the UAV 10 includes at least one camera device 60.
  • the UAV 10 may also include at least one camera 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10, respectively.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • At least one of the plurality of imaging devices 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 12 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 12 can communicate with the UAV 10 wirelessly.
  • the remote operation device 12 transmits instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating to the UAV 10.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 12.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • UAV10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement unit 42 (IMU42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50.
  • the imaging device 60 and the imaging device 100 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement unit 42 (IMU42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50.
  • IMU42 inertial measurement unit 42
  • magnetic compass 43 magnetic compass 43
  • barometric altimeter 44 temperature sensor 45
  • humidity sensor 46 universal joint 50.
  • the imaging device 60 and the imaging device 100 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement unit 42 (IMU42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50.
  • the communication interface 36 communicates with other devices such as the remote operation device 12.
  • the communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 12.
  • the memory 37 stores the UAV control unit 30 to control the propulsion unit 40, the GPS receiver 41, the IMU 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging device 60, and the imaging device 100. Required procedures, etc.
  • the memory 37 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the storage 37 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
  • the UAV control unit 30 controls the flying and shooting of the UAV 10 in accordance with a program stored in the memory 37.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, and the like.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operating device 12 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals.
  • the IMU42 detects the posture of the UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10.
  • the magnetic compass 43 detects the position of the nose of the UAV10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the lens part 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, a control unit 110, a memory 130, and a distance measuring sensor.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 captures optical images formed through the plurality of lenses 210 and outputs the captured images to the control unit 110.
  • the control unit 110 generates a recording image through image processing based on the pixel information read from the image sensor 120 and stores it in the memory 130.
  • the control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the control unit 110 can control the imaging device 100 in accordance with an operation command of the imaging device 100 from the UAV control unit 30.
  • the control unit 110 may be at least a part of the circuit in the present invention.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 130 stores programs and the like necessary for the control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the storage 130 may be configured to be detachable from the housing of the imaging device 100.
  • the distance measuring sensor measures the distance to the subject.
  • the distance measuring sensor can measure the distance to the specified subject.
  • the ranging sensor may be an infrared sensor, an ultrasonic sensor, a stereo camera, a TOF (Time Of Flight) sensor, etc.
  • the lens part 200 includes a plurality of lenses 210, a plurality of lens driving parts 212, and a lens control part 220.
  • the multiple lenses 210 can function as zoom lenses, variable focal length lenses, and focus lenses. At least a part or all of the plurality of lenses 210 are configured to be able to move along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring.
  • the lens driving part 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control unit 220 drives the lens driving unit 212 in accordance with a lens control instruction from the imaging unit 102 to move one or more lenses 210 in the optical axis direction via a mechanism member.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens part 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens drive unit 212 in accordance with the lens operation command from the imaging unit 102. Part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zooming action and a focusing action by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect the current zoom position or focus position.
  • the lens driving part 212 may include a shake correction mechanism.
  • the lens control section 220 may move the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism to perform shake correction.
  • the lens driving part 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 moved via the lens driving unit 212.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • FIG. 3 shows an example of functional blocks of the imaging device 60.
  • FIG. 3 shows functional blocks of the imaging device 60 that acquires an image for generating information used for control of the imaging device 100.
  • the imaging unit 60 includes a lens 300, an image sensor 320, a control unit 310, and a memory 330.
  • the control unit 310 exchanges information with the control unit 110 through the UAV control unit 30.
  • the control unit 310 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the memory 330 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 330 stores programs and the like necessary for the control unit 310 to control the image sensor 320 and the like.
  • the control unit 310 included in the imaging device 60, the control unit 110 included in the imaging device 100, and the UAV control unit 30 may all function as at least a part of the circuit.
  • the image sensor 320 may be composed of CCD or CMOS.
  • the image sensor 320 captures an optical image formed by the light beam passing through the lens 300 and outputs the captured image to the control unit 310.
  • the control unit 310 performs image processing based on the pixel information read from the image sensor 320 to generate an image, which is stored in the memory 330.
  • the lens 300 is a fixed focus lens.
  • the positional relationship between the lens 300 and the image sensor 320 is fixed.
  • the image sensor 320 is provided so as to deviate from the position of the focal point FP of the lens 300 by a predetermined distance or more in the direction of the optical axis AX of the lens 300.
  • the image sensor 320 is provided closer to the lens 300 side than the focal position of the lens 300. That is, the imaging device 60 is designed to capture a subject in a so-called focus backward state.
  • the control section 310 adopts a subject image captured in a state where the focus is moved backward, acquires distance measurement information through a phase difference detection method, and generates information for focus adjustment of the imaging device 100.
  • FIG. 4 schematically shows the arrangement pattern of the light receiving elements of the image sensor 320.
  • the image sensor 320 includes a plurality of light receiving elements arranged two-dimensionally.
  • G denotes a light-receiving element that receives light passing through a filter that transmits light in the green wavelength range.
  • R denotes a light-receiving element that receives light passing through a filter that transmits light in the red wavelength range.
  • B represents a light-receiving element that receives light passing through the green filter.
  • IR refers to a light-receiving element that receives light passing through a filter that transmits light in the infrared wavelength range. Each light-receiving element generates information about one pixel in the image.
  • the image sensor 320 includes a plurality of light-receiving elements, including light-receiving element 410a and light-receiving element 410b, light-receiving element 420G, light-receiving element 420Ga, light-receiving element 420Gb, light-receiving element 420B and light-receiving element 420R, and light-receiving element 430.
  • the image sensor 320 has a pixel arrangement in which the 4 ⁇ 4 pixel arrangement shown in FIG. 4 is arranged in a matrix.
  • the light-receiving element 420G, the light-receiving element 420Ga, the light-receiving element 420Gb, the light-receiving element 420B, and the light-receiving element 420R are collectively referred to as the "light-receiving element 420" in some cases.
  • the light-receiving element 410a and the light-receiving element 410b are sometimes collectively referred to as the "light-receiving element 410".
  • the light-receiving element 410 is an example of a first light-receiving element that receives a light beam that has passed through the lens 300 and is divided by the pupil. That is, the light receiving element 410 is a light receiving element for distance measurement based on phase difference detection.
  • the control unit 310 is configured to generate information used for focus adjustment of the imaging device 100 based on the signal generated by the light receiving element 410.
  • the control unit 310 generates information used for control of the imaging device 100 including the image sensor 120 based on the signal generated by the light receiving element 410.
  • the light receiving element 420 is an example of a second light receiving element that receives a light beam in the wavelength range of visible light and generates an image signal.
  • the light receiving element 430 is an example of a third light receiving element that receives a light beam in a wavelength range of invisible light and generates an image signal. In this embodiment, the light receiving element 430 receives light in the infrared wavelength range.
  • the control unit 310 generates information used for exposure control and color adjustment of the imaging device 100 based on the image signal generated by the light receiving element 420. Specifically, the control unit 310 generates, based on the image signal generated by the light receiving element 420, a visible light image that is an image formed by light in the wavelength range of visible light. The control unit 310 generates brightness information of each area within the imaging range of the imaging device 60 based on the brightness information of the visible light image. In addition, the control unit 310 generates color information of each area within the imaging range of the imaging device 60 based on the brightness information of each color of the visible light image. The control unit 310 outputs the generated brightness information and color information to the control unit 110.
  • the control unit 110 performs automatic exposure control of the imaging device 100 based on the brightness information output from the control unit 310.
  • the control unit 110 adjusts the white balance of the image generated by the image sensor 120 of the imaging device 100 based on the color information output from the control unit 310.
  • the control unit 310 is configured to generate information used for exposure control and color adjustment of the imaging device 100 based on the image signal generated by the light receiving element 420.
  • the light receiving element 430 generates an image signal formed of light in the infrared wavelength range.
  • the control unit 310 generates an infrared image that is an image formed by light in the infrared wavelength range based on the image signal generated by the light receiving element 430.
  • an infrared image is used as an image for analyzing temperature distribution.
  • infrared images are also used as images for analyzing subjects in dark places.
  • the control unit 310 is configured to generate an image in the wavelength range of invisible light based on the image signal generated by the light receiving element 430.
  • the light receiving element 410a is arranged on the right side in one pixel area.
  • the light receiving element 410b is arranged on the left side in one pixel area.
  • the light receiving element 410a is provided on the right side, and the light receiving element 420Ga is provided on the left side.
  • the light-receiving element 410b and the light-receiving element 420Gb are provided, when viewed from the subject side, the light-receiving element 410b is provided on the left side, and the light-receiving element 420Gb is provided on the right side.
  • the light-receiving element 410b and the light-receiving element 420Gb are separately provided on the left and right in the horizontal pixel direction in one pixel area.
  • the light receiving element 410a and the light receiving element 410b mainly receive light beams that pass through different pupils of the exit pupil of the lens 300.
  • the plurality of light receiving elements 410 provided on the right side of one pixel region like the light receiving element 410a may be referred to as "first phase difference elements”.
  • the plurality of light receiving elements 410 provided on the left side of one pixel region like the light receiving element 410b may be referred to as "second phase difference elements”.
  • the light receiving element 410a and the light receiving element 410b are alternately arranged every two pixels in the horizontal pixel direction. Therefore, the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels.
  • the first phase difference element including the light receiving element 410a is provided every four pixels in the horizontal pixel direction.
  • the second phase difference element including the light receiving element 410b is provided every four pixels in the horizontal pixel direction.
  • the control unit 310 uses the correlation between the image generated by the first phase difference element including the light receiving element 410a and the image generated by the second phase difference element including the light receiving element 410b to detect the phase difference.
  • the control unit 310 acquires distance measurement information based on the detected phase difference. In this way, the imaging device 60 acquires the distance information of the subject through the phase difference detection method.
  • the control unit 310 generates information used for focus adjustment of the imaging device 100 based on the signal generated by the light receiving element 410.
  • the control unit 310 may generate information for the zoom value or angle of view control of the lens 300 based on the signal generated by the light receiving element for phase difference detection.
  • FIG. 5 schematically shows the positional relationship between the focal point FP of the lens 300 and the image sensor 320.
  • the focal point FP is a point at which the light rays passing through the lens 300 intersect the optical axis AX when the light rays parallel to the optical axis AX of the lens 300 are incident on the lens 300.
  • the image surface 322 of the image sensor 320 is located at a position away from the focal point FP. Specifically, the image surface 322 of the image sensor 320 is located closer to the lens 300 side than the focal point FP.
  • the image sensor 320 is arranged at a position deviated from the position of the focal point FP so that the image plane 322 is located closer to the lens 300 than the focal point FP.
  • the lens 300 is fixed so that the focal point FP is located on the back side of the image sensor 320.
  • the image of the subject at infinity is projected onto the image sensor 320 with the focus blurred.
  • an image of a short-distance subject is also projected on the image sensor 320 in a state where the focus is blurred.
  • FIG. 6 schematically shows the MTF characteristics obtained by the lens 300 and the image sensor 320.
  • the solid line 600 represents the MTF characteristic when the image surface 322 of the image sensor 320 is set at a position away from the focus FP position as in the imaging device 60.
  • the dotted line 610 represents the MTF characteristic when the image surface 322 of the image sensor 320 is assumed to be at the position of the focal point FP.
  • the MTF of the imaging device 60 becomes lower in the high spatial frequency domain.
  • FIG. 7 schematically shows the output waveform of the light receiving element 410 assuming that the image surface 322 of the image sensor 320 is located at the position of the focal point FP.
  • the graph 710 shows the output waveform of the first phase difference element including the light receiving element 410a.
  • the graph 720 shows the output waveform of the second phase difference element including the light receiving element 410b.
  • the horizontal axis represents the position in the horizontal direction
  • the vertical axis represents the brightness value.
  • the graph 710 and the graph 720 show the output waveform of the light receiving element 410 when a pattern with a strong luminance difference in the horizontal direction is used as the subject.
  • the edge becomes steep. That is, the edge is represented by the signal of a few light receiving elements 410. Therefore, the phase difference is calculated by the correlation operation of the edge portion formed by fewer pixels, so the calculation accuracy of the phase difference is reduced. In addition, since the generated phase difference is also reduced, the calculation accuracy of the phase difference is also liable to decrease.
  • FIG. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position away from the focal point FP.
  • the graph 810 shows the output waveform of the first phase difference element including the light receiving element 410a.
  • the graph 820 shows the output waveform of the second phase difference element including the light receiving element 410b.
  • the horizontal axis represents the position in the horizontal direction
  • the vertical axis represents the brightness value.
  • the graph 810 and the graph 820 show the output waveform of the light receiving element 410 when a pattern having a strong luminance difference in the horizontal direction is used as the subject.
  • At least the plurality of light receiving elements 410 can capture an image of an infinitely distant subject formed on the image plane 322 of the image sensor 320 through the lens 300. That is, it is preferable to set the image sensor 320 to be offset from the focal point FP in the direction of the optical axis AX so that the expansion width of the image of the subject at infinity is greater than or equal to the interval of the plurality of light receiving elements 410. As shown in Fig. 4, the light-receiving element 410a and the light-receiving element 410b are alternately arranged every two pixels in the horizontal pixel direction.
  • the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels. Therefore, when the light receiving element 410 is arranged every two pixels, it is preferable that the image sensor 320 is offset from the focal point FP in the direction of the optical axis AX, so that the expansion width of the image of the subject at infinity is two pixels. above.
  • the image sensor 320 in order to be able to use more signals of the light-receiving element 410 to indicate the edge, it is desirable to set the image sensor 320 to deviate from the focal point FP in the direction of the optical axis AX, so that the expansion width of the image of the infinitely distant subject is greater than or equal to the receiving pass designation.
  • the distance between the light receiving element 410 in the pupil area For example, as shown in FIG.
  • the image sensor 320 It is set to shift from the focal point FP in the optical axis AX direction so that the expansion width of the image of the infinitely distant subject is four pixels or more.
  • the half-spread value of the point spread function, the half-spread value of the line spread function, or the like can be applied as the expansion width of the image of an infinitely distant subject.
  • the image sensor 320 is preferably arranged to deviate from the position of the focal point FP of the lens 300 along the optical axis direction of the lens 300, so that the image surface of the image sensor 320
  • the expansion width of the image of the infinitely distant subject on 322 is greater than or equal to the preset interval.
  • the positional relationship between the lens 300 and the image sensor 320 is fixed so that the focal point FP is located on the back side of the image sensor 320 (the side opposite to the object side).
  • the control section 310 acquires brightness information and color information of the subject based on the image information of the visible light image obtained from the image sensor 320.
  • brightness information in pixel units is not necessarily required, and it is sometimes preferable to obtain average brightness information of each partial region of the image.
  • color information in a pixel unit is not necessarily required, and it is sometimes preferable to obtain average color information of each partial region of the image. Therefore, it is possible to acquire brightness information and color information with sufficient accuracy even from an image accompanied by a certain degree of blur.
  • FIG. 9 schematically shows the imaging range 910 of the imaging device 60 and the imaging range 920 of the imaging device 100.
  • the viewing angle of the imaging device 60 is larger than the viewing angle of the imaging device 100.
  • the imaging range 910 of the imaging device 60 is larger than the imaging range 920 of the imaging device 100.
  • the imaging range 910 of the imaging device 60 includes all of the imaging range 920 of the imaging device 100.
  • the imaging range of the imaging device 60 is determined by the posture of the UAV 10.
  • the imaging range 920 of the imaging device 100 is determined based on the posture of the imaging device 100 controlled by the gimbal 50 and the angle of view of the imaging device 100 controlled by the control unit 110 in addition to the posture of the UAV 10.
  • the imaging range 920 of the imaging device 60 may include all the imaging ranges that can be changed by the imaging device 100 under the control of the universal joint 50 and the control unit 110.
  • the control unit 310 generates distance measurement information through phase difference detection based on the signal generated by the light receiving element 410 among the signals generated by the image sensor 320.
  • the control in the case where the current imaging range 920 of the imaging device 100 is changed to the imaging range 922 will be described.
  • the control unit 110 obtains the distance measurement generated by the control unit 310 The distance measurement information of the area corresponding to the imaging range 922 in the information.
  • the control unit 110 controls the position of the focus lens included in the lens 210 based on the acquired distance measurement information, thereby performing focus control on the subject within the imaging range 922. As a result, the control unit 110 can quickly perform focus control in accordance with the change in the imaging range of the imaging device 100.
  • the control unit 110 acquires the brightness information detected by the control unit 310 from the image of the image of the imaging range 910 corresponding to the imaging range 922. And based on the acquired brightness information, the exposure control of the imaging device 100 is performed. As a result, exposure control can be quickly performed in accordance with changes in the imaging range of the imaging device 100.
  • the control unit 110 acquires the color information detected by the control unit 310 from the image of the region corresponding to the imaging range 922 in the image of the imaging range 910, and determines the whiteness of the image acquired by the image sensor 120 based on the acquired color information. Balance the parameters of processing. As a result, the imaging device 100 can quickly determine appropriate white balance processing parameters to perform white balance adjustment.
  • control unit 310 generates distance measurement information in the imaging range of the imaging device 60 based on the signal generated by the light receiving element 410.
  • control unit 310 performs focus adjustment of the imaging device 100 based on the distance measurement information in the imaging range of the imaging device 100 after the change.
  • the imaging device 60 can capture a larger range than the imaging device 100, and acquire distance information, brightness information, and color information of the subject.
  • the imaging device 60 can be used as a 3A (AE/AF/AWB) camera. Since the imaging range of the imaging device 60 is set to include the imaging range of the imaging device 100, even when the imaging range of the imaging device 100 changes due to the posture of the UAV 10 or the control of the universal joint 50, the imaging device can be used. The distance information, brightness information and color information obtained by 60 can quickly perform 3A-related control.
  • the wavelength filter of a partial pixel region of the image sensor 320 is set as an IR filter, an infrared image can be acquired at the same time.
  • Fig. 10 schematically shows a ranging action when a variable focus lens is used.
  • the manner in which the position of the focal point of the lens 300 is set to be variable will be described with reference to FIG. 10.
  • the control unit 310 calculates the subject distance based on the image information of the light receiving element 410 obtained by changing the focal position of the lens 300 and performing three imaging. For example, the control unit 310 performs imaging in a state where the imaging point of the lens 300 is located at the position 1010, and performs phase difference detection using image information generated by the image sensor 320. In this way, the control unit 110 calculates the distance D1 to the subject.
  • control unit 310 moves the focal point of the lens 300, performs imaging with the imaging point at the position 1020, and performs phase difference detection using the image information generated by the image sensor 320.
  • the distance D2 to the subject is calculated.
  • control unit 310 moves the focal point of the lens 300, performs imaging with the imaging point at the position 1030, and performs phase difference detection using the image information generated by the image sensor 320.
  • the distance D3 to the subject is calculated.
  • the control unit 110 determines the distance to the subject based on the values of the distance D1, the distance D2, and the distance D3. For example, the control unit 110 selects one of the distances D1, D2, and D3 as the distance to the subject. As an example, the control unit 110 can calculate the Euclidean distance of any combination of two of D1, D2, and D3, select the two values that provide the shortest Euclidean distance from the values of D1, D2, and D3, and select from the selected Euclidean distance. One of the two values is the distance to the subject. The control unit 110 may select two values that provide the shortest Euclidean distance, or may select an average value from the selected two values as the distance to the subject.
  • phase difference can be detected with higher accuracy by changing the position of the focal point of the lens 300 to perform multiple imaging and performing phase difference detection.
  • control unit 110 may determine the position of the focal point of the lens 300 based on the imaging mode of the imaging device 100.
  • the control section 110 may select a predetermined position corresponding to the imaging mode of the imaging device 100 as the focal position of the lens 300.
  • the control unit 110 can set the focal position of the lens 300 closer to the image sensor 320 than when the imaging mode of the imaging device 100 is set to the landscape mode. .
  • the focal position of the lens 300 can be changed by changing the position of the lens 300 on the optical axis AX.
  • the position of the focal point of the lens 300 can be changed by changing the position of the focusing lens on the optical axis AX.
  • the method of changing the focal position of the lens 300 has been described, but a method of changing the positional relationship between the lens 300 and the image sensor 320 on the optical axis AX may also be adopted.
  • a method of changing the position of the image sensor 320 on the optical axis AX may be adopted instead of changing the focal position of the lens 300.
  • a method of changing the position of the image sensor 320 on the optical axis AX may be adopted.
  • a method of changing the position of the image sensor 320 on the optical axis AX may also be adopted.
  • the imaging device 100 and the imaging device 60 are imaging devices mounted on the UAV 10 as an example of the imaging system.
  • the camera device 100 and the camera device 60 can be configured to be detachable from the UAV 10.
  • At least one of the imaging device 100 and the imaging device 60 can be realized by a camera included in a portable terminal such as a smartphone.
  • At least one of the camera 100 and the camera 60 may be configured to be detachable from the UAV 10.
  • At least one of the imaging device 100 and the imaging device 60 may not be an imaging device mounted on a movable body such as the UAV10.
  • the camera device 100 may be a camera device supported by a handheld gimbal.
  • the imaging device 100 and the imaging device 60 may be imaging devices that are not supported by the UAV 10 and the handheld gimbal.
  • at least one of the camera device 100 and the camera device 60 may be a camera device that can be held by the user.
  • At least one of the camera device 100 and the camera device 60 may be a fixedly installed camera device such as a surveillance camera.
  • FIG. 11 shows an example of a computer 1200 that can embody aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes a specified operation associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with the programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230, which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and brings about the cooperation between the programs and the various types of hardware resources described above.
  • the device or method may be constituted by realizing operation or processing of information as the computer 1200 is used.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and instruct the communication interface 1222 to perform communication processing according to the processing described by the communication program.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or required parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transition, unconditional transition, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium.
  • the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium to provide the program to the computer 1200 via the network.

Abstract

A device comprising a plurality of light receiving elements including a first light receiving element, wherein the first light receiving element is constructed to receive a light beam which passes through a first lens and is split by a pupil. The device comprises a first image sensor, wherein the first image sensor is configured to deviate from a focal point position of the first lens along an optical axis direction of the first lens by a predetermined distance or more. The device comprises a circuit, wherein the circuit is constructed to generate, on the basis of a signal generated by the first light receiving element, information for the control of a second camera device. The second camera device comprises a second image sensor.

Description

装置、摄像装置、摄像系统及移动体Device, camera device, camera system and moving body 技术领域Technical field
本发明涉及一种装置、摄像装置、摄像系统及移动体。The invention relates to a device, a camera device, a camera system and a mobile body.
背景技术Background technique
专利文献1公开了一种根据TOF算法针对M×N个像素的各个像素计算距离值的相机系统。Patent Document 1 discloses a camera system that calculates a distance value for each pixel of M×N pixels based on the TOF algorithm.
[现有技术文献][Prior Art Literature]
[专利文献][Patent Literature]
[专利文献1]日本专利文献特表2019-508717号公报[Patent Document 1] Japanese Patent Document Special Form No. 2019-508717
发明内容Summary of the invention
一个方面所涉及的装置包括包含第一受光元件的多个受光元件,所述第一受光元件构成为:接收通过第一镜头并被光瞳分割的光束。装置包括第一图像传感器,所述第一图像传感器设置成沿第一镜头的光轴方向从第一镜头的焦点位置偏离预定距离以上。装置包括电路,所述电路构成为:基于第一受光元件生成的信号,生成用于包括第二图像传感器的第二摄像装置的控制的信息。An apparatus according to one aspect includes a plurality of light receiving elements including a first light receiving element, and the first light receiving element is configured to receive a light beam that passes through a first lens and is divided by a pupil. The device includes a first image sensor that is arranged to deviate from the focal position of the first lens by more than a predetermined distance along the optical axis direction of the first lens. The device includes a circuit configured to generate information for control of a second imaging device including a second image sensor based on a signal generated by the first light receiving element.
电路可以构成为:基于第一受光元件生成的信号,生成用于第二摄像装置的焦点调节的信息。The circuit may be configured to generate information used for focus adjustment of the second imaging device based on the signal generated by the first light receiving element.
第一图像传感器可以设置在比第一镜头的焦点的位置更靠近第一镜头侧。The first image sensor may be disposed closer to the first lens side than the position of the focal point of the first lens.
第一受光元件可以以预设间隔设置。第一图像传感器可以设置成沿第一镜头的光轴方向偏离第一镜头的焦点位置,以使第一图像传感器的像面上的无限远被摄体的像的扩展宽度大于等于预设间隔。The first light receiving element may be arranged at a preset interval. The first image sensor may be arranged to deviate from the focal position of the first lens along the optical axis direction of the first lens, so that the expansion width of the image of the infinitely distant subject on the image plane of the first image sensor is greater than or equal to a preset interval.
第一镜头可以是固定焦点镜头。The first lens may be a fixed focus lens.
第一图像传感器可以包括第一受光元件和第二受光元件,所述第二受光元件构成为:接收属于可见光的波长范围的光束并生成图像信号。电路可以构成为:基于第二受光元件生成的图像信号,生成用于第二摄像装置的曝光控制及颜色调整的信息。The first image sensor may include a first light-receiving element and a second light-receiving element, and the second light-receiving element is configured to receive a light beam belonging to a wavelength range of visible light and generate an image signal. The circuit may be configured to generate information used for exposure control and color adjustment of the second imaging device based on the image signal generated by the second light receiving element.
第一图像传感器可以包括第三受光元件,所述第三受光元件构成为:接收非可见光的波长范围的光束并生成图像信号。电路可以构成为:基于第三受光元件生成的图像信号,生成 非可见光的波长范围的图像。The first image sensor may include a third light-receiving element configured to receive a light beam in a wavelength range of invisible light and generate an image signal. The circuit may be configured to generate an image in the wavelength range of non-visible light based on the image signal generated by the third light receiving element.
包括第一镜头及第一图像传感器的第一摄像装置的摄像范围可以大于第二摄像装置的摄像范围。The imaging range of the first imaging device including the first lens and the first image sensor may be larger than the imaging range of the second imaging device.
第一摄像装置的摄像范围可以设置为包含第二摄像装置的摄像范围。The imaging range of the first imaging device may be set to include the imaging range of the second imaging device.
电路可以构成为:基于第一受光元件生成的信号,生成第一摄像装置的摄像范围中的测距信息。电路可以构成为:当变更第二摄像装置的摄像范围时,根据变更后的第二摄像装置的摄像范围中的测距信息,进行第二摄像装置的焦点调节。The circuit may be configured to generate distance measurement information in the imaging range of the first imaging device based on the signal generated by the first light receiving element. The circuit may be configured such that when the imaging range of the second imaging device is changed, the focus adjustment of the second imaging device is performed based on the distance measurement information in the imaging range of the second imaging device after the change.
一个方面所涉及的摄像装置包括上述装置和第一镜头。An imaging device according to one aspect includes the above-mentioned device and a first lens.
一个方面所涉及的摄像系统包括上述摄像装置和第二摄像装置。An imaging system according to one aspect includes the above-mentioned imaging device and a second imaging device.
一个方面所涉及的移动体搭载上述摄像装置并进行移动。The moving body according to one aspect is equipped with the above-mentioned imaging device and moves.
移动体可以包括第二摄像装置和能够控制第二摄像装置的姿势地进行支撑的支撑机构。The moving body may include a second camera device and a support mechanism capable of controlling the posture of the second camera device and supporting it.
根据上述的一个方面,能够更准确地获取与到被摄体的距离相关的信息。According to the above-mentioned one aspect, the information related to the distance to the subject can be acquired more accurately.
另外,上述内容未列举本发明所有的必要的全部特征。此外,这些特征组的子组合也可以构成发明。In addition, the above content does not enumerate all the necessary features of the present invention. In addition, sub-combinations of these feature groups can also constitute inventions.
附图说明Description of the drawings
图1示出无人驾驶航空器(UAV)10及远程操作装置12的外观的一个示例。FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 12.
图2示出UAV10的功能块的一个示例。Fig. 2 shows an example of the functional blocks of UAV10.
图3示出摄像装置60的功能块的一个示例。FIG. 3 shows an example of functional blocks of the imaging device 60.
图4示意性地示出图像传感器320的受光元件的排列模式。FIG. 4 schematically shows the arrangement pattern of the light receiving elements of the image sensor 320.
图5示意性地示出镜头300的焦点FP及图像传感器320的位置关系。FIG. 5 schematically shows the positional relationship between the focal point FP of the lens 300 and the image sensor 320.
图6示意性地示出通过镜头300及图像传感器320得到的MTF特性。FIG. 6 schematically shows the MTF characteristics obtained by the lens 300 and the image sensor 320.
图7示意性地示出假定图像传感器320的像面322位于焦点FP的位置时的受光元件410的输出波形。FIG. 7 schematically shows the output waveform of the light receiving element 410 assuming that the image surface 322 of the image sensor 320 is located at the position of the focal point FP.
图8示意性地示出图像传感器320的像面322位于偏离焦点FP的位置时的受光元件410的输出波形。FIG. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position away from the focal point FP.
图9示意性地示出摄像装置60的摄像范围910及摄像装置100的摄像范围920。FIG. 9 schematically shows the imaging range 910 of the imaging device 60 and the imaging range 920 of the imaging device 100.
图10示意性地示出使用可变焦点镜头时的测距动作。Fig. 10 schematically shows a ranging action when a variable focus lens is used.
图11示出计算机1200的一个示例。FIG. 11 shows an example of a computer 1200.
具体实施方式detailed description
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。Hereinafter, the present invention will be explained through the embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, all the feature combinations described in the embodiments are not necessarily necessary for the solution of the invention. It is obvious to those of ordinary skill in the art that various changes or improvements can be made to the following embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the abstract of the description include matters that are the subject of copyright protection. As long as anyone makes copies of these files as indicated in the patent office's documents or records, the copyright owner will not raise an objection. However, in other cases, all copyrights are reserved.
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。Various embodiments of the present invention can be described with reference to flowcharts and block diagrams. Here, the blocks can represent (1) a stage of a process of performing an operation or (2) a "part" of a device that performs an operation. Specific stages and "sections" can be implemented by programmable circuits and/or processors. Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits. Programmable circuits may include reconfigurable hardware circuits. Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) ) And other memory components.
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。The computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device. As a result, the computer-readable medium on which instructions are stored includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram. As an example of a computer-readable medium, it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like. As a more specific example of the computer readable medium, it may include floppy (registered trademark) disk floppy disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks, integrated circuit cards, etc.
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。The computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages. The source code or object code includes traditional procedural programming languages. Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages. The computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device. The processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
图1表示无人驾驶航空器(UAV)10及远程操作装置12的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60、以及摄像装置100。万向节50、摄像装置100及摄像装置60为摄像系统的一个示例。UAV10,即移动体,是指包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 12. The UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of imaging devices 60, and the imaging device 100. The gimbal 50, the camera device 100, and the camera device 60 are an example of a camera system. UAV10, or mobile, refers to the concept including flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。The UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section. The UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors. The UAV main body 20 uses, for example, four rotors to make the UAV 10 fly. The number of rotors is not limited to four. In addition, UAV10 can also be a fixed-wing aircraft without rotors.
摄像装置100为对包含在所期望的摄像范围内的被摄体进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50是以可控制摄像装置100的姿势进行支撑的支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。The imaging device 100 is an imaging camera that captures a subject included in a desired imaging range. The universal joint 50 rotatably supports the imaging device 100. The gimbal 50 is an example of a support mechanism that can control the posture of the imaging device 100 and support it. For example, the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis. The universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively. The universal joint 50 can change the posture of the camera device 100 by rotating the camera device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。基于多个摄像装置60拍摄的图像,可以生成UAV10周围的三维空间数据。多个摄像装置60中的至少一个是用于获取对包括对焦控制和曝光控制的摄像装置100的控制所使用的信息的摄像装置。UAV10所包括的摄像装置60的数量是不限定的。UAV10包括至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。多个摄像装置60中的至少一个也可以具有单焦点镜头或鱼眼镜头。The imaging device 60 is a sensor camera that photographs the surroundings of the UAV 10 in order to control the flight of the UAV 10. The two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side. In addition, the other two camera devices 60 may be provided on the bottom surface of the UAV 10. The two imaging devices 60 on the front side can function as so-called stereo cameras. The two imaging devices 60 on the bottom side may also be paired to function as a stereo camera. Based on the images captured by the plurality of camera devices 60, three-dimensional spatial data around the UAV 10 can be generated. At least one of the plurality of imaging devices 60 is an imaging device for acquiring information used for control of the imaging device 100 including focus control and exposure control. The number of camera devices 60 included in the UAV 10 is not limited. It is sufficient that the UAV 10 includes at least one camera device 60. The UAV 10 may also include at least one camera 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10, respectively. The viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100. At least one of the plurality of imaging devices 60 may have a single focus lens or a fisheye lens.
远程操作装置12与UAV10通信,以远程操作UAV10。远程操作装置12可以与UAV10进行无线通信。远程操作装置12向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置12接收的指示信息所表示的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。The remote operation device 12 communicates with the UAV 10 to remotely operate the UAV 10. The remote operation device 12 can communicate with the UAV 10 wirelessly. The remote operation device 12 transmits instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating to the UAV 10. The instruction information includes, for example, instruction information for raising the height of the UAV 10. The indication information may indicate the height at which the UAV10 should be located. The UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 12. The instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
图2示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器37、通信接口36、推进部40、GPS接收器41、惯性测量装置42(IMU42)、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100。Fig. 2 shows an example of the functional blocks of UAV10. UAV10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement unit 42 (IMU42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50. The imaging device 60 and the imaging device 100.
通信接口36与远程操作装置12等的其它装置通信。通信接口36可以从远程操作装置12接收包括对UAV控制部30的各种指令的指示信息。存储器37存储UAV控制部30对推进部40、GPS接收器41、IMU42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器37可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器37可以设置在UAV主体20的内部。其可以设置成可从UAV主体20中拆卸下来。The communication interface 36 communicates with other devices such as the remote operation device 12. The communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 12. The memory 37 stores the UAV control unit 30 to control the propulsion unit 40, the GPS receiver 41, the IMU 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging device 60, and the imaging device 100. Required procedures, etc. The memory 37 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD). The storage 37 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
UAV控制部30按照储存在存储器37中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置12接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10飞行。The UAV control unit 30 controls the flying and shooting of the UAV 10 in accordance with a program stored in the memory 37. The UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, and the like. The UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operating device 12 via the communication interface 36. The propulsion unit 40 propels the UAV10. The propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV10的位置(纬度及经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。The GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals. The IMU42 detects the posture of the UAV10. The IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10. The magnetic compass 43 detects the position of the nose of the UAV10. The barometric altimeter 44 detects the flying altitude of the UAV10. The barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude. The temperature sensor 45 detects the temperature around the UAV 10. The humidity sensor 46 detects the humidity around the UAV 10.
摄像装置100包括摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102包括图像传感器120、控制部110、存储器130及测距传感器。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像输出至控制部110。控制部110基于从图像传感器120读取的像素信息,通过图像处理生成记录用图像,并存储于存储器130中。控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。控制部110可以根据来自UAV控制部30的摄像装置100的动作指令来控制摄像装置100。控制部110可以是本发明中的电路的至少一部分。存储器130可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器130储存控制部110对图像传感器120等进行控制 所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体上拆卸下来。The imaging device 100 includes an imaging unit 102 and a lens unit 200. The lens part 200 is an example of a lens device. The imaging unit 102 includes an image sensor 120, a control unit 110, a memory 130, and a distance measuring sensor. The image sensor 120 may be composed of CCD or CMOS. The image sensor 120 captures optical images formed through the plurality of lenses 210 and outputs the captured images to the control unit 110. The control unit 110 generates a recording image through image processing based on the pixel information read from the image sensor 120 and stores it in the memory 130. The control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like. The control unit 110 can control the imaging device 100 in accordance with an operation command of the imaging device 100 from the UAV control unit 30. The control unit 110 may be at least a part of the circuit in the present invention. The memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD). The memory 130 stores programs and the like necessary for the control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the imaging device 100. The storage 130 may be configured to be detachable from the housing of the imaging device 100.
测距传感器对到被摄体的距离进行测距。测距传感器可以测量到指定的被摄体的距离。测距传感器可以是红外传感器、超声波传感器、立体相机、TOF(Time OfFlight,飞行时间)传感器等。The distance measuring sensor measures the distance to the subject. The distance measuring sensor can measure the distance to the specified subject. The ranging sensor may be an infrared sensor, an ultrasonic sensor, a stereo camera, a TOF (Time Of Flight) sensor, etc.
镜头部200包括多个镜头210、多个镜头驱动部212、以及镜头控制部220。多个镜头210可以起到变焦镜头、可变焦距镜头及聚焦镜头的作用。多个镜头210中的至少一部分或全部被配置为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的可更换镜头。镜头驱动部212经由凸轮环等机构构件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进电机。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,以经由机构构件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。The lens part 200 includes a plurality of lenses 210, a plurality of lens driving parts 212, and a lens control part 220. The multiple lenses 210 can function as zoom lenses, variable focal length lenses, and focus lenses. At least a part or all of the plurality of lenses 210 are configured to be able to move along the optical axis. The lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102. The lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring. The lens driving part 212 may include an actuator. The actuator may include a stepper motor. The lens control unit 220 drives the lens driving unit 212 in accordance with a lens control instruction from the imaging unit 102 to move one or more lenses 210 in the optical axis direction via a mechanism member. The lens control commands are, for example, zoom control commands and focus control commands.
镜头部200还包括存储器222和位置传感器214。镜头控制部220按照来自摄像部102的镜头动作指令,经由镜头驱动部212来控制镜头210向光轴方向移动。镜头210的一部分或者全部沿着光轴移动。镜头控制部220通过使镜头210中的至少一个沿着光轴移动,来执行变焦动作和聚焦动作中的至少一个。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的变焦位置或聚焦位置。The lens part 200 further includes a memory 222 and a position sensor 214. The lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens drive unit 212 in accordance with the lens operation command from the imaging unit 102. Part or all of the lens 210 moves along the optical axis. The lens control section 220 performs at least one of a zooming action and a focusing action by moving at least one of the lenses 210 along the optical axis. The position sensor 214 detects the position of the lens 210. The position sensor 214 can detect the current zoom position or focus position.
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以经由抖动校正机构使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。镜头驱动部212可以由步进电机驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进电机驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。The lens driving part 212 may include a shake correction mechanism. The lens control section 220 may move the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism to perform shake correction. The lens driving part 212 may drive a shake correction mechanism by a stepping motor to perform shake correction. In addition, the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
存储器222存储经由镜头驱动部212而移动的多个镜头210的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。The memory 222 stores control values of the plurality of lenses 210 moved via the lens driving unit 212. The memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
图3示出摄像装置60的功能块的一个示例。图3示出获取用于生成摄像装置100的控制所使用的信息的图像的摄像装置60的功能块。摄像部60包括镜头300、图像传感器320、控制部310及存储器330。控制部310通过UAV控制部30与控制部110之间交互信息。FIG. 3 shows an example of functional blocks of the imaging device 60. FIG. 3 shows functional blocks of the imaging device 60 that acquires an image for generating information used for control of the imaging device 100. The imaging unit 60 includes a lens 300, an image sensor 320, a control unit 310, and a memory 330. The control unit 310 exchanges information with the control unit 110 through the UAV control unit 30.
控制部310可以由CPU或MPU等微处理器、MCU等微控制器等构成。存储器330可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器330储存控制部310对图像传感器320等进行控制所需的程序等。摄像装置60所包括的控制部310、摄像装置100所包括的控制部110以及UAV控制部30均可以起到电路的至少一部分的作用。The control unit 310 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like. The memory 330 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD). The memory 330 stores programs and the like necessary for the control unit 310 to control the image sensor 320 and the like. The control unit 310 included in the imaging device 60, the control unit 110 included in the imaging device 100, and the UAV control unit 30 may all function as at least a part of the circuit.
图像传感器320可以由CCD或CMOS构成。图像传感器320拍摄通过镜头300的光束所形成的光学图像,并将所拍摄的图像输出至控制部310。控制部310基于从图像传感器320读取的像素信息进行图像处理,从而生成图像,并存储于存储器330中。The image sensor 320 may be composed of CCD or CMOS. The image sensor 320 captures an optical image formed by the light beam passing through the lens 300 and outputs the captured image to the control unit 310. The control unit 310 performs image processing based on the pixel information read from the image sensor 320 to generate an image, which is stored in the memory 330.
在本实施方式中,镜头300为固定焦点镜头。镜头300与图像传感器320之间的位置关系是固定的。如后面所述,图像传感器320设置成,在镜头300的光轴AX方向上与镜头300的焦点FP的位置偏离预定的距离以上。例如,图像传感器320设置在比起镜头300的焦点位置更靠近镜头300侧。即,摄像装置60被设计为在所谓的焦点后移状态下拍摄被摄体。控制部310采用在焦点后移状态下拍摄到的被摄体图像,通过相位差检测方法获取测距信息,并且生成用于摄像装置100的焦点调节的信息。In this embodiment, the lens 300 is a fixed focus lens. The positional relationship between the lens 300 and the image sensor 320 is fixed. As described later, the image sensor 320 is provided so as to deviate from the position of the focal point FP of the lens 300 by a predetermined distance or more in the direction of the optical axis AX of the lens 300. For example, the image sensor 320 is provided closer to the lens 300 side than the focal position of the lens 300. That is, the imaging device 60 is designed to capture a subject in a so-called focus backward state. The control section 310 adopts a subject image captured in a state where the focus is moved backward, acquires distance measurement information through a phase difference detection method, and generates information for focus adjustment of the imaging device 100.
图4示意地示出图像传感器320的受光元件的排列模式。图像传感器320包括二维排列的多个受光元件。在图4中,“G”表示接收通过滤波器的光的受光元件,该滤波器使绿色波长范围的光透过。“R”表示接收通过滤波器的光的受光元件,该滤波器使红色波长范围的光透过。“B”表示接收通过绿色滤波器的光的受光元件。“IR”表示接收通过滤波器的光的受光元件,该滤波器使红外波长范围的光透过。各个受光元件生成图像中的一个像素信息。FIG. 4 schematically shows the arrangement pattern of the light receiving elements of the image sensor 320. The image sensor 320 includes a plurality of light receiving elements arranged two-dimensionally. In FIG. 4, "G" denotes a light-receiving element that receives light passing through a filter that transmits light in the green wavelength range. "R" denotes a light-receiving element that receives light passing through a filter that transmits light in the red wavelength range. "B" represents a light-receiving element that receives light passing through the green filter. "IR" refers to a light-receiving element that receives light passing through a filter that transmits light in the infrared wavelength range. Each light-receiving element generates information about one pixel in the image.
图像传感器320包括多个受光元件,其中包括受光元件410a及受光元件410b、受光元件420G、受光元件420Ga、受光元件420Gb、受光元件420B及受光元件420R、以及受光元件430。图像传感器320具有将图4所示的4×4的像素排列布置成矩阵状的像素排列。另外,有时将受光元件420G、受光元件420Ga、受光元件420Gb、受光元件420B及受光元件420R统称为“受光元件420”。有时将受光元件410a及受光元件410b统称为“受光元件410”。The image sensor 320 includes a plurality of light-receiving elements, including light-receiving element 410a and light-receiving element 410b, light-receiving element 420G, light-receiving element 420Ga, light-receiving element 420Gb, light-receiving element 420B and light-receiving element 420R, and light-receiving element 430. The image sensor 320 has a pixel arrangement in which the 4×4 pixel arrangement shown in FIG. 4 is arranged in a matrix. In addition, the light-receiving element 420G, the light-receiving element 420Ga, the light-receiving element 420Gb, the light-receiving element 420B, and the light-receiving element 420R are collectively referred to as the "light-receiving element 420" in some cases. The light-receiving element 410a and the light-receiving element 410b are sometimes collectively referred to as the "light-receiving element 410".
受光元件410是接收通过镜头300并被光瞳分割的光束的第一受光元件的一个示例。即,受光元件410是基于相位差检测的测距用受光元件。控制部310构成为:基于受光元件410生成的信号,生成用于摄像装置100的焦点调节的信息。控制部310基于受光元件410生成的信号,生成用于包括图像传感器120的摄像装置100的控制的信息。The light-receiving element 410 is an example of a first light-receiving element that receives a light beam that has passed through the lens 300 and is divided by the pupil. That is, the light receiving element 410 is a light receiving element for distance measurement based on phase difference detection. The control unit 310 is configured to generate information used for focus adjustment of the imaging device 100 based on the signal generated by the light receiving element 410. The control unit 310 generates information used for control of the imaging device 100 including the image sensor 120 based on the signal generated by the light receiving element 410.
受光元件420是接收可见光的波长范围的光束并生成图像信号的第二受光元件的一个示例。受光元件430是接收非可见光的波长范围的光束并生成图像信号的第三受光元件的示例。在本实施方式中,受光元件430接收红外波长范围的光。The light receiving element 420 is an example of a second light receiving element that receives a light beam in the wavelength range of visible light and generates an image signal. The light receiving element 430 is an example of a third light receiving element that receives a light beam in a wavelength range of invisible light and generates an image signal. In this embodiment, the light receiving element 430 receives light in the infrared wavelength range.
控制部310基于受光元件420生成的图像信号,生成用于摄像装置100的曝光控制及颜色调整的信息。具体来说,控制部310基于受光元件420生成的图像信号,生成作为可见光的波长范围的光所形成的图像即可见光图像。控制部310基于可见光图像的亮度信息,生成摄像装置60的摄像范围内的各个区域的亮度信息。另外,控制部310基于可见光图像的各色的亮度信息,生成摄像装置60的摄像范围内的各个区域的颜色信息。控制部310将生成的亮 度信息及颜色信息输出到控制部110。控制部110基于从控制部310输出的亮度信息,进行摄像装置100的自动曝光控制。控制部110基于从控制部310输出的颜色信息,调整摄像装置100的图像传感器120生成的图像的白平衡。这样,控制部310构成为:基于受光元件420生成的图像信号,生成用于摄像装置100的曝光控制及颜色调整的信息。The control unit 310 generates information used for exposure control and color adjustment of the imaging device 100 based on the image signal generated by the light receiving element 420. Specifically, the control unit 310 generates, based on the image signal generated by the light receiving element 420, a visible light image that is an image formed by light in the wavelength range of visible light. The control unit 310 generates brightness information of each area within the imaging range of the imaging device 60 based on the brightness information of the visible light image. In addition, the control unit 310 generates color information of each area within the imaging range of the imaging device 60 based on the brightness information of each color of the visible light image. The control unit 310 outputs the generated brightness information and color information to the control unit 110. The control unit 110 performs automatic exposure control of the imaging device 100 based on the brightness information output from the control unit 310. The control unit 110 adjusts the white balance of the image generated by the image sensor 120 of the imaging device 100 based on the color information output from the control unit 310. In this way, the control unit 310 is configured to generate information used for exposure control and color adjustment of the imaging device 100 based on the image signal generated by the light receiving element 420.
受光元件430生成由红外波长范围的光形成的图像信号。控制部310基于受光元件430生成的图像信号,生成作为红外波长范围的光所形成的图像即红外线图像。例如,红外线图像被用作用于解析温度分布的图像。此外,红外线图像还被用作用于分析暗处的被摄体的图像。由此,控制部310构成为:基于受光元件430生成的图像信号来生成非可见光的波长范围的图像。The light receiving element 430 generates an image signal formed of light in the infrared wavelength range. The control unit 310 generates an infrared image that is an image formed by light in the infrared wavelength range based on the image signal generated by the light receiving element 430. For example, an infrared image is used as an image for analyzing temperature distribution. In addition, infrared images are also used as images for analyzing subjects in dark places. Thus, the control unit 310 is configured to generate an image in the wavelength range of invisible light based on the image signal generated by the light receiving element 430.
受光元件410a设置在一个像素区域内的右侧。受光元件410b设置在一个像素区域内的左侧。具体地,在设置有受光元件410a及受光元件420Ga的像素区域中,当从被摄体侧观察时,受光元件410a设置在右侧,受光元件420Ga设置在左侧。另外,在设置有受光元件410b和受光元件420Gb的像素区域中,当从被摄体侧观察时,受光元件410b设置在左侧,受光元件420Gb设置在右侧。这样,受光元件410b及受光元件420Gb在一个像素区域内分离设置水平像素方向的左右。由此,受光元件410a及受光元件410b主要接收通过镜头300的出瞳中彼此不同的光瞳的光束。The light receiving element 410a is arranged on the right side in one pixel area. The light receiving element 410b is arranged on the left side in one pixel area. Specifically, in the pixel region where the light receiving element 410a and the light receiving element 420Ga are provided, when viewed from the subject side, the light receiving element 410a is provided on the right side, and the light receiving element 420Ga is provided on the left side. In addition, in the pixel area where the light-receiving element 410b and the light-receiving element 420Gb are provided, when viewed from the subject side, the light-receiving element 410b is provided on the left side, and the light-receiving element 420Gb is provided on the right side. In this way, the light-receiving element 410b and the light-receiving element 420Gb are separately provided on the left and right in the horizontal pixel direction in one pixel area. Thus, the light receiving element 410a and the light receiving element 410b mainly receive light beams that pass through different pupils of the exit pupil of the lens 300.
另外,有时将像受光元件410a那样设置在一个像素区域内的右侧的多个受光元件410称为“第一相位差元件”。并且,有时将像受光元件410b那样设置在一个像素区域内的左侧的多个受光元件410称为“第二相位差元件”。如图4所示,受光元件410a和受光元件410b在水平像素方向上每两个像素交替地设置。因此,设置有受光元件410a的像素与设置有受光元件410b的像素之间的间隔为两个像素。此外,包括受光元件410a的第一相位差元件在水平像素方向上每四个像素设置一个。此外,包括受光元件410b的第二相位差元件在水平像素方向上每四个像素设置一个。In addition, the plurality of light receiving elements 410 provided on the right side of one pixel region like the light receiving element 410a may be referred to as "first phase difference elements". In addition, the plurality of light receiving elements 410 provided on the left side of one pixel region like the light receiving element 410b may be referred to as "second phase difference elements". As shown in FIG. 4, the light receiving element 410a and the light receiving element 410b are alternately arranged every two pixels in the horizontal pixel direction. Therefore, the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels. In addition, the first phase difference element including the light receiving element 410a is provided every four pixels in the horizontal pixel direction. In addition, the second phase difference element including the light receiving element 410b is provided every four pixels in the horizontal pixel direction.
控制部310利用包括受光元件410a的第一相位差元件生成的图像与包括受光元件410b的第二相位差元件生成的图像之间的相关性来检测相位差。控制部310基于检测到的相位差来获取测距信息。这样,摄像装置60通过相位差检测方法获取到被摄体的距离信息。并且,控制部310基于受光元件410生成的信号,生成用于摄像装置100的焦点调节的信息。另外,控制部310也可以基于相位差检测用的受光元件所生成的信号,生成用于镜头300的变焦值或视角控制的信息。The control unit 310 uses the correlation between the image generated by the first phase difference element including the light receiving element 410a and the image generated by the second phase difference element including the light receiving element 410b to detect the phase difference. The control unit 310 acquires distance measurement information based on the detected phase difference. In this way, the imaging device 60 acquires the distance information of the subject through the phase difference detection method. In addition, the control unit 310 generates information used for focus adjustment of the imaging device 100 based on the signal generated by the light receiving element 410. In addition, the control unit 310 may generate information for the zoom value or angle of view control of the lens 300 based on the signal generated by the light receiving element for phase difference detection.
图5示意性地示出镜头300的焦点FP及图像传感器320的位置关系。焦点FP是当平行于镜头300的光轴AX的光线入射到镜头300上时,通过镜头300后的光线与光轴AX相交 的点。如图5所示,在光轴AX方向上,图像传感器320的像面322位于偏离焦点FP的位置。具体来说,图像传感器320的像面322位于比焦点FP更靠近镜头300侧。由此,图像传感器320设置在偏离焦点FP的位置的位置,以使像面322位于比焦点FP更靠近镜头300侧。FIG. 5 schematically shows the positional relationship between the focal point FP of the lens 300 and the image sensor 320. The focal point FP is a point at which the light rays passing through the lens 300 intersect the optical axis AX when the light rays parallel to the optical axis AX of the lens 300 are incident on the lens 300. As shown in FIG. 5, in the optical axis AX direction, the image surface 322 of the image sensor 320 is located at a position away from the focal point FP. Specifically, the image surface 322 of the image sensor 320 is located closer to the lens 300 side than the focal point FP. Thus, the image sensor 320 is arranged at a position deviated from the position of the focal point FP so that the image plane 322 is located closer to the lens 300 than the focal point FP.
这样,镜头300被固定成焦点FP位于图像传感器320的背面侧。由此,无限远被摄体的图像在焦点模糊的状态下被投影到图像传感器320上。并且,近距离被摄体的图像也在焦点模糊的状态下被投影到图像传感器320上。In this way, the lens 300 is fixed so that the focal point FP is located on the back side of the image sensor 320. As a result, the image of the subject at infinity is projected onto the image sensor 320 with the focus blurred. In addition, an image of a short-distance subject is also projected on the image sensor 320 in a state where the focus is blurred.
图6示意性地示出由镜头300和图像传感器320得到的MTF特性。实线600表示如摄像装置60那样将图像传感器320的像面322设置在偏离焦点FP位置的位置时的MTF特性。虚线610表示假定图像传感器320的像面322位于焦点FP的位置时的MTF特性。如上所述,与假定像面322位于焦点FP的位置的情况相比,摄像装置60的MTF在高空间频域变低。FIG. 6 schematically shows the MTF characteristics obtained by the lens 300 and the image sensor 320. The solid line 600 represents the MTF characteristic when the image surface 322 of the image sensor 320 is set at a position away from the focus FP position as in the imaging device 60. The dotted line 610 represents the MTF characteristic when the image surface 322 of the image sensor 320 is assumed to be at the position of the focal point FP. As described above, compared with the case where the image plane 322 is assumed to be at the position of the focal point FP, the MTF of the imaging device 60 becomes lower in the high spatial frequency domain.
图7示意性地示出假定图像传感器320的像面322位于焦点FP的位置时的受光元件410的输出波形。图表710示出包括受光元件410a的第一相位差元件的输出波形。图表720示出包括受光元件410b的第二相位差元件的输出波形。在图表710和图表720中,横轴表示水平方向的位置,纵轴表示亮度值。此外,图表710及图表720示出了将水平方向上具有强亮度差的图案作为被摄体时的受光元件410的输出波形。如图表710和图表720所示,当相对于被摄体接近对焦状态时,边缘变陡。即,边缘由少数受光元件410的信号表示。因此,变成了通过较少像素形成的边缘部分的相关运算来计算相位差,所以相位差的计算精度降低。另外,由于产生的相位差也变小,所以相位差的计算精度也容易降低。FIG. 7 schematically shows the output waveform of the light receiving element 410 assuming that the image surface 322 of the image sensor 320 is located at the position of the focal point FP. The graph 710 shows the output waveform of the first phase difference element including the light receiving element 410a. The graph 720 shows the output waveform of the second phase difference element including the light receiving element 410b. In the graph 710 and the graph 720, the horizontal axis represents the position in the horizontal direction, and the vertical axis represents the brightness value. In addition, the graph 710 and the graph 720 show the output waveform of the light receiving element 410 when a pattern with a strong luminance difference in the horizontal direction is used as the subject. As shown in the graph 710 and the graph 720, when the focus state is approached with respect to the subject, the edge becomes steep. That is, the edge is represented by the signal of a few light receiving elements 410. Therefore, the phase difference is calculated by the correlation operation of the edge portion formed by fewer pixels, so the calculation accuracy of the phase difference is reduced. In addition, since the generated phase difference is also reduced, the calculation accuracy of the phase difference is also liable to decrease.
图8示意性地示出图像传感器320的像面322位于偏离焦点FP的位置时的受光元件410的输出波形。图表810示出包括受光元件410a的第一相位差元件的输出波形。图表820示出包括受光元件410b的第二相位差元件的输出波形。在图表810及图表820中,横轴表示水平方向的位置,纵轴表示亮度值。另外,与图7相同,图表810和图表820表示将水平方向上具有强亮度差的图案作为被摄体时的受光元件410的输出波形。当像面322位于偏离焦点FP的位置时,与像面322位于焦点FP的位置的情况相比,被摄体图像中产生的模糊变大。因此,如图表810和图表820所示,边缘变得平缓。由此,边缘可以由多个受光元件410的信号表示。因此,能够提高通过相关运算计算相位差时的相位差的计算精度。而且,由于产生的相位差也变大,所以相位差的计算精度提高。FIG. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position away from the focal point FP. The graph 810 shows the output waveform of the first phase difference element including the light receiving element 410a. The graph 820 shows the output waveform of the second phase difference element including the light receiving element 410b. In the graph 810 and the graph 820, the horizontal axis represents the position in the horizontal direction, and the vertical axis represents the brightness value. In addition, as in FIG. 7, the graph 810 and the graph 820 show the output waveform of the light receiving element 410 when a pattern having a strong luminance difference in the horizontal direction is used as the subject. When the image plane 322 is located at a position away from the focal point FP, the blur generated in the subject image becomes larger than when the image plane 322 is located at the focal point FP. Therefore, as shown in graph 810 and graph 820, the edges become gentle. In this way, the edge can be represented by the signals of the plurality of light receiving elements 410. Therefore, it is possible to improve the calculation accuracy of the phase difference when the phase difference is calculated by correlation calculation. Furthermore, since the generated phase difference also becomes larger, the calculation accuracy of the phase difference is improved.
为了提高相位差的计算精度,优选地,至少多个受光元件410能够对通过镜头300在图像传感器320的像面322上形成的无限远被摄体的像进行摄像。即,优选地,将图像传感器320设置成沿光轴AX方向从焦点FP偏移,以使无限远被摄体的像的扩展宽度大于等于多个受光元件410的间隔。如图4所示,受光元件410a和受光元件410b在水平像素方向上以每 两个像素交替设置。即,设有受光元件410a的像素与设有受光元件410b的像素之间的间隔为两个像素。由此,当受光元件410每两个像素进行排列时,优选地,将图像传感器320从焦点FP沿光轴AX方向偏移设置,以使无限远被摄体的像的扩展宽度为两个像素以上。In order to improve the calculation accuracy of the phase difference, it is preferable that at least the plurality of light receiving elements 410 can capture an image of an infinitely distant subject formed on the image plane 322 of the image sensor 320 through the lens 300. That is, it is preferable to set the image sensor 320 to be offset from the focal point FP in the direction of the optical axis AX so that the expansion width of the image of the subject at infinity is greater than or equal to the interval of the plurality of light receiving elements 410. As shown in Fig. 4, the light-receiving element 410a and the light-receiving element 410b are alternately arranged every two pixels in the horizontal pixel direction. That is, the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels. Therefore, when the light receiving element 410 is arranged every two pixels, it is preferable that the image sensor 320 is offset from the focal point FP in the direction of the optical axis AX, so that the expansion width of the image of the subject at infinity is two pixels. above.
另外,为了能够用更多的受光元件410的信号表示边缘,希望将图像传感器320设置成从焦点FP沿光轴AX方向偏离,以使无限远被摄体的像的扩展宽度大于等于接收通过指定的光瞳区域的光的受光元件410的间隔。例如,如图4所示,优选地,当包括受光元件410a的第一相位差元件每四个像素进行排列,包括受光元件410b的第二相位差元件每四个像素进行排列时,图像传感器320设置成从焦点FP沿光轴AX方向偏移,以使无限远被摄体的像的扩展宽度为四个像素以上。另外,作为无限远被摄体的像的扩展宽度,能够应用点扩展函数的半辐值或线扩展函数的半辐值等。In addition, in order to be able to use more signals of the light-receiving element 410 to indicate the edge, it is desirable to set the image sensor 320 to deviate from the focal point FP in the direction of the optical axis AX, so that the expansion width of the image of the infinitely distant subject is greater than or equal to the receiving pass designation. The distance between the light receiving element 410 in the pupil area. For example, as shown in FIG. 4, preferably, when the first phase difference element including the light receiving element 410a is arranged every four pixels, and the second phase difference element including the light receiving element 410b is arranged every four pixels, the image sensor 320 It is set to shift from the focal point FP in the optical axis AX direction so that the expansion width of the image of the infinitely distant subject is four pixels or more. In addition, as the expansion width of the image of an infinitely distant subject, the half-spread value of the point spread function, the half-spread value of the line spread function, or the like can be applied.
这样,当图像传感器320中受光元件410以预设间隔设置时,优选地,将图像传感器320设置成沿镜头300的光轴方向偏离镜头300的焦点FP的位置,以使图像传感器320的像面322上的无限远被摄体的像的扩展宽度大于等于预设间隔。In this way, when the light-receiving elements 410 in the image sensor 320 are arranged at a preset interval, the image sensor 320 is preferably arranged to deviate from the position of the focal point FP of the lens 300 along the optical axis direction of the lens 300, so that the image surface of the image sensor 320 The expansion width of the image of the infinitely distant subject on 322 is greater than or equal to the preset interval.
如上所述,在摄像装置60中,镜头300及图像传感器320的位置关系被固定,以使焦点FP位于图像传感器320的背面侧(与物体侧相反的一侧)。由此,能够在从无限远被摄体到近距离被摄体的大距离范围提高相位差的检测精度。另外,控制部310基于从图像传感器320获得的可见光图像的图像信息,获取被摄体的亮度信息和颜色信息。为了进行摄像装置100的曝光控制,未必需要像素单位的亮度信息,有时优选获取图像的各个部分区域的平均亮度信息。同样地,为了进行摄像装置100的白平衡调整,未必需要像素单位的颜色信息,有时优选获取图像的各个部分区域的平均颜色信息。因此,即使从伴随某种程度的模糊的图像中也能够以足够的精度获取亮度信息和颜色信息。As described above, in the imaging device 60, the positional relationship between the lens 300 and the image sensor 320 is fixed so that the focal point FP is located on the back side of the image sensor 320 (the side opposite to the object side). As a result, it is possible to improve the detection accuracy of the phase difference in a large distance range from an infinitely distant subject to a short-distance subject. In addition, the control section 310 acquires brightness information and color information of the subject based on the image information of the visible light image obtained from the image sensor 320. In order to perform exposure control of the imaging device 100, brightness information in pixel units is not necessarily required, and it is sometimes preferable to obtain average brightness information of each partial region of the image. Similarly, in order to adjust the white balance of the imaging device 100, color information in a pixel unit is not necessarily required, and it is sometimes preferable to obtain average color information of each partial region of the image. Therefore, it is possible to acquire brightness information and color information with sufficient accuracy even from an image accompanied by a certain degree of blur.
图9示意性地示出摄像装置60的摄像范围910和摄像装置100的摄像范围920。摄像装置60的视角大于摄像装置100的视角。摄像装置60的摄像范围910大于摄像装置100的摄像范围920。如图9所示,摄像装置60的摄像范围910包括摄像装置100的摄像范围920的全部。FIG. 9 schematically shows the imaging range 910 of the imaging device 60 and the imaging range 920 of the imaging device 100. The viewing angle of the imaging device 60 is larger than the viewing angle of the imaging device 100. The imaging range 910 of the imaging device 60 is larger than the imaging range 920 of the imaging device 100. As shown in FIG. 9, the imaging range 910 of the imaging device 60 includes all of the imaging range 920 of the imaging device 100.
另外,摄像装置60的摄像范围由UAV10的姿势确定。另一方面,摄像装置100的摄像范围920除了根据UAV10的姿势之外,还根据由万向节50控制的摄像装置100的姿势和由控制部110控制的摄像装置100的视角确定。当UAV10处于任意姿势时,摄像装置60的摄像范围920可以包括通过万向节50及控制部110的控制而使摄像装置100能够变更的摄像范围的全部。In addition, the imaging range of the imaging device 60 is determined by the posture of the UAV 10. On the other hand, the imaging range 920 of the imaging device 100 is determined based on the posture of the imaging device 100 controlled by the gimbal 50 and the angle of view of the imaging device 100 controlled by the control unit 110 in addition to the posture of the UAV 10. When the UAV 10 is in an arbitrary posture, the imaging range 920 of the imaging device 60 may include all the imaging ranges that can be changed by the imaging device 100 under the control of the universal joint 50 and the control unit 110.
如上所述,控制部310基于图像传感器320生成的信号中的受光元件410生成的信号, 通过相位差检测生成测距信息。这里,将对摄像装置100的当前摄像范围920变更为摄像范围922的情况下的控制进行说明。例如,当万向节50使摄像装置100围绕偏航轴和俯仰轴旋转,使摄像装置100的摄像范围从摄像范围920变更为摄像范围922时,控制部110获取由控制部310生成的测距信息中与摄像范围922对应的区域的测距信息。控制部110基于获取的测距信息,控制镜头210所包括的聚焦镜头的位置,从而对摄像范围922内的被摄体进行对焦控制。由此,控制部110可以根据摄像装置100的摄像范围的变化迅速地进行对焦控制。As described above, the control unit 310 generates distance measurement information through phase difference detection based on the signal generated by the light receiving element 410 among the signals generated by the image sensor 320. Here, the control in the case where the current imaging range 920 of the imaging device 100 is changed to the imaging range 922 will be described. For example, when the universal joint 50 rotates the imaging device 100 around the yaw axis and the pitch axis to change the imaging range of the imaging device 100 from the imaging range 920 to the imaging range 922, the control unit 110 obtains the distance measurement generated by the control unit 310 The distance measurement information of the area corresponding to the imaging range 922 in the information. The control unit 110 controls the position of the focus lens included in the lens 210 based on the acquired distance measurement information, thereby performing focus control on the subject within the imaging range 922. As a result, the control unit 110 can quickly perform focus control in accordance with the change in the imaging range of the imaging device 100.
另外,当摄像装置100的摄像范围从摄像范围920变更为摄像范围922时,控制部110获取控制部310从摄像范围910的图像中与摄像范围922对应的区域的图像中检测出的亮度信息,并基于获取的亮度信息,进行摄像装置100的曝光控制。由此,可以根据摄像装置100的摄像范围的变化来迅速地进行曝光控制。另外,控制部110获取控制部310从摄像范围910的图像中与摄像范围922对应的区域的图像中检测出的颜色信息,并根据获取的颜色信息,来确定对于图像传感器120获取的图像的白平衡处理的参数。由此,摄像装置100能够迅速确定合适的白平衡处理的参数来进行白平衡调节。In addition, when the imaging range of the imaging device 100 is changed from the imaging range 920 to the imaging range 922, the control unit 110 acquires the brightness information detected by the control unit 310 from the image of the image of the imaging range 910 corresponding to the imaging range 922. And based on the acquired brightness information, the exposure control of the imaging device 100 is performed. As a result, exposure control can be quickly performed in accordance with changes in the imaging range of the imaging device 100. In addition, the control unit 110 acquires the color information detected by the control unit 310 from the image of the region corresponding to the imaging range 922 in the image of the imaging range 910, and determines the whiteness of the image acquired by the image sensor 120 based on the acquired color information. Balance the parameters of processing. As a result, the imaging device 100 can quickly determine appropriate white balance processing parameters to perform white balance adjustment.
如上所述,控制部310基于受光元件410生成的信号来生成摄像装置60的摄像范围中的测距信息。并且,当变更摄像装置100的摄像范围时,控制部310基于变更后的摄像装置100的摄像范围中的测距信息,来进行摄像装置100的焦点调节。As described above, the control unit 310 generates distance measurement information in the imaging range of the imaging device 60 based on the signal generated by the light receiving element 410. In addition, when the imaging range of the imaging device 100 is changed, the control unit 310 performs focus adjustment of the imaging device 100 based on the distance measurement information in the imaging range of the imaging device 100 after the change.
如上所述,摄像装置60能够拍摄比摄像装置100大的范围,并获取到被摄体的距离信息、亮度信息及颜色信息。由此,摄像装置60能够作为3A(AE/AF/AWB)用相机来使用。由于摄像装置60的摄像范围被设置为包括摄像装置100的摄像范围,所以即使在摄像装置100的摄像范围因UAV10的姿势或万向节50的控制而变化的情况下,也可以使用由摄像装置60获取的距离信息、亮度信息及颜色信息,迅速进行3A相关的控制。另外,由于将图像传感器320的部分像素区域的波长滤波器设为IR滤波器,所以可以同时获取红外线图像。As described above, the imaging device 60 can capture a larger range than the imaging device 100, and acquire distance information, brightness information, and color information of the subject. Thus, the imaging device 60 can be used as a 3A (AE/AF/AWB) camera. Since the imaging range of the imaging device 60 is set to include the imaging range of the imaging device 100, even when the imaging range of the imaging device 100 changes due to the posture of the UAV 10 or the control of the universal joint 50, the imaging device can be used. The distance information, brightness information and color information obtained by 60 can quickly perform 3A-related control. In addition, since the wavelength filter of a partial pixel region of the image sensor 320 is set as an IR filter, an infrared image can be acquired at the same time.
图10示意性地示出使用可变焦点镜头时的测距动作。参照图10来说明镜头300的焦点的位置设置为可变的方式。控制部310根据改变镜头300的焦点位置进行三次摄像而得到的受光元件410的图像信息,计算被摄体距离。例如,控制部310在镜头300的成像点位于位置1010的状态下进行摄像,并利用图像传感器320生成的图像信息进行相位差检测。由此,控制部110计算到被摄体的距离D1。接着,控制部310移动镜头300的焦点,在成像点位于位置1020的状态下进行摄像,并利用图像传感器320生成的图像信息进行相位差检测。由此,计算出到被摄体的距离D2。接着,控制部310移动镜头300的焦点,在成像点位于位置1030的状态下进行摄像,并利用图像传感器320生成的图像信息进行相位差检测。由此,计算出到被摄体的距离D3。Fig. 10 schematically shows a ranging action when a variable focus lens is used. The manner in which the position of the focal point of the lens 300 is set to be variable will be described with reference to FIG. 10. The control unit 310 calculates the subject distance based on the image information of the light receiving element 410 obtained by changing the focal position of the lens 300 and performing three imaging. For example, the control unit 310 performs imaging in a state where the imaging point of the lens 300 is located at the position 1010, and performs phase difference detection using image information generated by the image sensor 320. In this way, the control unit 110 calculates the distance D1 to the subject. Next, the control unit 310 moves the focal point of the lens 300, performs imaging with the imaging point at the position 1020, and performs phase difference detection using the image information generated by the image sensor 320. Thus, the distance D2 to the subject is calculated. Next, the control unit 310 moves the focal point of the lens 300, performs imaging with the imaging point at the position 1030, and performs phase difference detection using the image information generated by the image sensor 320. Thus, the distance D3 to the subject is calculated.
控制部110基于距离D1、距离D2及距离D3的值,确定到被摄体的距离。例如,控制部110选择D1、D2及D3中的一个距离作为到被摄体的距离。作为一个示例,控制部110可以计算出D1、D2及D3的任意两者组合的欧氏距离,从D1、D2及D3的值中选择提供最短欧氏距离的两个值,并选择从所选择的两个值中的一个值作为到被摄体的距离。控制部110可以选择提供最短的欧式距离的两个值,也可以选择从所选择的两个值的平均值作为到被摄体的距离。例如,根据到被摄体的距离和被摄体自身的空间频率分量,可能存在高精度地检测相位差的最优焦点位置。如图10的关联说明所述,存在通过改变镜头300的焦点的位置进行多次摄像并进行相位差检测而能够更高精度地检测相位差的情况。The control unit 110 determines the distance to the subject based on the values of the distance D1, the distance D2, and the distance D3. For example, the control unit 110 selects one of the distances D1, D2, and D3 as the distance to the subject. As an example, the control unit 110 can calculate the Euclidean distance of any combination of two of D1, D2, and D3, select the two values that provide the shortest Euclidean distance from the values of D1, D2, and D3, and select from the selected Euclidean distance. One of the two values is the distance to the subject. The control unit 110 may select two values that provide the shortest Euclidean distance, or may select an average value from the selected two values as the distance to the subject. For example, depending on the distance to the subject and the spatial frequency component of the subject itself, there may be an optimal focus position for detecting the phase difference with high accuracy. As described in the related description of FIG. 10, there are cases where the phase difference can be detected with higher accuracy by changing the position of the focal point of the lens 300 to perform multiple imaging and performing phase difference detection.
另外,控制部110也可以基于摄像装置100的摄像模式来确定镜头300的焦点的位置。控制部110可以选择对应于摄像装置100的摄像模式的预定位置作为镜头300的焦点位置。控制部110在摄像装置100的摄像模式设定为肖像模式时,与摄像装置100的摄像模式设定为风景模式的情况相比,可以将镜头300的焦点位置设定在接近图像传感器320的位置。In addition, the control unit 110 may determine the position of the focal point of the lens 300 based on the imaging mode of the imaging device 100. The control section 110 may select a predetermined position corresponding to the imaging mode of the imaging device 100 as the focal position of the lens 300. When the imaging mode of the imaging device 100 is set to the portrait mode, the control unit 110 can set the focal position of the lens 300 closer to the image sensor 320 than when the imaging mode of the imaging device 100 is set to the landscape mode. .
镜头300的焦点位置的变更可以通过改变镜头300在光轴AX上的位置来进行。在采用镜头300包括聚焦镜头的方式的情况下,镜头300的焦点的位置的变更可以通过改变聚焦镜头在光轴AX上的位置来进行。另外,在上述变形例中,说明了变更镜头300的焦点位置的方式,也可以采用变更镜头300和图像传感器320之间在光轴AX上的位置关系的方式。例如,取代变更镜头300的焦点位置,可以采用改变图像传感器320在光轴AX上的位置的方式。而且,除了变更镜头300的焦点位置外,还可以采用改变图像传感器320在光轴AX上的位置的方式。The focal position of the lens 300 can be changed by changing the position of the lens 300 on the optical axis AX. In the case where the lens 300 includes a focusing lens, the position of the focal point of the lens 300 can be changed by changing the position of the focusing lens on the optical axis AX. In addition, in the above modification, the method of changing the focal position of the lens 300 has been described, but a method of changing the positional relationship between the lens 300 and the image sensor 320 on the optical axis AX may also be adopted. For example, instead of changing the focal position of the lens 300, a method of changing the position of the image sensor 320 on the optical axis AX may be adopted. Moreover, in addition to changing the focal position of the lens 300, a method of changing the position of the image sensor 320 on the optical axis AX may also be adopted.
在上述实施方式中,摄像装置100及摄像装置60是搭载在作为摄像系统的示例的UAV10上的摄像装置。摄像装置100及摄像装置60可以设置成能够从UAV10上拆卸下来。摄像装置100及摄像装置60中的至少一个可以通过智能手机等的便携式终端所具有的相机来实现。摄像装置100和摄像装置60中的至少一个可以设置成能够从UAV10上拆卸下来。摄像装置100及摄像装置60中的至少一个也可以不是搭载在UAV10等移动体上的摄像装置。例如,摄像装置100可以是由手持万向节支撑的摄像装置。另外,摄像装置100及摄像装置60也可以是不被UAV10和手持万向节支撑的摄像装置。例如,摄像装置100及摄像装置60中的至少一个可以是可由用户手持支撑的摄像装置。摄像装置100和摄像装置60中的至少一个可以是如监控摄像机之类的固定设置的摄像装置。In the above-described embodiment, the imaging device 100 and the imaging device 60 are imaging devices mounted on the UAV 10 as an example of the imaging system. The camera device 100 and the camera device 60 can be configured to be detachable from the UAV 10. At least one of the imaging device 100 and the imaging device 60 can be realized by a camera included in a portable terminal such as a smartphone. At least one of the camera 100 and the camera 60 may be configured to be detachable from the UAV 10. At least one of the imaging device 100 and the imaging device 60 may not be an imaging device mounted on a movable body such as the UAV10. For example, the camera device 100 may be a camera device supported by a handheld gimbal. In addition, the imaging device 100 and the imaging device 60 may be imaging devices that are not supported by the UAV 10 and the handheld gimbal. For example, at least one of the camera device 100 and the camera device 60 may be a camera device that can be held by the user. At least one of the camera device 100 and the camera device 60 may be a fixedly installed camera device such as a surveillance camera.
图11示出可整体或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置建立关联的操作或者该装置的一个或多个“部”发挥作用。或者,该程序能够使计算机1200执行该操作或者 该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框建立关联的指定操作。FIG. 11 shows an example of a computer 1200 that can embody aspects of the present invention in whole or in part. The program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts". This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention. Such a program may be executed by the CPU 1212, so that the computer 1200 executes a specified operation associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
本实施方式的计算机1200包括CPU1212及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内储存的程序而工作,从而控制各单元。The computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210. The computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220. The computer 1200 also includes a ROM 1230. The CPU 1212 operates in accordance with the programs stored in the ROM 1230 and RAM 1214 to control each unit.
通信接口1222通过网络与其它电子装置通信。硬盘驱动器可以储存计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中储存运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也是计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并带来程序与上述各种类型的硬件资源之间的协作。装置或者方法可通过随着计算机1200的使用而实现信息的操作或者处理来构成。The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive can store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network. The program is installed in RAM 1214 or ROM 1230, which is also an example of a computer-readable recording medium, and is executed by CPU 1212. The information processing described in these programs is read by the computer 1200 and brings about the cooperation between the programs and the various types of hardware resources described above. The device or method may be constituted by realizing operation or processing of information as the computer 1200 is used.
例如,在计算机1200与外部设备之间执行通信时,CPU1212可以执行加载在RAM1214中的通信程序,并根据通信程序所描述的处理,指令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取储存在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。For example, when performing communication between the computer 1200 and an external device, the CPU 1212 may execute a communication program loaded in the RAM 1214, and instruct the communication interface 1222 to perform communication processing according to the processing described by the communication program. The communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所储存的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。In addition, the CPU 1212 can make the RAM 1214 read all or required parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处记载的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中储存具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内储存的第二属性的属性值,从而获取与满足预定条件的第一属性建立了关联的第二属性的属性值。It is possible to store various types of information such as various types of programs, data, tables, and databases in the recording medium and receive information processing. For the data read from the RAM 1214, the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transition, unconditional transition, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214. In addition, the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
以上描述的程序或者软件模块可以储存在计算机1200上或者计算机1200附近的计算机 可读存储介质上。此外,与专用通信网络或者互联网连接的服务器系统内提供的硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而通过网络将程序提供给计算机1200。The programs or software modules described above may be stored on the computer 1200 or on a computer readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium to provide the program to the computer 1200 via the network.
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。It should be noted that the execution order of the actions, sequences, steps, and stages in the devices, systems, programs, and methods shown in the claims, descriptions and drawings, as long as there is no special indication that "in... "Before", "in advance", etc., and as long as the output of the previous processing is not used in the subsequent processing, they can be implemented in any order. Regarding the operating procedures in the claims, the description, and the drawings, the description is made using "first", "next", etc. for convenience, but it does not mean that it must be implemented in this order.
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It is obvious to those of ordinary skill in the art that various changes or improvements can be made to the above-mentioned embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
符号说明Symbol Description
10 UAV10 UAV
12远程操作装置12Remote operation device
20 UAV主体20 UAV subject
30 UAV控制部30 UAV Control Department
36通信接口36 communication interface
37存储器37 memory
40推进部40 Promotion Department
41 GPS接收器41 GPS receiver
42惯性测量装置42 inertial measurement device
43磁罗盘43 magnetic compass
44气压高度计44 barometric altimeter
45温度传感器45 temperature sensor
46湿度传感器46 humidity sensor
50万向节500,000 joint
60摄像装置60 camera
100摄像装置100 camera
102摄像部102 Camera Department
110控制部110 Control Department
120图像传感器120 image sensor
130存储器130 memory
200镜头部200 lens
210镜头210 lenses
212镜头驱动部212 lens drive unit
214位置传感器214 position sensor
220镜头控制部220 lens control unit
222存储器222 memory
300镜头300 lenses
310控制部310 Control Department
320图像传感器320 image sensor
322像面322 Image surface
330存储器330 memory
410、420、430受光元件410, 420, 430 light receiving element
710、720、810、820图表710, 720, 810, 820 charts
910、920、922摄像范围910, 920, 922 camera range
1010、1020、1030位置1010, 1020, 1030 position
1200计算机1200 computers
1210主机控制器1210 host controller
1212 CPU1212 CPU
1214 RAM1214 RAM
1220输入/输出控制器1220 input/output controller
1222通信接口1222 communication interface
1230 ROM1230 ROM

Claims (14)

  1. 一种装置,其特征在于,包括:第一图像传感器,所述第一图像传感器包括包含第一受光元件的多个受光元件,所述第一受光元件构成为接收通过第一镜头并被光瞳分割的光束,所述第一图像传感器设置成沿所述第一镜头的光轴方向从所述第一镜头的焦点位置偏离预定的距离以上;以及An apparatus, characterized by comprising: a first image sensor, the first image sensor comprising a plurality of light receiving elements including first light receiving elements, the first light receiving element is configured to receive through a first lens and be pupil Divided light beams, the first image sensor is arranged to deviate from the focal position of the first lens by more than a predetermined distance along the optical axis direction of the first lens; and
    电路,所述电路构成为基于所述第一受光元件生成的信号,生成用于第二摄像装置的控制的信息,所述第二摄像装置包括第二图像传感器。A circuit configured to generate information for control of a second imaging device based on a signal generated by the first light receiving element, the second imaging device including a second image sensor.
  2. 根据权利要求1所述的装置,其特征在于,所述电路构成为:基于所述第一受光元件生成的信号,生成用于所述第二摄像装置的焦点调节的信息。The device according to claim 1, wherein the circuit is configured to generate information used for focus adjustment of the second imaging device based on a signal generated by the first light receiving element.
  3. 根据权利要求1所述的装置,其特征在于,所述第一图像传感器设置在比所述第一镜头的焦点位置更靠近所述第一镜头侧。The device according to claim 1, wherein the first image sensor is arranged closer to the first lens side than the focal position of the first lens.
  4. 根据权利要求1或者2所述的装置,其特征在于,所述第一受光元件以预设间隔设置,The device according to claim 1 or 2, wherein the first light-receiving element is arranged at a preset interval,
    所述第一图像传感器设置成沿所述第一镜头的光轴方向偏离所述第一镜头的焦点位置,以使所述第一图像传感器的像面上的无限远被摄体的像的扩展宽度大于等于所述预设间隔。The first image sensor is arranged to deviate from the focal position of the first lens along the optical axis direction of the first lens, so as to expand the image of the infinitely distant subject on the image plane of the first image sensor The width is greater than or equal to the preset interval.
  5. 根据权利要求1或者2所述的装置,其特征在于,所述第一镜头为固定焦点镜头。The device according to claim 1 or 2, wherein the first lens is a fixed focus lens.
  6. 根据权利要求1或者2所述的装置,其特征在于,所述第一图像传感器包括The device according to claim 1 or 2, wherein the first image sensor comprises
    所述第一受光元件以及The first light-receiving element and
    第二受光元件,所述第二受光元件构成为接收属于可见光的波长范围的光束并生成图像信号,A second light receiving element configured to receive a light beam belonging to the wavelength range of visible light and generate an image signal,
    所述电路构成为:基于所述第二受光元件生成的所述图像信号,生成用于所述第二摄像装置的曝光控制及颜色调整的信息。The circuit is configured to generate information used for exposure control and color adjustment of the second imaging device based on the image signal generated by the second light receiving element.
  7. 根据权利要求6所述的装置,其特征在于,所述第一图像传感器还包括第三受光元件,所述第三受光元件构成为接收非可见光的波长范围的光束并生成图像信号;7. The device according to claim 6, wherein the first image sensor further comprises a third light receiving element, and the third light receiving element is configured to receive a light beam in a wavelength range of invisible light and generate an image signal;
    所述电路构成为基于所述第三受光元件生成的所述图像信号,生成非可见光的波长范围 的图像。The circuit is configured to generate an image in a wavelength range of invisible light based on the image signal generated by the third light receiving element.
  8. 根据权利要求1或者2所述的装置,其特征在于,包括所述第一镜头及所述第一图像传感器的第一摄像装置的摄像范围大于所述第二摄像装置的摄像范围。The device according to claim 1 or 2, wherein the imaging range of the first imaging device including the first lens and the first image sensor is larger than the imaging range of the second imaging device.
  9. 根据权利要求8所述的装置,其特征在于,所述第一摄像装置的摄像范围设置成包含所述第二摄像装置的摄像范围。The device according to claim 8, wherein the imaging range of the first imaging device is set to include the imaging range of the second imaging device.
  10. 根据权利要求8所述的装置,其特征在于,所述电路构成为:The device according to claim 8, wherein the circuit is configured as:
    基于所述第一受光元件生成的信号,生成所述第一摄像装置的摄像范围内的测距信息;Generating distance measurement information within the imaging range of the first imaging device based on the signal generated by the first light receiving element;
    当变更所述第二摄像装置的摄像范围时,基于所述变更后的所述第二摄像装置的摄像范围中的所述测距信息,进行所述第二摄像装置的焦点调节。When the imaging range of the second imaging device is changed, the focus adjustment of the second imaging device is performed based on the distance measurement information in the imaging range of the second imaging device after the change.
  11. 一种摄像装置,其特征在于,包括根据权利要求1或者2所述的装置;以及A camera device, characterized by comprising the device according to claim 1 or 2; and
    所述第一镜头。The first lens.
  12. 一种摄像系统,其特征在于,包括根据权利要求11所述的摄像装置;以及A camera system, characterized by comprising the camera device according to claim 11; and
    所述第二摄像装置。The second camera device.
  13. 一种移动体,其特征在于,其搭载根据权利要求11所述的摄像装置并进行移动。A mobile body characterized in that it mounts the imaging device according to claim 11 and moves.
  14. 根据权利要求13所述的移动体,其特征在于,还包括所述第二摄像装置;以及The mobile body according to claim 13, further comprising the second camera device; and
    能够控制所述第二摄像装置的姿势地进行支撑的支撑机构。A supporting mechanism capable of controlling the posture of the second imaging device and supporting it.
PCT/CN2021/097705 2020-06-08 2021-06-01 Device, camera device, camera system, and movable member WO2021249245A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180006043.9A CN114600024A (en) 2020-06-08 2021-06-01 Device, imaging system, and moving object

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-099170 2020-06-08
JP2020099170A JP2021193412A (en) 2020-06-08 2020-06-08 Device, imaging device, imaging system, and mobile object

Publications (1)

Publication Number Publication Date
WO2021249245A1 true WO2021249245A1 (en) 2021-12-16

Family

ID=78845279

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/097705 WO2021249245A1 (en) 2020-06-08 2021-06-01 Device, camera device, camera system, and movable member

Country Status (3)

Country Link
JP (1) JP2021193412A (en)
CN (1) CN114600024A (en)
WO (1) WO2021249245A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4373791A (en) * 1979-02-20 1983-02-15 Ricoh Company, Ltd. Focusing position detection apparatus
US20110199506A1 (en) * 2008-11-11 2011-08-18 Canon Kabushiki Kaisha Focus detection apparatus and control method therefor
CN102713713A (en) * 2009-10-13 2012-10-03 佳能株式会社 Focusing device and focusing method
CN104272161A (en) * 2012-05-01 2015-01-07 富士胶片株式会社 Imaging device and focus control method
CN104380167A (en) * 2012-04-25 2015-02-25 株式会社尼康 Focus detection device, focus adjustment device, and camera
CN109416458A (en) * 2016-06-30 2019-03-01 株式会社尼康 Camera
US20200125101A1 (en) * 2016-08-12 2020-04-23 Skydio, Inc. Unmanned aerial image capture platform
CN111226154A (en) * 2018-09-26 2020-06-02 深圳市大疆创新科技有限公司 Autofocus camera and system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012239135A (en) * 2011-05-13 2012-12-06 Nikon Corp Electronic apparatus
JP2014056014A (en) * 2012-09-11 2014-03-27 Canon Inc Imaging element and imaging apparatus
JP6249636B2 (en) * 2013-05-28 2017-12-20 キヤノン株式会社 Imaging apparatus and control method thereof
JP2017049426A (en) * 2015-09-01 2017-03-09 富士通株式会社 Phase difference estimation device, phase difference estimation method, and phase difference estimation program
JP6702777B2 (en) * 2016-04-12 2020-06-03 キヤノン株式会社 Imaging device, imaging method, and program
US10250794B2 (en) * 2017-01-04 2019-04-02 Motorola Mobility Llc Capturing an image using multi-camera automatic focus
WO2018185940A1 (en) * 2017-04-07 2018-10-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
JP2019086785A (en) * 2018-12-27 2019-06-06 株式会社ニコン Imaging element and imaging device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4373791A (en) * 1979-02-20 1983-02-15 Ricoh Company, Ltd. Focusing position detection apparatus
US20110199506A1 (en) * 2008-11-11 2011-08-18 Canon Kabushiki Kaisha Focus detection apparatus and control method therefor
CN102713713A (en) * 2009-10-13 2012-10-03 佳能株式会社 Focusing device and focusing method
CN104380167A (en) * 2012-04-25 2015-02-25 株式会社尼康 Focus detection device, focus adjustment device, and camera
CN104272161A (en) * 2012-05-01 2015-01-07 富士胶片株式会社 Imaging device and focus control method
CN109416458A (en) * 2016-06-30 2019-03-01 株式会社尼康 Camera
US20200125101A1 (en) * 2016-08-12 2020-04-23 Skydio, Inc. Unmanned aerial image capture platform
CN111226154A (en) * 2018-09-26 2020-06-02 深圳市大疆创新科技有限公司 Autofocus camera and system

Also Published As

Publication number Publication date
JP2021193412A (en) 2021-12-23
CN114600024A (en) 2022-06-07

Similar Documents

Publication Publication Date Title
WO2018185939A1 (en) Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
JP6496955B1 (en) Control device, system, control method, and program
US20210120171A1 (en) Determination device, movable body, determination method, and program
WO2020011230A1 (en) Control device, movable body, control method, and program
WO2019206076A1 (en) Control device, camera, moving body, control method and program
WO2020098603A1 (en) Determination device, camera device, camera system, moving object, determination method and program
WO2019174343A1 (en) Active body detection device, control device, moving body, active body detection method and procedure
WO2019085771A1 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
WO2020216037A1 (en) Control device, camera device, movable body, control method and program
WO2018185940A1 (en) Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
WO2021249245A1 (en) Device, camera device, camera system, and movable member
WO2021031833A1 (en) Control device, photographing system, control method, and program
JP6641574B1 (en) Determination device, moving object, determination method, and program
WO2020020042A1 (en) Control device, moving body, control method and program
JP6696092B2 (en) Control device, moving body, control method, and program
WO2019223614A1 (en) Control apparatus, photographing apparatus, moving body, control method, and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
WO2020216057A1 (en) Control device, photographing device, mobile body, control method and program
WO2021031840A1 (en) Device, photographing apparatus, moving body, method, and program
JP6878738B1 (en) Control devices, imaging systems, moving objects, control methods, and programs
JP6459012B1 (en) Control device, imaging device, flying object, control method, and program
WO2021143425A1 (en) Control device, photographing device, moving body, control method, and program
JP2020098289A (en) Control apparatus, photography system, moving body, control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21822372

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21822372

Country of ref document: EP

Kind code of ref document: A1