CN114600024A - Device, imaging system, and moving object - Google Patents

Device, imaging system, and moving object Download PDF

Info

Publication number
CN114600024A
CN114600024A CN202180006043.9A CN202180006043A CN114600024A CN 114600024 A CN114600024 A CN 114600024A CN 202180006043 A CN202180006043 A CN 202180006043A CN 114600024 A CN114600024 A CN 114600024A
Authority
CN
China
Prior art keywords
image
lens
light receiving
receiving element
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180006043.9A
Other languages
Chinese (zh)
Inventor
本庄谦一
高宫诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN114600024A publication Critical patent/CN114600024A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot

Abstract

The device includes a plurality of light receiving elements including a first light receiving element, wherein the first light receiving element is configured to: a light beam that passes through the first lens and is pupil-divided is received. The apparatus includes a first image sensor disposed to be deviated from a focal position of the first lens by a predetermined distance or more in an optical axis direction of the first lens. The apparatus comprises a circuit, wherein the circuit is configured to: information for controlling a second imaging device including a second image sensor is generated based on a signal generated by the first light receiving element.

Description

Device, imaging system, and moving object Technical Field
The invention relates to an apparatus, an imaging system, and a moving object.
Background
Patent document 1 discloses a camera system that calculates a distance value for each of M × N pixels according to a TOF algorithm.
[ Prior art documents ]
[ patent document ]
[ patent document 1] Japanese patent document JP 2019-A508717
Disclosure of Invention
A device according to an aspect of the present invention includes a plurality of light receiving elements including a first light receiving element configured to: a light beam that passes through the first lens and is pupil-divided is received. The apparatus includes a first image sensor disposed to be deviated from a focal position of the first lens by a predetermined distance or more in an optical axis direction of the first lens. The apparatus includes a circuit configured to: information for controlling a second imaging device including a second image sensor is generated based on a signal generated by the first light receiving element.
The circuit may be configured to: information for focus adjustment of the second image pickup device is generated based on the signal generated by the first light receiving element.
The first image sensor may be disposed closer to the first lens side than a position of a focal point of the first lens.
The first light receiving elements may be disposed at preset intervals. The first image sensor may be disposed to be shifted from a focal position of the first lens in an optical axis direction of the first lens so that an extended width of an image of an object at infinity on an image plane of the first image sensor is greater than or equal to a preset interval.
The first lens may be a fixed focus lens.
The first image sensor may include a first light receiving element and a second light receiving element, the second light receiving element being configured to: a light beam belonging to a wavelength range of visible light is received and an image signal is generated. The circuit may be configured to: information for exposure control and color adjustment of the second imaging device is generated based on the image signal generated by the second light receiving element.
The first image sensor may include a third light receiving element configured to: a light beam of a wavelength range of non-visible light is received and an image signal is generated. The circuit may be configured to: an image in the wavelength range of the invisible light is generated based on the image signal generated by the third light receiving element.
The image capturing range of the first image capturing device including the first lens and the first image sensor may be larger than the image capturing range of the second image capturing device.
The image capturing range of the first image capturing apparatus may be set to include the image capturing range of the second image capturing apparatus.
The circuit may be configured to: based on the signal generated by the first light receiving element, distance measurement information in the imaging range of the first imaging device is generated. The circuit may be configured to: when the imaging range of the second imaging device is changed, focus adjustment of the second imaging device is performed based on the changed distance measurement information in the imaging range of the second imaging device.
An image pickup apparatus according to one aspect includes the above apparatus and a first lens.
An imaging system according to an aspect includes the above-described imaging apparatus and a second imaging apparatus.
A moving object according to one aspect is mounted with the imaging device and moves.
The movable body may include a second image pickup device and a support mechanism that can support while controlling the posture of the second image pickup device.
According to the one aspect described above, information relating to the distance to the object can be acquired more accurately.
Moreover, the foregoing does not recite all of the necessary features of the present invention. Furthermore, sub-combinations of these feature sets may also constitute the invention.
Drawings
Fig. 1 shows one example of the appearance of an Unmanned Aerial Vehicle (UAV)10 and a remote operation device 12.
Fig. 2 shows one example of the functional blocks of the UAV 10.
Fig. 3 shows one example of functional blocks of the image pickup device 60.
Fig. 4 schematically shows an arrangement pattern of the light receiving elements of the image sensor 320.
Fig. 5 schematically shows a positional relationship of the focal point FP of the lens 300 and the image sensor 320.
Fig. 6 schematically shows MTF characteristics obtained by the lens 300 and the image sensor 320.
Fig. 7 schematically shows the output waveform of the light receiving element 410 on the assumption that the image plane 322 of the image sensor 320 is located at the position of the focal point FP.
Fig. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position deviated from the focal point FP.
Fig. 9 schematically illustrates an imaging range 910 of the imaging device 60 and an imaging range 920 of the imaging device 100.
Fig. 10 schematically shows a distance measuring operation when a variable focus lens is used.
Fig. 11 shows one example of a computer 1200.
Detailed Description
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the inventive solution. It will be apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flow diagrams and block diagrams, where blocks may represent (1) stages of a process to perform an operation or (2) a "part" of a device that has the role of performing an operation. Certain stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May include Integrated Circuits (ICs) and/or discrete circuits. The programmable circuitry may comprise reconfigurable hardware circuitry. The reconfigurable hardware circuit may include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), etc. memory elements.
A computer readable medium may comprise any tangible device that can store instructions for execution by a suitable device. As a result, a computer-readable medium having stored thereon instructions that may be executed to create a means for implementing the operations specified in the flowchart or block diagram includes an article of manufacture including instructions that may be executed to implement the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. As more specific examples of the computer-readable medium, floppy (registered trademark) disk floppy disks, flexible disks, hard disks, Random Access Memories (RAMs), Read Only Memories (ROMs), erasable programmable read only memories (EPROMs or flash memories), Electrically Erasable Programmable Read Only Memories (EEPROMs), Static Random Access Memories (SRAMs), compact disc read only memories (CD-ROMs), Digital Versatile Discs (DVDs), blu-Ray (RTM) optical discs, memory sticks, integrated circuit cards, and the like may be included.
Computer readable instructions may include any one of source code or object code described by any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or an object oriented programming language such as Smalltalk (registered trademark), JAVA (registered trademark), C + +, or the like, as well as the "C" programming language or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable instructions to create means for implementing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 shows an example of the appearance of an Unmanned Aerial Vehicle (UAV)10 and a remote operation device 12. The UAV10 includes a UAV body 20, a gimbal 50, a plurality of cameras 60, and a camera 100. The gimbal 50, the image pickup apparatus 100, and the image pickup apparatus 60 are one example of an image pickup system. The UAV10, i.e., a mobile body, is a concept including a flight vehicle moving in the air, a vehicle moving on the ground, a ship moving on water, and the like. The flying body moving in the air refers to a concept including not only the UAV but also other aircrafts, airships, helicopters, and the like moving in the air.
The UAV body 20 includes a plurality of rotors. Multiple rotors are one example of a propulsion section. The UAV body 20 flies the UAV10 by controlling the rotation of the plurality of rotors. The UAV body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four. Alternatively, the UAV10 may be a fixed wing aircraft without a rotor.
The imaging apparatus 100 is an imaging camera that captures an object included in a desired imaging range. The gimbal 50 rotatably supports the image pickup apparatus 100. The gimbal 50 is an example of a support mechanism that can support the image pickup apparatus 100 in a controllable posture. For example, the gimbal 50 rotatably supports the image pickup apparatus 100 with a pitch axis using an actuator. The gimbal 50 further rotatably supports the image pickup apparatus 100 centered on the roll axis and the yaw axis, respectively, using the actuators. The gimbal 50 can change the attitude of the image pickup apparatus 100 by rotating the image pickup apparatus 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
The imaging device 60 is a sensing camera that images the surroundings of the UAV10 in order to control the flight of the UAV 10. Two cameras 60 may be provided at the nose, i.e., the front, of the UAV 10. Also, two other cameras 60 may be provided on the bottom surface of the UAV 10. The two image pickup devices 60 on the front side can function as a so-called stereo camera. The two imaging devices 60 on the bottom surface side may also be paired to function as a stereo camera. Based on the images taken by the plurality of cameras 60, three-dimensional spatial data around the UAV10 may be generated. At least one of the plurality of image pickup devices 60 is an image pickup device for acquiring information used for control of the image pickup device 100 including focus control and exposure control. The number of cameras 60 included in the UAV10 is not limiting. It is sufficient that the UAV10 includes at least one camera 60. The UAV10 may also include at least one camera 60 at the nose, tail, sides, bottom, and top of the UAV 10. The angle of view settable in the image pickup device 60 may be larger than the angle of view settable in the image pickup device 100. At least one of the plurality of imaging devices 60 may have a single focus lens or a fisheye lens.
The remote operation device 12 communicates with the UAV10 to remotely operate the UAV 10. The remote operated device 12 may be in wireless communication with the UAV 10. The remote operation device 12 transmits instruction information indicating various instructions related to the movement of the UAV10, such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating, to the UAV 10. The indication includes, for example, an indication to raise the altitude of the UAV 10. The indication may indicate an altitude at which the UAV10 should be located. The UAV10 moves to be located at an altitude indicated by the indication received from the remote operation device 12. The indication may include a lift instruction to lift the UAV 10. The UAV10 ascends while receiving the ascending instruction. When the altitude of the UAV10 has reached an upper limit altitude, the UAV10 may be restricted from ascending even if an ascending command is accepted.
Fig. 2 shows one example of the functional blocks of the UAV 10. UAV10 includes UAV control 30, memory 37, communication interface 36, propulsion 40, GPS receiver 41, inertial measurement unit 42(IMU42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, gimbal 50, camera 60, and camera 100.
The communication interface 36 communicates with other devices such as the remote operation device 12. The communication interface 36 may receive instruction information including various instructions to the UAV control 30 from the remote operation device 12. The memory 37 stores programs and the like necessary for the UAV control unit 30 to control the propulsion unit 40, the GPS receiver 41, the IMU42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging apparatus 60, and the imaging apparatus 100. The memory 37 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as Solid State Disk (SSD). The memory 37 may be disposed inside the UAV body 20. Which may be configured to be detachable from the UAV body 20.
The UAV control unit 30 controls the flight and shooting of the UAV10 according to a program stored in the memory 37. The UAV control unit 30 may be constituted by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The UAV control section 30 controls the flight and shooting of the UAV10 in accordance with instructions received from the remote operation device 12 via the communication interface 36. The propulsion portion 40 propels the UAV 10. The propulsion section 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The propulsion unit 40 rotates the plurality of rotors via the plurality of drive motors in accordance with instructions from the UAV control unit 30 to fly the UAV 10.
The GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV10, from the plurality of received signals. The IMU42 detects the pose of the UAV 10. The IMU42 detects the acceleration of the UAV10 in the three-axis directions of the front-back, left-right, and up-down, and the angular velocity of the UAV10 in the three-axis directions of the pitch axis, roll axis, and yaw axis. The magnetic compass 43 detects the orientation of the nose of the UAV 10. The barometric altimeter 44 detects the altitude of the UAV 10. The barometric altimeter 44 detects the barometric pressure around the UAV10 and converts the detected barometric pressure into altitude to detect altitude. The temperature sensor 45 detects the temperature around the UAV 10. The humidity sensor 46 detects the humidity around the UAV 10.
The imaging device 100 includes an imaging section 102 and a lens section 200. The lens part 200 is one example of a lens apparatus. The imaging unit 102 includes an image sensor 120, a control unit 110, a memory 130, and a distance measuring sensor. The image sensor 120 may be formed of a CCD or a CMOS. The image sensor 120 captures an optical image imaged via the plurality of lenses 210, and outputs the captured image to the control section 110. The control unit 110 generates an image for recording by image processing based on the pixel information read from the image sensor 120, and stores the image in the memory 130. The control unit 110 may be constituted by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The control unit 110 may control the imaging apparatus 100 in accordance with an operation command of the imaging apparatus 100 from the UAV control unit 30. The control section 110 may be at least a part of the circuit in the present invention. The memory 130 may be a computer-readable recording medium and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as Solid State Disk (SSD). The memory 130 stores a program and the like necessary for the control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The memory 130 may be configured to be detachable from the housing of the image pickup apparatus 100.
The distance measurement sensor measures a distance to the subject. The ranging sensor may measure a distance to a specified object. The ranging sensor may be an infrared sensor, an ultrasonic sensor, a stereo camera, a TOF (Time of flight) sensor, or the like.
The lens part 200 includes a plurality of lenses 210, a plurality of lens driving parts 212, and a lens control part 220. The plurality of lenses 210 may function as a zoom lens, a variable focal length lens, and a focus lens. At least a part or all of the plurality of lenses 210 are configured to be movable along the optical axis. The lens portion 200 may be an interchangeable lens that is provided to be attachable to and detachable from the image pickup portion 102. The lens driving section 212 moves at least a part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring. The lens driving part 212 may include an actuator. The actuator may comprise a stepper motor. The lens control section 220 drives the lens driving section 212 in accordance with a lens control instruction from the image pickup section 102 to move the one or more lenses 210 in the optical axis direction via the mechanism member. The lens control command is, for example, a zoom control command and a focus control command.
The lens portion 200 also includes a memory 222 and a position sensor 214. The lens control unit 220 controls the lens 210 to move in the optical axis direction via the lens driving unit 212 in accordance with a lens operation command from the imaging unit 102. A part or all of the lens 210 moves along the optical axis. The lens control section 220 performs at least one of a zooming operation and a focusing operation by moving at least one of the lenses 210 along the optical axis. The position sensor 214 detects the position of the lens 210. The position sensor 214 may detect a current zoom position or focus position.
The lens driving part 212 may include a shake correction mechanism. The lens control section 220 may perform shake correction by moving the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via the shake correction mechanism. The lens driving section 212 may drive the shake correction mechanism by a stepping motor to perform shake correction. In addition, the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
The memory 222 stores control values of the plurality of lenses 210 moved via the lens driving part 212. Memory 222 may include at least one of SRAM, DRAM, EPROM, EEPROM, and USB memory, among flash memories.
Fig. 3 shows one example of functional blocks of the image pickup device 60. Fig. 3 shows functional blocks of the image pickup apparatus 60 that acquires an image for generating information used for control of the image pickup apparatus 100. The imaging unit 60 includes a lens 300, an image sensor 320, a control unit 310, and a memory 330. The control unit 310 exchanges information with the control unit 110 via the UAV control unit 30.
The control unit 310 may be constituted by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The memory 330 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as Solid State Disk (SSD). The memory 330 stores programs and the like necessary for the control unit 310 to control the image sensor 320 and the like. The control unit 310 included in the imaging apparatus 60, the control unit 110 included in the imaging apparatus 100, and the UAV control unit 30 may each function as at least a part of an electric circuit.
The image sensor 320 may be formed of a CCD or a CMOS. The image sensor 320 captures an optical image formed by the light beam passing through the lens 300, and outputs the captured image to the control section 310. The control section 310 performs image processing based on the pixel information read from the image sensor 320, thereby generating an image, and stores it in the memory 330.
In the present embodiment, the lens 300 is a fixed focus lens. The positional relationship between the lens 300 and the image sensor 320 is fixed. As described later, the image sensor 320 is provided so as to be deviated from the position of the focal point FP of the lens 300 by a predetermined distance or more in the optical axis AX direction of the lens 300. For example, the image sensor 320 is disposed closer to the lens 300 side than the focal position of the lens 300. That is, the image pickup device 60 is designed to pick up an object in a so-called back focus state. The control section 310 acquires distance measurement information by a phase difference detection method using an object image captured in a retrofocus state, and generates information for focus adjustment of the image pickup apparatus 100.
Fig. 4 schematically shows an arrangement pattern of the light receiving elements of the image sensor 320. The image sensor 320 includes a plurality of light receiving elements arranged two-dimensionally. In fig. 4, "G" indicates a light receiving element that receives light passing through a filter that transmits light in the green wavelength range. "R" represents a light receiving element that receives light passing through a filter that transmits light in the red wavelength range. "B" denotes a light receiving element that receives light passed through the green filter. "IR" denotes a light receiving element that receives light passing through a filter that transmits light in the infrared wavelength range. Each light receiving element generates one pixel information in an image.
The image sensor 320 includes a plurality of light-receiving elements including a light-receiving element 410a and a light-receiving element 410B, a light-receiving element 420G, a light-receiving element 420Ga, a light-receiving element 420Gb, a light-receiving element 420B and a light-receiving element 420R, and a light-receiving element 430. The image sensor 320 has a pixel array in which 4 × 4 pixel arrays shown in fig. 4 are arranged in a matrix. The light receiving elements 420G, 420Ga, 420Gb, 420B, and 420R may be collectively referred to as "light receiving elements 420". The light receiving element 410a and the light receiving element 410b may be collectively referred to as "light receiving element 410".
The light receiving element 410 is one example of a first light receiving element that receives the light beam that has passed through the lens 300 and has been pupil-divided. That is, the light receiving element 410 is a distance measuring light receiving element based on phase difference detection. The control unit 310 is configured to: information for focus adjustment of the image pickup apparatus 100 is generated based on the signal generated by the light receiving element 410. The control unit 310 generates information for controlling the imaging apparatus 100 including the image sensor 120 based on the signal generated by the light receiving element 410.
The light receiving element 420 is an example of a second light receiving element that receives a light beam of a wavelength range of visible light and generates an image signal. The light receiving element 430 is an example of a third light receiving element that receives a light beam of a wavelength range of non-visible light and generates an image signal. In the present embodiment, the light receiving element 430 receives light in the infrared wavelength range.
The control unit 310 generates information for exposure control and color adjustment of the imaging apparatus 100 based on the image signal generated by the light receiving element 420. Specifically, the control unit 310 generates a visible light image, which is an image formed by light in the wavelength range of visible light, based on the image signal generated by the light receiving element 420. The control unit 310 generates luminance information of each region within the imaging range of the imaging device 60 based on the luminance information of the visible light image. Further, the control unit 310 generates color information of each region within the imaging range of the imaging device 60 based on the luminance information of each color of the visible light image. The control unit 310 outputs the generated luminance information and color information to the control unit 110. The control unit 110 performs automatic exposure control of the imaging apparatus 100 based on the luminance information output from the control unit 310. The control unit 110 adjusts the white balance of the image generated by the image sensor 120 of the imaging apparatus 100 based on the color information output from the control unit 310. Thus, the control unit 310 is configured to: based on the image signal generated by the light receiving element 420, information for exposure control and color adjustment of the imaging apparatus 100 is generated.
The light receiving element 430 generates an image signal formed by light in the infrared wavelength range. The control unit 310 generates an infrared image, which is an image formed by light in the infrared wavelength range, based on the image signal generated by the light receiving element 430. For example, an infrared image is used as an image for resolving the temperature distribution. Further, the infrared image is also used as an image for analyzing a subject in a dark place. Thus, the control unit 310 is configured to: an image in the wavelength range of the invisible light is generated based on the image signal generated by the light receiving element 430.
The light receiving element 410a is disposed on the right side within one pixel region. The light receiving element 410b is disposed on the left side within one pixel region. Specifically, in the pixel region where the light receiving element 410a and the light receiving element 420Ga are provided, the light receiving element 410a is provided on the right side and the light receiving element 420Ga is provided on the left side as viewed from the object side. In the pixel region where the light receiving element 410b and the light receiving element 420Gb are provided, the light receiving element 410b is provided on the left side and the light receiving element 420Gb is provided on the right side as viewed from the object side. Thus, the light receiving element 410b and the light receiving element 420Gb are separated in the horizontal pixel direction in one pixel region. Thereby, the light receiving elements 410a and 410b mainly receive light beams passing through mutually different pupils among the exit pupils of the lens 300.
The plurality of light receiving elements 410 provided on the right side in one pixel region like the light receiving element 410a may be referred to as "first phase difference elements". The plurality of light receiving elements 410 provided on the left side in one pixel region like the light receiving element 410b may be referred to as "second phase difference elements". As shown in fig. 4, the light receiving elements 410a and 410b are alternately arranged every two pixels in the horizontal pixel direction. Therefore, the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels. Further, the first phase difference element including the light receiving element 410a is provided one every four pixels in the horizontal pixel direction. Further, the second phase difference element including the light receiving element 410b is provided one every four pixels in the horizontal pixel direction.
The control section 310 detects a phase difference using a correlation between an image generated by the first phase difference element including the light receiving element 410a and an image generated by the second phase difference element including the light receiving element 410 b. The control section 310 acquires ranging information based on the detected phase difference. In this way, the imaging device 60 acquires distance information of the object by the phase difference detection method. The control unit 310 generates information for focus adjustment of the image pickup apparatus 100 based on the signal generated by the light receiving element 410. The control unit 310 may generate information for controlling the zoom value or the angle of view of the lens 300 based on a signal generated by the light receiving element for phase difference detection.
Fig. 5 schematically shows a positional relationship of the focal point FP of the lens 300 and the image sensor 320. The focus FP is a point at which a ray after passing through the lens 300 intersects the optical axis AX when the ray parallel to the optical axis AX of the lens 300 is incident on the lens 300. As shown in fig. 5, the image plane 322 of the image sensor 320 is located at a position deviated from the focal point FP in the optical axis AX direction. Specifically, the image plane 322 of the image sensor 320 is located closer to the lens 300 side than the focus FP. Thus, the image sensor 320 is disposed at a position shifted from the position of the focal point FP so that the image plane 322 is located closer to the lens 300 side than the focal point FP.
Thus, the lens 300 is fixed such that the focal point FP is located on the back side of the image sensor 320. Thereby, an image of an object at infinity is projected onto the image sensor 320 in a state in which the focus is blurred. Also, the image of the close-distance object is projected onto the image sensor 320 in a state in which the focus is blurred.
Fig. 6 schematically shows MTF characteristics obtained by the lens 300 and the image sensor 320. The solid line 600 represents MTF characteristics when the image plane 322 of the image sensor 320 is set at a position shifted from the focus FP position as in the image pickup device 60. The dotted line 610 indicates MTF characteristics when the image plane 322 of the image sensor 320 is assumed to be located at the position of the focus FP. As described above, the MTF of the image pickup device 60 becomes lower in the high spatial frequency domain as compared with the case where it is assumed that the image plane 322 is located at the position of the focus FP.
Fig. 7 schematically shows the output waveform of the light receiving element 410 on the assumption that the image plane 322 of the image sensor 320 is located at the position of the focal point FP. The graph 710 shows an output waveform of the first phase difference element including the light receiving element 410 a. The graph 720 shows an output waveform of the second phase difference element including the light receiving element 410 b. In the graphs 710 and 720, the horizontal axis represents the position in the horizontal direction, and the vertical axis represents the luminance value. Further, graphs 710 and 720 show output waveforms of the light receiving element 410 when a pattern having a strong luminance difference in the horizontal direction is taken as an object. As shown in graphs 710 and 720, when the in-focus state is approached with respect to the subject, the edge becomes steep. That is, the edge is represented by the signal of a few light receiving elements 410. Therefore, it becomes possible to calculate the phase difference by correlation operation of the edge portion formed by fewer pixels, so the calculation accuracy of the phase difference is lowered. In addition, since the generated phase difference is also small, the calculation accuracy of the phase difference is also likely to be lowered.
Fig. 8 schematically shows the output waveform of the light receiving element 410 when the image plane 322 of the image sensor 320 is located at a position deviated from the focal point FP. The graph 810 shows an output waveform of the first phase difference element including the light receiving element 410 a. The graph 820 shows an output waveform of the second phase difference element including the light receiving element 410 b. In each of the graph 810 and the graph 820, the horizontal axis represents the position in the horizontal direction, and the vertical axis represents the luminance value. As in fig. 7, graph 810 and graph 820 show output waveforms of the light receiving element 410 when a pattern having a strong luminance difference in the horizontal direction is an object. When the image plane 322 is located at a position deviated from the focus FP, a blur generated in the object image becomes large as compared with a case where the image plane 322 is located at the position of the focus FP. Therefore, as shown in the graphs 810 and 820, the edge becomes gentle. Thus, the edge can be represented by the signals of the plurality of light receiving elements 410. Therefore, the accuracy of calculating the phase difference when calculating the phase difference by correlation can be improved. Further, the generated phase difference also increases, and therefore the calculation accuracy of the phase difference improves.
In order to improve the calculation accuracy of the phase difference, it is preferable that at least a plurality of light receiving elements 410 can capture an image of an object at infinity formed on the image plane 322 of the image sensor 320 through the lens 300. That is, it is preferable to set the image sensor 320 to be shifted from the focal point FP in the optical axis AX direction so that the extended width of the image of the infinity object is equal to or greater than the interval of the plurality of light receiving elements 410. As shown in fig. 4, the light receiving elements 410a and 410b are alternately arranged every two pixels in the horizontal pixel direction. That is, the interval between the pixel provided with the light receiving element 410a and the pixel provided with the light receiving element 410b is two pixels. Thus, when the light receiving elements 410 are arranged every two pixels, it is preferable to offset the image sensor 320 from the focal point FP in the optical axis AX direction so that the extended width of the image of the infinity object is two or more pixels.
In addition, in order to be able to represent an edge with more signals of the light receiving elements 410, it is desirable to set the image sensor 320 to be shifted from the focal point FP in the optical axis AX direction so that the extended width of the image of the infinity object is equal to or greater than the interval of the light receiving elements 410 that receive light passing through a specified pupil region. For example, as shown in fig. 4, when the first phase difference element including the light receiving element 410a is arranged every four pixels and the second phase difference element including the light receiving element 410b is arranged every four pixels, the image sensor 320 is preferably disposed so as to be shifted from the focal point FP in the optical axis AX direction so that the extended width of the image of the infinity object is four pixels or more. Further, as the extension width of the image of the object at infinity, the semiamplitude of the point spread function, the semiamplitude of the line spread function, or the like can be applied.
In this way, when the light receiving elements 410 in the image sensor 320 are disposed at the preset intervals, it is preferable that the image sensor 320 is disposed at a position shifted from the focal point FP of the lens 300 in the optical axis direction of the lens 300 so that the extended width of the image of the infinity object on the image plane 322 of the image sensor 320 is larger than or equal to the preset intervals.
As described above, in the imaging device 60, the positional relationship between the lens 300 and the image sensor 320 is fixed so that the focal point FP is located on the back side (the side opposite to the object side) of the image sensor 320. This can improve the detection accuracy of the phase difference over a wide distance range from an infinite-distance object to a short-distance object. Further, the control section 310 acquires luminance information and color information of the subject based on image information of the visible light image obtained from the image sensor 320. In order to control the exposure of the imaging apparatus 100, luminance information on a pixel basis is not necessarily required, and it may be preferable to acquire average luminance information of each partial region of an image. Similarly, color information on a pixel basis is not necessarily required for white balance adjustment of the imaging apparatus 100, and it may be preferable to acquire average color information of each partial region of an image. Therefore, luminance information and color information can be acquired with sufficient accuracy even from an image accompanied by a certain degree of blur.
Fig. 9 schematically illustrates an image capturing range 910 of the image capturing apparatus 60 and an image capturing range 920 of the image capturing apparatus 100. The angle of view of the image capture device 60 is greater than the angle of view of the image capture device 100. The imaging range 910 of the imaging device 60 is larger than the imaging range 920 of the imaging device 100. As shown in fig. 9, the imaging range 910 of the imaging apparatus 60 includes the entire imaging range 920 of the imaging apparatus 100.
In addition, the imaging range of the imaging device 60 is determined by the posture of the UAV 10. On the other hand, the imaging range 920 of the imaging apparatus 100 is determined based on the attitude of the imaging apparatus 100 controlled by the gimbal 50 and the angle of view of the imaging apparatus 100 controlled by the control unit 110, in addition to the attitude of the UAV 10. When the UAV10 is in any posture, the imaging range 920 of the imaging apparatus 60 may include all of the imaging ranges that can be changed by the imaging apparatus 100 under the control of the universal joint 50 and the control unit 110.
As described above, the control unit 310 generates the distance measurement information by the phase difference detection based on the signal generated by the light receiving element 410 among the signals generated by the image sensor 320. Here, a description will be given of control in a case where the current imaging range 920 of the imaging apparatus 100 is changed to the imaging range 922. For example, when the universal joint 50 rotates the imaging apparatus 100 about the yaw axis and the pitch axis and changes the imaging range of the imaging apparatus 100 from the imaging range 920 to the imaging range 922, the control unit 110 acquires the distance measurement information of the area corresponding to the imaging range 922 out of the distance measurement information generated by the control unit 310. The control unit 110 controls the position of the focus lens included in the lens 210 based on the acquired distance measurement information, thereby performing focus control on the object within the imaging range 922. Thus, the control unit 110 can quickly perform focus control in accordance with a change in the imaging range of the imaging apparatus 100.
When the imaging range of the imaging apparatus 100 is changed from the imaging range 920 to the imaging range 922, the control unit 110 acquires brightness information detected by the control unit 310 from the image of the region corresponding to the imaging range 922 in the image of the imaging range 910, and performs exposure control of the imaging apparatus 100 based on the acquired brightness information. This makes it possible to quickly perform exposure control in accordance with a change in the imaging range of the imaging apparatus 100. The control unit 110 acquires color information detected by the control unit 310 from an image of a region corresponding to the imaging range 922 in the image of the imaging range 910, and determines parameters of white balance processing for the image acquired by the image sensor 120 based on the acquired color information. Thus, the imaging apparatus 100 can quickly determine appropriate parameters for white balance processing and perform white balance adjustment.
As described above, the control unit 310 generates the distance measurement information in the imaging range of the imaging device 60 based on the signal generated by the light receiving element 410. When the imaging range of the imaging apparatus 100 is changed, the control unit 310 performs focus adjustment of the imaging apparatus 100 based on the changed distance measurement information in the imaging range of the imaging apparatus 100.
As described above, the image pickup device 60 can capture an image of a larger range than the image pickup device 100 and acquire distance information, luminance information, and color information of an object. Thus, the imaging device 60 can be used as a camera for 3A (AE/AF/AWB). Since the imaging range of the imaging device 60 is set to include the imaging range of the imaging device 100, even when the imaging range of the imaging device 100 changes due to the posture of the UAV10 or the control of the universal joint 50, the 3A-related control can be promptly performed using the distance information, the luminance information, and the color information acquired by the imaging device 60. In addition, since the wavelength filter of the partial pixel region of the image sensor 320 is an IR filter, infrared images can be simultaneously acquired.
Fig. 10 schematically shows a distance measuring operation when a variable focus lens is used. The manner in which the position of the focal point of the lens 300 is set to be variable is explained with reference to fig. 10. The control unit 310 calculates the object distance from the image information of the light receiving element 410 obtained by performing three times of image capturing while changing the focal position of the lens 300. For example, the control section 310 performs image capturing with the imaging point of the lens 300 at the position 1010, and performs phase difference detection using image information generated by the image sensor 320. Thereby, the control unit 110 calculates the distance D1 to the object. Next, the control unit 310 shifts the focal point of the lens 300, performs image capturing with the imaging point at the position 1020, and performs phase difference detection using image information generated by the image sensor 320. Thus, the distance D2 to the object is calculated. Next, the control unit 310 shifts the focus of the lens 300, performs image capturing with the imaging point at the position 1030, and performs phase difference detection using image information generated by the image sensor 320. Thus, the distance D3 to the object is calculated.
The control unit 110 determines the distance to the object based on the values of the distance D1, the distance D2, and the distance D3. For example, the control unit 110 selects one of the distances D1, D2, and D3 as the distance to the subject. As an example, the control unit 110 may calculate the euclidean distance of any combination of two of D1, D2, and D3, select two values providing the shortest euclidean distance from the values of D1, D2, and D3, and select one value from the two selected values as the distance to the subject. The control unit 110 may select two values that provide the shortest euclidean distance, or may select an average value of the two selected values as the distance to the subject. For example, there may be an optimal focus position at which the phase difference is detected with high accuracy, depending on the distance to the object and the spatial frequency component of the object itself. As described in connection with fig. 10, there are cases where the phase difference can be detected with higher accuracy by performing a plurality of times of image capturing with the position of the focal point of the lens 300 changed and performing phase difference detection.
The control unit 110 may determine the position of the focal point of the lens 300 based on the imaging mode of the imaging device 100. The control section 110 may select a predetermined position corresponding to the image capturing mode of the image capturing apparatus 100 as the focal position of the lens 300. When the image capturing mode of the image capturing apparatus 100 is set to the portrait mode, the control unit 110 may set the focal position of the lens 300 to a position close to the image sensor 320, as compared to the case where the image capturing mode of the image capturing apparatus 100 is set to the landscape mode.
The focal position of the lens 300 can be changed by changing the position of the lens 300 on the optical axis AX. In the case of adopting a mode in which the lens 300 includes a focus lens, the change of the position of the focal point of the lens 300 can be performed by changing the position of the focus lens on the optical axis AX. In the modification described above, the focus position of the lens 300 is changed, and a positional relationship between the lens 300 and the image sensor 320 on the optical axis AX may be changed. For example, instead of changing the focal position of the lens 300, a method of changing the position of the image sensor 320 on the optical axis AX may be employed. Further, in addition to changing the focal position of the lens 300, a method of changing the position of the image sensor 320 on the optical axis AX may be employed.
In the above-described embodiment, the imaging apparatus 100 and the imaging apparatus 60 are imaging apparatuses mounted on the UAV10 as an example of an imaging system. The camera 100 and the camera 60 may be configured to be detachable from the UAV 10. At least one of the imaging device 100 and the imaging device 60 may be implemented by a camera provided in a portable terminal such as a smartphone. At least one of the camera 100 and the camera 60 may be configured to be detachable from the UAV 10. At least one of the imaging device 100 and the imaging device 60 may not be an imaging device mounted on a mobile body such as the UAV 10. For example, the camera 100 may be a camera supported by a handheld gimbal. The imaging apparatus 100 and the imaging apparatus 60 may be imaging apparatuses that are not supported by the UAV10 and the hand-held gimbal. For example, at least one of the image pickup device 100 and the image pickup device 60 may be an image pickup device that can be supported by a user's hand. At least one of the image pickup device 100 and the image pickup device 60 may be a fixedly provided image pickup device such as a monitoring camera.
FIG. 11 illustrates one example of a computer 1200 that may embody aspects of the invention in whole or in part. The program installed on the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more "sections" of the apparatus. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more "sections". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such programs may be executed by the CPU1212 to cause the computer 1200 to perform specified operations associated with some or all of the blocks in the flowchart and block diagrams described herein.
The computer 1200 of the present embodiment includes a CPU1212 and a RAM1214, which are connected to each other via a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU1212 operates in accordance with programs stored in the ROM1230 and the RAM1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive may store programs and data used by CPU1212 in computer 1200. The ROM1230 stores therein a boot program or the like executed by the computer 1200 at runtime, and/or a program depending on hardware of the computer 1200. The program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM1214 or the ROM1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and brings about cooperation between the programs and the various types of hardware resources described above. An apparatus or method may be constructed by performing operations or processes on information as the computer 1200 is used.
For example, in performing communication between the computer 1200 and an external device, the CPU1212 may execute a communication program loaded in the RAM1214 and instruct the communication interface 1222 to perform communication processing according to processing described by the communication program. The communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM1214 or a USB memory and transmits the read transmission data to a network, or writes reception data received from the network in a reception buffer or the like provided in the recording medium, under the control of the CPU 1212.
Further, the CPU1212 may cause the RAM1214 to read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory, and execute various types of processing on data on the RAM 1214. Then, the CPU1212 may write back the processed data to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and processed by the information. With respect to data read from the RAM1214, the CPU1212 can execute various types of processing described throughout the present disclosure, including various types of operations specified by an instruction sequence of a program, information processing, condition judgment, condition transition, unconditional transition, retrieval/replacement of information, and the like, and write the result back into the RAM 1214. Further, the CPU1212 can retrieve information in files, databases, etc., within the recording medium. For example, when a plurality of entries having attribute values of a first attribute respectively associated with attribute values of a second attribute are stored in a recording medium, the CPU1212 may retrieve an entry matching a condition specifying an attribute value of the first attribute from the plurality of entries and read an attribute value of the second attribute stored in the entry, thereby acquiring an attribute value of the second attribute associated with the first attribute satisfying a predetermined condition.
The programs or software modules described above may be stored on the computer 1200 or on a computer readable storage medium near the computer 1200. Further, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as the computer-readable storage medium, thereby providing the program to the computer 1200 through the network.
It should be noted that the execution order of the operations, the sequence, the steps, the stages, and the like in the devices, systems, programs, and methods shown in the claims, the description, and the drawings may be implemented in any order as long as "before", "in advance", and the like are not particularly explicitly indicated, and as long as the output of the preceding process is not used in the following process. The operational flow in the claims, the specification, and the drawings is described using "first", "next", and the like for convenience, but it is not necessarily meant to be performed in this order.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. It is apparent from the description of the claims that the embodiments to which such changes or improvements are made can be included in the technical scope of the present invention.
Description of the symbols
10 UAV
12 remote operation device
20 UAV body
30 UAV control section
36 communication interface
37 memory
40 advancing part
41 GPS receiver
42 inertia measuring device
43 magnetic compass
44 air pressure altimeter
45 temperature sensor
46 humidity sensor
50 universal joint
60 image pickup device
100 image pickup device
102 image pickup part
110 control part
120 image sensor
130 memory
200 lens part
210 lens
212 lens driving unit
214 position sensor
220 lens control part
222 memory
300 lens
310 control unit
320 image sensor
322 image plane
330 memory
410. 420, 430 light receiving element
710. 720, 810, 820 diagrams
910. 920, 922 imaging range
1010. 1020, 1030 position
1200 computer
1210 host computer controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM

Claims (14)

  1. An apparatus, comprising: a first image sensor including a plurality of light receiving elements including a first light receiving element configured to receive a light beam that passes through a first lens and is pupil-divided, the first image sensor being provided so as to be deviated from a focal position of the first lens by a predetermined distance or more in an optical axis direction of the first lens; and
    and a circuit configured to generate information for controlling a second imaging device including a second image sensor based on a signal generated by the first light receiving element.
  2. The apparatus of claim 1, wherein the circuit is configured to: information for focus adjustment of the second image pickup device is generated based on the signal generated by the first light receiving element.
  3. The apparatus according to claim 1, wherein the first image sensor is disposed closer to the first lens side than a focal position of the first lens.
  4. The device according to claim 1 or 2, wherein the first light receiving elements are disposed at predetermined intervals,
    the first image sensor is disposed to be deviated from a focal position of the first lens in an optical axis direction of the first lens so that an extended width of an image of an object at infinity on an image plane of the first image sensor is equal to or greater than the preset interval.
  5. The apparatus of claim 1 or 2, wherein the first lens is a fixed focus lens.
  6. The apparatus of claim 1 or 2, wherein the first image sensor comprises
    The first light receiving element and
    a second light receiving element configured to receive a light beam belonging to a wavelength range of visible light and generate an image signal,
    the circuit is configured to: and generating information for exposure control and color adjustment of the second imaging device based on the image signal generated by the second light receiving element.
  7. The apparatus according to claim 6, wherein the first image sensor further comprises a third light receiving element configured to receive a light beam of a wavelength range of non-visible light and generate an image signal;
    the circuit is configured to generate an image in a wavelength range of non-visible light based on the image signal generated by the third light receiving element.
  8. The apparatus according to claim 1 or 2, wherein an imaging range of a first imaging device including the first lens and the first image sensor is larger than an imaging range of the second imaging device.
  9. The apparatus according to claim 8, wherein an image pickup range of the first image pickup means is set to include an image pickup range of the second image pickup means.
  10. The apparatus of claim 8, wherein the circuit is configured to:
    generating ranging information within an imaging range of the first imaging device based on the signal generated by the first light receiving element;
    when the imaging range of the second imaging device is changed, focus adjustment of the second imaging device is performed based on the distance measurement information in the changed imaging range of the second imaging device.
  11. An image pickup apparatus characterized by comprising the apparatus according to claim 1 or 2; and
    the first lens.
  12. An image pickup system characterized by comprising the image pickup apparatus according to claim 11; and
    the second image pickup device.
  13. A mobile body which is mounted with the imaging device according to claim 11 and moves.
  14. The movable body according to claim 13 further comprising the second imaging device; and
    and a support mechanism capable of supporting the second imaging device while controlling the posture of the second imaging device.
CN202180006043.9A 2020-06-08 2021-06-01 Device, imaging system, and moving object Pending CN114600024A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-099170 2020-06-08
JP2020099170A JP2021193412A (en) 2020-06-08 2020-06-08 Device, imaging device, imaging system, and mobile object
PCT/CN2021/097705 WO2021249245A1 (en) 2020-06-08 2021-06-01 Device, camera device, camera system, and movable member

Publications (1)

Publication Number Publication Date
CN114600024A true CN114600024A (en) 2022-06-07

Family

ID=78845279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180006043.9A Pending CN114600024A (en) 2020-06-08 2021-06-01 Device, imaging system, and moving object

Country Status (3)

Country Link
JP (1) JP2021193412A (en)
CN (1) CN114600024A (en)
WO (1) WO2021249245A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685916A (en) * 2012-09-11 2014-03-26 佳能株式会社 Image pickup apparatus with image pickup device and control method for image pickup apparatus
CN107295246A (en) * 2016-04-12 2017-10-24 佳能株式会社 Picture pick-up device and image capture method
US20180191945A1 (en) * 2017-01-04 2018-07-05 Motorola Mobility Llc Capturing an image using multi-camera automatic focus
CN109416458A (en) * 2016-06-30 2019-03-01 株式会社尼康 Camera
JP2019086785A (en) * 2018-12-27 2019-06-06 株式会社ニコン Imaging element and imaging device
CN111226154A (en) * 2018-09-26 2020-06-02 深圳市大疆创新科技有限公司 Autofocus camera and system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3006244C2 (en) * 1979-02-20 1984-08-30 Ricoh Co., Ltd., Tokio/Tokyo Device for determining the focus of a lens on an object
JP5489641B2 (en) * 2008-11-11 2014-05-14 キヤノン株式会社 Focus detection apparatus and control method thereof
CN102713713B (en) * 2009-10-13 2016-01-27 佳能株式会社 Focus adjusting apparatus and focus adjusting method
JP2012239135A (en) * 2011-05-13 2012-12-06 Nikon Corp Electronic apparatus
JPWO2013161944A1 (en) * 2012-04-25 2015-12-24 株式会社ニコン Focus detection device, focus adjustment device and camera
WO2013164937A1 (en) * 2012-05-01 2013-11-07 富士フイルム株式会社 Imaging device and focus control method
JP6249636B2 (en) * 2013-05-28 2017-12-20 キヤノン株式会社 Imaging apparatus and control method thereof
JP2017049426A (en) * 2015-09-01 2017-03-09 富士通株式会社 Phase difference estimation device, phase difference estimation method, and phase difference estimation program
US10520943B2 (en) * 2016-08-12 2019-12-31 Skydio, Inc. Unmanned aerial image capture platform
WO2018185940A1 (en) * 2017-04-07 2018-10-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685916A (en) * 2012-09-11 2014-03-26 佳能株式会社 Image pickup apparatus with image pickup device and control method for image pickup apparatus
CN107295246A (en) * 2016-04-12 2017-10-24 佳能株式会社 Picture pick-up device and image capture method
CN109416458A (en) * 2016-06-30 2019-03-01 株式会社尼康 Camera
US20180191945A1 (en) * 2017-01-04 2018-07-05 Motorola Mobility Llc Capturing an image using multi-camera automatic focus
CN111226154A (en) * 2018-09-26 2020-06-02 深圳市大疆创新科技有限公司 Autofocus camera and system
JP2019086785A (en) * 2018-12-27 2019-06-06 株式会社ニコン Imaging element and imaging device

Also Published As

Publication number Publication date
JP2021193412A (en) 2021-12-23
WO2021249245A1 (en) 2021-12-16

Similar Documents

Publication Publication Date Title
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN110383812B (en) Control device, system, control method, and program
US11070735B2 (en) Photographing device, photographing system, mobile body, control method and program
CN111356954B (en) Control device, mobile body, control method, and program
CN110809746A (en) Control device, imaging device, mobile body, control method, and program
US10942331B2 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
CN111630838B (en) Specifying device, imaging system, moving object, specifying method, and program
CN112335227A (en) Control device, imaging system, control method, and program
CN111264055A (en) Specifying device, imaging system, moving object, synthesizing system, specifying method, and program
CN110785997B (en) Control device, imaging device, mobile body, and control method
CN111357271B (en) Control device, mobile body, and control method
JP6641574B1 (en) Determination device, moving object, determination method, and program
CN112154371A (en) Control device, imaging device, mobile body, control method, and program
CN114600024A (en) Device, imaging system, and moving object
CN111226170A (en) Control device, mobile body, control method, and program
CN112313941A (en) Control device, imaging device, control method, and program
CN112335230A (en) Control device, imaging device, mobile body, control method, and program
CN110770667A (en) Control device, mobile body, control method, and program
CN110383815B (en) Control device, imaging device, flying object, control method, and storage medium
CN112166374B (en) Control device, imaging device, mobile body, and control method
CN111213369B (en) Control device, control method, imaging device, mobile object, and computer-readable storage medium
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
CN112313943A (en) Device, imaging device, moving object, method, and program
CN114600446A (en) Control device, imaging device, mobile body, control method, and program
CN111615663A (en) Control device, imaging system, mobile object, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220607