CN108235815B - Imaging control device, imaging system, moving object, imaging control method, and medium - Google Patents

Imaging control device, imaging system, moving object, imaging control method, and medium Download PDF

Info

Publication number
CN108235815B
CN108235815B CN201780002652.0A CN201780002652A CN108235815B CN 108235815 B CN108235815 B CN 108235815B CN 201780002652 A CN201780002652 A CN 201780002652A CN 108235815 B CN108235815 B CN 108235815B
Authority
CN
China
Prior art keywords
image
blur amount
captured
blur
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201780002652.0A
Other languages
Chinese (zh)
Other versions
CN108235815A (en
Inventor
邵明
本庄谦一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN108235815A publication Critical patent/CN108235815A/en
Application granted granted Critical
Publication of CN108235815B publication Critical patent/CN108235815B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Accessories Of Cameras (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

An imaging control device includes an acquisition unit that acquires a first image included in a first captured image captured with an imaging surface and a lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship. The imaging control device includes a calculation unit that calculates blur amounts of the first image and the second image. The imaging control device includes a control unit that controls a positional relationship between the imaging surface and the lens based on the blur amounts of the first image and the second image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value.

Description

Imaging control device, imaging system, moving object, imaging control method, and medium
Technical Field
The invention relates to an imaging control device, an imaging system, a moving object, an imaging control method, and a medium.
Background
Patent document 1 discloses an image processing apparatus that calculates distance information of an object in an image using a plurality of images with different blurs captured with different imaging parameters.
Patent document 1: JP 5932476A
Disclosure of Invention
When calculating the distance to the subject based on the amounts of blur of the plurality of images, if the difference between the amounts of blur of the plurality of images is small, the distance to the subject may not be accurately calculated.
The imaging control apparatus of the present invention may include: an acquisition unit acquires a first image included in a first captured image captured with an imaging surface and a lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship. The imaging control device may include: and a calculation unit that calculates the blur amount of each of the first image and the second image. The imaging control device may include: and a control unit that controls the positional relationship between the imaging surface and the lens based on the respective blur amounts of the first image and the second image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value.
The acquiring unit may further acquire a third image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when a difference between the blur amount of the first image and the blur amount of the second image is smaller than a first threshold value. The calculation section may further calculate a blur amount of the third image. The control unit may control the positional relationship between the imaging surface and the lens based on the blur amounts of the first image and the third image, when the difference between the blur amount of the first image and the blur amount of the third image is equal to or greater than a first threshold value.
The acquisition unit may further acquire a fourth image included in the first captured image and a fifth image included in the second captured image. The calculation unit may further calculate the blur amount of each of the fourth image and the fifth image. The control unit may control the positional relationship between the imaging surface and the lens based on the respective blur amounts of the fourth image and the fifth image when the difference between the blur amount of the first image and the blur amount of the second image is smaller than a first threshold value and the difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than the first threshold value.
The acquisition unit may further acquire a fourth image adjacent to the first image included in the first captured image and a fifth image adjacent to the second image included in the second captured image. The calculation unit may further calculate the blur amount of each of the fourth image and the fifth image. The control unit may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image, the second image, the fourth image, and the fifth image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value and a difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than a first threshold value.
The imaging control device may include: and a deriving unit that derives a first distance to the first target object included in the first image and the second image based on the blur amount of the first image and the blur amount of the second image, and derives a second distance to the second target object included in the fourth image and the fifth image based on the blur amount of the fourth image and the blur amount of the fifth image. The control unit may control a positional relationship between the imaging surface and the lens based on the first distance and the second distance.
The acquiring unit may further acquire a sixth image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when a difference between the blur amount of the fourth image and the blur amount of the fifth image is smaller than a first threshold value. The calculation section may further calculate a blur amount of the sixth image. The control unit may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image, the second image, the fourth image, and the sixth image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value and a difference between the blur amount of the fourth image and the blur amount of the sixth image is equal to or greater than a first threshold value.
The acquisition unit may further acquire a fourth image included in the first captured image and a fifth image included in the second captured image. The calculation unit may further calculate the blur amount of each of the fourth image and the fifth image. The imaging control device may include: and a deriving unit that derives a first distance to a first target object included in the first image and the second image based on the blur amount of each of the first image and the second image, and derives a second distance to a second target object included in the fourth image and the fifth image based on the blur amount of each of the fourth image and the fifth image. The control unit may control the positional relationship between the image pickup surface and the lens based on the first distance when the first distance satisfies a predetermined image pickup condition, the second distance does not satisfy the image pickup condition, and a difference between a blur amount of the first image and a blur amount of the second image is equal to or greater than a first threshold value.
The acquisition unit may further acquire a sixth image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when the first distance satisfies the imaging condition, the second distance does not satisfy the imaging condition, and a difference between the blur amount of the first image and the blur amount of the second image is smaller than a first threshold value.
The calculation section may further calculate a blur amount of the sixth image. The control unit may control the positional relationship between the imaging surface and the lens based on the blur amount of the first image and the blur amount of the sixth image when the difference between the blur amount of the first image and the blur amount of the sixth image is equal to or greater than the first threshold.
The apparatus may further comprise: and a specifying unit that specifies a region of the second captured image corresponding to the first image by comparing the feature points included in the first image with the feature points of the second captured image. The acquisition unit may acquire, as the second image, an image of a region of the second captured image having the same positional relationship as the positional relationship between the first captured image and the first image, with respect to the second captured image, when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is equal to or smaller than a second threshold value.
The acquisition unit may acquire the region as the second image when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is larger than a second threshold value.
The acquisition unit may acquire the image of the region as the second image when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is greater than a second threshold value and equal to or less than a third threshold value.
The determination unit may determine the respective feature points based on the luminance of the first image and the luminance of the region of the second captured image.
The determination unit may determine the center of gravity of the luminance of the first image as the feature point included in the first image, and may determine the center of gravity of the luminance of the region of the second captured image as the feature point of the second captured image.
By changing the position of the focus lens included in the lens, the state in which the imaging surface and the lens are in the first positional relationship can be changed to the state in which the imaging surface and the lens are in the second positional relationship.
By changing the position of the image pickup surface, the state in which the image pickup surface and the lens are in the first positional relationship can be changed to the state in which the image pickup surface and the lens are in the second positional relationship.
An imaging device according to an aspect of the present invention includes: the imaging control device, the image sensor having an imaging surface, and the lens are provided.
An imaging system according to an aspect of the present invention includes: the imaging device and a support mechanism for supporting the imaging device.
A moving body according to an aspect of the present invention is a moving body that moves with the imaging system.
An imaging control method according to an aspect of the present invention may include: and a step of acquiring a first image and a second image, the first image being included in a first captured image captured with the imaging surface and the lens in a first positional relationship, and the second image being included in a second captured image captured with the imaging surface and the lens in a second positional relationship. The image capture control method may include a step of calculating the blur amount of each of the first image and the second image. The imaging control method may include: and a step of controlling the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image and the second image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value.
A program according to an aspect of the present invention may cause a computer to execute: and a step of acquiring a first image and a second image, the first image being included in a first captured image captured with the imaging surface and the lens in a first positional relationship, and the second image being included in a second captured image captured with the imaging surface and the lens in a second positional relationship. The program may cause the computer to execute a stage of calculating the blur amount of each of the first image and the second image. The program may cause a computer to execute: and a step of controlling the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image and the second image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value.
According to an aspect of the present invention, the positional relationship between the imaging surface and the lens can be adjusted with higher accuracy based on the blur amount of the plurality of images.
The summary of the invention does not set forth all of the features of the invention. Combinations of these feature groups may also be an invention.
Drawings
Fig. 1 is a diagram showing an example of the external appearance of an unmanned aerial vehicle and a remote operation device.
Fig. 2 is a diagram showing an example of a functional module of the unmanned aerial vehicle.
Fig. 3 is a diagram showing an example of a curve showing a relationship between the blur amount and the lens position.
Fig. 4 is a diagram illustrating an example of a process of calculating a distance to a target object based on a blur amount.
Fig. 5 is a diagram for explaining the relationship between the position of the target object, the position of the lens, and the focal length.
Fig. 6A is a diagram for explaining a relationship between a difference in blur amount and a specific accuracy of a lens position.
Fig. 6B is a diagram for explaining a relationship between a difference in blur amount and a specific accuracy of the lens position.
Fig. 6C is a diagram for explaining a relationship between a difference in blur amount and a specific accuracy of the lens position.
Fig. 7 is a diagram for explaining an image for calculating the blur amount.
Fig. 8 is a diagram showing an example of a curve showing a relationship between the blur amount and the lens position.
Fig. 9 is a flowchart showing an example of the procedure of the AF processing of the BDAF method.
Fig. 10 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode.
Fig. 11 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode.
Fig. 12 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode.
Fig. 13 is a diagram showing another example of the functional module of the unmanned aerial vehicle.
Fig. 14 is a diagram for explaining the amount of movement of the target object.
Fig. 15A is a diagram for explaining an example in which the movement amount of the target object is determined based on the center of gravity of the luminance.
Fig. 15B is a diagram for explaining an example in which the movement amount of the target object is determined based on the center of gravity of the luminance.
Fig. 16 is a flowchart showing an example of a process of moving the AF processing frame in accordance with the amount of movement of the target object.
Fig. 17 is a diagram showing an example of the hardware configuration.
Detailed Description
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. All combinations of the features described in the embodiments are not essential to the means for solving the problems of the present invention. It is apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. It is apparent from the claims that the modifications and improvements can be included in the technical scope of the present invention.
The claims, the description, the drawings, and the abstract include matters to be protected by copyright. The copyright owner has no objection to the facsimile reproduction by anyone of the document, as it appears in the patent office file or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flowcharts and block diagrams, where a block may represent (1) a stage in a process of executing an operation or (2) a "section" of an apparatus having a role of executing an operation. Certain stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May comprise Integrated Circuits (ICs) and/or discrete circuitry. The programmable circuit may comprise a reconfigurable hardware circuit. Reconfigurable hardware circuits may contain memory elements of logical AND, logical OR, logical XOR, logical NAND, logical NOR, AND other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), AND the like.
A computer-readable medium may comprise any tangible device capable of holding commands for execution by a suitable device. As a result, a computer-readable medium having stored therein instructions may be provided with an article of manufacture including instructions which may be executed to generate a means for performing the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. As more specific examples of the computer-readable medium, floppy disks (registered trademark), magnetic tapes, hard disks, Random Access Memories (RAMs), Read Only Memories (ROMs), erasable programmable read only memories (EPROMs or flash memories), Electrically Erasable Programmable Read Only Memories (EEPROMs), Static Random Access Memories (SRAMs), compact disc read only memories (CD-ROMs), Digital Versatile Discs (DVDs), blu-Ray (RTM) optical discs, memory sticks, integrated circuit cards, and the like may be included.
The computer-readable commands may comprise any one of source code or object code written in any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may include assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or an object oriented programming language such as Smalltalk, JAVA (registered trademark), C + +, or the like, as well as the "C" programming language or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN) such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable commands to generate elements for performing the operations specified in the flowchart or block diagram block. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 shows an example of the external appearance of an Unmanned Aerial Vehicle (UAV)10 and a remote operation device 300. The UAV10 includes: a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100. The gimbal 50 and the imaging apparatus 100 are examples of an imaging system. The UAV10 is an example of a moving body propelled by a propulsion unit. The mobile body is a concept including, in addition to the UAV, a flying object such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on water, and the like.
The UAV main body 20 includes a plurality of rotary wings. The plurality of rotary blades is an example of the propulsion unit. The UAV body 20 flies the UAV10 by controlling the rotation of the plurality of rotating wings. The UAV body 20 flies the UAV10, for example, using four rotating wings. The number of the rotary wings is not limited to four. Further, the UAV10 may also be a fixed wing aircraft without rotating wings.
The imaging apparatus 100 is a camera for imaging an object included in a desired imaging range. The gimbal 50 rotatably supports the image pickup apparatus 100. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 rotatably supports the image pickup apparatus 100 with a pitch axis using an actuator. The gimbal 50 further rotatably supports the imaging device 100 around the roll axis and the yaw axis, respectively, using actuators. The gimbal 50 can change the attitude of the imaging apparatus 100 by rotating the imaging apparatus 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
The plurality of imaging devices 60 are sensing cameras for imaging the surroundings of the UAV10 in order to control the flight of the UAV 10. The two cameras 60 may be positioned at the nose, i.e., front, of the UAV 10. And two more cameras 60 may be positioned on the bottom surface of the UAV 10. The two image pickup devices 60 on the front side are paired and can function as a so-called stereo camera. The two imaging devices 60 on the bottom surface side also form a pair, and can function as a stereo camera. Three-dimensional spatial data of the surroundings of the UAV10 may be generated based on images captured by the plurality of cameras 60. The number of the imaging devices 60 provided in the UAV10 is not limited to four. The UAV10 may include at least one imaging device 60. The UAV10 may also include at least one imaging device 60 on the nose, tail, sides, bottom, and top of the UAV 10. The field angle that can be set in the imaging device 60 may be wider than the field angle that can be set in the imaging device 100. The imaging device 60 may also have a single focus lens or a fisheye lens.
The remote operation device 300 communicates with the UAV10 to remotely operate the UAV 10. The remote operation device 300 may be in wireless communication with the UAV 10. The remote operation device 300 transmits instruction information indicating various commands related to the movement of the UAV10 such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating to the UAV 10. The instruction information includes, for example, instruction information for raising the height of the UAV 10. The indication may show the altitude at which the UAV10 should be. The UAV10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300. The indication may include a lift command to raise the UAV 10. Upon receiving the ascending command, the UAV10 ascends. UAV10 may limit ascent even if an ascent command is accepted if the altitude of UAV10 reaches an upper limit altitude.
Fig. 2 shows an example of functional modules of the UAV 10. The UAV10 includes: UAV control section 30, memory 32, communication interface 34, propulsion section 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, gimbal 50, and imaging device 100.
The communication interface 34 communicates with other devices such as the remote operation device 300. The communication interface 34 may receive instruction information containing various commands to the UAV control section 30 from the remote operation device 300. The memory 32 stores programs and the like necessary for the UAV control unit 30 to control the propulsion unit 40, the GPS receiver 41, the Inertial Measurement Unit (IMU)42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the gimbal 50, the imaging device 60, and the imaging device 100. The memory 32 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 32 may be disposed inside the UAV body 20. May be provided to be detachable from the UAV main body 20.
The UAV control unit 30 controls the flight and imaging of the UAV10 in accordance with a program stored in the memory 32. The UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The UAV control unit 30 controls the flight and imaging of the UAV10 in accordance with a command received from the remote operation device 300 via the communication interface 34. The propulsion portion 40 propels the UAV 10. The pusher 40 has: a plurality of rotary blades, and a plurality of drive motors for rotating the plurality of rotary blades. The propulsion section 40 rotates the plurality of rotary wings via the plurality of driving motors in accordance with a command from the UAV control section 30, thereby flying the UAV 10.
The GPS receiver 41 receives a plurality of signals indicating time of day transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position of the GPS receiver 41, i.e., the position of the UAV10, based on the plurality of received signals. The IMU42 detects the pose of the UAV 10. The IMU42 detects, as the posture of the UAV10, the acceleration in the 3-axis direction of the forward, backward, left, right, and up-down directions of the UAV10, and the angular velocity in the 3-axis direction of the pitch, roll, and yaw. The magnetic compass 43 detects the heading of the UAV 10. The barometric altimeter 44 detects the altitude at which the UAV10 is flying. The barometric altimeter 44 detects the barometric pressure around the UAV10, and converts the detected barometric pressure into altitude, thereby detecting altitude. The temperature sensor 45 detects the temperature of the surroundings of the UAV 10.
The imaging device 100 includes an imaging unit 102 and a lens unit 200. The lens unit 200 is an example of a lens device. The imaging unit 102 includes: an image sensor 120, an imaging control unit 110, and a memory 130. The image sensor 120 may be formed of a CCD or a CMOS. The image sensor 120 outputs image data of an optical image formed via the plurality of lenses 210 to the image pickup control section 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The imaging control unit 110 can control the imaging apparatus 100 in accordance with an operation command of the imaging apparatus 100 from the UAV control unit 30. The memory 130 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The memory 130 may be provided to be detachable from the housing of the image pickup apparatus 100.
The lens portion 200 has: a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220. The plurality of lenses 210 may function as a zoom lens, a variable focal length lens, and a focus lens. At least a part or all of the plurality of lenses 210 is configured to be movable along the optical axis. The lens portion 200 may be an interchangeable lens provided to be attachable to and detachable from the image pickup portion 102. The lens moving mechanism 212 moves at least a part or all of the plurality of lenses 210 along the optical axis. The lens control unit 220 drives the lens moving mechanism 212 to move the one or more lenses 210 in the optical axis direction in accordance with a lens control command from the image pickup unit 102. The lens control command is, for example, a zoom control command and a focus control command.
The imaging apparatus 100 configured as described above executes autofocus processing (AF processing) to image a desired object.
The image pickup apparatus 100 determines a distance from the lens to the object (object distance) in order to execute the AF processing. As a method for determining the object distance, there is a method of determining the object distance based on the blur amount of a plurality of images captured in a state where the positional relationship between the lens and the image capture surface is different. This method is referred to as a blur Detection autofocus (Bokeh Detection Auto focus: BDAF) method.
For example, the blur amount (Cost) of an image can be represented by the following equation (1) using a gaussian function. In the formula (1), x represents a pixel position in the horizontal direction. σ denotes a standard deviation value.
[ formula 1]
Figure GDA0002593593470000101
Fig. 3 shows an example of a curve represented by equation (1). By aligning the focusing lens to the lens position corresponding to the minimum point 502 of the curve 500, the target object contained in the image I can be brought into focus.
Fig. 4 is a flowchart showing an example of the distance calculation procedure in the BDAF method. First, the imaging apparatus 100 performs the first image I in a state where the lens and the imaging surface are in the first positional relationship1The image is captured and stored in the memory 130. Next, the focusing lens or the imaging surface of the image sensor 120 is moved in the optical axis direction, so that the lens and the imaging surface are in the second positional relationship, and the imaging device 100 captures a second image I2The image is captured and stored in the memory 130 (S101). For example, as in so-called hill-climbing AF, the focus lens or the imaging surface of the image sensor 120 is moved in the optical axis direction so as not to exceed the focus point. The amount of movement of the focus lens or the image pickup surface of the image sensor 120 may be, for example, 10 μm.
Next, the image pickup apparatus 100 extracts the image I1The image is divided into a plurality of regions (S102). The feature amount may be calculated for each pixel in the image I2, and the image I may be set to have a group of pixels having similar feature amounts as one region1Divided into a plurality of regions. The image I can also be used1The pixel group in the range set by the middle AF process frame is divided into a plurality of areas. The image pickup apparatus 100 picks up an image I2Segmentation into sum-images I1A plurality of regions corresponding to the plurality of regions. The imaging device 100 is based on the image I1Of a plurality of regions and an image I2The blur amount of each of the plurality of regions is calculated for each of the plurality of regions, the distance to the target object included in each of the plurality of regions (S103).
The calculation process of the distance is further explained with reference to fig. 5. A distance from the lens L (principal point) to the target object 510 (object plane) is denoted by a, a distance from the lens L (principal point) to a position (image plane) where the target object 510 forms an image on the imaging plane is denoted by B, and a focal distance is denoted by F. In this case, the relationship of the distance a, the distance B, and the focal point distance F can be expressed by the following formula (2) according to the formula of the lens.
[ formula 2]
Figure GDA0002593593470000111
The focal distance F is determined by the lens position. Therefore, as long as the distance B at which the target object 510 is imaged on the imaging plane can be determined, the distance a from the lens L to the target object 510 can be determined using equation (2).
As shown in fig. 5, the distance B and further the distance a can be specified by calculating the position where the target object 510 forms an image from the magnitudes of the blur of the target object 510 projected on the image pickup surface (circle of confusion 512 and 514). That is, the imaging position can be determined by considering the size of the blur (blur amount) in proportion to the imaging plane and the imaging position.
Here, the image I from the position closer to the imaging surface1The distance to the lens L is set to D1. Image I to be taken from a position far away from the image pickup surface2The distance to the lens L is set to D2. The respective images are blurred. The Point Spread Function (Point Spread Function) at this time is PSF, and D is1And D2The images at are respectively set as Id1And Id2. In this case, for example, like I1The convolution operation can be expressed by the following expression (3).
[ formula 3]
I1=PSF*Id1…(3)
Further, image data Id1And Id2F, the Fourier transform function of (1) is set asd1And Id2Point image spread function PSF1And a PSF2The Optical Transfer Function (Optical Transfer Function) after fourier transform is defined as OTF1And OTF2The ratio is obtained as in the following formula (4).
[ formula 4]
Figure GDA0002593593470000121
The value C shown in formula (4) corresponds to the image Id1And Id2The amount of change in the respective blur amounts, i.e., the value C, corresponds to the image Id1Amount of blur and image Id2nThe difference in the amount of blur.
However, when the distances to the target objects included in the two images are calculated from the two images having different blur amounts, if the difference between the blur amounts of the two images is small, the curve 500 as shown in fig. 3 may not be accurately specified, and the distance to the target object may not be accurately calculated. For example, as shown in FIG. 6A, if the slave image I0Obtained blur amount C (t)0) And from picture I1Obtained blur amount C (t)1) If the difference is less than the threshold Th, the secondary blur amount C (t)0) And a blur amount C (t)1) The curve 522 of the determined gaussian function sometimes becomes an imperfect curve. The lens position 524 determined from the curve 522 deviates from the ideal lens position 520 corresponding to the imaging position. FIG. 6B is also the same from image I0Obtained blur amount C (t)0) And from picture I2Obtained blur amount C (t)2) The difference is smaller than the threshold Th. In this case, the secondary blur amount C (t)0) And a blur amount C (t)2) The determined lens position 528 as determined by curve 526 of the gaussian function deviates from the ideal lens position 520. On the other hand, for example, as shown in FIG. 6C, in the slave image I0Obtained blur amount C (t)0) And from picture I3Obtained blur amount C (t)3) When the difference is equal to or greater than threshold Th, the amount of secondary blur C (t)0) And a blur amount C (t)3) The determined lens position 532 determined from curve 530 of the determined gaussian function coincides with the ideal lens position 520. X or X denotes the position of the focusing lens.
As described above, when the distance to the target object is determined by the BDAF method, it is desirable that the difference between the blur amounts of the images to be compared be equal to or greater than a predetermined threshold.
Therefore, as shown in fig. 2, the imaging control unit 110 included in the imaging apparatus 100 according to the present embodiment includes: an acquisition unit 112, a calculation unit 114, a derivation unit 116, and a focus control unit 140.
The acquisition unit 112 acquires a first image included in a first captured image captured with the imaging surface and the lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship. The acquisition unit 112 acquires an image 601 from images 601 to 605 in an AF processing frame 610 of a captured image 600 captured with the imaging surface and the lens in the first positional relationship, for example, as shown in fig. 7. Further, the acquisition unit 112 acquires an image 621 from images 621 to 625 in the AF processing frame 630 of the captured image 620 captured after the positional relationship between the image pickup surface and the lens is changed by moving the focus lens or the image pickup surface of the image sensor 120. The acquisition unit 112 acquires an image of an area having a feature amount satisfying a predetermined condition from among a plurality of areas in the AF processing frame. The acquisition unit 112 may acquire an image for each of the plurality of areas in the AF processing frame when the feature value for each of the plurality of areas satisfies a predetermined condition.
The calculation unit 114 calculates the blur amount of each of the first image and the second image. The calculation unit 114 calculates the blur amount c (t) of each of the image 601 and the image 621, for example. When the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than the first threshold Th1, the focus control unit 140 controls the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image and the second image. The focus control unit 140 can control the positional relationship between the image pickup surface and the lens by controlling the position of at least one of the image pickup surface and the lens. The first threshold Th1 may be determined according to the specification of the imaging apparatus 100. The first threshold Th1 may be determined based on the lens characteristics of the image pickup apparatus 100. The first threshold Th1 may be decided based on the pixel pitch of the image sensor 120. The focus control unit 140 is an example of a control unit that controls the positional relationship between the imaging surface and the lens.
The deriving unit 116 derives a first distance to a first target object included in the first image and the second image based on the blur amount of the first image and the blur amount of the second image. The deriving unit 116 may derive the distance to the target object 650 included in the image 601 and the image 621, for example, based on the above equation (2) and the geometric relationship shown in fig. 5. The focus control unit 140 may execute the AF process by moving the focus lens or the imaging surface of the image sensor 120 based on the distance to the target object 650 derived by the derivation unit 116.
The obtaining unit 112 may further obtain a third image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when the difference between the blur amount of the first image and the blur amount of the second image is smaller than the first threshold Th 1. The acquisition unit 112 may acquire, for example, an image 661 included in the captured image 660 captured in the third positional relationship. The calculation section 114 may further calculate a blur amount of the third image. The focus control unit 140 may control the positional relationship between the imaging surface and the lens based on the blur amounts of the first image and the third image when the difference between the blur amount of the first image and the blur amount of the third image is equal to or greater than the first threshold Th 1. When the difference between the blur amount of the image 601 and the blur amount of the image 661 is equal to or greater than the first threshold value Th1, the focus control unit 140 may move the focus lens or the imaging surface of the image sensor 120 so as to focus on the target object 650 based on the blur amounts of the image 601 and the image 661.
The acquisition unit 112 may further acquire a fourth image included in the first captured image and a fifth image included in the second captured image. The acquisition unit 112 can acquire, for example, an image 602 included in the captured image 600 and an image 622 included in the captured image 620. The calculation unit 114 may calculate the blur amount of each of the fourth image and the fifth image. The calculation unit 114 may calculate the blur amount of each of the image 602 and the image 622, for example. The focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the respective blur amounts of the fourth image and the fifth image when the difference between the blur amount of the first image and the blur amount of the second image is smaller than the first threshold Th1 and the difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than the first threshold Th 1. The deriving unit 116 may derive the second distance to the second target object included in the fourth image and the fifth image based on the blur amount of the fourth image and the blur amount of the fifth image. The deriving unit 116 may derive the distance to the target object 652 included in the image 602 and the image 622, for example, based on the above equation (2) and the geometric relationship shown in fig. 5. For example, when the difference between the blur amount of the image 601 and the blur amount of the image 621 is smaller than the first threshold Th1 and the difference between the blur amount of the image 602 and the blur amount of the image 622 is equal to or larger than the first threshold Th1, the focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the image 602 and the image 622. The focus control unit 140 may move the focus lens or the imaging surface of the image sensor 120 so as to focus on the target object 652, based on the distances to the target object 652 included in the image 602 and the image 622 derived by the derivation unit 116.
The acquisition unit 112 may acquire a fourth image adjacent to the first image included in the first captured image and a fifth image adjacent to the second image included in the second captured image. The acquisition unit 112 can acquire, for example, an image 602 adjacent to the image 601 included in the captured image 600 and an image 622 adjacent to the image 621 included in the captured image 620. The calculation unit 114 may calculate the blur amount of each of the fourth image and the fifth image. The calculation unit 114 may calculate the blur amount of each of the image 602 and the image 622, for example.
The focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image, the second image, the fourth image, and the fifth image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value Th1 and the difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than a first threshold value Th 1. For example, when the difference between the blur amount of the image 601 and the blur amount of the image 621 is equal to or greater than the first threshold Th1 and the difference between the blur amount of the image 602 and the blur amount of the image 622 is equal to or greater than the first threshold Th1, the focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the image 601, the image 621, the image 602, and the image 622. The focus control unit 140 may control the positional relationship between the image plane and the lens based on the distance to the target object 650 determined based on the blur amounts of the image 601 and the image 621 and the distance to the target object 652 determined based on the blur amounts of the image 602 and the image 622. The focus control unit 140 may determine the distance of the object in the AF processing frame based on the weighted distance of each region in accordance with a weight set in advance for each region in the AF processing frame.
The obtaining unit 112 may further obtain a sixth image included in the third captured image captured with the imaging surface and the lens in the third positional relationship, when the difference between the blur amount of the fourth image and the blur amount of the fifth image is smaller than the first threshold Th 1. The obtaining unit 112 may further obtain an image 662 included in the captured image 660, for example, after changing the positional relationship between the imaging surface and the lens from the second positional relationship to the third positional relationship. The calculation section 114 may calculate the blur amount of the sixth image. The calculation section 114 may calculate the blur amount of the image 662. The focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the first image, the second image, the fourth image, and the sixth image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than the first threshold value Th1 and the difference between the blur amount of the fourth image and the blur amount of the sixth image is equal to or greater than the first threshold value Th 1. The focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the blur amounts of the image 601, the image 621, the image 602, and the image 662 when the difference between the blur amount of the image 601 and the blur amount of the image 621 is equal to or greater than the first threshold value Th1 and the difference between the blur amount of the image 602 and the blur amount of the image 662 is equal to or greater than the first threshold value Th 1.
The focus control unit 140 may control the positional relationship between the image pickup surface and the lens based on the first distance when the first distance derived by the deriving unit 116 satisfies a predetermined image pickup condition, the second distance derived by the deriving unit 116 does not satisfy the predetermined image pickup condition, and a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value. The imaging conditions may be conditions determined according to an imaging mode such as a portrait mode or a landscape mode. The imaging conditions may be conditions such as near-end-first and infinite-end-first. If the imaging condition is the close-end priority, the focus control unit 140 may determine, from the distances to the target object of the respective areas in the AF processing frames derived by the derivation unit 116, an image of an area having the shortest distance to the target object as an image satisfying the imaging condition, and determine images in other AF processing frames as images not satisfying the imaging condition. Here, the acquisition unit 112 may further acquire a sixth image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when the first distance satisfies the imaging condition, the second distance does not satisfy the imaging condition, and the difference between the blur amount of the first image and the blur amount of the second image is smaller than the first threshold Th 1. The acquisition unit 112 can acquire an image 661 from the captured image 660, for example. The focus control unit 140 may control the positional relationship between the imaging surface and the lens based on the blur amount of the first image and the blur amount of the sixth image when the difference between the blur amount of the first image and the blur amount of the sixth image is equal to or greater than the first threshold. The deriving unit 116 may derive the distance to the image 601 and the target object 650 included in the image 661 based on the amount of blur of the image 601 and the amount of blur of the image 661. The focus control unit 140 may move the focus lens or the image plane of the image sensor 120 so as to focus on the target object 650 based on the distance to the target object 650 derived by the derivation unit 116.
For example, as shown in fig. 8, the calculation unit 114 calculates a blur amount C (t) of the image 6010) And amount of blur C (t) of image 6211) The blur amount C' (t) of the image 6020) Blur amount C' (t) of image 6221). The blur amount C (t) of the image 601 at this time0) Amount of blur C (t) with image 6211) The difference Ta is smaller than the first threshold Th 1. On the other hand, the blur amount C' (t) of the image 6020) Amount of blur C' (t) with image 6211) The difference Tb is equal to or greater than the first threshold Th 1. If the imaging condition in this case is infinity-side-priority, the focus control unit 140 may determine to bring the target object 652 included in the image 602 and the image 622 into focus. On the other hand, in the case of a liquid,if the imaging condition is the near-end priority, the focus control unit 140 may determine to bring the target object 650 included in the image 601 and the image 621 into focus. Here, the difference Tb is equal to or greater than the first threshold Th1, and therefore the blur amount C' (t) based on the image 6020) And the blur amount C' (t) of the image 6221) While the accuracy of the determined curve 700 is higher. Therefore, the focus control unit 140 can execute the AF processing with high accuracy by moving the focus lens or the imaging surface of the image sensor 120 based on the lens position 712 determined from the curve 700. On the other hand, since the difference Ta is smaller than the first threshold Th1, the blur amount C (t) based on the image 6010) And a blur amount C (t) of the image 6211) While the accuracy of the determined curve 702 is lower. Therefore, when the close-end priority is given, if the focus control unit 140 moves the focus lens or the imaging surface of the image sensor 120 based on the lens position 714 determined based on the curve 702, the AF processing cannot be executed with high accuracy. Therefore, the imaging apparatus 100 further moves the focus lens or the imaging surface of the image sensor 120 to capture the captured image 660. Then, the calculation unit 114 calculates the blur amount C (t) of the image 661 of the captured image 6602). Blur amount C (t) of image 6010) Amount of blur C (t) with image 6612) The difference Tc is equal to or greater than the first threshold Th 1. Thus, based on the blur amount C (t) of the image 6010) And amount of blur C (t) of image 6612) And the determined curve 704 is more accurate than the curve 702. Thus, the focus control unit 140 can execute the AF process with high accuracy even when the close end is prioritized by moving the focus lens or the imaging surface of the image sensor 120 based on the lens position 716 determined from the curve 704.
Fig. 9 is a flowchart showing an example of the procedure of the AF processing of the BDAF method.
The imaging apparatus 100 moves the focus lens to X (t)0) Position (S201). The imaging apparatus 100 moves the focus lens by 10 μm in the optical axis direction, for example. The obtaining unit 112 is set at X (t)0) An image I (t) is obtained from the captured image0) (S202). The imaging control unit 110 increments the counter (S203). Subsequently, the imaging control unit 110 causes the focus to be performed via the lens control unit 220The focal lens is moved to X (t)n) Position (S204). The obtaining unit 112 is set at X (t)n) Obtaining a sum image I (t) from a captured image obtained by position imaging0) Image I (t) of the corresponding locationn) (S205). The calculation unit 114 calculates an image I (t)0) Amount of blur C (t)0) And an image I (t)n) Amount of blur C (t)n) (S206). If the blur amount C (t)0) And the blur amount C (t)n) Difference | C (t)n)-C(t0) If | is smaller than the first threshold Th1, the imaging control unit 110 repeats the processing from step S203 onward in order to further move the focus lens.
If the difference | C (t)n)-C(t0) If | is equal to or greater than the first threshold Th1, the derivation unit 116 determines the blur amount C (t)0) And a blur amount C (t)n) Derived to the image I (t)0) And an image I (t)n) The distance of the target object contained in (1). The focus control unit 140 determines the distance to the subject based on the distance (S208). The focus control unit 140 moves the focus lens to the predicted focus position based on the determined distance (S209).
As described above, the imaging apparatus 100 moves the focus lens until the difference in the blur amount between the images becomes equal to or greater than the first threshold Th1 when the difference in the blur amount between the images is small. Therefore, the imaging apparatus 100 can execute the AF processing by the BDAF method with higher accuracy and at higher speed.
Fig. 10 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode. The procedure shown in fig. 10 can be applied to the case of the photographing mode that prioritizes the speed of the AF processing.
The imaging apparatus 100 moves the focus lens to X (t)0) Position (S301). The imaging apparatus 100 moves the focus lens by 10 μm in the optical axis direction, for example. The acquisition unit 112 acquires the pair X (t)0) And a plurality of images within the AF processing frame set by the picked-up image. The acquisition unit 112 divides the AF processing frame into a plurality of areas and acquires an image for each area. The acquisition unit 112 calculates the feature amount of each of the plurality of images (S302). The acquisition unit 112 may calculate the feature amount based on the pixel value, the luminance value, the edge detection, and the like of each of the plurality of images. If there is no tool in the plurality of imagesIf there is an image having a feature value equal to or greater than the threshold value, the AF processing by the BDAF method is not executed and the processing is terminated.
On the other hand, if there are images having a feature value equal to or greater than the threshold value (S303), the acquisition unit 112 acquires these images I (t)0) (S304). Next, the imaging control unit 110 increments a counter (S305). The imaging control unit 110 moves the focus lens to X (t) via the lens control unit 220n) Position (S306). The obtaining unit 112 is set at X (t)n) The position of each image I (t) is acquired from the captured image0) Respective images I (t) of the corresponding positionsn) (S307). The calculation unit 114 calculates each image I (t)0) Amount of blur C (t)0) And each image I (t)n) Amount of blur C (t)n) (S308). If no blur amount C (t) is present0) And the blur amount C (t)n) The difference | C (t)n)-C(t0) If |, is an image equal to or greater than the first threshold Th1, the imaging control unit 110 repeats the processing from step S305 onward.
If there is a difference | C (t)n)-C(t0) If | is an image equal to or greater than the first threshold Th1, the derivation unit 116 derives the corresponding image I (t)0) Amount of blur C (t)0) And image I (t)n) Amount of blur C (t)n) Derived to the image I (t)0) And an image I (t)n) The distance of the target object contained in (1). The focus control unit 140 determines the distance to the subject based on the distance derived by the deriving unit 116, and moves the focus lens to the predicted focus position (S311).
According to the above processing, the focus control unit 140 can immediately move the focus lens to the predicted focus position at a stage when an image having a difference between the amounts of blur equal to or larger than the first threshold value can be acquired from the plurality of images within the AF processing frame.
Fig. 11 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode. The process shown in fig. 11 can be applied to the multipoint AF mode in which weights are set for respective areas within the AF processing frame.
The imaging apparatus 100 moves the focus lens to X (t)0) Position (S401). The imaging apparatus 100 moves the focus lens by 10 μm in the optical axis direction, for example. Acquisition unit112 is obtained as a pair at X (t)0) And a plurality of images within the AF processing frame set by the picked-up image. The acquisition unit 112 divides the AF processing frame into a plurality of areas and acquires an image for each area. The acquisition unit 112 calculates the feature amount of each of the plurality of images (S402). If there is no image having a feature value equal to or greater than the threshold value among the plurality of images, the AF processing by the BDAF method is not executed and the processing is terminated.
On the other hand, if there are images having a feature value equal to or greater than the threshold value (S403), the acquisition unit 112 acquires these images I (t)0) (S404). Next, the imaging control unit 110 increments a counter (S405). The imaging control unit 110 moves the focus lens to X (t) via the lens control unit 220n) Position (S406). The obtaining unit 112 is set at X (t)n) The position of each image I (t) is acquired from the captured image0) Respective images I (t) of the corresponding positionsn) (S407). The calculation unit 114 calculates each image I (t)0) Amount of blur C (t)0) And each image I (t)n) Amount of blur C (t)n)(S408)。
If there is no difference | C (t)n)-C(t0) If | is an image equal to or greater than the first threshold Th1, the imaging control unit 110 repeats the processing from step S405 onward in order to move the focus lens via the lens control unit 220 and acquire captured images having different positional relationships between the lens and the imaging surface.
If there is a difference | C (t)n)-C(t0) I is greater than the first threshold Th1, based on the corresponding image I (t)0) Amount of blur C (t)0) And image I (t)n) Amount of blur C (t)n) Derived to the image I (t)0) And an image I (t)n) The distance of the target object contained in (S410).
If the distance to the target object is not derived for all images having the feature amount equal to or greater than the threshold value, the imaging control unit 110 repeats the processing from step S405 onward in order to move the focus lens via the lens control unit 220 and acquire captured images having different positional relationships between the lens and the imaging surface.
When the distance to the target object is derived for all images having the feature amount equal to or greater than the threshold value, the focus control unit 140 determines the distance to the subject based on these distances (S410). For example, in the case of the AF processing frame 610 shown in fig. 7, the areas of the respective images within the AF processing frame 610 may be individually weighted. A weight of "100" may be set for a region of the image 601 at the center in the AF processing frame 610, a weight of "70" may be set for the images 602 and 603 adjacent to each other on the left and right of the image 601, a weight of "70" may be set for the image 604 adjacent to the upper side of the image 601, and a weight of "50" may be set for the image 605 adjacent to the lower side of the image 601. The focus control unit 140 may weight each area with respect to the distance to the target object derived for each area in the AF processing frame 610, and determine the distance to the subject based on each weighted distance. The focus control unit 140 moves the focus lens to the predicted focus position based on the determined distance to the subject (S413).
As described above, when the distance is calculated for a plurality of regions, an image in which the difference in blur amount is equal to or greater than the first threshold value is acquired for each of the plurality of regions. This can prevent the accuracy of the distance derived between the plurality of regions from varying.
Fig. 12 is a flowchart showing another example of the procedure of the AF processing of the BDAF mode. The process shown in fig. 12 can be applied to an imaging mode such as infinity-side first or close-end first.
The imaging apparatus 100 moves the focus lens to X (t)0) Position (S501). The imaging apparatus 100 moves the focus lens by 10 μm in the optical axis direction, for example. The acquisition unit 112 acquires the pair X (t)0) And a plurality of images within the AF processing frame set by the picked-up image. The acquisition unit 112 divides the AF processing frame into a plurality of areas and acquires an image for each area. The acquisition unit 112 calculates the feature amount of each of the plurality of images (S502). If there is no image having a feature value equal to or greater than the threshold value among the plurality of images, the AF processing by the BDAF method is not executed and the processing is terminated.
On the other hand, if there are images having a feature value equal to or greater than the threshold value (S503), the acquisition unit 112 acquires these images I (t)0) (S504). Then, the image pickup control is performedThe control unit 110 increments the counter (S505). The imaging control unit 110 moves the focus lens to X (t) via the lens control unit 220n) Position (S506). The obtaining unit 112 is set at X (t)n) The position of each image I (t) is acquired from the captured image0) Respective images I (t) of the corresponding positionsn) (S507). The calculation unit 114 calculates each image I (t)0) Amount of blur C (t)0) And each image I (t)n) Amount of blur C (t)n) (S508). The deriving unit 116 derives the images I (t) from the images0) Amount of blur C (t)0) And each image I (t)n) Amount of blur C (t)n) Derived to each image I (t)0) And each image I (t)n) The distance of each target object contained in (S509).
The focus control unit 140 selects an image I (t) satisfying the imaging condition from the calculated distances0) And an image (t)n) (S510). For example, if the near-end priority is high, the focus control unit 140 selects the image I closest to the calculated distance. If the image is infinity-side-first, the focus control unit 140 selects the image I having the farthest distance from the calculated distances.
The focus control unit 140 determines the selected image I (t)0) Amount of blur C (t)0) And an image (t)n) Amount of blur C (t)n) The difference | C (t)n)-C(t0) If | is equal to or greater than the first threshold Th1 (S511). If the difference | C (t)n)-C(t0) If | is equal to or greater than the first threshold Th1, the focus control unit 140 bases on the image I (t) derived by the deriving unit 1160) And an image I (t)n) The distance to the subject is determined based on the distance to the target object included in (S516), and the focus lens is moved to the predicted in-focus position.
The focus control unit 140 selects the image I (t)0) Amount of blur C (t)0) And an image (t)n) Amount of blur C (t)n) The difference | C (t)n)-C(t0) If | is smaller than the first threshold Th1, the imaging control unit 110 increments the counter (S512). Next, the imaging control unit 110 moves the focus lens to X (t) via the lens control unit 220n) Position (S513). Acquisition unit 112From at X (t)n) The image I (t) selected in step S510 is acquired from the captured image obtained by the position imaging0) Image I (t) of the corresponding locationn) (S514). The calculation unit 114 calculates the acquired image I (t)n) Amount of blur C (t)n)(S515)。
The imaging control unit 110 repeats the process of step S511 until the selected image I is obtained (t)0) Amount of blur C (t)0) The difference in the blur amount of (a) is equal to or more than a first threshold value Th1 (t)n) Until now. Then, if the difference | C (t) is obtainedn)-C(t0) An image (t) where | is equal to or greater than first threshold Th1n) The focus control section 140 then bases on the image I (t) derived by the deriving section 1160) And an image I (t)n) The distance to the subject is determined based on the distance to the target object included in (S516), and the focus lens is moved to the predicted in-focus position.
As described above, even when the AF processing is executed in the image capturing mode such as the infinity-side priority mode or the close-end priority mode, the AF processing of the BDAF method can be executed with high accuracy.
However, when imaging is performed by the imaging apparatus 100 mounted on a mobile object such as the UAV10, a target object in an imaged image may move in the imaged image. If the amount of movement of the target object is large, the imaging control unit 110 may not be able to execute the AF processing with high accuracy in the BDAF system.
Therefore, the imaging control unit 110 can execute the AF processing of the BDAF method in consideration of the amount of movement of the target object within the image.
Fig. 13 shows an example of functional modules of the UAV10 according to another embodiment. The UAV10 shown in fig. 13 is different from the UAV10 shown in fig. 2 in that the imaging control unit 110 includes the specification unit 113.
The specifying unit 113 specifies a region of the second captured image corresponding to the first image by comparing a feature point included in the first image in the first captured image captured with the lens and the imaging surface in the first positional relationship with a feature point of the second captured image captured with the lens and the imaging surface in the second positional relationship. When the difference between the position of the first image in the first captured image and the position of the region in the second captured image is equal to or less than the second threshold Th2, the acquisition unit 112 acquires, as the second image, the image of the region of the second captured image that has the same positional relationship as the positional relationship between the first captured image and the first image.
The obtaining unit 112 may obtain the region as the second image when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is larger than a second threshold value. The obtaining unit 112 may obtain an image of a region in the first captured image as the second image when a difference between the position of the first image and the position of the region in the second captured image is greater than the second threshold Th2 and equal to or less than the third threshold Th 3. The second threshold Th2 and the third threshold Th3 may be determined based on the pixel pitch of the image sensor 120 or the like. The determination unit 113 may divide the image into a plurality of blocks, and search for the feature points in units of blocks. Therefore, the second threshold Th2 and the third threshold Th3 may be decided based on the size of the block in which the feature points are searched.
The specifying unit 113 specifies the feature point 820 from the first image 812 of the AF processing frame 810 in the first captured image 800, for example, as shown in fig. 14. The determination section 113 may determine the feature point 820 based on a pixel value, luminance, edge detection, or the like. Next, a feature point 821 corresponding to the feature point 820 is specified from the second captured image 802 captured in a state where the positional relationship between the lens and the imaging surface is changed. The specifying unit 113 specifies the region 814 of the second captured image 802 corresponding to the first image 812 by comparing the feature points 820 of the first captured image 800 with the feature points 821 of the second captured image 802. When the difference between the position of the first image 812 in the first captured image 800 and the position of the region 814 in the second captured image 802 is equal to or less than the second threshold Th2, the acquisition unit 112 acquires, as the second image, an image of the region 816 in the second captured image 802 that is in the same positional relationship as the positional relationship between the first captured image 800 and the first image 812, with respect to the second captured image 802. That is, the focus control unit 140 executes the AF processing of the BDAF method without moving the AF processing frame 811 in the second captured image 802 relative to the AF processing frame 810 in the first captured image 800.
On the other hand, it is assumed that the specification unit 113 specifies the feature point 822 corresponding to the feature point 820 from the third captured image 804 captured in a state where the positional relationship between the lens and the imaging surface is changed. In this case, the specifying unit 113 specifies the region 816 of the third captured image 804 corresponding to the first image 812 by comparing the characteristic point 820 of the first captured image 800 with the characteristic point 822 of the third captured image 804. The acquisition unit 112 determines that the difference between the position of the first image 812 in the first captured image 800 and the position of the region 816 in the third captured image 804 is greater than the second threshold Th2, and acquires the region 816 as a second image. That is, the focus control unit 140 moves the AF processing frame 810 in the third captured image 804 by an amount corresponding to the movement amount of the feature point with respect to the AF processing frame 810 in the first captured image 800 in correspondence with the region 816, and executes the AF processing in the BDAF method using the moved AF processing frame 810.
When the target object to be compared moves within the image within the allowable range or more in this manner, the position of the image to be compared is changed, and the calculation unit 114 calculates the blur amount from the changed image. Therefore, the AF processing in the BDAF mode can be performed with high accuracy even if the target object moves within the image.
The feature point specified by the specifying unit 113 may be specified from the center of gravity of the luminance in the image. For example, as shown in fig. 15A, a target object 902 is included in the first captured image 900. The determination unit 113 divides the first captured image 900 into a plurality of blocks (for example, 8 × 8 pixels) and calculates the luminance for each block, and generates a monochrome image 901 representing the luminance for each block. The determination unit 113 determines the position of the center of gravity 903 of the luminance from the monochrome image 901. Similarly, the determination unit 113 divides the second captured image 910 into a plurality of blocks, calculates the luminance for each block, and generates a monochrome image 911. Further, the determination unit 113 determines the position of the center of gravity 913 of the luminance from the monochrome image 911. The specifying unit 113 specifies that the center of gravity of the luminance is shifted from the position of the center of gravity 903 to the position of the center of gravity 913. If the shift amount of the center of gravity of the luminance is within the range of one block, that is, the shift amount of the center of gravity of the luminance is within 8 × 8 pixels, the determination unit 113 may determine that the shift amount of the feature point is within the second threshold. On the other hand, if the moving amount of the center of gravity is within the range of two patches, the determination unit 113 may determine that the moving amount of the feature point is within the third threshold. Then, as shown in fig. 15B, the acquiring unit 112 acquires an image of the region 932, not an image of the region 931, as a second image acquired from the second captured image 910 with respect to the first image 930 of the first captured image 900. This can avoid the influence of movement of the target objects 902 and 912 between images.
Fig. 16 is a flowchart showing an example of a process of moving the AF processing frame in accordance with the amount of movement of the target object.
The acquisition unit 112 acquires an image I (t) from a first captured image captured with the lens and the imaging surface in a first positional relationship0) (S601). Next, the acquisition unit 112 acquires the image I (t) from the second captured image captured with the lens and the imaging surface in the second positional relationship0) Image I (t) of the corresponding location1) (S602). The specifying unit 113 specifies the image I (t)1) Relative to the image I (t)0) The movement amount X of the target object S in the space is calculated (S603). As described above, the determination unit 113 may calculate the movement amount X by determining the feature points based on the pixel values, the brightness, and the like, and comparing the positions of the feature points with each other.
Next, the focus control unit 140 determines whether or not the movement amount X is equal to or less than the second threshold value Th2 (S604). The focus control unit 140 may determine whether or not a feature point of the second captured image exists in a block corresponding to the position of the feature point of the first captured image. The focus control unit 140 may determine whether the movement amount X is within 8 × 8 pixels, for example. When the movement amount X is equal to or less than the second threshold value Th2, the focus control unit 140 uses the image I (t) acquired in step S602 without moving the AF processing frame for the second captured image1) A distance calculation process based on the blur amount is executed (S605). That is, the focus control unit 140 bases on the image I (t)0) And image I (t)1) Is blurredThe amount determines the distance to the target object S. Then, the focus control section 140 moves the focus lens based on the distance to the target object S.
On the other hand, when the movement amount X is larger than the second threshold value Th2, the focus control unit 140 determines whether the movement amount X is equal to or smaller than the third threshold value Th3 (S606). The focus control unit 140 may determine whether or not the feature point of the second captured image exists in a block adjacent to a block corresponding to the position of the feature point of the first captured image. The focus control unit 140 may determine whether or not the movement amount is within 24 × 24 pixels, for example. When the movement amount X is equal to or less than the third threshold value Th3, the focus control unit 140 makes the image I (t) to be compared1) The AF processing frame of (S607) is moved by the amount corresponding to the movement amount X. Next, the focus control unit 140 performs focusing based on the image I (t) obtained from the moved AF process frame1) And image I (t)0) The distance to the target object S is determined based on the blur amount (S605). Then, the focus control section 140 moves the focus lens based on the distance to the target object S.
As described above, by adjusting the position of the image of the comparison object in consideration of the movement of the target object within the image, the AF processing of the BDAF method can be performed with higher accuracy.
FIG. 17 illustrates an example of a computer 1200 that can embody, in whole or in part, various aspects of the invention. The program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more "sections" of the apparatus. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more "parts". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such a program may be executed by the CPU1212 to cause the computer 1200 to perform specific operations associated with some or all of the blocks of the flowcharts and block diagrams described in this specification.
The computer 1200 according to this embodiment includes a CPU1212 and a RAM1214, which are connected to each other via a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 via the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU1212 operates in accordance with programs stored in the ROM1230 and the RAM1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices via a network. The hard disk drive may store programs and data used by CPU1212 in computer 1200. The ROM1230 stores therein a boot program or the like executed by the computer 1200 at the time of startup, and/or a program depending on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM1214 or the ROM1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and the cooperation between the programs and the various types of hardware resources described above is realized. The apparatus or method may be constructed by implementing operations or processes of information in accordance with the use of the computer 1200.
For example, when communication is performed between the computer 1200 and an external device, the CPU1212 may execute a communication program loaded into the RAM1214 and instruct the communication interface 1222 to perform communication processing based on processing described in the communication program. The communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM1214 or a USB memory and transmits the read transmission data to a network or writes reception data received from the network to a reception buffer provided in the recording medium under the control of the CPU 1212.
The CPU1212 may read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory into the RAM1214, and execute various types of processing on data on the RAM 1214. The CPU1212 may then write the processed data back to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases are stored in a recording medium and can be subjected to information processing. The CPU1212 can execute various types of processing described in various places of the invention, including various types of operations specified by instruction sequence commands of programs, information processing, condition judgment, conditional branching, unconditional branching, retrieval/replacement of information, and the like, with respect to data read from the RAM1214, and write the result back to the RAM 1214. Further, the CPU1212 can retrieve information in files, databases, etc., within the recording medium. For example, when a plurality of entries each having an attribute value of a first attribute associated with an attribute value of a second attribute are stored in a recording medium, the CPU1212 may retrieve an entry matching a condition for specifying an attribute value of the first attribute from the plurality of entries, and may read an attribute value of the second attribute stored in the entry, thereby acquiring an attribute value of the second attribute associated with the first attribute satisfying a predetermined condition.
The programs or software modules described above may be stored on the computer 1200 or in a computer-readable storage medium near the computer 1200. Further, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet can be used as the computer-readable storage medium, thereby providing the program to the computer 1200 through the network.
It should be noted that the execution order of the respective processes such as the actions, procedures, steps, and stages in the devices, systems, programs, and methods shown in the claims, the description, and the drawings may be realized in any order unless it is explicitly indicated as "before", or the like, and the output of the preceding process is not used in the subsequent process. Even if the operation flows in the claims, the specification, and the drawings are described using "first", "next", and the like for convenience, this does not mean that the operations must be performed in this order.
Description of the reference numerals
10 UAV
20 UAV body
30 UAV control section
32 memory
34 communication interface
40 advancing part
41 GPS receiver
42 inertia measuring device
43 magnetic compass
44 barometric altimeter
45 temperature sensor
50 universal support
60 image pickup device
100 image pickup device
102 image pickup part
110 image pickup control unit
112 acquisition part
113 determination unit
114 calculation unit
116 lead-out part
120 image sensor
130 memory
140 focus control unit
200 lens part
210 lens
212 lens moving mechanism
220 lens control part
300 remote operation device
1200 computer
1210 host controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM。

Claims (19)

1. An imaging control device is provided with:
an acquisition unit that acquires a first image included in a first captured image captured with an imaging surface and a lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship;
a calculation unit that calculates blur amounts of the first image and the second image, respectively; and
a control unit that controls a positional relationship between the imaging surface and the lens based on respective blur amounts of the first image and the second image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value;
the acquisition unit further acquires a fourth image included in the first captured image and a fifth image included in the second captured image,
the calculation section further calculates the blur amount of each of the fourth image and the fifth image,
the imaging control device further includes:
a deriving unit that derives a first distance to a first target object included in the first image and the second image based on the blur amount of each of the first image and the second image, and derives a second distance to a second target object included in the fourth image and the fifth image based on the blur amount of each of the fourth image and the fifth image,
the control unit controls the positional relationship between the image pickup surface and the lens based on the first distance when the first distance satisfies a predetermined image pickup condition, the second distance does not satisfy the image pickup condition, and a difference between a blur amount of the first image and a blur amount of the second image is equal to or greater than the first threshold.
2. The imaging control apparatus according to claim 1,
the acquisition unit further acquires a third image included in a third captured image captured with the imaging surface and the lens in a third positional relationship when a difference between the blur amount of the first image and the blur amount of the second image is smaller than the first threshold value,
the calculation section further calculates a blur amount of the third image,
the control unit controls the positional relationship between the imaging surface and the lens based on the blur amounts of the first image and the third image when the difference between the blur amount of the first image and the blur amount of the third image is equal to or greater than the first threshold.
3. The imaging control apparatus according to claim 1,
the acquisition unit further acquires a fourth image included in the first captured image and a fifth image included in the second captured image,
the calculation section further calculates the blur amount of each of the fourth image and the fifth image,
the control unit controls the positional relationship between the image pickup surface and the lens based on the respective blur amounts of the fourth image and the fifth image when the difference between the blur amount of the first image and the blur amount of the second image is smaller than the first threshold and the difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than the first threshold.
4. The imaging control apparatus according to claim 1,
the acquisition unit further acquires a fourth image adjacent to the first image included in the first captured image and a fifth image adjacent to the second image included in the second captured image,
the calculation section further calculates the blur amount of each of the fourth image and the fifth image,
the control unit controls the positional relationship between the image pickup surface and the lens based on the respective blur amounts of the first image, the second image, the fourth image, and the fifth image when the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than the first threshold value and the difference between the blur amount of the fourth image and the blur amount of the fifth image is equal to or greater than the first threshold value.
5. The imaging control apparatus according to claim 4, further comprising:
a deriving unit that derives a first distance to a first target object included in the first image and the second image based on a blur amount of the first image and a blur amount of the second image, and derives a second distance to a second target object included in the fourth image and the fifth image based on a blur amount of the fourth image and a blur amount of the fifth image,
the control unit controls a positional relationship between the imaging surface and the lens based on the first distance and the second distance.
6. The imaging control apparatus according to claim 4,
the acquisition unit further acquires a sixth image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when a difference between the blur amount of the fourth image and the blur amount of the fifth image is smaller than the first threshold value,
the calculation section further calculates a blur amount of the sixth image,
the control unit controls the positional relationship between the image pickup surface and the lens based on the respective amounts of blur of the first image, the second image, the fourth image, and the sixth image when the difference between the amount of blur of the first image and the amount of blur of the second image is equal to or greater than the first threshold value and the difference between the amount of blur of the fourth image and the amount of blur of the sixth image is equal to or greater than the first threshold value.
7. The imaging control apparatus according to claim 1,
the acquisition unit further acquires a sixth image included in a third captured image captured with the imaging surface and the lens in a third positional relationship, when the first distance satisfies the imaging condition, the second distance does not satisfy the imaging condition, and a difference between a blur amount of the first image and a blur amount of the second image is smaller than the first threshold value,
the calculation section further calculates a blur amount of the sixth image,
the control unit controls the positional relationship between the imaging surface and the lens based on the blur amount of the first image and the blur amount of the sixth image when the difference between the blur amount of the first image and the blur amount of the sixth image is equal to or greater than the first threshold.
8. The imaging control apparatus according to claim 1, further comprising:
a specifying unit that specifies a region of the second captured image corresponding to the first image by comparing a feature point included in the first image with a feature point of the second captured image,
the acquisition unit acquires, as the second image, an image of a region of the second captured image that has the same positional relationship as the positional relationship between the first captured image and the first image, for the second captured image, when a difference between the position of the first image in the first captured image and the position of the region in the second captured image is equal to or smaller than a second threshold value.
9. The imaging control apparatus according to claim 8,
the acquisition unit acquires the region as the second image when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is greater than the second threshold value.
10. The imaging control apparatus according to claim 8,
the acquisition unit acquires an image of the region as the second image when a difference between a position of the first image in the first captured image and a position of the region in the second captured image is greater than the second threshold value and equal to or less than a third threshold value.
11. The imaging control apparatus according to claim 8,
the determination unit determines the feature points based on the luminance of the first image and the luminance of the region of the second captured image.
12. The imaging control apparatus according to claim 11,
the determination unit determines a center of gravity of the luminance of the first image as the feature point included in the first image, and determines a center of gravity of the luminance of the region of the second captured image as the feature point of the second captured image.
13. The imaging control apparatus according to claim 1,
by changing the position of a focus lens included in the lens barrel, the state in which the imaging surface and the lens barrel are in the first positional relationship is changed to the state in which the imaging surface and the lens barrel are in the second positional relationship.
14. The imaging control apparatus according to claim 1,
by changing the position of the imaging surface, the state in which the imaging surface and the lens are in the first positional relationship is changed to the state in which the imaging surface and the lens are in the second positional relationship.
15. An imaging device is provided with:
the imaging control apparatus according to any one of claims 1 to 14;
an image sensor having the imaging surface; and
the lens.
16. An imaging system includes:
the image pickup apparatus according to claim 15; and
and a support mechanism that supports the imaging device.
17. A mobile body that moves with the imaging system according to claim 16 mounted thereon.
18. An imaging control method includes:
a step of acquiring a first image included in a first captured image captured with an imaging surface and a lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship;
a step of calculating the blur amount of each of the first image and the second image;
a step of controlling a positional relationship between the imaging surface and the lens based on respective blur amounts of the first image and the second image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value;
acquiring a fourth image included in the first captured image and a fifth image included in the second captured image;
calculating the blur amount of each of the fourth image and the fifth image;
deriving a first distance to a first target object included in the first image and the second image based on respective blur amounts of the first image and the second image, and deriving a second distance to a second target object included in the fourth image and the fifth image based on respective blur amounts of the fourth image and the fifth image;
and controlling a positional relationship between the image pickup surface and the lens based on the first distance when the first distance satisfies a predetermined image pickup condition, the second distance does not satisfy the image pickup condition, and a difference between a blur amount of the first image and a blur amount of the second image is equal to or greater than the first threshold.
19. A computer-readable storage medium storing a computer program which, when executed by a processor, is adapted to carry out the stages of:
a step of acquiring a first image included in a first captured image captured with an imaging surface and a lens in a first positional relationship, and a second image included in a second captured image captured with the imaging surface and the lens in a second positional relationship;
a step of calculating the blur amount of each of the first image and the second image;
a step of controlling a positional relationship between the imaging surface and the lens based on respective blur amounts of the first image and the second image when a difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than a first threshold value;
a step of acquiring a fourth image included in the first captured image and a fifth image included in the second captured image;
a step of calculating the blur amount of each of the fourth image and the fifth image;
a step of deriving a first distance to a first target object included in the first image and the second image based on the blur amount of each of the first image and the second image, and deriving a second distance to a second target object included in the fourth image and the fifth image based on the blur amount of each of the fourth image and the fifth image;
and a step of controlling a positional relationship between the image pickup surface and the lens based on the first distance when the first distance satisfies a predetermined image pickup condition, the second distance does not satisfy the image pickup condition, and a difference between a blur amount of the first image and a blur amount of the second image is equal to or greater than the first threshold.
CN201780002652.0A 2017-04-07 2017-04-07 Imaging control device, imaging system, moving object, imaging control method, and medium Expired - Fee Related CN108235815B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014554 WO2018185939A1 (en) 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Publications (2)

Publication Number Publication Date
CN108235815A CN108235815A (en) 2018-06-29
CN108235815B true CN108235815B (en) 2020-11-13

Family

ID=62645421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780002652.0A Expired - Fee Related CN108235815B (en) 2017-04-07 2017-04-07 Imaging control device, imaging system, moving object, imaging control method, and medium

Country Status (3)

Country Link
JP (1) JPWO2018185939A1 (en)
CN (1) CN108235815B (en)
WO (1) WO2018185939A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6710864B1 (en) * 2018-12-20 2020-06-17 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Lens device, imaging device, and moving body
JP6690106B1 (en) * 2019-03-26 2020-04-28 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Determination device, imaging system, and moving body
JP6798072B2 (en) * 2019-04-24 2020-12-09 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Controls, mobiles, control methods, and programs
CN111932901B (en) * 2019-05-13 2022-08-09 斑马智行网络(香港)有限公司 Road vehicle tracking detection apparatus, method and storage medium
JP6874251B2 (en) * 2019-07-23 2021-05-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Devices, imaging devices, moving objects, methods, and programs
JP2021032990A (en) * 2019-08-21 2021-03-01 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control device, imaging system, control method and program
CN112335227A (en) * 2019-08-21 2021-02-05 深圳市大疆创新科技有限公司 Control device, imaging system, control method, and program
JP7019895B2 (en) * 2020-04-07 2022-02-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Devices, imaging devices, imaging systems, moving objects, methods, and programs
CN112822410B (en) * 2021-04-19 2021-06-22 浙江华创视讯科技有限公司 Focusing method, focusing device, electronic device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009089348A (en) * 2007-09-11 2009-04-23 Ricoh Co Ltd Electronic apparatus, imaging device, and reproducing device
CN101713902A (en) * 2008-09-30 2010-05-26 索尼株式会社 Fast camera auto-focus
CN103209297A (en) * 2012-01-13 2013-07-17 佳能株式会社 Imaging Apparatus And Method Thereof, Lens Apparatus And Method Thereof, And Imaging System
CN106303201A (en) * 2015-06-04 2017-01-04 光宝科技股份有限公司 Image capture unit and focusing method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006086952A (en) * 2004-09-17 2006-03-30 Casio Comput Co Ltd Digital camera and program
WO2012137650A1 (en) * 2011-04-01 2012-10-11 富士フイルム株式会社 Imaging device and program
JP5882750B2 (en) * 2012-01-13 2016-03-09 キヤノン株式会社 IMAGING DEVICE, LENS DEVICE, IMAGING DEVICE CONTROL METHOD, LENS DEVICE CONTROL METHOD, COMPUTER PROGRAM, IMAGING SYSTEM
DE102013210204A1 (en) * 2013-05-31 2014-12-18 Gilupi Gmbh Detection device for in vivo and / or in vitro enrichment of sample material
JP6136019B2 (en) * 2014-02-03 2017-05-31 パナソニックIpマネジメント株式会社 Moving image photographing apparatus and focusing method of moving image photographing apparatus
JP6137316B2 (en) * 2014-02-26 2017-05-31 パナソニックIpマネジメント株式会社 Depth position detection device, imaging device, and depth position detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009089348A (en) * 2007-09-11 2009-04-23 Ricoh Co Ltd Electronic apparatus, imaging device, and reproducing device
CN101713902A (en) * 2008-09-30 2010-05-26 索尼株式会社 Fast camera auto-focus
CN103209297A (en) * 2012-01-13 2013-07-17 佳能株式会社 Imaging Apparatus And Method Thereof, Lens Apparatus And Method Thereof, And Imaging System
CN106303201A (en) * 2015-06-04 2017-01-04 光宝科技股份有限公司 Image capture unit and focusing method

Also Published As

Publication number Publication date
JPWO2018185939A1 (en) 2019-04-11
WO2018185939A1 (en) 2018-10-11
CN108235815A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN111356954B (en) Control device, mobile body, control method, and program
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
JP2019110462A (en) Control device, system, control method, and program
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
US10942331B2 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
CN112292712A (en) Device, imaging device, moving object, method, and program
JP6543875B2 (en) Control device, imaging device, flying object, control method, program
JP6515423B2 (en) CONTROL DEVICE, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
JP6503607B2 (en) Imaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
CN111602385B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN111357271B (en) Control device, mobile body, and control method
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
CN110785997B (en) Control device, imaging device, mobile body, and control method
CN111226170A (en) Control device, mobile body, control method, and program
CN110770667A (en) Control device, mobile body, control method, and program
CN111213369B (en) Control device, control method, imaging device, mobile object, and computer-readable storage medium
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
CN111615663A (en) Control device, imaging system, mobile object, control method, and program
JP2020052220A (en) Controller, imaging device, mobile body, method for control, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201113