CN108235815A - Video camera controller, photographic device, camera system, moving body, camera shooting control method and program - Google Patents

Video camera controller, photographic device, camera system, moving body, camera shooting control method and program Download PDF

Info

Publication number
CN108235815A
CN108235815A CN201780002652.0A CN201780002652A CN108235815A CN 108235815 A CN108235815 A CN 108235815A CN 201780002652 A CN201780002652 A CN 201780002652A CN 108235815 A CN108235815 A CN 108235815A
Authority
CN
China
Prior art keywords
image
fuzzy quantity
photographed images
imaging surface
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780002652.0A
Other languages
Chinese (zh)
Other versions
CN108235815B (en
Inventor
邵明
本庄谦
本庄谦一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Publication of CN108235815A publication Critical patent/CN108235815A/en
Application granted granted Critical
Publication of CN108235815B publication Critical patent/CN108235815B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)
  • Accessories Of Cameras (AREA)

Abstract

A kind of video camera controller, has acquisition unit, acquisition unit obtains the first image included in the first photographed images imaged in a state that imaging surface and camera lens are in first position relationship and the second image included in the second photographed images imaged in a state that imaging surface and camera lens are in second position relationship.Video camera controller has calculating part, and calculating part calculates the first image and the respective fuzzy quantity of the second image.Video camera controller has control unit, and control unit is in the case where the difference of the fuzzy quantity of the first image and the fuzzy quantity of the second image is more than first threshold, based on the first image and the respective fuzzy quantity of the second image, to control the position relationship of imaging surface and camera lens.

Description

Video camera controller, photographic device, camera system, moving body, camera shooting control method and Program
Technical field
The present invention relates to video camera controller, photographic device, camera system, moving body, camera shooting control method and programs.
Background technology
In patent document 1, a kind of image processing apparatus is disclosed, uses and is obscured with what different acquisition parameters took Multiple different images calculate the range information of the subject in image.
Patent document 1:JP speciallys permit No. 5932476 bulletin
Invention content
In the case of the distance calculated until subject in the fuzzy quantity based on multiple images, if multiple images is fuzzy The difference of amount is smaller, then there is a situation where the distance that can not accurately calculate until subject.
The video camera controller of the present invention can have:Acquisition unit obtains the first image and the second image, the first image The first photographed images imaged in a state that imaging surface and camera lens are in first position relationship are contained in, the second image includes The second photographed images imaged in the state of second position relationship is in imaging surface and camera lens.Video camera controller can be with Have:Calculating part calculates the first image and the respective fuzzy quantity of the second image.Video camera controller can have:Control Portion, in the case where the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image is more than first threshold, based on first Image and the respective fuzzy quantity of the second image, to control the position relationship of imaging surface and camera lens.
Acquisition unit, can in the case where the difference of the fuzzy quantity of the first image and the fuzzy quantity of the second image is less than first threshold It is included in the third photographed images imaged in a state that imaging surface and camera lens are in the third place relationship with further obtaining Third image.Calculating part can further calculate the fuzzy quantity of third image.Control unit is in the fuzzy quantity and the of the first image The difference of the fuzzy quantity of three images is in the case of more than first threshold, can be based on the first image and the respective mould of third image Paste amount, to control the position relationship of imaging surface and camera lens.
Acquisition unit can further obtain the 4th image included in the first photographed images and be wrapped in the second photographed images The 5th image contained.Calculating part can further calculate the 4th image and the 5th respective fuzzy quantity of image.Control unit is The difference of the fuzzy quantity of one image and the fuzzy quantity of the second image is less than first threshold and the fuzzy quantity and the 5th figure of the 4th image The difference of the fuzzy quantity of picture is in the case of more than first threshold, can be based on the 4th image and the 5th image is respective fuzzy Amount, to control the position relationship of imaging surface and camera lens.
Acquisition unit can further obtain fourth image adjacent with the first image that is included in the first photographed images and Fiveth image adjacent with the second image included in second photographed images.Calculating part can further calculate the 4th image and The 5th respective fuzzy quantity of image.Control unit is first threshold in the difference of the fuzzy quantity of the first image and the fuzzy quantity of the second image Above and the difference of the fuzzy quantity of the 4th image and the fuzzy quantity of the 5th image is in the case of more than first threshold, can be with base In the first image, the second image, the 4th image and the 5th respective fuzzy quantity of image, to control the position of imaging surface and camera lens Relationship.
Video camera controller can have:Leading-out portion, the fuzzy quantity of fuzzy quantity and the second image based on the first image, The first distance of the first object object included in the first image and the second image is exported to, and fuzzy based on the 4th image Amount and the 5th image fuzzy quantity, export to included in the 4th image and the 5th image the second of the second target object away from From.Control unit can control the position relationship of imaging surface and camera lens based on the first distance and second distance.
Acquisition unit, can in the case where the difference of the fuzzy quantity of the 4th image and the fuzzy quantity of the 5th image is less than first threshold It is wrapped in the third photographed images imaged in a state that imaging surface and camera lens are in the third place relationship with further obtaining The 6th image contained.Calculating part can further calculate the fuzzy quantity of the 6th image.Control unit the first image fuzzy quantity with The difference of the fuzzy quantity of second image is more than first threshold and the difference of the fuzzy quantity of the fuzzy quantity and the 6th image of the 4th image In the case of more than first threshold, the first image, the second image, the 4th image and the 6th respective mould of image can be based on Paste amount, to control the position relationship of imaging surface and camera lens.
Acquisition unit can further obtain the 4th image included in the first photographed images and be wrapped in the second photographed images The 5th image contained.Calculating part can further calculate the 4th image and the 5th respective fuzzy quantity of image.Camera shooting control dress Putting can have:Leading-out portion exports to the first image and the based on the first image and the respective fuzzy quantity of the second image First distance of the first object object included in two images, and the 4th image and the 5th respective fuzzy quantity of image are based on, Export to the second distance of the second target object included in the 4th image and the 5th image.Control unit meets in the first distance Pre-determined imaging conditions, second distance be unsatisfactory for imaging conditions and the fuzzy quantity of the first image and the second image it is fuzzy Position relationship of the difference of amount for imaging surface and camera lens in the case of more than first threshold, can be controlled based on the first distance.
Acquisition unit the first distance meets imaging conditions, second distance is unsatisfactory for imaging conditions and the first image it is fuzzy In the case that the difference of amount and the fuzzy quantity of the second image is less than first threshold, it can further obtain and be in imaging surface and camera lens The 6th image included in the third photographed images imaged in the state of the third place relationship.
Calculating part can further calculate the fuzzy quantity of the 6th image.Control unit is in the fuzzy quantity and the 6th figure of the first image The difference of the fuzzy quantity of picture is in the case of more than first threshold, fuzzy quantity that can be based on the first image and the 6th image are respectively Fuzzy quantity, to control the position relationship of imaging surface and camera lens.
It can be also equipped with:Determining section passes through the feature of characteristic point and the second photographed images that will be included in the first image Point is compared, so that it is determined that the region of the second photographed images corresponding with the first image.Acquisition unit is in the first photographed images The first image position and the region in the second photographed images position difference in the case of below second threshold, can be with needle To the second photographed images, second in the position relationship identical with the position relationship of the first photographed images and the first image is obtained The image in the region of photographed images is as the second image.
The position in the position of first image of the acquisition unit in the first photographed images and the region in the second photographed images it In the case that difference is more than second threshold, region can be obtained as the second image.
The position in the position of first image of the acquisition unit in the first photographed images and the region in the second photographed images it Difference is more than second threshold and for that in the case of below third threshold value, can obtain the image in region as the second image.
Determining section can be respective to determine based on the brightness in the region of the brightness and the second photographed images of the first image Characteristic point.
The center of gravity of the brightness of first image can be determined as the characteristic point included in the first image by determining section, and second is taken the photograph As the center of gravity of the brightness in the region of image is determined as the characteristic point of the second photographed images.
It can be by changing the position of the condenser lens included in camera lens, so as to be in first position from imaging surface and camera lens The Status Change of relationship is the state in second position relationship.
It can be by changing the position of imaging surface, so as to be in the Status Change of first position relationship from imaging surface and camera lens To be in the state of second position relationship.
Photographic device involved by the mode of the present invention has:Above-mentioned video camera controller, the image with imaging surface Sensor and camera lens.
Camera system involved by the mode of the present invention has:The bearing machine of above-mentioned photographic device and bearing photographic device Structure.
Mobile body-mounted above-mentioned camera system involved by the mode of the present invention is moved.
Camera shooting control method involved by the mode of the present invention can have:Obtain the rank of the first image and the second image Section, the first image are contained in the first photographed images imaged in a state that imaging surface and camera lens are in first position relationship, Second image is contained in the second photographed images imaged in a state that imaging surface and camera lens are in second position relationship.Camera shooting Control method can have the stage for calculating the first image and the respective fuzzy quantity of the second image.Camera shooting control method can have It is standby:In the case where the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image is more than first threshold, based on the first figure Picture and the respective fuzzy quantity of the second image, to control the stage of the position relationship of imaging surface and camera lens.
Program involved by the mode of the present invention can perform computer:Obtain the rank of the first image and the second image Section, the first image are contained in the first photographed images imaged in a state that imaging surface and camera lens are in first position relationship, Second image is contained in the second photographed images imaged in a state that imaging surface and camera lens are in second position relationship.Program Computer can be made to perform the stage for calculating the first image and the respective fuzzy quantity of the second image.Program can hold computer Row:In the case where the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image is more than first threshold, based on the first figure Picture and the respective fuzzy quantity of the second image, to control the stage of the position relationship of imaging surface and camera lens.
A mode according to the present invention, can the fuzzy quantity based on multiple images, more precisely adjust imaging surface and mirror The position relationship of head.
Above-mentioned brief summary of the invention and whole features of the unrequited present invention.The combination of these syndromes can also become hair It is bright.
Description of the drawings
Fig. 1 is the figure of an example for the appearance for showing unmanned vehicle and long-distance operating device.
Fig. 2 is the figure of an example for the function module for showing unmanned vehicle.
Fig. 3 is the figure of an example for the curve for illustrating that fuzzy quantity and the relationship of lens position.
Fig. 4 is the figure for showing to calculate an example of the process of the distance until target object based on fuzzy quantity.
Fig. 5 is for position, the position of camera lens and the figure illustrated with the relationship of focal length to target object.
Fig. 6 A are the figures that the relationship of the specific precision for the difference to fuzzy quantity and lens position illustrates.
Fig. 6 B are the figures that the relationship of the specific precision for the difference to fuzzy quantity and lens position illustrates.
Fig. 6 C are the figures that the relationship of the specific precision for the difference to fuzzy quantity and lens position illustrates.
Fig. 7 is the figure for being illustrated to the image for calculating fuzzy quantity.
Fig. 8 is the figure of an example for the curve for illustrating that fuzzy quantity and the relationship of lens position.
Fig. 9 is the flow chart of an example of the process for the AF processing for showing BDAF modes.
Figure 10 is the flow chart of another of the process for the AF processing for showing BDAF modes.
Figure 11 is the flow chart of another of the process for the AF processing for showing BDAF modes.
Figure 12 is the flow chart of another of the process for the AF processing for showing BDAF modes.
Figure 13 is the figure of another of the function module for showing unmanned vehicle.
Figure 14 is the figure illustrated for the amount of movement to target object.
Figure 15 A are for determining figure that the example of the amount of movement of target object illustrates based on the center of gravity of brightness.
Figure 15 B are for determining figure that the example of the amount of movement of target object illustrates based on the center of gravity of brightness.
Figure 16 is the flow chart of an example of process for showing to move AF processing blocks according to the amount of movement of target object.
Figure 17 is the figure for an example for showing hardware configuration.
Specific embodiment
Hereinafter, the present invention is illustrated by the embodiment of invention, but following embodiment does not limit right and wants Ask the invention involved by book.In addition, the combination of the feature illustrated in embodiment is entirely not the solution institute of invention It is required.It will be apparent to one skilled in the art that following embodiment can be made various changes or be improved.According to The record of claims is it is found that the mode for having carried out such changes and modifications also may be included in the technical scope of the present invention It is interior.
In claims, specification, attached drawing and abstract, the item of the object comprising the protection as copyright. Duplication of the copyright owner for anyone to these files, as long as in the file or record with Patent Office it is shown it is consistent as long as not It can raise an objection.But in the case of in addition to this, retain the copyright of all.
The various embodiments of the present invention are referred to flow chart and block diagram to record, and can be represented in this box (1) performing the stage of the process of operation or (2) has the function of " portion " of the device for performing operation.Specific stage and " portion " It can be realized by programmable circuit and/or processor.Special circuit can include number and/or analog hardware Circuit.Integrated circuit (IC) and/or discrete circuit can be included.Programmable circuit can include the hardware electricity that can be reconstructed Road.The hardware circuit that can be reconstructed can include logic AND, logic OR, logic XOR, logic NAND, logic NOR and other The memory of logical operation, trigger, register, field programmable gate array (FPGA), programmable logic array (PLA) etc. will Element etc..
Computer-readable medium can include the arbitrary tangible equipment that can preserve the order by appropriate equipment execution. As a result, can have comprising being used to perform in flow chart to generate with the computer-readable medium for being stored in order therein or The unit for the operation specified in block diagram and the product of order that can be performed.It, can be with as the example of computer-readable medium Include electronic storage medium, magnetic storage medium, optical storage media, electromagnetic memory medium, semiconductor storage medium etc..As calculating The more specific example of machine readable medium can include floppy disk (registered trademark), tape, hard disk, random access memory (RAM), read-only memory (ROM), Erasable Programmable Read Only Memory EPROM (EPROM or flash memory), electrically erasable are read-only Memory (EEPROM), static RAM (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), blue light (RTM) CD, memory stick, integrated circuit card etc..
Computer-readable commands can include the source code that is described with the arbitrary combination of one or more programming languages or Any one in object code.Source code or object code include previous procedure programming language.Previous Process Character is compiled Cheng Yuyan can be that assembly instruction, instruction set architecture (ISA) instruction, machine instruction, machine-dependent instructions, microcode, firmware refer to Enable, Object-Oriented Programming Language as condition setup data or Smalltalk, JAVA (registered trademark), C++ etc. and " C " programming language or similar programming language.Computer-readable commands can be directed to all-purpose computer, special purpose computer or its The processor or programmable circuit of his programmable data processing unit in local or pass through LAN (LAN), internet etc. Such wide area network (WAN) provides.Processor or programmable circuit can perform computer-readable commands, be held with generating The unit of operation that row is specified in flow chart or block diagram.It is single comprising computer processor, processing as the example of processor Member, microprocessor, digital signal processor, controller, microcontroller etc..
Fig. 1 shows an example of the appearance of unmanned vehicle (UAV) 10 and long-distance operating device 300.UAV10 has:UAV Main body 20, gimbals (gimbal) 50, multiple photographic devices 60 and photographic device 100.Gimbals 50 and camera shooting dress Put 100 be camera system an example.UAV10 is an example of the moving body promoted by promotion part.So-called moving body, be in addition to Other than UAV, also included in flyers such as other aircraft of aerial mobile, the vehicle that moves on the ground, move on the water The concept of ship etc..
UAV main bodys 20 have multiple rotary wings.Multiple rotary wings are an examples of promotion part.UAV main bodys 20 are more by controlling The rotation of a rotary wings and UAV10 is made to fly.UAV main bodys 20 for example make UAV10 fly using four rotary wings.The number of rotary wings Amount is not limited to four.In addition, UAV10 can also be the fixed-wing aircraft of no rotary wings.
Photographic device 100 is the camera shooting of the camera shooting imaged to the subject included in desirable image pickup scope Machine.Gimbals 50 support photographic device 100 in a rotatable way.Gimbals 50 are an examples of supporting device.For example, ten thousand Photographic device 100 is rotatably supported with pitch axis using actuator to stent 50.Gimbals 50 are further using actuator Photographic device 100 is rotatably supported centered on roll axis and yaw axis respectively.Gimbals 50 can be by yaw Photographic device 100 is rotated centered at least one of axis, pitch axis and roll axis, so as to change photographic device 100 Posture.
Multiple photographic devices 60 be in order to control the flight of UAV10 and the sensing to being imaged around UAV10 Video camera.The prow that two photographic devices 60 can be arranged on UAV10 is i.e. positive.And then another two photographic device 60 can be set In the bottom surface of UAV10.Two photographic devices 60 of face side become pairing, can play work(as so-called stereo camera Energy.Two photographic devices 60 of bottom surface side also become pairing, can be functioned as stereo camera.It can be based on by more The image that a photographic device 60 images, to generate the three-dimensional space data around UAV10.The photographic device that UAV10 has 60 quantity is not limited to four.UAV10 has at least one photographic device 60.UAV10 can also be in the machine of UAV10 Head, tail, side, bottom surface and top surface have at least one photographic device 60 respectively.It is regarded what photographic device 60 can be set Rink corner can with a wide angle of view than that can be set in photographic device 100.Photographic device 60 can also have single-focus lens or fish Glasses head.
Long-distance operating device 300 communicates with UAV10, and remote operation is carried out to UAV10.Long-distance operating device 300 can be with UAV10 is carried out wireless communication.Long-distance operating device 300 to UAV10 send represent up and down, acceleration, deceleration, advance, after The instruction information with the relevant various orders of the movement of UAV10 such as move back, rotate.Indicate information for example comprising the height for making UAV10 The instruction information of rising.Instruction information can show the height that UAV10 should locate.UAV10 is moved such that positioned at by from remote The height shown in instruction information that journey operating device 300 receives.Indicate that information can include the rising life that UAV10 is made to increase It enables.UAV10 is risen when accepting rising order.UAV10 can be in the case where the height of UAV10 reaches limit level Even if accept rise order limitation rise.
Fig. 2 shows an examples of the function module of UAV10.UAV10 has:UAV control units 30, memory 32, communication interface 34th, promotion part 40, GPS receiver 41, inertial measuring unit 42, magnetic compass 43, barometertic altimeter 44, temperature sensor 45, ten thousand To stent 50 and photographic device 100.
300 grade other devices of communication interface 34 and long-distance operating device communicate.Communication interface 34 can be from long-range behaviour Make device 300 and receive the instruction information for including the various orders to UAV control units 30.It is right that memory 32 preserves UAV control units 30 Promotion part 40, GPS receiver 41, inertial measuring unit (IMU) 42, magnetic compass 43, barometertic altimeter 44, temperature sensor 45, Gimbals 50, photographic device 60 and photographic device 100 carry out controlling required program etc..Memory 32 can be computer The recording medium that can be read can be included in the flash memory of SRAM, DRAM, EPROM, EEPROM and USB storage etc. at least One.Memory 32 can be arranged on the inside of UAV main bodys 20.It could be provided as to remove from UAV main bodys 20.
UAV control units 30 control the flight and camera shooting of UAV10 according to the program preserved in memory 32.UAV is controlled Portion 30 can be made of microcontrollers such as the microprocessors such as CPU or MPU, MCU etc..UAV control units 30 are according to via communication interface 34 orders received from long-distance operating device 300 control the flight and camera shooting of UAV10.Promotion part 40 promotes UAV10. Promotion part 40 has:Multiple rotary wings and the multiple drive motors for making multiple rotary wings rotations.Promotion part 40 according to from The order of UAV control units 30 rotates multiple rotary wings via multiple drive motors, so as to which UAV10 be made to fly.
GPS receiver 41 receives the multiple signals at expression moment emitted from multiple GPS satellites.GPS receiver 41 is based on Received multiple signals, to calculate the position of the GPS receiver 41 i.e. position of UAV10.IMU42 detects the posture of UAV10. Postures of the IMU42 as UAV10, detect UAV10 around and the acceleration of 3 upper and lower axis directions and pitching, The angular speed of 3 axis directions of roll and yaw.Magnetic compass 43 detects the orientation of the prow of UAV10.Barometertic altimeter 44 detects The height of UAV10 flights.Barometertic altimeter 44 detects the air pressure around UAV10, and is height by detected pressure reduction Degree, so as to detect height.Temperature sensor 45 detects the temperature around UAV10.
Photographic device 100 has image pickup part 102 and camera lens part 200.Camera lens part 200 is an example of lens assembly.Camera shooting Portion 102 has:Imaging sensor 120, imaging control part 110 and memory 130.Imaging sensor 120 can by CCD or CMOS is formed.The image data of optical image being imaged via multiple lens 210 is output to imaging control part by imaging sensor 120 110.Imaging control part 110 can be made of microcontrollers such as the microprocessors such as CPU or MPU, MCU etc..Imaging control part 110 can With the action command according to the photographic device 100 from UAV control units 30, photographic device 100 is controlled.Memory 130 It can be computer-readable recording medium, SRAM, DRAM, EPROM, EEPROM and USB storage etc. can be included At least one of flash memory.Memory 130 preserves imaging control part 110 and imaging sensor 120 etc. is carried out to control required journey Sequence etc..Memory 130 can be arranged on the inside of the housing of photographic device 100.Memory 130 could be provided as can be from camera shooting The housing of device 100 removes.
Camera lens part 200 has:Multiple lens 210, lens moving mechanism 212 and lens control portion 220.Multiple lens 210 can function as zoom lens, lens of variable focal length and condenser lens.At least one of multiple lens 210 Divide or all be configured to move along optical axis.Camera lens part 200 can be arranged to be torn open relative to image pickup part 102 The lens changeable of dress.Lens moving mechanism 212 makes at least part of multiple lens 210 or is moved all along optical axis.Mirror Head control unit 220 drives lens moving mechanism 212, makes one or more saturating according to the lens control order from image pickup part 102 Mirror 210 is moved along optical axis direction.Lens control order is, for example, Zoom control order and focus control command.
The photographic device 100 formed in this way performs automatic focusing (AF processing), and desirable subject is taken the photograph Picture.
Photographic device 100 determines the distance (subject distance) from camera lens to subject to perform AF processing.As with In the mode for determining subject distance, exist more based on being imaged in the state of the position relationship difference of camera lens and imaging surface The fuzzy quantity of a image is come the mode that determines.Here, which is known as fuzzy detection focuses on (Bokeh Detection automatically Auto Foucus:BDAF) mode.
For example, the fuzzy quantity (Cost) of image can use Gaussian function to be represented by following formula (1).In formula (1), x is represented Location of pixels in horizontal direction.σ represents standard deviation value.
[formula 1]
Fig. 3 shows an example of the curve represented by formula (1).By the way that condenser lens is made to be registered to the minimal point with curve 500 502 corresponding lens positions, so as to make the target object included in focus alignment image I.
Fig. 4 is the flow chart for an example apart from calculating process for showing BDAF modes.First, by photographic device 100, in mirror Head and imaging surface are in the state of the relationship of first position, to the image I of first1It is imaged and is stored in memory 130 In.Then, by the way that the imaging surface of condenser lens or imaging sensor 120 is made to move in the direction of the optical axis, so as to as camera lens with Imaging surface is in the state of second position relationship, by photographic device 100 to the image I of second2It is imaged and is stored in and deposited In reservoir 130 (S101).For example, climbing the mountain AF as so-called, make the imaging surface of condenser lens or imaging sensor 120 in light It is moved in axis direction so that do not exceed focus point.The amount of movement of the imaging surface of condenser lens or imaging sensor 120 for example may be used To be 10 μm.
Then, photographic device 100 is by image I1It is divided into multiple regions (S102).It can be according to each picture in image I2 Element calculates characteristic quantity, using the pixel group with similar characteristic quantity as a region and by image I1It is divided into multiple regions. It can also be by image I1The pixel group of range set by middle AF processing blocks is divided into multiple regions.Photographic device 100 is by image I2 It is divided into and image I1The corresponding multiple regions of multiple regions.Photographic device 100 is based on image I1The respective mould of multiple regions Paste amount and image I2The respective fuzzy quantity of multiple regions, calculated according to each region of multiple regions to each of multiple regions Distance (S103) until the target object included in region.
The calculating process adjusted the distance with reference to Fig. 5 is further illustrated.It will be from camera lens L (principal point) to 510 (object of target object Face) distance be set as A, the distance of position (image planes) being imaged from camera lens L (principal point) to target object 510 in imaging surface is set as Focal length is set as F by B.In this case, the relationship of distance A, distance B and focal length F can be according to the public affairs of camera lens Formula is represented by following formula (2).
[formula 2]
Focal length F is determined by lens position.Therefore, if can determine target object 510 imaging surface be imaged away from From B, it will be able to determine from camera lens L to target object 510 distance A using formula (2).
As shown in figure 5, by according to be projected in target object 510 in imaging surface fuzzy size (blur circle 512 with And 514) position that target object 510 is imaged is calculated, so as to determine distance B, and then determine distance A.It is that is, big to what is obscured Small (fuzzy quantity) is proportional to imaging surface and image space to be taken in, and can determine image space.
Here, by from away from the nearer picture 1 of imaging surface1Distance to camera lens L is set as D1.It will be from the picture I away from imaging surface farther out2 Distance to camera lens L is set as D2.Each image obscures.By point at this time as spread function (Point Spread Function) PSF is set as, by D1And D2The image at place is set to Id1And Id2.In this case, for example, as I1It can be transported by convolution It calculates and is represented by following formula (3).
[formula 3]
I1=PSF*Id1…(3)
And then by image data Id1And Id2Fourier transform function be set as f, will be to image Id1And Id2Point picture Spread function PSF1And PSF2Carry out the optical transfer function (Optical Transfer Function) after Fourier transformation It is set as OTF1And OTF2, ratio is taken as following formula (4).
[formula 4]
Value C shown in formula (4) is equivalent to image Id1And Id2The variable quantity of respective fuzzy quantity, that is, value C is equivalent to figure As Id1Fuzzy quantity and image Id2nFuzzy quantity difference.
But the feelings of the distance of target object included in being calculated from two different images of fuzzy quantity to two images Under condition, if the difference of the fuzzy quantity of two images is smaller, cause in the presence of the curve 500 that cannot be accurately determined as shown in Figure 3 The situation of the distance of target object can not accurately be calculated.For example, as shown in Figure 6A, if from image I0Obtained fuzzy quantity C (t0) with from image I1Obtained fuzzy quantity C (t1) difference be less than threshold value Th, then from fuzzy quantity C (t0) and fuzzy quantity C (t1) really The curve 522 of fixed Gaussian function does not become ideal curve sometimes.From curve 522 determine lens position 524 from into The corresponding ideal lens position 520 in image position deviates.Fig. 6 B are also similary, from image I0Obtained fuzzy quantity C (t0) with from figure As I2Obtained fuzzy quantity C (t2) difference be less than threshold value Th.In this case, by from fuzzy quantity C (t0) and fuzzy quantity C (t2) The determining lens position 528 of the curve 526 of determining Gaussian function deviates from ideal lens position 520.On the other hand, example Such as, as shown in Figure 6 C, from image I0Obtained fuzzy quantity C (t0) with from image I3Obtained fuzzy quantity C (t3) difference be threshold value In the case of more than Th, by from fuzzy quantity C (t0) and fuzzy quantity C (t3) determining Gaussian function the determining lens of curve 530 Position 532 is consistent with ideal lens position 520.X or X represents the position of condenser lens.
As described above, in the case where determining the distance of target object in a manner of BDAF, it is expected as comparison other The difference of the mutual fuzzy quantity of image is more than pre-determined threshold value.
Therefore, the imaging control part 110 that the photographic device 100 involved by present embodiment has is as shown in Fig. 2, have: Acquisition unit 112, calculating part 114, leading-out portion 116 and focusing control unit 140.
Acquisition unit 112 obtains the first camera shooting figure imaged in a state that imaging surface and camera lens are in first position relationship The first image included and the second camera shooting imaged in a state that imaging surface and camera lens are in second position relationship as in The second image included in image.Acquisition unit 112, for example, as shown in fig. 7, being closed from first position is in imaging surface and camera lens Image 601 is obtained in image 601~605 in the AF processing blocks 610 of the photographed images 600 imaged in the state of system.And then Acquisition unit 112 from the imaging surface movement by making condenser lens or imaging sensor 120 so as to change imaging surface and camera lens Position relationship after in image 621~625 in the AF processing blocks 630 of photographed images 620 that images, acquirement image 621. The figure in the region with the characteristic quantity for meeting pre-determined condition is obtained in multiple regions of the acquisition unit 112 out of AF processing blocks Picture.The characteristic quantity in each region of multiple regions of the acquisition unit 112 in AF processing blocks meets the feelings of pre-determined condition Under condition, the image in each region can be obtained.
Calculating part 114 calculates the first image and the respective fuzzy quantity of the second image.Calculating part 114 for example calculates image 601 and 621 respective fuzzy quantity C (t) of image.Control unit 140 focus in the fuzzy quantity of the first image and the mould of the second image In the case that the difference of paste amount is more than first threshold Th1, taken the photograph based on the first image and the respective fuzzy quantity of the second image, control The position relationship of image planes and camera lens.Focusing control unit 140 can be by controlling the position of at least one party in imaging surface and camera lens It puts, so as to control the position relationship of imaging surface and camera lens.First threshold Th1 can be determined according to the specification of photographic device 100. First threshold Th1 can be determined based on the lens properties of photographic device 100.First threshold Th1 can be based on imaging sensor 120 pel spacing determines.Focusing control unit 140 is an example of the control unit for the position relationship for controlling imaging surface and camera lens.
The fuzzy quantity of fuzzy quantity and second image of the leading-out portion 116 based on the first image exports to the first image and First distance of the first object object included in two images.Leading-out portion 116, for example, can be based on above-mentioned formula (2) and Fig. 5 Shown geometry relationship exports to the distance of target object 650 included in image 601 and image 621.Focusing control unit 140 can be by based on by the derived distance to target object 650 of leading-out portion 116, making condenser lens or imaging sensor 120 Imaging surface movement, so as to perform AF processing.
Acquisition unit 112 can be less than first threshold Th1 in the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image In the case of, further obtain the third photographed images imaged in a state that imaging surface and camera lens are in the third place relationship In the third image that includes.Acquisition unit 112 is taken the photograph for example, can obtain what is imaged in the state of the third place relationship As the image 661 included in image 660.Calculating part 114 can further calculate the fuzzy quantity of third image.Focusing control unit 140 can be based in the case where the difference of the fuzzy quantity of the fuzzy quantity and third image of the first image is more than first threshold Th1 First image and the respective fuzzy quantity of third image, to control the position relationship of imaging surface and camera lens.Focusing control unit 140 can Using in the case where the difference of the fuzzy quantity of image 601 and the fuzzy quantity of image 661 is more than first threshold Th1, based on image 601 And 661 respective fuzzy quantity of image, the imaging surface of condenser lens or imaging sensor 120 is made to move so that focus is directed at mesh Mark object 650.
Acquisition unit 112 can be obtained further in the 4th image included in the first photographed images and the second photographed images Comprising the 5th image.Acquisition unit 112, for example, the image 602 included in photographed images 600 and photographed images can be obtained The image 622 included in 620.Calculating part 114 can calculate the 4th image and the 5th respective fuzzy quantity of image.Calculating part 114, for example, 622 respective fuzzy quantity of image 602 and image can be calculated.Focusing control unit 140 can be in the first image The difference of fuzzy quantity and the fuzzy quantity of the second image is less than the fuzzy quantity and the 5th image of first threshold Th1 and the 4th image In the case that the difference of fuzzy quantity is more than first threshold Th1, based on the 4th image and the 5th respective fuzzy quantity of image, to control The position relationship of imaging surface processed and camera lens.The fuzzy quantity and the fuzzy quantity of the 5th image that leading-out portion 116 can be based on the 4th image, Export to the second distance of the second target object included in the 4th image and the 5th image.Leading-out portion 116, for example, can be with Based on above-mentioned formula (2) and geometry relationship shown in fig. 5, the target object included in image 602 and image 622 is exported to 652 distance.It focuses control unit 140, for example, can be in the difference of the fuzzy quantity of the fuzzy quantity and image 621 of image 601 less than the In the case that the difference of the fuzzy quantity of the fuzzy quantity and image 622 of one threshold value Th1 and image 602 is more than first threshold Th1, Based on 622 respective fuzzy quantity of image 602 and image, to control the position relationship of imaging surface and camera lens.Focusing control unit 140 It can make focusing based on derived to the distance of target object 652 included in image 602 and image 622 by leading-out portion 116 The imaging surface of lens or imaging sensor 120 moves so that focus alignment target object 652.
Acquisition unit 112 can obtain fourth image and second adjacent with the first image included in the first photographed images Fiveth image adjacent with the second image included in photographed images.Acquisition unit 112, for example, can obtain in photographed images 600 Comprising the image 602 adjacent with image 601 and photographed images 620 in the image 622 adjacent with image 621 that includes.Meter Calculation portion 114 can calculate the 4th image and the 5th respective fuzzy quantity of image.Calculating part 114, for example, image can be calculated 602 and 622 respective fuzzy quantity of image.
Control unit 140 of focusing can be first threshold in the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image In the case that the difference of the fuzzy quantity of the fuzzy quantity and the 5th image of more than Th1 and the 4th image is first threshold more than Th1, Based on the first image, the second image, the 4th image and the 5th respective fuzzy quantity of image, to control the position of imaging surface and camera lens Put relationship.Focusing control unit 140, for example, can be the first threshold in the difference of the fuzzy quantity of the fuzzy quantity and image 621 of image 601 In the case that the difference of the fuzzy quantity of the fuzzy quantity and image 622 of more than value Th1 and image 602 is more than first threshold Th1, Based on image 601,622 respective fuzzy quantity of image 621, image 602 and image, the position of imaging surface and camera lens to be controlled to close System.Focusing control unit 140 can be according to determining based on the fuzzy quantity of image 601 and image 621 to target object 650 Distance and the distance to target object 652 determining based on the fuzzy quantity of image 602 and image 622, to control image planes With the position relationship of camera lens.Focusing control unit 140 can be according to the preset weight in each region in AF processing blocks, base The distance of the subject in AF processing blocks is determined in the respective distance in each region of weighting.
Acquisition unit 112 can be less than first threshold Th1 in the difference of the fuzzy quantity of the fuzzy quantity and the 5th image of the 4th image In the case of, further obtain the third photographed images imaged in a state that imaging surface and camera lens are in the third place relationship In the 6th image that includes.Acquisition unit 112, for example, can by the position relationship of imaging surface and camera lens from second position relationship It is further changed to after the third place relationship, and then obtains the image 662 included in the photographed images 660 imaged.It calculates Portion 114 can calculate the fuzzy quantity of the 6th image.Calculating part 114 can calculate the fuzzy quantity of image 662.Focusing control unit 140 Can be more than first threshold Th1 and the 4th image in the difference of the fuzzy quantity of the fuzzy quantity and the second image of the first image The difference of fuzzy quantity and the fuzzy quantity of the 6th image be more than first threshold Th1 in the case of, based on the first image, the second image, 4th image and the 6th respective fuzzy quantity of image, to control the position relationship of imaging surface and camera lens.Focusing control unit 140 can Using the difference of the fuzzy quantity of the fuzzy quantity and image 621 in image 601 as the fuzzy quantity of more than first threshold Th1 and image 602 And the difference of the fuzzy quantity of image 662 be more than first threshold Th1 in the case of, based on image 601, image 621, image 602 with And 662 respective fuzzy quantity of image, to control the position relationship of imaging surface and camera lens.
Focus control unit 140 can first distance as derived from leading-out portion 116 meet pre-determined imaging conditions, by Second distance derived from leading-out portion 116 is unsatisfactory for pre-determined imaging conditions and the fuzzy quantity of the first image and the second image Fuzzy quantity position relationship of the difference in the case of more than first threshold, to control imaging surface and camera lens based on the first distance. Imaging conditions can be the condition determined according to screening-modes such as Portrait, landscape configurations.Imaging conditions can be extremely near Hold preferential, unlimited distal end priority scheduling condition.If shooting condition is preferential for pole proximal end, control unit 140 of focusing can be from by exporting It, will be shortest to the distance of target object in the distance to target object of each region derived from portion 116 in AF processing blocks The image in region is judged as the image for meeting imaging conditions, and the image in other AF processing blocks is judged as to be unsatisfactory for camera shooting item The image of part.Here, imaging conditions can be met in the first distance, second distance is unsatisfactory for imaging conditions and the first image In the case that the difference of fuzzy quantity and the fuzzy quantity of the second image is less than first threshold Th1, acquisition unit 112 is further obtained and is being imaged Face and camera lens are in the 6th image included in the third photographed images imaged in the state of the third place relationship.Acquisition unit 112, for example, image 661 can be obtained from photographed images 660.Focusing control unit 140 can be in the fuzzy quantity and the of the first image The difference of the fuzzy quantity of six images is in the case of more than first threshold, fuzzy quantity and the 6th image based on the first image are respectively Fuzzy quantity, to control the position relationship of imaging surface and camera lens.Leading-out portion 116 can be based on image 601 fuzzy quantity and image 661 fuzzy quantity, to export to the distance of target object 650 included in image 601 and image 661.Focusing control unit 140 Can the picture of condenser lens or imaging sensor 120 be made based on the distance to target object 650 exported to by leading-out portion 116 Face is moved, so that focus alignment target object 650.
For example, as shown in figure 8, calculating part 114 calculates the fuzzy quantity C (t of image 6010), the fuzzy quantity C (t of image 6211)、 Fuzzy quantity the C ' (t of image 6020), fuzzy quantity the C ' (t of image 6221).Fuzzy quantity C (the t of image 601 at this time0) and image 621 fuzzy quantity C (t1) difference Ta be less than first threshold Th1.On the other hand, fuzzy quantity the C ' (t of image 6020) and image 621 Fuzzy quantity C ' (t1) difference Tb be more than first threshold Th1.If imaging conditions in this case are preferential for unlimited distal end, right Burnt control unit 140 may determine that as focus is made to be directed at the target object 652 included in image 602 and image 622.The opposing party Face, if imaging conditions are preferential for pole proximal end, control unit 140 of focusing may determine that as focus is made to be directed at image 601 and image The target object 650 included in 621.Here, because difference Tb is more than first threshold Th1, the fuzzy quantity based on image 602 C’(t0) and image 622 fuzzy quantity C ' (t1) and the precision of determining curve 700 is higher.Therefore, focusing control unit 140 passes through Based on the lens position 712 determined from curve 700, move the imaging surface of condenser lens or imaging sensor 120, so as to Accurately perform AF processing.On the other hand, because difference Ta is less than first threshold Th1, the fuzzy quantity C based on image 601 (t0) and image 621 fuzzy quantity C (t1) and the precision of determining curve 702 is relatively low.Therefore, in the case of preferential in pole proximal end, If control unit 140 of focusing according to the lens position 714 determining based on curve 702, makes condenser lens or imaging sensor 120 Imaging surface moves, then cannot accurately perform AF processing.Therefore, photographic device 100 further passes condenser lens or image The imaging surface movement of sensor 120, images photographed images 660.Then, calculating part 114 calculates the image of photographed images 660 661 fuzzy quantity C (t2).Fuzzy quantity C (the t of image 6010) with the fuzzy quantity C (t of image 6612) difference Tc be first threshold Th1 More than.Therefore, the fuzzy quantity C (t based on image 6010) and image 661 fuzzy quantity C (t2) and determining curve 704 compares curve 702 precision are high.Focusing control unit 140 is by based on the lens position 716 determined from curve 704, making condenser lens or figure as a result, As the imaging surface movement of sensor 120, even if thus in the case of preferential in pole proximal end, also can accurately perform at AF Reason.
Fig. 9 is the flow chart of an example of the process for the AF processing for showing BDAF modes.
Photographic device 100 makes condenser lens be moved to X (t0) position (S201).Photographic device 100, for example, making focusing saturating Mirror moves 10 μm in the direction of the optical axis.Acquisition unit 112 is from X (t0) image I (t are obtained in the photographed images that image of position0) (S202).Imaging control part 110 makes counter be incremented by (S203).Then, imaging control part 110 makes via lens control portion 220 Condenser lens is moved to X (tn) position (S204).Acquisition unit 112 is from X (tn) obtain in the photographed images that image of position and figure As I (t0) corresponding position image I (tn)(S205).Calculating part 114 calculates image I (t0) fuzzy quantity C (t0) and image I(tn) fuzzy quantity C (tn)(S206).If fuzzy quantity C (t0) and fuzzy quantity C (tn) difference | C (tn)-C(t0) | less than the first threshold Value Th1, then in order to further move condenser lens, the later step S203 processing repeatedly of imaging control part 110.
It is if poor | C (tn)-C(t0) | for more than first threshold Th1, then leading-out portion 116 is based on fuzzy quantity C (t0) and fuzzy quantity C (tn), export to image I (t0) and image I (tn) in the distance of target object that includes.Control unit 140 of focusing is based on should be away from From determining with a distance from subject (S208).It is pre- that control unit 140 of focusing based on the distance determined is moved to condenser lens Survey focusing position (S209).
As described above, the difference of fuzzy quantity of the photographic device 100 between image is smaller, move condenser lens, Until the difference of the fuzzy quantity between image becomes more than first threshold Th1.Therefore, photographic device 100 being capable of higher precision, height The AF processing based on BDAF modes is performed fastly.
Figure 10 is the flow chart of another of the process for the AF processing for showing BDAF modes.Process shown in Fig. 10 can answer For the situation of the screening-mode of the speed of preferential AF processing.
Photographic device 100 makes condenser lens be moved to X (t0) position (S301).Photographic device 100, for example, making focusing saturating Mirror moves 10 μm in the direction of the optical axis.Acquisition unit 112 is obtained in X (t0) position image photographed images setting AF processing Multiple images in frame.AF processing blocks are divided into multiple regions, and obtain image by each region by acquisition unit 112.Acquisition unit 112 calculate the respective characteristic quantity of multiple images (S302).Acquisition unit 112 can be based on the respective pixel value of multiple images, brightness Value, edge detection etc. calculate characteristic quantity.If the image with characteristic quantity more than threshold value is not present in multiple images, no The AF based on BDAF modes is performed to handle and end processing.
On the other hand, if there is the image (S303) with characteristic quantity more than threshold value, acquisition unit 112 obtains these figures As I (t0)(S304).Then, imaging control part 110 makes counter be incremented by (S305).Imaging control part 110 is via lens control Portion 220 makes condenser lens be moved to X (tn) position (S306).Acquisition unit 112 is from X (tn) in the photographed images that image of position It obtains and each image I (t0) corresponding position each image I (tn)(S307).Calculating part 114 calculates each image I (t0) Fuzzy quantity C (t0) and each image I (tn) fuzzy quantity C (tn)(S308).If there is no fuzzy quantity C (t0) and fuzzy quantity C (tn) difference | C (tn)-C(t0) | it is the image of more than first threshold Th1, then step S305 is later repeatedly for imaging control part 110 Processing.
If it is poor to exist | C (tn)-C(t0) | it is the image of more than first threshold Th1, then leading-out portion 116 is based on corresponding image I(t0) fuzzy quantity C (t0) and image I (tn) fuzzy quantity C (tn), export to image I (t0) and image I (tn) in include The distance of target object.Focusing control unit 140 determines the distance of subject based on the distance derived from leading-out portion 116, and Condenser lens is made to be moved to prediction focusing position (S311).
According to above processing, focusing control unit 140 can obtain mould in the multiple images out of AF processing blocks In stage of the difference of paste amount for image more than first threshold, condenser lens is made to be moved to prediction focusing position immediately.
Figure 11 is the flow chart of another of the process for the AF processing for showing BDAF modes.Process shown in Figure 11 can answer For setting the multiple spot AF patterns of weight to each region in AF processing blocks.
Photographic device 100 makes condenser lens be moved to X (t0) position (S401).Photographic device 100, for example, making focusing saturating Mirror moves 10 μm in the direction of the optical axis.Acquisition unit 112 is obtained in X (t0) position image photographed images setting AF processing Multiple images in frame.AF processing blocks are divided into multiple regions, and obtain image by each region by acquisition unit 112.It obtains Portion 112 calculates the respective characteristic quantity of multiple images (S402).If it is not present in multiple images with characteristic quantity more than threshold value Image, then do not perform based on BDAF modes AF processing and end handle.
On the other hand, if there is the image (S403) with characteristic quantity more than threshold value, acquisition unit 112 obtains these figures As I (t0)(S404).Then, imaging control part 110 makes counter be incremented by (S405).Imaging control part 110 is via lens control Portion 220 makes condenser lens be moved to X (tn) position (S406).Acquisition unit 112 is from X (tn) in the photographed images that image of position It obtains and each image I (t0) corresponding position each image I (tn)(S407).Calculating part 114 calculates each image I (t0) Fuzzy quantity C (t0) and each image I (tn) fuzzy quantity C (tn)(S408)。
If there is no poor | C (tn)-C(t0) | be more than first threshold Th1 image, then imaging control part 110 in order to via Lens control portion 220 moves condenser lens, and then obtains the camera lens photographed images different from the position relationship of imaging surface, and anti- Processing later multiple step S405.
If it is poor to exist | C (tn)-C(t0) | it is the image of more than first threshold Th1, then based on corresponding image I (t0) mould Paste amount C (t0) and image I (tn) fuzzy quantity C (tn), export to image I (t0) and image I (tn) in the target object that includes Distance (S410).
If not exporting to the distance of target object for all images with characteristic quantity more than threshold value, control is imaged Portion 110 processed via lens control portion 220 in order to move condenser lens, and then it is different from the position relationship of imaging surface to obtain camera lens Photographed images, and later step S405 processing repeatedly.
If the distance of target object, control unit of focusing are exported to for all images with characteristic quantity more than threshold value 140 determine the distance of subject (S410) based on these distances.For example, the situation of AF processing blocks 610 shown in Fig. 7 Under, the region of each image in AF processing blocks 610 can be individually weighted.It can be to the center in AF processing blocks 610 Image 601 region setting " 100 " weight, the image 602 adjacent to the left and right of image 601 and image 603 are set The weight of " 70 ", the image 604 adjacent to the top of image 601 set the weight of " 70 ", the figure adjacent to the lower section of image 601 As the weight of 605 settings " 50 ".Moreover, focusing control unit 140 can be directed to about each region export in AF processing blocks 610 The distance to target object carry out weighting to each region, based on each distance after weighting, to determine subject Distance.Control unit 140 focus based on the distance to subject determined, condenser lens is made to be moved to prediction focusing position (S413)。
As described above, in the case where calculating distance for multiple regions, obtained by each region of multiple regions fuzzy The difference of amount becomes image more than first threshold.It is inclined thereby, it is possible to prevent the precision of the derived distance between multiple regions from generating Difference.
Figure 12 is the flow chart of another of the process for the AF processing for showing BDAF modes.Process shown in Figure 12 can answer For unlimited distally preferential or pole proximal end priority scheduling image pickup mode.
Photographic device 100 makes condenser lens be moved to X (t0) position (S501).Photographic device 100, for example, making focusing saturating Mirror moves 10 μm in the direction of the optical axis.Acquisition unit 112 is obtained in X (t0) position image photographed images setting AF processing Multiple images in frame.AF processing blocks are divided into multiple regions, and obtain image by each region by acquisition unit 112.Acquisition unit 112 calculate the respective characteristic quantity of multiple images (S502).If it is not present in multiple images with characteristic quantity more than threshold value Image, then do not perform the AF processing based on BDAF modes and end is handled.
On the other hand, if there is the image (S503) with characteristic quantity more than threshold value, acquisition unit 112 obtains these figures As I (t0)(S504).Then, imaging control part 110 makes counter be incremented by (S505).Imaging control part 110 is via lens control Portion 220 makes condenser lens be moved to X (tn) position (S506).Acquisition unit 112 is from X (tn) in the photographed images that image of position It obtains and each image I (t0) corresponding position each image I (tn)(S507).Calculating part 114 calculates each image I (t0) Fuzzy quantity C (t0) and each image I (tn) fuzzy quantity C (tn)(S508).Leading-out portion 116 is based on each image I (t0) Fuzzy quantity C (t0) and each image I (tn) fuzzy quantity C (tn), export to each image I (t0) and each image I (tn) In the distance (S509) of each target object that includes.
Focusing control unit 140 selects the image I (t for meeting imaging conditions from the distance calculated0) and image (tn) (S510).For example, if pole proximal end is preferential, then the nearest figure of chosen distance from the distance calculated of control unit 140 of focusing As I.If unlimited distal end is preferential, then focus the farthest image I of the chosen distance from the distance calculated of control unit 140.
Focusing control unit 140 judges selected image I (t0) fuzzy quantity C (t0) and image (tn) fuzzy quantity C (tn) difference | C (tn)-C(t0) | whether it is more than first threshold Th1 (S511).It is if poor | C (tn)-C(t0) | it is first threshold Th1 More than, then control unit 140 of focusing is based on derived to image I (t by leading-out portion 1160) and image I (tn) in the target that includes The distance of object determines the distance of subject, and condenser lens is made to be moved to prediction focusing position (S516).
About focusing control unit 140, if selected image I (t0) fuzzy quantity C (t0) and image (tn) fuzzy quantity C(tn) difference | C (tn)-C(t0) | less than first threshold Th1, then imaging control part 110 makes counter be incremented by (S512).Then, Imaging control part 110 makes condenser lens be moved to X (t via lens control portion 220n) position (S513).Acquisition unit 112 is from X (tn) image I (t with being selected in step S510 are obtained in the photographed images that image of position0) corresponding position image I (tn)(S514).Calculating part 114 calculates acquired image I (tn) fuzzy quantity C (tn)(S515)。
The processing of the step S511 repeatedly of imaging control part 110, until obtaining and selected image I (t0) fuzzy quantity C (t0) fuzzy quantity difference become more than first threshold Th1 image (tn) until.Then, if obtaining difference | C (tn)-C(t0) | into Image (t for more than first threshold Th1n), then control unit 140 of focusing is based on derived to image I (t by leading-out portion 1160) with And image I (tn) in the distance of target object that includes determine the distance of subject, and condenser lens is made to be moved to prediction Focusing position (S516).
As described above, even if in the situation that AF processing is performed with unlimited distally preferential or pole proximal end priority scheduling image pickup mode Under, it also can accurately perform the AF processing of BDAF modes.
But if for example, the photographic device 100 carried by moving bodys such as UAV10 is imaged, there are photographed images The situation that interior target object moves in photographed images.If the amount of movement of target object is larger, there are imaging control parts 110 The situation of AF processing cannot be accurately performed in a manner of BDAF.
Therefore, amount of movement of the target object in image can be considered in imaging control part 110, to perform the AF of BDAF modes Processing.
Figure 13 shows an example of the function module of the UAV10 involved by other embodiment.UAV10 shown in Figure 13 is taking the photograph It is on this point as control unit 110 with determining section 113, different from UAV10 shown in Fig. 2.
Determining section 113 is imaged by imaged in the state of being in first position relationship in camera lens and imaging surface first The characteristic point that the first image in image is included is arrived with being imaged in the state of being in second position relationship in camera lens and imaging surface The characteristic points of the second photographed images be compared, so that it is determined that the region of the second photographed images corresponding with the first image.It takes The difference for obtaining the position of first image of the portion 112 in the first photographed images and the position in the region in the second photographed images is second In the case of below threshold value Th2, for the second photographed images, obtain to be in and be closed with the position of the first photographed images and the first image It is the image in the region of the second photographed images of identical position relationship as the second image.
Acquisition unit 112 can be in the region in the position of the first image in the first photographed images and the second photographed images In the case that the difference of position is more than second threshold, the region is obtained as the second image.Acquisition unit 112 can be in the first camera shooting figure The difference of the position of the first image as in and the position in the region in the second photographed images is more than second threshold Th2 and is third threshold In the case of below value Th3, the image in the region is obtained as the second image.Second threshold Th2 and third threshold value Th3 can be with Pel spacing based on imaging sensor 120 etc. determines.Determining section 113 can divide the image into multiple blocks, with block For unit search characteristics point.Therefore, second threshold Th2 and third threshold value Th3 can the ruler based on the block of search characteristics point It is very little to determine.
Determining section 113, for example, as shown in figure 14, the first image of the AF processing blocks 810 out of first photographed images 800 812 determining characteristic points 820.Determining section 113 can determine characteristic point 820 based on pixel value, brightness, edge detection etc..Then, The determining and characteristic point out of the second photographed images 802 that imaged in the state of the position relationship of camera lens and imaging surface is changed 820 corresponding characteristic points 821.Determining section 113 passes through 820 and second photographed images 802 of characteristic point to the first photographed images 800 Characteristic point 821 be compared, so that it is determined that the region 814 with 812 corresponding second photographed images 802 of the first image.It obtains The position of the position of first image 812 of the portion 112 in the first photographed images 800 and the region 814 in the second photographed images 802 Difference in the case of below second threshold Th2, for the second photographed images 802, obtain be in the first photographed images 800 and The image in the region 816 of the second photographed images 802 of the identical position relationship of the position relationship of the first image 812 is as the second figure Picture.That is, focusing control unit 140 does not make the AF processing blocks 811 in the second photographed images 802 relative in the first photographed images 800 AF processing blocks 810 move, and perform the AF processing of BDAF modes.
On the other hand, it is assumed that determining section 113 from the state of the position relationship of camera lens and imaging surface is changed camera shooting to Third photographed images 804 in determine and 820 corresponding characteristic point 822 of characteristic point.In this case, determining section 113 passes through The characteristic point 820 of first photographed images 800 and the characteristic point 822 of third photographed images 804 are compared, so that it is determined that with The region 816 of one image, 812 corresponding third photographed images 804.Acquisition unit 112 is judged as in the first photographed images 800 The difference of the position in the region 816 in the position and third photographed images 804 of one image 812 is more than second threshold Th2, so as to obtain The region 816 is used as the second image.Make at the AF in third photographed images 804 that is, focusing control unit 140 corresponds to region 816 It manages frame 813 and moves amount corresponding with the amount of movement of characteristic point, and use relative to the AF processing blocks 810 of the first photographed images 800 AF processing blocks 813 after movement perform the AF processing of BDAF modes.
Like this in the case where the target object of comparison other moves more than permissible range in image, change is compared The position of the image of object calculates fuzzy quantity by image of the calculating part 114 after change.Therefore, even if target object is in image Interior movement also can accurately perform the AF processing under BDAF modes.
The determining characteristic point of determining section 113 can also be determined according to the center of gravity of the brightness in image.For example, such as Figure 15 A It is shown, include target object 902 in the first photographed images 900.First photographed images 900 are divided into multiple by determining section 113 Block (for example, 8 × 8 pixels) unit, and brightness is calculated, and generation represents the list of brightness as unit of block according to each block Color image 901.Determining section 113 determines the position of the center of gravity 903 of brightness from monochrome image 901.Similarly, determining section 113 is by Two photographed images 910 are divided into multiple blocks, and calculate brightness according to each block, generate monochrome image 911.And then it determines Portion 113 determines the position of the center of gravity 913 of brightness from monochrome image 911.Determining section 113 determines the center of gravity of brightness from center of gravity 903 Position is moved to the position of center of gravity 913.In the range of if the amount of movement of the center of gravity of brightness is a block, that is, the weight of brightness The amount of movement of the heart is within 8 × 8 pixels, it is determined that portion 113, which may determine that, is characterized amount of movement a little within second threshold.Separately On the one hand, if the amount of movement of center of gravity is in the range of two block, it is determined that portion 113 may determine that the amount of movement being characterized a little exists Within third threshold value.Then, acquisition unit 112 as shown in fig. 15b, relative to the first image 930 of the first photographed images 900, are made For the second image obtained from the second photographed images 910, not obtain the image in region 931 but obtain the image in region 932. Thereby, it is possible to avoid the influence of 902 and 912 movement between image of target object.
Figure 16 is the flow chart of an example of process for showing to move AF processing blocks according to the amount of movement of target object.
The first photographed images that acquisition unit 112 images in the state of first position relationship is in camera lens and imaging surface In, obtain image I (t0)(S601).Then, acquisition unit 112 in camera lens and imaging surface in the state of second position relationship is in In the second photographed images imaged, obtain and be in and image I (t0) corresponding position image I (t1)(S602).Determining section 113 couples of image I (t1) target object S relative to image I (t0) in target object S amount of movement X carry out operation (S603). As described above, determining section 113 can be by determining characteristic point, and the mutual position of comparative feature point based on pixel value, brightness etc. It puts, so as to carry out operation to amount of movement X.
Then, focusing control unit 140 judges whether amount of movement X is (S604) below second threshold Th2.Focusing control unit 140 can be determined that the spy that whether there is the second photographed images in the corresponding block in the position of the characteristic point with the first photographed images Sign point.Focusing control unit 140, such as, it is possible to determine that whether amount of movement X is within 8 × 8 pixels.If amount of movement X is second threshold Th2 hereinafter, control unit 140 of then focusing do not make for the second photographed images AF processing blocks move, taken using in step S602 Image I (the t obtained1), to perform based on fuzzy quantity apart from calculation processing (S605).That is, focusing control unit 140 is based on image I (t0) fuzzy quantity and image I (t1) fuzzy quantity determine the distance of target object S.Then, focusing control unit 140 is based on Distance to target object S moves condenser lens.
On the other hand, in the case where amount of movement X is more than second threshold Th2, focusing control unit 140 judges that amount of movement X is No is below third threshold value Th3 (S606).Focusing control unit 140 can be determined that in the position of the characteristic point with the first photographed images It whether there is the characteristic point of the second photographed images in the adjacent block of corresponding block.Focusing control unit 140, for example, can sentence Amount of movement is determined whether within 24 × 24 pixels.If amount of movement X be third threshold value Th3 hereinafter, if control unit 140 of focusing make to compare Image I (the t of object1) AF processing blocks move corresponding with amount of movement X amount (S607).Then, focusing control unit 140 be based on from Image I (the t that AF processing blocks after movement obtain1) fuzzy quantity and image I (t0) fuzzy quantity determine target object S's Distance (S605).Then, focusing control unit 140 moves condenser lens based on the distance to target object S.
As described above, the movement by considering the target object in image, the position of the image of comparison other is adjusted, so as to The AF processing of BDAF modes can more precisely be performed.
Figure 17 shows that an example of the computer 1200 of multiple modes of the present invention can be implemented in whole or in part.Peace Program loaded on computer 1200 can make computer 1200 establish pass as with the device involved by embodiments of the present invention The one or more of the operation of connection or the device " portion " and function.It is somebody's turn to do alternatively, the program can perform computer 1200 Operation or the one or more " portion ".The program can make computer 1200 perform the process involved by embodiments of the present invention Or the stage of the process.Such program can in order to make computer 1200 perform with this specification described in flow chart and Some or all of in the box of block diagram establish associated specific operation and are performed by CPU1212.
Computer 1200 of the present embodiment includes CPU1212 and RAM1214, they pass through host controller 1210 and be connected with each other.Computer 1200 is also comprising communication interface 1222, I/O unit, they are via input/output control Device 1220 processed is connect with host controller 1210.Computer 1200 also includes ROM1230.CPU1212 is according to being stored in ROM1230 And the program in RAM1214 is acted, and thus controls each unit.
Communication interface 1222 communicates via network with other electronic equipments.Hard disk drive can be preserved by computer The program and data that CPU1212 in 1200 is used.ROM1230 is preserved therein on startup to be performed by computer 1200 The program of bootstrap etc. and/or hardware dependent on computer 1200.Program is via CR-ROM, USB storage or IC Computer readable recording medium storing program for performing as card or network provide.Program is mounted on and the example of computer readable recording medium storing program for performing RAM1214 or ROM1230, performed by CPU1212.The information processing described in these programs is read by computer 1200, Realize cooperating between program and above-mentioned various types of hardware resources.Device or method can be by according to computer 1200 It is formed using the operation or processing of realizing information.
For example, in the case of performing communication between computer 1200 and external equipment, CPU1212 can be performed and is loaded into Based on the processing described in signal procedure, communication process is indicated to communication interface 1222 for signal procedure in RAM1214.Communication Under the control of CPU1212, reading and saving provides interface 1222 in as RAM1214 or USB storage in recording medium The transmission data sent in buffering area, and read transmission data is sent to network or will be received from network It receives data and reception buffering area provided on the recording medium etc. is provided.
In addition, CPU1212 can will be stored in USB storage etc. as file or database in external recording medium Whole or the parts of needs read in RAM1214, perform various types of processing for the data on RAM1214. CPU1212 then can will treated that data write back to external recording medium.
Various types of information as various types of programs, data, table and database preserve in the recording medium, And information processing can be received.CPU1212 can be directed to the data that are read from RAM1214, perform record everywhere of the invention, Comprising specified by the instruction sequence order of program various types of operations, information processing, condition judgment, conditional branching, without item Part branch, information retrieval/displacement etc. including various types of processing, and result back into RAM1214.In addition, CPU1212 can retrieve the file in recording medium, the information in database.Have respectively for example, being preserved in recording medium Have in the case that the property value with the second attribute establishes multiple entries of the property value of associated first attribute, CPU1212 can To retrieve the consistent entry with the property value of specified first attribute from multiple entry, reading and saving is in the entry The second attribute property value, thus obtain and establish associated second attribute with meeting the first attribute of pre-determined condition Property value.
Program or software module described above can be stored in the calculating on computer 1200 or near computer 1200 In machine readable storage medium storing program for executing.In addition, with the hard disk that provides in the server system of dedicated communications network or Internet connection or Recording medium can be used as computer readable storage medium as RAM, and program is provided to computer 1200 from there through network.
It should be noted that the device shown in claims, specification and attached drawing, system, program and side As long as the execution sequence that action, process, step and stage in method etc. are respectively handled is not explicitly indicated as " in front ", " it especially Before " etc., in addition, not being that will be used in processing of the output of the processing of front below, then can be realized by arbitrary sequence.It closes Motion flow in claims, specification and attached drawing, though for convenience and use " first ", " then " etc. come into It has gone explanation, has also been not meant as in this order implementing.
Reference sign
10 UAV
20 UAV main bodys
30 UAV control units
32 memories
34 communication interfaces
40 promotion parts
41 GPS receiver
42 inertial measuring units
43 magnetic compasses
44 barometertic altimeters
45 temperature sensors
50 gimbals
60 photographic devices
100 photographic devices
102 image pickup parts
110 imaging control parts
112 acquisition units
113 determining sections
114 calculating parts
116 leading-out portions
120 imaging sensors
130 memories
140 focusing control units
200 camera lens parts
210 lens
212 lens moving mechanisms
220 lens control portions
300 long-distance operating devices
1200 computers
1210 host controllers
1212 CPU
1214 RAM
1220 i/o controllers
1222 communication interfaces
1230 ROM。

Claims (20)

1. a kind of video camera controller, has:
Acquisition unit obtains the first image and the second image, and described first image is contained in is in first in imaging surface and camera lens The first photographed images imaged in the state of position relationship, second image are contained in the imaging surface and the camera lens The second photographed images imaged in the state of the relationship of the second position;
Calculating part calculates described first image and the respective fuzzy quantity of the second image;With
Control unit is more than first threshold in the difference of the fuzzy quantity of the fuzzy quantity and second image of described first image In the case of, based on described first image and the respective fuzzy quantity of the second image, to control the imaging surface and the mirror The position relationship of head.
2. video camera controller according to claim 1, wherein,
The acquisition unit is less than first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of second image In the case of value, third image is further obtained, the third image is contained in is in the in the imaging surface and the camera lens The third photographed images imaged in the state of three position relationships,
The calculating part further calculates the fuzzy quantity of the third image,
The control unit is the first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of the third image In the case of above, based on described first image and the respective fuzzy quantity of third image, come control the imaging surface with The position relationship of the camera lens.
3. video camera controller according to claim 1, wherein,
The acquisition unit further obtains the 4th image included in first photographed images and second photographed images In the 5th image that includes,
The calculating part further calculates the 4th image and the respective fuzzy quantity of the 5th image,
The control unit is less than first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of second image It is worth and the difference of the fuzzy quantity of the fuzzy quantity and the 5th image of the 4th image is the situation more than first threshold Under, based on the 4th image and the respective fuzzy quantity of the 5th image, to control the imaging surface and the camera lens Position relationship.
4. video camera controller according to claim 1, wherein,
The acquisition unit further obtains fourth image adjacent with described first image included in first photographed images And fiveth image adjacent with second image included in second photographed images,
The calculating part further calculates the 4th image and the respective fuzzy quantity of the 5th image,
The control unit is the first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of second image Above and the difference of the fuzzy quantity of the 4th image and the fuzzy quantity of the 5th image is the feelings more than first threshold Under condition, based on described first image, second image, the 4th image and the respective fuzzy quantity of the 5th image, To control the position relationship of the imaging surface and the camera lens.
5. video camera controller according to claim 4, wherein, it is also equipped with:
Leading-out portion, the fuzzy quantity of fuzzy quantity and second image based on described first image, exports to first figure The first distance of first object object included in picture and second image, and based on the fuzzy quantity of the 4th image With the fuzzy quantity of the 5th image, the second target object included in the 4th image and the 5th image is exported to Second distance,
The control unit is based on the described first distance and the second distance, to control the position of the imaging surface and the camera lens Put relationship.
6. video camera controller according to claim 4, wherein,
The acquisition unit is less than first threshold in the difference of the fuzzy quantity of the 4th image and the fuzzy quantity of the 5th image In the case of value, the 6th image is further obtained, the 6th image is contained in is in the in the imaging surface and the camera lens The third photographed images imaged in the state of three position relationships,
The calculating part further calculates the fuzzy quantity of the 6th image,
The control unit is the first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of second image Above and the difference of the fuzzy quantity of the 4th image and the fuzzy quantity of the 6th image is the feelings more than first threshold Under condition, based on described first image, second image, the 4th image and the respective fuzzy quantity of the 6th image, To control the position relationship of the imaging surface and the camera lens.
7. video camera controller according to claim 1, wherein,
The acquisition unit further obtains the 4th image included in first photographed images and second photographed images In the 5th image that includes,
The calculating part further calculates the 4th image and the respective fuzzy quantity of the 5th image,
The video camera controller is also equipped with:
Leading-out portion, based on described first image and the respective fuzzy quantity of the second image export to described first image with And the first distance of the first object object included in second image, and based on the 4th image and the described 5th The respective fuzzy quantity of image exports to the second of the second target object included in the 4th image and the 5th image Distance,
The control unit is unsatisfactory for the camera shooting in the pre-determined imaging conditions of the described first distance satisfaction, the second distance The difference of condition and the fuzzy quantity of described first image and the fuzzy quantity of second image is the situation more than first threshold Under, the position relationship of the imaging surface and the camera lens is controlled based on first distance.
8. video camera controller according to claim 7, wherein,
The acquisition unit first distance meets the imaging conditions, the second distance is unsatisfactory for the imaging conditions, And in the case that the difference of the fuzzy quantity of described first image and the fuzzy quantity of second image is less than the first threshold, into one Step acquirement is wrapped in the third photographed images imaged in a state that the imaging surface and the camera lens are in the third place relationship The 6th image contained,
The calculating part further calculates the fuzzy quantity of the 6th image,
The control unit is the first threshold in the difference of the fuzzy quantity of described first image and the fuzzy quantity of the 6th image In the case of above, fuzzy quantity and the respective fuzzy quantity of the 6th image based on described first image are described to control Imaging surface and the position relationship of the camera lens.
9. video camera controller according to claim 1, wherein, it is also equipped with:
Determining section, by the way that the characteristic point of the characteristic point included in described first image and second photographed images is compared Compared with, so that it is determined that the region of second photographed images corresponding with described first image,
In the position of described first image of the acquisition unit in first photographed images and second photographed images The difference of the position in the region is in the case of below second threshold, for second photographed images, obtain be in it is described The region of second photographed images of the first photographed images position relationship identical with the position relationship of described first image Image is as second image.
10. video camera controller according to claim 9, wherein,
In the position of described first image of the acquisition unit in first photographed images and second photographed images In the case that the difference of the position in the region is more than the second threshold, the region is obtained as second image.
11. video camera controller according to claim 9, wherein,
In the position of described first image of the acquisition unit in first photographed images and second photographed images The difference of the position in the region is more than the second threshold and the image in the case of below third threshold value, to obtain the region As second image.
12. video camera controller according to claim 9, wherein,
The brightness in the region of brightness and second photographed images of the determining section based on described first image is come really Fixed respective characteristic point.
13. video camera controller according to claim 12, wherein,
The characteristic point that the center of gravity of the brightness of described first image is determined as including in described first image by the determining section, The center of gravity of the brightness in the region of second photographed images is determined as to the characteristic point of second photographed images.
14. video camera controller according to claim 1, wherein,
By changing the position of the condenser lens included in the camera lens, so as to be in described from the imaging surface and the camera lens The Status Change of first position relationship is the state in the second position relationship.
15. video camera controller according to claim 1, wherein,
By changing the position of the imaging surface, so as to be in the first position relationship from the imaging surface and the camera lens Status Change is the state in the second position relationship.
16. a kind of photographic device, has:
Video camera controller described in any one of claim 1 to 15;
Imaging sensor, with the imaging surface;With
The camera lens.
17. a kind of camera system, has:
Photographic device described in claim 16;With
Supporting device supports the photographic device.
18. a kind of moving body carries the camera system described in claim 17 and is moved.
19. a kind of camera shooting control method, has:
The stage of the first image and the second image is obtained, described first image is contained in is in first position in imaging surface and camera lens The first photographed images imaged in the state of relationship, second image is contained in be in the imaging surface and the camera lens The second photographed images imaged in the state of the relationship of the second position;
Calculate the stage of described first image and the respective fuzzy quantity of the second image;With
In the case where the difference of the fuzzy quantity of the fuzzy quantity and second image of described first image is more than first threshold, base In described first image and the respective fuzzy quantity of the second image, the imaging surface and the position of the camera lens to be controlled to close The stage of system.
20. a kind of program, for computer to be made to perform such as the next stage:
The stage of the first image and the second image is obtained, described first image is contained in is in first position in imaging surface and camera lens The first photographed images imaged in the state of relationship, second image is contained in be in the imaging surface and the camera lens The second photographed images imaged in the state of the relationship of the second position;
Calculate the stage of described first image and the respective fuzzy quantity of the second image;With
In the case where the difference of the fuzzy quantity of the fuzzy quantity and second image of described first image is more than first threshold, base In described first image and the respective fuzzy quantity of the second image, the imaging surface and the position of the camera lens to be controlled to close The stage of system.
CN201780002652.0A 2017-04-07 2017-04-07 Imaging control device, imaging system, moving object, imaging control method, and medium Expired - Fee Related CN108235815B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014554 WO2018185939A1 (en) 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Publications (2)

Publication Number Publication Date
CN108235815A true CN108235815A (en) 2018-06-29
CN108235815B CN108235815B (en) 2020-11-13

Family

ID=62645421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780002652.0A Expired - Fee Related CN108235815B (en) 2017-04-07 2017-04-07 Imaging control device, imaging system, moving object, imaging control method, and medium

Country Status (3)

Country Link
JP (1) JPWO2018185939A1 (en)
CN (1) CN108235815B (en)
WO (1) WO2018185939A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020125390A1 (en) * 2018-12-20 2020-06-25 深圳市大疆创新科技有限公司 Lens device, camera device and moving body
WO2020192385A1 (en) * 2019-03-26 2020-10-01 深圳市大疆创新科技有限公司 Determination device, camera system, and moving object
CN111932901A (en) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 Road vehicle tracking detection apparatus, method and storage medium
CN112166374A (en) * 2019-04-24 2021-01-01 深圳市大疆创新科技有限公司 Control device, imaging device, mobile body, control method, and program
WO2021013143A1 (en) * 2019-07-23 2021-01-28 深圳市大疆创新科技有限公司 Apparatus, photgraphic apparatus, movable body, method, and program
CN112335227A (en) * 2019-08-21 2021-02-05 深圳市大疆创新科技有限公司 Control device, imaging system, control method, and program
WO2021031833A1 (en) * 2019-08-21 2021-02-25 深圳市大疆创新科技有限公司 Control device, photographing system, control method, and program
CN112822410A (en) * 2021-04-19 2021-05-18 浙江华创视讯科技有限公司 Focusing method, focusing device, electronic device and storage medium
WO2021204020A1 (en) * 2020-04-07 2021-10-14 深圳市大疆创新科技有限公司 Device, camera device, camera system, moving body, method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009089348A (en) * 2007-09-11 2009-04-23 Ricoh Co Ltd Electronic apparatus, imaging device, and reproducing device
CN101713902A (en) * 2008-09-30 2010-05-26 索尼株式会社 Fast camera auto-focus
CN103209297A (en) * 2012-01-13 2013-07-17 佳能株式会社 Imaging Apparatus And Method Thereof, Lens Apparatus And Method Thereof, And Imaging System
CN103502866A (en) * 2011-04-01 2014-01-08 富士胶片株式会社 Imaging device and program
US20140357967A1 (en) * 2013-05-31 2014-12-04 Gilupi Gmbh Detection Device for the In Vivo and/or In Vitro Enrichment of Sample Material
CN106303201A (en) * 2015-06-04 2017-01-04 光宝科技股份有限公司 Image capture unit and focusing method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006086952A (en) * 2004-09-17 2006-03-30 Casio Comput Co Ltd Digital camera and program
JP5882750B2 (en) * 2012-01-13 2016-03-09 キヤノン株式会社 IMAGING DEVICE, LENS DEVICE, IMAGING DEVICE CONTROL METHOD, LENS DEVICE CONTROL METHOD, COMPUTER PROGRAM, IMAGING SYSTEM
JP6136019B2 (en) * 2014-02-03 2017-05-31 パナソニックIpマネジメント株式会社 Moving image photographing apparatus and focusing method of moving image photographing apparatus
JP6137316B2 (en) * 2014-02-26 2017-05-31 パナソニックIpマネジメント株式会社 Depth position detection device, imaging device, and depth position detection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009089348A (en) * 2007-09-11 2009-04-23 Ricoh Co Ltd Electronic apparatus, imaging device, and reproducing device
CN101713902A (en) * 2008-09-30 2010-05-26 索尼株式会社 Fast camera auto-focus
CN103502866A (en) * 2011-04-01 2014-01-08 富士胶片株式会社 Imaging device and program
CN103209297A (en) * 2012-01-13 2013-07-17 佳能株式会社 Imaging Apparatus And Method Thereof, Lens Apparatus And Method Thereof, And Imaging System
US20140357967A1 (en) * 2013-05-31 2014-12-04 Gilupi Gmbh Detection Device for the In Vivo and/or In Vitro Enrichment of Sample Material
CN106303201A (en) * 2015-06-04 2017-01-04 光宝科技股份有限公司 Image capture unit and focusing method

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020125390A1 (en) * 2018-12-20 2020-06-25 深圳市大疆创新科技有限公司 Lens device, camera device and moving body
CN112136067A (en) * 2018-12-20 2020-12-25 维克多哈苏有限公司 Lens device, imaging device, and moving object
WO2020192385A1 (en) * 2019-03-26 2020-10-01 深圳市大疆创新科技有限公司 Determination device, camera system, and moving object
CN112166374A (en) * 2019-04-24 2021-01-01 深圳市大疆创新科技有限公司 Control device, imaging device, mobile body, control method, and program
CN111932901A (en) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 Road vehicle tracking detection apparatus, method and storage medium
CN111932901B (en) * 2019-05-13 2022-08-09 斑马智行网络(香港)有限公司 Road vehicle tracking detection apparatus, method and storage medium
CN112292712A (en) * 2019-07-23 2021-01-29 深圳市大疆创新科技有限公司 Device, imaging device, moving object, method, and program
WO2021013143A1 (en) * 2019-07-23 2021-01-28 深圳市大疆创新科技有限公司 Apparatus, photgraphic apparatus, movable body, method, and program
CN112335227A (en) * 2019-08-21 2021-02-05 深圳市大疆创新科技有限公司 Control device, imaging system, control method, and program
WO2021031833A1 (en) * 2019-08-21 2021-02-25 深圳市大疆创新科技有限公司 Control device, photographing system, control method, and program
WO2021204020A1 (en) * 2020-04-07 2021-10-14 深圳市大疆创新科技有限公司 Device, camera device, camera system, moving body, method, and program
CN112822410A (en) * 2021-04-19 2021-05-18 浙江华创视讯科技有限公司 Focusing method, focusing device, electronic device and storage medium
CN112822410B (en) * 2021-04-19 2021-06-22 浙江华创视讯科技有限公司 Focusing method, focusing device, electronic device and storage medium

Also Published As

Publication number Publication date
WO2018185939A1 (en) 2018-10-11
CN108235815B (en) 2020-11-13
JPWO2018185939A1 (en) 2019-04-11

Similar Documents

Publication Publication Date Title
CN108235815A (en) Video camera controller, photographic device, camera system, moving body, camera shooting control method and program
CN113038016B (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
CN110494360A (en) For providing the autonomous system and method photographed and image
JP6943988B2 (en) Control methods, equipment and systems for movable objects
CN106161892A (en) Photographic attachment and attitude control method thereof and unmanned vehicle
US20210112194A1 (en) Method and device for taking group photo
US20210109312A1 (en) Control apparatuses, mobile bodies, control methods, and programs
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
JP2019110462A (en) Control device, system, control method, and program
CN107580161A (en) Photographic equipment and method, travelling shot device, photography moving body and its control device
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
JP6630939B2 (en) Control device, imaging device, moving object, control method, and program
JP6790318B2 (en) Unmanned aerial vehicles, control methods, and programs
JP6503607B2 (en) Imaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
US20210112202A1 (en) Control apparatuses, mobile bodies, control methods, and programs
JP6515423B2 (en) CONTROL DEVICE, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
CN111602385B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN110337609A (en) Control device, lens assembly, photographic device, flying body and control method
CN110392891A (en) Mobile's detection device, control device, moving body, movable body detecting method and program
US20210092282A1 (en) Control device and control method
CN109844634A (en) Control device, photographic device, flying body, control method and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
JP6818987B1 (en) Image processing equipment, imaging equipment, moving objects, image processing methods, and programs
US20210218879A1 (en) Control device, imaging apparatus, mobile object, control method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201113

CF01 Termination of patent right due to non-payment of annual fee