CN107529006A - Camera device - Google Patents

Camera device Download PDF

Info

Publication number
CN107529006A
CN107529006A CN201710464425.1A CN201710464425A CN107529006A CN 107529006 A CN107529006 A CN 107529006A CN 201710464425 A CN201710464425 A CN 201710464425A CN 107529006 A CN107529006 A CN 107529006A
Authority
CN
China
Prior art keywords
image
view data
camera device
lens
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710464425.1A
Other languages
Chinese (zh)
Inventor
新谷浩
新谷浩一
川口胜久
富泽将臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN107529006A publication Critical patent/CN107529006A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/09Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Abstract

Camera device is provided.Camera device (1) has:Photographing element (202), it is repeatedly imaged to subject and obtains the multiple images data of subject;Phtographic lens (102), it is used on photographing element (202) picture for forming subject, has focus lens;AF control units (2143), it is controlled by the driving to focus lens, and the focus condition of phtographic lens (102) is controlled;Image combining unit (2041), its multiple shooting in the driving period by focus lens in the big region of the fuzzy quantity of the view data of the present frame in the multiple images data that are obtained by photographing element (202), synthesizes the small region of the fuzzy quantity of the view data of previous frame;And display part (206), it shows the image synthesized by image combining unit (2041).

Description

Camera device
Technical field
The present invention relates to the camera device with AF functions.
Background technology
As the automatic focal adjustments technology of phtographic lens possessed by camera device, it is known to contrast AF modes.It is right Than the mode that degree AF modes are the focal adjustments that focus lens are carried out according to contrast value, the contrast value is according to based on shooting What the picture signal that element generates via the light beam that phtographic lens receives calculated.In contrast AF modes, in order to sentence The driving direction of the disconnected focus lens towards focus, such as trembleed (wobbling) driving.It is following technology to tremble driving: Focus lens are carried out with small driving to near and unlimited direction, also, to being driven to when being driven to nearly direction and to unlimited direction Contrast value when dynamic is compared, and thus judges the driving direction of the focus lens towards focus.Shooting dress in recent years In putting, the contrast AF with vibration driving is also carried out in live view is shown etc. sometimes.In vibration drives, in order to carry out The comparison of contrast value, the direction also reduced to contrast value, the direction of i.e. fuzzy increase drive focus lens.Therefore, when Live view carries out trembleing the driving of the lens such as driving in showing when, driven with lens and associated fuzzy be shown as image sometimes. In the camera device of Japanese Unexamined Patent Publication 2011-43646 publications, carried out in this live view is shown with vibration driving It is fuzzy to the image associated with vibration driving to carry out repair process in the case of AF.Specifically, in Japanese Unexamined Patent Publication 2011- In the camera device of No. 43646 publications, as fuzzy processing is repaired, mould is represented according to the various information generation relevant with vibration The ambiguity function of paste, by implementing deconvolution processing to ambiguity function, repaired to fuzzy.
In Japanese Unexamined Patent Publication 2011-43646 publications, pass through computing as the generation of ambiguity function and deconvolution processing Measure than big processing to carry out fuzzy reparation.When integrally carrying out this processing to image, easily apply circuit load.
The content of the invention
The present invention be in view of the situation and complete, its object is to, there is provided following camera device:Ratio can be passed through More easy computing come suppress display image in the case where carrying out lens driving in a period of live view is shown etc. and Shooting carries out the deterioration of the record image in the case of lens driving in a period of acting.
In order to realize the purpose, the camera device of of the invention mode has:Photographing element, it enters to subject Row repeatedly images and obtains the multiple images data related to the subject;Phtographic lens, it is used in the photographing element The upper picture for forming the subject, has focus lens;AF control units, it is controlled by the driving to the focus lens System, is controlled to the focus condition of the phtographic lens;Image combining unit, it is in the driving period by the focus lens The interior repeatedly shooting and the view data of the present frame in the multiple images data that are obtained by the photographing element it is fuzzy Measure in big region, the fuzzy quantity for synthesizing the view data of the previous frame obtained before the view data of the present frame is small Region;And display part, it shows the image synthesized by described image combining unit.
Brief description of the drawings
Fig. 1 is the figure of the structure for the camera device for showing the 1st embodiment of the present invention.
Fig. 2 is the flow chart of the photograph processing for the camera device for showing the embodiments of the present invention.
Fig. 3 is the flow chart for showing the image synthesis processing in the 1st embodiment.
Fig. 4 A and Fig. 4 B are the figures for illustrating the contrast change extraction process in region greatly.
Fig. 5 is the figure for illustrating image synthesis.
Fig. 6 is when image synthesis has been carried out when image synthesizes and do not carried out in the example for shown in a manner of comparing Fig. 5 The figure of contrast.
Fig. 7 is the flow chart for showing the image synthesis processing in the 2nd embodiment.
Fig. 8 is the figure for illustrating variation.
Embodiment
Below, embodiments of the present invention are illustrated referring to the drawings.
[the 1st embodiment]
First, the 1st embodiment of the present invention is illustrated.Fig. 1 is the shooting for showing the 1st embodiment of the present invention The figure of the structure of device.Camera device 1 shown in Fig. 1, which has, changes camera lens 100 and camera device main body 200.Change camera lens 100 It is configured to be dismounted relative to camera device main body 200.When replacing camera lens 100 is assemblied in camera device main body 200, more Camera lens 100 and camera device main body 200 is changed to be attached in a manner of communicating freely.In addition, camera device 1 does not have to be mirror Replaceable camera device.For example, camera device 1 can also be the one-piece type camera device of camera lens.
Changing camera lens 100 has phtographic lens 102, focus lens control unit 104, communication unit 106.
Phtographic lens 102 has aperture and camera lens etc., the light beam from subject (not shown) is incided camera device The photographing element 202 of main body 200.Phtographic lens 102 includes the focus lens for being used for adjusting focus state.
Focus lens control unit 104 drives in the direction of the optical axis under the control of the control unit 214 of camera device main body 200 Dynamic focus lens, thus adjust the focus state of phtographic lens 102.
When replacing camera lens 100 is assemblied in camera device main body 200, communication unit 106 is logical with camera device main body 200 Letter portion 212 connects, and carries out changing the communication between camera lens 100 and camera device main body 200 between two parties.
Camera device main body 200 has photographing element 202, image processing part 204, display part 206, record portion 208, motion Test section 210, communication unit 212, control unit 214.
Photographing element 202 images to subject, obtains the view data of subject.
Image processing part 204 implements various image procossings to the view data obtained by photographing element 202.The image procossing Include white balance processing, gradation correction processing etc..Also, image processing part 204 includes image combining unit 2041.Image combining unit The 2041 image synthesis synthesized to multiple images data are handled.
Display part 206 is, for example, liquid crystal display or organic el display, is shown based on the figure obtained by photographing element 202 The various images such as the image as data.
Record portion 208 is the nonvolatile memories such as flash memory, is the recording medium for recording various data.In record portion Such as record is as image file obtained from the result of photographing actions in 208.Also, the record shooting dress in record portion 208 Put program used in the control of main body 200.
Motion detecting section 210 detects the motion of the subject between the view data of the multiframe obtained by photographing element 202. Such as the motion vector between the view data by detecting multiframe, to detect the motion of subject.
When replacing camera lens 100 is assemblied in camera device main body 200, communication unit 212 and the communication unit for changing camera lens 100 106 connections, carry out changing the communication between camera lens 100 and camera device main body 200 between two parties.
Control unit 214 is the control circuits such as CPU and ASIC, and the action to camera device main body 200 carries out blanket control.Control Portion 214 processed has as the function of imaging control part 2141, the function as display control unit 2142, as AF control units 2143 Function.Here, each function of control unit 214 can be realized by a hardware or software, can also pass through multiple hardware Or software is realized.Also, part of functions can also be provided separately with control unit 214.
Imaging control part 2141 controls shooting of the photographing element 202 to subject to act and read from photographing element 202 The reading operation of view data.Control when display control unit 2142 carries out showing various images in display part 206.AF is controlled Portion 2143 for example carries out AF controls by the contrast AF modes (hereinafter referred to as trembleing AF controls (Wob AF)) with vibration driving System.It is following technology to tremble AF controls:Focus lens are carried out with small driving to near and unlimited direction, also, to near Direction drive when amplitude to near-end and to unlimited direction drive when amplitude unlimited end at contrast value be compared, Thus the driving direction of the focus lens towards focus is judged.AF control units 2143 can also be configured to by phase difference AF modes Carry out AF controls.
Then, the action to the camera device of present embodiment illustrates.Fig. 2 is the shooting dress for showing present embodiment The flow chart for the photograph processing put.Control unit 214 carries out Fig. 2 processing as main body.Here, if camera device 1 is with quiet State image model and dynamic graphical pattern are as photograph mode.Static pattern image is the photography mould for recording still image Formula.Dynamic graphical pattern is the photograph mode for recording dynamic image.By operation of the user to operating portion (not shown) come Carry out the setting of these photograph modes.In addition, camera device can also have photograph mode beyond pattern, for example reproduce mould Formula.
Such as by connecting the power supply of camera device 1, proceed by Fig. 2 processing.In step sl, control unit 214 is sentenced The fixed dynamic image photography for whether proceeding by camera device 1.For example, to for indicating to start not scheming for dynamic image photography In the case that the functional unit shown is operated, it is judged to proceeding by dynamic image photography.It is determined as in step sl not When proceeding by dynamic image photography, proceeding by still image photography, processing is transferred to step S2.Judge in step sl During to proceed by dynamic image photography, processing is transferred to step S3.
In step s 2, control unit 214 carries out the photograph processing of still image.The photograph processing of still image is entered Row simple declaration.In the photograph processing of still image, focus lens driving is arrived focus by control unit 214 by WobAF etc. Position.Then, receive user still image photography start operate, control unit 214 is acted photographing element 202, is held Row still image is photographed, and the static image file as obtained from photographing still image is recorded in record portion 208.In static state After the photograph processing of image, Fig. 2 processing terminates.
In step s3, control unit 214 judges whether current fuzzy quantity is big.In contrast AF modes, fuzzy quantity by Represent that the contrast value of subject contrast represents.By in the AF areas that are set in photographing region to corresponding view data Radio-frequency component accumulated, so as to calculate contrast value.The contrast value is bigger, then it represents that the contrast in subject is got over High state (the smaller state of fuzzy quantity).Therefore, in step s3, such as judge whether contrast value is below threshold value. When being determined as contrast value in step S3 as below threshold value, i.e. in big state (such as the non-focus state) of fuzzy quantity, quivering In dynamic AF controls, the time is spent untill focus, so, processing is transferred to step S4.It is determined as contrast value in step s3 During more than threshold value, i.e. in small state (such as the focus state) of fuzzy quantity, in order to tremble AF controls, processing is transferred to step Rapid S6.In addition, on threshold value, the initial value as defined in setting in power on.It can also replace and state scanning focusing after execution The peak value for being considered as focus that is obtained after control is stored, and is set to threshold value.
In step s 4, control unit 214 is scanned focusing control.In scanning focusing control, control unit 214 is to focusing Lens control unit 104 sends instruction, with to the increased direction Continuous Drive pair of in nearly direction and unlimited direction, contrast value Focus lens.Then, the view data that control unit 214 drives and obtained by photographing element 202 according to adjoint continuous focus lens, Calculate contrast value.Then, control unit 214 is sent to focus lens control unit 104 and indicated, contrast value is turned into peak value Position (peak) is used as focal position, stops focus lens.Here, detecting the contrast value of discrete acquirement from increasing Add to when reducing change, according to lens location corresponding with the peak value of discrete contrast value and its front and rear lens location, lead to Interpolation arithmetic is crossed to calculate peak.
In step s 5, the peak value of contrast value is stored in emporary transient storage department (not shown) by control unit 214.Then, Processing is transferred to step S12.
In step s3 fuzzy quantity it is small, i.e. in the case of focus, in step s 6, control unit 214 holds photographing element 202 Row shooting acts.Then, the view data as obtained from acting shooting is stored in temporarily storage (not shown) by control unit 214 In portion.In the step s 7, control unit 214 calculates contrast value according to the view data obtained by photographing element 202.
In step s 8, the contrast value that control unit 214 judges to calculate is relative to the contrast temporarily stored in step S5 The peak value of angle value is with the presence or absence of change.Here, the peak value of the contrast value temporarily stored is obtained by scanning focusing control Peak value, stored as the contrast value under focus state.When being determined as contrast value in step s 8 in the presence of change, i.e. Be determined as from focus state change be non-focus state when, processing be transferred to step S9.It is determined as contrast value in step s 8 In the absence of change when, be determined as maintain focus state when, handle be transferred to step S12.
In step s 9, control unit 214 proceeds by vibration AF controls.Control unit 214 is sent out to focus lens control unit 104 Instruction is sent, to start to the small driving that focus lens are carried out to nearly direction and unlimited direction.Then, control unit 214 makes focusing Lens carry out small driving, hold according to the amplitude in small driving to near-end and infinitely the image obtained by photographing element 202 Data, contrast value is calculated respectively.Then, control unit 214 judges the driving of focus lens by the comparison of two contrast values Direction.And then by the comparison of contrast value, detecting that two contrast values are roughly equal so as to detecting peak When, control unit 214 sends to focus lens control unit 104 and indicated so that focus lens stop.Judging the driving of focus lens Behind direction, processing is transferred to step S10.Here, the drive cycle example of the focus lens in the vibration AF controls of present embodiment The drive cycle for the degree that a driving direction judges can be carried out with 2 frames in this way.
In step slo, control unit 214 carries out image synthesis processing.After image synthesis processing, processing is transferred to step S11.When proceeding by vibration AF controls, because lens drive display image or dynamic image may be made to record the picture of image Shoddyization.In order to suppress image quality deterioration, image synthesis processing is carried out.Image synthesis processing is discussed in detail below.
In step s 11, control unit 214 is shown in display part 206 based on the picture number obtained after image synthesis processing According to image as live view image.Then, in step s 16, control unit 214 carries out dynamic image record.That is, control unit 214 by image synthesis handle after Imagery Data Recording in record portion 208, as the dynamic image file generated.Dynamic After state image record, S3 the processing returns to step.
In step s 12, control unit 214 carries out usual image procossing using image processing part 204.Usual image procossing is Institute is shown without the image procossing of image synthesis processing, such as comprising image as white balance processing, gradation correction processing The processing needed.After usual image procossing, processing is transferred to step S13.Such as when focus lens are located at focal position, enter The usual image procossing of row.In this case, associated lens are controlled to drive in the absence of with vibration AF, so carrying out at usual image Reason.
In step s 13, control unit 214 shows the view data that is obtained after usual image procossing in display part 206 Image is as live view image.Then, processing is transferred to step S14.
In step S14, control unit 214 carries out dynamic image record.That is, control unit 214 will obtain from photographing element 202 Imagery Data Recording in record portion 208, as the dynamic image file generated.After dynamic image record, processing turns Move on to step S15.
In step S15, control unit 214 determines whether to terminate the dynamic image photography of camera device 1.For example, to In the case that the functional unit (not shown) of instruction end dynamic image photography is operated, it is judged to terminating dynamic image Photography.When being judged to not terminating dynamic image photography in step S15, S3 the processing returns to step.It is judged to tying in step S15 When beam dynamic image is photographed, Fig. 2 processing terminates.
Fig. 3 is the flow chart for showing the image synthesis processing in the 1st embodiment.In step S101, control unit 214 takes Obtain the view data obtained in the step S6 of preceding 1 frame.That is, control unit 214 reads the view data stored in emporary transient storage department.
In step s 102, control unit 214 detects the view data of present frame and preceding 1 frame using motion detecting section 210 The motion vector of subject between view data.
In step s 103, control unit 214 makes the view data of present frame and the view data of preceding 1 frame according to motion vector Aligned in position after, the view data of view data and preceding 1 frame to present frame is compared, and extracts the view data of present frame In contrast value the big region of change.For example, control unit 214 extracts phase from the Zone Full of the view data of present frame For preceding 1 frame view data contrast value in the presence of more than 10% contrast value change region.In vibration AF controls In, the position of focus lens at every moment changes.Moreover, accompany with this, shown in Fig. 4 A region 302 and Fig. 4 B region 304 Subject contrast change.In step s 103, region as Fig. 4 A region 302 and Fig. 4 B region 304 is extracted. Subject in that region be present in estimation.
In step S104, control unit 214 carries out the image between the view data of present frame and the view data of preceding 1 frame Feature Points Matching processing.Then, control unit 214 is low by similarity between the view data of present frame and the view data of preceding 1 frame Region be set as hereafter image synthesis when exclusion object.The respective pixel of view data by calculating 2 frames is mutual The difference of image feature amount (such as pixel value), carry out Image Feature Point Matching.In the case where the difference is small, pixel is mutual similar Degree is high.
In step S105, control unit 214 judges whether the view data of present frame is in focus state.That is, control unit 214 determine whether to detect peak by trembleing AF controls.It is determined as the view data of present frame in step S105 During in focus state, processing is transferred to step S106.It is determined as that the view data of present frame is not at closing in step S105 During coke-like state, processing is transferred to step S107.
In step s 106, control unit 214 synthesizes without the image illustrated in step S107, only carries out at usual image Reason.Then, Fig. 3 processing terminates.In step s 106, image does not produce fuzzy, so, it is not necessary to suppress fuzzy figure As synthesis.
In step s 107, control unit 214 carries out image synthesis using image processing part 204.Then, Fig. 3 processing knot Beam.Below, image synthesis is illustrated.Fig. 5 is the figure for illustrating image synthesis.In the following examples, user will The bird to fly away carries out dynamic image photography as subject.
Fig. 5 (a) is to show to tremble the body position shot passed through in AF controls relative to the time, lens location (with camera lens The photo distance of position correspondence), the figure of relation between fuzzy quantity.The point at the beginning of vibration AF controls, focus lens focus On the bird as subject.Thus, there is no fuzzy quantity.Then, because bird flies up, as shown in Fig. 5 (a), it is shot position Put and change with the time by and to remote side.Thus, produce the change of contrast value, so, carry out tremble AF control with Follow the motion of bird.As shown in Fig. 5 (a), in vibration AF controls, enter to nearly direction and the two directions of unlimited direction Row lens drive, also, carry out lens and drive so that final focus is in subject.Therefore, as shown in Fig. 5 (a), with mirror The change of head position, fuzzy quantity also change.That is, if the difference of body position shot and lens location is big, fuzzy quantity " big ", if The difference of body position shot and lens location is small, then fuzzy quantity " small ", if there is no body position shot and the difference of lens location, then " being not present " fuzzy quantity.
Fig. 5 (b) is figure at the time of showing to tremble the shooting in AF controls.According to synchronous at the time of driving with lens Synchronizing signal is imaged to carry out trembleing the shooting in AF controls.Therefore, when being imaged at the time of fuzzy quantity is big, thus The fuzzy quantity of the view data arrived is big, and when being imaged at the time of fuzzy quantity is small, thus obtained view data obscures Measure small.In the example of Fig. 5 (b), fuzzy state consecutive variations.Therefore, in live view image, fuzzy state Consecutive variations.Therefore, indisposed sense is caused to user.
Fig. 5 (c) is the figure for the concept for showing image synthesis.Put using the unambiguous region of the view data of previous frame The fuzzy region of the view data of present frame is changed, thus carries out image synthesis.Therefore, image synthesis is carried out after the 2nd frame. In addition, on the fuzzy region in view data and unambiguous region, in Fig. 3 step S103, angle value as a comparison Change big region extract.
For example, when the image synthesis for the 2nd frame image data to Fig. 5 (c) illustrates, to from the 2nd frame figure When being compared as region that extracting data goes out and the region extracted from the 1st frame image data, and from the 2nd two field picture number The region extracted in is compared, and the fuzzy quantity in the region extracted from the 1st frame image data is small.Therefore, substantially, image Processing unit 204 is extracted using the small regional replacement of the fuzzy quantity extracted from the 1st frame image data from the 2nd frame image data The big region of the fuzzy quantity that goes out.But in the region to being extracted from the view data of the 2nd frame and the picture number from the 1st frame When the region extracted in is compared, in the 2nd frame and the 1st frame, the state spread the wings of bird is different.Therefore, in Fig. 3 step It is exclusion object when image synthesizes by the section sets of the wing of bird in rapid S104.Therefore, as shown in Fig. 5 (c), image Processing unit 204 does not utilize the wing of the regional replacement bird extracted from the 1st frame image data, and utilizes from the 1st two field picture number The region of the regional replacement extracted in addition.
Also, when the image synthesis for the 3rd frame image data to Fig. 5 (c) illustrates, to from the 3rd frame figure When being compared as region that extracting data goes out and the region extracted from the 2nd frame image data, and from the 3rd two field picture number The region extracted in is compared, and the fuzzy quantity in the region extracted from the 2nd frame image data is small.Therefore, substantially, image Processing unit 204 is extracted using the small regional replacement of the fuzzy quantity extracted from the 2nd frame image data from the 3rd frame image data The big region of the fuzzy quantity that goes out.It is different during from the image synthesis for the 2nd frame image data, to from the 3rd frame image data When the region extracted and the region extracted from the 2nd frame image data are compared, in the 3rd frame and the 2nd frame, the shape of bird State is roughly the same.Therefore, as shown in Fig. 5 (c), image processing part 204 utilizes the region extracted from the 2nd frame image data Replace the Zone Full extracted from the 3rd frame image data.
On the other hand, in the image synthesis for the 4th frame and the 5th frame image data of Fig. 5 (c), with previous frame View data is compared, and the fuzzy quantity of the view data of present frame is small.Therefore, image processing part 204 synthesizes without image.
Fig. 6 is when image synthesis has been carried out when image synthesizes and do not carried out in the example for shown in a manner of comparing Fig. 5 The figure of contrast.As shown in fig. 6, being synthesized by image, the contrast (i.e. fuzzy) of the 2nd frame and the 3rd frame is improved.
As described above, according to present embodiment, the situation that live view is shown also is carried out in a period of AF Under, when that will show in the presence of fuzzy image, carry out that depositing for fuzzy image transform present frame is not present using previous frame Image synthesis in fuzzy part.The live view shown in a period of AF can be reduced by this easy processing The fuzzy quantity of display.
Also, the comparative result on 2 frame image datas is the different part of image feature amount, even if fuzzy quantity is big, Excluded from the object of image synthesis.Thus, it is suppressed that image is overall to be obscured, and is not allowed to be also easy to produce due to displacement and made figure As more unnatural situation.
Here, the image synthesis in the embodiment is replacement Treatment.On the other hand, can also be to take the figure of present frame As data are synthesized with the average mode that is added of the view data of previous frame.Also, can also be by corresponding with fuzzy quantity Weighting summation synthesized.The weight coefficient of the weighting summation for example takes 0~1 value.Moreover, it is set as being directed to 2 frame figures As the value of the weight coefficient of the small view data of the fuzzy quantity in data is big, for the weight coefficient of the big view data of fuzzy quantity Value it is small.
Also, in the example illustrated, 2 two field pictures between the view data to present frame and the view data of preceding 1 frame close Into illustrating.But it is also possible to carry out the figure as the view data of present frame and the view data of preceding 2 frame more than 3 frames As synthesis.
[the 2nd embodiment]
Then, the 2nd embodiment of the present invention is illustrated.2nd embodiment is the variation of image synthesis processing. In the 1st embodiment, the comparative result on 2 frame image datas is the different part of image feature amount, is closed without exception from image Into object in exclude.On the other hand, the 2nd embodiment be by weighting summation corresponding with the similarity of image feature amount come Carry out the example of image synthesis.
Fig. 7 is the flow chart for showing the image synthesis processing in the 2nd embodiment.Here, in the figure 7, omit and Fig. 3 phases The explanation of same part.That is, the step S201-S206 and Fig. 3 of Fig. 7 step S101-106 are identical, so omitting the description.
It is determined as in step S205 in the step S207 when view data of present frame is not at focus state, control unit Whether the 214 judgement regions extracted from the view data of present frame and the region that is extracted from the view data of previous frame It is similar.Such as by judging whether the similarity (such as being calculated according to the difference of image feature amount) between 2 frames is setting More than, carry out the whether similar judgement of the view data of 2 frames.If similarity is more than setting, it is determined as the image of 2 frames Data are similar.It is determined as the region extracted from the view data of present frame and the picture number from previous frame in step S207 When the region extracted in is similar, processing is transferred to step S208.It is determined as the picture number from present frame in step S207 When the region extracted in and the region dissmilarity extracted from the view data of previous frame, processing is transferred to step S209。
In step S208, control unit 214 carries out image synthesis using image processing part 204.Then, Fig. 7 processing knot Beam.Image synthesis is identical with the step S107 of the 1st embodiment image synthesis.Therefore, omit the description.
In step S209, control unit 214 carries out image corresponding with similarity using image processing part 204 and synthesized.So Afterwards, Fig. 7 processing terminates.The image synthesis is carried out by weighting summation corresponding with similarity.The weight of the weighting summation Coefficient for example takes 0~1 value.Moreover, be set as that similarity is lower, then for present frame view data weight coefficient value It is bigger, it is smaller for the value of the weight coefficient of the view data of previous frame.That is, in the case where similarity is low, present frame is made View data is preferential.Moreover, the region that is extracted in the view data from present frame and being carried from the view data of previous frame When the region of taking-up is completely dissimilar, excluded from synthetic object.
As described above, according to present embodiment, as image feature amount difference in the case of image synthesis Processing, image corresponding with similarity is used to synthesize in the lump.Thereby, it is possible to the inclined of the subject after further suppressing due to synthesis Move or subject state change and to caused by image influence.
[other variations]
The variation of the embodiment is illustrated.Fig. 8 is the figure synthesized for illustrating the image of variation.With figure 5 is same, and Fig. 8 (a) is to show to tremble the body position shot passed through in AF controls relative to the time, lens location (with camera lens position Put corresponding photo distance), the figure of relation between fuzzy quantity.Fig. 8 (b) is at the time of showing to tremble the shooting in AF controls Figure.Fig. 8 (c) is the figure for the concept for showing image synthesis.In fig. 8, in order to extend the time of focus as far as possible, every 2 frame Carry out a lens driving.In the case where the motion of subject is few, this driving is effective.
In vibration drives, lens are driven to nearside and unlimited side both sides.Therefore, driven when every 1 frame carries out a lens When dynamic, the state of fuzzy quantity frequently changes.On the other hand, it is particularly few in the motion of subject in driving as Fig. 8 Under state, the state of fuzzy quantity will not frequently change.
On the other hand, because the state of fuzzy quantity will not frequently change, so, it is different from Fig. 5 example, in Fig. 8 example In son, the state identical view data of fuzzy quantity can be synthesized each other.Here, as shown in Figure 8, in the fortune of subject It is dynamic it is small in the case of, the mutual image synthesis of view data that can be generated outside by depth is bigger and the fuzzy quantity generated View data (being represented in figure with black triangles) in depth and by depth outside view data and depth in picture number According to image synthesis and the view data (being represented in figure with white circle) in the smaller depth of fuzzy quantity that generates.Moreover, In the case where every 2 frame carries out a lens driving, every 3 frame generates the view data in a depth.So, regularly giving birth to In the case of the view data in view data and depth outside into depth, when carrying out live view display to them, pass through The interpolation realized by the function of human brain, fuzzy in image are averaged.
The present invention is illustrated above according to embodiment, still, certainly can be the invention is not restricted to above-mentioned embodiment Various modifications and application are carried out in the range of present subject matter.For example, the embodiment is by taking the AF with vibration driving as an example. On the other hand, the technology of present embodiment can be applied to the various AF processing driven in live view is shown with lens. For example, the technology of present embodiment can also be applied to phase difference AF.
Also, in the explanation of each action flow chart, used in order to easy " first ", " then " etc. to act into Row explanation, still, is not meant to sequentially to implement to act according to this.
Also, each processing of above-mentioned embodiment can also be as can be program that control unit 214 performs by computer Stored in advance.Furthermore, it is also possible to it is stored in the storage medium of the external memories such as disk, CD, semiconductor memory In issued.Moreover, control unit 214 reads in the program stored in the storage medium of the external memory, pass through the reading Program to action be controlled, thus, it is possible to perform above-mentioned processing.

Claims (5)

1. a kind of camera device, it has:
Photographing element, it is repeatedly imaged to subject and obtains the multiple images data related to the subject;
Phtographic lens, it is used on the photographing element picture for forming the subject, has focus lens;
AF control units, it is controlled by the driving to the focus lens, and the focus condition of the phtographic lens is carried out Control;
Image combining unit, its described repeatedly shooting in the driving period by the focus lens is and by the photographing element In the big region of the fuzzy quantity of the view data of present frame in obtained multiple images data, the figure in the present frame is synthesized As the small region of the fuzzy quantity of the view data of the previous frame obtained before data;And
Display part, it shows the image synthesized by described image combining unit.
2. camera device according to claim 1, wherein,
Described image combining unit excludes in the small region of the low fuzzy quantity of similarity from the object of the synthesis, the similarity It is the similarity of the image feature amount between the region big with corresponding fuzzy quantity.
3. camera device according to claim 1, wherein,
Described image combining unit carries out weighting corresponding with similarity, and the region small to the fuzzy quantity synthesizes, the phase It is the similarity of the image feature amount between the region big with corresponding fuzzy quantity like degree.
4. camera device according to claim 1, wherein,
The camera device also has motion detecting section, and motion detecting section detection is with the driving of the focus lens and by institute The motion of the subject in the multiple images data that photographing element obtains is stated,
Described image combining unit makes the position of the view data of the present frame and the view data of previous frame according to the motion Alignment, then carries out the synthesis.
5. the camera device described in any one in Claims 1 to 4, wherein,
The AF control units make the focus lens carry out vibration driving.
CN201710464425.1A 2016-06-22 2017-06-19 Camera device Pending CN107529006A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-123342 2016-06-22
JP2016123342A JP2017228913A (en) 2016-06-22 2016-06-22 Imaging apparatus

Publications (1)

Publication Number Publication Date
CN107529006A true CN107529006A (en) 2017-12-29

Family

ID=60675239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710464425.1A Pending CN107529006A (en) 2016-06-22 2017-06-19 Camera device

Country Status (3)

Country Link
US (1) US10491800B2 (en)
JP (1) JP2017228913A (en)
CN (1) CN107529006A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021013143A1 (en) * 2019-07-23 2021-01-28 深圳市大疆创新科技有限公司 Apparatus, photgraphic apparatus, movable body, method, and program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107330408B (en) * 2017-06-30 2021-04-20 北京乐蜜科技有限责任公司 Video processing method and device, electronic equipment and storage medium
KR20200110906A (en) 2019-03-18 2020-09-28 삼성전자주식회사 Electronic device for auto focusing function and operating method thereof
DE102019135192A1 (en) * 2019-12-19 2021-06-24 Connaught Electronics Ltd. Method for determining a state of a tailgate of a flatbed vehicle by evaluating a region of interest, computer program product, electronic computing device and camera system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101601279A (en) * 2007-08-03 2009-12-09 松下电器产业株式会社 Camera head, image capture method and program
US20100157072A1 (en) * 2008-12-22 2010-06-24 Jun Luo Image processing apparatus, image processing method, and program
US20130215290A1 (en) * 2012-02-21 2013-08-22 Johannes Solhusvik Detecting transient signals using stacked-chip imaging systems
US20140354875A1 (en) * 2013-05-31 2014-12-04 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
CN105100594A (en) * 2014-05-12 2015-11-25 奥林巴斯株式会社 Imaging device and imaging method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5254904B2 (en) 2009-08-20 2013-08-07 キヤノン株式会社 Imaging apparatus and method
JP6160292B2 (en) * 2013-06-24 2017-07-12 富士通株式会社 Image correction apparatus, imaging apparatus, and computer program for image correction
JP6218520B2 (en) * 2013-09-11 2017-10-25 キヤノン株式会社 Image processing apparatus, image processing method, and program
JP5744161B2 (en) * 2013-11-18 2015-07-01 シャープ株式会社 Image processing device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101601279A (en) * 2007-08-03 2009-12-09 松下电器产业株式会社 Camera head, image capture method and program
US20100157072A1 (en) * 2008-12-22 2010-06-24 Jun Luo Image processing apparatus, image processing method, and program
US20130215290A1 (en) * 2012-02-21 2013-08-22 Johannes Solhusvik Detecting transient signals using stacked-chip imaging systems
US20140354875A1 (en) * 2013-05-31 2014-12-04 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
CN105100594A (en) * 2014-05-12 2015-11-25 奥林巴斯株式会社 Imaging device and imaging method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021013143A1 (en) * 2019-07-23 2021-01-28 深圳市大疆创新科技有限公司 Apparatus, photgraphic apparatus, movable body, method, and program

Also Published As

Publication number Publication date
US10491800B2 (en) 2019-11-26
US20170371231A1 (en) 2017-12-28
JP2017228913A (en) 2017-12-28

Similar Documents

Publication Publication Date Title
US8861806B2 (en) Real-time face tracking with reference images
CN107529006A (en) Camera device
JP4775700B2 (en) Image processing apparatus and image processing method
US10080006B2 (en) Stereoscopic (3D) panorama creation on handheld device
KR101247519B1 (en) Image processing apparatus and image processing method
US20110141229A1 (en) Panorama imaging using super-resolution
CN110536057A (en) Image processing method and device, electronic equipment, computer readable storage medium
US20110141224A1 (en) Panorama Imaging Using Lo-Res Images
JP2001330882A (en) Camera with subject recognizing function
JP2010011441A (en) Imaging apparatus and image playback device
JP4597087B2 (en) Image processing apparatus and method, and imaging apparatus
EP2545411A1 (en) Panorama imaging
JP2011097645A (en) Image synthesizing apparatus, imaging apparatus, and image synthesizing method
JP5144724B2 (en) Imaging apparatus, image processing apparatus, imaging method, and image processing method
JP6604908B2 (en) Image processing apparatus, control method thereof, and control program
JP4710983B2 (en) Image composition device, imaging device, and image composition method
JP2013005325A (en) Electronic camera
JP2018056659A (en) Photographing control device and control method for the same
JP2010233188A (en) Imaging apparatus, control method thereof and program
JP2019110402A (en) Imaging apparatus, control method therefor, and program
JP2007057704A (en) Camera
JP5613145B2 (en) Imaging apparatus and program
JP2012065128A (en) Image evaluation apparatus and image evaluation program
JP4938989B2 (en) Electronic camera
JP4142199B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20171229