CN104145475A - Image processing device, image processing method, program - Google Patents

Image processing device, image processing method, program Download PDF

Info

Publication number
CN104145475A
CN104145475A CN201380012098.6A CN201380012098A CN104145475A CN 104145475 A CN104145475 A CN 104145475A CN 201380012098 A CN201380012098 A CN 201380012098A CN 104145475 A CN104145475 A CN 104145475A
Authority
CN
China
Prior art keywords
stable
main subject
state estimation
processing
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380012098.6A
Other languages
Chinese (zh)
Other versions
CN104145475B (en
Inventor
木下雅也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN104145475A publication Critical patent/CN104145475A/en
Application granted granted Critical
Publication of CN104145475B publication Critical patent/CN104145475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors

Abstract

The invention is provided with: a stable imaging state inference unit that performs a stable imaging state inference process that infers whether or not a state is a stable imaging state; and a main subject assessment unit that performs a main subject assessment process. If by means of the stable imaging state inference process, said state is inferred to be the stable imaging state, the main subject assessment unit outputs the result of the main subject assessment process.

Description

Image processing equipment, image processing method and program
Technical field
The disclosure relates to the image processing equipment carrying out about the processing of the main subject in image, image processing method and program.
Quoted passage list
Patent documentation
PTL1: uncensored Japanese Patent Application Publication No.2011-166305
PTL2: uncensored Japanese Patent Application Publication No.2011-146826
PTL3: uncensored Japanese Patent Application Publication No.2011-146827
PTL4: uncensored Japanese Patent Application Publication No.2011-160379
Background technology
In recent years digital still camera and digital camera have face detection function conventionally, and have and adjust best such as the focus of camera and the parameter brightness, so that the coupling position of face or the function in region.
Meanwhile, PTL1 discloses a kind of wherein by user's appointment, selects the method as " the main subject " of subject tracking target in photographic images.
In addition, for example, if use disclosed technology in PTL2,3 and 4, can realize so subject and follow the tracks of, to surround the whole body profile of any subject.
In addition, also exist and a kind ofly wherein utilize the desired regions in the detection/track up images such as automatic focus/automatic exposure, and control optical system etc., so that the described region optimum function that becomes.
As mentioned above, the known image of being appointed as main subject by user of wherein following the tracks of, such as the image-region such as face, or the technology of focusing face area.
Summary of the invention
Simultaneously, under current state, as the desired regions of the tracking in photographic images or focusing target,, " main subject " is to use any method by cameraman, from obtaining, selects a candidate to determine from " a plurality of candidate region " of various detectors.
For example, utilize wherein by utilizing touch panel, from being reflected in, be shown as in a plurality of faces the preview image (the subject monitoring picture showing in the time at non-shutter operation) of picture, select the operation of any face, select main subject.On the other hand, in the timing of user's appointment, for example shutter half is by inferior time, and the subject that is positioned at presumptive area is chosen as main subject.
Yet under actual service condition, when considering this user interface, the operation itself such as " selection of the main subject that cameraman carries out " is difficulty conventionally.
For example, user expects to use this function, to focus on the subject moving around always, but, Yi Bian user is difficult to hold camera, aims at this subject, Yi Bian select this subject.
For example, owing to user, to the variation of subject or mobile reaction speed, above-mentioned appointment is difficult.For example, on the picture of preview image, that's how things stand can not to specify well the subject moving around.
In addition, user, camera is held in his/her hand, and under the situation facing to subject ground selection subject, user selects the action of main subject itself just difficult with finger.
In addition,, owing to the resolution that configures the display screen of touch panel, user may be difficult to select subject.In addition, owing to the subject size on the display screen of configuration touch panel, or the size of user's finger or thickness, may not suitably specify the subject of expectation.
In addition, owing to the time lag on camera arrangement, for example, and the time lag between actual scene and the preview image of photographic images, user may be difficult to suitably specify subject.
In addition, when taking or recording in the process of moving image, while carrying out this operation, what by the action of selecting main subject, caused is image blurring by former state record, or at main subject, frame out, or main subject is when disappear because cover temporarily,, when following the tracks of unsuccessfully, the action that is forced to reselect.
As mentioned above, in minicam, select the action of main subject originally all comparatively difficult under many applicable cases of this action of needs, thereby can give cameraman's pressure.
So, even if object of the present disclosure is to realize the action that a kind of user such as cameraman does not select subject, also can in due course, the target subject of user's expectation be defined as to the technology of main subject.
Image processing equipment of the present disclosure comprises and is stable into picture state estimation parts, described in be stable into as state estimation parts and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process; With main subject judging part, described main subject judging part carries out main subject determination processing, when owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, also exports the result of main subject determination processing.
Image processing method of the present disclosure comprises and is stable into picture state estimation step, described in be stable into as state estimation step and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process; Carry out the main subject determination step of main subject determination processing; With when owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, export the output step of the result of main subject determination processing.
Program of the present disclosure makes arithmetic processing equipment carry out described each step.
According to technology of the present disclosure, carry out for view data, automatically judge the main subject determination processing of main subject.As the processing of the preliminary step of main subject determination processing, that estimates whether to occur to stablize image formation state is stable into picture state estimation process.In addition, in the situation that estimating to occur being stable into picture state, carry out main subject determination processing, so that output result of determination, or export the main subject result of determination obtaining when estimating to occur being stable into as state.Stable image formation state is due to subject image stabilization, is therefore suitable for carrying out the state of main subject determination processing.
According to the disclosure, when occurring stablizing image formation state, the result of determination of the main subject in output photographic images.Thereby, need to the user such as cameraman do not select the action of main subject, and main subject information can be in due course and used.
Accompanying drawing explanation
Fig. 1 is that graphic extension is according to the block diagram of the configuration example of the image processing equipment of embodiment of the present disclosure.
Fig. 2 is that graphic extension is according to the flow chart of the main subject determination processing in the image processing equipment of embodiment.
Fig. 3 is that graphic extension is according to the block diagram of the imaging device of embodiment.
Fig. 4 is that graphic extension is according to the flow chart of the in detail main subject determination processing of embodiment.
Fig. 5 is that graphic extension is according to the candidate image frame of embodiment and the diagram of determinating reference point.
Fig. 6 is that graphic extension is according to the diagram of the distance between the candidate image frame of embodiment and determinating reference point.
Fig. 7 is that graphic extension is according to the diagram of the judgement of the stable existence degree of the position-based state of embodiment.
Fig. 8 is that graphic extension is according to the flow chart of the concrete main subject determination processing of embodiment.
Fig. 9 is that graphic extension is according to the execution that is stable into picture state estimation process of embodiment diagram regularly.
Figure 10 is that graphic extension is according to the execution that is stable into picture state estimation process of embodiment diagram regularly.
Figure 11 is that graphic extension is according to the flow chart of the processing example 1 of embodiment.
Figure 12 is that graphic extension is according to the flow chart of the processing example 2 of embodiment.
Figure 13 is that graphic extension is according to the diagram of the operation in the processing example 2 of embodiment.
Figure 14 is that graphic extension is according to the flow chart of the processing example 3 of embodiment.
Figure 15 is that graphic extension is according to the diagram of the operation in the processing example 3 of embodiment.
Figure 16 is that graphic extension is according to the flow chart of the processing example 4 of embodiment.
Figure 17 is that graphic extension is according to the diagram of the operation in the processing example 4 of embodiment.
Figure 18 is that graphic extension is according to the flow chart of the processing example 5 of embodiment.
Figure 19 is that graphic extension is according to the diagram of the Global Vector in the processing example 5 of embodiment.
Figure 20 is that graphic extension is according to the diagram of the operation in the processing example 5 of embodiment.
Figure 21 is that graphic extension is according to the diagram of the change of the scheduled time of embodiment.
Figure 22 is the flow chart that graphic extension is processed according to the scheduled time change of embodiment.
Figure 23 is that graphic extension is according to the flow chart of the main subject determination processing of the another kind in the image processing equipment of embodiment.
Figure 24 is that graphic extension is wherein applied to the disclosure according to the block diagram of the situation of the computer equipment of embodiment.
Embodiment
Below, embodiment is described in the following order.
<1. the structure > of image processing equipment
<2. the structure > of imaging device
<3. main subject determination processing >
<4. be stable into picture state estimation process >
[4-1: the execution opportunity that is stable into picture state estimation process]
[4-2: process example 1]
[4-3: process example 2]
[4-4: process example 3]
[4-5: process example 4]
[4-6: process example 5]
[4-7: the scheduled time change of estimating use in processing is processed]
<5. the another kind in image processing equipment is processed example >
<6. the application > to program and computer equipment
<7. variation >
In addition, will mention that the term using " stablizes image formation state ", the implication of " stable existence degree " and " visual field " in explanation below.
" stable existence degree " is to serve as the finger target value that automatically main subject is judged.In other words, stable existence degree is considered to indication in space, visual angle, the value of certain subject frequency in predetermined state aspect position.For example, stable existence degree is that subject desired value in predetermined state whether in image is judged on time precision highland.When illustrating according to the embodiment illustrating below, stable existence degree is the value of the accumulated time of indication candidate image in the precalculated position of the location status as in space, visual angle state, duration, average presence etc., for example, think to there is the long accumulated time calculated as " stable existence degree " or the image of long duration is estimated as the main subject that cameraman mainly aims at.
In addition, there is the space of photographic images in above-mentioned " space, visual angle " indication.The indication of space, visual angle is as the two-dimensional space of the screen plane in photographic images, or during comprising photography, subject is for the three dimensions of the relative distance of position of camera.
Indication is suitable for carrying out main subject determination processing " to stablize image formation state ", or utilizes state or the situation of main subject result of determination.Although in various expection situations, relativeness between imaging device and subject and difference, if but the variation of photographic images content is less, and continue the period when making to a certain extent photographic images stablize, so main subject determination processing is significant processing.
For example, the in the situation that of minicam, on one side user holds camera, Yi Bian find the situation of subject, be to stablize image formation state.For example, stable image formation state is the state that wherein imaging device 10 is stably kept, such as finding subject, to take the situation of the rest image of subject.
In the present embodiment, as the entry condition of automatic main subject determination processing, carry out " being stable into picture state estimation process ", but this is to estimate to carry out the whether suitable processing of the stable processing of main subject.For example, the in the situation that of minicam, " be stable into picture state estimation process " is to judge the whether processing in following mode of operation of user: Yi Bian described mode of operation can be estimated as, hold camera, Yi Bian find the situation of subject.
Subject scene in the scope showing in " visual field " indication photographic images.Visual field change indication, on time shaft, appears at the various changes of the photographic images in captured image data.For example, described change indicates the various changes in present view data, change such as the visual angle of photographic images, the change in subject orientation, the change of the subject scope being caused by shake or camera pose etc., picture quality change such as briliancy, color and contrast, and the change of focus state.Wherein the situation of visual field change in preset range, can judge that the stable situation of visual field can be estimated as " stablizing image formation state " that is.
<1. the structure > of image processing equipment
Fig. 1 graphic extension is according to the configuration example of the image processing equipment of embodiment.
Image processing equipment 1 comprises main subject judging part 2 and is stable into picture state estimation parts 3.
Be stable into as state estimation parts 3 and estimate that whether occurring stablizing being stable into of image formation state looks like state estimation process.
Main subject judging part 2 carries out main subject determination processing.
Owing to be stable into as state estimation parts 3 be stable into picture state estimation process, estimate to occur being stable in the picture situation of state the result of the main subject determination processing of main subject judging part 2 output.
For example, main subject judging part 2 carries out main subject determination processing, and in the situation that estimating to occur being stable into picture state, output result of determination.
On the other hand, main subject judging part 2 can successively carry out main subject determination processing, and exports when estimating to occur being stable into picture state the up-to-date main subject result of determination of acquisition.
Be stable into the estimated information Inf that picture state estimation parts 3 are inputted by utilization, carry out whether being estimated as and stablizing being stable into of image formation state and look like state estimation process about current state.Estimated information Inf can be various information, for example, and elapsed time information, transducer output for detection of the movement of imaging device, the controlling value of the various operations of imaging device or command value, for detection of the transducer output of the motion of imaging optical system, graphical analysis information etc.
Be stable into as state estimation parts 3 by utilizing estimated information Inf, estimate to process, and in the situation that estimating to occur being stable into picture state, notify main subject judging part 2 to estimate to have occurred stablizing image formation state.
The notice of the estimation of image formation state is stablized in response, and main subject judging part 2 is stable into picture state estimation process as in following example.
For example, as the arithmetic processing function of utilizing software program to realize, main subject judging part 2 has candidate measuring ability and main subject determination processing function.
First main subject judging part 2 utilizes candidate measuring ability, carries out candidate detection.
It is from the input image data Dg of multiframe that candidate detects, and detects the processing of the candidate image of the candidate that serves as main subject.
In other words, to each frame image data Dg inputting continuously on time shaft, or each frame intermittently carries out face image detection or human body image detects, and therefrom extracts the image of the candidate that serves as main subject.
In addition, can utilize for the method for mode matching in the graphical analysis of captured image data, carry out face detection, human detection etc., if be replaced for the dictionary of pattern matching, can realize so in principle any other detector.For example, can be for dog eye detection, broadleaf monkeyflower herb detection etc., the candidate image of extracting main subject.
In addition, for example, by the mobile body detecting method that utilizes frame difference, can detect movable body, movable body can be used as candidate image, in this case, can use the region-of-interest extracting method that is called saliency.
In addition, main subject judging part 2 for example, the information of indicating the candidate image of extracting due to candidate Check processing,, the positional information of the two-dimensional directional in the picture of candidate image (x and y coordinate figure), subject distance, image size etc., are set as candidate image information.
Main subject judging part 2 utilizes main subject determination processing function, one after the other carries out setting with the main subject that is calculated as basis of stable existence degree.
In other words, main subject judging part 2, about utilizing the candidate image that detects the candidate image information indication obtaining due to candidate, obtains the stable existence degree in the view data in multiframe, and in candidate image, judges main subject.In addition, export main subject information D m.
First, main subject judging part 2, about utilizing the candidate image of candidate image information indication, is judged the location status in space, visual angle.
The situation of the absolute or relative position of the photographic images in " location status " general name ground indicating image data in space, visual angle.
The object lesson of " location status " comprises
Relative distance with certain determinating reference point in space, visual angle
Relative position relation or relative distance with certain determinating reference region in space, visual angle
The position of the candidate image in the two dimensional surface of photographic images
During imaging, the relative distance of subject and position of camera
Relative position relation between subject distance and determinating reference point or determinating reference region, etc.
Afterwards, the location status of determining the candidate image frame from each, obtains the stable existence degree in the view data of each candidate image in multiframe.
In addition, wherein utilize the stable existence degree obtaining due to stable existence degree computing function, from each candidate image, judge main subject, to certain candidate image is set as to the processing of main subject.
As mentioned above, the information that indication is set as the image of main subject is output as main subject information D m, and is sent to other application software, treatment circuit part etc.
In addition, comprise main subject judging part 2 and be stable into central processing unit (CPU) or digital signal processor (DSP) realization that can be used as arithmetic processing device as the image processing equipment 1 of state estimation parts 3.
The flow process of the processing that image processing equipment 1 carries out as shown in Figure 2 A.
In step F 1000, be stable into as state estimation parts 3 and be stable into picture state estimation process.As mentioned above, be stable into the estimated information Inf that picture state estimation parts 3 are inputted by utilization, judge whether current state is estimated as stable image formation state.
Now, if being stable in picture state estimation process of step F 1000, estimate not occur not stablize image formation state, so in step F 1001, the processing of Fig. 2 A of end image processing equipment 1.Then, from step F 1000, restart this processing.
On the other hand, if looked like in state estimation process being stable into of step F 1000, estimate to occur being stable into picture state, so in step F 1001, image processing equipment 1 is judged to be the beginning timing of main subject determination processing this.In addition, flow process enters step F 1002, and main subject judging part 2 carries out main subject determination processing.
First, in step F 1002, start above-mentioned candidate image and detect.
For example, from imaging device part (not shown), or receive from the acceptance division of the view data that independently imaging device transmits and grade, input image data Dg.On the other hand, exist and reproduce and input and utilize imaging device to obtain, and be kept at the situation of the motion image data in recording medium.
In step F 1002, main subject judging part 2 is usingd each frame of the view data Dg that in succession inputs as target, carries out graphical analysis, frame difference detection, area-of-interest-detection etc., to start to detect the processing of predetermined candidate image.
Main subject judging part 2 can carry out candidate image extraction using all incoming frames as target, can be frame at intermittence, such as every a frame or every two frames as target, carry out candidate image extraction.In other words, during carrying out main subject judgement, main subject judging part 2 at least can carry out candidate image extraction process to multiframe to sequential.Image as candidate image becomes with setting, such as being face image, human body image, dog image, cat image etc.
In addition, main subject judging part 2 is each frame, generates the candidate image information of the candidate image of indication detection.
Afterwards, in step F 1003, main subject judging part 2 carries out main subject determination processing.
Referring to Fig. 2 B, describe step F 1003 in detail.
First, in step F 1, main subject judging part 2 carries out location status determination processing.This processing is the processing of judging the location status of each candidate image previously having generated.
Candidate image may be present in the frame or multiframe after a frame, and candidate image may not be present in certain frame.In the situation that a plurality of candidate images are present in a frame, to each candidate image, determine location status.
Afterwards, in step F 2, main subject judging part 2 is the location status in each frame according to candidate image definite in location status determination processing, calculates the stable existence degree in the view data of each candidate image in multiframe.For example, to be calculated as indicating positions state be the value etc. of frequency that approaches the state of picture centre to stable existence degree.
In step F 3, main subject judging part 2, by utilizing the stable existence degree of each candidate image, is judged main subject from candidate image.For example, it is that the stable existence degree of value etc. of frequency that approaches the state of picture centre is that maximum or the candidate image that reaches predetermined value within the shortest time are judged as main subject as indicating positions state.
In the step F 1003 of Fig. 2 A, main subject judging part 2 carries out the step F 1-F3 of Fig. 2 B as above, thereby a candidate image is set as to main subject.
In addition, in the step F 1004 of Fig. 2 A, main subject judging part 2 sends to application program etc. main subject information D m.
Application programs etc. are carried out the processing corresponding with the fact of the image that given settings is main subject.For example, described processing comprises focus control, follows the tracks of and processes, image effect processing etc.
As mentioned above, in the main subject determination processing of utilizing image processing equipment 1 to carry out, when owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, carry out main subject determination processing.
Therefore,, to stablize the form of the estimation of image formation state, judge and be suitable for judging the opportunity of main subject, thereby can in due course, carry out main subject determination processing.
For example, the state that the cameraman who wherein has a camera is intended to photography and holds camera is estimated as stable image formation state.In this case, be suitable for starting main subject determination processing.On the contrary, especially, in the situation that user does not hold camera, it is useless carrying out main subject determination processing.
In the image processing equipment 1 of this example, by being stable into picture state estimation process, estimate the state that wherein cameraman intends imaging, for example, wherein cameraman attempts to find the situation of subject, and in the opportune moment based on this estimation,, the opportunity of user's expectation, and when not carrying out useless processing, carry out main subject determination processing.
In addition, in main subject determination processing, from the candidate image of extracting, obtain the stable existence degree of subject in multiframe.In other words, obtain for judging whether subject is present in the desired value of image time frequency highland position stability.For example, the subject that precision highland is regarded as holding the target that the cameraman of camera aims at has higher stable existence degree.; make subject that cameraman mainly aims at as the position in photographic images, be comprised in image, so that become point or the region that is considered as center by cameraman; and if cameraman aims at this subject, this subject is included in photographic images naturally for a long time so.So, position stability, and the subject with high stable presence that time frequency highland is present in photographic images can be estimated as the main subject that cameraman aims at.
Main subject is to judge according to such stable existence degree.Thereby main subject is automatically to judge, and needn't the user such as cameraman carry out wittingly assigned operation, thereby has greatly improved the user-operable in the various electronic equipments that carry out the operation corresponding with the setting of main subject.
<2. the structure > of imaging device
Below by illustrating the imaging device 10 that comprises above-mentioned image processing equipment, illustrating, be stable into as state estimation operation and main subject decision.
The configuration example of the imaging device 10 of Fig. 3 graphic extension embodiment.Imaging device 10 is so-called digital still camera or digital camera, is the equipment of taking and/or record rest image or moving image, and is included in the image processing equipment described in claim.
In addition, in the control unit 30 of imaging device 10, utilize software to realize the composed component corresponding with being stable into picture state estimation parts 3 with the above-mentioned main subject judging part 2 of image processing equipment.In Fig. 3, illustrate main subject judging part 30a and be stable into picture state estimation parts 30b.It is basic processing that control unit 30 carries out take described in the claims program, to carry out the operation as described in the claims image processing method.
As shown in diagram in Fig. 3, imaging device 10 comprises optical system 11, imager 12, optical system driver element 13, sensor unit 14, record cell 15, communication unit 16, digital signal processing unit 20, control unit 30, user interface controller (being called " UI controller " below) 32 and user interface 33.
Optical system 11 comprises the lens such as cover cap lens, zoom lens and condenser lens, and aperture device.Utilize optical system 11, at imager 12, collect light.
Imager 12 comprises the image device of charge coupled device (CCD) type, complementary metal oxide semiconductors (CMOS) (CMOS) type etc.
12 pairs of signals of telecommunication that obtain by the opto-electronic conversion in image device of imager, carry out controlling (AGC) processing etc. such as correlated-double-sampling (CDS) processing, automatic gain, also carry out mould/number (A/D) conversion process.In addition, the imaging signal as numerical data is exported to the digital signal processing unit 20 at next stage.
Optical system driver element 13, under the control of control unit 30, drives the condenser lens of optical system 11, so that carries out focusing operation.In addition, optical system driver element 13, under the control of control unit 30, drives the aperture device of optical system 11, so that the adjustment that exposes.In addition, optical system driver element 13, under the control of control unit 30, drives the zoom lens of optical system 11, to carry out zoom operation.
By utilizing for example DSP, with the form of image processing processor, form digital signal processing unit 20.20 pairs of photographed image signals from imager 12 of digital signal processing unit, carry out various signal processing.
For example, digital signal processing unit 20 comprises preprocessing part 21, sync section 22, YC generating portion 23, conversion of resolution part 24, codec part 25, candidate test section 26 and motion vector test section 27.
21 pairs of photographed image signals from imager 12 of preprocessing part, carry out the black level of R, G and B to be fixed to the clamp processing of predetermined level, the correction processing between R, G and the Color Channel of B etc.Sync section 22 carries out demosaicing processing, so that the view data of each pixel has all colours component of R, G and B.YC generating portion 23, from the view data of R, G and B, generates briliancy (Y) signal and color (C) signal.The view data that the various images of 24 pairs of processes of conversion of resolution part are processed, carries out conversion of resolution processing.The view data that 25 pairs of resolution of codec part have been converted, records or communicates by letter and process with coding.
Candidate test section 26 is corresponding to the candidate measuring ability of the main subject judging part 2 of describing in Fig. 1.In the example of Fig. 3, candidate test section 26 is functional structures of being carried out by digital signal processing unit 20, but this is example, and the processing of candidate test section 26 can be undertaken by the main subject judging part 30a of control unit 30.
Using such as the luminance signal obtaining by YC generating portion 23 or the photographed image signal color signal as target, candidate test section 26Yi Zhengwei unit carries out image analysis processing, to extract candidate image.For example, detect face image, thereby the extracted region that face image is present in is candidate image frame.For the candidate image of extracting, the positional information of described candidate image frame on screen, such as xy coordinate figure and subject distance, or the size information of candidate image frame, such as width, height and pixel count, as candidate image information, be sent to the main subject judging part 30a of control unit 30.In addition here, because candidate image information is the information of indicating the frame of the image-region that serves as candidate image, so candidate image information is also referred to as " candidate image frame information ".
In addition, candidate image frame information can comprise kind, individual identification information or the view data itself of the attribute information such as face, human body, dog or cat of candidate image.
As mentioned above, candidate test section 26 enabled mode matching process, extract the specific image that serves as candidate target, can utilize frame difference, with mobile body detecting method, detect movable body, and movable body is set as to candidate image.Extract and select the method for candidate image to be not limited to above method, can be varied.In addition, the processing such as smoothing processing or exterior point Processing for removing can be carried out to image in candidate test section 26, to generate candidate image frame information.
Motion vector test section 27 is in the situation that the functional structure that the processing example 5 that employing illustrates below arranges.Using and utilize the photographed image signal of YC generating portion 23 acquisitions as target, motion vector test section 27Yi Zhengwei unit, carry out image analysis processing, so that the motion vector of a plurality of respective regions that are divided into from picture (partial vector), obtain the vector (Global Vector) of whole image, details will explanation in processing example 5.
Motion vector test section 27 sends to being stable into as state estimation parts 30b of control unit 30 Global Vector, to be stable into picture state estimation process.
In addition, in the example of Fig. 3, motion vector test section 27 is functional structures of being carried out by digital signal processing unit 20, but this is example, and being stable into as state estimation parts 30b of control unit 30 can be carried out the processing of motion vector test section 27.
Control unit 30 consists of the microcomputer that comprises CPU, read-only memory (ROM), random access memory (RAM), flash memory etc.
CPU carries out and is kept at the program in ROM, flash memory etc., so that overall control imaging device 10.
As the service area when CPU carries out various data processing, RAM is for interim save data, program etc.
ROM or nonvolatile memory are for not only preserving for CPU and control each composed component, or the operating system of the use of the content file such as image file (OS), and preserve application program or the firmware for various operations.In this example, for example, preserve the program of the main subject determination processing for carrying out illustrating below, utilize the application program of main subject result of determination etc.
So, the instruction that control unit 30 is processed for the various signals in digital signal processing unit 20, imaging operation or record operation in response to user's operation, the reproduction operation of the image file of record, camera operation such as zoom, focusing and exposure are adjusted, user interface operations etc., control the operation of each necessary composed component.
In addition, in the present embodiment, control unit 30 plays main subject judging part 30a, carries out the main subject determination processing illustrating below.
Main subject judging part 30a, to the 26 candidate image information that send from candidate test section, carries out location status determination processing, the computing of stable existence degree, and the main subject based on stable existence degree is set processing.
In addition, in the present embodiment, control unit 30 works to be stable into picture state estimation parts 30b, and that carries out illustrating below is stable into as state estimation process.
Be stable into picture state estimation parts 30b according to elapsed time, various sensor information, control command value etc., determine whether and estimate to occur being stable into picture state.
User interface 33 carries out demonstration output or the voice output for user, and receives the operation input from user.Therefore, user interface comprises display unit, operating means, speaker unit, microphone apparatus etc.Here, illustrate display section 34 and operation part 35.
Various demonstrations are carried out to user in display section 34, for example, be included in the display unit such as liquid crystal display (LCD) or organic electroluminescent (EL) display forming on the housing of imaging device 10.In addition, form that can so-called view finder, utilizes LCD, organic El device etc. to form display section.
Display section 34 comprises display unit, and the display driver that display unit is shown.Display driver, according to the instruction from control unit 30, carries out various demonstrations in display unit.For example, display driver reproduces and shows takes and is recorded in rest image or the moving image on recording medium, or on the screen of display unit, shows the preview image (subject monitoring picture) of taking when waiting for shutter operation.In addition, the graphic user interface such as various actions menus, icon and message (GUI) is displayed on screen.In the present embodiment, also on preview image or reproduced image, carry out consequently user and can understand the demonstration owing to the result of determination of main subject judgement.
Operation part 35 has the input function of the operation that receives user, and the signal of response input operation is sent to control unit 30.
Operation part 35 is for example used the various operating means on the housing that is arranged on imaging device 10, the realizations such as touch panel that form in display section 34.As the operating means on housing, be provided with and reproduce menu start button, confirming button, cross key, cancel button, zoom key, slide key, shutter release button etc.In addition, can utilize the icon that is presented on touch panel and display section 34, menu etc., by touch panel, operate, carry out various operations.
The operation of display section 34 grades of user interface 33, is controlled from the instruction of control unit 30 by 32 responses of UI controller.In addition, UI controller 32, the operation information that utilizes operation part 35, sends control unit 30 to.
Record cell 15 for example comprises nonvolatile memory, works the content file of preserving such as Still image data or motion image data, the attribute information of image file, the effect of the memory block of thumbnail image etc.
Image file is to preserve by the form such as JPEG (joint photographic experts group) (JPEG), TIF (TIFF) or GIF(Graphic Interchange format) (GIF).
The actual form of record cell 15 can be varied.For example, record cell 15 can be the form that is built in the flash cell in imaging device 10, can be the storage card that can be attached to and depart from imaging device 10, for example, portable flash memory and described flash memory is carried out to the form of the card recoding/reproduction part of recoding/reproduction access.In addition, record cell can be realized with the hard disk drive (HDD) being built in imaging device 10.
In addition, in this example, the program being stable into as state estimation process and main subject determination processing of carrying out illustrating below can be stored in record cell 15.
Communication unit 16 carries out data communication or network service with external equipment wired or wirelessly.For example, carry out communicating by letter of captured image data (static picture document or motion pictures files) with exterior display device, tape deck, transcriber etc.
In addition, can pass through to utilize for example diverse network such as internet, home network and local area network (LAN) (LAN) as network communication unit, to communicate, thereby be to and from server and the terminal on network, transmit and receive various data.
Sensor unit 14 comprises various transducers.For example, the gyrosensor that detects camera-shake is set, the acceleration transducer of the posture of detection imaging device 10 etc.In addition, the posture or the mobile angular-rate sensor that detect imaging device 10 can be set, testing environment illumination, so that the illuminance transducer that exposes and adjust, and the distance measuring sensor of detection subject distance.Depend on acceleration transducer or angular-rate sensor, can detect the move/pitching of imaging device 10.
In addition, there is the zoom lens position transducer of the position of the zoom lens that detection optical system 11 is set, and the condenser lens position transducer of the position of the condenser lens of detection optical system 11, as the situation of sensor unit 14.
In addition, exist and detection is set as the transducer of the pore size of the mechanical aperture of aperture device, as the situation of sensor unit 14.Each transducer of sensor unit 14 sends control unit 30 to the information detecting.The information that control unit 30 can utilize sensor unit 14 to detect, carries out various control.
<3. main subject determination processing >
First, in imaging device 10, the main subject determination processing of being undertaken by main subject judging part 30a is described.Afterwards, the stable state imaging that the condition of judging main subject determination processing is described is estimated to process.
Main subject determination processing is the processing that obtains the stable existence degree in the view data of multiframe, is the processing of judging main subject in candidate image.
Below with reference to Fig. 4, the overview of main subject determination processing is described, with reference to figure 5-8, concrete processing example is described.
Fig. 4 is the flow chart of main subject determination processing.In addition, Fig. 4 is also about step F 10-F15, the step F 1 of the decision bits configuration state of graphic extension and Fig. 2 B, the step F 2 of calculation stability presence and set the corresponding relation of each processing of step F 3 correspondences of main subject.
In step F 10, control unit 30, from candidate test section 26, obtains the candidate image frame information of a certain frame.
In step F 11, control unit 30 is respectively for the one or more candidate image frames that utilize the candidate image frame information indication obtaining, decision bits configuration state.
In this case, as location status, judge that candidate image is with respect to the distance of the determinating reference point of setting in space, visual angle.On the other hand, as location status, judge that candidate image is with respect to the position relationship in the determinating reference region of setting in space, visual angle.
In step F 12, control unit 30 is for each candidate image frame, calculation stability presence.In this case, the accumulated time information that control unit 30 calculates when location status meets predetermined condition, as stable existence degree.On the other hand, control unit 30 calculating location states keep meeting the duration of predetermined condition, as stable existence degree.
In addition, the positional information of candidate image in space, visual angle, or the size information of candidate image can be used for calculation stability presence.
In step F 13, control unit 30 utilizes stable existence degree, judges main subject.
Here, the judgement of carrying out in step F 13 be wherein from start main subject while judging, the candidate image that reaches predetermined value within the shortest time is judged as the processing of main subject.On the other hand, described judgement be wherein current carry out main subject judgement in, as the candidate image of the value maximum of stable existence degree, be judged as the processing of main subject.
In addition, together with the value of stable existence degree, the positional information in space, visual angle, or size information can be used for main subject judgement.
When also not existing the value of its stable existence degree to reach the candidate image of predetermined value, or when predetermined main subject, judging the phase does not also pass by, and within this period, in the time of can't selecting the candidate image of value maximum of stable existence degree, in the processing of step F 3, can't judge main subject.In this case, control unit 30 returns to F10 from step F 14, repeats each processing.In other words, candidate test section 26 obtains the next candidate image frame information of frame to be processed, and it is carried out to identical processing.
When in certain time, when the value of finding its stable existence degree reaches the candidate image of predetermined value, or when predetermined main subject, judge the phase and pass by, and within this period, in the time of can selecting the candidate image of value maximum of stable existence degree, control unit 30 enters step F 15 from step F 14.In addition, the candidate image of judging in step F 13 is set to main subject.
In addition, the main subject determination processing of Fig. 4 is wherein during carrying out main subject and judging, when obtaining candidate image information, and the processing mode of judging.
As alternate manner, within certain period, obtain candidate image information.In addition, can consider wherein when past tense in described period, the candidate image information obtaining by utilization, the processing mode of carrying out main subject judgement.
Below, the concrete processing example corresponding with the main subject determination processing of Fig. 4 is described.
In object lesson below, obtain location status and distance determinating reference point that is set to candidate image frame.
In addition, as the stable existence degree of each candidate image frame, calculate to meet the accumulated time information of distance with the determinating reference point condition within predetermined threshold.
In addition,, from starting main subject judgement, the candidate image that its stable existence degree reaches predetermined value is the earliest judged as main subject.
So, first, with reference to figure 5,6 and 7, candidate image frame is described respectively, with the distance of determinating reference point, and stable existence degree.
The operation of the extraction candidate image frame that the explanation of Fig. 5 schematic illustration is undertaken by candidate test section 26.
Fig. 5 graphic extension is due to the optical system 11 of imaging device 10 and the operation of imager 12, each frame FR1 of the photographed image signal in supplied with digital signal processing unit 20, FR2, FR3 ...Candidate test section 26 is from each successive frame of input in succession, or each is intermittently in frame, detects candidate image.
For example, as shown in the figure, the in the situation that of there is 3 people in frame FR1, each face image is partly extracted as candidate image, and the candidate image frame information of candidate image frame E1, E2 and E3 is output and sends to main subject judging part 30a.Candidate image frame information is positional information, size information, attribute information etc.
Candidate test section 26 is according to identical mode, to follow-up each frame FR2, FR3 ... extract candidate image, generate the candidate image frame information of each candidate image frame, and candidate image frame information is sent to main subject judging part 30a.
When obtaining the candidate image frame information of each frame, control unit 30 calculates the distance with determinating reference point, as the location status of each candidate image frame.
The example of Fig. 6 A graphic extension determinating reference point SP.This is that wherein picture centre is used as the example of determinating reference point SP.In addition, the xy coordinate figure of determinating reference point SP is set to (Cx, Cy).
Here, calculate distance D iff1, Diff2 and the Diff3 from the center of gravity G of illustrative each candidate image frame E1, E2 and E3 to determinating reference point SP.
In addition, do not require determinating reference point SP is set in to picture centre.
For example, as shown in Fig. 6 B, determinating reference point SP can be set at a little partially upper left position at center.This is that if main subject is placed in the position at Bu center, this can produce suitable composition conventionally so because in the situation that consider the composition of rest image.
In addition in this case, by the method identical with Fig. 6 A, the distance between computational discrimination datum mark SP and the center of gravity G of candidate image frame.
Determinating reference point SP can be set in the position as in Fig. 6 A or 6B regularly, can utilize the touch operation of user on display screen to specify arbitrarily.In addition, can, on the screen of display section 34, to user, present several candidate points of determinating reference point, to selected by user.In addition, control unit 30 can be considered composition etc. according to picture material, graphical analysis result etc., judges optimum position, thus automatically setting determinating reference point.In other words, can set as follows determinating reference point SP.
Determinating reference point is set at predefined fixed position, such as picture centre position, or off-centered position.
Determinating reference point is specified arbitrarily by user.
By present several candidate points to user, and select a candidate point by user, set determinating reference point.
By according to picture material, judge optimum position, by control unit 30, set automatically, changeably determinating reference point.
At the time point of each frame, obtain the distance D iff (n) as the location status of candidate image frame E (n).
Fig. 7 is illustrated in supposition within certain period, candidate image frame E1, E2 and E3 remain resident in continuously on each frame (FR1, FR2 ...) in situation under, the state that the distance D iff1 of calculating, Diff2 and Diff3 change.Each distance D iff1, the Diff2 and the Diff3 that calculate change on time shaft.
In the processing example of Fig. 8 of explanation, stable existence degree is configured to the accumulated time close to the state of determinating reference point SP in the back.Therefore, distance threshold Trs-diff is used to judge " approach/keeping off ".
The bottom of Fig. 7 is illustrated in each time point, and whether distance D iff1, Diff2 and Diff3 are equal to or less than the result of determination of distance threshold Trs-diff.If distance D iff (n) is equal to or less than distance threshold Trs-diff, this is considered to approaching=1 so.By being accumulated in the result of determination " 1 " of each time point, obtain the stable existence degree of processing in example 1.
From start to judge to the period that finishes to judge with concrete processing example difference.In the back in the processing example of Fig. 8 of explanation, the accumulated value that is equal to or less than the result of determination " 1 " of distance threshold Trs-diff is that indication is until the accumulated time of the stable existence degree of this time, but, its accumulated time reaches the opportunity that the time of the candidate image of predetermined value is the described judgement of termination.
For example, in the example of Fig. 7, candidate image frame E3 is become " 1 " by continuous judgement, but, when accumulated value reaches certain predetermined value, stops described judgement, and candidate image frame E3 is judged as main subject.
In addition here, continuity is unimportant.For example, the distance D iff2 of candidate image frame E2 in Fig. 7 puts in time and is judged as " 1 " and " 0 ", but needing only with regard to accumulated time, for the situation of " 1 " many, and its accumulated time is than in the more Zao situation that reaches predetermined value of other candidate image frame, and candidate image frame E2 just can be judged as main subject.
But, can consider wherein, in the condition of stable existence degree, to use successional example.
Referring to Fig. 8, the object lesson of the main subject determination processing in control unit 30 is described.
First in step F 100, variable TSF is set as to 0, count value Cnt (n) is set as to 0.
Variable TSF indicates whether to finish the mark that main subject is set.TSF=" 0 " indicates main subject not to be determined.In addition, count value Cnt (n) is the count value that the value that compares to determine result by being added with the above-mentioned distance threshold Trs-diff of distance D iff obtains.
In addition, " n " represents natural number, and count value Cnt (n) is configured to the count value corresponding with the candidate image frame E (n) detecting.For example, in the situation that 3 candidate image frame E1, E2 and E3 being detected, Cnt1, Cnt2 and Cnt3 are used as count value.
In addition, similarly, distance D iff (n) is assumed to be distance D iff1, Diff2 and the Diff3 of overall indication 3 candidate image frame E1, E2 and E3 and determinating reference point SP.
Candidate image frame E (n) indication candidate image frame E1, E2, E3 ..., but these candidate image frames preferably depend on the subject in multiframe and mutually distinguish.For example, when illustrating that candidate Extraction parts wherein 26 extracts the example of faces, in the situation that personage A, personage B and personage C are subject, concerning each frame, jointly, the face image of personage A, B and C partly serves as candidate image frame E1, E2 and E3.Even in middle certain frame, only having personage D is subject, the face image part of personage D is also served as candidate image frame E4.So candidate Extraction parts 26 can not only detect " face ", but also differentiate individual.
In step F 101, control unit 30 obtains the candidate image frame information of certain frame from candidate Extraction parts 26.For example, for each candidate image frame E (n), obtain positional information or size information.
In step F 102, control unit 30, for each candidate image frame E (n), calculates the coordinate of center of gravity G.
For example, suppose the coordinate figure of the left upper apex of known square candidate image frame.These xy coordinate figures are set to (E (n) _ x, E (n) _ y).In addition, as shown in diagram in Fig. 6, xy coordinate system is using the upper left point of screen plane as initial point O.
In addition, the width w of candidate image frame E (n) is set to E (n) _ w, and its height h is set to E (n) _ h.
In addition,, when the coordinate figure of the center of gravity G of candidate image frame E (n) is set to (E (n) _ cx, E (n) _ cy), obtain as follows the coordinate figure of center of gravity G.
E(n)_cx=E(n)_x+(E(n)_w)/2
E(n)_cy=E(n)_y+(E(n)_h)/2
In step F 103, control unit 30 calculates the distance D iff (n) from the center of gravity G of each candidate image frame E (n) to determinating reference point SP.The following coordinate figure (Cx, Cy) that obtains determinating reference point SP.
Diff ( n ) = { ( E ( n ) _ cx - Cx ) 2 + ( E ( n ) _ cy - Cy ) 2 }
In step F 104, check variable TSF.If variable TSF is 0, flow process enters step F 105 so.
In addition, as described later, in the present embodiment, main subject determination processing is carried out according to being stable into as state estimation process, thereby can omit step F 104.
But, although be illustrated, but also there is the processing example that wherein carries out all the time main subject determination processing in Figure 23.In this case, the processing in step F 104 is skipped.
At step F 105, F106 and F107, utilize distance threshold Trs-diff, judge that whether the distance D iff (n) that arrives determinating reference point SP is close to determinating reference point SP.
Relatively each candidate image frame E (n) is to distance D iff (n) and the distance threshold Trs-diff of determinating reference point SP, if distance D iff (n) is <Trs-diff, in step F 106, flag F lg (n) is set to 1 so.In addition, if Diff (n) >=Trs-diff, so in step F 107, flag F lg (n) is set to 0.
Afterwards, at step F 108 and F109, calculate the stable existence degree of each candidate image frame E (n).First, in step F 108, whether check mark Flg (n) is 1, if flag F lg (n)=1, so in step F 109, increases progressively count value Cnt (n).If flag F lg (n)=0, does not change count value Cnt (n) so.
Count value Cnt (n) is the value as the stable existence degree of accumulated value.In other words, count value mean candidate image frame E (n) " close to " value of the frequency of the state of determinating reference point SP.
Afterwards, at step F 111, F112 and F113, by utilizing the stable existence degree of each candidate image frame E (n), judge main subject.
In step F 111, check whether the count value Cnt (n) of each candidate image frame E (n) reaches count threshold CTthr.
If do not have Cnt (n) to be more than or equal to CTthr,, all count value Cnt (n) of each candidate image frame E (n) do not reach count threshold CTthr, so in step F 113, variable TSF is under 0 state, in step F 114, do not finish to judge, flow process is returned to step F 101.In this case, according to the candidate image frame information for next frame input, carry out as mentioned above the processing in step F 101 and follow-up each step.
In addition, in step F 114, if variable TSF=0, the judgement of so main subject does not also complete, thereby proceeds determination processing, if variable TSF=1 completes so main subject and judges.
When in above-mentioned steps F104, while variable TSF=1 being detected, stop in this state described judgement.
Although omitted detailed description, but judge parallel with the automatically main subject of this example, by carrying out on the screen of display section 34 such as user, touch the operation of main subject, or subject is adjusted to the precalculated position on screen, and the operation of partly pressing shutter release button and so on, can select main subject.During the processing in execution graph 8, when user carries out this assigned operation, preferably pay the utmost attention to user's operation.So when utilizing manual operation to set main subject, variable TSF is set to 1.In this case, can, according to the judgement in step F 104 and F114, stop the processing of Fig. 8.
Due in certain duration, the main subject carrying out based on stable existence degree is judged, unless therefore processed the candidate image frame information in many frames, otherwise as mentioned above in step F 114, do not stop described judgement, thereby flow process is returned to step F 101, so that repeat to process.
Here, for example, as shown in diagram in Fig. 7, certain candidate image frame E3 is not continuous, but supposition is in multiframe, exists on photographic images high frequency, and this frame is positioned at the situation close to the position of determinating reference point SP.Thereby, owing to appearing in step F 109, increase progressively the repeatedly chance of the count value Cnt3 of candidate image frame E3, so count value Cnt3 makes progress quickly than count value Cnt1 and Cnt2, thereby first reaches count threshold CTthr.
In this case, control unit 30 makes to process and enters step F 112 from step F 111.
In step F 112, the candidate image frame E (n) that control unit 30 reaches count threshold CTthr its count value Cnt (n) is judged to be main subject, and carries out main subject setting.In addition, variable TSF is set to 1.
In this case, in step F 114, finish to judge.In other words, for example, candidate image frame E3 is set to main subject, thereby completes the main subject determination processing of Fig. 8.
In addition, at this, process in example, process and repeatedly carried out, until variable TSF becomes 1, but in fact, predetermined binding hours is preferably set.In other words, even the processing time started from Fig. 8, the past scheduled time, can not judge in the situation of main subject, owing to there not being main subject, therefore stop this processing.
According to the above-mentioned processing example of Fig. 8, user holds imaging device 10, and to cause the subject that main expectation is taken to approach as much as possible the determinating reference point SP such as picture centre, thereby this subject is automatically as main subject.
Especially, in this processing example, use the accumulated time close to the state of determinating reference point SP, judge stable existence degree.The subject such as animal of the just subject in moving situation, or fast moving, cameraman is even difficult to continue a little time, such as several seconds, remains on picture centre and takes the subject that expectation becomes main subject.In addition, depend on cameraman's imaging technical ability, due to violent camera-shake, the subject that very expectation is taken can not be maintained at picture centre.Even in these cases, by utilizing accumulated time, also can relatively quickly judge main subject.
So, even the situation that is target for the subject from fast moving, or the user who relatively lacks experience, this is also suitable.In addition, because main subject determination processing not necessarily will be carried out the regular hour, but when having judged main subject, stop this processing, therefore there is the experience that depends on subject or cameraman, the advantage of carrying out fast main subject judgement.
In addition, the concrete processing example of main subject determination processing can be varied.For example, as the stable existence degree of each candidate image frame, can calculate the information of the duration when the condition of the distance with determinating reference point within predetermined threshold keeps being satisfied.If according to the described duration, estimate whether stably to exist subject, so in the situation that using slowly mobile subject as target, can be easily in picture centre etc., locate to continue perception target subject, thereby it is high accurately to set the possibility of main subject of user's expectation.In addition, owing to depending on cameraman's technical ability, the desired subject that becomes main subject can be continued to maintain the position close to determinating reference point SP, thus the subject of cameraman expectation can be accurately judged to be the possibility of main subject high.In other words, depend on the expert of the technical ability of photographing, or subject, the subject that can further improve user's expectation is judged as the possibility of main subject.
In addition, can give weight to the calculating of stable existence degree.For example, the implementation period of main subject determination processing is in the past more, higher close to the subject value of determinating reference point SP.
Conventionally, cameraman, aim at subject, and hold in the situation of camera, initial, the subject that expectation becomes main subject does not appear at the desired locations such as picture centre, thereby cameraman adjusts the orientation of camera gradually.Consider this fact, initial, cameraman remembers, for the subject of " main subject " past along with the time, to engender in picture centre.
So during carrying out main subject determination processing, along with the past of time, the subject of paying much attention to more close determinating reference point SP can increase the possibility of judging the main subject conforming to cameraman's intention.
In addition, for the calculating of stable existence degree, except distance D iff (n) is equal to or less than the condition of distance threshold Trs-diff, can increase other condition.
For example, can increase the condition of subject distance in preset range, the condition of size in preset range, subject is the condition of the image of particular types, etc.
In addition, can consider wherein to set specific main subject and judge the phase, and within this period, the candidate image of the value maximum of its stable existence degree is judged to be the processing example of main subject.
In addition, in above-mentioned processing example, be used as the location status of candidate image frame with the position relationship of the determinating reference point of setting, but, can use the position relationship with determinating reference region.For example, in picture centre etc., set square, circular etc. region, and as determinating reference region.For example, each candidate image frame E (n) can comprise with the position relationship in determinating reference region:
Whether center of gravity is included in determinating reference region;
Whether whole candidate image frame E (n) is included in determinating reference region;
Whether at least a portion of candidate image frame E (n) is included in determinating reference region; With
With the distance of the outer rim in determinating reference region whether in preset range.
Under the condition of described position relationship, can obtain stable existence degree.
In addition, as the stable existence degree of each candidate image frame, can calculating location state, for example, with the mean value of the distance of determinating reference point.Average distance is that indication candidate image frame time frequency highland is in approaching the index of the location status of determinating reference point.For example, the implication of " value of average distance is less " is identical with the implication of " accumulated time is longer " that illustrate in superincumbent processing example.In addition, meet average distance the shortest, and the candidate image frame of the condition within predetermined threshold can be judged as main subject.
In the present embodiment, for example, according to above-mentioned example in identical mode, judge main subject, and the object etc. of main subject judgement is carried out in explanation in imaging device 10.
First, the example that utilizes main subject result of determination is described.
For example as user at shutter regularly on time, carry out main subject judgement, but, control unit 30 can be judged main subject automatically, can carry out following processing subsequently.
Follow the tracks of and process
The main subject that tracking is set in each photographed frame.For example, showing under the state of preview image, to user, express main subject, thereby adjust visual angle for user, for example, under the state of holding camera with hand, judge the use of subject.
In addition, in order to point out main subject, can be displayed under the state on display section 34 at preview image, show highlightedly the frame of main subject.In addition, highlight etc. and immediately in the specific period after judging, to carry out, carry out in can be in main subject is present in preview image.
Focus on
Main subject is carried out to auto focus control.In addition, together with following the tracks of, process, adjust and focus on, even so that main subject moves around, also can follow the tracks of main subject.
Exposure is adjusted
According to the briliancy of main subject, carry out automatic exposure adjustment.
Directive property is adjusted
In the shooting in company with moving image together, utilize microphone to collect in the situation of sound, according to the orientation of the main subject in space, visual angle, adjust directive property.
In addition, during directive property adjustment can be used in and process for the various signals of photographed image signal.
Image effect is processed
Only the region of the main subject in each frame of taking, carries out the image processing such as image quality adjustment, noise reduction and the colour of skin are adjusted.
On the other hand, can carry out image effect at the extra-regional regional except main subject, for example, mosaic processing, Fuzzy Processing, filling processing etc.
Picture editting processes
Can, to photographic images or document image, carry out the editing and processing such as modification adds frame or shearing.
For example, can carry out such as comprising the shearing of the subregion in the frame of main subject or the processing amplification.
In addition, can carry out the cutting etc. of peripheral part of image, so that in captured image data, main subject is placed in the center of image, thereby adjusts composition.
These are only examples, can exist the automatic regulating function of application program wherein or imaging device to use other various processing of the main subject of setting.
As follows by carrying out the effect that main subject determination processing obtains.When cameraman holds imaging device 10 on one side, Yi Bian while aiming at subject, specify the operation difficulty of main subject itself.If automatically carry out main subject judgement, can address this problem so, thereby can obtain the effect of the pressure that alleviates user.
In addition, the imaging device 10 that carries and use user, such as the normally used digital still camera of user or be built in the camera in mobile phone, display section 34 sizes are little, thereby although user specifies the operation of main subject on screen, be also difficult to accurately carry out this operation.And in the present embodiment, automatically judge, thereby can avoid wrong, specify.
In addition, owing to can naturally holding imaging device, take under the sensation of main subject and use imaging device 10, therefore can increase imager can and corresponding service condition, thereby can also provide wieldy camera to user.
According to explanation above, the imaging device 10 that automatically carries out the present embodiment of main subject judgement is very suitable for as minicam.
<4. be stable into picture state estimation process >
Here, consider when to carry out above-mentioned main subject determination processing.
For example, be switched on power supply when imaging device 10, and on display section 34, show preview image during, can carry out all the time main subject determination processing, even if but carry out main subject determination processing, this processing also may not be meaningful with ignoring the possible consequences.
For example, when user is not intended to imaging, such as only hold in the state that switches on power imaging device 10 time, carry out main subject determination processing or utilize the result of main subject determination processing to have little significance.
Particularly, suitable is holds imaging device 10 as user, and while attempting to carry out imaging from now on, carries out main subject determination processing.
So, in the imaging device of the present embodiment, be stable into as state estimation parts 30b and estimate to occur being stable into picture state, for example, user holds the state of camera, and when estimating to occur being stable into picture state, main subject judging part 30a just carries out above-mentioned main subject determination processing, so that output is as the main subject information of the result of determination of main subject determination processing.
As mentioned above, owing to being stable into picture state estimation process, therefore can significant time, carry out main subject determination processing.
In other words, due to by main subject determination processing, be stable into picture state estimation process, therefore for example not meaningless in main subject result of determination, such as the cameraman of imaging device 10 intends photography and holds the state of camera, when finding the situation etc. of subject, just carry out main subject determination processing.Especially, because above-mentioned main subject determination processing has sizable treating capacity, and the processing load that is stable into picture state estimation process illustrating is below less, therefore can obtains the processing load that reduces control unit 30, or reduce the effect of power consumption.
In other words, can improve user's convenience, reduce the processing load of control unit 30, and improve the timeliness of automatically main subject determination processing.
[4-1: the execution opportunity that is stable into picture state estimation process]
As mentioned above, being stable into as state estimation process is to judge the processing on execution opportunity of main subject determination processing.Here first with reference to figure 9 and 10, be described the execution opportunity that is stable into picture state estimation process.
Fig. 9 graphic extension is as camera mode MD1 and the reproduction mode MD2 of the pattern of imaging device 10.
Camera mode MD1 is by carrying out imaging, the pattern of essential record rest image or moving image.Reproduction mode MD2 is mainly on display section 34, reproduces and shows the Still image data having taken and be recorded in record cell 15 for example or the pattern of motion image data.
When imaging device 10 is switched on power, with camera mode MD1 or reproduction mode MD2, start imaging device.In addition, user can be according in the corresponding time, and the application target of imaging device 10, at random switches camera mode MD1 and reproduction mode MD2 mutually.
In the example of Fig. 9, when starting camera mode MD1, start to be stable into picture state estimation process ST1 (step F 1000 of Fig. 2 A).
The object lesson being stable into as state estimation process will be explained below, but, as shown in Figure 2 A, if by being stable into picture state estimation process, estimate to occur being stable into picture state, carry out so main subject determination processing (F1001-F1004), and utilize the main subject information as result of determination.
First, after carrying out main subject judgement, utilize the processing as the main subject information of result of determination.In Fig. 9, this processing is illustrated as main subject information and utilizes treatment S T2.
For example, carry out the processing that illustrates as in utilizing the above-mentioned example of main subject information.
Afterwards, exist main subject information not used, or do not need situation about being used.
For example, the in the situation that of in main subject is present in photographic images, can carry out the tracking processing illustrating, the adjustment that focuses on, exposes, directive property adjustment, image effect processing, picture editting's processing etc. in utilizing the above-mentioned example of main subject information, but at main subject, depart from screen, thereby be not present in the situation in photographic images, can not carry out so main subject information utilization and process.
In addition, user, change the operation of main subject, or the operation of again judging, or stop in the situation of main subject information utilization processing, utilize the main subject information utilization of current main subject information to process dispensable.
It is unnecessary in the main subject information utilization that utilizes current main subject information, to process, or in situation about can not carry out (ST3), can again be stable into picture state estimation process ST1.
In addition, as with chain-dotted line institute around, this operational transition be camera mode MD1 under operational transition.In the situation that pattern is switched to reproduction mode MD2 from camera mode MD1, be not stable into picture state estimation process and main subject determination processing.
In other words, the example of Fig. 9 is to be wherein stable into as state estimation parts 30b response to enter when main subject determination processing effectively works, and the state of the camera mode MD1 of expection, is stable into the example as state estimation process.
In addition, even if this example wherein can not be used in the result of main subject determination processing, or after not needing to be used, during camera mode MD1, be stable into the example that is still stable into picture state estimation process as state estimation parts 30b.
Below, the example of Figure 10 A is wherein under camera mode MD1, allows user to select for judging the example of the pattern of main subject.
For example, by menu operation etc., select main subject determinating mode MD11, as subordinate's pattern of camera mode MD1.Figure 10 B graphic extension wherein allows user on display section 34, selects the demonstration example of main subject determinating mode.
For example, when user operates by the selection on screen, while starting the operation of main subject determinating mode, control unit 30 starts main subject determinating mode MD11.
When main subject determinating mode MD11 is activated, control unit 30 starts to be stable into picture state estimation process ST1.
Subsequently, described in Fig. 2 A, if estimate to occur being stable into picture state, carry out so main subject determination processing, and be used as the main subject information of result of determination.
First, carry out main subject judgement, subsequently, carry out main subject information and utilize treatment S T2, it is the processing utilizing as the main subject information of result of determination.For example, carry out the processing that illustrates in utilizing the above-mentioned example of main subject information.Afterwards, main subject information can not be used, or does not need to be used.According to the mode the same with above-mentioned example, it is unnecessary in the main subject information utilization that utilizes current main subject information, to process, or can not, by the situation of carrying out, again be stable into picture state estimation process ST1.
In addition, this operational transition is the operational transition during main subject determinating mode MD11 is activated, as shown in chain-dotted line.
In pattern, be that the still main subject determinating mode MD11 of camera mode MD1 is closed, or be switched in the situation of reproduction mode MD2, be not stable into picture state estimation process and main subject determination processing.
In other words, the example of Figure 10 is wherein to respond to enter effectively to work when main subject determination processing, when especially user also asks main subject to be judged, the state of the main subject determinating mode MD11 of expection, is stable into the example that is stable into picture state estimation process as state estimation parts 30b.
In addition, even if this example is wherein after the result of main subject determination processing can not be used or not require by use, as long as main subject determinating mode MD11 is unlocked, be stable into the example that is just stable into picture state estimation process as state estimation parts 30b.
For example, according to mode identical in the example with Fig. 9 and 10, during camera mode MD1, or during the main subject determinating mode MD11 in camera mode MD1, be stable into picture state estimation process ST1.
In addition, the estimated result according to being stable into picture state estimation process ST1, carries out main subject determination processing.
[4-2: process example 1]
Below, each object lesson that is stable into picture state estimation process is described.
Figure 11 is illustrated in and is stable in the picture processing example 1 of state estimation process, by being stable into the processing of carrying out as state estimation parts 30b.
When in Fig. 9, while starting camera mode MD1, or when in Figure 10, while starting main subject determinating mode MD11, carry out the processing of Figure 11.In addition, this fact is so same concerning the processing example 2-5 illustrating below, but no longer repeat specification.
In step F 201, whether control unit 30 is being stable into picture state estimation process according to current, makes to process to occur branch.
When starting to be stable into picture state estimation process, be stable into as state estimation process and be not also performed, thus the processing that control unit 30 carries out in step F 202.
First, replacement time in the past counter CTt, to start counting.In addition, indicate whether to estimate to occur that the estimation flag F st being stable into as state is initialized to Fst=0.
Control unit 30 repeats the processing of Fig. 2 A, and the step F 1001 of Fig. 2 A is to judge to estimate that whether flag F st is 1 step.Estimating in the period of flag F st=0, as the processing of Fig. 2 A, step F 1000, that is, the processing of Figure 11 is carried out repeatedly.
After starting to be stable into picture state estimation process, the processing of Figure 11 enters step F 203 from step F 201.
In step F 203, control unit 30 increases progressively time in the past counter CTt.
In addition, in step F 204, control unit 30 is value and the estimation scheduled time thTM of time in the past counter CTt relatively.
If the value of time in the past counter CTt does not surpass scheduled time thTM, former state finishes this processing so.In other words, maintain and estimate flag F st=0.
Along with the past of time, in step F 204, in certain time, judge CTt>thTM.In this case, control unit 30 enters step F 205, thinks and estimates to have occurred stablizing image formation state, thereby set, estimates flag F st=1.
The in the situation that of estimation flag F st=1 as above, in the step F 1001 of Fig. 2 A, control unit 30 starts main subject determination processing, thereby enters step F 1002-F1004.
In the aforementioned stable image formation state of Figure 11 is estimated to process, at the preassigned pattern state effectively working from being transformed into main subject determination processing,, the time of main subject determinating mode MD11 in camera mode MD1 in Fig. 9 or Figure 10 rises, past in the situation of scheduled time thTM, estimate to occur being stable into picture state.
Conventionally, the operation that user switches on power or opens camera mode MD1, takes to hold imaging device 10 so that the posture of photography subsequently.In addition, after opening main subject determinating mode MD11, can take the posture of photographing.
So, if proceed to the transformation of this pattern, so from it, in the time of the scheduled time in past, can estimating user hold imaging device 10 preparation photographies.So, in processing example 1, in being stable into picture state estimation process, the most simply, carry out the coefficient processing of scheduled time thTM.
As mentioned above, owing to being stable into picture state estimation process, and carry out main subject determination processing according to it, therefore do not require that user knows the startup of main subject determination processing, main subject determination processing is in due course and is activated, and does not apply operation burden.In addition, very little for being stable into the processing load of control unit 30 of picture state estimation process.
[4-3: process example 2]
Below with reference to Figure 12 and 13, illustrate and process example 2.Processing example 2 is according to the testing result of visual field change, stablizes the example of the estimation of image formation state.Here, this example is that the output of transducer that wherein detects the motion of imaging device 10 is specifically designed to the example that detects visual field change.
Described transducer is for example to detect the shake that puts on imaging device 10, such as the gyrosensor of the camera-shake when user carries imaging device 10, or the acceleration transducer of detection posture.
For example, the sensor unit 14 of Fig. 3 comprises gyrosensor or acceleration transducer, and control unit 30 can detect the output from transducer.
Figure 12 A graphic extension is as the processing example 2 that is stable into picture state estimation process, by being stable into the processing of carrying out as state estimation parts 30b.
In step F 300, whether control unit 30 is being stable into picture state estimation process according to current, makes to process to occur branch.
When starting to be stable into picture state estimation process, be stable into as state estimation process and be not also performed, thereby in step F 307, control unit 30 resets to CTst=0 the counter CTst (calling measurement counter CTst stabilization time below) for the Measurement sensibility time.In addition, indicate whether to estimate to occur that the estimation flag F st being stable into as state is initialized to Fst=0.
According to aforementioned processing example 1 in identical mode, estimating in the period of flag F st=0, as the processing of Fig. 2 A, step F 1000, that is, the processing of Figure 12 A is carried out repeatedly.
After starting to be stable into picture state estimation process, the processing of Figure 12 A enters step F 301 from step F 300.
In step F 301, control unit 30 obtains the transducer input from sensor unit 14.For example, obtain the detected value of gyrosensor or acceleration transducer.
For example, if in step F 301, obtain the detected value of gyrosensor, so in step F 302, control unit 30 judges that these detected values are whether within the scope of predeterminated level.Although be illustrated in Figure 13, but, predeterminated level scope is wherein to estimate that camera-shake level is less, thus the stable horizontal extent of imaging device 10.
If the detected value of determine sensor is within the scope of predeterminated level, control unit 30 enters step F 303 so, increases progressively measurement counter CTst stabilization time.
On the other hand, if the detected value of determine sensor not within the scope of predeterminated level, control unit 30 enters step F 304 so, measurement counter CTst stabilization time successively decreases.
In step F 305, control unit 30 is count value and the estimation scheduled time thTM of counter CTst relatively.
If the value of time in the past counter CTt does not surpass scheduled time thTM, former state stops this processing so.In other words, maintain and estimate flag F st=0.
If in step F 305, judge CTst>thTM, control unit 30 enters step F 306 so, thinks and estimates to have occurred stablizing image formation state, thereby set, estimates flag F st=1.
The in the situation that of estimation flag F st=1 as above, in the step F 1001 of Fig. 2 A, control unit 30 starts main subject determination processing, thereby enters step F 1002-F1004.
Below with reference to Figure 13, the example of operation based on this processing is described.
Figure 13 A graphic extension control unit 30 utilizes the situation of the detection output of gyrosensor.The longitudinal axis represents the detected value of gyrosensor, and transverse axis represents the time.Each point of the waveform dotting is illustrated in step F 301, at the detected value of each time point Input Control Element 30.
In step F 302, for the predeterminated level scope of described judgement, be assumed that between the horizontal L1 of detected value and the scope between L2.
In addition, also illustrate the value of estimating flag F st.
In addition, suppose at time point t0, start to be stable into picture state estimation process.
For example, when putting on the shake of imaging device 10 owing to user's camera-shake when larger, the detected value of gyrosensor has higher level.
So if detected value within the scope of predeterminated level, can be estimated the wherein less situation such as camera-shake so, that is, cameraman holds the situation of imaging device 10 to a certain extent.
This example represents wherein until time point t1 shakes relatively large situation.For example, expect that wherein user does not also hold imaging device 10, for example, user has just held imaging device, or finds roughly the situation of subject.
In the processing of Figure 12 A, if the detected value of transducer within predeterminated level scope, stabilization time, measurement counter CTst was incremented so, and if detected value outside predeterminated level scope, stabilization time, measurement counter was successively decreased so.
Therefore, for example, if after time point t1, the shake that puts on imaging device 10 reduces, and so in certain time, stabilization time, the value of measurement counter CTst surpassed the value of scheduled time thTM.For example, suppose at time point t2, occur CTst>thTM.
If at above-mentioned steps F305, judge CTst>thTM, so in step F 306, estimate that flag F st is set to 1.As shown in FIG. 13A, at time point t2, estimate that flag F st is set to 1.
As mentioned above, by detection, put on the shake of imaging device 10, detect visual field change.If wherein put on the shake of imaging device 10, continue to a certain extent less period, can judge that so visual field change is less, thereby visual field is stable.In this case, estimate to occur being stable into picture state.
Next, Figure 13 B graphic extension control unit 30 utilizes the situation of the detection output of acceleration transducer.The longitudinal axis represents the detected value of angular-rate sensor, and transverse axis represents the time.Each some indication of the waveform dotting is in step F 301, at the detected value of each time point Input Control Element 30.
In step F 302, for the predeterminated level scope of described judgement, be assumed to be it is between the horizontal L1 of detected value and the scope between L2.In addition, also illustrate the value of estimating flag F st.In addition, suppose at time point t0, start to be stable into picture state estimation process.
The detected value of acceleration transducer becomes with the postural change of imaging device 10.The motion of imaging device 10 is larger, and the change of resistance is larger.For example, when user has just held imaging device 10, or while being in one's hands imaging device 10, while finding subject roughly, detected value changes greatly.
So, if wherein detected value continues the period within the scope of predeterminated level to a certain extent, can estimate that so wherein cameraman dwindles subject orientation to a certain extent, and hold the situation of imaging device 10.
This example represents wherein until time point t1, the situation of the motion more than all existing relatively.
On the other hand, if user dwindles subject orientation to a certain extent, and holding imaging device 10, the motion of imaging device 10 is reduced so, thereby the posture of imaging device 10 is stablized.For example, from time point t1, there is the wherein situation of the stable posture of imaging device 10.
In the processing of Figure 12 A, if the detected value of transducer within the scope of predeterminated level, stabilization time, measurement counter CTst was incremented so, and if detected value outside predeterminated level scope, stabilization time, measurement counter was successively decreased so.
Therefore, for example, if after time point t1, make the posture of imaging device 10 keep stable, so in certain time, stabilization time, the value of measurement counter CTst surpassed the value of scheduled time thTM.For example, suppose at time point t2, occur CTst>thTM.
If at above-mentioned steps F305, judge CTst>thTM, so in step F 306, estimate that flag F st is set to 1.In Figure 13 B, at time point t2, estimate that flag F st is set to 1.
As mentioned above, by detecting the motion of imaging device 10, detect visual field change.If the state of the posture of definition imaging device 10 continues to a certain extent, can judge that so visual field change is very little, thereby visual field is stable.In this case, estimate to occur being stable into picture state.
Especially, by judging shake or the posture of imaging device 10, can improve the estimated accuracy of the appearance of stablizing image formation state.
In addition, in the processing example 2 of Figure 12 A, for example, can use the detected value of gyrosensor or acceleration transducer, also can use this two transducers.
For example, can the walk abreast processing of the detected value for gyrosensor that carries out Figure 12 A, the processing of the detected value for acceleration transducer of figure Figure 12 A, in processing at any one, estimate that flag F st is set at 1 o'clock, can estimate to occur being stable into picture state (OR condition).
On the other hand, whether all the judgement of the detected value in step F 302 within the scope of predeterminated level can be the judgement within the scope of predeterminated level (AND condition) of the detected value of gyrosensor and the detected value of acceleration transducer.
In addition, can use alone or in combination the detected value of other transducer such as aspect sensor and geomagnetic sensor.In a word, wherein the motion of imaging device 10 or the quite little situation of posture change can be estimated as stable image formation state.In the situation that being used in combination a plurality of transducer, nature can adopt above-mentioned OR conditioned disjunction AND condition.
In addition, can there is other processing of measurement counter CTst stabilization time.
As shown in the dotted line with in Figure 12 A, if in step F 302, judge that detected value is not within predeterminated level scope, measurement counter CTst stabilization time that can not successively decrease so, flow process can enter step F 305 steadily.In other words, this is the not reformed example of count value wherein.
Thereby even if there are shake to a certain degree, motion etc., conventionally, wherein shake is less, thereby the state of stable posture also can easily be estimated as stable image formation state.When the larger user who lacks experience of anticipation camera-shake waits, this can easily start main subject determination processing.
In addition, in the available Figure 12 B of the processing in step F 304, the step F 304A shown in diagram replaces.
In other words, if this is wherein in step F 302, judge that detected value is not within the scope of predeterminated level, so stabilization time measurement counter CTst value to be reset be 0 example.Thereby, wherein keep the state continuance scheduled time thTM of non-jitter or motion, thereby be finally estimated as stable image formation state; So, can improve the precision of estimation.This is preferred, because concerning the user who gets used to photographing, do not start at all adventures main subject determination processing.
On the contrary, as successively decreasing in the step F 304 of Figure 12 A is suitable for conventionally take the situation that many users are target, even because there is moment shake or posture change, stabilization time, measurement counter CTst was not reset yet, thereby can detect more stable state.
[4-4: process example 3]
Below with reference to Figure 14 and 15, illustrate and process example 3.Processing example 3 is also the testing result wherein changing according to visual field, estimates to occur being stable into the example of picture state.Here, this example is wherein to cause the testing result of the cameras work of visual field change to be used in particular for stablizing the example of the estimation of image formation state.
The cameras work that causes visual field change of explanation be here move, pitching, zoom etc., also comprise the operation based on this automatic operation or user's operation.
The following cameras work that detects.
Moving of movement as the visual field of horizontal direction can be utilized for example gyrosensor, or the acceleration transducer of horizontal direction detects.These transducers can be arranged in sensor unit 14.In addition, at imaging device 10, be installed on The Cloud Terrace, and by the motor operation that moves of The Cloud Terrace, in situation about moving, can utilize and put on the control signal that it moves mechanism.In addition,, if the position-detection sensor that moves of The Cloud Terrace etc. is set, can utilize so the detected value of this transducer.In this manner, control unit 30 can detect the execution that moves/do not carry out or the amount of movement moving.
In addition, digital signal processing unit 20, for example, motion vector test section 27 or control unit 30 carry out analysis of image data, to detect the motion that moves direction, thus the execution that detection moves/do not carry out, or the amount of movement moving.
Pitching as the movement of the visual field of vertical direction can utilize for example gyrosensor, or the acceleration transducer of vertical direction detects.These transducers can be arranged in sensor unit 14.In addition, at imaging device 10, be installed on The Cloud Terrace, and turn round by the pitch motor of The Cloud Terrace, carry out, in the situation of pitching, can utilizing the control signal that puts on its luffing mechanism.In addition,, if the pitch position detecting sensor of The Cloud Terrace etc. is set, can utilize so the detected value of this transducer.In this manner, control unit 30 can detect the execution of pitching/do not carry out or the amount of movement of pitching.
In addition, digital signal processing unit 20, for example, motion vector test section 27 or control unit 30 carry out analysis of image data, to detect the motion of the photographic images of pitch orientation, thus the execution of detection pitching/do not carry out, or the amount of movement of pitching.
With regard to zoom, utilizing user's zoom operation or automatically controlling, carry out in the situation of zoom, control unit 30 itself can send to according to control unit 30 zoom control signal of optical system driver element 13, detects zoom operation.
In addition, if zoom lens position transducer is set in sensor unit 14, control unit 30 can detect the motion of zoom lens according to the detected value of this transducer so, that is, and and zoom operation.
In this manner, control unit 30 can detect the execution of zoom/do not carry out or the amount of movement of zoom.
In addition, digital signal processing unit 20 or control unit 30 can carry out analysis of image data, to detect the variation in subject region, or the variation at visual angle, thereby detect zoom operation.
Figure 14 A is illustrated in and is stable in the picture processing example 3 of state estimation process, by being stable into the processing of carrying out as state estimation parts 30b.
In step F 400, whether control unit 30 is being stable into picture state estimation process according to current, makes to process to occur branch.
When starting to be stable into picture state estimation process, be stable into as state estimation process and be not also performed, thereby in step F 407, control unit 30 stabilization time measurement counter CTst reset to 0.In addition, indicate whether to estimate to occur that the estimation flag F st being stable into as state is initialized to Fst=0.
According to the processing example 1 with above and 2 identical modes, estimating, in the period of flag F st=0, as the processing of Fig. 2 A, repeatedly to carry out step F 1000, that is, and the processing of Figure 14 A.
After starting to be stable into picture state estimation process, the processing of Figure 14 A enters step F 401 from step F 400.
In step F 401, control unit 30 obtains the transducer input from sensor unit 14, for example, the detected value of gyrosensor, control unit sends to the command value of the zoom operation of optical system driver element 13, etc.In other words, detect move, each mode of operation of pitching and zoom.
Afterwards, in step F 402, control unit 30 judge owing to move or the detected value of the motion of pitching operation whether within the scope of predeterminated level, and determine whether and carried out zoom operation.Owing to the predeterminated level scope moving or the detected value of the motion of pitching operation is included therein, be to indicate to move or pitching, even or indication wherein carried out moving or pitching, also only carried out the horizontal extent of the situation of minute movement.
If judge owing to move or the detected value of the transducer of pitching operation within the scope of predeterminated level, and do not carry out zoom operation, control unit 30 enters step F 403 so, increases progressively measurement counter CTst stabilization time.
On the other hand, if the detected value of determine sensor not within the scope of predeterminated level, thereby carried out moving or pitching operation, or carried out zoom operation, control unit 30 enters step F 404 so, measurement counter CTst stabilization time successively decreases.
In step F 405, control unit 30 is count value and the estimation scheduled time thTM of counter CTst relatively.
If the value of time in the past counter CTt does not surpass scheduled time thTM, former state stops this processing so.In other words, maintain and estimate flag F st=0.
In addition, if in step F 405, judge CTst>thTM, control unit 30 enters step F 406 so, thinks and estimates to have occurred stablizing image formation state, thereby set, estimates flag F st=1.
The in the situation that of estimation flag F st=1 as above, control unit 30, in the step F 1001 of Fig. 2 A, starts main subject determination processing, thereby enters step F 1002-F1004.
Below with reference to Figure 15, the example of operation based on this processing is described.
Figure 15 graphic extension wherein control unit 30 utilizes gyrosensor, detects and moves or pitching operation, and according to the command value of zoom operation, detect the situation of zoom operation.The longitudinal axis represents detected value and the output of zoom instructions value of gyrosensor, and transverse axis represents the time.For gyrosensor output, each some indication of the waveform dotting is in step F 401, at the detected value of each time point Input Control Element 30.In step F 402, for the predeterminated level scope of described judgement, be assumed to be it is in the horizontal L1 of detected value and the scope between L2.
As control unit 30 fore optics system driver elements 13, to carry out the command value of zoom operation, T-ON indication output is towards the command value of the operation of the end of dolly-out,ing dolly-back, and W-ON indication output is towards the command value of the operation of wide-angle side.Control unit 30 can, according to the command value of its output, check and whether carry out zoom operation.
In addition, also illustrate the value of estimating flag F st.
In addition, suppose from time point t0, start to be stable into picture state estimation process.
Move or pitching in, the detected value of gyrosensor has higher level.
So, if detected value within the scope of predeterminated level, can judge and move or pitching so, or carried out small moving or the situation of pitching.
In this example, find until time point t1 has carried out moving or pitching, or carried out zoom.So, until time point t1 estimates to occur user's mobile image forming apparatus 10 roughly, to find subject, or find the situation at visual angle.
When user moves, when pitching or zoom operation, can think and occur that user narrows down to visual field to a certain degree, and hold imaging device 10, to wait for the shutter situation on opportunity.
For example, from time point t1, do not carry out larger moving, pitching or zoom operation.
In the processing of Figure 14 A, due under these situations, stabilization time, measurement counter CTst was incremented, if these situations continue, so in certain time, stabilization time, the value of measurement counter CTst surpassed the value of scheduled time thTM.For example, at time point t2, there is CTst>thTM.
If at above-mentioned steps F405, judge CTst>thTM, so in step F 406, estimate that flag F st is set to 1.As shown in Figure 15 A, at time point t2, estimate that flag F st is set to 1.
As mentioned above, if wherein moved, period of pitching or zoom operation continues to a certain extent, that is, wherein the change of the visual field of photographic images continues less period to a certain extent, can judge that so visual field is stable.In this case, estimate to occur being stable into picture state.
If owing to moving, the visual field change of the imaging device 10 of pitching or zoom operation is less, can estimate accurately that so cameraman judges subject orientation or visual angle roughly, photographs to attempt.So, according to being stable into picture state estimation result, in the step F 1001 of Fig. 2 A, start main subject and judge, thereby be in due course, carry out main subject determination processing.
In addition, in the processing example 3 of Figure 14 A, although in step F 402, move or the AND of pitching operation and zoom operation (" with ") condition under, flow process enters step F 403, thereby stabilization time, measurement counter CTst was incremented, and moves or pitching operation and zoom operation can be to detect target, under OR ("or") condition of two kinds of operations about as detection target, flow process can enter step F 403.
In addition, can there is other processing of measurement counter CTst stabilization time.
As used as shown in dotted line in Figure 14 A, if the condition in determination step F402 is not satisfied, measurement counter CTst stabilization time that can not successively decrease so, does not enter step F 405 thereby flow process does not change count value.
In addition, in the available Figure 14 B of the processing in step F 404, the step F 404A shown in diagram replaces.In other words, if this is that wherein the condition in determination step F402 is not satisfied, so stabilization time measurement counter CTst value reset to 0 example.
Exist wherein as processing described in example 2, successively decrease, replacement and the very suitable situation of indeclinable arbitrary examples.
[4-5: process example 4]
Below with reference to Figure 16 and 17, illustrate and process example 4.Processing example 4 is the testing results that wherein change according to visual field, stablizes the example of the estimation of image formation state.Here, this example is that wherein the exposing operation of imaging device 10 and the testing result of focusing operation are used in particular for stablizing the example of the estimation of image formation state.
In addition, exposing operation and the focusing operation of explanation comprise auto iris or the self-focusing situation owing to the control of control unit 30 here, and exposure adjusting operation and the focusing operation of the operation based on user.
Control unit 30 can carry out the detection of exposing operation as follows.
The adjustment of the mechanical aperture by the aperture device as in optical system 11, the adjustment of the shutter speed of imager 12, and utilize the adjustment of the gain of the photographed image signal that imager 12 obtains, adjustment exposes.Control unit 30 is by response user's operation, or according to the luminance level detecting from captured image data, automatically carries out some or all adjustment, realizes exposure adjustment.So control unit 30 can be according to the exposure control command value for optical system driver element 13, imager 12 or digital signal processing unit 20, the detection control unit adjustment that whether exposes.In addition, according to described command value, can judge its controlled quentity controlled variable.
In addition, if the transducer of the pore size that detects mechanical aperture is set, control unit 30 can, according to the detected value of this transducer, detect the exposure adjusting operation that utilizes mechanical aperture to carry out so.
In addition, at digital signal processing unit 20 or control unit 30, carry out the analysis of the brightness value of view data, to detect the brightness value that indication automatic exposure is adjusted, or detect in the situation of larger luminance variations, can detect and carry out exposure adjustment.
By the condenser lens in moving optical system 11, focus on adjustment.Control unit 30 response users' operation, focuses on adjustment, or by auto focus control, automatically focuses on adjustment.So control unit 30 can be according to the control signal for optical system driver element 13, whether detection control unit has carried out focusing on is adjusted and condenser lens amount of movement.
In addition, if condenser lens position transducer is set in sensor unit 14, control unit 30 can, by detecting the detected value of this transducer, detect the situation that focuses on adjusting operation so.
In addition, digital signal processing unit 20 or control unit 30 can carry out the focal point analysis of view data, and Analysis of Contrast etc., to find that condenser lens is moved.
Figure 16 A graphic extension is as the processing example 4 that is stable into picture state estimation process, by being stable into the processing of carrying out as state estimation parts 30b.
In step F 500, whether control unit 30 is being stable into picture state estimation process according to current, makes to process to occur branch.
When starting to be stable into picture state estimation process, control unit 30 makes to process and enters step F 507 from step F 500, so as stabilization time measurement counter CTst reset to 0.In addition, indicate whether to estimate to occur that the estimation flag F st being stable into as state is initialized to Fst=0.
According to aforementioned processing example 1,2 and 3 in identical mode, estimating, in the period of flag F st=0, as the processing of Fig. 2 A, repeatedly to carry out step F 1000, that is, the processing of Figure 16 A.
After starting to be stable into picture state estimation process, the processing of Figure 16 A enters step F 501 from step F 500.
In step F 501, control unit 30 obtains the transducer input from sensor unit 14, for example, the detected value of condenser lens position transducer, control unit sends to the condenser lens move value of optical system driver element 13, the command value that exposure is adjusted etc.In other words, detect each mode of operation of exposure adjusting operation or focusing adjusting operation.
Whether afterwards, in step F 502, control unit 30 judges that change that the exposure of adjusting owing to exposure adjusts state is whether within the scope of predeterminated level, and judge owing to the focusing change that focuses on adjusting operation within the scope of predeterminated level.Predeterminated level scope is in this case to indicate wherein to expose to adjusting or focus on to adjust, even or carried out exposure adjustment or focused on adjusting, exposure status or focus state be the horizontal extent of the situation in small mobility scale also.
If the motion of judging exposure adjusting operation and focusing on adjusting operation is within the scope of predeterminated level, thereby expose adjusting operation and focusing adjusting operation, or only carried out the small movements of exposure adjusting operation and focusing adjusting operation, control unit 30 enters step F 503 so, increases progressively measurement counter CTst stabilization time.
On the other hand, in the situation that the detected value of transducer surpasses predeterminated level scope, judge and carried out exposure adjusting operation or focused on adjusting operation, thereby control unit 30 enters step F 504, measurement counter CTst stabilization time successively decreases.
In step F 505, control unit 30 is count value and the estimation scheduled time thTM of counter CTst relatively.
If the value of time in the past counter CTt does not surpass scheduled time thTM, former state stops this processing so.In other words, maintain and estimate flag F st=0.
In addition, if in step F 505, judge CTst>thTM, control unit 30 enters step F 506 so, thinks and estimates to have occurred stable image formation state, thereby set, estimates flag F st=1.
The in the situation that of estimation flag F st=1 as above, control unit 30, in the step F 1001 of Fig. 2 A, starts main subject determination processing, thereby enters step F 1002-F1004.
Below with reference to Figure 17, the example of operation based on this processing is described.
Figure 17 graphic extension control unit 30 utilizes transducer or according to command value, the exposure adjusting operation of detection and focus on adjusting operation.About exposure adjusting operation, the position of mechanical aperture represents with solid line, and shift in position dots.Each point is indicated in step F 501, at each time point, and the value of the shift in position being detected by control unit 30.
In addition, about focusing on adjusting operation, condenser lens alternate position spike dots.Each point is indicated in step F 501, at each time point, and the value of the lens position variation being detected by control unit 30.
In step F 502, for the predeterminated level scope of described judgement, be between the horizontal L10 of the detected value poor as aperture position and the scope between L11, and between the horizontal L20 of the detected value as condenser lens alternate position spike and the scope between L21.There is not respectively the aperture state of mechanical aperture and the change of condenser lens position in described scope indication, even or have change, variation is also small quantity.
In addition, also illustrate the value of estimating flag F st.
In addition, suppose at time point t0, start to be stable into picture state estimation process.
In order exposing, to adjust therein and to drive in the period of aperture, the alternate position spike of mechanical aperture increases, but does not drive therein in the period of aperture, and the alternate position spike of mechanical aperture is 0.
In order to focus on adjustment, drive in the period of lens therein, the alternate position spike of condenser lens also increases, but does not drive therein in the period of lens, and the alternate position spike of condenser lens is 0.But, in order to carry out auto focus control, always along fore-and-aft direction, drive a little condenser lens, thereby observe small shift in position, as shown in picture in picture solution.In this example, find until time point t1 has carried out exposure adjustment or focused on adjusting.So until time point t1, user has manually carried out exposure and has adjusted or focused on adjustment, or by automatically controlling, has adjusted exposure status or focus state.In this case, expection does not exist user to wait for the situation for shutter opportunity of subject.
If exposure is adjusted or focus on to adjust, stablized, photographic images is also stablized so, thereby user can start actual imaging.For example, from time point t1, do not carry out larger exposure adjustment or focus on adjusting.
In the processing of Figure 16 A, due under these situations, stabilization time, measurement counter CTst was incremented, and if these situations continue, so in certain time, stabilization time, the value of measurement counter CTst can be over the value of scheduled time thTM.For example, at time point t2, there is CTst>thTM.
If at above-mentioned steps F505, judge CTst>thTM, so in step F 506, estimate that flag F st is set to 1.As illustrated in Figure 17, at time point t2, estimate that flag F st is set to 1.
If as mentioned above, wherein do not carry out larger exposure adjustment or focus on continuing to a certain extent the period of adjusting, can judge that so visual field change is very little, thereby visual field is stable.In this case, estimate to occur being stable into picture state.
In imaging device 10, the situation that wherein expose hardly adjustment or focusing are adjusted can be estimated as stable image formation state, because cameraman is almost photography, gets ready.So, according to being stable into picture state estimation result, in the step F 1001 of Fig. 2 A, start main subject and judge, thereby be in due course, carry out main subject determination processing.
In addition, in the processing example 4 of Figure 16 A, although in step F 502, under the AND condition of exposure adjusting operation and focusing adjusting operation, flow process enters step F 503, thereby stabilization time, measurement counter CTst was incremented, and not overexposure adjusting operation or focusing adjusting operation can be to detect target, under the OR condition of the exposure adjusting operation about as detection target and focusing adjusting operation, flow process can enter step F 403.
In addition, the detection of exposure adjusting operation can be for mechanical aperture operation, shutter speed alter operation, and for all or some operation in the gain adjusting operation of photographed image signal.
In addition, can there is other processing of measurement counter CTst stabilization time.
As used as shown in dotted line in Figure 16 A, if the condition in determination step F502 is not satisfied, measurement counter CTst stabilization time that can not successively decrease so, does not enter step F 505 thereby flow process does not change count value.
In addition, in the available Figure 16 B of the processing in step F 504, the step F 504A shown in diagram replaces.In other words, if this is that wherein the condition in determination step F502 is not satisfied, so stabilization time measurement counter CTst value reset to 0 example.
Exist wherein as processing described in example 2, successively decrease, replacement and the very suitable situation of indeclinable arbitrary examples.
[4-6: process example 5]
Below with reference to Figure 18,19 and 20, illustrate and process example 5.Processing example 5 is the testing results that wherein change according to visual field, stablizes the example of the estimation of image formation state.But here, this example is wherein to carry out the analysis of the motion vector of photographic images, and analysis result is for stablizing the example of the estimation of image formation state.
First with reference to Figure 19, the motion vector of mentioning is described here.
The captured image data of each frame that order obtains is usingd on time shaft as target in Figure 19 A graphic extension motion vector test section 27, detects motion vector.
As shown in Figure 19 B, motion vector test section 27 carries out wherein the picture of a frame being divided into a plurality of regions, and in regional, detect that this frame wherein occurs to change time interim, the motion of subject image is as vectorial processing.
As shown in dotted line or solid line, the vector detecting in regional is set to partial vector.
Here, partial vector comprises as for stablizing the vector of the estimation of image formation state, has the vector of high reliability and low reliability.
For example, the subject of the target of judging as main subject, such as people, animal or movable body existing in region there is high reliability because its contrast is higher.
On the other hand, subject in background etc. existing in region there is lower contrast, thereby there is low reliability.
In Figure 19 B, the partial vector with the region of high reliability represents with solid line, and the partial vector with the region of low reliability dots.
In processing example 5, the global motion of picture is used to stablize the estimation of image formation state.Global motion do not represent with partial vector, but represent with the Global Vector of the arrow indication with oblique line.The mean value by calculating with the partial vector of high reliability, can obtain Global Vector.
The frame image data according to order input is carried out in motion vector test section 27, order computation Global Vector, and Global Vector is offered to the operation of control unit 30.
Figure 18 A graphic extension is as the processing example 5 that is stable into picture state estimation process, by being stable into the processing of carrying out as state estimation parts 30b.
In step F 600, whether control unit 30 is being stable into picture state estimation process according to current, makes to process to occur branch.
When starting to be stable into picture state estimation process, control unit 30 makes to process and enters step F 607 from step F 600, so as stabilization time measurement counter CTst reset to 0.In addition, indicate whether to estimate to occur that the estimation flag F st being stable into as state is initialized to Fst=0.
According to aforementioned processing example 1,2,3 and 4 in identical mode, estimating, in the period of flag F st=0, as the processing of Fig. 2 A, repeatedly to carry out step F 1000, that is, the processing of Figure 18 A.
After starting to be stable into picture state estimation process, the processing of Figure 18 A enters step F 601 from step F 600.
In step F 601, the value of control unit 30 27 acquisition Global Vectors from motion vector test section.In addition, obtain vectorial value here, but in this processing example, can obtain vectorial amount of exercise.
Afterwards, in step F 602, control unit 30 judges that the amount of exercise of the Global Vector obtaining is whether within the scope of predeterminated level.Predeterminated level scope is in this case the horizontal extent of the situation that the amount of global motion of the subject of indication on picture is less.
If judge that the amount of exercise of Global Vector is within the scope of predeterminated level, thereby do not have larger motion in photographic images, control unit 30 enters step F 603 so, increases progressively measurement counter CTst stabilization time.
On the other hand, if judge that the amount of exercise of Global Vector is the amount that surpasses predeterminated level scope, thereby in photographic images, do not have larger motion, control unit 30 enters step F 604 so, and measurement counter CTst stabilization time successively decreases.
In step F 605, control unit 30 is count value and the estimation scheduled time thTM of counter CTst relatively.
If the value of time in the past counter CTt does not surpass scheduled time thTM, former state stops this processing so.In other words, maintain and estimate flag F st=0.
In addition, if in step F 605, judge CTst>thTM, control unit 30 enters step F 606 so, thinks and estimates to occur being stable into picture state, thereby set, estimates flag F st=1.
In the situation that estimating as mentioned above flag F st=1, control unit 30, in the step F 1001 of Fig. 2 A, starts main subject determination processing, then enters step F 1002-F1004.
Below with reference to Figure 20, the example of operation based on this processing is described.
The displacement of the scalar value of dotted line diagram Global Vector for Figure 20.Each point is indicated in step F 601, at each time point, and the amount of exercise of the Global Vector being detected by control unit 30.
In step F 602, for the predeterminated level scope of described judgement, be the scope between momental horizontal L30 and L31.On photographic images, there is not the global motion of subject in the indication of described scope, even if or to have global motion, the amount of global motion be also small quantity.
In addition, also illustrate the value of estimating flag F st.
In addition, suppose at time point t0, start to be stable into picture state estimation process.
In this example, until time point t1, as global motion, detects larger motion.For example, can expect the situation that wherein user does not hold imaging device 10, wherein user finds roughly the situation of subject, or the situation that wherein subject greatly moves around.
On the other hand, after time point t1, amount of exercise is reduced within the scope of predeterminated level.This is contemplated to wherein user and holds imaging device 10, or the subject that occupies most of picture is for imaging device 10 static situation relatively almost.
In the processing of Figure 18 A, under these situations after time point t1, stabilization time, measurement counter CTst was incremented, thereby at time point t2, stabilization time, the value of measurement counter CTst surpassed the value of scheduled time thTM, if in above-mentioned steps F605, judged that the value of measurement counter CTst stabilization time surpasses the value of scheduled time thTM, in step F 606, estimate that flag F st is set to 1 so.As shown in Figure 20, at time point t2, estimate that flag F st is set to 1.
If continue the period that wherein global motion on photographic images is less to a certain extent, can judge that so visual field change is very little, thereby visual field is stable.In this case, estimate to occur being stable into picture state.
According to above-mentioned example in identical mode, in processing example 5, be stable into picture state estimation parts 30b the testing result of the motion of photographic images for stablizing the estimation of image formation state.Especially, as the analyzing and processing about captured image data, carry out Global Vector detection, and according to the situation of motion, carry out the estimation of the stable stable image formation state of visual field wherein.
In addition, in the processing example 5 of Figure 18 A, although in step F 502, whether the amount of exercise of judging Global Vector within the scope of predeterminated level, but also can utilize the direction of motion as vector, carries out described judgement.
In addition, owing to analyzing the partial vector of regional, thereby also can estimate the relative motion situation between each subject and imaging device 10, thereby can utilize partial vector.
In addition, can there is other processing of measurement counter CTst stabilization time.
As used as shown in dotted line in Figure 18 A, if the condition in determination step F602 is not satisfied, measurement counter CTst stabilization time that can not successively decrease so, does not enter step F 605 thereby flow process does not change count value.
In addition, in the available Figure 18 B of the processing in step F 604, the step F 604A shown in diagram replaces.In other words, if this is that wherein the condition in determination step F602 is not satisfied, so stabilization time measurement counter CTst value reset to 0 example.
Exist wherein as processing described in example 2, successively decrease, replacement and the very suitable situation of indeclinable arbitrary examples.
[4-7: the scheduled time change of estimating use in processing is processed]
In above-mentioned processing example 1-5, the estimation of scheduled time thTM for stablizing image formation state.At each, process in example, scheduled time thTM can be configured to fixed value, but also can as described belowly be changed.In other words, according to the execution opportunity that is stable into picture state estimation process, change and to be stable in picture state estimation process, for stablizing the relevant condition of time of the estimation of image formation state.
Example of Figure 21 graphic extension.For example, 3 kinds of time thTM1, thTM2 and thTM3 are set, as scheduled time thTM.Suppose thTM1>thTM2>thTM3.
On the opportunity of explanation in Fig. 9 and 10, be stable into picture state estimation process, but for example, be stable into therein as state estimation process in the situation that in the predetermined period after switching on power, use longer time thTM1 as scheduled time thTM.In addition, in the predetermined period after the transformation of carrying out from reproduction mode MD2 to camera mode MD1, be stable in the picture situation of state estimation process, also use longer time thTM1 as scheduled time thTM.
In these cases, rare wherein after carrying out the operation of power-on servicing or pattern, user holds camera at once, and aims at the situation of subject.So, improving in the meaning of estimated accuracy, it is suitable extending for stablizing the scheduled time thTM of the estimation of image formation state.
From switching on power or transferring to camera mode MD1, during the camera mode MD1 having pass by after the scheduled time, and when also not carrying out main subject while judging, utilize standard time thTM2 as scheduled time thTM.
This is because to a certain extent, and user gets ready for imaging operation.
But, after carrying out once main subject judgement, be again stable into the opportunity of picture state estimation process, utilize the shortest time thTM3 as scheduled time thTM.For example, described in Fig. 9 and 10, this is because after main subject information can not be used, need to carry out fast new main subject determination processing.
In addition, continue some period therein, that is, wherein do not carry out, under lasting situation in period of main subject judgement, utilizing standard time thTM2 as scheduled time thTM.
Explanation is above example, and for example, control unit 30 is by certain time interval, by interruptions, processes etc., carries out setting and processing as the estimation scheduled time in Figure 22, so that according to the execution opportunity being stable into as state estimation process, changes scheduled time thTM.
In Figure 22, in step F 4000, whether 30 judgement periods of control unit are at the Ta from switching on power in the phase.If in the phase, so in step F 4006, be set as scheduled time thTM the longest time thTM1 at Ta period.
In addition, if period not at the Ta from switching on power in the phase, so in step F 4001, whether 30 judgement periods of control unit at the Tb from being converted to camera mode MD1 in the phase.If in the phase, so in step F 4006, be set as scheduled time thTM the longest time thTM1 at Tb period.
If do not correspond to the judgement in step F 4000 and F4001 period, control unit 30, according to whether carried out initial main subject determination processing after switching on power or being converted to camera mode MD1, makes processing occur branch in step F 4002 so.If also do not carry out initial main subject determination processing, so in step F 4004, standard time thTM2 be set as to scheduled time thTM.
If carried out initial main subject determination processing, so in step F 4003, control unit 30 is judged from last time main subject is judged, whether at Tc, in the phase, has been carried out main subject and judged.If do not carry out main subject judgement, so in step F 4004, standard time thTM2 is set as to scheduled time thTM.
On the other hand, if in step F 4003, carried out main subject and judged, so in step F 4005, shortest time thTM3 is set to scheduled time thTM.
In being stable into picture state estimation process, at its time of implementation point, use the scheduled time thTM setting due to the above-mentioned processing of Figure 22.
Thereby, as illustration in Figure 21, according to the execution opportunity that is stable into picture state estimation process, use suitable estimation to process and use scheduled time thTM, thereby realize suitable operation.
<5. the another kind in image processing equipment is processed example >
As the object lesson of the step F 1000 in the processing of Fig. 2 A, above-mentioned processing example 1-5 has been described.
The example of Fig. 2 A is wherein owing to being stable into, to look like state estimation process, estimates to occur being stable in the picture situation of state, and main subject judging part 30a carries out main subject determination processing, and exports the example of its result.
Here, that in the image processing equipment 1 for the imaging device 10 at Fig. 3 or Fig. 1, carries out is stable into as state estimation process and main subject determination processing, not only can consider the example shown in diagram in Fig. 2 A, and can consider the example as shown in diagram in Figure 23.
In the example of Figure 23, during camera mode, as shown in Figure 23 A, repeat at any time main subject determination processing.
In other words, during camera mode MD1, main subject judging part 30a, in step F 2000, starts the detection of candidate image, and in step F 2001, carries out main subject determination processing.The detailed processing of step F 2001 is identical with the processing of the step F 1-F3 of Fig. 2 B.
If in step F 2001, carried out main subject and judged, so at this moment, as the main subject information of result of determination, be not sent to application program etc., but in step F 2002, be maintained in internal storage.Subsequently, again carry out the processing in step F 2000, F2001 and F2002.
Meanwhile, on the execution opportunity of explanation in as Fig. 9 and 10, as in Figure 23 B, carry out and be stable into as being stable into as state estimation process in state estimation parts 30b.
First, in step F 3000, control unit 30 is stable into picture state estimation process.For example, in 1-5 as sub-in above-mentioned processing example, be stable into picture state estimation process.
If in certain time, estimate to occur being stable into picture state, control unit 30 makes to process and enters step F 3002 from step F 3001 so, obtains main subject information.As shown in Figure 23 A, the main subject determination processing of at any time carrying out, this will read the up-to-date main subject information of now preserving from internal storage.
In addition, in step F 3003, control unit 30 sends to application program etc. the main subject information obtaining.
In other words, in the above-mentioned processing of Figure 23, main subject judging part 30 orders are carried out main subject determination processing.In addition, be stable into the execution opportunity that picture state estimation parts 30b illustrates in Fig. 9 and 10, be stable into picture state estimation process.This example is wherein owing to being stable into, to look like state estimation process, estimates to occur being stable in the picture situation of state the example of the latest result of the main subject determination processing of control unit 30 output.
In addition, by this, process, main subject information is machine in due course, is exported to application program etc.
<6. the application > to program and computer equipment
As mentioned above, the embodiment of image processing equipment 1 and imaging device 10 has been described, above-mentioned main subject determination processing available hardware, or carry out with software.
The program of embodiment is to make arithmetic processing equipment such as central processing unit (CPU) or digital signal processor (DSP) carry out the program of the processing that illustrates in above embodiment.
In other words, described program is to make arithmetic processing equipment carry out the program of following steps: be stable into picture state estimation step, described being stable into as state estimation step estimates that whether occurring stablizing being stable into of image formation state looks like state estimation process, main subject determination step, described main subject determination step carries out main subject determination processing, with output step, owing to being stable into, looking like state estimation process, estimate to occur being stable in the picture situation of state, described output step is exported the result of main subject determination processing.
Particularly, the program of the present embodiment can be to make arithmetic processing equipment carry out the processing shown in diagram in Fig. 2 or Figure 23, about main subject determination processing, and the processing of explanation in Fig. 4 and 8, in addition about be stable into picture state estimation process, the program of the processing illustrating in processing example 1-5.
By utilizing described program, available described arithmetic processing equipment is realized and is carried out the equipment that aforementioned stable image formation state is estimated processing and main subject determination processing.
Program can be recorded in advance in as being built in the HDD such as the recording medium in the kind equipment of computer equipment, has in the ROM etc. of microcomputer of CPU.
On the other hand, program can be kept on the detachable recording medium such as floppy disk, compact disc read-only memory (CD-ROM), magneto-optic (MO) dish, digital versatile disc (DVD), Blu-ray Disc, disk, semiconductor memory or storage card temporarily or for good and all.Form that can so-called package software, provides described detachable recording medium.
In addition, described program not only can be installed to personal computer from detachable recording medium, and can pass through the network such as local area network (LAN) (LAN) or internet, from downloading side, downloads.
In addition the image processing equipment of the embodiment that, described program is suitable for extensively providing.For example, by program being downloaded to personal computer, portable information processing apparatus, game station, video equipment, personal digital assistant (PDA) etc., described portable information processing apparatus etc. can be used as messaging device of the present disclosure.
For example, make computer equipment as illustrated in Figure 24 can carry out being stable into picture state estimation process and main subject determination processing in the image processing equipment of Fig. 1 or imaging device 10.
In Figure 24, the CPU71 of computer equipment 70 is according to the program being kept in ROM72, or is written into the program RAM73 from memory cell 78, carries out various processing.RAM73 takes the circumstances into consideration to save as CPU71 and carries out the necessary data of various processing etc.
CPU71, ROM72 and RAM73 interconnect by bus 74.Bus 74 is also connected to input and output interface 75.
Input and output interface 75 is connected to the input unit 76 that comprises keyboard, mouse etc., comprise the display being formed by cathode ray tube (CRT), LCD or organic EL panel, output unit 77 with loud speaker, comprise the memory cell 78 of hard disk etc., and comprise the communication unit 79 of modulator-demodulator etc.Communication unit 79, by comprising the network of internet, communicates processing.
Input and output interface 75 also takes the circumstances into consideration to be connected to driver 80.Detachable media 81 such as disk, CD, magneto optical disk or semiconductor memory is arranged in driver as one sees fit, thereby the computer program therefrom reading is arranged in memory cell 78 as one sees fit.
Carry out aforementioned stable image formation state estimation processing and main subject determination processing with software in the situation that, from the program of network or the described software of recording medium installation formation.
For example, as illustrated in Figure 24, described recording medium is included as and is independent of apparatus body, to user, pays program and the detachable media 81 such as disk (comprising floppy disk), CD (comprising Blu-ray Disc (registered trade mark), CD-ROM and DVD), magneto optical disk (comprising compact disk (MD)) or semiconductor memory that distributes.On the other hand, described recording medium comprises the ROM72 that is incorporated in advance the program of paying to record to user in apparatus body, is included in the hard disk in memory cell 78, etc.
In computer equipment 70, when the reception by communication unit 79 operates, reproduction operation in driver 80, detachable media 81 or record cell 78 etc., during input motion view data, CPU71 is according to described program, carry out aforementioned stable image formation state estimation section (3,30b) and main subject judging part (2, function 30a).In other words, carry out processing as the same with 23 in Fig. 2, thereby can be in due course, the view data of input is carried out to main subject judgement.
<7. variation >
Above-described embodiment can have various variation.
In being stable into picture state estimation process, a plurality of processing examples among above-mentioned processing example 1-5 can combine mutually.
For example, some in combined treatment example 1-5 are processed example, under the OR conditioned disjunction AND condition of the estimated result of the stable image formation state in each processing example, can estimate to occur being stable into picture state.
In addition, the in the situation that of combined treatment example 1-5, at each, process in example, can change for stablizing the weight of the estimation of image formation state, or can set the priority of judgement.For example, at combined treatment example 2 with process example 3 in the situation that, at each, process in example, make the increment value of counter CTst different, or scheduled time thTM is configured to different values.
In addition, in the example of imaging device 10, be stable into as state estimation process and carry out under camera mode MD1 or main subject determinating mode MD11, but also can under reproduction mode MD2, carry out.
For example, main subject information processes for image effect or picture editting processes, but for this reason, using reproduced image as target, carry out main subject determination processing, thereby in reproduced image, being stable into as after state estimation while carrying out imaging, it is also useful carrying out main subject determination processing.
For example, wherein the less situation of the variation of reproduced image can be estimated as when imaging, has stable image formation state, now can carry out main subject determination processing, and can carry out picture editting's processing and image effect processing, to reproduce demonstration, the establishment of editing data, etc.
In addition, for being stable into as state estimation process and main subject determination processing of this reproduced image, naturally can expect that the messaging device of Figure 24 etc. carries out by the messaging device of Fig. 1.
In addition, the result of main subject determination processing can be used as metadata, in the Still image data or motion image data of taking after adding to and recording.In other words, indicate the information of tranquil subject to be added in static picture document etc.
In addition, show preview image, and when also carrying out main subject determination processing, can, by cameraman's operation, carry out main subject assigned operation.
In addition, although in an embodiment, the in the situation that of the main shooting of supposition rest image, the processing of judging main subject has been described, even but during the standby of taking moving image, or during the shooting and record of carrying out moving image, also can be used as the processing of carrying out main subject judgement from a plurality of photographed frames, the above-mentioned processing of Application Example.
In addition, this technology also can have following structure.
(1) image processing equipment, comprises and is stable into picture state estimation parts, described in be stable into as state estimation parts and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process; With main subject judging part, described main subject judging part carries out main subject determination processing, when owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, also exports the result of main subject determination processing.
(2), according to the image processing equipment (1) Suo Shu, wherein, in being stable into picture state estimation process, be stable into picture state estimation parts according to the testing result of visual field change, the estimation of stablizing image formation state.
(3) according to the image processing equipment (2) Suo Shu, be wherein stable into the output of transducer that the utilization of picture state estimation parts detects the motion of the imaging device that generates photographic images, detect visual field change.
(4) according to the image processing equipment (2) or (3) Suo Shu, be wherein stable into the output of transducer of motion that the utilization of picture state estimation parts detects the imaging optical system of the imaging device that generates photographic images, detect visual field change.
(5) according to one of any described image processing equipment in (2)-(4), be wherein stable into the command value that the operation of the imaging device that generates photographic images is controlled in the utilization of picture state estimation parts, detect visual field change.
(6) according to one of any described image processing equipment in (2)-(5), be wherein stable into picture state estimation parts and utilize the result about the analyzing and processing of captured image data, detect visual field change.
(7), according to the image processing equipment (6) Suo Shu, wherein as the analyzing and processing about captured image data, be stable into the motion that picture state estimation parts detect photographic images.
(8) according to one of any described image processing equipment in (1)-(7), wherein from being converted to time of the preassigned pattern state that main subject determination processing works effectively, under the condition of the scheduled time in past, be stable into as state estimation parts and estimate to occur being stable into picture state.
(9) according to one of any described image processing equipment in (1)-(8), wherein according to the execution opportunity that is stable into picture state estimation process, be stable into as state estimation parts changes about being stable in picture state estimation process, for stablizing the condition of time of the estimation of image formation state.
(10) according to one of any described image processing equipment in (1)-(9), wherein, when there is the preassigned pattern state that wherein main subject determination processing works effectively, be stable into as state estimation parts and be stable into picture state estimation process.
(11) according to one of any described image processing equipment in (1)-(10), wherein in the result of main subject determination processing, can not be used, or after not needing to be used, be stable into as state estimation parts and be stable into picture state estimation process.
(12) according to one of any described image processing equipment in (1)-(11), wherein work as owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, main subject judging part carries out main subject determination processing, and output is as the main subject information of the result of main subject determination processing.
(13) according to the image processing equipment (1)-(11) Suo Shu, wherein main subject judging part sequentially carries out main subject determination processing, and work as owing to being stable into picture state estimation process, while estimating to occur being stable into picture state, output is as the main subject information of the result of up-to-date main subject determination processing.
Reference numerals list
1 image processing equipment, 2 main subject judging parts, 3 are stable into picture state estimation parts, 10 imaging devices, 11 optical systems, 12 imagers, 13 optical system driver elements, 14 sensor units, 15 record cells, 16 communication units, 20 digital signal processing units, 21 preprocessing parts, 22 sync sections, 23YC generating portion, 24 conversion of resolution parts, 25 codec parts, 26 candidate test sections, 27 motion vector test sections, 30 control units, the main subject judging part of 30a, 30b is stable into picture state estimation parts, 32UI controller, 33 user interfaces, 34 display sections, 35 operation parts, 70 computer equipments, 71CPU

Claims (15)

1. an image processing equipment, comprising:
Be stable into picture state estimation parts, described in be stable into as state estimation parts and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process; With
Main subject judging part, described main subject judging part carries out main subject determination processing, and owing to being stable into while estimating to occur being stable into picture state as state estimation process, also exports the result of main subject determination processing.
2. according to image processing equipment claimed in claim 1,
Wherein, in being stable into picture state estimation process, be stable into the estimation that the testing result based on visual field change is stablized image formation state as state estimation parts.
3. according to image processing equipment claimed in claim 2,
Wherein be stable into the output of transducer of motion that detect to generate the imaging device of photographic images as the utilization of state estimation parts and detect described visual field change.
4. according to image processing equipment claimed in claim 2,
Wherein be stable into the output of transducer of motion that the utilization of picture state estimation parts detects the imaging optical system of the imaging device that generates photographic images, detect described visual field change.
5. according to image processing equipment claimed in claim 2,
Wherein be stable into as the utilization of state estimation parts for controlling the command value of the operation of the imaging device that generates photographic images, detect described visual field change.
6. according to image processing equipment claimed in claim 2,
Wherein be stable into picture state estimation parts and utilize the result about the analyzing and processing of captured image data, detect described visual field change.
7. according to image processing equipment claimed in claim 6,
Wherein, as the analyzing and processing for captured image data, be stable into the motion that picture state estimation parts detect photographic images.
8. according to image processing equipment claimed in claim 1,
Wherein described, from being converted to have pass by time of the preassigned pattern state that main subject determination processing works effectively, the condition of the scheduled time, being stable into as state estimation parts and estimating to occur being stable into picture state in being stable into as state estimation process.
9. according to image processing equipment claimed in claim 1,
Wherein according to the execution opportunity that is stable into picture state estimation process, be stable into as state estimation parts changes about in being stable into picture state estimation process for stablizing the condition of time of the estimation of image formation state.
10. according to image processing equipment claimed in claim 1,
Wherein, when there is the preassigned pattern state that main subject determination processing works effectively, be stable into as state estimation parts and be stable into picture state estimation process.
11. according to image processing equipment claimed in claim 1,
Wherein, after the result of main subject determination processing can not be used or not need to be used, be stable into as state estimation parts and be stable into picture state estimation process.
12. according to image processing equipment claimed in claim 1,
Wherein when owing to being stable into while estimating to occur being stable into picture state as state estimation process, main subject judging part carries out main subject determination processing, and output is as the main subject information of the result of main subject determination processing.
13. according to image processing equipment claimed in claim 1,
Wherein main subject judging part sequentially carries out main subject determination processing, and when owing to being stable into while estimating to occur being stable into picture state as state estimation process, output is as the main subject information of the result of up-to-date main subject determination processing.
14. 1 kinds of image processing methods, comprising:
Be stable into picture state estimation step, described in be stable into as state estimation step and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process;
Main subject determination step, described main subject determination step carries out main subject determination processing; And
Output step, when owing to being stable into while estimating to occur being stable into picture state as state estimation process, described output step is exported the result of main subject determination processing.
15. 1 kinds of programs that make arithmetic processing equipment carry out following steps:
Be stable into picture state estimation step, described in be stable into as state estimation step and estimate that whether occurring stablize being stable into of image formation state looks like state estimation process;
Carry out the main subject determination step of main subject determination processing; With
Output step, when looking like state estimation process owing to being stable into, while estimating to occur being stable into picture state, described output step is exported the result of main subject determination processing.
CN201380012098.6A 2012-03-09 2013-02-22 Image processing equipment and image processing method Active CN104145475B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012-052453 2012-03-09
JP2012052453A JP2015097301A (en) 2012-03-09 2012-03-09 Image processing system, image processing method, and program
PCT/JP2013/055609 WO2013133142A1 (en) 2012-03-09 2013-02-22 Image processing device, image processing method, program

Publications (2)

Publication Number Publication Date
CN104145475A true CN104145475A (en) 2014-11-12
CN104145475B CN104145475B (en) 2018-02-02

Family

ID=49116623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380012098.6A Active CN104145475B (en) 2012-03-09 2013-02-22 Image processing equipment and image processing method

Country Status (7)

Country Link
US (1) US10455154B2 (en)
EP (1) EP2824909B1 (en)
JP (1) JP2015097301A (en)
KR (1) KR102008040B1 (en)
CN (1) CN104145475B (en)
TW (1) TW201351980A (en)
WO (1) WO2013133142A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109983758A (en) * 2016-12-02 2019-07-05 索尼半导体解决方案公司 Image-forming component, imaging method and electronic equipment
CN113034069A (en) * 2019-12-25 2021-06-25 菜鸟智能物流控股有限公司 Logistics object processing method and logistics management equipment

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014155961A1 (en) * 2013-03-26 2014-10-02 パナソニック株式会社 Image generation device, imaging device, image generation method, and program
JP6295443B2 (en) * 2013-03-26 2018-03-20 パナソニックIpマネジメント株式会社 Image generating apparatus, photographing apparatus, image generating method, and program
JP6137921B2 (en) * 2013-04-16 2017-05-31 オリンパス株式会社 Image processing apparatus, image processing method, and program
JP2016219899A (en) * 2015-05-15 2016-12-22 カシオ計算機株式会社 Imaging device, imaging method and program
US10192582B2 (en) * 2015-05-21 2019-01-29 Adobe Inc. Automatic generation of time-lapse videos
JP6576177B2 (en) * 2015-09-09 2019-09-18 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
US11336831B2 (en) 2018-07-06 2022-05-17 Canon Kabushiki Kaisha Image processing device, control method, and program storage medium
JP2022133723A (en) * 2021-03-02 2022-09-14 株式会社アイシン Body information acquisition device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002064728A (en) * 2000-08-23 2002-02-28 Fuji Photo Film Co Ltd Digital still camera and exposure display control method
JP2008028758A (en) * 2006-07-21 2008-02-07 Canon Inc Photographing apparatus and its control method, and computer program
US20080187185A1 (en) * 2007-02-05 2008-08-07 Takeshi Misawa Image pickup apparatus, and device and method for control of image pickup
US20100208126A1 (en) * 2009-02-17 2010-08-19 Canon Kabushiki Kaisha Focus adjustment apparatus and focus adjustment method
JP2010191073A (en) * 2009-02-17 2010-09-02 Canon Inc Focus adjustment apparatus and focus adjustment method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4262023B2 (en) 2003-07-16 2009-05-13 日本放送協会 Image shake detection device, method and program thereof, and camera image selection device
JP4444936B2 (en) 2006-09-19 2010-03-31 富士フイルム株式会社 Imaging apparatus, method, and program
JP2008109336A (en) 2006-10-25 2008-05-08 Matsushita Electric Ind Co Ltd Image processor and imaging apparatus
KR101444103B1 (en) * 2008-03-14 2014-09-26 삼성전자주식회사 Media signal generating method and apparatus using state information
JP2009290827A (en) * 2008-06-02 2009-12-10 Sony Corp Image processing apparatus, and image processing method
EP2148499B1 (en) 2008-07-25 2018-01-17 FUJIFILM Corporation Imaging apparatus and method
JP5159515B2 (en) * 2008-08-26 2013-03-06 キヤノン株式会社 Image processing apparatus and control method thereof
JP5517435B2 (en) * 2008-10-22 2014-06-11 キヤノン株式会社 Automatic focusing device, automatic focusing method, and imaging device
JP2011146827A (en) 2010-01-13 2011-07-28 Sony Corp Unit and method for processing image, and program
JP2011146826A (en) 2010-01-13 2011-07-28 Sony Corp Unit and method for processing image, and program
JP2011160379A (en) 2010-02-04 2011-08-18 Sony Corp Image processing device and method, and program therefor
JP2011166305A (en) 2010-02-05 2011-08-25 Sony Corp Image processing apparatus and imaging apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002064728A (en) * 2000-08-23 2002-02-28 Fuji Photo Film Co Ltd Digital still camera and exposure display control method
JP2008028758A (en) * 2006-07-21 2008-02-07 Canon Inc Photographing apparatus and its control method, and computer program
US20080187185A1 (en) * 2007-02-05 2008-08-07 Takeshi Misawa Image pickup apparatus, and device and method for control of image pickup
US20100208126A1 (en) * 2009-02-17 2010-08-19 Canon Kabushiki Kaisha Focus adjustment apparatus and focus adjustment method
JP2010191073A (en) * 2009-02-17 2010-09-02 Canon Inc Focus adjustment apparatus and focus adjustment method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109983758A (en) * 2016-12-02 2019-07-05 索尼半导体解决方案公司 Image-forming component, imaging method and electronic equipment
CN109983758B (en) * 2016-12-02 2021-11-16 索尼半导体解决方案公司 Imaging element, imaging method, and electronic apparatus
CN113034069A (en) * 2019-12-25 2021-06-25 菜鸟智能物流控股有限公司 Logistics object processing method and logistics management equipment

Also Published As

Publication number Publication date
US10455154B2 (en) 2019-10-22
JP2015097301A (en) 2015-05-21
KR102008040B1 (en) 2019-08-06
US20150116517A1 (en) 2015-04-30
TW201351980A (en) 2013-12-16
WO2013133142A1 (en) 2013-09-12
EP2824909B1 (en) 2018-12-19
CN104145475B (en) 2018-02-02
EP2824909A1 (en) 2015-01-14
KR20140138135A (en) 2014-12-03
EP2824909A4 (en) 2015-07-29

Similar Documents

Publication Publication Date Title
CN104145475A (en) Image processing device, image processing method, program
US11012614B2 (en) Image processing device, image processing method, and program
CN100545733C (en) The control method of imaging device, imaging device and computer program
JP5661373B2 (en) Imaging system, imaging apparatus, and control method thereof
US10630891B2 (en) Image processing apparatus, image processing method, and program
JP3849645B2 (en) Monitoring device
JP6418879B2 (en) Image processing apparatus, control method thereof, control program, and imaging apparatus
US8462215B2 (en) Photographing control method and apparatus according to motion of digital photographing apparatus
JP2005176143A (en) Monitoring apparatus
US9942460B2 (en) Image processing device, image processing method, and program
JP2010199694A (en) Image capturing apparatus, angle-of-view adjusting method, and program
JP4807582B2 (en) Image processing apparatus, imaging apparatus, and program thereof
EP2763395B1 (en) Imaging apparatus
JP7342883B2 (en) Imaging control device, imaging device, imaging control method
JP2009111827A (en) Photographing apparatus and image file providing system
JP2010103725A (en) Imaging apparatus and image transfer method
JP2004112537A (en) Imaging apparatus, imaging method, and program for executing the method
JP2004117195A (en) Digital camera with speed measuring function
JP6729583B2 (en) Image processing apparatus and method
JP2005176142A (en) Monitoring apparatus
CN114175631A (en) Image processing apparatus, image processing method, program, and storage medium
JP2005176141A (en) Monitoring apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant