CN102387301B - Imaging device and formation method - Google Patents
Imaging device and formation method Download PDFInfo
- Publication number
- CN102387301B CN102387301B CN201110236738.4A CN201110236738A CN102387301B CN 102387301 B CN102387301 B CN 102387301B CN 201110236738 A CN201110236738 A CN 201110236738A CN 102387301 B CN102387301 B CN 102387301B
- Authority
- CN
- China
- Prior art keywords
- imaging
- image
- imaging device
- shooting image
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 244
- 230000015572 biosynthetic process Effects 0.000 title claims abstract description 7
- 238000005755 formation reaction Methods 0.000 title claims abstract description 7
- 230000000875 corresponding Effects 0.000 claims abstract description 83
- 230000003287 optical Effects 0.000 claims abstract description 42
- 238000000034 method Methods 0.000 description 50
- 238000003860 storage Methods 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 10
- 238000006243 chemical reaction Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 5
- 238000003825 pressing Methods 0.000 description 4
- 241000406668 Loxodonta cyclotis Species 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 238000009738 saturating Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement Effects 0.000 description 1
- 230000004059 degradation Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 230000000644 propagated Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000036299 sexual function Effects 0.000 description 1
Abstract
The invention discloses a kind of imaging device and formation method, this imaging device includes: optical system, and it forms the picture corresponding with the object light through lens entrance;Imaging device, it generates the signal corresponding with the object light through lens entrance, and exports this signal as shooting image;Acquisition device, for obtaining the distance away from object;Correcting unit, for imaging characteristic based on the optical system corresponding with the object distance acquired in acquisition device, corrects obscuring from the shooting image of imaging device output.
Description
Technical field
The present invention relates to a kind of imaging device, formation method and program, more particularly, relate to one
Plant the amount of calculation generation imaging device of focusedimage, formation method and program that permission reduces.
Background technology
When using Digital Still Camera or any other similar imaging device shoots image, make
User generally performs scheduled operation to focus on required object.Specifically, user is quiet by numeral
The required object of state camera alignment makes object fall at the center of view finder, the most partly trip button with
Focal object.
Generally, the focusing operation of Digital Still Camera so performs: utilizes contrast method to determine and makes
Lens are also moved to the lens position so determined by lens position that the contrast of object images is maximum
Put.Above-mentioned focusing operation is exactly so-called automatic focusing.Or, utilize optically-based distance finder
Diastimeter carrys out the image that superposition utilizes the luminous flux propagated along two different light paths to generate, so that right
As image focusing.
Additionally having a kind of technology, wherein user has the electronic viewfinder of touch panel function
(EVF) real time imaging (referred to as direct picture) shown by selects predetermined object, camera (automatically)
Measure the distance away from the region comprising selected objects and this object is focused on (for example, with reference to JP-A-8-
122847)。
Summary of the invention
But, before the image of the object automatically focused on as already identified above in shooting, away from object away from
Highly precisely perform from changing suddenly or automatically focus on to be likely not to have.In this situation
Under, the shooting image of gained is not focusedimage, but out-of-focus image.
In order to solve this problem, it is contemplated that such out-of-focus image is performed at prearranged signals
Reason, with blur correction mode and generate focusedimage.Specifically, estimate to represent the mould in out-of-focus image
The ambiguity function of paste amount, and based on the lens imaging corresponding with the distance (object distance) away from object
Characteristic, utilizes the inverse function of the ambiguity function estimated that out-of-focus image performs inverse calculating.So,
Focusedimage just can be generated without automatic focusing operation.
But, owing to the distance of the object included in out-of-focus image is unknown, so needing
Based on the lens imaging characteristic corresponding with each possible object distance, out-of-focus image is performed at signal
Reason.In this case, amount of calculation is huge.
Therefore, it is desirable to generate focusedimage with the amount of calculation reduced.
Imaging device includes according to an embodiment of the invention: optical system, and it is formed and passes
The picture that the object light of lens entrance is corresponding;Imaging device, it generates and the object through lens entrance
The signal that light is corresponding, and export this signal as shooting image;Acquisition device, for obtaining away from right
The distance of elephant;Correcting unit, for based on the light corresponding with the object distance acquired in acquisition device
The imaging characteristic of system, corrects obscuring from the shooting image of imaging device output.
Described imaging device may also include that display device, and it is used for showing shooting image;Select dress
Putting, be used in the shooting image that operation based on user selects to show on the display apparatus is right
As.Correcting unit can be according in the object distance obtained with acquisition device, away from selecting selected by device
The imaging characteristic of optical system corresponding to the distance of object, fuzzy in correcting captured image.
Display device can show direct picture (through image), and described direct picture is to clap in real time
The image taken the photograph, and be to perform pixel value addition or selection reading by order imaging device and generate
's.Select device operation based on user can select the object in one of direct picture.Obtain dress
Put the distance away from this object that can obtain when selecting device to select the object in direct picture.Correction
Device can be according to in the object distance acquired in acquisition device, away from selecting device at direct picture
The imaging characteristic of the optical system that the distance of the object of middle selection is corresponding, the mould in correcting captured image
Stick with paste.
Imaging device may also include generating means, and it focuses on straight-through figure for generating from direct picture
Picture, described focusing direct picture is generated as the object so that selecting device to select in direct picture
It is focused.Display device can show the focusing direct picture generated by generating means.
A part for the signal that exportable and through lens entrance the object light of imaging device is corresponding, should
Signal section is used as to represent the range information of the distance away from object.Acquisition device can fill based on from imaging
The range information putting output obtains the distance away from object.
Described lens can be single focal length lenses of the object focus to infinite point or distant location.
Described lens can be zoom lens, and correcting unit can the imaging characteristic school of optically-based system
Just shooting obscuring in image, described imaging characteristic is obtained ahead of time, and not only fills with obtaining
Put acquired object distance corresponding, and corresponding with the zoom state of zoom lens.
Correcting unit can some picture based on the optical system corresponding with the object distance that acquisition device obtains
Intensity distributions carrys out obscuring in correcting captured image.
Correcting unit can line picture based on the optical system corresponding with the object distance that acquisition device obtains
Intensity distributions carrys out obscuring in correcting captured image.
Correcting unit can be based on the optics of the optical system corresponding with the object distance that acquisition device obtains
Transfer function to obscuring in correcting captured image.
Formation method is used for imaging device, described imaging device bag according to an embodiment of the invention
Including: optical system, it forms the picture corresponding with the object light through lens entrance;Imaging device,
It generates the signal corresponding with the object light through lens entrance, and exports this signal as shooting figure
Picture;Acquisition device, for obtaining the distance away from object;Correcting unit, for filling based on obtaining
Put the imaging characteristic of optical system corresponding to acquired object distance, correct and export from imaging device
Shooting image in fuzzy.Described method includes: utilize acquisition device obtain away from object away from
From;Utilize correcting unit based on the optical system corresponding with the object distance obtained in obtaining step
Imaging characteristic, correct from the shooting image of imaging device output is fuzzy.
At imaging during program command computer performs imaging device according to an embodiment of the invention
Reason, described imaging device includes optical system and imaging device, and described optical system is formed and passes
The picture that the object light of lens entrance is corresponding, described imaging device generates and the object through lens entrance
The signal that light is corresponding, and export this signal as shooting image.Described program command computer performs
Following steps: the acquisition rate-determining steps of the acquisition of the control distance away from object;Control based on in acquisition
The imaging characteristic of the optical system that the object distance of acquisition is corresponding in step processed, correction is from imaging device
Fuzzy aligning step in the shooting image of output.
In another embodiment of the invention, obtain distance away from object, and based on obtain
The imaging characteristic of the optical system that distance away from object is corresponding corrects the shooting from imaging device output
Obscuring in image.
In any of the embodiments of the present invention, focused view can be generated in the case of amount of calculation reduces
Picture.
Accompanying drawing explanation
Fig. 1 is the block diagram of the structure of the embodiment of the imaging device illustrating that the present invention is applied to;
Fig. 2 illustrates the arrangement mode of the pixel forming range images sensor;
Fig. 3 is the block diagram of the exemplary functions structure illustrating imaging device;
Fig. 4 is the flow chart illustrating imaging processing;
Fig. 5 is to illustrate the flow chart that ambiguity correction processes;
Fig. 6 is the flow chart of another example illustrating imaging processing;
Fig. 7 is the block diagram of the another exemplary functional configuration illustrating imaging device;
Fig. 8 is the flow chart illustrating the imaging processing performed by the imaging device shown in Fig. 7;
Fig. 9 is the block diagram of the another exemplary functional configuration illustrating imaging device;
Figure 10 is the flow chart illustrating the imaging processing performed by the imaging device shown in Fig. 9;
Figure 11 is the block diagram of the representative configuration of the hardware illustrating computer.
Detailed description of the invention
Embodiments of the invention are described below with reference to accompanying drawings.To be described according to following order.
1. first embodiment (is provided with single focal length lenses and in response to shutter operation shooting image
Structure)
2. the second embodiment (is provided with single focal length lenses and in response to touch operation shooting image
Structure)
3. the 3rd embodiment (is provided with zoom lens and the structure in response to shutter operation shooting image
Make)
1. first embodiment
[structure of imaging device]
Fig. 1 illustrates the structure of the embodiment of the imaging device that the present invention is applied to.
Imaging device 11 shown in Fig. 1 is configured to such as Digital Still Camera, and according to making
The image of operation reference object of user, the object images (rest image) of storage shooting also will shoot
Image present to user.
Such as, imaging device 11 is at least made in following mode of operation: imaging pattern, is becoming
As the image of reference object the image of records photographing under pattern;Image-viewing mode, sees at image
See that under pattern, user may be viewed by the shooting image of record.When user operates imaging device 11 also
When selecting imaging pattern, imaging device 11 is in response to the figure of the shutter operation reference object of user
Picture the image of records photographing.When user operates imaging device 11 and selects image-watching mould
During formula, imaging device 11 allows user shooting figure from the shooting image of record needed for selection
Picture, and show selected shooting image.
Imaging device 11 shown in Fig. 1 include optical system 31, range images sensor 32,
A/D (simulation is to numeral) transducer 33, timing sequencer 34, image processor 35, camera signal
Processor 36, memorizer 37, monitor 38, operating unit 39, controller 40 and find range auxiliary
Help optical transmitting set 41.
The picture of object is delivered to range images sensor 32 by optical system 31, senses in range images
Image is shot at device 32.
Optical system 31 including lens 31a and aperture 31b regulates through lens 31a incidence
From the light quantity of object, and obtained light is made to be incident on range images sensor 32.
It is relative that optical system 31 need not regulate the light from object with through lens 31a incidence
The forming position of the picture answered.It is to say, lens 31a is monocular lens, it is configured to have
Single focal length to the far point (distant objects) away from lens 31a 2 meters, 4 meters or other distances, very
It is focused to infinite point.Aperture 31b regulation is incident on range images sensing through lens 31a
Light quantity on device 32.
Range images sensor 32 is CCD (charge-coupled image sensor) sensor, CMOS (complementary gold
Belong to oxide semiconductor) sensor or any other suitable sensor, the figure of its reference object
Picture signal obtained by picture output.
That is, range images sensor 32 receives the light through optical system 31 incidence, turns at photoelectricity
Change the picture signal (simulation that according to the light quantity received, described light is converted in process electrical signal form
Signal), and this picture signal is delivered to A/D converter 33.
Range images sensor 32 also provides incident with through lens 31a to A/D converter 33
The part of the corresponding signal of the light from object as range information (analogue signal), described
Range information represents the distance between imaging device 11 and object.
The spread pattern of the pixel forming range images sensor 32 is described now with reference to Fig. 2.
Typical imageing sensor uses so-called Bayer layout, wherein color R (red),
G (green) and B (blue) is assigned to pixel so that G pixel is arranged in checkerboard pattern, R and B
Pixel is arranged between G pixel.
Range images sensor 32 shown in Fig. 2 is configured such that as in typical image sensor
Equally it is arranged in the G pixel of checkerboard pattern (in the RGBG pixel that each group 2 × 2 (=4) are adjacent
G pixel) half be used as range finding pixel (being painted in Fig. 2).
Range images sensor 32 receives from range finding fill-in light emitter 41 (will be described later on) projection
To object, by object reflection and be incident on for range finding pixel on light.Range images senses
Device 32 utilizes so-called TOF (flight time) method based on projecting the time receiving light from light
Section determines the distance away from object, and exports the range information representing the distance away from object.
That is, range images sensor 32 the most exportable by for imaging rgb pixel generate
Picture signal, and the exportable range information generated by the pixel for range finding.
In the range images sensor 32 shown in Fig. 2, a pixel in four pixels by with
Making range finding pixel, range finding pixel is freely set with the ratio of whole rgb pixels, and such as 1 to 16
Pixel.Although it addition, range images sensor 32 is formed by rgb pixel and range finding pixel
Single (single sided board) sensor (that is, during pixel is integrated into a chip), but as other one
Planting and select, it also can be formed, wherein by two sensor image sensors and distance measuring sensor
Imageing sensor uses above-mentioned Bayer layout, and all pixels of distance measuring sensor are used to survey
Away from.In this case, a pixel in distance measuring sensor only needs corresponding to imageing sensor
In a pixel, four pixels or any other number of pixels.In other words, in distance measuring sensor
Number of pixels (pixel arrangement) there is no need equal in imageing sensor number of pixels (pixel arrange
Row).Alternatively, range images sensor 32 can be three panel sensors.Tool
Say, at least one RGB image sensor, such as, in 2 × 2 (=4) neighbor body
A pixel can be used as find range pixel.
The structure of imaging device 11 is described referring back again to Fig. 1.Range finding is schemed by A/D converter 33
The analogue signal (picture signal and range information) sent here as sensor 32 is converted to digital signal, and
Described digital signal is exported to image processor 35.
Such as, when range images sensor 32 is ccd sensor, timing sequencer 34 exists
The lower generation that controls of controller 40 is used for driving the clock signal of CCD, and by described clock signal
It is supplied to range images sensor 32 and A/D converter 33.
Image processor 35 is under the control of the controller 40 to the numeral from A/D converter 33
Signal (picture signal) performs predetermined image procossing, and the signal processed is delivered to camera signal
Processor 36.
Camera signal processor 36 performs multiple process to the picture signal from image processor 35
(such as, color correction, white balance adjusting, GTG conversion, gamma correction, YC conversion and pressure
Contracting), and by the signal record that processed in memory, and shown on monitor 38.
Operating unit 39 is by the button being layered on monitor 38, driver plate, stick, touch surface
Plate and miscellaneous part are formed, and receive the operation input of user.
Controller 40 signal based on the operation input representing user controls in imaging device 11
Parts.
Range finding fill-in light emitter 41 launches light towards object, light described in object reflection, the light of reflection
(being described with reference to Fig. 2) is received by the range finding pixel in range finding imageing sensor 32.
[the exemplary functions structure of imaging device]
The exemplary functions structure of the imaging device 11 shown in Fig. 1 is described referring next to Fig. 3
Make.
Imaging device 11 shown in Fig. 3 includes imaging section 51, input unit 52, input control portion
53, imaging control portion 54, range information acquisition unit 55, display control unit 56, display part 57,
Recording control part 58, record portion 59 and ambiguity correction portion 60.
Imaging section 51 includes lens 51a and range images sensor 51b.Lens 51a and range finding
Lens 31a shown in imageing sensor 51b with Fig. 1 and range images sensor 32 are identical, because of
Lens 51a and range images sensor 51b will be no longer described in detail by this.Enter lens
The object light of 51a is focused on range images sensor 51b, and imaging section 51 not only will be carried
The picture signal of the object images formed is supplied to imaging control portion 54, and would indicate that away from imaging
The range information of distance of object be supplied to range information acquisition unit 55.
Input unit 52 is corresponding to the operating unit 39 shown in Fig. 1, when user is to operating unit 39
When operating, input unit 52 receives the input from operating unit 39, and would indicate that and use
The signal (operation signal) of the input of the operation correspondence of person is delivered to input control portion 53.
The instruction corresponding with the operation signal from input unit 52 is supplied to into by input control portion 53
As control portion 54 and range information acquisition unit 55.
Imaging control portion 54 carries based on the picture signal generation from imaging section 51 will be at display part
The picture signal of the display image of display on 57, and be supplied to the picture signal of generation show control
Portion 56.Imaging control portion 54 obtains from imaging section 51 always according to the instruction from input control portion 53
Take picture signal, and the picture signal of acquisition is delivered to recording control part 58.
Range information acquisition unit 55 obtains from imaging section 51 according to the instruction from input control portion 53
Take range information, and this range information is delivered to recording control part 58.
Display control unit 56 is based on the picture signal carrying display image from imaging control portion 54
Described display image is shown on display part 57.Alternatively, display control unit
56 will send the figure entrained by picture signal of (reading) here via recording control part 58 from record portion 59
As display is on display part 57.
Display part 57 is corresponding to the monitor 38 shown in Fig. 1, and it is in the control of display control unit 56
Lower display image.
Recording control part 58 based on the pixel in range images sensor 51b (such as, based on four
Neighbor) make from the picture signal in imaging control portion 54 and from range information acquisition unit 55
Range information association, and by picture signal associated with each other and range information record in record portion 59
In.Alternatively, recording control part 58 reads record as required in record portion 59
In picture signal and range information, and be supplied to display control unit 56 and ambiguity correction
Portion 60.
Record portion 59 is corresponding to the memorizer 37 shown in Fig. 1, and its record is from recording control part 58
Picture signal and range information.Recording control part 58 reads record as required in record portion 59
In picture signal and range information.
Ambiguity correction portion 60 is according to from inputting the instruction correction in control portion 53 from recording control part
Obscuring included in the shooting image entrained by the picture signal of 58, and by corrected shooting figure
As being supplied to recording control part 58.It is supplied to the shooting figure that the ambiguity correction of recording control part 58 is crossed
As being recorded in record portion 59.
[imaging processing performed by imaging device]
Flow chart referring next to Fig. 4 describes the imaging processing performed by imaging device 11.
Such as, when user opens imaging device 11 and operates on it to select imaging pattern
Time, imaging device 11 is operated under the imaging pattern allowing user shooting image.In step S11
In, imaging control portion 54 controls imaging section 51 and shows direct picture on display part 57, described
Direct picture is the image of captured in real-time.
Specifically, imaging control portion 54 order range images sensor 51b execution pixel value adds
Method or selection are read, and carry the display figure such as with VGA (Video Graphics Array) size with generation
As the picture signal of (hereinafter referred to as showing image), and this picture signal is delivered to display control unit
56.Display control unit 56 will show from the display image in imaging control portion 54 as direct picture
On display part 57.So, user can check the display straight-through figure on display part 57 in real time
Picture, and determine the composition of the image shot.
In step s 12, input control portion 53 judges whether user performs shutter operation,
That is, whether user presses the input unit 52 as shutter release button.
When the judgement in step S12 shows to be not carried out shutter operation, control to return to step
S11, and it is repeatedly carried out the process in step S11 and S12, until shutter operation is held
OK.
On the other hand, the judgement in step S12 shows to perform shutter operation, inputs control portion
53 send the instruction corresponding with the operation signal representing the shutter operation performed by input unit 52 to
Imaging control portion 54 and range information acquisition unit 55, control to proceed to step S13.
In step s 13, imaging control portion 54 performs according to the instruction from input control portion 53
Imaging processing.Specifically, when inputting control portion 53 and issuing instructions to imaging control portion 54,
Imaging control portion 54 reads all pixel values from range images sensor 51b, carries example to generate
Believe as having the image of the shooting image (hereinafter referred to as shooting image) of full HD (full HD) size
Number, and described shooting image is supplied to recording control part 58.
In step S14, when input control portion 53 issues instructions to range information acquisition unit 55
Time, range information acquisition unit 55 obtains range information from range images sensor 51b, and should
Range information is supplied to recording control part 58.
Process in step S13 and S14 can be carried out simultaneously.
In step S15, recording control part 58 is based on the pixel in range images sensor 51b
Make the shooting image from imaging control portion 54 and the range information from range information acquisition unit 55
Association, and by shooting image associated with each other and range information record in record portion 59.
As described in reference to Fig. 2, in range images sensor 51b (range images sensor 32)
Four neighbors are formed by three rgb pixels and a range finding pixel, described three RGB
Pixel is as overall output image signal.Now, four neighbors are taken as a block.Each
Block generates picture signal (hereinafter referred to as image information) and the range information carrying shooting image.Change speech
It, the image information of each piece in shooting image and range information utilize and represent that this block is at shooting figure
The coordinate of the position in Xiang comes associated with each other.
In front portion, image information and the range information of each piece are separated from one another, and utilize block
Coordinate come associated with each other.Alternatively, for each piece, range information can be wrapped
It is contained in image information.
It addition, in front portion, four neighbors form single piece.One block only needs bag
Containing at least one pixel of finding range, such as, 4 × 4=16 neighbor can form single piece.
In step s 16, input control portion 53 judges whether user is pressed with operation mode switching
The input unit 52 of button, power on/off button or other button (not shown) performs pressing operation,
To send the instruction terminating imaging pattern.
When the judgement in step S16 shows not send the instruction terminating imaging pattern, control to return
Return to step S11, and be repeatedly carried out the process in step S11 to S16.
On the other hand, show to have issued the instruction of termination imaging pattern when the judgement in step S16
Time, imaging processing terminates.
By performing above-mentioned process, imaging device 11 may be in response to the shutter operation record of user
There is the shooting image of the desired composition of user.Owing to lens 51a (lens 31a) is to remote position
Single focal length lenses that the object of the place of putting or infinite point is focused, so the shooting image bag obtained
Containing the distant object focused on.The image of shooting can be to comprise landscape or any other is the most right at a distance
The excellent natural land image of elephant.
But, above-mentioned process generally will not generate the focusedimage of imaging device 11 object nearby,
Or imaging device 11 object nearby is entirely fuzzy in the image of shooting.
It is described below in correction entries portion 59 in the shooting image of record being selected by user
The ambiguity correction of fuzzy object processes.
[ambiguity correction performed by imaging device]
Fig. 5 is the flow chart illustrating the ambiguity correction performed by imaging device 11.
Such as, when user operates to select image-viewing mode to operating unit, imaging
Equipment 11 is operated under the image-viewing mode of the shooting image allowing user viewing record, and
And the thumbnail list that display part 57 display is corresponding with the shooting image of record in record portion 59.Contracting
Sketch map is attended by and represents whether it stands the icon of ambiguity correction.When operating unit is carried out by user
Operate with select from thumbnail list not thumbnail by ambiguity correction time, ambiguity correction is opened
Begin.
In step S31, in record portion 59 in the shooting image of record, recording control part 58
Read from record portion 59 shooting image corresponding to thumbnail selected with user and with this shooting
The range information of image association.In the shooting image so read and range information, record controls
Shooting image is supplied to display control unit 56 by portion 58, and will shoot both image and range information
It is supplied to ambiguity correction portion 60.
In step s 32, the shooting image from recording control part 58 is shown by display control unit 56
Show on display part 57.Owing to the shooting image from recording control part 58 has as mentioned above entirely
HD size, so display control unit 56 is by the resolution of the shooting image from recording control part 58
Rate is converted to may be displayed on the image of the VGA size on display part 57, and by the figure after conversion
As display is on display part 57.
In step S33, input control portion 53 is based on from being layered in being used as on monitor 38
The operation signal of the input unit 52 of touch panel, it is judged that whether user is shown on display part 57
Shooting image (hereinafter referred to as " display shooting image ") after the conversion of resolution shown have selected predetermined
Object.
Judgement in step S33 shows not select the object in display shooting image, repeats
Process in step S33, until input control portion 53 judge display shooting image in predetermined right
As being chosen.
On the other hand, the object in the judgement in step S33 shows display shooting image is the most selected
When selecting, input control portion 53 will be supplied to obscure about the coordinate information of the coordinate on shooting image
Correction unit 60.The operation that coordinate information is comprised in from the input unit 52 as touch panel is believed
In number, and the object part that described coordinate representation is selected in display shooting image with user
Corresponding region (block).
In step S34, ambiguity correction portion 60 believes based on the distance from recording control part 58
The imaging characteristic of the lens 51a that manner of breathing is corresponding, corrects the shooting image from recording control part 58
In fuzzy, described range information is corresponding to represented by the coordinate information from input control portion 53
Coordinate, i.e. the object part that described range information is selected with user is associated.
Generally, some intensity distributions (the some picture of characteristic and the object distance depending on lens 51a is utilized
Intensity distributions: PSF) to the mould being included in the object shooting in image and having predetermined object distance
Paste distribution is modeled.
Ambiguity correction portion 60 is by performing to deconvolute to perform ambiguity correction to shooting image.According to
With being touched (selection) on display part 57 by user in display shooting image in shooting image
The distance that region corresponding to region (this region hereinafter referred to as selection area in shooting image) is associated
Information (object distance), utilizes PSF to perform to deconvolute.Ambiguity correction portion 60 is by ambiguity correction
After shooting image deliver to recording control part 58.
In the shooting image of full HD size, such as, selection area may correspond to previously described
One block, or can be by the some pieces of regions formed.When selection area is formed by some pieces
Time, the range information that PSF is depended on can be relevant to a block of approximate centre in described some pieces
Connection, or can be and the meansigma methods of the described some pieces of range informations being associated.
Due to object (user wants the object focused on) chosen in shooting image distance with
The form of range information is previously known as mentioned above, so ambiguity correction portion 60 can perform fuzzy school
Just, shooting image is performed without using the PSF corresponding with each possible object distance
Deconvolute.
In step s 35, recording control part 58 is by after the ambiguity correction from ambiguity correction portion 60
Captured image recording in record portion 59.
In the process above, in the shooting image being associated with the range information representing object distance
Select predetermined object, and based on corresponding with the range information associated by selected object part
Lens imaging characteristic performs ambiguity correction to shooting image.That is, due to chosen in shooting image
The distance of object (user want focus on object) previously known with the form of range information, institute
Ambiguity correction can be performed with imaging device, and without using corresponding with each possible object distance
Shooting image is performed to deconvolute by PSF.As a result, imaging device can not perform automatic gathering
Burnt, the most otherwise make object focus in the case of, generate focused view with the amount of calculation reduced
Picture.
In superincumbent description, when shooting image is performed ambiguity correction, ambiguity correction portion 60
Use PSF (point is as intensity distributions) as the imaging characteristic of the lens 51a corresponding with range information.
Alternatively, the imaging characteristic of lens 51a can be by putting as intensity distributions is held
Row two-dimensional Fourier transform and the line that obtains are as intensity distributions (LSF) or optical transfer function (OTF).
OTF is the function of the degradation representing image in spatial frequency domain.
Alternatively, by execution with reference to the imaging processing of the flow chart description of Fig. 4
The shooting image provided can be supplied to via record medium or any other suitable medium (not shown)
Image processing equipment, such as personal computer, and can perform above-mentioned in this image processing equipment
Ambiguity correction.In this case, image processing equipment does not include imaging device 11 shown in Fig. 1
In imaging section 51, imaging control portion 54 and range information acquisition unit 55.At the figure being constructed so as to
As, in the ambiguity correction performed by processing equipment, display display on display part 57 shoots image
The image of full HD size can be still, can the such as non-tactile by the click operation of user
Touch operation on panel performs object choice.
It addition, in superincumbent description, when in imaging device 11, imaging pattern is switched to figure
During as watching mode, select not by ambiguity correction from thumbnail shooting image list at user
Shooting image time perform ambiguity correction.Alternatively, can be switched at imaging pattern
It is automatically to perform ambiguity correction during other patterns including image-viewing mode.
[another example of the imaging processing performed by imaging device]
The imaging automatically performing ambiguity correction after imaging pattern terminates it is described in referring next to Fig. 6
Process.
Process in step S71 to S76 and step in the flow chart of Fig. 4 in the flow chart of Fig. 6
Those process in S11 to S16 are substantially the same, step S78 to S81 in the flow chart of Fig. 6
In process substantially the same with those process in step S32 to S35 in the flow chart of Fig. 5.
Therefore will no longer the above-mentioned steps in Fig. 6 be described.
That is, in the process in step S71 to S75, shooting image and range information are associated with each other
And be recorded, in step S76, it is judged that whether user (does not shows as mode switch button
Go out) input unit 52 perform pressing operation and have issued terminate imaging pattern instruction.
When the judgement in step S76 show terminate imaging pattern instruction have been sent from, then control into
Row is to step S77.
In step S77, it is taken in the process of step S71 to S75 and records in record portion
In shooting image in 59, recording control part 58 reads from record portion 59 to be remembered the earliest along time shaft
The shooting image recording (shooting) and the range information associated with this shooting image.So read
In shooting image and range information, shooting image is supplied to display control unit by recording control part 58
56, both shooting image and range information are supplied to ambiguity correction portion 60.
By the captured image recording after ambiguity correction in record portion in the process of step S78 to S81
After in 59, recording control part 58 judges whether step S71's to S75 in step S82
In process, all shooting images that are captured and that record perform ambiguity correction.
When the judgement in step S82 shows and not all shooting image is all processed, control to return
Step S77, and residue shooting image is repeatedly carried out step S77 according to the descending of shooting time
And the process in subsequent step.
On the other hand, show that all shooting images are the most processed when the judgement in step S82
Time, imaging processing terminates.
In the process above, in the shooting image being associated with the range information representing object distance
Select predetermined object, and based on corresponding with the range information associated by selected object part
Lens imaging characteristic carrys out obscuring in correcting captured image.That is, due to chosen in shooting image
The distance of object (user want focus on object) previously known with the form of range information, institute
Ambiguity correction can be performed with imaging device, and without using corresponding with each possible object distance
Shooting image is performed to deconvolute by PSF.As a result, imaging device can not performing automatically to focus on,
In the case of the most otherwise making object focus on, generate focusedimage with the amount of calculation reduced.
Above description is that the person of consulting and using selects user to want in the shooting image obtained
The configuration of the object focused on is carried out.Alternatively, user can shooting image it
User in front selection direct picture wants the object focused on.
<2. the second embodiment>
[the exemplary functions structure of imaging device]
Fig. 7 illustrates the exemplary functions structure of imaging device, and this imaging device is configured such that and makes
User selects user to want the object focused on before capturing the image in direct picture.
Imaging device 111 includes imaging section 51, input unit 52, input control portion 53, imaging control
Portion 54 processed, range information acquisition unit 55, display control unit 56, display part 57, record portion 59,
Ambiguity correction portion 60, focusedimage generating unit 151 and recording control part 152.
In the imaging device 111 shown in Fig. 7, carried in the imaging device 11 shown in Fig. 3
The parts of confession have the parts of identical function and have identical title and label, will be the most right
These parts are described.
That is, the imaging device 111 shown in Fig. 7 different from the imaging device 11 shown in Fig. 3 it
Place is: be newly provided with focusedimage generating unit 151, and recording control part 58 is recorded control portion
152 replace.
Focusedimage generating unit 151 is according to the instruction from input control portion 53, based on from one-tenth
Display image as control portion 54 and the range information from range information acquisition unit 55 generate display
Focusedimage.Display focusedimage is generated as the object selected in display image so that user
It is focused.Then, display focusedimage is delivered to display control unit 56 by focusedimage generating unit 151
With recording control part 152.
The pixel that recording control part 152 is based not only in range images sensor 51b makes from imaging
The picture signal in control portion 54 associates with the range information from range information acquisition unit 55, and
Display focusedimage is made to associate with picture signal and range information, and by image letter associated with each other
Number, range information and display focusedimage record be in record portion 59.Recording control part 152
According to needing to read picture signal, range information and the display focused view recorded in record portion 59
Picture, and it is supplied to display control unit 56 and ambiguity correction portion 60.
[imaging processing performed by imaging device]
Describe at the imaging performed by imaging device 111 referring next to the flow chart in Fig. 8
Reason.
Such as, when user opens imaging device 111 and operates on it to select imaging pattern
Time, imaging device 111 is operated under the imaging pattern allowing user shooting image.In step
In S111, imaging control portion 54 controls imaging section 51 and shows direct picture on display part 57,
These direct pictures are the images of captured in real-time.
Specifically, imaging control portion 54 order range images sensor 51b execution pixel value adds
Method or selection are read, and to generate the display image such as with VGA size, and this display are schemed
As delivering to display control unit 56.Display control unit 56 is by the display image from imaging control portion 54
Show on display part 57 as direct picture.
In step S112, input control portion 53 is based on from the use being layered on monitor 38
Make the operation signal of the input unit 52 of touch panel, it is judged that user whether institute on display part 57
The display image of display have selected predetermined object.
When the judgement in step S112 shows not select to show the object in image, repeat to hold
Process in row step S111 and S112, until input control portion 53 judges in display image
Predetermined object is chosen.
On the other hand, show to show that the object in image is the most selected when the judgement in step S112
Time, input control portion 53 sends and selects with the expression object from the input unit 52 as touch panel
That selects operation operates the corresponding instruction of signal to imaging control portion 54 and range information acquisition unit
55.Input control portion 53 also provides to focusedimage generating unit 151 and is included in operation signal also
Represent the user coordinate information at the coordinate showing object part selected on image.
In step S113, imaging control portion 54 holds according to the instruction from input control portion 53
Row imaging processing.Specifically, imaging control portion 54 not only generates and shows figure as above
Picture, and when inputting control portion 53 and issuing instructions to imaging control portion 54, pass from range images
Sensor 51b reads all pixel values, to generate the shooting image such as with full HD size, and
This shooting image is delivered to recording control part 152.
In step S114, range information acquisition unit 55 issues instructions in input control portion 53
Range information is obtained from range images sensor 51b during range information acquisition unit 55, and should be away from
It is supplied to focusedimage generating unit 151 and recording control part 152 from information.
In step sl 15, focusedimage generating unit 151 inputs control portion 53 by coordinate information
The display image that imaging control portion 54 is generated is obtained when being supplied to focusedimage generating unit 151.
In step S116, based on the coordinate information from input control portion 53 with from distance letter
The range information of breath acquisition unit 55, focusedimage generating unit 151 uses and obtains from imaging control portion 54
The display image taken is to generate display focusedimage, and in this display focusedimage, user is in display
Object part selected in image is focused.
Specifically, focusedimage generating unit 151 is by performing to deconvolute to perform to display image
Ambiguity correction.According to shooting image in display image in by user on display part 57
Touch the range information (object distance) that the selection area corresponding to region of (selection) is associated, utilization
PSF performs to deconvolute.Focusedimage generating unit 151 will show that (it is fuzzy school to focusedimage
Display image after just) it is supplied to display control unit 56 and recording control part 152.Should be noted that
It is that the coordinate information in the region (object part) that expression user is touched is added to display and focuses on
Image.
In step S117, the display of display control unit 56 self-focusing image production part 151 in the future
Focusedimage shows on display part 57.
Process in step S113 and S114 and the process in step S115 to S117 can be simultaneously
Carry out.
In step S118, recording control part 152 is based on the picture in range images sensor 51b
Element makes the shooting image from imaging control portion 54, the distance from range information acquisition unit 55 believe
Cease and associated with each other from the display focusedimage of focusedimage generating unit 151, and will close each other
Shooting image, range information and the display focusedimage record of connection are in record portion 59.
It should be noted that due to by order range images sensor 51b (range images sensor
32) perform pixel value addition or selection reading generates display focusedimage, so display focused view
Single pixel in Xiang with by from range images sensor 51b (range images sensor 32) read
All pixel values and generate shooting image in predetermined quantity block (pixel of predetermined quantity) close
Connection.
In step S119, input control portion 53 judges whether user is pressed with operation mode switching
The input unit 52 of button (not shown) performs pressing operation to send the instruction terminating imaging pattern.
When the judgement in step S119 shows not send the instruction terminating imaging pattern, control
Return step S111, and be repeatedly carried out the process in step S111 to S119.
On the other hand, show that the instruction terminating imaging pattern has been sent out when the judgement in step S119
When going out, control to proceed to step S120.
In the step s 120, it is taken in the process of step S111 to S118 and records in note
In shooting image in record portion 59, recording control part 152 reads along time shaft from record portion 59
Record shooting image and the range information associated with this shooting image of (shooting) the earliest and show poly-
Burnt image.The shooting image so read, range information and display are focused on by recording control part 152
Image is supplied to ambiguity correction portion 60.
In step S121, ambiguity correction portion 60 based on the distance from recording control part 152
The imaging characteristic of the lens 51a that information is corresponding, corrects the shooting figure from recording control part 152
Obscuring in Xiang, described range information gathers corresponding to the display of adding to from input control portion 152
The coordinate represented by coordinate information of burnt image, i.e. described range information and user are selected
Object part is associated.
Specifically, ambiguity correction portion 60 is by performing to deconvolute to perform to obscure to shooting image
Correction.According to in shooting image with display focusedimage in by adding display focusedimage to
The range information (object distance) that selection area corresponding to region that coordinate information represents is associated, profit
Perform to deconvolute with PSF.Shooting image after ambiguity correction is delivered to note by ambiguity correction portion 60
Record control portion 152.
Due to object (user wants the object focused on) chosen in display image distance with
The form of range information is previously known as mentioned above, so ambiguity correction portion 60 can perform fuzzy school
Just, shooting image is performed without using the PSF corresponding with each possible object distance
Deconvolute.
In step S122, recording control part 152 is by the ambiguity correction from ambiguity correction portion 60
After captured image recording in record portion 59.
In step S123, recording control part 152 judges whether step S111's to S118
In process, all shooting images that are captured and that record perform ambiguity correction.
When the judgement in step S123 shows and not all shooting image is all processed, control to return
Return step S120, and residue shooting image is repeatedly carried out step according to the descending of shooting time
Process in S210 and subsequent step.
On the other hand, show that all shooting images are the most processed when the judgement in step S123
Time, imaging processing terminates.
In the process above, when selecting the predetermined object in display image (direct picture), obtain
The shooting image that (shooting) is associated with the range information representing distance away from object, and based on
The lens imaging characteristic that range information is corresponding comes obscuring in correcting captured image, described range information
It is associated with the region corresponding to object part selected in display image in shooting image.That is,
Owing to the distance of object (user wants the object focused on) chosen in display image is with distance
The form of information is previously known, so imaging device can perform ambiguity correction, and without using with often
Shooting image is performed to deconvolute by PSF corresponding to one possible object distance.As a result, imaging
Equipment can be in the case of not performing automatically to focus on, the most otherwise making object focus on, to subtract
Little amount of calculation generates focusedimage.
It addition, when selected object in showing image, to by performing pixel value addition or selection
The display image read and generate performs ambiguity correction, and shows the display focusedimage obtained.
But, the amount of calculation in this process is less than to the shooting image generated by reading all pixel values
Amount of calculation commonly required in the case of performing ambiguity correction.Additionally, (walk during imaging pattern
Rapid S111 to S118), shooting image is not performed ambiguity correction, but only will shooting image and away from
From information record in record portion 59, carry out imaging processing the most in real time, and user because of
This can while checking direct picture continuously shot images.It should be noted that and work as imaging device
When not performing high capacity process during imaging pattern, can be the most right with the process in imaging pattern
Shooting image performs ambiguity correction.
Above description is that the person of consulting and using selects to show that the object in image generation comprise and is focused
The configuration of shooting image of object carry out.Alternatively, imaging control portion 54
Be provided with face recognition, people identifies or any other Object identifying function, so that in display image
Object can stand face recognition or people identifies, and can generate the shooting figure comprising the object being focused
Picture.
Additionally, above description is to be used as the lens 51a in imaging section 51 with reference to single focal length lenses
Structure carry out.The structure being used as the lens in imaging section 51 below with reference to zoom lens enters
Line description.
<3. the 3rd embodiment>
[the exemplary functions structure of imaging device]
Fig. 9 is shown with the zoom lens example as the imaging device of the lens in imaging section 51
Sexual function constructs.
Imaging device 211 includes imaging section 51, input unit 52, input control portion 53, imaging control
Portion 54 processed, range information acquisition unit 55, display control unit 56, display part 57, recording control part
58, record portion 59, zoom information acquisition unit 231, ambiguity correction portion 232 and ambiguity correction data
Storage part 233.Imaging section 51 includes range images sensor 51b and zoom lens 234.
In the imaging device 211 shown in Fig. 9, carried in the imaging device 11 shown in Fig. 3
The parts of confession have the parts of identical function and have identical title and label, will be the most right
These parts are described.
That is, the imaging device 211 shown in Fig. 9 different from the imaging device 11 shown in Fig. 3 it
Place is: be newly provided with zoom information acquisition unit 231 and ambiguity correction data store 233, and
And lens 51a and ambiguity correction portion 60 are by zoom lens 234 and ambiguity correction portion 232 replacement.
Zoom information acquisition unit 231 obtains from imaging section 51 according to the instruction from input control portion 53
Take the zoom information of zoom ratio about the zoom lens 234 in imaging section 51, and by this change
Burnt information is supplied to ambiguity correction portion 232.
Ambiguity correction portion 232 deposits from ambiguity correction data according to the instruction from input control portion 53
Storage portion 233 reads ambiguity correction data, and described ambiguity correction data are corresponding to obtaining from zoom information
Take the zoom information in portion 231.Ambiguity correction portion 232 uses the ambiguity correction data of so reading
Correction is included in from obscuring in the shooting image of recording control part 58, and by after ambiguity correction
Shooting image deliver to recording control part 58.Deliver to the bat after the ambiguity correction of recording control part 58
Take the photograph image to be recorded in record portion 59.
Ambiguity correction data store 233 is formed by such as flash memory, and stores pre-prepd correspondence
Ambiguity correction data in zoom information.Ambiguity correction portion 232 read as required be stored in fuzzy
Ambiguity correction data in correction data storage part 233.
Zoom lens 234 are driven based on the instruction from input control portion 53, and saturating based on zoom
The position of mirror 234 determines zoom ratio.That is, zoom information acquisition unit 231 zoom obtained
Information is corresponding to the position of zoom lens 234.
[imaging processing performed by imaging device]
Flow chart referring next to Figure 10 describes at the imaging performed by imaging device 211
Reason.
The flow chart of Figure 10 walks in the process in step S211 to S214 and the flow chart of Fig. 6
Those process in rapid S71 to S74 are substantially the same, by no longer to the above-mentioned process in Figure 10
It is described.
When the judgement in step S212 of Figure 10 shows to perform shutter operation, input control portion
53 send the instruction corresponding with the operation signal representing the shutter operation performed by input unit 52 to
Imaging control portion 54, range information acquisition unit 55 and zoom information acquisition unit 231.
In step S215, zoom information acquisition unit 231 issues instructions in input control portion 53
The change being associated with the zoom lens 234 in imaging section 51 is obtained during zoom information acquisition unit 231
Burnt information, and this zoom information delivered to recording control part 58 by zoom information acquisition unit 231.
In step S216, recording control part 58 is based not only in range images sensor 51b
Pixel makes the shooting image from imaging control portion 54 and the distance from range information acquisition unit 55
Information association, and make zoom information associate with shooting image and range information, then will close each other
Shooting image, range information and the zoom information record of connection are in record portion 59.
In step S217, input control portion 53 judges whether user is pressed with operation mode switching
The input unit 52 of button (not shown) performs pressing operation to send the instruction terminating imaging pattern.
When the judgement in step S217 shows not send the instruction terminating imaging pattern, control
Return to step S211, and be repeatedly carried out the process in step S211 to S217.
On the other hand, show to have issued the finger of termination imaging pattern when the judgement in step S217
When making, control to proceed to step S218.
In step S218, it is taken in the process of step S211 to S216 and records in note
In record portion 59 several shooting images in the middle of, recording control part 58 read from record portion 59 along
What time shaft recorded the earliest shoots image and the range information associated with this shooting image and zoom letter
Breath.In shooting image, range information and the zoom information so read, recording control part 58
Shooting image is supplied to display control unit 56, and image, range information and zoom information will be shot
It is supplied to ambiguity correction portion 232.
In step S219, display control unit 56 is by the shooting image from recording control part 58
Display is on display part 57.Owing to the shooting image from recording control part 58 has as mentioned above
Full HD size, thus display control unit 56 by the shooting image from recording control part 58 point
Resolution is converted to may be displayed on the image of the VGA size on display part 57, and by after conversion
Image shows on display part 57.
In step S220, input control portion 53 is based on from the use being layered on monitor 38
Make the operation signal of the input unit 52 of touch panel, it is judged that user whether institute on display part 57
Display shooting image after the conversion of resolution of display have selected predetermined object.
Judgement in step S220 shows not select the object in display shooting image, repeats to hold
Process in row step S220, judges making a reservation in display shooting image until input control portion 53
Object is chosen.
On the other hand, when the judgement in step S220 show display shooting image in object by
During selection, input control portion 53 will be supplied to mould about the coordinate information of the coordinate on shooting image
Stick with paste correction unit 232.This coordinate information is comprised in the behaviour from the input unit 52 as touch panel
Make in signal, and the object portion that described coordinate representation is selected in display shooting image with user
Divide corresponding region (block)
In step S221, ambiguity correction portion 232 based on not only with from recording control part 58
Range information is corresponding, and the lens 51a corresponding with the zoom information from recording control part 58
Imaging characteristic, correct from the shooting image of recording control part 58 is fuzzy, described distance
Information corresponding to from input control portion 53 coordinate information represented by coordinate, i.e. described away from
The object part selected with user from information is associated.
Specifically, ambiguity correction portion 232 first from ambiguity correction data store 233 read with
From the ambiguity correction data that the zoom information of recording control part 58 is corresponding.Specifically, storage
The example of the ambiguity correction data in ambiguity correction data store 233 is for each predetermined saturating
The data (PSF data) of the pre-prepd PSF about zoom lens 234 in mirror position.That is, exist
This process is read the PSF data corresponding with such zoom lens 234 position, in this position
Place achieves the zoom ratio represented by zoom information.
There is no need to prepare PSF data for continuous print lens position (zoom ratio), but can be with pin
Discrete zoom ratio is prepared PSF data.Accurate without the form with ambiguity correction data
The standby PSF data corresponding with the zoom ratio represented by specific zoom information, then can be by alignment
Standby PSF data carry out linear interpolation or other process the PSF data determining correspondence.
Then, ambiguity correction portion 232 performs fuzzy school by performing to deconvolute to shooting image
Just.Utilize represented by the PSF data read from ambiguity correction data store 233 and with distance
The PSF that information (object distance) is consistent performs to deconvolute, in described range information and shooting image
On display part 57, touched, by user, the choosing that the region of (selection) is corresponding in display shooting image
Determine region to be associated.Shooting image after ambiguity correction is supplied to record control by ambiguity correction portion 232
Portion 58 processed.
In step S222, recording control part 58 is by the ambiguity correction from ambiguity correction portion 232
After captured image recording in record portion 59.
In step S223, recording control part 58 judges whether the place to step S211 to S216
In reason, all shooting images that are captured and that record perform ambiguity correction.
When the judgement in step S223 shows and not all shooting image is all processed, control to return
Return step S218, and residue shooting image is repeatedly carried out step according to the descending of shooting time
Process in S218 and subsequent step.
On the other hand, show that all shooting images are the most processed when the judgement in step S223
Time, imaging processing terminates.
In the process above, in the shooting image being associated with the range information about object distance
Select predetermined object, and based on not only corresponding with the range information associated by selected object part,
And the zoom lens imaging characteristic corresponding with the zoom information associated by zoom lens comes correcting captured
Obscuring in image.That is, due to object chosen in shooting image, (user is wanted to focus on
Object) distance previously known with the form of range information, and prepare in advance and zoom lens
The PSF that associated zoom information (zoom ratio) is corresponding, so imaging device can perform fuzzy school
Just, come in each object distance without using the PSF corresponding with each possible object distance
Under to shooting image perform deconvolute.As a result, the imaging device including zoom lens also can not held
Row focuses on automatically, in the case of the most otherwise making object focus on, raw with the amount of calculation reduced
Become focusedimage.
In superincumbent description, utilize TOF (during flight in the imaging device including monocular lens
Between) method and range images sensor 51b (range images sensor 32) determine away from object away from
From.Not necessarily using above-mentioned structure, above-mentioned structure can replace coming with any other suitably structure
Determine the distance away from object.
Such as, set in the imaging including optical axis right lens parallel to each other and left lens (binocular lens)
In Bei, object is arranged on the centrage of imaging device (on the centrage between two optical axises),
And can alternate position spike based on the object images through two lens forming determine away from object away from
From.
It addition, above description is with reference to the Digital Still Camera that imaging device is shooting still image
Structure carry out.Alternatively, such as, imaging device can be shooting video image
Digital video camcorder.In this case, the image captured when object is selected is held
Row ambiguity correction.
Additionally, above description is without adjusting with reference to the optical system 31 arranged in an imaging device
The situation of the forming position saving the picture corresponding with the object light incident through lens 31a is carried out.Make
Selecting for another, imaging device can have following two operator scheme: there is no need regulation Jiao
Put the first mode of position and perform the second pattern of so-called automatic focusing, and operator scheme
Can switch between the first and second patterns.In this case, when imaging device is at first mode
Above-mentioned imaging processing is performed during lower operation.
Above-mentioned a series of process may utilize hardware or software realizes.When this series of processes utilizes soft
When part realizes, formed the program of described software from program recorded medium be installed to be contained in special firmly
Computer in part or be such as provided with various program to provide the universal personal of various functions to calculate
Machine.
Figure 11 is the example illustrating the computer hardware performing above-mentioned a series of process with program form
Property structure block diagram.
Described computer includes CPU (the central authorities' process list being connected to each other via bus 904
Unit) 901, ROM (read only memory) 902 and RAM (random access storage device) 903.
Input/output interface 905 is also connected to bus 904.Following parts are connected to input/output
Interface 905: the input unit 906 that formed by keyboard, mouse, mike and other devices, by showing
Show output unit 907 that device, speaker and other devices formed, by hard disk drive, non-volatile
Storage part 908 that memorizer and other devices are formed, by leading to that network interface and other devices are formed
Letter portion 909 and drive removable medium 911 (such as, disk, CD, magneto-optic disk and partly lead
Body memory) driver 910.
In the computer being constructed so as to, CPU 901 is via input/output interface 905 and bus
904 programs that will be stored in such as storage part 908 are loaded in RAM 903, and run described
Program is to perform above-mentioned a series of process.
The program run by computer (CPU 901) can such as be recorded in removable medium 911
On, described removable medium 911 is by such as disk (including floppy disk), CD (such as, CD-
ROM (compact disk-read only memory) and DVD (digital versatile disc)), magneto-optic disk and quasiconductor deposit
The encapsulation medium that reservoir is formed.Described program also can provide via wired or wireless transmission medium, example
Such as LAN, the Internet and digital satellite broadcasting.
Can be by removable medium 911 be loaded in driver 910 via input/output interface 905
The program provided with any of the above described form is arranged in storage part 908.Alternatively select
Select, described program can be received via wired or wireless transmission medium by communication unit 909 and be pacified
Install in storage part 908.Alternatively, can in advance described program be arranged on
In ROM 902 or storage part 908.
Can be to continuously perform according to chronological order as herein described by the program run by computer
The program of reason, or execution simultaneously processes or holds according to necessary sequential (when processing called)
The program of each process of row.
Embodiments of the invention are not limited to those described above embodiment, without departing from present subject matter
In the case of, various amendment can be carried out.
The day that the theme that the application comprises relates on August 26th, 2010 and is submitted to Japan Office
Theme disclosed in this priority patent JP2010-189302, described application whole in
Appearance is incorporated herein by reference.
It will be apparent to a skilled person that without departing from claims or its equivalent
In the case of the scope of thing, require according to design and other factors can carry out various amendment, group
Conjunction, sub-portfolio and replacement.
Claims (12)
1. an imaging device, including:
Optical system, this optical system forms the picture corresponding with the object light through lens entrance;
Imaging device, this imaging device generates the letter corresponding with the object light through described lens entrance
Number, and export this signal as shooting image;
Acquisition device, is taken from described optical system for obtaining to described shooting image
The distance of object;With
Correcting unit, for based on the described optical system corresponding with the distance that described acquisition device obtains
The imaging characteristic of system, corrects obscuring from the shooting image that described imaging device exports,
Wherein acquired distance is relevant to corresponding to multiple pieces in the shooting image of described object
The meansigma methods of the range information of connection, each piece in the plurality of piece includes the adjacent picture of at least four
Element.
Imaging device the most according to claim 1, also includes:
Display device, is used for showing described shooting image;With
Select device, select the bat shown on said display means for operation based on user
Take the photograph the object in image,
Wherein said correcting unit is according in the object distance obtained with described acquisition device, away from institute
State the imaging characteristic of the described optical system selecting the distance of object selected by device corresponding, correct institute
State obscuring in shooting image.
Imaging device the most according to claim 2,
Wherein said display device display direct picture, described direct picture is the figure of captured in real-time
Picture, and be to generate by ordering described imaging device to perform pixel value addition or selection reading
,
The operation based on user of described selection device selects the object in a direct picture,
Described acquisition device obtain when described selection device selects the object in direct picture away from this
The distance of object,
Described correcting unit is according in the object distance obtained with described acquisition device, away from described choosing
The imaging selecting described optical system corresponding to the distance of the object that device selects in direct picture is special
Property, correct obscuring in described shooting image.
Imaging device the most according to claim 3, also includes:
Generating means, focuses on direct picture, described focusing direct picture for generating from direct picture
It is generated as so that the object selected in direct picture by described selection device is focused,
Wherein said display device shows the focusing direct picture generated by described generating means.
Imaging device the most according to claim 1,
The signal that the output of wherein said imaging device is corresponding with the object light through described lens entrance
A part, this part of described signal is used as to represent the range information of the distance away from object,
Described acquisition device based on the range information that exports from described imaging device obtain away from object away from
From.
Imaging device the most according to claim 1,
Wherein said lens are single focal length lenses of the object focus to infinite point or distant location.
Imaging device the most according to claim 1,
Wherein said lens are zoom lens,
Described correcting unit imaging characteristic based on described optical system corrects in described shooting image
Fuzzy, described imaging characteristic is obtained ahead of time, and not only right with what described acquisition device obtained
Image distance is from correspondence, and corresponding with the zoom state of zoom lens.
Imaging device the most according to claim 1,
Wherein said correcting unit is based on described in corresponding with the object distance that described acquisition device obtains
Fuzzy as in intensity distributions corrects described shooting image of the point of optical system.
Imaging device the most according to claim 1,
Wherein said correcting unit is based on described in corresponding with the object distance that described acquisition device obtains
Fuzzy as in intensity distributions corrects described shooting image of the line of optical system.
Imaging device the most according to claim 1,
Wherein said correcting unit is based on described in corresponding with the object distance that described acquisition device obtains
The optical transfer function of optical system corrects obscuring in described shooting image.
11. 1 kinds of formation methods used in the imaging device include optical system, including:
Obtain the distance of the object being taken from described optical system to shooting image;And
Imaging characteristic based on the described optical system corresponding with acquired distance, corrects described bat
Take the photograph obscuring in image,
Wherein acquired distance is relevant to corresponding to multiple pieces in the shooting image of described object
The meansigma methods of the range information of connection, each piece in the plurality of piece includes the adjacent picture of at least four
Element.
12. 1 kinds of imaging systems used in the imaging device include optical system, including:
For obtaining the distance of the object being taken from described optical system to shooting image
Device;With
Imaging characteristic based on the described optical system corresponding with acquired distance, corrects described bat
Take the photograph the fuzzy device in image,
Wherein acquired distance is relevant to corresponding to multiple pieces in the shooting image of described object
The meansigma methods of the range information of connection, each piece in the plurality of piece includes the adjacent picture of at least four
Element.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010189302A JP2012049773A (en) | 2010-08-26 | 2010-08-26 | Imaging apparatus and method, and program |
JP2010-189302 | 2010-08-26 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102387301A CN102387301A (en) | 2012-03-21 |
CN102387301B true CN102387301B (en) | 2016-12-14 |
Family
ID=
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0981245A2 (en) * | 1998-08-20 | 2000-02-23 | Canon Kabushiki Kaisha | Solid-state image sensing apparatus, control method therefor, basic layout of photoelectric conversion cell and storage medium |
CN1652009A (en) * | 2004-02-03 | 2005-08-10 | 佳能株式会社 | Imaging apparatus |
US20090147999A1 (en) * | 2007-12-10 | 2009-06-11 | Fujifilm Corporation | Image processing system, image processing method, and computer readable medium |
US20100103311A1 (en) * | 2007-06-06 | 2010-04-29 | Sony Corporation | Image processing device, image processing method, and image processing program |
CN102422630A (en) * | 2009-05-12 | 2012-04-18 | 佳能株式会社 | Image pickup apparatus |
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0981245A2 (en) * | 1998-08-20 | 2000-02-23 | Canon Kabushiki Kaisha | Solid-state image sensing apparatus, control method therefor, basic layout of photoelectric conversion cell and storage medium |
CN1652009A (en) * | 2004-02-03 | 2005-08-10 | 佳能株式会社 | Imaging apparatus |
US20100103311A1 (en) * | 2007-06-06 | 2010-04-29 | Sony Corporation | Image processing device, image processing method, and image processing program |
US20090147999A1 (en) * | 2007-12-10 | 2009-06-11 | Fujifilm Corporation | Image processing system, image processing method, and computer readable medium |
CN102422630A (en) * | 2009-05-12 | 2012-04-18 | 佳能株式会社 | Image pickup apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100545733C (en) | The control method of imaging device, imaging device and computer program | |
CN101527860B (en) | White balance control apparatus, control method therefor, and image sensing apparatus | |
CN107087107A (en) | Image processing apparatus and method based on dual camera | |
CN100565320C (en) | The control method of imaging device, imaging device and computer program | |
CN101582989B (en) | Image capture apparatus | |
TWI471004B (en) | Imaging apparatus, imaging method, and program | |
JP6791336B2 (en) | Imaging device | |
CN108141531B (en) | Image pickup apparatus | |
CN103002211B (en) | Photographic equipment | |
KR101501395B1 (en) | Image capturing device and image capturing method | |
CN101867723A (en) | Image processing apparatus, camera head and image-reproducing apparatus | |
CN101959020A (en) | Imaging device and formation method | |
CN104885440B (en) | Image processing apparatus, camera device and image processing method | |
CN102959942B (en) | Image capture device for stereoscopic viewing-use and control method thereof | |
CN101115139A (en) | Photographing apparatus and exposure control method | |
KR101204888B1 (en) | Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method | |
JP6095266B2 (en) | Image processing apparatus and control method thereof | |
CN106254754A (en) | Filming apparatus, image processing apparatus, the control method of filming apparatus | |
CN104756493A (en) | Image capture device, image processing device, image capture device control program, and image processing device control program | |
CN102387301B (en) | Imaging device and formation method | |
CN114208153B (en) | Multi-shot image capture without anti-shake | |
JP6723853B2 (en) | Imaging device and control method thereof | |
CN104737527A (en) | Image processing device, imaging device, image processing method, and image processing program | |
JP7079833B2 (en) | Mobile information terminal | |
JP2014011639A (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20160830 Address after: Kanagawa, Japan Applicant after: SONY semiconductor solutions Address before: Tokyo, Japan Applicant before: Sony Corp |
|
GR01 | Patent grant |