WO2013061947A1 - 撮像方法及び画像処理方法並びにそれらのプログラム、記録媒体、及び撮像装置 - Google Patents
撮像方法及び画像処理方法並びにそれらのプログラム、記録媒体、及び撮像装置 Download PDFInfo
- Publication number
- WO2013061947A1 WO2013061947A1 PCT/JP2012/077304 JP2012077304W WO2013061947A1 WO 2013061947 A1 WO2013061947 A1 WO 2013061947A1 JP 2012077304 W JP2012077304 W JP 2012077304W WO 2013061947 A1 WO2013061947 A1 WO 2013061947A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- focus
- subject
- focus state
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0075—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/10—Bifocal lenses; Multifocal lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
Definitions
- the present invention relates to a technique for controlling an in-focus state with an imaging apparatus including a multifocal lens.
- Patent Document 1 describes that an image of a subject is obtained by convolution processing in an imaging apparatus including a multifocal lens.
- Japanese Patent Application Laid-Open No. H11-228707 describes that an imaging signal is corrected by an inverse function based on a point spread function in an imaging apparatus including a multifocal lens.
- Patent Documents 1 and 2 images having different focal lengths are acquired using an imaging apparatus including a multifocal lens, and an in-focus image is obtained by a restoration process.
- Patent Document 3 describes that a barcode is read in a short focal region provided in the central portion of a multifocal lens and normal photographing is performed in a long focal region provided in a peripheral portion of the lens.
- Patent Document 4 the same subject is photographed by a plurality of camera modules having different focal lengths instead of using a multifocal lens to obtain a multifocal image as a whole. .
- the present invention has been made based on such circumstances, and an imaging method and image processing capable of acquiring a desired image utilizing the characteristics of a multifocal lens and extracting and selecting the acquired image according to the purpose and preference. It is an object to provide a method, a program thereof, a recording medium, and an imaging apparatus.
- an imaging method is an imaging method using a multifocal lens having a plurality of regions and each of the plurality of regions having a different focal length. And an in-focus state control step for controlling the in-focus state of the multifocal lens, and an imaging step for acquiring an image of the subject in the controlled in-focus state.
- the imaging method since the image of the subject is acquired by controlling the in-focus state of a multifocal lens having a plurality of different focal lengths, an accurately focused image, an image focused at high speed, etc. A desired image utilizing the characteristics of the multifocal lens can be acquired.
- the in-focus state control step sets the in-focus state according to the relationship between the depth of field of the multifocal lens and the distance distribution width of the object. Control.
- the focus control is performed according to the relationship between the depth of field of the multifocal lens and the distance distribution width of the subject, for example, a plurality of focal points according to the distance to the subject.
- the “depth of field of the multifocal lens” includes the depth of field of each focal region of the multifocal lens and the depth of field of the multifocal lens as a whole.
- the distance distribution can be detected by various methods. For example, it may be calculated using phase difference information, or may be calculated by comparing the contrast of a plurality of imaging signals acquired at different lens positions. Further, the image acquired in the second mode is not limited to a still image, and a moving image may be acquired by continuously performing focus control in accordance with a change in the distance distribution of the subject. In the second aspect, it is preferable that the imaging apparatus includes an imaging element including a light receiving element that selectively receives a light beam that has passed through any of the plurality of regions.
- the depth of field as a whole of the multifocal lens is not less than the width of the distance distribution of the subject. Controls the in-focus state so that the distance distribution width of the subject is within the depth of field as a whole.
- the relationship between the depth of field as a whole of the multifocal lens and the width of the distance distribution of the subject varies depending on the design of the multifocal lens, actual shooting conditions, etc., and any of them may be wide.
- the width of the distance distribution of the subject is the subject. Since an image is acquired by performing focusing control so as to be within the depth of field, an image that is appropriately focused in accordance with the distance distribution of the subject can be acquired.
- how to cover the width of the distance distribution of the subject by the depth of field as a whole of the multifocal lens can be determined according to the photographing purpose. For example, after controlling the distance distribution of the subject to be within the depth of field as a whole, the center of the depth of field as a whole and the center of the distance distribution of the subject are controlled to match. Alternatively, the overall depth of field may be controlled so as to be closer to the front side or the rear side of the distance distribution of the subject. In addition, the width of the subject distance distribution is set to be within the depth of field as a whole, and the degree of focus with respect to a specific subject such as a main subject is greater than or equal to a threshold value for any focal region. You may control as follows.
- the depth of field as a whole of the multifocal lens is narrower than the width of the distance distribution of the subject in the in-focus state control step. Controls the in-focus state so that the range of the distance distribution of the subject that does not fit within the depth of field as a whole is equal between the front side and the back side of the depth of field as a whole.
- the fourth mode prescribes one mode of focusing control when the depth of field of the entire multifocal lens is narrower than the width of the distance distribution of the subject.
- the depth of field as a whole of the multifocal lens is narrower than the width of the distance distribution of the subject in the in-focus state control step. Controls the in-focus state based on the in-focus priority order between subjects.
- the subject with the highest priority is set to be accurately focused in any one of the focal areas, or the focus degree of the subject with the highest priority is set to a predetermined value or more. It is possible to maximize the degree of focus of a subject with a priority of 2 or lower, and thus it is possible to obtain an image that is appropriately focused in accordance with the distance distribution of the subject.
- the imaging method according to a sixth aspect of the present invention is the imaging method according to the fifth aspect, wherein the step of detecting a person's face is performed, and the focus priority is set higher than that of a subject other than the person for the detected face person. And in a focus state control step, the focus state is controlled based on the set focus priority order.
- the imaging method further includes a step of causing the user of the imaging device to specify the focus priority, and the focus state control step sets the specified focus priority to Based on this, the in-focus state is controlled.
- the focus priority order of a person may be increased, or the focus priority order of subjects other than the person may be increased.
- the in-focus state control step in the first aspect, in the in-focus state control step, the in-focus state is controlled so that the main subject is in focus in any of a plurality of regions. Note that whether or not the subject is a main subject can be determined based on whether or not the subject is a person's face, a ratio in the photographing region, and the like.
- the imaging method according to a ninth aspect of the present invention is the imaging method according to the eighth aspect, wherein the focus state is controlled so that the main subject is focused in an area having the shortest focal length among a plurality of areas in the focus state control step.
- the imaging method according to a tenth aspect of the present invention is the imaging method according to the eighth aspect, wherein the focus state is controlled so that the main subject is focused in the region having the longest focal length among the plurality of regions in the focus state control step.
- the imaging method according to an eleventh aspect of the present invention is the image pickup method according to any one of the first to tenth aspects, wherein in the in-focus state control step, the relationship between the aperture value of the multifocal lens and the depth of field of the multifocal lens. Based on this, the in-focus state is controlled.
- the depth of field varies depending on the aperture of the lens. Generally, when the aperture value (F value) increases, the depth of field increases, and when the aperture value decreases, the depth of field decreases. Therefore, as in the eleventh aspect, by performing focus control based on the relationship between the aperture value of the multifocal lens and the depth of field of the multifocal lens, the focus is appropriately adjusted according to the distance distribution of the subject. Images can be acquired.
- the imaging method according to a twelfth aspect of the present invention is the image pickup method according to any one of the first to eleventh aspects, wherein the acquired image and information indicating the degree of focus of the subject included in the acquired image are recorded in association with each other.
- the method further includes a step.
- the user can select and extract a desired image such as an image having a degree of focus equal to or greater than a predetermined value with reference to the degree-of-focus information after image acquisition.
- the image processing method includes a step of extracting an image included in the recorded image based on the information recorded according to the twelfth aspect.
- the image processing method according to the fourteenth aspect of the present invention further includes a step of extracting an image including a subject having a designated degree of focus based on the recorded information.
- the image processing method according to the fifteenth aspect of the present invention is the process according to the thirteenth or fourteenth aspect, wherein the extracted image is used to synthesize a new image having the designated degree of focus for the designated subject. Further included.
- the user can obtain an image with a desired degree of focus according to the purpose and preference.
- a specific value may be designated for the degree of focus, or only an upper limit value, only a lower limit value, or a range consisting of an upper limit value and a lower limit value may be designated.
- an imaging program according to a sixteenth aspect of the present invention causes an imaging apparatus to execute an imaging method according to any of the first to twelfth aspects.
- an image processing program according to a seventeenth aspect of the present invention causes an image processing apparatus to execute any one of the thirteenth to fifteenth image processing methods.
- the program according to the sixteenth and seventeenth aspects may be used by being incorporated in an imaging apparatus such as a digital camera, or may be used as image processing / editing software in a personal computer (PC) or the like.
- the computer-readable codes of the imaging programs according to the sixteenth and seventeenth aspects are recorded on the recording media according to the eighteenth and nineteenth aspects of the present invention, respectively.
- Examples of recording media according to the eighteenth and nineteenth aspects include non-transitory such as CD, DVD, BD, HDD, SSD, and various memory cards in addition to digital cameras and PC ROM / RAM.
- a semiconductor storage medium or a magneto-optical recording medium can be used.
- an imaging apparatus includes a multifocal lens having a plurality of regions, each of the plurality of regions having a different focal length, and focusing of the multifocal lens.
- An in-focus state control means for controlling a state and an image pickup means for acquiring an image of a subject in the controlled in-focus state, wherein the in-focus state control means is a field of view of a multifocal lens.
- the in-focus state is controlled according to the relationship between the depth and the width of the distance distribution of the subject.
- the imaging device by controlling the in-focus state according to the relationship between the depth of field of the multifocal lens and the width of the distance distribution of the subject, for example, a plurality of the number according to the distance to the subject Depending on the shooting purpose, it is possible to acquire an image that is accurately focused in any of the focus areas, or to acquire an image in which a specific subject is blurred by imaging at a lens position different from the in-focus position. It is possible to obtain a desired image utilizing the characteristics of the multifocal lens.
- the “depth of field of the multifocal lens” means the depth of field of each focal region of the multifocal lens and the depth of field of the entire multifocal lens, as in the second aspect. Includes depth.
- the image acquired in the twentieth aspect is not limited to a still image, and a moving image may be acquired by continuously performing focus control according to a change in the distance distribution of the subject.
- the imaging device according to the twentieth aspect preferably includes an imaging element including a light receiving element that selectively receives a light beam that has passed through any of the plurality of regions.
- the imaging device is the first aspect, in the in-focus state control step, the main subject passes through any one of the plurality of regions in accordance with the photographing instruction.
- the in-focus state is controlled so as to focus.
- the front image apparatus preferably includes an image sensor including a light receiving element that selectively receives a light beam that has passed through any of a plurality of regions.
- the “multifocal lens” is not limited to a single lens but includes a lens composed of a plurality of lenses, and “control of the multifocal lens” constitutes a multifocal lens. This means that at least a part of the lens is moved to focus on the subject.
- the “shooting instruction” may be input by the user pressing the shutter button, or may be generated by the imaging apparatus at each shooting time during moving image shooting.
- whether or not the subject is a main subject can be determined based on whether or not the subject is a person's face, a ratio in the photographing region, and the like.
- the imaging method according to the twenty-second aspect of the present invention is such that in the in-focus state control step, the main subject is controlled to be in focus through an area having the shortest in-focus time among a plurality of areas. To do.
- the main subject since the main subject is controlled so as to be focused through the region having the shortest required focusing time among the plurality of regions, it can be focused at higher speed.
- the main subject in the twenty-first or twenty-second aspect, in the in-focus state control step, the main subject is focused through the main area having the largest area among the plurality of areas.
- the image quality depends on the area of each focal region, but in the imaging method according to the twenty-third aspect, the main subject passes through the main region having the largest area among a plurality of regions. Since the image is acquired by controlling to be in focus, a high-quality image that is accurately focused can be obtained, and the highest performance of the lens can be utilized.
- the imaging method according to a twenty-fourth aspect of the present invention is the imaging method according to any one of the twenty-first to the twenty-third aspects, wherein the main subject passes through the region having the shortest required focusing time among the plurality of regions in the focusing state control step.
- the subject is controlled so as to be in focus, and an image of the subject is acquired in the imaging process, and then the main subject is focused through the main area having the largest area among the plurality of areas in the focus state control process.
- an image is acquired by controlling to focus through an area having the shortest focusing time among a plurality of areas, and then the main subject is the maximum of the plurality of areas.
- the image is acquired by controlling to focus through the main area having a large area, an image without a time lag focused at high speed, and an image with a slight time lag but with high image quality and accurately focused Can be obtained.
- the user can select a desired image from these images according to the purpose and preference.
- the imaging method according to a twenty-fifth aspect of the present invention is the imaging method according to any one of the twenty-first to twenty-fourth aspects, wherein in the imaging step, after the photographing instruction, before performing the focusing state control step, the focus at the time of the photographing instruction Images are acquired for all of the plurality of regions in the state. Since the imaging is performed in the focused state at the time of the imaging instruction and the multifocal lens is used, an image focused to some extent in any of a plurality of regions can be obtained at high speed.
- the imaging method according to a twenty-sixth aspect of the present invention is the imaging method according to any one of the twenty-first to the twenty-fifth aspects, wherein the main subject is the shortest required focusing time out of the plurality of regions in the focusing state control step according to the shooting instruction.
- the main subject is the shortest required focusing time out of the plurality of regions in the focusing state control step according to the shooting instruction.
- the main subject in accordance with a shooting instruction, is controlled so as to be focused through an area having the shortest required focusing time among a plurality of areas, Since moving images of a subject are obtained by continuously performing at least one of image acquisition and control so that the subject is focused through a main region having the largest area among a plurality of regions. At least one of a focused image and a high-quality image that is accurately focused can be continuously obtained.
- the user may specify which image is to be acquired, or both images may be continuously acquired and then selected and edited afterwards. A desired image can be obtained according to the user's preference.
- the imaging method according to a twenty-seventh aspect of the present invention is the imaging method according to any one of the twenty-first to twenty-sixth aspects, wherein the acquired image and imaging time information indicating a time from the imaging instruction to the image acquisition are recorded in association with each other. Further included.
- the user since the acquired image and imaging time information indicating the time from the imaging instruction to image acquisition are recorded in association with each other, the user refers to the imaging time information after acquiring the image, It is possible to select and extract a desired image such as an image having an imaging time of a predetermined time or less.
- the imaging method according to a twenty-eighth aspect of the present invention is the process according to any one of the twenty-first to the twenty-seventh aspects, wherein the acquired image and focus degree information indicating the focus degree of the acquired image are recorded in association with each other. Further included.
- the user since the acquired image and focusing degree information indicating the focusing degree of the acquired image are recorded in association with each other, the user refers to the focusing degree information after acquiring the image. It is possible to select / extract a desired image such as an image having an in-focus degree equal to or higher than a predetermined value.
- the imaging method according to a twenty-ninth aspect of the present invention is the imaging method according to any one of the twenty-first to twenty-eighth aspects, and imaging region information indicating which region the acquired image is acquired from. Are further recorded in association with each other.
- the acquired image and the imaging area information indicating which area the acquired image is acquired are recorded in association with each other. Accordingly, it is possible to select and extract an image shot in a specific area (for example, the main area).
- an image processing method is included in a recorded image based on information recorded by an imaging method according to any of the twenty-seventh to twenty-ninth aspects. Extracting an image to be generated.
- an image is extracted with reference to imaging time information, focus degree information, and imaging area information recorded in association with the image. An image can be obtained.
- an imaging program causes an imaging apparatus to execute the imaging method according to any of the twenty-first to thirtieth aspects.
- the imaging program according to the thirty-second aspect of the present invention causes the imaging apparatus to execute the imaging method according to the thirty-third aspect.
- the program according to the thirty-second aspect may be used by being incorporated in an imaging apparatus such as a digital camera, or may be used as image processing / editing software in a personal computer (PC) or the like.
- computer-readable codes of the imaging programs according to the thirty-first and thirty-second aspects are recorded on the recording media according to the thirty-third and thirty-fourth aspects of the present invention, respectively.
- Examples of recording media according to the thirty-third and thirty-fourth aspects include non-transitory such as CD, DVD, BD, HDD, SSD, and various memory cards in addition to digital cameras and PC ROM / RAM.
- a semiconductor storage medium or a magneto-optical recording medium can be used.
- an imaging apparatus includes a multifocal lens having a plurality of regions, each of the plurality of regions having a different focal length, and focusing of the multifocal lens.
- An in-focus state control unit for controlling the state; and an imaging unit for acquiring an image of the subject in the controlled state.
- the imaging unit is configured so that the main subject passes through any one of the plurality of regions in accordance with a shooting instruction. Control to focus.
- the image is acquired by controlling the in-focus state of the multifocal lens so that the main subject is in focus through any of the plurality of regions, so that the main subject is accurately focused.
- a focused image can be obtained.
- the imaging apparatus preferably includes an imaging element including a light receiving element that selectively receives a light beam that has passed through any of a plurality of regions.
- the focus state control means controls the focus state by a phase difference method.
- contrast method the lens is driven to detect the in-focus state
- phase difference method the phase difference of the subject image is detected. Since the in-focus state is detected based on this, in-focus control can be performed at a speed higher than that of the contrast method, and the effect of the present invention of accurate and high-speed in-focus is further exhibited.
- the “multifocal lens” is not limited to a single lens, but includes a lens composed of a plurality of lenses, and “control of the multifocal lens” refers to a multifocal lens. This means that at least a part of the constituent lenses is moved to focus on the subject.
- a desired image utilizing the characteristics of the multifocal lens can be acquired, and the acquired image can be extracted and selected according to the purpose and preference.
- FIG. 1 is a block diagram illustrating a configuration of the imaging apparatus 10.
- FIG. 2 is a schematic diagram showing the photographic lens 12.
- FIG. 3 is a schematic diagram illustrating another example of a multifocal lens.
- FIG. 4 is a schematic diagram showing the relationship between the subject distance and the focusing degree in the photographic lens 12.
- FIG. 5 is a schematic diagram illustrating the configuration of the image sensor 16.
- FIG. 6 is a schematic diagram showing the arrangement of the light receiving elements in the image sensor 16.
- FIG. 7 is a flowchart showing an imaging method according to the first embodiment of the present invention.
- FIG. 8 is a conceptual diagram showing an imaging method according to the first embodiment of the present invention.
- FIG. 9 is a flowchart showing an imaging method according to the second embodiment of the present invention.
- FIG. 1 is a block diagram illustrating a configuration of the imaging apparatus 10.
- FIG. 2 is a schematic diagram showing the photographic lens 12.
- FIG. 3 is a schematic diagram illustrating another example of a multifo
- FIG. 10 is a flowchart showing an imaging method according to the third embodiment of the present invention.
- FIG. 11 is a diagram illustrating an example of priority setting between subjects in the imaging method according to the third embodiment of the present invention.
- FIG. 12 is a flowchart showing an imaging method according to the fourth embodiment of the present invention.
- FIG. 13 is a flowchart showing an image processing method according to the fifth embodiment of the present invention.
- FIG. 14 is a flowchart showing an image processing method according to the sixth embodiment of the present invention.
- FIG. 15 is a schematic diagram illustrating the relationship between the subject distance and the focusing degree in the photographing lens 12.
- FIG. 16 is a flowchart showing an imaging method according to the seventh embodiment of the present invention.
- FIG. 16 is a flowchart showing an imaging method according to the seventh embodiment of the present invention.
- FIG. 17 is another flowchart showing an imaging method according to the seventh embodiment of the present invention.
- FIG. 18 is a flowchart showing an imaging method according to the eighth embodiment of the present invention.
- FIG. 19 is another flowchart showing an imaging method according to the eighth embodiment of the present invention.
- FIG. 20 is a flowchart showing an imaging method according to the ninth embodiment of the present invention.
- FIG. 21 is another flowchart showing an imaging method according to the ninth embodiment of the present invention.
- FIG. 22 is a conceptual diagram showing an imaging method according to the ninth embodiment of the present invention.
- FIG. 23 is a schematic diagram illustrating an example of main subject designation in the imaging apparatus 10.
- FIG. 24 is a conceptual diagram showing image extraction processing in the present invention.
- FIG. 1 is a block diagram showing an embodiment of an imaging apparatus 10 (imaging apparatus, image processing apparatus) according to the first embodiment of the present invention.
- the overall operation of the image pickup apparatus 10 is centrally controlled by a central processing unit (CPU) 40 (focus state control means, image pickup means), and a program necessary for the operation of the CPU 40 (focus state control, (Including programs used for processing such as imaging and image extraction / composition) and parameters are stored in an EEPROM (Electronically Erasable and Programmable Read Only Memory) 46 (recording medium).
- CPU central processing unit
- EEPROM Electrical Erasable and Programmable Read Only Memory
- the program is a non-transitory semiconductor storage medium such as CD, DVD, BD, HDD, SSD, various memory cards, etc., in addition to the EEPROM 46 of the imaging apparatus 10. Or a magneto-optical recording medium, and can be used in an image processing apparatus such as a digital camera or a personal computer (PC).
- the imaging device 10 is provided with operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- a signal from the operation unit 38 is input to the CPU 40, and the CPU 40 controls each circuit of the imaging device 10 based on the input signal as described later.
- the shutter button is an operation button for inputting an instruction to start shooting, and is configured by a two-stroke switch having an S1 switch that is turned on when half-pressed and an S2 switch that is turned on when fully pressed.
- the mode dial is a means for selecting a still image / moving image shooting mode, a manual / auto shooting mode, a shooting scene, and the like.
- the playback button is a button for switching to a playback mode in which a still image or a moving image of a photographed and recorded image is displayed on the liquid crystal monitor 30.
- the MENU / OK key is an operation key that has both a function for instructing to display a menu on the screen of the liquid crystal monitor 30 and a function for instructing confirmation and execution of selection contents.
- the cross key is an operation unit that inputs instructions in four directions, up, down, left, and right, and functions as a cursor moving operation means, a zoom switch, a frame advance button in the playback mode, and the like.
- the BACK key is used to delete a desired object such as a selection item, cancel an instruction content, or return to the previous operation state.
- the image light indicating the subject is imaged on the light receiving surface of a solid-state imaging device (hereinafter referred to as “CCD”) 16 (imaging means) through the photographing lens 12 and the diaphragm 14.
- CCD solid-state imaging device
- the photographing lens 12 is driven by a lens driving unit 36 (focusing state control means) controlled by the CPU 40, and focusing control described later is performed.
- the lens driving unit 36 moves the photographing lens 12 in the optical axis direction in accordance with a command from the CPU 40 to change the focal position.
- the taking lens 12 (multifocal lens, image pickup means) takes a short focal distance and performs a short distance shooting (hereinafter referred to as a near focus area) 12a and a person having a focal length longer than the near focus area.
- a multifocal lens having an area (hereinafter, referred to as a mid-focus area) 12b and a region (hereinafter, referred to as a far-focus area) 12c having a longer focal length than that of the middle focus area and capable of photographing a landscape or the like. (Trifocal lens).
- the photographic lens 12 has a half-moon shaped region at the top and bottom of FIG.
- the lens center O is provided between them.
- An area 12a, a middle focal area 12b, and a far focal area 12c are formed.
- the middle focal region 12b having the largest area is the main region. In these areas, the specific value of the focal length may be set according to the purpose of photographing.
- each focal region is formed in a half-moon shape or a band shape.
- a circular region 12 a ′ (medium focal region) including the lens center O ′ is used.
- the area ratio of the near focus area 12a, the middle focus area 12b, and the far focus area 12c can be set to 30:50:20, for example, but may be set to a ratio different from the above depending on the characteristics of the optical system and the photographing purpose. Good.
- the number of focal regions is not limited to 3, and may be 2 focal points, or 4 or more focal points.
- FIG. 4 (imaging method, imaging program) is a conceptual diagram showing the relationship between the subject distance and the focusing degree in the taking lens 12.
- focus degree curves Ca, Cb, and Cc are curves indicating the focus degree at the distance D among the near focus region 12a, the middle focus region 12b, and the far focus region 12c.
- the distances between points Pa, Pb, and Pc at which the curve peaks represent the in-focus distance of each region of the photographic lens 12, the height of the curve corresponds to the degree of focus, and the spread of the curve corresponds to the depth of field. .
- the photographic lens 12 is controlled so that the subject distance and Pa, Pb, and Pc coincide with each other, the subject is accurately focused in each focal region, and the subject is within the spread of each curve Ca, Cb, and Cc (Wa, If it exists in Wb, Wc), it will be focused to some extent in each area according to the distance from Pa, Pb, Pc. In addition, if the subject is within the depth of field of the entire photographing lens 12, the focus is to some extent in any region.
- the area of the area surrounded by these curves corresponds to the area of each focal area.
- the shape of each curve and the degree of overlap with other curves, and the depth of field DOF of the entire photographing lens 12 determined thereby may be set according to the characteristics of the optical system and the purpose of photographing.
- These curves Ca, Cb, and Cc correspond to MTF (ModulationModTransfer Function) curves in each region.
- the CCD 16 includes a near-image light-receiving cell 16a that receives a light beam that has passed through the near-focus region 12a of the photographing lens 12, and a medium-image light-receiving cell 16b that receives a light beam that has passed through the middle focus region 12b. And a far-image light-receiving cell 16c that receives the light beam that has passed through the far-focus region 12c.
- the light receiving cells 16a, 16b, and 16c pass the light beam that has passed through the near focal region 12a, the middle focal region 12b, and the far focal region 12c by the microlens ML and the light shielding films 18a, 18b, and 18c provided on the front surface of the light receiving unit 17, respectively.
- Each of the light-shielding films 18a, 18b, and 18c has a different shape.
- a light shielding member may be provided on the front surface of the microlens ML.
- the number of these light receiving cells is a ratio corresponding to the area of the corresponding lens region.
- the middle image light receiving cell 16b corresponding to the middle focal region 12b is the most element. The number is increasing.
- the image obtained from the middle image light receiving cell 16b is the main image, and a high quality main image can be obtained by increasing the number of middle image light receiving cells 16b in accordance with the area ratio of the photographing lens. it can.
- the number of light receiving cells may be approximately equal to the area ratio of the focal region corresponding to the light receiving cells.
- the CPU 40 controls the aperture 14 via the aperture drive unit 34, and controls the charge accumulation time (shutter speed) in the CCD 16 and the readout control of the image signal from the CCD 16 via the CCD control unit 32.
- the signal charge accumulated in the CCD 16 is read out as a voltage signal corresponding to the signal charge based on the readout signal applied from the CCD control unit 32 and added to the analog signal processing unit 20.
- the analog signal processing unit 20 samples and holds the R, G, and B signals for each pixel by a correlated double sampling process on the voltage signal output from the CCD 16, amplifies the signal, and adds the amplified signal to the A / D converter 21.
- the A / D converter 21 converts analog R, G, and B signals that are sequentially input into digital R, G, and B signals and outputs them to the image input controller 22.
- the digital signal processing unit 24 performs predetermined processing such as offset control, gain control processing including white balance correction and sensitivity correction, gamma correction processing, YC processing, etc., on the digital image signal input via the image input controller 22. Perform signal processing.
- predetermined processing such as offset control, gain control processing including white balance correction and sensitivity correction, gamma correction processing, YC processing, etc.
- the image data processed by the digital signal processing unit 24 is input to the VRAM 50.
- the VRAM 50 includes an A area and a B area each storing image data representing an image of one frame, and image data representing an image of one frame is rewritten alternately between the A area and the B area.
- the written image data is read from an area other than the area where the image data is rewritten.
- the image data read from the VRAM 50 is encoded by the video encoder 28 and output to the liquid crystal monitor 30, whereby the subject image is displayed on the liquid crystal monitor 30.
- the liquid crystal monitor 30 employs a touch panel, displays an acquired image, and designates a main subject and other subjects as described later, and focuses between subjects by a user operation via a screen as described later. Operations such as setting the priority order and specifying the area to be blurred are possible.
- the CPU 40 starts an AF operation (focusing state control step), and the focusing lens 12 is adjusted via the lens driving unit 36.
- the focus state is controlled.
- the image data output from the A / D converter 21 when the shutter button is half-pressed is taken into the AE detection unit 44.
- the CPU 40 calculates the brightness of the subject (shooting Ev value) from the integrated value of the G signal input from the AE detection unit 44, and the aperture value of the diaphragm 14 and the electronic shutter (shutter speed) of the CCD 16 based on this shooting Ev value. And the charge accumulation time in the diaphragm 14 and the CCD 16 is controlled based on the result.
- the AF processing unit 42 (in-focus state control means) is a part that performs phase difference AF processing, and is a defocus obtained from the phase difference between the main pixel and sub-pixel image data in a predetermined focus area of the image data.
- the focus lens in the taking lens 12 is controlled so that the amount becomes zero.
- the image data output from the A / D converter 21 in response to the press is stored in the memory from the image input controller 22.
- the data is input to (SDRAM) 48 and temporarily stored.
- an image file is generated through signal processing such as YC processing in the digital signal processing unit 24 and compression processing to a JPEG (joint photographic experts group) format in the compression / decompression processing unit 26.
- the image file is read by the media controller 52 and recorded on the memory card 54.
- the image recorded on the memory card 54 can be reproduced and displayed on the liquid crystal monitor 30 by operating the reproduction button of the operation unit 38.
- FIG. 7 (imaging method, imaging program) is a flowchart showing the imaging process according to the first embodiment of the present invention, and the CPU 40 controls the execution based on the program (imaging program) stored in the EEPROM 46 (recording medium).
- an image is acquired by performing control so that the main subject is focused in the middle focus area 12b which is the main area. Images of the imaging process are shown in FIGS. 8A to 8F (imaging method and imaging program).
- the main subject is the child Q1, and the dog Q2 and the tree Q3 are other subjects.
- the focus areas 12a and 12c and the focusing curves Ca and Cc corresponding to these are omitted.
- the lens is first moved to the initial position (S100).
- the initial position may be arbitrarily set, for example, the lens position at which the middle focus area 12b, which is the main area, focuses on a subject having a distance of 2 m can be set as the initial position.
- photographing lens 12 is moved to the initial position, photographing becomes possible.
- the user presses the shutter button of the operation unit 38 and a photographing instruction is input (S102) the distance distribution of the subject is detected based on the diaphragm value of the diaphragm 14.
- S104 Based on the detected distance distribution, the child Q1, which is the main subject, is focused in the main area (S106; focus state control step).
- the child Q1 is within the width Wb of the middle focus area 12b (also within the range of the depth of field DOF as the entire photographing lens 12), and the position of the child Q1 coincides with the position of the peak Pb of the curve Cb. ing.
- whether or not the subject is the main subject can be determined based on whether or not the subject is a person's face, a ratio in the photographing region, or a user designation via the liquid crystal monitor 30.
- the distance distribution can be detected by various methods. For example, it may be calculated using phase difference information as described in Japanese Patent Application Laid-Open No. 2011-124712, or calculated by comparing the contrast of a plurality of imaging signals acquired at different lens positions. You may make it do. The same applies to second to fourth embodiments described later.
- an image is acquired (S108; imaging process), and the acquired image is subjected to focus information (the distance distribution width of the subject, the depth of field of the photographing lens 12, the focus degree of the subject, etc.).
- focus information the distance distribution width of the subject, the depth of field of the photographing lens 12, the focus degree of the subject, etc.
- the information is associated and recorded in the memory card 54 (S110). This makes it possible to extract and synthesize an image with reference to the degree-of-focus information as will be described later.
- the processing of S104 to S110 is continuously performed according to the movement of the main subject (that is, the change in distance distribution) to acquire and record the image.
- FIG. 9 (imaging method, imaging program) is a flowchart showing the imaging process according to the second embodiment of the present invention, and the CPU 40 controls the execution based on the program stored in the EEPROM 46.
- focus control is performed according to whether or not the depth of field of the photographing lens 12 is equal to or larger than the width of the distance distribution of the subject.
- the CPU 40 moves the photographing lens 12 to the initial position (S200).
- a shooting instruction is input (S202)
- the depth of field DOF of the entire shooting lens 12 is calculated based on the aperture value of the aperture 14 (S204), and then the distance distribution of the subject is detected (S206).
- the process proceeds to S210 (in-focus state control step), and the width SD as shown in FIG. Is controlled to be within the depth of field DOF.
- the width SD of the distance distribution may be controlled to be within the depth of field DOF, and control may be performed so that the center of the DOF and the center of the SD coincide with each other. You may control so that it may approach.
- control may be performed so that the degree of focus on a specific subject such as the main subject is maximized.
- an image is captured (S212; imaging process), and the focus level information focus information of the acquired image and each of the subjects Q1 to Q3 (the distance distribution width of the subject, the depth of field of the photographing lens 12) And the degree of focus of the subject, etc.) are recorded in the memory card 54 (S214).
- the process proceeds to S216 (in-focus state control process), and SD is DOF as shown in FIG.
- An image is acquired by performing focusing control so as to protrude evenly in the front-rear direction (S218; imaging process), and the acquired image and the in-focus level information of each of the subjects Q1 to Q3 are associated and recorded in the memory card 54 (S220).
- an image is acquired by performing focusing control according to whether or not the depth of field DOF of the photographing lens 12 as a whole is equal to or greater than the width SD of the distance distribution of the subject.
- An image that is appropriately focused in accordance with the distance distribution of the subject can be acquired.
- FIG. 10 is a flowchart showing the imaging process according to the third embodiment of the present invention, and the CPU 40 controls the execution based on the program stored in the EEPROM 46.
- focusing control is performed according to whether or not the depth of field of the photographing lens 12 is equal to or larger than the width of the distance distribution of the subject as in the second embodiment.
- the focus control is performed based on the focus priority order between the subjects, and an image is acquired.
- the processing of S300 to S314 is the same as S200 to S214 of the flowchart of FIG.
- the depth of field DOF of the entire photographing lens 12 is calculated based on the aperture value of the aperture 14.
- the process proceeds to S316 and the priority between subjects is designated.
- the priority may be specified by detecting the face on the imaging apparatus 10 side and setting the priority of the detected face person higher than that of the subject other than the person, or using the liquid crystal monitor 30 having a touch panel function. It is also possible to accept designation by the user via
- FIG. 11B when a child Q1, a dog Q2, and a tree Q3 exist as subjects as shown in FIG.
- the child Q1 is the main subject with the first priority
- the dog Q2 is the second priority
- the tree Q3 is the third priority.
- These subjects Q1 to Q3 are provided with a frame FL1 (thick solid line), a frame FL2 (thin solid line), and a frame FL3 (dotted line) in order of priority so that the user can check the priority order between the subjects.
- the sizes and positions of the frames FL1 to FL3 may be changed by the user operating the liquid crystal monitor 30.
- (C) of FIG. 11 is an example in the case where the user sets priorities among subjects.
- priority is set in the order touched by the user (here, the order of child Q1, dog Q2, and tree Q3), and frames FL1 to FL3 are displayed as in (b) of FIG. Has been.
- the process proceeds to S318 (in-focus state control step), and the focus control is performed so that a subject with a high priority order falls within the depth of field DOF of the photographing lens 12.
- various methods can be adopted as to how to control a subject having a higher priority within the depth of field DOF of the photographing lens 12.
- the subject Q1 with the highest priority is in focus in any focal region of the photographic lens 12, or the focus degree of the subject Q1 with the highest priority as shown in FIG. Can be set within the depth of field DOF, while the subject Q2 having the second highest priority can be stored.
- as many subjects as possible may be accommodated within the DOF.
- an image of the subject is acquired (S320; imaging process), and the acquired image and focus information focus information (the distance distribution width of the subject, the depth of field of the photographing lens 12, the focus of the subject) And the like are recorded in the memory card 54 (S322).
- an image that is appropriately focused in accordance with the distance distribution of the subject can be acquired.
- FIG. 12 (imaging method, imaging program) is a flowchart showing the imaging process according to the fourth embodiment of the present invention, and the CPU 40 controls the execution based on the program stored in the EEPROM 46.
- the fourth embodiment an image obtained by blurring the near side or back side region of the main subject is acquired.
- the processes of S400 to S406 are the same as the processes of S200 to S206 of the flowchart of FIG. 9 and S300 to S306 of the flowchart of FIG.
- the depth of field DOF of the entire photographing lens 12 is calculated based on the aperture value of the aperture 14.
- the subject behind the child Q1 (for example, the tree Q3) is focused within the range of the depth of field DOF of the photographing lens 12 by the imaging process, the subject (for example, the dog Q2) before the child Q1 is focused from the child Q1. Depending on the distance, it will be blurred.
- the middle focal region 12b of the photographic lens 12 and the corresponding focus degree curve Cb are not shown.
- the CPU 40 causes the child Q1, which is the main subject, to focus on the far-focus area 12c of the photographing lens 12, as shown in FIG. (S416: In-focus state control step) and image pickup (S418: Image pickup step).
- the captured image is recorded on the memory card 54 in association with the focus information (S420).
- the user can acquire a desired image in which the near side or the back side of the main subject is blurred in accordance with the shooting purpose.
- the user may designate a blur target area via the liquid crystal monitor 30, and focus control may be performed so that the area is blurred.
- FIG. 13 (image processing method, image processing program) is a flowchart showing image processing according to the fifth embodiment of the present invention.
- the CPU 40 is based on a program (image processing program) stored in the EEPROM 46 (recording medium). Control execution.
- an image is extracted from images acquired by the methods according to the first to fourth embodiments described above, and all the subjects have a degree of focus equal to or higher than a threshold value from the extracted image. A case of synthesizing a new image having the above will be described.
- a subject Q1 here, a child Q1, a dog Q2, and a tree Q3 as in the examples of FIGS. 11A to 11C.
- the image including the above is detected.
- the subject for example, the child Q1
- the focus level of the target subject is detected, and it is determined whether or not the focus level is equal to or greater than a threshold value (S502).
- the image and focus information are stored in the memory card 54.
- the threshold value a value set on the imaging device 10 side may be used, or a value specified by the user via the operation unit 38 may be used. A threshold value may be set for each subject.
- the process proceeds to 504 to determine whether or not the detected focus level is the maximum, and is determined to be the maximum.
- the processing target image is updated as the composition source image related to the target subject, and the process proceeds to S508 to determine whether or not all the images have been processed. If NO in S502 and S504, the process proceeds to S508 without updating the composition source image. If YES in step S508, the process advances to step S510. If NO, the processes in steps S502 to S506 are repeated until the processing is completed for all images. As a result, from the images recorded on the memory card 54, an image in which the degree of focus of the target subject is greater than or equal to the threshold value and the maximum is extracted.
- the image extraction / combination may be performed by displaying the pre-processing image and the post-processing image on the liquid crystal monitor 30.
- FIG. 14 (image processing method, image processing program) is a flowchart showing image processing according to the sixth embodiment of the present invention, and the CPU 40 controls execution based on a program (image processing program) stored in the EEPROM 46. .
- an image is extracted from images acquired by the methods according to the first to fourth embodiments described above, and the degree of focus of a specific subject (subject of interest) is determined from the extracted image.
- the threshold value for example, the main subject has a high focus level and the other subjects Will be described.
- a subject Q here, a child Q1, a dog Q2, and a tree Q3 as in the example of (a) to (c) of FIG. 11.
- the image including the above is detected. If a subject to be detected is detected, it is determined whether or not the detected subject is a subject of interest (S602).
- the child Q1 which is the main subject, can be set as the subject of interest by designation of the imaging device 10 or the user, but may be another subject. If the detected subject is the subject of interest, the process proceeds to S604, and if it is not the subject of interest (NO in S602), the process proceeds to S614.
- the focus level of the subject of interest is detected, and it is determined whether the focus level is equal to or greater than a threshold value.
- the image and focus information (the distance distribution width of the subject, the depth of field of the photographing lens 12, the focus level of the subject, etc.) are stored in the memory card 54. Are recorded in association with each other. Therefore, it is possible to detect the degree of focus of the subject of interest by referring to this focus information.
- the threshold value a value set on the imaging device 10 side may be used, or a value specified by the user via the operation unit 38 may be used. A threshold value may be set for each subject.
- the process proceeds to 606 to determine whether the detected focus level is the maximum. If it is determined that it is the maximum (YES in S606), the processing target image is updated as a composition source image related to the subject of interest (S608), and the process proceeds to S610 to determine whether or not all the images have been processed. If NO in S604 and S606, the process proceeds to S610 without updating the composition source image. If the determination result is YES in S610, the process proceeds to S612, and if NO, the processes of S600 to S608 are repeated until the process is completed for all images. As a result, an image in which the degree of focus of the subject of interest is greater than or equal to the threshold value and the maximum is extracted from the images recorded on the memory card 54.
- the process proceeds to S614, it is determined whether or not the focus level is equal to or less than the threshold value. If the determination result is YES, the process proceeds to S616 and the focus level is the minimum. Judge whether there is. If the determination result is YES, the process proceeds to S618, where the processing target image is updated as a composition source image related to a subject other than the subject of interest, and the process proceeds to S610. If the determination result is NO in S614 and S616, the process proceeds to S610 without updating the composition source image.
- image composition is performed in S612.
- This image synthesis is a process of extracting a portion of the target subject for each target subject from the extracted image and synthesizing it as a single image. As described above, an image with a high degree of focus is extracted for the subject of interest, and an image with a low degree of focus is extracted for the other subjects. A new image is synthesized such that the degree of focus of the other subject is equal to or greater than the threshold value and the focus level of the other subjects is equal to or less than the threshold value.
- the image extraction / combination may be performed by displaying the pre-processing image and the post-processing image on the liquid crystal monitor 30.
- the extraction reference value may be changed for each time. For example, instead of extracting an image with a degree of focus equal to or greater than a fixed threshold at all times, an image with a low degree of focus is initially extracted, and the degree of focus gradually increases with time. A high image may be extracted. When the image extracted in this way is reproduced as a moving image, the subject that was initially blurred gradually appears clearly, and a natural moving image can be obtained when there is a sense of incongruity when focused rapidly. Conversely, the image may be extracted so that the focused subject gradually blurs.
- a time-varying reference value may be set by the imaging apparatus, or a value set by the user via the liquid crystal monitor 30 or the operation unit 38 may be used.
- the apparatus that performs the imaging process and the image processing has been described as the imaging apparatus 10.
- the imaging process and the image processing according to the present invention are not limited to the execution by the imaging apparatus, and the camera is attached. It can be executed in a mobile phone, a personal computer (PC), a portable game machine, or the like.
- a seventh embodiment of the present invention will be described.
- the configuration of the imaging apparatus 10 (imaging apparatus, image processing apparatus) according to the seventh embodiment is the same as that of the imaging apparatus 10 according to the first to sixth embodiments (see FIGS. 1 to 3, 5, 6, etc.). Therefore, detailed description will be omitted, and differences from these embodiments will be described.
- FIG. 15 is a schematic diagram showing the relationship between the subject distance and the focusing degree in the photographing lens 12 (multifocal lens).
- focus degree curves Ca, Cb, and Cc are curves indicating the focus degree at the distance D among the near focus area 12a, the mid focus area 12b, and the far focus area 12c.
- the distances between points Pa, Pb, and Pc at which the curve peaks represent the in-focus distance of each region of the photographic lens 12, the height of the curve corresponds to the degree of focus, and the spread of the curve corresponds to the depth of field. .
- the photographic lens is controlled so that the subject distance and Pa, Pb, and Pc coincide with each other, the subject is in focus through each focal region, and if the subject is within the spread of each curve Ca, Cb, and Cc, It will focus to some extent.
- the area of the region surrounded by the curve corresponds to the area of each focal region.
- the shape of each curve and the degree of overlap with other curves may be set according to the characteristics of the optical system and the shooting purpose.
- the focus degree curves Ca, Cb, Cc are set so as to cover from a short distance of 1 m or less to a long distance (infinity).
- These curves Ca, Cb, and Cc correspond to MTF (ModulationModTransfer Function) curves in each region.
- FIG. 16 is a flowchart showing an imaging process when still image shooting is performed, and the CPU 40 controls the execution based on a program (imaging program) stored in the EEPROM 46 (recording medium).
- the CPU 40 moves the photographing lens 12 to the initial position (S700), and when the photographing instruction is input (S702), the image is acquired in the entire region with the lens position at the time of the photographing instruction, Recording is performed (S704).
- the image acquired in S704 is recorded, it may be recorded in association with focusing information described later.
- the CPU 40 detects the in-focus state of the photographing lens 12, and calculates the time required for focusing in each of the focal regions 12a, 12b, 12c (S706).
- the focusing time calculation if the focusing time in the main area is the shortest (YES in S708), focusing control on the main area of the main subject (S709; focusing state control step) is performed.
- Acquired (S710; imaging process), and the acquired image and focusing information are associated with each other and recorded in the memory card 54 (S712).
- Step 714 An image is acquired (S716; imaging step), and the acquired image and focusing information are associated with each other and recorded in the memory card 54 (S718).
- the time required for focusing can be calculated from the lens position at the time of imaging instruction, the lens position to be focused, and the lens moving speed.
- the main subject in accordance with the shooting instruction, is focused through the region with the shortest required focusing time, so that an image is acquired. Images can be acquired. In addition, since the main subject is focused through any one of the focal regions, an accurately focused image can be obtained.
- FIG. 18 is a flowchart showing imaging processing when still image shooting is performed, and the CPU 40 controls execution based on a program (imaging program) stored in the EEPROM 46.
- the CPU 40 moves the photographing lens 12 to the initial position (S800), and when a photographing instruction is input (S802), the image is acquired in the entire region with the lens position at the time of the photographing instruction, Recording is performed (S804: imaging step).
- S804 imaging step
- the CPU 40 detects the in-focus state of the photographic lens 12 (S806), focuses the main subject through the main area (S807; in-focus state control step), and acquires an image (S808; imaging step).
- the acquired image and the focus information are associated with each other and recorded in the memory card 54 (S810).
- the main subject is focused on the main area and an image is acquired in accordance with a shooting instruction, so that a high-quality image that is accurately focused can be acquired.
- FIG. 20 (imaging method, imaging program) is a flowchart showing an imaging process when still image shooting is performed, and the CPU 40 controls execution based on a program (imaging program) stored in the EEPROM 46.
- imaging is performed in the focal region having the shortest focusing time, and if the focal region having the shortest focusing time is not the main region, imaging in the main region is further performed. Do.
- FIG. 22A schematically shows this state.
- the initial position may be arbitrarily set, for example, the lens position at which the middle focal point area 12b corresponding to the main area is focused on a subject having a distance of 2 m can be set as the initial position.
- the photographing lens 12 is moved to the initial position, photographing becomes possible, and when a photographing instruction is input (S902), the lens position at the time of photographing instruction is kept in the entire region (that is, the near focal region 12a, the middle focal region 12b, the far region).
- An image is acquired and recorded (at all of the focal region 12c) (S904; imaging process).
- each focal region of the photographing lens 12 is set so as to cover from a short distance to a long distance, so that the main subject Q is far-focused as shown in FIG.
- the main subject Q is in focus to some extent in any focal region. Therefore, an image with a predetermined image quality can be acquired at high speed.
- the CPU 40 detects the in-focus state of the taking lens 12 and calculates the time required for in-focus in each focal region (S906).
- the process proceeds to S909 (focusing state control step) and focusing control on the main area is performed.
- S910 imaging process
- S92 associate the acquired image with the focusing information and record it in the memory card 54
- the photographic lens 12 is controlled to focus through the area with the shortest required in-focus time (S914; in-focus state control step).
- the obtained image and the focusing information are associated with each other and recorded in the memory card 54 (S918).
- the main subject Q is at the closest distance to the peak Pc and the in-focus time in the far focus region 12c is the shortest.
- an image is obtained by focusing through the far-focus region 12c (S914, S916), and then focused through the middle focus region 12b as the main region as shown in (d) of FIG. In this way, an image is acquired (S910, S912).
- the processing may be terminated without performing image acquisition in other regions.
- focusing information means imaging time information indicating the time from the imaging instruction to image acquisition, focusing information indicating the degree of focusing of the acquired image, and any region of the shooting lens 12. Imaging area information indicating whether the image is acquired by the above. The same applies to the seventh and eighth embodiments.
- images are acquired in all the focal regions while the lens positions are at the initial positions, and then images are acquired in the region with the shortest required focusing time, and further, the required focusing is performed. If the area with the shortest time is not the main area, an image is acquired even in the main area, so that an image focused at high speed and a high-quality image accurately focused can be acquired. In addition, an image focused to the maximum performance of the lens can be acquired by focusing and imaging through one of the focal regions of the photographing lens 12 which is a multifocal lens.
- the main subject may be designated by recognizing the person as the main subject when a person's face is detected, or by recognizing the subject with the largest area as the main subject. Also good. Further, a subject corresponding to a place touched by the user with the liquid crystal monitor 30 having a touch panel function may be recognized as a main subject.
- 23 (a) to (c) imaging method and imaging program) are diagrams showing such a main subject designation process. As shown in FIG. 23 (a), the dog Q2, An example is shown in which a child Q1 exists at a medium distance and a tree Q3 exists at a long distance.
- FIG. 23 (a) the dog Q2
- An example is shown in which a child Q1 exists at a medium distance and a tree Q3 exists at a long distance.
- FIG. 23B shows an example in which the imaging apparatus 10 recognizes the child Q1 as a main subject by face recognition, and a frame FL indicating the main subject is displayed around the face of the child Q1.
- FIG. 23C shows an example in which the child Q1 is recognized as the main subject by touching the face of the child Q1 on the liquid crystal monitor 30, and the frame FL is centered on the point touched by the user. It is displayed.
- Such a frame FL may be changed in size and moved by a user operation via the liquid crystal monitor 30.
- FIG. 21 is a flowchart showing the moving image shooting process according to the ninth embodiment. Steps in which processing similar to that in the flowchart of FIG. 20 is performed are denoted by the same reference numerals and detailed description thereof is omitted.
- the processing of S906 to S912 is continuously performed after the image acquisition / recording in the main area (S910, S912) until the end of moving image shooting (NO in S920). That is, an image is acquired in the area with the shortest required focusing time, and if the area with the shortest required focusing time is not the main area, the image is also acquired in the main area, and this is repeated until the moving image shooting ends.
- FIGS. 22A to 22F after the main subject Q is focused in the main area in FIG. 22D and an image is acquired and recorded (S910, S912), the main subject Q moves forward.
- the CPU 40 moves the main subject to the near-focus area as shown in FIG.
- the photographing lens 12 is controlled so as to be focused at 12a (S914), and an image is acquired and recorded (S916, S918). Further, as shown in FIG.
- the photographing lens 12 is controlled so as to be in focus (S909), and an image is acquired and recorded (S910, S912).
- an image is acquired in an area with the shortest required focusing time according to a shooting instruction, and if the area with the shortest required focusing time is not the main area, By acquiring an image even in an area, it is possible to acquire an image focused at high speed and a high-quality image focused accurately.
- an image focused to the maximum performance of the lens can be acquired by focusing and imaging through one of the focal regions of the photographing lens 12 which is a multifocal lens.
- shooting in the entire region after the shooting instruction (S904) is omitted, but such shooting may be performed as necessary.
- the AF processing unit 42 may perform the contrast AF processing.
- the AF processing unit 42 integrates high-frequency components of image data in a predetermined focus area, calculates an AF evaluation value indicating a focused state, and maximizes this AF evaluation value.
- the focus state of the taking lens 12 is controlled.
- 24 (a) to 24 (c) are conceptual diagrams showing image extraction processing in the present invention.
- moving image shooting is performed between times t 0 and t f
- which focal region the image is acquired from is determined which one of the seventh to ninth embodiments of the imaging process described above is used, and which main subject is
- FIG. 24A images A, B, and C shown below the times t0 to tf are images focused and acquired in the near focus area 12a, the middle focus area 12b, and the far focus area 12c, respectively.
- an image is acquired in the area with the shortest required focusing time, and if the area with the shortest required focusing time is not the main area, the image is also acquired in the main area.
- the two images at intervals following time interval ⁇ t for each frame may be obtained, following the description as such two images of "time t i at captured image a, B" Are listed together.
- FIG. 24A it is assumed that the image shown on the lower side of the figure has been acquired at times t 0 , t 1 , t 2 ,... T f ⁇ 1 , t f .
- time at t 0 required focusing time shortest space are acquired image C because it was far focus region 12c
- the image because the required focusing shortest time region is not the main area (middle focal region 12b) in the main area B shows the acquired situation.
- the image and focusing information (imaging time information, focusing degree information, imaging area information) are recorded in association with each other. Yes. Therefore, a desired image can be selected and extracted by referring to the focusing information recorded for each image.
- FIG. 24B shows an example in which an image with the shortest imaging time is extracted from the image group shown in FIG.
- the image C is acquired first and then the image B is acquired at time t 0 , the image C is extracted.
- the image B captured in the main area can be extracted with reference to the imaging area information.
- the focus degree information an image having a focus degree equal to or higher than a threshold value may be extracted. In this case, when there are a plurality of images having a focusing degree equal to or greater than the threshold value, an image having the highest focusing degree may be extracted.
- the information to be referred to may be changed for each time, and the extraction reference value may be changed for each time.
- the extraction reference value may be changed for each time.
- an image with a degree of focus equal to or higher than the threshold value is not extracted at all times, but an image with a low degree of focus is extracted at the beginning.
- an image with a high degree of focus may be extracted gradually.
- the image extracted in this way is reproduced as a moving image, the subject that was initially blurred gradually appears clearly, and a natural moving image can be obtained when there is a sense of incongruity when focused rapidly.
- an image may be selected / extracted so that the focused subject is gradually blurred.
- Such image selection / extraction may be performed by displaying the pre-processing image and the post-processing image on the liquid crystal monitor 30.
- SYMBOLS 10 Imaging device, 12 ... Shooting lens, 12a ... Near focus area
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
- Cameras In General (AREA)
Abstract
Description
図1は本発明の第1の実施の形態に係る撮像装置10(撮像装置、画像処理装置)の実施の形態を示すブロック図である。撮像装置10の装置全体の動作は中央処理装置(CPU)40(合焦状態制御手段、撮像手段)によって統括制御され、CPU40の動作に必要なプログラム(本発明に係る、合焦状態の制御や撮像、画像の抽出・合成等の処理に用いるプログラムを含む)やパラメータは、EEPROM(Electronically Erasable and Programmable Read Only Memory)46(記録媒体)に記憶されている。なお本発明に係るプログラム(撮像プログラム、画像処理プログラム)は撮像装置10のEEPROM46の他、CD、DVDやBD、HDDやSSD、各種メモリカード等の非一時的(non-transitory)な半導体記憶媒体や光磁気記録媒体に記録し、デジタルカメラやパーソナルコンピュータ(PC)等の画像処理装置で用いることができる。
<第1の実施形態>
次に、上記構成の撮像装置10で行う撮像処理について説明する。図7(撮像方法、撮像プログラム)は、本発明の第1の実施形態に係る撮像処理を示すフローチャートであり、EEPROM46(記録媒体)に記憶されたプログラム(撮像プログラム)に基づきCPU40が実行を制御する。第1の実施形態では、主要被写体が主領域である中焦点領域12bで合焦するように制御して画像を取得する。撮像処理のイメージを図8の(a)~(f)(撮像方法、撮像プログラム)に示す。図8の(a)において主要被写体は子供Q1であり、犬Q2,木Q3はそれ以外の被写体である。なお図8の(a)では、焦点領域12a,12c、及びこれらに対応する合焦曲線Ca,Ccの図示は省略している。
図9(撮像方法、撮像プログラム)は、本発明の第2の実施形態に係る撮像処理を示すフローチャートであり、EEPROM46に記憶されたプログラムに基づきCPU40が実行を制御する。第2の実施形態では、撮影レンズ12の被写界深度が被写体の距離分布の幅以上であるか否かに応じた合焦制御を行う。
図10(撮像方法、撮像プログラム)は、本発明の第3の実施形態に係る撮像処理を示すフローチャートであり、EEPROM46に記憶されたプログラムに基づきCPU40が実行を制御する。第3の実施形態では、第2の実施形態と同様に撮影レンズ12の被写界深度が被写体の距離分布の幅以上であるか否かに応じた合焦制御を行うが、第2の実施形態と異なり、被写界深度DOFが被写体の距離分布の幅SDより狭い場合は被写体間の合焦優先順位に基づいて前記合焦制御を行って画像を取得する。
図12(撮像方法、撮像プログラム)は、本発明の第4の実施形態に係る撮像処理を示すフローチャートであり、EEPROM46に記憶されたプログラムに基づきCPU40が実行を制御する。第4の実施形態では、主要被写体の手前側または奥側の領域をぼかした画像を取得する。
次に、上述の手法で取得した画像の抽出・合成について説明する。
図13(画像処理方法、画像処理プログラム)は、本発明の第5の実施形態に係る画像処理を示すフローチャートであり、EEPROM46(記録媒体)に記憶されたプログラム(画像処理プログラム)に基づきCPU40が実行を制御する。図13のフローチャートでは、上述した第1~第4の実施形態に係る手法で取得した画像の中から画像を抽出し、抽出された画像から、全ての被写体がしきい値以上の合焦度を有するような新たな画像を合成する場合について説明する。
図14(画像処理方法、画像処理プログラム)は、本発明の第6の実施形態に係る画像処理を示すフローチャートであり、EEPROM46に記憶されたプログラム(画像処理プログラム)に基づきCPU40が実行を制御する。図14のフローチャートでは、上述した第1~第4の実施形態に係る手法で取得した画像の中から画像を抽出し、抽出された画像から、特定の被写体(関心被写体)の合焦度がしきい値以上であり、それ以外の被写体の合焦度がしきい値以下の合焦度を有するような新たな画像を合成する場合(例えば、主要被写体は合焦度が高く、それ以外の被写体については合焦度を低くするような場合)について説明する。
[撮像装置の構成]
次に、本発明の第7の実施形態について説明する。第7の実施形態に係る撮像装置10(撮像装置、画像処理装置)の構成は上記第1~第6の実施形態に係る撮像装置10と同様(図1~3,5,6等を参照)であるので詳細な説明を省略し、これら実施形態との相違について説明する。
次に、上記構成の撮像装置10で行う撮像処理について説明する。図16(撮像方法、撮像プログラム)は、静止画撮影を行う場合の撮像処理を示すフローチャートであり、EEPROM46(記録媒体)に記憶されたプログラム(撮像プログラム)に基づきCPU40が実行を制御する。撮像処理が開始されると、CPU40は撮影レンズ12を初期位置に移動し(S700)、撮影指示が入力されると(S702)撮影指示時のレンズ位置のまま、全領域で画像を取得し、記録する(S704)。S704で取得した画像を記録する際に、後述する合焦情報を関連付けて記録するようにしてもよい。次にCPU40は、撮影レンズ12の合焦状態を検出し、各焦点領域12a,12b,12cでの合焦に要する時間を算出する(S706)。合焦時間算出の結果、主領域での合焦時間が最短である場合は(S708でYES)、主要被写体の主領域への合焦制御(S709;合焦状態制御工程)を行って画像を取得し(S710;撮像工程)、取得した画像と合焦情報とを関連付けてメモリカード54に記録する(S712)。一方、主領域が合焦時間最短でない場合は(S708でNO)、S714へ進んで所要合焦時間最短の領域を介して合焦するよう撮影レンズ12を制御して(S714;合焦状態制御工程)画像を取得し(S716;撮像工程)、取得した画像と合焦情報とを関連付けてメモリカード54に記録する(S718)。
次に、本発明に係る撮像処理の第8の実施形態について説明する。図18(撮像方法、撮像プログラム)は、静止画撮影を行う場合の撮像処理を示すフローチャートであり、EEPROM46に記憶されたプログラム(撮像プログラム)に基づきCPU40が実行を制御する。
次に、本発明に係る撮像処理の第9の実施形態について説明する。図20(撮像方法、撮像プログラム)は、静止画撮影を行う場合の撮像処理を示すフローチャートであり、EEPROM46に記憶されたプログラム(撮像プログラム)に基づきCPU40が実行を制御する。以下に説明するように第9の実施形態では、まず合焦所要時間最短の焦点領域での撮像を行い、合焦所要時間最短の焦点領域が主領域でない場合は、さらに主領域での撮像を行う。
次に、本発明における画像処理について説明する。本発明においては、上述した撮像処理の第7~第9の実施形態に示すように、取得した画像と合焦情報とを関連付けて記録しているので、この合焦情報を参照して所望の画像を選択・抽出できる。具体的な処理は、EEPROM46に記憶されたプログラム(画像処理プログラム)に基づきCPU40が実行を制御する。
Claims (36)
- 複数の領域を有し当該複数の領域のそれぞれが異なる焦点距離を有する多焦点レンズ、を用いた撮像方法であって、
前記多焦点レンズの合焦状態を制御する合焦状態制御工程と、
前記制御した合焦状態で被写体の画像を取得する撮像工程と、
を含む撮像方法。 - 前記合焦状態制御工程では、前記多焦点レンズの被写界深度と前記被写体の距離分布の幅との関係に応じて前記合焦状態を制御する、請求項1に記載の撮像方法。
- 前記合焦状態制御工程では、前記多焦点レンズの全体としての被写界深度が前記被写体の距離分布の幅以上である場合は、前記被写体の距離分布の幅が前記全体としての被写界深度内に収まるように前記合焦状態を制御する、請求項1又は2に記載の撮像方法。
- 前記合焦状態制御工程では、前記多焦点レンズの全体としての被写界深度が前記被写体の距離分布の幅よりも狭い場合は、前記被写体の距離分布の幅のうち前記全体としての被写界深度に収まらない範囲が、前記全体としての被写界深度の前側と後ろ側とで等しくなるように前記合焦状態を制御する、
請求項1又は2に記載の撮像方法。 - 前記合焦状態制御工程では、前記多焦点レンズの全体としての被写界深度が前記被写体の距離分布の幅よりも狭い場合は、被写体間の合焦優先順位に基づいて前記合焦状態を制御する、
請求項1又は2に記載の撮像方法。 - 人物の顔検出を行う工程と、
検出された顔の人物に対して、前記合焦優先順位を人物以外の被写体よりも高く設定する工程と、
をさらに含み、
前記合焦状態制御工程では、前記設定した合焦優先順位に基づいて前記合焦状態を制御する、
請求項5に記載の撮像方法。 - 前記合焦優先順位を撮像装置のユーザに指定させる工程をさらに含み、
前記合焦状態制御工程では、前記指定された合焦優先順位に基づいて前記合焦状態を制御する、
請求項5に記載の撮像方法。 - 前記合焦状態制御工程では、主要被写体が前記複数の領域のいずれかで合焦するように前記合焦状態を制御する、
請求項1に記載の撮像方法。 - 前記合焦状態制御工程で、前記主要被写体が前記複数の領域のうち最も焦点距離が短い領域で合焦するように前記合焦状態の制御を行うことで、
前記撮像工程で前記主要被写体よりも手前側の領域をぼかした画像を取得する、
請求項8に記載の撮像方法。 - 前記合焦状態制御工程で、前記主要被写体が前記複数の領域のうち最も焦点距離が長い領域で合焦するように前記合焦状態の制御を行うことで、
前記撮像工程で前記主要被写体よりも奥側の領域をぼかした画像を取得する、
請求項8に記載の撮像方法。 - 前記合焦状態制御工程では、前記多焦点レンズの絞り値と前記多焦点レンズの被写界深度との関係に基づいて前記合焦状態を制御する、
請求項1から10のいずれかに記載の撮像方法。 - 前記取得した画像と、前記取得した画像に含まれる被写体の合焦度を示す情報と、を関連付けて記録する工程をさらに含む、
請求項1から11のいずれかに記載の撮像方法。 - 請求項12に記載の撮像方法により記録された情報に基づいて、前記記録した画像に含まれる画像を抽出する工程を含む、
画像処理方法。 - 前記記録された情報に基づいて、指定された合焦度を有する被写体が含まれる画像を抽出する工程をさらに含む、
請求項13に記載の画像処理方法。 - 前記抽出した画像を用いて、指定された被写体が指定された合焦度を有する新たな画像を合成する工程をさらに含む、
請求項13または14に記載の画像処理方法。 - 請求項1から12のいずれかに記載の撮像方法を撮像装置に実行させる、
撮像プログラム。 - 請求項13から15のいずれかに記載の画像処理方法を画像処理装置に実行させる、
画像処理プログラム。 - 請求項16に記載の撮像プログラムの、コンピュータ読み取り可能なコードが記録された記録媒体。
- 請求項17に記載の画像処理プログラムの、コンピュータ読み取り可能なコードが記録された記録媒体。
- 複数の領域を有し、当該複数の領域のそれぞれが異なる焦点距離を有する多焦点レンズと、
前記多焦点レンズの合焦状態を制御する合焦状態制御手段と、
前記制御した合焦状態で被写体の画像を取得する撮像手段と、
を備えた撮像装置であって、
前記合焦状態制御手段は、前記多焦点レンズの被写界深度と被写体の距離分布の幅との関係に応じて前記合焦状態を制御する、
撮像装置。 - 前記合焦状態制御工程では、撮影指示に応じて、主要被写体が前記複数の領域のいずれかを介して合焦するように前記合焦状態を制御する、請求項1に記載の撮像方法。
- 前記合焦状態制御工程では、前記主要被写体が前記複数の領域のうち所要合焦時間が最短の領域を介して合焦するように制御する、
請求項21に記載の撮像方法。 - 前記合焦状態制御工程では、前記主要被写体が前記複数の領域のうち最大の面積を有する主領域を介して合焦するように制御する、
請求項21または22に記載の撮像方法。 - 前記合焦状態制御工程で、前記主要被写体が前記複数の領域のうち所要合焦時間が最短の領域を介して合焦するように制御して、前記撮像工程で前記被写体の画像を取得し、続いて
前記合焦状態制御工程で、前記主要被写体が前記複数の領域のうち最大の面積を有する主領域を介して合焦するように制御して、前記撮像工程で前記被写体の画像を取得する、
請求項21から23のいずれかに記載の撮像方法。 - 前記撮像工程では、前記撮影指示の後、前記合焦状態制御工程を行う前に、前記撮影指示の時の合焦状態のまま前記複数の領域の全てに対して画像を取得する、
請求項21から24のいずれかに記載の撮像方法。 - 前記撮影指示に応じて、
前記合焦状態制御工程で前記主要被写体が前記複数の領域のうち所要合焦時間が最短の領域を介して合焦するように制御して、前記撮像工程で前記被写体の画像を取得することと、
前記合焦状態制御工程で前記主要被写体が前記複数の領域のうち最大の面積を有する主領域を介して合焦するように制御して、前記撮像工程で前記被写体の画像を取得することと、
の少なくとも一方を継続的に行って被写体の動画像を取得する、
請求項21から25のいずれかに記載の撮像方法。 - 前記取得した画像と、前記撮影指示から画像取得までの時間を示す撮像時間情報と、を関連付けて記録する工程をさらに含む、
請求項21から26のいずれかに記載の撮像方法。 - 前記取得した画像と、前記取得した画像の合焦度合を示す合焦度情報と、を関連付けて記録する工程をさらに含む、
請求項21から27のいずれかに記載の撮像方法。 - 前記取得した画像と、前記取得した画像がいずれの領域により取得されたものであるかを示す撮像領域情報と、を関連付けて記録する工程をさらに含む、
請求項21から28のいずれかに記載の撮像方法。 - 請求項27から29のいずれかに記載の撮像方法で記録された情報に基づいて、前記記録した画像に含まれる画像を抽出する工程をさらに含む、
画像処理方法。 - 請求項21から30のいずれかに記載の撮像方法を撮像装置に実行させる、撮像プログラム。
- 請求項30に記載の画像処理方法を画像処理装置に実行させる、画像処理プログラム。
- 請求項31に記載の撮像プログラムの、コンピュータ読み取り可能なコードが記録された、記録媒体。
- 請求項32に記載の画像処理プログラムの、コンピュータ読み取り可能なコードが記録された、記録媒体。
- 複数の領域を有し、当該複数の領域のそれぞれが異なる焦点距離を有する多焦点レンズと、
前記多焦点レンズの合焦状態を制御する合焦状態制御手段と、
前記制御した状態で被写体の画像を取得する撮像手段と、
を備え、
前記撮像手段は、撮影指示に応じて、主要被写体が前記複数の領域のいずれかを介して合焦するように制御する、
撮像装置。 - 前記合焦状態制御手段は位相差方式で前記合焦状態の制御を行う、
請求項35に記載の撮像装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280053034.6A CN103907043B (zh) | 2011-10-28 | 2012-10-23 | 摄像方法 |
JP2013540778A JP5647739B2 (ja) | 2011-10-28 | 2012-10-23 | 撮像方法 |
EP12844116.9A EP2772782B1 (en) | 2011-10-28 | 2012-10-23 | Imaging method and image processing method, program using same, recording medium, and imaging device |
US14/262,822 US9270878B2 (en) | 2011-10-28 | 2014-04-28 | Imaging method using multifocal lens |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011237635 | 2011-10-28 | ||
JP2011237634 | 2011-10-28 | ||
JP2011-237634 | 2011-10-28 | ||
JP2011-237635 | 2011-10-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/262,822 Continuation US9270878B2 (en) | 2011-10-28 | 2014-04-28 | Imaging method using multifocal lens |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013061947A1 true WO2013061947A1 (ja) | 2013-05-02 |
Family
ID=48167777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/077304 WO2013061947A1 (ja) | 2011-10-28 | 2012-10-23 | 撮像方法及び画像処理方法並びにそれらのプログラム、記録媒体、及び撮像装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9270878B2 (ja) |
EP (1) | EP2772782B1 (ja) |
JP (1) | JP5647739B2 (ja) |
CN (1) | CN103907043B (ja) |
WO (1) | WO2013061947A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015181237A (ja) * | 2015-04-23 | 2015-10-15 | オリンパス株式会社 | 撮影機器及び撮影方法 |
US9756235B2 (en) | 2013-09-17 | 2017-09-05 | Olympus Corporation | Photographing apparatus, photographing method and recording medium on which photographing/display program is recorded |
EP3110131A4 (en) * | 2014-02-19 | 2017-11-01 | Samsung Electronics Co., Ltd. | Method for processing image and electronic apparatus therefor |
CN113574857A (zh) * | 2019-03-25 | 2021-10-29 | 日本电气株式会社 | 图像处理设备和图像处理方法 |
WO2022201782A1 (ja) * | 2021-03-26 | 2022-09-29 | 富士フイルム株式会社 | 撮像装置、撮像方法、及びプログラム |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5710835B2 (ja) * | 2012-03-21 | 2015-04-30 | 富士フイルム株式会社 | 撮像装置 |
JP2014103353A (ja) * | 2012-11-22 | 2014-06-05 | Samsung R&D Institute Japan Co Ltd | 認識装置、認識方法、実装装置及び実装方法 |
KR20150077646A (ko) * | 2013-12-30 | 2015-07-08 | 삼성전자주식회사 | 이미지 처리 장치 및 방법 |
JP6415196B2 (ja) * | 2014-09-08 | 2018-10-31 | キヤノン株式会社 | 撮像装置および撮像装置の制御方法 |
US20170054897A1 (en) * | 2015-08-21 | 2017-02-23 | Samsung Electronics Co., Ltd. | Method of automatically focusing on region of interest by an electronic device |
JP6800628B2 (ja) * | 2016-06-22 | 2020-12-16 | キヤノン株式会社 | 追跡装置、追跡方法、及びプログラム |
US20190297270A1 (en) * | 2016-09-29 | 2019-09-26 | Nikon Corporation | Image - capturing apparatus |
JP6838994B2 (ja) * | 2017-02-22 | 2021-03-03 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法およびプログラム |
JP2019086775A (ja) * | 2017-11-06 | 2019-06-06 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム及び記憶媒体 |
CN110324532B (zh) * | 2019-07-05 | 2021-06-18 | Oppo广东移动通信有限公司 | 一种图像虚化方法、装置、存储介质及电子设备 |
US11172112B2 (en) | 2019-09-09 | 2021-11-09 | Embedtek, LLC | Imaging system including a non-linear reflector |
DE102020109929B3 (de) | 2020-04-09 | 2021-01-14 | Sick Ag | Erfassung von Bilddaten eines bewegten Objekts |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05199443A (ja) * | 1992-01-21 | 1993-08-06 | Konica Corp | 電子カメラの合焦位置検出装置 |
JPH07119893B2 (ja) * | 1986-09-22 | 1995-12-20 | オリンパス光学工業株式会社 | 内視鏡光学系 |
JP2001116980A (ja) * | 1999-10-18 | 2001-04-27 | Fuji Photo Film Co Ltd | 自動焦点カメラ及び撮影方法 |
JP2003098426A (ja) | 2001-09-20 | 2003-04-03 | Olympus Optical Co Ltd | 撮影レンズ、その撮影レンズを用いたカメラ及びそのカメラにおける絞り |
JP2008058540A (ja) | 2006-08-30 | 2008-03-13 | Kyocera Corp | 撮像装置、および画像処理方法 |
JP2008516299A (ja) | 2004-10-15 | 2008-05-15 | 松下電器産業株式会社 | 撮像装置及び画像改質処理方法 |
JP2008172523A (ja) | 2007-01-11 | 2008-07-24 | Fujifilm Corp | 多焦点カメラ装置及びそれに用いられる制御方法並びにプログラム |
JP2011013514A (ja) * | 2009-07-03 | 2011-01-20 | Fujifilm Corp | 撮影制御装置および方法並びにプログラム |
JP2011118235A (ja) * | 2009-12-04 | 2011-06-16 | Ricoh Co Ltd | 撮像装置 |
JP2011124712A (ja) | 2009-12-09 | 2011-06-23 | Fujifilm Corp | 画像処理装置、方法およびプログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07119893A (ja) * | 1993-10-27 | 1995-05-12 | Chiyoda Corp | 低温液化ガス配管の制御方法 |
JPH1068870A (ja) * | 1996-08-29 | 1998-03-10 | Asahi Optical Co Ltd | 多点オートフォーカス装置 |
US6614998B1 (en) | 1999-10-18 | 2003-09-02 | Fuji Photo Film Co., Ltd. | Automatic focusing camera and shooting method |
JP3944039B2 (ja) * | 2002-09-13 | 2007-07-11 | キヤノン株式会社 | 焦点調節装置及びプログラム |
JP2006154465A (ja) * | 2004-11-30 | 2006-06-15 | Olympus Corp | 焦点検出装置およびその制御方法 |
WO2007060794A1 (ja) * | 2005-11-22 | 2007-05-31 | Matsushita Electric Industrial Co., Ltd. | 撮影装置、携帯端末装置、撮影方法、およびプログラム |
US20070150138A1 (en) * | 2005-12-08 | 2007-06-28 | James Plante | Memory management in event recording systems |
JP5154392B2 (ja) * | 2008-12-12 | 2013-02-27 | 株式会社キーエンス | 撮像装置 |
WO2010134343A1 (ja) * | 2009-05-21 | 2010-11-25 | 株式会社ニコン | 形状測定装置、観察装置および画像処理方法 |
-
2012
- 2012-10-23 EP EP12844116.9A patent/EP2772782B1/en active Active
- 2012-10-23 CN CN201280053034.6A patent/CN103907043B/zh active Active
- 2012-10-23 JP JP2013540778A patent/JP5647739B2/ja active Active
- 2012-10-23 WO PCT/JP2012/077304 patent/WO2013061947A1/ja active Application Filing
-
2014
- 2014-04-28 US US14/262,822 patent/US9270878B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07119893B2 (ja) * | 1986-09-22 | 1995-12-20 | オリンパス光学工業株式会社 | 内視鏡光学系 |
JPH05199443A (ja) * | 1992-01-21 | 1993-08-06 | Konica Corp | 電子カメラの合焦位置検出装置 |
JP2001116980A (ja) * | 1999-10-18 | 2001-04-27 | Fuji Photo Film Co Ltd | 自動焦点カメラ及び撮影方法 |
JP2003098426A (ja) | 2001-09-20 | 2003-04-03 | Olympus Optical Co Ltd | 撮影レンズ、その撮影レンズを用いたカメラ及びそのカメラにおける絞り |
JP2008516299A (ja) | 2004-10-15 | 2008-05-15 | 松下電器産業株式会社 | 撮像装置及び画像改質処理方法 |
JP2008058540A (ja) | 2006-08-30 | 2008-03-13 | Kyocera Corp | 撮像装置、および画像処理方法 |
JP2008172523A (ja) | 2007-01-11 | 2008-07-24 | Fujifilm Corp | 多焦点カメラ装置及びそれに用いられる制御方法並びにプログラム |
JP2011013514A (ja) * | 2009-07-03 | 2011-01-20 | Fujifilm Corp | 撮影制御装置および方法並びにプログラム |
JP2011118235A (ja) * | 2009-12-04 | 2011-06-16 | Ricoh Co Ltd | 撮像装置 |
JP2011124712A (ja) | 2009-12-09 | 2011-06-23 | Fujifilm Corp | 画像処理装置、方法およびプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2772782A4 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9756235B2 (en) | 2013-09-17 | 2017-09-05 | Olympus Corporation | Photographing apparatus, photographing method and recording medium on which photographing/display program is recorded |
US10367990B2 (en) | 2013-09-17 | 2019-07-30 | Olympus Corporation | Photographing apparatus, photographing method and recording medium on which photographing/display program is recorded |
EP3110131A4 (en) * | 2014-02-19 | 2017-11-01 | Samsung Electronics Co., Ltd. | Method for processing image and electronic apparatus therefor |
US10674066B2 (en) | 2014-02-19 | 2020-06-02 | Samsung Electronics Co., Ltd. | Method for processing image and electronic apparatus therefor |
JP2015181237A (ja) * | 2015-04-23 | 2015-10-15 | オリンパス株式会社 | 撮影機器及び撮影方法 |
CN113574857A (zh) * | 2019-03-25 | 2021-10-29 | 日本电气株式会社 | 图像处理设备和图像处理方法 |
CN113574857B (zh) * | 2019-03-25 | 2023-10-31 | 日本电气株式会社 | 图像处理设备和图像处理方法 |
US11985432B2 (en) | 2019-03-25 | 2024-05-14 | Nec Corporation | Image processing device and image processing method suitably applied to biometric authentication |
WO2022201782A1 (ja) * | 2021-03-26 | 2022-09-29 | 富士フイルム株式会社 | 撮像装置、撮像方法、及びプログラム |
JP7421008B2 (ja) | 2021-03-26 | 2024-01-23 | 富士フイルム株式会社 | 撮像装置、撮像方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013061947A1 (ja) | 2015-04-02 |
EP2772782A4 (en) | 2015-07-22 |
EP2772782A1 (en) | 2014-09-03 |
JP5647739B2 (ja) | 2015-01-07 |
EP2772782B1 (en) | 2017-04-12 |
US20140232928A1 (en) | 2014-08-21 |
CN103907043A (zh) | 2014-07-02 |
CN103907043B (zh) | 2016-03-02 |
US9270878B2 (en) | 2016-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5647739B2 (ja) | 撮像方法 | |
JP5554453B2 (ja) | 画像処理装置及び方法 | |
TWI399082B (zh) | Display control device, display control method and program | |
US9258545B2 (en) | Stereoscopic imaging apparatus | |
US7769287B2 (en) | Image taking apparatus and image taking method | |
JP5546692B2 (ja) | 撮影装置、撮影方法およびプログラム | |
JP5437781B2 (ja) | 画像処理装置、方法およびプログラム | |
JP4697606B2 (ja) | 撮影装置及び合焦制御方法 | |
JP5647751B2 (ja) | 撮像装置及び撮像方法 | |
JP2008236534A (ja) | デジタルカメラ、及び情報表示方法、情報表示制御プログラム | |
JP5546691B2 (ja) | 撮影装置、撮影方法およびプログラム | |
JP2009177503A (ja) | 撮像装置 | |
JP7281977B2 (ja) | 焦点調節装置および焦点調節方法 | |
JP5832353B2 (ja) | 撮像装置 | |
JP2009071592A (ja) | 撮像装置及び撮像制御方法 | |
JP2007225897A (ja) | 合焦位置決定装置及び方法 | |
JP2007249000A (ja) | 撮像装置、撮像方法および撮像プログラム | |
JP2010134309A (ja) | オートフォーカス装置、オートフォーカス方法及び撮像装置 | |
JP5195663B2 (ja) | 撮像装置、合焦方法及びプログラム | |
JP5493273B2 (ja) | 撮像装置、撮影方法及びプログラム | |
JP5157528B2 (ja) | 撮影装置 | |
JP5054209B2 (ja) | 撮影装置及び合焦制御方法 | |
JP5040669B2 (ja) | 撮像装置 | |
JP2005242271A (ja) | オートフォーカスカメラ | |
JP2005326506A (ja) | 焦点検出装置及び焦点検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12844116 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013540778 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012844116 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012844116 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |