US9210334B2 - Imaging apparatus and imaging method for flare portrait scene imaging - Google Patents
Imaging apparatus and imaging method for flare portrait scene imaging Download PDFInfo
- Publication number
- US9210334B2 US9210334B2 US14/071,813 US201314071813A US9210334B2 US 9210334 B2 US9210334 B2 US 9210334B2 US 201314071813 A US201314071813 A US 201314071813A US 9210334 B2 US9210334 B2 US 9210334B2
- Authority
- US
- United States
- Prior art keywords
- exposure
- value
- image data
- face part
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 79
- 238000006243 chemical reaction Methods 0.000 claims description 23
- 230000002194 synthesizing effect Effects 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 11
- 230000008921 facial expression Effects 0.000 claims description 10
- 230000004907 flux Effects 0.000 claims description 10
- 238000000034 method Methods 0.000 description 140
- 230000008569 process Effects 0.000 description 127
- 238000012545 processing Methods 0.000 description 33
- 238000010586 diagram Methods 0.000 description 29
- ORQBXQOJMQIAOY-UHFFFAOYSA-N nobelium Chemical compound [No] ORQBXQOJMQIAOY-UHFFFAOYSA-N 0.000 description 29
- 230000004048 modification Effects 0.000 description 28
- 238000012986 modification Methods 0.000 description 28
- 230000015572 biosynthetic process Effects 0.000 description 27
- 238000003786 synthesis reaction Methods 0.000 description 27
- 238000012937 correction Methods 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000003321 amplification Effects 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000003199 nucleic acid amplification method Methods 0.000 description 3
- 230000004397 blinking Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H04N5/2355—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H04N5/23219—
-
- H04N5/23293—
-
- H04N9/045—
Definitions
- This invention relates to an imaging apparatus and an imaging method.
- the exposure value is controlled so that the subject photographed may acquire an appropriate exposure value (brightness).
- an appropriate exposure value (brightness).
- either the subject or the background must be set to an appropriate exposure value.
- the other may not have an appropriate exposure value in some cases.
- photographing with different exposure values is repeatedly executed, and then the resultant images are synthesized. This technique can provide an image of a high dynamic range in which both the subject and the background have appropriate exposure values even in a backlighted scene.
- An imaging apparatus comprises: an imaging unit including an imaging element, configured to perform photoelectric conversion on a light flux received at the imaging element and to generate image data; a main subject detecting unit configured to detect a main subject from the image data; and a control unit configured to compare an exposure value of the image data with a predetermined appropriate value and to perform an exposure control so that the exposure value for the main subject part in the image data changes to the appropriate value and the exposure value for a part excluding the main subject part in the image data changes to a value larger than the appropriate value.
- An imaging method comprises: acquiring image data in an imaging unit having an imaging element, by performing photoelectric conversion on a light flux received at the imaging element; detecting a main subject from the image data; comparing an exposure value of the image data with a predetermined appropriate value; and performing an exposure control so that the exposure value for the main subject part in the image data changes to the appropriate value and the exposure value for a part excluding the main subject part in the image data changes to a value larger than the appropriate value.
- FIG. 1 is a block diagram showing the configuration of a digital camera that is one example of an imaging apparatus according to the first embodiment of this invention
- FIG. 2 is a block diagram showing the internal configuration of a synthesis process circuit
- FIG. 3 is a block diagram showing the internal configuration of an image processing circuit
- FIG. 4 is a flowchart showing how the digital camera according to the first embodiment of the invention operates
- FIG. 5A is a diagram showing the exposure data of a face part
- FIG. 5B is a diagram showing the exposure data of a background part
- FIG. 6 is a diagram showing how the photographer is notified that the image is a flare portrait scene
- FIG. 7A is a diagram showing an example of a backlighted scene
- FIG. 7B is a diagram showing a brightness distribution along the chain line shown in FIG. 7A ;
- FIG. 8A is a diagram showing a face part that is a crushed shadow
- FIG. 8B is a diagram showing a background that is clipped white
- FIG. 8C is a diagram showing an image processing for providing an image of a high dynamic range
- FIG. 9 is a diagram showing an exemplary control performed to provide a flare portrait image
- FIG. 10 is a flowchart showing how the digital camera according to a second embodiment of the invention operates.
- FIG. 11A is a diagram showing the brightness distribution in a flare portrait scene
- FIG. 11B is a diagram showing how an exposure control is performed for a flare portrait scene
- FIG. 12 is a timing chart showing how a through image is displayed in preparation for providing a flare portrait image by performing exposure several times;
- FIG. 13 is a flowchart showing how the digital camera according to a third embodiment of the invention operates.
- FIG. 14 is a diagram showing how the photographer is notified that the scene is a flare portrait scene
- FIG. 15 is a flowchart showing how a modification 1 performs an exposure process in the flare portrait mode
- FIG. 16A is the first diagram explaining how the modification 1 performs the exposure process in the flare portrait mode
- FIG. 16B is the second diagram explaining how the modification 1 performs the exposure process in the flare portrait mode
- FIG. 16C is the third diagram explaining how the modification 1 performs the exposure process in the flare portrait mode
- FIG. 17 is a flowchart showing how a modification 2 performs an exposure process in the flare portrait mode
- FIG. 18 is a flowchart showing how a digital camera, which is a combination of the modifications 1 and 2, performs an exposure process in the flare portrait mode;
- FIG. 19 is a flowchart showing the process performed in a modification 3.
- FIG. 20A is a diagram showing the first exemplary facial expression regarded as a flare portrait scene.
- FIG. 20B is a diagram showing the second exemplary facial expression regarded as a flare portrait scene.
- FIG. 1 is a block diagram showing the configuration of a digital camera 100 (hereinafter referred to as “camera”) that is one example of an imaging apparatus according to the first embodiment of this invention.
- the camera 100 includes a photographing optical system 101 , an imaging element 102 , an imaging process circuit 103 , a bus 104 , a RAM 105 , a synthesis process circuit 106 , an image processing circuit 107 , a face detecting circuit 108 , a display unit 109 , an interface (I/F) 110 , a recording medium 111 , a ROM 112 , a console unit 113 , and a microcomputer 114 .
- I/F interface
- the photographing optical system 101 includes a photographing lens 101 a and a diaphragm 101 b .
- the photographing lens 101 a condenses the light flux coming from a subject (not shown) on the imaging element 102 .
- the photographing lens 101 a may be configured as an optical system that includes a focus lens and a zoom lens.
- the focus lens is a lens for adjusting the focal position of the photographing lens 101 a .
- the zoom lens is a lens for adjusting the focal distance of the photographing lens 101 a .
- the photographing lens 101 a is constituted by a movable lens, which is driven along its optical axis as it is controlled by the microcomputer 114 .
- the diaphragm 101 b is configured to open and close, and adjusts the amount of light condensed on the imaging element 102 by the photographing optical system 101 .
- the diaphragm 101 b is driven as it is controlled by the microcomputer 114 .
- the imaging element 102 has a photoelectric conversion surface for receiving the light flux condensed by the photographing optical system 101 .
- pixels i.e., photoelectric conversion elements (e.g., photodiodes) are arranged in a two-dimensional pattern, each configured to convert the light received to an electric charge.
- a color filter has, for example, a Bayer array, and is arranged on the photoelectric conversion surface.
- the imaging element 102 converts the light condensed by the photographing optical system 101 to an electric signal (image signal).
- the imaging element 102 is controlled by the microcomputer 114 .
- the imaging element 102 has a configuration known in the art, such as CCD type or CMOS type.
- the color filter has a color array selected from various arrays known in the art, such as the Bayer array.
- the imaging element 102 used in this embodiment is not limited to a specific configuration. Various types of imaging elements can be used in this embodiment.
- the imaging process circuit 103 functions in the imaging unit together with the imaging element 102 , and includes an analog processing unit 201 , an A/D conversion unit 202 , a pixel-data separating unit 203 , a brightness-data generating unit 204 , and an area-for-area output unit 205 .
- the imaging process circuit 103 processes the image signal generated in the imaging element 102 .
- the analog process unit 201 performs various processes, such as a CDS process and AGC process, on the image signal input from the imaging element 102 .
- a CDS process the dark-current noise component is removed from the image signal.
- AGC process the image signal is amplified.
- the A/D conversion unit 202 converts the image signal processed in the analog process unit 201 to image data that is digital data.
- the image data is composed of the pixel data items generated in the pixels of the imaging element 102 . If the color filter array is the Bayer array, the image data will accord with the Bayer array.
- any image data that accords with the Bayer array shall be referred to as “Bayer data.”
- the pixel-data separating unit 203 separates the Bayer data generated in the A/D conversion unit 202 into color-component data items. If the imaging element 102 has color filters arranged in the Bayer array, the pixel-data separating unit 203 separates the Bayer data generated in the A/D conversion unit 202 into three pixel data items R (red), G (green) and B (blue), or into four pixel data items R, Gr, Gb and B.
- the brightness-data generating unit 204 generates brightness data from the R data, G data and B data acquired in the pixel-data separating unit 203 . If the imaging element 102 has color filters arranged in the Bayer array, the brightness data is generated area for area, each area defined by, for example, four pixels (R, Gr, Gb and B) arranged in two rows and two columns. The brightness data is a mixture of four different-color pixel data items weighted by specific coefficients.
- G data may be regarded as brightness data.
- each area includes two G data items (Gr and Gb).
- the area-for-area output unit 205 outputs, to the bus 104 , the pixel data items separated for color components in the pixel-data separating unit 203 or the exposure data acquired by cumulating the brightness data the brightness-data generating unit 204 has generated for respective areas.
- the bus 104 is a transfer path for transferring various data generated in the camera 100 .
- the RAM 105 is a storage unit for temporary storage of the various data generated in the camera 100 .
- the RAM 105 is used also as a buffer memory in processing data in the synthesis process circuit 106 or image processing circuit 107 .
- the synthesis process circuit 106 synthesizes the Bayer data items generated as the exposure process was performed several times. The synthesis process circuit 106 will be described later in detail.
- the image processing circuit 107 performs various image processes on the Bayer data output from the area-for-area output unit 205 of the imaging process circuit 103 , or the synthesized Bayer data generated in the synthesis process circuit 106 .
- the face detecting circuit 108 which functions as a part of the main subject detecting unit, detects a face part, or the main part of the image represented by the image data. To detect the face part, a known method such as local image-feature matching is used. Alternatively, the face detecting circuit 108 may store the facial feature data of each person, thereby detecting the face image of the person.
- the display unit 109 is provided on, for example, the back of the camera 100 , and is configured to display various images represented by the image data processed in the image processing circuit 107 .
- the I/F 110 is an interface through which the microcomputer 114 performs data communication with the recording medium 111 .
- the recording medium 111 is a medium for recording the image files acquired by photographing.
- the ROM 112 stores a synthesis ratio table for use in the synthesis process circuit 106 , image processing parameters for use in the image processing circuit 107 , and various other parameters for driving the camera 100 .
- the ROM 112 stores two standard exposure values for achieving two types of exposure control, respectively.
- the ROM 112 further stores various programs the microcomputer 114 executes.
- the operation unit 113 is a unit that the photographer may operate to make the camera 100 perform various functions.
- the operation unit 113 includes a release button, a mode button, a menu button, a reproduction button, and a power-supply button. Some or all of these buttons may be virtual buttons displayed on a touch panel.
- the release button is a two-stage switch composed of a first (1st) release switch and a second (2nd) release switch. If the 1st release switch is turned on while the release button remains half depressed, the microcomputer 114 performs an AE process and an AF process. If the 2nd release switch is turned on while the release button remains fully depressed, the microcomputer 114 performs an image recording process (i.e., photographing process).
- the mode button sets the camera 100 to an operating mode.
- the camera 100 has at least two operation modes, i.e., photographing mode and reproducing mode. While being set to the photographing mode, the camera 100 can generate an image to record. While set to the reproducing mode, the camera 100 can reproduce the image recorded.
- the menu button instructs the display unit 109 to display a menu screen.
- the photographer may touch the menu screen, thereby changing the various items set in the camera 100 .
- the photographing mode set in the camera 100 and detailed operating modes can be set on the menu screen.
- the reproduction button may be pushed to instruct the microcomputer 114 to reproduce image files.
- the power-supply button may be pushed to turn on or off the power supply of the camera 100 .
- the microcomputer 114 is a control unit configured to control the various operation sequences of the digital camera 100 . If any member of the console unit 113 is operated, the microcomputer 114 controls the area associated with the member operated.
- the microcomputer 114 includes an AF control unit 301 , an AE control unit 302 , and a display control unit 303 .
- the AF control unit 301 controls the AF process for focusing the photographing lens 101 a on the subject of photography.
- the AE control unit 302 controls the AE process for adjusting the exposure value of the image data (Bayer data) acquired in the imaging element 102 .
- the display control unit 303 controls the display unit 109 , causing the same to display various types of images.
- FIG. 2 is a block diagram showing the internal configuration of the process circuit 106 .
- the synthesis process circuit 106 includes a positional shift detecting unit 401 and a synthesizing unit 402 .
- the positional shift detecting unit 401 detects the positional shift (i.e., shift between the images of a subject) between the Bayer data items acquired through several exposures performed (between Bayer data 1 and Bayer data 2 , in the instance shown in FIG. 2 ).
- the positional shift is, for example, the motion vector between Bayer data 1 and Bayer data 2 .
- the motion vector is calculated by a known method such as the matching of the Bayer data items.
- the synthesizing unit 402 synthesizes Bayer data items which have been acquired at different exposure timings. If neither the camera 100 nor the subject moves at the exposure timing, no positional shift will occur between the Bayer data items. In most cases, however, either the camera 100 or the subject moves at the exposure timing, resulting in a positional shift between Bayer data items. If Bayer data items having a positional shift between them are synthesized, the resultant image may be composed of two images shifted with respect to each other. Therefore, the positional shift between the Bayer data items is corrected before the synthesizing unit 402 synthesizes the Bayer data items. A correction value for correcting the positional shift is calculated in the positional shift detecting unit 401 .
- two Bayer data items are input to the synthesis process circuit 106 and the positional shift between the two data items is detected. If the synthesizing unit 402 synthesizes three or more Bayer data items, the positional shift detecting unit 401 detects the positional shift between the three or more Bayer data items.
- the synthesizing unit 402 synthesizes the pixel data items corresponding to the Bayer data items input to it, in accordance with the synthesis ratio acquired from the synthesis ratio table.
- the synthesizing unit 402 synthesizes the pixel data items, after the positional shift between the Bayer data items has been corrected in accordance with the shift correction value calculated in the positional shift detecting unit 401 .
- FIG. 3 is a block diagram showing the internal configuration of the image processing circuit 107 .
- the image processing circuit 107 includes a noise reducing (NR) unit 501 , a white balancing (WB) unit 502 , a synchronizing unit 503 , a color converting unit 504 , a gradation converting unit 505 , a YC converting unit 506 , an edge extracting unit 507 , an edge emphasizing unit 508 , an edge synthesizing unit 509 , a resizing unit 510 , a compressing unit 511 , an expanding unit 512 , and an amplifying unit 513 .
- NR noise reducing
- WB white balancing
- the NR unit 501 reads Bayer data from the RAM 105 and performs noise reduction on the Bayer data, thereby removing high-frequency noise, etc.
- the noise reduction is achieved by using, for example, a coring process.
- the WB unit 502 performs a white-balance correction, thereby correcting the white balance of the Bayer data processed by the NR unit 501 .
- the white-balance correction is achieved by, for example, multiplying the pixel data by the gain value (i.e., white-balance gain) for each color component.
- the synchronizing unit 503 performs interpolation on the Bayer data in which each pixel output from the WB unit 502 has a color component R, G or B, thereby converting the Bayer data to image data (RGB data) having three color components; R, G and B.
- the color converting unit 504 performs a color conversion, thereby reproducing the color of an image appropriately.
- the color conversion may be, for example, color matrix calculation.
- the color matrix calculation is a process of multiplying the RGB data by the color matrix coefficient that accords with, for example, the white balance.
- the color converting unit 504 further corrects chroma and hue.
- the gradation converting unit 505 performs a gradation conversion on the RGB data output from the color converting unit 504 .
- the gradation conversion is a process of converting the RGB data by using a predetermined gradation conversion table, thereby correcting the gradation of the image.
- the YC converting unit 506 converts the RGB data output from the gradation converting unit 505 , to YC data (i.e., brightness-color difference data).
- the RGB data is converted to YC data by multiplying the RBG data by a predetermined brightness-color difference matrix.
- the edge extracting unit 507 performs, for example, band-pass filtering on the Bayer data output from the NR unit 501 , extracting an edge component signal.
- the edge emphasizing unit 508 receives the edge component signal extracted by the edge extracting unit 507 , and multiplies the edge component signal by an edge emphasis coefficient.
- the edge synthesizing unit 509 receives the edge component signal from the edge emphasizing unit 508 , and adds the edge component signal to the brightness (Y) data acquired in the YC converting unit 506 , thereby emphasizing the edge component in the image.
- the resizing unit 510 resizes the edge-emphasized brightness (Y) data output from the edge synthesizing unit 509 and the color difference (C) data acquired in the YC converting unit 506 , so that both the edge-emphasized brightness (Y) data and the color difference (C) data may be appropriately recorded and displayed.
- the compressing unit 511 compresses the YC data resized in the resizing unit 510 , and stores the image data compressed (compressed image data) in the RAM 105 . If a still image is photographed, the compressing unit 511 uses, for example, the known JPEG system, to compress the YC data. If a moving picture is photographed, the compressing unit 511 uses, for example, the known MPEG system, to compress the YC data.
- the expanding unit 512 expands the compressed image data contained in an image file, and stores the image data expanded image data (YC data) in the RAM 105 . If the compressed data has been generated by the JPEG system, the expanding unit 512 uses the JPEG system, expanding the compressed image data. If the compressed data has been generated by the MPEG system, the expanding unit 512 uses the MPEG system, expanding the compressed image data.
- the amplifying unit 513 amplifies the image data (YC data).
- the amplifying unit 513 is used to achieve exposure correction.
- FIG. 4 is a flowchart showing the operation of the camera 100 according to the first embodiment.
- Step S 101 the microcomputer 114 determines whether the camera 100 is set to the photographing mode or not.
- the operating mode may be found to be photographing mode.
- the microcomputer 114 performs exposure control in order to display a through image (Step S 102 ).
- the microcomputer 114 controls the exposure value of the face part detected by the face detecting circuit 108 , changing the exposure value to a first appropriate value.
- the microcomputer 114 inputs the image data (Bayer data or YC data) to the face detecting circuit 108 , and causes the face detecting circuit 108 to detect the face part.
- the microcomputer 114 sets exposure conditions (i.e., aperture opening of the diaphragm 101 b , the exposure time for the imaging element 102 , etc.), changing the exposure value of the face part detected by the face detecting circuit 108 , to the first appropriate value.
- the microcomputer 114 controls the diaphragm 101 b and the imaging element 102 in accordance with the exposure conditions.
- the face part of the image represented by the Bayer data acquired in the imaging element 102 comes to have the first appropriate value.
- the first appropriate value is, for example, an exposure value obtained by using, for example, 18% gray as a reference.
- the microcomputer 114 After the exposure performed to display a through image, the microcomputer 114 operates to display the through image (Step S 103 ). First, the microcomputer 114 causes the image processing circuit 107 to process the Bayer data acquired by photographing and sequentially stored in the RAM 105 . The image processing circuit 107 performs the NR process to resizing process, acquiring the YC data. The YC data so acquired is stored in the RAM 105 . Thereafter, the microcomputer 114 reads the YC data from the RAM 105 , and inputs the YC data to the display unit 109 . The display unit 109 displays the image represented by the YC data. As this operation sequence proceeds, the photographer can view the image displayed by the display unit 109 , to confirm the state of the subject of photography.
- Step S 104 the microcomputer 114 determines whether the 1st release switch is on or not. If the 1st release switch is not on in Step S 104 , the process returns to Step S 101 .
- Step S 105 the microcomputer 114 performs the AF action.
- the microcomputer 114 cumulates the Bayer data acquired from the imaging element 102 , thereby acquiring a focus evaluation value.
- the microcomputer 114 uses the focus evaluation value, evaluating the image contrast, and minutely drives the focus lens of the photographing lens 101 a .
- the microcomputer 114 ceases to drive the focus lens.
- This AF process is called “contrast-type AF process.”
- the AF process may be replaced by a phase-difference AF process.
- the face detecting circuit 108 detects the face part of the image.
- the focus lens may be driven to bring the face part into focus.
- the microcomputer 114 After performing the AF process, the microcomputer 114 performs the AE process.
- two types of AE process are interchangeably performed in accordance with the relation between the brightness of the face part and the brightness of the background part.
- One type is an AE process using an exposure control reference that is the exposure value obtained by using 18% gray as a reference.
- the other type is an AE process using an exposure control reference that is greater than the exposure value obtained by using 18% gray as a reference.
- the reference can, of course, be changed from 18% gray in prescribed conditions.
- the microcomputer 114 acquires the face-part exposure data Ef from the imaging process circuit 103 (Step S 106 ) in order to determine which AE process should be performed. More specifically, the microcomputer 114 calculates the face-part exposure data Ef from the brightness data about the face part detected by the face detecting circuit 108 .
- FIG. 5A is a diagram showing the exposure data Ef of a face part.
- the Bayer data is divided into 12 ⁇ 9 areas ARs.
- the exposure data is acquired for each area AR.
- the exposure data for one area AR is obtained by cumulating the brightness data acquired in the area AR.
- the exposure data Ef of the face part has been acquired from the areas (i.e., black areas) constituting the face part.
- the method of acquiring the exposure data, described here, is no more than an example. Any other method may be used to acquire the exposure data Ef.
- the microcomputer 114 acquires the exposure data Es of the background part from the imaging process circuit 103 (Step S 107 ). More precisely, the microcomputer 114 generates the exposure data Es of the background part from that part of the brightness data generated in the brightness-data generating unit 204 which corresponds to the background part.
- FIG. 5B is a diagram showing the exposure data Es of the background part.
- the exposure data Es is that part of the exposure data acquired for areas ARs from the area-for-area output unit 205 which is associated with the areas (i.e., black areas in FIG. 5B ) constituting the background part.
- the “background part” is composed of areas remote from the areas constituting the face part, by at least a specific number of areas, and having almost uniform brightness different from the brightness of the face part.
- the background part may be composed of all areas other than the face-part areas.
- the threshold value Th is used to determine whether the photographed scene is a flare portrait scene or not.
- the threshold value Th is set at the time of, for example, designing the camera 100 , and then stored in the ROM 112 .
- the flare portrait scene is a scene including any backlighted subject (e.g., backlighted face part). Flare is unnecessary light, such as diffused reflection and stray light in the optical system. If the subject is backlighted, flare will appear in the image in many cases, making a special photographic effect in few cases. In view of this, flare is generally undesirable light. In this embodiment, however, flare is utilized to provide a
- Step S 110 If ⁇ E is found greater than Th in Step S 109 , showing that the scene is a flare portrait scene, the microcomputer 114 informs the photographer that the scene is a flare portrait scene (Step S 110 ).
- FIG. 6 is a diagram showing how the photographer is notified that the image is a flare portrait scene, by using a display. More specifically, the microcomputer 114 causes the display unit 109 to display the letter D, informing the photographer that the image is a flare portrait scene. At the same time, the microcomputer 114 causes the display unit 109 to display selection buttons B 1 and B 2 .
- the selection buttons B 1 and B 2 are an OK button and an NG button, respectively, in the instance of FIG. 6 . If the selection button B 1 is touched, flare portrait photography is selected.
- “Flare portrait photography” is a coined term, which means photography scattered light is utilized to apply a special effect to the portrait image.
- the notification may be achieved not only by displaying it on the display unit 109 , but also by blinking an LED and/or generating a voice message.
- Step S 111 the microcomputer 114 determines whether the flare portrait should be photographed or not.
- Step S 111 the microcomputer 114 determines that the flare portrait should be photographed if the selection button B 1 has been selected, and that the flare portrait should not be photographed if the selection button B 2 has been selected.
- the microcomputer 114 may determine, in Step S 111 , that the flare portrait should be photographed. In this case, the microcomputer 114 sets the photographing mode to the flare portrait mode, thereby to perform the exposure process (Step S 112 ). In the exposure process, the microcomputer 114 controls the exposure value of the face part detected by the face detecting circuit 108 , changing the exposure value to the second appropriate value that is greater than the first appropriate value.
- the microcomputer 114 inputs the image data (i.e., Bayer data or YC data) to the face detecting circuit 108 and causes the face detecting circuit 108 to detect the face part.
- image data i.e., Bayer data or YC data
- the microcomputer 114 sets the exposure conditions (i.e., the aperture opening of the diaphragm 101 b and the exposure time of the imaging element 102 ), thereby changing the exposure value of the face part detected by the face detecting circuit 108 to the second appropriate value.
- the exposure value of the face part of the image represented by the Bayer data acquired in the imaging element 102 becomes the second appropriate value. How the exposure control is performed will be later explained in detail.
- Step S 113 The result of the exposure process so performed is applied to the through image. Then, the microcomputer 114 determines whether the 2nd release switch is on or not (Step S 114 ). If the 2nd release switch is not on in Step S 114 , the process returns to Step S 104 . Thus, if the 2nd release switch is not on, while the 1st release switch is on, the AF process and the AE process will be continued.
- Step S 115 the microcomputer 114 performs the photographing process.
- the microcomputer 114 controls the diaphragm 101 b and the imaging element 102 in accordance with the exposure condition (either for appropriating the exposure value of the face part or for the flare portrait mode).
- the Bayer data acquired in the imaging element 102 is stored in the RAM 105 .
- the image processing circuit 107 reads the Bayer data from the RAM 105 and processes the data, first reducing noise in the NR unit 501 , and finally compressing the data in the compressing unit 511 .
- the data compressed is stored in the RAM 105 .
- the microcomputer 114 applies predetermined header data to the compressed image data, generating an image file.
- the image file is recorded in the recording medium 111 .
- the header data is composed of various data items such as file name, file size, and exposure conditions for photographing.
- Step S 116 the microcomputer 114 determines whether the present photographing mode is the flare portrait mode. If the present photographing mode is determined to be the flare portrait mode in Step S 116 , the process 114 goes to Step S 110 . In the flare portrait mode, both the face part and the background part have the exposure values close to the saturation level of the imaging element 102 . As a result, ⁇ E may become equal to or smaller than Th. In this embodiment, once process of Step 116 has been performed and the exposure control has been thereby performed in the flare portrait mode, the exposure control can be continued in the flare portrait mode until the photographer selects non-flare portrait photographing.
- the microcomputer 114 may determine in Step S 111 that the flare portrait should not be photographed, or may determine in Step S 116 that the present photographing mode is not the flare portrait mode. In this case, the microcomputer 114 performs an exposure process using the first appropriate value for the face part of the subject (Step S 117 ). In this exposure process, the microcomputer 114 changes the exposure value of the face part to the predetermined first appropriate value. Thereafter, the process goes to Step S 113 .
- Step S 101 determines whether the operating mode is the photographing mode.
- Step S 118 determines whether the operating mode is the reproducing mode.
- the operating mode may be found, in Step S 101 , not to be the reproducing mode. In this case, the process returns to Step S 101 .
- Step S 118 If the microcomputer 114 determines in Step S 118 that the operating mode is the reproducing mode, it makes the camera 100 operate in the reproducing mode.
- the reproducing mode will be briefly explained.
- the microcomputer 114 causes the display unit 109 to display the list of the image files stored in the recording medium 111 . If the photographer selects a desirable image file and pushes the reproduction button, the microcomputer 114 reads the compressed image data from the image file selected, and inputs the compressed image data to the image processing circuit 107 .
- the image processing circuit 107 expands the compressed image data.
- the image data expanded (YC data) is input to the display unit 109 .
- the display unit 109 displays the image represented by the image data.
- FIG. 7A is a diagram showing an example of a backlighted scene.
- FIG. 7B is a diagram showing a brightness distribution along the chain line shown in FIG. 7A . As seen from FIG. 7B , the background part S has higher brightness than the face part F.
- the A/D conversion unit 202 has a brightness range called “dynamic range,” in which the A/D conversion can be performed.
- the dynamic range is shown as “DR” in FIG. 7B .
- Image signal having brightness higher than the maximum value DRH for the dynamic range and image signal having brightness lower than the minimum value DRL are clipped to the maximum value DRH and minimum value DRL of the dynamic range DR, respectively.
- the brightness of the background part S is exposure-controlled to fall within the dynamic range DR in the case where the face part is photographed in a backlighted scene. Then, the brightness of the face part F, which is less bright than the background part S, will become even lower, and the image of the face part F may appear as a crushed shadow as shown in FIG. 8A . Conversely, the brightness of the background part S, which is brighter than the face part, will become even higher, and the image of the background part S may appear clipped white as shown in FIG. 8B .
- an image synthesizing technique has been developed.
- an image exposure-controlled as shown in FIG. 8A and an image exposure-controlled as shown in FIG. 8B are synthesized to provide an image of high dynamic range, which is free of crushed shadow and clipped white as shown in FIG. 8C .
- the synthesized image may, however, appear unnatural in some cases.
- This embodiment can accomplish flare portrait photographing to provide a photograph in which the subject looks bright even in a bright background, thus mirroring the photographer's taste. More precisely, the exposure control is performed for a flare portrait scene as shown in FIG. 9 , in order to set the exposure value of the face part to the second appropriate value E 2 that is greater than the first appropriate value E 1 .
- the second appropriate value E 2 is, for example, the maximum value DRH for the dynamic range DR.
- this embodiment can perform an automatic exposure control on a high-key image (i.e., flare portrait image) desirable as a backlighted-scene portrait, by setting the exposure value of the face part greater than that of the background part, as shown in FIG. 9 .
- a high-key image i.e., flare portrait image
- the automatic exposure control can result in a soft and mysterious image of a person embraced in light.
- a photograph is desirable in which the scattered light looks white and the face image appears in the scattered light.
- the exposure value is so selected that the background may look as white as possible, while preventing the face part from appearing as clipped white or crushed shadow.
- the exposure level may be held in an appropriate range since any change in color balance is undesirable and the face image is important. As long as the exposure level remains in the appropriate range, the facial expression and the parts of the face can be recognized in the image and the color balance never greatly deviates from the color of the subject. Even an image not entirely bright, or locally bright, can be called a “high-key image.” If the brightness difference is large, the background looks bright and the face looks bright at one part and dark at the other parts. Such an image is considered a flare portrait. In this embodiment, the main subject of photography is the face of a man. Nonetheless, the entire face part need not be set to a predetermined exposure level.
- the main subject can be regarded as a part composed of the main parts of a face, i.e., eyes, nose and mouth. It is therefore not absolutely unnecessary to expose the corners of the face appropriately. This expands the range of possible scenes. Further, flowers, accessories and statues, which are better embraced in light, may be handled as main subjects.
- flare may be more easily generated around the face.
- the flare generating rate is raised in this embodiment, thereby providing images more desirable for a backlighted scene.
- the second appropriate value E 2 is the maximum value DRH for the dynamic range. Nonetheless, the second appropriate value E 2 may range from the first appropriate value E 1 to the maximum value DRH for the dynamic range.
- An exposure value capable of achieving the best possible photographic effect may be obtained through experiments and may be stored in the ROM 112 as the second appropriate value E 2 .
- a backlighted scene including the image of a person's face is considered a flare portrait scene.
- the main subject in a flare portrait scene is not necessarily the face of a person. It suffices to incorporate, in the camera 100 , a circuit configured to detect the main subject.
- the second embodiment of this invention will be described.
- an exposure control of the flare portrait mode is performed on a backlighted scene having a large brightness difference between the face part and the background part
- an ordinary exposure control is performed on a backlighted scene having a small brightness difference between the face part and the background part, thereby adjusting the exposure value of the face to an appropriate one.
- the second embodiment is designed to provide an image of the same quality as that acquired through the exposure control of the flare portrait mode, even in a backlighted scene that has no sufficient brightness difference between the background part and the fact part.
- the second embodiment is identical to the first embodiment, in terms of the configuration of the camera 100 and the configuration of the image processing circuit 107 . Therefore, the configurations of the camera 100 and image processing circuit 107 of the second embodiment will not be described.
- FIG. 10 is a flowchart showing how the digital camera 100 of the second embodiment operates.
- the steps identical to those performed in the first embodiment are designated by the same numbers as in FIG. 4 and will not be described. That is, Steps S 101 to S 115 and Step S 118 will not be described.
- Step S 109 ⁇ E may be found equal to or smaller than Th.
- the microcomputer 114 causes the display unit 109 to display the selection buttons B 1 and B 2 as shown in FIG. 6 (Step S 201 ). Then, the photographer may select the selection button B 1 or the selection button B 2 . Since the scene is not a flare portrait scene, character D for notifying a flare portrait scene need not be displayed.
- Step S 202 the microcomputer 114 determines whether the flare portrait photographing should be performed or not. This decision is made in the same way as in Step S 111 .
- Step S 203 If the microcomputer 114 determines in Step S 202 that the flare portrait photographing should not be performed, it performs an exposure process, using the exposure value of the face part of the subject as a first appropriate value (Step S 203 ). In this exposure process, the microcomputer 114 adjusts the exposure value of the face part detected by the face detecting circuit 108 to a predetermined first appropriate value. Thereafter, the process goes to Step S 113 .
- Step S 202 the microcomputer 114 may determine that the flare portrait photographing should be performed.
- the microcomputer 114 controls the exposure value of the background part, rendering the same greater than the second appropriate value (Step S 204 ).
- the background part is composed of areas having almost uniform brightness different from that of the face part, as explained with reference to FIG. 5B .
- the second appropriate value is, for example, an exposure value smaller than the maximum value DRH for the dynamic range DR.
- Step S 205 is similar to Step S 112 .
- the microcomputer 114 then acquires image data (Bayer data) in which the exposure value of the background part is greater than the second appropriate value, and also the image data (Bayer data) in which the exposure value of the face part is equal to the second appropriate value. Thereafter, the microcomputer 114 causes the synthesis process circuit 106 to synthesize these two frames of image data (Step S 206 ). Then, the process goes to Step S 113 . In the synthesis process, the Bayer data acquired in Step S 205 is subjected to the synthesis for the face-part areas, and the Bayer data acquired in Step S 204 is subjected to the synthesis for the background-part areas. This data synthesis can provide an image comparable with a flare portrait image, even if the background is dark.
- Photographing may be performed in Step S 115 after the processes of Steps S 204 to S 206 have been performed.
- the image data items acquired by performing the exposure process several times as in Steps S 204 to S 206 are synthesized, thereby generating image data.
- the image data so generated is processed in the image processing circuit 107 and then recorded in the recording medium 111 .
- the exposure control performed in the flare portrait photographing in Steps S 204 to S 206 will be explained.
- the exposure control is performed in Steps S 204 to S 206 for such a scene as shown in FIG. 11A , in which the face part F is brighter than the background part S. Since the face part F is brighter than the background part S in this scene, the exposure value of the background part S never exceeds the exposure value of the face part F even if the exposure control is performed in the same way as in the flare portrait photographing performed in the first embodiment. Consequently, an image of the same quality as attained in the first embodiment cannot be provided.
- an exposure control is performed changing to the exposure value of the background part to a value greater than the second appropriate value and another exposure control is performed changing the exposure value of the face part to the second appropriate value.
- the two images acquired by these exposure controls, respectively, are synthesized, providing a flare portrait image even if the background part is darker than the face part in the scene.
- the image is a through image having ⁇ E equal to or smaller than Th and acquired prior to the flare portrait photographing, and two times of exposure processes are performed. In practice, however, it suffices to perform the exposure process only once on one frame of the image to be displayed.
- FIG. 12 is a timing chart showing how a through image is displayed in preparation for providing a flare portrait image in such a modified embodiment.
- the exposure process for displaying the through image is performed in accordance with a sync signal VD.
- two types of exposure processes are alternately repeated, one for an exposure time and the other for another exposure time.
- the exposure value of the background part is greater than the second appropriate value, as controlled in Step S 204 ( FIG. 10 ).
- the exposure value of the face part is the second appropriate value, as controlled in Step S 205 ( FIG. 10 ).
- the imaging process circuit 103 reads the image signal.
- R 1 is the data reading after the exposure process EX 1
- R 2 is the data reading after the exposure process EX 2
- R 3 is the data reading after the exposure process EX 3
- R 4 is the data reading after the exposure process EX 4
- R 5 is the data reading after the exposure process EX 5
- R 6 is the data reading after the exposure process EX 6 .
- the Bayer data items read for the respective frames are synthesized in the synthesis process circuit 106 .
- the Bayer data items for two immediately preceding frames are synthesized.
- the synthesis is sequentially performed, starting after the data reading R 2 for the second frame.
- synthesis C 12 is performed after the data reading R 2 , synthesizing the Bayer data items for the first and second frames.
- synthesis C 23 is performed, synthesizing the Bayer data items for the second and third frames.
- synthesis C 34 is similarly performed, synthesizing the Bayer data items for the third and fourth frames.
- synthesis C 45 is performed, synthesizing the Bayer data items for the fourth and fifth frames.
- the image data is processed in the image processing circuit 107 .
- image processing IP 12 is performed after the synthesis C 12
- image processing IP 23 is performed after the synthesis C 23
- image processing IP 34 is performed after the synthesis C 34
- image processing IP 45 is performed after the synthesis C 45 .
- image displaying D 12 is performed after the image processing IP 12
- image displaying D 23 is performed after the image processing IP 23
- image displaying D 34 is performed after the image processing IP 34
- image displaying D 45 is performed after the image processing IP 45 .
- the image displaying starts after the exposure has been performed on the third frame.
- the exposure control is performed in the photographing mode, thereby acquiring a high-key image (i.e., flare portrait image) desirable for a backlighted scene.
- the exposure correction performed in the reproducing mode is utilized to acquire an image having photographic effects comparable with those of the flare portrait image.
- the third embodiment is identical to the first embodiment in terms of the configuration of the camera 100 and the configuration of the image processing circuit 107 . Therefore, the configurations of the camera 100 and image processing circuit 107 of the third embodiment will not be described.
- FIG. 13 is a flowchart showing how the digital camera according to a third embodiment of the invention operates.
- the steps identical to those performed in the first embodiment are designated by the same numbers as in FIG. 4 and will not be described. That is, Steps S 101 to S 105 , Step S 117 , Steps S 113 to S 115 and Step S 118 will not be described.
- the operating mode may be found to be the reproducing mode.
- the microcomputer 114 operates in the reproducing mode.
- the microcomputer 114 causes the display unit 109 to display the list of the image files stored in the recording medium 111 (Step S 301 ). Thereafter, the microcomputer 114 determines which image file in the list displayed has been selected by the photographer (Step S 302 ).
- Step S 302 an image file may be found to have been selected.
- the microcomputer 114 reproduces the image file selected (Step S 303 ). More specifically, the microcomputer 114 reads the compressed image data and inputs the same to the image processing circuit 107 .
- the image processing circuit 107 expands the compressed image data, generating image data (YC data).
- the image data (YC data) is stored in the RAM 105 . Thereafter, the microcomputer 114 inputs the image data stored in the RAM 105 , to the display unit 109 .
- the display unit 109 displays the image represented by the image data.
- the microcomputer 114 After reproducing the image file, the microcomputer 114 acquires the exposure data Ef about the face part in the image file reproduced (Step S 304 ). Thereafter, the microcomputer 114 acquires the exposure data Es about the background part in the image file (Step S 305 ).
- the face part and the background part assume such positions as has been explained with reference to FIG. 5A and FIG. 5B .
- the microcomputer 114 After acquiring the exposure data Ef and the exposure data Es, the microcomputer 114 calculates the difference ⁇ E between the exposure data Es about the background part and the exposure data Ef about the face part, i.e., Es ⁇ Ef (Step S 305 ). The microcomputer 114 then determines whether ⁇ E is greater than threshold value Th (Step S 307 ).
- the threshold value Th is a threshold value for determining whether the photographed scene is a flare portrait scene or not, as has been explained in connection with the first embodiment.
- the threshold value Th is set at the time of, for example, designing the camera 100 , and then stored in the ROM 112 .
- Step S 307 ⁇ E may be greater than Th, or the scene may be found to be a flare portrait scene.
- the microcomputer 114 notifies the photographer that the image is a flare portrait scene (Step S 308 ).
- FIG. 14 is a diagram showing how the photographer is notified that the scene is a flare portrait scene, by using the display unit 109 .
- the microcomputer 114 causes the display unit 109 to display character D and also selection buttons B 3 and B 4 .
- the character D informs the photographer that the scene is a flare portrait scene.
- the selection buttons B 3 and B 4 are displayed, enabling the photographer to select, or not to select, the exposure correction for the flare portrait.
- the exposure correction for the flare portrait is performed if the selection button B 3 is selected and is not performed if the selection button B 4 is selected.
- the notification may be achieved not only by displaying it on the display unit 109 , but also by blinking an LED and/or generating a voice message.
- Step S 309 the microcomputer 114 determines whether or not the flare portrait should undergo the exposure correction.
- Step S 309 the microcomputer 114 determines that the flare portrait should undergo the exposure correction if the photographer has selected the selection button B 3 , and that the flare portrait should not undergo the exposure correction if the photographer has selected the selection button B 4 .
- Step S 310 the microcomputer 114 causes the amplifying unit 513 of the face detecting circuit 108 to amplify the image data, pixel by pixel, so that the brightness of the face part detected by the face detecting circuit 108 may have the second appropriate value that is greater than the first appropriate value.
- the amplification of the image data can result in a flare portrait image comparable with the flare portrait image provided through the exposure process of FIG. 9 .
- a certain amplification factor may be applied to the face part, and a different amplification factor may be applied to the background part. In this case, a flare portrait image can be provided even if the background part is darker than the face part in the scene as described in connection with the second embodiment.
- Step S 311 the microcomputer 114 determines whether the reproduction of the image file should be terminated if the photographer has depressed the reproduction button again at the operation unit 113 .
- Step S 311 determines, in Step S 311 , that the reproduction of the image file should be not terminated. If the microcomputer 114 determines, in Step S 311 , that the reproduction of the image file should be not terminated, the process returns to Step S 303 , whereby the reproduction of the image file is continued. After the exposure correction has been performed on the image data, the image represented by the image data is displayed by the display unit 109 . If microcomputer 114 determines, in Step S 311 , that the reproduction of the image file should be terminated, the process returns to Step S 118 .
- this embodiment can provide a flare portrait image comparable with one acquired in ordinary photographing, by performing the exposure correction in the reproducing mode.
- an exposure control is performed, adjusting the exposure value of the face part to the second appropriate value.
- the exposure value of the areas defining the boundary between the face part and the background part may be used as second appropriate value.
- FIG. 15 is a flowchart showing how the modification 1 performs the exposure process in the flare portrait mode. This exposure process is performed in place of Step S 112 shown in FIG. 4 .
- the microcomputer 114 detects the low-contrast areas surrounding the face part from the image data (Bayer data or YC data) (Step S 401 ).
- the “areas surrounding the face part” are areas (i.e., black areas) that surround the face part, are closer to the face part than to the background part and undergo small brightness change.
- the method of detecting low-contrast areas is no more than an example.
- the low-contrast areas may be detected by any other method.
- Step S 402 the microcomputer 114 detects the low-contrast area brighter than any other low-contrast areas detected. Since any flare portrait scene is a backlighted scene including a face part, the areas detected in Step S 402 can be considered as the closest to the background part.
- the microcomputer 114 detects the areas defining the boundary between the areas detected in Step S 402 and the face part detected by the face detecting circuit 108 (Step S 403 ). Thereafter, the microcomputer 114 performs an exposure process using, as a second appropriate value, the exposure value of the areas detected in Step S 403 (Step S 404 ). Then, the microcomputer 114 terminates the exposure process of FIG. 15 .
- the exposure control is performed using the exposure value of the areas defining the boundary between the face part and the background part as a second appropriate value. These areas are most susceptible to flare. The exposure value of these areas is therefore used as the second appropriate value, enhancing the possibility of generating flare FL around the face part of a person as shown in FIG. 16B .
- a high-key image desirable as a portrait in a backlighted scene can be more easily provided than in the first embodiment.
- a flare portrait image is provided by performing only one exposure process, as in the first embodiment. Nonetheless, the exposure control may be repeated several times as in the second embodiment. If this is the case, the exposure control of using the exposure value of the boundary areas as the second appropriate value may be combined with the exposure control of clipping white in the background part. Alternatively, the exposure control of using the exposure value of the face part as the second appropriate value may be combined with the exposure control of clipping white in the boundary areas.
- Flare is generated from the light in any bright part of the background.
- the exposure control may be performed by using, as the second appropriate value, the exposure value of the area (black area) adjacent to the low-contrast area having the largest exposure value Esmax. In this case, the flare can be expressed more naturally.
- the exposure control is performed by using the exposure data calculated from the brightness data generated by the brightness-data generating unit 204 .
- the exposure control may be performed by using the R, G and B components of the Bayer data.
- FIG. 17 is a flowchart showing how a modification 2 performs an exposure process in the flare portrait mode. This exposure process is performed in place of Step S 112 shown in FIG. 4 .
- the microcomputer 114 acquires the Bayer data separated into three color components by the pixel-data separating unit 203 of the imaging process circuit 103 .
- the microcomputer 114 then cumulates the color-components of the Bayer data, generating exposure data items for the respective color components (Step S 501 ).
- the exposure data about the G component is the sum of the exposure data items about the Gr and Gb components.
- the microcomputer 114 performs an exposure process, increasing the exposure value gradually (Step S 502 ).
- the microcomputer 114 sets exposure conditions (e.g., the aperture opening of the diaphragm 101 b and the exposure time of the imaging element 102 ), thereby increasing the exposure value by a preset value (e.g., 1 EV).
- the microcomputer 114 then controls the diaphragm 101 b and the imaging element 102 in accordance with the exposure conditions.
- the microcomputer 114 determines whether any one of the exposure data items about the R, G and B components of the face part detected by the face detecting circuit 108 has almost reached the saturation level (Step S 503 ). If any one of the exposure data items about the R, G and B components has become, for example, 80% of the maximum value DRH for the dynamic range DR, the microcomputer 114 determines that the exposure data has almost reached the saturation level.
- Step S 503 none of the exposure data items about the R, G and B components may be found to have almost reached the saturation level.
- the microcomputer 114 determines whether any one of the exposure data items about the R, G and B components of the background part has reached the saturation level (Step S 504 ).
- Step S 504 none of the exposure data items about the R, G and B components of the background part may be found to have reached the saturation level. If so, the process returns to Step S 502 . In this case, the microcomputer 114 keeps performing the exposure process to increase the exposure value again.
- Step S 503 any one of the R, G and B components of the face part may be found to have almost reached the saturation level.
- Step S 504 any one of the exposure data items about the R, G and B components of the background part may be found to have almost reached the saturation level. In either case, the microcomputer 114 terminates the exposure process, and then terminates the process of FIG. 17 .
- the exposure data items for the three color components are calculated from the R, G and B components of the Bayer data, respectively, and the exposure values are determined for the color components, respectively, from the exposure data items.
- the second embodiment can therefore control the exposure values more minutely than in each embodiment described above.
- FIG. 18 is a flowchart showing how a digital camera, which is a combination of the modifications 1 and 2, performs an exposure process in the flare portrait mode. This exposure process is performed in place of Step S 112 shown in FIG. 4 .
- the microcomputer 114 generates exposure data items for the brightest color components detected by a method similar to Steps S 401 and 402 shown in FIG. 15 (Step S 601 ).
- the microcomputer 114 then performs an exposure process to increase the exposure value gradually (Step S 602 ).
- This exposure process is similar to the exposure process explained with reference to FIG. 17 .
- the microcomputer 114 determines whether any one of the exposure data items for R, G and B components for the brightest area has reached the saturation level (Step S 603 ).
- Step S 603 none of the exposure data items for R, G and B components for the brightest area may be found to have reached the saturation level. If this is the case, the process returns to Step S 602 , and continues the exposure process, gradually increasing the exposure value.
- Step S 603 any one of the exposure data items for R, G and B components for the brightest area may be found to have reached the saturation level. In this case, the microcomputer 114 terminates the exposure process shown in FIG. 18 .
- the exposure data items for the color components R, G and B of the Bayer data are calculated, and the exposure values are determined for the color components R, G and B, and the exposure control is performed in accordance with the exposure values determined.
- the modification 2 can therefore control the exposure values more minutely than in each embodiment described above.
- the Bayer data is separated into R, G and B components, and exposure values are determined for the R, G and B components, respectively.
- the image data acquired in the imaging element 102 is Bayer data of the primary color system.
- the imaging element 102 has a color-filter arrangement different from the primary-color arrangement, the color components for which exposure values are determined will differ.
- the flare portrait photographing is performed on a flare portrait scene that is actually a backlighted scene including a face part. Nonetheless, the flare portrait photographing may be performed on a scene other than a backlighted scene.
- the scene is assumed to be a flare portrait scene, thus enabling the photographer to perform flare portrait photographing in order to provide portraits of a model.
- FIG. 19 is a flowchart showing the process performed in the modification 3.
- the face detecting circuit 108 detects not only the face part, but also the facial expression from the positions of the parts of face (e.g., eyes, nose and mouth).
- the microcomputer 114 causes the face detecting circuit 108 to detect the face part and the parts of face (Step S 701 ).
- the microcomputer 114 determines whether the ratio of the height of the face part to the screen height of the display unit 109 (i.e., height of the image displayed on the display unit 109 ) is equal to or larger than a threshold value (i.e., 1 ⁇ 6 height of the image) (Step S 702 ). If the ratio of the height is equal to or larger than the threshold value, it is determined that the photographer wants to perform portrait photographing. As shown in FIG. 20A , the height of the face part is the distance H between the top of the head and the tip of the jaw. If the distance H is equal to or larger than the threshold value (1 ⁇ 6), the decision made in Step S 701 is affirmative (Yes).
- the threshold value is not limited to 1 ⁇ 6, nevertheless.
- Step S 702 the ratio the height of the face part detected by the face detecting circuit 108 has with respect to the screen height of the display unit 109 may be determined to be equal to or larger than 1 ⁇ 6.
- the microcomputer 114 determines whether the ratio of the height of the eye part detected by the face detecting circuit 108 to the width of the eye part is equal to or larger than a threshold value (i.e., 1 ⁇ 3 height of image) (Step S 703 ). If the eye part has a relatively large height-to-width ratio, the person (i.e., photography subject) seems to be opening his or her eyes fully. In this case, too, the photographer is considered as having the intention of performing portrait photographing. As shown in FIG.
- the height of the eye part is the distance h between the upper and lower edges of either eye, and the width of either eye is the distance d between the left and right ends of either eye. If the distance h is equal to or larger than the threshold value (1 ⁇ 3), the decision made in Step S 703 is affirmative (Yes).
- the threshold value is not limited to 1 ⁇ 3, nevertheless.
- Step S 703 the eye part detected by the face detecting circuit 108 may be determined to have a height-to-width ratio equal to or larger than 1 ⁇ 3.
- the microcomputer 114 determines whether the boundary between the upper and lower lips is straight or not, and whether the upper and lower lips are symmetrical (Step S 704 ).
- m 2 shows that the boundary between the upper and lower lips is straight.
- m 1 shows that the upper and lower lips are symmetrical. If the person wears the expression of FIG. 20A or the expression of FIG. 20B , the photographer is considered as having the intention of performing portrait photographing.
- Step S 704 the microcomputer 114 may determine that the boundary between the upper and lower lips is straight or that the upper and lower lips are symmetrical. If this is the case, the microcomputer 114 determines that scene is a flare portrait scene, and then recommends the flare portrait photographing to the photographer.
- Step S 702 the ratio the height of the face part has with respect to the screen height of the display unit 109 may be determined to be neither equal to nor larger than 1 ⁇ 6.
- Step S 703 the ratio of the height of the eye part detected to the width of the eye part is neither equal to nor larger than 1 ⁇ 3.
- Step S 704 the boundary between the upper and lower lips may be found not straight and the upper and lower lips may be found not symmetrical. Then, the microcomputer 114 determines that the scene is not a flare portrait scene, and therefore does not recommend the flare portrait photographing to the photographer.
- the photographer can perform the flare portrait photographing even if the scene is not a backlighted scene.
- the technique of determining a backlighted scene and the technique of determining the facial expression of the subject, both described above, may be used in combination.
- the flare portrait photographing can be recommended to the photographer if the scene is backlighted and includes a face part and if the decision made in Step 704 ( FIG. 19 ) is affirmative (Yes).
- the facial expressions shown in FIGS. 20A and 20B are no more than examples.
- the model may assume other facial expressions, each defined by a positional relation of the face parts of the model. Therefore, whether or not to perform the flare portrait photographing may be determined in accordance with the arrangement of the face parts.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
Abstract
Description
Y=(r×R+gr×Gr+gb×Gb+b×B)/(r+gr+Gb+b) (1)
where R, Gr, Gb and B are the values of the pixels, respectively, r, gr, gb and b are coefficients.
Y=(Gr+Gb)/2 (2)
Claims (13)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-245382 | 2012-11-07 | ||
JP2012245382A JP5647209B2 (en) | 2012-11-07 | 2012-11-07 | Imaging apparatus and imaging method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140125863A1 US20140125863A1 (en) | 2014-05-08 |
US9210334B2 true US9210334B2 (en) | 2015-12-08 |
Family
ID=50622020
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/071,813 Expired - Fee Related US9210334B2 (en) | 2012-11-07 | 2013-11-05 | Imaging apparatus and imaging method for flare portrait scene imaging |
Country Status (3)
Country | Link |
---|---|
US (1) | US9210334B2 (en) |
JP (1) | JP5647209B2 (en) |
CN (1) | CN103813097B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170201665A1 (en) * | 2014-06-20 | 2017-07-13 | Sony Corporation | Image capturing apparatus and image capturing method |
US20180091719A1 (en) * | 2016-09-28 | 2018-03-29 | Renesas Electronics Corporation | Backlight correction program and semiconductor device |
US20190172420A1 (en) * | 2016-06-15 | 2019-06-06 | Shenzhen Tcl New Technology Co., Ltd | Photographing method by smart television and system thereof |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102263537B1 (en) * | 2014-09-30 | 2021-06-11 | 삼성전자주식회사 | Electronic device and control method of the same |
JP6186521B2 (en) * | 2014-12-26 | 2017-08-23 | 富士フイルム株式会社 | Focus control device, imaging device, focus control method, and focus control program |
WO2016132976A1 (en) * | 2015-02-17 | 2016-08-25 | ソニー株式会社 | Transmission device, transmission method, reception device, and reception method |
CN105049726B (en) * | 2015-08-05 | 2018-09-04 | 广东欧珀移动通信有限公司 | A kind of mobile terminal photographic method and mobile terminal |
CN105245786B (en) * | 2015-09-09 | 2019-01-08 | 厦门美图之家科技有限公司 | A kind of self-timer method based on intelligent testing light, self-heterodyne system and camera terminal |
CN105450932B (en) * | 2015-12-31 | 2018-11-09 | 华为技术有限公司 | Backlight photographic method and device |
JP6702752B2 (en) * | 2016-02-16 | 2020-06-03 | キヤノン株式会社 | Image processing device, imaging device, control method, and program |
US20180139369A1 (en) | 2016-11-16 | 2018-05-17 | Motorola Mobility Llc | Backlit face detection |
US10158797B2 (en) * | 2017-03-31 | 2018-12-18 | Motorola Mobility Llc | Combining images when a face is present |
US11272113B2 (en) | 2017-10-24 | 2022-03-08 | Sony Corporation | Control apparatus and control method for exposure adjustment |
CN108616689B (en) * | 2018-04-12 | 2020-10-02 | Oppo广东移动通信有限公司 | Portrait-based high dynamic range image acquisition method, device and equipment |
CN110536072A (en) * | 2018-05-25 | 2019-12-03 | 神讯电脑(昆山)有限公司 | Automobile-used image-taking device and image acquisition method |
CN108683862B (en) | 2018-08-13 | 2020-01-10 | Oppo广东移动通信有限公司 | Imaging control method, imaging control device, electronic equipment and computer-readable storage medium |
CN109788278B (en) * | 2019-01-16 | 2020-12-01 | 深圳市壹欣科技有限公司 | Camera glare testing method and glare collecting device thereof |
CN112686852A (en) * | 2020-12-25 | 2021-04-20 | 浙江伟星实业发展股份有限公司 | Product defect identification method, device, equipment and storage medium |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6460156A (en) | 1987-08-31 | 1989-03-07 | Canon Kk | Method for controlling image pickup device |
JPH0220840A (en) | 1988-07-08 | 1990-01-24 | Fuji Photo Film Co Ltd | Photometry device |
JPH0895146A (en) | 1994-09-21 | 1996-04-12 | Konica Corp | Camera |
US6246837B1 (en) * | 1998-06-09 | 2001-06-12 | Minolta Co., Ltd. | Image sensing apparatus |
US20040119874A1 (en) * | 2002-09-20 | 2004-06-24 | Toshie Imai | Backlight adjustment processing of image using image generation record information |
US20050187437A1 (en) * | 2004-02-25 | 2005-08-25 | Masakazu Matsugu | Information processing apparatus and method |
US20060027733A1 (en) * | 2004-08-05 | 2006-02-09 | Broadcom Corporation | Apparatus and method of digital imaging on a semiconductor substrate |
US20060055784A1 (en) * | 2004-09-02 | 2006-03-16 | Nikon Corporation | Imaging device having image color adjustment function |
JP2006311311A (en) | 2005-04-28 | 2006-11-09 | Fuji Photo Film Co Ltd | Imaging apparatus and method |
JP2007074163A (en) | 2005-09-05 | 2007-03-22 | Sony Corp | Imaging device and imaging method |
JP2007228118A (en) | 2006-02-22 | 2007-09-06 | Seiko Epson Corp | Determination of shooting scene |
JP2007243384A (en) | 2006-03-07 | 2007-09-20 | Seiko Epson Corp | Photography assistance in digital camera |
US20080094493A1 (en) * | 2004-09-10 | 2008-04-24 | Senshu Igarashi | Imaging Apparatus |
JP2008131542A (en) * | 2006-11-24 | 2008-06-05 | Fujifilm Corp | Color correction apparatus, and color correction program |
US20090073275A1 (en) * | 2005-06-01 | 2009-03-19 | Kouhei Awazu | Image capturing apparatus with flash device |
US20100315521A1 (en) * | 2009-06-15 | 2010-12-16 | Keiji Kunishige | Photographing device, photographing method, and playback method |
US20120098993A1 (en) * | 2008-01-17 | 2012-04-26 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method and image capturing apparatus |
US20120134558A1 (en) * | 2010-11-29 | 2012-05-31 | Alexander Sienkiewicz | Method for providing visual simulation of teeth whitening |
JP2012156647A (en) | 2011-01-24 | 2012-08-16 | Nikon Corp | Digital camera and electronic apparatus with camera |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5386793B2 (en) * | 2006-12-11 | 2014-01-15 | 株式会社リコー | Imaging apparatus and exposure control method for imaging apparatus |
JP4678060B2 (en) * | 2009-03-25 | 2011-04-27 | 株式会社ニコン | Imaging apparatus and image processing program |
-
2012
- 2012-11-07 JP JP2012245382A patent/JP5647209B2/en not_active Expired - Fee Related
-
2013
- 2013-11-05 US US14/071,813 patent/US9210334B2/en not_active Expired - Fee Related
- 2013-11-07 CN CN201310551648.3A patent/CN103813097B/en not_active Expired - Fee Related
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6460156A (en) | 1987-08-31 | 1989-03-07 | Canon Kk | Method for controlling image pickup device |
JPH0220840A (en) | 1988-07-08 | 1990-01-24 | Fuji Photo Film Co Ltd | Photometry device |
JPH0895146A (en) | 1994-09-21 | 1996-04-12 | Konica Corp | Camera |
US6246837B1 (en) * | 1998-06-09 | 2001-06-12 | Minolta Co., Ltd. | Image sensing apparatus |
US20040119874A1 (en) * | 2002-09-20 | 2004-06-24 | Toshie Imai | Backlight adjustment processing of image using image generation record information |
US20050187437A1 (en) * | 2004-02-25 | 2005-08-25 | Masakazu Matsugu | Information processing apparatus and method |
US20060027733A1 (en) * | 2004-08-05 | 2006-02-09 | Broadcom Corporation | Apparatus and method of digital imaging on a semiconductor substrate |
US20060055784A1 (en) * | 2004-09-02 | 2006-03-16 | Nikon Corporation | Imaging device having image color adjustment function |
US20080094493A1 (en) * | 2004-09-10 | 2008-04-24 | Senshu Igarashi | Imaging Apparatus |
JP2006311311A (en) | 2005-04-28 | 2006-11-09 | Fuji Photo Film Co Ltd | Imaging apparatus and method |
US20090073275A1 (en) * | 2005-06-01 | 2009-03-19 | Kouhei Awazu | Image capturing apparatus with flash device |
JP2007074163A (en) | 2005-09-05 | 2007-03-22 | Sony Corp | Imaging device and imaging method |
JP2007228118A (en) | 2006-02-22 | 2007-09-06 | Seiko Epson Corp | Determination of shooting scene |
JP2007243384A (en) | 2006-03-07 | 2007-09-20 | Seiko Epson Corp | Photography assistance in digital camera |
JP2008131542A (en) * | 2006-11-24 | 2008-06-05 | Fujifilm Corp | Color correction apparatus, and color correction program |
US20120098993A1 (en) * | 2008-01-17 | 2012-04-26 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method and image capturing apparatus |
US20100315521A1 (en) * | 2009-06-15 | 2010-12-16 | Keiji Kunishige | Photographing device, photographing method, and playback method |
US20120134558A1 (en) * | 2010-11-29 | 2012-05-31 | Alexander Sienkiewicz | Method for providing visual simulation of teeth whitening |
JP2012156647A (en) | 2011-01-24 | 2012-08-16 | Nikon Corp | Digital camera and electronic apparatus with camera |
Non-Patent Citations (3)
Title |
---|
English Translation: JP Patent Application Publication 2008-131542; Published on Jun. 5, 2008. Industrial Property Digital Library PAJ-Machine Translation. * |
Office Action for corresponding Japanese Patent Application Serial No. 2012-245382, mailed Jul. 29, 2014 (2 pgs.), with translation (2 pgs.). |
Office Action to Japanese Patent Application No. 2014-226278, mailed on Aug. 18, 2015 (3 pgs.) with translation (4 pgs.). |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170201665A1 (en) * | 2014-06-20 | 2017-07-13 | Sony Corporation | Image capturing apparatus and image capturing method |
US10237488B2 (en) * | 2014-06-20 | 2019-03-19 | Sony Corporation | Image capturing apparatus and image capturing method |
US20190172420A1 (en) * | 2016-06-15 | 2019-06-06 | Shenzhen Tcl New Technology Co., Ltd | Photographing method by smart television and system thereof |
US10957281B2 (en) * | 2016-06-15 | 2021-03-23 | Shenzhen Tcl New Technology Co., Ltd | Photographing method by smart television and system thereof |
US20180091719A1 (en) * | 2016-09-28 | 2018-03-29 | Renesas Electronics Corporation | Backlight correction program and semiconductor device |
Also Published As
Publication number | Publication date |
---|---|
CN103813097A (en) | 2014-05-21 |
JP2014096621A (en) | 2014-05-22 |
JP5647209B2 (en) | 2014-12-24 |
US20140125863A1 (en) | 2014-05-08 |
CN103813097B (en) | 2017-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9210334B2 (en) | Imaging apparatus and imaging method for flare portrait scene imaging | |
US7184078B2 (en) | Image-processing apparatus and image-quality control method | |
KR100799215B1 (en) | Camera | |
US6812969B2 (en) | Digital camera | |
JP5321163B2 (en) | Imaging apparatus and imaging method | |
JP2006025312A (en) | Imaging apparatus and image acquisition method | |
US20130050542A1 (en) | Image processing apparatus and photographing apparatus | |
US9749546B2 (en) | Image processing apparatus and image processing method | |
JP5660341B2 (en) | Imaging apparatus and imaging method | |
JP2013055567A (en) | Imaging apparatus | |
JP5458937B2 (en) | IMAGING DEVICE, IMAGING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING PROGRAM FOR EXECUTING THE IMAGING METHOD | |
JP4999871B2 (en) | Imaging apparatus and control method thereof | |
US10762600B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable recording medium | |
JP5903478B2 (en) | Imaging apparatus and imaging method | |
US10051200B2 (en) | Imaging apparatus and control method thereof | |
JP6197062B2 (en) | Imaging device, imaging method, display device, and display method | |
JP5146015B2 (en) | Imaging apparatus and imaging method | |
JP4879127B2 (en) | Digital camera and digital camera focus area selection method | |
JP5803233B2 (en) | Imaging apparatus and imaging method | |
JP5310331B2 (en) | Imaging apparatus and imaging method | |
JP2009118052A (en) | Image signal processing method and apparatus | |
US20210289116A1 (en) | Image processing apparatus and image processing method | |
JP2011086209A (en) | Moving picture processing apparatus and moving picture processing method | |
JP5644260B2 (en) | Imaging apparatus and white balance control method | |
JP5807378B2 (en) | Imaging apparatus, imaging method, and imaging program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OLYMPUS IMAGING CORP., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TOYODA, TETSUYA;NONAKA, OSAMU;REEL/FRAME:031654/0592 Effective date: 20131023 |
|
AS | Assignment |
Owner name: OLYMPUS CORPORATION, JAPAN Free format text: MERGER;ASSIGNOR:OLYMPUS IMAGING CORP.;REEL/FRAME:036616/0332 Effective date: 20150401 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: OLYMPUS CORPORATION, JAPAN Free format text: CHANGE OF ADDRESS;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:039344/0502 Effective date: 20160401 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: OM DIGITAL SOLUTIONS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:058294/0274 Effective date: 20210730 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20231208 |