EP3675477B1 - Electronic device for providing function by using rgb image and ir image acquired through one image sensor - Google Patents
Electronic device for providing function by using rgb image and ir image acquired through one image sensor Download PDFInfo
- Publication number
- EP3675477B1 EP3675477B1 EP18857514.6A EP18857514A EP3675477B1 EP 3675477 B1 EP3675477 B1 EP 3675477B1 EP 18857514 A EP18857514 A EP 18857514A EP 3675477 B1 EP3675477 B1 EP 3675477B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image data
- raw image
- processor
- raw
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims description 50
- 230000004044 response Effects 0.000 claims description 15
- 230000006870 function Effects 0.000 description 39
- 238000004891 communication Methods 0.000 description 35
- 238000006243 chemical reaction Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 9
- 238000012937 correction Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 239000003381 stabilizer Substances 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 208000016339 iris pattern Diseases 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/73—Colour balance circuits, e.g. white balance circuits or colour temperature control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
Definitions
- Embodiments disclosed herein relate to a technology for providing a function using an RGB image and an IR image acquired through one image sensor.
- the digital camera may convert light into an electrical image signal and then store the electrical image signal as digital data (image data).
- An electronic device may include an image sensor to generate the image data.
- the image sensor may include millions to tens of millions of unit pixels each having a photoelectric conversion element.
- the photoelectric conversion element the movement of electric charges, that is, a current, occurs according to the photoelectric effect.
- the image data may be generated by converting the current into a digital signal.
- the size of the unit pixel of the image sensor as described above is very small.
- an optical color filter may be inserted into each unit pixel, and a filter for blocking infrared light may be inserted between a lens and each unit pixel to obtain image data, particularly, a color image.
- an infrared camera may be mounted on an electronic device.
- the size of the electronic device may increase.
- a camera using one image sensor that obtains both the visible light image and the infrared image may be used.
- the camera using the image sensor may not include a filter for blocking infrared light, and the image sensor may include a unit pixel for receiving visible light and a unit pixel for receiving infrared light.
- infrared components may be mixed in an RGB image obtained due to the irradiation of infrared light to the unit pixel for receiving visible light, and the quality of the RGB image may be degraded.
- Embodiments disclosed herein may provide an electronic device that generates and uses an RGB image of high quality and an IR image using one image sensor by generating an RGB image from which an infrared component has been removed in an efficient method using image data obtained through one image sensor for receiving visible light and infrared light.
- an electronic device according to claim 1 is proposed. Further embodiments are defined by claims 2-13.
- FIG. 1 is a block diagram of an electronic device according to an embodiment.
- an electronic device 100 may include an image sensor 110, a display 120, a memory 130, a photo sensor 140, an IR flash 150, and a processor 160.
- the electronic device 100 may be implemented with some components omitted.
- the electronic device 100 may be implemented with the display 120, the photo sensor 140, or the IR flash 150 omitted.
- the image sensor 110 may generate an electrical signal (e.g., movement of electric charges or current) that is the basis of the image data in response to incident light.
- the image sensor 110 may include a pixel array.
- the pixel array may include a plurality of pixels.
- each of the pixels may include an R subpixel, a G subpixel, a B subpixel, and an IR subpixel.
- the R subpixel may receive red light and infrared light
- G subpixel may receive green light and infrared light
- the B subpixel may receive blue light and infrared light
- the IR subpixel may receive infrared light.
- the pixel array may include a color filter array.
- the color filter array may include a first region through which visible light and infrared light pass and a second region through which infrared light passes.
- the first region may be a region included in the R subpixel, the G subpixel, and the B subpixel, and the second region may be a region included in the IR subpixel.
- the first region may include a region which transmits red light and infrared light, a region which transmits green light and infrared light, and a region which transmits blue light and infrared light.
- the image sensor 110 may include an image processor 160, and the image processor 160 may obtain raw image data corresponding to light in an infrared band and a visible light band with respect to an external object.
- the display 120 may include, for example, a liquid crystal display (LCD) display 120, an LED display 120, an organic LED (OLED) display 120, or a microelectromechanical systems (MEMS) display 120, or an electronic paper display 120.
- the display 120 may display, for example, an RGB image generated in real time.
- the memory 130 may include a volatile and/or nonvolatile memory 130.
- the memory 130 may store, for example, commands or data related to at least one other component of the electronic device 100.
- the memory 130 may store an image data file as a final result processed by the image processor 160.
- the memory 130 may store a plurality of applications that provide a function using at least one of an RGB image or an IR image.
- the photo sensor 140 may measure at least one of an illuminance or a color temperature corresponding to obtained raw image data. According to an embodiment, the photo sensor 140 may be provided on an outer surface of the electronic device 100.
- the IR flash 150 may output infrared light.
- the IR flash 150 may operate to increase the amount of infrared light reflected from a subject when an IR image is required.
- the processor 160 may be electrically connected to the components included in the electronic device 100 to execute operations or data processing related to the control and/or communication of the components included in the electronic device 100.
- FIG. 2 illustrates a pixel array of the image sensor 110 according to an embodiment.
- the pixel array of FIG. 2 may correspond to a pixel array included in the image sensor 110 of FIG. 1 .
- the pixel array may include a plurality of pixels (e.g., millions to tens of millions).
- each of the pixels may include an R subpixel, a G subpixel, a B subpixel, and an IR subpixel.
- each of the pixels may include a first region 210 which transmits visible light and infrared light and a second region 220 which transmits infrared light.
- the image sensor 110 may include a first set of sensor pixels arranged to respond to light passing through a filter configured to transmit at least some light in the infrared band and the visible light band and a second set of sensor pixels arranged to respond to light passing through a filter configured to transmit at least some light in the infrared band and block light in the visible light band.
- the subpixel may include a micro lens 211, various films or filters 212 and 213, and a photoelectric conversion element 214.
- the subpixel may further include other components such as various conductor patterns that electrically connect the photoelectric conversion element 214 and the image processor 160.
- the micro lens 211 may collect incident light such that the incident light reaches a top of the photoelectric conversion element 214.
- the incident light may be refracted by the micro lens 211 to form a condensed spot (also referred to as an optical spot) on the photoelectric conversion element 214.
- the color filter 212 may be disposed under the micro lens 211 to transmit light having a specified color, that is, light having a specified wavelength range.
- the color filter 213 may include a primary color filter (e.g., R, G, B) and an infrared filter.
- a bundle of color filters may correspond to a color filter array.
- the anti-reflection film 213 may increase the amount of light reaching the photoelectric conversion element 214 by preventing light incident through the micro lens 211 from being reflected.
- the photoelectric conversion element 214 may correspond to, for example, a photo diode formed on a semiconductor substrate.
- the photoelectric conversion element 214 may output an electrical signal in response to the incident light according to the photoelectric effect.
- the photoelectric conversion element 214 may generate electric charges (or current) according to the intensity (or the amount of light) of received light. An output value may be determined based on the amount of the charges (or current).
- FIG. 3 is a flowchart illustrating a method of providing a function of an application using raw image data according to a type of a required image according to an embodiment.
- an operation described as being performed by the electronic device 100 may be understood to be controlled by the processor 160 of the electronic device 100.
- Operations described as being performed by the electronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by the processor 160 of the electronic device 100.
- the instructions may be stored in, for example, a computer recording medium or the memory 130 of the electronic device 100 shown in FIG. 1 .
- the processor 160 may execute any one of a plurality of applications stored in the memory 130.
- the processor 160 may obtain raw image data through the image sensor 110.
- the raw image data may include a value according to the magnitude of an electrical signal output by each sub-pixel of the image sensor 110.
- the raw image data may have a value according to the magnitude of an electrical signal output by an R subpixel, a value according to the magnitude of an electrical signal output by a G subpixel, a value according to the magnitude of an electrical signal output by a B subpixel, and a value according to the magnitude of an electrical signal output by an IR subpixel
- the raw image data may include a first image data set which is data obtained through a first region of a color filter array, which transmits visible light and infrared light and a second image data set which is data obtained through a second region of the color filter array, which transmits infrared light.
- the processor 160 may obtain first raw image data by using the first image data set.
- the processor 160 may correct an infrared component of the raw image data by further using the second image data set, and obtain first raw image data corresponding to the first region from the corrected raw image data.
- the first raw image data may be data corresponding to the first region of the color filter array among the corrected raw image data.
- the processor 160 may generate an RGB image based on the first raw image data.
- the processor 160 may obtain third raw image data corresponding to the second region and including a visible light component, an infrared component, or both the visible light component and the infrared component, using the first raw image data.
- the processor 160 may interpolate the RGB image using the first raw image data and the third raw image data.
- the processor 160 may generate an RGB image related to an external object by using the first raw image data corresponding to the light in the visible light band obtained through the image sensor 110.
- the processor 160 may provide a function of using the application by using the generated RGB image.
- the processor 160 may display an RGB image on the display 120 or store the RGB image in the memory 130.
- the processor 160 may obtain second raw image data by using the second image data set.
- the processor 160 may correct an infrared component of the raw image data by using the second image data set, and obtain second raw image data corresponding to the second region from the corrected raw image data.
- the second raw image data may be data corresponding to the second region of the color filter array among the corrected raw image data.
- the processor 160 may generate an IR image based on the second raw image data.
- the processor 160 may obtain fourth raw image data corresponding to the first region and including an infrared component by using the second raw image data.
- the processor 160 may interpolate the IR image using the second raw image data and the fourth raw image data.
- the processor 160 may provide a function of using the application by using the generated IR image.
- the processor 160 may display the IR image on the display 120 or perform biometric authentication (e.g., iris authentication) using the IR image.
- biometric authentication e.g., iris authentication
- the processor 160 may generate an IR image by using the second raw image data corresponding to the light in the infrared light band obtained through the image sensor 110, and perform biometric authentication related to an external object using the generated IR image.
- the processor 160 may obtain distance information (or depth information) between the electronic device 100 and the object using the IR image.
- the processor 160 may output IR light of a predetermined pattern through the IR flash 150 and obtain distance information to the object by using the IR light of the pattern included in the obtained IR image.
- FIG. 4 is a flowchart illustrating a method of correcting an infrared component of raw image data and providing a function of an application using the corrected RGB image, according to an embodiment.
- an operation described as being performed by the electronic device 100 may be understood to be controlled by the processor 160 of the electronic device 100.
- Operations described as being performed by the electronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by the processor 160 of the electronic device 100.
- the instructions may be stored in, for example, a computer recording medium or the memory 130 of the electronic device 100 shown in FIG. 1 .
- the above-described operations 305a to 309a may include operations 401 to 417 of FIG. 4 . According to an embodiment, some of operations 401 to 417 may be omitted.
- operation 401, operation 407a, or operation 407b may be performed after operation 303 of FIG. 3 .
- the processor 160 may obtain first raw image data using the first image data set.
- the first range may be defined as an illuminance corresponding to the raw image data being lower than a predetermined illuminance.
- the first image data set may be image data obtained through an R subpixel, a G subpixel, and a B subpixel.
- the first image data set may include a visible light component and an infrared component.
- the processor 160 may obtain third raw image data corresponding to the second region and including the visible light component and the infrared component using the first raw image data.
- the first raw image data may include a component related to light incident to the first region of the color filter array and may not include a component related to light incident to the second region.
- the processor 160 may interpolate the third raw image data corresponding to the second region by using the first raw image data.
- the processor 160 may obtain RGB image data corresponding to all regions of a color filter array by obtaining the third raw image data.
- the processor 160 may generate an RGB image based on the first raw image data and the third raw image data.
- the processor 160 may interpolate the RGB image by using the first raw image data and the third raw image data.
- the processor 160 may demosaic the first raw image data and the third raw image data to generate the RGB image.
- the processor 160 may correct an infrared component of the raw image data based on the second image data set.
- the second range may be defined as an illuminance corresponding to the raw image data being equal to or higher than a predetermined illuminance.
- the processor 160 may generate an RGB image using the first image data set.
- the processor 160 may generate an RGB image in which a component corresponding to the infrared light band in the first image data set is corrected using at least some of the second image data set.
- an RGB image of an improved quality even in a low illuminance environment by obtaining the RGB image including an IR component.
- the processor 160 may obtain an RGB image including an IR component to provide various functions as well as when illumination is low (operation 401, operation 403 and operation 405).
- the processor 160 may obtain an RGB image including an IR component to smooth the texture of human skin to be photographed.
- the processor 160 may obtain an RGB image including an IR component to optimize a dynamic range of the RGB image.
- the processor 160 may obtain an RGB image including an IR component to obtain an effect of removing fog from the RGB image.
- the processor 160 may correct an IR component of the raw image data based on the second image data set according to a first method.
- the first range may be defined as a color temperature corresponding to the raw image data being lower than a predetermined color temperature.
- the processor 160 may correct an IR component of the raw image data based on the second image data set according to a second method different from the first method.
- the second range may be defined as a color temperature corresponding to the raw image data being equal to or higher than a predetermined color temperature.
- the processor 160 may correct the infrared component of the raw data image in a method determined according to the color temperature of the raw image data.
- the processor 160 may perform correction by subtracting a value obtained by multiplying a value of an IR component obtained through an IR pixel by a predetermined coefficient corresponding to a type of the RGB subpixel included in the pixel from a value of an RGB component obtained through the RGB subpixel.
- the raw image data may include a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel, and a value of a component corresponding to an IR subpixel, which correspond to one pixel.
- the value of the component corresponding to the R subpixel may be 3, the value of the component corresponding to the G subpixel may be 3.5, the value of the component corresponding to the B subpixel may be 4, and the value of the component corresponding to the IR subpixel may be 1.
- the coefficient corresponding to the R subpixel may be 1.24
- the coefficient corresponding to the G subpixel may be 1.03
- the coefficient corresponding to the B subpixel may be 0.84.
- the processor 160 may correct at least a part of the second image data set using a first parameter when the color temperature corresponding to the raw image data falls within the first range, and at least a part of the second image data set using a second parameter when the color temperature falls within the second range.
- the processor 160 may correct a component corresponding to the infrared band in the first image data set by using at least a part of the corrected second image data set.
- the processor 160 may correct a component corresponding to the infrared light of the raw image data based further on at least a part of the first image data set.
- the processor 160 may correct an infrared component of the raw image data by using the following matrix formula.
- R ′ G ′ B ′ IR ′ a b c d e f g h i j k l m n o p ⁇ R G B IR
- R', G', B', and IR' in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the corrected raw image data.
- R, G, B, and IR in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the raw image data before correction.
- the processor 160 may correct an infrared component of the raw image data by using the following matrix formula.
- R ′ G ′ B ′ IR ′ 4 ⁇ 11 ⁇ R G B IR R 2 G 2 B 2 IR 2 R ⁇ IR G ⁇ IR B ⁇ IR
- R', G', B', and IR' in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the corrected raw image data.
- R, G, B, and IR in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the raw image data before correction.
- the processor 160 may perform correction using the first method (e.g., using a correction method of subtracting a value obtained by multiplying a value of an IR component obtained through an IR pixel by a predetermined coefficient corresponding to a type of an RGB subpixel included in the pixel from a value of an RGB component obtained through the RGB subpixel) of the above-described correction methods, and when the color temperature of the raw image data falls within the second range (e.g., the color temperature range of the incandescent light), perform correction using the second method of the above-described correction methods (e.g., a method of correcting an infrared component of the raw image data using the matrix formula) .
- the first method e.g., using a correction method of subtracting a value obtained by multiplying a value of an IR component obtained through an IR pixel by a predetermined coefficient corresponding to a type of an RGB subpixel included in the pixel from a value of an RGB component obtained through the RGB subpixel
- the second range e.
- the processor 160 may obtain first raw image data by using at least a part of the corrected raw image data.
- the first raw image data that is raw image data corresponding to a first region of the corrected raw image data may be obtained.
- the corrected raw image data as described above may include a value R' of a component corresponding to an R subpixel, a value G' of a component corresponding to a G subpixel, a value B' of a component corresponding to a B subpixel, and a value IR' of the component corresponding to an IR subpixel.
- the processor 160 may obtain the value R' of the component corresponding to the R subpixel corresponding to the first region, the value G' of the component corresponding to the G subpixel corresponding to the first region, and the value B' of the component corresponding to the B subpixel corresponding to the first region, among the values of the plurality of components.
- the processor 160 may obtain third raw image data corresponding to the second region and including a visible light component and an infrared component using the first raw image data, and obtain fourth raw image data corresponding to the first region and including an infrared component using the second raw image data.
- the first raw image data may include a component related to light incident to the first region of the color filter array and may not include a component related to light incident to the second region.
- the processor 160 may interpolate the third raw image data corresponding to the second region by using the first raw image data.
- the processor 160 may obtain RGB image data corresponding to all regions of a color filter array by obtaining the third raw image data.
- the second raw image data may include a component related to light incident to the second region of the color filter array and may not include a component related to light incident to the first region.
- the processor 160 may interpolate the fourth raw image data corresponding to the first region by using the second raw image data.
- the processor 160 may obtain IR image data corresponding to all regions of a color filter array by obtaining the fourth raw image data.
- FIG. 5 is a diagram illustrating a process of obtaining third raw image data and fourth raw image data by using a part of raw image data according to an embodiment.
- corrected raw image data may include a value 510 of a component corresponding to a first region and a value 520 of a component corresponding to a second region.
- the processor 160 may obtain first raw image data including a value 511 of a component corresponding to an R subpixel, a value 512 of a component corresponding to a G subpixel, and a value 513 of a component corresponding to a B subpixel, and obtain second raw image data including the value 520 of a component corresponding to an IR subpixel.
- the processor 160 may interpolate third raw image data 530 corresponding to the second region using the first raw image data. Although the use of values of three components is illustrated in FIG. 5 , there may be a plurality of neighboring other pixels of the illustrated pixel, and the processor 160 may interpolate the third raw image data 530 using the first raw image data corresponding to the other neighboring pixels.
- the processor 160 may interpolate the third raw image data 530 using the value 512 of the component corresponding to the G subpixel as it is.
- the processor 160 may process values of components corresponding to neighboring subpixels according to a predetermined algorithm, and interpolate the third raw image data 530 using the processed values.
- the processor 160 may interpolate the third raw image data 530 by using the value 512 of the component corresponding to the G subpixel and an average value of the values of the components corresponding to the G subpixels of the neighboring other pixels.
- the processor 160 may interpolate the fourth raw image data 540 corresponding to the first region using the second raw image data.
- the processor 160 may interpolate the fourth raw image data 540 using the second raw image data corresponding to the neighboring other pixels.
- the fourth raw image data 540 may include a value 541 of a component corresponding to the R subpixel, a value 542 of a component corresponding to the G subpixel, and a value 543 of a component corresponding to the B subpixel.
- the processor 160 may interpolate the fourth raw image data 540 using the second raw image data 520 as it is.
- the processor 160 may process values of components corresponding to IR subpixels included in neighboring pixels according to a predetermined algorithm, and interpolate the fourth raw image data 540 using the processed values.
- the processor 160 may interpolate the fourth raw image data 540 using an average value of values of components corresponding to the IR subpixels of the neighboring pixels.
- the processor 160 may remove infrared components included in the first raw image data and the third raw image data based on the second raw image data and the fourth raw image data.
- the first raw image data and the third raw image data may include visible light components and infrared components
- the second raw image data and fourth raw image data may include only infrared components.
- the processor 160 may remove infrared components by subtracting a value of a component corresponding to each subpixel included in the second raw image data and the fourth raw image data from in a value of a component corresponding to each subpixel included in the first raw image data and the third raw image data.
- the processor 160 may subtract the value of the component corresponding to the R subpixel of the fourth raw image data from the value of the component corresponding to the R subpixel of the first raw image data.
- the processor 160 may generate an RGB image based on the first raw image data and the third raw image data from which the infrared components have been removed. According to an example, the processor 160 may interpolate the RGB image by using the first raw image data and the third raw image data from which the infrared components have been removed. For example, the processor 160 may demosaic the first raw image data and the third raw image data from which the infrared components have been removed to generate an RGB image.
- the processor 160 may provide a function of an application using the RGB image.
- the processor 160 may display an RGB image obtained in real time on the display 120 and store a still image or a moving image in the memory 130 according to a user input.
- a range of illuminance corresponding to the raw image data is determined, and a range of color temperature corresponding to the raw image data is then determined.
- a range of illuminance corresponding to the raw image data may be determined or only a range of color temperature may be determined.
- the processor may perform operation (operation 407a) of correcting the infrared component of the raw image data based on the second image data set according to the first method, and perform operation (operation 407b) of correcting the infrared component of the raw image data based on the second image data set according to the second method when the range of the color temperature temperature falls within the second range.
- the processor may perform operation (operation 401) of obtaining first raw image data using the first image data set when the range of illuminance falls within the first range and perform operation (operation 407a or operation 407b) of correcting an infrared component of the raw image data based on the second image data set when the range of illuminance falls within the second range.
- FIG. 6 is a flowchart illustrating a method of providing a function of an application using an IR image according to an embodiment.
- an operation described as being performed by the electronic device 100 may be understood to be controlled by the processor 160 of the electronic device 100.
- Operations described as being performed by an electronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by the processor 160 of the electronic device 100.
- the instructions may be stored in, for example, a computer recording medium or the memory 130 of the electronic device 100 shown in FIG. 1 .
- the above-described operations 305b to 309b may include operations 601 to 607 of FIG. 6 . According to an embodiment, some of operations 601 to 607 may be omitted.
- operation 601 may be performed after operation 303 of FIG. 3 .
- the processor 160 may output IR light through the IR flash 150.
- the processor 160 may obtain second raw image data using a second image data set.
- the second image data set may be image data obtained through an IR pixel.
- the second image data set may include an infrared component.
- the processor 160 may obtain fourth raw image data corresponding to a first region and including an infrared component by using the second raw image data.
- the second raw image data may include a component related to light incident to the second region of the color filter array and may not include a component related to light incident to the first region.
- the processor 160 may interpolate the fourth raw image data corresponding to the first region by using the second raw image data.
- the processor 160 may obtain RGB image data corresponding to all regions of the color filter array by obtaining fourth raw image data. A detailed method of obtaining the fourth raw image data by the processor 160 is the same as described above with reference to FIG. 5 .
- the processor 160 may generate an IR image based on the second raw image data and the fourth raw image data.
- the processor 160 may interpolate the IR image by using the second raw image data and the fourth raw image data.
- the processor 160 may demosaic the second raw image data and the fourth raw image data to generate an IR image.
- the processor 160 may provide a function of an application using the IR image.
- the processor 160 may perform iris authentication by comparing an iris pattern included in the obtained IR image with an iris pattern stored in the memory 130.
- the processor 160 may generate an RGB image according to the method described with reference to FIG. 4 . According to an example, the processor 160 may display the generated RGB image on the display 120, and provide a function using the IR image while the RGB image is displayed on the display 120. A detailed example thereof will be described with reference to FIG. 7 .
- FIG. 7 is a flowchart illustrating a method of performing biometric authentication using an IR image while displaying an RGB image, according to an example.
- the electronic device 100 of FIG. 1 performs the process of FIG. 7 .
- an operation described as being performed by the electronic device 100 with reference to FIG. 7 may be understood as being controlled by the processor 160 of the electronic device 100.
- Operations described as being performed by an electronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by the processor 160 of the electronic device 100.
- the instructions may be stored in, for example, a computer recording medium or the memory 130 of the electronic device 100 shown in FIG. 1 .
- the processor 160 may execute an application associated with biometric authentication.
- the processor 160 may execute a card payment application that authenticates a user according to a biometric authentication method.
- biometric authentication may correspond to iris recognition.
- the processor 160 may obtain image data including eyes of a user through the image sensor 110. According to an embodiment, the processor 160 may output infrared light through the IR flash 150 before operation 703. According to an embodiment, the processor 160 may continuously output infrared light through the IR flash 150 until user authentication is completed.
- the processor 160 may display, on the display 120, at least a portion of an RGB image generated based at least on first information obtained by an R subpixel, a G subpixel, and a B subpixel of the image data.
- the first information may include first image data set, first raw image data, or corrected first raw image data which are described above.
- the processor 160 may generate an RGB image according to the method described above with reference to FIG. 4 , and display at least a portion of the generated RGB image on the display 120.
- the processor 160 may authenticate a user based on second information obtained by the IR subpixel while the generated RGB image is displayed on the display 120.
- the second information may include second image data set, second raw image data, or corrected second raw image data which are described above.
- the processor 160 may generate an IR image by the method described above with reference to FIG. 6 .
- the second information may include iris image data obtained from the eyes of the user.
- the processor 160 may perform user authentication by comparing the obtained iris image data with iris image data stored in the memory 130.
- the processor 160 may provide a function of an application according to a result of the user authentication. For example, when the user authentication is successful, the processor 160 may perform card payment.
- the processor 160 may output flicking of infrared light through the IR flash 150.
- the processor 160 may display, on the display 120, at least a portion of the RGB image generated based at least on the first information obtained by the R subpixel, the G subpixel, and the B subpixel of the image data corresponding to a time point at which the infrared light is not output.
- the processor 160 may authenticate the user based on the second information obtained by the IR subpixel of the image data corresponding to the time point at which the infrared light is output.
- FIG. 8 is a block diagram of an electronic device in a network environment according to various embodiments.
- an electronic device 801 may communicate with an electronic device 802 through a first network 898 (e.g., a short-range wireless communication) or may communicate with an electronic device 804 or a server 808 through a second network 899 (e.g., a long-distance wireless communication) in a network environment 800.
- the electronic device 801 may communicate with the electronic device 804 through the server 808.
- the electronic device 801 may include a processor 820, a memory 830, an input device 850, a sound output device 855, a display device 860, an audio module 870, a sensor module 876, an interface 877, a haptic module 879, a camera module 880, a power management module 888, a battery 889, a communication module 890, a subscriber identification module 896, and an antenna module 897.
- at least one e.g., the display device 860 or the camera module 880
- components of the electronic device 801 may be omitted or other components may be added to the electronic device 801.
- some components may be integrated and implemented as in the case of the sensor module 876 (e.g., a fingerprint sensor, an iris sensor, or an illuminance sensor) embedded in the display device 860 (e.g., a display).
- the processor 820 may operate, for example, software (e.g., a program 840) to control at least one of other components (e.g., a hardware or software component) of the electronic device 801 connected to the processor 820 and may process and compute a variety of data.
- the processor 820 may load a command set or data, which is received from other components (e.g., the sensor module 876 or the communication module 890), into a volatile memory 832, may process the loaded command or data, and may store result data into a nonvolatile memory 834.
- the processor 820 may include a main processor 821 (e.g., a central processing unit or an application processor) and an auxiliary processor 823 (e.g., a graphic processing device, an image signal processor, a sensor hub processor, or a communication processor), which operates independently from the main processor 821, additionally or alternatively uses less power than the main processor 821, or is specified to a designated function.
- the auxiliary processor 823 may operate separately from the main processor 821 or embedded.
- the auxiliary processor 823 may control, for example, at least some of functions or states associated with at least one component (e.g., the display device 860, the sensor module 876, or the communication module 890) among the components of the electronic device 801 instead of the main processor 821 while the main processor 821 is in an inactive (e.g., sleep) state or together with the main processor 821 while the main processor 821 is in an active (e.g., an application execution) state.
- the auxiliary processor 823 e.g., the image signal processor or the communication processor
- the auxiliary processor 823 may be implemented as a part of another component (e.g., the camera module 880 or the communication module 890) that is functionally related to the auxiliary processor 823.
- the memory 830 may store a variety of data used by at least one component (e.g., the processor 820 or the sensor module 876) of the electronic device 801, for example, software (e.g., the program 840) and input data or output data with respect to commands associated with the software.
- the memory 830 may include the volatile memory 832 or the nonvolatile memory 834.
- the program 840 may be stored in the memory 830 as software and may include, for example, an operating system 842, a middleware 844, or an application 846.
- the input device 850 may be a device for receiving a command or data, which is used for a component (e.g., the processor 820) of the electronic device 801, from an outside (e.g., a user) of the electronic device 801 and may include, for example, a microphone, a mouse, or a keyboard.
- a component e.g., the processor 820
- the electronic device 801 may include, for example, a microphone, a mouse, or a keyboard.
- the sound output device 855 may be a device for outputting a sound signal to the outside of the electronic device 801 and may include, for example, a speaker used for general purposes, such as multimedia play or recordings play, and a receiver used only for receiving calls. According to an example, the receiver and the speaker may be either integrally or separately implemented.
- the display device 860 may be a device for visually presenting information to the user of the electronic device 801 and may include, for example, a display, a hologram device, or a projector and a control circuit for controlling a corresponding device.
- the display device 860 may include a touch circuitry or a pressure sensor for measuring an intensity of pressure on the touch.
- the audio module 870 may convert a sound and an electrical signal in dual directions. According to an example, the audio module 870 may obtain the sound through the input device 850 or may output the sound through an external electronic device (e.g., the electronic device 802 (e.g., a speaker or a headphone)) wired or wirelessly connected to the sound output device 855 or the electronic device 801.
- an external electronic device e.g., the electronic device 802 (e.g., a speaker or a headphone)
- the sensor module 876 may generate an electrical signal or a data value corresponding to an operating state (e.g., power or temperature) inside or an environmental state outside the electronic device 801.
- the sensor module 876 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 877 may support a designated protocol wired or wirelessly connected to the external electronic device (e.g., the electronic device 802).
- the interface 877 may include, for example, an HDMI (high-definition multimedia interface), a USB (universal serial bus) interface, an SD card interface, or an audio interface.
- a connecting terminal 878 may include a connector that physically connects the electronic device 801 to the external electronic device (e.g., the electronic device 802), for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
- the external electronic device e.g., the electronic device 802
- an HDMI connector e.g., a USB connector
- SD card connector e.g., an SD card connector
- an audio connector e.g., a headphone connector
- the haptic module 879 may convert an electrical signal to a mechanical stimulation (e.g., vibration or movement) or an electrical stimulation perceived by the user through tactile or kinesthetic sensations.
- the haptic module 879 may include, for example, a motor, a piezoelectric element, or an electric stimulator.
- the camera module 880 may shoot a still image or a video image.
- the camera module 880 may include, for example, at least one lens, an image sensor, an image signal processor, or a flash.
- the power management module 888 may be a module for managing power supplied to the electronic device 801 and may serve as at least a part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 889 may be a device for supplying power to at least one component of the electronic device 801 and may include, for example, a non-rechargeable (primary) battery, a rechargeable (secondary) battery, or a fuel cell.
- the communication module 890 may establish a wired or wireless communication channel between the electronic device 801 and the external electronic device (e.g., the electronic device 802, the electronic device 804, or the server 808) and support communication execution through the established communication channel.
- the communication module 890 may include at least one communication processor operating independently from the processor 820 (e.g., the application processor) and supporting the wired communication or the wireless communication.
- the communication module 890 may include a wireless communication module 892 (e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module) or a wired communication module 894 (e.g., an LAN (local area network) communication module or a power line communication module) and may communicate with the external electronic device using a corresponding communication module among them through the first network 898 (e.g., the short-range communication network such as a Bluetooth, a WiFi direct, or an IrDA (infrared data association)) or the second network 899 (e.g., the long-distance wireless communication network such as a cellular network, an internet, or a computer network (e.g., LAN or WAN)).
- the above-mentioned various communication modules 890 may be implemented into one chip or into separate chips, respectively.
- the wireless communication module 892 may identify and authenticate the electronic device 801 using user information stored in the subscriber identification module 896 in the communication network.
- the antenna module 897 may include one or more antennas to transmit or receive the signal or power to or from an external source.
- the communication module 890 e.g., the wireless communication module 892
- Some components among the components may be connected to each other through a communication method (e.g., a bus, a GPIO (general purpose input/output), an SPI (serial peripheral interface), or an MIPI (mobile industry processor interface)) used between peripheral devices to exchange signals (e.g., a command or data) with each other.
- a communication method e.g., a bus, a GPIO (general purpose input/output), an SPI (serial peripheral interface), or an MIPI (mobile industry processor interface) used between peripheral devices to exchange signals (e.g., a command or data) with each other.
- the command or data may be transmitted or received between the electronic device 801 and the external electronic device 804 through the server 808 connected to the second network 899.
- Each of the electronic devices 802 and 804 may be the same or different types as or from the electronic device 801.
- all or some of the operations performed by the electronic device 801 may be performed by another electronic device or a plurality of external electronic devices.
- the electronic device 801 may request the external electronic device to perform at least some of the functions related to the functions or services, in addition to or instead of performing the functions or services by itself.
- the external electronic device receiving the request may carry out the requested function or the additional function and transmit the result to the electronic device 801.
- the electronic device 801 may provide the requested functions or services based on the received result as is or after additionally processing the received result.
- a cloud computing, distributed computing, or client-server computing technology may be used.
- Fig. 9 is a block diagram 900 illustrating the camera module 880 according to various embodiments.
- the camera module 880 may include a lens assembly 910, a flash 920, an image sensor 930, an image stabilizer 940, memory 950 (e.g., buffer memory), or an image signal processor 960.
- the lens assembly 910 may collect light emitted or reflected from an object whose image is to be taken.
- the lens assembly 910 may include one or more lenses.
- the camera module 880 may include a plurality of lens assemblies 910. In such a case, the camera module 880 may form, for example, a dual camera, a 360-degree camera, or a spherical camera.
- Some of the plurality of lens assemblies 910 may have the same lens attribute (e.g., view angle, focal length, auto-focusing, f number, or optical zoom), or at least one lens assembly may have one or more lens attributes different from those of another lens assembly.
- the lens assembly 910 may include, for example, a wide-angle lens or a telephoto lens.
- the flash 920 may emit light that is used to reinforce light reflected from an object.
- the flash 920 may include one or more light emitting diodes (LEDs) (e.g., a red-green-blue (RGB) LED, a white LED, an infrared (IR) LED, or an ultraviolet (UV) LED) or a xenon lamp.
- LEDs light emitting diodes
- the image sensor 930 may obtain an image corresponding to an object by converting light emitted or reflected from the object and transmitted via the lens assembly 910 into an electrical signal.
- the image sensor 930 may include one selected from image sensors having different attributes, such as a RGB sensor, a black-and-white (BW) sensor, an IR sensor, or a UV sensor, a plurality of image sensors having the same attribute, or a plurality of image sensors having different attributes.
- Each image sensor included in the image sensor 930 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
- CCD charged coupled device
- CMOS complementary metal oxide semiconductor
- the image stabilizer 940 may move the image sensor 930 or at least one lens included in the lens assembly 910 in a particular direction, or control an operational attribute (e.g., adjust the read-out timing) of the image sensor 930 in response to the movement of the camera module 880 or the electronic device 801 including the camera module 880. This allows compensating for at least part of a negative effect (e.g., image blurring) by the movement on an image being captured.
- the image stabilizer 940 may sense such a movement by the camera module 880 or the electronic device 801 using a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 880.
- the image stabilizer 940 may be implemented, for example, as an optical image stabilizer.
- the memory 950 may store, at least temporarily, at least part of an image obtained via the image sensor 930 for a subsequent image processing task. For example, if image capturing is delayed due to shutter lag or multiple images are quickly captured, a raw image obtained (e.g., a Bayer-patterned image, a high-resolution image) may be stored in the memory 950, and its corresponding copy image (e.g., a low-resolution image) may be previewed via the display device 860. Thereafter, if a specified condition is met (e.g., by a user's input or system command), at least part of the raw image stored in the memory 950 may be obtained and processed, for example, by the image signal processor 960. According to an example, the memory 950 may be configured as at least part of the memory 830 or as a separate memory that is operated independently from the memory 830.
- a raw image obtained e.g., a Bayer-patterned image, a high-resolution image
- its corresponding copy image e.
- the image signal processor 960 may perform one or more image processing with respect to an image obtained via the image sensor 930 or an image stored in the memory 950.
- the one or more image processing may include, for example, depth map generation, three-dimensional (3D) modeling, panorama generation, feature point extraction, image synthesizing, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, or softening).
- the image signal processor 960 may perform control (e.g., exposure time control or read-out timing control) with respect to at least one (e.g., the image sensor 930) of the components included in the camera module 880.
- An image processed by the image signal processor 960 may be stored back in the memory 950 for further processing, or may be provided to an external component (e.g., the memory 830, the display device 860, the electronic device 802, the electronic device 804, or the server 808) outside the camera module 880.
- the image signal processor 960 may be configured as at least part of the processor 820, or as a separate processor that is operated independently from the processor 820. If the image signal processor 960 is configured as a separate processor from the processor 820, at least one image processed by the image signal processor 960 may be displayed, by the processor 820, via the display device 860 as it is or after being further processed.
- the electronic device 801 may include a plurality of camera modules 880 having different attributes or functions.
- at least one of the plurality of camera modules 880 may form, for example, a wide-angle camera and at least another of the plurality of camera modules 880 may form a telephoto camera.
- at least one of the plurality of camera modules 880 may form, for example, a front camera and at least another of the plurality of camera modules 880 may form a rear camera.
- an electronic device may include an image sensor that obtains raw image data corresponding to light in an infrared band and a visible band for an external object and a processor, wherein the processor may receive a request to obtain the raw image data corresponding to the external object, generate an RGB image associated with the external object using first raw image data corresponding to the light in the visible band obtained through the image sensor, based on the request being set to be performed using a first function of the image sensor, and perform biometric authentication associated with the external object using second raw image data corresponding to the light in the infrared band obtained through the image sensor, based on the request being set to be performed using a second function of the image sensor.
- the image sensor may include a first sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and the visible band and a second sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and block the light in the visible band
- the processor may obtain raw image data including a first image data set obtained through the first sensor pixel set and a second image data set obtained through the second sensor pixel set, using the image sensor, and obtain the first image data or the second raw image data using at least a part of the raw image data.
- the processor may, as a part of an operation of generating the RGB image, generate the RGB image using the first image data set based on an illuminance corresponding to the raw image data satisfying a specified first range, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using at least a part of the second image data set based on the illuminance satisfying a specified second range.
- the processor may, as a part of an operation of generating the RGB image, correct at least a part of the second image data set using a first parameter based on a color temperature corresponding to the raw image data satisfying a specified first range, and correct at least a part of the second image data set using a second parameter based on the color temperature satisfying a specified second range, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using at least a part of the corrected second image data set.
- the processor may ,as a part of an operation of generating the RGB image, correct at least a part of the second image data set using the first parameter, using at least a part of the corrected second image data set according to a color temperature corresponding to the raw image data satisfying a color temperature range of fluorescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using the at least a part of the second image data set which is corrected and correct at least a part of the first image data set and at least a part of the second image data set using the second parameter according to a color temperature corresponding to the raw image data satisfying a color temperature range of incandescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using the at least a part of the first image data set and the at least a part of the second image data set which are corrected.
- an electronic device may include an image sensor including a color filter array, the color filter array including a first region transmitting visible light and infrared light and a second region transmitting infrared light, a memory, and a processor electrically connected to the image sensor, the display, and the memory, wherein the processor may execute an application, obtain first raw image data using a first image data set corresponding to the first region among raw image data obtained through the image sensor in response to a request to obtain an RGB image from the application, and provide the RGB image generated based at least on the obtained first raw image data to the application, and obtain second raw image data using a second image data set corresponding to the second region among raw image data obtained through the image sensor in response to a request to obtain an IR image from the application, and provide the IR image generated based at least on the obtained second raw image data to the application.
- the first region may include a region transmitting red light and infrared light, a region transmitting green light and infrared light and a region transmitting blue light and infrared light.
- the processor may obtain the first raw image data using the first image data set according to an illuminance corresponding to the raw image data falling within a first range, and obtain the first raw image data using at least a part of the raw image data in which a component corresponding to infrared light included in the raw image data is corrected based at least on the second image data set according to the illuminance corresponding to the raw image data falling within a second range.
- the first range may be defined as being lower than a predetermined illuminance and the second range may be defined as being higher than or equal to the predetermined illuminance
- the processor may correct a component of the raw image data corresponding to the infrared light based at least on the second image data set using a first method according to a color temperature corresponding to the raw image data falling within a first range, correct a component of the raw image data corresponding to the infrared light based at least on the second image data set using a second method different from the first method according to a color temperature corresponding to the raw image data falling within a second range, and obtain the first raw image data using at least a part of the corrected raw image data.
- the processor may correct a component of the raw image data corresponding to the infrared light further based on at least a part of the first image data set.
- the first range may be defined as being lower than a predetermined color temperature and the second range may be defined as being higher than or equal to the predetermined color temperature.
- the processor may further obtain third raw image data corresponding to the second region and including a visible light component and an infrared component based on the first raw image data in response to the request to obtain the RGB image, and remove an infrared component included in the first raw image data and the third raw image data based on the second raw image data, and wherein the RGB image is generated based on the first raw image data and the third raw image data.
- the processor may further obtain fourth raw image data corresponding to the first region and including an infrared component based on the second raw image data in response to the request to obtain the IR image, and the IR image may be generated based on the second raw image data and the fourth raw image data.
- the electronic device may further include a display
- the processor may obtain the first raw image data using the first image data set corresponding to the first region among the raw image data obtained through the image sensor in response to the request to obtain the IR image, and display the RGB image generated based at least on the obtained first raw image data on the display, and provide the IR image to the application while the RGB image is displayed on the display.
- an electronic device may include an image sensor, the image sensor including a pixel array, the pixel array including a plurality of pixels, each of the pixels including an R subpixel, a G subpixel, a B subpixel, and an IR subpixel, a display, a memory that stores instructions, and a processor electrically connected to the image sensor, the display and the memory, wherein the processor may execute instructions to perform an application associated with biometric authentication, obtain image data including eyes of a user through the image sensor, display, on the display, at least a portion of an RGB image generated based at least on first information obtained by the R subpixel, the G subpixel and the B subpixel among the image data, authenticate the user based on second information obtained by the IR subpixel while the generated RGB image is displayed on the display, and provide a function of the application according to a result of the authentication.
- the processor may execute instructions to perform an application associated with biometric authentication, obtain image data including eyes of a user through the image sensor, display, on the display, at least a portion of an RGB image
- the second information may include iris image data obtained from the eyes of the user and the biometric authentication corresponds to iris authentication.
- the electronic device may further include an IR flash
- the processor may execute the instructions to output flickering of infrared light through the IR flash, display, on the display, at least a part of the RGB image generated based at least on the first information obtained by the R subpixel, the G subpixel and the B subpixel among the image data corresponding to a time point at which the infrared light is not output and authenticate the user based on the second information obtained by the IR subpixel among the image data at a time point at which the infrared light is output.
- the processor may execute the instructions to correct an infrared component included in the first information based at least on the second information and display, on the display, at least a portion of the RGB image generated based at least on the first information in which the infrared component is corrected.
- the processor may execute the instructions to correct the infrared component included in the first information based at least on the second information according to a first method when a color temperature corresponding to the image data falls within a first range and correct the infrared component included in the first information based at least on the second information according to a second method different from the first method when the color temperature corresponding to the image data falls within a second range.
- the electronic device may be various types of devices.
- the electronic device may include, for example, at least one of a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a mobile medical appliance, a camera, a wearable device, or a home appliance.
- a portable communication device e.g., a smartphone
- a computer device e.g
- a first”, “a second”, “the first”, or “the second”, used in herein may refer to various components regardless of the order and/or the importance, but do not limit the corresponding components.
- the above expressions are used merely for the purpose of distinguishing a component from the other components. It should be understood that when a component (e.g., a first component) is referred to as being (operatively or communicatively) "connected,” or “coupled,” to another component (e.g., a second component), it may be directly connected or coupled directly to the other component or any other component (e.g., a third component) may be interposed between them.
- module used herein may represent, for example, a unit including one or more combinations of hardware, software and firmware.
- module may be interchangeably used with the terms “logic”, “logical block”, “part” and “circuit”.
- the “module” may be a minimum unit of an integrated part or may be a part thereof.
- the “module” may be a minimum unit for performing one or more functions or a part thereof.
- the “module” may include an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of the present disclosure may be implemented by software (e.g., the program 840) including an instruction stored in a machine-readable storage media (e.g., an internal memory 836 or an external memory 838) readable by a machine (e.g., a computer).
- the machine may be a device that calls the instruction from the machine-readable storage media and operates depending on the called instruction and may include the electronic device (e.g., the electronic device 801).
- the processor e.g., the processor 820
- the processor may perform a function corresponding to the instruction directly or using other components under the control of the processor.
- the instruction may include a code generated or executed by a compiler or an interpreter.
- the machine-readable storage media may be provided in the form of non-transitory storage media.
- non-transitory is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency.
- the method according to various embodiments disclosed in the present disclosure may be provided as a part of a computer program product.
- the computer program product may be traded between a seller and a buyer as a product.
- the computer program product may be distributed in the form of machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)) or may be distributed only through an application store (e.g., a Play Store TM ).
- an application store e.g., a Play Store TM
- at least a portion of the computer program product may be temporarily stored or generated in a storage medium such as a memory of a manufacturer's server, an application store's server, or a relay server.
- Each component may include at least one of the above components, and a portion of the above sub-components may be omitted, or additional other sub-components may be further included.
- some components e.g., the module or the program
- Operations performed by a module, a programming, or other components according to various embodiments of the present disclosure may be executed sequentially, in parallel, repeatedly, or in a heuristic method. Also, at least some operations may be executed in different sequences, omitted, or other operations may be added.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
Description
- Embodiments disclosed herein relate to a technology for providing a function using an RGB image and an IR image acquired through one image sensor.
- With the advancement of information technology (IT), cameras have evolved from traditional film cameras to digital cameras. The digital camera may convert light into an electrical image signal and then store the electrical image signal as digital data (image data).
- An electronic device may include an image sensor to generate the image data. The image sensor may include millions to tens of millions of unit pixels each having a photoelectric conversion element. In the photoelectric conversion element, the movement of electric charges, that is, a current, occurs according to the photoelectric effect. The image data may be generated by converting the current into a digital signal.
- In recent years, according to the trend toward miniaturization and increasing number of pixels of a camera module mounted in an electronic device, the size of the unit pixel of the image sensor as described above is very small.
- It is known from
US2017/0134704 to provide an imaging processing device for performing both photographing with visible light and photographing with infrared light. It is also known fromUS6,759,646 to provide an imager with a four color mosaic pattern of red, green, blue and infrared pass filters.2015/0356351 also provides systems, devices and methods for authenticating an individual or user using biometric features. - To obtain image data, particularly, a color image, an optical color filter may be inserted into each unit pixel, and a filter for blocking infrared light may be inserted between a lens and each unit pixel to obtain image data, particularly, a color image.
- Meanwhile, as the utilization of infrared images increases, an infrared camera may be mounted on an electronic device. When the infrared camera is mounted on the electronic device, the size of the electronic device may increase. To reduce the size of the electronic device that obtains both a visible light image and an infrared image, a camera using one image sensor that obtains both the visible light image and the infrared image may be used. The camera using the image sensor may not include a filter for blocking infrared light, and the image sensor may include a unit pixel for receiving visible light and a unit pixel for receiving infrared light.
- However, when the camera does not include an infrared cut-off filter, infrared components may be mixed in an RGB image obtained due to the irradiation of infrared light to the unit pixel for receiving visible light, and the quality of the RGB image may be degraded.
- Embodiments disclosed herein may provide an electronic device that generates and uses an RGB image of high quality and an IR image using one image sensor by generating an RGB image from which an infrared component has been removed in an efficient method using image data obtained through one image sensor for receiving visible light and infrared light.
- According to an embodiment disclosed therein, an electronic device according to claim 1 is proposed. Further embodiments are defined by claims 2-13.
- According to the embodiments disclosed herein, it is possible to obtain and use an RGB image from which infrared components have been removed in an efficient manner.
- According to the embodiments herein, it is possible to perform biometric authentication with an IR image while displaying an RGB image on a display using one image sensor.
- In addition, various effects may be provided that are directly or indirectly understood through the disclosure.
-
-
FIG. 1 is a block diagram of an electronic device according to an embodiment. -
FIG. 2 illustrates a pixel array of an image sensor according to an embodiment. -
FIG. 3 is a flowchart illustrating a method of providing a function of an application using raw image data according to a type of a required image according to an embodiment. -
FIG. 4 is a flowchart illustrating a method of correcting an infrared component of raw image data and providing a function of an application using the corrected RGB image, according to an embodiment. -
FIG. 5 is a diagram illustrating a process of obtaining third raw image data and fourth raw image data by using a part of raw image data according to an embodiment. -
FIG. 6 is a flowchart illustrating a method of providing a function of an application using an IR image according to an embodiment. -
FIG. 7 is a flowchart illustrating a method of performing biometric authentication using an IR image while displaying an RGB image, according to an embodiment. -
FIG. 8 is a block diagram of an electronic device in a network environment according to various embodiments. -
FIG. 9 is a block diagram illustrating the camera module according to various embodiments. - With regard to description of drawings, the same or similar components may be designated by the same or similar reference numerals.
- Hereinafter, various embodiments of the present disclosure will be described with reference to accompanying drawings. Accordingly, those of ordinary skill in the art will recognize that modification, equivalent, and/or alternative on the various embodiments described herein can be variously made without departing from the scope of the appended claims.
-
FIG. 1 is a block diagram of an electronic device according to an embodiment. - Referring to
FIG. 1 , anelectronic device 100 according to an embodiment may include animage sensor 110, adisplay 120, amemory 130, aphoto sensor 140, anIR flash 150, and aprocessor 160. According to various embodiments, theelectronic device 100 may be implemented with some components omitted. For example, theelectronic device 100 may be implemented with thedisplay 120, thephoto sensor 140, or theIR flash 150 omitted. - The
image sensor 110 may generate an electrical signal (e.g., movement of electric charges or current) that is the basis of the image data in response to incident light. According to an embodiment, theimage sensor 110 may include a pixel array. - According to an embodiment, the pixel array may include a plurality of pixels.
- According to an embodiment, each of the pixels may include an R subpixel, a G subpixel, a B subpixel, and an IR subpixel. According to an embodiment, the R subpixel may receive red light and infrared light, G subpixel may receive green light and infrared light, the B subpixel may receive blue light and infrared light, and the IR subpixel may receive infrared light.
- According to an embodiment, the pixel array may include a color filter array. According to an embodiment, the color filter array may include a first region through which visible light and infrared light pass and a second region through which infrared light passes. According to an embodiment, the first region may be a region included in the R subpixel, the G subpixel, and the B subpixel, and the second region may be a region included in the IR subpixel. According to an embodiment, the first region may include a region which transmits red light and infrared light, a region which transmits green light and infrared light, and a region which transmits blue light and infrared light.
- According to an embodiment, the
image sensor 110 may include animage processor 160, and theimage processor 160 may obtain raw image data corresponding to light in an infrared band and a visible light band with respect to an external object. - The
display 120 may include, for example, a liquid crystal display (LCD)display 120, anLED display 120, an organic LED (OLED)display 120, or a microelectromechanical systems (MEMS)display 120, or anelectronic paper display 120. Thedisplay 120 may display, for example, an RGB image generated in real time. - The
memory 130 may include a volatile and/ornonvolatile memory 130. Thememory 130 may store, for example, commands or data related to at least one other component of theelectronic device 100. For example, thememory 130 may store an image data file as a final result processed by theimage processor 160. As another example, thememory 130 may store a plurality of applications that provide a function using at least one of an RGB image or an IR image. - The
photo sensor 140 may measure at least one of an illuminance or a color temperature corresponding to obtained raw image data. According to an embodiment, thephoto sensor 140 may be provided on an outer surface of theelectronic device 100. - The
IR flash 150 may output infrared light. TheIR flash 150 may operate to increase the amount of infrared light reflected from a subject when an IR image is required. - The
processor 160 may be electrically connected to the components included in theelectronic device 100 to execute operations or data processing related to the control and/or communication of the components included in theelectronic device 100. -
FIG. 2 illustrates a pixel array of theimage sensor 110 according to an embodiment. - According to an embodiment, the pixel array of
FIG. 2 may correspond to a pixel array included in theimage sensor 110 ofFIG. 1 . - Referring to
FIG. 2 , the pixel array may include a plurality of pixels (e.g., millions to tens of millions). According to an embodiment, as illustrated inFIG. 2 , each of the pixels may include an R subpixel, a G subpixel, a B subpixel, and an IR subpixel. According to an embodiment, each of the pixels may include afirst region 210 which transmits visible light and infrared light and asecond region 220 which transmits infrared light. In other words, theimage sensor 110 may include a first set of sensor pixels arranged to respond to light passing through a filter configured to transmit at least some light in the infrared band and the visible light band and a second set of sensor pixels arranged to respond to light passing through a filter configured to transmit at least some light in the infrared band and block light in the visible light band. - According to an embodiment, the subpixel may include a
micro lens 211, various films orfilters photoelectric conversion element 214. Although not shown, according to various embodiments, the subpixel may further include other components such as various conductor patterns that electrically connect thephotoelectric conversion element 214 and theimage processor 160. - The
micro lens 211 may collect incident light such that the incident light reaches a top of thephotoelectric conversion element 214. The incident light may be refracted by themicro lens 211 to form a condensed spot (also referred to as an optical spot) on thephotoelectric conversion element 214. - The
color filter 212 may be disposed under themicro lens 211 to transmit light having a specified color, that is, light having a specified wavelength range. For example, thecolor filter 213 may include a primary color filter (e.g., R, G, B) and an infrared filter. According to an embodiment, a bundle of color filters may correspond to a color filter array. - The
anti-reflection film 213 may increase the amount of light reaching thephotoelectric conversion element 214 by preventing light incident through themicro lens 211 from being reflected. - The
photoelectric conversion element 214 may correspond to, for example, a photo diode formed on a semiconductor substrate. Thephotoelectric conversion element 214 may output an electrical signal in response to the incident light according to the photoelectric effect. For example, thephotoelectric conversion element 214 may generate electric charges (or current) according to the intensity (or the amount of light) of received light. An output value may be determined based on the amount of the charges (or current). -
FIG. 3 is a flowchart illustrating a method of providing a function of an application using raw image data according to a type of a required image according to an embodiment. - Hereinafter, it is assumed that the
electronic device 100 ofFIG. 1 performs the process ofFIG. 3 . In addition, in the description ofFIG. 3 , an operation described as being performed by theelectronic device 100 may be understood to be controlled by theprocessor 160 of theelectronic device 100. Operations described as being performed by theelectronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by theprocessor 160 of theelectronic device 100. The instructions may be stored in, for example, a computer recording medium or thememory 130 of theelectronic device 100 shown inFIG. 1 . - In
operation 301, theprocessor 160 may execute any one of a plurality of applications stored in thememory 130. - In
operation 303, theprocessor 160 may obtain raw image data through theimage sensor 110. According to an embodiment, the raw image data may include a value according to the magnitude of an electrical signal output by each sub-pixel of theimage sensor 110. For example, the raw image data may have a value according to the magnitude of an electrical signal output by an R subpixel, a value according to the magnitude of an electrical signal output by a G subpixel, a value according to the magnitude of an electrical signal output by a B subpixel, and a value according to the magnitude of an electrical signal output by an IR subpixel - According to an embodiment, the raw image data may include a first image data set which is data obtained through a first region of a color filter array, which transmits visible light and infrared light and a second image data set which is data obtained through a second region of the color filter array, which transmits infrared light.
- When an RGB image is required (in response to a request to obtain an RGB image from an application) to provide a function using the executed application according to one embodiment, in
operation 305a, theprocessor 160 may obtain first raw image data by using the first image data set. - According to an embodiment, the
processor 160 may correct an infrared component of the raw image data by further using the second image data set, and obtain first raw image data corresponding to the first region from the corrected raw image data. For example, the first raw image data may be data corresponding to the first region of the color filter array among the corrected raw image data. A detailed description of operations that may be included inoperation 305a will be described with reference toFIG. 4 . - In
operation 307a, theprocessor 160 may generate an RGB image based on the first raw image data. - According to an embodiment, the
processor 160 may obtain third raw image data corresponding to the second region and including a visible light component, an infrared component, or both the visible light component and the infrared component, using the first raw image data. - According to an embodiment, the
processor 160 may interpolate the RGB image using the first raw image data and the third raw image data. In other words, when the function requested in relation to theimage sensor 110 belongs to a first function, theprocessor 160 may generate an RGB image related to an external object by using the first raw image data corresponding to the light in the visible light band obtained through theimage sensor 110. A detailed description of operations that may be included inoperation 307a will be described with reference toFIG. 4 . - In
operation 309a, theprocessor 160 may provide a function of using the application by using the generated RGB image. For example, theprocessor 160 may display an RGB image on thedisplay 120 or store the RGB image in thememory 130. - When an IR image is required to provide the function using the executed application according to one embodiment (in response to a request to obtain an IR image from an application), in
operation 305b, theprocessor 160 may obtain second raw image data by using the second image data set. - According to an embodiment, the
processor 160 may correct an infrared component of the raw image data by using the second image data set, and obtain second raw image data corresponding to the second region from the corrected raw image data. For example, the second raw image data may be data corresponding to the second region of the color filter array among the corrected raw image data. - In
operation 307b, theprocessor 160 may generate an IR image based on the second raw image data. - According to an embodiment, the
processor 160 may obtain fourth raw image data corresponding to the first region and including an infrared component by using the second raw image data. - According to an embodiment, the
processor 160 may interpolate the IR image using the second raw image data and the fourth raw image data. - In
operation 309b, theprocessor 160 may provide a function of using the application by using the generated IR image. For example, theprocessor 160 may display the IR image on thedisplay 120 or perform biometric authentication (e.g., iris authentication) using the IR image. In other words, when the requested function in relation to theimage sensor 110 belongs to a specified second function, theprocessor 160 may generate an IR image by using the second raw image data corresponding to the light in the infrared light band obtained through theimage sensor 110, and perform biometric authentication related to an external object using the generated IR image. According to an embodiment, theprocessor 160 may obtain distance information (or depth information) between theelectronic device 100 and the object using the IR image. For example, theprocessor 160 may output IR light of a predetermined pattern through theIR flash 150 and obtain distance information to the object by using the IR light of the pattern included in the obtained IR image. - Hereinafter, a detailed operation performed when an RGB image is required to provide a function of an application will be described with reference to
FIGS. 4 and5 . -
FIG. 4 is a flowchart illustrating a method of correcting an infrared component of raw image data and providing a function of an application using the corrected RGB image, according to an embodiment. - Hereinafter, it is assumed that the
electronic device 100 ofFIG. 1 performs the process ofFIG. 4 . In addition, in the description ofFIG. 4 , an operation described as being performed by theelectronic device 100 may be understood to be controlled by theprocessor 160 of theelectronic device 100. Operations described as being performed by theelectronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by theprocessor 160 of theelectronic device 100. The instructions may be stored in, for example, a computer recording medium or thememory 130 of theelectronic device 100 shown inFIG. 1 . - According to an embodiment, the above-described
operations 305a to 309a may includeoperations 401 to 417 ofFIG. 4 . According to an embodiment, some ofoperations 401 to 417 may be omitted. - According to an embodiment,
operation 401,operation 407a, oroperation 407b may be performed afteroperation 303 ofFIG. 3 . - According to an embodiment, when an illuminance corresponding to raw image data falls within a first range, in
operation 401, theprocessor 160 may obtain first raw image data using the first image data set. According to an embodiment, the first range may be defined as an illuminance corresponding to the raw image data being lower than a predetermined illuminance. According to an embodiment, the first image data set may be image data obtained through an R subpixel, a G subpixel, and a B subpixel. According to an embodiment, the first image data set may include a visible light component and an infrared component. - In
operation 403, theprocessor 160 may obtain third raw image data corresponding to the second region and including the visible light component and the infrared component using the first raw image data. According to an embodiment, the first raw image data may include a component related to light incident to the first region of the color filter array and may not include a component related to light incident to the second region. Theprocessor 160 may interpolate the third raw image data corresponding to the second region by using the first raw image data. According to an embodiment, theprocessor 160 may obtain RGB image data corresponding to all regions of a color filter array by obtaining the third raw image data. - In
operation 405, theprocessor 160 may generate an RGB image based on the first raw image data and the third raw image data. According to an embodiment, theprocessor 160 may interpolate the RGB image by using the first raw image data and the third raw image data. For example, theprocessor 160 may demosaic the first raw image data and the third raw image data to generate the RGB image. - According to an embodiment, when the illuminance corresponding to the raw image data falls within the second range, in
operation processor 160 may correct an infrared component of the raw image data based on the second image data set. According to an embodiment, the second range may be defined as an illuminance corresponding to the raw image data being equal to or higher than a predetermined illuminance. - In other words, when the illuminance corresponding to the raw image data falls within the first range, the
processor 160 may generate an RGB image using the first image data set. In addition, when the illuminance corresponding to the raw image data falls within a specified second range, theprocessor 160 may generate an RGB image in which a component corresponding to the infrared light band in the first image data set is corrected using at least some of the second image data set. - According to an embodiment, it is possible to obtain an RGB image of an improved quality even in a low illuminance environment by obtaining the RGB image including an IR component.
- According to various embodiments, the
processor 160 may obtain an RGB image including an IR component to provide various functions as well as when illumination is low (operation 401,operation 403 and operation 405). For example, theprocessor 160 may obtain an RGB image including an IR component to smooth the texture of human skin to be photographed. As another example, theprocessor 160 may obtain an RGB image including an IR component to optimize a dynamic range of the RGB image. As another example, theprocessor 160 may obtain an RGB image including an IR component to obtain an effect of removing fog from the RGB image. - According to an embodiment, when the color temperature corresponding to the raw image data falls within the first range, in
operation 407a, theprocessor 160 may correct an IR component of the raw image data based on the second image data set according to a first method. According to one embodiment, the first range may be defined as a color temperature corresponding to the raw image data being lower than a predetermined color temperature. - According to an embodiment, when the color temperature corresponding to the raw image data falls within the second range, in
operation 407b, theprocessor 160 may correct an IR component of the raw image data based on the second image data set according to a second method different from the first method. According to one embodiment, the second range may be defined as a color temperature corresponding to the raw image data being equal to or higher than a predetermined color temperature. - In other words, the
processor 160 may correct the infrared component of the raw data image in a method determined according to the color temperature of the raw image data. - According to an embodiment, the
processor 160 may perform correction by subtracting a value obtained by multiplying a value of an IR component obtained through an IR pixel by a predetermined coefficient corresponding to a type of the RGB subpixel included in the pixel from a value of an RGB component obtained through the RGB subpixel. For example, the raw image data may include a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel, and a value of a component corresponding to an IR subpixel, which correspond to one pixel. In an embodiment, the value of the component corresponding to the R subpixel may be 3, the value of the component corresponding to the G subpixel may be 3.5, the value of the component corresponding to the B subpixel may be 4, and the value of the component corresponding to the IR subpixel may be 1. In addition, the coefficient corresponding to the R subpixel may be 1.24, the coefficient corresponding to the G subpixel may be 1.03, and the coefficient corresponding to the B subpixel may be 0.84. - In an embodiment, the
processor 160 may correct the value of the component corresponding to the R subpixel to be 1.76 (= 3-1.24 x 1), correct the value of the component corresponding to the G subpixel to be 2.47 (= 3.5-1.03 x 1), and correct the value of the component corresponding to the B subpixel to be 3.16 (= 4-0.84 x 1). - In other words, the
processor 160 may correct at least a part of the second image data set using a first parameter when the color temperature corresponding to the raw image data falls within the first range, and at least a part of the second image data set using a second parameter when the color temperature falls within the second range. Theprocessor 160 may correct a component corresponding to the infrared band in the first image data set by using at least a part of the corrected second image data set. - According to an embodiment, the
processor 160 may correct a component corresponding to the infrared light of the raw image data based further on at least a part of the first image data set. -
- According to an example, R', G', B', and IR' in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the corrected raw image data. According to an embodiment, R, G, B, and IR in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the raw image data before correction. example,
-
-
- According to an example, R', G', B', and IR' in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the corrected raw image data. According to an embodiment, R, G, B, and IR in the matrix may be respectively a value of a component corresponding to an R subpixel, a value of a component corresponding to a G subpixel, a value of a component corresponding to a B subpixel and a value of a component corresponding to an IR subpixel, which are included in the raw image data before correction.
- According to an embodiment, when the color temperature of the raw image data falls within the first range (e.g., the color temperature range of fluorescent light), the
processor 160 may perform correction using the first method (e.g., using a correction method of subtracting a value obtained by multiplying a value of an IR component obtained through an IR pixel by a predetermined coefficient corresponding to a type of an RGB subpixel included in the pixel from a value of an RGB component obtained through the RGB subpixel) of the above-described correction methods, and when the color temperature of the raw image data falls within the second range (e.g., the color temperature range of the incandescent light), perform correction using the second method of the above-described correction methods (e.g., a method of correcting an infrared component of the raw image data using the matrix formula) . - In
operation 409, theprocessor 160 may obtain first raw image data by using at least a part of the corrected raw image data. According to an embodiment, the first raw image data that is raw image data corresponding to a first region of the corrected raw image data may be obtained. For example, the corrected raw image data as described above may include a value R' of a component corresponding to an R subpixel, a value G' of a component corresponding to a G subpixel, a value B' of a component corresponding to a B subpixel, and a value IR' of the component corresponding to an IR subpixel. Theprocessor 160 may obtain the value R' of the component corresponding to the R subpixel corresponding to the first region, the value G' of the component corresponding to the G subpixel corresponding to the first region, and the value B' of the component corresponding to the B subpixel corresponding to the first region, among the values of the plurality of components. - In
operation 411, theprocessor 160 may obtain third raw image data corresponding to the second region and including a visible light component and an infrared component using the first raw image data, and obtain fourth raw image data corresponding to the first region and including an infrared component using the second raw image data. - According to an example, the first raw image data may include a component related to light incident to the first region of the color filter array and may not include a component related to light incident to the second region. The
processor 160 may interpolate the third raw image data corresponding to the second region by using the first raw image data. According to an embodiment, theprocessor 160 may obtain RGB image data corresponding to all regions of a color filter array by obtaining the third raw image data. - According to an embodiment, the second raw image data may include a component related to light incident to the second region of the color filter array and may not include a component related to light incident to the first region. The
processor 160 may interpolate the fourth raw image data corresponding to the first region by using the second raw image data. According to an embodiment, theprocessor 160 may obtain IR image data corresponding to all regions of a color filter array by obtaining the fourth raw image data. -
FIG. 5 is a diagram illustrating a process of obtaining third raw image data and fourth raw image data by using a part of raw image data according to an embodiment. - Referring to
FIG. 5 , corrected raw image data may include avalue 510 of a component corresponding to a first region and avalue 520 of a component corresponding to a second region. Theprocessor 160 may obtain first raw image data including avalue 511 of a component corresponding to an R subpixel, avalue 512 of a component corresponding to a G subpixel, and a value 513 of a component corresponding to a B subpixel, and obtain second raw image data including thevalue 520 of a component corresponding to an IR subpixel. - According to an embodiment, the
processor 160 may interpolate thirdraw image data 530 corresponding to the second region using the first raw image data. Although the use of values of three components is illustrated inFIG. 5 , there may be a plurality of neighboring other pixels of the illustrated pixel, and theprocessor 160 may interpolate the thirdraw image data 530 using the first raw image data corresponding to the other neighboring pixels. - According to an example, the
processor 160 may interpolate the thirdraw image data 530 using thevalue 512 of the component corresponding to the G subpixel as it is. According to an embodiment, theprocessor 160 may process values of components corresponding to neighboring subpixels according to a predetermined algorithm, and interpolate the thirdraw image data 530 using the processed values. For example, theprocessor 160 may interpolate the thirdraw image data 530 by using thevalue 512 of the component corresponding to the G subpixel and an average value of the values of the components corresponding to the G subpixels of the neighboring other pixels. - According to an embodiment, the
processor 160 may interpolate the fourthraw image data 540 corresponding to the first region using the second raw image data. Although the use of value of one component is illustrated inFIG. 5 , there may be a plurality of neighboring other pixels of the illustrated pixel, and theprocessor 160 may interpolate the fourthraw image data 540 using the second raw image data corresponding to the neighboring other pixels. According to an embodiment, the fourthraw image data 540 may include avalue 541 of a component corresponding to the R subpixel, avalue 542 of a component corresponding to the G subpixel, and avalue 543 of a component corresponding to the B subpixel. - According to an example, the
processor 160 may interpolate the fourthraw image data 540 using the secondraw image data 520 as it is. According to an embodiment, theprocessor 160 may process values of components corresponding to IR subpixels included in neighboring pixels according to a predetermined algorithm, and interpolate the fourthraw image data 540 using the processed values. For example, theprocessor 160 may interpolate the fourthraw image data 540 using an average value of values of components corresponding to the IR subpixels of the neighboring pixels. - Referring back to
FIG. 4 , inoperation 413, theprocessor 160 may remove infrared components included in the first raw image data and the third raw image data based on the second raw image data and the fourth raw image data. - According to an embodiment, the first raw image data and the third raw image data may include visible light components and infrared components, and the second raw image data and fourth raw image data may include only infrared components. According to an example, the
processor 160 may remove infrared components by subtracting a value of a component corresponding to each subpixel included in the second raw image data and the fourth raw image data from in a value of a component corresponding to each subpixel included in the first raw image data and the third raw image data. For example, theprocessor 160 may subtract the value of the component corresponding to the R subpixel of the fourth raw image data from the value of the component corresponding to the R subpixel of the first raw image data. - In
operation 415, theprocessor 160 may generate an RGB image based on the first raw image data and the third raw image data from which the infrared components have been removed. According to an example, theprocessor 160 may interpolate the RGB image by using the first raw image data and the third raw image data from which the infrared components have been removed. For example, theprocessor 160 may demosaic the first raw image data and the third raw image data from which the infrared components have been removed to generate an RGB image. - In
operation 417, theprocessor 160 may provide a function of an application using the RGB image. For example, when the application is a camera application, theprocessor 160 may display an RGB image obtained in real time on thedisplay 120 and store a still image or a moving image in thememory 130 according to a user input. - In the above description with reference to
FIG. 4 , it has been described that a range of illuminance corresponding to the raw image data is determined, and a range of color temperature corresponding to the raw image data is then determined. According to an example, only a range of illuminance corresponding to the raw image data may be determined or only a range of color temperature may be determined. - For example, regardless of which range the illuminance corresponding to the raw image data belongs to, when the range of color temperature falls within the first range, the processor may perform operation (
operation 407a) of correcting the infrared component of the raw image data based on the second image data set according to the first method, and perform operation (operation 407b) of correcting the infrared component of the raw image data based on the second image data set according to the second method when the range of the color temperature temperature falls within the second range. - As another example, regardless of which range the color temperature corresponding to the raw image data belongs to, the processor may perform operation (operation 401) of obtaining first raw image data using the first image data set when the range of illuminance falls within the first range and perform operation (
operation 407a oroperation 407b) of correcting an infrared component of the raw image data based on the second image data set when the range of illuminance falls within the second range. - Hereinafter, a detailed operation performed when an IR image is required to provide a function of an application (in response to a request to obtain the IR image from the application) is described with reference to
FIG. 6 . -
FIG. 6 is a flowchart illustrating a method of providing a function of an application using an IR image according to an embodiment. - Hereinafter, it is assumed that the
electronic device 100 ofFIG. 1 performs the process ofFIG. 6 . In addition, in the description ofFIG. 6 , an operation described as being performed by theelectronic device 100 may be understood to be controlled by theprocessor 160 of theelectronic device 100. Operations described as being performed by anelectronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by theprocessor 160 of theelectronic device 100. The instructions may be stored in, for example, a computer recording medium or thememory 130 of theelectronic device 100 shown inFIG. 1 . - According to an embodiment, the above-described
operations 305b to 309b may includeoperations 601 to 607 ofFIG. 6 . According to an embodiment, some ofoperations 601 to 607 may be omitted. - According to an embodiment,
operation 601 may be performed afteroperation 303 ofFIG. 3 . - According to an example, before
operation 303, theprocessor 160 may output IR light through theIR flash 150. - In
operation 601, theprocessor 160 may obtain second raw image data using a second image data set. According to an embodiment, the second image data set may be image data obtained through an IR pixel. According to an embodiment, the second image data set may include an infrared component. - In
operation 603, theprocessor 160 may obtain fourth raw image data corresponding to a first region and including an infrared component by using the second raw image data. According to an embodiment, the second raw image data may include a component related to light incident to the second region of the color filter array and may not include a component related to light incident to the first region. Theprocessor 160 may interpolate the fourth raw image data corresponding to the first region by using the second raw image data. According to an embodiment, theprocessor 160 may obtain RGB image data corresponding to all regions of the color filter array by obtaining fourth raw image data. A detailed method of obtaining the fourth raw image data by theprocessor 160 is the same as described above with reference toFIG. 5 . - In
operation 605, theprocessor 160 may generate an IR image based on the second raw image data and the fourth raw image data. According to an embodiment, theprocessor 160 may interpolate the IR image by using the second raw image data and the fourth raw image data. For example, theprocessor 160 may demosaic the second raw image data and the fourth raw image data to generate an IR image. - In
operation 607, theprocessor 160 may provide a function of an application using the IR image. For example, when the application is an iris authentication application, theprocessor 160 may perform iris authentication by comparing an iris pattern included in the obtained IR image with an iris pattern stored in thememory 130. - According to an embodiment, the
processor 160 may generate an RGB image according to the method described with reference toFIG. 4 . According to an example, theprocessor 160 may display the generated RGB image on thedisplay 120, and provide a function using the IR image while the RGB image is displayed on thedisplay 120. A detailed example thereof will be described with reference toFIG. 7 . -
FIG. 7 is a flowchart illustrating a method of performing biometric authentication using an IR image while displaying an RGB image, according to an example. - Hereinafter, it is assumed that the
electronic device 100 ofFIG. 1 performs the process ofFIG. 7 . In addition, an operation described as being performed by theelectronic device 100 with reference toFIG. 7 may be understood as being controlled by theprocessor 160 of theelectronic device 100. - Operations described as being performed by an
electronic device 100 may be implemented as instructions (commands) that may be performed (or executed) by theprocessor 160 of theelectronic device 100. The instructions may be stored in, for example, a computer recording medium or thememory 130 of theelectronic device 100 shown inFIG. 1 . - In
operation 701, theprocessor 160 may execute an application associated with biometric authentication. For example, theprocessor 160 may execute a card payment application that authenticates a user according to a biometric authentication method. According to an embodiment, biometric authentication may correspond to iris recognition. - In
operation 703, theprocessor 160 may obtain image data including eyes of a user through theimage sensor 110. According to an embodiment, theprocessor 160 may output infrared light through theIR flash 150 beforeoperation 703. According to an embodiment, theprocessor 160 may continuously output infrared light through theIR flash 150 until user authentication is completed. - In
operation 705, theprocessor 160 may display, on thedisplay 120, at least a portion of an RGB image generated based at least on first information obtained by an R subpixel, a G subpixel, and a B subpixel of the image data. - According to an embodiment, the first information may include first image data set, first raw image data, or corrected first raw image data which are described above. According to an embodiment, the
processor 160 may generate an RGB image according to the method described above with reference toFIG. 4 , and display at least a portion of the generated RGB image on thedisplay 120. - In
operation 707, theprocessor 160 may authenticate a user based on second information obtained by the IR subpixel while the generated RGB image is displayed on thedisplay 120. - According to an embodiment, the second information may include second image data set, second raw image data, or corrected second raw image data which are described above. According to an embodiment of the present disclosure, the
processor 160 may generate an IR image by the method described above with reference toFIG. 6 . According to an embodiment, the second information may include iris image data obtained from the eyes of the user. - According to an embodiment, the
processor 160 may perform user authentication by comparing the obtained iris image data with iris image data stored in thememory 130. - In
operation 709, theprocessor 160 may provide a function of an application according to a result of the user authentication. For example, when the user authentication is successful, theprocessor 160 may perform card payment. - According to an example, the
processor 160 may output flicking of infrared light through theIR flash 150. - According to an example, in
operation 705, theprocessor 160 may display, on thedisplay 120, at least a portion of the RGB image generated based at least on the first information obtained by the R subpixel, the G subpixel, and the B subpixel of the image data corresponding to a time point at which the infrared light is not output. According to an example, inoperation 707, theprocessor 160 may authenticate the user based on the second information obtained by the IR subpixel of the image data corresponding to the time point at which the infrared light is output. -
FIG. 8 is a block diagram of an electronic device in a network environment according to various embodiments. - Referring to
FIG. 8 , anelectronic device 801 may communicate with anelectronic device 802 through a first network 898 (e.g., a short-range wireless communication) or may communicate with anelectronic device 804 or aserver 808 through a second network 899 (e.g., a long-distance wireless communication) in anetwork environment 800. According to an example, theelectronic device 801 may communicate with theelectronic device 804 through theserver 808. According to an example, theelectronic device 801 may include aprocessor 820, amemory 830, aninput device 850, asound output device 855, adisplay device 860, anaudio module 870, asensor module 876, aninterface 877, ahaptic module 879, acamera module 880, apower management module 888, abattery 889, acommunication module 890, asubscriber identification module 896, and anantenna module 897. According to some examples, at least one (e.g., thedisplay device 860 or the camera module 880) among components of theelectronic device 801 may be omitted or other components may be added to theelectronic device 801. According to some examples, some components may be integrated and implemented as in the case of the sensor module 876 (e.g., a fingerprint sensor, an iris sensor, or an illuminance sensor) embedded in the display device 860 (e.g., a display). - The
processor 820 may operate, for example, software (e.g., a program 840) to control at least one of other components (e.g., a hardware or software component) of theelectronic device 801 connected to theprocessor 820 and may process and compute a variety of data. Theprocessor 820 may load a command set or data, which is received from other components (e.g., thesensor module 876 or the communication module 890), into avolatile memory 832, may process the loaded command or data, and may store result data into anonvolatile memory 834. According to an example, theprocessor 820 may include a main processor 821 (e.g., a central processing unit or an application processor) and an auxiliary processor 823 (e.g., a graphic processing device, an image signal processor, a sensor hub processor, or a communication processor), which operates independently from themain processor 821, additionally or alternatively uses less power than themain processor 821, or is specified to a designated function. In this case, theauxiliary processor 823 may operate separately from themain processor 821 or embedded. - In this case, the
auxiliary processor 823 may control, for example, at least some of functions or states associated with at least one component (e.g., thedisplay device 860, thesensor module 876, or the communication module 890) among the components of theelectronic device 801 instead of themain processor 821 while themain processor 821 is in an inactive (e.g., sleep) state or together with themain processor 821 while themain processor 821 is in an active (e.g., an application execution) state. According to an example, the auxiliary processor 823 (e.g., the image signal processor or the communication processor) may be implemented as a part of another component (e.g., thecamera module 880 or the communication module 890) that is functionally related to theauxiliary processor 823. Thememory 830 may store a variety of data used by at least one component (e.g., theprocessor 820 or the sensor module 876) of theelectronic device 801, for example, software (e.g., the program 840) and input data or output data with respect to commands associated with the software. Thememory 830 may include thevolatile memory 832 or thenonvolatile memory 834. - The
program 840 may be stored in thememory 830 as software and may include, for example, anoperating system 842, amiddleware 844, or anapplication 846. - The
input device 850 may be a device for receiving a command or data, which is used for a component (e.g., the processor 820) of theelectronic device 801, from an outside (e.g., a user) of theelectronic device 801 and may include, for example, a microphone, a mouse, or a keyboard. - The
sound output device 855 may be a device for outputting a sound signal to the outside of theelectronic device 801 and may include, for example, a speaker used for general purposes, such as multimedia play or recordings play, and a receiver used only for receiving calls. According to an example, the receiver and the speaker may be either integrally or separately implemented. - The
display device 860 may be a device for visually presenting information to the user of theelectronic device 801 and may include, for example, a display, a hologram device, or a projector and a control circuit for controlling a corresponding device. According to an example, thedisplay device 860 may include a touch circuitry or a pressure sensor for measuring an intensity of pressure on the touch. - The
audio module 870 may convert a sound and an electrical signal in dual directions. According to an example, theaudio module 870 may obtain the sound through theinput device 850 or may output the sound through an external electronic device (e.g., the electronic device 802 (e.g., a speaker or a headphone)) wired or wirelessly connected to thesound output device 855 or theelectronic device 801. - The
sensor module 876 may generate an electrical signal or a data value corresponding to an operating state (e.g., power or temperature) inside or an environmental state outside theelectronic device 801. Thesensor module 876 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor. - The
interface 877 may support a designated protocol wired or wirelessly connected to the external electronic device (e.g., the electronic device 802). According to an example, theinterface 877 may include, for example, an HDMI (high-definition multimedia interface), a USB (universal serial bus) interface, an SD card interface, or an audio interface. - A connecting
terminal 878 may include a connector that physically connects theelectronic device 801 to the external electronic device (e.g., the electronic device 802), for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector). - The
haptic module 879 may convert an electrical signal to a mechanical stimulation (e.g., vibration or movement) or an electrical stimulation perceived by the user through tactile or kinesthetic sensations. Thehaptic module 879 may include, for example, a motor, a piezoelectric element, or an electric stimulator. - The
camera module 880 may shoot a still image or a video image. According to an example, thecamera module 880 may include, for example, at least one lens, an image sensor, an image signal processor, or a flash. - The
power management module 888 may be a module for managing power supplied to theelectronic device 801 and may serve as at least a part of a power management integrated circuit (PMIC). - The
battery 889 may be a device for supplying power to at least one component of theelectronic device 801 and may include, for example, a non-rechargeable (primary) battery, a rechargeable (secondary) battery, or a fuel cell. - The
communication module 890 may establish a wired or wireless communication channel between theelectronic device 801 and the external electronic device (e.g., theelectronic device 802, theelectronic device 804, or the server 808) and support communication execution through the established communication channel. Thecommunication module 890 may include at least one communication processor operating independently from the processor 820 (e.g., the application processor) and supporting the wired communication or the wireless communication. According to an example, thecommunication module 890 may include a wireless communication module 892 (e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module) or a wired communication module 894 (e.g., an LAN (local area network) communication module or a power line communication module) and may communicate with the external electronic device using a corresponding communication module among them through the first network 898 (e.g., the short-range communication network such as a Bluetooth, a WiFi direct, or an IrDA (infrared data association)) or the second network 899 (e.g., the long-distance wireless communication network such as a cellular network, an internet, or a computer network (e.g., LAN or WAN)). The above-mentionedvarious communication modules 890 may be implemented into one chip or into separate chips, respectively. - According to an example, the
wireless communication module 892 may identify and authenticate theelectronic device 801 using user information stored in thesubscriber identification module 896 in the communication network. - The
antenna module 897 may include one or more antennas to transmit or receive the signal or power to or from an external source. According to an embodiment, the communication module 890 (e.g., the wireless communication module 892) may transmit or receive the signal to or from the external electronic device through the antenna suitable for the communication method. - Some components among the components may be connected to each other through a communication method (e.g., a bus, a GPIO (general purpose input/output), an SPI (serial peripheral interface), or an MIPI (mobile industry processor interface)) used between peripheral devices to exchange signals (e.g., a command or data) with each other.
- According to an example, the command or data may be transmitted or received between the
electronic device 801 and the externalelectronic device 804 through theserver 808 connected to thesecond network 899. Each of theelectronic devices electronic device 801. According to an example, all or some of the operations performed by theelectronic device 801 may be performed by another electronic device or a plurality of external electronic devices. When theelectronic device 801 performs some functions or services automatically or by request, theelectronic device 801 may request the external electronic device to perform at least some of the functions related to the functions or services, in addition to or instead of performing the functions or services by itself. The external electronic device receiving the request may carry out the requested function or the additional function and transmit the result to theelectronic device 801. Theelectronic device 801 may provide the requested functions or services based on the received result as is or after additionally processing the received result. To this end, for example, a cloud computing, distributed computing, or client-server computing technology may be used. -
Fig. 9 is a block diagram 900 illustrating thecamera module 880 according to various embodiments. Referring toFig. 9 , thecamera module 880 may include alens assembly 910, aflash 920, animage sensor 930, animage stabilizer 940, memory 950 (e.g., buffer memory), or animage signal processor 960. Thelens assembly 910 may collect light emitted or reflected from an object whose image is to be taken. Thelens assembly 910 may include one or more lenses. According to an embodiment, thecamera module 880 may include a plurality oflens assemblies 910. In such a case, thecamera module 880 may form, for example, a dual camera, a 360-degree camera, or a spherical camera. Some of the plurality oflens assemblies 910 may have the same lens attribute (e.g., view angle, focal length, auto-focusing, f number, or optical zoom), or at least one lens assembly may have one or more lens attributes different from those of another lens assembly. Thelens assembly 910 may include, for example, a wide-angle lens or a telephoto lens. - The
flash 920 may emit light that is used to reinforce light reflected from an object. According to an example, theflash 920 may include one or more light emitting diodes (LEDs) (e.g., a red-green-blue (RGB) LED, a white LED, an infrared (IR) LED, or an ultraviolet (UV) LED) or a xenon lamp. Theimage sensor 930 may obtain an image corresponding to an object by converting light emitted or reflected from the object and transmitted via thelens assembly 910 into an electrical signal. According to an example, theimage sensor 930 may include one selected from image sensors having different attributes, such as a RGB sensor, a black-and-white (BW) sensor, an IR sensor, or a UV sensor, a plurality of image sensors having the same attribute, or a plurality of image sensors having different attributes. Each image sensor included in theimage sensor 930 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor. - The
image stabilizer 940 may move theimage sensor 930 or at least one lens included in thelens assembly 910 in a particular direction, or control an operational attribute (e.g., adjust the read-out timing) of theimage sensor 930 in response to the movement of thecamera module 880 or theelectronic device 801 including thecamera module 880. This allows compensating for at least part of a negative effect (e.g., image blurring) by the movement on an image being captured. According to an example, theimage stabilizer 940 may sense such a movement by thecamera module 880 or theelectronic device 801 using a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside thecamera module 880. According to an example, theimage stabilizer 940 may be implemented, for example, as an optical image stabilizer. - The
memory 950 may store, at least temporarily, at least part of an image obtained via theimage sensor 930 for a subsequent image processing task. For example, if image capturing is delayed due to shutter lag or multiple images are quickly captured, a raw image obtained (e.g., a Bayer-patterned image, a high-resolution image) may be stored in thememory 950, and its corresponding copy image (e.g., a low-resolution image) may be previewed via thedisplay device 860. Thereafter, if a specified condition is met (e.g., by a user's input or system command), at least part of the raw image stored in thememory 950 may be obtained and processed, for example, by theimage signal processor 960. According to an example, thememory 950 may be configured as at least part of thememory 830 or as a separate memory that is operated independently from thememory 830. - The
image signal processor 960 may perform one or more image processing with respect to an image obtained via theimage sensor 930 or an image stored in thememory 950. The one or more image processing may include, for example, depth map generation, three-dimensional (3D) modeling, panorama generation, feature point extraction, image synthesizing, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, or softening). Additionally or alternatively, theimage signal processor 960 may perform control (e.g., exposure time control or read-out timing control) with respect to at least one (e.g., the image sensor 930) of the components included in thecamera module 880. An image processed by theimage signal processor 960 may be stored back in thememory 950 for further processing, or may be provided to an external component (e.g., thememory 830, thedisplay device 860, theelectronic device 802, theelectronic device 804, or the server 808) outside thecamera module 880. According to an example, theimage signal processor 960 may be configured as at least part of theprocessor 820, or as a separate processor that is operated independently from theprocessor 820. If theimage signal processor 960 is configured as a separate processor from theprocessor 820, at least one image processed by theimage signal processor 960 may be displayed, by theprocessor 820, via thedisplay device 860 as it is or after being further processed. - According to an example, the
electronic device 801 may include a plurality ofcamera modules 880 having different attributes or functions. In such a case, at least one of the plurality ofcamera modules 880 may form, for example, a wide-angle camera and at least another of the plurality ofcamera modules 880 may form a telephoto camera. Similarly, at least one of the plurality ofcamera modules 880 may form, for example, a front camera and at least another of the plurality ofcamera modules 880 may form a rear camera. - According to an embodiment disclosed therein, an electronic device may include an image sensor that obtains raw image data corresponding to light in an infrared band and a visible band for an external object and a processor, wherein the processor may receive a request to obtain the raw image data corresponding to the external object, generate an RGB image associated with the external object using first raw image data corresponding to the light in the visible band obtained through the image sensor, based on the request being set to be performed using a first function of the image sensor, and perform biometric authentication associated with the external object using second raw image data corresponding to the light in the infrared band obtained through the image sensor, based on the request being set to be performed using a second function of the image sensor.
- According to an embodiment, the image sensor may include a first sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and the visible band and a second sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and block the light in the visible band, and the processor may obtain raw image data including a first image data set obtained through the first sensor pixel set and a second image data set obtained through the second sensor pixel set, using the image sensor, and obtain the first image data or the second raw image data using at least a part of the raw image data.
- According to an embodiment, the processor may, as a part of an operation of generating the RGB image, generate the RGB image using the first image data set based on an illuminance corresponding to the raw image data satisfying a specified first range, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using at least a part of the second image data set based on the illuminance satisfying a specified second range.
- According to an embodiment, the processor may, as a part of an operation of generating the RGB image, correct at least a part of the second image data set using a first parameter based on a color temperature corresponding to the raw image data satisfying a specified first range, and correct at least a part of the second image data set using a second parameter based on the color temperature satisfying a specified second range, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using at least a part of the corrected second image data set.
- According to an embodiment, the processor may ,as a part of an operation of generating the RGB image, correct at least a part of the second image data set using the first parameter, using at least a part of the corrected second image data set according to a color temperature corresponding to the raw image data satisfying a color temperature range of fluorescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using the at least a part of the second image data set which is corrected and correct at least a part of the first image data set and at least a part of the second image data set using the second parameter according to a color temperature corresponding to the raw image data satisfying a color temperature range of incandescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set using the at least a part of the first image data set and the at least a part of the second image data set which are corrected.
- Furthermore, according to an embodiment disclosed therein, an electronic device may include an image sensor including a color filter array, the color filter array including a first region transmitting visible light and infrared light and a second region transmitting infrared light, a memory, and a processor electrically connected to the image sensor, the display, and the memory, wherein the processor may execute an application, obtain first raw image data using a first image data set corresponding to the first region among raw image data obtained through the image sensor in response to a request to obtain an RGB image from the application, and provide the RGB image generated based at least on the obtained first raw image data to the application, and obtain second raw image data using a second image data set corresponding to the second region among raw image data obtained through the image sensor in response to a request to obtain an IR image from the application, and provide the IR image generated based at least on the obtained second raw image data to the application.
- According to an embodiment, the first region may include a region transmitting red light and infrared light, a region transmitting green light and infrared light and a region transmitting blue light and infrared light.
- According to an embodiment, the processor may obtain the first raw image data using the first image data set according to an illuminance corresponding to the raw image data falling within a first range, and obtain the first raw image data using at least a part of the raw image data in which a component corresponding to infrared light included in the raw image data is corrected based at least on the second image data set according to the illuminance corresponding to the raw image data falling within a second range.
- According to an embodiment, the first range may be defined as being lower than a predetermined illuminance and the second range may be defined as being higher than or equal to the predetermined illuminance
- According to an embodiment, the processor may correct a component of the raw image data corresponding to the infrared light based at least on the second image data set using a first method according to a color temperature corresponding to the raw image data falling within a first range, correct a component of the raw image data corresponding to the infrared light based at least on the second image data set using a second method different from the first method according to a color temperature corresponding to the raw image data falling within a second range, and obtain the first raw image data using at least a part of the corrected raw image data.
- According to an embodiment, the processor may correct a component of the raw image data corresponding to the infrared light further based on at least a part of the first image data set.
- According to an embodiment, the first range may be defined as being lower than a predetermined color temperature and the second range may be defined as being higher than or equal to the predetermined color temperature.
- According to an embodiment, the processor may further obtain third raw image data corresponding to the second region and including a visible light component and an infrared component based on the first raw image data in response to the request to obtain the RGB image, and remove an infrared component included in the first raw image data and the third raw image data based on the second raw image data, and wherein the RGB image is generated based on the first raw image data and the third raw image data.
- According to an embodiment, the processor may further obtain fourth raw image data corresponding to the first region and including an infrared component based on the second raw image data in response to the request to obtain the IR image, and the IR image may be generated based on the second raw image data and the fourth raw image data.
- According to an example, the electronic device may further include a display, and the processor may obtain the first raw image data using the first image data set corresponding to the first region among the raw image data obtained through the image sensor in response to the request to obtain the IR image, and display the RGB image generated based at least on the obtained first raw image data on the display, and provide the IR image to the application while the RGB image is displayed on the display.
- Furthermore, according to an example disclosed herein, an electronic device may include an image sensor, the image sensor including a pixel array, the pixel array including a plurality of pixels, each of the pixels including an R subpixel, a G subpixel, a B subpixel, and an IR subpixel, a display, a memory that stores instructions, and a processor electrically connected to the image sensor, the display and the memory, wherein the processor may execute instructions to perform an application associated with biometric authentication, obtain image data including eyes of a user through the image sensor, display, on the display, at least a portion of an RGB image generated based at least on first information obtained by the R subpixel, the G subpixel and the B subpixel among the image data, authenticate the user based on second information obtained by the IR subpixel while the generated RGB image is displayed on the display, and provide a function of the application according to a result of the authentication.
- According to an example, the second information may include iris image data obtained from the eyes of the user and the biometric authentication corresponds to iris authentication.
- According to an example, the electronic device may further include an IR flash, and the processor may execute the instructions to output flickering of infrared light through the IR flash, display, on the display, at least a part of the RGB image generated based at least on the first information obtained by the R subpixel, the G subpixel and the B subpixel among the image data corresponding to a time point at which the infrared light is not output and authenticate the user based on the second information obtained by the IR subpixel among the image data at a time point at which the infrared light is output.
- According to an example, the processor may execute the instructions to correct an infrared component included in the first information based at least on the second information and display, on the display, at least a portion of the RGB image generated based at least on the first information in which the infrared component is corrected.
- According to an embodiment, the processor may execute the instructions to correct the infrared component included in the first information based at least on the second information according to a first method when a color temperature corresponding to the image data falls within a first range and correct the infrared component included in the first information based at least on the second information according to a second method different from the first method when the color temperature corresponding to the image data falls within a second range.
- The electronic device according to various examples disclosed in the present disclosure may be various types of devices. The electronic device may include, for example, at least one of a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a mobile medical appliance, a camera, a wearable device, or a home appliance. The electronic device according to an embodiment of the present disclosure should not be limited to the above-mentioned devices.
- It should be understood that various embodiments of the present disclosure and terms used in the embodiments do not intend to limit technologies disclosed in the present disclosure to the particular forms disclosed herein; rather, the present disclosure should be construed to cover various modifications, according to the appended claims. With regard to description of drawings, similar components may be assigned with similar reference numerals. As used herein, singular forms may include plural forms as well unless the context clearly indicates otherwise. In the present disclosure disclosed herein, the expressions "A or B", "at least one of A or/and B", "A, B, or C" or "one or more of A, B, or/and C", and the like used herein may include any and all combinations of one or more of the associated listed items. The expressions "a first", "a second", "the first", or "the second", used in herein, may refer to various components regardless of the order and/or the importance, but do not limit the corresponding components. The above expressions are used merely for the purpose of distinguishing a component from the other components. It should be understood that when a component (e.g., a first component) is referred to as being (operatively or communicatively) "connected," or "coupled," to another component (e.g., a second component), it may be directly connected or coupled directly to the other component or any other component (e.g., a third component) may be interposed between them.
- The term "module" used herein may represent, for example, a unit including one or more combinations of hardware, software and firmware. The term "module" may be interchangeably used with the terms "logic", "logical block", "part" and "circuit". The "module" may be a minimum unit of an integrated part or may be a part thereof. The "module" may be a minimum unit for performing one or more functions or a part thereof. For example, the "module" may include an application-specific integrated circuit (ASIC).
- Various embodiments of the present disclosure may be implemented by software (e.g., the program 840) including an instruction stored in a machine-readable storage media (e.g., an
internal memory 836 or an external memory 838) readable by a machine (e.g., a computer). The machine may be a device that calls the instruction from the machine-readable storage media and operates depending on the called instruction and may include the electronic device (e.g., the electronic device 801). When the instruction is executed by the processor (e.g., the processor 820), the processor may perform a function corresponding to the instruction directly or using other components under the control of the processor. The instruction may include a code generated or executed by a compiler or an interpreter. The machine-readable storage media may be provided in the form of non-transitory storage media. Here, the term "non-transitory", as used herein, is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency. - According to an example, the method according to various embodiments disclosed in the present disclosure may be provided as a part of a computer program product. The computer program product may be traded between a seller and a buyer as a product. The computer program product may be distributed in the form of machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)) or may be distributed only through an application store (e.g., a Play Store™). In the case of online distribution, at least a portion of the computer program product may be temporarily stored or generated in a storage medium such as a memory of a manufacturer's server, an application store's server, or a relay server.
- Each component (e.g., the module or the program) according to various embodiments may include at least one of the above components, and a portion of the above sub-components may be omitted, or additional other sub-components may be further included. Alternatively or additionally, some components (e.g., the module or the program) may be integrated in one component and may perform the same or similar functions performed by each corresponding components prior to the integration. Operations performed by a module, a programming, or other components according to various embodiments of the present disclosure may be executed sequentially, in parallel, repeatedly, or in a heuristic method. Also, at least some operations may be executed in different sequences, omitted, or other operations may be added.
- While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the appended claims
Claims (13)
- An electronic device comprising:an image sensor (110) configured to obtain raw image data corresponding to light in an infrared band and a visible band for an external object; anda processor (160);wherein the processor is configured to:receive a request to obtain (303) the raw image data corresponding to the external object;generate (307a) an RGB image associated with the external object using a first raw image data (510) by the processor being further configured to:if an illuminance corresponding to the raw image data falls within a first range, obtain (401) the first raw image data using a first image data set corresponding to a first region (210) among raw image data obtained through the image sensor (110); orif the illuminance corresponding to the raw image data falls within a second range, obtain (409) the first raw image data using at least a part of the raw image data in which a component corresponding to infrared light included in the raw image data is corrected in the first image data set based at least on a second image data set corresponding to a second region (220) among the raw image data obtained through the image sensor,wherein the first range is defined as being lower than a predetermined illuminance and the second range is defined as being higher or equal to the predetermined illuminance and
generate (415) the RGB image based on the request being set to be performed using a first function of the image sensor, wherein the RGB image generated according to the illuminance corresponding to the raw image data falling within the first range includes an infrared component, and the RGB image generated according to the illuminance corresponding to the raw image data falling within the second range does not include the infrared component; andperform (309b) biometric authentication associated with the external object using a second raw image data (520) corresponding to the light in the infrared band obtained through the image sensor, based on the request being set to be performed using a second function of the image sensor. - The electronic device of claim 1, wherein the image sensor (110) includes a first sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and the visible band, and a second sensor pixel set arranged to respond to light passing through a filter configured to transmit at least a part of the light in the infrared band and block the light in the visible band;
wherein the processor (160) is configured to:obtain (303) the raw image data including the first image data set obtained through the first sensor pixel set and the second image data set obtained through the second sensor pixel set, using the image sensor; andobtain (305a or 305b) the first raw image data or the second raw image data using at least a part of the raw image data. - The electronic device of claim 2, wherein the processor is configured to:as a part of an operation of generating the RGB image;correct (407a) at least a part of the second image data set using a first parameter based on a color temperature corresponding to the raw image data satisfying a specified first range; orcorrect (407b) at least a part of the second image data set using a second parameter based on the color temperature satisfying a specified second range; andgenerate (415) the RGB image in which a component corresponding to the infrared band is corrected in the first image data set, using at least a part of the corrected second image data set.
- The electronic device of claim 3, wherein the processor is configured to:as a part of an operation of generating the RGB image;correct at least a part of the second image data set using the first parameter, using at least a part of the corrected second image data set according to a color temperature corresponding to the raw image data satisfying a color temperature range of fluorescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set, using the at least a part of the second image data set which is corrected; andcorrect at least a part of the first image data set and at least a part of the second image data set using the second parameter according to a color temperature corresponding to the raw image data satisfying a color temperature range of incandescent light, and generate the RGB image in which a component corresponding to the infrared band is corrected in the first image data set, using the at least a part of the first image data set and the at least a part of the second image data set which are corrected.
- The electronic device of claim 1, wherein the processor is configured to:execute (301) an application;obtain (305b) the second raw image data using the second image data set corresponding to the second region among raw image data obtained through the image sensor in response to a request to obtain an IR image from the application, and provide the IR image generated based at least on the obtained second raw image data to the application.
- The electronic device of any of claims 1 to 5, wherein the first region (210) includes a region transmitting red light and infrared light, a region transmitting green light and infrared light, and a region transmitting blue light and infrared light.
- The electronic device of any of claims 1 to 6, wherein the first range is defined as being lower than a predetermined illuminance, and the second range is defined as being higher than or equal to the predetermined illuminance.
- The electronic device of claim 1, wherein the processor is configured to:correct (407a) a component of the raw image data corresponding to the infrared light based at least on the second image data set using a first method according to a color temperature corresponding to the raw image data falling within a first range; orcorrect (407b) a component of the raw image data corresponding to the infrared light based at least on the second image data set using a second method different from the first method according to a color temperature corresponding to the raw image data falling within a second range; andobtain (409) the first raw image data using at least a part of the corrected raw image data.
- The electronic device of claim 8, wherein the processor is further configured to correct a component of the raw image data corresponding to the infrared light further based on at least a part of the first image data set.
- The electronic device of claim 8, wherein the first range is defined as being lower than a predetermined color temperature, and the second range is defined as being higher than or equal to the predetermined color temperature.
- The electronic device of any of claims 1 to 10, wherein the processor is configured to:further obtain (411) a third raw image data (530) corresponding to the second region and including a visible light component and an infrared component based on the first raw image data in response to the request to obtain the RGB image; andremove (413) an infrared component included in the first raw image data and the third raw image data based on the second raw image data; andwherein the RGB image is generated based on the first raw image data and the third raw image data.
- The electronic device of claim 5, wherein the processor is configured to further obtain (411) a fourth raw image data (540) corresponding to the first region and including an infrared component based on the second raw image data in response to the request to obtain the IR image; and
wherein the IR image is generated based on the second raw image data and the fourth raw image data. - The electronic device of claim 5, further comprising:a display (860);wherein the processor is configured to further obtain (305a) the first raw image data using the first image data set corresponding to the first region among the raw image data obtained through the image sensor in response to the request to obtain the IR image, and display (705) the RGB image generated based at least on the obtained first raw image data on the display, and provide the IR image to the application while the RGB image is displayed on the display.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170120616A KR102407200B1 (en) | 2017-09-19 | 2017-09-19 | Electronic device for providing function using RGB image and IR image acquired through one image sensor |
PCT/KR2018/011047 WO2019059635A1 (en) | 2017-09-19 | 2018-09-19 | Electronic device for providing function by using rgb image and ir image acquired through one image sensor |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3675477A1 EP3675477A1 (en) | 2020-07-01 |
EP3675477A4 EP3675477A4 (en) | 2020-10-28 |
EP3675477B1 true EP3675477B1 (en) | 2023-03-15 |
Family
ID=65809756
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18857514.6A Active EP3675477B1 (en) | 2017-09-19 | 2018-09-19 | Electronic device for providing function by using rgb image and ir image acquired through one image sensor |
Country Status (4)
Country | Link |
---|---|
US (1) | US11146720B2 (en) |
EP (1) | EP3675477B1 (en) |
KR (1) | KR102407200B1 (en) |
WO (1) | WO2019059635A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210077901A (en) * | 2019-12-18 | 2021-06-28 | 엘지전자 주식회사 | Apparatus and Method for Obtaining Image |
KR20220068034A (en) | 2020-11-18 | 2022-05-25 | 삼성전자주식회사 | A camera module and an electronic device including the camera module |
WO2023277219A1 (en) * | 2021-06-30 | 2023-01-05 | 한국전자기술연구원 | Lightweight deep learning processing device and method for vehicle to which environmental change adaptive feature generator is applied |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6759646B1 (en) | 1998-11-24 | 2004-07-06 | Intel Corporation | Color interpolation for a four color mosaic pattern |
JP4710319B2 (en) * | 2004-12-27 | 2011-06-29 | カシオ計算機株式会社 | Imaging apparatus and program thereof |
KR20080029051A (en) * | 2006-09-28 | 2008-04-03 | 엠텍비젼 주식회사 | Device having image sensor and method for getting image |
KR101437849B1 (en) | 2007-11-21 | 2014-09-04 | 삼성전자주식회사 | Portable terminal and method for performing shooting mode thereof |
KR100905269B1 (en) * | 2008-11-27 | 2009-06-29 | 크라제비전(주) | Image sensor with infrared ray correction |
US20170161557A9 (en) * | 2011-07-13 | 2017-06-08 | Sionyx, Inc. | Biometric Imaging Devices and Associated Methods |
US9661230B2 (en) | 2013-07-05 | 2017-05-23 | Lg Electronics Inc. | Image display apparatus and method of operating the image display apparatus |
KR102206382B1 (en) * | 2013-07-05 | 2021-01-22 | 엘지전자 주식회사 | Image display device and operation method of the image display device |
KR101588225B1 (en) * | 2014-01-15 | 2016-01-25 | 주식회사 아이리시스 | Iris data Registration and Authentication device and the method in the camera of a mobile |
CN106664394B (en) | 2014-06-24 | 2018-10-02 | 麦克赛尔株式会社 | Camera treatment device and image pickup processing method |
KR20170013082A (en) * | 2015-07-27 | 2017-02-06 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR102480600B1 (en) * | 2015-10-21 | 2022-12-23 | 삼성전자주식회사 | Method for low-light image quality enhancement of image processing devices and method of operating an image processing system for performing the method |
JP6734647B2 (en) | 2015-12-23 | 2020-08-05 | マクセル株式会社 | Imaging device |
-
2017
- 2017-09-19 KR KR1020170120616A patent/KR102407200B1/en active IP Right Grant
-
2018
- 2018-09-19 US US16/648,449 patent/US11146720B2/en active Active
- 2018-09-19 EP EP18857514.6A patent/EP3675477B1/en active Active
- 2018-09-19 WO PCT/KR2018/011047 patent/WO2019059635A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
KR102407200B1 (en) | 2022-06-10 |
US11146720B2 (en) | 2021-10-12 |
EP3675477A1 (en) | 2020-07-01 |
EP3675477A4 (en) | 2020-10-28 |
US20200228689A1 (en) | 2020-07-16 |
KR20190032101A (en) | 2019-03-27 |
WO2019059635A1 (en) | 2019-03-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102338576B1 (en) | Electronic device which stores depth information associating with image in accordance with Property of depth information acquired using image and the controlling method thereof | |
KR102385360B1 (en) | Electronic device performing image correction and operation method of thereof | |
KR102328539B1 (en) | Electronic device for acquiring image using plurality of cameras and method for processing image using the same | |
US10979612B2 (en) | Electronic device comprising plurality of cameras using rolling shutter mode | |
KR102318013B1 (en) | Electronic device composing a plurality of images and method | |
KR102452564B1 (en) | Apparatus and method for estimating optical image stabilization motion | |
KR102663537B1 (en) | electronic device and method of image processing | |
US11626447B2 (en) | Electronic device comprising image sensor for identifying an operation setting and an external environmental condition and method of operation thereof | |
EP3675477B1 (en) | Electronic device for providing function by using rgb image and ir image acquired through one image sensor | |
TWI785162B (en) | Method of providing image and electronic device for supporting the method | |
US11238279B2 (en) | Method for generating plural information using camera to sense plural wave bandwidth and apparatus thereof | |
US11558587B2 (en) | Camera module comprising complementary color filter array and electronic device comprising same | |
US20210211615A1 (en) | Electronic device comprising image sensor and method of operation thereof | |
US20200204747A1 (en) | Electronic device and method for obtaining data from second image sensor by means of signal provided from first image sensor | |
US11354777B2 (en) | Image processing device and method of electronic device | |
KR102418852B1 (en) | Electronic device and method for controlling an image display | |
US20220360713A1 (en) | Electronic device including camera | |
EP4171019A1 (en) | Electronic device comprising image sensor, and method for controlling same | |
US20200260023A1 (en) | Electronic device and image up-sampling method for electronic device | |
US11323654B2 (en) | Electronic device and method capable of compressing image on basis of attributes of image data | |
EP4329286A1 (en) | Electronic device for synchronizing lens driving information with image | |
KR20190111506A (en) | Electronic device for processing image with external electronic device acquired and method for operating thefeof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200326 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20200929 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06K 9/20 20060101ALI20200923BHEP Ipc: H04N 5/225 20060101AFI20200923BHEP Ipc: H04N 9/64 20060101ALI20200923BHEP Ipc: G06K 9/00 20060101ALI20200923BHEP Ipc: H04N 9/04 20060101ALI20200923BHEP Ipc: G06F 21/32 20130101ALI20200923BHEP Ipc: H04N 5/33 20060101ALI20200923BHEP Ipc: G06T 7/70 20170101ALI20200923BHEP Ipc: H04N 9/73 20060101ALI20200923BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210813 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06V 40/19 20220101ALI20220819BHEP Ipc: G06V 10/143 20220101ALI20220819BHEP Ipc: H04N 9/73 20060101ALI20220819BHEP Ipc: H04N 9/04 20060101ALI20220819BHEP Ipc: G06F 21/32 20130101ALI20220819BHEP Ipc: G06T 7/70 20170101ALI20220819BHEP Ipc: G06K 9/00 20060101ALI20220819BHEP Ipc: H04N 9/64 20060101ALI20220819BHEP Ipc: H04N 5/33 20060101ALI20220819BHEP Ipc: H04N 5/225 20060101AFI20220819BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20220930 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: WON, JONG HUN Inventor name: BOREGOWDA, LOKESH RAYASANDRA Inventor name: YOON, YOUNG KWON Inventor name: KANG, HWA YONG Inventor name: NAIR, PRAJIT SIVASANKARAN Inventor name: KIM, DONG SOO |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602018047324 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1554644 Country of ref document: AT Kind code of ref document: T Effective date: 20230415 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20230315 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230615 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1554644 Country of ref document: AT Kind code of ref document: T Effective date: 20230315 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230616 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230717 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230715 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20230822 Year of fee payment: 6 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602018047324 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 |
|
26N | No opposition filed |
Effective date: 20231218 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230919 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20230930 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20230919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230919 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230315 |