WO2020047859A1 - Electronic apparatus, image correction method, and non-transitory computer-readable recording medium - Google Patents
Electronic apparatus, image correction method, and non-transitory computer-readable recording medium Download PDFInfo
- Publication number
- WO2020047859A1 WO2020047859A1 PCT/CN2018/104679 CN2018104679W WO2020047859A1 WO 2020047859 A1 WO2020047859 A1 WO 2020047859A1 CN 2018104679 W CN2018104679 W CN 2018104679W WO 2020047859 A1 WO2020047859 A1 WO 2020047859A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- image
- light
- electronic apparatus
- camera
- Prior art date
Links
- 238000003702 image correction Methods 0.000 title claims description 14
- 238000000034 method Methods 0.000 title claims description 14
- 238000004364 calculation method Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
Definitions
- the presentdisclosure relates to an electronic apparatus, an image correction method, and a non-transitory computer-readable recording medium.
- a screen is spread as widely as possible on a display unit of an electronic apparatus.
- a notch inthedisplay is provided at anend of the electronic apparatus, and the camera is installed thereso as not to be affected by light emitted by the display.
- Such configuration restricts the size of the display, and placement restrictions arise in that the camera can only be installed at the edge of the electronic apparatus. Therefore, an electronicapparatusis developed which arranges a front-facing camera on the back side of the display.
- the front-facing camera is available for imaging when status of the display is "inactive” , but when statuses of both of the display and the camera are "active" at the same time, images captured by the camera are damaged due to light emitted by the display. Therefore an electronic apparatus is desired for which, even though the apparatus is equipped with the camera behind the display, the images captured by the camera are little damaged due to light emitted by the display.
- the present disclosure has been created in view of the foregoing circumstances, and an objective of the disclosure is to provide an electronic apparatusthat is capable of obtaining a good quality image even if a front-facing camera is placed behind a display.
- an electronic apparatus includes a display, a camera, and a processor.
- the display is light-transmissiveand is configured to emit light.
- the camera is behind the display and is configured to capture an image.
- the processor is configured to correct the image captured by the camera by removing an influence of the light emitted by the display.
- the electronic apparatus mayfurther include a memory configured to store display information representing information to be displayed on the display screen, wherein the processor may generate an emission map obtained from the display information stored in the memory and correct the image captured by the camera based on the emission map.
- the processor may generate the emission map by interpolating emission data obtained from the display informationstored in the memory.
- the processor may calculate a color value of each pixel of the emission map using bilinear interpolation.
- the display may further include: a light emitting member to emit the light; and a reflector to reflect light emitted by the light emitting member, the reflector defining a hole in front of the camera.
- an image correction method includes: capturing an image throughalight-transmissive and light emitting display from the back of the display; and correcting the captured image by removing an influence of the light emitted by the display.
- a non-transitory computer-readable recording medium is a storing a program causing a computer to execute an image correction process.
- the image correction process includes: capturing an image throughalight-transmissive and light emitting display from the back of the display; and correcting the captured image by removing an influence of the light emitted by the display.
- FIG. 1 is a front view of a cellphone according to an embodiment of the present disclosure
- FIG. 2 is a cross-sectional viewof a cellphone taken along line A-A of FIG. 1;
- FIG. 3 is a hardware configuration diagram of a cellphone according to an embodiment of the present disclosure
- FIG. 4 is a flowchart of image correction processing according to an embodiment of the present disclosure.
- FIG. 5 is anillustration for explaining light entering a camera according to an embodiment of the present disclosure
- FIG. 6 is an illustration for explaining relationship between an image captured by a camera and a display according to an embodiment of the present disclosure.
- FIG. 7 is an illustration for explaining an emission map according to an embodiment of the present disclosure.
- theelectronic apparatus according to the present disclosure is described as a cellphone including a front-facing camera.
- a cellphone 1 As illustrated in FIG. 1, and in FIG. 2 that is a cross-sectional viewtaken along line A-A of FIG. 1, a cellphone 1 according to the present embodiment includes a camera 10 and a display 20.
- the camera 10 is arranged on the back side of the display 20 and faces the direction of the front of the cellphone 1.
- the camera 10 includes a lens 11 and a body 12.
- a light receiving element (not illustrated) in the body 12 receives light having passed through the lens 11, generates a digital signal representing a captured color image, and transmits the generated digital signal to the cellphone 1.
- the digital signal includes image data representing a bitmap image including multiple pixels. Each pixel has color information represented by a value from 0 to 255.
- the display 20 is arranged on the front face of the cellphone 1.
- the display 20 is a light-transmissive display, such as an organic light emitting diode (OLED) display.
- the display 20 includes a light emitting member 21 and a reflector 22.
- the light emitting member 21 includesmultiple light emitting elements. Each of the light emitting elements 23 (illustrated in FIG. 5) emits light. The light is emitted toward the front andthe back of the cellphone 1.
- the reflector 22 reflects the light emitted toward the back of the cellphone 1 so as to ensure visibility.
- the reflector 22 defines a hole in front of the lens 11and the light emitting member 21 can transmit light so as to the lens 11 can receive the light from the front side of the cellphone 1.
- the cellphone 1 includes the camera 10, the display 20, and a controller 30.
- the controller 30 is connected to the camera 10 and the display 20 so as to communicate with the camera 10 and the display 20.
- the controller 30 includes a central processing unit (CPU) 31 and a memory 32.
- the CPU 31 executes various types of processing by executing a program for control, which is stored in the memory 32.
- the memory 32 stores various data such as a program for the CPU 31, the image datareceived by the camera 10, and display information for display on the display 20.
- the memory 32 further stores structural feature informationsuch as a distance between the light emitting member 21 and the lens 11, a size of the lens 11, and a distance between the adjacent light emitting elements of the light emitting elements 23.
- the CPU 31 determines a zoom magnification of the camera 10 and stores the determined zoom magnification in the memory32. Thereafter, in response to operation of a switch (not illustrated) of the cellphone 1 or a depression of a surface of the display 20 by the user, the CPU 31 controls the camera 10 to capture an image, and starts an "image correction processing" illustrated in FIG. 4.
- the image correction processing is a process for removing the influence of the light emitted by the display 20 from the image captured by the camera 10.
- the CPU 31 obtains the image from the camera 10 (step S11) .
- the image captured by the camera 10 includes environment light illustrated as arrows 2 in FIG. 5 and the emission light emitted by the light emitting elements 23 illustrated as arrows 3 in FIG. 5.
- the emission light emitted by the light emitting elements 23 is mainly towards the front and the back of the cellphone 1. In other words, there is very little light emission in the diagonal direction. Therefore, most of the outgoing light emitted from one of the light emitting elements 23 reaches aportionof the lens 11 just behind one of the light emitting elements 23.
- the CPU 31 generates an emission map by using bilinear interpolation (step S12) .
- the emission map indicates how light emitted from the display 20 affects the image captured by the camera 10, and is expressed by color information of the same number of pixels as the image.
- Each pixel of the emission map has color information represented by a value of the same 0–255 rangeas that of the image.
- the light emitting member 21 includes a plurality of light emitting elements that are independently controlled to emit various colors of light.
- the light emitting elements are regularly arranged with a little clearance therebetween.
- light emitting elements 23a, 23b, 23c, and 23d are arranged side by side to form a quadrangle.
- Alens portion 11a is just behind the light emitting element 23a.
- lens portions 11b, 11c, and 11d are just behind light emitting elements 23b, 23c, and 23d respectively.
- the CPU 31 identifies pixels of the emission map 4, each of such pixels being the most affected pixel by a respective light emitting element based on the zoom magnificationand the structural feature information such asthe distance betweenthe light emitting member 21 and the lens 11, the size of the lens 11, and the distance between the adjacent light emitting elements of the light emitting elements 23stored in the memory 32.
- the CPU 31 calculates that the light emitted from the light emitting element 23a and passed through the lens portion 11a is projectedon a pixel 4a at the zoom magnification stored in the memory 32, so that the CPU 31 identifiesthe pixel 4a as the pixel most affected by the light emitting element 23a.
- the CPU 31 identifies pixels 4b, 4c, and 4d are the pixels most affected by the light emitting elements 11b, 11c, and 11d, respectively, at the zoom magnification stored in the memory 32.
- the CPU 31 determines color values of the pixels identified as the pixelsmost affected by the corresponding light emitting elements based on the display information stored in the memory 32. For example, the CPU 31 determines color values of each of pixels 4a, 4b, 4c, and 4d.
- the CPU 31 calculates color values of the other pixels of the emission map 4. For example, as illustrated in FIG. 7, the CPU 31 calculates a color value of a pixel 4t of the emission map based on a so-called bilinear interpolation. Specifically, the CPU 31 calculates the color value of the pixel based on the four neighboring pixels selected from among the pixels identified to be the pixels most affected by the light emitting elements 23.
- the CPU 31 calculates the color value DP t of the pixel 4t based on the color values DP 1 , DP 2 , DP 3 , and DP 4 of the pixels 4a, 4b, 4c, and 4d, and the distances x 1 , x 2 , y 1 , and y 2 .
- the CPU 31 calculates the color value DP t by the following formula (a) .
- the values of w 1 , w 2 , w 3 , and w 4 in the formula (a) are calculatedby the distances x 1 , x 2 , y 1 , and y 2 by using bilinear interpolation.
- the CPU 31 subtracts the emission values of the generated emission map from the image obtained by the camera 10 (step S13) . Specifically, the CPU 31 determines color values of pixels in a corrected image by subtracting the color values of the corresponding pixels in the emission map from the color values of the corresponding pixels in the image obtained by the camera 10. And then, the CPU 31 finally obtains the corrected image when the CPU 31 determines color values of all of the pixels in the corrected image.
- the cellphone 1 of this embodiment even when a camera 10 disposed on the back of the display 20 acquires an image influenced by light emitted from the display 20, an emission map indicating the influence of the emitted light is generated. Then, by calculating the color value in the case in which the influence is not present, the image can be acquired in which the influence of the display 20 is reduced.
- the cellphone 1 is described as an example of theelectronic apparatus according to the present disclosure, but the electronic apparatusis not limited tothe cellphone, and other examples of the electronic apparatus include a smartphone, a personal digital assistant (PDA) , various types of computers, or the like.
- PDA personal digital assistant
- the image pixel and the emission map pixel have color information as values of 0 to 255.
- the values of the image pixel or the emission map pixel are not confined to the above-mentioned range. Not only 8-bit color like 0–255 values but also 15-bit, 16-bit, 18-bit, 24-bit, and othernumber of colorsmay be applied to this embodiment.
- the display 20 is the OLED display.
- the display 20 is not confined to the OLED display.
- the display 20 may be any type of display that has a light emitting element and transmits light.
- the above-described embodiment an example is described in which the operation to acquire the image by the camera 10 acquires a still image.
- the image acquired by the camera 10 is not confined to a still image.
- the above-described embodiment may be achievedby performing the image correction processing illustrated in FIG. 4 for each frame of the moving image.
- step S12 of FIG. 4 the CPU 31 identifies pixels of the emission map 4 such that each of those pixels is the pixel most affected by the corresponding light emitting element based on the zoom magnification and the structural feature information such asthe distance between the light emitting member 21 and the lens 11, the size of the lens 11, and the distance between the adjacent light emitting elements of the light emitting elements 23 stored in the memory 32.
- the structural feature information stored in the memory 32 includes fixed values, apreset correspondence table taking these values into consideration, and includingthe pixelsmost affected by the light emitting elements 23, may be stored in the memory 32 with only the zoom magnification as a parameter.
- the CPU 31 further determines a color value of a pixel 4t of the emission map based on a so-called bilinear interpolation.
- the CPU 31 can easilygenerate the emission map with a lowcomputational load.
- the method of determination of a color value of pixels of the emission map is not confined to a so-called bilinear interpolation.
- a nonlinear, trilinear, or multilinear interpolation may be applied to the above-described embodiment.
- the CPU 31 may calculate the color value of the pixel based on not only the four neighboringpixelsbut also another number of pixels such as 16 neighboring pixels.
- the accuracy of the emission map increases with increased complexity of the interpolation, but the amount of calculation increases. Therefore, an appropriate calculation method maybe adoptedin consideration of the tradeoff between accuracy and computational load.
- the CPU 31 may calculate color values of the emission map by a method not using interpolation. If the processing capability of the CPU 31 permits, the CPU 31 may calculate the color value of each pixel of the emission map based on the emission color of the plurality of light emitting elements and the zoom magnification. If the capacity of the memory permits, a correspondence table in which the color value of each pixel of the emission map is predefined with respect to the emission color of the light emitting elements23 may be stored in the memory 32.
- step S13 of FIG. 4 an example is described in which the CPU 31 determines color values of pixels in a corrected image by subtracting the color values of the corresponding pixels in the emission map from the color values of the corresponding pixels in the image obtained by the camera 10.
- a method of determining color values of pixels in a corrected image is not confined to the above-described method. For example, when calculating a color value of a pixel corresponding to a pixel of a dark color of the emission map, the CPU 31 may calculate the color value by interpolating from the surrounding pixels of the pixel of the image obtained by the camera 10 corresponding to the pixel of the emission map.
- Athreshold color value may be stored in advance in the memory 32, and the CPU 31 may determine whether the color value of the pixel of the emission map is larger than the threshold color value.
- the CPU 31 may calculate the color value of the corresponding pixel in a corrected image by interpolating from surrounding pixels of the image obtained by the camera 10 as described above.
- the CPU 31 may calculate the color value of the corresponding pixel in a corrected imageby subtractingthe color value of the pixel in the emission map from the color value of the corresponding pixel in the image obtained by the camera 10. In this way, even if the influence of the light of the display is strong and the color of the object in front of the display cannot be completely restored, an image of the object in front of the electronic apparatus can be acquired that is substantially free of the influence of light emitted by the display.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
An electronic apparatus includes a display, a camera, and a processor. The display is light-transmissiveand is configured to emit light. The camera is behind the display and is configured to capture an image. The processor is configured to correct the image captured by the camera by removing an influence of the light emitted by the display.
Description
The presentdisclosure relates to an electronic apparatus, an image correction method, and a non-transitory computer-readable recording medium.
As typified by cellphones, a screen is spread as widely as possible ona display unit of an electronic apparatus. However, in the case of an electronic apparatuson which a front-facing camerais mounted for so-called selfies, a notch inthedisplay is provided at anend of the electronic apparatus, and the camera is installed thereso as not to be affected by light emitted by the display. Such configuration restricts the size of the display, and placement restrictions arise in that the camera can only be installed at the edge of the electronic apparatus. Therefore, an electronicapparatusis developed which arranges a front-facing camera on the back side of the display. In such an electronic apparatus, the front-facing camera is available for imaging when status of the display is "inactive" , but when statuses of both of the display and the camera are "active" at the same time, images captured by the camera are damaged due to light emitted by the display. Therefore an electronic apparatus is desired for which, even though the apparatus is equipped with the camera behind the display, the images captured by the camera are little damaged due to light emitted by the display.
The present disclosure has been created in view of the foregoing circumstances, and an objective of the disclosure is to provide an electronic apparatusthat is capable of obtaining a good quality image even if a front-facing camera is placed behind a display.
SUMMARY
To achieve the above-described objective, an electronic apparatus according to a first aspect of the present disclosure includesa display, a camera, and a processor. The display is light-transmissiveand is configured to emit light. The camera is behind the display and is configured to capture an image. The processor is configured to correct the image captured by the camera by removing an influence of the light emitted by the display.
In the electronic apparatus of the aforementioned aspect, the electronic apparatus mayfurther include a memory configured to store display information representing information to be displayed on the display screen, wherein the processor may generate an emission map obtained from the display information stored in the memory and correct the image captured by the camera based on the emission map.
In the electronic apparatus of the aforementioned aspect, wherein the processor may generate the emission map by interpolating emission data obtained from the display informationstored in the memory.
In the electronic apparatus of the aforementioned aspect, wherein the processor may calculate a color value of each pixel of the emission map using bilinear interpolation.
In the electronic apparatus of the aforementioned aspect, wherein the display may further include: a light emitting member to emit the light; and a reflector to reflect light emitted by the light emitting member, the reflector defining a hole in front of the camera.
To achieve the above-described objective, an image correction method according to a second aspect of the present disclosure includes: capturing an image throughalight-transmissive and light emitting display from the back of the display; and correcting the captured image by removing an influence of the light emitted by the display.
To achieve the above-described objective, A non-transitory computer-readable recording mediumaccording to a third aspect of the present disclosure is a storing a program causing a computer to execute an image correction process. The image correction process includes: capturing an image throughalight-transmissive and light emitting display from the back of the display; and correcting the captured image by removing an influence of the light emitted by the display.
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
FIG. 1 is a front view of a cellphone according to an embodiment of the present disclosure;
FIG. 2is a cross-sectional viewof a cellphone taken along line A-A of FIG. 1;
FIG. 3is a hardware configuration diagram of a cellphone according to an embodiment of the present disclosure;
FIG. 4 is a flowchart of image correction processing according to an embodiment of the present disclosure.
FIG. 5is anillustration for explaining light entering a camera according to an embodiment of the present disclosure;
FIG. 6is an illustration for explaining relationship between an image captured by a camera and a display according to an embodiment of the present disclosure; and
FIG. 7 is an illustration for explaining an emission map according to an embodiment of the present disclosure.
The following describes embodiments of the present disclosure with reference to the drawings. In these embodiments, theelectronic apparatus according to the present disclosure is described asa cellphone including a front-facing camera.
As illustrated in FIG. 1, and in FIG. 2 that is a cross-sectional viewtaken along line A-A of FIG. 1, a cellphone 1 according to the present embodiment includes a camera 10 and a display 20.
The camera 10 is arranged on the back side of the display 20 and faces the direction of the front of the cellphone 1. The camera 10 includes a lens 11 and a body 12. A light receiving element (not illustrated) in the body 12 receives light having passed through the lens 11, generates a digital signal representing a captured color image, and transmits the generated digital signal to the cellphone 1. The digital signal includes image data representing a bitmap image including multiple pixels. Each pixel has color information represented by a value from 0 to 255.
The display 20 is arranged on the front face of the cellphone 1. The display 20 is a light-transmissive display, such as an organic light emitting diode (OLED) display. The display 20 includes a light emitting member 21 and a reflector 22. The light emitting member 21 includesmultiple light emitting elements. Each of the light emitting elements 23 (illustrated in FIG. 5) emits light. The light is emitted toward the front andthe back of the cellphone 1.
The reflector 22 reflects the light emitted toward the back of the cellphone 1 so as to ensure visibility. The reflector 22 defines a hole in front of the lens 11and the light emitting member 21 can transmit light so as to the lens 11 can receive the light from the front side of the cellphone 1.
As illustrated in FIG. 3, the cellphone 1 includes the camera 10, the display 20, and a controller 30. The controller 30 is connected to the camera 10 and the display 20 so as to communicate with the camera 10 and the display 20. The controller 30 includes a central processing unit (CPU) 31 and a memory 32.
The CPU 31 executes various types of processing by executing a program for control, which is stored in the memory 32.
The memory 32 stores various data such as a program for the CPU 31, the image datareceived by the camera 10, and display information for display on the display 20. The memory 32 further stores structural feature informationsuch as a distance between the light emitting member 21 and the lens 11, a size of the lens 11, and a distance between the adjacent light emitting elements of the light emitting elements 23.
Various processing executed by the cellphone 1 isdescribed concretely below with reference to drawings. In response to aso-called pinch-in or pinch-out operation on the surface of the display 20 by a user of the cellphone 1, the CPU 31 determines a zoom magnification of the camera 10 and stores the determined zoom magnification in the memory32. Thereafter, in response to operation of a switch (not illustrated) of the cellphone 1 or a depression of a surface of the display 20 by the user, the CPU 31 controls the camera 10 to capture an image, and starts an "image correction processing" illustrated in FIG. 4. The image correction processing is a process for removing the influence of the light emitted by the display 20 from the image captured by the camera 10.
Upon the CPU 31 starting the image correction processing, the CPU 31 obtains the image from the camera 10 (step S11) . The image captured by the camera 10 includes environment light illustrated as arrows 2 in FIG. 5 and the emission light emitted by the light emitting elements 23 illustrated as arrows 3 in FIG. 5. The emission light emitted by the light emitting elements 23 is mainly towards the front and the back of the cellphone 1. In other words, there is very little light emission in the diagonal direction. Therefore, most of the outgoing light emitted from one of the light emitting elements 23 reaches aportionof the lens 11 just behind one of the light emitting elements 23.
Thereafter, returning to FIG. 4, the CPU 31 generates an emission map by using bilinear interpolation (step S12) . The emission map indicates how light emitted from the display 20 affects the image captured by the camera 10, and is expressed by color information of the same number of pixels as the image. Each pixel of the emission map has color information represented by a value of the same 0–255 rangeas that of the image.
As illustrated in FIG. 6, the light emitting member 21 includes a plurality of light emitting elements that are independently controlled to emit various colors of light. The light emitting elements are regularly arranged with a little clearance therebetween. For example, light emitting elements 23a, 23b, 23c, and 23d are arranged side by side to form a quadrangle. Alens portion 11a is just behind the light emitting element 23a. Similarly, lens portions 11b, 11c, and 11d are just behind light emitting elements 23b, 23c, and 23d respectively.
Specifically, in the step S12, the CPU 31identifies pixels of the emission map 4, each of such pixels being the most affected pixel by a respective light emitting element based on the zoom magnificationand the structural feature information such asthe distance betweenthe light emitting member 21 and the lens 11, the size of the lens 11, and the distance between the adjacent light emitting elements of the light emitting elements 23stored in the memory 32. For example, the CPU 31 calculates that the light emitted from the light emitting element 23a and passed through the lens portion 11a is projectedon a pixel 4a at the zoom magnification stored in the memory 32, so that the CPU 31 identifiesthe pixel 4a as the pixel most affected by the light emitting element 23a. Similarly, the CPU 31 identifies pixels 4b, 4c, and 4d are the pixels most affected by the light emitting elements 11b, 11c, and 11d, respectively, at the zoom magnification stored in the memory 32.
Then the CPU 31 determines color values of the pixels identified as the pixelsmost affected by the corresponding light emitting elements based on the display information stored in the memory 32. For example, the CPU 31 determines color values of each of pixels 4a, 4b, 4c, and 4d.
Thereafter, the CPU 31 calculates color values of the other pixels of the emission map 4. For example, as illustrated in FIG. 7, the CPU 31 calculates a color value of a pixel 4t of the emission map based on a so-called bilinear interpolation. Specifically, the CPU 31 calculates the color value of the pixel based on the four neighboring pixels selected from among the pixels identified to be the pixels most affected by the light emitting elements 23. For example, the CPU 31 calculates the color value DP
t of the pixel 4t based on the color values DP
1, DP
2, DP
3, and DP
4 of the pixels 4a, 4b, 4c, and 4d, and the distances x
1, x
2, y
1, and y
2. The CPU 31 calculates the color value DP
t by the following formula (a) .
DP
t = w
1DP
1+w
2DP
2+w
3DP
3+w
4DP
4 (a)
Wherein, the values of w
1, w
2, w
3, and w
4in the formula (a) are calculatedby the distances x
1, x
2, y
1, and y
2by using bilinear interpolation.
Thereafter, returning to FIG. 4, the CPU 31 subtracts the emission values of the generated emission map from the image obtained by the camera 10 (step S13) . Specifically, the CPU 31 determines color values of pixels in a corrected image by subtracting the color values of the corresponding pixels in the emission map from the color values of the corresponding pixels in the image obtained by the camera 10. And then, the CPU 31 finally obtains the corrected image when the CPU 31 determines color values of all of the pixels in the corrected image.
According to the cellphone 1 of this embodiment, even when a camera 10 disposed on the back of the display 20 acquires an image influenced by light emitted from the display 20, an emission map indicating the influence of the emitted light is generated. Then, by calculating the color value in the case in which the influence is not present, the image can be acquired in which the influence of the display 20 is reduced.
In the above-described embodiment, the cellphone 1 is described as an example of theelectronic apparatus according to the present disclosure, but the electronic apparatusis not limited tothe cellphone, and other examples of the electronic apparatus include a smartphone, a personal digital assistant (PDA) , various types of computers, or the like.
In the above-described embodiment, an example is described in which the image pixel and the emission map pixel have color information as values of 0 to 255. However, the values of the image pixel or the emission map pixel are not confined to the above-mentioned range. Not only 8-bit color like 0–255 values but also 15-bit, 16-bit, 18-bit, 24-bit, and othernumber of colorsmay be applied to this embodiment.
In the above-described embodiment, an example described in which the display 20 is the OLED display. However, the display 20 is not confined to the OLED display. The display 20 may be any type of display that has a light emitting element and transmits light.
In the above-described embodiment, an example is described in which the operation to acquire the image by the camera 10 acquires a still image. However, the image acquired by the camera 10 is not confined to a still image. When the camera 10 acquires a moving image, the above-described embodimentmay be achievedby performing the image correction processing illustrated in FIG. 4 for each frame of the moving image.
In the above-described embodiment, in step S12 of FIG. 4, the CPU 31identifies pixels of the emission map 4 such that each of those pixels is the pixel most affected by the corresponding light emitting element based on the zoom magnification and the structural feature information such asthe distance between the light emitting member 21 and the lens 11, the size of the lens 11, and the distance between the adjacent light emitting elements of the light emitting elements 23 stored in the memory 32. However, because the structural feature information stored in the memory 32 includes fixed values, apreset correspondence table taking these values into consideration, and includingthe pixelsmost affected by the light emitting elements 23, may be stored in the memory 32 with only the zoom magnification as a parameter.
In the above-described embodiment, in step S12 of FIG. 4, the CPU 31 further determines a color value of a pixel 4t of the emission map based on a so-called bilinear interpolation. As a result, the CPU 31 can easilygenerate the emission map with a lowcomputational load. However, the method of determination of a color value of pixels of the emission map is not confined to a so-called bilinear interpolation. For example, a nonlinear, trilinear, or multilinear interpolation may be applied to the above-described embodiment. Additionally, the CPU 31 may calculate the color value of the pixel based on not only the four neighboringpixelsbut also another number of pixels such as 16 neighboring pixels. The accuracy of the emission map increases with increased complexity of the interpolation, but the amount of calculation increases. Therefore, an appropriate calculation method maybe adoptedin consideration of the tradeoff between accuracy and computational load.
The CPU 31may calculate color values of the emission map by a method not using interpolation. If the processing capability of the CPU 31 permits, the CPU 31 may calculate the color value of each pixel of the emission map based on the emission color of the plurality of light emitting elements and the zoom magnification. If the capacity of the memory permits, a correspondence table in which the color value of each pixel of the emission map is predefined with respect to the emission color of the light emitting elements23 may be stored in the memory 32.
In the above-described embodiment, in step S13 of FIG. 4, an example is described in which the CPU 31 determines color values of pixels in a corrected image by subtracting the color values of the corresponding pixels in the emission map from the color values of the corresponding pixels in the image obtained by the camera 10. However, a method of determining color values of pixels in a corrected image is not confined to the above-described method. For example, when calculating a color value of a pixel corresponding to a pixel of a dark color of the emission map, the CPU 31 may calculate the color value by interpolating from the surrounding pixels of the pixel of the image obtained by the camera 10 corresponding to the pixel of the emission map. Athreshold color value may be stored in advance in the memory 32, and the CPU 31 may determine whether the color value of the pixel of the emission map is larger than the threshold color value. When the CPU 31 determines that the color value of the pixel of the emission map is larger than the threshold value, the CPU 31 may calculate the color value of the corresponding pixel in a corrected image by interpolating from surrounding pixels of the image obtained by the camera 10 as described above. When the CPU 31 determines that the color value of the pixel of the emission map is not larger than the threshold value, the CPU 31may calculate the color value of the corresponding pixel in a corrected imageby subtractingthe color value of the pixel in the emission map from the color value of the corresponding pixel in the image obtained by the camera 10. In this way, even if the influence of the light of the display is strong and the color of the object in front of the display cannot be completely restored, an image of the object in front of the electronic apparatus can be acquired that is substantially free of the influence of light emitted by the display.
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Claims (7)
- An electronic apparatus, comprising:a displaythat is light-transmissiveand is configured to emit light;a camerabehind the display and configured tocapture animage; anda processorconfigured to correct the image captured by the cameraby removing an influence of the light emitted by the display.
- The electronic apparatus of claim 1, further comprising a memory configured to store display information representing information to be displayed on the display screen, whereinthe processor generates an emission map obtained from the display informationstored in the memory and corrects the image captured by the camera based on the emission map.
- The electronic apparatus of claim 2, wherein the processor generates the emission map by interpolating emission data obtained from the display informationstored in the memory.
- The electronic apparatus of claim 3, wherein the processor calculates a color value of each pixel of the emission map using bilinear interpolation.
- The electronic apparatus of claim 1, wherein the display further comprises:a light emitting member to emit the light; anda reflector to reflect light emitted by the light emitting member, the reflectordefining a hole in front of the camera.
- An image correction method, comprising:capturing an image throughalight-transmissive and light emittingdisplay from the back of the display; andcorrecting the captured image by removing an influence of the light emitted by the display.
- A non-transitory computer-readable recording medium storing a program causing a computer to execute an image correction process, the image correction process comprising:capturing an image throughalight-transmissive and light emittingdisplay from the back of the display; andcorrecting the captured image by removing an influence of the light emitted by the display.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/104679 WO2020047859A1 (en) | 2018-09-07 | 2018-09-07 | Electronic apparatus, image correction method, and non-transitory computer-readable recording medium |
CN201880095654.3A CN112567719B (en) | 2018-09-07 | 2018-09-07 | Electronic device, image correction method, and non-transitory computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/104679 WO2020047859A1 (en) | 2018-09-07 | 2018-09-07 | Electronic apparatus, image correction method, and non-transitory computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020047859A1 true WO2020047859A1 (en) | 2020-03-12 |
Family
ID=69722071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/104679 WO2020047859A1 (en) | 2018-09-07 | 2018-09-07 | Electronic apparatus, image correction method, and non-transitory computer-readable recording medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112567719B (en) |
WO (1) | WO2020047859A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5710428A (en) * | 1995-08-10 | 1998-01-20 | Samsung Electronics Co., Ltd. | Infrared focal plane array detecting apparatus having light emitting devices and infrared camera adopting the same |
CN101594428A (en) * | 2008-05-29 | 2009-12-02 | Lg电子株式会社 | Portable terminal and image-capturing method thereof |
CN105096816A (en) * | 2014-05-06 | 2015-11-25 | 西安诺瓦电子科技有限公司 | Method for correcting brightness and chrominance of LED display screen and mobile communication terminal |
CN107395987A (en) * | 2017-08-31 | 2017-11-24 | 华南理工大学 | A kind of smart mobile phone frame rate control method and system for visible light communication |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5598033B2 (en) * | 2010-03-15 | 2014-10-01 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
CN103800019B (en) * | 2012-11-07 | 2015-07-01 | 上海联影医疗科技有限公司 | Random scattering point forming method and PET (Positron Emission Tomography) image scattering correcting method |
CN105446061B (en) * | 2014-08-26 | 2018-11-09 | 鸿富锦精密工业(深圳)有限公司 | Electronic equipment |
CN107330415A (en) * | 2017-07-10 | 2017-11-07 | 广东欧珀移动通信有限公司 | Electronic installation |
-
2018
- 2018-09-07 CN CN201880095654.3A patent/CN112567719B/en active Active
- 2018-09-07 WO PCT/CN2018/104679 patent/WO2020047859A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5710428A (en) * | 1995-08-10 | 1998-01-20 | Samsung Electronics Co., Ltd. | Infrared focal plane array detecting apparatus having light emitting devices and infrared camera adopting the same |
CN101594428A (en) * | 2008-05-29 | 2009-12-02 | Lg电子株式会社 | Portable terminal and image-capturing method thereof |
CN105096816A (en) * | 2014-05-06 | 2015-11-25 | 西安诺瓦电子科技有限公司 | Method for correcting brightness and chrominance of LED display screen and mobile communication terminal |
CN107395987A (en) * | 2017-08-31 | 2017-11-24 | 华南理工大学 | A kind of smart mobile phone frame rate control method and system for visible light communication |
Also Published As
Publication number | Publication date |
---|---|
CN112567719A (en) | 2021-03-26 |
CN112567719B (en) | 2022-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102385360B1 (en) | Electronic device performing image correction and operation method of thereof | |
JP6469678B2 (en) | System and method for correcting image artifacts | |
US20180233095A1 (en) | Display device and control method thereof with brightness and transmittance control | |
US20200267320A1 (en) | Electronic device for stabilizing image and method for operating same | |
JP4806476B2 (en) | Image processing apparatus, image generation system, method, and program | |
JP2009171318A (en) | Image processor, image processing method, and imaging device | |
US20160037061A1 (en) | Dynamic motion estimation and compensation for temporal filtering | |
EP3528206A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP2006166450A (en) | System and method for detecting and correcting defective pixel in digital image sensor | |
US10516825B2 (en) | Image processing device, image processing method, and computer program product for correcting a motion vector of a low-reliability block | |
US11863889B2 (en) | Circuit for correcting lateral chromatic abberation | |
US20140333648A1 (en) | Projection type image display apparatus, method for displaying projection image, and storage medium | |
KR101672944B1 (en) | Lens shading correction method in the auto focus camera module | |
JP2018517369A (en) | Dynamic frame skip for auto white balance | |
US10951815B2 (en) | Imaging device, imaging control method, electronic apparatus, and readable storage media | |
US11763421B2 (en) | Circuit for combined down sampling and correction of image data | |
KR102282457B1 (en) | Method and apparatus for reducing color moire, and image processing apparatus | |
KR100690171B1 (en) | Image correction circuit and image correction method | |
KR20220025552A (en) | Image processing device and image enhancing method | |
WO2020047859A1 (en) | Electronic apparatus, image correction method, and non-transitory computer-readable recording medium | |
US20190378255A1 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
JP2001197356A (en) | Device and method for restoring picture | |
JP5906848B2 (en) | Image correction apparatus, image correction method, and computer program for image correction | |
JP6632339B2 (en) | Image processing device, imaging device, image processing method, program, storage medium | |
JP2015201715A (en) | Display control device, control method of display control device, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18932408 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18932408 Country of ref document: EP Kind code of ref document: A1 |