WO2021138867A1 - Method for electronic device with a plurality of cameras and electronic device - Google Patents

Method for electronic device with a plurality of cameras and electronic device Download PDF

Info

Publication number
WO2021138867A1
WO2021138867A1 PCT/CN2020/071141 CN2020071141W WO2021138867A1 WO 2021138867 A1 WO2021138867 A1 WO 2021138867A1 CN 2020071141 W CN2020071141 W CN 2020071141W WO 2021138867 A1 WO2021138867 A1 WO 2021138867A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
histogram
statistical information
camera
electronic device
Prior art date
Application number
PCT/CN2020/071141
Other languages
French (fr)
Inventor
Hajime Numata
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to CN202080091004.9A priority Critical patent/CN114930799B/en
Priority to PCT/CN2020/071141 priority patent/WO2021138867A1/en
Publication of WO2021138867A1 publication Critical patent/WO2021138867A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/90
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present disclosure relates to method for electronic device with a plurality of cameras, and electronic device.
  • FIG. 14A is an example of an image taken by a downsized telephoto camera
  • FIG. 14B is an example of an image taken by a conventional regular (i.e. non-downsized) telephoto camera.
  • the contrast of the image taken by the downsized telephoto camera is lower due to lens flare.
  • FIGs. 15A and 15B show a worse example.
  • FIG. 15A is an example of an image taken by a downsized telephoto camera
  • FIG. 15B is an example of an image taken by a conventional regular (i.e. non-downsized) telephoto camera.
  • the image taken by the downsized telephoto camera has a lower contrast due to lens flare and is generally whitish.
  • DO Diffractive Optics
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for electronic device with a plurality of cameras, and an electronic device.
  • a method for an electronic device which includes a first camera having a first FOV and a second camera having a second FOV which is narrower than the first FOV may include:
  • the first statistical information may include a first histogram of the partial image
  • the second statistical information may include a second histogram of the second image
  • the determining whether to correct the second image it may be determined that the second image will be corrected if the first histogram is not similar to the second histogram.
  • both of the first histogram and the second histogram may be luminance histograms.
  • the determining whether to correct the second image may include:
  • the method may further include: correcting the second image based on the second statistical information if it is determined that the second image will be corrected.
  • the first statistical information may include a first histogram of the partial image
  • the second statistical information may include a second histogram of the second image
  • the second image may be converted until a shape of the second histogram is similar to a shape of the first histogram
  • the first statistical information may include a first histogram of the partial image
  • the second statistical information may include a second histogram of the second image
  • the correcting the second image may include:
  • An electronic device may include:
  • a first camera configured to have a first FOV and to take a first image
  • a second camera configured to have a second FOV which is narrower than the first FOV and to take a second image, the second image being a part of the first image;
  • a statistical information creating unit configured to create a first statistical information based on a partial image of the first image, and to create a second statistical information based on the second image, a FOV of the partial image being substantially the same as that of the second image;
  • a determining unit configured to determine whether to correct the second image based on the first statistical information and the second statistical information.
  • the first statistical information may include a first histogram of the partial image
  • the second statistical information may include a second histogram of the second image
  • the determining unit may determine that the second image will be corrected if the first histogram is not similar to the second histogram.
  • both of the first histogram and the second histogram may be luminance histograms.
  • the electronic device may further include a correcting unit configured to correct the second image based on the second statistical information if it is determined that the second image will be corrected.
  • the first statistical information may include a first histogram of the partial image
  • the second statistical information may include a second histogram of the second image
  • the correcting unit may convert the second image until a shape of the second histogram is similar to a shape of the first histogram.
  • the first camera may have a first lens and a first image sensor
  • the second camera may have a second lens and a second image sensor
  • a first distance between the first lens and the first image sensor may be substantially the same as a second distance between the second lens and the second image sensor.
  • the electronic device may be a smartphone, and the first and second cameras may be disposed in a housing of the smartphone so as to face a back side thereof.
  • FIG. 1A is a rear view of an electronic device in the present disclosure
  • FIG. 1B is a cross-sectional view taken along line I-I of the electronic device shown in FIG. 1A;
  • FIG. 2 is a block diagram of the electronic device in the present disclosure
  • FIG. 3 is a functional block diagram of a processor provided with the electronic device in the present disclosure
  • FIG. 4 is a flow chart illustrating a method for an electronic device with a plurality of cameras according to an implementation of the present disclosure
  • FIG. 5A is an example of a first image taken by a main camera
  • FIG. 5B is an example of a second image taken by a sub camera
  • FIG. 6A is an example of a luminance histogram of the first image
  • FIG. 6B is an example of a luminance histogram of the second image
  • FIG. 7A is an another example of a first image taken by the main camera
  • FIG. 7B is an another example of a second image taken by the sub camera
  • FIG. 8A is an another example of a luminance histogram of the first image
  • FIG. 8B is an another example of a luminance histogram of the second image
  • FIG. 9 is a flow chart illustrating a method for determining whether to correct the second image according to an implementation of the present disclosure
  • FIG. 10A shows a first difference between a minimum signal level and a maximum signal level of a luminance histogram of the first image
  • FIG. 10B shows a second difference between a minimum signal level and a maximum signal level of a luminance histogram of the second image
  • FIG. 11 is a flow chart illustrating a method for correcting the second image according to an implementation of the present disclosure
  • FIG. 12A is a diagram illustrating tone curves for explaining the method for correcting the second image according to an implementation of the present disclosure
  • FIG. 12B is a diagram illustrating converted histograms to explain the method for correcting the second image according to an implementation of the present disclosure
  • FIG. 13A is an example of the second image before a correction process
  • FIG. 13B is an example of the second image after the correction process
  • FIG. 14A is an example of an image taken by a conventional downsized telephoto camera
  • FIG. 14B is an example of an image taken by a conventional regular (non-downsized) telephoto camera
  • FIG. 15A is another example of an image taken by a conventional downsized telephoto camera
  • FIG. 15B is another example of an image taken by a conventional regular (non-downsized) telephoto camera.
  • FIG. 1A is a rear view of an electronic device in the present disclosure.
  • FIG. 1B is a cross-sectional view taken along line I-I of the electronic device shown in FIG. 1A. Please note that a light unit 4 is omitted in FIG. 1B.
  • FIG. 2 is a block diagram of the electronic device in the present disclosure.
  • FIG. 3 is a functional block diagram of a processor installed in the electronic device in the present disclosure.
  • an electronic device 1 is a smartphone.
  • the electronic device 1 may be another mobile device or terminal, such as a tablet computer, a personal digital assistant (PDA) , a laptop, a mobile Internet device (MID) or a wearable device.
  • PDA personal digital assistant
  • MID mobile Internet device
  • the electronic device 1 may be any electronic apparatus which has a plurality of cameras.
  • the electronic device 1 includes a main camera (first camera) 2, a sub camera (second camera) 3, a light unit 4, a display 5 and a housing 6.
  • the light unit 4 is configured to illuminate a subject of a photograph.
  • the display 5 is a display such as a Liquid Crystal Display (LCD) and an Organic Light Emitting Diode (OLED) .
  • the display 5 is configured to be a touch panel. Alternatively, the display 5 may be a regular display which is not a touch panel if the electronic device 1 is not a smartphone.
  • the housing 6 houses the main camera 2, the sub camera 3, the light unit 4 and the display 5.
  • the housing 6 also houses other components illustrated in FIG. 2 such as a processor 10, a memory 20 and a power supply 30 etc.
  • the main camera 2 and the sub camera 3 are disposed in the housing 6 so as to face a back side of the electronic device 1.
  • the main camera 2 includes a lens 2a and an image sensor 2b
  • the sub camera 3 includes a lens 3a and an image sensor 3b.
  • each of the lens 2a and the lens 3a consists of a plurality of lens.
  • the image sensor 2b takes an image which passes through the lens 2a.
  • the image sensor 3b takes an image which passes through the lens 3a.
  • the image sensors 2b, 3b have a plurality of pixels, each of which has a photo detector such as a photo-diode and a color filter provided above the photo detector.
  • the image sensors 2b, 3b may be a CMOS (Complementary Metal-Oxide-Semiconductor Transistor) image sensor, or may be a CCD (Charge Coupled Device) image sensor.
  • CMOS Complementary Metal-Oxide-Semiconductor Transistor
  • the main camera 2 has a first field of view (FOV)
  • the sub camera 3 has a second field of view.
  • the second FOV is narrower than the first FOV.
  • the main camera 2 is a wide angle camera and the sub camera 3 is a telephoto camera.
  • the distance between a lens and an image sensor depends on a focal length of a camera.
  • the focal length of a telephoto camera should be longer than that of a regular or wide camera. That is, a distance between the lens 3a and the image sensor 3b should be longer than a distance between the lens 2a and the image sensor 2b.
  • the distance between the lens 3a and the image sensor 3b is substantially the same as the distance between the lens 2a and the image sensor 2b (the reason why this is possible will be described later) .
  • a thickness T of the electronic device 1 can be made to be thinner than that of a conventional electronic device.
  • whether to correct an image taken by the sub camera 3 is determined in consideration of an image captured by the main camera 2.
  • the electronic device 1 includes the processor 10 such as a CPU or GPU, the memory 20, the power supply 30 and a communication unit 40 in addition to the already described components (i.e. the main camera 2, the sub camera 3, the light unit 4 and the display 5.
  • the processor 10 such as a CPU or GPU
  • the memory 20 the power supply 30
  • a communication unit 40 in addition to the already described components (i.e. the main camera 2, the sub camera 3, the light unit 4 and the display 5.
  • the above components in the electronic device 1 are connected together via a bus 50.
  • the processor 10 executes one or more programs stored in the memory 20.
  • the processor 10 implements various applications and data processing of the electronic device 1 by executing the programs.
  • the processor 10 may be one or more computer processors.
  • the processor 10 is not limited to one CPU core, but it may have a plurality of CPU cores.
  • the processor 10 is configured to process an image taken by the main camera 2 and the sub camera 3.
  • the processor 10 may be a main CPU of the electronic device 1.
  • the processor 10 may be an image processing unit (IPU) or a DSP (Digital Signal Processor) provided for the main camera 2 and the sub camera 3.
  • IPU image processing unit
  • DSP Digital Signal Processor
  • the memory 20 stores a program to be executed by the processor 10 and various kinds of data such as image data or statistical data thereof.
  • the memory 20 may be a random access memory (RAM) , a read-only memory (ROM) or an erasable programmable read-only memory (EPROM or flash memory) .
  • the power supply 30 includes a battery such as a lithium-ion rechargeable battery and a battery management unit (BMU) for managing the battery.
  • a battery such as a lithium-ion rechargeable battery
  • BMU battery management unit
  • the communication unit 40 is configured to receive and transmit data to communicate with the web or other electronic devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) .
  • the communication unit 40 may include an antenna and a RF (radio frequency) circuit.
  • the processor 10 includes a statistical information creating unit 11, a determining unit 12 and a correcting unit 13. At least one of the units 11, 12 and 13 may be a software (program) or hardware such as ASIC (application specific integrated circuit) . For example, statistical information such as a histogram can be created by an integrated circuit for performing automatic exposure (i.e., AE) .
  • AE automatic exposure
  • the statistical information creating unit 11 is configured to create a statistical information such as a histogram based on an image captured by the cameras 2, 3. More specifically, the creating unit 11 creates a first statistical information based on a partial image of a first image taken by the main camera 2. Further, the creating unit 11 creates a second statistical information based on a second image taken by the sub camera 3.Here, a field of view (FOV) of the partial image is substantially the same as that of the second image.
  • a field of view (FOV) of the partial image is substantially the same as that of the second image.
  • the first and second statistical information are, for example, a histogram such as a luminance histogram.
  • the histogram may be a Red histogram, a Green histogram or a Blue histogram.
  • the determining unit 12 is configured to determine whether to correct the second image based on the first statistical information and the second statistical information. The detailed determination method will be described later with reference to FIG. 9.
  • the correcting unit 13 is configured to correct the second image based on the second statistical information if the determining unit 12 determines that second image should be corrected. The detailed correction method will be described later with reference to FIG. 11.
  • a method according to an implementation of the present disclosure for the electronic device 1 will be described with reference to the flowchart shown in FIG. 4.
  • the method includes the following steps.
  • the main camera 2 takes an image (a first image) and the sub camera 3 takes an image (a second image) .
  • the second image is an enlarged image of a part of the first image.
  • FIG. 5A is an example image I1 taken by the main camera 2 and FIG. 5B is an example image I2 taken by the sub camera 3.
  • the example image I2 corresponds to a region R of the example image I1. In other words, the region R indicates a region corresponding to a FOV of the sub camera 3.
  • FIG. 7A is another example image I3 taken by the main camera 2 and FIG. 7B is another example image I4 taken by the sub camera 3.
  • the statistical information creating unit 11 creates a first statistical information based on a partial image of the first image.
  • the partial image is an image of the region R of the first image.
  • the first statistical information includes a first histogram of the partial image
  • the second statistical information includes a second histogram of the second image.
  • both of the first histogram and the second histogram are luminance histograms.
  • FIG. 6A shows a luminance histogram H 1 of the partial image of the image I1 shown in FIG. 5A.
  • FIG. 8A shows a luminance histogram H3 of the partial image of the image I3 shown in FIG. 7A.
  • a luminance histogram is a graph which shows a luminance distribution of an image. The horizontal axis of the luminance histogram indicates pixel brightness (left side: da rk, right side: bright) , and the vertical axis indicates the number of pixels for each brightness.
  • the statistical information creating unit 11 creates a second statistical information based on the second image. Precisely, the second statistical information is created based on a full FOV of the second image.
  • FIG. 6B is a luminance histogram H2 of the image I2 shown in FIG. 5B.
  • FIG. 8B is a luminance histogram H4 of the image I4 shown in FIG. 7B.
  • the determining unit 12 determines whether to correct the second image taken by the sub camera 3 based on the first statistical information and the second statistical information. As shown in FIGs. 6A and 6B, in this example, the histogram H1 and the histogram H2 are almost the same, which means that no optical issues such as lens flare have occurred in the sub camera 3. Therefore, the determining unit 12 determines that it is not necessary to correct the second image I2. As a result, it can be avoided that the image I2, a beautiful foggy scene is erroneously enhanced.
  • the determining unit 12 determines that it is necessary to correct the second image I2. As a result, the image I4 can be enhanced correctly.
  • the first statistical information may not be used if it is not normal.
  • the first statistical information is a luminance histogram
  • the determining unit 12 calculates a first difference between a minimum signal level and a maximum signal level of the first histogram.
  • a difference D1 between a minimum signal level L1 and a maximum signal level L2 is calculated.
  • the minimum signal level L1 may be a value greater than the minimum value of the first histogram by a predetermined ratio (e.g., 1%) .
  • the maximum signal level L2 may be a value smaller than the maximum value of the first histogram by a predetermined ratio (e.g., 1%) .
  • the determining unit 12 may store the difference D1 in the memory 20 so that the correcting unit 13 can read it to correct the second image.
  • the determining unit 12 calculates a second difference between a minimum signal level and a maximum signal level of the second histogram.
  • a difference D2 between the minimum signal level L3 and the maximum signal level L4 is calculated.
  • the minimum signal level L3 may be a value greater than the minimum value of the second histogram by a predetermined ratio (e.g., 1%) .
  • the maximum signal level L4 may be a value smaller than the maximum value of the second histogram by a predetermined ratio (e.g., 1%) .
  • the determining unit 12 calculates an absolute difference between the first difference and the second difference. Specifically, the determining unit 12 calculates an absolute difference (i.e.,
  • an absolute difference i.e.,
  • the determining unit 12 judges whether the absolute difference is greater than a determination threshold value or not.
  • the determination threshold value is a predetermined value stored in advance in the memory 20. If the absolute difference is greater than the determination threshold value, the determining unit 12 determines that the second image should be corrected (S45) .
  • the determination that the second image should be corrected may be made if the first histogram is not similar to the second histogram.
  • the determination may be made based on whether the shapes of both histograms are similar to each other.
  • the determination may be made based on other characteristics of the histograms such as a position of a peak or a number of peaks.
  • the correcting unit 13 corrects the second image based on the second statistical information (S6) . For example, the correcting unit 13 enhances the second image.
  • the correcting unit 13 obtains a first difference between a minimum signal level and a maximum signal level of the first histogram.
  • the first difference is the difference D1 described above.
  • the correcting unit 13 reads the difference D1 stored in the memory 20.
  • the correcting unit 13 may calculate the first difference by itself.
  • the correcting unit 13 sets a tone enhance bottom and a tone enhance top.
  • an initial value of the tone enhance bottom and an initial value of the tone enhance top are set.
  • the tone enhance bottom and the tone enhance top define a tone curve for converting tones of an image.
  • the tone curve is a line which represents a change between a pixel value before correction (input level) and a pixel value after correction (output level) .
  • a tone curve C1 shown on the left side of FIG. 12A is an example of a tone curve when the initial value of the tone enhance bottom (B0) is 0 and the initial value of the tone enhance top (T0) is 255.
  • the tone curve C1 is a straight line connecting point (0, 0) and point (255, 255) and does not convert an image.
  • a histogram Ha shown on the left side of FIG. 12B is a luminance histogram converted by the tone curve C1.
  • a tone curve C2 is defined by the values B1 and T1.
  • the correcting unit 13 converts the second histogram based on a tone curve defined by the tone enhance bottom and the tone enhance top.
  • a histogram Hb shown in the center of FIG. 12B is a luminance histogram obtained by converting the histogram Ha with the tone curve C2.
  • the histogram Hb has a shape obtained by expanding the histogram Ha in the horizontal direction.
  • the correcting unit 13 calculates a second difference between a minimum signal level and a maximum signal level of the converted second histogram.
  • a minimum signal level and a maximum signal level of the converted second histogram obtained in the S63 are set in the same manner as described in the steps S41 and S42.
  • a difference between the minimum signal level and the maximum signal level is calculated.
  • the correcting unit 13 calculates an absolute difference between the first difference obtained in the S61 and the second difference calculated in the S64.
  • the correcting unit 13 judges whether the absolute difference is greater than a correction threshold value or not.
  • the correction threshold value is a predetermined value stored in advance in the memory 20.
  • the correcting unit 13 changes the tone enhance bottom and the tone enhance top set in the S62 so that a difference between the tone enhance bottom and the tone enhance top is smaller than the current distance (S67) .
  • a tone curve C3 is defined by the values B2 and T2.
  • an amount of change in the tone enhance bottom and an amount of change in the tone enhance top may be different. That is to say, the value B1 of the tone enhance bottom may be changed to a value of “B1 + x” , and a value T1 of the tone enhance top may be changed to a value of “T1 -y” , where y is a positive number different from x.
  • only one of the tone enhance bottom and the tone enhance top may be changed.
  • the correcting unit 13 converts the converted second histogram Hb based on the tone curve C3.
  • a histogram Hc shown on the right side of FIG. 12B is a luminance histogram obtained by converting the histogram Hb with the tone curve C3.
  • the histogram Hc has a shape obtained by expanding the histogram Hb in the horizontal direction.
  • FIG. 13A is an example of the second image (i.e., image taken by the sub camera 3) before correction and FIG. 13B is an example of the second image after correction.
  • the correction method described above is just an example, and the method is not limited to the above.
  • the correction method may be a method in which the second image is converted until a shape of the second histogram is similar to a shape of the first histogram.
  • a lens height of the sub camera 3 can be reduced.
  • a distance between the lens 3a and the image sensor 3b can be as short as a distance between the lens 2a and the image sensor 3b.
  • the thickness T of the electronic device 1 can be made thinner than that of conventional electronic devices. Further, the cost of the electronic device is not increased since a complicated optical system such as a diffractive optical system is not necessary.
  • the object of the present disclosure described above is an electronic device having two cameras, but it may also be an electronic device having three or more cameras.
  • the electronic device 1 may further include a third camera having a third FOV.
  • An image taken by the third camera may be used as the first image described above if the third FOV is narrower than the second FOV of the main camera 2.
  • strong light such as sunlight easily enters the image and, as a result, there is a higher probability that normal statistical information cannot be obtained.
  • which image is used as the first image may be determined based on statistical information of each image taken by the first and the third camera.
  • the present disclosure it can be determi ned whether an image taken by a one camera is deteriorated due to an optical issue such as lens flare by comparing the image with an image taken by the other camera having a wider FOV than the one camera.
  • the one camera is a telephoto camera and the other camera is a regular camera.
  • the one camera may be a regular camera and the other camera may be a wide angle camera.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • “aplurality of” means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Abstract

Disclosed is a method for an electronic device which includes a first camera having a first FOV and a second camera having a second FOV which is narrower than the first FOV. According to the method, a first image is taken by the first camera and a second image is taken by the second camera. The second image is a part of the first image. A first statistical information is created based on a partial image of the first image. A FOV of the partial image is substantially the same as that of the second image. A second statistical information is created based on the second image. It is determined whether to correct the second image based on the first statistical information and the second statistical information.

Description

METHOD FOR ELECTRONIC DEVICE WITH A PLURALITY OF CAMERAS AND ELECTRONIC DEVICE FIELD
The present disclosure relates to method for electronic device with a plurality of cameras, and electronic device.
BACKGROUND
In recent years, electronic devices such as smartphones having a plurality of cameras (e.g., a regular camera and a telephoto camera) are very popular. One problem they face is that the thickness of a smartphone increases since telephoto camera lenses have long focal length.
It is necessary to shorten the distance between a lens and an image sensor in order to reduce the size of the telephoto camera. However, shortening the distance causes lens flaring and other phenomena which negatively affect image quality.
FIG. 14A is an example of an image taken by a downsized telephoto camera, and FIG. 14B is an example of an image taken by a conventional regular (i.e. non-downsized) telephoto camera. As can be seen comparing FIG. 14A and FIG. 14B, the contrast of the image taken by the downsized telephoto camera is lower due to lens flare.
FIGs. 15A and 15B show a worse example. FIG. 15A is an example of an image taken by a downsized telephoto camera, and FIG. 15B is an example of an image taken by a conventional regular (i.e. non-downsized) telephoto camera. As can be seen comparing FIG. 15A and FIG. 15B, the image taken by the downsized telephoto camera has a lower contrast due to lens flare and is generally whitish.
Some approaches (e.g., Diffractive Optics: DO) have been studied to reduce the distance between a lens and an image sensor. However, the costs of such optical system are  considerable.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for electronic device with a plurality of cameras, and an electronic device.
A method for an electronic device which includes a first camera having a first FOV and a second camera having a second FOV which is narrower than the first FOV, the method may include:
taking a first image by the first camera and a second image by the second camera, the second image being a part of the first image;
creating a first statistical information based on a partial image of the first image, a FOV of the partial image being substantially the same as that of the second image;
creating a second statistical information based on the second image; and
determining whether to correct the second image based on the first statistical information and the second statistical information.
In some embodiments, the first statistical information may include a first histogram of the partial image, and the second statistical information may include a second histogram of the second image.
In some embodiments, in the determining whether to correct the second image, it may be determined that the second image will be corrected if the first histogram is not similar to the second histogram.
In some embodiments, both of the first histogram and the second histogram may be luminance histograms.
In some embodiments, the determining whether to correct the second image may include:
calculating a first difference between a minimum signal level and a maximum signal level of the first histogram;
calculating a second difference between a minimum signal level and a maximum signal level of the second histogram;
calculating an absolute difference between the first difference and the second difference; and
determining that the second image will be corrected if the absolute difference is greater than a determination threshold value.
In some embodiments, the method may further include: correcting the second image based on the second statistical information if it is determined that the second image will be corrected.
In some embodiments, the first statistical information may include a first histogram of the partial image, the second statistical information may include a second histogram of the second image, and the second image may be converted until a shape of the second histogram is similar to a shape of the first histogram.
In some embodiments, the first statistical information may include a first histogram of the partial image, and the second statistical information may include a second histogram of the second image, and wherein
the correcting the second image may include:
obtaining a first difference between a minimum signal level and a maximum signal level of the first histogram;
setting a tone enhance bottom and a tone enhance top;
converting the second histogram based on a tone curve defined by the tone enhance bottom and the tone enhance top;
calculating a second difference between a minimum signal level and a maximum signal level of the converted second histogram;
calculating an absolute difference between the first difference and the second difference;
changing the tone enhance bottom and/or the tone enhance top so that a difference between the tone enhance bottom and the tone enhance top is smaller if the absolute  difference is greater than a correction threshold value; and
converting the converted second histogram based on the changed tone enhance bottom and/or the changed tone enhance top.
An electronic device may include:
a first camera configured to have a first FOV and to take a first image;
a second camera configured to have a second FOV which is narrower than the first FOV and to take a second image, the second image being a part of the first image;
a statistical information creating unit configured to create a first statistical information based on a partial image of the first image, and to create a second statistical information based on the second image, a FOV of the partial image being substantially the same as that of the second image; and
a determining unit configured to determine whether to correct the second image based on the first statistical information and the second statistical information.
In some embodiments, the first statistical information may include a first histogram of the partial image, and the second statistical information may include a second histogram of the second image.
In some embodiments, the determining unit may determine that the second image will be corrected if the first histogram is not similar to the second histogram.
In some embodiments, both of the first histogram and the second histogram may be luminance histograms.
In some embodiments, the electronic device may further include a correcting unit configured to correct the second image based on the second statistical information if it is determined that the second image will be corrected.
In some embodiments, the first statistical information may include a first histogram of the partial image, and the second statistical information may include a second histogram of the second image, and
the correcting unit may convert the second image until a  shape of the second histogram is similar to a shape of the first histogram.
In some embodiments, the first camera may have a first lens and a first image sensor, and the second camera may have a second lens and a second image sensor, and
a first distance between the first lens and the first image sensor may be substantially the same as a second distance between the second lens and the second image sensor.
In some embodiments, the electronic device may be a smartphone, and the first and second cameras may be disposed in a housing of the smartphone so as to face a back side thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1A is a rear view of an electronic device in the present disclosure;
FIG. 1B is a cross-sectional view taken along line I-I of the electronic device shown in FIG. 1A;
FIG. 2 is a block diagram of the electronic device in the present disclosure;
FIG. 3 is a functional block diagram of a processor provided with the electronic device in the present disclosure;
FIG. 4 is a flow chart illustrating a method for an electronic device with a plurality of cameras according to an implementation of the present disclosure;
FIG. 5A is an example of a first image taken by a main camera;
FIG. 5B is an example of a second image taken by a sub camera;
FIG. 6A is an example of a luminance histogram of the first image;
FIG. 6B is an example of a luminance histogram of the  second image;
FIG. 7A is an another example of a first image taken by the main camera;
FIG. 7B is an another example of a second image taken by the sub camera;
FIG. 8A is an another example of a luminance histogram of the first image;
FIG. 8B is an another example of a luminance histogram of the second image;
FIG. 9 is a flow chart illustrating a method for determining whether to correct the second image according to an implementation of the present disclosure;
FIG. 10A shows a first difference between a minimum signal level and a maximum signal level of a luminance histogram of the first image;
FIG. 10B shows a second difference between a minimum signal level and a maximum signal level of a luminance histogram of the second image;
FIG. 11 is a flow chart illustrating a method for correcting the second image according to an implementation of the present disclosure;
FIG. 12A is a diagram illustrating tone curves for explaining the method for correcting the second image according to an implementation of the present disclosure;
FIG. 12B is a diagram illustrating converted histograms to explain the method for correcting the second image according to an implementation of the present disclosure;
FIG. 13A is an example of the second image before a correction process;
FIG. 13B is an example of the second image after the correction process;
FIG. 14A is an example of an image taken by a conventional downsized telephoto camera;
FIG. 14B is an example of an image taken by a conventional regular (non-downsized) telephoto camera;
FIG. 15A is another example of an image taken by a  conventional downsized telephoto camera;
FIG. 15B is another example of an image taken by a conventional regular (non-downsized) telephoto camera.
DETAILED DESCRIPTION
Implementations of the present disclosure will be described in detail and examples of the implementations will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The implementations described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
<Electronic device>
First, a schematic configuration of an electronic device in the present disclosure is described with reference to FIG. 1a to FIG. 3.
FIG. 1A is a rear view of an electronic device in the present disclosure. FIG. 1B is a cross-sectional view taken along line I-I of the electronic device shown in FIG. 1A. Please note that a light unit 4 is omitted in FIG. 1B. FIG. 2 is a block diagram of the electronic device in the present disclosure. FIG. 3 is a functional block diagram of a processor installed in the electronic device in the present disclosure.
As shown in FIGs. 1A and 1B, an electronic device 1 according to an implementation is a smartphone. The electronic device 1 may be another mobile device or terminal, such as a tablet computer, a personal digital assistant (PDA) , a laptop, a mobile Internet device (MID) or a wearable device. The electronic device 1 may be any electronic apparatus which has a plurality of cameras.
The electronic device 1 includes a main camera (first camera) 2, a sub camera (second camera) 3, a light unit 4, a display 5 and a housing 6.
The light unit 4 is configured to illuminate a subject of a photograph. The display 5 is a display such as a Liquid Crystal Display (LCD) and an Organic Light Emitting Diode (OLED) . The display 5 is configured to be a touch panel. Alternatively, the display 5 may be a regular display which is not a touch panel if the electronic device 1 is not a smartphone. The housing 6 houses the main camera 2, the sub camera 3, the light unit 4 and the display 5. The housing 6 also houses other components illustrated in FIG. 2 such as a processor 10, a memory 20 and a power supply 30 etc.
As shown in FIGs. 1A and 1B, the main camera 2 and the sub camera 3 are disposed in the housing 6 so as to face a back side of the electronic device 1.
As shown in FIG. 1B, the main camera 2 includes a lens 2a and an image sensor 2b, and the sub camera 3 includes a lens 3a and an image sensor 3b. Actually, each of the lens 2a and the lens 3a consists of a plurality of lens. The image sensor 2b takes an image which passes through the lens 2a. The image sensor 3b takes an image which passes through the lens 3a. The image sensors 2b, 3b have a plurality of pixels, each of which has a photo detector such as a photo-diode and a color filter provided above the photo detector. The image sensors 2b, 3b may be a CMOS (Complementary Metal-Oxide-Semiconductor Transistor) image sensor, or may be a CCD (Charge Coupled Device) image sensor.
The main camera 2 has a first field of view (FOV) , and the sub camera 3 has a second field of view. The second FOV is narrower than the first FOV. In a present implementation, the main camera 2 is a wide angle camera and the sub camera 3 is a telephoto camera.
Generally, the distance between a lens and an image sensor depends on a focal length of a camera. The focal length of a telephoto camera should be longer than that of a regular or wide camera. That is, a distance between the lens 3a and the image sensor 3b should be longer than a distance between the lens 2a and the image sensor 2b. However, according to the  present implementation, the distance between the lens 3a and the image sensor 3b is substantially the same as the distance between the lens 2a and the image sensor 2b (the reason why this is possible will be described later) . Thereby, a thickness T of the electronic device 1 can be made to be thinner than that of a conventional electronic device. As will be explained in detail later, whether to correct an image taken by the sub camera 3 is determined in consideration of an image captured by the main camera 2.
Next, a functional block of the electronic device 1 is described with reference to FIGs. 2 and 3.
As shown in FIG. 2, the electronic device 1 includes the processor 10 such as a CPU or GPU, the memory 20, the power supply 30 and a communication unit 40 in addition to the already described components (i.e. the main camera 2, the sub camera 3, the light unit 4 and the display 5. The above components in the electronic device 1 are connected together via a bus 50.
The processor 10 executes one or more programs stored in the memory 20. The processor 10 implements various applications and data processing of the electronic device 1 by executing the programs. The processor 10 may be one or more computer processors. The processor 10 is not limited to one CPU core, but it may have a plurality of CPU cores.
The processor 10 is configured to process an image taken by the main camera 2 and the sub camera 3. The processor 10 may be a main CPU of the electronic device 1. Alternatively, the processor 10 may be an image processing unit (IPU) or a DSP (Digital Signal Processor) provided for the main camera 2 and the sub camera 3.
The memory 20 stores a program to be executed by the processor 10 and various kinds of data such as image data or statistical data thereof. The memory 20 may be a random access memory (RAM) , a read-only memory (ROM) or an erasable programmable read-only memory (EPROM or flash memory) .
The power supply 30 includes a battery such as a lithium-ion rechargeable battery and a battery management unit (BMU) for managing the battery.
The communication unit 40 is configured to receive and transmit data to communicate with the web or other electronic devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) . The communication unit 40 may include an antenna and a RF (radio frequency) circuit.
Next, referring to FIG. 3, the processor 10 will be described in detail.
The processor 10 includes a statistical information creating unit 11, a determining unit 12 and a correcting unit 13. At least one of the  units  11, 12 and 13 may be a software (program) or hardware such as ASIC (application specific integrated circuit) . For example, statistical information such as a histogram can be created by an integrated circuit for performing automatic exposure (i.e., AE) .
The statistical information creating unit 11 is configured to create a statistical information such as a histogram based on an image captured by the  cameras  2, 3. More specifically, the creating unit 11 creates a first statistical information based on a partial image of a first image taken by the main camera 2. Further, the creating unit 11 creates a second statistical information based on a second image taken by the sub camera 3.Here, a field of view (FOV) of the partial image is substantially the same as that of the second image.
The first and second statistical information are, for example, a histogram such as a luminance histogram. The histogram may be a Red histogram, a Green histogram or a Blue histogram.
The determining unit 12 is configured to determine whether to correct the second image based on the first  statistical information and the second statistical information. The detailed determination method will be described later with reference to FIG. 9.
The correcting unit 13 is configured to correct the second image based on the second statistical information if the determining unit 12 determines that second image should be corrected. The detailed correction method will be described later with reference to FIG. 11.
<Method for the electronic device>
A method according to an implementation of the present disclosure for the electronic device 1 will be described with reference to the flowchart shown in FIG. 4. In this example, the method includes the following steps.
In S1, the main camera 2 takes an image (a first image) and the sub camera 3 takes an image (a second image) . The second image is an enlarged image of a part of the first image. FIG. 5A is an example image I1 taken by the main camera 2 and FIG. 5B is an example image I2 taken by the sub camera 3. The example image I2 corresponds to a region R of the example image I1. In other words, the region R indicates a region corresponding to a FOV of the sub camera 3. Further, FIG. 7A is another example image I3 taken by the main camera 2 and FIG. 7B is another example image I4 taken by the sub camera 3.
In S2, the statistical information creating unit 11 creates a first statistical information based on a partial image of the first image. The partial image is an image of the region R of the first image. The first statistical information includes a first histogram of the partial image, and the second statistical information includes a second histogram of the second image.
In the present disclosure, both of the first histogram and the second histogram are luminance histograms. FIG. 6A shows a luminance histogram H 1 of the partial image of the image I1 shown in FIG. 5A. FIG. 8A shows a luminance histogram H3 of the partial image of the image I3 shown in FIG. 7A. A luminance histogram is a graph which shows a luminance distribution of an image. The horizontal axis of the luminance histogram indicates  pixel brightness (left side: da rk, right side: bright) , and the vertical axis indicates the number of pixels for each brightness.
In S3, the statistical information creating unit 11 creates a second statistical information based on the second image. Precisely, the second statistical information is created based on a full FOV of the second image. FIG. 6B is a luminance histogram H2 of the image I2 shown in FIG. 5B. FIG. 8B is a luminance histogram H4 of the image I4 shown in FIG. 7B.
In S4, the determining unit 12 determines whether to correct the second image taken by the sub camera 3 based on the first statistical information and the second statistical information. As shown in FIGs. 6A and 6B, in this example, the histogram H1 and the histogram H2 are almost the same, which means that no optical issues such as lens flare have occurred in the sub camera 3. Therefore, the determining unit 12 determines that it is not necessary to correct the second image I2. As a result, it can be avoided that the image I2, a beautiful foggy scene is erroneously enhanced.
On the other hand, as shown in FIGs. 8A and 8B, in this example, the luminance histogram H3 and the luminance histogram H4 are different from each other, which means that optical issues such as lens flare have occurred in the sub camera 3. Therefore, the determining unit 12 determines that it is necessary to correct the second image I2. As a result, the image I4 can be enhanced correctly.
Please note that, in S4, the first statistical information may not be used if it is not normal. Specifically, in case that the first statistical information is a luminance histogram, it is judged that the histogram is not normal when the number of pixels is scaled out over a wide range of grey levels. This phenomenon occurs when a strong light such as sunlight enters the first FOV.
An example of a method performed by the determining unit 12 is described in detail with reference to FIG. 9.
In S41, the determining unit 12 calculates a first difference between a minimum signal level and a maximum signal level of the first histogram. In an example shown in FIG.  10A, a difference D1 between a minimum signal level L1 and a maximum signal level L2 is calculated. The minimum signal level L1 may be a value greater than the minimum value of the first histogram by a predetermined ratio (e.g., 1%) . Similarly, the maximum signal level L2 may be a value smaller than the maximum value of the first histogram by a predetermined ratio (e.g., 1%) .
The determining unit 12 may store the difference D1 in the memory 20 so that the correcting unit 13 can read it to correct the second image.
In S42, the determining unit 12 calculates a second difference between a minimum signal level and a maximum signal level of the second histogram. In an example shown in FIG. 10B, a difference D2 between the minimum signal level L3 and the maximum signal level L4 is calculated. The minimum signal level L3 may be a value greater than the minimum value of the second histogram by a predetermined ratio (e.g., 1%) . Similarly, the maximum signal level L4 may be a value smaller than the maximum value of the second histogram by a predetermined ratio (e.g., 1%) .
In S43, the determining unit 12 calculates an absolute difference between the first difference and the second difference. Specifically, the determining unit 12 calculates an absolute difference (i.e., |D1-D2|) between the difference D1 calculated in the S41 and the difference D2 calculated in the S42.
In S44, the determining unit 12 judges whether the absolute difference is greater than a determination threshold value or not. The determination threshold value is a predetermined value stored in advance in the memory 20. If the absolute difference is greater than the determination threshold value, the determining unit 12 determines that the second image should be corrected (S45) .
It should be noted that the determination method described above is just an example, and the method is not limited to the above. The determination that the second image should be corrected may be made if the first histogram is not  similar to the second histogram. For example, the determination may be made based on whether the shapes of both histograms are similar to each other. Alternatively, the determination may be made based on other characteristics of the histograms such as a position of a peak or a number of peaks.
Returning to the flowchart shown in FIG. 4 and the steps after the S4 are described below.
If the determining unit 12 determines that the second image should be corrected (i.e., S5: Yes) , the correcting unit 13 corrects the second image based on the second statistical information (S6) . For example, the correcting unit 13 enhances the second image.
An example of a method performed by the correcting unit 13 is described in detail with reference to FIG. 11.
In S61, the correcting unit 13 obtains a first difference between a minimum signal level and a maximum signal level of the first histogram. The first difference is the difference D1 described above. For example, the correcting unit 13 reads the difference D1 stored in the memory 20. Alternatively, the correcting unit 13 may calculate the first difference by itself.
In S62, the correcting unit 13 sets a tone enhance bottom and a tone enhance top. In this step, an initial value of the tone enhance bottom and an initial value of the tone enhance top are set. The tone enhance bottom and the tone enhance top define a tone curve for converting tones of an image. The tone curve is a line which represents a change between a pixel value before correction (input level) and a pixel value after correction (output level) .
A tone curve C1 shown on the left side of FIG. 12A is an example of a tone curve when the initial value of the tone enhance bottom (B0) is 0 and the initial value of the tone enhance top (T0) is 255. The tone curve C1 is a straight line connecting point (0, 0) and point (255, 255) and does not convert an image. A histogram Ha shown on the left side of FIG. 12B is a luminance histogram converted by the tone curve C1.
As shown in the center of FIG. 12 A, a value B1 (=B0 +  x) is set as the initial value of the tone enhance bottom, and a value T1 (=T0 -x) is set as the initial value of the tone enhance top, where x is a positive number. A tone curve C2 is defined by the values B1 and T1.
In S63, the correcting unit 13 converts the second histogram based on a tone curve defined by the tone enhance bottom and the tone enhance top. A histogram Hb shown in the center of FIG. 12B is a luminance histogram obtained by converting the histogram Ha with the tone curve C2. The histogram Hb has a shape obtained by expanding the histogram Ha in the horizontal direction.
In S64, the correcting unit 13 calculates a second difference between a minimum signal level and a maximum signal level of the converted second histogram. In this step, a minimum signal level and a maximum signal level of the converted second histogram obtained in the S63 are set in the same manner as described in the steps S41 and S42. Next, a difference between the minimum signal level and the maximum signal level is calculated.
In S65, the correcting unit 13 calculates an absolute difference between the first difference obtained in the S61 and the second difference calculated in the S64.
In S66, the correcting unit 13 judges whether the absolute difference is greater than a correction threshold value or not. The correction threshold value is a predetermined value stored in advance in the memory 20.
If the absolute difference is greater than the correction threshold value (S66: Yes) , the correcting unit 13 changes the tone enhance bottom and the tone enhance top set in the S62 so that a difference between the tone enhance bottom and the tone enhance top is smaller than the current distance (S67) . In the S67, for example, as shown on the right side of FIG. 12 A, the value B1 of the tone enhance bottom is changed to a value B2 (=B1 + x) , and a value T1 of the tone enhance top is changed to a value T2 (=T1 -x) . A tone curve C3 is defined by the values B2 and T2.
Alternatively, an amount of change in the tone enhance bottom and an amount of change in the tone enhance top may be different. That is to say, the value B1 of the tone enhance bottom may be changed to a value of “B1 + x” , and a value T1 of the tone enhance top may be changed to a value of “T1 -y” , where y is a positive number different from x.
Alternatively, only one of the tone enhance bottom and the tone enhance top may be changed.
Returning to the S63, the correcting unit 13 converts the converted second histogram Hb based on the tone curve C3. A histogram Hc shown on the right side of FIG. 12B is a luminance histogram obtained by converting the histogram Hb with the tone curve C3. The histogram Hc has a shape obtained by expanding the histogram Hb in the horizontal direction.
On the other hand, if the absolute difference calculated in the S65 is not greater than the correction threshold value (S66: No) , the correction process ends. FIG. 13A is an example of the second image (i.e., image taken by the sub camera 3) before correction and FIG. 13B is an example of the second image after correction.
It should be noted that the correction method described above is just an example, and the method is not limited to the above. The correction method may be a method in which the second image is converted until a shape of the second histogram is similar to a shape of the first histogram.
According to the implementation of the present disclosure, it is possible to recover a contrast of an image taken by the sub camera (telephoto camera) 3. Therefore, a lens height of the sub camera 3 can be reduced. For example, a distance between the lens 3a and the image sensor 3b can be as short as a distance between the lens 2a and the image sensor 3b. As a result, the thickness T of the electronic device 1 can be made thinner than that of conventional electronic devices. Further, the cost of the electronic device is not increased since a complicated optical system such as a diffractive optical system is not necessary.
The object of the present disclosure described above is an electronic device having two cameras, but it may also be an electronic device having three or more cameras. In this case, the electronic device 1 may further include a third camera having a third FOV. An image taken by the third camera may be used as the first image described above if the third FOV is narrower than the second FOV of the main camera 2. In the case of a wide FOV, strong light such as sunlight easily enters the image and, as a result, there is a higher probability that normal statistical information cannot be obtained.
Alternatively, which image is used as the first image may be determined based on statistical information of each image taken by the first and the third camera.
According to the present disclosure, it can be determi ned whether an image taken by a one camera is deteriorated due to an optical issue such as lens flare by comparing the image with an image taken by the other camera having a wider FOV than the one camera. In the above disclosure, the one camera is a telephoto camera and the other camera is a regular camera. Alternatively, the one camera may be a regular camera and the other camera may be a wide angle camera.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to  indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "aplurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure,  certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function,  may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data  signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (16)

  1. A method for an electronic device which comprises a first camera having a first FOV and a second camera having a second FOV which is narrower than the first FOV, the method comprising:
    taking a first image by the first camera and a second image by the second camera, the second image being a part of the first image;
    creating a first statistical information based on a partial image of the first image, a FOV of the partial image being substantially the same as that of the second image;
    creating a second statistical information based on the second image; and
    determining whether to correct the second image based on the first statistical information and the second statistical information.
  2. The method according to claim 1, wherein the first statistical information includes a first histogram of the partial image, and the second statistical information includes a second histogram of the second image.
  3. The method according to claim 2, wherein in the determining whether to correct the second image, it is determined that the second image will be corrected if the first histogram is not similar to the second histogram.
  4. The method according to claim 2 or 3, wherein both of the first histogram and the second histogram are luminance histograms.
  5. The method according to any one of claims 2 to 4, wherein the determining whether to correct the second image comprising:
    calculating a first difference between a minimum signal  level and a maximum signal level of the first histogram;
    calculating a second difference between a minimum signal level and a maximum signal level of the second histogram;
    calculating an absolute difference between the first difference and the second difference; and
    determining that the second image will be corrected if the absolute difference is greater than a determination threshold value.
  6. The method according to claim 1, further comprising:
    correcting the second image based on the second statistical information if it is determined that the second image will be corrected.
  7. The method according to claim 6, wherein the first statistical information includes a first histogram of the partial image, the second statistical information includes a second histogram of the second image, and the second image is converted until a shape of the second histogram is similar to a shape of the first histogram.
  8. The method according to claim 6 or 7, wherein the first statistical information includes a first histogram of the partial image, and the second statistical information includes a second histogram of the second image, and wherein
    the correcting the second image comprising:
    obtaining a first difference between a minimum signal level and a maximum signal level of the first histogram;
    setting a tone enhance bottom and a tone enhance top;
    converting the second histogram based on a tone curve defined by the tone enhance bottom and the tone enhance top;
    calculating a second difference between a minimum signal level and a maximum signal level of the converted second histogram;
    calculating an absolute difference between the first  difference and the second difference;
    changing the tone enhance bottom and/or the tone enhance top so that a difference between the tone enhance bottom and the tone enhance top is smaller if the absolute difference is greater than a correction threshold value; and
    converting the converted second histogram based on the changed tone enhance bottom and/or the changed tone enhance top.
  9. An electronic device comprising:
    a first camera configured to have a first FOV and to take a first image;
    a second camera configured to have a second FOV which is narrower than the first FOV and to take a second image, the second image being a part of the first image;
    a statistical information creating unit configured to create a first statistical information based on a partial image of the first image, and to create a second statistical information based on the second image, a FOV of the partial image being substantially the same as that of the second image; and
    a determining unit configured to determine whether to correct the second image based on the first statistical information and the second statistical information.
  10. The electronic device according to claim 9, wherein the first statistical information includes a first histogram of the partial image, and the second statistical information includes a second histogram of the second image.
  11. The electronic device according to claim 10, wherein the determining unit determines that the second image will be corrected if the first histogram is not similar to the second histogram.
  12. The electronic device according to claim 10 or 11, wherein both of the first histogram and the second histogram  are luminance histograms.
  13. The electronic device according to clai m 9, further comprising a correcting unit configured to correct the second image based on the second statistical information if it is determined that the second image will be corrected.
  14. The electronic device according to claim 13, wherein the first statistical information includes a first histogram of the partial image, and the second statistical information includes a second histogram of the second image, and
    the correcting unit converts the second image until a shape of the second histogram is similar to a shape of the first histogram.
  15. The electronic device according to any one of claims 9 to 14, wherein the first camera has a first lens and a first image sensor, and the second camera has a second lens and a second image sensor, and
    a first distance between the first lens and the first image sensor is substantially the same as a second distance between the second lens and the second image sensor.
  16. The electronic device according to any one of claims 9 to 15, wherein the electronic device is a smartphone, and the first and second cameras are disposed in a housing of the smartphone so as to face a back side thereof.
PCT/CN2020/071141 2020-01-09 2020-01-09 Method for electronic device with a plurality of cameras and electronic device WO2021138867A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080091004.9A CN114930799B (en) 2020-01-09 2020-01-09 Method for electronic device with multiple cameras and electronic device
PCT/CN2020/071141 WO2021138867A1 (en) 2020-01-09 2020-01-09 Method for electronic device with a plurality of cameras and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/071141 WO2021138867A1 (en) 2020-01-09 2020-01-09 Method for electronic device with a plurality of cameras and electronic device

Publications (1)

Publication Number Publication Date
WO2021138867A1 true WO2021138867A1 (en) 2021-07-15

Family

ID=76787399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/071141 WO2021138867A1 (en) 2020-01-09 2020-01-09 Method for electronic device with a plurality of cameras and electronic device

Country Status (2)

Country Link
CN (1) CN114930799B (en)
WO (1) WO2021138867A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4164211A1 (en) * 2021-10-06 2023-04-12 Axis AB Method and system for stray light compensation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110128406A1 (en) * 2009-11-30 2011-06-02 Canon Kabushiki Kaisha Image pickup apparatus capable of correcting image quality degradation due to optical member, method of controlling image pickup apparatus, and nonvolatile storage medium
US20180096487A1 (en) * 2016-09-30 2018-04-05 Qualcomm Incorporated Systems and methods for fusing images
US20180352165A1 (en) * 2017-06-05 2018-12-06 Samsung Electronics Co., Ltd. Device having cameras with different focal lengths and a method of implementing cameras with different focal lenghts
CN109040524A (en) * 2018-08-16 2018-12-18 Oppo广东移动通信有限公司 Artifact eliminating method, device, storage medium and terminal
CN109299696A (en) * 2018-09-29 2019-02-01 成都臻识科技发展有限公司 A kind of method for detecting human face and device based on dual camera
US20190227267A1 (en) * 2015-01-03 2019-07-25 Corephotonics Ltd. Miniature telephoto lens module and a camera utilizing such a lens module

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102457675B (en) * 2010-10-27 2014-08-06 展讯通信(上海)有限公司 Image shooting anti-shaking manner for handheld camera equipment
CN107085825A (en) * 2017-05-27 2017-08-22 成都通甲优博科技有限责任公司 Image weakening method, device and electronic equipment
US10244164B1 (en) * 2017-09-11 2019-03-26 Qualcomm Incorporated Systems and methods for image stitching
CN107835372A (en) * 2017-11-30 2018-03-23 广东欧珀移动通信有限公司 Imaging method, device, mobile terminal and storage medium based on dual camera

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110128406A1 (en) * 2009-11-30 2011-06-02 Canon Kabushiki Kaisha Image pickup apparatus capable of correcting image quality degradation due to optical member, method of controlling image pickup apparatus, and nonvolatile storage medium
US20190227267A1 (en) * 2015-01-03 2019-07-25 Corephotonics Ltd. Miniature telephoto lens module and a camera utilizing such a lens module
US20180096487A1 (en) * 2016-09-30 2018-04-05 Qualcomm Incorporated Systems and methods for fusing images
US20180352165A1 (en) * 2017-06-05 2018-12-06 Samsung Electronics Co., Ltd. Device having cameras with different focal lengths and a method of implementing cameras with different focal lenghts
CN109040524A (en) * 2018-08-16 2018-12-18 Oppo广东移动通信有限公司 Artifact eliminating method, device, storage medium and terminal
CN109299696A (en) * 2018-09-29 2019-02-01 成都臻识科技发展有限公司 A kind of method for detecting human face and device based on dual camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4164211A1 (en) * 2021-10-06 2023-04-12 Axis AB Method and system for stray light compensation

Also Published As

Publication number Publication date
CN114930799A (en) 2022-08-19
CN114930799B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
US10531019B2 (en) Image processing method and apparatus, and electronic device
US10348962B2 (en) Image processing method and apparatus, and electronic device
US10339632B2 (en) Image processing method and apparatus, and electronic device
US10249021B2 (en) Image processing method and apparatus, and electronic device
CN104363379A (en) Shooting method by use of cameras with different focal lengths and terminal
CN112639610B (en) Method and apparatus for restoring images across a display
US10380717B2 (en) Image processing method and apparatus, and electronic device
US10165205B2 (en) Image processing method and apparatus, and electronic device
US20180150935A1 (en) Image Processing Method And Apparatus, And Electronic Device
WO2021138867A1 (en) Method for electronic device with a plurality of cameras and electronic device
CN112104860B (en) Calibration method, calibration device, computer device and readable storage medium
US10785462B2 (en) Image processing apparatus, image capturing apparatus, image processing method, and storage medium
CN111656759A (en) Image color correction method and device and storage medium
WO2021243709A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
WO2021120107A1 (en) Method of generating captured image and electrical device
WO2022016385A1 (en) Method of generating corrected pixel data, electrical device and non-transitory computer readable medium
WO2022047671A1 (en) Method of removing noise in image and electrical device
WO2022174460A1 (en) Sensor, electrical device, and non-transitory computer readable medium
WO2021159295A1 (en) Method of generating captured image and electrical device
CN114946170B (en) Method for generating image and electronic equipment
WO2022047614A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
WO2024050818A1 (en) Camera assembly and electrical device
WO2022077353A1 (en) Method and apparatus for tone mapping, and computer usable medium storing software for implementing the method
WO2021138797A1 (en) Method of adjusting captured image and electrical device
WO2023044852A1 (en) Camera assembly and electrical device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20912788

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20912788

Country of ref document: EP

Kind code of ref document: A1