WO2018161758A1 - 曝光控制方法、曝光控制装置及电子装置 - Google Patents

曝光控制方法、曝光控制装置及电子装置 Download PDF

Info

Publication number
WO2018161758A1
WO2018161758A1 PCT/CN2018/075491 CN2018075491W WO2018161758A1 WO 2018161758 A1 WO2018161758 A1 WO 2018161758A1 CN 2018075491 W CN2018075491 W CN 2018075491W WO 2018161758 A1 WO2018161758 A1 WO 2018161758A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure amount
main image
depth
exposure
processing
Prior art date
Application number
PCT/CN2018/075491
Other languages
English (en)
French (fr)
Inventor
孙剑波
Original Assignee
广东欧珀移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东欧珀移动通信有限公司 filed Critical 广东欧珀移动通信有限公司
Priority to EP18764519.7A priority Critical patent/EP3579546B1/en
Publication of WO2018161758A1 publication Critical patent/WO2018161758A1/zh
Priority to US16/561,806 priority patent/US11206360B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present invention relates to imaging technology, and in particular to an exposure control method, an exposure control device, and an electronic device.
  • the exposure time of the underexposed image and the overexposed image is generally fixed, and if the exposure time of the underexposed image and the overexposed image is not correct, the quality of the wide dynamic range image is deteriorated.
  • the present invention aims to solve at least one of the technical problems existing in the prior art.
  • embodiments of the present invention provide an exposure control method, an exposure control device, and an electronic device.
  • the imaging device exposure is controlled according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount.
  • a depth-based exposure control device is configured to control an imaging device to collect scene data, the scene data includes a cached main image, and the exposure control device includes a processing module, a first determining module, a second determining module, and a control module.
  • the processing module is configured to process the scene data to obtain a foreground portion of the cached main image.
  • the first determining module is configured to determine a reference exposure amount according to the brightness information of the foreground portion.
  • the second determining module is configured to determine a dark frame exposure amount and a bright frame exposure amount according to the reference exposure amount, the dark frame exposure amount is smaller than the reference exposure amount, and the bright frame exposure amount is greater than the reference exposure amount .
  • the control module is configured to control exposure of the imaging device according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount.
  • An electronic device includes an imaging device and the exposure control device.
  • the exposure control method, the exposure control device, and the electronic device determine the foreground portion as the main body of the image by using the depth information, determine the reference exposure amount according to the luminance information of the main portion, and determine the dark frame exposure amount and the bright frame exposure according to the reference exposure amount.
  • the amount, the exposure of the imaging device is controlled according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount to obtain a plurality of images, so that a wide dynamic range image with a clear subject and a reasonable dynamic range can be obtained by the image processing method.
  • FIG. 1 is a flow chart showing an exposure control method according to an embodiment of the present invention.
  • FIG. 2 is a schematic plan view of an electronic device according to an embodiment of the present invention.
  • FIG. 3 is a flow chart of an exposure control method according to some embodiments of the present invention.
  • FIG. 4 is a functional block diagram of a processing module of an exposure control device according to some embodiments of the present invention.
  • FIG. 5 is a schematic flow chart of an exposure control method according to some embodiments of the present invention.
  • FIG. 6 is a functional block diagram of a processing unit in accordance with some embodiments of the present invention.
  • FIG. 7 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • FIG. 8 is a schematic diagram of another functional block of a processing unit in accordance with some embodiments of the present invention.
  • FIG. 9 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • FIG. 10 is a schematic diagram of functional modules of a first acquiring unit according to some embodiments of the present invention.
  • FIG. 11 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • FIG. 12 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • Figure 13 is a functional block diagram of an exposure control device in accordance with some embodiments of the present invention.
  • FIG. 14 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • 15 is a functional block diagram of a second determining module of some embodiments of the present invention.
  • 16 is a flow chart showing an exposure control method according to some embodiments of the present invention.
  • 17 is a schematic diagram of another functional block of an exposure control device according to some embodiments of the present invention.
  • the electronic device 100, the exposure control device 10, the processing module 11, the processing unit 112, the first processing sub-unit 1122, the second processing sub-unit 1124, the third processing sub-unit 1126, the fourth processing sub-unit 1128, and the first obtaining unit 114 a fifth processing sub-unit 1142, a finding sub-unit 1144, a first determining module 12, a second determining module 13, a second obtaining unit 132, a third obtaining unit 134, a determining unit 136, a control module 14, and a third determining module 15
  • the depth-based exposure control method of the embodiment of the present invention can be used to control the imaging device 20 to collect scene data.
  • the scene data includes a cached main image.
  • the exposure control method includes the following steps:
  • S14 The imaging device 20 is controlled to be exposed according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount.
  • the depth-based exposure control apparatus 10 of the embodiment of the present invention may be used to control the imaging device 20 to collect scene data.
  • the scene data includes a cached main image.
  • the exposure control device 10 includes a processing module 11, a first determination module 12, a second determination module 13, and a control module 14.
  • the processing module 11 is configured to process the scene data to obtain a foreground portion of the cached main image.
  • the first determining module 12 is configured to determine a reference exposure amount according to the brightness information of the foreground portion.
  • the second determining module 13 is configured to determine a dark frame exposure amount and a bright frame exposure amount according to the reference exposure amount, the dark frame exposure amount is less than the reference exposure amount, and the bright frame exposure amount is greater than the reference exposure amount.
  • the control module 14 is configured to control the exposure of the imaging device 20 according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount.
  • the exposure control method of the embodiment of the present invention can be implemented by the exposure control device 10 of the embodiment of the present invention, wherein step S11 can be implemented by the processing module 11, step S12 can be implemented by the first determining module 12, and step S13 can be implemented.
  • step S14 can be implemented by the control module 14.
  • the exposure control device 10 of the embodiment of the present invention may be applied to the electronic device 100 of the embodiment of the present invention, or the electronic device 100 of the embodiment of the present invention may include the exposure control device 10 of the embodiment of the present invention. Furthermore, the electronic device 100 of the embodiment of the present invention further includes an imaging device 20 that is electrically connected to the exposure control device 10.
  • the exposure control method, the exposure control device 10, and the electronic device 100 of the embodiment of the present invention determine the foreground portion as the main body of the image by using the depth information, determine the reference exposure amount according to the luminance information of the main portion, and determine the dark frame exposure amount and the light according to the reference exposure amount.
  • the frame exposure amount controls the imaging device 20 to obtain a plurality of images according to the reference exposure amount, the dark frame exposure amount, and the bright frame exposure amount, so that a wide dynamic range image with a clear subject and a dynamic range can be obtained by the image processing method.
  • the reference exposure amount may be the exposure amount of the normal exposure image, and obtaining the reference exposure amount according to the luminance information of the foreground portion may improve the definition of the foreground portion, that is, the body portion, thereby making the normal exposure image more ornamental.
  • electronic device 100 includes a cell phone or tablet. In an embodiment of the invention, the electronic device 100 is a mobile phone.
  • imaging device 20 includes a front camera and/or a rear camera, without limitation. In an embodiment of the invention, the imaging device 20 is a front camera.
  • step S11 includes the following steps:
  • S112 processing scene data to obtain depth information of the cached main image
  • the processing module 11 includes a processing unit 112 and a first acquisition unit 114.
  • the processing unit 112 is configured to process the scene data to obtain depth information of the cached main image.
  • the first obtaining unit 114 is configured to acquire a foreground portion of the cached main image according to the depth information.
  • step S112 can be implemented by the processing unit 112
  • step S114 can be implemented by the first obtaining unit 114.
  • the foreground portion of the cached main image can be obtained from the depth information.
  • the scene data includes a depth image corresponding to the cached main image
  • step S112 includes the following steps:
  • S1124 Process the depth data to obtain depth information.
  • the scene data includes a depth image corresponding to the cached main image
  • the processing unit 112 includes a first processing sub-unit 1122 and a second processing sub-unit 1124.
  • the first processing sub-unit 1122 is configured to process the depth image to obtain depth data of the cached main image.
  • the second processing sub-unit 1124 is configured to process the depth data to obtain depth information.
  • step S1122 can be implemented by the first processing sub-unit 1122
  • step S1124 can be implemented by the second processing sub-unit 1124.
  • the depth information of the cached main image can be quickly obtained by using the depth image.
  • the cached main image is an RGB color image
  • the depth image includes a large amount of depth data, that is, depth information including each person or object in the scene, and the depth information includes the size and/or range of the depth. Since the color information of the cached main image has a one-to-one correspondence with the depth information of the depth image, the depth information of the cached main image can be obtained.
  • the manner of acquiring the depth image corresponding to the cached main image includes acquiring the depth image by using structured light depth ranging and acquiring the depth image by using a time of flight (TOF) depth camera.
  • TOF time of flight
  • the imaging device 20 When a depth image is acquired using structured light depth ranging, the imaging device 20 includes a camera and a projector.
  • the structured light depth ranging is to project a certain mode light structure onto the surface of the object by using a projector, and form a three-dimensional image of the light strip modulated by the shape of the measured object on the surface.
  • the three-dimensional image of the light strip is detected by the camera to obtain a two-dimensional distortion image of the light strip.
  • the degree of distortion of the strip depends on the relative position between the projector and the camera and the surface profile or height of the object.
  • the displacement along the light bar is proportional to the height of the surface of the object, and the kink indicates a change in the plane, showing the physical gap of the surface discontinuously.
  • the relative position between the projector and the camera is constant, the three-dimensional contour of the surface of the object can be reproduced by the distorted two-dimensional strip image coordinates, so that the depth information can be acquired.
  • Structured light depth ranging has high resolution and measurement accuracy.
  • the imaging device 20 When a depth image is acquired using a TOF depth camera, the imaging device 20 includes a TOF depth camera.
  • the TOF depth camera records the phase change of the modulated infrared light emitted from the light emitting unit to the object through the sensor, and then reflects back from the object, and the depth distance of the entire scene can be obtained in real time according to the speed of light in a range of wavelengths.
  • the TOF depth camera calculates the depth information without being affected by the gray level and features of the surface of the object, and can quickly calculate the depth information, which has high real-time performance.
  • the scene data includes a cached secondary image corresponding to the cached primary image
  • step S112 includes the following steps:
  • S1126 processing the scene main image and buffering the secondary image to obtain depth data of the cached main image
  • S1128 Process the depth data to obtain depth information.
  • the scene data includes a cached secondary image corresponding to the cached primary image
  • the processing unit 112 includes a third processing sub-unit 1126 and a fourth processing sub-unit 1128.
  • the third processing sub-unit 1126 is configured to process the cached main image and the cached sub-image to obtain depth data of the cached main image.
  • the fourth processing sub-unit 1128 is configured to process the depth data to obtain depth information.
  • step S1126 can be implemented by the third processing sub-unit 1126
  • step S1128 can be implemented by the fourth processing sub-unit 1128.
  • the depth information of the cached main image can be obtained by processing the cached main image and the cached sub-image.
  • imaging device 20 includes a primary camera and a secondary camera.
  • the depth information can be acquired by the binocular stereo vision ranging method, and the scene data includes the cache main image and the cache sub-image.
  • the cached main image is captured by the main camera, and the cached sub-image is captured by the sub-camera.
  • Binocular stereo vision ranging is to use two identical cameras to image the same subject from different positions to obtain a stereo image pair of the subject, and then algorithm to match the corresponding image points of the stereo image pair to calculate the parallax.
  • the triangulation-based method is used to recover the depth information.
  • the depth information of the cached main image can be obtained by matching the stereoscopic image pair of the cached main image and the cached sub-image.
  • step S114 includes the following steps:
  • S1144 Find an area adjacent to the foremost point and continuously varying in depth as a foreground part.
  • the first obtaining unit 114 includes a fifth processing sub-unit 1142 and a finding sub-unit 1144.
  • the fifth processing sub-unit 1142 is configured to obtain a foremost point of the cached main image according to the depth information.
  • the finding subunit 1144 looks for a region adjacent to the foremost point and continuously varying in depth as the foreground portion.
  • step S1142 can be implemented by the fifth processing sub-unit 1142, and step S1144 can be implemented by the finding sub-unit 1144.
  • the foreground portion of the physical connection of the cached main image can be obtained.
  • the foreground parts are usually connected together. Taking the foreground part of the physical connection as the main body, the relationship of the foreground part can be intuitively obtained.
  • the first point of the cached main image is obtained according to the depth information, and the foremost point is equivalent to the beginning of the foreground part, and is diffused from the foremost point to obtain an area adjacent to the foremost point and continuously changing in depth, and the area and the foremost point are merged into the foreground. region.
  • the foremost point refers to a pixel point corresponding to the object with the smallest depth, that is, a pixel point corresponding to the object with the smallest object distance or closest to the imaging device 20 .
  • Adjacency means that two pixels are connected together.
  • the depth continuous change means that the depth difference between two adjacent pixel points is smaller than a predetermined difference, or the depth difference between two adjacent pixel points whose difference in depth is smaller than a predetermined difference.
  • step S114 may include the following steps:
  • S1148 Find an area where the difference from the depth of the foremost point is less than a predetermined threshold as the foreground part.
  • the foreground portion of the logical association of the cached main image can be obtained.
  • the foreground parts may not be connected together, but in a logical relationship, such as the scene where the eagle swoops down to catch the chicken, the eagle and the chick may not be physically connected, but logically, they can be judged. It is linked.
  • the first point of the cached main image is obtained according to the depth information, and the foremost point is equivalent to the beginning of the foreground portion, and is diffused from the foremost point to obtain an area where the difference from the depth of the foremost point is less than a predetermined threshold, and the areas are merged with the foremost point. For the foreground area.
  • the predetermined threshold can be a value set by the user. In this way, the user can determine the range of the foreground part according to his own needs, thereby obtaining an ideal composition suggestion and achieving an ideal composition.
  • the predetermined threshold may be a value determined by the exposure control device 10, without any limitation.
  • the predetermined threshold determined by the exposure control device 10 may be a fixed value stored internally, or may be a value calculated according to a different situation, such as the depth of the foremost point.
  • step S13 can include the following steps:
  • the area of the cached main image other than the foreground portion is the background portion.
  • the foreground part is not the front part, but the part of the front part that is slightly behind.
  • the computer is relatively front, but the talent is the main part, so The area where the depth is in the predetermined interval is used as the foreground part, and the problem that the subject selection is incorrect can be effectively avoided.
  • the reference exposure amount includes an exposure time and a sensitivity of the imaging device 20, and the exposure control method includes:
  • S16 The sensitivity is determined according to the reference exposure amount and the exposure time.
  • the reference exposure amount includes an exposure time and a sensitivity of the imaging device 20
  • the exposure control device 10 includes a third determination module 15 and a fourth determination module 16 .
  • the third determining module 15 is configured to determine an exposure time according to motion information of the foreground portion.
  • the fourth determining module 16 is configured to determine the sensitivity according to the reference exposure amount and the exposure time.
  • step S15 can be implemented by the third determining module 15, and step S16 can be implemented by the fourth determining module 16.
  • the exposure time and the sensitivity of the imaging device 20 can be determined according to the motion state of the foreground portion.
  • the exposure time can be reduced, and in order to ensure the brightness of the foreground portion, the sensitivity can be increased so that the reference exposure amount is substantially not change.
  • the sensitivity can be appropriately lowered and the exposure time can be increased.
  • step S13 includes the following steps:
  • S132 Obtain a region where a brightness value of the cached main image is greater than a first brightness threshold as a highlighted area;
  • the area where the brightness value of the cached main image is smaller than the second brightness threshold is a low-light area, and the first brightness threshold is greater than the second brightness threshold;
  • S136 Determine a dark frame exposure amount and a bright frame exposure amount according to the ratio of the highlight area and the low light area and the reference exposure amount.
  • the second determining module 13 includes a second obtaining unit 132 , a third obtaining unit 134 , and a determining unit 136 .
  • the second obtaining unit 132 is configured to obtain, as a highlight area, an area where the brightness value of the cached main image is greater than the first brightness threshold.
  • the third obtaining unit 134 is configured to obtain a region where the brightness value of the cached main image is smaller than the second brightness threshold, and the first brightness threshold is greater than the second brightness threshold.
  • the determining unit 136 is configured to determine the dark frame exposure amount and the bright frame exposure amount according to the ratio of the highlight area and the low light area and the reference exposure amount.
  • step S132 can be implemented by the second obtaining unit 132
  • step S134 can be implemented by the third obtaining unit 134
  • step S136 can be implemented by the determining unit 136.
  • the dark frame exposure amount and the bright frame exposure amount can be determined according to the ratio of the highlight area and the low light area of the cache main image and the reference exposure amount.
  • the dark frame exposure amount may be the exposure amount of the underexposed image
  • the bright frame exposure amount may be the exposure amount of the overexposed image.
  • the proportion of the highlighted area is large, the brightness of the image is high, and the exposure amount of the dark frame and/or the exposure amount of the bright frame can be appropriately reduced; when the proportion of the low-light area is large, the brightness of the image is explained. Low, the dark frame exposure amount and/or the bright frame exposure amount can be appropriately increased, so that the appropriate dark frame exposure amount and bright frame exposure amount can be determined according to actual conditions.
  • the method of determining the dark frame exposure amount and the bright frame exposure amount according to the ratio of the highlight area and the low light area may be determining the dark frame exposure amount and the bright frame according to the pre-stored highlight and low brightness ratio relationship.
  • Exposure For example, the electronic device 100 obtains a better dark frame exposure amount and a bright frame exposure amount corresponding to the ratio of the highlighted area and the low-light area after a large amount of data and experiments before leaving the factory, and stores these proportional relationships in the electronic device 100. In the memory, the dark frame exposure and the bright frame exposure can be quickly obtained by determining the scale.
  • the exposure control method includes the following steps:
  • S18 Adjust the dark frame exposure amount and the bright frame exposure amount according to the brightness information of the background portion.
  • exposure control device 10 includes a fifth determination module 17 and an adjustment module 18.
  • the fifth determining module 17 is configured to determine that the area of the cached main image other than the foreground portion is the background portion.
  • the adjustment module 18 is configured to adjust the dark frame exposure amount and the bright frame exposure amount according to the brightness information of the background portion.
  • step S17 can be implemented by the fifth determining module 17, and step S18 can be implemented by the adjusting module 18.
  • the dark frame exposure amount and the bright frame exposure amount can be adjusted according to the brightness information of the background portion.
  • the background portion is less important than the foreground portion, that is, the main portion
  • the dark frame exposure amount and the bright frame exposure amount can be appropriately adjusted.
  • the image obtained by the imaging device 20 exposure has a better contrast.
  • the exposure amount of the bright frame can be reduced to make the contrast of the overexposed image suitable and reduce the noise of the background portion of the overexposed image.
  • first and second are used for descriptive purposes only, and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include one or more of the described features either explicitly or implicitly.
  • the meaning of "a plurality" is two or more unless specifically defined otherwise.
  • the terms “installation”, “connected”, and “connected” should be understood broadly, and may be a fixed connection, for example, or They are detachable or integrally connected; they can be mechanically connected, they can be electrically connected or can communicate with each other; they can be connected directly or indirectly through an intermediate medium, which can be internal or two components of two components. Interaction relationship.
  • an intermediate medium which can be internal or two components of two components. Interaction relationship.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the embodiments of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)

Abstract

本发明公开了一种基于深度的曝光控制方法。曝光控制方法包括:(S11)处理场景数据以获得缓存主图像的前景部分;(S12)根据前景部分的亮度信息确定参考曝光量;(S13)根据参考曝光量确定暗帧曝光量和亮帧曝光量;和(S14)根据参考曝光量、暗帧曝光量和亮帧曝光量控制成像装置(20)曝光。本发明还公开了一种曝光控制装置(10)及电子装置(100)。

Description

曝光控制方法、曝光控制装置及电子装置
优先权信息
本申请请求2017年03月09日向中国国家知识产权局提交的、专利申请号为201710137930.5的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本发明涉及成像技术,特别涉及一种曝光控制方法、曝光控制装置及电子装置。
背景技术
宽动态范围图像的获得方式中,欠曝图像和过曝图像的曝光时间一般是固定的,若欠曝图像和过曝图像的曝光时间不正确容易导致宽动态范围图像的质量变差。
发明内容
本发明旨在至少解决现有技术中存在的技术问题之一。为此,本发明实施方式提供了一种曝光控制方法、曝光控制装置及电子装置。
一种基于深度的曝光控制方法,用于控制成像装置采集场景数据,所述场景数据包括缓存主图像,所述曝光控制方法包括以下步骤:
处理所述场景数据以获得所述缓存主图像的前景部分;
根据所述前景部分的亮度信息确定参考曝光量;
根据所述参考曝光量确定暗帧曝光量和亮帧曝光量,所述暗帧曝光量小于所述参考曝光量,所述亮帧曝光量大于所述参考曝光量;和
根据所述参考曝光量、所述暗帧曝光量和所述亮帧曝光量控制所述成像装置曝光。
一种基于深度的曝光控制装置,用于控制成像装置采集场景数据,所述场景数据包括缓存主图像,所述曝光控制装置包括处理模块、第一确定模块、第二确定模块和控制模块。
所述处理模块用于处理所述场景数据以获得所述缓存主图像的前景部分。
所述第一确定模块用于根据所述前景部分的亮度信息确定参考曝光量。
所述第二确定模块用于根据所述参考曝光量确定暗帧曝光量和亮帧曝光量,所述暗帧曝光量小于所述参考曝光量,所述亮帧曝光量大于所述参考曝光量。
所述控制模块用于根据所述参考曝光量、所述暗帧曝光量和所述亮帧曝光量控制所述成像装置曝光。
一种电子装置包括成像装置和所述曝光控制装置。
本发明实施方式的曝光控制方法、曝光控制装置及电子装置利用深度信息确定前景部分作为图像的主体,根据主体部分的亮度信息确定参考曝光量并且根据参考曝光量确定暗帧曝光量和亮帧曝光量,根据参考曝光量、暗帧曝光量和亮帧曝光量控制成像装置曝光以获得多个图像,从而可以通过图像处理的方法获得主体清晰和动态范围合理的宽动态范围图像。
本发明的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
本发明的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本发明实施方式的曝光控制方法的流程示意图。
图2是本发明实施方式的电子装置的平面示意图。
图3是本发明某些实施方式的曝光控制方法的流程示意图。
图4是本发明某些实施方式的曝光控制装置的处理模块的功能模块示意图。
图5是本发明某些实施方式的曝光控制方法的流程示意图。
图6是本发明某些实施方式的处理单元的功能模块示意图。
图7是本发明某些实施方式的曝光控制方法的流程示意图。
图8是本发明某些实施方式的处理单元的另一个功能模块示意图。
图9是本发明某些实施方式的曝光控制方法的流程示意图。
图10是本发明某些实施方式的第一获取单元的功能模块示意图。
图11是本发明某些实施方式的曝光控制方法的流程示意图。
图12是本发明某些实施方式的曝光控制方法的流程示意图。
图13是本发明某些实施方式的曝光控制装置的功能模块示意图。
图14是本发明某些实施方式的曝光控制方法的流程示意图。
图15是本发明某些实施方式的第二确定模块的功能模块示意图。
图16是本发明某些实施方式的曝光控制方法的流程示意图。
图17是本发明某些实施方式的曝光控制装置的另一个功能模块示意图。
主要元件符号说明:
电子装置100、曝光控制装置10、处理模块11、处理单元112、第一处理子单元1122、第二处理子单元1124、第三处理子单元1126、第四处理子单元1128、第一获取单元114、第五处理子单元1142、寻找子单元1144、第一确定模块12、第二确定模块 13、第二获取单元132、第三获取单元134、确定单元136、控制模块14、第三确定模块15、第四确定模块16、第五确定模块17、调整模块18、成像装置20。
具体实施方式
下面详细描述本发明的实施方式,所述实施方式的实施方式在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本发明,而不能理解为对本发明的限制。
请一并参阅图1和图2,本发明实施方式的基于深度的曝光控制方法可以用于控制成像装置20采集场景数据。场景数据包括缓存主图像。曝光控制方法包括以下步骤:
S11:处理场景数据以获得缓存主图像的前景部分;
S12:根据前景部分的亮度信息确定参考曝光量;
S13:根据参考曝光量确定暗帧曝光量和亮帧曝光量,暗帧曝光量小于参考曝光量,亮帧曝光量大于参考曝光量;和
S14:根据参考曝光量、暗帧曝光量和亮帧曝光量控制成像装置20曝光。
请再次参阅图2,本发明实施方式的基于深度的曝光控制装置10可以用于控制成像装置20采集场景数据。场景数据包括缓存主图像。曝光控制装置10包括处理模块11、第一确定模块12、第二确定模块13和控制模块14。处理模块11用于处理场景数据以获得缓存主图像的前景部分。第一确定模块12用于根据前景部分的亮度信息确定参考曝光量。第二确定模块13用于根据参考曝光量确定暗帧曝光量和亮帧曝光量,暗帧曝光量小于参考曝光量,亮帧曝光量大于参考曝光量。控制模块14用于根据参考曝光量、暗帧曝光量和亮帧曝光量控制成像装置20曝光。
也即是说,本发明实施方式的曝光控制方法可以由本发明实施方式的曝光控制装置10实现,其中,步骤S11可以由处理模块11实现,步骤S12可以由第一确定模块12实现,步骤S13可以由第二确定模块13实现,步骤S14可以由控制模块14实现。
在某些实施方式中,本发明实施方式的曝光控制装置10可以应用于本发明实施方式的电子装置100,或者说本发明实施方式的电子装置100可以包括本发明实施方式的曝光控制装置10。此外,本发明实施方式的电子装置100还包括成像装置20,成像装置20和曝光控制装置10电连接。
本发明实施方式的曝光控制方法、曝光控制装置10及电子装置100利用深度信息确定前景部分作为图像的主体,根据主体部分的亮度信息确定参考曝光量并且根据参考曝光量确定暗帧曝光量和亮帧曝光量,根据参考曝光量、暗帧曝光量和亮帧曝光量控制成像装置 20曝光以获得多个图像,从而可以通过图像处理的方法获得主体清晰和动态范围合理的宽动态范围图像。
可以理解,参考曝光量可以是正常曝光图像的曝光量,根据前景部分的亮度信息获得参考曝光量可以提高前景部分,即主体部分的清晰度,从而使得正常曝光图像更加具有观赏性。
在某些实施方式中,电子装置100包括手机或平板电脑。在本发明实施方式中,电子装置100是手机。
在某些实施方式中,成像装置20包括前置相机和/或后置相机,在此不做任何限制。在本发明实施方式中,成像装置20是前置相机。
请参阅图3,在某些实施方式中,步骤S11包括以下步骤:
S112:处理场景数据以获取缓存主图像的深度信息;和
S114:根据深度信息获取缓存主图像的前景部分。
请参阅图4,在某些实施方式中,处理模块11包括处理单元112和第一获取单元114。处理单元112用于处理场景数据以获取缓存主图像的深度信息。第一获取单元114用于根据深度信息获取缓存主图像的前景部分。
也即是说,步骤S112可以由处理单元112实现,步骤S114可以由第一获取单元114实现。
如此,可以根据深度信息获取缓存主图像的前景部分。
请参阅图5,在某些实施方式中,场景数据包括与缓存主图像对应的深度图像,步骤S112包括以下步骤:
S1122:处理深度图像以获取缓存主图像的深度数据;和
S1124:处理深度数据以得到深度信息。
请参阅图6,在某些实施方式中,场景数据包括与缓存主图像对应的深度图像,处理单元112包括第一处理子单元1122和第二处理子单元1124。第一处理子单元1122用于处理深度图像以获取缓存主图像的深度数据。第二处理子单元1124用于处理深度数据以得到深度信息。
也即是说,步骤S1122可以由第一处理子单元1122实现,步骤S1124可以由第二处理子单元1124实现。
如此,可以利用深度图像快速获得缓存主图像的深度信息。
可以理解,缓存主图像为RGB彩色图像,深度图像中包含大量的深度数据,即包含场景中各个人或物体的深度信息,深度信息包括深度的大小和/或范围。由于缓存主图像的色彩信息与深度图像的深度信息是一一对应的关系,因此,可获得缓存主图像的深度信息。
在某些实施方式中,与缓存主图像对应的深度图像的获取方式包括采用结构光深度测距获取深度图像及采用飞行时间(time of flight,TOF)深度摄像头获取深度图像两种方式。
采用结构光深度测距获取深度图像时,成像装置20包括摄像头和投射器。
可以理解,结构光深度测距是利用投射器将一定模式的光结构投射于物体表面,在表面形成由被测物体形状所调制的光条三维图像。光条三维图像由摄像头探测从而获得光条二维畸变图像。光条的畸变程度取决于投射器与摄像头之间的相对位置和物体表面形廓或高度。沿光条显示出的位移与物体表面的高度成比例,扭结表示了平面的变化,不连续显示表面的物理间隙。当投射器与摄像头之间的相对位置一定时,由畸变的二维光条图像坐标便可重现物体表面的三维轮廓,从而可以获取深度信息。结构光深度测距具有较高的分辨率和测量精度。
采用TOF深度摄像头获取深度图像时,成像装置20包括TOF深度摄像头。
可以理解,TOF深度摄像头通过传感器记录从发光单元发出的调制红外光发射到物体,再从物体反射回来的相位变化,在一个波长的范围内根据光速,可以实时的获取整个场景深度距离。TOF深度摄像头计算深度信息时不受被摄物表面的灰度和特征的影响,且可以快速地计算深度信息,具有很高的实时性。
请参阅图7,在某些实施方式中,场景数据包括与缓存主图像对应的缓存副图像,步骤S112包括以下步骤:
S1126:处理场景主图像和缓存副图像以得到缓存主图像的深度数据;和
S1128:处理深度数据以得到深度信息。
请参阅图8,在某些实施方式中,场景数据包括与缓存主图像对应的缓存副图像,处理单元112包括第三处理子单元1126和第四处理子单元1128。第三处理子单元1126用于处理缓存主图像和缓存副图像以得到缓存主图像的深度数据。第四处理子单元1128用于处理深度数据以得到深度信息。
也即是说,步骤S1126可以由第三处理子单元1126实现,步骤S1128可以由第四处理子单元1128实现。
如此,可以通过处理缓存主图像和缓存副图像获取缓存主图像的深度信息。
在某些实施方式中,成像装置20包括主摄像头和副摄像头。
可以理解,深度信息可以通过双目立体视觉测距方法进行获取,此时场景数据包括缓存主图像和缓存副图像。其中,缓存主图像由主摄像头拍摄得到,缓存副图像由副摄像头拍摄得到。双目立体视觉测距是运用两个相同的摄像头对同一被摄物从不同的位置成像以获得被摄物的立体图像对,再通过算法匹配出立体图像对的相应像点,从而计算出视差,最后采用基于三角测量的方法恢复深度信息。如此,通过对缓存主图像和缓存副图像这一 立体图像对进行匹配便可获得缓存主图像的深度信息。
请参阅图9,在某些实施方式中,步骤S114包括以下步骤:
S1142:根据深度信息获得缓存主图像的最前点;和
S1144:寻找与最前点邻接且深度连续变化的区域作为前景部分。
请参阅图10,在某些实施方式中,第一获取单元114包括第五处理子单元1142和寻找子单元1144。第五处理子单元1142用于根据深度信息获得缓存主图像的最前点。寻找子单元1144寻找与最前点邻接且深度连续变化的区域作为前景部分。
也即是说,步骤S1142可以由第五处理子单元1142实现,步骤S1144可以由寻找子单元1144实现。
如此,可以获得缓存主图像物理联系的前景部分。在现实场景中,通常前景部分是连接在一起的。以物理联系的前景部分作为主体,可以直观地获得前景部分的关系。
具体地,先根据深度信息获得缓存主图像的最前点,最前点相当于前景部分的开端,从最前点进行扩散,获取与最前点邻接并且深度连续变化的区域,这些区域和最前点归并为前景区域。
需要说明的是,最前点指的是深度最小的物体对应的像素点,即物距最小或者离成像装置20最近的物体对应的像素点。邻接是指两个像素点连接在一起。深度连续变化是指邻接的两个像素点的深度差值小于预定差值,或者说深度之差小于预定差值的两个邻接的像素点的深度连续变化。
请参阅图11,在某些实施方式中,步骤S114可以包括以下步骤:
S1146:根据深度信息获得缓存主图像的最前点;和
S1148:寻找与最前点的深度之差小于预定阈值的区域作为前景部分。
如此,可以获得缓存主图像逻辑联系的前景部分。在现实场景中,前景部分可能没有连接在一起,但是符合某种逻辑关系,比如老鹰俯冲下来抓小鸡的场景,老鹰和小鸡物理上可能没连接在一起,但是从逻辑上,可以判断它们是联系起来的。
具体地,先根据深度信息获得缓存主图像的最前点,最前点相当于前景部分的开端,从最前点进行扩散,获取与最前点的深度之差小于预定阈值的区域,这些区域和最前点归并为前景区域。
在某些实施方式中,预定阈值可以是由用户设置的一个值。如此,用户可根据自身的需求来确定前景部分的范围,从而获得理想的构图建议,实现理想的构图。
在某些实施方式中,预定阈值可以是曝光控制装置10确定的一个值,在此不做任何限制。曝光控制装置10确定的预定阈值可以是内部存储的一个固定值,也可以是根据不同情况,例如最前点的深度,计算出来的数值。
在某些实施方式中,步骤S13可以包括以下步骤:
寻找深度处于预定区间的区域作为前景部分。
确定缓存主图像除前景部分外的区域为背景部分。
如此,可以获得深度处于合适范围的前景部分和背景部分。
可以理解,有些拍摄情况下,前景部分并不是最前面的部分,而是最前面部分稍微靠后一点的部分,例如,人坐在电脑后面,电脑比较靠前,但是人才是主体部分,所以将深度处于预定区间的区域作为前景部分,可以有效地避免主体选择不正确的问题。
请参阅图12,在某些实施方式中,参考曝光量包括曝光时间和成像装置20的感光度,曝光控制方法包括:
S15:根据前景部分的运动信息确定曝光时间;和
S16:根据参考曝光量和曝光时间确定感光度。
请参阅图13,在某些实施方式中,参考曝光量包括曝光时间和成像装置20的感光度,曝光控制装置10包括第三确定模块15和第四确定模块16。第三确定模块15用于根据前景部分的运动信息确定曝光时间。第四确定模块16用于根据参考曝光量和曝光时间确定感光度。
也即是说,步骤S15可以由第三确定模块15实现,步骤S16可以由第四确定模块16实现。
如此,可以根据前景部分的运动状态确定曝光时间和成像装置20的感光度。
可以理解,在前景部分处于运动状态时,为了确保前景部分的清晰度和避免出现重影等问题,可以减少曝光时间,同时为了确保前景部分的亮度,可以提高感光度以使得参考曝光量基本不变。在前景部分处于静止状态时,为了避免感光度太大带来的噪声,可以适当降低感光度和增加曝光时间。
请参阅图14,在某些实施方式中,步骤S13包括以下步骤:
S132:获取缓存主图像的亮度值大于第一亮度阈值的区域为高亮区域;
S134:获取缓存主图像的亮度值小于第二亮度阈值的区域为低亮区域,第一亮度阈值大于第二亮度阈值;和
S136:根据高亮区域和低亮区域的比例以及参考曝光量确定暗帧曝光量和亮帧曝光量。
请参阅图15,在某些实施方式中,第二确定模块13包括第二获取单元132、第三获取单元134和确定单元136。第二获取单元132用于获取缓存主图像的亮度值大于第一亮度阈值的区域为高亮区域。第三获取单元134用于获取缓存主图像的亮度值小于第二亮度阈值的区域为低亮区域,第一亮度阈值大于第二亮度阈值。确定单元136用于根据高亮区域和低亮区域的比例以及参考曝光量确定暗帧曝光量和亮帧曝光量。
也即是说,步骤S132可以由第二获取单元132实现,步骤S134可以由第三获取单元134实现,步骤S136可以由确定单元136实现。
如此,可以根据缓存主图像的高亮区域和低亮区域的比例以及参考曝光量确定暗帧曝光量和亮帧曝光量。
可以理解,暗帧曝光量可以是欠曝图像的曝光量,亮帧曝光量可以是过曝图像的曝光量。在高亮区域所占比例较大时,说明图像的亮度偏高,可以适当减小暗帧曝光量和/或亮帧曝光量;在低亮区域所占比例较大时,说明图像的亮度偏低,可适当增大暗帧曝光量和/或亮帧曝光量,从而可以根据实际情况确定合适的暗帧曝光量和亮帧曝光量。
在某些实施方式中,根据高亮区域和低亮区域的比例确定暗帧曝光量和亮帧曝光量的方式可以是根据预先存储的高亮、低亮比例关系确定暗帧曝光量和亮帧曝光量。比如,电子装置100在出厂前,经过大量的数据和实验获得高亮区域和低亮区域的比例对应的较佳的暗帧曝光量和亮帧曝光量,并且将这些比例关系存储在电子装置100的存储器中,从而通过确定比例即可快速地获得暗帧曝光量和亮帧曝光量。
请参阅图16,在某些实施方式中,曝光控制方法包括以下步骤:
S17:确定缓存主图像除前景部分外的区域为背景部分;和
S18:根据背景部分的亮度信息调整暗帧曝光量和亮帧曝光量。
在某些实施方式中,曝光控制装置10包括第五确定模块17和调整模块18。第五确定模块17用于确定缓存主图像除前景部分外的区域为背景部分。调整模块18用于根据背景部分的亮度信息调整暗帧曝光量和亮帧曝光量。
也即是说,步骤S17可以由第五确定模块17实现,步骤S18可以由调整模块18实现。
如此,可根据背景部分的亮度信息调整暗帧曝光量和亮帧曝光量。
可以理解,由于背景部分相对于前景部分,即主体部分的重要性较小,所以在缓存主图像的亮度最小或最大的区域处于背景部分时,可以适当的调整暗帧曝光量和亮帧曝光量,从而使得成像装置20曝光获得的图像具有更好的对比度。比如在缓存主图像的亮度最小的区域位于背景部分时,可以减小亮帧曝光量,以使得过曝图像的对比度合适并且降低过曝图像的背景部分的噪声。
在本发明的实施方式的描述中,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本发明的实施方式的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本发明的实施方式的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一 体地连接;可以是机械连接,也可以是电连接或可以相互通讯;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明的实施方式中的具体含义。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理模块的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本发明的实施方式的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列 (PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本发明的各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。
尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (21)

  1. 一种基于深度的曝光控制方法,用于控制成像装置采集场景数据,所述场景数据包括缓存主图像,其特征在于,所述曝光控制方法包括以下步骤:
    处理所述场景数据以获得所述缓存主图像的前景部分;
    根据所述前景部分的亮度信息确定参考曝光量;
    根据所述参考曝光量确定暗帧曝光量和亮帧曝光量,所述暗帧曝光量小于所述参考曝光量,所述亮帧曝光量大于所述参考曝光量;和
    根据所述参考曝光量、所述暗帧曝光量和所述亮帧曝光量控制所述成像装置曝光。
  2. 如权利要求1所述的曝光控制方法,其特征在于,所述处理所述场景数据以获得所述缓存主图像的前景部分的步骤包括以下步骤:
    处理所述场景数据以获取所述缓存主图像的深度信息;和
    根据所述深度信息获取所述缓存主图像的前景部分。
  3. 如权利要求2所述的曝光控制方法,其特征在于,所述场景数据包括与所述缓存主图像对应的深度图像,所述处理所述场景数据以获取所述缓存主图像的深度信息的步骤包括以下步骤:
    处理所述深度图像以获取所述缓存主图像的深度数据;和
    处理所述深度数据以得到所述深度信息。
  4. 如权利要求2所述的曝光控制方法,其特征在于,所述场景数据包括与所述缓存主图像对应的缓存副图像,所述处理所述场景数据以获取所述缓存主图像的深度信息的步骤包括以下步骤:
    处理所述场景主图像和所述缓存副图像以得到所述缓存主图像的深度数据;和
    处理所述深度数据以得到所述深度信息。
  5. 如权利要求2所述的曝光控制方法,其特征在于,所述根据所述深度信息获取所述缓存主图像的前景部分的步骤包括以下步骤:
    根据所述深度信息获得所述缓存主图像的最前点;和
    寻找与所述最前点邻接且深度连续变化的区域作为所述前景部分。
  6. 如权利要求1所述的曝光控制方法,其特征在于,所述参考曝光量包括曝光时间和 成像装置的感光度,所述曝光控制方法包括:
    根据所述前景部分的运动信息确定所述曝光时间;和
    根据所述参考曝光量和所述曝光时间确定所述感光度。
  7. 如权利要求1所述的曝光控制方法,其特征在于,所述根据所述参考曝光量确定暗帧曝光量和亮帧曝光量的步骤包括以下步骤:
    获取所述缓存主图像的亮度值大于第一亮度阈值的区域为高亮区域;
    获取所述缓存主图像的亮度值小于第二亮度阈值的区域为低亮区域,所述第一亮度阈值大于所述第二亮度阈值;和
    根据所述高亮区域和低亮区域的比例以及参考曝光量确定所述暗帧曝光量和所述亮帧曝光量。
  8. 如权利要求1所述的曝光控制方法,其特征在于,所述曝光控制方法包括以下步骤:
    确定所述缓存主图像除所述前景部分外的区域为背景部分;和
    根据所述背景部分的亮度信息调整所述暗帧曝光量和所述亮帧曝光量。
  9. 一种基于深度的曝光控制装置,用于控制成像装置采集场景数据,所述场景数据包括缓存主图像,其特征在于,所述曝光控制装置包括:
    处理模块,所述处理模块用于处理所述场景数据以获得所述缓存主图像的前景部分;
    第一确定模块,所述第一确定模块用于根据所述前景部分的亮度信息确定参考曝光量;
    第二确定模块,所述第二确定模块用于根据所述参考曝光量确定暗帧曝光量和亮帧曝光量,所述暗帧曝光量小于所述参考曝光量,所述亮帧曝光量大于所述参考曝光量;和
    控制模块,所述控制模块用于根据所述参考曝光量、所述暗帧曝光量和所述亮帧曝光量控制所述成像装置曝光。
  10. 如权利要求9所述的曝光控制装置,其特征在于,所述处理模块包括:
    处理单元,所述处理单元用于处理所述场景数据以获取所述缓存主图像的深度信息;和
    第一获取单元,所述第一获取单元用于根据所述深度信息获取所述缓存主图像的前景部分。
  11. 如权利要求10所述的曝光控制装置,其特征在于,所述场景数据包括与所述缓存 主图像对应的深度图像,所述处理单元包括:
    第一处理子单元,所述第一处理子单元用于处理所述深度图像以获取所述缓存主图像的深度数据;和
    第二处理子单元,所述第二处理子单元用于处理所述深度数据以得到所述深度信息。
  12. 如权利要求10所述的曝光控制装置,其特征在于,所述场景数据包括与所述缓存主图像对应的缓存副图像,所述处理单元包括:
    第三处理子单元,所述第三处理子单元用于处理所述场景主图像和所述缓存副图像以得到所述缓存主图像的深度数据;和
    第四处理子单元,所述第四处理子单元用于处理所述深度数据以得到所述深度信息。
  13. 如权利要求10所述的曝光控制装置,其特征在于,所述第一获取单元包括:
    第五处理子单元,所述第五处理子单元用于根据所述深度信息获得所述缓存主图像的最前点;和
    寻找子单元,所述寻找子单元用于寻找与所述最前点邻接且深度连续变化的区域作为所述前景部分。
  14. 如权利要求9所述的曝光控制装置,其特征在于,所述参考曝光量包括曝光时间和成像装置的感光度,所述曝光控制装置包括:
    第三确定模块,所述第三确定模块用于根据所述前景部分的运动信息确定所述曝光时间;和
    第四确定模块,所述第四确定模块用于根据所述参考曝光量和所述曝光时间确定所述感光度。
  15. 如权利要求9所述的曝光控制装置,其特征在于,所述第二确定模块包括:
    第二获取单元,所述第二获取单元用于获取所述缓存主图像的亮度值大于第一亮度阈值的区域为高亮区域;
    第三获取单元,所述第三获取单元用于获取所述缓存主图像的亮度值小于第二亮度阈值的区域为低亮区域,所述第一亮度阈值大于所述第二亮度阈值;和
    确定单元,所述确定单元用于根据所述高亮区域和低亮区域的比例以及参考曝光量确定所述暗帧曝光量和所述亮帧曝光量。
  16. 如权利要求9所述的曝光控制装置,其特征在于,所述曝光控制装置包括:
    第五确定模块,所述第五确定模块用于确定所述缓存主图像除所述前景部分外的区域为背景部分;和
    调整模块,所述调整模块用于根据所述背景部分的亮度信息调整所述暗帧曝光量和所述亮帧曝光量。
  17. 一种电子装置,其特征在于,包括:
    成像装置;和
    如权利要求9至16任意一项所述的曝光控制装置。
  18. 如权利要求17所述的电子装置,其特征在于,所述电子装置包括手机或平板电脑。
  19. 如权利要求17所述的电子装置,其特征在于,所述成像装置包括主摄像头和副摄像头。
  20. 如权利要求17所述的电子装置,其特征在于,所述成像装置包括摄像头和投射器。
  21. 如权利要求17所述的电子装置,其特征在于,所述成像装置包括飞行时间(TOF)深度摄像头。
PCT/CN2018/075491 2017-03-09 2018-02-06 曝光控制方法、曝光控制装置及电子装置 WO2018161758A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18764519.7A EP3579546B1 (en) 2017-03-09 2018-02-06 Exposure control method, exposure control device and electronic device
US16/561,806 US11206360B2 (en) 2017-03-09 2019-09-05 Exposure control method for obtaining HDR image, related exposure control device and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710137930.5 2017-03-09
CN201710137930.5A CN106851123B (zh) 2017-03-09 2017-03-09 曝光控制方法、曝光控制装置及电子装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/561,806 Continuation US11206360B2 (en) 2017-03-09 2019-09-05 Exposure control method for obtaining HDR image, related exposure control device and electronic device

Publications (1)

Publication Number Publication Date
WO2018161758A1 true WO2018161758A1 (zh) 2018-09-13

Family

ID=59143524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/075491 WO2018161758A1 (zh) 2017-03-09 2018-02-06 曝光控制方法、曝光控制装置及电子装置

Country Status (4)

Country Link
US (1) US11206360B2 (zh)
EP (1) EP3579546B1 (zh)
CN (1) CN106851123B (zh)
WO (1) WO2018161758A1 (zh)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851123B (zh) * 2017-03-09 2020-12-22 Oppo广东移动通信有限公司 曝光控制方法、曝光控制装置及电子装置
EP3588363A4 (en) * 2017-03-09 2020-05-27 Guangdong Oppo Mobile Telecommunications Corp., Ltd. DEPTH-BASED CONTROL METHOD, DEPTH-BASED CONTROL DEVICE AND ELECTRONIC DEVICE
CN107241557A (zh) * 2017-06-16 2017-10-10 广东欧珀移动通信有限公司 图像曝光方法、装置、摄像设备及存储介质
CN107197169B (zh) * 2017-06-22 2019-12-06 维沃移动通信有限公司 一种高动态范围图像拍摄方法及移动终端
CN107295269A (zh) * 2017-07-31 2017-10-24 努比亚技术有限公司 一种测光方法及终端、计算机存储介质
CN107846556B (zh) * 2017-11-30 2020-01-10 Oppo广东移动通信有限公司 成像方法、装置、移动终端和存储介质
CN107948519B (zh) 2017-11-30 2020-03-27 Oppo广东移动通信有限公司 图像处理方法、装置及设备
CN109389582B (zh) * 2018-09-11 2020-06-26 广东智媒云图科技股份有限公司 一种图像主体亮度的识别方法及装置
US10609299B1 (en) * 2018-09-17 2020-03-31 Black Sesame International Holding Limited Method of measuring light using dual cameras
CN109218628B (zh) * 2018-09-20 2020-12-08 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及存储介质
WO2020107295A1 (zh) * 2018-11-28 2020-06-04 深圳市大疆创新科技有限公司 拍摄方法和拍摄系统
CN109598237A (zh) * 2018-12-04 2019-04-09 青岛小鸟看看科技有限公司 一种疲劳状态检测方法和装置
CN109729275A (zh) * 2019-03-14 2019-05-07 Oppo广东移动通信有限公司 成像方法、装置、终端及存储介质
CN110308458B (zh) * 2019-06-27 2021-03-23 Oppo广东移动通信有限公司 调节方法、调节装置、终端及计算机可读存储介质
CN110456380B (zh) * 2019-07-31 2021-12-28 炬佑智能科技(苏州)有限公司 飞行时间传感相机及其深度检测方法
DE102020103575B4 (de) * 2020-02-12 2022-08-11 Basler Aktiengesellschaft Merkmalspunktdetektionsvorrichtung und -verfahren zur Detektion von Merkmalspunkten in Bilddaten
CN112822404B (zh) * 2021-01-12 2022-08-09 Oppo广东移动通信有限公司 一种图像处理方法及装置、存储介质
CN114866703A (zh) * 2021-02-03 2022-08-05 浙江舜宇智能光学技术有限公司 基于tof成像系统的主动曝光方法、装置及电子设备
CN114979498B (zh) * 2021-02-20 2023-06-30 Oppo广东移动通信有限公司 曝光处理方法、装置、电子设备及计算机可读存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909150A (zh) * 2009-06-03 2010-12-08 索尼公司 成像设备和成像控制方法
CN103248828A (zh) * 2012-02-13 2013-08-14 宏达国际电子股份有限公司 曝光值调整装置及曝光值调整方法
CN104333710A (zh) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 相机曝光方法、装置及设备
CN104917973A (zh) * 2014-03-11 2015-09-16 宏碁股份有限公司 动态曝光调整方法及其电子装置
CN105100637A (zh) * 2015-08-31 2015-11-25 联想(北京)有限公司 一种图像处理方法及电子设备
US20160191896A1 (en) * 2014-12-31 2016-06-30 Dell Products, Lp Exposure computation via depth-based computational photography
CN106851123A (zh) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 曝光控制方法、曝光控制装置及电子装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7379100B2 (en) * 2004-02-12 2008-05-27 Canesta, Inc. Method and system to increase dynamic range of time-of-flight (TOF) and/or imaging sensors
CN100432940C (zh) 2006-10-19 2008-11-12 华为技术有限公司 计算机集群系统中共享资源锁分配方法与计算机及集群系统
TW200820123A (en) * 2006-10-20 2008-05-01 Primax Electronics Ltd Method and system of generating high dynamic range image corresponding to specific scene
TWI374400B (en) * 2008-06-11 2012-10-11 Vatics Inc Method for auto-exposure control
CN101621629B (zh) * 2008-06-30 2011-09-14 睿致科技股份有限公司 自动曝光的方法
JP5349072B2 (ja) 2009-02-17 2013-11-20 パナソニック株式会社 資源排他制御方法および資源排他制御装置
US8570396B2 (en) * 2009-04-23 2013-10-29 Csr Technology Inc. Multiple exposure high dynamic range image capture
JP5610762B2 (ja) * 2009-12-21 2014-10-22 キヤノン株式会社 撮像装置及び制御方法
JP5932474B2 (ja) * 2012-05-09 2016-06-08 キヤノン株式会社 撮像装置及びその制御方法
JP6120500B2 (ja) * 2012-07-20 2017-04-26 キヤノン株式会社 撮像装置およびその制御方法
CN103973690B (zh) 2014-05-09 2018-04-24 北京智谷睿拓技术服务有限公司 资源访问方法及资源访问装置
CN103973691B (zh) 2014-05-09 2018-02-02 北京智谷睿拓技术服务有限公司 资源访问方法及资源访问装置
US9544503B2 (en) * 2014-12-30 2017-01-10 Light Labs Inc. Exposure control methods and apparatus
JP7321956B2 (ja) * 2020-02-28 2023-08-07 株式会社日立エルジーデータストレージ 測距装置の測定値補正方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909150A (zh) * 2009-06-03 2010-12-08 索尼公司 成像设备和成像控制方法
CN103248828A (zh) * 2012-02-13 2013-08-14 宏达国际电子股份有限公司 曝光值调整装置及曝光值调整方法
CN104917973A (zh) * 2014-03-11 2015-09-16 宏碁股份有限公司 动态曝光调整方法及其电子装置
CN104333710A (zh) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 相机曝光方法、装置及设备
US20160191896A1 (en) * 2014-12-31 2016-06-30 Dell Products, Lp Exposure computation via depth-based computational photography
CN105100637A (zh) * 2015-08-31 2015-11-25 联想(北京)有限公司 一种图像处理方法及电子设备
CN106851123A (zh) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 曝光控制方法、曝光控制装置及电子装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3579546A4 *

Also Published As

Publication number Publication date
EP3579546A1 (en) 2019-12-11
EP3579546B1 (en) 2021-04-07
US11206360B2 (en) 2021-12-21
CN106851123B (zh) 2020-12-22
CN106851123A (zh) 2017-06-13
US20200007736A1 (en) 2020-01-02
EP3579546A4 (en) 2019-12-25

Similar Documents

Publication Publication Date Title
WO2018161758A1 (zh) 曝光控制方法、曝光控制装置及电子装置
US10404969B2 (en) Method and apparatus for multiple technology depth map acquisition and fusion
CN106851124B (zh) 基于景深的图像处理方法、处理装置和电子装置
TWI713547B (zh) 用於判定一圖像之一深度圖的方法與設備
US10475237B2 (en) Image processing apparatus and control method thereof
WO2018228467A1 (zh) 图像曝光方法、装置、摄像设备及存储介质
CN107948538B (zh) 成像方法、装置、移动终端和存储介质
WO2019042216A1 (zh) 图像虚化处理方法、装置及拍摄终端
WO2019109805A1 (zh) 图像处理方法和装置
WO2019011147A1 (zh) 逆光场景的人脸区域处理方法和装置
US20180309919A1 (en) Methods and apparatus for controlling exposure and synchronization of image sensors
CN110378946B (zh) 深度图处理方法、装置以及电子设备
CN106851122A (zh) 基于双摄像头系统的自动曝光参数的标定方法和装置
WO2019047985A1 (zh) 图像处理方法和装置、电子装置和计算机可读存储介质
TW200539055A (en) Method and apparatus for optimizing capture device settings through depth information
CN108053438B (zh) 景深获取方法、装置及设备
CN106851103B (zh) 基于双摄像头系统的自动对焦方法和装置
JP6460829B2 (ja) 撮像装置、電子機器及び光量変化特性の算出方法
JP6412386B2 (ja) 画像処理装置およびその制御方法、プログラムならびに記録媒体
WO2019011110A1 (zh) 逆光场景的人脸区域处理方法和装置
WO2016197494A1 (zh) 对焦区域调整方法和装置
US20230033956A1 (en) Estimating depth based on iris size
JP2004133919A (ja) 擬似3次元画像生成装置および生成方法並びにそのためのプログラムおよび記録媒体
WO2018161322A1 (zh) 基于深度的图像处理方法、处理装置和电子装置
US20190199933A1 (en) Determination of a contrast value for a digital image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18764519

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018764519

Country of ref document: EP

Effective date: 20190904