WO2020038069A1 - Procédé et dispositif de commande d'exposition, et appareil électronique - Google Patents

Procédé et dispositif de commande d'exposition, et appareil électronique Download PDF

Info

Publication number
WO2020038069A1
WO2020038069A1 PCT/CN2019/090146 CN2019090146W WO2020038069A1 WO 2020038069 A1 WO2020038069 A1 WO 2020038069A1 CN 2019090146 W CN2019090146 W CN 2019090146W WO 2020038069 A1 WO2020038069 A1 WO 2020038069A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure
night scene
image
scene mode
frame
Prior art date
Application number
PCT/CN2019/090146
Other languages
English (en)
Chinese (zh)
Inventor
胡孔勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038069A1 publication Critical patent/WO2020038069A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • the present application relates to the technical field of mobile terminals, and in particular, to an exposure control method, device, and electronic device.
  • This application is intended to solve at least one of the technical problems in the related technology.
  • this application proposes an exposure control method that dynamically adjusts the applicable night scene mode by identifying whether there is a portrait in the night scene, and uses the corresponding exposure parameters to perform exposure control of the image to be acquired, which improves the image in the night scene. Imaging quality.
  • the application proposes an exposure control device.
  • the present application proposes an electronic device.
  • the present application proposes a computer-readable storage medium.
  • An embodiment of one aspect of the present application provides an exposure control method, including:
  • the exposure parameter is used for exposure control.
  • An embodiment of another aspect of the present application provides an exposure control device, including:
  • a scene determination module configured to determine that a current shooting scene belongs to a night scene scene
  • a recognition module for recognizing a face area of a preview image, and identifying a night scene mode applicable to a current shooting scene according to whether a face area is recognized;
  • a parameter determining module configured to determine an exposure parameter of an image to be acquired in each frame according to the night scene mode
  • a control module configured to perform exposure control by using the exposure parameter.
  • An embodiment of another aspect of the present application provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the implementation is implemented as described above. Aspect of the exposure control method.
  • An embodiment of yet another aspect of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the exposure control method according to the foregoing aspect is implemented.
  • FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of still another exposure control method according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment.
  • this application proposes an exposure control method, determining that the current shooting scene belongs to a night scene, performing face area recognition on the preview image, identifying whether a face area is recognized, identifying a night scene mode applicable to the current shooting scene, and according to the night scene mode, Determine the exposure parameters of the images to be collected for each frame, use the exposure parameters for exposure control, dynamically adjust the applicable night scene mode by identifying whether there is a portrait in the night scene, and use the corresponding exposure parameters for exposure control of the images to be collected, which improves the Imaging quality of images in night scenes.
  • FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application.
  • the method includes the following steps:
  • Step 101 Determine that the current shooting scene belongs to a night scene.
  • an image acquisition module is used to obtain a preview graphic of the current scene, image feature extraction is performed on the preview image, the extracted image features are input to a recognition model, and the current scene is determined according to the type of scene output by the recognition model.
  • the shooting scene is a night scene, where the recognition model has learned to obtain the correspondence between the image features and the scene type.
  • a user operation for scene switching is detected, and when a user operation switched to a night scene is detected, the ambient brightness is detected to obtain brightness information.
  • electronic The built-in light metering module detects the current ambient brightness and determines the brightness information of the current environment. According to the brightness information, it is determined that the current shooting scene belongs to a night scene.
  • the brightness index Lix_index can be used to measure the brightness level. The greater the value of the brightness information, the lower the brightness of the current scene.
  • the ratio of the obtained brightness information to the preset brightness value is Yes, if the obtained brightness information is greater than a preset brightness value, it is determined that the current shooting scene belongs to a night scene.
  • the obtained brightness information is less than a preset brightness value, it is determined that the current shooting scene belongs to a non-night scene, and in a non-night scene, imaging is performed in a high dynamic range mode, where high dynamic range imaging can be performed by setting different Exposure compensation value, to obtain a higher dynamic range. For example, 3 frames of images can be acquired, and the interval of the exposure compensation value is [-4, +1].
  • Step 102 Perform face area recognition on the preview image, and identify a night scene mode applicable to the current shooting scene according to whether the face area is recognized.
  • face area recognition is performed on the preview image to identify whether the preview image contains a human face.
  • a face detection algorithm may be adopted, for example, a skin detection-based face detection algorithm or a facial feature feature Face detection algorithm, etc., to detect the face interest area FACE ROI, identify the face area in the preview image, if there is a face area, the face interest area FACE ROI box is displayed, if it is not recognized, it is not displayed.
  • the face interest area FACE ROI box determines whether there is a face area in the preview image according to the recognition result.
  • a night portrait mode is determined. If a face area is not recognized, a non-night portrait mode is determined.
  • the portrait night mode and the non-night night mode use different exposure compensation values. This is because when the image processor performs post-processing on the image, it will increase the brightness of the face area based on the face brightening related algorithm. If the exposure is collected with normal exposure, the image will be superimposed with the later face brightening effect. The final image will have face brightness distortion. Therefore, in order to avoid this kind of brightness distortion, a lower exposure compensation value can be used in portrait night scene mode to offset all or part of the face brightening effect in the later stage.
  • Step 103 Determine the exposure parameters of the images to be collected in each frame according to the night scene mode.
  • the exposure parameters include exposure compensation value, sensitivity and exposure time.
  • the preset exposure compensation value of each frame to be acquired is determined from the corresponding first value range, and in the non portrait night scene mode, each frame is determined from the corresponding second value range. Preset exposure compensation value for the image to be acquired.
  • the upper limit of the first value range is smaller than the upper limit of the second value range.
  • the preset sensitivity of each frame to be acquired is determined, the reference exposure amount is determined based on the brightness information of the preview image, and each frame is determined based on the reference exposure amount and the preset exposure compensation value of each frame to be acquired image.
  • the target exposure of the image to be acquired is determined according to the target exposure of the image to be acquired in each frame and the preset sensitivity of the image to be acquired in each frame to determine the exposure time of the image to be acquired in each frame.
  • Step 104 Use exposure parameters to perform exposure control.
  • the exposure control is performed using the determined exposure parameters of the images to be acquired in each frame.
  • the exposure control method in the embodiment of the present application it is determined that the current shooting scene belongs to a night scene, and face area recognition is performed on the preview image. According to whether a face area is recognized, a night scene mode applicable to the current shooting scene is identified, and each of the night scene modes is determined according to the night scene mode.
  • the exposure parameters of the frame to be captured are adjusted by using the exposure parameters.
  • the night scene is identified to dynamically adjust the applicable night scene mode.
  • the corresponding exposure parameters are used to control the exposure of the image to be collected, which improves the night scene. Imaging quality of the lower image.
  • this embodiment provides another exposure control method, which illustrates that before the face area recognition of the preview image, the night scene mode needs to be determined according to the degree of jitter of the imaging device.
  • FIG. 2 is the implementation of this application.
  • the flow chart of another exposure control method provided by the example is shown in FIG. 2. Before step 102, the method may further include the following steps:
  • Step 201 Obtain the degree of shaking of the imaging device, and determine the night scene mode according to the degree of shaking.
  • the night scene mode determined according to the degree of shaking includes a portrait night scene mode or a non-portrait night scene mode, a single-frame night scene mode, and a tripod night scene mode.
  • the collected displacement information is acquired from a sensor provided in the imaging device, and the degree of jitter of the imaging device is determined according to the displacement information.
  • the sensor may be a gyroscope, and the gyroscope may output displacement information of three axes of x, y, and z.
  • the gyroscope may output displacement information of three axes of x, y, and z.
  • the information takes an absolute value and is added and summed.
  • the sum of the displacement information corresponding to the three axes is represented by S.
  • the value of the displacement information S is used to indicate the degree of jitter of the imaging device.
  • a preset dither threshold is determined, and the obtained dither degree is compared with a preset threshold to determine the current night scene mode.
  • the dither threshold is divided into a first dither threshold and a second dither threshold. , Wherein the first jitter threshold is greater than the second jitter threshold.
  • the night scene mode is determined according to the shake threshold, specifically: if the degree of shake is less than the first shake threshold and greater than the second shake threshold, the night scene mode may be determined according to the degree of shake using a portrait night scene mode or a non-portrait night scene mode; if the shake degree is greater than or equal to The first jitter threshold is determined to adopt a single-frame night scene mode; if the degree of jitter is less than or equal to the second jitter threshold, it is determined to use a tripod night scene mode.
  • the number of frames of the image to be collected in the night scene mode is larger than the number of frames in the portrait night mode or the non-portrait night scene mode, and the number of frames of the image to be collected in the single frame night scene mode is less than that in the portrait night scene mode or the non-portrait night scene mode. number.
  • a portrait night scene mode or a non-portrait night scene mode can be used, and then the face image area can be further identified by identifying whether the preview image contains a face night scene mode or a non-night night scene mode.
  • the night scene mode can be determined by the degree of jitter
  • the tripod night scene mode can be used, because the tripod night scene mode has very little jitter, and the imaging device itself has an anti-shake elimination strategy, such as Optical Image Stabilization (OIS), etc.
  • OIS Optical Image Stabilization
  • the jitter in the tripod night scene mode is very small, so it can take a long time to obtain a high-quality picture, such as 1 minute. Even if a person appears in the image, it can't stay still for 1 minute. It does not distinguish whether or not a face area is included, that is, no special processing is performed even if a portrait is included.
  • the single-frame night scene mode can be used. Due to the large jitter, the single-frame night scene mode only uses a single frame acquisition and a single exposure measurement to shorten the exposure time and avoid blurs and ghosts. Therefore, It does not distinguish whether or not a face region is included.
  • the current shooting scene belongs to a night scene
  • the current night scene mode is determined according to the jitter situation.
  • the preview is further performed.
  • the image performs face area recognition, according to whether the face area is recognized, and the night scene mode applicable to the current shooting scene, and according to the night scene mode, the exposure parameters of each frame to be collected are determined, and the exposure parameters are used for exposure control. Whether there is a portrait, to dynamically adjust the applicable night scene mode and use the corresponding exposure parameters to control the exposure of the image to be collected, improving the imaging quality of the image in the night scene.
  • an embodiment of the present application further proposes an exposure control method, which further clearly explains how to determine the exposure parameters of the images to be collected in each frame according to the determined night scene mode or portrait night scene mode.
  • 3 is a schematic flowchart of another exposure control method provided by an embodiment of the present application. As shown in FIG. 3, step 103 may further include the following sub-steps:
  • Step 1031 determine the preset sensitivity of the image to be collected for each frame.
  • the night scene mode is a portrait night scene mode or a non-portrait night scene mode
  • the preset sensitivity iso of each image to be acquired in each frame in the portrait night scene mode or the non-portrait night scene mode is determined, for example, the preset sensitivity is 200 iso.
  • the preset sensitivity value is small to prevent more noise from appearing in the captured image and improve the imaging quality of the image.
  • the preset sensitivities of the images to be collected in each frame in the portrait night scene mode or the non-portrait night scene mode may be completely the same, or there may be minor differences.
  • the preset sensitivity of each frame to be collected is 200iso, or the preset sensitivity of the first frame is 200iso, and subsequent frames are incremented by 5iso per frame.
  • the value of the preset sensitivity is not limited in this embodiment.
  • Step 1032 Determine a preset exposure compensation value for each frame of the image to be acquired according to the night scene mode.
  • the preset exposure compensation value of each frame to be acquired is determined from the corresponding first value range.
  • the image to be acquired is set It is set to 7 frames, and the first value range corresponding to the exposure compensation value EV (Exposure Compensation Value) is [-6,0] EV, so that the preset exposure compensation values of the images to be collected for each frame are determined as: [-6 , -4, -2,0,0,0,0] EV.
  • the preset exposure compensation value of each frame to be acquired is determined from the corresponding second value range.
  • the The acquired image is set to 7 frames, and the range of the exposure compensation value is set to [-6, + 1] EV.
  • the corresponding exposure compensation value can be [-6, -3 , 0, + 1, + 1, + 1, + 1] EV.
  • the exposure compensation value can also be changed according to the ambient brightness and the captured image, and the exposure range can be reduced.
  • the range of the exposure compensation value can be adjusted to [-5, + 1] EV.
  • Step 1033 Determine a reference exposure amount according to the brightness information of the preview image.
  • the reference exposure amount is determined according to the brightness information of the preview image.
  • the brightness information of the preview image corresponding to the current shooting scene is measured by the photometry module in the electronic device, and the set comparison value is used.
  • Low sensitivity convert the measured brightness information, determine the reference exposure, and set it to EVO.
  • the sensitivity measured by the photometry module is 500iso
  • the exposure time is 50 milliseconds (ms)
  • the target sensitivity If it is 100iso, the sensitivity obtained after conversion is 100iso, the exposure time is 250ms, and the sensitivity is 100iso and the exposure time is 250ms as the reference exposure amount EVO.
  • EVO is the reference exposure that is most suitable for the current night scene environment, but the reference exposure EVO is not a fixed value, but a value that changes according to the brightness information of the preview image.
  • the preview The brightness information of the image will change, and the reference exposure EV0 will also change.
  • Step 1034 Determine the target exposure amount of the image to be acquired for each frame according to the reference exposure amount and the preset exposure compensation value of the image to be acquired for each frame.
  • the corresponding preset exposure compensation values for the 7 frames of images to be acquired are -6EV, -4EV, -2EV, 0EV, 0EV, 0EV, and 0EV, respectively, where "+” indicates that it is being measured.
  • the corresponding number is the number of steps to compensate for the exposure. According to the preset exposure compensation value and reference exposure for each frame of the image to be collected, determine the frame to be collected. Target exposure of the image.
  • the target exposure of the frame image is determined as EVO * 2 -6 , that is, EVO / 64, that is, reducing the brightness of the frame image acquisition; if the exposure compensation value of a frame image is 0EV, and the reference exposure is EVO, the target exposure of the frame image is determined as EVO * 1, that It is EVO, that is, the exposure is performed based on the reference exposure amount.
  • the method for confirming the target exposure amount of the images to be collected in other frames is the same, which is not listed here.
  • the determination method of the target exposure amount of the images to be collected in each frame is the same, and is not repeated here.
  • Step 1035 Determine the exposure duration of the image to be acquired in each frame according to the target exposure of the image to be acquired in each frame and the preset sensitivity of the image to be acquired in each frame.
  • the aperture value in the night scene mode, is fixed when collecting each frame of images.
  • the target exposure amount is determined by the sensitivity and the exposure duration.
  • the corresponding exposure time can be determined.
  • the sensitivity IOS value and exposure time corresponding to the reference exposure are divided into: 100iso and 250ms, the preset sensitivity of a frame to be captured is 100iso, and the exposure compensation value is -4EV.
  • Target exposure time is That is 16ms, which means that the exposure time is reduced.
  • the exposure compensation value is EV
  • the obtained exposure time is 250ms.
  • the exposure time of each frame can be determined.
  • the minimum exposure duration supported by the shutter is 10 milliseconds (ms).
  • the metering device may mistakenly assume that the current scene light Brighter, so that the determined reference exposure is smaller, that is, the exposure duration corresponding to the reference exposure is shorter.
  • the calculated exposure time may be lower than the preset minimum exposure time of 10ms, such as 8ms, then increase the exposure time from 8ms to 10ms, and determine to increase the corresponding magnification compensation value to ensure that the darkest There is a certain shooting brightness in one frame.
  • the images to be collected in each frame are increased according to this magnification compensation value, so that the brightness of the obtained images to be collected increases linearly, so that the acquired images are synthesized in subsequent images.
  • the transition of the halo is natural, improving the effect of the synthesized image.
  • the maximum exposure duration of a single frame calculated with the reference exposure amount may be greater than the maximum value set for the exposure duration, for example, 5 seconds, for example, the The sensitivity is 100iso and the exposure time is 4 seconds.
  • the preset sensitivity of the image to be captured is also 100iso.
  • the calculated exposure time is 8 seconds, which exceeds the preset maximum of 5 Seconds, you need to reduce the exposure time of the frame to a preset maximum of 5 seconds, determine the reduction ratio, and adjust the sensitivity at this ratio to prevent the exposure time from being too long.
  • the exposure control method in the embodiment of the present application according to the night scene mode being a portrait night scene mode or a non-portrait night scene mode, a preset sensitivity and a preset exposure compensation value of each frame to be collected are determined, and the brightness information of the preview image is used to determine
  • the reference exposure amount according to the reference exposure amount, determines the target exposure amount corresponding to the image to be collected for each frame, and determines the exposure duration according to the target exposure amount and the preset sensitivity of the preset image to be collected for each frame, thereby determining each frame to be collected
  • the exposure parameters of the image by setting different exposure parameters in portrait night scene mode or non-portrait night scene mode, realize the dynamic adjustment of the exposure parameters when shooting night scenes, and improve the imaging quality of night scenes.
  • FIG. 4 is a schematic flowchart of another exposure control method provided by an embodiment of the present application. As shown in FIG. 4, after step 104, the method It can also include the following steps:
  • Step 401 Acquire each frame image collected under exposure control, and synthesize each frame image to obtain an imaging image.
  • each frame image acquired under the control of the corresponding exposure parameter is acquired, and the acquired images are aligned to eliminate the influence of jitter, and at the same time, the image
  • the moving objects in the image are detected to eliminate ghosting, and then the corresponding pixels in each frame image are weighted and synthesized to obtain a corresponding one frame target image.
  • the exposure parameters used for each frame of the image acquired are different and correspond to different exposure durations, by combining the images of each frame, the dark portion of the final output imaging image can be obtained from the corresponding pixel information in the image with the longer exposure duration. For compensation, the bright part can be suppressed by the corresponding pixel information in the image with a shorter exposure time.
  • the amplitude and position of the noise generated by the current are random. Therefore, when multiple images are superimposed and synthesized, the noise can be canceled each other, thereby improving the imaging quality.
  • the exposure control method in the embodiment of the present application it is determined that the current shooting scene belongs to a night scene, and face area recognition is performed on the preview image. According to whether a face area is recognized, a night scene mode applicable to the current shooting scene is identified, and each of the night scene modes is determined according to the night scene mode.
  • the exposure parameters of the frame to be captured are adjusted by using the exposure parameters.
  • the night scene is identified to dynamically adjust the applicable night scene mode.
  • the corresponding exposure parameters are used to control the exposure of the image to be collected, which improves the night scene.
  • the image quality of the image is reduced, and at the same time, multiple frames of image are synthesized to preserve the details of the highlights and the corresponding transitions, which improves the imaging effect.
  • the present application also proposes an exposure control device.
  • FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
  • the device includes a scene determination module 51, an identification module 52, a parameter determination module 53, and a control module 54.
  • the scene determining module 51 is configured to determine that a current shooting scene belongs to a night scene.
  • the recognition module 52 is configured to recognize a face area of the preview image, and identify a night scene mode applicable to the current shooting scene according to whether the face area is recognized.
  • a parameter determining module 53 is configured to determine an exposure parameter of an image to be acquired in each frame according to a night scene mode.
  • the control module 54 is configured to perform exposure control by using an exposure parameter.
  • the apparatus further includes a determination module and a synthesis module.
  • the determining module is configured to obtain the degree of jitter of the imaging device, and determine that the degree of jitter is less than the first jitter threshold and greater than the second jitter threshold, where the first jitter threshold is greater than the second jitter threshold.
  • the determining module is further used for:
  • the degree of jitter is greater than or equal to the first jitter threshold, it is determined that a single-frame night scene mode is adopted;
  • the number of frames of the image to be acquired in the tripod night scene mode is greater than the number of frames in the portrait night scene mode or the non-portrait night scene mode;
  • the number of frames to be captured in the single-frame night scene mode is smaller than the number of frames in the portrait night scene mode or the non-portrait night scene mode.
  • the determining module is further specifically used to:
  • a synthesizing module is configured to acquire each frame image collected under exposure control; and synthesize each frame image to obtain an imaging image.
  • the foregoing identification module 52 is specifically configured to:
  • a portrait night scene mode is determined
  • night portrait mode and non-night portrait mode use different exposure compensation values.
  • the parameter determining module 53 is specifically configured to:
  • the preset exposure compensation value of each frame to be acquired is determined from the corresponding first value range
  • the preset exposure compensation value of each frame to be acquired is determined from the corresponding second value range
  • the upper limit of the first value range is smaller than the upper limit of the second value range.
  • the above-mentioned parameter determining module 53 is further specifically configured to:
  • the exposure time of the images to be acquired in each frame is determined.
  • the foregoing scenario determining module 51 is specifically configured to:
  • Detecting a user operation for scene switching when detecting a user operation switching to a night scene, detecting the ambient brightness to obtain brightness information; and according to the brightness information, determining that the current shooting scene belongs to a night scene.
  • the foregoing scene determination module 51 is further specifically configured to: if it is determined that the current shooting scene belongs to a non-night scene according to the brightness information, and use a high dynamic range mode for imaging.
  • the foregoing scenario determination module 51 is further specifically configured to:
  • Extract the image features of the preview image input the extracted image features into the recognition model, and determine that the current shooting scene belongs to the night scene according to the type of scene output by the recognition model; wherein the recognition model has learned to obtain the correspondence between the image features and the scene type.
  • the exposure control device of the embodiment of the present application it is determined that the current shooting scene belongs to a night scene, and the current night scene mode is determined through the shake situation, and the face area recognition is performed on the preview image.
  • the night scene mode applicable to the shooting scene according to the night scene mode, determine the exposure parameters of the frames to be captured, use the exposure parameters to perform exposure control, and dynamically adjust the applicable night scene mode by identifying whether there is a portrait in the night scene and use the corresponding exposure.
  • the parameter controls the exposure of the image to be collected, which improves the imaging quality of the image in the night scene. At the same time, it combines multiple frames of images to retain the details of the highlights and the corresponding transitions, which improves the imaging effect.
  • an embodiment of the present application further provides an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, The exposure control method according to the foregoing method embodiment is implemented.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method in the embodiment of the present application.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display.
  • the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball, or a touch button provided on the housing of the electronic device 200.
  • Board which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • Those skilled in the art can understand that the structure shown in FIG. 6 is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device 200 to which the solution of the present application is applied.
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of defining an ISP (Image Signal Processing) pipeline. Processing unit.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 7, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image information of the image sensor 934 such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
  • control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps of implementing the exposure control method by using the processor 60 in FIG. 6 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 7:
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • a computer program When instructions in the storage medium are executed by a processor, the implementation is implemented as in the foregoing method embodiment.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved. It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un procédé et un dispositif de commande d'exposition, ainsi qu'un appareil électronique, se rapportant au domaine technique des terminaux mobiles. Le procédé consiste à : déterminer qu'un scénario de prise de vues actuel est un scénario de nuit ; effectuer une reconnaissance de région de visage sur une image de prévisualisation, et identifier, selon le point de savoir si une région de visage est ou non reconnue, un mode nuit applicable pour le scénario de prise de vues actuel ; déterminer, selon le mode nuit, un paramètre d'exposition de chaque trame d'une image à acquérir ; utiliser le paramètre d'exposition pour effectuer une commande d'exposition ; ajuster de manière dynamique le mode nuit applicable par identification du point de savoir si une personne est ou non dans le scénario de nuit ; et utiliser un paramètre d'exposition correspondant pour effectuer une commande d'exposition sur l'image à acquérir. L'invention améliore la qualité d'imagerie d'une image dans un scénario de nuit, et résout le problème technique de faible qualité d'imagerie provoqué par utilisation d'un seul type de mode de prise de vues lors de la prise de vues d'un scénario de nuit.
PCT/CN2019/090146 2018-08-22 2019-06-05 Procédé et dispositif de commande d'exposition, et appareil électronique WO2020038069A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810963326.2 2018-08-22
CN201810963326.2A CN109068067B (zh) 2018-08-22 2018-08-22 曝光控制方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2020038069A1 true WO2020038069A1 (fr) 2020-02-27

Family

ID=64755843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090146 WO2020038069A1 (fr) 2018-08-22 2019-06-05 Procédé et dispositif de commande d'exposition, et appareil électronique

Country Status (2)

Country Link
CN (1) CN109068067B (fr)
WO (1) WO2020038069A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986129A (zh) * 2020-06-30 2020-11-24 普联技术有限公司 基于多摄图像融合的hdr图像生成方法、设备及存储介质
CN112819722A (zh) * 2021-02-03 2021-05-18 东莞埃科思科技有限公司 一种红外图像人脸曝光方法、装置、设备及存储介质
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN112929576A (zh) * 2021-02-01 2021-06-08 北京字节跳动网络技术有限公司 图像处理方法、装置、设备和存储介质
CN113422908A (zh) * 2021-07-01 2021-09-21 联想(北京)有限公司 一种数据处理方法及装置
CN113766260A (zh) * 2021-08-24 2021-12-07 武汉瓯越网视有限公司 人脸自动曝光优化方法、存储介质、电子设备及系统
CN113870300A (zh) * 2020-06-29 2021-12-31 北京迈格威科技有限公司 图像处理方法、装置、电子设备及可读存储介质
CN114222075A (zh) * 2022-01-28 2022-03-22 广州华多网络科技有限公司 移动端图像处理方法及其装置、设备、介质、产品
CN114500865A (zh) * 2022-01-29 2022-05-13 北京精英路通科技有限公司 补光灯的调控方法、装置、电子设备和存储介质
CN115314629A (zh) * 2021-05-08 2022-11-08 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN115314628A (zh) * 2021-05-08 2022-11-08 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN115706766A (zh) * 2021-08-12 2023-02-17 荣耀终端有限公司 视频处理方法、装置、电子设备和存储介质
CN116739014A (zh) * 2022-09-15 2023-09-12 荣耀终端有限公司 一种扫码方法及电子设备
CN116894984A (zh) * 2023-09-07 2023-10-17 中海物业管理有限公司 一种基于图像识别的入户访问方法和计算机可读存储介质
WO2023231479A1 (fr) * 2022-06-01 2023-12-07 同方威视科技江苏有限公司 Procédé et appareil de détection de pupille, ainsi que support de stockage et dispositif électronique

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109068067B (zh) * 2018-08-22 2020-06-12 Oppo广东移动通信有限公司 曝光控制方法、装置和电子设备
CN109618102B (zh) * 2019-01-28 2021-08-31 Oppo广东移动通信有限公司 对焦处理方法、装置、电子设备及存储介质
CN109862282B (zh) * 2019-02-18 2021-04-30 Oppo广东移动通信有限公司 人物图像处理方法和装置
CN109831629B (zh) * 2019-03-14 2021-07-02 Oppo广东移动通信有限公司 终端拍照模式的调整方法、装置、终端及存储介质
CN109831628B (zh) * 2019-03-14 2021-07-16 Oppo广东移动通信有限公司 终端拍照模式的调整方法、装置、终端及存储介质
CN112396574B (zh) * 2019-08-02 2024-02-02 浙江宇视科技有限公司 一种车牌图像质量处理方法、装置、存储介质及电子设备
CN112532857B (zh) * 2019-09-18 2022-04-12 华为技术有限公司 一种延时摄影的拍摄方法及设备
CN110740238B (zh) * 2019-10-24 2021-05-11 华南农业大学 一种应用于移动机器人slam领域的分光hdr相机
CN111131693B (zh) * 2019-11-07 2021-07-30 深圳市艾为智能有限公司 一种基于多曝光人脸检测的人脸图像增强方法
CN113099101B (zh) * 2019-12-23 2023-03-24 杭州宇泛智能科技有限公司 摄像参数调节方法、装置及电子设备
CN111402135B (zh) * 2020-03-17 2023-06-20 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN111447374B (zh) * 2020-05-13 2021-01-26 重庆紫光华山智安科技有限公司 补光调节方法、装置、电子设备及存储介质
CN111654623B (zh) * 2020-05-29 2022-03-22 维沃移动通信有限公司 拍照方法、装置和电子设备
CN117714900A (zh) * 2023-08-01 2024-03-15 荣耀终端有限公司 视频拍摄方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080037975A1 (en) * 2006-08-08 2008-02-14 Kenichi Nakajima Imaging device
CN101772952A (zh) * 2007-07-23 2010-07-07 松下电器产业株式会社 摄像装置
CN103002224A (zh) * 2011-09-09 2013-03-27 佳能株式会社 摄像装置及其控制方法
CN103227896A (zh) * 2012-01-26 2013-07-31 佳能株式会社 电子装置及电子装置控制方法
CN103841324A (zh) * 2014-02-20 2014-06-04 小米科技有限责任公司 拍摄处理方法、装置和终端设备
CN109068067A (zh) * 2018-08-22 2018-12-21 Oppo广东移动通信有限公司 曝光控制方法、装置和电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5386793B2 (ja) * 2006-12-11 2014-01-15 株式会社リコー 撮像装置および撮像装置の露出制御方法
KR101594295B1 (ko) * 2009-07-07 2016-02-16 삼성전자주식회사 촬영 장치 및 촬영 방법
CN103220431A (zh) * 2013-05-07 2013-07-24 深圳市中兴移动通信有限公司 自动切换拍照模式的方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080037975A1 (en) * 2006-08-08 2008-02-14 Kenichi Nakajima Imaging device
CN101772952A (zh) * 2007-07-23 2010-07-07 松下电器产业株式会社 摄像装置
CN103002224A (zh) * 2011-09-09 2013-03-27 佳能株式会社 摄像装置及其控制方法
CN103227896A (zh) * 2012-01-26 2013-07-31 佳能株式会社 电子装置及电子装置控制方法
CN103841324A (zh) * 2014-02-20 2014-06-04 小米科技有限责任公司 拍摄处理方法、装置和终端设备
CN109068067A (zh) * 2018-08-22 2018-12-21 Oppo广东移动通信有限公司 曝光控制方法、装置和电子设备

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113870300A (zh) * 2020-06-29 2021-12-31 北京迈格威科技有限公司 图像处理方法、装置、电子设备及可读存储介质
CN111986129A (zh) * 2020-06-30 2020-11-24 普联技术有限公司 基于多摄图像融合的hdr图像生成方法、设备及存储介质
CN111986129B (zh) * 2020-06-30 2024-03-19 普联技术有限公司 基于多摄图像融合的hdr图像生成方法、设备及存储介质
CN112929576A (zh) * 2021-02-01 2021-06-08 北京字节跳动网络技术有限公司 图像处理方法、装置、设备和存储介质
CN112929576B (zh) * 2021-02-01 2023-08-01 北京字节跳动网络技术有限公司 图像处理方法、装置、设备和存储介质
CN112819722A (zh) * 2021-02-03 2021-05-18 东莞埃科思科技有限公司 一种红外图像人脸曝光方法、装置、设备及存储介质
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN112911165B (zh) * 2021-03-02 2023-06-16 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN115314629B (zh) * 2021-05-08 2024-03-01 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN115314628B (zh) * 2021-05-08 2024-03-01 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN115314629A (zh) * 2021-05-08 2022-11-08 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN115314628A (zh) * 2021-05-08 2022-11-08 杭州海康威视数字技术股份有限公司 一种成像方法、系统及摄像机
CN113422908B (zh) * 2021-07-01 2023-05-23 联想(北京)有限公司 一种数据处理方法及装置
CN113422908A (zh) * 2021-07-01 2021-09-21 联想(北京)有限公司 一种数据处理方法及装置
CN115706766A (zh) * 2021-08-12 2023-02-17 荣耀终端有限公司 视频处理方法、装置、电子设备和存储介质
CN115706766B (zh) * 2021-08-12 2023-12-15 荣耀终端有限公司 视频处理方法、装置、电子设备和存储介质
CN113766260A (zh) * 2021-08-24 2021-12-07 武汉瓯越网视有限公司 人脸自动曝光优化方法、存储介质、电子设备及系统
CN114222075A (zh) * 2022-01-28 2022-03-22 广州华多网络科技有限公司 移动端图像处理方法及其装置、设备、介质、产品
CN114500865A (zh) * 2022-01-29 2022-05-13 北京精英路通科技有限公司 补光灯的调控方法、装置、电子设备和存储介质
CN114500865B (zh) * 2022-01-29 2024-04-09 北京精英路通科技有限公司 补光灯的调控方法、装置、电子设备和存储介质
WO2023231479A1 (fr) * 2022-06-01 2023-12-07 同方威视科技江苏有限公司 Procédé et appareil de détection de pupille, ainsi que support de stockage et dispositif électronique
CN116739014A (zh) * 2022-09-15 2023-09-12 荣耀终端有限公司 一种扫码方法及电子设备
CN116894984A (zh) * 2023-09-07 2023-10-17 中海物业管理有限公司 一种基于图像识别的入户访问方法和计算机可读存储介质
CN116894984B (zh) * 2023-09-07 2023-12-26 中海物业管理有限公司 一种基于图像识别的入户访问方法和计算机可读存储介质

Also Published As

Publication number Publication date
CN109068067B (zh) 2020-06-12
CN109068067A (zh) 2018-12-21

Similar Documents

Publication Publication Date Title
WO2020038069A1 (fr) Procédé et dispositif de commande d'exposition, et appareil électronique
WO2020038072A1 (fr) Procédé et dispositif de contrôle d'exposition, et dispositif électronique
US11582400B2 (en) Method of image processing based on plurality of frames of images, electronic device, and storage medium
JP6911202B2 (ja) 撮像制御方法および撮像装置
AU2019326496B2 (en) Method for capturing images at night, apparatus, electronic device, and storage medium
CN110072052B (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN110062160B (zh) 图像处理方法和装置
CN108683862B (zh) 成像控制方法、装置、电子设备及计算机可读存储介质
CN109788207B (zh) 图像合成方法、装置、电子设备及可读存储介质
WO2020057199A1 (fr) Procédé et dispositif d'imagerie, et dispositif électronique
WO2020034737A1 (fr) Procédé de commande d'imagerie, appareil, dispositif électronique et support d'informations lisible par ordinateur
CN109194882B (zh) 图像处理方法、装置、电子设备及存储介质
WO2020029732A1 (fr) Procédé et appareil de photographie panoramique, et dispositif d'imagerie
CN110290289B (zh) 图像降噪方法、装置、电子设备以及存储介质
WO2020207261A1 (fr) Procédé et appareil de traitement d'images basés sur de multiples trames d'images, et dispositif électronique
CN110191291B (zh) 基于多帧图像的图像处理方法和装置
CN110166708B (zh) 夜景图像处理方法、装置、电子设备以及存储介质
CN110248106B (zh) 图像降噪方法、装置、电子设备以及存储介质
CN110166707B (zh) 图像处理方法、装置、电子设备以及存储介质
US11490024B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
WO2020038087A1 (fr) Procédé et appareil de commande photographique dans un mode de scène de super nuit et dispositif électronique
CN108683861A (zh) 拍摄曝光控制方法、装置、成像设备和电子设备
CN110166706B (zh) 图像处理方法、装置、电子设备以及存储介质
CN108833802B (zh) 曝光控制方法、装置和电子设备
CN110264420B (zh) 基于多帧图像的图像处理方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852464

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852464

Country of ref document: EP

Kind code of ref document: A1