WO2020038072A1 - Procédé et dispositif de contrôle d'exposition, et dispositif électronique - Google Patents

Procédé et dispositif de contrôle d'exposition, et dispositif électronique Download PDF

Info

Publication number
WO2020038072A1
WO2020038072A1 PCT/CN2019/090476 CN2019090476W WO2020038072A1 WO 2020038072 A1 WO2020038072 A1 WO 2020038072A1 CN 2019090476 W CN2019090476 W CN 2019090476W WO 2020038072 A1 WO2020038072 A1 WO 2020038072A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure
night scene
frame
image
jitter
Prior art date
Application number
PCT/CN2019/090476
Other languages
English (en)
Chinese (zh)
Inventor
胡孔勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038072A1 publication Critical patent/WO2020038072A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present application relates to the technical field of mobile terminals, and in particular, to an exposure control method, device, and electronic device.
  • This application is intended to solve at least one of the technical problems in the related technology.
  • this application proposes an exposure control method.
  • the shooting night scene mode is determined according to the degree of shake, and the exposure parameters used for each frame of the image in the currently adopted night scene mode are determined, thereby realizing based on different shooting scenes. Dynamically adjust the night scene mode and exposure parameters to improve the image quality of night scene shooting.
  • the application proposes an exposure control device.
  • the present application proposes an electronic device.
  • the present application proposes a computer-readable storage medium.
  • An embodiment of one aspect of the present application provides an exposure control method, including:
  • the exposure parameter is used for exposure control.
  • An embodiment of another aspect of the present application provides an exposure control device, including:
  • a scene determination module configured to determine that a current shooting scene belongs to a night scene scene
  • a recognition module configured to identify a night scene mode applicable to the current shooting scene according to the degree of jitter of the imaging device
  • a parameter determining module configured to determine an exposure parameter of an image to be acquired in each frame according to the night scene mode
  • a control module configured to perform exposure control by using the exposure parameter.
  • An embodiment of another aspect of the present application provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the implementation is implemented as described above. Aspect of the exposure control method.
  • Another embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the instructions in the storage medium are executed by a processor, the exposure control method according to the foregoing aspect is implemented. .
  • FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of still another exposure control method according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment.
  • this embodiment provides an exposure control method, determining that the current shooting scene belongs to a night scene, identifying the night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device, and determining the exposure of the images to be collected in each frame according to the night scene mode. Parameters, using exposure parameters for exposure control, to achieve the use of different night scene modes and exposure parameters based on different shooting scenes when shooting night scenes, improving the shooting effect.
  • FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application.
  • the method includes the following steps:
  • Step 101 Determine that the current shooting scene belongs to a night scene.
  • an image acquisition module is used to obtain a preview image of the current scene, image feature extraction is performed on the preview image, and the extracted image features are input to the recognition model, and the current scene is determined according to the type of the scene output by the recognition model.
  • the shooting scene is a night scene, where the recognition model has learned to obtain the correspondence between the image features and the scene type.
  • a user operation for scene switching is detected, and when a user operation switched to a night scene is detected, the ambient brightness is detected to obtain brightness information.
  • electronic The built-in light metering module detects the current ambient brightness and determines the brightness information of the current environment. According to the brightness information, it is determined that the current shooting scene belongs to a night scene. For example, the brightness level can be measured by the brightness index Lix_index. The larger the value of the brightness information, the lower the brightness of the current scene.
  • the obtained brightness information is compared with a preset brightness value. If the obtained brightness information is greater than the preset brightness value, it is determined that the current shooting scene belongs to a night scene.
  • the obtained brightness information is less than the preset brightness value, it is determined that the current shooting scene belongs to a non-night scene.
  • imaging is performed in a high dynamic range mode.
  • Exposure compensation value to obtain a higher dynamic range. For example, 3 frames of images can be acquired, and the interval of the exposure compensation value is [-4, +1].
  • Step 102 Identify a night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device.
  • the degree of jitter of the imaging device is obtained, and if the degree of jitter is greater than or equal to the first jitter threshold, a single-frame night scene mode is determined; if the degree of jitter is less than the first jitter threshold and greater than the second jitter threshold, a handheld night mode is determined to be adopted; If the degree of jitter is less than or equal to the second jitter threshold, it is determined to use a tripod night scene mode, where the first jitter threshold is greater than the second jitter threshold, the number of frames to be collected in the handheld night scene mode is greater than one frame, and the tripod night scene mode is used. The number of frames in the captured image is greater than the number of frames in the handheld night scene mode.
  • Step 103 Determine the exposure parameters of the images to be collected in each frame according to the night scene mode.
  • the exposure parameters include exposure duration, sensitivity, and exposure compensation.
  • the preset sensitivity of each frame of the image to be acquired is determined according to the night scene mode, wherein the preset sensitivity in the handheld night scene mode is greater than the preset sensitivity in the tripod night scene mode.
  • determine the preset exposure compensation value for each frame of the image to be acquired determine the reference exposure amount based on the brightness information of the preview image, and determine the frame to be acquired based on the reference exposure amount and the exposure compensation value preset for each frame of image to be acquired.
  • the target exposure amount of the collected image is determined according to the target exposure amount of the image to be collected in each frame and the preset sensitivity of the image to be collected in each frame, to determine the exposure time of the image to be collected in each frame.
  • the hand-held night scene mode since the hand-held night scene mode has a large shake, in order to improve the imaging quality and avoid the afterimage caused by the shake, the frame number of the image captured in the hand-held night scene mode is reduced as much as possible.
  • the interval between exposure compensation values for multiple frames of images is set small, so that the high light areas in the final composite image can be And low light areas smooth transition.
  • the value range of the exposure compensation value in the handheld night scene mode is often set to be small, so that the handheld night scene The exposure compensation value range in the mode is smaller than the exposure compensation value range in the tripod night scene mode.
  • the degree of jitter in the handheld night scene mode can be further subdivided.
  • the degree of jitter has a positive relationship with the sensitivity in the exposure parameter, that is, the greater the degree of jitter, the greater the sensitivity;
  • the exposure time in the exposure parameters has a reverse relationship, that is, the greater the degree of jitter, the shorter the exposure time. This is because, although the lower the sensitivity, the less noise in the image, but at the same exposure, the The longer the required exposure time.
  • the degree of jitter also has an inverse relationship with the value range of the exposure compensation value in the exposure parameters. This is because when the degree of jitter is greater, in order to improve the imaging quality and avoid the afterimage caused by the jitter, the captured image can be appropriately reduced. Number of frames.
  • the degree of jitter is greater, in order to improve the imaging quality and avoid the afterimage caused by the jitter, the captured image can be appropriately reduced. Number of frames.
  • the greater the degree of jitter the smaller the interval between exposure compensation values for multiple frames of images, and the smaller the number of captured images, the smaller the range of exposure compensation values.
  • the correspondence relationship is queried to obtain the exposure parameters of the images to be acquired for each frame.
  • Step 104 Use exposure parameters to perform exposure control.
  • the exposure control is performed using the determined exposure parameters of the images to be acquired in each frame.
  • the exposure control method of this embodiment it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device. Perform exposure control to dynamically adjust the night scene mode and exposure parameters based on different shooting scenes, improve the imaging quality of the night scene shooting, and solve the related technology, the night shooting mode is single and cannot be applied to all shooting scenes, causing Technical problems with poor shooting quality in some scenes.
  • this embodiment provides another exposure control method, and further clearly explains that in step 102, the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
  • FIG. A schematic flowchart of another exposure control method is provided.
  • step 102 may include the following sub-steps:
  • Step 1021 Obtain the degree of jitter of the imaging device.
  • a sensor is provided in the imaging device to acquire the collected displacement information, and according to the displacement information, a degree of jitter of the imaging device is determined.
  • the sensor may be a gyroscope, and the gyroscope may output displacement information of three axes of x, y, and z.
  • the gyroscope may output displacement information of three axes of x, y, and z.
  • the absolute value of the information is added and summed.
  • the sum of the displacement information corresponding to the three axes is represented by S.
  • the value of the displacement information S is used to indicate the degree of jitter of the imaging device, that is, the value of the degree of jitter is equal to the displacement information. Value of S.
  • step 1022 it is determined whether the degree of jitter is greater than or equal to the first jitter threshold. If yes, go to step 1023; if no, go to step 1024.
  • step 1023 it is determined that a single-frame night scene mode is adopted.
  • the degree of jitter is greater than or equal to the first jitter threshold, the current jitter is large, and a single-frame night scene mode is directly used to obtain a single frame image. This is because when the degree of jitter is large, if multiple frames of images are acquired, Each frame of image may have a large difference, and the acquired images cannot be synthesized. Therefore, when the jitter is large, only a single frame of night scene mode is used to acquire a single frame of image.
  • Step 1024 Determine whether the degree of jitter is less than or equal to the first jitter threshold and greater than the second jitter threshold. If yes, go to step 1026; if no, go to step 1025.
  • the second jitter threshold is smaller than the first jitter threshold.
  • step 1025 it is determined that the handheld night scene mode is adopted.
  • the degree of jitter is not greater than or equal to the first jitter threshold, it indicates that the current jitter is not very large. It is further judged whether the current degree of jitter is less than or equal to the first jitter threshold and is greater than the second jitter threshold. If so, a handheld Night scene mode.
  • the number of frames of the image to be acquired in the handheld night scene mode is greater than one frame, for example, it can be 7 frames, wherein the exposure parameters used in each frame are not exactly the same.
  • the following embodiments will Details.
  • step 1026 it is determined that the night scene mode of the tripod is adopted.
  • a tripod night scene mode is used.
  • the number of frames of the image to be acquired in the night scene mode is larger than the number of frames of the image to be acquired in the hand-held night scene mode, that is, in the night scene mode, the corresponding jitter is very small, and the imaging device itself has anti-jitter elimination.
  • Strategies, such as Optical Image Stabilization (OIS), etc. Therefore, the jitter in the night scene mode of the tripod is very small, so it can take a long time to obtain a high-quality picture. Therefore, in this mode , Can take many frames of images continuously, for example, it can be 17 frames to improve the imaging quality of the image, where the exposure parameters used in each frame are not exactly the same, for the method of determining the exposure parameters, the following embodiments will Details.
  • the degree of shake of the imaging device is obtained, and according to the magnitude of the degree of shake, it is determined that the night scene mode to be adopted is a tripod night scene mode, a handheld night scene mode and a single frame night scene mode, and the night scene mode is refined by In this way, different night scenes are used for different night scenes, which improves the imaging quality of the collected images.
  • an embodiment of the present application further proposes an exposure control method, which further clearly explains how to determine the exposure parameters of the images to be collected in each frame according to the determined night scene mode.
  • FIG. 3 is another example provided by the embodiment of the present application.
  • Step 1031 determine the preset sensitivity of the image to be collected for each frame.
  • the night scene mode is the handheld night scene mode
  • the night scene mode is a tripod night scene mode
  • the preset sensitivities of the images to be collected in each frame in the handheld night scene mode may be completely the same, or there may be small differences, which are not limited in this embodiment.
  • the night scene of the tripod The principle of the preset sensitivity in the mode is the same as that in the hand-held night scene mode, and will not be repeated here.
  • the preset sensitivity of the tripod night scene mode is smaller than the preset sensitivity of the handheld night scene mode. This is because the jitter level of the tripod night scene mode is smaller than that of the handheld night scene mode. Small, therefore, when collecting graphics to be collected for each frame, a lower sensitivity can be used, thereby extending the exposure time, reducing the noise of the image to be collected, and improving the imaging quality of the image.
  • Step 1032 Determine a preset exposure compensation value for each frame of the image to be acquired according to the night scene mode.
  • the night scene mode is the tripod night scene mode
  • the range of the value EV (Exposure Compensation Value) is set to [-6, +2] EV, and the interval of the exposure compensation value for each frame is set to 0.5 EV.
  • the night scene mode is the handheld night scene mode
  • the value range is set to [-6, + 1] EV, for example, corresponding to the 7 frames of images to be acquired, the corresponding exposure compensation values can be [-6, -3,0, + 1, + 1, + 1, +1] EV.
  • the exposure compensation value can also be changed according to the ambient brightness and the captured image, and the exposure range can be reduced.
  • the range of the exposure compensation value is adjusted to [- 5, + 1] EV.
  • Step 1033 Determine a reference exposure amount according to the brightness information of the preview image.
  • the reference exposure amount is determined according to the brightness information of the preview image.
  • the brightness information of the preview image corresponding to the current shooting scene is measured by the photometry module in the electronic device, and the set comparison value is used.
  • Low sensitivity convert the measured brightness information, determine the reference exposure, and set it to EVO.
  • the sensitivity measured by the photometric module is 500iso
  • the exposure time is 50 milliseconds (ms)
  • the target sensitivity If it is 100iso
  • the sensitivity after conversion is 100iso
  • the exposure time is 250ms.
  • the sensitivity is 100iso and the exposure time is 250ms as the reference exposure amount EVO.
  • EVO is not a fixed value, but a value that changes according to the brightness information of the preview image.
  • the brightness information of the preview image changes, so the reference exposure EV0 is also Changed.
  • Step 1034 Determine the target exposure amount of the image to be acquired for each frame according to the reference exposure amount and the preset exposure compensation value of the image to be acquired for each frame.
  • the corresponding preset exposure compensation values for the 7 frames of images to be acquired are -6EV, -3EV, 0EV, + 1EV, + 1EV, + 1EV, and + 1EV, among which "+” Indicates that the exposure is increased based on the reference exposure set by the metering. "-" Indicates that the exposure is decreased.
  • the corresponding number is the number of steps to compensate for the exposure. According to the preset exposure compensation value and reference exposure for each frame of the image to be acquired, determine each The target exposure of the frame to be acquired.
  • the target exposure of the frame image is determined as EVO * 2 -6 , that is, EVO / 64, that is to reduce the brightness of the frame image capture; if the exposure compensation value of a frame image is 1EV and the reference exposure is EVO, the target exposure amount of the frame image is determined as EVO * 2, which is 2 times the EVO, that is, to increase the brightness of the image collection in this frame.
  • the method of confirming the target exposure of the images to be collected in other frames is the same, and not listed here.
  • the method for determining the target exposure amount of the image to be collected in each frame is the same, and is not repeated here.
  • Step 1035 Determine the exposure duration of the image to be acquired in each frame according to the target exposure of the image to be acquired in each frame and the preset sensitivity of the image to be acquired in each frame.
  • the aperture value in the night scene mode, is fixed when collecting each frame of images.
  • the target exposure amount is determined by the sensitivity and the exposure duration.
  • the corresponding exposure time can be determined.
  • the sensitivity IOS value and exposure duration corresponding to the reference exposure are divided into: 100iso and 250ms, the preset sensitivity of a frame to be captured is 100iso, and the exposure compensation value is -3EV.
  • Target exposure time is That is 32ms, that is, the exposure time is reduced.
  • the exposure compensation value is + 1EV
  • the obtained exposure time is 500ms, that is, the exposure time is increased.
  • the exposure time of each frame can be determined.
  • the minimum exposure duration supported by the shutter is 10 milliseconds (ms).
  • the metering device may mistakenly assume that the current scene light Brighter, so that the determined reference exposure is smaller, that is, the exposure duration corresponding to the reference exposure is shorter.
  • the calculated exposure time may be lower than the preset minimum exposure time of 10ms, such as 8ms, then increase the exposure time from 8ms to 10ms, and determine to increase the corresponding magnification compensation value to ensure that the darkest There is a certain shooting brightness in one frame.
  • the images to be collected in each frame are increased according to this magnification compensation value, so that the brightness of the obtained images to be collected increases linearly, so that the acquired images are synthesized in subsequent images.
  • the transition of the halo is natural, improving the effect of the synthesized image.
  • the maximum exposure duration of a single frame calculated with the reference exposure amount may be greater than the maximum value set for the exposure duration, for example, 5 seconds, for example, the The sensitivity is 100iso, the exposure duration is 2 seconds, and when the corresponding exposure compensation value is + 2EV, the preset sensitivity of the image to be captured is also 100iso.
  • the calculated exposure duration is 8 seconds, which exceeds the preset maximum of 5 Seconds, you need to reduce the exposure time of the frame to a preset maximum of 5 seconds, determine the reduction ratio, and adjust the sensitivity at this ratio to prevent the exposure time from being too long.
  • the night scene mode is a single-frame night scene mode, it is difficult to acquire high-quality images due to large jitter. Therefore, only one frame of image is captured and the time is shorter than that of the handheld night scene mode and the tripod night scene mode.
  • the exposure duration is used to perform image exposure control on the image to be acquired, which will not be described in detail in this embodiment.
  • a preset sensitivity and a preset exposure compensation value of each frame to be acquired are determined, a reference exposure amount is determined based on the brightness information of the preview image, and the reference exposure is determined.
  • the target exposure amount corresponding to the image to be collected for each frame and to determine the exposure time according to the target exposure amount and the preset sensitivity of the image to be collected for each frame, thereby determining the exposure parameters of the image to be collected for each frame.
  • Different exposure parameters are set in different night scene modes, so that in the handheld night scene mode with large vibrations, multiple frames of images to be captured are acquired, and the exposure time is reduced for image collection. Multiple frames of images to be captured, and the exposure time is increased for image acquisition, in the night scene mode, to dynamically adjust the exposure parameters during night scene shooting, and improve the imaging quality of night scene shooting.
  • FIG. 4 is a schematic flowchart of another exposure control method provided by an embodiment of the present application. As shown in FIG. 4, after step 104, the method It can also include the following steps:
  • Step 401 Acquire each frame image collected under exposure control, and synthesize each frame image to obtain an imaging image.
  • each frame image acquired under the control of the corresponding exposure parameter is acquired, and the acquired images are aligned to eliminate the influence of jitter, and at the same time, the image
  • the moving objects in the image are detected to eliminate ghosting, and then the corresponding pixels in each frame image are weighted and synthesized to obtain a corresponding one frame target image.
  • the exposure parameters used for each frame of the image acquired are different and correspond to different exposure durations, by combining the images of each frame, the dark portion of the final output imaging image can be obtained from the corresponding pixel information in the image with the longer exposure duration. For compensation, the bright part can be suppressed by the corresponding pixel information in the image with a shorter exposure time.
  • the amplitude and position of the noise generated by the current are random. Therefore, when multiple images are superimposed and synthesized, the noise can be canceled each other, thereby improving the imaging quality.
  • the exposure control method of the embodiment of the present application it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
  • the night scene mode the exposure parameter of each frame to be collected is determined, and the exposure is adopted.
  • the parameters are used for exposure control, and the obtained multi-frame images are synthesized to obtain an imaging image.
  • different night scene modes and exposure parameters are used to obtain multiple frames of images based on different shooting scenes. Compositing, retaining the details of the highlights and the corresponding transitions, improves the imaging effect.
  • the present application also proposes an exposure control device.
  • FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
  • the device includes a scene determination module 51, an identification module 52, a parameter determination module 53, and a control module 54.
  • the scene determining module 51 is configured to determine that a current shooting scene belongs to a night scene.
  • the identification module 52 is configured to identify a night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device.
  • a parameter determining module 53 is configured to determine an exposure parameter of an image to be acquired in each frame according to a night scene mode.
  • the control module 54 is configured to perform exposure control by using an exposure parameter.
  • the apparatus further includes: a synthesis module.
  • a synthesizing module is configured to acquire each frame image collected under exposure control; and synthesize each frame image to obtain an imaging image.
  • the foregoing identification module 52 may further include: an obtaining unit and a determining unit.
  • the obtaining unit is configured to obtain a degree of jitter of the imaging device.
  • the determining unit determines that if the degree of jitter is greater than or equal to the first jitter threshold, the single-frame night scene mode is adopted; if the degree of jitter is less than the first jitter threshold and is greater than the second jitter threshold, the hand-held night scene mode is determined; if the degree of jitter is less than or Is equal to the second jitter threshold, and a tripod night scene mode is determined to be adopted;
  • the first jitter threshold is greater than the second jitter threshold, the number of frames of the image to be collected in the handheld night scene mode is greater than one frame, and the number of frames of the image to be collected in the tripod night scene mode is greater than the number of frames in the handheld night scene mode.
  • the obtaining unit is specifically configured to obtain the collected displacement information from a sensor provided on the imaging device; and determine the degree of jitter of the imaging device according to the displacement information.
  • the foregoing parameter determining module 53 is specifically configured to:
  • the preset sensitivity in the handheld night scene mode is greater than the preset sensitivity in the tripod night scene mode.
  • the foregoing parameter determining module 53 is further specifically configured to:
  • the night scene mode determine a preset exposure compensation value for each frame of images to be acquired
  • the exposure time of the images to be acquired in each frame is determined.
  • the value range of the exposure compensation value in the handheld night scene mode is smaller than the value range of the exposure compensation value in the tripod night scene mode.
  • the above-mentioned parameter determination module 53 may also be specifically used for:
  • the degree of jitter of the imaging device query the corresponding relationship to obtain the exposure parameters of the images to be collected in each frame; wherein the exposure parameters include the exposure time, sensitivity, and exposure compensation value.
  • the degree of dithering has a positive relationship with the sensitivity in the exposure parameters; the degree of dithering has an inverse relationship with the exposure time in the exposure parameters; the range of value between the degree of dithering and the exposure compensation value in the exposure parameters Has a reverse relationship.
  • the foregoing scenario determining module 51 is specifically configured to:
  • Extract the image features of the preview image input the extracted image features into the recognition model, and determine that the current shooting scene belongs to the night scene according to the type of scene output by the recognition model; wherein the recognition model has learned to obtain the correspondence between the image features and the scene type.
  • the foregoing scenario determination module 51 is further specifically configured to:
  • Detecting a user operation for scene switching when detecting a user operation switching to a night scene, detecting the ambient brightness to obtain brightness information; and according to the brightness information, determining that the current shooting scene belongs to a night scene.
  • the exposure control device of this embodiment it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
  • the night scene mode the exposure parameters of the images to be collected in each frame are determined, and the exposure parameters are adopted Perform exposure control to achieve different shooting modes and exposure parameters based on different shooting scenes when shooting night scenes. This improves the shooting effect and solves the problem that in the related technology, the night shooting mode is single and cannot be applied to all shooting scenes. Causes technical problems of poor shooting quality in some scenes.
  • an embodiment of the present application further provides an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, The exposure control method according to the foregoing method embodiment is implemented.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method in the embodiment of the present application.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display.
  • the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball, or a touch button provided on the housing of the electronic device 200.
  • Board which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • Those skilled in the art can understand that the structure shown in FIG. 6 is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device 200 to which the solution of the present application is applied.
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of defining an ISP (Image Signal Processing) pipeline. Processing unit.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 7, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
  • control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps of implementing the exposure control method by using the processor 60 in FIG. 6 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 7:
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • a computer program When instructions in the storage medium are executed by a processor, the implementation is implemented as in the foregoing method embodiment.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved. It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne le domaine technique des terminaux mobiles, et plus particulièrement un procédé et un dispositif de contrôle d'exposition et un dispositif électronique, le procédé comprenant les étapes consistant à : déterminer qu'une scène de capture actuelle est une scène de nuit ; identifier un mode de scène de nuit approprié pour la scène de capture actuelle en fonction du degré d'instabilité d'un dispositif d'imagerie ; déterminer des paramètres d'exposition pour chaque image à acquérir en fonction du mode de scène de nuit ; et effectuer un contrôle d'exposition à l'aide des paramètres d'exposition, de façon à ajuster dynamiquement le mode de scène de nuit et les paramètres d'exposition sur la base de différentes scènes de capture, ce qui permet d'améliorer la qualité d'imagerie d'une image capturée en un mode de scène de nuit, et de résoudre le problème technique dans la technologie correspondante d'un seul mode de capture de nuit qui ne peut pas être appliqué à toutes les scènes de capture, résultant en une mauvaise qualité de capture dans certaines scènes.
PCT/CN2019/090476 2018-08-22 2019-06-10 Procédé et dispositif de contrôle d'exposition, et dispositif électronique WO2020038072A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810962773.6A CN109040609B (zh) 2018-08-22 2018-08-22 曝光控制方法、装置、电子设备和计算机可读存储介质
CN201810962773.6 2018-08-22

Publications (1)

Publication Number Publication Date
WO2020038072A1 true WO2020038072A1 (fr) 2020-02-27

Family

ID=64627979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090476 WO2020038072A1 (fr) 2018-08-22 2019-06-10 Procédé et dispositif de contrôle d'exposition, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN109040609B (fr)
WO (1) WO2020038072A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112818732A (zh) * 2020-08-11 2021-05-18 腾讯科技(深圳)有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN114222075A (zh) * 2022-01-28 2022-03-22 广州华多网络科技有限公司 移动端图像处理方法及其装置、设备、介质、产品

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109040609B (zh) * 2018-08-22 2021-04-09 Oppo广东移动通信有限公司 曝光控制方法、装置、电子设备和计算机可读存储介质
CN108833804A (zh) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 成像方法、装置和电子设备
CN109618102B (zh) * 2019-01-28 2021-08-31 Oppo广东移动通信有限公司 对焦处理方法、装置、电子设备及存储介质
CN110060213B (zh) * 2019-04-09 2021-06-15 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN110072051B (zh) * 2019-04-09 2021-09-03 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法和装置
CN110113539A (zh) * 2019-06-13 2019-08-09 Oppo广东移动通信有限公司 曝光控制方法、装置、电子设备以及存储介质
CN110536057B (zh) * 2019-08-30 2021-06-08 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN112532857B (zh) 2019-09-18 2022-04-12 华为技术有限公司 一种延时摄影的拍摄方法及设备
CN110708475B (zh) * 2019-11-27 2021-08-24 维沃移动通信有限公司 一种曝光参数确定方法、电子设备及存储介质
CN111654594B (zh) * 2020-06-16 2022-05-17 Oppo广东移动通信有限公司 图像拍摄方法、图像拍摄装置、移动终端及存储介质
CN111988523B (zh) * 2020-08-14 2022-05-13 RealMe重庆移动通信有限公司 超级夜景图像生成方法及装置、终端和可读存储介质
CN112911109B (zh) * 2021-01-20 2023-02-24 维沃移动通信有限公司 电子设备及拍摄方法
CN113660425B (zh) * 2021-08-19 2023-08-22 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质
CN116723408B (zh) * 2022-02-28 2024-05-14 荣耀终端有限公司 一种曝光控制方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051783A1 (en) * 2007-08-23 2009-02-26 Samsung Electronics Co., Ltd. Apparatus and method of capturing images having optimized quality under night scene conditions
CN101795358A (zh) * 2009-01-30 2010-08-04 佳能株式会社 摄像设备及其控制方法
CN103002224A (zh) * 2011-09-09 2013-03-27 佳能株式会社 摄像装置及其控制方法
CN103227896A (zh) * 2012-01-26 2013-07-31 佳能株式会社 电子装置及电子装置控制方法
CN109040609A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 曝光控制方法、装置和电子设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241294B (zh) * 2007-02-06 2010-09-01 亚洲光学股份有限公司 摄像控制方法及其装置
CN101262567B (zh) * 2008-04-07 2010-12-08 北京中星微电子有限公司 自动曝光方法与装置
CN106375676A (zh) * 2016-09-20 2017-02-01 广东欧珀移动通信有限公司 终端设备的拍照控制方法、装置和终端设备
CN108322669B (zh) * 2018-03-06 2021-03-23 Oppo广东移动通信有限公司 图像获取方法及装置、成像装置和可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051783A1 (en) * 2007-08-23 2009-02-26 Samsung Electronics Co., Ltd. Apparatus and method of capturing images having optimized quality under night scene conditions
CN101795358A (zh) * 2009-01-30 2010-08-04 佳能株式会社 摄像设备及其控制方法
CN103002224A (zh) * 2011-09-09 2013-03-27 佳能株式会社 摄像装置及其控制方法
CN103227896A (zh) * 2012-01-26 2013-07-31 佳能株式会社 电子装置及电子装置控制方法
CN109040609A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 曝光控制方法、装置和电子设备

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112818732A (zh) * 2020-08-11 2021-05-18 腾讯科技(深圳)有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN112818732B (zh) * 2020-08-11 2023-12-12 腾讯科技(深圳)有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN112911165B (zh) * 2021-03-02 2023-06-16 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN114222075A (zh) * 2022-01-28 2022-03-22 广州华多网络科技有限公司 移动端图像处理方法及其装置、设备、介质、产品

Also Published As

Publication number Publication date
CN109040609B (zh) 2021-04-09
CN109040609A (zh) 2018-12-18

Similar Documents

Publication Publication Date Title
WO2020038072A1 (fr) Procédé et dispositif de contrôle d'exposition, et dispositif électronique
WO2020038069A1 (fr) Procédé et dispositif de commande d'exposition, et appareil électronique
AU2019326496B2 (en) Method for capturing images at night, apparatus, electronic device, and storage medium
US11582400B2 (en) Method of image processing based on plurality of frames of images, electronic device, and storage medium
JP6911202B2 (ja) 撮像制御方法および撮像装置
WO2020034737A1 (fr) Procédé de commande d'imagerie, appareil, dispositif électronique et support d'informations lisible par ordinateur
CN110072052B (zh) 基于多帧图像的图像处理方法、装置、电子设备
WO2020057199A1 (fr) Procédé et dispositif d'imagerie, et dispositif électronique
CN108683862B (zh) 成像控制方法、装置、电子设备及计算机可读存储介质
CN109788207B (zh) 图像合成方法、装置、电子设备及可读存储介质
WO2020029732A1 (fr) Procédé et appareil de photographie panoramique, et dispositif d'imagerie
CN110191291B (zh) 基于多帧图像的图像处理方法和装置
WO2020207261A1 (fr) Procédé et appareil de traitement d'images basés sur de multiples trames d'images, et dispositif électronique
CN109194882B (zh) 图像处理方法、装置、电子设备及存储介质
CN110166708B (zh) 夜景图像处理方法、装置、电子设备以及存储介质
WO2020038087A1 (fr) Procédé et appareil de commande photographique dans un mode de scène de super nuit et dispositif électronique
CN110248106B (zh) 图像降噪方法、装置、电子设备以及存储介质
CN110166707B (zh) 图像处理方法、装置、电子设备以及存储介质
WO2020034701A1 (fr) Procédé et appareil de commande d'imagerie, dispositif électronique et support de stockage lisible
CN108833802B (zh) 曝光控制方法、装置和电子设备
CN110166709B (zh) 夜景图像处理方法、装置、电子设备以及存储介质
CN110166706B (zh) 图像处理方法、装置、电子设备以及存储介质
CN110264420B (zh) 基于多帧图像的图像处理方法和装置
WO2020034702A1 (fr) Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur
US11601600B2 (en) Control method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852869

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852869

Country of ref document: EP

Kind code of ref document: A1