WO2024032033A1 - 视频处理的方法及电子设备 - Google Patents
视频处理的方法及电子设备 Download PDFInfo
- Publication number
- WO2024032033A1 WO2024032033A1 PCT/CN2023/090835 CN2023090835W WO2024032033A1 WO 2024032033 A1 WO2024032033 A1 WO 2024032033A1 CN 2023090835 W CN2023090835 W CN 2023090835W WO 2024032033 A1 WO2024032033 A1 WO 2024032033A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- mode
- hdr
- exposure
- target
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 claims abstract description 288
- 230000004044 response Effects 0.000 claims abstract description 13
- 230000004927 fusion Effects 0.000 claims description 50
- 230000035945 sensitivity Effects 0.000 claims description 47
- 230000006870 function Effects 0.000 claims description 37
- 238000006243 chemical reaction Methods 0.000 claims description 22
- 230000015654 memory Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 18
- 206010034972 Photosensitivity reaction Diseases 0.000 claims description 8
- 230000036211 photosensitivity Effects 0.000 claims description 8
- 230000009977 dual effect Effects 0.000 claims description 7
- 230000008447 perception Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 67
- 230000008569 process Effects 0.000 description 45
- 238000010586 diagram Methods 0.000 description 26
- 238000005516 engineering process Methods 0.000 description 26
- 238000004891 communication Methods 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 15
- 230000006835 compression Effects 0.000 description 14
- 238000007906 compression Methods 0.000 description 14
- 238000010295 mobile communication Methods 0.000 description 11
- 238000004422 calculation algorithm Methods 0.000 description 10
- 238000001514 detection method Methods 0.000 description 8
- 230000007613 environmental effect Effects 0.000 description 8
- 238000013528 artificial neural network Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000007499 fusion processing Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
Definitions
- the present application relates to the field of terminal technology, and in particular, to a video processing method and electronic equipment.
- Exposure is the process of receiving light from the lens through a photosensitive device to form an image. During the shooting process, the light and dark intensity of the shooting background or subject will change. Too strong external light can easily lead to overexposure, resulting in an image that is too bright and lacks layers and details. Too weak external light can easily lead to underexposure, resulting in an image that is too dark and cannot be captured. Reflects true color.
- Dynamic range refers to the ratio of the maximum output signal and the minimum output signal supported by the device, or the grayscale ratio of the upper and lower brightness limits of the image. If the ambient brightness is greater than the upper limit of the dynamic range, the captured image will be brighter; if the ambient brightness is less than the lower limit of the dynamic range, the captured image will be darker.
- Embodiments of the present application provide a video processing method that seamlessly and automatically switches between multiple exposure modes according to actual video shooting conditions, thereby achieving efficient HDR processing in video shooting scenarios and improving video images. quality.
- a frequency processing method applied to electronic equipment, and the method includes:
- the ambient brightness is obtained, and the target HDR exposure method is determined according to a preset strategy.
- the preset strategy includes the dynamic range information corresponding to the video shooting, the strobe state, and the relationship between the ambient brightness and the target. Correspondence between HDR exposure methods;
- the default HDR exposure mode is switched to the target HDR exposure mode, and the video shooting is continued to obtain a second image.
- the dynamic range information here may include dynamic range and/or dynamic range compression gain.
- multiple types of HDR processing solutions can be seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby taking advantage of the actual shooting environment and pictures.
- the HDR solution adapted to the quality requirements is used for image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- the target HDR exposure method includes at least a first HDR exposure method and a second HDR exposure method, the first HDR exposure method is a single frame mode, and the The second HDR exposure method is double frame mode;
- the images input in the dual-frame mode are fused.
- the single frame mode may be a binning exposure mode, that is, the image sensor outputs a single frame image after exposure.
- Double-frame mode can be exposure methods such as SHDR, DCG, DXG, etc., that is, reading two frames of images based on one exposure, and then fusing the two frames of images to adjust HDR; or using long exposure and short exposure to obtain two frames of images, and then How to fuse two frames of images to adjust HDR.
- the first HDR exposure method is a binning mode
- the second HDR exposure method includes interleaved high dynamic range modes SHDR and DXG
- the DXG is double conversion Gain mode DCG and dual analog gain DAG superposition mode.
- the method further includes:
- the target parameters here may include image size, bit depth, etc.
- adjusting the parameters of the images corresponding to different exposure methods to the target parameters can make the images of different exposure methods have consistent image parameters and avoid image jumps when switching between different exposure methods.
- the electronic device includes an automatic exposure control AEC module, an image sensor, and a sensing module,
- acquiring the first image according to the preset default HDR exposure method specifically includes:
- the AEC module sends first indication information to the image sensor, the first indication information being used to instruct the image to be captured using the default HDR exposure mode;
- the image sensor uses the default HDR exposure mode and acquires the first image.
- the electronic device includes an AEC module, an image sensor, and a sensing module,
- the image sensor sends the first image to the sensing module
- the sensing module obtains the ambient brightness according to the first image, and indicates the ambient brightness to the AEC module;
- the AEC module determines the target HDR exposure mode according to a preset strategy based on the ambient brightness.
- the electronic device includes an AEC module, an image sensor, a sensing module and a fusion module,
- fusing the two-frame input images specifically includes:
- the image sensor transmits the image in the dual-frame mode to the fusion module;
- the fusion module fuses the images in the dual-frame mode; or,
- the image sensor fuses the images in the dual-frame mode.
- the fusion module when the fusion module fuses the images in the dual-frame mode, it is determined according to the photosensitivity ratio required by the DXG mode.
- the DCG mode dual-frame input image and the DAG mode dual-frame input image are respectively superimposed according to the target sensitivity ratio to obtain a superimposed dual-frame input image that satisfies the DXG mode sensitivity ratio.
- the dual-frame input in the DCG mode is performed according to a preset sensitivity ratio.
- the image is superimposed with the double-frame input image in DAG mode.
- determining the target HDR exposure method according to a preset strategy specifically includes:
- the target HDR exposure mode is determined according to the ambient brightness;
- the target HDR exposure mode is determined to be the DXG mode.
- the method further includes:
- the target HDR exposure mode is determined to be the binning mode.
- the electronic device supports the first HDR video mode, and the first HDR video mode includes HDR10 or HDR10+.
- the method when there is switching between an HDR camera and a non-HDR camera during video shooting, the method further includes:
- the exposure mode of the HDR camera is the first HDR exposure mode
- the exposure mode corresponding to the non-HDR camera is the second HDR exposure mode
- the second dynamic range gain should be closest.
- the method further includes:
- a first interface is displayed, and the first interface includes a first control, and the first control is used to enable the function of automatically switching the HDR exposure mode.
- an electronic device including: one or more processors; one or more memories; the one or more memories store one or more computer programs, and the one or more computer programs It includes instructions that, when executed by the one or more processors, cause the electronic device to perform the method described in any implementation of the first aspect or the second aspect.
- a computer-readable storage medium stores computer-executable program instructions. When the computer-executable program instructions are run on a computer, they cause the computer to execute the above steps. The method described in any implementation manner of the first aspect or the second aspect.
- a computer program product includes computer program code.
- the computer program code When the computer program code is run on a computer, the computer is caused to execute any of the above first or second aspects. A method of implementation.
- FIG. 1 is a schematic diagram of an image captured in an interleaved high dynamic range mode according to an embodiment of the present application.
- FIGS. 2A and 2B are schematic diagrams of images captured through DCG mode and DAG mode provided by embodiments of the present application.
- FIG. 3 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
- FIG. 4 is a software structure block diagram of an electronic device 100 provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram illustrating differences in relevant parameters corresponding to multiple exposure modes provided by an embodiment of the present application.
- FIGS 6A to 6D are schematic diagrams of GUIs that may be involved in some video processing processes provided by embodiments of the present application.
- FIG. 7 is a schematic diagram of a video processing method provided by an embodiment of the present application.
- FIG. 8 is a schematic diagram of another video processing method provided by an embodiment of the present application.
- FIG. 9 is a schematic diagram of yet another video processing method provided by an embodiment of the present application.
- FIG. 10 is a schematic diagram of yet another video processing method provided by an embodiment of the present application.
- FIG. 11 is a schematic diagram of yet another video processing method provided by an embodiment of the present application.
- FIG. 12 is a schematic diagram of yet another video processing method provided by an embodiment of the present application.
- Figure 13 is a schematic diagram of adjusting image parameters during video processing provided by an embodiment of the present application.
- Figure 14 is a schematic diagram of adjusting image parameters during video processing provided by an embodiment of the present application.
- FIG. 15 is a schematic diagram of yet another video processing method provided by an embodiment of the present application.
- first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Thus, defining “first” and “second” features may explicitly or implicitly include one or more of these features.
- GUI graphical user interface
- Controls can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc.
- HDR technology is a group of technologies that can achieve a greater dynamic range of exposure (that is, a greater difference between light and dark), and its purpose is to correctly represent the real-world brightness range from direct sunlight to the darkest shadows.
- the currently photographed object can be exposed by setting multiple sets of exposure values (EV), including the EV value exposure under normal conditions obtained by current metering calculations, and the use of low exposure values respectively. Expose at normal EV value (EV value of EV-n) and higher than normal EV value (EV value of EV+n). After that, multiple exposed photos are fused to expose objects in dark places with high EV. Partial photos are taken, and the bright objects are taken with low EV exposure, so that the scene in the entire photo is not too bright or too dark.
- the above-mentioned method of enlarging dynamic range by bracketing and merging multiple photos with EV involves a more complex image processing algorithm, which requires a sufficient interval between two frames as calculation time.
- the time interval between obtaining two consecutive frames of images is long enough to expand the dynamic range according to the traditional HDR exposure method.
- embodiments of the present application provide a video processing method that seamlessly switches multiple types of HDR exposure methods according to changes in environmental brightness, dynamic range, stroboscopic and other factors, thereby utilizing practical
- the HDR solution adapted to the shooting status and image quality requirements is used for image collection and processing, effectively expanding the dynamic range in the recording scene and improving the image quality in the recording scene.
- SHDR Staggered high-dynamic range
- SHDR line interleaved HDR
- DOL-HDR digital overlap HDR
- SHDR technology can collect multiple frames of images with different exposure times in one acquisition cycle, and then fuse the corresponding long exposure frames and short exposure frames into one frame through multi-frame fusion technology to obtain high dynamics. range of images.
- the short exposure frame (or "short exposure image”) can obtain the bright part information
- the long exposure frame (or the "long exposure image”) can obtain the dark part information
- the long exposure frame has excellent noise control
- the two frame images are fused , can obtain the gain of highlights and shadows.
- the images collected through SHDR technology can be shown in Figure 1, including a frame of long exposure images collected periodically (the frame marked with the letter “L” in Figure 1) and a frame of short exposure images (Figure 1 frame with the letter “S” in it).
- SHDR may not be able to obtain ideal shooting results in some scenes.
- SHDR scenes need to collect images under multiple exposure parameters (such as exposure time), so the blank frame interval between two frames of images will be shorter, which is not suitable for image processing.
- exposure time such as exposure time
- the two frames used by SHDR to fuse into one image come from different exposure periods and correspond to different exposure times. The longer the exposure time, the more likely smear (or ghosting) will be produced, which will inevitably occur during fusion. Produce motion ghosting (ideal effects can only be achieved in highlighted scenes with short exposure times of two frames).
- Binning is an image readout mode that adds the charges induced in adjacent pixels together and reads them out in a pixel pattern. For example, during the process of capturing an image by an electronic device, the light reflected by the target object is collected by the camera, so that the reflected light is transmitted to the image sensor.
- the image sensor includes multiple photosensitive elements.
- the charge collected by each photosensitive element is one pixel, and a binning operation is performed on the pixel information.
- binning can merge n ⁇ n pixels into one pixel.
- binning can combine adjacent 2 ⁇ 2 pixels into one pixel, that is, the color of adjacent 2 ⁇ 2 pixels is presented in the form of one pixel.
- DCG is also an image readout method. It can be understood as the ability to read twice in a pixel circuit, or it can be understood as having two capacitors that store photon energy in the photosensitive unit corresponding to a pixel.
- the DCG involved in the embodiment of the present application may specifically refer to two conversion gain readouts of an image based on one exposure operation.
- DCG can be used to expand the dynamic range.
- the implementation principle is: an image sensor with double conversion gain DCG capability has two potential wells in one pixel, and the two potential wells correspond to different full well capacities and different Conversion gain CG, a large full well capacity corresponds to low conversion gain (LCG) and low sensitivity, and a small full well capacity corresponds to high conversion gain (high conversion gain, HCG) and high sensitivity.
- the sensor can use two potential wells (two sensitivities) and two conversion gains in the same scene to acquire two images in one exposure: an image in a high-sensitivity mode and an image in a low-sensitivity mode.
- the electronic device then combines the two acquired images into one image, which is HDR technology.
- the images read out twice by DCG can be shown in Figure 2A, respectively.
- the LCG frame is a frame of image read out using the LCG gain signal, which can protect the highlight area from being overexposed
- the HCG frame is a frame of the image read out using the HCG gain signal, which can improve the brightness of the shadows and control noise at the same time.
- the two frames of images are fused to obtain the benefits of highlights and shadows, and an image with optimized dynamic range is obtained.
- DAG As another image readout method, similar to the double conversion gain DCG introduced above, DAG also reads two frames of images through two analog signals based on one exposure. The difference is that the readout methods of the two are different.
- the DAG readout mode uses two analog gains for image readout.
- the two analog gains include: low analog gain (LAG) and high analog gain (HAG).
- LAG low analog gain
- HAG high analog gain
- LAG can protect highlights. Areas are not overexposed; HAG can brighten shadows while controlling noise.
- the images read out twice by DAG can be shown in Figure 2B respectively, where the LAG frame is a frame of image read out using LAG, and the HAG frame is a frame of image read out using HAG. .
- the two frames of images are fused to obtain the benefits of highlights and shadows and expand the dynamic range.
- conversion gain (CG) analog gain (AG) has better noise control capabilities.
- DCG and DAG can also be superimposed to expand the dynamic range.
- the superimposed use of DCG and DAG is called DXG.
- DCG and DAG can be superimposed according to the sensitivity ratio. For example, if the sensitivity ratio between the image read out by the high-gain signal and the image read out by the low-gain signal is 1:16, then DCG can be used to obtain a sensitivity ratio of 1 :2, and then use DAG to obtain two frames of images with a sensitivity ratio of 1:8. After multiplying the two sensitivity ratios, a sensitivity ratio image of 1:16 is obtained.
- the embodiment of the present application refers to the method of fusing two frames acquired through DXG into one frame in the image sensor as iDXG.
- a raw image contains data processed from the image sensor of a digital camera, scanner, or film scanner.
- RAW images contain the most original information of the image, without non-linear processing in the image signal processing (ISP) process.
- ISP image signal processing
- HDR10 video is configured according to static metadata.
- the PQ conversion curve of the HDR10 is fixedly mapped according to the monitor's baseline display brightness.
- the bit depth of the HDR10 video is 10 bit; the static metadata can meet the definitions in SMPTE ST2086 or other standards.
- HDR10+ is based on HDR and continues to improve.
- HDR10+ supports dynamic metadata, that is, HDR10+ can adjust or enhance image brightness, contrast, color saturation, etc. according to different scenes in the video, so that each frame in the HDR10+ video can be independently adjusted.
- HDR effect is Among them, the bit depth of the HDR10+ video is 12bit; the dynamic metadata can meet the definitions in SMPTE ST2094 or other standards.
- Luminance scenes may also be called brightness levels.
- the brightness scene can be used to determine the image
- the exposure method of acquisition DXG or SHDR.
- brightness scenes can include: highlight scenes, medium scenes, dark light scenes, etc.
- brightness scenes may correspond to different brightness ranges, and the device may distinguish different brightness levels based on the intensity of light reflected by the photographed object.
- the brightness range corresponding to a high-brightness scene can be greater than 50,000 lux
- the brightness range corresponding to a medium-brightness scene can be 50,000lux-10lux
- the brightness range corresponding to a dark-light scene can be 10lux-0lux.
- the brightness levels described in the embodiments of the present application may not be limited to the above three.
- the brightness ranges corresponding to these three brightness scenes are only used as an example.
- the values of the brightness ranges corresponding to different brightness scenes can also be other values, which are not limited in the embodiments of the present application.
- FIG. 3 it is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
- FIG. 3 shows a schematic structural diagram of the electronic device 100.
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
- SIM Subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
- the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
- the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units.
- the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
- application processor application processor, AP
- modem processor graphics processing unit
- GPU graphics processing unit
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- baseband processor baseband processor
- NPU neural-network processing unit
- different processing units can be independent devices or integrated in one or more processors.
- the controller may be the nerve center and command center of the electronic device 100 .
- the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
- the processor 110 may also be provided with a memory for storing instructions and data.
- the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
- processor 110 may include one or more interfaces.
- Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, and universal asynchronous reception and reception transmitter.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- UART universal asynchronous receiver/transmitter, UART
- mobile industry processor interface mobile industry processor interface, MIPI
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
- the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
- the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
- at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
- at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
- a modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
- the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
- the modem processor may be a stand-alone device.
- the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
- WLAN wireless local area networks
- System global navigation satellite system, GNSS
- frequency modulation frequency modulation, FM
- near field communication technology near field communication, NFC
- infrared technology infrared, IR
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
- the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR Technology etc.
- the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation systems
- the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
- Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
- Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
- Microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
- the electronic device 100 may be provided with at least one microphone 170C.
- the headphone interface 170D is used to connect wired headphones.
- the sensor module 180 may include one or more sensors, which may be of the same type or different types.
- the sensor module 180 may include a pressure sensor, a gyroscope sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, an ambient light sensor, and the like.
- the buttons 190 include a power button, a volume button, etc.
- Key 190 may be a mechanical key. It can also be a touch button.
- the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
- the display screen 194 is used to display images, videos, etc.
- Display 194 includes a display panel.
- the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (Active-Matrix Organic Light).
- the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
- the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
- the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
- Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
- the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
- Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy. Video codecs are used to compress or decompress digital video. NPU is a neural network (NN) computing processor. By drawing on the structure of biological neural networks, such as the transmission mode between neurons in the human brain, it can quickly process input information and can continuously learn by itself.
- NN neural network
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to implement extended electronic Storage capabilities of device 100.
- the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
- Internal memory 121 may be used to store computer executable program code, which includes instructions.
- the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
- the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image video playback function, etc.), etc. Data created during use of the electronic device 100 (such as audio data, phone book, etc.) may be stored in the storage data area.
- the internal memory 121 may store codes related to automatic exposure control (automatic exposure control, AEC) and exposure mode switching methods.
- the processor can realize the automatic exposure control process by running AEC-related code, and can realize the switching of the exposure mode of the image sensor in the camera by running the code related to the exposure mode switching method.
- the AEC module may include an AEC algorithm module and an AEC statistics module.
- the AEC statistics module is used to statistically analyze the parameters in the collected images, such as image brightness.
- the AEC algorithm module can automatically adjust the exposure parameters of the camera based on statistical results.
- the AEC algorithm module can also estimate the ambient light brightness based on statistical results.
- the electronic device 100 can realize the function of acquiring images through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
- the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the light signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image or video visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
- Camera 193 is used to capture still images or video.
- the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
- the photosensitive element can be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor.
- CCD Charge Coupled Device
- CMOS Complementary Metal-Oxide-Semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP for conversion into a digital image or video signal.
- ISP outputs digital images or video signals to DSP for processing.
- DSP converts digital images or video signals into standard RGB, YUV and other formats.
- the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
- the electronic device 100 can use N cameras 193 to acquire images with multiple exposure coefficients.
- the electronic device 100 can use high dynamic range (High Dynamic Range (HDR) technology synthesizes HDR images.
- HDR High Dynamic Range
- Video codecs are used to compress or decompress digital video.
- Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
- MPEG Moving Picture Experts Group
- MPEG2 MPEG2, MPEG3, MPEG4, etc.
- NPU is a neural network (Neural-Network, NN) computing processor.
- NN neural network
- Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
- the SIM card interface 195 is used to connect a SIM card.
- the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
- the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
- the electronic device 100 uses an eSIM, that is, an embedded SIM card.
- the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- This embodiment of the present invention takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
- FIG. 4 it is a software structure block diagram of an electronic device 100 provided by an embodiment of the present application.
- a layered architecture divides software into several layers, with each layer having clear roles and division of labor.
- the layers communicate through software interfaces.
- the system may include an application layer, a hardware abstraction layer and a kernel layer from top to bottom.
- the application framework layer, system library, runtime, etc. can also be included between the application layer and the hardware abstraction layer.
- the application layer can include a series of application packages. As shown in Figure 4, the application package can include camera, gallery, music, video, call and other applications (or applications).
- the hardware abstraction layer shields the differences between different hardware devices and provides a standard interface for the system. As shown in Figure 4, the hardware abstraction layer transmits data to the kernel layer through the standard HAL interface and accepts data uploaded by the kernel layer.
- the hardware abstraction layer can contain multiple library modules, each of which implements a set of interfaces for a specific type of hardware component, such as wireless fidelity (Wi-Fi)/Bluetooth (bluetooth) modules, camera modules .
- Wi-Fi wireless fidelity
- Bluetooth bluetooth
- the hardware abstraction layer may also include a camera module.
- the camera module may include an automatic exposure control (AEC) module and an image processing pipeline.
- AEC automatic exposure control
- the AEC module can be used to implement automatic exposure control.
- the AEC module can obtain exposure parameters from the system configuration file and configure the exposure parameters to the image sensor.
- the image sensor captures images based on exposure parameters.
- the AEC statistics module in the AEC module can statistically analyze the parameters in the collected images, such as image brightness, etc.
- the AEC module can also estimate the dynamic range compression gain (adrc gain) based on the above-mentioned exposure parameters and image brightness used when acquiring images.
- the exposure mode control module may be used to control the exposure mode of the image sensor of the camera based on an estimated measure of ambient light. Specifically, in bright light scenes, the image sensor can be controlled to operate in a low-sensitivity mode, thereby extending the exposure time so that the exposure time meets the exposure time requirements for shooting videos.
- the kernel layer is the layer between hardware and software.
- the kernel layer at least includes display driver, camera driver, sensor driver, sensor front-end processor, image processor front-end, image processor back-end, etc.
- the above-mentioned exposure mode control module can control the exposure mode of the image sensor of the camera through the camera driver, and the AEC module can configure the exposure parameters to the camera through the camera driver.
- the workflow of the software and hardware of the electronic device 100 will be exemplified below based on the video recording scene.
- the corresponding hardware interrupt is sent to the kernel layer.
- the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, and other information).
- raw input event is stored in the kernel layer.
- the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. For example, if the touch operation is a click operation, and the control acted upon by the click operation is the camera application icon control, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
- 193 captures a video image
- the display screen 194 displays a preview interface of the camera, and the captured video image is displayed in the preview interface.
- the video processing method provided by the embodiment of the present application establishes a corresponding relationship between specific influencing factors and exposure methods, and then during the video shooting process Seamlessly switch between multiple HDR solutions according to the actual situation of influencing factors, thereby achieving a processing method that adapts to high dynamic range influencing factors, effectively improving the extended dynamic range in recording scenarios.
- SHDR obtains long and short frame images by changing the shutter speed (that is, exposure conditions), which can save the computing power of subsequent image reading and processing.
- SHDR easily causes ghosting in the image, so it is suitable for Scenes with shorter exposure times (higher ambient brightness).
- the Binning exposure method is a single frame exposure. Compared with the DXG exposure method, its exposure time is longer, for example, it can be longer than the long exposure time in the SHDR exposure method. Taking several exposure methods shown in Figure 5 as an example, the exposure time corresponding to the binning exposure method can be the longest, and the specific exposure time value can be set as needed. This is not limited in the embodiment of the present application.
- the DXG exposure method is to expose two gain signals (conversion gain signal and analog gain signal) simultaneously to obtain two frames of images with different gains (conversion gain or analog gain). Although the process of obtaining images is relatively complicated, this method will not cause ghosting. problem, and the exposure time is short, so this exposure method can be applied to scenes with medium and low ambient brightness.
- the binning single-frame exposure method can be used. For scenes with a large dynamic range, you can further determine whether there is flicker in the current scene, that is, flicker judgment. If stroboscopic exists, use the binning exposure method. The reason is that the binning exposure method has a longer exposure time and can effectively overcome the stroboscopic phenomenon caused by periodic changes in alternating current. If there is no stroboscopic, you can further judge based on the ambient brightness. Use the SHDR or DXG exposure method. When the ambient brightness is high, the SHDR exposure method is used, and when the ambient brightness is low, the DXG exposure method is used.
- Table 1 shows examples of corresponding exposure methods under different influencing factors:
- influencing factors introduced above are only examples.
- the types of influencing factors can be flexibly set according to needs. For example, in some cases, only the dynamic range and ambient brightness in the shooting scene can be considered and ignored. Flicker factors, etc. The embodiments of the present application do not limit this.
- the video processing method provided by the embodiment of the present application can be applied in various scenarios that require HDR processing, and can especially be applied in HDR processing scenarios during video recording.
- GUI graphical user interface
- the method for triggering the video processing process may include multiple methods.
- the video processing process may be triggered by inputting a click operation on the smart HDR switch control provided in the mobile phone; for another example, the mobile phone When turning on the phone, the smart HDR function can be turned on by default, which means that the video processing process and so on can be triggered when the phone is turned on.
- the smart HDR switch control can be set in the settings application of the mobile phone.
- FIG. 6A it is a schematic diagram of a setting interface including an intelligent HDR switch control.
- the setting interface 601 includes a shooting parameter setting area, a video parameter setting area, a general setting area, etc.
- the shooting parameter setting area may specifically include a photo ratio setting bar, a voice-activated shooting function setting bar, a smile capture setting bar, etc.
- the video parameter setting area may specifically include a video resolution setting bar, a video frame rate setting bar, a smart HDR setting bar, etc.
- the general setting area may specifically include a reference line function setting bar, a level function setting bar (not shown in the figure), a timed shooting function setting bar (not shown in the figure), etc.
- the smart HDR setting bar includes a smart HDR switch control. When the user turns on the smart HDR function through the switch control, the phone can intelligently enable the HDR mode according to the shooting scene during video recording.
- the smart HDR switch control can also be set in the camera application of the mobile phone.
- Figure 6B and Figure 6C they are schematic diagrams of the GUI that may be involved when turning on the smart HDR function through the camera application.
- the camera main interface 603 as shown in FIG. 6B can be displayed.
- the camera main interface 603 may include an image preview area, a mode setting area, shooting controls, etc.
- the mode setting area 605 may include a viewfinder frame, an album icon, a shooting control 604, a rotation control, etc.
- the viewfinder frame is used to obtain a shooting preview image and display the preview image in real time.
- the photo album icon is used to quickly enter the photo album.
- the shooting control 604 is used for shooting or video recording.
- the mobile phone detects that the user clicks on the shooting control, the mobile phone performs a photo taking operation and saves the taken photos; or when the mobile phone is in video recording mode and the user clicks on the shooting control, the mobile phone performs video recording. operate and save the recorded video.
- the camera rotation control is used to control the switching between the front camera and the rear camera.
- the camera main interface 603 also includes functional controls for setting shooting modes, such as aperture shooting mode, night scene shooting mode, portrait shooting mode, video recording mode, photo taking mode and more as shown in FIG. 6B .
- shooting modes such as aperture shooting mode, night scene shooting mode, portrait shooting mode, video recording mode, photo taking mode and more as shown in FIG. 6B .
- it can also include slow motion mode, panoramic mode, black and white art mode, dual scene recording mode, filter mode, smart HDR mode, etc.
- a specific operation can be input for the smart HDR icon shown in FIG. 6C to turn on the smart HDR function.
- AI artificial intelligence
- the preview interface can display the prompt message: "Backlight HDR”; for another example, in a scene with low brightness and high dynamics in the environment, the corresponding exposure mode is SHDR.
- the exposure mode is DXG state.
- the preview interface can display a prompt message: "Night Scene HDR”.
- the positions of the smart HDR switch controls shown in FIGS. 6A to 6C are only examples. In actual applications, the smart HDR switch controls are not limited to settings in settings applications and camera applications.
- the above embodiment only uses the user's mobile phone to enable the smart HDR function as an example. However, in other embodiments, there can be multiple ways to enable the smart HDR. For example, when the mobile phone is turned on, the smart HDR function can be enabled by default; For another example, when the mobile phone detects that a preset trigger event occurs, it can automatically turn on the smart HDR function. In some embodiments, if the electronic device does not support the smart HDR function (eg due to hardware limitations), the control icon of the HDR switch may be grayed out. This application does not limit this.
- the user when the electronic device detects that the scene brightness difference in the current shooting environment is large and the smart HDR function is turned off, the user can also be prompted to turn on the smart HDR function in various ways. For example, when a user takes a photo of a cave with backlighting, the brightness level of the backlit part is higher, while the brightness level of the cave part is lower. There is a big difference between the two. At this time, the shooting device can prompt the user to turn on artificial intelligence (AI).
- HDR function for example, the prompt message shown in Figure 6D can be displayed on the shooting interface, "The difference between light and dark in the current shooting picture is large, you can click the smart HDR function in the "More" option to improve the picture quality."
- the mobile phone when the smart HDR function is turned on, during the shooting process, the mobile phone can perform exposure and image processing according to the HDR scheme that matches the current actual shooting situation to obtain high dynamic range video images.
- the shooting picture can more clearly show the details of the bright and dark areas, without blurring the details caused by the brightness being too high or too low.
- the preset HDR solution mentioned here refers to the HDR solution determined by the mobile phone based on factors such as the dynamic range, strobe status, and brightness of the shooting environment corresponding to the actual shooting environment.
- the difference between different HDR solutions mainly lies in the different exposure methods and image processing methods.
- mobile phones can seamlessly switch between multiple supported exposure methods based on actual shooting conditions.
- the exposure methods that can be selected in the embodiment of the present application may include, for example: SHDR, DXG, and binning. It should be noted that the exposure methods available for use in different scenarios may be different. For example, limited by the hardware performance of the device, for devices that do not support DXG exposure, the exposure methods available may only include SHDR and binning.
- the underlying implementation process of controlling the smart HDR switch control may include: HAL reporting multi-state switching video HDR capabilities to differentiate products.
- the APP configures the "Smart HDR" switch based on the reported capabilities.
- the APP confirms the switch status according to the mode and scene: on/off/grayed out.
- the APP issues commands to turn on and off the HDR capability to HAL based on the switch status. When turned off, only the binning capabilities are used.
- HAL reports AEC HDR status tag to support AI prompts and recommendations.
- FIG. 7 it is a schematic diagram of data flow interaction between relevant modules when implementing a video processing method provided by an embodiment of the present application.
- the electronic device 100 includes an image sensor (camera sensor), an automatic exposure control (AEC) module, a sensor front-end processor (or fusion module), an image processor front-end, and a multi-camera smooth switching module. , image processor backend and its included local tone mapping (local tone mapping, LTM) module, etc.
- the image sensor may correspond to the camera 193 shown in Figure 3 and is mainly used to capture video images.
- the image sensor collects image frames through exposure, and when the reflected light of the subject passes through the lens, it converges on the image sensor; the image sensor can convert the optical signal into an analog electrical signal, and transmit the analog electrical signal to Fusion module.
- the image sensor outputs the original digital image it collected, that is, the original (RAW) image.
- the image sensor in the initial stage of shooting (when the camera is turned on), the image sensor can collect images according to the default exposure method. Subsequently, after determining the exposure method suitable for the current shooting conditions, the image sensor can, under the instructions of the AEC module, Shoot in target mode.
- the default exposure method may be, for example, the binning exposure method, but this is not limited in the embodiments of the present application.
- the image sensor may output a different number of frames for the same picture at the same time. For example, when shooting using the binning exposure method, the image sensor outputs a single frame (the long exposure frame in Figure 1); when shooting using the DXG exposure method, the image sensor outputs double frames.
- the image sensor can also perform image fusion.
- image fusion For example, in the DXG exposure mode, after the image sensor acquires two frames of images, the two frames of images can be fused into one frame.
- the image sensor can also be used to adjust image parameters (such as image size, bit depth, etc.), such as adjusting the image size obtained by different exposure methods to a consistent target size; or adjusting the bit depth of images obtained by different exposure methods to be consistent. the target bit depth, etc.
- the sensor front-end processor (also known as the fusion module) is used to fuse the RAW image frames collected by the image sensor, which can realize single frame input and single frame output, or two frame input and single frame output.
- the SHDR exposure method can obtain two frames of images with different exposure times. Therefore, when shooting using the SHDR exposure method, the initial image input by the image sensor to the fusion module can be two frames with a certain frame rate (such as 30fps) and different exposure times. Image, that is, dual frame mode. After the fusion module obtains the initial image, it fuses the two frames of images with different exposure times into one frame, and adjusts the fused image parameters to the preset target parameters, such as adjusting the initial format RAW10 of the input image to the target format RAW14.
- a certain frame rate such as 30fps
- Image that is, dual frame mode.
- the fusion module After the fusion module obtains the initial image, it fuses the two frames of images with different exposure times into one frame, and adjusts the fused image parameters to the preset target parameters, such as adjusting the initial format RAW10 of the input image to the target format RAW14.
- the DXG exposure method obtains two frames of images based on one exposure. Therefore, when shooting using the DXG exposure method, one exposure of the image sensor can correspond to inputting two frames of initial images to the fusion module, which is also a double-frame mode.
- the two frames of initial images can be images read through high conversion gain and low conversion gain respectively, or the two frames of images can also be images read through high analog gain and low analog gain respectively, or the two frames of images can also be This is an image obtained by superimposing the images read by conversion gain and analog gain.
- the initial format of the input image may be RAW10, for example, and the frame rate may be 30fps, for example.
- the fusion module After the fusion module obtains the initial image, it fuses the two frames read out with different gains into one frame, and can adjust the fused image parameters to the preset target parameters, such as adjusting the input initial format RAW10 to the target format RAW14.
- the binning exposure method can obtain a single frame image.
- one exposure of the image sensor can correspond to inputting a single frame initial image to the fusion module, that is, single frame mode.
- the fusion module obtains the initial image, there is no need to perform image fusion, but the image parameters can be adjusted according to needs, such as adjusting the input initial format RAW10 to the target format RAW14.
- format adjustment of the initial image is to unify the images obtained by different exposure methods into the same format (size, bit depth, etc.) so that the screen size will jump when the video is subsequently played.
- a single frame can be input to the fusion module.
- the fusion module does not need to perform fusion processing on the image, or does not need to adjust the image size, but outputs a single frame to the image.
- the processor front-end performs further processing.
- the image processor front end (image processor front end) is used to perform front-end processing on images to obtain the initial preview image.
- the front-end of the image processor may specifically include a statistics module (STATS) and a GTM module, where the GTM module is used to globally brighten dark parts of the image to improve image quality.
- STATS statistics module
- GTM module is used to globally brighten dark parts of the image to improve image quality.
- the image output by the image processor front-end can be divided into the following two transmission paths:
- the image processor backend may include a local tone mapping (LTM) module, which is used to locally brighten dark parts of the image.
- LTM local tone mapping
- the image processor back-end continues to transmit the back-end processed image to the sensing module.
- the sensing module can obtain the ambient brightness based on the image and indicate the ambient brightness to the automatic exposure control AEC module. Then, the AEC module determines the exposure mode based on the ambient brightness and instructs the image sensor to switch the exposure mode. Alternatively, the sensing module can obtain the ambient brightness based on the image, determine the exposure method based on the ambient brightness, and then indicate the exposure method to the AEC module. Afterwards, the AEC module determines the exposure mode based on the ambient brightness and instructs the image sensor to switch the exposure mode.
- LTM local tone mapping
- the way the sensing module obtains the ambient brightness based on the image may include: comprehensively determining the ambient brightness based on the current status of the image sensor, exposure parameters, and brightness channel information of the image frame.
- the AEC module can be used to control the exposure mode, the AEC module of the system itself does not have the enabling conditions to determine the SHDR and DXG exposure modes. Instead, it needs to be combined with the ambient brightness detection results of the sensing module.
- the SHDR exposure method is used.
- the DXG exposure method is used.
- the other transmission process is from the image processor front-end to the multi-camera smooth switching module.
- the multi-camera smooth switching module is used to use the multi-camera smooth switching algorithm (such as SAT (spatial alignment transform) algorithm) to realize one camera to another The process of smooth switching of camera images.
- the processed images can be transmitted to the image processor backend.
- the image processor backend can partially brighten the dark parts of the image through the LTM module it includes.
- the preview video image can be output at a specific frame rate (such as 30fps), or a video type (vedio) file can be generated and stored at a specific frame rate (such as 30fps).
- the smooth switching operation performed by the multi-camera smooth switching module is mainly used to switch between images captured by HDR cameras and non-HDR cameras.
- the multi-camera smooth switching mode The block can be used to smoothly switch the image of the HDR camera to the image of the non-HDR camera, or to smoothly switch the image of the non-HDR camera to the image of the HDR camera, that is, to achieve smooth switching between images collected by two different types of cameras. , to ensure smooth video playback and avoid image jump problems caused by direct switching.
- the HDR camera in the embodiment of the present application may be a main camera, and the non-HDR camera may be a non-main camera, such as a wide-angle camera, etc.
- the HDR camera and non-HDR camera mentioned in the embodiments of this application may not specifically refer to two different cameras, but may refer to a camera using HDR mode and a camera not using HDR mode, such as a certain camera.
- it uses the HDR scheme to capture images it is considered an HDR camera, and when it does not use the HDR scheme to capture images, it is considered a non-HDR camera.
- FIG. 8 it is a schematic flow chart of a video processing method provided by an embodiment of the present application.
- the execution subject of this method may be the electronic device 100 introduced above, and may be specifically implemented by various functional modules in the electronic device.
- the method may include the following steps:
- the AEC module determines the dynamic range information and obtains the first result.
- the dynamic range information here may be dynamic range or dynamic range compression gain.
- the dynamic range compression gain can be calculated based on the dynamic range of the current image and the preset standard dynamic range.
- the video processing method provided by the embodiment of the present application may further include: when receiving a first operation input by the user for starting shooting, in response to the first operation, electronically
- the AEC module of the device can instruct the image sensor to capture images according to the default exposure method; after the AEC module obtains the image captured by the image sensor, the dynamic range of the current image can be obtained through the histogram.
- the specific process of obtaining the dynamic range of the current image based on the histogram please refer to the existing process, which will not be described in detail in the embodiments of this application.
- the AEC module can obtain the current dynamic range based on the multi-frame images, such as obtaining the average dynamic range based on the histogram corresponding to each frame image. .
- S802 Determine whether the first result is greater than a preset first threshold.
- the first threshold can be flexibly set according to actual conditions, and this is not limited in the embodiments of the present application.
- the level of the current dynamic range may be determined. For example, when the first result is greater than the first threshold, it means that the difference between light and dark of the current shooting picture is large, and the current dynamic range can be determined to be a high dynamic range; when the first result is not greater than the first threshold, it means that the difference between light and dark of the current shooting picture is large. The difference is small, and it can be determined that the current dynamic range is a medium-low dynamic range.
- different dynamic ranges can correspond to different exposure methods. For example, when it is a high dynamic range, the exposure method is determined to be binning. When it is a medium or low dynamic range, it needs to be judged according to the stroboscopic situation, ambient brightness and preset judgment method. Further determine the exposure method.
- the AEC module determines whether there is flicker in the image and obtains the second result.
- the AEC module instructs the image sensor to use binning exposure Way.
- stroboscopic flicker may occur in the shooting images. According to the result of stroboscopic, it can be determined whether to use the binning exposure method. When there is stroboscopic, it is determined to use the binning exposure method; when there is no stroboscopic, it can be further determined to use the SHDR or DXG exposure method based on the brightness of the environment.
- the exposure time corresponding to the binning exposure method in the embodiment of the present application may be greater than or equal to the AC cycle.
- the exposure time of the binning exposure method may be 10 ms.
- the sensing module may acquire an image captured by the image sensor and acquire the ambient brightness according to the image.
- the second threshold can be flexibly set according to actual conditions, and this is not limited in the embodiments of the present application.
- the third threshold is smaller than the second threshold.
- the third threshold and the fourth threshold can be flexibly set according to the actual situation, and this is not limited in the embodiments of the present application.
- multiple types of HDR processing solutions are seamlessly switched according to changes in factors such as environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting situation and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- the video processing method provided by the embodiments of the present application can also be switched according to SHDR and binning exposure methods. , at this time, the exposure methods corresponding to different influencing factors can be shown in Table 2:
- the video processing method provided by the embodiments of the present application can also be switched according to the DCG and binning exposure methods.
- the exposure methods corresponding to different influencing factors can be shown in Table 3:
- FIG. 9 it is a schematic flow chart of another video processing method provided by an embodiment of the present application.
- Figure 9 shows the interaction process between various functional modules, which may include the following steps:
- the automatic exposure control module sends first instruction information to the image sensor.
- the automatic exposure module can preset a default exposure method, which can be binning, for example.
- the video processing method provided by the embodiment of the present application may further include: when receiving a first operation input by the user for starting shooting, in response to the first operation, electronically
- the device's AEC module instructs the image sensor to capture images using a default exposure.
- the first operation may be used to start video recording, for example, it may be an operation of clicking on the video recording image in the camera main interface as shown in FIG. 6C.
- the image sensor responds to the first indication information and captures the image using the default exposure method.
- the image sensor is only used to collect original images, such as images in RAW10 format, without performing image fusion processing or adjusting image size.
- the image sensor sends the first image to the fusion module.
- the first image may be a raw image collected by an image sensor, such as an image in RAW10 format.
- step S904 may also be performed, that is, the fusion module performs multi-frame fusion on the first image.
- the fusion module can be input in single-frame mode. At this time, the fusion module does not need to perform fusion processing and can only adjust the image parameters.
- the fusion module sends the second image to the image processor front end.
- the second image may be an image processed by the fusion module.
- the fusion module may be used to perform fusion processing on the acquired images, and may also adjust the image size by part.
- S906 The image processor front end sends the third image to the image processor back end.
- the third image may be an image obtained by front-end processing of the second image by the image processor front-end.
- the third image can be output according to the two-way transmission process.
- the one-way transmission process is to transmit the third image to the multi-camera smooth switching module, and then the multi-camera smooth switching module will It is transmitted to the image processor backend.
- the image processor backend can perform the following step S907A; the other transmission process is to transmit the third image to the image processor backend. After that, the image processor backend will The image is sent to the sensing module, that is, the next step S907B is executed.
- the image processor back-end performs back-end processing to obtain preview videos and stored video files.
- the image processor backend sends the fourth image to the sensing module.
- the fourth image may be an image obtained by performing back-end processing on the third image by the image processor.
- the sensing module obtains the environment brightness based on the fourth image.
- the sensing module sends the ambient brightness to the automatic exposure control module.
- the automatic exposure control module determines the exposure method based on preset influencing factors, including ambient brightness.
- the automatic exposure control module sends second instruction information to the image sensor.
- the second instruction information is used to instruct the image sensor to capture the image using a target exposure method.
- the image sensor responds to the second instruction information and continues to use the default exposure method or other exposure methods.
- the real-time exposure method can be determined according to the above steps S903 to S911, and the image can be captured and processed through the target exposure method.
- multiple types of HDR processing solutions are seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting environment and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- the image sensor when the image sensor has the function of fusing images and adjusting the image size, the image sensor can perform the operation of fusing multiple frames of images in the DXG exposure mode and adjusting the image size. .
- the video processing process in this method is introduced below with reference to the accompanying drawings.
- FIG. 10 it is a schematic diagram of the data flow interaction between relevant modules when implementing another video processing method provided by the embodiment of the present application.
- the automatic exposure control module may send exposure indication information to the image sensor, instructing the image sensor to capture images using a default exposure method (such as a binning exposure method).
- the image sensor collects images according to the default exposure mode (such as RAW10), fuses the images (in DXG or SHDR exposure mode) and adjusts image parameters (such as image size, bit depth, etc.), and then outputs the image (such as RAW14) to the fusion module.
- the fusion module transparently transmits the acquired image to the image processor front-end. After the image processor front-end performs front-end processing on the image, it outputs the image according to two transmission paths: (1) One transmission path is, and the image processor front-end transmits the image to the image processor.
- the image processor back-end can perform back-end processing on the image, and then transmit the image to the perception module.
- the perception module can obtain the environment brightness based on the image, and transmit the environment brightness to the decision-making module.
- the decision-making module is used to determine the environment according to the environment.
- the brightness determines the target exposure mode, where the target exposure mode is the exposure mode that matches the actual conditions such as the current dynamic range, strobe status, and ambient brightness; after that, the decision-making module sends exposure mode indication information to the image sensor, instructing the image sensor to switch to the target exposure method.
- the other transmission path is that the front end of the image processor transmits the image to the multi-camera smooth switching module.
- the multi-camera smooth switching module can use the multi-camera smooth switching algorithm to achieve smooth switching of images from one camera to another.
- the processed images can be transmitted to the image processor backend.
- the image processor backend can partially brighten the dark parts of the image through the LTM module it includes, and then can output the preview video image at a specific frame rate (such as 30fps), or generate and output it at a specific frame rate (such as 30fps).
- the smooth switching operation performed by the multi-camera smooth switching module is mainly used to switch between images captured by HDR cameras and non-HDR cameras. Specifically, when There is switching between HDR cameras and non-HDR cameras during video shooting.
- the multi-camera smooth switching module can be used to smoothly switch the image of the HDR camera to the image of the non-HDR camera, or to smoothly switch the image of the non-HDR camera to HDR.
- the image of the camera is to realize smooth switching between the images collected by two different types of cameras to ensure the smooth playback of the video and avoid the problem of image jump caused by direct switching.
- the HDR camera in the embodiment of the present application may be a main camera, and the non-HDR camera may be a non-main camera, such as a wide-angle camera, etc.
- the HDR camera and non-HDR camera mentioned in the embodiments of this application may not specifically refer to two different cameras, but may refer to a camera using HDR mode and a camera not using HDR mode, such as a certain camera.
- it uses the HDR scheme to capture images it is considered an HDR camera, and when it does not use the HDR scheme to capture images, it is considered a non-HDR camera.
- multiple types of HDR processing solutions are seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting environment and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- Figure 11 shows the interaction process between various functional modules, which may include the following steps:
- the automatic exposure control module sends first instruction information to the image sensor.
- the automatic exposure module can preset a default exposure method, which can be binning, for example.
- the video processing method provided by the embodiment of the present application may also include: when receiving a first operation input by the user for starting shooting, in response to the first operation, the electronic device The AEC module can instruct the image sensor to capture images according to the default exposure method.
- the first operation may be used to start video recording, for example, it may be an operation of clicking on the video recording image in the camera main interface as shown in FIG. 6C.
- the image sensor responds to the first instruction information, captures the image using a default exposure method, and performs image processing according to a preset strategy.
- the image sensor in the embodiment of the present application can have the ability to fuse images, adjust image size, etc., and can perform image processing according to a preset strategy.
- the image sensor can fuse the first image of the dual frames.
- the single-frame mode is input to the fusion module. At this time, the fusion module does not need to perform fusion processing and can only adjust the image size.
- the method of acquiring two frames of images through DXG in the image sensor and performing fusion is denoted as iDXG.
- the image sensor fuses two-frame images in a manner that includes: acquiring two frames of images with a preset photosensitivity ratio, and then fusing the two frames of images to obtain an image with a larger dynamic range.
- the sensitivity ratio of the two frames of images obtained through iDXG can be preset, such as iDXG1:4, iDXG1:8 or iDXG1:16 and other fixed values.
- Different switching conditions can correspond to different photosensitivity ratios.
- it in highly dynamic scenes, it can correspond to iDXG1:4 (that is, the iDXG exposure method is used, and the photosensitivity ratio is 1:4); in extremely high dynamic scenes, it can correspond to iDXG 1:16 (that is, iDXG exposure method is used, and the sensitivity ratio is 1:16); in medium and low dynamic scenes, it can correspond to the binning exposure method; in thermal escape scenes, it can correspond to the binning exposure method.
- the extremely high dynamic range, high dynamic range, and medium and low dynamic range can be divided according to needs.
- This application does not specifically limit each dynamic range.
- the fusion module when the fusion module performs dual-frame image fusion, the photosensitivity ratio of the two-frame images can be flexibly set and is not limited to a few sets of fixed values. Therefore, the fusion method of the embodiment in Figure 9 can facilitate flexible adjustment of the brightness of the image, and the fusion method of the embodiment of Figure 11 can achieve HDR processing of the image on the basis of ensuring the performance of the image sensor.
- the image sensor sends the fifth image to the image processor front end.
- the image sensor may first transmit the fifth image to the fusion module, and then the fusion module transparently transmits the fifth image to the image processor front end.
- the image processor front end sends the sixth image to the image processor back end.
- the sixth image may be an image obtained by front-end processing of the fifth image by the front-end of the image processor.
- the sixth image can be output according to two transmission processes.
- the first transmission process is to transmit the sixth image to the multi-camera smooth switching module, and then the multi-camera smooth switching module It is transmitted to the image processor backend.
- the image processor backend can perform the following step S1105A; the other transmission process is to directly transmit the sixth image to the image processor backend.
- the image processor backend Send the image to the sensing module, that is, perform the next step S1105B.
- the image processor backend performs backend processing to obtain preview videos and stored video files.
- the image processor backend sends the seventh image to the perception module.
- the sensing module obtains the environment brightness based on the seventh image.
- the sensing module sends the environment brightness to the decision-making module.
- decision-making module here may be an additional module, or it may be a sub-module in automatic exposure control, which is not limited in the embodiments of the present application.
- the sensing module determines the target exposure method according to the preset strategy.
- the preset strategy refers to a strategy for determining the target exposure method based on the current dynamic range, whether there is strobe, ambient brightness, etc., which may correspond to the strategy for determining the target exposure method introduced in the embodiment of FIG. 8 above.
- the decision-making module sends the second instruction information to the image sensor.
- the image sensor responds to the second instruction information and captures the image using the target exposure method.
- the real-time exposure method can be determined according to the above steps S1103 to S1109, and the image can be captured and processed through the target exposure method.
- multiple types of HDR processing solutions are seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting environment and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- the HDR mode and the non-HDR mode can be switched to each other.
- the HDR mode is a video shooting mode processed through HDR technology
- the non-HDR mode is not Video shooting mode processed with HDR technology.
- electronic equipment The main camera and the secondary camera are included in a logical camera.
- the main camera has HDR capabilities
- the secondary camera can have other capabilities (such as ultra-wide-angle capabilities) but does not have HDR capabilities. Therefore, it is required when shooting videos.
- switching between HDR mode and non-HDR mode you can switch between the main camera and the secondary camera.
- FIG 12 it is a schematic flow chart of a video processing method in a main camera and secondary camera switching scenario provided by an embodiment of the present application.
- the exposure method used is binning. At this time, the dynamic range gain of the system is flat and the image effect does not jump.
- the main camera can capture images according to the target exposure method determined by the method introduced above, and the secondary camera uses the binning exposure method and maintains the maximum dynamic range , try to be as close as possible to the dynamic range of the main camera to reduce image jumps.
- the main camera can capture images through multiple exposure methods, including SHDR, DXG and binning exposure methods. After the main camera captures the image, it can transmit the initial image (such as RAW10 image) to the fusion module. When the fusion module is input in dual-frame mode, the fusion module fuses the dual-frame images and transmits the fused image (RAW14 image) to the image Processor frontend 0.
- SHDR SHDR
- DXG DXG
- binning exposure methods After the main camera captures the image, it can transmit the initial image (such as RAW10 image) to the fusion module.
- the fusion module When the fusion module is input in dual-frame mode, the fusion module fuses the dual-frame images and transmits the fused image (RAW14 image) to the image Processor frontend 0.
- the secondary camera can capture images through binning exposure, and then transmit the captured initial image (such as RAW image) to the image processor front end 1.
- the image processor front end 0 and the image processor front end 1 can transmit the images processed by each front end to the multi-camera smooth switching module.
- the multi-camera smooth switching module can process images corresponding to different cameras according to the multi-camera smooth switching algorithm to achieve smooth switching between images collected by two different types of cameras to ensure smooth playback of videos and avoid direct switching.
- the image screen jump problem can be used to solve the image screen jump problem.
- the fusion module can continue to transmit the image to the image processor backend, which can perform anti-shake processing on the image through the preview stream anti-shake module (such as EIS2.0 and EIS3.0), and then pass the image
- the affine transformation module, color conversion matrix (CCM) module, and gamma transformation (Gamma) module further process the image.
- CCM color conversion matrix
- Gamma gamma transformation
- a preview image with a certain frame rate (such as 30fps) and a video storage file with a certain frame rate (such as 30fps) can be obtained.
- multiple types of HDR processing solutions are seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting environment and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- HDR10+ video capabilities that is, electronic devices can apply HDR10+ technology for video recording or playback.
- HDR10+ capabilities can mainly act on the back-end part of the image processor, including adjusting the color conversion matrix module to BT.2020, adjusting gamma transformation to PQ conversion curve, generating dynamic metadata, etc. That is to say, HDR10+ technology can be used simultaneously with seamless switching of different exposure methods. Among them, the path for switching the exposure mode using HDR10+ technology can be seen in Figure 9, which will not be described in detail here.
- corresponding switch controls can be set.
- the switch control corresponding to the HDR10+ can also be set.
- the switch control corresponding to multi-state HDR or HDR10+ on the electronic device can be grayed out.
- FIG. 13 it is a schematic diagram of a method for adjusting image parameters provided by an embodiment of the present application.
- the method of adjusting the image parameters may include: padding 0 in the low bits through the Pad0 method, so that images corresponding to different exposure methods have the same bit depth.
- the image corresponding to the iDCG1:4 exposure mode is filled with 0 in the lower 2 bits, where 1:4 is the photosensitivity ratio between two frames of images with different gains of iDCG; the image corresponding to the binning exposure mode is in the low bits After the 4 bits are filled with 0 and the low bit is filled with 0, the images obtained by iDCG and binning exposure methods have the same bit depth.
- FIG 14 it is a schematic diagram of a method for adjusting image parameters provided by an embodiment of the present application.
- image sensor settings that support seamless switching of multiple exposure methods can unify the bit depth (bit depth), such as unifying the bit depth of RAW 14 .
- bit depth such as unifying the bit depth of RAW 14 .
- images with a bit depth of less than 14 bits they can be unified into RAW14 through the position of padding.
- the unified bit depth may be based on the bit depth level of the iDXG image.
- the reason is that among the several exposure methods provided by the present application, iDXG exposure The bit depth of the image corresponding to the mode is the largest.
- the unified bit depth can also be flexibly set according to the type of exposure method, which is not limited in the embodiments of the present application.
- the method of adjusting the bit depth can be shown in Figure 13, including: padding the image captured by binning exposure from 10bit to 14b; acquiring the image with iDXG exposure, and the sensitivity ratio is 1 :4
- the image after fusion of two frames is padded from 12bit to 14bit; the image after fusion of two frames obtained by iDXG exposure method with a sensitivity ratio of 1:16 is 14bit, and no padding is required at this time.
- sensitivity ratio can be a combination between exposure time (exp) and system gain (gain).
- AEC can instruct the image sensor to capture images using the gain corresponding to the exposure method. Therefore, images captured through different exposure methods may correspond to different sensitivities.
- the video processing method provided in the embodiment of the present application solves the problem by adjusting the sensitivity ratio corresponding to the exposure mode. the above issues.
- the exposure method and sensitivity ratio in 3 given in Table 4 above are taken as an example.
- the binning exposure mode corresponds to a sensitivity ratio of 1
- iDXG1:4 corresponds to a sensitivity ratio of 4.
- the iDXG1:4 exposure mode can be increased to the original 4 times; or, the binning exposure method in Table 4 above corresponds to a sensitivity ratio of 1, and iDXG1:16 corresponds to a sensitivity ratio of 16.
- the exposure mode is switched from binning to iDXG1:16, you can change iDXG1:16
- the exposure mode is increased to 16 times the original; or, in Table 4 above, the iDXG1:4 exposure mode corresponds to a sensitivity ratio of 4, and iDXG1:16 corresponds to a sensitivity ratio of 16.
- the exposure mode switches from iDXG1:4 to When iDXG1:16, you can increase the iDXG1:16 exposure mode to the original 4.
- the several sensitivity ratios given in Table 4 above are only examples. In practical applications, the sensitivity ratios corresponding to different exposure methods can also be other specific values, which are not limited in the embodiments of the present application.
- FIG. 15 it is a schematic flow chart of yet another video processing method provided by an embodiment of the present application. This approach may include the following steps:
- the first operation may correspond to the operation of clicking the video recording button in FIG. 6C; or the first operation may also correspond to the operation of clicking to turn on the smart HDR function.
- the electronic device when the video recording function is turned on, can display a first interface, where the first interface includes a first control, and the first control is used to enable the function of automatically switching the HDR exposure mode.
- the first interface may correspond to the interface shown in FIG. 6A or FIG. 6D, for example.
- the exposure methods in the embodiments of the present application may include: binning exposure method, SHDR exposure method, DXG exposure method, etc.
- the first HDR exposure method is a single frame mode, that is, one frame of image can be obtained based on one exposure.
- the first HDR exposure method can be binning mode;
- the second HDR exposure method is double frame mode, that is, two frames of images can be read based on one exposure, such as DXG mode, or two frames of images can be obtained by two long and short exposures, such as SHDR mode.
- a default HDR exposure mode may be preset, and when receiving the first operation, the AEC module may instruct the image sensor to capture an image using the default HDR exposure mode.
- the default HDR exposure mode may be binning mode.
- the preset strategy includes the dynamic range information corresponding to the video shooting, the stroboscopic state, and the correspondence between the environment brightness and the target HDR exposure method. relation.
- the dynamic range compression information here may include dynamic range and/or dynamic range compression gain.
- the embodiments of this application are introduced by taking the dynamic range as an example.
- determining the target HDR exposure method according to a preset strategy may include: obtaining the dynamic range corresponding to the video shooting; when the dynamic range is less than a first threshold, determining the target HDR exposure method to be the target HDR exposure method.
- the binning mode when the dynamic range is greater than or equal to the first threshold, detect whether there is stroboscopic; when stroboscopic exists, determine the target HDR exposure mode to be the binning mode; when there is no stroboscopic , determine the target HDR exposure mode according to the ambient brightness; wherein, when the ambient brightness is greater than a second threshold, the target HDR exposure mode is determined to be the SHDR mode; when the ambient brightness is less than a third threshold and When it is greater than the fourth threshold, it is determined that the target HDR exposure mode is the DXG mode, wherein the third threshold is less than the second threshold; when the ambient brightness is less than the fourth threshold, it is determined that the target HDR exposure is The way is the binning mode.
- the exposure method based on the dynamic range it can also be judged based on the dynamic range compression gain.
- the process of determining the exposure mode according to the dynamic range compression gain is similar to the above process (the dynamic range can be replaced by the dynamic range compression gain), which will not be described again.
- the specific value of the first threshold may also be changed accordingly, that is, the specific values of the first threshold corresponding to the dynamic range and the first threshold corresponding to the dynamic range compression gain may be different.
- the method of determining the target HDR exposure method according to the preset strategy may also include: detecting whether there is a thermal escape phenomenon in the video shooting; when the thermal escape phenomenon exists, determining the target HDR exposure method to be the target HDR exposure method. Describe the binning mode.
- the method when the video is captured and there is a switch between an HDR camera and a non-HDR camera, the method further includes: when the switch is performed, the exposure mode of the HDR camera is the first HDR exposure mode, if the exposure mode corresponding to the non-HDR camera is the second HDR exposure mode, then adjust the first dynamic range gain corresponding to the first HDR exposure mode so that the first dynamic range gain is consistent with the HDR exposure mode.
- the second dynamic range gain corresponding to the second HDR exposure method is the closest.
- the images in the dual-frame mode are fused.
- the image sensor when the target HDR exposure mode is the second HDR exposure mode, transmits the image in dual-frame mode to the fusion module; the fusion module Perform fusion; or, when the target HDR exposure mode is the second HDR exposure mode, the image sensor fuses the images in the dual-frame mode.
- the fusion module when the fusion module fuses the images in the dual-frame mode, according to the The sensitivity ratio required by the DXG mode determines the target sensitivity ratio between the double-frame input image in the DCG mode and the double-frame input image in the DAG mode respectively; according to the target sensitivity ratio, the DCG mode double-frame The input image and the double-frame input image in the DAG mode are respectively superimposed correspondingly to obtain the superimposed double-frame input image that satisfies the DXG mode photosensitivity ratio.
- the dual-frame input image in the DCG mode and the dual-frame input image in the DAG mode are merged according to a preset sensitivity ratio. Overlay. Among them, taking iDXG as an example, the sensitivity ratio is 1:4 or 1:16.
- the image parameters corresponding to these exposure methods can be adjusted to be consistent.
- the specific process may include: presetting the target parameters corresponding to the images obtained through the video shooting; The initial parameters corresponding to the first image obtained based on the first HDR exposure method are adjusted to the target parameters; and/or, the initial parameters corresponding to the second image obtained based on the second HDR exposure method are adjusted Adjust to the target parameters.
- the electronic device supports the first HDR video mode, and the first HDR video mode includes HDR10 or HDR10+.
- multiple types of HDR processing solutions are seamlessly switched according to changes in environmental brightness, required dynamic range, and stroboscopic detection, thereby utilizing the actual shooting environment and
- the HDR solution adapted to the image quality requirements performs image processing to effectively expand the dynamic range in the recording scene and improve the image quality in the recording scene.
- embodiments of the present application also provide an electronic device, including one or more processors; one or more memories; the one or more memories store one or more computer programs, and the One or more computer programs include instructions that, when executed by the one or more processors, cause the computer or processor to perform one or more steps of any of the methods described above.
- embodiments of the present application also provide a computer-readable storage medium.
- the computer-readable storage medium stores computer-executable program instructions.
- embodiments of the present application also provide a computer program product containing instructions.
- the computer program product includes computer program code.
- the computer program code When the computer program code is run on a computer, it causes the computer or processor to execute One or more steps in any of the above methods.
- the computer program product includes one or more computer instructions.
- the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
- the computer instructions may be stored in or transmitted over a computer-readable storage medium.
- the computer instructions can be transmitted from one website, computer, server or data center to another website, computer, or data center through wired (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) means. server or data center for transmission.
- the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
- the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), Or semiconductor media (eg, solid state disk (SSD)), etc.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
本申请提供了一种视频处理的方法及电子设备,属于终端技术领域。该方法包括:接收用户输入的第一操作;响应于第一操作,根据预设的默认高动态范围HDR曝光方式获取第一图像;根据第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,预设策略包括视频拍摄对应的动态范围、频闪状态以及所述环境亮度与目标HDR曝光方式之间的对应关系;当目标HDR曝光方式与默认HDR曝光方式不相同时,将默认HDR曝光方式切换为目标HDR曝光方式,并继续进行视频拍摄。该方法通过根据实际的视频拍摄情况,在多种曝光方式之间进行无缝自动切换,从而实现在视频拍摄场景下进行高效的HDR处理,改善视频画质。
Description
本申请要求于2022年08月09日提交国家知识产权局、申请号为202210950356.6、申请名称为“视频处理的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及终端技术领域,尤其涉及一种视频处理的方法及电子设备。
曝光是通过感光器件接收镜头进光形成影像的过程。在拍摄过程中,拍摄背景或拍摄主题的明暗强度会发生变化,外界光线太强容易导致曝光过度,导致图像过亮而缺乏层次和细节;外界光太弱容易导致曝光不足,导致图像过暗而无法反映真实色泽。
在实际应用中,图像的亮度通常受到动态范围(dynamic imaging)的限制。动态范围是指设备支持的最大输出信号和最小输出信号的比值,或者图像的亮度上限和亮度下限的灰度比值。如果环境亮度大于动态范围的上限值,拍摄的图像会偏亮;如果环境亮度小于动态范围的亮度下限值,拍摄的图像会偏暗。
目前,动态范围的影响因素包括设备及图像传感器(camera sensor)的尺寸,传感器的尺寸越大会有更大的感光表面积,在曝光时间内就可以提供更多的用于接收光信息的区域,像素更多,动态范围更大。然而,随着科技的发展,设备物理空间小型化限制了传感器尺寸,从而造成动态范围有限。
发明内容
本申请实施例提供了一种视频处理的方法,通过根据实际的视频拍摄情况,在多种曝光方式之间进行无缝自动切换,从而实现在视频拍摄场景下进行高效的HDR处理,改善视频画质。
第一方面,提供了一种频处理的方法,应用于电子设备,所述方法包括:
接收用户输入的第一操作,所述第一操作用于打开所述电子设备的视频拍摄的功能;
响应于第一操作,根据预设的默认高动态范围HDR曝光方式获取第一图像;
根据所述第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,所述预设策略包括所述视频拍摄对应的动态范围信息、频闪状态以及所述环境亮度与所述目标HDR曝光方式之间的对应关系;
当所述目标HDR曝光方式与所述默认HDR曝光方式不相同时,将所述默认HDR曝光方式切换为所述目标HDR曝光方式,并继续进行所述视频拍摄,获取第二图像。
在一种可能的实现方式中,这里的动态范围信息可以包括动态范围和/或动态范围压缩增益。
根据本实现方式提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
结合第一方面,在第一方面的某些实现方式中,所述目标HDR曝光方式至少包括第一HDR曝光方式和第二HDR曝光方式,所述第一HDR曝光方式为单帧模式,所述第二HDR曝光方式为双帧模式;
当所述目标HDR曝光方式为所述第二HDR曝光方式时,对所述双帧模式输入的图像进行融合。
在一种可能的实现方式中,单帧模式可以是binning曝光方式,也即曝光后图像传感器输出单帧图像。双帧模式可以是SHDR,DCG,DXG等曝光方式,也即基于一次曝光读取两帧图像,然后对两帧图像进行融合调整HDR的方式;或者采用长曝光和短曝光获取两帧图像,然后对两帧图像进行融合调整HDR的方式。
结合第一方面,在第一方面的某些实现方式中,所述第一HDR曝光方式为binning模式,所述第二HDR曝光方式包括交错高动态范围模式SHDR和DXG,所述DXG为双转换增益模式DCG和双模拟增益DAG叠加使用的模式。
结合第一方面,在第一方面的某些实现方式中,所述方法还包括:
预设通过所述视频拍摄获取的图像所对应的目标参数;
将基于所述第一HDR曝光方式获取的所述第一图像对应的初始参数调整为所述目标参数;和/或,
将基于所述第二HDR曝光方式获取的所述第二图像对应的初始参数调整为所述目标参数。
在一种可能的实现方式中,这里的目标参数可以包括图像尺寸、比特深度等等。
应理解,将不同曝光方式对应图像的参数均调整为目标参数,能够使得不同曝光方式的图像具有一致的图像参数,避免不同曝光方式切换时,图像会发生跳变。
结合第一方面,在第一方面的某些实现方式中,所述电子设备包括自动曝光控制AEC模块、图像传感器、感知模块,
所述响应于第一操作,根据预设的默认HDR曝光方式获取第一图像,具体包括:
响应于所述第一操作,所述AEC模块向所述图像传感器发送第一指示信息,所述第一指示信息用于指示使用所述默认HDR曝光方式捕获图像;
响应于所述第一指示信息,所述图像传感器使用所述默认HDR曝光方式,并获取所述第一图像。
结合第一方面,在第一方面的某些实现方式中,所述电子设备包括AEC模块、图像传感器、感知模块,
所述根据所述第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,具体包括:
所述图像传感器将所述第一图像发送至所述感知模块;
所述感知模块根据所述第一图像,获取环境亮度,并向所述AEC模块指示所述环境亮度;
所述AEC模块根据所述环境亮度,按照预设策略确定所述目标HDR曝光方式。
结合第一方面,在第一方面的某些实现方式中,所述电子设备包括AEC模块、图像传感器、感知模块和融合模块,
所述当所述目标HDR曝光方式为所述第二HDR曝光方式时,对所述双帧输入的图像进行融合,具体包括:
当所述目标HDR曝光方式为所述第二HDR曝光方式时,所述图像传感器将所述双帧模式的图像传输至所述融合模块;
所述融合模块对所述双帧模式的图像进行融合;或者,
当所述目标HDR曝光方式为所述第二HDR曝光方式时,所述图像传感器对所述双帧模式的图像进行融合。
结合第一方面,在第一方面的某些实现方式中,当所述融合模块对所述双帧模式的图像进行融合时,根据所述DXG模式要求的感光比,分别确定所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像之间的目标感光度比例;
根据所述目标感光度比例对所述DCG模式双帧输入图像和DAG模式下的双帧输入图像分别进行对应叠加,获取满足所述DXG模式感光比的叠加后的双帧输入图像。
结合第一方面,在第一方面的某些实现方式中,当所述图像传感器对所述双帧模式的图像进行融合时,根据预设的感光度比例对所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像进行叠加。
结合第一方面,在第一方面的某些实现方式中,所述按照预设策略确定目标HDR曝光方式,具体包括:
获取所述视频拍摄对应的动态范围;
当所述动态范围信息小于第一阈值时,确定所述目标HDR曝光方式为所述binning模式;
当所述动态范围信息大于或等于所述第一阈值时,检测是否存在频闪;
当存在频闪时,确定所述目标HDR曝光方式为所述binning模式;
当不存在频闪时,根据所述环境亮度确定所述目标HDR曝光方式;其中,
当所述环境亮度大于第二阈值时,确定所述目标HDR曝光方式为所述SHDR模式;
当所述环境亮度小于第三阈值时,确定所述目标HDR曝光方式为所述DXG模式。
结合第一方面,在第一方面的某些实现方式中,所述方法还包括:
检测所述视频拍摄是否存在热逃生现象;
当存在所述热逃生现象时,确定所述目标HDR曝光方式为所述binning模式。
结合第一方面,在第一方面的某些实现方式中,所述电子设备支持所述第一HDR视频模式,所述第一HDR视频模式包括HDR10或HDR10+。
结合第一方面,在第一方面的某些实现方式中,当所述视频拍摄时,存在HDR摄像头和非HDR摄像头之间的切换时,所述方法还包括:
当所述切换时,所述HDR摄像头的曝光方式为所述第一HDR曝光方式,所述非HDR摄像头对应的曝光方式为所述第二HDR曝光方式,则调节所述第一HDR曝光方式对应的第一动态范围增益,使得所述第一动态范围增益与所述第二HDR曝光方式对
应的第二动态范围增益最接近。
结合第一方面,在第一方面的某些实现方式中,所述方法还包括:
显示第一界面,所述第一界面包括第一控件,所述第一控件用于开启自动切换所述HDR曝光方式功能。
第二方面,提供了一种电子设备,包括:一个或多个处理器;一个或多个存储器;所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行如上述第一方面或第二方面中任一实现方式所述的方法。
第三方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行程序指令,所述计算机可执行程序指令在被计算机上运行时,使所述计算机执行如上述第一方面或第二方面中任一实现方式所述的方法。
第四方面,提供了一种计算机程序产品,所述计算机程序产品包括计算机程序代码,当所述计算机程序代码在计算机上运行时,使所述计算机执行如上述第一方面或第二方面中任一实现方式所述的方法。
图1为本申请实施例提供的一种通过交错高动态范围模式所捕获的图像的示意图。
图2A和图2B为本申请实施例提供的通过DCG模式和DAG模式所捕获的图像的示意图。
图3为本申请实施例提供的一种的电子设备100的结构示意图。
图4为本申请实施例提供的一种电子设备100的软件结构框图。
图5为本申请实施例提供的一种多种曝光模式对应的相关参数差异的示意图。
图6A至图6D为本申请实施例提供的一些视频处理过程中可能涉及的GUI示意图。
图7为本申请实施例提供的一种视频处理的方法的示意图。
图8为本申请实施例提供的另一种视频处理的方法的示意图。
图9为本申请实施例提供的又一种视频处理的方法的示意图。
图10为本申请实施例提供的又一种视频处理的方法的示意图。
图11为本申请实施例提供的又一种视频处理的方法的示意图。
图12为本申请实施例提供的又一种视频处理的方法的示意图。
图13为本申请实施例提供的一种视频处理的过程中调整图像参数的示意图。
图14为本申请实施例提供的一种视频处理的过程中调整图像参数的示意图。
图15为本申请实施例提供的又一种视频处理的方法的示意图。
需要说明的是,本申请实施例的实施方式部分使用的术语仅用于对本申请的具体实施例进行解释,而非旨在限定本申请。在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联障碍物的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,
同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,除非另有说明,“多个”是指两个或多于两个,“至少一个”、“一个或多个”是指一个、两个或两个以上。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”特征可以明示或者隐含地包括一个或者更多个该特征。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其它一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其它方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其它方式另外特别强调。
本申请的说明书和权利要求书及附图中的术语“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
结合背景技术中的介绍,为了克服受限于动态范围导致图像拍摄质量低(明暗差别小)的问题,高动态范围成像(high dynamic imaging,HDR)技术应运而生。HDR技术是可以实现更大曝光动态范围(即更大的明暗差别)的一组技术,其目的是正确地表示真实世界中从太阳光直射到最暗的阴影的亮度范围。在HDR技术具体实现时,可以通过设置多组曝光值(exposure values,EV)来对当前拍摄的对象进行曝光,其中包括利用当前测光计算获取的正常情况下的EV值曝光,以及分别使用低于正常EV值(EV-n的EV值)和高于正常EV值(EV+n的EV值)进行曝光,之后,对曝光后的多张照片进行融合,使得暗处的对象使用高倍EV曝光的局部照片,而亮处的对象使用低倍EV曝光的局部照片,从而使得整个照片的场景都不至于太亮或者太暗。
上述通过EV包围并融合多张照片以扩大动态范围的方式涉及较复杂的图像处理算法,需要两帧之间具有足够的间隔作为计算时间。通常来说,在拍照场景下,获取连续两帧图像的时间间隔较长,足以实现按照传统HDR曝光方式扩大动态范围。然而,在录像场景下,为了保证画面的流畅性,一般会以特定帧率(frame rate)采集和处理图像,帧间隔很短,也即每一帧(frame)只对应极短的计算时间(如帧率为30fps时,每一帧对应的计算时间平均小于33ms),这就使得在录像场景下,无法使用复杂度较高的传统方式来扩大动态范围,而是需要针对录像的特点定制适配的HDR方案。
有鉴于此,本申请实施例提供了一种视频处理的方法,该方法根据环境亮度、动态范围以及频闪等因素的变化,对多种类型HDR曝光方式进行无缝切换,从而得以利用与实际拍摄状态及画质需求适配的HDR方案进行图像采集和处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
为了更好地理解本申请实施例提供的视频处理的方法,首先对本文可能涉及到的
一些术语定义进行介绍。
1、交错高动态范围(stagger high-dynamic range,SHDR)
各个厂商可能对与SHDR相似的技术有不同的命名,如索尼(Sony)的相关技术就被称为行交织HDR(digital overlap HDR,DOL-HDR)。SHDR技术通过提高传感器的帧率,可以在一个采集周期内,采集不同曝光时间的多帧图像的技术,然后通过多帧融合技术将对应的长曝光帧和短曝光帧融合成一帧,获取高动态范围的图像。其中,短曝光帧(或称“短曝光图像”)可以获取亮部信息;长曝光帧(或称“长曝光图像”)可以获取暗部信息,且长曝光帧噪声控制优异;将两帧图像融合,能够获取高光和阴影的收益(gain)。
示例性的,通过SHDR技术采集的图像可以如图1所示,包括周期性采集的一帧长曝光图像(如图1中标识字母“L”的帧)和一帧短曝光图像(如图1中标识字母“S”的帧)。
需要说明的是,受不同曝光时段和曝光时长的影响,SHDR在某些场景下可能无法获取理想的拍摄效果。比如,相比于非SHDR曝光的拍摄场景,SHDR场景由于要采集多种曝光参数下(如曝光时间)的图像,因此两帧图像之间的空白帧间隔会更短,不适宜应用于图像处理算法过于复杂的场景下。并且,SHDR用于融合成一张图像的两帧分别来自不同的曝光时段,且对应不同的曝光时长,而曝光时间越长,越可能产生拖影(或鬼影),在融合时不可避免地会产生运动鬼影(仅在两帧曝光时间都很短的高亮场景可以达到较为理想的效果)。
2、合并(binning)
Binning是一种图像读出模式,将相邻的像素中感应的电荷加在一起,以一个像素的模式读出。例如,电子设备在拍摄图像的过程中,目标对象反射的光线被摄像头采集,以使得该反射的光线传输至图像传感器。图像传感器上包括多个感光元件,每个感光元件采集到的电荷为一个像素,并对像素信息执行binning操作。具体地说,binning可以将n×n个像素合并为一个像素。例如,binning可以将相邻的2×2个像素合成为一个像素,也就是说,相邻2×2个像素的颜色以一个像素的形式呈现。
3、双转换增益(dual conversion gain,DCG)
DCG也是一种图像读出方式,其可以理解为在一个像素元电路当中做两次读取的能力,或者可以理解为在一个像素对应的感光单元当中有两个存储光子能量的电容。本申请实施例中涉及的DCG可以具体指对基于一次曝光操作的图像进行两次转换增益读出。
在本申请实施例中,可以利用DCG扩大动态范围,其实现原理为:具有双转换增益DCG能力的图像传感器,一个像素有两个势阱,两个势阱对应不同的满阱容量以及不同的转换增益CG,大满阱容量对应低转换增益(low conversion gain,LCG)、低感光度,小满阱容量对应高转换增益(high conversion gain,HCG)、高感光度。这样,传感器可以在同一场景下使用两个势阱(两种感光度)和两种转换增益,一次曝光获取两张图像:高感光模式下的图像和低感光模式下的图像。再由电子设备将获取的两张图像合成一张图像,也就是HDR技术。
作为一个示例,基于同一次短曝光,DCG两次读出的图像可以分别如图2A所示,
其中,LCG帧为应用LCG增益信号读出的一帧图像,可以保护高光区域不过曝;HCG帧为应用HCG增益信号读出的一帧图像,可以提高阴影亮度,同时控制噪声。之后,将两帧图像融合,获取高光和阴影的收益,获取动态范围优化后的一张图像。
4、双模拟增益(dual analog gain,DAG)
作为另一种图像读出方式,与上述介绍的双转换增益DCG类似,DAG也是基于一次曝光通过两路模拟信号读取两帧图像,区别在于两者的读出方式不同。DAG读出模式分别采用两种模拟增益进行图像读出,这两种模拟增益包括:低模拟增益(low analog gain,LAG)和高模拟增益(high analog gain,HAG),其中,LAG可以保护高光区域不过曝;HAG可以提亮阴影,同时控制噪声。
作为一个示例,基于同一次短曝光,DAG两次读出的图像可以分别如图2B所示,其中,LAG帧为应用LAG读出的一帧图像,HAG帧为应用HAG读出的一帧图像。之后,将两帧图像融合,获取高光和阴影的收益,扩大动态范围。相比于转换增益(conversion gain,CG),模拟增益(analog gain,AG)具有更好的噪声控制能力。
需要说明的是,由于上述DCG和DAG的两帧图像均来自一次曝光的不同读出,因此两帧图像融合之后不会存在鬼影问题,适合各录像场景的广泛使用。
还需要说明的是,在本申请实施例中,还可以将上述DCG和DAG技术叠加以扩大动态范围。为了便于描述,本申请实施例将DCG和DAG叠加使用的方式称为DXG。示例性的,DCG和DAG可以根据感光比进行叠加,比如如果需要高增益信号读出的图像和低增益读出的图像之间的感光比为1:16,那么可以利用DCG获取感光比为1:2的两帧图像,再利用DAG获取感光比为1:8的两帧图像,两者感光比相乘后获得1:16的感光比图像。
5、单幅动态范围DXG(intra-scene DXG,iDXG)
为了便于区分,本申请实施例将在图像传感器内将通过DXG方式获取的两帧融合成一帧的方式称为iDXG。
6、RAW图像(或称RAW域图像)
也即原始图像,包含从数码相机、扫描器或电影胶片扫描仪的图像传感器所处理的数据。RAW图像包含了图像最原始的信息,未经过图像信号处理(image signal processing,ISP)过程中的非线性处理。
7、HDR10视频
HDR10视频是按照静态元数据进行配置的,例如该HDR10的PQ转换曲线是按照显示器的基准显示亮度进行固定映射的。其中,该HDR10视频的比特深度为10bit;该静态元数据可以满足SMPTE ST2086或者其他标准中的定义。
8、HDR10+视频
HDR10+是以HDR为基础继续改良的,HDR10+支持动态元数据即HDR10+可以根据视频中的不同场景,调节或强化影像亮度、对比以及色彩饱和度等,使得HDR10+视频中的每帧画面都拥有独立调节的HDR效果。其中,该HDR10+视频的比特深度为12bit;该动态元数据可以满足SMPTE ST2094或者其他标准中的定义。
9、亮度场景
亮度场景也可以称为亮度级别。在本申请实施例中,亮度场景可以用于确定图像
采集的曝光方式(DXG或SHDR)。其中,亮度场景可以包括:高亮场景、中等场景以及暗光场景等。
示例性的,亮度场景可以对应于不同的亮度范围,设备可以根据被拍摄对象反射的光照强度区分不同的亮度级别。例如,高亮场景对应的亮度范围可以为大于50000勒克斯(lux),中等亮度场景对应的亮度范围可以为50000lux-10lux,暗光场景对应的亮度范围可以为10lux-0lux。
需要说明的是,本申请实施例中描述的亮度等级可以不限于上述三种。并且,这三种亮度场景分别对应的亮度范围仅作为一种示例,不同亮度场景下所对应的亮度范围的取值也可以为其他数值,本申请实施例对此不作限定。
示例性的,如图3所示,为本申请实施例提供的一种电子设备100的结构示意图。
图3示出了电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器
(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code
division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。电子设备100可以设置至少一个麦克风170C。
耳机接口170D用于连接有线耳机。
传感器模块180可以包括1个或多个传感器,这些传感器可以为相同类型或不同类型。传感器模块180可以包括压力传感器,陀螺仪传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,触摸传感器,环境光传感器等。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(Liquid Crystal Display,LCD),有机发光二极管(Organic Light-Emitting Diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(Active-Matrix Organic Light Emitting Diode的,AMOLED),柔性发光二极管(Flex Light-Emitting Diode,FLED),Mini LED,Micro LED,Micro-OLED,量子点发光二极管(Quantum Dot Light Emitting Diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100通过GPU、显示屏194以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。视频编解码器用于对数字视频压缩或解压缩。NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子
设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能、图像视频播放功能等)等。在存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据、电话本等)等。
在一些实施例中,内部存储器121可以存储自动曝光控制(automatic exposure control,AEC)以及曝光模式切换方法相关的代码。处理器通过运行AEC相关的代码可以实现自动曝光控制的过程,通过运行曝光模式切换方法相关的代码可以实现摄像头中图像传感器的曝光模式的切换。
在一些实施例中,AEC模块可以包括AEC算法模块和AEC统计模块。其中,AEC统计模块用于统计分析所采集图像中的参数情况,比如图像亮度等。AEC算法模块可以根据统计结果自动调整摄像头的曝光参数,AEC算法模块还可以根据统计结果估计环境光亮度。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现获取图像的功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像或视频。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(Charge Coupled Device,CCD)或互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像或视频信号。ISP将数字图像或视频信号输出到DSP加工处理。DSP将数字图像或视频信号转换成标准的RGB,YUV等格式的图像或视频信号。
在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。例如,在一些实施例中,电子设备100可以利用N个摄像头193获取多个曝光系数的图像,进而,在视频后处理中,电子设备100可以根据多个曝光系数的图像,通过高动态范围(High Dynamic Range,HDR)技术合成HDR图像。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(Moving Picture Experts Group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(Neural-Network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。
通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
示例性的,电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
示例性的,如图4所示,为本申请实施例提供的一种电子设备100的软件结构框图。
在一些实施例中,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在本申请的实施例中,具体地,系统从上至下可以包括应用程序层,硬件抽象层以及内核层。其中,应用程序层和硬件抽象层之间还可以包括应用程序框架层,系统库,运行时等。
应用程序层可以包括一系列应用程序包。如图4所示,应用程序包可以包括相机,图库,音乐,视频,通话等应用程序(或称应用)。
硬件抽象层屏蔽了不同硬件设备的差异,为系统提供标准接口,如图4所示,硬件抽象层通过标准的HAL接口向内核层传输数据,并接纳内核层上传的数据。硬件抽象层可以包含多个库模块,其中,每个库模块都为特定类型的硬件组件实现一组接口,比如,无线保真(wireless fidelity,Wi-Fi)/蓝牙(bluetooth)模块,相机模块。当应用程序框架层应用程序接口请求访问设备硬件时,系统将为该硬件加载相应的库模块。
在一些实施例中,硬件抽象层还可以包括相机模块。该相机模块可以包括自动曝光控制(automatic exposure control,AEC)模块和图像处理管线。其中,AEC模块可以用于实现自动曝光控制,具体地,AEC模块可以从系统配置文件中获取曝光参数,并将曝光参数配置给图像传感器。图像传感器根据曝光参数捕获图像。AEC模块中的AEC统计模块可以统计分析所采集图像中的参数情况,比如,图像亮度等。AEC模块还可以根据采集图像时采用的上述曝光参数以及图像亮度估计动态范围压缩增益(adrc gain)。该曝光模式控制模块可以用于根据估计的环境光量度控制摄像头的图像传感器的曝光模式。具体地,在亮光场景下,可以控制图像传感器工作在低感光模式下,从而延长曝光时间,使得曝光时间满足拍摄视频的曝光时间的需求。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,传感器驱动、传感器前端处理器、图像处理器前端、图像处理器后端等。其中,上述曝光模式控制模块可以通过摄像头驱动控制摄像头的图像传感器的曝光模式,AEC模块可以通过摄像头驱动向摄像头配置曝光参数。
下面结合录像场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件
被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是单击操作,该单击操作所作用的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获视频图像,显示屏194显示相机的预览界面,预览界面中显示捕获的视频图像。
应理解,由于动态范围、频闪、环境亮度等对图像质量均有影响,因而在不同场景下需要采用与实际拍摄情况适配的HDR方案进行图像曝光和处理。为了使得在录像场景下实现有效扩大动态范围,获取高质量的视频画面,本申请实施例提供的视频处理的方法,通过建立特定影响因素与曝光方式之间的对应关系,然后在视频拍摄过程中根据影响因素的实际情况在多个HDR方案之间无缝切换,从而实现于高动态范围影响因素适配的处理方式,有效提高录像场景下的扩展动态范围。
为了更好地理解本申请实施例提供的视频处理的方法,以下对本申请中涉及的几种曝光方式之间的差异以及几种曝光方式分别在哪些情况下被使用进行介绍。
示例性的,如图5所示,为本申请实施例提供的分别通过SHDR曝光方式,binning曝光和DXG曝光方式采集图像过程中的时序图。
SHDR通过改变快门速度(也即曝光条件)获取长短帧图像,可以节约后续图像读取及处理的算力,但是在曝光时间较长的情形下,SHDR容易使图像产生鬼影,所以其适用于曝光时间较短(环境亮度较高)的场景。
Binning曝光方式为单帧曝光,相比于DXG曝光方式来说,其曝光时间较长,例如可以长于SHDR曝光方式中的长曝光时间。以图5示出的几种曝光方式为例,binning曝光方式对应的曝光时间可以是最长的,具体曝光时间的数值可以根据需要设置,本申请实施例对此不作限定。
DXG曝光方式为两路增益信号(转换增益信号和模拟增益信号)同时曝光,获取不同增益(转换增益或模拟增益)两帧图像,虽然获取图像的过程相对复杂,但该方式不会导致鬼影问题,且曝光时间短,因此该曝光方式可以应用于中低环境亮度的场景。
综合上述各曝光方式的特点,对于动态范围较小的场景来说,图像明暗差别较小,可以采用binning单帧曝光方式。对于动态范围较大的场景,可以进一步判断当前场景是否存在频闪,也即进行flicker判断。如果存在频闪,则使用binning曝光方式,原因是binning曝光方式具有较长的曝光时间,可以有效克服由于交流电周期性变化带来的频闪现象;如果不存在频闪,可以进一步根据环境亮度判断使用SHDR或DXG曝光方式,其中,当环境亮度较高时,使用SHDR曝光方式,而环境亮度较低时,使用DXG曝光方式。
示例性的,下表1示出了不同影响因素下对应的曝光方式示例:
表1
需要说明的是,上述介绍的影响因素仅为示例,在实际应用中,可以根据需要灵活设置影响因素的类型,比如在一些情况下,可以仅考虑拍摄场景中的动态范围和环境亮度,而忽略频闪因素等。本申请实施例对此不作限定。
本申请实施例提供的视频处理的方法可以应用于需要进行HDR处理的多种场景下,尤其可以应用于录像过程中的HDR处理场景下。示例性的,如图6A至图6D所示,为本申请实施例提供的一些视频处理的方法实现过程中可能涉及的图形用户界面(graphical user interface,GUI)的示意图。为便于理解,这里以手机中的GUI界面为例进行介绍。
需要说明的是,在本申请实施例中,用于触发视频处理过程的方式可以包括多种,比如可以通过针对设置于手机中的智能HDR开关控件输入点击操作触发视频处理过程;再比如,手机开机时,可以默认开启智能HDR功能,也即手机开机即可触发视频处理过程等等。
在一种可能的实现方式中,智能HDR的开关控件可以设置于手机的设置应用中。例如如图6A所示,为一种包括智能HDR开关控件的设置界面的示意图。该设置界面601包括拍摄参数设置区域、视频参数设置区域以及通用设置区域等。其中,拍摄参数设置区域可以具体包括照片比例设置栏、声控拍摄功能设置栏和笑脸抓拍设置栏等;视频参数设置区域可以具体包括视频分辨率设置栏、视频帧率设置栏和智能HDR设置栏等;通用设置区域可以具体包括参考线功能设置栏、水平仪功能设置栏(图中未示出)和定时拍摄功能设置栏等(图中未示出)。其中,智能HDR设置栏包括智能HDR的开关控件,当用户通过该开关控件开启智能HDR功能时,手机在录制视频的过程中可以根据拍摄场景智能启用HDR模式。
在另一种可能的实现方式中,智能HDR的开关控件还可以设置于手机的相机应用中。例如如图6B和图6C所示,为通过相机应用开启智能HDR功能时可能涉及的GUI示意图。具体来说,当手机接收到用户针对相机应用的开启操作(如针对主界面或锁屏界面中的相机图标的点击操作)后,可以显示如图6B所示的相机主界面603。示例性的,该相机主界面603可以包括图像预览区域、模式设置区域以及拍摄控件等。其中,模式设置区域605可以包括取景框、相册图标、拍摄控件604和旋转控件等。
其中,取景框用于获取拍摄预览的图像,实时显示预览图像。相册图标用于快捷进入相册,当手机检测到用户点击相册的图标后,可以在触摸屏上展示已经拍摄的照片或者视频等。拍摄控件604用于拍摄或者录像,当手机检测到用户点击拍摄控件后,手机执行拍照操作,并将拍摄的照片保存下来;或者,当手机处于录像模式时,用户点击拍摄控件后,手机执行录像操作,并将录制的录像保存下来。摄像头旋转控件用于控制前置摄像头和后置和摄像头的切换。
此外,该相机主界面603还包括用于设置拍摄模式的功能控件,例如图6B所示的光圈拍摄模式、夜景拍摄模式、人像拍摄模式、录像模式、拍照模式和更多等。其中,如图6C所示,更多里还可以包括慢动作模式、全景模式、黑白艺术模式、双景录像模式、滤镜模式、智能HDR模式等。
在一些实施例中,当用户通过相机应用拍摄视频时,可以针对图6C所示的智能HDR图标输入特定操作,以开启智能HDR功能。
在一些实施例中,当智能HDR开关打开时,在录像预览场景(录制未开始)可以支持人工智能(artificial intelligence,AI)提示能力。比如,在环境高亮且高动态场景下,对应的曝光方式为SHDR状态,此时,预览界面可以显示提示信息:“逆光HDR”;再比如,在环境中低亮度且高动态场景下,对应的曝光方式为DXG状态,当环境亮度小于一定阈值时,预览界面可以显示提示信息:“夜景HDR”。
需要说明的是,图6A至图6C示出的智能HDR开关控件的位置仅为示例,在实际应用中,该智能HDR开关控件不仅限于设置于设置应用以及相机应用中。此外,上述实施例仅以用户手机开启智能HDR功能作为示例进行介绍,但在另一些实施例中,开启智能HDR的方式可以有多种,比如当手机开机时,该智能HDR功能可以默认开启;再比如,当手机检测到有预设的触发事件发生时,可以自动开启智能HDR功能。在又一些实施例中,如果电子设备不支持智能HDR功能(如由于硬件限制),那么HDR开关的控件图标可以置灰。本申请对此不作限定。
在一些实施例中,当电子设备检测当前拍摄环境中场景亮度差别较大,且智能HDR功能处于关闭状态,那么还可以通过多种方式提示用户开启智能HDR功能。比如,当用户逆光拍摄某山洞时,逆光部分的光亮等级较高,而山洞部分的光亮等级较低,两者相差较大,此时,拍摄设备可以通过(artificial intelligence,AI)提示用户开启智能HDR功能,如可以在拍摄界面上显示如图6D所示的提示信息“当前拍摄画面的明暗差别较大,可点击“更多”选项中的智能HDR功能,改善画质”。
在一些实施例中,当智能HDR功能处于开启状态时,那么拍摄过程中手机可以按照与当前实际拍摄情况匹配的HDR方案进行曝光和图像处理,获取高动态范围的视频画面,智能HDR开启下的拍摄画面能够更清楚地呈现出亮部区域和暗部区域的细节,不至于存在亮度过高或亮度太低导致细节模糊的情况。
值得注意的是,这里所说的预设的HDR方案是指手机根据实际拍摄环境所对应的动态范围、频闪状态、拍摄环境的亮度等因素确定的HDR方案。不同的HDR方案的区别主要在于曝光方式以及图像处理的方式不同,在实际应用中,手机根据实际拍摄条件,可以在多种支持的曝光方式之间进行无缝切换。结合上文中的介绍,本申请实施例中可供选择的曝光方式例如可以包括:SHDR,DXG,binning。需要说明的是,不同场景下,供选择使用的曝光方式可以有所不同,比如受限于设备的硬件性能,对于不支持DXG曝光方式的设备,可供选择的曝光方式也可以仅包括SHDR和binning。
在一些实施例中,控制智能HDR开关控件的底层实现过程可以包括:HAL上报多状态切换视频HDR能力,区分产品。APP根据上报的能力,配置“智能HDR“开关。APP根据模式和场景,确认开关状态:开启/关闭/置灰。APP根据开关状态,下发HDR能力开启和关闭的命令给HAL。在关闭的情况下,仅使用binning能力。HAL上报AEC HDR状态tag,用于支持AI提示和推荐。
上文介绍了不同的曝光方式的原理及智能HDR的开启方式,为了更好地理解本申请实施例提供的视频处理的方法,以下对具体执行该方法的具体功能模块以及该方法的底层实现过程进行介绍。
示例性的,如图7所示,为本申请实施例提供的一种视频处理的方法实现时,各相关模块之间数据流交互的示意图。
如图7所示,电子设备100包括图像传感器(camera sensor),自动曝光控制(automatic exposure control,AEC)模块,传感器前端处理器(或称融合模块),图像处理器前端,多摄平滑切换模块,图像处理器后端及其包括的局部色调映射(local tone mapping,LTM)模块等。
图像传感器,可对应于图3所示的摄像头193,主要用于拍摄视频图像。在一些实施例中,图像传感器通过曝光采集图像帧,当被拍摄对象的反射光通过透镜后,汇聚在图像传感器上;图像传感器可以将光信号转换成模拟电信号,并将模拟电信号传输至融合模块。其中,图像传感器输出的为其采集的原始数字图像,即原始(RAW)图像。在一些实施例中,拍摄的初始阶段(开启相机时),图像传感器可以按照默认曝光方式采集图像,后续当确定与当前拍摄条件适配的曝光方式之后,图像传感器可以在AEC模块的指示下,采用目标方式进行拍摄。其中,默认的曝光方式例如可以是binning曝光方式,但本申请实施例对此不作限定。
在一些实施例中,当使用不同的曝光方式采集图像时,针对同一时刻的相同画面,图像传感器可能输出不同数量的帧。比如,当采用binning曝光方式进行拍摄时,图像传感器输出单帧(如图1中的长曝光帧);当采用DXG曝光方式进行拍摄时,图像传感器输出双帧。
此外,在本申请实施例中,图像传感器还可以进行图像融合,比如在DXG曝光方式下,图像传感器获取两帧图像之后,可以将两帧图像融合成一帧。图像传感器还可以用于调整图像参数(如图像尺寸、比特深度等),如将不同曝光方式获取的图像尺寸均调整为一致的目标尺寸;或将不同曝光方式获取的图像比特深度均调整为一致的目标比特深度等。
传感器前端处理器(也即融合模块),用于对图像传感器采集到的RAW图像帧进行融合,可以实现单帧输入单帧输出,或者两帧输入单帧输出。
比如,SHDR曝光方式可以获取曝光时间不同的两帧图像,因此当采用SHDR曝光方式进行拍摄时,图像传感器向融合模块输入的初始图像可以是一定帧率(如30fps)且曝光时间不同的两帧图像,也即双帧模式。融合模块获取初始图像之后,将曝光时间不同的两帧图像融合成一帧,并且将融合后图像参数调整为预设的目标参数,如将输入图像的初始格式RAW10调整为目标格式RAW14。
再比如,DXG曝光方式为基于一次曝光获取两帧图像,因此当采用DXG曝光方式进行拍摄时,图像传感器的一次曝光可以对应向融合模块输入两帧初始图像,同样为双帧模式。这两帧初始图像可以是通过高转换增益和低转换增益分别读取的图像,或者这两帧图像还可以是通过高模拟增益和低模拟增益分别读取的图像,或者这两帧图像还可以是对通过转换增和模拟增益读取的图像进行叠加后的图像。输入图像初始格式例如可以是RAW10,帧率例如可以是30fps。融合模块获取初始图像之后,将不同增益读出的两帧图像融合成一帧,并且可以将融合后图像参数调整为预设的目标参数,如将输入的初始格式RAW10调整为目标格式RAW14。
又比如,binning曝光方式可以获取单帧图像,当采用binning曝光方式进行拍摄
时,图像传感器的一次曝光可以对应向融合模块输入单帧初始图像,也即单帧模式。融合模块获取初始图像之后,无需进行图像融合,但可以根据需求调整图像参数,如将输入的初始格式RAW10调整为目标格式RAW14。
需要说明的是,对初始图像进行格式调整的目的是为了将不同曝光方式获取的图像统一为相同的格式(尺寸、比特深度等),以便后续播放视频时画面尺寸发生跳变。
在一些实施例中,如果图像传感器对图像进行了融合,那么就可以单帧输入融合模块,此时融合模块可以无需对图像做融合处理,也可以不调整图像尺寸,而是单帧输出至图像处理器前端进行进一步处理。
图像处理器前端(图像处理器前端),用于对图像进行前端处理,得到初始预览图像。图像处理器前端具体可以包括统计模块(STATS)和GTM模块,其中,GTM模块用于对图像暗部进行全局提亮,改善画质。
在一些实施例中,图像处理器前端输出的图像可以分为以下两路传输路径:
(1)一路传输路径为从图像处理器前端传输至图像处理器后端进行后端处理。其中,图像处理器后端可以包括局部色调映射(local tone mapping,LTM)模块,该LTM模块用于对图像暗部进行局部提亮。之后,图像处理器后端将后端处理后的图像继续传输至感知模块,该感知模块可以根据图像获取环境亮度,并将环境亮度指示给自动曝光控制AEC模块。然后,AEC模块根据该环境亮度确定曝光方式,并指示图像传感器对曝光方式进行切换。或者,感知模块可以根据图像获取环境亮度,并根据环境亮度确定曝光方式,然后将曝光方式指示给AEC模块。之后,AEC模块根据该环境亮度确定曝光方式,并指示图像传感器对曝光方式进行切换。
在一些实施例中,感知模块根据图像获取环境亮度的方式可以包括:根据当前图像传感器的状态、曝光参数以及图像帧的亮度通道信息,综合判断环境亮度。
需要注意的是,虽然AEC模块可以用于控制曝光方式,但系统的AEC模块本身并不具备确定SHDR和DXG曝光方式的使能条件,而是需要结合感知模块的环境亮度检测结果,当环境亮度大于X时,采用SHDR曝光方式,当环境亮度小于Y时,采用DXG曝光方式。X和Y可以根据实际需求设置,但为了防止乒乓,可采用双阈值设计,也即X大于Y,如X=500lux,Y=300lux。
(2)另一路传输流程为从图像处理器前端传输至多摄平滑切换模块,该多摄平滑切换模块用于利用多摄平滑切换算法(如SAT(spatial alignment transform)算法)实现一个摄像头到另一个摄像头图像平滑切换的过程。多摄平滑切换模块进行图像平滑切换之后,可以将处理后的图像传输至图像处理器后端。其中,该图像处理器后端可以通过其包括的LTM模块对图像暗部进行局部提亮。之后,可以按照特定的帧率(如30fps)输出预览视频图像,或者按照特定的帧率(如30fps)生成并存储视频类型(vedio)文件。
应理解,图像处理器前端和图像处理器后端对图像分别进行前端处理和后端处理的过程可以参见现有流程,此处不再详述。
需要说明的是,在本申请实施例中,多摄平滑切换模块执行的平滑切换操作主要用于实现HDR摄像头和非HDR摄像头分别拍摄的图像之间的切换。具体来说,当在视频拍摄过程中存在HDR摄像头和非HDR摄像头之间的切换,那么多摄平滑切换模
块可以用于将HDR摄像头的图像平滑切换为非HDR摄像头的图像,或者将非HDR摄像头的图像平滑切换为HDR摄像头的图像,也即实现两种不同类型摄像头所采集的图像之间的平滑切换,以保证视频的流畅播放,避免直接切换带来的图像画面跳变的问题。
还需要说明的是,本申请实施例中的HDR摄像头可以是主摄像头,非HDR摄像头可以是非主摄像头,例如广角摄像头等。可选地,本申请实施例所说的HDR摄像头和非HDR摄像头也可以并非特指两个不同的摄像头,而是可以指使用HDR模式的摄像头和未使用HDR模式的摄像头,例如某一摄像头,在其使用HDR方案采集图像时,将其视为HDR摄像头,而在其不使用HDR方案采集图像时,将其视为非HDR摄像头。
示例性的,如图8所示,为本申请实施例提供的一种视频处理的方法的示意性流程图。该方法的执行主体可以是上文中介绍的电子设备100,具体可以由电子设备中的各个功能模块实现。该方法可以包括以下步骤:
S801,AEC模块判断动态范围信息,并获取第一结果。
其中,这里的动态范围信息可以是动态范围或者动态范围压缩增益。动态范围压缩增益可以根据当前图像的动态范围和预设的标准动态范围计算获得。
在一些实施例中,在执行步骤S801之前,本申请实施例提供的视频处理的方法还可以包括:当接收到用户输入的用于开启拍摄的第一操作时,响应于该第一操作,电子设备的AEC模块可以指示图像传感器按照默认的曝光方式捕获图像;当AEC模块获取图像传感器捕获的图像之后,可以通过直方图获取当前图像的动态范围。其中,根据直方图获取当前图像动态范围的具体过程可以参见现有流程,本申请实施例对此不再详述。
在一些实施例中,在视频拍摄过程中,随着图像传感器捕获越来越多帧图像,AEC模块可以基于多帧图像获取当前的动态范围,如根据各帧图像对应的直方图获取平均动态范围。
S802,判断第一结果是否大于预设的第一阈值。
其中,第一阈值可以根据实际情况灵活设置,本申请实施例对此不作限定。
在一些实施例中,根据第一结果与第一阈值之间的相关关系,可以确定当前动态范围的等级。比如,当第一结果大于第一阈值时,说明当前拍摄画面的明暗差别较大,可以确定当前的动态范围为高动态范围;当第一结果不大于第一阈值时,说明当前拍摄画面的明暗差别较小,可以确定当前动态范围为中低动态范围。
需要说明的是,不同动态范围可以对应不同的曝光方式,如当为高动态范围时,确定曝光方式为binning,当为中低动态范围时,需要根据频闪情况、环境亮度以及预设判断方式进一步确定曝光方式。
S803,当第一结果不大于第一阈值时,AEC模块指示图像传感器采用binning曝光方式。
S804,当第一结果大于第一阈值时,AEC模块判断图像是否存在频闪,并获取第二结果。
S805,当第二结果指示存在频闪时,AEC模块指示图像传感器采用binning曝光
方式。
在一些实施例中,在视频拍摄过程中,由于设备均使用交流电,因而可能导致拍摄画面存在频闪。根据频闪的结果,可以确定是否使用binning曝光方式,其中,当存在频闪时,确定使用binning曝光方式;当不存在频闪时,可以进一步根据环境亮度确定使用SHDR或DXG曝光方式。
需要说明的是,本申请实施例中的binning曝光方式对应的曝光时间可以大于或等于交流电周期,比如当交流电频率为50Hz时,binning曝光方式的曝光时间可以是10ms。
S806,当第二结果指示不存在频闪时,感知模块判断环境亮度。
在一些实施例中,感知模块可以获取图像传感器捕获的图像,并根据图像获取环境亮度。
S807,当环境亮度大于第二阈值时,AEC模块指示图像传感器采用SHDR曝光方式。
其中,第二阈值可以根据实际情况灵活设置,本申请实施例对此不作限定。
S808,当环境亮度小于第三阈值且大于第四阈值时,AEC模块指示图像传感器采用DXG曝光方式。
其中,第三阈值小于第二阈值。第三阈值和第四阈值可以根据实际情况灵活设置,本申请实施例对此不作限定。
S809,当环境亮度小于第四阈值时,AEC模块指示图像传感器采用binning曝光方式。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄情况及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
需要说明的是,在另一些实施例中,如果电子设备的图像传感器不支持某些曝光方式(如DXG),那么本申请实施例提供的视频处理的方法也可以按照SHDR和binning曝光方式进行切换,此时不同影响因素对应的曝光方式可以如表2所示:
表2
或者,在又一些实施例中,如果电子设备的图像传感器不支持某些曝光方式(如SHDR),那么本申请实施例提供的视频处理的方法也可以按照DCG和binning曝光方式进行切换,此时不同影响因素对应的曝光方式可以如表3所示:
表3
不同曝光方式切换时,可以实现无缝切换效果。其中,实现无缝切换效果的具体实现手段将在下文进行详细介绍,此处暂不赘述。
示例性的,如图9所示,为本申请实施例提供的另一种视频处理的方法的示意性流程图。图9示出了各个功能模块之间的交互过程,具体可以包括以下步骤:
S901,自动曝光控制模块向图像传感器发送第一指示信息。
在一些实施例中,自动曝光模块可以预设默认的曝光方式,该默认曝光方式例如可以是binning。
在一些实施例中,在执行步骤S901之前,本申请实施例提供的视频处理的方法还可以包括:当接收到用户输入的用于开启拍摄的第一操作时,响应于该第一操作,电子设备的AEC模块可以指示图像传感器按照默认的曝光方式捕获图像。示例性的,第一操作可以用于开启视频录制,例如可以是点击如图6C所示的相机主界面中录像图像的操作等。
S902,图像传感器响应于第一指示信息,采用默认曝光方式捕获图像。
需要说明的是,在本申请实施例中,图像传感器仅用于采集原始图像,如RAW10格式的图像,而不对图像进行融合处理或者调整图像尺寸。
S903,图像传感器向融合模块发送第一图像。
其中,第一图像可以图像传感器采集的原始图像,如RAW10格式的图像。
可选地,当默认的曝光方式为SHDR或DXG等曝光方式捕获双帧模式的图像时,还可以执行步骤S904,也即融合模块对第一图像进行多帧融合。而在图像传感器采用binning曝光方式捕获单帧模式的图像时,可以以单帧模式输入融合模块,此时融合模块无需进行融合处理,可以仅调整图像参数。
S905,融合模块向图像处理器前端发送第二图像。
其中,第二图像可以是融合模块进行处理之后的图像,融合模块可以用于对获取的图像进行融合处理,还可以通过部位调整图像尺寸。
S906,图像处理器前端向图像处理器后端发送第三图像。
其中,第三图像可以为图像处理器前端对第二图像进行前端处理后的图像。
在一些实施例中,图像处理器前端对图像进行前端处理之后,可以按照两路传输流程输出第三图像,一路传输流程为将第三图像传输至多摄平滑切换模块,然后多摄平滑切换模块将其传输至图像处理器后端,图像处理器后端接收到图像之后可以执行下述步骤S907A;另一路传输流程为将第三图像传输至图像处理器后端,之后,图像处理器后端将该图像发送至感知模块,也即执行下步骤S907B。
S907A,图像处理器后端进行后端处理,获取预览视频及存储的视频文件。
S907B,图像处理器后端向感知模块发送第四图像。
其中,第四图像可以是图像处理器对第三图像进行后端处理后的图像。
S908,感知模块基于第四图像获取环境亮度。
S909,感知模块向自动曝光控制模块发送环境亮度。
S910,自动曝光控制模块根据预设影响因素确定曝光方式,该影响因素包括环境亮度。
S911,自动曝光控制模块向图像传感器发送第二指示信息。
其中,第二指示信息用于指示图像传感器采用目标曝光方式捕获图像。
S912,图像传感器响应于第二指示信息,继续采用默认曝光方式或者其它曝光方式。
当图像传感器采用目标曝光方式采集到原始图像(如RAW10图像)之后,可以按照上述步骤S903至步骤S911,进行实时曝光方式的确定,并通过目标曝光方式捕获图像及处理图像。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
需要说明的是,在本申请实施例提供的视频处理的方法中,当图像传感器具有融合图像和调整图像尺寸的功能时,可以由图像传感器执行融合DXG曝光方式多帧图像和调整图像尺寸的操作。以下结合附图,对该方式下的视频处理过程进行介绍。
示例性的,如图10所示,为本申请实施例提供的另一种视频处理的方法实现时,各相关模块之间数据流交互的示意图。
在一些实施例中,当开启视频拍摄时,自动曝光控制模块可以向图像传感器发送曝光指示信息,指示图像传感器采用默认的曝光方式(如binning曝光方式)捕获图像。图像传感器根据默认曝光方式采集图像(如RAW10),并对图像进行融合(在DXG或SHDR曝光方式下)和图像参数(如图像尺寸、比特深度等)调整,之后输出图像(如RAW14)至融合模块。融合模块将获取的图像透传至图像处理器前端,图像处理器前端对图像进行前端处理之后,按照两路传输路径输出图像:(1)一路传输路径为,图像处理器前端将图像传输至图像处理器后端,图像处理器后端可以对图像进行后端处理,之后将图像传输至感知模块,感知模块可以根据图像获取环境亮度,并将环境亮度传输至决策模块,决策模块用于根据环境亮度确定目标曝光方式,其中,目标曝光方式是与当前动态范围、频闪状态、环境亮度等实际情况匹配的曝光方式;之后,决策模块向图像传感器发送曝光方式指示信息,指示图像传感器切换为目标曝光方式。(2)另一路传输路径为,图像处理器前端将图像传输至多摄平滑切换模块,可选地,多摄平滑切换模块可以利用多摄平滑切换算法实现一个摄像头到另一个摄像头图像的平滑切换。多摄平滑切换模块进行图像平滑切换之后,可以将处理后的图像传输至图像处理器后端。之后,图像处理器后端可以通过其包括的LTM模块对图像暗部进行局部提亮,之后可以按照特定的帧率(如30fps)输出预览视频图像,或者按照特定的帧率(如30fps)生成并存储视频类型(vedio)文件。
应理解,图像处理器前端和图像处理器后端对图像分别进行前端处理和后端处理的过程可以参见现有流程,此处不再详述。
需要说明的是,在本申请实施例中,多摄平滑切换模块执行的平滑切换操作主要用于实现HDR摄像头和非HDR摄像头分别拍摄的图像之间的切换。具体来说,当在
视频拍摄过程中存在HDR摄像头和非HDR摄像头之间的切换,那么多摄平滑切换模块可以用于将HDR摄像头的图像平滑切换为非HDR摄像头的图像,或者将非HDR摄像头的图像平滑切换为HDR摄像头的图像,也即实现两种不同类型摄像头所采集的图像之间的平滑切换,以保证视频的流畅播放,避免直接切换带来的图像画面跳变的问题。
还需要说明的是,本申请实施例中的HDR摄像头可以是主摄像头,非HDR摄像头可以是非主摄像头,例如广角摄像头等。可选地,本申请实施例所说的HDR摄像头和非HDR摄像头也可以并非特指两个不同的摄像头,而是可以指使用HDR模式的摄像头和未使用HDR模式的摄像头,例如某一摄像头,在其使用HDR方案采集图像时,将其视为HDR摄像头,而在其不使用HDR方案采集图像时,将其视为非HDR摄像头。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
示例性的,如图11所示,为本申请实施例提供的另一种视频处理的方法的示意性流程图。图11示出了各个功能模块之间的交互过程,具体可以包括以下步骤:
S1101,自动曝光控制模块向图像传感器发送第一指示信息。
在一些实施例中,自动曝光模块可以预设默认的曝光方式,该默认曝光方式例如可以是binning。
一些实施例中,在执行步骤S1101之前,本申请实施例提供的视频处理的方法还可以包括:当接收到用户输入的用于开启拍摄的第一操作时,响应于该第一操作,电子设备的AEC模块可以指示图像传感器按照默认的曝光方式捕获图像。示例性的,第一操作可以用于开启视频录制,例如可以是点击如图6C所示的相机主界面中录像图像的操作等。
S1102,图像传感器响应于第一指示信息,采用默认曝光方式捕获图像,并且按照预设策略进行图像处理。
需要说明的是,与上文图9实施例不同的是,本申请实施例中的图像传感器可以具备融合图像、调整图像尺寸等能力,能够按照预设策略进行图像处理。比如,当默认的曝光方式为DCG或DAG或DXG等双帧模块获取图像时,图像传感器可以对双帧第一图像进行融合。而在图像传感器采用binning曝光方式捕获图像时,单帧模式输入融合模块,此时融合模块无需进行融合处理,可以仅调整图像尺寸。
示例性的,为了便于描述,本申请实施例将在图像传感器中通过DXG方式获取两帧图像,并进行融合的方式记为iDXG。参见图11,图像传感器融合双帧图像的方式可以包括:获取预设感光比的两帧图像,然后将这两帧图像进行融合,获取具有更大动态范围的图像。受限于图像传感器的硬件性能,通过iDXG方式获取的两帧图像的感光比可以预先设置,比如iDXG1:4,iDXG1:8或者iDXG1:16等几种固定值。
不同切换条件可以对应不同的感光比,比如:在高动态场景下,可以对应iDXG1:4(也即采用iDXG曝光方式,且感光比为1:4);在极高动态场景下,可以对应iDXG
1:16(也即采用iDXG曝光方式,且感光比为1:16);在中低动态场景下,可以对应binning曝光方式;在热逃生场景下,可以对应binning曝光方式。
其中,对于极高动态、高动态和中低动态范围可以根据需求进行划分,本申请对各个动态范围不作具体限定。
与此不同的是,在图9实施例中,在融合模块进行双帧图像融合时,对两帧图像的感光比可以灵活设置,并不局限于几组固定值。因而,图9实施例的融合方式能够便于灵活调整图像的光亮度,而图11实施例的融合方式可以在保证图像传感器性能的基础上,实现图像的HDR处理。
S1103,图像传感器将第五图像发送至图像处理器前端。
具体地,图像传感器可以将第五图像先传输至融合模块,然后由融合模块将第五图像透传至图像处理器前端。
S1104,图像处理器前端向图像处理器后端发送第六图像。
其中,第六图像可以为图像处理器前端对第五图像进行前端处理后的图像。
在一些实施例中,图像处理器前端对图像进行前端处理之后,可以按照两路传输流程输出第六图像,一路传输流程为将第六图像传输至多摄平滑切换模块,然后多摄平滑切换模块将其传输至图像处理器后端,图像处理器后端接收到图像之后可以执行下述步骤S1105A;另一路传输流程为直接将第六图像传输至图像处理器后端,之后,图像处理器后端将该图像发送至感知模块,也即执行下步骤S1105B。
S1105A,图像处理器后端进行后端处理,获取预览视频及存储的视频文件。
S1105B,图像处理器后端向感知模块发送第七图像。
S1106,感知模块基于第七图像获取环境亮度。
S1107,感知模块向决策模块发送环境亮度。
需要说明的是,这里的决策模块可以是额外设置的模块,或者也可以是自动曝光控制中的子模块,本申请实施例对此不作限定。
S1108,感知模块根据预设策略确定目标曝光方式。
其中,预设策略是指根据当前的动态范围、是否存在频闪、环境亮度等确定目标曝光方式的策略,可以对应于上文图8实施例中介绍的确定目标曝光方式的策略。
S1109,决策模块向图像传感器发送第二指示信息。
S1110,图像传感器响应于第二指示信息,采用目标曝光方式捕获图像。
当图像传感器采用目标曝光方式采集到原始图像(如RAW10图像)之后,可以按照上述步骤S1103至步骤S1109,进行实时曝光方式的确定,并通过目标曝光方式捕获图像及处理图像。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
在本申请实施例提供的视频处理的方法中,在视频拍摄时,HDR模式和非HDR模式可以相互切换,其中,HDR模式也即通过HDR技术进行处理的视频拍摄模式,非HDR模式也即不采用HDR技术进行处理的视频拍摄模式。通常来说,电子设备的
主摄像头和副摄像头被包含在一个逻辑摄像头(logical camera)中,其中,主摄像头具备HDR能力,副摄像头可以具备其它能力(如超广角能力),而不具备HDR能力,因此当视频拍摄时需要进行HDR模式和非HDR模式之间的切换时,可以通过主摄像头和副摄像头之间的切换来实现。
示例性的,如图12所示,为本申请实施例提供的一种主摄像头和副摄像头切换场景下的视频处理的方法的示意性流程图。
其中,当主摄像头和副摄像头进行切换时,为了减轻切换之后的图像的跳变,可以通过以下方式:
(1)在低动态场景下,或者存在频闪或存在热逃生现象的场景下,采用的曝光方式为binning,此时系统的动态范围增益持平,图像效果无跳变。
(2)在高动态,且不存在频闪或热逃生现象的场景下,主摄像头可以根据上文介绍的方式所确定的目标曝光方式捕获图像,副摄像头使用binning曝光方式且保持最大的动态范围,尽量与主摄像头的动态范围接近,以减轻图像的跳变。
请参见图12所示,在一些实施例中,主摄像头可以通过多种曝光方式捕获图像,包括SHDR,DXG以及binning曝光方式。主摄像头捕获图像之后可以将初始图像(如RAW10图像)传输至融合模块,当以双帧模式输入融合模块时,融合模块双帧图像进行融合,并将融合之后的图像(RAW14图像)传输至图像处理器前端0。
副摄像头可以通过binning曝光方式捕获图像,然后将捕获的初始图像(如RAW图像)传输至图像处理器前端1。
图像处理器前端0和图像处理器前端1可以将各自进行前端处理后的图像传输至多摄平滑切换模块。该多摄平滑切换模块可以根据多摄平滑切换算法对不同摄像头对应的图像进行处理,实现两种不同类型摄像头所采集的图像之间的平滑切换,以保证视频的流畅播放,避免直接切换带来的图像画面跳变的问题。
之后,融合模块可以将图像继续传输至图像处理器后端,该图像处理器后端可以通过预览流防抖模块(如EIS2.0和EIS3.0)对图像进行防抖处理,之后通过图象仿射变换模块、色彩转换矩阵(CCM)模块、伽马变换(Gamma)模块对图像进行进一步处理,具体处理过程可以参见现有流程和原理,此处不在赘述。
通过上述图像处理器后端处理之后,可以获取一定帧率(如30fps)的预览图象和一定帧率(如30fps)的视频存储文件。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
在本申请实施例提供的视频处理的方法中,可以支持多种HDR能力,比如可以支持HDR10+视频能力,也即电子设备可以应用HDR10+技术进行视频录制或播放。在一些实施例中,HDR10+能力可以主要作用于图像处理器后端部分,包括色彩转换矩阵模块调整为BT.2020,伽马变换调整为PQ转换曲线,生成动态元数据等。也即HDR10+技术可以与不同曝光方式的无缝切换同时使用。其中,应用HDR10+技术的曝光方式切换的通路可以参见图9,此处不再详述。
在一些实施例中,当电子设备支持多状态HDR能力时,可以设置对应的开关控件。此外,当电子设备支持其它HDR技术(以HDR10+为例)时,也可以设置该HDR10+对应的开关控件。对于不支持多状态HDR或者HDR10+的电子设备,该电子设备上多状态HDR或者HDR10+对应的开关控件可以置灰。
以下结合附图,以调整比特位深为例,对调整不同曝光方式对应的图像参数的过程进行介绍。
示例性的,如图13所示,为本申请实施例提供的一种调整图像参数的方式的示意图。
在一些实施例中,调整图像参数的方式可以包括:通过Pad0方式在低位补0,使得不同曝光方式对应的图像具有相同的比特位深。比如,结合图13所示,iDCG1:4曝光方式对应的图像在低位2位补0,其中,1:4为iDCG不同增益的两帧图像之间的感光比;binning曝光方式对应的图像在低位4位补0,低位补0之后,iDCG和binning曝光方式获取的图像具有相同的比特位深。
示例性的,如图14所示,为本申请实施例提供的一种调整图像参数的方式的示意图。
在一些实施例中,支持多种曝光方式(如binning,DXG,SHDR三种曝光方式)无缝切换的图像传感器设置,可以统一比特位深(bit depth),如统一为RAW 14的比特位深。其中,对于比特位深不足14bit的图像,可以通过补位的方位将其统一为RAW14。
需要说明的是,在本申请实施例提供的视频处理的方法中,统一的比特位深可以是以iDXG图像的比特位深位准,原因是在本申请提供的几种曝光方式中,iDXG曝光方式对应的图像的比特位深最大。当在实际应用中,使用其它曝光方式时,统一比特位深还可以根据曝光方式的类型灵活设置,本申请实施例对此不作限定。
以统一比特位深时14bit为例,调整比特位深的方式可以如图13所示,包括:将binning曝光捕获的图像由10bit补位到14b;将iDXG曝光方式获取的,且感光比为1:4两帧融合后的图像,由12bit补位到14bit;将iDXG曝光方式获取的,且感光比为1:16的两帧融合后的图像本身为14bit,此时无需补位。
还需要说明的是,在本申请实施例提供的视频处理的方法中,不同曝光方式可以对应不同感光度比例。需要说明的是,感光度比例可以是曝光时间(exp)和系统增益(gain)之间的组合。系统增益可以设置多组,比如gain=1,gain=4,gain=16等等。
假设当前存在3种曝光方式:binning,iDXG1:4,iDXG1:16,且三种曝光方式对应不同的曝光时间和系统增益,那么可以根据曝光时间和系统增益获取不同曝光方式对应的感光度比例。举例来说,获取感光度比例的原理可以包括:(1)当系统动态范围压缩增益为3.99(也即adrc gain=3.99),曝光方式为binning模式,曝光时间为a ms(exp=a ms),系统增益为1,曝光时间和系统增益组合后的感光度比例为1;(2)当系统动态范围压缩增益为4(也即adrc gain=4),曝光方式为iDXG1:4,曝光时间为ams(exp=a ms),系统增益为4,曝光时间和系统增益组合后的感光度比例为4;(3当系统动态范围压缩增益为4(也即adrc gain=4),曝光方式为iDXG1:16,曝光时间为ams(exp=a ms),系统增益为16,曝光时间和系统增益组合后的感光度比例
为16。示例性的,各个曝光方式对应的感光度比例可以如表4所示:
表4
在不同曝光方式下,AEC可以指示图像传感器使用该曝光方式对应的gain捕获图像,因此,通过不同曝光方式捕获的图像可能对应不同的感光度。为了避免不同曝光方式切换时,导致不同曝光方式对应的图像之间,由于感光度不同导致的图像亮度跳变,本申请实施例提供的视频处理的方法通过调整曝光方式对应的感光度比例来解决上述问题。
示例性的,以上表4给出的3中曝光方式和感光度比例作为示例。在上表4中的binning曝光方式对应感光度比例为1,iDXG1:4对应的感光度比例为4,当曝光方式从binning切换到iDXG1:4时,可以将iDXG1:4曝光方式增大到原来的4倍;或者,在上表4中的binning曝光方式对应感光度比例为1,iDXG1:16对应的感光度比例为16,当曝光方式从binning切换到iDXG1:16时,可以将iDXG1:16曝光方式增大到原来的16倍;或者,在上表4中的iDXG1:4曝光方式对应感光度比例为4,iDXG1:16对应的感光度比例为16,当曝光方式从iDXG1:4切换到iDXG1:16时,可以将iDXG1:16曝光方式增大到原来的4。需要说明的是,上述表4给出的几种感光度比例仅为示例,在实际应用中,不同曝光方式对应的感光度比例还可以为其它具体数值,本申请实施例对此不作限定。
通过对不同曝光方式切换时对应的图像的感光度进行调整,可以避免图像亮度的跳变,实现不同曝光方式切换时,视频图像也能对应地实现平缓切换,保证画质。
示例性的,如图15所示,为本申请实施例提供的又一种视频处理的方法的示意性流程图。该方式可以包括以下步骤:
S1501,接收用户输入的第一操作,该第一操作用于开启电子设备的录像的功能。
其中,第一操作可以对应于上述图6C中点击录像按钮的操作;或者,第一操作也可以对应于点击开启智能HDR功能的操作。
在一些实施例中,当开启录像功能后,电子设备可以显示第一界面,所述第一界面包括第一控件,所述第一控件用于开启自动切换所述HDR曝光方式功能。该第一界面例如可以对应于图6A或图6D所示的界面。
S1502,响应于第一操作,根据预设的默认高动态范围HDR曝光方式获取第一图像。
其中,本申请实施例中的曝光方式可以包括:binning曝光方式,SHDR曝光方式,DXG曝光方式等。其中,根据不同曝光方式对应的图像帧数量,可以分为第一HDR曝光方式和第二HDR曝光方式,第一HDR曝光方式为单帧模式,也即基于一次曝光可以获取一帧图像,该第一HDR曝光方式可以是binning模式;第二HDR曝光方式为双帧模式,也即基于一次曝光可以读取两帧图像,如DXG模式,或者进行长短两次曝光获取两帧图像,如SHDR模式。
在一些实施例中,可以预先设置默认HDR曝光方式,当接收到第一操作时,AEC模块可以指示图像传感器使用默认HDR曝光方式捕获图像。示例性的,默认HDR曝光方式例如可以是binning模式。
S1503,根据第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,该预设策略包括视频拍摄对应的动态范围信息、频闪状态以及环境亮度与目标HDR曝光方式之间的对应关系。
其中,这里的动态范围压缩信息可以包括动态范围和/或动态范围压缩增益。本申请实施例以动态范围为例来介绍。
在一些实施例中,按照预设策略确定目标HDR曝光方式的方式可以包括:获取所述视频拍摄对应的动态范围;当所述动态范围小于第一阈值时,确定所述目标HDR曝光方式为所述binning模式;当所述动态范围大于或等于所述第一阈值时,检测是否存在频闪;当存在频闪时,确定所述目标HDR曝光方式为所述binning模式;当不存在频闪时,根据所述环境亮度确定所述目标HDR曝光方式;其中,当所述环境亮度大于第二阈值时,确定所述目标HDR曝光方式为所述SHDR模式;当所述环境亮度小于第三阈值且大于第四阈值时,确定所述目标HDR曝光方式为所述DXG模式,其中,所述第三阈值小于第二阈值;当所述环境亮度小于所述第四阈值时,确定所述目标HDR曝光方式为所述binning模式。
需要说明的是,除了根据动态范围判断曝光方式,可以根据动态范围压缩增益来判断。其中,按照动态范围压缩增益判断曝光方式的过程与上述过程类似(可以将动态范围替换为动态范围压缩增益),对此不再赘述。当将动态范围替换为动态范围压缩增益时,第一阈值的具体数值也可以随之改变,也即动态范围对应的第一阈值和动态范围压缩增益对应的第一阈值的具体数值可以不同。
在一些实施例中,按照预设策略确定目标HDR曝光方式的方式还可以包括:检测所述视频拍摄是否存在热逃生现象;当存在所述热逃生现象时,确定所述目标HDR曝光方式为所述binning模式。
在一些实施例中,当所述视频拍摄时,存在HDR摄像头和非HDR摄像头之间的切换时,所述方法还包括:当所述切换时,所述HDR摄像头的曝光方式为所述第一HDR曝光方式,所述非HDR摄像头对应的曝光方式为所述第二HDR曝光方式,则调节所述第一HDR曝光方式对应的第一动态范围增益,使得所述第一动态范围增益与所述第二HDR曝光方式对应的第二动态范围增益最接近。
S1504,当目标HDR曝光方式与默认HDR曝光方式不相同时,将默认HDR曝光方式切换为目标HDR曝光方式,并继续进行所述视频拍摄,获取第二图像。
在一些实施例中,当所述目标HDR曝光方式为所述第二HDR曝光方式时,对所述双帧模式的图像进行融合。
在一些实施例中,当所述目标HDR曝光方式为所述第二HDR曝光方式时,图像传感器将双帧模式的图像传输至所述融合模块;所述融合模块对所述双帧模式的图像进行融合;或者,当所述目标HDR曝光方式为所述第二HDR曝光方式时,所述图像传感器对所述双帧模式的图像进行融合。
在一些实施例中,当所述融合模块对所述双帧模式的图像进行融合时,根据所述
DXG模式要求的感光比,分别确定所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像之间的目标感光度比例;根据所述目标感光度比例对所述DCG模式双帧输入图像和DAG模式下的双帧输入图像分别进行对应叠加,获取满足所述DXG模式感光比的叠加后的双帧输入图像。
在一些实施例中,当所述图像传感器对所述双帧模式的图像进行融合时,根据预设的感光度比例对所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像进行叠加。其中,以iDXG为例,感光度比例例如为1:4或1:16。
在一些实施例中,对于通过不同曝光方式获取的图像,可以将这些曝光方式分别对应的图像参数调整一致,具体过程可以包括:预设通过所述视频拍摄获取的图像所对应的目标参数;将基于所述第一HDR曝光方式获取的所述第一图像对应的初始参数调整为所述目标参数;和/或,将基于所述第二HDR曝光方式获取的所述第二图像对应的初始参数调整为所述目标参数。
在一些实施例中,所述电子设备支持所述第一HDR视频模式,所述第一HDR视频模式包括HDR10或HDR10+。
根据本申请实施例提供的视频处理的方法,通过根据环境亮度、需求的动态范围以及频闪检测等因素的变化,对多种类型HDR处理方案进行无缝切换,从而得以利用与实际拍摄环境及画质需求适配的HDR方案进行图像处理,实现有效扩大录像场景下的动态范围,提升录像场景下的图像画质。
基于同样的技术构思,本申请实施例还提供了一种电子设备,包括一个或多个处理器;一个或多个存储器;所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得计算机或处理器执行上述任一个方法中的一个或多个步骤。
基于同样的技术构思,本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机可执行程序指令,所述计算机可执行程序指令在被计算机上运行时,使得计算机或处理器执行上述任一个方法中的一个或多个步骤。
基于同样的技术构思,本申请实施例还提供了一种包含指令的计算机程序产品,所述计算机程序产品包括计算机程序代码,当所述计算机程序代码在计算机上运行时,使得计算机或处理器执行上述任一个方法中的一个或多个步骤。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其它可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、
或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。
Claims (16)
- 一种视频处理的方法,其特征在于,应用于电子设备,所述方法包括:接收用户输入的第一操作,所述第一操作用于开启所述电子设备的录像的功能;响应于第一操作,根据预设的默认高动态范围HDR曝光方式获取第一图像;根据所述第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,所述预设策略包括所述视频拍摄对应的动态范围信息、频闪状态以及所述环境亮度与所述目标HDR曝光方式之间的对应关系;当所述目标HDR曝光方式与所述默认HDR曝光方式不相同时,将所述默认HDR曝光方式切换为所述目标HDR曝光方式,并继续进行所述视频拍摄,获取第二图像。
- 根据权利要求1所述的方法,其特征在于,所述目标HDR曝光方式至少包括第一HDR曝光方式和第二HDR曝光方式,所述第一HDR曝光方式为单帧模式,所述第二HDR曝光方式为双帧模式;当所述目标HDR曝光方式为所述第二HDR曝光方式时,对所述双帧模式输入的图像进行融合。
- 根据权利要求2所述的方法,其特征在于,所述第一HDR曝光方式为binning模式,所述第二HDR曝光方式包括交错高动态范围模式SHDR和DXG,所述DXG为双转换增益模式DCG和双模拟增益DAG叠加使用的模式。
- 根据权利要求2或3所述的方法,其特征在于,所述方法还包括:预设通过所述视频拍摄获取的图像所对应的目标参数;将基于所述第一HDR曝光方式获取的所述第一图像对应的初始参数调整为所述目标参数;和/或,将基于所述第二HDR曝光方式获取的所述第二图像对应的初始参数调整为所述目标参数。
- 根据权利要求1-4中任一项所述的方法,其特征在于,所述电子设备包括自动曝光控制AEC模块、图像传感器、感知模块,所述响应于第一操作,根据预设的默认HDR曝光方式获取第一图像,具体包括:响应于所述第一操作,所述AEC模块向所述图像传感器发送第一指示信息,所述第一指示信息用于指示使用所述默认HDR曝光方式捕获图像;响应于所述第一指示信息,所述图像传感器使用所述默认HDR曝光方式,并获取所述第一图像。
- 根据权利要求1-5中任一项所述的方法,其特征在于,所述电子设备包括AEC模块、图像传感器、感知模块,所述根据所述第一图像,获取环境亮度,并按照预设策略确定目标HDR曝光方式,具体包括:所述图像传感器将所述第一图像发送至所述感知模块;所述感知模块根据所述第一图像,获取环境亮度,并向所述AEC模块指示所述环境亮度;所述AEC模块根据所述环境亮度,按照预设策略确定所述目标HDR曝光方式。
- 根据权利要求2-6中任一项所述的方法,其特征在于,所述电子设备包括AEC 模块、图像传感器、感知模块和融合模块,所述当所述目标HDR曝光方式为所述第二HDR曝光方式时,对所述双帧输入的图像进行融合,具体包括:当所述目标HDR曝光方式为所述第二HDR曝光方式时,所述图像传感器将所述双帧模式的图像传输至所述融合模块;所述融合模块对所述双帧模式的图像进行融合;或者,当所述目标HDR曝光方式为所述第二HDR曝光方式时,所述图像传感器对所述双帧模式的图像进行融合。
- 根据权利要求7所述的方法,其特征在于,当所述融合模块对所述双帧模式的图像进行融合时,根据所述DXG模式要求的感光比,分别确定所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像之间的目标感光度比例;根据所述目标感光度比例对所述DCG模式双帧输入图像和DAG模式下的双帧输入图像分别进行对应叠加,获取满足所述DXG模式感光比的叠加后的双帧输入图像。
- 根据权利要求7所述的方法,其特征在于,当所述图像传感器对所述双帧模式的图像进行融合时,根据预设的感光度比例对所述DCG模式下的双帧输入图像和DAG模式下的双帧输入图像进行叠加。
- 根据权利要求1-9中任一项所述的方法,其特征在于,所述按照预设策略确定目标HDR曝光方式,具体包括:获取所述视频拍摄对应的动态范围;当所述动态范围信息小于第一阈值时,确定所述目标HDR曝光方式为所述binning模式;当所述动态范围信息大于或等于所述第一阈值时,检测是否存在频闪;当存在频闪时,确定所述目标HDR曝光方式为所述binning模式;当不存在频闪时,根据所述环境亮度确定所述目标HDR曝光方式;其中,当所述环境亮度大于第二阈值时,确定所述目标HDR曝光方式为所述SHDR模式;当所述环境亮度小于第三阈值且大于第四阈值时,确定所述目标HDR曝光方式为所述DXG模式,其中,所述第三阈值小于第二阈值;当所述环境亮度小于所述第四阈值时,确定所述目标HDR曝光方式为所述binning模式。
- 根据权利要求10所述的方法,其特征在于,所述方法还包括:检测所述视频拍摄是否存在热逃生现象;当存在所述热逃生现象时,确定所述目标HDR曝光方式为所述binning模式。
- 根据权利要求1-11中任一项所述的方法,其特征在于,所述电子设备支持所述第一HDR视频模式,所述第一HDR视频模式包括HDR10或HDR10+。
- 根据权利要求3-12中任一项所述的方法,其特征在于,当所述视频拍摄时,存在HDR摄像头和非HDR摄像头之间的切换时,所述方法还包括:当所述切换时,所述HDR摄像头的曝光方式为所述第一HDR曝光方式,所述非HDR摄像头对应的曝光方式为所述第二HDR曝光方式,则调节所述第一HDR曝光方 式对应的第一动态范围增益,使得所述第一动态范围增益与所述第二HDR曝光方式对应的第二动态范围增益最接近。
- 根据权利要求1-13中任一项所述的方法,其特征在于,所述方法还包括:显示第一界面,所述第一界面包括第一控件,所述第一控件用于开启自动切换所述HDR曝光方式功能。
- 一种电子设备,其特征在于,包括:一个或多个处理器;一个或多个存储器;所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1至14任一项所述的方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机可执行程序指令,所述计算机可执行程序指令在被计算机上运行时,使所述计算机执行如权利要求1至14中任一项所述的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210950356.6A CN117651221A (zh) | 2022-08-09 | 2022-08-09 | 视频处理的方法及电子设备 |
CN202210950356.6 | 2022-08-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024032033A1 true WO2024032033A1 (zh) | 2024-02-15 |
Family
ID=89850584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/090835 WO2024032033A1 (zh) | 2022-08-09 | 2023-04-26 | 视频处理的方法及电子设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN117651221A (zh) |
WO (1) | WO2024032033A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117750190A (zh) * | 2024-02-20 | 2024-03-22 | 荣耀终端有限公司 | 一种图像处理方法及电子设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104144305A (zh) * | 2013-05-10 | 2014-11-12 | 江苏思特威电子科技有限公司 | 双转换增益成像装置及其成像方法 |
US10834400B1 (en) * | 2016-08-19 | 2020-11-10 | Fastvdo Llc | Enhancements of the AV1 video codec |
CN112449120A (zh) * | 2019-08-30 | 2021-03-05 | 华为技术有限公司 | 高动态范围视频生成方法及装置 |
CN112616013A (zh) * | 2020-12-03 | 2021-04-06 | 上海龙旗科技股份有限公司 | 实现自动hdr的方法及设备 |
CN112738414A (zh) * | 2021-04-06 | 2021-04-30 | 荣耀终端有限公司 | 一种拍照方法、电子设备及存储介质 |
CN113572948A (zh) * | 2020-04-29 | 2021-10-29 | 华为技术有限公司 | 视频处理方法和视频处理装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103516984A (zh) * | 2013-09-13 | 2014-01-15 | 广东欧珀移动通信有限公司 | 相机智能切换hdr模式的方法及装置 |
CN114466134A (zh) * | 2021-08-17 | 2022-05-10 | 荣耀终端有限公司 | 生成hdr图像的方法及电子设备 |
CN114157791A (zh) * | 2021-12-01 | 2022-03-08 | 北京紫光展锐通信技术有限公司 | 一种hdr传感器和非hdr传感器的切换方法与电子设备 |
-
2022
- 2022-08-09 CN CN202210950356.6A patent/CN117651221A/zh active Pending
-
2023
- 2023-04-26 WO PCT/CN2023/090835 patent/WO2024032033A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104144305A (zh) * | 2013-05-10 | 2014-11-12 | 江苏思特威电子科技有限公司 | 双转换增益成像装置及其成像方法 |
US10834400B1 (en) * | 2016-08-19 | 2020-11-10 | Fastvdo Llc | Enhancements of the AV1 video codec |
CN112449120A (zh) * | 2019-08-30 | 2021-03-05 | 华为技术有限公司 | 高动态范围视频生成方法及装置 |
CN113572948A (zh) * | 2020-04-29 | 2021-10-29 | 华为技术有限公司 | 视频处理方法和视频处理装置 |
CN112616013A (zh) * | 2020-12-03 | 2021-04-06 | 上海龙旗科技股份有限公司 | 实现自动hdr的方法及设备 |
CN112738414A (zh) * | 2021-04-06 | 2021-04-30 | 荣耀终端有限公司 | 一种拍照方法、电子设备及存储介质 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117750190A (zh) * | 2024-02-20 | 2024-03-22 | 荣耀终端有限公司 | 一种图像处理方法及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN117651221A (zh) | 2024-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021052232A1 (zh) | 一种延时摄影的拍摄方法及设备 | |
WO2023015981A1 (zh) | 图像处理方法及其相关设备 | |
CN112532892B (zh) | 图像处理方法及电子装置 | |
WO2023015991A1 (zh) | 拍照方法、电子设备和计算机可读存储介质 | |
CN116055890B (zh) | 生成高动态范围视频的方法和电子设备 | |
WO2023160285A1 (zh) | 视频处理方法和装置 | |
US20240236504A9 (en) | Point light source image detection method and electronic device | |
CN113630558B (zh) | 一种摄像曝光方法及电子设备 | |
WO2023077939A1 (zh) | 摄像头的切换方法、装置、电子设备及存储介质 | |
CN116055897B (zh) | 拍照方法及其相关设备 | |
WO2023160295A1 (zh) | 视频处理方法和装置 | |
WO2023226612A1 (zh) | 一种曝光参数确定方法和装置 | |
WO2024032033A1 (zh) | 视频处理的方法及电子设备 | |
WO2022267608A1 (zh) | 一种曝光强度调节方法及相关装置 | |
WO2024174625A1 (zh) | 图像处理方法和电子设备 | |
WO2023160221A1 (zh) | 一种图像处理方法和电子设备 | |
CN116048323B (zh) | 图像处理方法及电子设备 | |
CN116193268A (zh) | 一种拍摄视频的方法及相关设备 | |
CN113891008B (zh) | 一种曝光强度调节方法及相关设备 | |
CN117496391A (zh) | 图像处理方法与电子设备 | |
CN116723417B (zh) | 一种图像处理方法和电子设备 | |
WO2023077938A1 (zh) | 生成视频帧的方法、装置、电子设备及存储介质 | |
CN118450275A (zh) | 一种拍摄方法及相关设备 | |
CN116193269A (zh) | 一种曝光模式切换方法及相关设备 | |
CN117692799A (zh) | 一种拍摄方法及相关设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23851263 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023851263 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2023851263 Country of ref document: EP Effective date: 20240731 |