CN116614714A - Real exposure correction method and system guided by perception characteristics of camera - Google Patents

Real exposure correction method and system guided by perception characteristics of camera Download PDF

Info

Publication number
CN116614714A
CN116614714A CN202310348986.0A CN202310348986A CN116614714A CN 116614714 A CN116614714 A CN 116614714A CN 202310348986 A CN202310348986 A CN 202310348986A CN 116614714 A CN116614714 A CN 116614714A
Authority
CN
China
Prior art keywords
network
exposure
correction
channel
exposure correction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310348986.0A
Other languages
Chinese (zh)
Inventor
付莹
张涛
张军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202310348986.0A priority Critical patent/CN116614714A/en
Publication of CN116614714A publication Critical patent/CN116614714A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/096Transfer learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

The invention discloses a real exposure correction method and a real exposure correction system guided by a perception characteristic of a camera, and belongs to the technical field of calculation shooting and image processing. The method is used for exposure correction of real data, and a channel-guided convolutional neural network is designed according to the perception characteristics of a camera to build a real paired data acquisition system. Meanwhile, a knowledge distillation strategy for exposure correction is designed, a real paired exposure correction data acquisition system is provided, a data set is acquired, and the acquired data set is used for training an exposure correction neural network, so that the real image exposure correction can be finished with high quality, and the real image exposure correction quality is improved.

Description

Real exposure correction method and system guided by perception characteristics of camera
Technical Field
The invention relates to a real exposure correction method and a real exposure correction system guided by a perception characteristic of a camera, and belongs to the technical field of calculation shooting and image processing.
Background
The exposure affects the intensity and quality of the captured image. Modern digital cameras can automatically control exposure by means of automatic exposure, or can be manually controlled by a user to obtain images of suitable quality. However, there are several important factors that lead to exposure errors including errors in measurement through the lens, user manual errors, brightness variations, and difficult illumination conditions. Exposure errors occur early in the image capture and will greatly reduce the contrast and visual quality of the final image.
In order to solve the above problems, many model-driven exposure correction techniques have been proposed by the study of the skilled person. Such as histogram-based methods, retinex theory-based methods, and the like. In recent years, data-driven methods using deep learning techniques have achieved better performance, and such methods learn nonlinear mapping functions between false exposure and good exposure images. However, most methods deal only with the problem of overexposure or underexposure, limiting the practical application of correcting various exposures. More and more research is beginning to focus on solving the problem of exposure correction in end-to-end networks. However, these methods either ignore the requirement of different prior knowledge learning, or simply design a simple module, implicitly learn various prior knowledge for over-exposure and under-exposure corrections. Since the overexposure and underexposure correction processes are significantly different, only the overexposure and underexposure correction are different. Thus, it is a challenge to correct both unreasonable exposures simultaneously with one network. Currently, most digital camera sensors are designed to have a higher sensitivity to the green channel than to the red and blue channels, since the human eye is more sensitive to green than others. The camera perception characteristic means that the red and blue channels are more advantageous for overexposure correction and the green channel for underexposure correction.
In addition, most existing methods perform exposure correction on 8-bit standard color images (sRGB), which is generated from RAW data by a highly nonlinear operation, i.e., camera Image Signal Processing (ISP). Due to the compression and quantization operations of ISP, sRGB has information loss, resulting in a potential bottleneck for exposure correction. In fact, previous studies have shown that RAW data contains more abundant information, has higher bit values, and maintains a linear relationship between exposure and scene brightness, which is beneficial for image recovery. However, there is now a lack of true paired RAW domain exposure correction data sets.
Disclosure of Invention
The invention aims at solving the technical problems that in the prior art, a real paired exposure correction data set does not exist, the existing method cannot effectively process overexposure correction, underexposure correction and the like at the same time, creatively provides a real exposure correction method and a real exposure correction system guided by the perception characteristics of a camera, which are used for the exposure correction of real data, a channel-guided convolutional neural network is designed according to the perception characteristics of the camera, a real paired data acquisition system is built, a high-quality real paired exposure correction data set is built, an accurate channel-guided exposure correction network is trained, the exposure correction of real images can be completed with high quality, and the imaging quality is improved.
According to the invention, the channel-guided exposure correction network is designed according to the different sensing characteristics of different channels by analyzing the sensing characteristics of different channels of the camera. Meanwhile, a knowledge distillation strategy for exposure correction is further designed, a real paired exposure correction data acquisition system is provided, a data set is acquired, and the acquired data set is used for training an exposure correction neural network, so that high-precision image exposure correction is realized, and the real image exposure correction quality is improved.
Advantageous effects
Compared with the prior art, the invention has the following advantages:
1. the invention builds a real data acquisition system, and acquires the real paired over/under exposure image and the normal exposure image data set by using the system.
2. According to the invention, the channel-guided exposure correction network is designed according to the perception characteristics of different color channels of the camera, and the training strategy of knowledge distillation is adopted, so that priori knowledge required by different exposure correction tasks can be adaptively mined, modulation parameters can be adaptively generated, and the modeling capability of the exposure correction network on overexposure and underexposure correction tasks can be effectively improved.
3. The invention utilizes the mapping relation between the over/under exposure image and the normal exposure image learned by the convolutional neural network, and combines the beneficial effects 1 and 2 to improve the generalization of the convolutional neural network, improve the quality of the exposure correction of the real image and ensure the fidelity of the real exposure correction image.
Drawings
FIG. 1 is a general flow chart of the method of the present invention;
FIG. 2 is a schematic diagram of the composition of the system of the present invention;
FIG. 3 is a schematic diagram of a camera having different sensing characteristics for different color channels;
FIG. 4 is a schematic diagram of a channel-guided exposure correction network in accordance with the present invention;
FIG. 5 is a schematic diagram of exposure correction knowledge distillation in accordance with the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
As shown in fig. 1, in one aspect, the present invention provides a real exposure correction method guided by a perception characteristic of a camera, including the following steps:
step 101: a real data acquisition system is built, and a real paired exposure correction data set is acquired by using the real data acquisition system. Wherein the true paired exposure correction dataset contains paired over/under exposure images and normal exposure images.
To support exposure correction on RAW images, the wrong exposure and the right exposure RAW images are acquired in pairs to obtain a real RAW exposure correction dataset.
Specifically, the camera can be fixed on a tripod to form a real data acquisition system. When shooting, the system firstly adjusts the aperture, the focal length, the exposure time and the like to acquire a high-quality normal exposure image. Then, the exposure time is controlled at a magnification of 1/100 to 10 using control software. Finally, an underexposed or overexposed image is acquired. Through the above flow, the true paired exposure correction data sets can be acquired.
Step 102: the camera perception characteristics are analyzed and a channel-guided exposure correction network is designed to learn the different prior knowledge required for underexposure and overexposure in the same network.
The camera perception characteristic is that the green channel has a higher sensitivity than the red and blue channels, as shown in fig. 3. This means that the red and blue channels are more advantageous for over-exposure correction and the green channel for under-exposure correction. According to the perceived characteristics of the camera, the invention designs a channel-guided exposure correction network, and the overall network structure comprises a main branch, a color channel guiding branch and a guiding channel selecting module, as shown in fig. 4.
To fully utilize the useful information in the input RAW image, the RAW image is decomposed into four channels of RGGB (red green blue). The red and blue channels are richer for overexposure correction information and the two green channels are more advantageous for underexposure correction information. Since different guide channels contribute to different exposure corrections, the guide channel selection module is first used to automatically select the desired color channel for either overexposure or underexposure correction. The selected color channel is then input into a color channel guide branch, generating a guide feature. Finally, the corresponding features in the main branch are modulated with the pilot features. By this means, overexposure and underexposure corrections can be automatically selected and effectively performed, respectively, using the guidance information.
Further, the main branch comprises 4 encoder stages and 4 corresponding decoder stages. At the end of each encoder stage, a 4×4 kernel size and a 2-step convolution downsamples the feature map to a ratio of 1/2. Prior to each decoder stage, the feature map is upsampled to 2 scales with bilinear interpolation. The jump connection passes the shallow feature map from each encoder stage to the corresponding decoder stage. Further, to simplify training, residual learning is introduced in the exposure correction network, and the encoder and decoder are constructed using residual blocks as basic blocks, where the residual blocks consist of two 3 x 3 convolutions, followed by an activation function and one 1 x 1 convolution.
A guide channel selection module is designed for automatically selecting a desired channel for over-exposure and under-exposure correction. First, four channel RAW images are input into a Global Average Pool (GAP) layer to obtain average vectors. Subsequently, to further acquire weight information for different channels, two fully connected layers (FC) are employed to learn the nonlinear relationship between each channel. Then, the final attention weight of each channel is limited by using a Sigmoid gating mechanism, and the channel corresponding to the two maximum values is the selected channel.
As described above, the red and blue channels recover more easily when overexposed, and the green channel recovers more easily when underexposed. When the training network performs over-exposure and under-exposure correction, the constraint guide channel selection module selects a specific channel to perform corresponding correction tasks, which are specifically expressed as follows:
L s =||I s -I d || 1 (1)
wherein ,Ls Indicating channel selection loss, I s and Id Representing the selected channel and the target channel, respectively, namely: the overexposure corrects for the blue and red channels and the underexposure corrects for the green channel.
In order to adapt the main branch feature to different exposure corrections, the present invention designs a color channel guide module. The color channel guide module modulates the features by the guide information inspired by the guide filtering. First, a selected color channel is input into a color channel guide branch, generating a guide feature. The main branch feature is then modulated with a pilot feature having a corresponding spatial resolution.
In particular, two convolution layers may be utilized from the pilot feature F g To generate scaling and bias values at pixel level for enhancing main branch feature F m Expressed as:
wherein ,for enhanced features, α (F g) and β(Fg ) Two learnable modulation parameters, respectively.
Thus, the design of the channel-guided exposure correction network is completed, and different priori knowledge required by underexposure and overexposure can be learned in the same network.
Step 103: using knowledge distillation, the channel-directed exposure correction network is made to learn the a priori knowledge required for overexposure and underexposure correction more effectively, as shown in fig. 5.
Wherein knowledge distillation is the distillation of a priori knowledge required for overexposure and underexposure correction to a channel-guided exposure correction network. Before using knowledge distillation, two networks for overexposure and underexposure correction, respectively, are first trained, whose target loss function L is expressed as:
L=L+L r (3)
wherein ,Ls and Lr The loss and reconstruction loss are selected for the pilot channel, respectively.
Further, reconstruction loss L r Expressed as:
L r =||O-T|| 1 (4)
wherein O and T are denoted as the output of the exposure correction network and the corresponding target image, respectively.
When the overexposure and underexposure correction networks are well trained, knowledge distillation is adopted to extract the knowledge of the overexposure and underexposure correction into a unified network.
In terms of knowledge distillation, two well-trained overexposure and underexposure correction networks are teacher networks, and the unified overexposure and overexposure correction networks are student networks. For both overexposure and underexposure corrections, the characteristics of the student network should be close to the corresponding characteristics of the teacher network.
The knowledge distillation process is expressed as:
L=L s +L r +L kd (5)
wherein ,Lkd The error between the teacher and student network characteristics is expressed as:
where N is the number of features used, and />Representing features in the student network and the teacher network, respectively. />The characteristics in the overexposure correction network and the underexposure network are used for supervising the characteristics in the student network.
Thus, the knowledge distillation training strategy is constructed, and different priori knowledge can be distilled into a unified exposure correction network.
Step 104: and inputting a learning rate, an optimization method, iteration times and an acquired real paired data set, and training network parameters according to a knowledge distillation strategy to obtain a mapping relation f between the over/under exposure image and the normal exposure image.
And (3) obtaining optimized network parameters through optimizing the target functions of the training formula 3 and the training formula 5, and finishing the training of the exposure correction network to obtain the mapping relation f between the optimized over/under exposure image and the normal exposure image.
Step 105: the over/under exposure image to be tested is input, and the mapping relation f between the over/under exposure image and the normal exposure image obtained in step 104 is input. The over/under exposure image is mapped into the normal exposure image through the mapping relation f between the over/under exposure image and the normal exposure image, thereby realizing high-efficiency and high-precision image exposure correction and improving imaging quality.
Further, the normal exposure image isI and />Respectively an over/under exposure image and a normal exposure image.
Preferably, the training process of the network in step 104 and the image exposure correction process in step 105 are completed by using a GPU, and the operation speed of the convolutional neural network is accelerated by using a cuDNN library.
On the other hand, as shown in fig. 2, the invention provides a real exposure correction system guided by the perception characteristics of a camera, which comprises a data acquisition module, a channel guide exposure correction network design module, a knowledge distillation strategy design module and a network training test module.
The data acquisition module is used for acquiring real paired over/under exposure image and normal exposure image data sets. The output end of the data acquisition module is connected with the input end of the network training test module.
The channel guide exposure correction network design module is used for designing an exposure correction network so as to fully utilize prior information of different channels. The output end of the channel guide exposure correction network design module is connected with the input end of the knowledge distillation strategy design module.
The knowledge distillation strategy design module is used for designing a network training strategy to distill knowledge of the overexposure and underexposure correction networks into a unified exposure correction network model. The output end of the knowledge distillation strategy design module is connected with the input end of the network training measurement module.
The network training test module is used for training the exposure correction network and testing the effectiveness of the exposure correction network.
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (9)

1. The real exposure correction method guided by the perception characteristics of the camera is characterized by comprising the following steps of:
step 101: building a real data acquisition system, and acquiring a real paired exposure correction data set by using the real data acquisition system, wherein the real paired exposure correction data set comprises paired over/under exposure images and normal exposure images;
step 102: analyzing the perception characteristic of a camera, designing a channel-guided exposure correction network, and learning different priori knowledge required by underexposure and overexposure in the same network;
the network overall structure comprises a main branch, a color channel guiding branch and a guiding channel selecting module; decomposing the RAW image into four channels of red, green and blue; firstly, a guide channel selection module is used for automatically selecting a required color channel for overexposure or underexposure correction; then, inputting the selected color channel into a color channel guide branch to generate a guide feature; finally, modulating the corresponding features in the main branch by using the guiding features;
designing a guide channel selection module; firstly, inputting four channel RAW images into a global average pool layer to obtain average vectors; subsequently, in order to further acquire the weight information of different channels, two complete connection layers are adopted to learn the nonlinear relation between each channel; then, limiting the final attention weight of each channel by using a Sigmoid gating mechanism, wherein the channel corresponding to the two maximum values is the selected channel;
when the training network performs over-exposure and under-exposure correction, the constraint guide channel selection module selects a specific channel to perform corresponding correction tasks, which are expressed as follows:
L s =||I s -I d || 1 (1)
wherein ,Ls Indicating channel selection loss, I s and Id Representing the selected channel and the target channel, respectively, namely: overexposure correction toBlue and red channels, the underexposure correcting to the green channel;
designing a color channel guiding module, and modulating the characteristics through guiding information; firstly, inputting a selected color channel into a color channel guide branch to generate a guide feature; then, modulating the main branch feature with a pilot feature having a corresponding spatial resolution;
step 103: knowledge distillation is used to make the channel-guided exposure correction network more efficient in learning the a priori knowledge required for overexposure and underexposure correction;
wherein, knowledge distillation is to distill the prior knowledge required by overexposure and underexposure correction to a channel-guided exposure correction network; before using knowledge distillation, two networks for overexposure and underexposure correction, respectively, are first trained, whose target loss function L is expressed as:
L=L s +L r (3)
wherein ,Ls and Lr Selecting a loss and a reconstruction loss for the pilot channel, respectively;
reconstruction loss L r Expressed as:
L r =||O-T|| 1 (4)
wherein O and T are respectively expressed as the output of the exposure correction network and the corresponding target image;
when the overexposure and underexposure correction network is well trained, knowledge distillation is adopted to extract the knowledge of overexposure and underexposure correction into a unified network;
in knowledge distillation, two well-trained overexposure and underexposure correction networks are teacher networks, and the unified overexposure and overexposure correction networks are student networks; for overexposure and underexposure correction, the characteristics of the student network should be close to the corresponding characteristics of the teacher network;
the knowledge distillation process is expressed as:
L=L s +L r +L kd (5)
wherein ,Lkd The error between the teacher and student network characteristics is expressed as:
where N is the number of features used, and />Respectively representing the characteristics in the student network and the teacher network; />For supervising features in the student network;
step 104: inputting a learning rate, an optimization method, iteration times and an acquired real paired data set, and training network parameters according to a knowledge distillation strategy to obtain a mapping relation f between an over/under exposure image and a normal exposure image;
obtaining optimized network parameters through optimized training target functions 3 and 5, completing training of an exposure correction network, and obtaining a mapping relation f between an optimized over/under exposure image and a normal exposure image;
step 105: inputting an over/under exposure image to be tested, and inputting a mapping relation f between the over/under exposure image and a normal exposure image obtained in the step 104; mapping the over/under exposure image into a normal exposure image through the mapping relation f between the over/under exposure image and the normal exposure image, thereby realizing high-efficiency and high-precision image exposure correction and improving imaging quality;
the normal exposure image isI and />Respectively an over/under exposure image and a normalExposing the image.
2. The method for correcting true exposure guided by perception characteristics of a camera according to claim 1, wherein in step 101, a true data acquisition system first adjusts aperture, focal length, exposure time to acquire high-quality normal exposure images when shooting; then, using control software, the exposure time is controlled at a magnification of 1/100 to 10; finally, an underexposed or overexposed image is acquired.
3. The camera perception feature-directed real exposure correction method as claimed in claim 1, wherein in step 102, the main branch includes 4 encoder stages and 4 corresponding decoder stages;
at the end of each encoder stage, a 4×4 kernel size and a 2-step convolution downsamples the feature map to a 1/2-fold ratio;
upsampling the feature map to 2 scales with bilinear interpolation before each decoder stage;
the jump connection passes the shallow feature map from each encoder stage to the corresponding decoder stage.
4. A camera perception feature-guided real exposure correction method as claimed in claim 1 or 3, characterized in that residual learning is introduced in the exposure correction network, the encoder and decoder being constructed using residual blocks as basic blocks, wherein the residual blocks consist of two 3 x 3 convolutions followed by an activation function and one 1 x 1 convolution.
5. The camera perception feature-guided real exposure correction method of claim 1, wherein in step 102, two convolution layers are utilized to guide features F from g To generate scaling and bias values at pixel level for enhancing main branch feature F m Expressed as:
wherein ,for enhanced features, α (F g) and β(Fg ) Two learnable modulation parameters, respectively.
6. The method for camera perception feature-guided real exposure correction as claimed in claim 1, wherein, in step 103,is a feature in overexposure correction networks.
7. The method for camera perception feature-guided real exposure correction as claimed in claim 1, wherein, in step 103,is a feature in an underexposed network.
8. The camera perception feature-guided real exposure correction method of claim 1, wherein the training process of the network of step 104 and the image exposure correction process of step 105 are completed using a GPU, and the operation speed of the convolutional neural network is increased using a cuDNN library.
9. The real exposure correction system guided by the perception characteristics of the camera is characterized by comprising a data acquisition module, a channel guide exposure correction network design module, a knowledge distillation strategy design module and a network training test module;
the data acquisition module is used for acquiring real paired over/under exposure images and normal exposure image data sets; the output end of the data acquisition module is connected with the input end of the network training test module;
the channel guide exposure correction network design module is used for designing an exposure correction network so as to fully utilize prior information of different channels; the output end of the channel guide exposure correction network design module is connected with the input end of the knowledge distillation strategy design module;
the knowledge distillation strategy design module is used for designing a network training strategy so as to distill knowledge of the overexposure and underexposure correction network into a unified exposure correction network model; the output end of the knowledge distillation strategy design module is connected with the input end of the network training measurement module;
the network training test module is used for training the exposure correction network and testing the effectiveness of the exposure correction network.
CN202310348986.0A 2023-04-04 2023-04-04 Real exposure correction method and system guided by perception characteristics of camera Pending CN116614714A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310348986.0A CN116614714A (en) 2023-04-04 2023-04-04 Real exposure correction method and system guided by perception characteristics of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310348986.0A CN116614714A (en) 2023-04-04 2023-04-04 Real exposure correction method and system guided by perception characteristics of camera

Publications (1)

Publication Number Publication Date
CN116614714A true CN116614714A (en) 2023-08-18

Family

ID=87677106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310348986.0A Pending CN116614714A (en) 2023-04-04 2023-04-04 Real exposure correction method and system guided by perception characteristics of camera

Country Status (1)

Country Link
CN (1) CN116614714A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793538A (en) * 2024-02-23 2024-03-29 北京理工大学 Automatic image exposure correction and enhancement method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793538A (en) * 2024-02-23 2024-03-29 北京理工大学 Automatic image exposure correction and enhancement method and device

Similar Documents

Publication Publication Date Title
US11037278B2 (en) Systems and methods for transforming raw sensor data captured in low-light conditions to well-exposed images using neural network architectures
CN110378845B (en) Image restoration method based on convolutional neural network under extreme conditions
US20220036523A1 (en) Image processor
CN109862389B (en) Video processing method, device, server and storage medium
CN111064904A (en) Dark light image enhancement method
JPH0225551B2 (en)
WO2022000397A1 (en) Low-illumination image enhancement method and apparatus, and computer device
CN112183637A (en) Single-light-source scene illumination re-rendering method and system based on neural network
CN108510560A (en) Image processing method, device, storage medium and computer equipment
CN110930341A (en) Low-illumination image enhancement method based on image fusion
CN112508812A (en) Image color cast correction method, model training method, device and equipment
CN114862698A (en) Method and device for correcting real overexposure image based on channel guidance
CN113096029A (en) High dynamic range image generation method based on multi-branch codec neural network
CN116614714A (en) Real exposure correction method and system guided by perception characteristics of camera
CN116456183B (en) High dynamic range video generation method and system under guidance of event camera
CN112581392A (en) Image exposure correction method, system and storage medium based on bidirectional illumination estimation and fusion restoration
CN114638764B (en) Multi-exposure image fusion method and system based on artificial intelligence
WO2023110880A1 (en) Image processing methods and systems for low-light image enhancement using machine learning models
CN115209119A (en) Video automatic coloring method based on deep neural network
CN114283101A (en) Multi-exposure image fusion unsupervised learning method and device and electronic equipment
Yang et al. Multi-scale extreme exposure images fusion based on deep learning
TWI590192B (en) Adaptive high dynamic range image fusion algorithm
CN114511462B (en) Visual image enhancement method
US20230186612A1 (en) Image processing methods and systems for generating a training dataset for low-light image enhancement using machine learning models
US20230063209A1 (en) Neural network training based on consistency loss

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination