CN111970432A - Image processing method and image processing device - Google Patents

Image processing method and image processing device Download PDF

Info

Publication number
CN111970432A
CN111970432A CN201910418979.7A CN201910418979A CN111970432A CN 111970432 A CN111970432 A CN 111970432A CN 201910418979 A CN201910418979 A CN 201910418979A CN 111970432 A CN111970432 A CN 111970432A
Authority
CN
China
Prior art keywords
color
pixel point
pixel
value
space image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910418979.7A
Other languages
Chinese (zh)
Inventor
唐道龙
竺旭东
孙超伟
周焕铳
黄普发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910418979.7A priority Critical patent/CN111970432A/en
Publication of CN111970432A publication Critical patent/CN111970432A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase

Abstract

The application provides an image processing method and an image processing device, which are used for solving the image quality problems of brightness, color and the like of a night image. The color image and the infrared image are processed in the RGB space, specifically, the brightness values of the color image and the infrared image are obtained, the correction parameters for correcting the color image and the infrared image in the RGB space are determined based on the brightness values, so that the pixel values of three channels of the color image in the RGB space are adjusted based on the determined correction parameters, and the color tone of the adjusted color image is unchanged, so that the problem of color cast of the existing fusion algorithm can be solved.

Description

Image processing method and image processing device
Technical Field
The embodiment of the application relates to the technical field of images, in particular to an image processing method and an image processing device.
Background
In the field of video security, color attributes play an important role in identifying objects. For the current camera, when the working mode is daytime, the visible light information in the environment is rich, so the camera can work in a color mode; and the working mode is night, the visible light information in the environment is less, and the camera is difficult to obtain a high-quality color image. And the optical signal in the environment can be increased by supplementing infrared light, so that the brightness of the image obtained by the camera is improved. However, although the brightness of the whole image can be improved by using the infrared lamp for light supplement, the obtained image has only gray information and no color information, and some contents cannot be identified, such as the color of clothes of a person, cannot be identified.
The method solves the image quality problems of brightness, color and the like of the night image, and is of great importance in the field of video security for obtaining the night high-quality image.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, which are used for solving the image quality problems of brightness, color and the like of a night image.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring the brightness value of a first pixel point of a color RGB space image and the brightness value of a second pixel point of an infrared RGB space image, wherein the infrared RGB space image and the color RGB space image are obtained by collecting the same target scene, and the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image; determining a correction parameter for correcting the brightness of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point; and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
The color image and the infrared image captured at the same time have color information, but have low luminance. The infrared image is bright but lacks color information. By the image processing method provided by the embodiment of the application, the color information and the brightness information can be effectively extracted from the color image and the infrared image for fusion processing, the image quality is improved, the color tone of the color image after the fusion processing is kept unchanged, and the color cast problem is effectively avoided. In addition, the prior art carries out image fusion in YUV space. Since the color information (UV) and luminance information (Y) in YUV space cannot be decoupled, processing only the Y information necessarily changes the color information, resulting in color cast of the fused image. Compared with the prior art, the technical scheme of the application enhances the color information while processing the brightness information, and mathematically proves that the color tone of the fused image is completely the same as that of the color image, so that the problem of color cast of the existing fusion algorithm can be solved, and the color of the fused image is more real and natural.
In one possible design, the correction parameters include a color correction factor and a brightness boost parameter;
determining a correction parameter of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, including: determining a color correction coefficient of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, and determining a brightness improvement value of the first pixel point according to the brightness value of the second pixel point; and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter, wherein the correction process comprises the following steps: and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient.
Corresponding texture changes may exist in the dark regions in the color RGB space image, but because light is too dark, texture changes cannot be reflected in the dark regions in the color RGB space image obtained by shooting, and brightness improvement parameters can be determined through the brightness values of the pixel points of the infrared RGB space image in the embodiment of the application, so that the brightness of the dark regions is improved, and more texture change details are obtained.
In a possible design, the correcting the RGB pixel values of the first pixel of the color RGB space image according to the luminance enhancement parameter and the color correction coefficient includes:
and carrying out linearization processing on the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
Through the design, the color tone of the fused image after the linearization processing is completely the same as that of the color image, so that the problem of color cast of the existing fusion algorithm can be solved, and the color of the fused image is more real and natural.
In one possible design, the linearizing RGB pixel values of a first pixel of the color RGB space image according to the luminance boost parameter and the color correction coefficient includes:
multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image; alternatively, the first and second electrodes may be,
and multiplying the color correction coefficient by the RGB pixel value of the color RGB space image, and then summing the color correction coefficient and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
The above design provides two ways of linearizing the process.
In one possible design, the correction parameters include a color correction factor;
determining a correction parameter of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, including:
determining the color correction coefficient according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter, wherein the correction process comprises the following steps:
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the color correction coefficient.
In the design, the same color modification coefficient is adopted to modify the pixel values of the three channels of the first pixel point of the color RGB space image, so that the hue and the saturation of the modified image are not changed, the color cast problem can be avoided, and the quality of the processed image is improved.
In one possible design, determining a color correction coefficient of the second pixel according to the luminance value of the second pixel and the luminance value of the first pixel includes:
taking the ratio of the brightness value to the weighted value of the second pixel point as the color correction coefficient of the first pixel point; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
The method for determining the color correction coefficient provided by the design is simple and effective, and ensures that the hue and the saturation of the corrected image are unchanged, so that the color cast problem can be avoided, and the quality of the processed image is improved.
In one possible design, determining a color correction coefficient of the first pixel according to the luminance value of the second pixel and the luminance value of the first pixel includes:
determining the ratio of the brightness value to the weighted value of the second pixel point; the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point;
and carrying out linear transformation on the ratio to obtain a color correction coefficient of the first pixel point.
The method for determining the color correction coefficient provided by the design is simple and effective, and ensures that the hue and the saturation of the corrected image are unchanged, so that the color cast problem can be avoided, and the quality of the processed image is improved.
Based on the same inventive concept as the method embodiment of the first aspect, an embodiment of the present application provides, in a second aspect, an image processing apparatus, including:
the device comprises an acquisition unit and a processing unit, wherein the acquisition unit is used for acquiring the brightness value of a first pixel point of a color RGB space image and acquiring the brightness value of a second pixel point of an infrared RGB space image, the infrared RGB space image and the color RGB space image are obtained by acquiring the same target scene, and the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image;
the correction processing unit is used for determining a correction parameter for correcting the brightness of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point; and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
In one possible design, the device may further comprise a first acquisition unit and a second acquisition unit.
The first acquisition unit is used for acquiring the color RGB space image, and the second acquisition unit is used for acquiring the infrared RGB space image.
In another possible design, the first collecting unit is configured to collect a color brightness chrominance YUV space image, and the second collecting unit is configured to collect an infrared YUV space image, where the apparatus further includes: and the conversion unit is used for converting the color YUV space image into the color RGB space image and converting the infrared YUV space image into the infrared RGB space image.
In one possible design, the correction parameters include a color correction factor and a brightness boost parameter; the correction processing unit is specifically configured to:
determining a color correction coefficient of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, and determining a brightness improvement value of the first pixel point according to the brightness value of the second pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient.
In a possible design, when performing correction processing on RGB pixel values of a first pixel of the color RGB space image according to the luminance enhancement parameter and the color correction coefficient, the correction processing unit is specifically configured to:
and carrying out linearization processing on the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
In a possible design, when the RGB pixel values of the first pixel of the color RGB space image are linearized according to the luminance boost parameter and the color correction coefficient, the correction processing unit is specifically configured to:
multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image; alternatively, the first and second electrodes may be,
and multiplying the color correction coefficient by the RGB pixel value of the color RGB space image, and then summing the color correction coefficient and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
In one possible design, the correction parameters include a color correction factor;
the correction processing unit is specifically configured to:
determining the color correction coefficient according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the color correction coefficient.
In a possible design, when determining the color correction coefficient of the second pixel point according to the luminance value of the second pixel point and the luminance value of the first pixel point, the correction processing unit is specifically configured to:
taking the ratio of the brightness value to the weighted value of the second pixel point as the color correction coefficient of the first pixel point; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
In a possible design, when determining the color correction coefficient of the first pixel point according to the luminance value of the second pixel point and the luminance value of the first pixel point, the correction processing unit is specifically configured to:
determining the ratio of the brightness value to the weighted value of the second pixel point; the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point; and carrying out linear transformation on the ratio to obtain a color correction coefficient of the first pixel point.
In a third aspect, an embodiment of the present application provides an image processing apparatus, including at least one processor, coupled with at least one memory: the at least one processor is configured to execute the computer program or instructions stored in the at least one memory to cause the apparatus to perform the method of any of the first aspects.
In one possible design, a first image sensor for capturing a static or dynamic color image and a second image sensor for capturing a static or dynamic infrared image are also included.
In one example, the color image and the infrared image are both RGB space images, and the first image sensor is specifically configured to acquire the color RGB space image, and the second image sensor is specifically configured to acquire the infrared RGB space image.
In another example, the color image and the infrared image are both YUV space images, the first image sensor is specifically configured to acquire a color YUV space image, the second image sensor is specifically configured to acquire an infrared YUV space image, and the at least one processor converts the color YUV space image into a color RGB space image and converts the infrared YUV space image into an infrared RGB space image.
In a fourth aspect, the present application provides a computer program product, which includes a computer program that, when executed on a computer or a processor, will enable the computer or the processor to implement the functions involved in any one of the possible designs of the embodiments of the method and the embodiments of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium for storing a program or instructions, which when invoked in a computer, can cause the computer to perform the functions involved in any one of the possible designs of the method embodiments and method embodiments of the first aspect.
In a sixth aspect, the present application provides a chip system, where the chip system includes a processor and may further include a memory, and is configured to implement the functions involved in the foregoing methods. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
Drawings
Fig. 1A-1B are schematic diagrams of an image processing apparatus provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of an image processing method provided in an embodiment of the present application;
3A-3B are schematic diagrams of another image processing apparatus according to an embodiment of the present disclosure;
fig. 4 to fig. 5 are schematic diagrams of still another image processing apparatus according to an embodiment of the present application.
Detailed Description
Some terms referred to in the present application will be described in detail below.
The term "at least one" as referred to herein means one, or more than one, i.e. including one, two, three and more; "plurality" means two, or more than two, i.e., including two, three, and more than two. In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
Image fusion (Image fusion): image data of the same target scene collected by the multi-source channels are subjected to image processing, computer technology and the like, beneficial information in the respective channels is extracted to the maximum extent, and finally high-quality images are synthesized, so that the utilization rate of image information is improved, and the spatial resolution and the spectral resolution of an original image are improved.
Hue (Hue): the hue appears as a color on a color image. The tone is the expression of the reflection of the ground objects and the intensity of the radiation energy on the image, and the attributes, the geometric shapes, the distribution ranges and the combination rules of the ground objects can be reflected on the remote sensing image through the tone difference.
Luminance (luminence): the physical quantity representing the actual feeling of the human eye on the luminous or reflected light intensity of the luminous body or the irradiated object surface is the same brightness when any two object surfaces are photographed, or the two surfaces are seen to be the same brightness by the eyes, which means that the brightness of the two surfaces is the same.
Saturation (saturation): saturation refers to the degree of vividness of a color, also referred to as the purity of the color. The saturation depends on the ratio of the color component and the achromatic component (gray) contained in the color. The larger the color content, the greater the saturation; the larger the achromatic component is, the smaller the saturation is. Pure colors are highly saturated, such as bright red, bright green. Mixed with a white, grey or other shade of color, is an unsaturated color such as magenta, pink, yellow-brown, etc. Fully unsaturated colors have no hue at all, such as various grays between black and white.
The infrared light image is an image recorded by the image sensor capturing infrared light emitted by an object in the scene, and the color image is an image recorded by the image sensor capturing visible light emitted by an object in the scene.
Color Space (Color Space): the color may be a different perception of the eye by different frequencies of light, or may represent the presence of different frequencies of light objectively. A color space is a range of colors defined by a coordinate system that one establishes to represent colors. Together with the color model, the color gamut defines a color space. Wherein the color model is an abstract mathematical model representing colors with a set of color components. The color model may include, for example, a Red Green Blue (RGB) mode, a print four Color (CMYK) mode. Color gamut refers to the aggregate of colors that a system is capable of producing. Illustratively, Adobe RGB and sRGB are two different color spaces based on the RGB model.
Each device, such as a display or printer, has its own color space and can only generate colors within its gamut. When moving an image from one device to another, the colors of the image may change on different devices as each device converts and displays RGB or CMYK according to its own color space.
The RGB space according to the embodiment of the present application means that the brightness of three primary colors, red, green, and blue, is used to quantitatively represent an image space; the YCC space is a color space representing luminance-chrominance separation in the present application, three components of a YCC space video signal represent luminance-chrominance, respectively, and common YCC space video signals are YUV, YCbCr, ICtCp, and the like.
To improve the brightness of the night image, one way is: by increasing the amount of light exposure of the camera chip. The photosensitive quantity of the camera chip is determined by the following three aspects: the aperture size of the camera and the exposure time of the camera improve the gain of the chip. Therefore, the light sensitivity of the camera chip can be improved by increasing the aperture of the camera, the exposure time of the camera and the gain of the chip. In another mode, image fusion of an infrared image and a color image can be adopted, and the technical scheme adopted at present is a YUV color space fusion method. This fusion algorithm transforms the image into the YUV color space domain so that the luminance component Y of the image is separated from the chrominance component UV. Because the brightness of the night color image is low, the brightness information of the fused image is provided by the infrared image in the mode; however, since the infrared image lacks color information, the chromaticity information of the fused image is provided by the color image, and the fused image is finally obtained.
Although the brightness of the night image can be improved by the two methods, the image with higher image brightness can be obtained by the method of improving the light sensitive quantity of the chip, but the depth of field is reduced by increasing the aperture of the camera (reducing the aperture value f), and the clear imaging range of the camera is limited; increasing the camera exposure time can result in motion blur, blurring moving objects (e.g., faces and license plates); the noise generated by the chip gain under the low illumination environment is increased, so that the image picture is not clean, and the visual effect is influenced. The YUV color space fusion method may cause a color cast problem in the fused image, and the main reason is that the chrominance component and the luminance component in the YUV color space are not completely decoupled, and if only Y is adjusted without performing corresponding processing on UV, for example, replacing the luminance component of a color image with the luminance component of an infrared image may cause an obvious change in color tone, thereby causing a color cast problem.
Applicants have found in their research that the HSL color space is better able to decouple color information from luminance information than the YUV color space. The HSL color space decomposes an image into three pieces of information, hue (H), saturation (S), and (L) brightness. Whereas in the HSL color space, changing the brightness does not change the type of color (i.e., hue).
Generally, in order to facilitate transmission and storage, the computing resources of the YUV video stream collected by the camera are relatively high. The applicant finds that processing in an RGB color space (RGB space for short) can significantly reduce computational resource consumption. And according to the interconversion formula of the RGB color space and the HSL color space, strict proof can be mathematically performed: in order to ensure that the color of the fused image is not color cast, that is, when the hue of the fused image is not changed, the three color channels of RGB need to be changed to the same extent, for example, the same multiplication operation is performed. Based on this, an embodiment of the present application provides an image processing method and an image processing apparatus, where an infrared image and a color image are fused in an RGB space, and correction coefficients for pixel values of three RGB color channels of the color image are specifically determined according to the infrared image and the color image, that is, the same correction coefficients are used for the three RGB color channels of the same pixel point in the color image, so that while brightness of the color image is improved, it is ensured that neither hue nor saturation of the corrected color image changes, and thus a color cast problem of the fused image can be avoided.
It should be understood that, since the RGB color space and the YUV color space can be converted by a linear transformation, the application can also achieve the same processing effect as the RGB color space in the YUV color space.
Hereinafter, embodiments of the present application will be described in detail with reference to the drawings.
The method and the device can be applied to video monitoring scenes, shooting scenes, camera shooting scenes and the like. The method and the device can be applied to low-illumination scenes, such as indoor scenes, backlight scenes, night scenes and the like.
The image processing system provided by the embodiment of the application can comprise a signal source and the image processing device provided by the embodiment of the application. The signal source may be a processing content source of the image processing apparatus, and may be a picture or a video frame. The image source may be a network, a removable storage medium, a camera device, etc. The image processing device is used for processing the image from the signal source according to the image processing method provided by the embodiment of the application. In an alternative case, the image processing apparatus may have a display function, and the image processing system provided in the embodiment of the present application may further display the image signal after the image processing, in which case, the processed image signal does not need to be output to the display device. In another optional case, the image processing system further includes a display device, where the display device may be a device with a display function, such as a display, or a display screen, and the display device is configured to receive the image signal transmitted by the image processing apparatus and display the received image signal.
For example, the signal source may be generated internally by an image processing apparatus, such as an image processing apparatus including a plurality of image sensors for acquiring image signals. The signal source may be generated by an external device, such as a camera or a camera device, provided outside the image processing apparatus. The signal source is internally generated by the image processing apparatus as an example as follows.
Illustratively, the image processing method provided by the present application can be applied to an image processing apparatus including at least two image sensors, the image processing apparatus including, but not limited to, a camera, a video camera, a smart camcorder, a smart mobile terminal (such as a mobile phone (mobile phone), a tablet computer, etc.), and the like. The image processing method provided by the application can also be applied to the camera equipment in the video monitoring system, such as a network camera, or be realized by a cloud server in the video monitoring system, that is, the image processing device can be the camera equipment in the video monitoring system.
In a first possible example, referring to fig. 1A, the image processing apparatus according to the embodiment of the present application may include at least two cameras and a processor 102. Taking two cameras as an example, a first camera 101a and a second camera 110 b.
The first camera 101a includes an optical lens 1 and an image sensor 1, the second camera 101b includes an optical lens 2 and an image sensor 2, and optical axes of the optical lenses included in the first camera 101a and the second camera 110b are parallel to each other. The image sensor 1 may be an infrared light sensor and the image sensor 2 may be a color image sensor (also referred to as a visible light sensor), i.e. a first camera is used for acquiring infrared images and a second camera is used for acquiring color images.
In a second possible example, referring to fig. 1B, the image processing apparatus according to the embodiment of the present application may include only one camera 101 c. The camera 101c includes an optical lens, a beam splitter, and at least two image sensors, which are exemplified by two image sensors in fig. 1B. The two image sensors are an infrared light sensor and a color image sensor, respectively.
In the following description, an example in which the image processing apparatus includes two cameras will be described.
The image sensor may be a charge-coupled device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), a Contact Image Sensor (CIS), or the like.
The processor 102 may include one or more of the following: general purpose processors, Image Signal Processors (ISPs), microprocessors, Digital Signal Processors (DSPs), field-programmable gate arrays (FPGAs), and the like.
Illustratively, a video encoder may also be included in the image processing apparatus. The image processing apparatus may further include a memory 103. The Memory 103 may be a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by the apparatus, but is not limited to such. The memory may be self-contained and coupled to the processor via a bus (as shown in fig. 1A and 1B). The memory 103 may also be integrated with the processor.
The memory 103 may be configured to store an application program code for executing the scheme of the present application, and the processor 102 controls the execution, that is, the processor 102 is configured to execute the application program code stored in the memory 103 to implement the image processing method in the embodiment of the present application.
An image processing method provided by the embodiment of the present application is described below with reference to fig. 2, where the method may be executed by an image processing apparatus, for example, by a processor 102 in the image processing apparatus, and the method includes the following steps:
s201, acquiring the brightness value of a second pixel point of the infrared RGB space image, and acquiring the brightness value of a first pixel point of the color RGB space image.
The infrared RGB space image and the color RGB space image are obtained by collecting the same target scene, the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image, and the first pixel point is any pixel point in the color RGB space image.
The infrared RGB space image may be obtained by collecting the target scene by an infrared sensor, and the color RGB space image may be obtained by collecting the target scene by a color image sensor. For example, the infrared sensor and the color image sensor may be located on the same device, and may capture a color image and an infrared image captured at the same time, where the captured color image and the captured infrared image may be in a YUV color space format, and perform a conversion operation on the color image and the infrared image in the YUV color space format to obtain the infrared RGB space image and the color RGB space image. The collected color image and infrared image may also be in RGB color space format, and no conversion operation is required.
Illustratively, the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image, and may be a coordinate of the second pixel point in the infrared RGB space image, which is the same as the coordinate of the first pixel point in the color RGB space image.
Illustratively, the infrared RGB space image and the color RGB space image may be one frame image in a video stream or may be one picture taken.
In one example, when acquiring the luminance value of an RGB space image (including an infrared RGB space image and a color RGB space image), the luminance value of a pixel point may be determined according to the pixel value of an RGB three channel of the pixel point.
In another example, the RGB space image may be converted from a YUV color image, and before the conversion, a pixel value of a Y channel of each pixel point in the YUV color image is obtained, that is, a brightness value of each pixel point of the RGB space image.
S202, determining a correction parameter for correcting the brightness of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point.
In an example, the correction parameter may include a color correction coefficient, and when determining the color correction coefficient, the color correction parameter may be determined according to the luminance value of the second pixel and the luminance value of the first pixel, and specifically may be implemented in the following manner:
example 1, a ratio of a luminance value to a weighted value of a second pixel is used as a color correction coefficient of the first pixel; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
For example, the color correction coefficient of the first pixel point may be determined by the following formula (1-1):
r1=Y2/[α×Y2+β×Y1]formula (1-1);
wherein r is1Representing the color correction factor, Y, determined according to example 11Expressing the brightness value, Y, of the first pixel2And expressing the brightness value of the second pixel point, wherein alpha is the weight of the brightness value of the second pixel point, and beta is the weight of the brightness value of the first pixel point. Wherein α + β may be equal to 1, or may not be equal to 1, for example, less than 1.
For example, α + β is equal to 1, and the color correction coefficient of the first pixel point can be determined by the following formula (1-2):
r1=Y2/[α×Y2+(1-α)×Y1]formula (1-2);
wherein r is1Representing the color correction factor, Y, determined according to example 11Expressing the brightness value, Y, of the first pixel2And expressing the brightness value of the second pixel point. Alternatively, the weight of the second pixel point may be less than 1/2, for example, α ═ 1/6.
Example 2, a ratio of a luminance value of the second pixel to the weighted value is determined, and then linear transformation is performed on the ratio to obtain the color correction coefficient of the first pixel. And the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
For example, the ratio can be determined by the above formula (1-1), and then the ratio is linearly transformed. For example, the linear transformation may be a maximum-minimum normalization process, and may be specifically implemented by the following formula (2):
Figure BDA0002065370040000091
wherein, Lr1Representing the color correction factor, r, determined according to example 21maxFor N r determined from N pixel points1Maximum value of (1), r1minFor N r determined from N pixel points1Maximum value of (2). The infrared RGB space image and the color RGB space image both comprise N pixel points, and r is determined according to the brightness value of the ith pixel point of the infrared RGB space image and the brightness value of the ith pixel point of the color RGB space1I takes positive integers less than or equal to N to obtain N r1,r1maxI.e. the N r1Maximum value of r1minI.e. the N r1Minimum value of (1), Lr1max、Lr1minAre respectively preset Lr1Upper limit value and lower limit value of (2). Lr1max、Lr1minMay be empirically determined maximum and minimum values of the color correction coefficient, such as Lr1max=2.8,Lr1min=0.7。
In another example, corresponding texture changes may exist in a dark region in a color RGB space image, but the dark region in the color RGB space image obtained by shooting cannot reflect the texture changes due to too dark light. That is, the correction parameters may further include a brightness enhancement parameter on the basis of including the color correction coefficient. For example, the brightness enhancement parameter of the first pixel point may be determined according to the brightness value of the second pixel point; therefore, when the RGB pixel value of the first pixel point of the color RGB space image is corrected according to the correction parameter, the RGB pixel value of the first pixel point of the color RGB space image is corrected according to the brightness improvement parameter and the color correction coefficient.
Exemplarily, when determining the brightness enhancement parameter according to the brightness value of the second pixel point, the following formula (3) may be used to implement the following:
Lr2=L×Y2equation (3);
wherein, Lr2Representing a luminance boost parameter, L representing a luminance parameter, Y2And expressing the brightness value of the second pixel point. Illustratively, L may take a value greater than 0 and less than or equal to 1.
S203, correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
It should be understood that the infrared RGB space image and the color RGB space image both include a plurality of pixel points, and the operation of the image processing flow shown in fig. 2 is performed for each pixel point, that is, the fusion processing of the infrared RGB space image and the color RGB space image is completed.
The color image and the infrared image which are shot by the two image sensors at the same moment have color information, but the brightness is low. The infrared image is bright but lacks color information. By the image processing method provided by the embodiment of the application, the color information and the brightness information can be effectively extracted from the color image and the infrared image for fusion processing, the image quality is improved, the color tone of the color image after the fusion processing is kept unchanged, and the color cast problem is effectively avoided.
In addition, the prior art carries out image fusion in YUV space. Since the color information (UV) and luminance information (Y) in YUV space cannot be decoupled, processing only the Y information necessarily changes the color information, resulting in color cast of the fused image. Compared with the prior art, the technical scheme of the application enhances the color information while processing the brightness information, and mathematically proves that the color tone of the fused image is completely the same as that of the color image, so that the problem of color cast of the existing fusion algorithm can be solved, and the color of the fused image is more real and natural.
In one example, the correction parameter only includes a color correction coefficient, and when the RGB pixel value of the first pixel point of the color RGB space image is corrected according to the correction parameter, the color correction coefficient may be multiplied by the RGB pixel value of the first pixel point of the color RGB space image, so as to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
In another example, the correction parameter includes not only the color correction coefficient but also the luminance boost parameter. When the RGB pixel values of the first pixel point of the color RGB space image are corrected according to the brightness enhancement parameter and the color correction coefficient, the RGB pixel values of the first pixel point of the color RGB space image may be linearized according to the brightness enhancement parameter and the color correction coefficient, so as to obtain the corrected RGB pixel values of the first pixel point of the color RGB space image.
Two ways of performing linearization processing on the RGB pixel values of the first pixel point of the color RGB space image according to the luminance boost parameter and the color correction coefficient are exemplarily described as follows:
in an example one, the color correction coefficient is multiplied by the sum of the RGB pixel value of the color RGB space image and the luminance improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
For example, the RGB pixel value of the first pixel point of the corrected color RGB space image can be represented by the following formula (4):
Cfuse1=A1×Lr1×(C+Lr2) Equation (4);
wherein, Cfuse1The pixel values (including R, G, B), Lr, representing the corrected color RGB space image1Color correction factor, Lr, representing a first pixel2A luminance boost parameter representing the first pixel point, C ═ RG, B, i.e., C is the pixel value of the three color channels R, G, B, a1 is the preconfigured parameter value, a1 may be a number greater than 0 and less than or equal to 1.
For example, a1 is 1, and for RGB three channels, equation (4) can be expanded as:
Figure BDA0002065370040000101
Figure BDA0002065370040000102
Figure BDA0002065370040000103
wherein the content of the first and second substances,
Figure BDA0002065370040000104
the R channel pixel value of the first pixel point of the corrected color RGB space image is represented,
Figure BDA0002065370040000105
the G channel pixel value of the first pixel point of the corrected color RGB space image is represented,
Figure BDA0002065370040000106
and representing the B channel pixel value of the first pixel point of the corrected color RGB space image. CRR channel pixel value, C, representing a first pixel of a color RGB space imageGG-channel pixel value, C, representing a first pixel point of a color RGB space imageBAnd B channel pixel values of a first pixel point of the color RGB space image are represented.
Exemplarily, when any channel pixel value of RGB three channels of the first pixel point of the RGB space image obtained by multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the luminance boost parameter is greater than the maximum pixel value supported by the image processing device, the channel pixel value is set as the maximum pixel value supported by the image processing device. For example, for three RGB channels, the maximum pixel value supported by the image processing apparatus is 255, and if the R channel pixel value of the first pixel point of the RGB space image obtained by multiplying the color correction coefficient by the sum of the R channel pixel value of the first pixel point of the color RGB space image and the luminance boost parameter is 258, the R channel pixel value of the first pixel point is set to 255.
And in the second example, the color correction coefficient is multiplied by the RGB pixel value of the color RGB space image, and then the sum is made with the luminance improvement parameter, so as to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
For example, the RGB pixel values of the first pixel point of the corrected color RGB space image can be represented by the following formula (5):
Cfuse2=A2×Lr1×C+Lr2equation (5);
wherein, Cfuse2The pixel values (including R, G, B), Lr, representing the corrected color RGB space image1Color correction factor, Lr, representing a first pixel2The luminance boost parameter representing the first pixel point, C is R, G, B, that is, C is the pixel values of three color channels of R, G, and B, a2 is a preconfigured parameter value, a2 may be a number greater than 0 and less than or equal to 1, for example, a2 is 1, then the RGB pixel value of the first pixel point of the color RGB space image after being modified may be represented as: cfuse2=Lr1×C+Lr2
Illustratively, the color correction coefficient is multiplied by RGB pixel values of the color RGB spatial image, and then summed with the luminance boost parameter, and when any channel pixel value of RGB three channels of the first pixel point of the obtained RGB spatial image is greater than a maximum pixel value supported by the image processing device, the channel pixel value is set as the maximum pixel value supported by the image processing device.
In another example, the correction parameter includes a color correction coefficient, and when it is determined that the color correction coefficient is less than or equal to 1 according to the luminance value of the second pixel and the luminance value of the first pixel, the luminance value of the second pixel is performed to determine a luminance boost parameter, and then the RGB pixel value of the first pixel is corrected according to the color correction coefficient and the luminance boost parameter. And when the color correction coefficient is determined to be larger than 1 according to the brightness value of the second pixel point and the brightness value of the first pixel point, the brightness value of the second pixel point is not executed to determine a brightness improvement parameter, and then the RGB pixel value of the first pixel point is corrected according to the color correction coefficient.
Based on the same inventive concept of the method embodiment, the embodiment of the present application provides an image processing apparatus, which includes an acquisition unit 301 and a correction processing unit 302, as shown in fig. 3A and 3B.
The acquiring unit 301 is configured to acquire a luminance value of a first pixel point of a color RGB space image, and acquire a luminance value of a second pixel point of an infrared RGB space image, where the infrared RGB space image and the color RGB space image are obtained by acquiring the same target scene, and a position of the second pixel point in the infrared RGB space image is the same as a position of the first pixel point in the color RGB space image.
A correction processing unit 302, configured to determine, according to the luminance value of the second pixel and the luminance value of the first pixel, a correction parameter for correcting the luminance of the first pixel; and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
In one possible design, as shown in fig. 3A, the apparatus may further include a first acquisition unit 303A and a second acquisition unit 303 b.
The first collecting unit 303a is configured to collect the color RGB spatial image, and the second collecting unit 303b is configured to collect the infrared RGB spatial image.
In another possible design, referring to fig. 3B, the first acquiring unit 303a is configured to acquire a color brightness chrominance YUV space image, and the second acquiring unit 303B is configured to acquire an infrared YUV space image, and then the apparatus further includes: a conversion unit 304, configured to convert the color YUV space image into the color RGB space image, and convert the infrared YUV space image into the infrared RGB space image.
In one possible design, the correction parameters include a color correction factor and a brightness boost parameter; the modification processing unit 302 is specifically configured to:
determining a color correction coefficient of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, and determining a brightness improvement value of the first pixel point according to the brightness value of the second pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient.
In a possible design, when performing correction processing on RGB pixel values of a first pixel of the color RGB space image according to the luminance boost parameter and the color correction coefficient, the correction processing unit 302 is specifically configured to:
and carrying out linearization processing on the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
In a possible design, when the RGB pixel values of the first pixel of the color RGB space image are linearized according to the luminance enhancement parameter and the color correction coefficient, the correction processing unit 302 is specifically configured to:
multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image; alternatively, the first and second electrodes may be,
and multiplying the color correction coefficient by the RGB pixel value of the color RGB space image, and then summing the color correction coefficient and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
In one possible design, the correction parameters include a color correction factor;
the modification processing unit 302 is specifically configured to:
determining the color correction coefficient according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the color correction coefficient.
In a possible design, when determining the color correction coefficient of the second pixel according to the luminance value of the second pixel and the luminance value of the first pixel, the correction processing unit 302 is specifically configured to:
taking the ratio of the brightness value to the weighted value of the second pixel point as the color correction coefficient of the first pixel point; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
In a possible design, when determining the color correction coefficient of the first pixel according to the luminance value of the second pixel and the luminance value of the first pixel, the correction processing unit 302 is specifically configured to:
determining the ratio of the brightness value to the weighted value of the second pixel point; the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point; and carrying out linear transformation on the ratio to obtain a color correction coefficient of the first pixel point.
It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation. The functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
Illustratively, the obtaining unit 301, the modification processing unit 302, and the converting unit 304 may be implemented by one or more processors. The processor may be implemented as a central processing unit, a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, transistor logic, hardware components, or any combination thereof that may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a digital signal processor and a microprocessor, or the like. The first collecting unit 303a and the second collecting unit 303b may be two image sensors, the two image sensors may be configured in the same camera, and the one camera further includes an optical lens and a beam splitter, and the beam splitter distributes light received by the optical lens to the two image sensors. The two image sensors can also be respectively configured in different cameras, and each camera also comprises an optical lens.
In one possible implementation manner, referring to fig. 4, an example of another possible structure of an image processing apparatus according to an embodiment of the present application includes a processor 401, a memory 402, and a communication interface 403. The communication interface 403 receives an image signal from a signal source. The signal source may be a processing content source of the image processing apparatus, and may be a picture or a video frame. The processor 401 may be used to support image processing apparatus to perform the relevant functions of image processing. The memory 402 is used for supporting the processor 401 to call the computer program and instructions in the memory 402 to implement the steps involved in the image processing method provided by the embodiment of the present application, and in addition, the memory 402 is also used for storing data, such as color images and infrared images to be processed. The memory 402 may include both volatile and non-volatile memory, such as memory and a hard disk. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data Rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchlink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
In another possible implementation manner, referring to fig. 5, another possible structure of an image processing apparatus according to an embodiment of the present application includes a first image sensor 501a, a second image sensor 501b, a processor 502, and a memory 503. The first image sensor 501a is used to capture color images and the second image sensor 501b is used to capture infrared images. In one example, the color image and the infrared image are both RGB space images, the first image sensor 501a is specifically configured to capture the color RGB space image, and the second image sensor 501b is specifically configured to capture the infrared RGB space image. In another example, the color image and the infrared image are both YUV space images, the first image sensor 501a is specifically configured to acquire a color YUV space image, the second image sensor 501b is specifically configured to acquire an infrared YUV space image, and the processor 502 converts the color YUV space image into a color RGB space image and converts the infrared YUV space image into an infrared RGB space image. The processor 502 may be used to support image processing devices to perform image processing related functions. The memory 503 is used for supporting the processor 502 to call the computer program and instructions in the memory 503 to implement the steps involved in the image processing method provided by the embodiment of the present application, and in addition, the memory 503 is also used for storing data, such as color images and infrared images to be processed. The memory 503 may include both volatile and nonvolatile memories, such as a memory and a hard disk, which are described in detail above and will not be described herein again.
Illustratively, the image processing apparatus shown in fig. 4 or 5 may further include a display screen for displaying images, videos, and the like. The display screen includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like.
Based on the same inventive concept, embodiments of the present application provide a computer program product, which includes a computer program, and when the computer program is executed on a computer, the computer will implement the functions of any of the above-mentioned embodiments of the image processing method.
Based on the same inventive concept, embodiments of the present application provide a computer program, which when executed on a computer, will enable the computer to implement the functions involved in any of the above-described image processing method embodiments.
Based on the same inventive concept, the present application provides a computer-readable storage medium for storing programs and instructions, which when called to execute in a computer, can cause the computer to execute the functions related to any of the above-mentioned image processing method embodiments.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the scope of the embodiments of the present application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.

Claims (18)

1. An image processing method, comprising:
acquiring the brightness value of a first pixel point of a color RGB space image and the brightness value of a second pixel point of an infrared RGB space image, wherein the infrared RGB space image and the color RGB space image are obtained by collecting the same target scene, and the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image;
determining a correction parameter for correcting the brightness of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
2. The method of claim 1, wherein the correction parameters include a color correction coefficient and a brightness boost parameter;
determining a correction parameter of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, including:
determining a color correction coefficient of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, and determining a brightness improvement value of the first pixel point according to the brightness value of the second pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter, wherein the correction process comprises the following steps:
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient.
3. The method as claimed in claim 2, wherein the modifying the RGB pixel values of the first pixel of the color RGB space image according to the luminance enhancement parameter and the color modification coefficient comprises:
and carrying out linearization processing on the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
4. The method as claimed in claim 3, wherein linearizing RGB pixel values of a first pixel of the color RGB space image based on the luminance boost parameter and the color correction factor comprises:
multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image; alternatively, the first and second electrodes may be,
and multiplying the color correction coefficient by the RGB pixel value of the color RGB space image, and then summing the color correction coefficient and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
5. The method of claim 1, wherein the correction parameters include color correction coefficients;
determining a correction parameter of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, including:
determining the color correction coefficient according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter, wherein the correction process comprises the following steps:
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the color correction coefficient.
6. The method according to any one of claims 1 to 5, wherein determining the color correction factor for the second pixel based on the luminance value of the second pixel and the luminance value of the first pixel comprises:
taking the ratio of the brightness value to the weighted value of the second pixel point as the color correction coefficient of the first pixel point; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
7. The method according to any one of claims 1 to 5, wherein determining the color correction factor for the first pixel based on the luminance value of the second pixel and the luminance value of the first pixel comprises:
determining the ratio of the brightness value to the weighted value of the second pixel point; the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point;
and carrying out linear transformation on the ratio to obtain a color correction coefficient of the first pixel point.
8. An image processing apparatus characterized by comprising:
the device comprises an acquisition unit and a processing unit, wherein the acquisition unit is used for acquiring the brightness value of a first pixel point of a color RGB space image and acquiring the brightness value of a second pixel point of an infrared RGB space image, the infrared RGB space image and the color RGB space image are obtained by acquiring the same target scene, and the position of the second pixel point in the infrared RGB space image is the same as the position of the first pixel point in the color RGB space image;
the correction processing unit is used for determining a correction parameter for correcting the brightness of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point; and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the correction parameter.
9. The apparatus of claim 8, further comprising a first acquisition unit and a second acquisition unit;
the first acquisition unit is used for acquiring the color RGB space image, and the second acquisition unit is used for acquiring the infrared RGB space image; alternatively, the first and second electrodes may be,
the first acquisition unit is used for acquiring a color brightness chrominance YUV space image, the second acquisition unit is used for acquiring an infrared YUV space image, and the device further comprises:
and the conversion unit is used for converting the color YUV space image into the color RGB space image and converting the infrared YUV space image into the infrared RGB space image.
10. The apparatus according to claim 8 or 9, wherein the correction parameters include a color correction coefficient and a brightness boost parameter; the correction processing unit is specifically configured to:
determining a color correction coefficient of the first pixel point according to the brightness value of the second pixel point and the brightness value of the first pixel point, and determining a brightness improvement value of the first pixel point according to the brightness value of the second pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient.
11. The apparatus according to claim 10, wherein the correction processing unit, when performing the correction processing on the RGB pixel values of the first pixel of the color RGB space image according to the luminance boost parameter and the color correction coefficient, is specifically configured to:
and carrying out linearization processing on the RGB pixel value of the first pixel point of the color RGB space image according to the brightness improving parameter and the color correction coefficient to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
12. The apparatus according to claim 11, wherein the correction processing unit, when performing linearization processing on the RGB pixel values of the first pixel of the color RGB space image according to the luminance boost parameter and the color correction coefficient, is specifically configured to:
multiplying the color correction coefficient by the sum of the RGB pixel value of the color RGB space image and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image; alternatively, the first and second electrodes may be,
and multiplying the color correction coefficient by the RGB pixel value of the color RGB space image, and then summing the color correction coefficient and the brightness improvement parameter to obtain the corrected RGB pixel value of the first pixel point of the color RGB space image.
13. The apparatus of claim 8 or 9, wherein the correction parameters comprise color correction coefficients;
the correction processing unit is specifically configured to:
determining the color correction coefficient according to the brightness value of the second pixel point and the brightness value of the first pixel point;
and correcting the RGB pixel value of the first pixel point of the color RGB space image according to the color correction coefficient.
14. The apparatus according to any one of claims 8 to 13, wherein the correction processing unit, when determining the color correction coefficient of the second pixel according to the luminance value of the second pixel and the luminance value of the first pixel, is specifically configured to:
taking the ratio of the brightness value to the weighted value of the second pixel point as the color correction coefficient of the first pixel point; and the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point.
15. The apparatus according to any one of claims 8 to 14, wherein the correction processing unit, when determining the color correction coefficient of the first pixel point according to the luminance value of the second pixel point and the luminance value of the first pixel point, is specifically configured to:
determining the ratio of the brightness value to the weighted value of the second pixel point; the weighted value is the weighted sum of the brightness value of the second pixel point and the brightness value of the first pixel point;
and carrying out linear transformation on the ratio to obtain a color correction coefficient of the first pixel point.
16. An image processing apparatus comprising at least one processor, the at least one processor coupled with at least one memory; the at least one processor configured to execute computer programs or instructions stored in the at least one memory to cause the apparatus to perform the method of any of claims 1-7.
17. The apparatus of claim 16, further comprising a first image sensor for capturing a static or dynamic color image and a second image sensor for capturing a static or dynamic infrared image.
18. A computer-readable storage medium, having stored thereon a computer program or instructions, which, when read and executed by a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN201910418979.7A 2019-05-20 2019-05-20 Image processing method and image processing device Pending CN111970432A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910418979.7A CN111970432A (en) 2019-05-20 2019-05-20 Image processing method and image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910418979.7A CN111970432A (en) 2019-05-20 2019-05-20 Image processing method and image processing device

Publications (1)

Publication Number Publication Date
CN111970432A true CN111970432A (en) 2020-11-20

Family

ID=73358261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910418979.7A Pending CN111970432A (en) 2019-05-20 2019-05-20 Image processing method and image processing device

Country Status (1)

Country Link
CN (1) CN111970432A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112446839A (en) * 2020-11-30 2021-03-05 平安科技(深圳)有限公司 Image enhancement method and device, electronic equipment and computer readable storage medium
CN112788321A (en) * 2021-01-05 2021-05-11 锐芯微电子股份有限公司 Image color recovery method and apparatus, image pickup apparatus, and storage medium
CN112884688A (en) * 2021-02-03 2021-06-01 浙江大华技术股份有限公司 Image fusion method, device, equipment and medium
CN113592739A (en) * 2021-07-30 2021-11-02 浙江大华技术股份有限公司 Method and device for correcting lens shadow and storage medium
WO2023124201A1 (en) * 2021-12-29 2023-07-06 荣耀终端有限公司 Image processing method and electronic device
TWI812516B (en) * 2022-10-20 2023-08-11 緯創資通股份有限公司 Image processing apparatus and image processing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168444A1 (en) * 2012-12-14 2014-06-19 Korea University Research And Business Foundation Apparatus and method for fusing images
CN103971351A (en) * 2013-02-04 2014-08-06 三星泰科威株式会社 Image fusion method and apparatus using multi-spectral filter array sensor
CN105554483A (en) * 2015-07-16 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN105578063A (en) * 2015-07-14 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168444A1 (en) * 2012-12-14 2014-06-19 Korea University Research And Business Foundation Apparatus and method for fusing images
CN103971351A (en) * 2013-02-04 2014-08-06 三星泰科威株式会社 Image fusion method and apparatus using multi-spectral filter array sensor
CN105578063A (en) * 2015-07-14 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN105554483A (en) * 2015-07-16 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112446839A (en) * 2020-11-30 2021-03-05 平安科技(深圳)有限公司 Image enhancement method and device, electronic equipment and computer readable storage medium
WO2022110712A1 (en) * 2020-11-30 2022-06-02 平安科技(深圳)有限公司 Image enhancement method and apparatus, electronic device and computer readable storage medium
CN112446839B (en) * 2020-11-30 2023-11-07 平安科技(深圳)有限公司 Image enhancement method, image enhancement device, electronic equipment and computer readable storage medium
CN112788321A (en) * 2021-01-05 2021-05-11 锐芯微电子股份有限公司 Image color recovery method and apparatus, image pickup apparatus, and storage medium
CN112884688A (en) * 2021-02-03 2021-06-01 浙江大华技术股份有限公司 Image fusion method, device, equipment and medium
CN112884688B (en) * 2021-02-03 2024-03-29 浙江大华技术股份有限公司 Image fusion method, device, equipment and medium
CN113592739A (en) * 2021-07-30 2021-11-02 浙江大华技术股份有限公司 Method and device for correcting lens shadow and storage medium
WO2023124201A1 (en) * 2021-12-29 2023-07-06 荣耀终端有限公司 Image processing method and electronic device
TWI812516B (en) * 2022-10-20 2023-08-11 緯創資通股份有限公司 Image processing apparatus and image processing method

Similar Documents

Publication Publication Date Title
CN111970432A (en) Image processing method and image processing device
TWI737979B (en) Image demosaicer and method
TWI416940B (en) Image processing apparatus and image processing program
CN105915909B (en) A kind of high dynamic range images layered compression method
JP5392560B2 (en) Image processing apparatus and image processing method
CN111738970A (en) Image fusion method and device and computer readable storage medium
WO2019105305A1 (en) Image brightness processing method, computer readable storage medium and electronic device
JP2012109900A (en) Photographing device, photographing method and program
CN112351195B (en) Image processing method, device and electronic system
US20180025476A1 (en) Apparatus and method for processing image, and storage medium
US20170154437A1 (en) Image processing apparatus for performing smoothing on human face area
CN110213502A (en) Image processing method, device, storage medium and electronic equipment
CN110248242A (en) A kind of image procossing and live broadcasting method, device, equipment and storage medium
Kao High dynamic range imaging by fusing multiple raw images and tone reproduction
WO2018152977A1 (en) Image noise reduction method and terminal, and computer storage medium
CN110807735A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN113824914B (en) Video processing method and device, electronic equipment and storage medium
CN104010134B (en) For forming the system and method with wide dynamic range
JP2003199115A (en) Method for contrast enhancement in color digital image
CN105118032B (en) A kind of wide method for dynamically processing of view-based access control model system
JP7277158B2 (en) Setting device and method, program, storage medium
EP4090006A2 (en) Image signal processing based on virtual superimposition
JP4359662B2 (en) Color image exposure compensation method
WO2022067761A1 (en) Image processing method and apparatus, capturing device, movable platform, and computer readable storage medium
KR101903428B1 (en) System and Method of Color Correction for Related Images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201120