CN115103168A - Image generation method, image generation device, electronic equipment and storage medium - Google Patents

Image generation method, image generation device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115103168A
CN115103168A CN202210737275.8A CN202210737275A CN115103168A CN 115103168 A CN115103168 A CN 115103168A CN 202210737275 A CN202210737275 A CN 202210737275A CN 115103168 A CN115103168 A CN 115103168A
Authority
CN
China
Prior art keywords
color
image
channel
target
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210737275.8A
Other languages
Chinese (zh)
Inventor
王秀花
凌晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202210737275.8A priority Critical patent/CN115103168A/en
Publication of CN115103168A publication Critical patent/CN115103168A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

According to the image generation method and device, the electronic equipment and the storage medium, the server obtains the original image in the current color temperature scene, and then performs resolution supplement on the original image to obtain each color channel value of each pixel point in the original image. And acquiring a target channel correction coefficient corresponding to the current color temperature scene, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image, and effectively separating other mixed components in the original color channel. And then, carrying out interpolation processing on the corrected image to generate each color channel value of each pixel point so as to generate a target image corresponding to the original image, and accurately restoring the target image in the original image so that the texture and the color of the target image are accurate.

Description

Image generation method, image generation device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image generation method and apparatus, an electronic device, and a storage medium.
Background
The night vision function of the camera is more and more required in the fields of monitoring, automatic driving and the like, and under the condition of dark light, an image is acquired by using an RGB-IR (Red Green blue-Infrared) image sensor capable of sensing Infrared IR (IR for short), so that the display effect of the image can be enhanced, wherein the RGB-IR image sensor is one type of the camera.
For reasons such as the RGB-IR image sensor manufacturing process, the R, G, B channel of the RGB-IR image sensor senses visible light and also senses infrared light, and the IR channel also senses visible light, that is, in the original image collected by the RGB-IR image sensor, information of R, G and B three channels of visible light is mixed in the IR channel. In the prior art, in order to separate an RGB image from an original image, IR information in R, G and B channels in the RGBIR image is usually removed, and then the image with the IR information removed is interpolated to obtain the RGB image.
However, the above processing method easily causes that neither the RGB image nor the IR image can be accurately restored.
Disclosure of Invention
The application provides an image generation method, an image generation device, electronic equipment and a storage medium, which are used for solving the problem that in the prior art, information of R, G and B channels mixed in an IR channel is not considered, so that an RGB image and an IR image cannot be accurately restored.
In a first aspect, the present application provides an image generation method, including:
acquiring an original image in a current color temperature scene, wherein each pixel point in the original image is represented by an original color channel numerical value;
performing resolution supplement on the original image to obtain each color channel value of each pixel point in the original image;
acquiring a target channel correction coefficient corresponding to the current color temperature scene;
for each pixel point, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image;
interpolating the original color channel value of each pixel point in the corrected image to generate each color channel value of each pixel point;
and generating a target image corresponding to the original image according to each color channel value of each pixel point.
In a second aspect, the present application provides an image generation apparatus comprising:
the first acquisition module is used for acquiring an original image under a current color temperature scene, wherein each pixel point in the original image is represented by an original color channel numerical value;
the compensation module is used for performing resolution compensation on the original image to obtain each color channel numerical value of each pixel point in the original image;
the second acquisition module is used for acquiring a target channel correction coefficient corresponding to the current color temperature scene;
the correction module is used for correcting the original color channel value of each pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image;
the interpolation module is used for carrying out interpolation processing on the original color channel numerical value of each pixel point in the corrected image to generate each color channel numerical value of each pixel point;
and the generating module is used for generating a target image corresponding to the original image according to each color channel numerical value of each pixel point.
In a third aspect, the present application provides an electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the picture generation method of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the computer-readable storage medium is used for implementing the picture generation method according to the first aspect.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the picture generation method of the first aspect.
According to the image generation method and device, the electronic equipment and the storage medium, the server obtains the original image in the current color temperature scene, and then performs resolution supplement on the original image to obtain each color channel value of each pixel point in the original image. And acquiring a target channel correction coefficient corresponding to the current color temperature scene, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image, and effectively separating other mixed components in the original color channel. And then, carrying out interpolation processing on the corrected image to generate each color channel value of each pixel point so as to generate a target image corresponding to the original image, and accurately restoring the target image in the original image so that the texture and the color of the target image are accurate.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flowchart of an image generation method according to an embodiment of the present application;
FIG. 2 is a diagram of an original image according to an example of the embodiment of the present application;
fig. 3 is a schematic flowchart of another image generation method provided in the second embodiment of the present application;
fig. 4 is a schematic flowchart of another image generation method provided in the third embodiment of the present application;
fig. 5 is a schematic flowchart of another image generation method according to the fourth embodiment of the present application;
FIG. 6 is a diagram illustrating a corrected image according to a fourth example of the embodiment of the present application;
FIG. 7 is a schematic diagram of a color difference gray scale map according to a fourth example of the present application;
fig. 8 is a schematic structural diagram of an image generating apparatus according to a fifth embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In the fields of safety monitoring, automatic driving and the like, a used camera is generally an RGB-IR image sensor with a strong image display effect, an R, G, B channel of the RGB-IR image sensor can sense visible light and infrared light at the same time, and an IR channel can also sense visible light, namely, in an original image acquired by the RGB-IR image sensor, information of R, G and B channels of visible light is mixed in the IR channel, and information of infrared light is mixed in R, G and B channels of visible light.
In order to separate an RGB image and an IR image from an original image, in the prior art, IR information in R, G and B channels in the RGBIR image is usually removed, and then the image with the IR information removed is interpolated to obtain the RGB image.
However, considering the information of R, G and B channels mixed in the IR channel, both RGB image and IR image cannot be accurately restored.
Therefore, the present application provides an image generation method, which corrects an original color channel value of an original image with full resolution by obtaining a channel correction coefficient corresponding to a current scene, then performs interpolation processing on the corrected image to obtain each color channel value of each pixel point, and finally generates a target image corresponding to the original image, for example, an RGB image and an IR image. The method of the application can remove the IR component in R, G, B and the R, G, B component in IR, and can accurately restore the target image in the original image, so that the texture and color of the target image are accurate.
The application scenario of the present application may be the generation of the captured image in a monitoring place where the RGB-IR image sensor is installed, or the generation of the captured image in an autonomous vehicle where the RGB-IR image sensor is installed, which is not limited in the present application. It is understood that the image generation method provided by the present application includes, but is not limited to, the above application scenarios.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following embodiments may exist independently or in combination, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic flowchart of an image generating method provided in an embodiment of the present application, where the method may be performed by an image generating apparatus, where the apparatus may be a server, and the following description takes the server as an example, and the method includes the following steps.
S101, obtaining an original image in a current color temperature scene, wherein each pixel point in the original image is represented by an original color channel numerical value.
The original image in the current color temperature scene can be acquired by an RGB-IR image sensor, and the pixel value of each pixel point in the original image is represented by an original color channel value.
For example, as shown in fig. 2, fig. 2 is an original image, where each grid represents a pixel, a letter in each grid represents that a pixel value of the pixel is represented by a channel value corresponding to the letter, for example, a first pixel in an upper left corner, the pixel is represented by a G channel value (i.e., an original color channel value), G in the image represents a green channel, R represents a red channel, B represents a blue channel, and IR represents an infrared channel.
S102, performing resolution supplement on the original image to obtain each color channel numerical value of each pixel point in the original image.
After the server acquires the original image, the resolution of the original image is supplemented, and the numerical value of each color channel of each pixel point in the image is obtained.
The resolution supplementing method may be, for example, a simple interpolation operation, that is, performing mean calculation on other three channel values except for the original color channel of a certain pixel point according to a neighborhood pixel point, for example, the target pixel point in fig. 2, where the original color channel value is a B channel value, so that interpolation operation needs to be performed on other three channels (i.e., R, G and IR) of the target pixel point, and taking calculating an R channel value as an example, calculating a mean value can be calculated through R channel values of the neighborhood pixel points (four pixel points in which the image channel value is coarsely displayed, and the number of the neighborhood pixel points is not limited), so that the R channel value of the target pixel point can be obtained. The method may also be a bilinear interpolation method, and the application does not limit this.
S103, acquiring a target channel correction coefficient corresponding to the current color temperature scene.
And after the resolution of the original image is supplemented by the server, acquiring a target channel correction coefficient corresponding to the current color temperature scene for subsequently correcting the original color channel numerical value of each pixel point in the original image.
The target channel correction coefficient may be obtained, for example, by: and acquiring a color card image in the current color temperature scene, and inputting the color card image into a coefficient prediction mode obtained by pre-training to obtain a target channel correction coefficient corresponding to the current color temperature scene. It should be noted that, the color card image shot by the camera in the current color temperature scene is a standard color card including multiple colors.
For example, the original image after resolution supplementation includes four channel values, and the target channel correction coefficient may be represented by a4 × 4 matrix, for example:
Figure BDA0003716120070000051
where each row represents the correction coefficient for one color channel, e.g., a1-a4 for the R color channel, B1-B4 for the G color channel, c1-c4 for the B color channel, and d1-d4 for the IR color channel.
And S104, for each pixel point, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image.
After the server obtains the target channel correction coefficient corresponding to the current color temperature scene, for each pixel point, the server can correct the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point.
Illustratively, the B-channel value of the target pixel point in fig. 2 is corrected as follows:
and (3) calculating products of correction coefficients c1-c4 corresponding to the B channel exemplified in the formula (1) and the pixel values of all channels of the target pixel point respectively, and then summing to obtain R × c1+ G × c2+ B × c3+ IR × c 4. The corrected B channel value of the target pixel is R × c1+ G × c2+ B × c3+ IR × c 4.
Alternatively, before performing the correction, the server may perform black level removal and/or dead pixel correction on the original image after the resolution supplementation in step S102. And then the server corrects the original color channel value of each pixel point according to the target channel correction coefficient and each color channel value of each pixel point after the pixel value processing, so that the influence of the existence of black level and bad points on the correction result is avoided.
And S105, carrying out interpolation processing on the original color channel numerical value of each pixel point in the corrected image to generate each color channel numerical value of each pixel point.
And the server performs interpolation processing on the original color channel numerical value of each pixel point in the corrected original format image to generate each color channel numerical value of each pixel point.
The interpolation processing method may be, for example: and obtaining other three channel values of each pixel point according to the color difference value of the channel value of the neighborhood pixel point of a certain pixel point to other three channel values except the original color channel.
It can be understood that, in the present application, the corrected value of the original color channel value is an accurate value obtained by separating components of other color channels, and at this time, interpolation processing is performed to obtain other color channel values, so that the pixel value of the image is more accurate, and the texture and color of the generated image are more accurate.
Optionally, before performing interpolation processing on the corrected image, the server may perform operations such as black level addition processing on the corrected image, so as to remove adverse effects of the above-mentioned black point removal on the original image. And then the server performs interpolation processing on the corrected image after the pixel value processing to obtain each color channel value of each pixel point, so that the interpolation processing result is more accurate.
And S106, generating a target image corresponding to the original image according to each color channel numerical value of each pixel point.
After obtaining each color channel value of each pixel point, the server may generate a target image corresponding to the original image according to each color channel value of each pixel point, where the target image may include one or more of the following images: RGB image, IR image, RAW image, YUV image, and the like.
The generated target image may be selected according to actual requirements, which is not limited in the present application.
In this embodiment, the server obtains an original image in a current color temperature scene, and then performs resolution supplement on the original image to obtain each color channel value of each pixel point in the original image. And acquiring a target channel correction coefficient corresponding to the current color temperature scene, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image, and effectively separating other mixed components in the original color channel. And then, carrying out interpolation processing on the corrected image to generate each color channel value of each pixel point so as to generate a target image corresponding to the original image, and accurately restoring the target image in the original image so that the texture and the color of the target image are accurate.
On the basis of the first embodiment, a specific implementation manner for determining the target channel correction coefficient corresponding to the current color temperature scene is further provided through a second embodiment.
Referring to fig. 3, fig. 3 is a schematic flowchart of another image generation method provided in the second embodiment of the present application, where the method may be executed by an image generation apparatus, where the apparatus may be a server, and the following description takes the server as an example, and the method includes the following steps.
S301, obtaining a color temperature value of the current color temperature scene.
Generally speaking, in a scene with darker light, the color temperature value is lower, and in a scene with brighter light, the color temperature value is higher, and the server may obtain the color temperature value of the current color temperature scene by obtaining a numerical value measured by a color temperature table placed in the scene. Of course, the color temperature value of the current color temperature scene may also be obtained in other manners, which is not limited in this application.
S302, determining a target channel correction coefficient in the stored channel correction coefficients corresponding to the color temperature scenes according to the color temperature values.
And the server determines a target channel correction coefficient in the stored channel correction coefficients corresponding to the color temperature scenes according to the color temperature values.
Optionally, the channel correction coefficients corresponding to the color temperature scenes stored in the server may be obtained in the following manner:
and acquiring image data in each color temperature scene, wherein the image data comprises a monochromatic color card image corresponding to each color channel and a standard color card image corresponding to each color channel. The monochrome color chart images are captured by a single color filter in the camera, such as a blue color chart image, a green color chart image, a red color chart image, an infrared color chart image, and the like. The standard color card image is a standard color image photographed in a standard color temperature scene.
And then the server respectively inputs the image data under each color temperature scene into the first neural network to obtain the channel correction coefficient corresponding to each color temperature scene.
After the server obtains the channel correction coefficients corresponding to the color temperature scenes, the channel correction coefficients corresponding to the color temperature scenes can be stored in a matrix form to form a one-to-one correspondence relationship, so that subsequent searching is facilitated. The stored channel correction coefficients corresponding to the color temperature scenes are used for determining the channel correction coefficients of the current color temperature scene, so that the image generation efficiency is higher, and the real-time performance of the shooting equipment can be improved.
Or, the server may also acquire a channel correction coefficient corresponding to each color temperature scene determined by a professional through other conventional calibration methods, and the acquisition mode of the channel correction coefficient corresponding to each color temperature scene is not limited in the present application.
In this embodiment, the server obtains the color temperature value of the current color temperature scene, and then determines the target channel correction coefficient in the stored channel correction coefficients corresponding to the color temperature scenes according to the color temperature value, so as to quickly obtain the target channel correction coefficient of the current color temperature scene, thereby improving the image generation efficiency and accuracy of the shooting device.
On the basis of the first embodiment, another specific implementation manner for determining the target channel correction coefficient corresponding to the current color temperature scene is further provided below through a third embodiment
Referring to fig. 4, fig. 4 is a flowchart illustrating another image generation method provided in the third embodiment of the present application, where the method may be executed by an image generation apparatus, where the apparatus may be a server, and the following description takes the server as an example, and the method includes the following steps.
S401, obtaining a color card image in the current color temperature scene.
In order to obtain the target channel correction coefficient of the current color temperature scene, the server may obtain a color card image captured by the camera in the current color temperature scene.
S402, inputting the color card image of the current color temperature scene into a coefficient prediction model obtained in advance to obtain a target channel correction coefficient corresponding to the current color temperature scene.
And then inputting the color card image into a coefficient prediction mode obtained by pre-training, and predicting to obtain a target channel correction coefficient corresponding to the current color temperature scene.
The coefficient prediction model is obtained by training in a second neural network according to training data, and specifically comprises the following steps:
the server acquires training data, wherein the training data comprises color card images in different color temperature scenes and channel correction coefficients corresponding to the different color temperature scenes. It can be understood that the channel correction coefficients corresponding to different color temperature scenes may be obtained by a conventional calibration method or the first neural network in the second embodiment, and the like, which is not limited in the present application.
And then inputting the training data into a second neural network for training to obtain a coefficient prediction model.
It can be understood that, compared with the first neural network in the above embodiment, the second neural network has fewer model parameters, so that the coefficient prediction model occupies a smaller storage space, and is beneficial to being embedded into a device for practical application, such as a camera or an unmanned vehicle, and the accuracy of the determined target channel correction coefficient is improved while the real-time performance is satisfied.
In this embodiment, the server obtains a target channel correction coefficient corresponding to the current color temperature scene by obtaining a color card image in the current color temperature scene and inputting the color card image of the current color temperature scene into a coefficient prediction model obtained in advance. Compared with the method for acquiring the target channel correction coefficient corresponding to the current color temperature scene in the second embodiment, the method in this embodiment effectively avoids the situation that the target channel correction coefficient determined in the method for acquiring the second embodiment is deviated when the color temperature judgment is different from the actual color temperature. The target channel correction coefficient corresponding to the acquired front color temperature scene is more accurate.
On the basis of the first embodiment, a specific implementation manner of interpolation processing on the corrected original image in the first embodiment is further provided through a fourth embodiment.
Referring to fig. 5, fig. 5 is a flowchart illustrating another image generation method provided in the fourth embodiment of the present application, where the method may be executed by an image generation apparatus, where the apparatus may be a server, and the following description takes the server as an example, and the method includes the following steps.
S501, carrying out interpolation calculation on the target color channel numerical value in the corrected image to obtain an interpolation result of the target color channel numerical value, wherein the target color channel is the color channel with the largest number.
For example, as shown in fig. 6, if the color channel with the largest number in fig. 6 is a G channel numerical value, then the G channel numerical value is interpolated, and then the G channel numerical value of the pixel is calculated by taking the target pixel as an example, specifically as follows:
firstly, calculating the difference value of the B channel value of a target pixel point and the G channel values of four adjacent pixel points, namely, the difference values are assumed to be a, B, c and d, and taking the reciprocal of each difference value.
Then multiplying the reciprocal of the difference value with the corresponding G channel numerical value, and adding the products to obtain the final product
Figure BDA0003716120070000101
The value is a G channel value of the target pixel point, wherein G1, G2, G3 and G4 are G channel values of four adjacent pixel points, up, down, left and right, respectively.
Through the calculation, the G channel numerical value of each pixel point in the original image can be obtained.
S502, calculating the color difference value of the interpolation result of the numerical values of the other color channels of each pixel point and the target color channel to obtain a color difference gray scale map corresponding to each other color channel.
Then, the color difference value of the interpolation result of the other color channels of each pixel point and the target color channel value is calculated, and it can be understood that the other color channels refer to the color channels in the original image except the target color channel value, for example, the G channel value is calculated in step S401, and then the other color channels refer to R, B and the IR channel.
For R, B and IR channel, color difference values between R, B and IR channel values and channel values in each pixel are calculated respectively to obtain a color difference gray scale map corresponding to R, B and the IR channel, for example, taking fig. 6 as an example, the color difference gray scale map corresponding to the IR channel is shown in fig. 7, and gray color in the map indicates that the pixel is formed by the color difference values.
S503, carrying out interpolation calculation on blank pixel points in the color difference gray-scale image to obtain the color difference gray-scale image corresponding to all the pixel points.
Because a blank pixel exists in the color difference gray-scale image, the server may perform interpolation calculation on the blank pixel through the color difference values of adjacent pixels, specifically, the simple interpolation operation method in step S102 in the first embodiment may be used, and details are not described here.
S504, according to the color difference gray-scale map corresponding to all the pixel points and the interpolation result of the target color channel numerical value, generating numerical values corresponding to other color channels of all the pixel points.
In each pixel point, the server adds the value in the color difference gray scale map and the interpolation result of the target color channel value correspondingly to obtain R, B and a value corresponding to the IR channel.
According to R, G, B of each pixel point and the corresponding numerical value of the IR channel, the method can be used for generating images such as RGB images, IR images, RAW images, YUV images and the like.
In this embodiment, the server performs interpolation calculation on the target color channel value in the corrected image to obtain an interpolation result of the target color channel value. And then, calculating the color difference value of the interpolation result of the numerical values of the other color channels of each pixel point and the target color channel to obtain a color difference gray scale map corresponding to each other color channel. And obtaining values corresponding to other color channels of all the pixel points according to the interpolation results of the color difference gray-scale images corresponding to all the pixel points and the values of the target color channels so as to generate a target image corresponding to the original image, so that the texture and the color of the target image are more accurate.
The present application does not limit the interpolation processing method for the corrected image, and may calculate the numerical value of each of the other color channels by using other interpolation methods, which is not limited in the present application.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an image generating apparatus according to a fifth embodiment of the present application. The image generation 80 includes: a first acquisition module 801, a complementary module 802, a second acquisition module 803, a correction module 804, an interpolation module 805 and a generation module 806.
The first obtaining module 801 is configured to obtain an original image in a current color temperature scene, where each pixel in the original image is represented by an original color channel value.
The complementing module 802 is configured to complement the resolution of the original image to obtain each color channel value of each pixel point in the original image.
A second obtaining module 803, configured to obtain a target channel correction coefficient corresponding to the current color temperature scene.
And the correcting module 804 is configured to correct, for each pixel point, the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point, so as to obtain a corrected image.
The interpolation module 805 is configured to perform interpolation processing on the original color channel value of each pixel point in the corrected image, and generate each color channel value of each pixel point.
The generating module 806 is configured to generate a target image corresponding to the original image according to each color channel value of each pixel point.
Optionally, the image generating device 80 further includes: a first processing module.
The first processing module is specifically configured to:
and acquiring image data in each color temperature scene, wherein the image data comprises color card images corresponding to each color channel and standard color card images corresponding to each color channel.
And respectively inputting the image data under each color temperature scene into the first neural network to obtain the channel correction coefficient corresponding to each color temperature scene.
And storing the channel correction coefficients corresponding to the color temperature scenes.
Optionally, the second obtaining module 803 is specifically configured to:
and acquiring the color temperature value of the current color temperature scene.
And determining a target channel correction coefficient in the stored channel correction coefficients corresponding to the color temperature scenes according to the color temperature values.
Optionally, the second obtaining module 803 is further configured to:
and acquiring a color card image in the current color temperature scene.
And inputting the color card image of the current color temperature scene into a pre-obtained coefficient prediction model to obtain a target channel correction coefficient corresponding to the current color temperature scene.
Optionally, the image generation 80 further comprises: and a second processing module.
The second processing module is specifically configured to:
and acquiring training data, wherein the training data comprises color card images in different color temperature scenes and corresponding channel correction coefficients.
And inputting the training data into a second neural network for training to obtain a coefficient prediction model.
Optionally, the correction module 804 is further configured to:
and performing black level removal and/or dead pixel correction on the original image after resolution supplement.
Optionally, the interpolation module 805 is specifically configured to:
and carrying out interpolation calculation on the target color channel numerical value in the corrected image to obtain an interpolation result of the target color channel numerical value, wherein the target color channel is the color channel with the largest quantity.
And calculating the color difference value of the interpolation result of the other color channel numerical values of each pixel point and the target color channel numerical value to obtain a color difference gray scale image corresponding to each other color channel.
And carrying out interpolation calculation on blank pixel points in the color difference gray image to obtain the color difference gray image corresponding to all the pixel points.
And generating values corresponding to other color channels of all the pixel points according to the color difference gray-scale images corresponding to all the pixel points and the interpolation result of the target color channel value.
Optionally, the interpolation module 805 is further configured to:
the black level adding process is performed on the corrected image.
The apparatus of this embodiment may be configured to execute the steps of the image generation method in the first to fourth embodiments, and specific implementation and technical effects are similar and will not be described herein again.
Fig. 9 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention, and as shown in fig. 9, the electronic device may include: at least one processor 901 and memory 902.
And a memory 902 for storing programs. In particular, the program may include program code comprising computer operating instructions.
Memory 902 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 901 is configured to execute computer-executable instructions stored in the memory 902 to implement the methods described in the foregoing method embodiments. The processor 901 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement the embodiments of the present Application.
Optionally, the device may also comprise a communication interface 903. In a specific implementation, if the communication interface 903, the memory 902 and the processor 901 are implemented independently, the communication interface 903, the memory 902 and the processor 901 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. Buses may be divided into address buses, data buses, control buses, etc., but do not represent only one bus or type of bus.
Optionally, in a specific implementation, if the communication interface 903, the memory 902, and the processor 901 are integrated into a chip, the communication interface 903, the memory 902, and the processor 901 may complete communication through an internal interface.
Alternatively, the electronic device may be a component or a product in a field such as smart home, transportation, smart home, monitoring, industrial automation, and the like, which is integrated with an Image Signal Processing (ISP) chip. For example, the electronic device may be an intelligent transportation device (such as an autonomous automobile, etc.), a security device (such as a monitor), various communication devices (such as a mobile phone, a tablet computer, etc.), and the like.
The electronic device of this embodiment may be configured to execute the steps of any one of the image generation methods in the first to fourth embodiments, and specific implementation manners and technical effects are similar and will not be described herein again.
An embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium may include: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing computer programs, specifically, a computer program is stored in the computer readable storage medium, and when the computer program is executed by a processor, the computer program is used to implement the steps of any one of the image generation methods in the first to fourth embodiments.
An eighth embodiment of the present invention provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the steps of the image generation method according to any one of the first to fourth embodiments are implemented, and the specific implementation manner and the technical effect are similar, and are not described herein again.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (11)

1. An image generation method, characterized in that the method comprises:
acquiring an original image in a current color temperature scene, wherein each pixel point in the original image is represented by an original color channel numerical value;
performing resolution supplement on the original image to obtain each color channel numerical value of each pixel point in the original image;
acquiring a target channel correction coefficient corresponding to the current color temperature scene;
for each pixel point, correcting the original color channel value of the pixel point according to the target channel correction coefficient and each color channel value of the pixel point to obtain a corrected image;
carrying out interpolation processing on the original color channel numerical value of each pixel point in the corrected image to generate each color channel numerical value of each pixel point;
and generating a target image corresponding to the original image according to each color channel numerical value of each pixel point.
2. The method of claim 1, wherein prior to the obtaining the original image of the current color temperature scene, the method further comprises:
acquiring image data in each color temperature scene, wherein the image data comprises color card images corresponding to each color channel and standard color card images corresponding to each color channel;
respectively inputting the image data under each color temperature scene into a first neural network to obtain a channel correction coefficient corresponding to each color temperature scene;
and storing the channel correction coefficients corresponding to the color temperature scenes.
3. The method of claim 2, wherein the obtaining the target channel correction coefficient corresponding to the current color temperature scene comprises:
acquiring a color temperature value of the current color temperature scene;
and determining the target channel correction coefficient in the stored channel correction coefficients corresponding to the color temperature scenes according to the color temperature values.
4. The method of claim 1, wherein the obtaining of the target channel correction coefficient corresponding to the current color temperature scene comprises:
acquiring a color card image in the current color temperature scene;
and inputting the color card image of the current color temperature scene into a pre-obtained coefficient prediction model to obtain a target channel correction coefficient corresponding to the current color temperature scene.
5. The method of claim 4, wherein prior to obtaining the original image of the current color temperature scene, the method further comprises:
acquiring training data, wherein the training data comprises color card images in scenes with different color temperatures and corresponding channel correction coefficients;
and inputting the training data into a second neural network for training to obtain the coefficient prediction model.
6. The method according to any one of claims 1 to 5, wherein before correcting, for each of the pixels, the original color channel value of the pixel according to the target channel correction factor and the respective color channel value of the pixel, the method further comprises:
and performing black level removal and/or dead pixel correction on the original image after resolution supplement.
7. The method of claim 6, wherein before the interpolating the original color channel values for each pixel in the corrected image, the method further comprises:
and performing black level addition processing on the corrected image.
8. The method according to claim 7, wherein the interpolating the original color channel values of the pixels in the corrected image to generate the color channel values of the pixels comprises:
performing interpolation calculation on the target color channel numerical value in the corrected image to obtain an interpolation result of the target color channel numerical value, wherein the target color channel is the color channel with the largest quantity;
calculating the color difference value of the interpolation result of the other color channel numerical values of each pixel point and the target color channel numerical value to obtain a color difference gray scale image corresponding to each other color channel;
carrying out interpolation calculation on blank pixel points in the color difference gray-scale image to obtain a color difference gray-scale image corresponding to all the pixel points;
and generating numerical values corresponding to other color channels of all the pixel points according to the color difference gray-scale images corresponding to all the pixel points and the interpolation result of the numerical value of the target color channel.
9. An image generation apparatus, characterized in that the apparatus comprises:
the first acquisition module is used for acquiring an original image under a current color temperature scene, wherein each pixel point in the original image is represented by an original color channel numerical value;
the supplement module is used for supplementing the resolution of the original image to obtain each color channel numerical value of each pixel point in the original image;
the second acquisition module is used for acquiring a target channel correction coefficient corresponding to the current color temperature scene;
the correction module is used for correcting the original color channel numerical value of each pixel point according to the target channel correction coefficient and each color channel numerical value of the pixel point to obtain a corrected image;
the interpolation module is used for carrying out interpolation processing on the original color channel numerical value of each pixel point in the corrected image to generate each color channel numerical value of each pixel point;
and the generating module is used for generating a target image corresponding to the original image according to each color channel numerical value of each pixel point.
10. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the image generation method of any of claims 1-8.
11. A computer-readable storage medium having computer-executable instructions stored therein, which when executed by a processor, are configured to implement the image generation method of any one of claims 1-8.
CN202210737275.8A 2022-06-27 2022-06-27 Image generation method, image generation device, electronic equipment and storage medium Pending CN115103168A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210737275.8A CN115103168A (en) 2022-06-27 2022-06-27 Image generation method, image generation device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210737275.8A CN115103168A (en) 2022-06-27 2022-06-27 Image generation method, image generation device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115103168A true CN115103168A (en) 2022-09-23

Family

ID=83294936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210737275.8A Pending CN115103168A (en) 2022-06-27 2022-06-27 Image generation method, image generation device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115103168A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103686111A (en) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 Method and device for correcting color based on RGBIR (red, green and blue, infra red) image sensor
CN111953953A (en) * 2019-05-17 2020-11-17 北京地平线机器人技术研发有限公司 Method and device for adjusting pixel brightness and electronic equipment
CN112243117A (en) * 2019-07-17 2021-01-19 杭州海康威视数字技术股份有限公司 Image processing apparatus, method and camera
WO2021114184A1 (en) * 2019-12-12 2021-06-17 华为技术有限公司 Neural network model training method and image processing method, and apparatuses therefor
CN113781326A (en) * 2021-08-11 2021-12-10 北京旷视科技有限公司 Demosaicing method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103686111A (en) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 Method and device for correcting color based on RGBIR (red, green and blue, infra red) image sensor
CN111953953A (en) * 2019-05-17 2020-11-17 北京地平线机器人技术研发有限公司 Method and device for adjusting pixel brightness and electronic equipment
CN112243117A (en) * 2019-07-17 2021-01-19 杭州海康威视数字技术股份有限公司 Image processing apparatus, method and camera
WO2021114184A1 (en) * 2019-12-12 2021-06-17 华为技术有限公司 Neural network model training method and image processing method, and apparatuses therefor
CN113781326A (en) * 2021-08-11 2021-12-10 北京旷视科技有限公司 Demosaicing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US20150278996A1 (en) Image processing apparatus, method, and medium for generating color image data
CN111837155A (en) Image processing method and apparatus
US20120257088A1 (en) Image processing apparatus and method thereof
CN111163301B (en) Color adjustment method, device and computer readable storage medium
CN110909750B (en) Image difference detection method and device, storage medium and terminal
CN111510692B (en) Image processing method, terminal and computer readable storage medium
CN108230407B (en) Image processing method and device
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN114331916A (en) Image processing method and electronic device
CN103079077A (en) Image processing method
CN105049820B (en) IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, and IMAGE PROCESSING METHOD
CN115103168A (en) Image generation method, image generation device, electronic equipment and storage medium
JP2002223452A (en) Image interpolation device
JP5898428B2 (en) Image processing apparatus and control method thereof
JP5818568B2 (en) Image processing apparatus and control method thereof
CN114500850A (en) Image processing method, device and system and readable storage medium
CN103313066A (en) Interpolation method and device
US9041815B2 (en) Digital camera imaging evaluation module
JP2001197321A (en) Color picture processing method and picture processor
JP2001078235A (en) Method and system for image evaluation
CN112907653B (en) Image processing method, image pickup device and storage medium
CN110148089B (en) Image processing method, device and equipment and computer storage medium
CN112055191B (en) White balance adjustment method, image acquisition device and storage medium
JP2013005363A (en) Image pickup apparatus
CN114025144A (en) White balance gain adjustment method, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination