CN117278862A - Image processing method, device, equipment and medium - Google Patents

Image processing method, device, equipment and medium Download PDF

Info

Publication number
CN117278862A
CN117278862A CN202311098984.7A CN202311098984A CN117278862A CN 117278862 A CN117278862 A CN 117278862A CN 202311098984 A CN202311098984 A CN 202311098984A CN 117278862 A CN117278862 A CN 117278862A
Authority
CN
China
Prior art keywords
image
pixel
resolution
yuv
pixel values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311098984.7A
Other languages
Chinese (zh)
Inventor
卢二利
况璐
卢劲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202311098984.7A priority Critical patent/CN117278862A/en
Publication of CN117278862A publication Critical patent/CN117278862A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals

Abstract

The application discloses an image processing method, an image processing device, image processing equipment and a medium, which are used for improving the brightness of an image and avoiding frame loss. When a camera sensor works in a high dynamic range imaging HDR mode, the camera sensor outputs a first image and a second image; when the current ambient brightness is less than or equal to the set threshold value, the image signal processor ISP accumulates the pixel values at the same pixel point positions of the first image and the second image to obtain a target image. The pixel values are accumulated, so that the brightness of the image can be improved, and meanwhile, the frame loss is avoided.

Description

Image processing method, device, equipment and medium
Technical Field
The present disclosure relates to the field of video monitoring technologies, and in particular, to an image processing method, apparatus, device, and medium.
Background
As shown in fig. 1, a Bayer format image, a Quadra Bayer format image, and an RGB three-channel image are illustrated in order from the left to the right. The Bayer pattern image is an image in which red pixels R, green pixels G, and blue pixels B are arranged in a Bayer array. The Quadra Bayer format image is an image in which 2 rows and 2 columns of 4 pixels of the same color are used as one large pixel of the color, and red large pixels, green large pixels and blue large pixels are arranged in a Bayer array. For example, in the Quadra Bayer pattern image illustrated in fig. 1, 4 red pixels R in the upper left corner 2 rows and 2 columns are arranged as one red large pixel, and red large pixels, green large pixels, and blue large pixels are arranged in Bayer array. The RGB three-channel image is a color image generated from a red channel image, a blue channel image, and a green channel image, the red channel image determining a red component of the RGB three-channel image, the blue channel image determining a blue component of the RGB three-channel image, and the green channel image determining a green component of the RGB three-channel image.
As shown in fig. 2, the original raw image generated by the sensor is a Quadra Bayer format image. Modes of sensor output image include two types: one is a binding output mode, in which each of the large pixels of the image in the Quadra Bayer format is synthesized into a binding pixel of the same color, the size of the binding pixel is one-fourth of the size of the large pixel of the Quadra, and the resolution of the image output in the binding output mode is one-fourth of the original raw image, so that part of image information is lost, and the image information is usually used for displaying a thumbnail. The other is a Fullsize output mode, also called a remote output mode, in which the resolution of the image output by the sensor is identical to that of the original raw image, and in which remote processing is performed on the original raw image by opening a remote register configuration to obtain a raw image in Bayer format.
As shown in fig. 3, when the illumination is sufficient, the sensor adopts a Fullsize output mode to output an image with 4*4 resolution on the left side; when the illumination is insufficient, the sensor adopts a canning output mode to output the right image with 2 x 2 resolution. In low illumination environment, the image brightness can be improved by a manner of reducing the image resolution through a binding output mode.
When the sensor's binding output mode is switched from off to on, the sensor register needs to be initialized. This process may drop frames, resulting in an uneven video monitoring picture.
Disclosure of Invention
The application provides an image processing method, an image processing device, image processing equipment and a medium, which are used for improving the brightness of an image and avoiding frame loss.
In a first aspect, there is provided an image processing method including:
when a camera sensor works in a high dynamic range imaging HDR mode, the camera sensor outputs a first image and a second image;
the pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
the pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
when the current ambient brightness is less than or equal to the set threshold value, the image signal processor ISP accumulates the pixel values at the same pixel point positions of the first image and the second image to obtain a target image.
In one possible implementation, accumulating the pixel values at the same pixel point positions of the first image and the first image to obtain a target image includes:
and accumulating the pixel values at the same pixel point positions of the first image and the second image, keeping the accumulated pixel values smaller than or equal to a preset value unchanged, and updating the accumulated pixel values larger than the preset value to obtain a target image.
In one possible implementation, before the camera sensor outputs the first image and the second image, the method further includes:
setting the exposure time and/or gain of the camera sensor for acquiring the first image and the second image to be the same.
In one possible implementation, after obtaining the target image, the method further includes:
the image signal processor ISP performs color space conversion processing on the target image to obtain a first YUV image, and outputs the first YUV image.
In a second aspect, there is provided an image processing method including:
when the camera sensor determines that the camera sensor works in a linear mode, a third image is output; the third image is an image with a first resolution in a Quadra Bayer format;
when the current ambient brightness is determined to be smaller than or equal to a set threshold value, the image signal processor ISP combines adjacent 4 pixels with the same color in the third image into one pixel, and adds the pixel values of the adjacent 4 pixels with the same color to obtain a fourth image; the fourth image is an image with a second resolution of a Bayer format, and the second resolution is 1/4 of the first resolution;
performing color space conversion processing on the fourth image to obtain a second YUV image; wherein the resolution of the second YUV image is the second resolution;
amplifying the second YUV image into a third YUV image; wherein the resolution of the third YUV image is the first resolution;
outputting the third YUV image.
In one possible implementation, the first resolution is the highest resolution supported by the camera sensor.
In one possible implementation, the method further includes:
when the current ambient brightness is determined to be smaller than or equal to a set threshold value, performing color space conversion processing on the third image to obtain a fourth YUV image; wherein the resolution of the fourth YUV image is the first resolution;
and outputting the fourth YUV image.
In a third aspect, there is provided an image processing apparatus comprising:
the camera sensor module is used for determining that the camera sensor works in a high dynamic range imaging HDR mode and outputting a first image and a second image; the pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format; the pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
and the image signal processor ISP module is used for accumulating the pixel values at the same pixel point positions of the first image and the second image when the current ambient brightness is less than or equal to the set threshold value, so as to obtain the target image.
In one possible implementation, the image signal processor ISP module is specifically configured to accumulate pixel values at the same pixel point positions of the first image and the second image, keep the accumulated pixel values smaller than or equal to a preset value unchanged, and update the accumulated pixel values greater than the preset value to obtain the target image.
In a possible implementation, the camera sensor module is further configured to set, before outputting the first image and the second image, that exposure times and/or gains for the camera sensor to acquire the first image and the second image are the same.
In one possible implementation, the image signal processor ISP module is further configured to perform color space conversion processing on the target image to obtain a first YUV image, and output the first YUV image.
In a fourth aspect, there is provided an image processing apparatus including:
the camera sensor module is used for outputting a third image when the camera sensor works in a linear mode; the third image is an image with a first resolution in a Quadra Bayer format;
an image signal processor ISP, configured to combine adjacent 4 pixels of the same color in the third image into one pixel when it is determined that the current ambient brightness is less than or equal to a set threshold, add the pixel values of the adjacent 4 pixels of the same color to obtain a fourth image as the pixel value of the one pixel; the fourth image is an image with a second resolution of a Bayer format, and the second resolution is 1/4 of the first resolution; performing color space conversion processing on the fourth image to obtain a second YUV image; wherein the resolution of the second YUV image is the second resolution; amplifying the second YUV image into a third YUV image; wherein the resolution of the third YUV image is the first resolution; outputting the third YUV image.
In one possible implementation, the ISP module is further to: when the current ambient brightness is determined to be smaller than or equal to a set threshold value, performing color space conversion processing on the third image to obtain a fourth YUV image; wherein the resolution of the fourth YUV image is the first resolution; and outputting the fourth YUV image.
In a fifth aspect, the present application provides an electronic device, including: a processor, optionally further comprising a memory; the processor and the memory are coupled; the memory is used for storing a computer program or instructions; the processor is configured to execute part or all of the computer program or instructions in the memory, which when executed, is configured to implement the functions of any of the methods described above.
In one possible implementation, the apparatus may further include a transceiver for transmitting the signal processed by the processor or receiving a signal input to the processor. The transceiver may perform the transmit or receive actions of any of the methods.
In a sixth aspect, a computer readable storage medium is provided for storing a computer program comprising instructions for implementing the functions of any one of the claims.
Alternatively, a computer-readable storage medium storing a computer program which, when executed by a computer, can cause the computer to perform the method of any one of the above.
In a seventh aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of any of the above.
In the prior art, when the sensor is operated in the HDR mode and the ambient brightness is low, the system controls the sensor to switch to the linear mode and starts the sensor's bind output mode. The switching of the on and off of the bind output mode of Sensor requires initializing the Sensor register, and when video monitoring is performed, the initializing process of the Sensor register may bring possibility of frame loss, and the video picture is not smooth. In addition, changing the sensor output also requires resetting the VI parameters. In the prior art, when HDR fusion is performed, weights of 2 frames of images are different for pixel points at the same position in the 2 frames of images, and the sum of the weights of the 2 frames of images is 1, that is, the fused target image is formed by stitching pixel values of the pixel points in the 2 frames. In this embodiment, when the ambient brightness is relatively low, the image brightness can be improved by accumulating the pixel values of 2 frames. The sensor register does not need to be initialized again, the VI parameter does not need to be reset, and frame loss is avoided.
In the prior art, when the sensor works in the linear mode and the ambient brightness is low, the system starts the sensor's binding output mode. The switching of the on and off of the bind output mode of Sensor requires initializing the Sensor register, and when video monitoring is performed, the initializing process of the Sensor register may bring possibility of frame loss, and the video picture is not smooth. In addition, changing the sensor output also requires resetting the VI parameters. In the present embodiment, when the ambient brightness is relatively low, the ISP performs a sensor-like operation, and the screen brightness of the second image is increased by 4 times compared with the screen brightness of the first image.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation in the related art, a brief description will be given below of the drawings required for the embodiments or the related art descriptions, and it is apparent that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings for those of ordinary skill in the art.
FIG. 1 shows a schematic view of a Bayer format image, a Quadra Bayer format image, and an RGB three-channel image;
FIG. 2 shows a schematic diagram of a sensor output mode;
FIG. 3 shows a schematic diagram of a sensor output image;
FIG. 4 is a schematic flow chart of image processing in a linear mode of sensor operation according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of image processing of a sensor operating in HDR mode according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of image processing according to an embodiment of the present application;
fig. 7 shows a device configuration diagram of image processing according to an embodiment of the present application;
fig. 8 shows a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application. The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
It should be noted that: the embodiments of the present application are only used to illustrate the technical solution of the present application, but not to limit it; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
An optical lens focuses the light onto the sensor; the sensor converts the optical signal into an electrical signal; the ISP processes the signals obtained by the sensors to obtain visual images.
Sensor operates in either a linear mode or a high dynamic range imaging (High Dynamic Range Imaging, HDR) mode, depending on scene needs. In a general monitoring scenario, sensor usually works in a linear mode; in a wide dynamic monitoring scenario, sensor typically operates in HDR mode.
When the Sensor is operating in the linear mode, the Sensor may or may not turn on the binning output mode. The application needs to satisfy: "if the Sensor is operating in the linear mode, the Sensor does not turn on the binding output mode nor the remote output mode", so that the Sensor outputs a full resolution raw image in the Quadra Bayer format, and the VI of the ISP collects a full resolution raw image in the Quadra Bayer format output by the Sensor.
If the Sensor is operated in the HDR mode, the Sensor outputs 2 frames of raw images with 1/4full resolution, which are respectively a first image and a second image, which can be called a long video frame and a short video frame, and the VI of the ISP acquires the 1/4full resolution frames of raw images output by the Sensor.
full is the resolution in the preset linear mode, for example, the highest resolution supported by the sensor, and may be smaller than the highest resolution.
Fig. 4 shows a schematic flow chart of image processing of a sensor operating in an HDR mode according to an embodiment of the present application, where the process includes the following steps:
step 41: the camera sensor outputs a first image and a second image when operating in a high dynamic range imaging HDR mode.
The pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on the first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format.
The pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on the second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format.
For example, the position coordinates of adjacent 4 same-color pixel points are (1, 1), (1, 2), (2, 1) and (2, 2), the position coordinates of 2 same-color pixel points on the first diagonal are (1, 1) and (2, 2), and the position coordinates of 2 same-color pixel points on the second diagonal are (2, 1) and (1, 2).
The first image and the second image are obtained by adding pixel values of 2 pixel points with the same color on the diagonal line, so that the same end time of the exposure of the first image and the second image can be ensured, and motion smear is reduced. If the pixel values of adjacent 4 pixels with the same color are added to determine the long video frame and the short video frame, the long video frame is exposed firstly, then the short video frame is exposed, and motion smear is obvious during fusion.
The first image and the second image have the same resolution, which is, illustratively, 1/4full resolution. The full resolution is a resolution in a preset linear mode, for example, the highest resolution supported by the camera sensor, and may be smaller than the highest resolution.
Step 42: it is determined whether the current ambient brightness is less than or equal to a set threshold. If so, step 43 is performed.
Step 43: the image signal processor ISP adds up the pixel values at the same pixel point positions of the first image and the second image to obtain the target image.
Wherein the resolution of the target image is the same as the resolution of the first image and the second image.
Further optionally, the ISP performs conventional ISP processing on the target image to obtain a first YUV image, and outputs the first YUV image.
Wherein the resolution of the first YUV image is the same as the resolution of the target image.
Conventional ISP processing includes, but is not limited to, one or more of the following: black level compensation (black level compensation), lens correction (lens shading correction), bad pixel correction (bad pixel correction), color interpolation (demosaic), bayer noise removal, white balance (AWB) correction, color correction (color correction), gamma correction, color space conversion (RGB to YUV), color noise removal and edge enhancement in YUV color space, color and contrast enhancement, automatic exposure control in the middle, image compression, and the like.
In the prior art, when the sensor is operated in the HDR mode and the ambient brightness is low, the system controls the sensor to switch to the linear mode and starts the sensor's bind output mode. The switching of the on and off of the bind output mode of Sensor requires initializing the Sensor register, and when video monitoring is performed, the initializing process of the Sensor register may bring possibility of frame loss, and the video picture is not smooth. In addition, changing the sensor output also requires resetting the VI parameters.
In the prior art, when HDR fusion is performed, weights of 2 frames of images are different for pixel points at the same position in the 2 frames of images, and the sum of the weights of the 2 frames of images is 1, that is, the fused target image is formed by stitching pixel values of the pixel points in the 2 frames.
In this embodiment, when the ambient brightness is relatively low, the image brightness can be improved by accumulating the pixel values of 2 frames. The sensor register does not need to be initialized again, the VI parameter does not need to be reset, and frame loss is avoided.
In one possible implementation, when the pixel values at the same pixel positions of the first image and the first image are accumulated to obtain the target image, specifically, the pixel values at the same pixel positions of the first image and the second image are accumulated, the accumulated pixel values smaller than or equal to a preset value are kept unchanged, and the accumulated pixel value larger than the preset value is updated to the preset value, so that the target image is obtained. The preset value may be 255 or other value. Thus, overexposure can be prevented and the quality of the image can be improved. Of course, the accumulated pixel value may be kept unchanged for values greater than a preset value.
In one possible example, the camera sensor is set to acquire the same exposure time and/or the same gain for the first image and the second image before the camera sensor outputs the first image and the second image.
Resetting the camera sensor to acquire the same exposure time and/or the same gain of the first image and the second image after determining that the current ambient brightness is less than or equal to the set threshold; for example, the long exposure time and the short exposure time are each 3ms, or 5ms, or 10ms, or the like. The first and second images acquired by the camera sensor based on the reset exposure time and/or gain are reacquired. Then, pixel values are accumulated based on the retrieved first image and the second prediction. Resetting parameters such as exposure time, gain and the like does not need to initialize a sensor register, and does not need to reset VI parameters. The image quality of the target image is further improved by setting the exposure time to be the same and/or the gain to be the same.
Fig. 5 shows a schematic flow chart of image processing in a linear mode of sensor operation according to an embodiment of the present application, where the process includes the following steps:
step 51: the camera sensor works in a linear mode and outputs a third image; the third image is an image with a first resolution in a Quadra Bayer format.
The first resolution is, for example, a resolution in a preset linear mode, i.e., the full resolution described above.
Step 52: and calculating the current ambient brightness, and determining whether the current ambient brightness is less than or equal to a set threshold.
If so, step 53 is performed.
The order of steps 51 and 52 is not limited.
Step 53: ISP combines the adjacent 4 pixels with the same color in the third image into one pixel, adds the pixel values of the adjacent 4 pixels with the same color as the pixel value of the one pixel, and obtains a fourth image; the fourth image is an image with a second resolution of a Bayer format, and the second resolution is 1/4 of the first resolution.
The brightness of the fourth image is improved by 4 times compared with that of the third image. The principle of this process is similar to that of the sensor's binning output mode introduced in fig. 2. Step 53 may be seen as the ISP performing a sensor binding-like process.
Step 54: ISP carries out traditional ISP processing on the fourth image to obtain a second YUV image; wherein the resolution of the second YUV image is the second resolution.
Conventional ISP processing includes, but is not limited to, one or more of the following: black level compensation (black level compensation), lens correction (lens shading correction), bad pixel correction (bad pixel correction), color interpolation (demosaic), bayer noise removal, white balance (AWB) correction, color correction (color correction), gamma correction, color space conversion (RGB to YUV), color noise removal and edge enhancement in YUV color space, color and contrast enhancement, automatic exposure control in the middle, image compression, and the like.
Step 55: ISP amplifies the second YUV image into a third YUV image; wherein the resolution of the third YUV image is the first resolution.
For example, the amplification is performed by interpolation amplification, for example, by bilinear interpolation amplification. The resolution of the third YUV image and the resolution of the raw image output by the Sensor are both full with the first resolution, and the picture brightness of the third YUV image is improved to 4 times compared with the picture brightness of the raw image in the Quadra Bayer format output by the Sensor. I.e. without reducing the resolution and increasing the brightness of the image.
Step 56: the ISP outputs the third YUV image.
In the prior art, when the sensor works in the linear mode and the ambient brightness is low, the system starts the sensor's binding output mode. The switching of the on and off of the bind output mode of Sensor requires initializing the Sensor register, and when video monitoring is performed, the initializing process of the Sensor register may bring possibility of frame loss, and the video picture is not smooth. In addition, changing the sensor output also requires resetting the VI parameters. In this embodiment, when the ambient brightness is low, the ISP performs a sensor-like operation, so that the sensor register does not need to be initialized again, and the VI parameter does not need to be reset, thereby avoiding frame loss.
When the current ambient brightness is determined to be greater than or equal to the set threshold, the application does not limit the subsequent process. In a possible implementation, the following steps are performed:
step 57: ISP carries out color space conversion processing on the third image to obtain a fourth YUV image; wherein the resolution of the fourth YUV image is the first resolution.
Step 58: and outputting the fourth YUV image.
In connection with the above-described examples of fig. 4 and 5, taking the first resolution as full and the second resolution as 1/4full as an example, fig. 6 shows a schematic flow chart of image processing provided in an embodiment of the present application, where the process includes the following steps:
first, each device is initially started.
Individual devices include, but are not limited to: camera Sensor, video Interface (VI), image signal processor (Image Signal Processing, ISP).
Setting various parameters in the initial start-up, for example, parameters of Sensor: gain parameters, exposure time, etc.; parameters of ISP: HDR fusion weights.
Then, the current ambient brightness Luma is calculated, and whether the image brightness needs to be improved is judged according to the current ambient brightness. From here on, it runs once a frame. The typical image is 25 frames/second.
Next, the current operation mode sensor_mode of the sensor is acquired.
The current mode of operation of the sensor is either linear or HDR. Illustratively, it is determined whether the current operating mode of the sensor is an HDR mode, if so, the sensor operates in an HDR mode, and if not, the sensor operates in a linear mode.
If the current ambient brightness is less than or equal to the set threshold, the image brightness needs to be increased. At this time, according to the current working mode of the sensor, a corresponding method for improving the brightness of the image is selected.
For example, when the sensor operates in the linear mode, the process of increasing the brightness of the image is as follows (refer to steps 53 to 56 in fig. 5):
ISP carries out similar sensor binding treatment on the third image with full resolution in the Quadra Bayer format to obtain a fourth image with 1/4full resolution in the Bayer format;
ISP carries out traditional ISP processing on the fourth image;
the ISP amplifies the resolution of the image processed by the ISP from 1/4full to full resolution;
displaying the full resolution image.
By way of example, when the sensor is operating in HDR mode, the process of increasing the brightness of the image is as follows (see part of fig. 4):
resetting the camera sensor to acquire the same exposure time and/or the same gain of the first image and the second image;
the HDR parameters are reset to update the HDR fusion mode to the accumulation of pixel values of 2 frames. It can also be understood that resetting the HDR fusion weight to 2 frames is 1. Specifically, the resetting can be performed in the corresponding register.
The ISP acquires a first image and a second image acquired by the camera sensor based on the reset exposure time and/or gain; optionally, the resolution of the first image and the second image are each 1/4full resolution.
The ISP accumulates the pixel values of the same pixel point positions of the first image and the second image to obtain a target image; optionally, the resolution of the target image is 1/4full resolution.
And performing conventional ISP processing on the target image by the ISP to obtain a first YUV image. Optionally, the resolution of the first YUV image is 1/4full resolution;
the first YUV image is displayed.
If the current ambient brightness is greater than or equal to the set threshold, then no improvement in image brightness is required. And performing normal image processing according to the current working mode (linear mode or HDR mode) of the sensor, and outputting an image with corresponding resolution.
For example, when the sensor is operating in the linear mode, the sensor outputs a third image of full resolution, the VI is operating in the full mode, the VI of the ISP receives the third image of full resolution output by the sensor, the ISP performs conventional ISP processing on the third image, and displays the image of full resolution.
For example, when the sensor is operating in the HDR mode, the sensor outputs a first image and a second image, the VI of the ISP is operating in the HDR mode, and the first image and the second image output by the sensor are received; and after the ISP performs conventional HDR fusion on the first image and the second image, performing conventional ISP processing, and displaying the image after the ISP processing.
When the sensor is operated in the linear mode, if it is determined that the brightness of the image needs to be increased, the image with the brightness not increased may or may not be output to the display on the basis of the image with the brightness increased by the ISP. For example, in the case where the luminance is less than or equal to the set threshold, steps 53 to 56 are performed, and a third YUV image is output; in addition, steps 57 to 58 are also performed to output a fourth YUV image.
When the sensor is operating in HDR mode, the sensor gain may be limited to be used up to the sensor analog gain and then the ISP analog gain again in order to further optimize the brightness boosting effect. Rather than using the sensor gain to the sensor digital gain.
Based on the same technical concept, the present application further provides an image processing apparatus, and fig. 7 shows a schematic structural diagram of the image processing apparatus, including: a camera sensor module 71 and an image signal processor ISP module 72.
In one example:
a camera sensor module 71 for determining that the camera sensor is operating in a high dynamic range imaging HDR mode, outputting a first image and a second image; the pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format; the pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
and the image signal processor ISP module 72 is configured to accumulate pixel values at the same pixel point positions of the first image and the second image to obtain a target image when the current ambient brightness is determined to be less than or equal to the set threshold.
In one possible implementation, the image signal processor ISP module 72 is specifically configured to accumulate pixel values at the same pixel positions of the first image and the second image, keep the accumulated pixel value smaller than or equal to a preset value unchanged, and update the accumulated pixel value greater than the preset value to obtain the target image.
In a possible implementation, the camera sensor module 71 is further configured to set, before outputting the first image and the second image, that the exposure time and/or the gain of the camera sensor for acquiring the first image and the second image are the same.
In a possible implementation, the image signal processor ISP module 72 is further configured to perform color space conversion processing on the target image, obtain a first YUV image, and output the first YUV image.
In one example:
a camera sensor module 71 for outputting a third image when the camera sensor is operated in the linear mode; the third image is an image with a first resolution in a Quadra Bayer format;
an image signal processor ISP72, configured to, when determining that the current ambient brightness is less than or equal to a set threshold, combine adjacent 4 pixels of the same color in the third image into one pixel, and add the pixel values of the adjacent 4 pixels of the same color as the pixel value of the one pixel to obtain a fourth image; the fourth image is an image with a second resolution of a Bayer format, and the second resolution is 1/4 of the first resolution; performing color space conversion processing on the fourth image to obtain a second YUV image; wherein the resolution of the second YUV image is the second resolution; amplifying the second YUV image into a third YUV image; wherein the resolution of the third YUV image is the first resolution; outputting the third YUV image.
In one possible implementation, the ISP module 72 is further to: when the current ambient brightness is determined to be smaller than or equal to a set threshold value, performing color space conversion processing on the third image to obtain a fourth YUV image; wherein the resolution of the fourth YUV image is the first resolution; and outputting the fourth YUV image.
Based on the same technical concept, the application further provides an electronic device, fig. 8 shows a schematic structural diagram of the electronic device, and as shown in fig. 8, the electronic device includes: the processor 81, optionally, further comprises: a communication interface 82, a memory 83 and a communication bus 84, wherein the processor 81, the communication interface 82 and the memory 83 perform communication with each other through the communication bus 84;
the memory 83 stores a computer program which, when executed by the processor 81, causes the processor 81 to perform the steps of the image processing method described above.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface 82 is used for communication between the above-described electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit, a network processor (Network Processor, NP), etc.; but also digital instruction processors (Digital Signal Processing, DSP), application specific integrated circuits, field programmable gate arrays or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
Based on the same technical idea, on the basis of the above embodiments, the present application provides a computer-readable storage medium having stored therein a computer program executable by an electronic device, the computer-executable instructions for causing a computer to execute the steps of the above image processing method.
The computer readable storage medium may be any available medium or data storage device that can be accessed by a processor in an electronic device, including but not limited to magnetic memories such as floppy disks, hard disks, magnetic tapes, magneto-optical disks (MO), etc., optical memories such as CD, DVD, BD, HVD, etc., and semiconductor memories such as ROM, EPROM, EEPROM, nonvolatile memories (NAND FLASH), solid State Disks (SSD), etc.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (10)

1. An image processing method, comprising:
when a camera sensor works in a high dynamic range imaging HDR mode, the camera sensor outputs a first image and a second image;
the pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
the pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
when the current ambient brightness is less than or equal to the set threshold value, the image signal processor ISP accumulates the pixel values at the same pixel point positions of the first image and the second image to obtain a target image.
2. The method of claim 1, wherein accumulating pixel values at the same pixel locations of the first image and the first image to obtain a target image comprises:
and accumulating the pixel values at the same pixel point positions of the first image and the second image, keeping the accumulated pixel values smaller than or equal to a preset value unchanged, and updating the accumulated pixel values larger than the preset value to obtain a target image.
3. The method of claim 1, further comprising, prior to the camera sensor outputting the first image and the second image:
setting the exposure time and/or gain of the camera sensor for acquiring the first image and the second image to be the same.
4. The method of claim 1, further comprising, after obtaining the target image:
the image signal processor ISP performs color space conversion processing on the target image to obtain a first YUV image, and outputs the first YUV image.
5. The method as recited in claim 1, further comprising:
when the camera sensor determines that the camera sensor works in a linear mode, a third image is output; the third image is an image with a first resolution in a Quadra Bayer format;
when the current ambient brightness is determined to be smaller than or equal to a set threshold value, the image signal processor ISP combines adjacent 4 pixels with the same color in the third image into one pixel, and adds the pixel values of the adjacent 4 pixels with the same color to obtain a fourth image; the fourth image is an image with a second resolution of a Bayer format, and the second resolution is 1/4 of the first resolution;
performing color space conversion processing on the fourth image to obtain a second YUV image; wherein the resolution of the second YUV image is the second resolution;
amplifying the second YUV image into a third YUV image; wherein the resolution of the third YUV image is the first resolution;
outputting the third YUV image.
6. The method of claim 5, wherein the first resolution is a highest resolution supported by the camera sensor.
7. The method of claim 5 or 6, further comprising:
when the current ambient brightness is determined to be smaller than or equal to a set threshold value, performing color space conversion processing on the third image to obtain a fourth YUV image; wherein the resolution of the fourth YUV image is the first resolution;
and outputting the fourth YUV image.
8. An image processing apparatus, comprising:
the camera sensor module is used for outputting a first image and a second image when working in a high dynamic range imaging HDR mode; the pixel value of each pixel point of the first image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a first diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format; the pixel value of each pixel point of the second image is obtained by accumulating the pixel values of 2 pixel points with the same color, which are positioned on a second diagonal line, in adjacent 4 pixel points with the same color under the Quadra Bayer format;
and the image signal processor ISP module is used for accumulating the pixel values at the same pixel point positions of the first image and the second image when the current ambient brightness is less than or equal to the set threshold value, so as to obtain the target image.
9. An electronic device, comprising: a processor and a memory;
the memory is used for storing a computer program or instructions;
the processor being configured to execute part or all of the computer program or instructions in the memory, which, when executed, is configured to implement the method of any of claims 1-8.
10. A computer readable storage medium storing a computer program comprising instructions for implementing the method of any one of claims 1-8.
CN202311098984.7A 2023-08-29 2023-08-29 Image processing method, device, equipment and medium Pending CN117278862A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311098984.7A CN117278862A (en) 2023-08-29 2023-08-29 Image processing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311098984.7A CN117278862A (en) 2023-08-29 2023-08-29 Image processing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117278862A true CN117278862A (en) 2023-12-22

Family

ID=89213329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311098984.7A Pending CN117278862A (en) 2023-08-29 2023-08-29 Image processing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117278862A (en)

Similar Documents

Publication Publication Date Title
KR101229600B1 (en) Image capturing apparatus and camera shake correction method, and computer-readable medium
US8982232B2 (en) Image processing apparatus and image processing method
US10102787B2 (en) Image signal processing apparatus and control method therefor
JP5624809B2 (en) Image signal processing device
JP2008131273A (en) Image processor, image processing method and program
US11490024B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
US20110050950A1 (en) Image device and imaging method
KR20080016657A (en) Image capturing apparatus and electronic zoom method
JP2009003011A (en) Image display device, imaging apparatus, image reproducing device, and image display method
US8885078B2 (en) Image processing apparatus, image processing method, and recording medium storing image processing program
JP5417746B2 (en) Motion adaptive noise reduction device, image signal processing device, image input processing device, and motion adaptive noise reduction method
JP5822508B2 (en) Imaging apparatus and control method thereof
KR20120024448A (en) Imaging apparatus, signal processing method, and program
US10762600B2 (en) Image processing apparatus, image processing method, and non-transitory computer-readable recording medium
US20110187903A1 (en) Digital photographing apparatus for correcting image distortion and image distortion correcting method thereof
US20130322755A1 (en) Image processing method and apparatus
JP2007274504A (en) Digital camera
TW202218403A (en) Correction of color tinted pixels captured in low-light conditions
JP4895107B2 (en) Electronic device, information processing method, and program
CN110266965B (en) Image processing method, image processing device, storage medium and electronic equipment
CN107920198B (en) Image processing apparatus and method
US11653107B2 (en) Image pick up apparatus, image pick up method, and storage medium
CN117278862A (en) Image processing method, device, equipment and medium
US11368624B2 (en) Image capturing apparatus and control method thereof
JP4705146B2 (en) Imaging apparatus and imaging method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination