CN113055665A - Image processing method, terminal and storage medium - Google Patents

Image processing method, terminal and storage medium Download PDF

Info

Publication number
CN113055665A
CN113055665A CN201911379772.XA CN201911379772A CN113055665A CN 113055665 A CN113055665 A CN 113055665A CN 201911379772 A CN201911379772 A CN 201911379772A CN 113055665 A CN113055665 A CN 113055665A
Authority
CN
China
Prior art keywords
infrared
characteristic value
image data
image
infrared characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911379772.XA
Other languages
Chinese (zh)
Other versions
CN113055665B (en
Inventor
王琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911379772.XA priority Critical patent/CN113055665B/en
Publication of CN113055665A publication Critical patent/CN113055665A/en
Application granted granted Critical
Publication of CN113055665B publication Critical patent/CN113055665B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image processing method, a terminal and a storage medium, wherein the image processing method comprises the following steps: acquiring first infrared information, second infrared information and visible light components through a color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and performing image processing on the image to be processed by using the screened image data.

Description

Image processing method, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image processing method, a terminal and a storage medium.
Background
When image processing is performed, the terminal can detect the position of a light source in an image by using an Automatic White Balance (AWB) algorithm based on original image data, and then White Balance restoration adaptive to the image can be performed on the basis of determining the position of the light source, so that a better image processing effect is realized.
However, a large amount of confusing colors are often introduced in an actual scene to influence the determination of white balance, so that the terminal cannot accurately determine the position of the light source in the image, and the reduction effect of the AWB is influenced, thereby reducing the processing effect of image processing.
Disclosure of Invention
The embodiment of the application provides an image processing method, a terminal and a storage medium, which can accurately determine the position of a light source in an image, effectively enhance the recovery effect of an AWB (active-waveguide beam) and further improve the processing effect of image processing.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides an image processing method, where the method includes:
acquiring first infrared information, second infrared information and visible light components through a color temperature sensor;
generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component;
screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data;
and carrying out image processing on the image to be processed by using the screened image data.
In a second aspect, an embodiment of the present application provides a terminal, where the terminal includes: an acquisition unit, a generation unit, a screening unit, a processing unit,
the acquisition unit is used for acquiring first infrared information, second infrared information and visible light components through the color temperature sensor;
the generating unit is used for generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component;
the screening unit is used for screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data;
and the processing unit is used for carrying out image processing on the image to be processed by utilizing the screened image data.
In a third aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory storing instructions executable by the processor, and when the instructions are executed by the processor, the terminal implements the image processing method as described above.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a program is stored, and the program is applied to a terminal, and when the program is executed by a processor, the program implements the image processing method as described above.
The embodiment of the application provides an image processing method, a terminal and a storage medium, wherein the terminal acquires first infrared information, second infrared information and visible light components through a color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and performing image processing on the image to be processed by using the screened image data. That is to say, in the embodiment of the application, the terminal may obtain the first infrared characteristic value and the second infrared characteristic value of the image to be processed by using the color temperature sensor, and then screen the original image data of the image to be processed by using the first infrared characteristic value and the second infrared characteristic value, so that the confusion data which may affect the white balance processing in the original image data may be deleted, and the image processing may be performed by using the screened image data. Therefore, the image processing method can take the dual-channel infrared information as an important auxiliary factor of the AWB algorithm to eliminate the confusing color which can cause interference in the image to be processed, so that the position of the light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of the image processing is further improved.
Drawings
FIG. 1 is a schematic flow chart of an implementation of an image processing method;
FIG. 2 is a first schematic view of a position of a color temperature sensor;
FIG. 3 is a schematic diagram of a second location of the color temperature sensor;
FIG. 4 is a schematic view of a current color temperature sensor;
FIG. 5 is a schematic diagram of a third position of the color temperature sensor;
FIG. 6 is a fourth schematic view of the position of the color temperature sensor;
FIG. 7 is a graphical representation of the spectral response of a color temperature sensor;
FIG. 8 is a schematic view of different detection channels;
FIG. 9 is a diagram of time domain signals before time-frequency transformation;
FIG. 10 is a schematic diagram of a frequency domain signal after time-frequency transformation;
FIG. 11 is a first schematic diagram of a coordinate point in a predetermined color gamut space;
FIG. 12 is a second schematic diagram of coordinate points in the preset color gamut space;
fig. 13 is a third schematic diagram of coordinate points in the preset color gamut space;
FIG. 14 is a first schematic diagram of the structure of the terminal;
fig. 15 is a schematic diagram of a terminal structure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant application and are not limiting of the application. It should be noted that, for the convenience of description, only the parts related to the related applications are shown in the drawings.
The basic assumption for implementing the AWB algorithm is that of a single light source, so that if a plurality of light sources with different color temperatures exist in an image, the basic assumption of most AWB algorithms cannot be established, and the AWB module needs to search for an image in a mixed light source scene by other methods better; object recognition, such as face recognition, gesture recognition, license plate recognition, etc., requires removal of the influence of light source color, and if under a mixed light source, the removal strategy is often more complicated.
The AWB algorithm can detect the position of a light source in an image based on raw image data of the image, and then can perform white balance restoration adapted to the image on the basis of the determined position of the light source. However, when there is no light source in the image or there are a lot of mixed colors, it becomes very difficult to achieve white balance, that is, when image processing is performed, the image processing is often interfered by a lot of mixed colors such as skin color, and green-blue color, which reduces the processing effect of image processing.
In the AWB algorithm, in the UV color gamut formed by R/G and B/G, a large amount of confusing colors are introduced to the actual scene to influence the determination of white balance, including green plants in the outdoor scene, and objects with various colors similar to Light Emitting Diodes (LEDs) and Organic Light Emitting Semiconductors (OLEDs) in the indoor scene, which are not actually the colors of the Light source and need to be excluded during calculation. Specifically, the AWB algorithm usually calculates the R/G and B/G of the original image data, puts the corresponding coordinate points into the UV color gamut with the R/G as the abscissa and the B/G as the ordinate, and then infers the position of the real light source in the scene corresponding to the image according to the relative positions of the coordinate points and the standard light sources (e.g., the D75 light source, the D65 light source, and the a light source). Because a large number of objects with light source color types exist in an actual scene, the judgment of the white balance algorithm can be misled, and the prior art cannot effectively distinguish whether the coordinate points are real light sources or mixed colors, so that a terminal cannot accurately determine the positions of the light sources in the image, the recovery effect of the AWB is influenced, and the processing effect of image processing is reduced.
Further, since the color temperature sensor can provide dual-channel Infrared component (IR) information and assist a white balance algorithm to eliminate a color confusion in an environment, in an embodiment of the present application, the terminal can obtain a first Infrared characteristic value and a second Infrared characteristic value of an image to be processed by using the color temperature sensor, and then screen original image data of the image to be processed by using the first Infrared characteristic value and the second Infrared characteristic value, so that the color confusion data which affects white balance processing in the original image data can be deleted, and image processing can be performed by using the screened image data. Therefore, the image processing method can take the dual-channel infrared information as an important auxiliary factor of the AWB algorithm to eliminate the confusing color which can cause interference in the image to be processed, so that the position of the light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of the image processing is further improved.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
An embodiment of the present application provides an image processing method, fig. 1 is a schematic implementation flow diagram of the image processing method, and as shown in fig. 1, in the embodiment of the present application, a method for a terminal to perform image processing may include the following steps:
step 101, acquiring first infrared information, second infrared information and visible light components through a color temperature sensor.
In the embodiment of the application, the terminal may first obtain the first infrared information, the second infrared information and the visible light component through detection of the configured color temperature sensor.
It should be noted that, in the embodiments of the present application, the terminal may be any device having communication and storage functions, for example: tablet computers, mobile phones, electronic readers, remote controllers, Personal Computers (PCs), notebook computers, vehicle-mounted devices, network televisions, wearable devices, and the like.
Further, in the embodiment of the present application, the terminal may be provided with a shooting device for image acquisition, and specifically, the terminal may be provided with at least one front camera and at least one rear camera.
It is understood that, in the embodiment of the present application, the image to be processed may be obtained by shooting the terminal through the arranged shooting device.
It should be noted that, in the embodiments of the present application, the terminal may further be provided with a color temperature sensor, and specifically, in the present application, the terminal may be provided with the color temperature sensor on one side of the front camera, and may also be provided with the color temperature sensor on one side of the rear camera. Fig. 2 is a first schematic position diagram of a color temperature sensor, and fig. 3 is a second schematic position diagram of the color temperature sensor, where the color temperature sensor is disposed on the left side of the front camera of the terminal as shown in fig. 2, and the color temperature sensor is disposed on the lower side of the rear camera of the terminal as shown in fig. 3.
In a common setting method of the color temperature sensor, the color temperature sensor is arranged in a bang area of a full-face screen, specifically, fig. 4 is a schematic setting diagram of the current color temperature sensor, and as shown in fig. 4, the color temperature sensor is placed below ink in the bang area by a terminal.
The terminal may also have a color temperature sensor disposed in the slit at the top. Fig. 5 is a third schematic position view of a color temperature sensor, and fig. 6 is a fourth schematic position view of the color temperature sensor, as shown in fig. 5 and 6, the color temperature sensor is disposed in a slit at the top of the terminal, the color temperature sensor does not affect the appearance of the terminal whether on the front side (fig. 5) or the back side (fig. 6) of the terminal, and the color temperature sensor disposed in the slit does not require an ink hole opened in the terminal.
Further, in the embodiment of the application, the terminal may detect an environmental parameter corresponding to the image to be processed through a configured color temperature sensor, and specifically, the color temperature sensor may detect and obtain parameters such as red R, green G, blue B, visible light C, full spectrum wb (wide band), correlated color temperature (cct), and flash Frequency (FD) of two channels corresponding to the image to be processed, which are parameters of FD1 and FD2, respectively.
Fig. 7 is a graph showing the spectral response of the color temperature sensor, and as shown in fig. 7, the spectral response curves corresponding to R, G, B, C, WB, FD1 and FD2 detected by the color temperature sensor vary with the wavelength.
It should be noted that, in the embodiment of the present application, the first infrared information and the second infrared information are different, and specifically, the first infrared information may be used to measure the intensity of an infrared band between 800nm and 900nm, and the second infrared information may be used to measure the intensity of an infrared band between 950nm and 1000 nm.
Further, in the embodiment of the application, the color temperature sensor configured in the terminal may detect infrared light in an environment corresponding to an image to be processed through different transceiving bands, so that the first infrared information and the second infrared information may be obtained. Fig. 8 is a schematic diagram of different detection channels, and as shown in fig. 8, the terminal can perform detection of the infrared band by using two frequencies, 50Hz and 60Hz respectively.
It should be noted that, in the embodiment of the present application, the terminal may obtain, by using the color temperature sensor, first time domain information obtained by detecting the first infrared channel, that is, obtain the first infrared information; meanwhile, the terminal can also obtain second time domain information detected and obtained by the second infrared channel through the color temperature sensor, namely obtain second infrared information.
Accordingly, the terminal can also obtain the component of the visible light band through the color temperature sensor, namely obtain the visible light component.
And 102, generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component.
In the embodiment of the application, after the terminal detects and obtains the first infrared information, the second infrared information and the visible light component by using the color temperature sensor, the terminal can directly generate the first infrared characteristic value and the second infrared characteristic value corresponding to the image to be processed according to the first infrared information, the second infrared information and the visible light component.
It should be noted that, in the embodiment of the present application, when the terminal generates the first infrared characteristic value and the second infrared characteristic value, the terminal may first perform time-frequency transformation on the first infrared information, so as to obtain a first direct current component corresponding to the first infrared information, and may also perform time-frequency transformation on the second infrared information, so as to obtain a second direct current component corresponding to the second infrared information. Fig. 9 is a schematic diagram of a time domain signal before time-frequency transformation, and fig. 10 is a schematic diagram of a frequency domain signal after time-frequency transformation, as shown in fig. 9 and fig. 10, after time-frequency transformation processing is performed, infrared information of a time domain can be converted into a corresponding direct current component.
Further, in the embodiment of the present application, after the terminal performs time-frequency transform processing on the first infrared information and the second infrared information respectively to obtain the first direct current component and the second direct current component, the terminal may further generate the first infrared characteristic value and the second infrared characteristic value by using the first direct current component, the second direct current component, and the visible light component.
It should be noted that, in the embodiment of the present application, the terminal may calculate to obtain the first infrared characteristic value by using the second direct current component and the visible light component, and meanwhile, the terminal may also calculate to obtain the second infrared characteristic value by using the first direct current component and the second direct current component.
Further, in an embodiment of the present application, the first infrared characteristic may be used to measure the intensity of the infrared band from 800nm to 900nm, and the second infrared characteristic may be used to measure the intensity of the infrared band from 950nm to 1000 nm.
It should be noted that, in the embodiment of the present application, when the first terminal generates the first infrared characteristic value, the first infrared characteristic value IR1 may be obtained by calculation based on the second direct current component Dc (FD2) and the visible light component C according to the following formula (1):
IR1=(Dc(FD2)-C)/Dc(FD2) (1)
it should be noted that, in the embodiment of the present application, when the first terminal generates the second infrared characteristic value, the second infrared characteristic value IR2 may be obtained by calculation based on the first direct current component Dc (FD1) and the second direct current component Dc (FD2) according to the following formula (2):
IR2=(Dc(FD1)-Dc(FD2))/Dc(FD2) (2)
the operator of Dc represents the direct current component of the corresponding channel, FD1DC is Dc (FD1), FD2DC is Dc (FD 2).
In this application, light in the spectrum from 380nm to 780nm is detectable by the human eye and is referred to as the visible light band. The region 800nm back is usually referred to as the infrared band and is imperceptible to the human eye. In the scene of an indoor fluorescent lamp, the energy of an infrared band of 800-900 nm is very weak, the energy of the infrared band of 800-900 nm under sunlight exists quite strongly, the infrared band starts to be attenuated violently after 950nm, and in contrast, the energy of an incandescent lamp in the infrared band of 800-1000 nm shows a stronger trend. Therefore, the color temperature sensor can be directly used for detecting the obtained dual-channel infrared waveband information to obtain the differential characteristic information.
And 103, screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data.
In the embodiment of the application, after the terminal generates the first infrared characteristic value and the second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component, the terminal may screen the original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value, so as to obtain the screened image data.
Further, in the embodiment of the present application, the RAW image data of the image to be processed may be a RAW image of the image to be processed acquired by the terminal.
The RAW image is original image data in which a Complementary Metal Oxide Semiconductor (CMOS) or a Charge-coupled Device (CCD) image sensor converts a captured light source signal into a digital signal. The RAW file is a file in which RAW information of an image sensor is recorded, and at the same time, some Metadata (Metadata such as ISO setting, shutter speed, aperture value, white balance, etc.) generated by camera shooting is recorded. RAW is in an unprocessed, also uncompressed, format and can be conceptualized as "original image encoded data" or more visually as "digital film".
It can be understood that, when Image Signal Processing (ISP) is performed, after an Automatic Exposure (AE) is performed on an Image, original Image data may be extracted, and meanwhile, an AWB algorithm or an Automatic Focus (AF) algorithm may be used for optimization, and after demosaic is performed on the optimized original Image data, Color Space Conversion (CSC) is performed, and a Signal-to-Noise Ratio (NR) is calculated. The original image data corresponding to the image to be processed in the application is the original image data.
It should be noted that, in the embodiment of the present application, the terminal selects to determine the original image data in the RAW domain, and during the subsequent clustering process, the extraction is performed based on the stats information (120 × 90) in the RAW domain faster than that in the RGB domain (4000 × 3000).
Further, in the embodiment of the application, the terminal may first acquire original image data of an image to be processed, and then screen the original image data by using the first infrared characteristic value and the second infrared characteristic value.
It is to be understood that, in the embodiment of the present application, the original image data may be stats information of a preset data amount. The preset data amount may be a data size standard pre-stored in the terminal, for example, the preset data amount may be 64 × 48, 120 × 90, 32 × 24, and the like.
It should be noted that, in the embodiment of the present application, before the terminal filters the original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value, a preset color gamut space may be established first. Specifically, the terminal may establish the preset color gamut space based on the R, G, B information. Illustratively, the terminal may establish a preset color gamut space, i.e., a UV space, with log (R/G) and log (B/G) as abscissa and ordinate, respectively; the terminal can also establish a preset color gamut space by taking R/G and B/G as an abscissa and an ordinate respectively.
Further, in the embodiment of the application, the terminal may map the original image data corresponding to the image to be processed into the preset color gamut space, that is, generate the coordinate point corresponding to the original image data in the preset color gamut space, and then may perform screening processing on the coordinate point by using the first infrared characteristic value and the second infrared characteristic value to obtain the screened coordinate, so that the screened image data may be determined according to the screened coordinate.
It should be noted that, in the embodiment of the present application, the terminal may further combine the luminance parameter corresponding to the image to be processed with the first infrared characteristic value and the second infrared characteristic value, so as to complete the screening of the coordinate point in the preset color gamut space.
Further, in the embodiment of the application, the terminal may use the dual-channel infrared band information obtained by the color temperature sensor to detect, so that the terminal may screen the original image data corresponding to the image to be processed through the first infrared characteristic value and the second infrared characteristic value, and delete a confusing color in the image to be processed, which interferes with the determination of the light source position, thereby avoiding that a large amount of confusing colors affect the determination of the white balance.
It can be understood that, in the embodiment of the application, the terminal may further acquire and obtain the flash frequencies of the two channels, that is, the first flash frequency and the second flash frequency, through the color temperature sensor, so that when the terminal screens the original image data, the terminal may further screen the original image data according to the first flash frequency and the second flash frequency, thereby obtaining the screened image data.
And 104, performing image processing on the image to be processed by using the screened image data.
In the embodiment of the application, after the terminal screens the original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value and obtains the screened image data, the image to be processed can be processed by using the screened image data.
It should be noted that, in the embodiment of the present application, when the terminal performs image processing on the image to be processed by using the screened image data, the terminal may determine a light source detection result corresponding to the image to be processed according to the first infrared characteristic value, the second infrared characteristic value, and the screened image data; and then, the image to be processed is processed by utilizing the light source detection result.
It is understood that, in the embodiment of the present application, the light source detection result determined by the terminal may include a corresponding light source position in the image to be processed, and may also include a corresponding light source type in the image to be processed.
Further, in the embodiment of the present application, the light source types in the light source detection result may include a single light source and a mixed light source. Specifically, the single light source may be any one of a daylight light source, a fluorescent lamp light source, or a tungsten lamp light source; the hybrid light source can be any combination of a daylight light source, a fluorescent lamp light source, and a tungsten lamp light source.
In the embodiment of the application, further, the terminal acquires the flash frequencies of the two channels, namely the first flash frequency and the second flash frequency, through the color temperature sensor, and screens the original image data according to the first flash frequency and the second flash frequency to obtain the screened image data, and then may also perform image processing on the image to be processed by using the screened image data.
The embodiment of the application provides an image processing method, wherein a terminal acquires first infrared information, second infrared information and visible light components through a color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and performing image processing on the image to be processed by using the screened image data. That is to say, in the embodiment of the application, the terminal may obtain the first infrared characteristic value and the second infrared characteristic value of the image to be processed by using the color temperature sensor, and then screen the original image data of the image to be processed by using the first infrared characteristic value and the second infrared characteristic value, so that the confusion data which may affect the white balance processing in the original image data may be deleted, and the image processing may be performed by using the screened image data. Therefore, the image processing method can take the dual-channel infrared information as an important auxiliary factor of the AWB algorithm to eliminate the confusing color which can cause interference in the image to be processed, so that the position of the light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of the image processing is further improved.
Based on the above embodiments, in still another embodiment of the present application, when the terminal performs image processing, 120 × 90 stats information (original image data) of an image to be processed, each stats containing R, G, B three components, may be provided by the ISP. The terminal can calculate and obtain the R/G and B/G of each stats information, and then map the stats information as a point into a gamut range with R/G as abscissa and B/G as ordinate, which is usually called as UV gamut, i.e. a preset gamut space, by using the R/G and B/G of each stats information, so as to obtain a coordinate point corresponding to the stats information. Fig. 11 is a schematic diagram of coordinate points in the preset color gamut space, as shown in fig. 11, the terminal may previously mark preset points corresponding to different light sources in the preset color gamut space, for example, specific coordinate positions corresponding to the light sources D75, D65, CWF, TL84, and A, H, that is, mark preset points corresponding to different light sources. In the coordinate points corresponding to the stats point information in the preset color gamut space, the range of the area 2 is usually the position where the light sources such as the LED and the OLED often appear, the upper left-hand coordinate of the area 2 is above the CWF light source point, the distance from the CWF light source is [0.015, 0.03], and the length and the width are [0.33, 0.09], wherein when the infrared band energy obtained by the color temperature sensor is strong, all the coordinate points appearing in the area 2 position can be determined as invalid, so the coordinate points in the area 2 can be omitted when the AWB algorithm is implemented, thereby avoiding the interference of part of mixed colors. Green plants such as leaves in an outdoor light environment reflect a large number of infrared bands, when the ambient brightness of an image to be processed is high, for example, the brightness parameter is greater than 3, the range of the area 1 in fig. 11 is usually the position where a pseudo light source such as a green plant often appears, the coordinate of the upper left corner of the area 1 is below the light source point of D65, the distance from the light source of D65 is [0.12, 0.1], the length and the width are [0.15, 0.22], and the white balance parameter can be also eliminated when calculating the white balance parameter.
The embodiment of the application provides an image processing method, and a terminal can take dual-channel infrared information as an important auxiliary factor of an AWB algorithm to eliminate confusing colors which can cause interference in an image to be processed, so that the position of a light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of image processing is further improved.
Based on the foregoing embodiment, in yet another embodiment of the present application, the method for obtaining the filtered image data by the terminal filtering the original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value may include the following steps:
step 201, mapping the original image data to a preset color gamut space, and obtaining a coordinate point corresponding to the image to be processed.
In the embodiment of the application, the terminal may map the original image data to a preset color gamut space first, so as to obtain a coordinate point corresponding to the image to be processed.
It should be noted that, in the embodiment of the present application, after obtaining the original image data corresponding to the image to be processed, the terminal may first perform two-dimensional space construction by using R, G, B information of the original image data, so as to map the corresponding coordinate point in the preset color gamut space. Specifically, based on the construction manner of the preset color gamut space, the terminal may first perform corresponding calculation on R, G, B information corresponding to each original image data, so as to obtain coordinate information corresponding to each original image data.
For example, in the present application, if the abscissa and the ordinate of the preset color gamut space are log (R/G) and log (B/G), respectively, the terminal may calculate the R, G, B information corresponding to each original image data according to log (R/G) and log (B/G), so as to determine the corresponding position of each original image data in the preset color gamut space.
For example, in the present application, if the abscissa and the ordinate of the preset color gamut space are R/G and B/G, respectively, the terminal may calculate the R, G, B information corresponding to each original image data according to R/G and B/G, so as to determine the corresponding position of each original image data in the preset color gamut space.
It is to be understood that, in the embodiment of the present application, one original image data corresponds to one coordinate point in the preset color gamut space.
Step 202, screening the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain screened coordinates.
In the embodiment of the application, after the terminal maps the original image data to the preset color gamut space and obtains the coordinate point corresponding to the image to be processed, the terminal may use the first infrared characteristic value and the second infrared characteristic value to perform screening processing on the coordinate point in the preset color gamut space, so as to obtain the screened coordinate.
Further, in the embodiment of the application, before the terminal performs the screening processing on the coordinate points by using the first infrared characteristic value and the second infrared characteristic value, a preset color gamut space may be established first, and a preset point may also be determined in the preset color gamut space, where the preset point is used for calibrating light sources of different types.
Specifically, in the embodiment of the present application, the terminal may establish the preset color gamut space based on the R, G, B information. Illustratively, the terminal may establish a preset color gamut space with log (R/G) and log (B/G) as abscissa and ordinate, respectively; the terminal can also establish a preset color gamut space by taking R/G and B/G as an abscissa and an ordinate respectively.
It should be noted that, in the embodiment of the present application, the preset color gamut space established by the terminal may be a square area, for example, a square area of 0.2 to 1.5 may be selected as the preset color gamut space from the UV space established based on the R, G, B information.
It is to be understood that in the embodiments of the present application, the terminal may mark at least one preset point in the preset color gamut space for characterizing the light source. For example, the terminal may mark specific coordinate positions corresponding to 10K, D75, D65, D50, Cool White shop light (CWF), TL84, A, H light sources in a preset color gamut space according to the light box, i.e. mark at least one preset point of different light sources.
In the embodiments of the present application, the International Commission on illumination (english: International Commission on illumination, french: Commission on International de l' Eclairage, CIE for short) specifies color temperature standards of various standard illuminants, which are:
standard illuminant a: represents the light emitted by a complete radiator at 2856K;
standard illuminant B: represents direct sunlight with a correlated color temperature of about 4874K;
standard illuminant C: represents average sunlight with the correlated color temperature of about 6774K, and the light color is similar to sunlight in the sky of a cloudy day;
standard illuminant D65: represents daylight with a correlated color temperature of about 6504K;
standard illuminant D: representing other daylight than the standard illuminant D65.
CIE-defined standard illuminants refer to specific spectral energy distributions and are defined light source color standards. It does not have to be provided directly by a light source, nor does it have to be implemented by a light source. To achieve the requirements of the standard illuminant specified by the CIE, a standard light source must also be specified to embody the spectral power distribution required by the standard illuminant. The CIE recommends the following artificial light sources to achieve the specification of a standard illuminant:
standard light source a: the color temperature of the gas-filled spiral tungsten lamp is 2856K, and the color of the gas-filled spiral tungsten lamp is yellow;
and (3) standard light source B: the color temperature is 4874K, and the color temperature is equivalent to that of midday sunlight, and the color temperature is 4874K and consists of a light source A and a B-type D-G liquid filter;
and (3) standard light source C: the color temperature is 6774K, and the color of the liquid filter is equivalent to that of cloudy sky light, and the liquid filter is formed by covering a light source A with a C-type D-G liquid filter;
specifically, the artificial standard light sources include the following ones,
d65: the international standard artificial sunlight with the color temperature of 6500K replaces natural light for color matching, and is suitable for common requirements;
TL 84: a tricolor fluorescent light source with a color temperature of 4000K, European and Japanese shop lights;
CWF: color temperature 4150K, national shop or office light source (or cold white);
F/A: sunlight, a yellow light source and a colorimetric reference light source;
UV: fluorescent light or ultraviolet light for detecting articles using fluorescent and whitening dyes and controlling whiteness of the articles;
u30: american commercial fluorescence, color temperature 3000K, rare earth commercial fluorescent lamp, for market lighting;
d50: simulating sunlight, wherein the color temperature is 5000K;
h: horizon, simulated horizontal daylight, color temperature 2300K.
Further, in the embodiment of the present application, while the terminal identifies at least one preset point in the preset color gamut space, the terminal may identify, for each preset point, a corresponding region in the preset color gamut space, where the region is used to identify a position range where a light source or a pseudo light source that may be excluded based on a first preset point often appears, where one preset point corresponds to one region, for example, for the first preset point, a corresponding first region is identified in the preset color gamut space, and for a second preset point, a corresponding second region is identified in the preset color gamut space. Specifically, each preset point has a fixed positional relationship with its corresponding region.
For example, in the present application, fig. 12 is a second schematic diagram of coordinate points in the preset color gamut space, as shown in fig. 12, if the first preset point is used to represent the CWF light source, the first region corresponding to the first preset point may represent a position range where light sources such as LEDs, OLEDs, etc. that can be excluded from the CWF light source often appear in the preset color gamut space, specifically, the coordinate of the upper left corner corresponding to the first region may be above the first preset point, and the distance range between the first region and the first preset point is [0.015, 0.03], and the long-wide range is [0.33, 0.09 ].
For example, in the present application, fig. 13 is a third schematic diagram of coordinate points in the preset color gamut space, as shown in fig. 13, if a second preset point is used to represent the D65 light source, a second region corresponding to the second preset point may represent a position range where a pseudo light source such as a green plant that can be excluded based on the D65 light source often appears in the preset color gamut space, specifically, the coordinates of the upper left corner corresponding to the second region may be below the second preset point, and the distance range between the second region and the second preset point is [0.12, 0.1], and the long-wide range is [0.15, 0.22 ].
Therefore, in the application, after the terminal marks at least one preset point in the preset color gamut space, the corresponding at least one region can be determined based on the at least one preset point in the preset color gamut space, the terminal compares the first infrared characteristic value and the second infrared characteristic value with at least one preset infrared threshold respectively, and after the comparison result is obtained, the comparison result can be combined with the preset point and the region corresponding to the preset point to delete the coordinate point, so that the filtered coordinate is obtained.
And step 203, determining the screened image data according to the screened coordinates.
In the embodiment of the application, the terminal performs screening processing on the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain the screened coordinates, and then can further determine the screened image data according to the screened coordinates.
Further, in the embodiment of the application, the terminal determines a corresponding position of original image data corresponding to an image to be processed in a preset color gamut space, maps the original image data to a coordinate point in the preset color gamut space, and then performs screening processing on the coordinate point based on the first infrared characteristic value and the second infrared characteristic value, so that the screened image data can be obtained according to the obtained screened coordinate.
In an embodiment of the present application, further, the terminal performs a screening process on the coordinate point by using the first infrared characteristic value and the second infrared characteristic value, and the method for obtaining the screened coordinate may include:
step 202a, if the first infrared characteristic value is greater than the first infrared threshold value and the second infrared characteristic value is greater than the second infrared threshold value, determining a first area corresponding to the first preset point in the preset color gamut space.
And step 202b, deleting the points in the first area in the coordinate points to obtain the screened coordinates.
In the embodiment of the application, after the terminal maps the original image data to the preset color gamut space and obtains the coordinate point corresponding to the image to be processed, the terminal may compare the first infrared characteristic value with the first infrared threshold value, and simultaneously, compare the second infrared characteristic value with the second infrared threshold value to obtain a comparison result, and then based on the comparison result, the terminal determines a first region corresponding to the first preset point in the preset color gamut space in combination, and deletes the coordinate point to obtain the filtered coordinate.
Further, in the embodiment of the present application, the terminal may be preset with a plurality of infrared threshold values, where the infrared threshold values preset by the terminal, that is, the first infrared threshold value and the second infrared threshold value, may be used to perform preliminary screening on the light source types of the images to be processed.
It should be noted that, in the embodiment of the application, after the terminal respectively compares the first infrared characteristic value with the first infrared threshold and compares the second infrared characteristic value with the second infrared threshold, if the obtained comparison result is that the first infrared characteristic value is greater than the first infrared threshold and the second infrared characteristic value is greater than the second infrared threshold, the terminal may further obtain the first preset point and determine the first area corresponding to the first preset point in the preset color gamut space.
It is understood that, in the embodiment of the present application, the first preset point corresponds to the comparison result, and specifically, the first preset point corresponds to the first infrared threshold and the second infrared threshold. Illustratively, the first infrared threshold is 0.4, the second infrared threshold is 0.6, and if the obtained comparison result is that the first infrared characteristic value is greater than 0.4 and the second infrared characteristic value is greater than 0.6, the terminal may consider that the energy of the infrared band corresponding to the image to be processed is stronger, so that the calibration point corresponding to the CWF light source in the preset color gamut space is determined as a first preset point, and further, after a first area where light sources such as LEDs and OLEDs that can be excluded based on the CWF light source are frequently present is determined, the first area may be used to perform coordinate point deletion processing.
Further, in the embodiment of the present application, after the terminal determines the first region corresponding to the first preset point in the preset color gamut space, the point appearing in the first region may be deleted, that is, the point in the first region in the coordinate points is deleted, so as to obtain the filtered coordinates.
In the embodiment of the application, further, before the terminal screens the original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value and obtains the screened image data, the terminal may further determine the brightness parameter corresponding to the image to be processed by using the original image data.
It can be understood that, in the embodiment of the present application, the terminal may further perform screening processing on the coordinate points by using the first infrared characteristic value and the second infrared characteristic value, and may further determine which coordinate points need to be deleted in the preset color gamut space by combining with the brightness parameter of the image to be processed.
Specifically, in an embodiment of the present application, the method for the terminal to perform screening processing on the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain the screened coordinates may include the following steps:
step 202c, if the first infrared characteristic value is greater than the third infrared threshold, the second infrared characteristic value is greater than the fourth infrared threshold, and the brightness parameter is greater than the preset brightness threshold, determining a second area corresponding to a second preset point in the preset color gamut space.
And step 202d, deleting the points in the second area in the coordinate points to obtain the screened coordinates.
In the embodiment of the application, after the terminal maps the original image data into the preset color gamut space and obtains the coordinate point corresponding to the image to be processed, the terminal may compare the first infrared characteristic value with the third infrared threshold, may compare the second infrared characteristic value with the fourth infrared threshold, and simultaneously may compare the brightness parameter with the preset brightness threshold to obtain a comparison result, and then based on the comparison result, the coordinate point is deleted by determining the second region corresponding to the second preset point in the preset color gamut space, so as to obtain the filtered coordinate.
Further, in the embodiment of the present application, the terminal may be preset with a plurality of infrared thresholds, where the infrared thresholds preset by the terminal, that is, the third infrared threshold and the fourth infrared threshold, may be used to perform preliminary screening on the light source types of the image to be processed.
Correspondingly, the terminal may be preset with a brightness threshold, where the brightness threshold preset by the terminal, that is, the preset brightness threshold, may also be used to perform preliminary screening on the light source type of the image to be processed.
It should be noted that, in the embodiment of the present application, after the terminal respectively compares the first infrared characteristic value with the third infrared threshold, compares the second infrared characteristic value with the fourth infrared threshold, and simultaneously compares the brightness parameter with the preset brightness threshold, if the obtained comparison result is that the first infrared characteristic value is greater than the third infrared threshold, the second infrared characteristic value is greater than the fourth infrared threshold, and the brightness parameter is greater than the preset brightness threshold, the terminal may further obtain the second preset point, and determine a second area corresponding to the second preset point in the preset color gamut space.
It is understood that, in the embodiment of the present application, the second preset point corresponds to the comparison result, and specifically, the second preset point corresponds to the third infrared threshold, the fourth infrared threshold, and the preset brightness threshold. Illustratively, the third infrared threshold is 0.5, the fourth infrared threshold is 0.4, and the preset brightness threshold is 3, and if the obtained comparison result is that the first infrared characteristic value is greater than 0.5, the second infrared characteristic value is greater than 0.4, and the brightness parameter is greater than 3, the terminal may consider that the ambient brightness corresponding to the image to be processed is higher, so that the calibration point corresponding to the D65 light source in the preset color gamut space is determined as the second preset point, and further, after a second area where false light sources such as green plants that can be excluded based on the D65 light source frequently appear is determined, the coordinate point may be deleted by using the second area.
Further, in the embodiment of the present application, after the terminal determines the second region corresponding to the second preset point in the preset color gamut space, the point appearing in the second region may be deleted, that is, the point in the second region in the coordinate points is deleted, so as to obtain the filtered coordinates.
The embodiment of the application provides an image processing method, wherein a terminal acquires first infrared information, second infrared information and visible light components through a color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and performing image processing on the image to be processed by using the screened image data. That is to say, in the embodiment of the application, the terminal may obtain the first infrared characteristic value and the second infrared characteristic value of the image to be processed by using the color temperature sensor, and then screen the original image data of the image to be processed by using the first infrared characteristic value and the second infrared characteristic value, so that the confusion data which may affect the white balance processing in the original image data may be deleted, and the image processing may be performed by using the screened image data. Therefore, the image processing method can take the dual-channel infrared information as an important auxiliary factor of the AWB algorithm to eliminate the confusing color which can cause interference in the image to be processed, so that the position of the light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of the image processing is further improved.
Based on the foregoing embodiment, in another embodiment of the present application, fig. 14 is a schematic diagram of a composition structure of a terminal, and as shown in fig. 14, the terminal 10 according to the embodiment of the present application may include an obtaining unit 11, a generating unit 12, a screening unit 13, a processing unit 14, and a determining unit 15.
The acquiring unit 11 is configured to acquire the first infrared information, the second infrared information, and the visible light component through the color temperature sensor;
the generating unit 12 is configured to generate a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information, and the visible light component;
the screening unit 13 is configured to screen original image data corresponding to an image to be processed according to the first infrared characteristic value and the second infrared characteristic value, so as to obtain screened image data;
the processing unit 14 is configured to perform image processing on the image to be processed by using the screened image data.
Further, in an embodiment of the present application, the generating unit 12 is specifically configured to perform time-frequency transform processing on the first infrared information to obtain a first direct current component; performing time-frequency transformation processing on the second infrared information to obtain a second direct current component; generating the first infrared characteristic value and the second infrared characteristic value using the first direct current component, the second direct current component, and the visible light component.
Further, in an embodiment of the present application, the generating unit 12 is specifically configured to determine the first infrared characteristic value according to the second direct current component and the visible light component; and determining the second infrared characteristic value according to the first direct current component and the second direct current component.
Further, in an embodiment of the present application, the screening unit 13 is specifically configured to map the original image data into a preset color gamut space, and obtain a coordinate point corresponding to the image to be processed; screening the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain screened coordinates; and determining the screened image data according to the screened coordinates.
Further, in an embodiment of the present application, the screening unit 13 is specifically configured to determine a first area corresponding to a first preset point in the preset color gamut space if the first infrared characteristic value is greater than a first infrared threshold value and the second infrared characteristic value is greater than a second infrared threshold value; and deleting the points in the first area in the coordinate points to obtain the screened coordinates.
Further, in an embodiment of the present application, the determining unit 15 is configured to screen original image data corresponding to an image to be processed according to the first infrared characteristic value and the second infrared characteristic value, and determine a luminance parameter corresponding to the image to be processed by using the original image data before obtaining the screened image data.
Further, in an embodiment of the present application, the screening unit 13 is specifically configured to determine a second area corresponding to a second preset point in the preset color gamut space if the first infrared characteristic value is greater than a third infrared threshold, the second infrared characteristic value is greater than a fourth infrared threshold, and the brightness parameter is greater than a preset brightness threshold; and deleting the points in the second area in the coordinate points to obtain the screened coordinates.
Further, in an embodiment of the present application, the processing unit 14 is specifically configured to determine a light source detection result corresponding to the image to be processed according to the first infrared characteristic value, the second infrared characteristic value, and the screened image data; and carrying out image processing on the image to be processed by utilizing the light source detection result.
Further, in an embodiment of the present application, the obtaining unit 11 is further configured to detect, by a color temperature sensor, a first flashing frequency and a second flashing frequency corresponding to the image to be processed before performing image processing on the image to be processed by using the screened image data;
the screening unit 13 is further configured to screen the original image data according to the first flash frequency and the second flash frequency, so as to obtain the screened image data.
In an embodiment of the present application, further, fig. 15 is a schematic diagram of a composition structure of a terminal, as shown in fig. 15, the terminal 10 according to the embodiment of the present application may further include a processor 16, a memory 17 storing executable instructions of the processor 16, and further, the terminal 10 may further include a communication interface 18, and a bus 19 for connecting the processor 16, the memory 17, and the communication interface 18.
In an embodiment of the present Application, the Processor 16 may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a ProgRAMmable Logic Device (PLD), a Field ProgRAMmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above processor functions may be other devices, and the embodiments of the present application are not limited in particular. The terminal 10 may further comprise a memory 17, which memory 17 may be connected to the processor 16, wherein the memory 17 is adapted to store executable program code comprising computer operating instructions, and wherein the memory 17 may comprise a high speed RAM memory and may further comprise a non-volatile memory, such as at least two disk memories.
In the embodiment of the present application, the bus 19 is used to connect the communication interface 18, the processor 16, and the memory 17 and the intercommunication among these devices.
In the embodiment of the present application, the memory 17 is used for storing instructions and data.
Further, in the embodiment of the present application, the processor 16 is configured to obtain the first infrared information, the second infrared information, and the visible light component through the color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and carrying out image processing on the image to be processed by using the screened image data.
In practical applications, the Memory 17 may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor 16.
In addition, each functional module in this embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
According to the terminal provided by the embodiment of the application, the first infrared information, the second infrared information and the visible light component are acquired through the color temperature sensor; generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component; screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data; and performing image processing on the image to be processed by using the screened image data. That is to say, in the embodiment of the application, the terminal may obtain the first infrared characteristic value and the second infrared characteristic value of the image to be processed by using the color temperature sensor, and then screen the original image data of the image to be processed by using the first infrared characteristic value and the second infrared characteristic value, so that the confusion data which may affect the white balance processing in the original image data may be deleted, and the image processing may be performed by using the screened image data. Therefore, the image processing method can take the dual-channel infrared information as an important auxiliary factor of the AWB algorithm to eliminate the confusing color which can cause interference in the image to be processed, so that the position of the light source in the image can be accurately determined, the recovery effect of the AWB is effectively enhanced, and the processing effect of the image processing is further improved.
An embodiment of the present application provides a computer-readable storage medium on which a program is stored, which when executed by a processor implements the image processing method as described above.
Specifically, the program instructions corresponding to an image processing method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the program instructions corresponding to an image processing method in the storage medium are read or executed by an electronic device, the method includes the steps of:
acquiring first infrared information, second infrared information and visible light components through a color temperature sensor;
generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component;
screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data;
and carrying out image processing on the image to be processed by using the screened image data.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of implementations of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks in the flowchart and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application.

Claims (12)

1. An image processing method, characterized in that the method comprises:
acquiring first infrared information, second infrared information and visible light components through a color temperature sensor;
generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component;
screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data;
and carrying out image processing on the image to be processed by using the screened image data.
2. The method of claim 1, wherein generating a first infrared signature value and a second infrared signature value from the first infrared information, the second infrared information, and the visible light component comprises:
performing time-frequency transformation processing on the first infrared information to obtain a first direct current component; performing time-frequency transformation processing on the second infrared information to obtain a second direct current component;
generating the first infrared characteristic value and the second infrared characteristic value using the first direct current component, the second direct current component, and the visible light component.
3. The method of claim 2, wherein the generating the first infrared eigenvalue and the second infrared eigenvalue using the first direct current component, the second direct current component, and the visible light component comprises:
determining the first infrared characteristic value according to the second direct current component and the visible light component;
and determining the second infrared characteristic value according to the first direct current component and the second direct current component.
4. The method according to claim 1, wherein the screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data includes:
mapping the original image data to a preset color gamut space to obtain a coordinate point corresponding to the image to be processed;
screening the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain screened coordinates;
and determining the screened image data according to the screened coordinates.
5. The method according to claim 4, wherein the screening the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain screened coordinates comprises:
if the first infrared characteristic value is larger than a first infrared threshold value and the second infrared characteristic value is larger than a second infrared threshold value, determining a first area corresponding to a first preset point in the preset color gamut space;
and deleting the points in the first area in the coordinate points to obtain the screened coordinates.
6. The method according to claim 4, wherein before the original image data corresponding to the image to be processed is filtered according to the first infrared characteristic value and the second infrared characteristic value and the filtered image data is obtained, the method further comprises:
and determining the brightness parameter corresponding to the image to be processed by using the original image data.
7. The method according to claim 6, wherein the screening the coordinate points by using the first infrared characteristic value and the second infrared characteristic value to obtain screened coordinates comprises:
if the first infrared characteristic value is larger than a third infrared threshold value, the second infrared characteristic value is larger than a fourth infrared threshold value, and the brightness parameter is larger than a preset brightness threshold value, determining a second area corresponding to a second preset point in the preset color gamut space;
and deleting the points in the second area in the coordinate points to obtain the screened coordinates.
8. The method according to any one of claims 1 to 6, wherein the image processing of the image to be processed by using the filtered image data comprises:
determining a light source detection result corresponding to the image to be processed according to the first infrared characteristic value, the second infrared characteristic value and the screened image data;
and carrying out image processing on the image to be processed by utilizing the light source detection result.
9. The method according to any one of claims 1 to 6, wherein before performing image processing on the image to be processed by using the filtered image data, the method further comprises:
detecting a first flashing frequency and a second flashing frequency corresponding to the image to be processed through a color temperature sensor;
and screening the original image data according to the first flash frequency and the second flash frequency to obtain the screened image data.
10. A terminal, characterized in that the terminal comprises: an acquisition unit, a generation unit, a screening unit, a processing unit,
the acquisition unit is used for acquiring first infrared information, second infrared information and visible light components through the color temperature sensor;
the generating unit is used for generating a first infrared characteristic value and a second infrared characteristic value according to the first infrared information, the second infrared information and the visible light component;
the screening unit is used for screening original image data corresponding to the image to be processed according to the first infrared characteristic value and the second infrared characteristic value to obtain screened image data;
and the processing unit is used for carrying out image processing on the image to be processed by utilizing the screened image data.
11. A terminal, characterized in that the terminal comprises: a processor, a memory storing processor-executable instructions that, when executed by the processor, implement the method of any of claims 1-9.
12. A computer-readable storage medium, on which a program is stored, for use in a terminal, characterized in that the program, when executed by a processor, implements the method according to any one of claims 1-9.
CN201911379772.XA 2019-12-27 2019-12-27 Image processing method, terminal and storage medium Active CN113055665B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911379772.XA CN113055665B (en) 2019-12-27 2019-12-27 Image processing method, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911379772.XA CN113055665B (en) 2019-12-27 2019-12-27 Image processing method, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113055665A true CN113055665A (en) 2021-06-29
CN113055665B CN113055665B (en) 2023-04-07

Family

ID=76506687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911379772.XA Active CN113055665B (en) 2019-12-27 2019-12-27 Image processing method, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113055665B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101779109A (en) * 2007-07-25 2010-07-14 Nxp股份有限公司 indoor/outdoor detection
CN103493212A (en) * 2011-03-29 2014-01-01 欧司朗光电半导体有限公司 Unit for determining the type of a dominating light source by means of two photodiodes
CN105430367A (en) * 2015-12-30 2016-03-23 浙江宇视科技有限公司 Automatic white balance method and device
CN106454305A (en) * 2016-11-14 2017-02-22 浙江宇视科技有限公司 White balance correcting method and device
CN106993175A (en) * 2016-01-20 2017-07-28 瑞昱半导体股份有限公司 Produce the method that the pixel used for realizing auto kine bias function computing screens scope
CN107690065A (en) * 2017-07-31 2018-02-13 努比亚技术有限公司 A kind of white balance correcting, device and computer-readable recording medium
CN108027278A (en) * 2015-08-26 2018-05-11 株式会社普瑞密斯 Lighting detecting device and its method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101779109A (en) * 2007-07-25 2010-07-14 Nxp股份有限公司 indoor/outdoor detection
CN103493212A (en) * 2011-03-29 2014-01-01 欧司朗光电半导体有限公司 Unit for determining the type of a dominating light source by means of two photodiodes
CN108027278A (en) * 2015-08-26 2018-05-11 株式会社普瑞密斯 Lighting detecting device and its method
CN105430367A (en) * 2015-12-30 2016-03-23 浙江宇视科技有限公司 Automatic white balance method and device
CN106993175A (en) * 2016-01-20 2017-07-28 瑞昱半导体股份有限公司 Produce the method that the pixel used for realizing auto kine bias function computing screens scope
CN106454305A (en) * 2016-11-14 2017-02-22 浙江宇视科技有限公司 White balance correcting method and device
CN107690065A (en) * 2017-07-31 2018-02-13 努比亚技术有限公司 A kind of white balance correcting, device and computer-readable recording medium

Also Published As

Publication number Publication date
CN113055665B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN113452980B (en) Image processing method, terminal and storage medium
CN108111772B (en) Shooting method and terminal
JP4977707B2 (en) Image processing apparatus with auto white balance
US9245332B2 (en) Method and apparatus for image production
US20030035156A1 (en) System and method for efficiently performing a white balance operation
US9460521B2 (en) Digital image analysis
EP3657785B1 (en) Image white balance processing method and apparatus, and terminal device
US20140265882A1 (en) System and method for controlling lighting
US20020131635A1 (en) System and method for effectively performing a white balance operation
WO2021105398A1 (en) Ambient light source classification
WO2021115419A1 (en) Image processing method, terminal, and storage medium
CN104243793B (en) Have the camera device and method of image identification mechanism
US20190052855A1 (en) System and method for detecting light sources in a multi-illuminated environment using a composite rgb-ir sensor
CN110830794B (en) Light source detection method, terminal and storage medium
US20200228770A1 (en) Lens rolloff assisted auto white balance
JP7144678B2 (en) Image processing device, image processing method, and image processing program
US11457189B2 (en) Device for and method of correcting white balance of image
CN113055665B (en) Image processing method, terminal and storage medium
KR20200145670A (en) Device and method for correcting white balance of image
US10602112B2 (en) Image processing apparatus
US20200228769A1 (en) Lens rolloff assisted auto white balance
KR20000059451A (en) Method of raw color adjustment and atmosphere color auto extract in a image reference system
CN105321153A (en) Video monitor low-illumination image color restoration method and device
WO2022032666A1 (en) Image processing method and related apparatus
JP2008109604A (en) Digital image capture apparatus and white balance adjustment method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant