WO2024027287A1 - 图像处理系统及方法、计算机可读介质和电子设备 - Google Patents

图像处理系统及方法、计算机可读介质和电子设备 Download PDF

Info

Publication number
WO2024027287A1
WO2024027287A1 PCT/CN2023/095718 CN2023095718W WO2024027287A1 WO 2024027287 A1 WO2024027287 A1 WO 2024027287A1 CN 2023095718 W CN2023095718 W CN 2023095718W WO 2024027287 A1 WO2024027287 A1 WO 2024027287A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
color data
original image
color
Prior art date
Application number
PCT/CN2023/095718
Other languages
English (en)
French (fr)
Other versions
WO2024027287A9 (zh
Inventor
张科武
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2024027287A1 publication Critical patent/WO2024027287A1/zh
Publication of WO2024027287A9 publication Critical patent/WO2024027287A9/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled

Definitions

  • the present disclosure relates to the technical field of image processing, and specifically relates to an image processing system, an image processing method, a computer-readable medium and an electronic device.
  • Bayer sensor is a sensor that can spatially separate red, green, and blue components of light signals and count the light signals separately.
  • RGBG-based image sensors Most current camera solutions use Bayer sensors and image sensors based on combinations of red, green, and blue (RGB) color channels to obtain different filter pattern combinations, such as RGBG-based image sensors.
  • RGBG-based image sensors RGBG-based image sensors.
  • the color filter used by the camera is limited to the three spectral ranges of red, green, and blue, the spectral perception information in complex light source scenarios is limited, and the color information of local areas cannot be obtained, so the final output image is Color reproduction accuracy is poor.
  • the purpose of the present disclosure is to provide an image processing system, an image processing method, a computer-readable medium, and an electronic device.
  • an image processing system including:
  • Spectral sensing device used to collect target spectral color data
  • Camera module including image sensor and image signal processor, including:
  • the image sensor is used to generate original images
  • the image signal processor is electrically connected to the image sensor and the spectrum sensing device, and is used to restore the color of the original image according to the target spectrum color data to obtain a target image.
  • an image processing method including:
  • Color restoration is performed on the original image according to the target spectral color data to obtain the target image.
  • a computer-readable medium is provided, a computer program is stored thereon, and when the computer program is executed by a processor, the above method is implemented.
  • an electronic device which is characterized in that it includes:
  • the main circuit board is arranged in the housing;
  • the image processing system provided in the first aspect is disposed in the housing and electrically connected to the main circuit board.
  • Figure 1 shows a schematic system architecture diagram of an image signal processing flow in a related technical solution
  • Figure 2 schematically shows an architectural diagram of an image processing system in an exemplary embodiment of the present disclosure
  • Figure 3 schematically shows a schematic diagram of the arrangement of a spectrum sensing device and electronic equipment in an exemplary embodiment of the present disclosure
  • Figure 4 schematically shows a principle diagram of a spectral sensor array collecting spectral color data in an exemplary embodiment of the present disclosure
  • Figure 5 schematically shows an architectural diagram of another image processing system in an exemplary embodiment of the present disclosure
  • Figure 6 schematically shows a principle diagram of an image area segmentation module in an exemplary embodiment of the present disclosure
  • Figure 7 schematically shows an architectural diagram of yet another image processing system in an exemplary embodiment of the present disclosure
  • FIG. 8 schematically shows a structural diagram of an RGB domain processing module in an exemplary embodiment of the present disclosure
  • Figure 9 schematically shows a structural diagram of an image signal processing pipeline in an exemplary embodiment of the present disclosure.
  • Figure 10 schematically illustrates the difference between a Raw RGB domain image and a Full RGB domain in an exemplary embodiment of the present disclosure
  • Figure 11 schematically shows the structural diagram of a Raw RGB domain processing module in an exemplary embodiment of the present disclosure
  • Figure 12 schematically shows the structural diagram of a Full RGB domain processing module in an exemplary embodiment of the present disclosure
  • Figure 13 schematically shows a structural diagram of a YUV domain processing module in an exemplary embodiment of the present disclosure
  • Figure 14 schematically shows a flow chart of an image processing method in an exemplary embodiment of the present disclosure
  • Figure 15 shows a schematic structural diagram of an electronic device that can be applied to embodiments of the present disclosure
  • FIG. 16 shows a schematic structural diagram of another electronic device that can be applied to embodiments of the present disclosure.
  • Example embodiments will now be described more fully with reference to the accompanying drawings.
  • Example embodiments may, however, be embodied in various forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concepts of the example embodiments. To those skilled in the art.
  • the described features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
  • Figure 1 shows a schematic system architecture diagram of an image signal processing flow in a related technical solution.
  • a system architecture of an image signal processing process may include an image sensor 110 , an RGB domain processing module 120 , a YUV domain processing module 130 and a JPG output module 140 . in:
  • the image sensor 110 refers to a sensor that collects optical signals based on a Bayer filter pattern template and converts the optical signals into electrical signals.
  • the image sensor 110 can be a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS). It can also be a charge-coupled device (CCD).
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD charge-coupled device
  • the RGB domain processing module 120 refers to processing the original image collected by the image sensor 110 in the RGB domain. Modules corresponding to all processes of image signal processing.
  • the YUV domain processing module 130 refers to a module that converts the output result of the RGB domain processing module 120 into the YUV color space (YUV domain) and performs all processes of image signal processing in the YUV color space.
  • the JPG output module 140 refers to mapping the processing results of the RGB domain processing module 120 and the processing results of the YUV domain processing module 130 to the original image, and converting the processed original image into a JPG format picture that can finally be displayed on the display unit. processing module.
  • the image sensor 110 in this solution uses a sensor based on Bayer filters and a combination of RGB color channels to obtain different filter style combinations
  • the color filtering used by the sensor based on RGB three channels is limited to red, green
  • the color information of local areas cannot be obtained, resulting in poor color accuracy of the final output image and the inability to better restore the colors in the real scene.
  • the image processing system can be installed in an electronic device with certain computing capabilities.
  • the electronic device can be a desktop computer, a portable computer, Smart phones, smart robots, wearable devices, tablet computers, etc., this example embodiment does not specifically limit these.
  • FIG. 2 schematically shows an architectural diagram of an image processing system in an exemplary embodiment of the present disclosure.
  • the image processing system 200 may include:
  • the spectral sensing device 210 can be used to collect target spectral color data
  • the camera module 220 may include an image sensor 221 and an image signal processor 222, wherein:
  • Image sensor 221 may be used to generate raw images
  • the image signal processor 222 can be electrically connected to the image sensor 221 and the spectrum sensing device 210, and is used to restore the color of the original image output by the image sensor 221 according to the target spectral color data collected by the spectrum sensing device 210 to obtain the target image.
  • the spectral sensing device 210 refers to a sensing unit used to sense the real color in the current scene when the original image is captured.
  • the spectral sensing device 210 can be a multispectral sensor (Multispectral sensor), and the multispectral sensor can include imaging optical elements and The optical element that divides the spectrum can accurately identify all spectral color information in the current scene; of course, the spectrum sensing device 210 can also be a CCD direct reading spectrometer. This exemplary embodiment does not specifically limit the type of the spectrum sensing device.
  • Figure 3 schematically shows a schematic diagram of the arrangement of a spectrum sensing device and electronic equipment in an exemplary embodiment of the present disclosure.
  • the multispectral sensor 310 can be integrated into the camera module 220 , or can be installed in an electronic device independently of the camera module 220 .
  • the spectrum sensing device 210 can also be used as an external device to communicate with the electronic device corresponding to the camera module 220, for example, an external
  • the spectrum sensing device 330 can communicate with the camera module 220 in a wired or wireless manner 340. This exemplary embodiment does not impose any special limitations on the connection method of the spectrum sensing device 210.
  • the target spectral color data refers to the filtered spectral color data that acts on the original image.
  • the target spectral color data can be all spectral color data collected by the spectral sensing device 210, or It can be the spectral color data corresponding to the foreground area of the original image among all the spectral color data collected by the spectral sensing device 210.
  • it can also be the spectral color data filtered in other ways and acting on the original image.
  • This example embodiment is for the target
  • the filtering method of spectral color data is not particularly limited.
  • the image sensor 221 refers to a sensor that collects optical signals and converts the optical signals into electrical signals.
  • the image sensor 221 can be a complementary metal oxide semiconductor CMOS or a charge-coupled detection element CCD.
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled detection element
  • the sensor type of the image sensor 221 is not particularly limited.
  • the image sensor 221 may be an image sensor based on a Bayer filter pattern, such as an image sensor based on an RGBG filter pattern or an RGBW filter pattern. Of course, it may also be an image sensor based on a non-Bayer filter pattern, such as an image sensor based on a non-Bayer filter pattern.
  • This embodiment is not limited to the RYYB filter style image sensor.
  • the original image refers to the image data obtained by selecting the light signal and converting it through the image sensor.
  • the original image can be an RGBG image; if the original image is based on RYYB If the image data is collected by a filter-style image sensor, the original image may be a RYYB image, and this example embodiment is not limited to this.
  • Image signal processor refers to a processor that processes the original image collected by the image sensor into image data that can be output to the display.
  • the image signal processor can be an ISP (Image Signal Processor) processor or an application processor. (Application Processor, AP), GPU graphics processor (Graphics Processing Unit, GPU), etc. This example embodiment does not limit this.
  • the following description takes the image signal processor as the ISP processor.
  • the spectrum sensing device may include multiple spectrum sensors, and the multiple spectrum sensors may be arranged in an M*N array to form a spectrum sensor array, where M and N are both positive integers greater than 1.
  • FIG. 4 schematically shows a principle diagram of a spectral sensor array collecting spectral color data in an exemplary embodiment of the present disclosure.
  • the spectrum sensing device 210 may include multiple spectrum sensors 410.
  • the spectrum sensors 410 are spatially arranged according to a preset number of columns and rows to form a spectrum sensor array.
  • the multiple spectrum sensors 410 may be
  • the spectrum sensor array 420 is formed according to the array arrangement of M*N.
  • M can be set to 10 in advance and N can be set to 10.
  • the spectrum sensing device 210 can include 100 spectrum sensors 410.
  • the specific M and N can be customized according to actual application scenarios, and this example embodiment does not impose special limitations on this.
  • the spectrum sensing device 210 can be used to collect spectral color data 440 with M*N detection areas 430 through the spectrum sensor array 420, and use the spectral color data 440 of the M*N local detection areas as the target spectrum. Color data.
  • the spectral sensing device 210 having an M*N spectral sensor array 420 structure in space collects the real color information in the current scene to obtain a M*N detection area 430 .
  • the spectral color data in each detection area 430 can collect the spectral color data in the corresponding real scene area. Compared with the image sensor based on RGB channels, it can obtain richer color information in the real scene and perceive A wider range of spectral information can effectively supplement the color information of the image sensor, improve the accuracy of color restoration of the output target image, and through the spectral color data in each detection area, it can effectively achieve local area color enhancement of the target image. Ensure the color expression of the target image.
  • the image processing system 200 may include an image area segmentation module 510.
  • the image area segmentation module 510 may be configured in a central processing unit (Central Processing Unit, CPU) or a neural network of the electronic device.
  • the processor Neral-Network Processing Unit, NPU
  • This example embodiment does not impose special limitations on this.
  • the image area segmentation module 510 can be used to: determine the area of interest according to the original image, determine the target detection area in M*N detection areas through the area of interest, and combine the spectrum corresponding to the target detection area Color data as target spectral color data.
  • the area of interest refers to the area in the original image where specific image content is located.
  • the area of interest corresponding to the original image can be the face area in the original image; assuming that the original image If a tree is collected, the region of interest corresponding to the original image may be the image region corresponding to the tree.
  • This example embodiment does not impose any special restrictions on the selection or setting of the region of interest.
  • the target detection area refers to the overlapping area between the area of interest in the original image and the multiple detection areas corresponding to the spectrum sensing device. That is, the intersection of the area of interest in the original image and the multiple detection areas corresponding to the spectrum sensing device is used to obtain the target. Detection area.
  • FIG. 6 schematically shows a principle diagram of an image area segmentation module in an exemplary embodiment of the present disclosure.
  • the image sensor 221 collects the original image 610 , and the original image 610 can be input into a central processing unit, an image signal processor, or other processors with computing capabilities to determine the area of interest 620 in the original image 610 .
  • the original image 610 can be input into the central processor, and the region of interest 620 is determined through the edge detection algorithm, or the original image 610 can be input into the artificial intelligence processor, and the region of interest based on pre-trained deep learning is determined. Extract the model to determine the region of interest 620, which is not specifically limited in this example embodiment.
  • the area of interest 620 corresponding to the original image 610 and the spectral color data of the multiple detection areas 630 corresponding to the spectrum sensing device 210 can be input into the image area segmentation module 510 to determine the spectra of the area of interest 620 and the multiple detection areas 630
  • the overlapping area 640 of the color data is used as the target detection area, and the spectral color data corresponding to the target detection area is used as the target spectral color data.
  • the image processing system in this embodiment does not change the system architecture of the already mature image signal processing process. That is, the image processing system in this embodiment combines multiple detection areas 630 corresponding to the spectrum sensing device 210 The spectral color data is combined on the system architecture of the mature image signal processing process.
  • FIG. 7 schematically shows an architectural diagram of yet another image processing system in an exemplary embodiment of the present disclosure.
  • the image signal processor 222 may include an RGB domain processing module 710 and a YUV domain processing module 720.
  • the RGB domain processing module 710 may include a Raw RGB domain processing unit 810 and a Full RGB domain processing unit 820.
  • the Raw RGB domain, Full RGB domain and YUV domain respectively refer to the distinction between original images in different formats or stages in the image signal processing pipeline.
  • a mature image signal processing pipeline 900 for the final JPG output image in which Dead Pixel Concealment, Black Level Compensation, Lens Shading Correction, and Anti-aliasing can be performed
  • Data processed by Anti-aliasing Noise filter, AWB Gain control, CFA Interpolation and other processes are considered to be the original images corresponding to the Raw RGB domain 910;
  • gamma correction Gamma Correction
  • Color Correction Color Correction
  • Color space conversion Color space conversion
  • the Chroma noise filter Noise filter for Chroma
  • Color Hue Saturation Control Color Hue Saturation Control
  • edge enhancement Edge Enhancement
  • contrast brightness control Contrast Brightness Control
  • Data formatter Data formatter
  • the Raw RGB domain original image 1000 can be collected by an image sensor based on the Bayer filter style.
  • the Raw RGB domain original image 1000 can include a color filter array 1010 (R color) of different channels. filter array, G color filter array, B color filter array); at this time, the color filter array 1010 of different channels can be interpolated (for example, the color filter array 1010 can be interpolated through the anti-mosaic Demosaic algorithm) to obtain the global values of the different channels.
  • Color array 1020 R global color array, G global color array, B global color array
  • the Full RGB domain original image 1030 can be obtained through the global color array 1020 of different channels. You can continue to perform color space conversion on the original image 1030 in the Full RGB domain to obtain the original image in the YUV domain.
  • the image signal processor 222 may include an RGB domain processing module 710.
  • the RGB domain processing module 710 may include a Raw RGB domain processing unit 810.
  • the Raw RGB domain processing unit 810 Can be used for:
  • the local image areas can be arranged in an array of K*L, where K and L are both positive integers greater than 1, K is greater than or equal to M, and L is greater than or equal to N; According to the target spectral color data, the original image with K*L partial image areas is subjected to color restoration processing to obtain the target image.
  • the Raw RGB domain processing unit 810 may include a white balance gain subunit 1110 and a first tone mapping subunit 1120, wherein the white balance gain subunit 1110 may be used to have a The original image of the K*L partial image areas is subjected to white balance processing to determine the white balance gain matrix; the first tone mapping subunit 1120 can be used to tone the original image of the K*L partial image areas according to the target spectral color data. Mapping processing to determine a first tone mapping gain function.
  • the Raw RGB domain processing unit 810 can perform color restoration on the original image through the white balance gain matrix and the first tone mapping gain function to obtain the target image.
  • the image signal processor 222 may include an RGB domain processing module 710.
  • the RGB domain processing module 710 may include a Full RGB domain processing unit 820.
  • the Full RGB domain processing unit 820 may be used for:
  • the partial image areas can be arranged in an array of X*Y, where X and Y are both positive integers greater than 1, X is greater than or equal to M, and Y is greater than or equal to N;
  • the original image with X*Y local image areas is color restored according to the target spectral color data to obtain the target image.
  • the Full RGB domain processing unit 820 may include a color correction sub-unit 1210, a second tone mapping sub-unit 1220 and a 2D/3D lookup table sub-unit 1230, where:
  • the color correction subunit 1210 may be used to perform gamma correction processing on the original image having X*Y local image areas according to the target spectral color data, and determine a gamma correction curve;
  • the second tone mapping subunit 1220 may be configured to perform tone mapping processing on the original image having X*Y local image areas according to the target spectral color data, and determine a second tone mapping gain function;
  • the 2D/3D lookup table subunit 1230 may be used to perform color mapping processing on the original image with X*Y local image areas according to the target spectral color data and the 2D/3D lookup table, and determine the corrected color data.
  • the Full RGB domain processing unit 820 can perform color restoration on the original image through the gamma correction curve, the second tone mapping gain function and the corrected color data, to obtain target image.
  • the image signal processor 222 may include a YUV domain processing module 720, which may be used to:
  • the partial image areas can be arranged in an array of E*F, where E and F are both positive integers greater than 1, E is greater than or equal to M, and F is greater than or equal to N; According to the target spectral color data, the original image with E*F partial image areas is subjected to color restoration processing to obtain the target image.
  • the YUV domain processing module 720 may include a third tone mapping subunit 1310 and a color adjustment enhancement subunit 1320, wherein:
  • the third tone mapping subunit 1310 may be configured to perform tone mapping processing on the original image having E*F local image areas according to the target spectral color data, and determine a third tone mapping gain function;
  • the color adjustment and enhancement subunit 1320 may be used to perform color adjustment and enhancement on the original image having E*F partial image areas according to the target spectral color data, and determine the color adjustment and enhancement parameters, where the color adjustment and enhancement parameters may include contrast adjustment parameters, Brightness adjustment parameters, saturation adjustment parameters, edge enhancement parameters, etc. are not specifically limited in this exemplary embodiment.
  • the YUV domain processing module 720 may perform color restoration on the original image according to the third tone mapping gain function and the color adjustment enhancement parameters to obtain the target image.
  • the K*L array, X*Y array, and E*F array in the embodiment of the present disclosure are grids divided when processing data in the Raw RGB domain, processing data in the Full RGB domain, and processing data in the YUV domain, respectively.
  • the sizes of the K*L array, X*Y array, and E*F array may be consistent or inconsistent, and can be customized according to actual application conditions. This example embodiment does not impose special limitations on this.
  • the image signal processor can also be used to perform time domain filtering on the original image after color restoration at different times based on the infinite impulse response IIR filter to obtain the target image.
  • time domain filtering By performing temporal filtering on the original images after color restoration at different times, it can reduce the problem of regional edge or color discontinuity in the front and rear images caused by local processing of the image, and improve the continuity of image performance when shooting videos or real-time previews. Improve user experience.
  • this exemplary embodiment may include a spectrum sensing device for collecting target spectral color data, and a camera module.
  • the camera module may include an image sensor and an image signal processor, where the image sensor It can be used to generate original images, and the image signal processor can be electrically connected to the image sensor and the spectrum sensing device, and is used to restore the color of the original image according to the target spectral color data to obtain the target image.
  • the target spectrum color data is obtained through the spectrum sensing device. Compared with the original image with fewer color channels generated by the image sensor, the richer color information in the current scene can be perceived, and the original image can be color-coded based on the target spectrum color data.
  • the target spectral color data is collected through an independent spectral sensing device, and the target spectral color data is combined with the image signal processor
  • the system architecture of the already mature image signal processing process is combined for color restoration. There is no need to adjust the system structure of the mature image signal processing process, which effectively improves color restoration efficiency and reduces optimization costs.
  • Figure 14 shows a schematic flowchart of an image processing method in this exemplary embodiment, including the following steps S1410 to step S1430:
  • Step S1410 collect target spectral color data
  • Step S1420 obtain the original image
  • Step S1430 Color restore the original image according to the target spectral color data to obtain the target image.
  • the target spectral color data may include multiple detection areas
  • the original image may include multiple image areas
  • the detection areas may correspond to the image areas
  • the electronic device 1500 may include: a housing 1510 , a main circuit board 1520 , and an image processing system 1530 .
  • the main circuit board 1520 is located in the housing 1510.
  • Image processing system 1530 may be an image processing system as in embodiments of the present disclosure.
  • the image processing system 1530 may be electrically connected to the main circuit board 1520 to implement signal or data transmission between the image processing system 1530 and the main circuit board 1520 .
  • the image processing system 1530 can be disposed on the main circuit board 1520, or connected to the main circuit board 1520 through a medium such as a flexible circuit board to form an electrical connection to power the image processing system 1530 or transmit signals.
  • the following takes the mobile terminal 1600 in FIG. 16 as an example to illustrate the structure of the electronic device. It will be appreciated by those skilled in the art that, in addition to components specifically intended for mobile purposes, the configuration in Figure 16 can also be applied to stationary type equipment.
  • the mobile terminal 1600 may specifically include: a processor 1601, a memory 1602, a bus 1603, a mobile communication module 1604, an antenna 1, a wireless communication module 1605, an antenna 2, a display screen 1606, a camera module 1607, and an audio module 1608. , power module 1609 and sensor module 1610.
  • the processor 1601 may include one or more processing units.
  • the processor 1601 may include an application processor AP, a modem processor, a graphics processor GPU, an ISP processor, a controller, an encoder, a decoder, a DSP ( Digital Signal Processor, digital signal processor), baseband processor and/or neural network processor NPU, etc.
  • An encoder can encode (i.e. compress) an image or video to reduce the data size so it can be stored or sent.
  • the decoder can decode (ie decompress) the encoded data of the image or video to restore the image or video data.
  • the mobile terminal 1600 can support one or more encoders and decoders, such as: JPEG (Joint Photographic Experts Group, Joint Photographic Experts Group), PNG (Portable Network Graphics, portable network graphics), BMP (Bitmap, bitmap), etc. Image formats, MPEG (Moving Picture Experts Group, Moving Picture Experts Group) 1, MPEG10, H.1063, H.1064, HEVC (High Efficiency Video Coding, high-efficiency video coding) and other video formats.
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics, portable network graphics
  • BMP Bitmap, bitmap
  • Image formats MPEG (Moving Picture Experts Group, Moving Picture Experts Group) 1, MPEG10, H.1063, H.1064, HE
  • Processor 1601 may be connected to memory 1602 or other components via bus 1603.
  • Memory 1602 may be used to store computer executable program code, which includes instructions.
  • the processor 1601 executes various functional applications and data processing of the mobile terminal 1600 by executing instructions stored in the memory 1602 .
  • the memory 1602 can also store application data, such as images, videos and other files.
  • the communication function of the mobile terminal 1600 can be implemented through the mobile communication module 1604, antenna 1, wireless communication module 1605, antenna 2, modem processor, baseband processor, etc.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the mobile communication module 1604 can provide 3G, 4G, 5G and other mobile communication solutions applied to the mobile terminal 1600.
  • the wireless communication module 1605 can provide wireless communication solutions such as wireless LAN, Bluetooth, and near field communication applied on the mobile terminal 1600.
  • the display screen 1606 is used to implement display functions, such as displaying user interfaces, images, videos, etc.
  • the camera module 1607 is used to implement shooting functions, such as shooting images, videos, etc.
  • the audio module 1608 is used to implement audio functions, such as playing audio, collecting voice, etc.
  • the power module 1609 is used to implement power management functions, such as charging the battery, powering the device, monitoring battery status, etc.
  • the sensor module 1610 may include one or more sensors for implementing corresponding sensing detection functions.
  • the sensor module 1610 may include an inertial sensor, which is used to detect the motion posture of the mobile terminal 1600 and output inertial sensing data.
  • Exemplary embodiments of the present disclosure also provide a computer-readable storage medium on which a program product capable of implementing the method described above in this specification is stored.
  • various aspects of the present disclosure can also be implemented in the form of a program product, which includes program code.
  • the program product is run on a terminal device, the program code is used to cause the terminal device to execute the above described instructions.
  • the steps according to various exemplary embodiments of the present disclosure are described in the "Exemplary Methods" section.
  • the computer-readable medium shown in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium may be, for example, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or any combination thereof. More specific examples of computer readable storage media may include, but are not limited to: an electrical connection having one or more wires, a portable computer disk, a hard drive, random access memory (RAM), read only memory (ROM), removable Programmd read-only memory (EPROM or flash memory), fiber optics, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium that can send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to: wireless, wire, optical cable, RF, etc., or any suitable combination of the foregoing.
  • program code for performing operations of the present disclosure may be written in any combination of one or more programming languages, including object-oriented programming languages such as Java, C++, etc., as well as conventional procedural Programming language—such as "C” or a similar programming language.
  • the program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server execute on.
  • the remote computing device may be connected to the user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device, such as provided by an Internet service. (business comes via Internet connection).
  • LAN local area network
  • WAN wide area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

提供一种图像处理系统及方法、计算机可读介质和电子设备,涉及图像处理技术领域。该系统包括:光谱感知装置,用于采集目标光谱色彩数据;摄像模组,包括图像传感器和图像信号处理器,其中:所述图像传感器,用于生成原始图像;所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。通过引入独立的光谱感知装置,用于采集当前场景中的光谱色彩数据,相比于普通基于拜耳滤镜的图像传感器采集的色彩信息更加丰富,并通过该光谱色彩数据对原始图像进行色彩还原及增强,提高目标图像的色彩准确性,提升图像的色彩表现力。

Description

图像处理系统及方法、计算机可读介质和电子设备
交叉引用
本公开要求于2022年08月03日提交的申请号为202210927465.6名称均为“图像处理系统及方法、计算机可读介质和电子设备”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本公开涉及图像处理技术领域,具体涉及一种图像处理系统、图像处理方法、计算机可读介质和电子设备。
背景技术
伴随着人们生活水平的不断提高,对于拍摄的图像质量的要求也越来越高。拜耳传感器(Bayer sensor)是一种可以将红,绿,蓝三种成份光信号在空间上区分开来单独统计光信号的传感器。
目前大多数相机方案采用的是拜耳传感器以及基于红绿蓝(RGB)颜色通道的组合所获得不同滤光样式组合的图像传感器,例如,基于RGBG的图像传感器。但是,这种方案中,由于相机所采用的色彩滤波仅限于红,绿,蓝三种光谱范围,复杂光源场景下的光谱感知信息有限,并且无法得到局部区域的色彩信息,因此最终输出图像的色彩还原的准确性较差。
发明内容
本公开的目的在于提供一种图像处理系统、图像处理方法、计算机可读介质和电子设备。
根据本公开的第一方面,提供一种图像处理系统,包括:
光谱感知装置,用于采集目标光谱色彩数据;
摄像模组,包括图像传感器和图像信号处理器,其中:
所述图像传感器,用于生成原始图像;
所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
根据本公开的第二方面,提供一种图像处理方法,包括:
采集目标光谱色彩数据;
获取原始图像;
根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
根据本公开的第三方面,提供一种计算机可读介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述的方法。
根据本公开的第四方面,提供一种电子设备,其特征在于,包括:
壳体;
主电路板,设置在所述壳体内;
如第一方面所提供的图像处理系统,设置在所述壳体内,与所述主电路板电连接。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。在附图中:
图1示出了相关技术方案中的一种图像信号处理流程的系统架构示意图;
图2示意性示出本公开示例性实施例中一种图像处理系统的架构示意图;
图3示意性示出本公开示例性实施例中一种光谱感知装置与电子设备设置方式的示意图;
图4示意性示出本公开示例性实施例中一种光谱传感器阵列采集光谱色彩数据的原理示意图;
图5示意性示出本公开示例性实施例中另一种图像处理系统的架构示意图;
图6示意性示出本公开示例性实施例中一种图像区域分割模块的原理示意图;
图7示意性示出本公开示例性实施例中再一种图像处理系统的架构示意图;
图8示意性示出本公开示例性实施例中一种RGB域处理模块的结构示意图;
图9示意性示出本公开示例性实施例中一种图像信号处理流水线的结构示意图;
图10示意性示出本公开示例性实施例中一种Raw RGB域图像与Full RGB域的区别的示意图;
图11示意性示出本公开示例性实施例中一种Raw RGB域处理模块的结构示意图;
图12示意性示出本公开示例性实施例中一种Full RGB域处理模块的结构示意图;
图13示意性示出本公开示例性实施例中一种YUV域处理模块的结构示意图;
图14示意性示出本公开示例性实施例中一种图像处理方法的流程示意图;
图15示出了可以应用于本公开实施例的一种电子设备的结构示意图;
图16示出了可以应用于本公开实施例的另一种电子设备的结构示意图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的范例;相反,提供这些实施方式使得本公开将更加全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施方式中。
此外,附图仅为本公开的示意性图解,并非一定是按比例绘制。图中相同的附图标记表示相同或类似的部分,因而将省略对它们的重复描述。附图中所示的一些方框图是功能实体,不一定必须与物理或逻辑上独立的实体相对应。可以采用软件形式来实现这些功能实体,或在一个或多个硬件模块或集成电路中实现这些功能实体,或在不同网络和/或处理器装置和/或微控制器装置中实现这些功能实体。
图1示出了相关技术方案中的一种图像信号处理流程的系统架构示意图。
参考图1所示,一种图像信号处理流程的系统架构可以包括图像传感器110、RGB域处理模块120、YUV域处理模块130以及JPG输出模块140。其中:
图像传感器110是指基于拜耳滤光样式的模板实现光信号采集,并将光信号转换为电信号的传感器,例如,图像传感器110可以是互补性金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS),也可以是电荷耦合探测元件(Charge-coupled Device,CCD)。
RGB域处理模块120是指对于图像传感器110采集的原始图像在RGB域上进行 图像信号处理的所有流程对应的模块。
YUV域处理模块130是指将RGB域处理模块120的输出结果转换到YUV颜色空间(YUV域),并在YUV颜色空间上进行图像信号处理的所有流程对应的模块。
JPG输出模块140是指将RGB域处理模块120的处理结果以及YUV域处理模块130的处理结果,映射到原始图像中,并将处理后的原始图像转换为最终可以显示到显示单元如JPG格式图片的处理模块。
但是,由于该方案中图像传感器110采用的是基于拜耳滤镜以及基于RGB颜色通道的组合所获得不同滤光样式组合的传感器,基于RGB三通道的传感器所采用的色彩滤波仅限于红,绿,蓝(R,G,B)三种光谱范围,复杂光源场景下的光谱感知信息有限;同时对于系统结构中需要用到当前色温,以及反应当前场景颜色信息,而基于全局RGB的方式存在很多局限性,无法获得局部区域的色彩信息,导致最终输出的图像的色彩的准确性较差,无法较好还原真实场景中的色彩。
另外,一些技术方案中,虽然有采用到基于红绿蓝青品红黄(RGBCMY)以及红黄蓝(RYYB)等非常规颜色通道的传感器,用于弥补当前图像传感器仅能够采集RGB三通道色彩的缺陷,但是,由于红绿蓝青品红黄(RGBCMY)以及红黄蓝(RYYB)等非常规颜色通道的传感器的应用较少,需要在图像信号处理流程(ISP)的系统架构以及单点算法层面上需要针对该类型的色彩滤波模板做进一步的支持,工作量较大,应用成本较高。
基于相关方案中存在的一个或者多个问题,本公开首先提供了一种图像处理系统,图像处理系统可以设置在具有一定计算能力的电子设备中,例如,电子设备可以是台式计算机、便携式计算机、智能手机、智能机器人、可穿戴设备和平板电脑等,本示例实施例对此不做特殊限定。
图2示意性示出本公开示例性实施例中一种图像处理系统的架构示意图。参考图2所示,图像处理系统200可以包括:
光谱感知装置210可以用于采集目标光谱色彩数据;
摄像模组220可以包括图像传感器221和图像信号处理器222,其中:
图像传感器221可以用于生成原始图像;
图像信号处理器222可以与图像传感器221和光谱感知装置210电连接,用于根据光谱感知装置210采集的目标光谱色彩数据对图像传感器221输出的原始图像进行色彩还原,得到目标图像。
其中,光谱感知装置210是指用于感知拍摄原始图像时的当前场景中真实色彩的感知单元,例如,光谱感知装置210可以是多光谱传感器(Multispectral sensor),多光谱传感器可以包括成像光学元件和划分光谱的光学元件,能够准确识别当前场景中所有的光谱色彩信息;当然,光谱感知装置210也可以是CCD直读光谱仪,本示例实施例对光谱感知装置的类型不做特殊限定。
图3示意性示出本公开示例性实施例中一种光谱感知装置与电子设备设置方式的示意图。
参考图3所示,以光谱感知装置210是多光谱传感器310为例,可以将多光谱传感器310集成到摄像模组220中,也可以独立于摄像模组220设置到电子设备中,例如,可以设置在电子设备的外壳320上,并且与摄像模组220的距离处于距离阈值之内;当然,光谱感知装置210也可以作为外接设备与摄像模组220对应的电子设备通信连接,例如,外接的光谱感知装置330可以通过有线或者无线的方式340与摄像模组220通信连接,本示例实施例对于光谱感知装置210的连接方式不做特殊限定。
继续参考图2所示,目标光谱色彩数据是指筛选的作用于原始图像的光谱色彩数据,例如,目标光谱色彩数据可以是光谱感知装置210采集的所有光谱色彩数据,也 可以是光谱感知装置210采集的所有光谱色彩数据中与原始图像的前景区域对应的光谱色彩数据,当然,还可以是以其他方式筛选的作用于原始图像的光谱色彩数据,本示例实施例对目标光谱色彩数据的筛选方式不做特殊限定。
图像传感器221是指实现光信号采集,并将光信号转换为电信号的传感器,例如,图像传感器221可以是互补性金属氧化物半导体CMOS,也可以是电荷耦合探测元件CCD,本示例实施例对图像传感器221的传感器类型不做任何特殊限定。可选的,图像传感器221可以是基于拜耳滤光样式的图像传感器,如基于RGBG滤光样式、RGBW滤光样式的图像传感器,当然,也可以是基于非拜耳滤光样式的图像传感器,如基于RYYB滤光样式的图像传感器,本实施例不以此为限。
原始图像是指通过图像传感器选择光信号并转换得到的图像数据,例如,若原始图像是基于RGBG滤光样式的图像传感器采集的图像数据,那么原始图像可以是RGBG图像;若原始图像是基于RYYB滤光样式的图像传感器采集的图像数据,那么原始图像可以是RYYB图像,本示例实施例不以此为限。
图像信号处理器是指将图像传感器采集的原始图像加工成可以输出到显示器展示的图像数据的处理器,例如,图像信号处理器可以是ISP(Image Signal Processor)处理器,也可以是应用处理器(Application Processor,AP)、GPU图形处理器(Graphics Processing Unit,GPU)等,本示例实施例对此不做限定,为便于理解,下面均以图像信号处理器为ISP处理器进行说明。
在一示例实施例中,光谱感知装置可以包括多个光谱传感器,多个光谱传感器可以按照M*N的阵列排布构成光谱传感器阵列,其中,M和N均为大于1的正整数。
图4示意性示出本公开示例性实施例中一种光谱传感器阵列采集光谱色彩数据的原理示意图。
参考图4所示,光谱感知装置210可以包括多个光谱传感器410,光谱传感器410在空间上按照预设列数和行数进行排列,排列得到光谱传感器阵列,具体的,多个光谱传感器410可以按照M*N的阵列排布构成光谱传感器阵列420,例如,可以预先设置M为10,可以设置N为10,那么光谱感知装置210可以包括100个光谱传感器410,对这100个光谱传感器410在空间上进行排列,得到10*10的光谱传感器阵列。当然,具体的M和N可以根据实际应用场景进行自定义设置,本示例实施例对此不做特殊限定。
具体的,光谱感知装置210可以用于,通过光谱传感器阵列420采集得到具有M*N个检测区域430的光谱色彩数据440,并将该M*N个局部检测区域的光谱色彩数据440作为目标光谱色彩数据。
继续参考图4所示,在拍摄原始图像时,在空间上具有M*N光谱传感器阵列420结构的光谱感知装置210,采集当前场景中的真实色彩信息,得到具有M*N个检测区域430的光谱色彩数据440,各检测区域430内的光谱色彩数据能够采集所对应的真实场景区域中的光谱色彩数据,相比于基于RGB通道的图像传感器,能够得到真实场景中更加丰富的色彩信息,感知更大范围的光谱信息,有效补充图像传感器的色彩信息,提升输出的目标图像的色彩还原的准确性,并且通过各检测区域中的光谱色彩数据,能够有效实现对目标图像的局部区域色彩增强,保证目标图像的色彩表现力。
在一示例实施例中,参考图5所示,图像处理系统200可以包括图像区域分割模块510,该图像区域分割模块510可以设置在电子设备的中央处理器(Central Processing Unit,CPU)、神经网络处理器(Neural-Network Processing Unit,NPU)等处理器中,也可以设置在摄像模组的图像信号处理器中,本示例实施例对此不做特殊限定。
具体的,图像区域分割模块510可以用于:根据原始图像确定感兴趣区域,通过感兴趣区域在M*N个检测区域中确定目标检测区域,并将目标检测区域对应的光谱 色彩数据作为目标光谱色彩数据。
其中,感兴趣区域是指原始图像中特定图像内容所在的区域,例如,假设原始图像采集的是人脸内容,那么原始图像对应的感兴趣区域可以是原始图像中的人脸区域;假设原始图像采集的是一颗树,那么原始图像对应的感兴趣区域可以是这棵树对应的图像区域,本示例实施例对于感兴趣区域的选取或者设定不做特殊限定。
目标检测区域是指原始图像中的感兴趣区域与光谱感知装置对应的多个检测区域的重叠区域,即对原始图像中的感兴趣区域与光谱感知装置对应的多个检测区域取交集,得到目标检测区域。
通过确定原始图像对应的感兴趣区域,并根据感兴趣区域筛选需要使用的光谱色彩数据对原始图像进行色彩还原,而不是将光谱感知装置采集到的所有光谱色彩数据均应用于色彩还原,可以有效降低色彩还原过程中的计算量,提升计算效率,并且筛选目标检测区域对应的光谱色彩数据对原始图像进行色彩还原,能够实现对原始图像局部感兴趣区域的色彩增强,提升图像中感兴趣区域的色彩表现力,使输出的目标图像中的感兴趣区域的图像内容更加鲜艳,提升用户观赏体验。
图6示意性示出本公开示例性实施例中一种图像区域分割模块的原理示意图。
参考图6所示,图像传感器221采集到原始图像610,可以将原始图像610输入到中央处理器、图像信号处理器或者其他具有计算能力的处理器中,确定原始图像610中的感兴趣区域620,例如,可以将原始图像610输入到中央处理器中,通过边缘检测算法确定感兴趣区域620,也可以将原始图像610输入到人工智能处理器中,通过预先训练的基于深度学习的感兴趣区域提取模型确定感兴趣区域620,本示例实施例对此不做特殊限定。
可以将原始图像610对应的感兴趣区域620,与光谱感知装置210对应的多个检测区域630的光谱色彩数据输入到图像区域分割模块510中,确定感兴趣区域620与多个检测区域630的光谱色彩数据的重叠区域640,并将重叠区域640作为目标检测区域,并将目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
通过确定原始图像对应的感兴趣区域,并根据感兴趣区域筛选需要使用的光谱色彩数据对原始图像进行色彩还原,而不是将光谱感知装置采集到的所有光谱色彩数据均应用于色彩还原,可以有效降低色彩还原过程中的计算量,提升计算效率。
在一示例实施例中,本实施例中的图像处理系统不改变已经成熟的图像信号处理流程的系统架构,即本实施例中的图像处理系统是将光谱感知装置210对应的多个检测区域630的光谱色彩数据,在已经成熟的图像信号处理流程的系统架构上进行结合。
图7示意性示出本公开示例性实施例中再一种图像处理系统的架构示意图,参考图7所示,图像信号处理器222可以包括RGB域处理模块710和YUV域处理模块720,具体的,参考图8所示,RGB域处理模块710可以包括Raw RGB域处理单元810以及Full RGB域处理单元820。
可以理解的是,Raw RGB域、Full RGB域以及YUV域分别是指在图像信号处理流水线中对不同格式或者阶段的原始图像的区分,举例而言,参考图9所示,对于将原始图像转换为最终的JPG输出图像的成熟的图像信号处理流水线900,其中,可以将坏点隐藏(Dead Pixel Concealment)、黑电平补偿(Black Level Compensation)、镜头阴影校正(Lens Shading Correction)、抗混叠噪声滤波器(Anti-aliasing Noise filter)、自动白平衡增益控制(AWB Gain control)、CFA插值(CFA Interpolation)等流程处理的数据认为是Raw RGB域910对应的原始图像;可以将伽马校正(Gamma Correction)、色彩校正(Color Correction)、色彩空间转换(Color space conversion)等流程处理的数据认为是Full RGB域920对应的原始图像;可以将色度噪声滤波器(Noise filter for Chroma)、色度饱和度控制(Hue Saturation Control)、Luma噪声滤波器(Noise filter for  Luma)、边缘增强(Edge Enhancement)、对比度亮度控制(Contrast Brightness Control)、数据格式化(Data formatter)等流程处理的数据认为是YUV域930对应的原始图像。
便于理解的,参考图10所示,可以通过基于拜耳滤光样式的图像传感器采集到Raw RGB域原始图像1000,此时的Raw RGB域原始图像1000可以包含不同通道的颜色滤波阵列1010(R颜色滤波阵列、G颜色滤波阵列、B颜色滤波阵列);此时可以通过对不同通道的颜色滤波阵列1010进行插值(如可以通过反马赛克Demosaic算法对颜色滤波阵列1010进行插值),得到不同通道的全局颜色阵列1020(R全局颜色阵列、G全局颜色阵列、B全局颜色阵列),进而可以通过不同通道的全局颜色阵列1020得到Full RGB域原始图像1030。可以继续对Full RGB域原始图像1030进行颜色空间转换,得到YUV域的原始图像。
具体的,继续参考图7和图8所示,图像信号处理器222可以包括RGB域处理模块710,具体的,RGB域处理模块710可以包括Raw RGB域处理单元810,该Raw RGB域处理单元810可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
进一步的,参考图11所示,Raw RGB域处理单元810可以包括白平衡增益子单元1110以及第一色调映射子单元1120,其中:白平衡增益子单元1110可以用于根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处理,确定白平衡增益矩阵;第一色调映射子单元1120可以用于根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数。
在确定白平衡增益矩阵和第一色调映射增益函数之后,Raw RGB域处理单元810可以通过白平衡增益矩阵以及第一色调映射增益函数对原始图像进行色彩还原,得到目标图像。
在一示例实施例中,图像信号处理器222可以包括RGB域处理模块710,RGB域处理模块710可以包括Full RGB域处理单元820,该Full RGB域处理单元820可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照X*Y的阵列排布,其中,X和Y均为大于1的正整数,X大于或者等于M,且Y大于或者等于N;根据目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
具体的,参考图12所示,Full RGB域处理单元820可以包括颜色校正子单元1210、第二色调映射子单元1220和2D/3D查找表子单元1230,其中:
颜色校正子单元1210可以用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行伽马校正处理,确定伽马校正曲线;
第二色调映射子单元1220可以用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色调映射处理,确定第二色调映射增益函数;
2D/3D查找表子单元1230可以用于根据所述目标光谱色彩数据以及2D/3D查找表,对具有X*Y个局部图像区域的原始图像进行色彩映射处理,确定校正色彩数据。
在确定伽马校正曲线、第二色调映射增益函数以及校正色彩数据之后,Full RGB域处理单元820可以通过伽马校正曲线、第二色调映射增益函数以及校正色彩数据对原始图像进行色彩还原,得到目标图像。
在一示例实施例中,图像信号处理器222可以包括YUV域处理模块720,该YUV域处理模块720可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照E*F的阵列排布,其中,E和F均为大于1的正整数,E大于或者等于M,且F大于或者等于N;根据目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
具体的,参考图13所示,YUV域处理模块720可以包括第三色调映射子单元1310和色彩调整增强子单元1320,其中:
第三色调映射子单元1310可以用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色调映射处理,确定第三色调映射增益函数;
色彩调整增强子单元1320可以用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩调整增强,确定色彩调整增强参数,其中色彩调整增强参数可以包括对比度调整参数、亮度调整参数、饱和度调整参数和边缘增强参数等,本示例实施例对此不做特殊限定。
在确定第三色调映射增益函数以及色彩调整增强参数之后,YUV域处理模块720可以根据第三色调映射增益函数和色彩调整增强参数对原始图像进行色彩还原,得到目标图像。
需要说明的是,本公开实施例中的K*L阵列、X*Y阵列、E*F阵列分别为在Raw RGB域处理数据、Full RGB域处理数据、YUV域处理数据时划分的网格,K*L阵列、X*Y阵列、E*F阵列的尺寸可以是一致的,也可以是不一致的,具体可以根据实际应用情况进行自定义设置,本示例实施例对此不做特殊限定。
在一示例实施例中,图像信号处理器还可以用于基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。通过对不同时刻色彩还原后的原始图像进行时域滤波处理,能够降低由于对图像局部处理而导致的区域边缘或者前后图像色彩不连续的问题,提升拍摄视频或者实时预览时的图像表现连续性,提升用户体验。
综上所述,本示例性实施方式中,可以包括光谱感知装置,光谱感知装置用于采集目标光谱色彩数据,以及摄像模组,摄像模组可以包括图像传感器和图像信号处理器,其中图像传感器可以用于生成原始图像,图像信号处理器可以与图像传感器和光谱感知装置电连接,用于根据目标光谱色彩数据对原始图像进行色彩还原,得到目标图像。一方面,通过光谱感知装置获取目标光谱色彩数据,相比于图像传感器生成的颜色通道较少的原始图像,能够感知当前场景中更加丰富的色彩信息,并结合目标光谱色彩数据对原始图像进行色彩还原,能够有效提升目标图像的色彩表达的准确性,提高目标图像的色彩表现力;另一方面,通过独立的光谱感知装置采集目标光谱色彩数据,并将目标光谱色彩数据与图像信号处理器中的已经成熟的图像信号处理流程的系统架构相结合进行色彩还原,并不需要调整已经成熟的图像信号处理流程的系统结构,有效提升色彩还原效率,降低优化成本。
需要注意的是,上述附图仅是根据本公开示例性实施例的方法所包括的处理的示意性说明,而不是限制目的。易于理解,上述附图所示的处理并不表明或限制这些处理的时间顺序。另外,也易于理解,这些处理可以是例如在多个模块中同步或异步执行的。
进一步的,本示例的实施方式中还提供一种图像处理方法,图14示出了本示例性实施方式中一种图像处理方法的流程示意图,包括以下步骤S1410至步骤S1430:
步骤S1410,采集目标光谱色彩数据;
步骤S1420,获取原始图像;
步骤S1430,根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
在一示例实施例中,目标光谱色彩数据可以包括多个检测区域,原始图像可以包括多个图像区域,且检测区域与图像区域相对应。
上述方法的具体细节在系统部分实施方式中已经详细说明,未披露的细节内容可以参见系统部分的实施方式内容,因而不再赘述。
所属技术领域的技术人员能够理解,本公开的各个方面可以实现为系统、方法或程序产品。因此,本公开的各个方面可以具体实现为以下形式,即:完全的硬件实施方式、完全的软件实施方式(包括固件、微代码等),或硬件和软件方面结合的实施方式,这里可以统称为“电路”、“模块”或“系统”。
本公开的示例性实施方式还提供一种电子设备,参考图15所示,电子设备1500可以包括:壳体1510,主电路板1520,图像处理系统1530。其中,主电路板1520位于壳体1510内。图像处理系统1530可以是如本公开实施例中的图像处理系统。图像处理系统1530可以与主电路板1520电连接,以实现图像处理系统1530可以与主电路板1520之间的信号或数据传输。例如,图像处理系统1530可以设置于主电路板1520上,或者通过柔性电路板等介质连接于主电路板1520,以形成电连接,给图像处理系统1530供电或者传输信号。
下面以图16中的移动终端1600为例,对该电子设备的构造进行示例性说明。本领域技术人员应当理解,除了特别用于移动目的的部件之外,图16中的构造也能够应用于固定类型的设备。
如图16所示,移动终端1600具体可以包括:处理器1601、存储器1602、总线1603、移动通信模块1604、天线1、无线通信模块1605、天线2、显示屏1606、摄像模块1607、音频模块1608、电源模块1609与传感器模块1610。
处理器1601可以包括一个或多个处理单元,例如:处理器1601可以包括应用处理器AP、调制解调处理器、图形处理器GPU、ISP处理器、控制器、编码器、解码器、DSP(Digital Signal Processor,数字信号处理器)、基带处理器和/或神经网络处理器NPU等。
编码器可以对图像或视频进行编码(即压缩),以减小数据大小,便于存储或发送。解码器可以对图像或视频的编码数据进行解码(即解压缩),以还原出图像或视频数据。移动终端1600可以支持一种或多种编码器和解码器,例如:JPEG(Joint Photographic Experts Group,联合图像专家组)、PNG(Portable Network Graphics,便携式网络图形)、BMP(Bitmap,位图)等图像格式,MPEG(Moving Picture Experts Group,动态图像专家组)1、MPEG10、H.1063、H.1064、HEVC(High Efficiency Video Coding,高效率视频编码)等视频格式。
处理器1601可以通过总线1603与存储器1602或其他部件形成连接。
存储器1602可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器1601通过运行存储在存储器1602的指令,执行移动终端1600的各种功能应用以及数据处理。存储器1602还可以存储应用数据,例如存储图像,视频等文件。
移动终端1600的通信功能可以通过移动通信模块1604、天线1、无线通信模块1605、天线2、调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。移动通信模块1604可以提供应用在移动终端1600上3G、4G、5G等移动通信解决方案。无线通信模块1605可以提供应用在移动终端1600上的无线局域网、蓝牙、近场通信等无线通信解决方案。
显示屏1606用于实现显示功能,如显示用户界面、图像、视频等。摄像模块1607用于实现拍摄功能,如拍摄图像、视频等。音频模块1608用于实现音频功能,如播放音频,采集语音等。电源模块1609用于实现电源管理功能,如为电池充电、为设备供电、监测电池状态等。
传感器模块1610可以包括一种或多种传感器,用于实现相应的感应检测功能。例如,传感器模块1610可以包括惯性传感器,其用于检测移动终端1600的运动位姿,输出惯性传感数据。
本公开的示例性实施方式还提供了一种计算机可读存储介质,其上存储有能够实现本说明书上述方法的程序产品。在一些可能的实施方式中,本公开的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当程序产品在终端设备上运行时,程序代码用于使终端设备执行本说明书上述“示例性方法”部分中描述的根据本公开各种示例性实施方式的步骤。
需要说明的是,本公开所示的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
此外,可以以一种或多种程序设计语言的任意组合来编写用于执行本公开操作的程序代码,程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其他实施例。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限。

Claims (20)

  1. 一种图像处理系统,其特征在于,包括:
    光谱感知装置,用于采集目标光谱色彩数据;
    摄像模组,包括图像传感器和图像信号处理器,其中:
    所述图像传感器,用于生成原始图像;
    所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
  2. 根据权利要求1所述的系统,其特征在于,所述光谱感知装置包括多个光谱传感器,所述多个光谱传感器按照M*N的阵列排布构成光谱传感器阵列,其中,M和N均为大于1的正整数。
  3. 根据权利要求2所述的系统,其特征在于,所述光谱感知装置用于:
    通过所述光谱传感器阵列采集得到具有M*N个检测区域的光谱色彩数据,并将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据。
  4. 根据权利要求3所述的系统,其特征在于,所述图像处理系统包括图像区域分割模块,所述图像区域分割模块用于:
    根据所述原始图像确定感兴趣区域;
    通过所述感兴趣区域在所述M*N个检测区域中确定目标检测区域,并将所述目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
  5. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括RGB域处理模块,所述RGB域处理模块包括Raw RGB域处理单元,
    所述Raw RGB域处理单元用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  6. 根据权利要求5所述的系统,其特征在于,所述Raw RGB域处理单元包括白平衡增益子单元以及第一色调映射子单元,其中:
    所述白平衡增益子单元,用于根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处理,确定白平衡增益矩阵;
    所述第一色调映射子单元,用于根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数;以及
    所述Raw RGB域处理单元,用于通过所述白平衡增益矩阵以及所述第一色调映射增益函数对所述原始图像进行色彩还原,得到目标图像。
  7. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括RGB域处理模块,所述RGB域处理模块包括Full RGB域处理单元,
    所述Full RGB域处理单元用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照X*Y的阵列排布,其中,X和Y均为大于1的正整数,X大于或者等于M,且Y大于或者等于N;
    根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  8. 根据权利要求7所述的系统,其特征在于,所述Full RGB域处理单元包括颜色校正子单元、第二色调映射子单元和2D/3D查找表子单元,其中:
    所述颜色校正子单元,用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行伽马校正处理,确定伽马校正曲线;
    所述第二色调映射子单元,用于根据所述目标光谱色彩数据对具有X*Y个局部图 像区域的原始图像进行色调映射处理,确定第二色调映射增益函数;
    所述2D/3D查找表子单元,用于根据所述目标光谱色彩数据以及2D/3D查找表,对具有X*Y个局部图像区域的原始图像进行色彩映射处理,确定校正色彩数据;以及
    所述Full RGB域处理单元,用于通过所述伽马校正曲线、所述第二色调映射增益函数以及所述校正色彩数据对所述原始图像进行色彩还原,得到目标图像。
  9. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括YUV域处理模块,所述YUV域处理模块用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照E*F的阵列排布,其中,E和F均为大于1的正整数,E大于或者等于M,且F大于或者等于N;
    根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  10. 根据权利要求9所述的系统,其特征在于,所述YUV域处理模块包括第三色调映射子单元和色彩调整增强子单元,其中:
    所述第三色调映射子单元,用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色调映射处理,确定第三色调映射增益函数;
    所述色彩调整增强子单元,用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩调整增强,确定色彩调整增强参数,其中所述色彩调整增强参数包括对比度调整参数、亮度调整参数、饱和度调整参数和边缘增强参数;以及
    所述YUV域处理模块,用于根据所述第三色调映射增益函数和所述色彩调整增强参数对所述原始图像进行色彩还原,得到目标图像。
  11. 根据权利要求1所述的系统,其特征在于,所述图像信号处理器还用于:
    基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。
  12. 一种图像处理方法,其特征在于,由权利要求1-11任一项所述的图像处理系统执行,所述方法包括:
    采集目标光谱色彩数据;
    获取原始图像;
    根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
  13. 根据权利要求12所述的方法,其特征在于,所述采集目标光谱色彩数据包括:
    通过光谱传感器阵列采集得到具有M*N个检测区域的光谱色彩数据,并将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据。
  14. 根据权利要求13所述的方法,其特征在于,所述将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据包括:
    根据所述原始图像确定感兴趣区域;
    通过所述感兴趣区域在所述M*N个检测区域中确定目标检测区域,并将所述目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
  15. 根据权利要求12所述的方法,其特征在于,所述根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像包括:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  16. 根据权利要求15所述的方法,其特征在于,所述根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像包括:
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处 理,确定白平衡增益矩阵;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数;
    通过所述白平衡增益矩阵以及所述第一色调映射增益函数对所述原始图像进行色彩还原,得到目标图像。
  17. 根据权利要求12所述的方法,其特征在于,所述根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像包括:
    基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。
  18. 根据权利要求12所述的方法,其特征在于,所述目标光谱色彩数据包括多个检测区域,所述原始图像包括多个图像区域,所述检测区域与所述图像区域相对应。
  19. 一种计算机可读介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求12至18中任一项所述的方法。
  20. 一种电子设备,其特征在于,包括:
    壳体;
    主电路板,设置在所述壳体内;
    如权利要求1-11任一项所述的图像处理系统,设置在所述壳体内,与所述主电路板电连接。
PCT/CN2023/095718 2022-08-03 2023-05-23 图像处理系统及方法、计算机可读介质和电子设备 WO2024027287A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210927465.6 2022-08-03
CN202210927465.6A CN115314617A (zh) 2022-08-03 2022-08-03 图像处理系统及方法、计算机可读介质和电子设备

Publications (2)

Publication Number Publication Date
WO2024027287A1 true WO2024027287A1 (zh) 2024-02-08
WO2024027287A9 WO2024027287A9 (zh) 2024-05-10

Family

ID=83859660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/095718 WO2024027287A1 (zh) 2022-08-03 2023-05-23 图像处理系统及方法、计算机可读介质和电子设备

Country Status (2)

Country Link
CN (1) CN115314617A (zh)
WO (1) WO2024027287A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115314617A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 图像处理系统及方法、计算机可读介质和电子设备
CN118057830A (zh) * 2022-11-18 2024-05-21 华为技术有限公司 图像处理方法、电子设备、计算机程序产品及存储介质
CN117319815B (zh) * 2023-09-27 2024-05-14 北原科技(深圳)有限公司 基于图像传感器的视频流识别方法和装置、设备、介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200195862A1 (en) * 2018-12-14 2020-06-18 Lyft, Inc. Multispectrum, multi-polarization (msmp) filtering for improved perception of difficult to perceive colors
CN112562017A (zh) * 2020-12-07 2021-03-26 奥比中光科技集团股份有限公司 一种rgb图像的色彩还原方法及计算机可读存储介质
CN112752023A (zh) * 2020-12-29 2021-05-04 深圳市天视通视觉有限公司 一种图像调整方法、装置、电子设备及存储介质
CN113676713A (zh) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 图像处理方法、装置、设备及介质
CN115314617A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 图像处理系统及方法、计算机可读介质和电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11493387B2 (en) * 2020-03-12 2022-11-08 Spectricity Correction and calibration of spectral sensor output
CN114286072A (zh) * 2020-09-27 2022-04-05 北京小米移动软件有限公司 色彩还原装置及方法、图像处理器
CN113418864B (zh) * 2021-06-03 2022-09-16 奥比中光科技集团股份有限公司 一种多光谱图像传感器及其制造方法
CN113676628B (zh) * 2021-08-09 2023-05-02 Oppo广东移动通信有限公司 成像装置和图像处理方法
CN113639881A (zh) * 2021-08-23 2021-11-12 Oppo广东移动通信有限公司 色温测试方法及装置、计算机可读介质和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200195862A1 (en) * 2018-12-14 2020-06-18 Lyft, Inc. Multispectrum, multi-polarization (msmp) filtering for improved perception of difficult to perceive colors
CN112562017A (zh) * 2020-12-07 2021-03-26 奥比中光科技集团股份有限公司 一种rgb图像的色彩还原方法及计算机可读存储介质
CN112752023A (zh) * 2020-12-29 2021-05-04 深圳市天视通视觉有限公司 一种图像调整方法、装置、电子设备及存储介质
CN113676713A (zh) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 图像处理方法、装置、设备及介质
CN115314617A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 图像处理系统及方法、计算机可读介质和电子设备

Also Published As

Publication number Publication date
WO2024027287A9 (zh) 2024-05-10
CN115314617A (zh) 2022-11-08

Similar Documents

Publication Publication Date Title
WO2024027287A1 (zh) 图像处理系统及方法、计算机可读介质和电子设备
EP3039864B1 (en) Automatic white balancing with skin tone correction for image processing
US11317070B2 (en) Saturation management for luminance gains in image processing
US10055815B2 (en) Image processing apparatus, image processing system, imaging apparatus and image processing method
CN113810641B (zh) 视频处理方法、装置、电子设备和存储介质
US10600170B2 (en) Method and device for producing a digital image
CN111147857B (zh) 一种图像处理方法、图像处理装置、电子设备和存储介质
CN113810642B (zh) 视频处理方法、装置、电子设备和存储介质
US9654756B1 (en) Method and apparatus for interpolating pixel colors from color and panchromatic channels to color channels
CN113824914B (zh) 视频处理方法、装置、电子设备和存储介质
WO2021179142A1 (zh) 一种图像处理方法及相关装置
CN111107336B (zh) 一种图像处理方法、图像处理装置、电子设备和存储介质
WO2020215263A1 (zh) 一种图像处理方法及装置
WO2023016044A1 (zh) 视频处理方法、装置、电子设备和存储介质
CN115205159A (zh) 图像处理方法及装置、电子设备、存储介质
CN115187487A (zh) 图像处理方法及装置、电子设备、存储介质
WO2021217428A1 (zh) 图像处理方法、装置、摄像设备和存储介质
CN115278189A (zh) 图像色调映射方法及装置、计算机可读介质和电子设备
CN115187488A (zh) 图像处理方法及装置、电子设备、存储介质
CN115330633A (zh) 图像色调映射方法及装置、电子设备、存储介质
CN115706765A (zh) 视频处理方法、装置、电子设备和存储介质
CN115278191B (zh) 图像白平衡方法及装置、计算机可读介质和电子设备
WO2023016041A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2023016043A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2023016038A1 (zh) 视频处理方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23848997

Country of ref document: EP

Kind code of ref document: A1