WO2024027287A9 - 图像处理系统及方法、计算机可读介质和电子设备 - Google Patents

图像处理系统及方法、计算机可读介质和电子设备 Download PDF

Info

Publication number
WO2024027287A9
WO2024027287A9 PCT/CN2023/095718 CN2023095718W WO2024027287A9 WO 2024027287 A9 WO2024027287 A9 WO 2024027287A9 CN 2023095718 W CN2023095718 W CN 2023095718W WO 2024027287 A9 WO2024027287 A9 WO 2024027287A9
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
color data
original image
color
Prior art date
Application number
PCT/CN2023/095718
Other languages
English (en)
French (fr)
Other versions
WO2024027287A1 (zh
Inventor
张科武
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2024027287A1 publication Critical patent/WO2024027287A1/zh
Publication of WO2024027287A9 publication Critical patent/WO2024027287A9/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled

Definitions

  • the present disclosure relates to the technical field of image processing, and in particular to an image processing system, an image processing method, a computer-readable medium, and an electronic device.
  • Bayer sensor is a sensor that can distinguish the three components of red, green and blue light signals in space and count the light signals separately.
  • RGB red, green, and blue
  • the present disclosure aims to provide an image processing system, an image processing method, a computer-readable medium and an electronic device.
  • an image processing system comprising:
  • a spectral sensing device used to collect target spectral color data
  • the camera module includes an image sensor and an image signal processor, wherein:
  • the image sensor is used to generate an original image
  • the image signal processor is electrically connected to the image sensor and the spectrum sensing device, and is used to perform color restoration on the original image according to the target spectrum color data to obtain a target image.
  • an image processing method comprising:
  • the original image is color restored according to the target spectral color data to obtain a target image.
  • a computer-readable medium on which a computer program is stored, and when the computer program is executed by a processor, the above method is implemented.
  • an electronic device comprising:
  • a main circuit board is arranged in the housing;
  • the image processing system provided in the first aspect is arranged in the housing and electrically connected to the main circuit board.
  • FIG1 is a schematic diagram showing a system architecture of an image signal processing flow in a related technical solution
  • FIG2 schematically shows a schematic diagram of the architecture of an image processing system in an exemplary embodiment of the present disclosure
  • FIG3 schematically shows a schematic diagram of a configuration of a spectrum sensing device and an electronic device in an exemplary embodiment of the present disclosure
  • FIG4 schematically shows a schematic diagram of the principle of collecting spectral color data by a spectral sensor array in an exemplary embodiment of the present disclosure
  • FIG5 schematically shows a schematic diagram of the architecture of another image processing system in an exemplary embodiment of the present disclosure
  • FIG6 schematically shows a principle diagram of an image region segmentation module in an exemplary embodiment of the present disclosure
  • FIG7 schematically shows a schematic diagram of the architecture of yet another image processing system in an exemplary embodiment of the present disclosure
  • FIG8 schematically shows a structural diagram of an RGB domain processing module in an exemplary embodiment of the present disclosure
  • FIG9 schematically shows a structural diagram of an image signal processing pipeline in an exemplary embodiment of the present disclosure
  • FIG10 schematically shows a schematic diagram of the difference between a Raw RGB domain image and a Full RGB domain in an exemplary embodiment of the present disclosure
  • FIG11 schematically shows a schematic structural diagram of a Raw RGB domain processing module in an exemplary embodiment of the present disclosure
  • FIG12 schematically shows a structural diagram of a Full RGB domain processing module in an exemplary embodiment of the present disclosure
  • FIG13 schematically shows a structural diagram of a YUV domain processing module in an exemplary embodiment of the present disclosure
  • FIG14 schematically shows a flowchart of an image processing method in an exemplary embodiment of the present disclosure
  • FIG15 is a schematic diagram showing a structure of an electronic device that can be applied to an embodiment of the present disclosure
  • FIG. 16 shows a schematic structural diagram of another electronic device that can be applied to the embodiments of the present disclosure.
  • FIG1 is a schematic diagram showing a system architecture of an image signal processing flow in a related technical solution.
  • a system architecture of an image signal processing flow may include an image sensor 110, an RGB domain processing module 120, a YUV domain processing module 130, and a JPG output module 140. Among them:
  • the image sensor 110 refers to a sensor that collects optical signals based on a Bayer filter pattern template and converts the optical signals into electrical signals.
  • the image sensor 110 can be a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the RGB domain processing module 120 refers to a module corresponding to all processes of performing image signal processing in the RGB domain on the original image captured by the image sensor 110 .
  • the YUV domain processing module 130 is a module that converts the output result of the RGB domain processing module 120 into a YUV color space (YUV domain) and performs all processes of image signal processing on the YUV color space.
  • the JPG output module 140 maps the processing results of the RGB domain processing module 120 and the processing results of the YUV domain processing module 130 to the original image, and converts the processed original image into a picture that can be finally displayed on a display unit such as a JPG format picture.
  • the image sensor 110 in this solution adopts a sensor with different filter pattern combinations obtained based on the Bayer filter and the combination of RGB color channels
  • the color filtering adopted by the sensor based on the RGB three-channel is limited to the three spectral ranges of red, green, and blue (R, G, B), and the spectral perception information under complex light source scenes is limited; at the same time, the current color temperature and the color information of the current scene need to be used in the system structure, and the global RGB-based method has many limitations and cannot obtain the color information of the local area, resulting in poor color accuracy of the final output image and inability to restore the color in the real scene.
  • the present disclosure first provides an image processing system, which can be set in an electronic device with certain computing capabilities.
  • the electronic device can be a desktop computer, a portable computer, a smart phone, an intelligent robot, a wearable device, and a tablet computer, etc. This example embodiment does not make any special limitations on this.
  • FIG2 schematically shows a schematic diagram of the architecture of an image processing system in an exemplary embodiment of the present disclosure.
  • the image processing system 200 may include:
  • the spectrum sensing device 210 can be used to collect target spectrum color data
  • the camera module 220 may include an image sensor 221 and an image signal processor 222, wherein:
  • the image sensor 221 may be used to generate a raw image
  • the image signal processor 222 may be electrically connected to the image sensor 221 and the spectrum sensing device 210 , and is configured to perform color restoration on the original image output by the image sensor 221 according to the target spectrum color data collected by the spectrum sensing device 210 , so as to obtain a target image.
  • the spectral sensing device 210 refers to a sensing unit for sensing the true color in the current scene when capturing the original image.
  • the spectral sensing device 210 may be a multispectral sensor (Multispectral sensor), which may include imaging optical elements and optical elements for dividing the spectrum, and may accurately identify all spectral color information in the current scene.
  • the spectral sensing device 210 may also be a CCD direct-reading spectrometer. This example embodiment does not specifically limit the type of the spectral sensing device.
  • FIG. 3 schematically shows a schematic diagram of a configuration of a spectrum sensing device and an electronic device in an exemplary embodiment of the present disclosure.
  • the multispectral sensor 310 can be integrated into the camera module 220, or it can be set in the electronic device independently of the camera module 220. For example, it can be set on the housing 320 of the electronic device, and the distance from the camera module 220 is within the distance threshold; of course, the spectral sensing device 210 can also be used as an external device to communicate with the electronic device corresponding to the camera module 220.
  • the external spectral sensing device 330 can communicate with the camera module 220 in a wired or wireless manner 340. This example embodiment does not specifically limit the connection method of the spectral sensing device 210.
  • the target spectral color data refers to the spectral color data that is screened and acts on the original image.
  • the target spectral color data may be all the spectral color data collected by the spectral sensing device 210, or may be the spectral color data corresponding to the foreground area of the original image among all the spectral color data collected by the spectral sensing device 210.
  • it may also be the spectral color data that is screened and acts on the original image in other ways. This example embodiment does not specifically limit the screening method of the target spectral color data.
  • Image sensor 221 refers to a sensor that realizes light signal acquisition and converts light signals into electrical signals.
  • image sensor 221 may be a complementary metal oxide semiconductor (CMOS) or a charge coupled detection element (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled detection element
  • image sensor 221 may be an image sensor based on a Bayer filter pattern, such as an image sensor based on an RGBG filter pattern or an RGBW filter pattern.
  • it may also be an image sensor based on a non-Bayer filter pattern, such as an image sensor based on an RYYB filter pattern. This embodiment is not limited thereto.
  • the original image refers to image data obtained by selecting and converting light signals through an image sensor.
  • the original image may be an RGBG image
  • the original image may be an RYYB image.
  • This example embodiment is not limited to this.
  • An image signal processor is a processor that processes the original image captured by an image sensor into image data that can be output to a display for display.
  • the image signal processor can be an ISP (Image Signal Processor) processor, or an application processor (AP), a GPU (Graphics Processing Unit, GPU), etc. This example embodiment does not limit this.
  • the image signal processor is used as the ISP processor in the following description.
  • the spectral sensing device may include a plurality of spectral sensors, and the plurality of spectral sensors may be arranged in an array of M*N to form a spectral sensor array, wherein M and N are both positive integers greater than 1.
  • FIG. 4 schematically shows a schematic diagram of the principle of collecting spectral color data by a spectral sensor array in an exemplary embodiment of the present disclosure.
  • the spectrum sensing device 210 may include a plurality of spectrum sensors 410, which are arranged spatially according to a preset number of columns and rows to obtain a spectrum sensor array.
  • the plurality of spectrum sensors 410 may be arranged in an M*N array to form a spectrum sensor array 420.
  • M may be preset to 10
  • N may be preset to 10
  • the spectrum sensing device 210 may include 100 spectrum sensors 410, which are spatially arranged to obtain a 10*10 spectrum sensor array.
  • the specific M and N may be customized according to the actual application scenario, and this exemplary embodiment does not specifically limit this.
  • the spectral sensing device 210 can be used to collect spectral color data 440 having M*N detection areas 430 through the spectral sensor array 420, and use the spectral color data 440 of the M*N local detection areas as target spectral color data.
  • the spectral sensing device 210 when capturing the original image, the spectral sensing device 210 having a spatial structure of an M*N spectral sensor array 420 collects the real color information in the current scene and obtains spectral color data 440 having M*N detection areas 430.
  • the spectral color data in each detection area 430 can collect the spectral color data in the corresponding real scene area. Compared with an image sensor based on RGB channels, it can obtain richer color information in the real scene, perceive a larger range of spectral information, effectively supplement the color information of the image sensor, and improve the accuracy of color restoration of the output target image.
  • the spectral color data in each detection area it can effectively realize color enhancement of the local area of the target image to ensure the color expression of the target image.
  • the image processing system 200 may include an image region segmentation module 510.
  • the image region segmentation module 510 may be arranged in a processor such as a central processing unit (CPU) or a neural network processor (NPU) of the electronic device, or may be arranged in an image signal processor of a camera module. This example embodiment does not make any special limitation to this.
  • the image region segmentation module 510 can be used to: determine a region of interest according to the original image, determine a target detection region in the M*N detection regions through the region of interest, and use the spectral color data corresponding to the target detection region as the target spectral color data.
  • the region of interest refers to the area where specific image content is located in the original image.
  • the region of interest corresponding to the original image may be the face area in the original image; assuming that the original image captures a tree, then the region of interest corresponding to the original image may be the image area corresponding to the tree.
  • This example embodiment does not impose any special limitation on the selection or setting of the region of interest.
  • the target detection area refers to the overlapping area of the region of interest in the original image and the multiple detection areas corresponding to the spectral sensing device, that is, the target detection area is obtained by taking the intersection of the region of interest in the original image and the multiple detection areas corresponding to the spectral sensing device.
  • the amount of calculation in the color restoration process can be effectively reduced and the calculation efficiency can be improved.
  • the color of the local region of interest in the original image can be enhanced, the color expression of the region of interest in the image can be improved, the image content of the region of interest in the output target image can be made more vivid, and the user viewing experience can be improved.
  • FIG. 6 schematically shows a principle diagram of an image region segmentation module in an exemplary embodiment of the present disclosure.
  • the image sensor 221 captures the original image 610, and the original image 610 can be input into a central processing unit, an image signal processor or other processor with computing capability to determine the region of interest 620 in the original image 610.
  • the original image 610 can be input into the central processing unit, and the region of interest 620 can be determined by an edge detection algorithm.
  • the original image 610 can also be input into an artificial intelligence processor, and the region of interest 620 can be determined by a pre-trained region of interest extraction model based on deep learning. This example embodiment does not specifically limit this.
  • the spectral color data of the region of interest 620 corresponding to the original image 610 and the multiple detection regions 630 corresponding to the spectral sensing device 210 can be input into the image region segmentation module 510 to determine the overlapping region 640 of the spectral color data of the region of interest 620 and the multiple detection regions 630, and the overlapping region 640 is used as the target detection region, and the spectral color data corresponding to the target detection region is used as the target spectral color data.
  • the amount of calculation in the color restoration process can be effectively reduced and the calculation efficiency can be improved.
  • the image processing system in this embodiment does not change the system architecture of the already mature image signal processing process, that is, the image processing system in this embodiment combines the spectral color data of the multiple detection areas 630 corresponding to the spectral sensing device 210 with the system architecture of the already mature image signal processing process.
  • FIG. 7 schematically shows an architectural diagram of another image processing system in an exemplary embodiment of the present disclosure.
  • the image signal processor 222 may include an RGB domain processing module 710 and a YUV domain processing module 720.
  • the RGB domain processing module 710 may include a Raw RGB domain processing unit 810 and a Full RGB domain processing unit 820.
  • the Raw RGB domain, the Full RGB domain, and the YUV domain respectively refer to the distinction between original images of different formats or stages in the image signal processing pipeline.
  • data processed by processes such as Dead Pixel Concealment, Black Level Compensation, Lens Shading Correction, Anti-aliasing Noise filter, Automatic White Balance Gain Control, and CFA Interpolation can be considered as the original image corresponding to the Raw RGB domain 910;
  • data processed by processes such as Gamma Correction, Color Correction, and Color space conversion can be considered as the original image corresponding to the Full RGB domain 920;
  • data processed by processes such as Chroma, Hue Saturation Control, and Luma Noise filter can be considered as the original image corresponding to the Full RGB domain 920.
  • the data processed by processes such as Luma, Edge Enhancement, Contrast Brightness Control, and Data formatter are considered to be the original image corresponding to the YUV domain 930.
  • a Raw RGB domain original image 1000 can be collected by an image sensor based on a Bayer filter pattern.
  • the Raw RGB domain original image 1000 can include color filter arrays 1010 (R color filter array, G color filter array, B color filter array) of different channels; at this time, the color filter arrays 1010 of different channels can be interpolated (such as the color filter array 1010 can be interpolated by a Demosaic algorithm) to obtain global color arrays 1020 (R global color array, G global color array, B global color array) of different channels, and then the Full RGB domain original image 1030 can be obtained by the global color arrays 1020 of different channels.
  • the Full RGB domain original image 1030 can be further converted to a color space to obtain a YUV domain original image.
  • the image signal processor 222 may include an RGB domain processing module 710.
  • the RGB domain processing module 710 may include a Raw RGB domain processing unit 810.
  • the Raw RGB domain processing unit 810 may be used to:
  • the original image is divided into a plurality of local image areas, and the local image areas can be arranged in a K*L array, wherein K and L are both positive integers greater than 1, K is greater than or equal to M, and L is greater than or equal to N; the original image having K*L local image areas is subjected to color restoration processing according to the target spectral color data to obtain a target image.
  • the Raw RGB domain processing unit 810 may include a white balance gain subunit 1110 and a first tone mapping subunit 1120, wherein: the white balance gain subunit 1110 may be used to perform white balance processing on the original image having K*L local image areas according to the target spectral color data, and determine a white balance gain matrix; the first tone mapping subunit 1120 may be used to perform tone mapping processing on the original image having K*L local image areas according to the target spectral color data, and determine a first tone mapping gain function.
  • the Raw RGB domain processing unit 810 can perform color restoration on the original image through the white balance gain matrix and the first tone mapping gain function to obtain a target image.
  • the image signal processor 222 may include an RGB domain processing module 710, and the RGB domain processing module 710 may include a Full RGB domain processing unit 820, which may be used to:
  • the original image is divided into a plurality of local image regions, and the local image regions can be arranged in an array of X*Y, wherein X and Y are both positive integers greater than 1, X is greater than or equal to M, and Y is greater than or equal to N; the original image having X*Y local image regions is subjected to color restoration processing according to the target spectral color data to obtain a target image.
  • the Full RGB domain processing unit 820 may include a color correction subunit 1210, a second tone mapping subunit 1220, and a 2D/3D lookup table subunit 1230, wherein:
  • the color correction subunit 1210 may be used to perform gamma correction processing on the original image having X*Y local image regions according to the target spectral color data, and determine a gamma correction curve;
  • the second tone mapping subunit 1220 may be configured to perform tone mapping processing on the original image having X*Y local image regions according to the target spectral color data, and determine a second tone mapping gain function;
  • the 2D/3D lookup table subunit 1230 may be configured to perform color mapping processing on the original image having X*Y local image regions according to the target spectral color data and the 2D/3D lookup table to determine corrected color data.
  • the Full RGB domain processing unit 820 can restore the color of the original image through the gamma correction curve, the second tone mapping gain function, and the corrected color data to obtain the target image.
  • the image signal processor 222 may include a YUV domain processing module 720, which may be used to:
  • the original image is divided into a plurality of local image regions, and the local image regions can be arranged in an array of E*F, wherein E and F are both positive integers greater than 1, E is greater than or equal to M, and F is greater than or equal to N; the original image having E*F local image regions is subjected to color restoration processing according to the target spectral color data to obtain a target image.
  • the YUV domain processing module 720 may include a third tone mapping subunit 1310 and a color adjustment enhancement subunit 1320, wherein:
  • the third tone mapping subunit 1310 may be configured to perform tone mapping processing on the original image having E*F local image regions according to the target spectral color data, and determine a third tone mapping gain function;
  • the color adjustment enhancement subunit 1320 can be used to perform color adjustment and enhancement on the original image having E*F local image areas according to the target spectral color data, and determine color adjustment enhancement parameters, where the color adjustment enhancement parameters may include contrast adjustment parameters, brightness adjustment parameters, saturation adjustment parameters, edge enhancement parameters, etc., and this example embodiment does not make any special limitations on this.
  • the YUV domain processing module 720 may perform color restoration on the original image according to the third tone mapping gain function and the color adjustment enhancement parameter to obtain a target image.
  • the K*L array, X*Y array, and E*F array in the embodiments of the present disclosure are grids divided when processing data in the Raw RGB domain, the Full RGB domain, and the YUV domain, respectively.
  • the sizes of the K*L array, X*Y array, and E*F array may be consistent or inconsistent, and may be customized according to the actual application situation. This example embodiment does not make any special limitation on this.
  • the image signal processor can also be used to perform time domain filtering processing on the original image after color restoration at different times based on an infinite impulse response (IIR) filter to obtain a target image.
  • IIR infinite impulse response
  • a spectral sensing device may be included, the spectral sensing device is used to collect target spectral color data, and a camera module, the camera module may include an image sensor and an image signal processor, wherein the image sensor may be used to generate an original image, and the image signal processor may be electrically connected to the image sensor and the spectral sensing device, and is used to perform color restoration on the original image according to the target spectral color data to obtain a target image.
  • the target spectral color data is obtained by the spectral sensing device, and compared with the original image with fewer color channels generated by the image sensor, it can perceive richer color information in the current scene, and the original image is restored in color in combination with the target spectral color data, which can effectively improve the accuracy of the color expression of the target image and improve the color expression of the target image;
  • the target spectral color data is collected by an independent spectral sensing device, and the target spectral color data is combined with the system architecture of the mature image signal processing process in the image signal processor for color restoration, and there is no need to adjust the system structure of the mature image signal processing process, which effectively improves the color restoration efficiency and reduces the optimization cost.
  • FIG. 14 shows a flowchart of an image processing method in this exemplary embodiment, which includes the following steps S1410 to S1430:
  • Step S1410 collecting target spectral color data
  • Step S1420 obtaining an original image
  • Step S1430 performing color restoration on the original image according to the target spectral color data to obtain a target image.
  • the target spectral color data may include a plurality of detection areas
  • the original image may include a plurality of image areas
  • the detection areas correspond to the image areas
  • the electronic device 1500 may include: a housing 1510, a main circuit board 1520, and an image processing system 1530.
  • the main circuit board 1520 is located in the housing 1510.
  • the image processing system 1530 may be an image processing system as in the embodiment of the present disclosure.
  • the image processing system 1530 may be electrically connected to the main circuit board 1520 to achieve signal or data transmission between the image processing system 1530 and the main circuit board 1520.
  • the image processing system 1530 may be disposed on the main circuit board 1520, or may be connected to the main circuit board 1520 through a medium such as a flexible circuit board to form an electrical connection, and to supply power to the image processing system 1530 or transmit a signal.
  • the mobile terminal 1600 may specifically include: a processor 1601, a memory 1602, a bus 1603, a mobile communication module 1604, an antenna 1, a wireless communication module 1605, an antenna 2, a display screen 1606, a camera module 1607, an audio module 1608, a power module 1609 and a sensor module 1610.
  • Processor 1601 may include one or more processing units.
  • processor 1601 may include an application processor AP, a modem processor, a graphics processor GPU, an ISP processor, a controller, an encoder, a decoder, a DSP (Digital Signal Processor), a baseband processor and/or a neural network processor NPU, etc.
  • the encoder can encode (i.e. compress) an image or video to reduce the data size for easy storage or transmission.
  • the decoder can decode (i.e. decompress) the encoded data of the image or video to restore the image or video data.
  • the mobile terminal 1600 can support one or more encoders and decoders, for example: image formats such as JPEG (Joint Photographic Experts Group), PNG (Portable Network Graphics), BMP (Bitmap), and video formats such as MPEG (Moving Picture Experts Group) 1, MPEG10, H.1063, H.1064, and HEVC (High Efficiency Video Coding).
  • the processor 1601 may be connected to the memory 1602 or other components via a bus 1603 .
  • the memory 1602 may be used to store computer executable program codes, which may include instructions.
  • the processor 1601 executes various functional applications and data processing of the mobile terminal 1600 by running the instructions stored in the memory 1602.
  • the memory 1602 may also store application data, such as images, videos, and other files.
  • the communication function of the mobile terminal 1600 can be implemented by the mobile communication module 1604, antenna 1, wireless communication module 1605, antenna 2, modulation and demodulation processor and baseband processor. Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the mobile communication module 1604 can provide 3G, 4G, 5G and other mobile communication solutions applied to the mobile terminal 1600.
  • the wireless communication module 1605 can provide wireless communication solutions such as wireless LAN, Bluetooth, and near field communication applied to the mobile terminal 1600.
  • the display screen 1606 is used to implement display functions, such as displaying a user interface, images, videos, etc.
  • the camera module 1607 is used to implement shooting functions, such as shooting images, videos, etc.
  • the audio module 1608 is used to implement audio functions, such as playing audio, collecting voice, etc.
  • the power module 1609 is used to implement power management functions, such as charging the battery, powering the device, monitoring the battery status, etc.
  • the sensor module 1610 may include one or more sensors for implementing corresponding sensing detection functions.
  • the sensor module 1610 may include an inertial sensor for detecting the motion posture of the mobile terminal 1600 and outputting inertial sensing data.
  • the exemplary embodiments of the present disclosure also provide a computer-readable storage medium on which a program product capable of implementing the above-mentioned method of the present specification is stored.
  • various aspects of the present disclosure may also be implemented in the form of a program product, which includes a program code, and when the program product is run on a terminal device, the program code is used to enable the terminal device to execute the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "Exemplary Method" section of the present specification.
  • the computer-readable medium shown in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device or device, or any combination of the above.
  • Computer-readable storage media may include, but are not limited to: an electrical connection with one or more wires, a portable computer disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM portable compact disk read-only memory
  • magnetic storage device or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium containing or storing a program that may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as part of a carrier wave, which carries a computer-readable program code. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, which may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device.
  • the program code contained on the computer-readable medium may be transmitted using any suitable medium, including but not limited to: wireless, wire, optical cable, RF, etc., or any suitable combination of the above.
  • program code for performing the operations of the present disclosure may be written in any combination of one or more programming languages, including object-oriented programming languages such as Java, C++, etc., and conventional procedural programming languages such as "C" or similar programming languages.
  • the program code may be executed entirely on the user computing device, partially on the user device, as a separate software package, partially on the user computing device and partially on a remote computing device, or entirely on a remote computing device or server.
  • the remote computing device may be connected to the user computing device through any type of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (e.g., via the Internet using an Internet service provider).
  • LAN local area network
  • WAN wide area network
  • Internet service provider e.g., via the Internet using an Internet service provider

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

提供一种图像处理系统及方法、计算机可读介质和电子设备,涉及图像处理技术领域。该系统包括:光谱感知装置,用于采集目标光谱色彩数据;摄像模组,包括图像传感器和图像信号处理器,其中:所述图像传感器,用于生成原始图像;所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。通过引入独立的光谱感知装置,用于采集当前场景中的光谱色彩数据,相比于普通基于拜耳滤镜的图像传感器采集的色彩信息更加丰富,并通过该光谱色彩数据对原始图像进行色彩还原及增强,提高目标图像的色彩准确性,提升图像的色彩表现力。

Description

图像处理系统及方法、计算机可读介质和电子设备
交叉引用
本公开要求于2022年08月03日提交的申请号为202210927465.6名称均为“图像处理系统及方法、计算机可读介质和电子设备”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本公开涉及图像处理技术领域,具体涉及一种图像处理系统、图像处理方法、计算机可读介质和电子设备。
背景技术
伴随着人们生活水平的不断提高,对于拍摄的图像质量的要求也越来越高。拜耳传感器(Bayer sensor)是一种可以将红,绿,蓝三种成份光信号在空间上区分开来单独统计光信号的传感器。
目前大多数相机方案采用的是拜耳传感器以及基于红绿蓝(RGB)颜色通道的组合所获得不同滤光样式组合的图像传感器,例如,基于RGBG的图像传感器。但是,这种方案中,由于相机所采用的色彩滤波仅限于红,绿,蓝三种光谱范围,复杂光源场景下的光谱感知信息有限,并且无法得到局部区域的色彩信息,因此最终输出图像的色彩还原的准确性较差。
发明内容
本公开的目的在于提供一种图像处理系统、图像处理方法、计算机可读介质和电子设备。
根据本公开的第一方面,提供一种图像处理系统,包括:
光谱感知装置,用于采集目标光谱色彩数据;
摄像模组,包括图像传感器和图像信号处理器,其中:
所述图像传感器,用于生成原始图像;
所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
根据本公开的第二方面,提供一种图像处理方法,包括:
采集目标光谱色彩数据;
获取原始图像;
根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
根据本公开的第三方面,提供一种计算机可读介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述的方法。
根据本公开的第四方面,提供一种电子设备,其特征在于,包括:
壳体;
主电路板,设置在所述壳体内;
如第一方面所提供的图像处理系统,设置在所述壳体内,与所述主电路板电连接。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。在附图中:
图1示出了相关技术方案中的一种图像信号处理流程的系统架构示意图;
图2示意性示出本公开示例性实施例中一种图像处理系统的架构示意图;
图3示意性示出本公开示例性实施例中一种光谱感知装置与电子设备设置方式的示意图;
图4示意性示出本公开示例性实施例中一种光谱传感器阵列采集光谱色彩数据的原理示意图;
图5示意性示出本公开示例性实施例中另一种图像处理系统的架构示意图;
图6示意性示出本公开示例性实施例中一种图像区域分割模块的原理示意图;
图7示意性示出本公开示例性实施例中再一种图像处理系统的架构示意图;
图8示意性示出本公开示例性实施例中一种RGB域处理模块的结构示意图;
图9示意性示出本公开示例性实施例中一种图像信号处理流水线的结构示意图;
图10示意性示出本公开示例性实施例中一种Raw RGB域图像与Full RGB域的区别的示意图;
图11示意性示出本公开示例性实施例中一种Raw RGB域处理模块的结构示意图;
图12示意性示出本公开示例性实施例中一种Full RGB域处理模块的结构示意图;
图13示意性示出本公开示例性实施例中一种YUV域处理模块的结构示意图;
图14示意性示出本公开示例性实施例中一种图像处理方法的流程示意图;
图15示出了可以应用于本公开实施例的一种电子设备的结构示意图;
图16示出了可以应用于本公开实施例的另一种电子设备的结构示意图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的范例;相反,提供这些实施方式使得本公开将更加全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施方式中。
此外,附图仅为本公开的示意性图解,并非一定是按比例绘制。图中相同的附图标记表示相同或类似的部分,因而将省略对它们的重复描述。附图中所示的一些方框图是功能实体,不一定必须与物理或逻辑上独立的实体相对应。可以采用软件形式来实现这些功能实体,或在一个或多个硬件模块或集成电路中实现这些功能实体,或在不同网络和/或处理器装置和/或微控制器装置中实现这些功能实体。
图1示出了相关技术方案中的一种图像信号处理流程的系统架构示意图。
参考图1所示,一种图像信号处理流程的系统架构可以包括图像传感器110、RGB域处理模块120、YUV域处理模块130以及JPG输出模块140。其中:
图像传感器110是指基于拜耳滤光样式的模板实现光信号采集,并将光信号转换为电信号的传感器,例如,图像传感器110可以是互补性金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS),也可以是电荷耦合探测元件(Charge-coupled Device,CCD)。
RGB域处理模块120是指对于图像传感器110采集的原始图像在RGB域上进行图像信号处理的所有流程对应的模块。
YUV域处理模块130是指将RGB域处理模块120的输出结果转换到YUV颜色空间(YUV域),并在YUV颜色空间上进行图像信号处理的所有流程对应的模块。
JPG输出模块140是指将RGB域处理模块120的处理结果以及YUV域处理模块130的处理结果,映射到原始图像中,并将处理后的原始图像转换为最终可以显示到显示单元如JPG格式图片的处理模块。
但是,由于该方案中图像传感器110采用的是基于拜耳滤镜以及基于RGB颜色通道的组合所获得不同滤光样式组合的传感器,基于RGB三通道的传感器所采用的色彩滤波仅限于红,绿,蓝(R,G,B)三种光谱范围,复杂光源场景下的光谱感知信息有限;同时对于系统结构中需要用到当前色温,以及反应当前场景颜色信息,而基于全局RGB的方式存在很多局限性,无法获得局部区域的色彩信息,导致最终输出的图像的色彩的准确性较差,无法较好还原真实场景中的色彩。
另外,一些技术方案中,虽然有采用到基于红绿蓝青品红黄(RGBCMY)以及红黄蓝(RYYB)等非常规颜色通道的传感器,用于弥补当前图像传感器仅能够采集RGB三通道色彩的缺陷,但是,由于红绿蓝青品红黄(RGBCMY)以及红黄蓝(RYYB)等非常规颜色通道的传感器的应用较少,需要在图像信号处理流程(ISP)的系统架构以及单点算法层面上需要针对该类型的色彩滤波模板做进一步的支持,工作量较大,应用成本较高。
基于相关方案中存在的一个或者多个问题,本公开首先提供了一种图像处理系统,图像处理系统可以设置在具有一定计算能力的电子设备中,例如,电子设备可以是台式计算机、便携式计算机、智能手机、智能机器人、可穿戴设备和平板电脑等,本示例实施例对此不做特殊限定。
图2示意性示出本公开示例性实施例中一种图像处理系统的架构示意图。参考图2所示,图像处理系统200可以包括:
光谱感知装置210可以用于采集目标光谱色彩数据;
摄像模组220可以包括图像传感器221和图像信号处理器222,其中:
图像传感器221可以用于生成原始图像;
图像信号处理器222可以与图像传感器221和光谱感知装置210电连接,用于根据光谱感知装置210采集的目标光谱色彩数据对图像传感器221输出的原始图像进行色彩还原,得到目标图像。
其中,光谱感知装置210是指用于感知拍摄原始图像时的当前场景中真实色彩的感知单元,例如,光谱感知装置210可以是多光谱传感器(Multispectral sensor),多光谱传感器可以包括成像光学元件和划分光谱的光学元件,能够准确识别当前场景中所有的光谱色彩信息;当然,光谱感知装置210也可以是CCD直读光谱仪,本示例实施例对光谱感知装置的类型不做特殊限定。
图3示意性示出本公开示例性实施例中一种光谱感知装置与电子设备设置方式的示意图。
参考图3所示,以光谱感知装置210是多光谱传感器310为例,可以将多光谱传感器310集成到摄像模组220中,也可以独立于摄像模组220设置到电子设备中,例如,可以设置在电子设备的外壳320上,并且与摄像模组220的距离处于距离阈值之内;当然,光谱感知装置210也可以作为外接设备与摄像模组220对应的电子设备通信连接,例如,外接的光谱感知装置330可以通过有线或者无线的方式340与摄像模组220通信连接,本示例实施例对于光谱感知装置210的连接方式不做特殊限定。
继续参考图2所示,目标光谱色彩数据是指筛选的作用于原始图像的光谱色彩数据,例如,目标光谱色彩数据可以是光谱感知装置210采集的所有光谱色彩数据,也可以是光谱感知装置210采集的所有光谱色彩数据中与原始图像的前景区域对应的光谱色彩数据,当然,还可以是以其他方式筛选的作用于原始图像的光谱色彩数据,本示例实施例对目标光谱色彩数据的筛选方式不做特殊限定。
图像传感器221是指实现光信号采集,并将光信号转换为电信号的传感器,例如,图像传感器221可以是互补性金属氧化物半导体CMOS,也可以是电荷耦合探测元件CCD,本示例实施例对图像传感器221的传感器类型不做任何特殊限定。可选的,图像传感器221可以是基于拜耳滤光样式的图像传感器,如基于RGBG滤光样式、RGBW滤光样式的图像传感器,当然,也可以是基于非拜耳滤光样式的图像传感器,如基于RYYB滤光样式的图像传感器,本实施例不以此为限。
原始图像是指通过图像传感器选择光信号并转换得到的图像数据,例如,若原始图像是基于RGBG滤光样式的图像传感器采集的图像数据,那么原始图像可以是RGBG图像;若原始图像是基于RYYB滤光样式的图像传感器采集的图像数据,那么原始图像可以是RYYB图像,本示例实施例不以此为限。
图像信号处理器是指将图像传感器采集的原始图像加工成可以输出到显示器展示的图像数据的处理器,例如,图像信号处理器可以是ISP(Image Signal Processor)处理器,也可以是应用处理器(Application Processor,AP)、GPU图形处理器(Graphics Processing Unit,GPU)等,本示例实施例对此不做限定,为便于理解,下面均以图像信号处理器为ISP处理器进行说明。
在一示例实施例中,光谱感知装置可以包括多个光谱传感器,多个光谱传感器可以按照M*N的阵列排布构成光谱传感器阵列,其中,M和N均为大于1的正整数。
图4示意性示出本公开示例性实施例中一种光谱传感器阵列采集光谱色彩数据的原理示意图。
参考图4所示,光谱感知装置210可以包括多个光谱传感器410,光谱传感器410在空间上按照预设列数和行数进行排列,排列得到光谱传感器阵列,具体的,多个光谱传感器410可以按照M*N的阵列排布构成光谱传感器阵列420,例如,可以预先设置M为10,可以设置N为10,那么光谱感知装置210可以包括100个光谱传感器410,对这100个光谱传感器410在空间上进行排列,得到10*10的光谱传感器阵列。当然,具体的M和N可以根据实际应用场景进行自定义设置,本示例实施例对此不做特殊限定。
具体的,光谱感知装置210可以用于,通过光谱传感器阵列420采集得到具有M*N个检测区域430的光谱色彩数据440,并将该M*N个局部检测区域的光谱色彩数据440作为目标光谱色彩数据。
继续参考图4所示,在拍摄原始图像时,在空间上具有M*N光谱传感器阵列420结构的光谱感知装置210,采集当前场景中的真实色彩信息,得到具有M*N个检测区域430的光谱色彩数据440,各检测区域430内的光谱色彩数据能够采集所对应的真实场景区域中的光谱色彩数据,相比于基于RGB通道的图像传感器,能够得到真实场景中更加丰富的色彩信息,感知更大范围的光谱信息,有效补充图像传感器的色彩信息,提升输出的目标图像的色彩还原的准确性,并且通过各检测区域中的光谱色彩数据,能够有效实现对目标图像的局部区域色彩增强,保证目标图像的色彩表现力。
在一示例实施例中,参考图5所示,图像处理系统200可以包括图像区域分割模块510,该图像区域分割模块510可以设置在电子设备的中央处理器(Central Processing Unit,CPU)、神经网络处理器(Neural-Network Processing Unit,NPU)等处理器中,也可以设置在摄像模组的图像信号处理器中,本示例实施例对此不做特殊限定。
具体的,图像区域分割模块510可以用于:根据原始图像确定感兴趣区域,通过感兴趣区域在M*N个检测区域中确定目标检测区域,并将目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
其中,感兴趣区域是指原始图像中特定图像内容所在的区域,例如,假设原始图像采集的是人脸内容,那么原始图像对应的感兴趣区域可以是原始图像中的人脸区域;假设原始图像采集的是一颗树,那么原始图像对应的感兴趣区域可以是这棵树对应的图像区域,本示例实施例对于感兴趣区域的选取或者设定不做特殊限定。
目标检测区域是指原始图像中的感兴趣区域与光谱感知装置对应的多个检测区域的重叠区域,即对原始图像中的感兴趣区域与光谱感知装置对应的多个检测区域取交集,得到目标检测区域。
通过确定原始图像对应的感兴趣区域,并根据感兴趣区域筛选需要使用的光谱色彩数据对原始图像进行色彩还原,而不是将光谱感知装置采集到的所有光谱色彩数据均应用于色彩还原,可以有效降低色彩还原过程中的计算量,提升计算效率,并且筛选目标检测区域对应的光谱色彩数据对原始图像进行色彩还原,能够实现对原始图像局部感兴趣区域的色彩增强,提升图像中感兴趣区域的色彩表现力,使输出的目标图像中的感兴趣区域的图像内容更加鲜艳,提升用户观赏体验。
图6示意性示出本公开示例性实施例中一种图像区域分割模块的原理示意图。
参考图6所示,图像传感器221采集到原始图像610,可以将原始图像610输入到中央处理器、图像信号处理器或者其他具有计算能力的处理器中,确定原始图像610中的感兴趣区域620,例如,可以将原始图像610输入到中央处理器中,通过边缘检测算法确定感兴趣区域620,也可以将原始图像610输入到人工智能处理器中,通过预先训练的基于深度学习的感兴趣区域提取模型确定感兴趣区域620,本示例实施例对此不做特殊限定。
可以将原始图像610对应的感兴趣区域620,与光谱感知装置210对应的多个检测区域630的光谱色彩数据输入到图像区域分割模块510中,确定感兴趣区域620与多个检测区域630的光谱色彩数据的重叠区域640,并将重叠区域640作为目标检测区域,并将目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
通过确定原始图像对应的感兴趣区域,并根据感兴趣区域筛选需要使用的光谱色彩数据对原始图像进行色彩还原,而不是将光谱感知装置采集到的所有光谱色彩数据均应用于色彩还原,可以有效降低色彩还原过程中的计算量,提升计算效率。
在一示例实施例中,本实施例中的图像处理系统不改变已经成熟的图像信号处理流程的系统架构,即本实施例中的图像处理系统是将光谱感知装置210对应的多个检测区域630的光谱色彩数据,在已经成熟的图像信号处理流程的系统架构上进行结合。
图7示意性示出本公开示例性实施例中再一种图像处理系统的架构示意图,参考图7所示,图像信号处理器222可以包括RGB域处理模块710和YUV域处理模块720,具体的,参考图8所示,RGB域处理模块710可以包括Raw RGB域处理单元810以及Full RGB域处理单元820。
可以理解的是,Raw RGB域、Full RGB域以及YUV域分别是指在图像信号处理流水线中对不同格式或者阶段的原始图像的区分,举例而言,参考图9所示,对于将原始图像转换为最终的JPG输出图像的成熟的图像信号处理流水线900,其中,可以将坏点隐藏(Dead Pixel Concealment)、黑电平补偿(Black Level Compensation)、镜头阴影校正(Lens Shading Correction)、抗混叠噪声滤波器(Anti-aliasing Noise filter)、自动白平衡增益控制(AWB Gain control)、CFA插值(CFA Interpolation)等流程处理的数据认为是Raw RGB域910对应的原始图像;可以将伽马校正(Gamma Correction)、色彩校正(Color Correction)、色彩空间转换(Color space conversion)等流程处理的数据认为是Full RGB域920对应的原始图像;可以将色度噪声滤波器(Noise filter for Chroma)、色度饱和度控制(Hue Saturation Control)、Luma噪声滤波器(Noise filter for  Luma)、边缘增强(Edge Enhancement)、对比度亮度控制(Contrast Brightness Control)、数据格式化(Data formatter)等流程处理的数据认为是YUV域930对应的原始图像。
便于理解的,参考图10所示,可以通过基于拜耳滤光样式的图像传感器采集到Raw RGB域原始图像1000,此时的Raw RGB域原始图像1000可以包含不同通道的颜色滤波阵列1010(R颜色滤波阵列、G颜色滤波阵列、B颜色滤波阵列);此时可以通过对不同通道的颜色滤波阵列1010进行插值(如可以通过反马赛克Demosaic算法对颜色滤波阵列1010进行插值),得到不同通道的全局颜色阵列1020(R全局颜色阵列、G全局颜色阵列、B全局颜色阵列),进而可以通过不同通道的全局颜色阵列1020得到Full RGB域原始图像1030。可以继续对Full RGB域原始图像1030进行颜色空间转换,得到YUV域的原始图像。
具体的,继续参考图7和图8所示,图像信号处理器222可以包括RGB域处理模块710,具体的,RGB域处理模块710可以包括Raw RGB域处理单元810,该Raw RGB域处理单元810可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
进一步的,参考图11所示,Raw RGB域处理单元810可以包括白平衡增益子单元1110以及第一色调映射子单元1120,其中:白平衡增益子单元1110可以用于根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处理,确定白平衡增益矩阵;第一色调映射子单元1120可以用于根据目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数。
在确定白平衡增益矩阵和第一色调映射增益函数之后,Raw RGB域处理单元810可以通过白平衡增益矩阵以及第一色调映射增益函数对原始图像进行色彩还原,得到目标图像。
在一示例实施例中,图像信号处理器222可以包括RGB域处理模块710,RGB域处理模块710可以包括Full RGB域处理单元820,该Full RGB域处理单元820可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照X*Y的阵列排布,其中,X和Y均为大于1的正整数,X大于或者等于M,且Y大于或者等于N;根据目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
具体的,参考图12所示,Full RGB域处理单元820可以包括颜色校正子单元1210、第二色调映射子单元1220和2D/3D查找表子单元1230,其中:
颜色校正子单元1210可以用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行伽马校正处理,确定伽马校正曲线;
第二色调映射子单元1220可以用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色调映射处理,确定第二色调映射增益函数;
2D/3D查找表子单元1230可以用于根据所述目标光谱色彩数据以及2D/3D查找表,对具有X*Y个局部图像区域的原始图像进行色彩映射处理,确定校正色彩数据。
在确定伽马校正曲线、第二色调映射增益函数以及校正色彩数据之后,Full RGB域处理单元820可以通过伽马校正曲线、第二色调映射增益函数以及校正色彩数据对原始图像进行色彩还原,得到目标图像。
在一示例实施例中,图像信号处理器222可以包括YUV域处理模块720,该YUV域处理模块720可以用于:
将原始图像划分为多个局部图像区域,局部图像区域可以按照E*F的阵列排布,其中,E和F均为大于1的正整数,E大于或者等于M,且F大于或者等于N;根据目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
具体的,参考图13所示,YUV域处理模块720可以包括第三色调映射子单元1310和色彩调整增强子单元1320,其中:
第三色调映射子单元1310可以用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色调映射处理,确定第三色调映射增益函数;
色彩调整增强子单元1320可以用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩调整增强,确定色彩调整增强参数,其中色彩调整增强参数可以包括对比度调整参数、亮度调整参数、饱和度调整参数和边缘增强参数等,本示例实施例对此不做特殊限定。
在确定第三色调映射增益函数以及色彩调整增强参数之后,YUV域处理模块720可以根据第三色调映射增益函数和色彩调整增强参数对原始图像进行色彩还原,得到目标图像。
需要说明的是,本公开实施例中的K*L阵列、X*Y阵列、E*F阵列分别为在Raw RGB域处理数据、Full RGB域处理数据、YUV域处理数据时划分的网格,K*L阵列、X*Y阵列、E*F阵列的尺寸可以是一致的,也可以是不一致的,具体可以根据实际应用情况进行自定义设置,本示例实施例对此不做特殊限定。
在一示例实施例中,图像信号处理器还可以用于基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。通过对不同时刻色彩还原后的原始图像进行时域滤波处理,能够降低由于对图像局部处理而导致的区域边缘或者前后图像色彩不连续的问题,提升拍摄视频或者实时预览时的图像表现连续性,提升用户体验。
综上所述,本示例性实施方式中,可以包括光谱感知装置,光谱感知装置用于采集目标光谱色彩数据,以及摄像模组,摄像模组可以包括图像传感器和图像信号处理器,其中图像传感器可以用于生成原始图像,图像信号处理器可以与图像传感器和光谱感知装置电连接,用于根据目标光谱色彩数据对原始图像进行色彩还原,得到目标图像。一方面,通过光谱感知装置获取目标光谱色彩数据,相比于图像传感器生成的颜色通道较少的原始图像,能够感知当前场景中更加丰富的色彩信息,并结合目标光谱色彩数据对原始图像进行色彩还原,能够有效提升目标图像的色彩表达的准确性,提高目标图像的色彩表现力;另一方面,通过独立的光谱感知装置采集目标光谱色彩数据,并将目标光谱色彩数据与图像信号处理器中的已经成熟的图像信号处理流程的系统架构相结合进行色彩还原,并不需要调整已经成熟的图像信号处理流程的系统结构,有效提升色彩还原效率,降低优化成本。
需要注意的是,上述附图仅是根据本公开示例性实施例的方法所包括的处理的示意性说明,而不是限制目的。易于理解,上述附图所示的处理并不表明或限制这些处理的时间顺序。另外,也易于理解,这些处理可以是例如在多个模块中同步或异步执行的。
进一步的,本示例的实施方式中还提供一种图像处理方法,图14示出了本示例性实施方式中一种图像处理方法的流程示意图,包括以下步骤S1410至步骤S1430:
步骤S1410,采集目标光谱色彩数据;
步骤S1420,获取原始图像;
步骤S1430,根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
在一示例实施例中,目标光谱色彩数据可以包括多个检测区域,原始图像可以包括多个图像区域,且检测区域与图像区域相对应。
上述方法的具体细节在系统部分实施方式中已经详细说明,未披露的细节内容可以参见系统部分的实施方式内容,因而不再赘述。
所属技术领域的技术人员能够理解,本公开的各个方面可以实现为系统、方法或程序产品。因此,本公开的各个方面可以具体实现为以下形式,即:完全的硬件实施方式、完全的软件实施方式(包括固件、微代码等),或硬件和软件方面结合的实施方式,这里可以统称为“电路”、“模块”或“系统”。
本公开的示例性实施方式还提供一种电子设备,参考图15所示,电子设备1500可以包括:壳体1510,主电路板1520,图像处理系统1530。其中,主电路板1520位于壳体1510内。图像处理系统1530可以是如本公开实施例中的图像处理系统。图像处理系统1530可以与主电路板1520电连接,以实现图像处理系统1530可以与主电路板1520之间的信号或数据传输。例如,图像处理系统1530可以设置于主电路板1520上,或者通过柔性电路板等介质连接于主电路板1520,以形成电连接,给图像处理系统1530供电或者传输信号。
下面以图16中的移动终端1600为例,对该电子设备的构造进行示例性说明。本领域技术人员应当理解,除了特别用于移动目的的部件之外,图16中的构造也能够应用于固定类型的设备。
如图16所示,移动终端1600具体可以包括:处理器1601、存储器1602、总线1603、移动通信模块1604、天线1、无线通信模块1605、天线2、显示屏1606、摄像模块1607、音频模块1608、电源模块1609与传感器模块1610。
处理器1601可以包括一个或多个处理单元,例如:处理器1601可以包括应用处理器AP、调制解调处理器、图形处理器GPU、ISP处理器、控制器、编码器、解码器、DSP(Digital Signal Processor,数字信号处理器)、基带处理器和/或神经网络处理器NPU等。
编码器可以对图像或视频进行编码(即压缩),以减小数据大小,便于存储或发送。解码器可以对图像或视频的编码数据进行解码(即解压缩),以还原出图像或视频数据。移动终端1600可以支持一种或多种编码器和解码器,例如:JPEG(Joint Photographic Experts Group,联合图像专家组)、PNG(Portable Network Graphics,便携式网络图形)、BMP(Bitmap,位图)等图像格式,MPEG(Moving Picture Experts Group,动态图像专家组)1、MPEG10、H.1063、H.1064、HEVC(High Efficiency Video Coding,高效率视频编码)等视频格式。
处理器1601可以通过总线1603与存储器1602或其他部件形成连接。
存储器1602可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器1601通过运行存储在存储器1602的指令,执行移动终端1600的各种功能应用以及数据处理。存储器1602还可以存储应用数据,例如存储图像,视频等文件。
移动终端1600的通信功能可以通过移动通信模块1604、天线1、无线通信模块1605、天线2、调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。移动通信模块1604可以提供应用在移动终端1600上3G、4G、5G等移动通信解决方案。无线通信模块1605可以提供应用在移动终端1600上的无线局域网、蓝牙、近场通信等无线通信解决方案。
显示屏1606用于实现显示功能,如显示用户界面、图像、视频等。摄像模块1607用于实现拍摄功能,如拍摄图像、视频等。音频模块1608用于实现音频功能,如播放音频,采集语音等。电源模块1609用于实现电源管理功能,如为电池充电、为设备供电、监测电池状态等。
传感器模块1610可以包括一种或多种传感器,用于实现相应的感应检测功能。例如,传感器模块1610可以包括惯性传感器,其用于检测移动终端1600的运动位姿,输出惯性传感数据。
本公开的示例性实施方式还提供了一种计算机可读存储介质,其上存储有能够实现本说明书上述方法的程序产品。在一些可能的实施方式中,本公开的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当程序产品在终端设备上运行时,程序代码用于使终端设备执行本说明书上述“示例性方法”部分中描述的根据本公开各种示例性实施方式的步骤。
需要说明的是,本公开所示的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
此外,可以以一种或多种程序设计语言的任意组合来编写用于执行本公开操作的程序代码,程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其他实施例。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限。

Claims (20)

  1. 一种图像处理系统,其特征在于,包括:
    光谱感知装置,用于采集目标光谱色彩数据;
    摄像模组,包括图像传感器和图像信号处理器,其中:
    所述图像传感器,用于生成原始图像;
    所述图像信号处理器,与所述图像传感器和所述光谱感知装置电连接,用于根据所述目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
  2. 根据权利要求1所述的系统,其特征在于,所述光谱感知装置包括多个光谱传感器,所述多个光谱传感器按照M*N的阵列排布构成光谱传感器阵列,其中,M和N均为大于1的正整数。
  3. 根据权利要求2所述的系统,其特征在于,所述光谱感知装置用于:
    通过所述光谱传感器阵列采集得到具有M*N个检测区域的光谱色彩数据,并将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据。
  4. 根据权利要求3所述的系统,其特征在于,所述图像处理系统包括图像区域分割模块,所述图像区域分割模块用于:
    根据所述原始图像确定感兴趣区域;
    通过所述感兴趣区域在所述M*N个检测区域中确定目标检测区域,并将所述目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
  5. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括RGB域处理模块,所述RGB域处理模块包括Raw RGB域处理单元,
    所述Raw RGB域处理单元用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  6. 根据权利要求5所述的系统,其特征在于,所述Raw RGB域处理单元包括白平衡增益子单元以及第一色调映射子单元,其中:
    所述白平衡增益子单元,用于根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处理,确定白平衡增益矩阵;
    所述第一色调映射子单元,用于根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数;以及
    所述Raw RGB域处理单元,用于通过所述白平衡增益矩阵以及所述第一色调映射增益函数对所述原始图像进行色彩还原,得到目标图像。
  7. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括RGB域处理模块,所述RGB域处理模块包括Full RGB域处理单元,
    所述Full RGB域处理单元用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照X*Y的阵列排布,其中,X和Y均为大于1的正整数,X大于或者等于M,且Y大于或者等于N;
    根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  8. 根据权利要求7所述的系统,其特征在于,所述Full RGB域处理单元包括颜色校正子单元、第二色调映射子单元和2D/3D查找表子单元,其中:
    所述颜色校正子单元,用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行伽马校正处理,确定伽马校正曲线;
    所述第二色调映射子单元,用于根据所述目标光谱色彩数据对具有X*Y个局部图像区域的原始图像进行色调映射处理,确定第二色调映射增益函数;
    所述2D/3D查找表子单元,用于根据所述目标光谱色彩数据以及2D/3D查找表,对具有X*Y个局部图像区域的原始图像进行色彩映射处理,确定校正色彩数据;以及
    所述Full RGB域处理单元,用于通过所述伽马校正曲线、所述第二色调映射增益函数以及所述校正色彩数据对所述原始图像进行色彩还原,得到目标图像。
  9. 根据权利要求1至4任一项所述的系统,其特征在于,所述图像信号处理器包括YUV域处理模块,所述YUV域处理模块用于:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照E*F的阵列排布,其中,E和F均为大于1的正整数,E大于或者等于M,且F大于或者等于N;
    根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  10. 根据权利要求9所述的系统,其特征在于,所述YUV域处理模块包括第三色调映射子单元和色彩调整增强子单元,其中:
    所述第三色调映射子单元,用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色调映射处理,确定第三色调映射增益函数;
    所述色彩调整增强子单元,用于根据所述目标光谱色彩数据对具有E*F个局部图像区域的原始图像进行色彩调整增强,确定色彩调整增强参数,其中所述色彩调整增强参数包括对比度调整参数、亮度调整参数、饱和度调整参数和边缘增强参数;以及
    所述YUV域处理模块,用于根据所述第三色调映射增益函数和所述色彩调整增强参数对所述原始图像进行色彩还原,得到目标图像。
  11. 根据权利要求1所述的系统,其特征在于,所述图像信号处理器还用于:
    基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。
  12. 一种图像处理方法,其特征在于,由权利要求1-11任一项所述的图像处理系统执行,所述方法包括:
    采集目标光谱色彩数据;
    获取原始图像;
    根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像。
  13. 根据权利要求12所述的方法,其特征在于,所述采集目标光谱色彩数据包括:
    通过光谱传感器阵列采集得到具有M*N个检测区域的光谱色彩数据,并将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据。
  14. 根据权利要求13所述的方法,其特征在于,所述将所述M*N个局部检测区域的光谱色彩数据作为目标光谱色彩数据包括:
    根据所述原始图像确定感兴趣区域;
    通过所述感兴趣区域在所述M*N个检测区域中确定目标检测区域,并将所述目标检测区域对应的光谱色彩数据作为目标光谱色彩数据。
  15. 根据权利要求12所述的方法,其特征在于,所述根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像包括:
    将所述原始图像划分为多个局部图像区域,所述局部图像区域按照K*L的阵列排布,其中,K和L均为大于1的正整数,K大于或者等于M,且L大于或者等于N;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像。
  16. 根据权利要求15所述的方法,其特征在于,所述根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色彩还原处理,得到目标图像包括:
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行白平衡处理,确定白平衡增益矩阵;
    根据所述目标光谱色彩数据对具有K*L个局部图像区域的原始图像进行色调映射处理,确定第一色调映射增益函数;
    通过所述白平衡增益矩阵以及所述第一色调映射增益函数对所述原始图像进行色彩还原,得到目标图像。
  17. 根据权利要求12所述的方法,其特征在于,所述根据目标光谱色彩数据对所述原始图像进行色彩还原,得到目标图像包括:
    基于无限脉冲响应IIR滤波器对不同时刻色彩还原后的原始图像进行时域滤波处理,得到目标图像。
  18. 根据权利要求12所述的方法,其特征在于,所述目标光谱色彩数据包括多个检测区域,所述原始图像包括多个图像区域,所述检测区域与所述图像区域相对应。
  19. 一种计算机可读介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求12至18中任一项所述的方法。
  20. 一种电子设备,其特征在于,包括:
    壳体;
    主电路板,设置在所述壳体内;
    如权利要求1-11任一项所述的图像处理系统,设置在所述壳体内,与所述主电路板电连接。
PCT/CN2023/095718 2022-08-03 2023-05-23 图像处理系统及方法、计算机可读介质和电子设备 WO2024027287A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210927465.6A CN115314617A (zh) 2022-08-03 2022-08-03 图像处理系统及方法、计算机可读介质和电子设备
CN202210927465.6 2022-08-03

Publications (2)

Publication Number Publication Date
WO2024027287A1 WO2024027287A1 (zh) 2024-02-08
WO2024027287A9 true WO2024027287A9 (zh) 2024-05-10

Family

ID=83859660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/095718 WO2024027287A1 (zh) 2022-08-03 2023-05-23 图像处理系统及方法、计算机可读介质和电子设备

Country Status (2)

Country Link
CN (1) CN115314617A (zh)
WO (1) WO2024027287A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115314617A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 图像处理系统及方法、计算机可读介质和电子设备
CN118057830A (zh) * 2022-11-18 2024-05-21 华为技术有限公司 图像处理方法、电子设备、计算机程序产品及存储介质
US20240257325A1 (en) * 2023-01-26 2024-08-01 Samsung Electronics Co., Ltd. Tone mapping in high-resolution imaging systems
CN117319815B (zh) * 2023-09-27 2024-05-14 北原科技(深圳)有限公司 基于图像传感器的视频流识别方法和装置、设备、介质
CN118505573B (zh) * 2024-07-18 2024-10-01 奥谱天成(湖南)信息科技有限公司 光谱数据恢复方法、装置及存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10708557B1 (en) * 2018-12-14 2020-07-07 Lyft Inc. Multispectrum, multi-polarization (MSMP) filtering for improved perception of difficult to perceive colors
US11493387B2 (en) * 2020-03-12 2022-11-08 Spectricity Correction and calibration of spectral sensor output
CN114286072A (zh) * 2020-09-27 2022-04-05 北京小米移动软件有限公司 色彩还原装置及方法、图像处理器
CN112562017B (zh) * 2020-12-07 2024-08-23 奥比中光科技集团股份有限公司 一种rgb图像的色彩还原方法及计算机可读存储介质
CN112752023B (zh) * 2020-12-29 2022-07-15 深圳市天视通视觉有限公司 一种图像调整方法、装置、电子设备及存储介质
CN113418864B (zh) * 2021-06-03 2022-09-16 奥比中光科技集团股份有限公司 一种多光谱图像传感器及其制造方法
CN113676628B (zh) * 2021-08-09 2023-05-02 Oppo广东移动通信有限公司 成像装置和图像处理方法
CN113676713B (zh) * 2021-08-11 2024-09-27 维沃移动通信(杭州)有限公司 图像处理方法、装置、设备及介质
CN113639881A (zh) * 2021-08-23 2021-11-12 Oppo广东移动通信有限公司 色温测试方法及装置、计算机可读介质和电子设备
CN115314617A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 图像处理系统及方法、计算机可读介质和电子设备

Also Published As

Publication number Publication date
WO2024027287A1 (zh) 2024-02-08
CN115314617A (zh) 2022-11-08

Similar Documents

Publication Publication Date Title
WO2024027287A9 (zh) 图像处理系统及方法、计算机可读介质和电子设备
JP6129119B2 (ja) 画像処理装置、画像処理システム、撮像装置、および画像処理方法
CN113810641B (zh) 视频处理方法、装置、电子设备和存储介质
US10600170B2 (en) Method and device for producing a digital image
WO2022148446A1 (zh) 图像处理方法、装置、设备及存储介质
CN115242992A (zh) 视频处理方法、装置、电子设备和存储介质
CN113810642B (zh) 视频处理方法、装置、电子设备和存储介质
CN113824914B (zh) 视频处理方法、装置、电子设备和存储介质
US9654756B1 (en) Method and apparatus for interpolating pixel colors from color and panchromatic channels to color channels
WO2021179142A1 (zh) 一种图像处理方法及相关装置
WO2019164767A1 (en) Multiple tone control
CN115205159A (zh) 图像处理方法及装置、电子设备、存储介质
WO2020215263A1 (zh) 一种图像处理方法及装置
WO2023016044A1 (zh) 视频处理方法、装置、电子设备和存储介质
CN115187487A (zh) 图像处理方法及装置、电子设备、存储介质
CN115550575B (zh) 图像处理方法及其相关设备
CN115187488A (zh) 图像处理方法及装置、电子设备、存储介质
CN115239739A (zh) 图像处理方法及装置、电子设备和计算机可读介质
CN115278189A (zh) 图像色调映射方法及装置、计算机可读介质和电子设备
CN113364964B (zh) 图像处理方法、图像处理装置、存储介质与终端设备
CN112218062B (zh) 图像缩放装置、电子设备、图像缩放方法及图像处理芯片
CN114125319A (zh) 图像传感器、摄像模组、图像处理方法、装置和电子设备
WO2021217428A1 (zh) 图像处理方法、装置、摄像设备和存储介质
CN115706765A (zh) 视频处理方法、装置、电子设备和存储介质
CN115278191B (zh) 图像白平衡方法及装置、计算机可读介质和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23848997

Country of ref document: EP

Kind code of ref document: A1