CN116071626A - Image processing device and method and electronic equipment - Google Patents

Image processing device and method and electronic equipment Download PDF

Info

Publication number
CN116071626A
CN116071626A CN202211712781.8A CN202211712781A CN116071626A CN 116071626 A CN116071626 A CN 116071626A CN 202211712781 A CN202211712781 A CN 202211712781A CN 116071626 A CN116071626 A CN 116071626A
Authority
CN
China
Prior art keywords
type
sensors
pixels
pixel
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211712781.8A
Other languages
Chinese (zh)
Inventor
杨建明
张韵东
周学武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Zhongxing Micro Artificial Intelligence Chip Technology Co ltd
Vimicro Corp
Original Assignee
Chongqing Zhongxing Micro Artificial Intelligence Chip Technology Co ltd
Vimicro Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Zhongxing Micro Artificial Intelligence Chip Technology Co ltd, Vimicro Corp filed Critical Chongqing Zhongxing Micro Artificial Intelligence Chip Technology Co ltd
Priority to CN202211712781.8A priority Critical patent/CN116071626A/en
Publication of CN116071626A publication Critical patent/CN116071626A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

An image processing device and method and electronic equipment are provided. The image processing apparatus includes: a plurality of first type sensors for acquiring analog signals of a plurality of first type pixels of a photographic subject; a plurality of second type sensors for acquiring analog signals of a plurality of second type pixels of the photographing object; the conversion module is used for converting the analog signals of the first type of pixels into digital signals and converting the analog signals of the second type of pixels into digital signals; the processing module is connected with the conversion module and used for carrying out fusion processing on the digital signals of the first type pixels and the digital signals of the second type pixels according to a preset pixel fusion algorithm so as to obtain a target image. The embodiment of the application digitizes the pixel signals, is favorable for carrying out image fusion at a pixel level after receiving partial pixel data of the image frame, and has good instantaneity and accurate fusion.

Description

Image processing device and method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of image processing, and more particularly relates to an image processing device and method and electronic equipment.
Background
Image fusion technology has been widely used in the fields of target detection, monitoring, military, remote sensing, medicine, etc. The image fusion aims at combining the advantages of different sensors, and aiming at the same target, a plurality of images acquired by different sensors are fused to output images with better effect. At present, a visible light sensor and a long-wave infrared sensor are respectively used for imaging independently, and then image fusion is carried out through operation of a graphic processing unit. The image fusion mode is large in calculation and tends to bring a problem of large delay.
Disclosure of Invention
The embodiment of the application provides an image processing device and method and electronic equipment. Various aspects related to embodiments of the present application are described below.
In a first aspect, there is provided an apparatus for image processing, comprising: a plurality of first type sensors for acquiring analog signals of a plurality of first type pixels of a photographic subject; a plurality of second type sensors for acquiring analog signals of a plurality of second type pixels of the photographic subject; the conversion module is used for converting the analog signals of the first type of pixels into digital signals and converting the analog signals of the second type of pixels into digital signals; the processing module is connected with the conversion module and used for carrying out fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels according to a preset pixel fusion algorithm so as to obtain a target image.
In a second aspect, there is provided a method of image processing, applied to an image processing apparatus, the image processing apparatus comprising: a plurality of first type sensors, a plurality of second type sensors, a conversion module and a processing module; the method comprises the following steps: acquiring analog signals of a plurality of first type pixels of a shooting object; acquiring analog signals of a plurality of second type pixels of the shooting object; converting the analog signals of the first type of pixels into digital signals, and converting the analog signals of the second type of pixels into digital signals; and according to a preset pixel fusion algorithm, carrying out fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels so as to obtain a target image.
In a third aspect, there is provided an electronic device comprising a memory and an apparatus for image processing as described in the first aspect.
In a fourth aspect, there is provided a computer-readable storage medium having stored thereon a computer program for performing the method of image processing according to the second aspect.
According to the embodiment of the application, the analog signals of the first type of pixels are converted into digital signals, the analog signals of the second type of pixels are converted into digital signals, and fusion processing is carried out on the digital signals of the first type of pixels and the digital signals of the second type of pixels according to a preset pixel fusion algorithm, so that a target image is obtained. According to the embodiment of the application, the pixels are digitized, the image fusion is carried out at the pixel level, the image fusion is not required to be carried out after the data transmission of the whole image frame is completed, the pixel fusion is carried out after the partial pixel data of the image frame are received, and the real-time performance is good and the fusion is accurate.
Drawings
Fig. 1 is a schematic structural diagram of an apparatus for image processing according to an embodiment of the present application.
Fig. 2 is a schematic layout of one possible implementation of the apparatus of fig. 1.
Fig. 3 is a circuit schematic of another possible implementation of the apparatus of fig. 1.
Fig. 4 is a flowchart of a method for image processing according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments.
At present, the image fusion technology is widely applied to the fields of target detection, monitoring, military, remote sensing, medicine and the like. The image fusion aims at combining the advantages of different sensors, and aiming at the same target, fusing a plurality of images acquired by different sensors to output images which can be better analyzed and processed by a computer. Infrared and visible light image fusion is a more studied direction.
Visible light (VI) is the part of the electromagnetic spectrum that is visible (perceived) to the human eye, and electromagnetic radiation in this range is called visible light. Currently, the visible light imaging technology is rapidly developed, for example, an image collector in a mobile phone, and the total number of pixels of the resolution reaches one hundred million pixels. The visible light image is usually used under the condition of good light in the daytime or at night, can fully reflect the detailed information of the whole scene, and has finer texture information and higher spatial resolution. However, visible light transmission is easily affected by the environment, it is difficult to penetrate fog, rain and snow, dust, haze and the like, the detection distance is limited, and effective information cannot be acquired when the visibility of a target scene is low or shielding exists. For example, the working wavelength of a common visible light camera is 0.38-0.78 micrometers, which is far less than 2.5 micrometers, and visible light reflected by an object is blocked by PM2.5 and cannot reach the camera end, so that the shooting effect of the visible light camera in haze weather is poor.
Infrared (IR) is an electromagnetic wave in the infrared band between visible light and microwaves, with a wavelength range of 0.75 to 1000 micrometers (μm). Unlike the mechanism of visible imaging, the mechanism of infrared imaging is imaging based on the temperature or emissivity of the object. Compared with visible light images, infrared imaging has poor resolution, low contrast, low signal-to-noise ratio and blurred visual effect. But the infrared imaging has stronger penetrating power, is not influenced by external complex illumination conditions, can well identify a thermal target, and has long detection distance. Thus, the complementary characteristics of the visible light image and the infrared image can be fused into a high quality image.
Visible light and infrared image fusion is a different type of sensor image fusion. At present, a visible light sensor and a long-wave infrared sensor are respectively used for imaging in a fusion scheme of visible light and a long-wave infrared image, and the fusion image is operated by a graphic processing unit (graph processing unit, GPU) at a later stage, so that the fusion image belongs to application level fusion.
The image fusion method is to fuse images in a space domain and a time domain after acquiring an infrared image frame and a visible light image frame, so that the calculated amount is large, and the problem of large delay is often caused. In addition, two independent imaging systems are needed for obtaining the visible light image and the infrared image, and the problems of high cost, large volume and unsuitability for miniaturized scenes exist.
Therefore, how to develop an image fusion scheme with small delay is a problem to be solved.
Based on this, an embodiment of the present application proposes an apparatus for image processing. Fig. 1 is a schematic structural diagram of an apparatus for image processing according to an embodiment of the present application. Embodiments of the present application are described in detail below in conjunction with fig. 1. As shown in fig. 1, the image processing apparatus may include a first type sensor 110, a second type sensor 120, a conversion module 130, and a processing module 140.
The first type sensor 110 is used to acquire analog signals of first type pixels of a photographic subject. The shooting object includes, but is not limited to, a product, a person, a scene, etc., and the shooting object may be static or dynamic, and the embodiment of the present application does not limit the type and state of the shooting object. The first type of sensor may be a visible light sensor. The visible light sensor may be a photosensitive element of a "point" light source such as a photodiode (or photodiode), phototransistor, or the like, or may be an image sensor.
The image sensor converts the light image on the light sensing surface into an electric signal in corresponding proportional relation with the light image by utilizing the photoelectric conversion function of the photoelectric device. The image sensor divides the light image on its light receiving surface into a number of small units, which are converted into usable electrical signals. The image sensor mainly has two types of complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) and charge coupled device (charged coupled device, CCD).
CMOS is a mainstream semiconductor process, has advantages of low power consumption and high speed, and is widely used for manufacturing CPUs, memories, and various digital logic chips. An image sensor designed based on a CMOS process is called a CMOS image sensor (CMOS image sensor, CIS). CIS image sensors are typically composed of an array of pixels, including photodetectors (e.g., photodiodes) and corresponding control circuitry, peripheral addressing circuitry, readout circuitry, and the like. The first type of sensor may be a visible light CIS, for example.
The first type of pixels may be visible light pixels. A Pixel (Pixel) is a short for graphic elements, referred to as one unit of screen color and intensity, and is also the smallest unit that can address and assign color values.
The means for image processing may comprise a plurality of sensors 110 of a first type. The plurality of first-type sensors may acquire analog signals of a plurality of first-type pixels of the photographic subject. For example, a plurality of visible light sensors may acquire analog signals of a plurality of visible light pixels of a photographic subject. For another example, a plurality of visible light CIS sensors may acquire analog signals of a plurality of visible light pixels of a photographing object.
The second type sensor 120 is used to acquire analog signals of the second type pixels of the photographic subject. The second type of sensor may be other image sensors than the visible light sensor, such as an infrared sensor (simply referred to as an infrared sensor), or may be other image sensors applied to future technologies. The second type of sensor 120 has a wavelength that is greater than the wavelength of the first type of sensor 110. The second type of pixel may be an infrared pixel.
Infrared rays having wavelengths of 0.75 to 1000 μm can be divided into four bands: (1) near infrared, wavelength range is 0.75-1 μm; (2) short wave infrared, the wavelength range is 1-3 mu m; (3) mid-infrared or mid-wave infrared, the wavelength range is 3-5 μm; (4) The wavelength range of the long-wave infrared or far-infrared is 7.5-14 mu m. Therefore, the infrared sensors may be further classified into near infrared sensors, mid-wave infrared sensors, long-wave infrared sensors, and the like. For example, long wave infrared technology belongs to the thermal imaging technology line, no light source is needed for assistance, imaging is realized only by temperature, and a non-refrigeration detection technology is often adopted.
The infrared sensor may be a photosensor or a microelectromechanical system (micro-electro mechanical system, MEMS) structure. MEMS is also called microsystem or micromachine, and its internal structure is typically on the order of micrometers or even nanometers, which is an independent intelligent system. MEMS infrared sensors are typically electrically connected to an application specific integrated circuit for controlling the sensor and amplifying the output signal.
The means for image processing may comprise a plurality of sensors 120 of the second type. The plurality of second-type sensors may acquire analog signals of a plurality of second-type pixels of the photographic subject. For example, a plurality of infrared sensors may acquire analog signals of a plurality of infrared pixels of a photographic subject. For another example, a plurality of MEMS long-wave infrared sensors may acquire analog signals of a plurality of infrared pixels of a photographic subject.
The conversion module 130 is connected to the first type of sensor 110 and also to the plurality of second type of sensors 120. The conversion module 130 is configured to convert an analog signal of a first type of pixel into a digital signal, and convert an analog signal of a second type of pixel into the digital signal. I.e. the digitization of the pixel signals acquired by the sensor is achieved by the conversion module 130. For example, the conversion module 130 may convert an analog signal of a visible light pixel (which is a first type of pixel) into a digital signal, and may convert an analog signal of a long-wave infrared pixel (which is a second type of pixel) into a digital signal.
The processing module 140 is connected to the converting module 130, and is configured to perform fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels according to a preset pixel fusion algorithm, so as to obtain a target image. For example, according to a preset pixel fusion algorithm, the processing module 140 may perform fusion processing on the digital signals of the plurality of visible light pixels (which are the first type of pixels) and the digital signals of the plurality of long-wave infrared pixels (which are the second type of pixels) to obtain the target image.
The pixel fusion algorithm is generally to compare pixel points at the same position of each source image according to the characteristic of the matrix arrangement of the pixel points of the image, and select a better value as the pixel value of the corresponding position of the final fusion image. Preset pixel fusion algorithms include, but are not limited to, maximum absolute value, weighting, averaging, etc. Compared with an image frame fusion algorithm, the pixel fusion algorithm has the advantages of less input data quantity, less operation quantity, good instantaneity and small delay.
The target image in the embodiment of the application is a fused image obtained by performing pixel fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels, and the fused image can be a final image or an initial image subjected to subsequent further analysis processing.
In some implementations, the plurality of first-type sensors 110 and the plurality of second-type sensors 120 may be integrated onto the same silicon die. The two pixels have a height difference in the vertical direction, and the plurality of first type sensors 110 and the plurality of second type sensors 120 are arranged in a staggered manner, so that the focal planes of the plurality of first type sensors 110 are coincident with the focal planes of the plurality of second type sensors 120, and further, the first type pixel image is coincident with the second type pixel image. For example, the visible light pixels are integrated with the infrared pixels on the same silicon wafer. The visible light pixels can be photodiodes and are positioned on the surface of the silicon wafer. The infrared pixels are of MEMS structures and are located above the silicon wafer. The height difference exists in the vertical direction of the two pixels, so that the visible light focal plane and the infrared focal plane are overlapped, the visible light image and the infrared image are overlapped, further, the fact that the objects subjected to subsequent pixel fusion processing are the same image is ensured, the image fusion is simple in the mode, and the problem of space alignment of a traditional multi-path sensor is solved.
The plurality of first-type sensors 110 and the plurality of second-type sensors 120 may be arranged in a variety of ways on a mounting surface (e.g., a silicon wafer). In some implementations, the plurality of first-type sensors 110 and the plurality of second-type sensors 120 may be staggered (or referred to as being spaced) in the lateral direction, and/or the plurality of first-type sensors 110 and the plurality of second-type sensors 120 may be staggered (or referred to as being spaced) in the longitudinal direction. That is, the plurality of first-type sensors 110 and the plurality of second-type sensors 120 are uniformly arranged at intervals, so that the difficult problem of space alignment and time axis alignment of the traditional multi-path sensor can be further eliminated, the fusion is accurate, and the fusion is concise and direct.
The second type of sensor 120 has a wavelength that is greater than the wavelength of the first type of sensor 110. As described above, the wavelength range of visible light is generally 0.38 to 0.78 μm, the wavelength range of infrared light is generally 0.75 to 1000 μm, and the wavelength of visible light sensor is greater than the wavelength of infrared sensor. The second type sensor 120 is generally larger than the first type sensor 110 in size, and is also the same size as one second type sensor 120 in area, and can accommodate a plurality of first type sensors 110. In some implementations, a set of the plurality of first-type sensors and one of the plurality of second-type sensors are staggered in the lateral direction; and/or a group of the plurality of first type sensors and one of the plurality of second type sensors are staggered in the longitudinal direction. That is, the first type of sensors are staggered in the transverse direction according to groups and the second type of sensors; and/or the first type of sensors are staggered longitudinally by groups and the second type of sensors. Wherein, a group of sensors in the plurality of first type sensors is a part of sensors in the plurality of first type sensors, and the number of each group of sensors can be the same, such as 2, 4, 6, etc. In some embodiments, the number of partial group sensors may also be different. The staggered arrangement mode of more first-type sensors is beneficial to exerting the advantage of high resolution of the first-type sensors (visible light sensors), can further eliminate the problems of space alignment and time axis alignment of the traditional multi-channel sensors, and is beneficial to accurate fusion, simplicity and straightness in fusion and good instantaneity.
In some implementations, the conversion module 130 may include a first readout circuit and an analog-to-digital conversion circuit. The readout circuit may be an interface circuit between the first type sensor 110 (or the second type sensor 120) and a post-stage analog-to-digital conversion circuit, and mainly extracts a photo-generated current (or voltage) signal generated by the sensor. The conversion module 130 may include a plurality of sensing circuits, and the first sensing circuit may be any one of the plurality of sensing circuits.
Typically, the readout circuits of the first type of sensor 110 and the second type of sensor 120 are different, e.g., the readout circuit of the visible light sensor is different from the readout circuit of the infrared sensor. Usually, the space of the chip is limited, and more space is occupied by designing two independent read-out circuits. The first readout circuit may be configured to transmit analog signals acquired by a portion of the plurality of first type sensors 110 and the first readout circuit may be configured to transmit analog signals acquired by a portion of the plurality of second type sensors 120. The analog-to-digital conversion circuit is used for converting the analog signal transmitted by the first readout circuit into a digital signal. The analog-to-digital conversion circuit may be an analog-to-digital converter. The same reading circuit is adopted to read the analog signals of the first type sensor 110 and the second type sensor 120 to the analog-digital conversion circuit, so that the system cost is reduced, the circuit area is reduced, and the accuracy of identifying the shooting object is improved.
In some implementations, the first readout circuitry may transmit analog signals acquired by the partial sensors of the plurality of first type sensors 110 in a time-multiplexed manner, and the first readout circuitry may transmit analog signals acquired by the partial sensors of the plurality of second type sensors 120 in a time-multiplexed manner. The same reading circuit is adopted to read the analog signals of the visible light sensor and the second infrared sensor to the analog-to-digital conversion circuit, thereby being beneficial to reducing the system cost and the circuit area.
In some embodiments, the conversion module 130 may also include a readout circuit, i.e., a first readout circuit. The first readout circuit may transmit the analog signals acquired by the plurality of first type sensors 110 in a time-division multiplexed manner, and the first readout circuit may transmit the analog signals acquired by the plurality of second type sensors 120 in a time-division multiplexed manner.
According to the embodiment of the application, the analog signals of the first type of pixels are converted into digital signals, the analog signals of the second type of pixels are converted into digital signals, and fusion processing is carried out on the digital signals of the first type of pixels and the digital signals of the second type of pixels according to a preset pixel fusion algorithm, so that a target image is obtained. According to the embodiment of the application, the pixels are digitized, the image fusion is carried out at the pixel level, the image fusion is not required to be carried out after the data transmission of the whole image frame is completed, the pixel fusion is carried out after the partial pixel data of the image frame are received, and the real-time performance is good and the fusion is accurate.
Fig. 2 is a schematic layout of one possible implementation of the apparatus of fig. 1. As shown in fig. 2, the image processing apparatus may include a plurality of visible light image sensors (CIS) and a plurality of MEMS infrared sensors. The visible light image sensor (CIS) is a first type of sensor, and the first type of pixels may be visible light pixels (CIS pixels), as shown by the gray boxes in fig. 2. The MEMS infrared sensor is a second type of sensor and the second type of pixels may be infrared pixels (IR pixels), as indicated by the diagonal boxes in fig. 2.
As shown in fig. 2, an effective pixel (effective pixel) area is located in the dotted line box, an internal redundant pixel (dummy pixel) area is located between the dotted line box and the dotted line box, and an active pixel (active pixel) area is located in the solid line box. The active pixel region is located within the active pixel region, and the active pixel region may include a plurality of visible light pixels (corresponding to the sensor) and a plurality of infrared pixels (corresponding to the sensor). For example, the active pixel region in the lateral direction may include 640CIS pixels and/or 320IR pixels, and the active pixel region in the longitudinal direction may include 480CIS pixels and/or 240IR pixels.
The plurality of visible light CIS sensors may acquire analog signals of a plurality of visible light pixels of a photographing object. The plurality of MEMS infrared sensors may acquire analog signals of a plurality of infrared pixels of the photographic subject.
As shown in fig. 2, the visible light CIS pixel unit (i.e., sensor) and the infrared pixel unit (i.e., sensor) may be integrated on the same silicon wafer. The visible light CIS pixel unit can be a photodiode and is positioned on the surface of the silicon wafer. The infrared pixels are MEMS structures and can be located above the silicon wafer. Due to the difference in wavelength and sensor size, there is a difference in height between the two pixel cells in the vertical direction. The plurality of visible light CIS pixel units and the plurality of infrared pixel units are arranged, so that the focal plane of the visible light sensor is overlapped with the focal plane of the infrared sensor, and further the visible light image is overlapped with the infrared image.
The visible light pixels and the infrared pixels can be arranged in a cross manner in a plurality of modes to form different pixel arrays, and the pixel-level fusion algorithm in a plurality of modes corresponds to the pixel arrays.
In one pixel array, a group of CIS sensors and one infrared sensor may be staggered in a lateral direction; and/or a group of CIS sensors and one infrared sensor are staggered in the longitudinal direction. That is, the CIS sensors are staggered in the lateral direction by groups and the infrared sensors; and/or the CIS sensors are staggered in the longitudinal direction by groups and the infrared sensors. Taking the example shown in fig. 2, a group of sensors may be 4 sensors of the plurality of first type sensors. Namely, infrared pixels are inserted every 4 visible light pixel positions in the transverse direction and are arranged in a crossed manner; and infrared pixels are inserted at every 4 visible light pixel positions in the longitudinal direction and are arranged in a crossed manner. The arrangement mode is favorable for fully playing the high-resolution advantage of the visible light sensor, the difficult problems of space alignment and time axis alignment of the traditional multi-path sensor can be further eliminated, fusion is accurate, simplicity and straightness are realized, and instantaneity is good.
The visible light pixel sensors and the infrared pixel sensors may be arranged in a cross-over manner in a variety of ways, including but not limited to, as shown in fig. 2, such as CIS sensors and infrared sensors in each row in the lateral direction may be aligned or staggered. The embodiment of the present application does not specifically limit the layout pattern of the cross arrangement.
The infrared sensor is of an MEMS structure and is positioned above the silicon wafer, and a certain space is usually reserved below the MEMS structure. In some implementations, a processing module for pixel-level readout circuitry to pixel fusion may be placed at a spatial location of the silicon surface and MEMS structure to achieve digitized pixel-level fusion.
According to the embodiment of the application, the analog signals of the visible light pixels are converted into digital signals, the analog signals of the infrared pixels are converted into digital signals, and fusion processing is carried out on the digital signals of the plurality of visible light pixels and the digital signals of the plurality of infrared pixels according to a preset pixel fusion algorithm, so that a target image is obtained. According to the embodiment of the application, pixels are digitized, image fusion is carried out at a pixel level, so that the focal plane of the visible light sensor is overlapped with the focal plane of the infrared sensor, the problems of space alignment and time axis alignment of the traditional multi-path sensor are further solved, fusion is accurate, simplicity and straightness in fusion are realized, delay is small, and instantaneity is good.
Typically the readout circuits of the first type of sensor and the second type of sensor are different, e.g. the readout circuit of the visible light sensor is different from the readout circuit of the infrared sensor. For example, the visible light sensor outputs photo-generated current after being excited by incident light, and the readout circuit can directly read the current signal, or can convert the current signal into a voltage signal and read the voltage signal.
The infrared sensor may include a refrigerated infrared sensor and a non-refrigerated infrared sensor. The refrigeration type infrared sensor is a photoelectric device, and the parameters (such as sensitivity and dynamic range) of the device are different from those of the visible light sensor. The non-refrigeration infrared sensor belongs to a thermistor, the resistance value of the thermistor changes after being irradiated by infrared rays, and a readout circuit is required to compare the resistance value difference of the thermistor and a reference resistor. As shown in FIG. 3, the thermistor R can be compared S And reference resistance R b Is a difference in (a) between the two.
Usually, the space of the chip is limited, and more space is occupied by designing two independent read-out circuits. Accordingly, embodiments of the present application provide a readout circuit that is compatible with the photoelectric characteristics of two sensor pixels. The front stage of the visible light reading circuit completes photoelectric conversion, outputs a photo-generated current signal or an integrated voltage signal, and inputs the photo-generated current signal or the integrated voltage signal to one input port of the multiplexer. The front stage of the readout circuit of the infrared sensor typically performs photoelectric conversion or thermoelectric conversion, outputs a current signal or an integrated voltage signal, and inputs the current signal or the integrated voltage signal to the other input port of the multiplexer. The output end of the multiplexer is connected with the analog-to-digital converter. The read-out circuit finishes the visible light pixel digitization and the infrared pixel digitization in a time-sharing mode.
A combination of a visible light sensor and an infrared sensor is exemplified below in connection with fig. 3. Fig. 3 is a circuit schematic of another possible implementation of the apparatus of fig. 1. Fig. 3 shows a schematic representation of the multiplexing of 4 visible light pixels with one uncooled infrared pixel readout circuit. The 4 visible light pixels and one uncooled infrared pixel are connected to an analog-to-digital converter through a readout circuit comprising a multiplexer. The embodiment of fig. 3 is described in detail below.
Fig. 3 is a partial circuit of an image processing apparatus, which may include a first sensor 310, a second sensor 320, and a first readout circuit 330 and an analog-to-digital converter 340.
The first sensors 310 are 4 visible light pixel units, namely visible light pixel units TX0, TX1, TX2 and TX3, respectively. As shown in fig. 3, the visible light pixel unit is a 4-tube readout circuit, the photo-generated current integration is completed by using the parasitic capacitance of the node, and the photo-generated integrated voltage signal is output through the source follower, so that the dynamic range is improved. Wherein the source follower is a circuit with field effect transistors for impedance transformation and voltage following. The photocell also generates a current when not illuminated, known as dark current. Dark current effects can be reduced by Reset (RST) control logic. Since the analog signal is transmitted only in the pixel space, noise coupling is low.
The second sensor 320 may be a non-refrigerated infrared pixel unit. The uncooled infrared pixel reading circuit is a current mirror difference value current output structure, and is converted into a photo-generated voltage signal by an active integrator. The current mirror is also called a cascode current mirror, and is of a cascade triode structure, as shown in fig. 3, one MOS transistor works in a cascode state, and one MOS transistor works in a cascode state.
The first readout circuit 330 may include a visible light pixel readout circuit 331, an infrared pixel readout circuit 332, and a multiplexer 333.
The analog-to-digital converter 340 is used for converting the analog signal of the pixel transmitted by the multiplexer 333 into a digital signal.
The processing flow of the apparatus for image processing of fig. 3 is approximately as follows:
the front stage of the visible light pixel readout circuit 331 performs photoelectric conversion, outputs a photo-generated current signal or an integrated voltage signal, and inputs the same to one input port of the multiplexer 333. The front stage of the infrared pixel readout circuit 332 typically performs photoelectric conversion or thermoelectric conversion, outputs a current signal or an integrated voltage signal, and inputs to the other input port of the multiplexer 333. The output of the multiplexer 333 is connected to the input of the analog-to-digital converter 340, and the analog-to-digital converter 340 converts the analog signal of the pixel into a digital signal. The first readout circuit 330 performs the digitizing of the visible light pixels and the digitizing of the infrared pixels in a time-division multiplexing manner.
The implementation mode of the reading circuit can be various, but the basic idea is to time-division multiplex the same analog-digital converter, so that the photoelectric analog signal is digitized in the pixel space. The arrangement of the readout circuits in the embodiments of the present application is not particularly limited.
Apparatus embodiments of the present application are described above in detail in connection with fig. 1-3, and method embodiments of the present application are described below in detail in connection with fig. 4. It is to be understood that the description of the method embodiments corresponds to the description of the device embodiments, and that parts not described in detail can therefore be seen in the preceding device embodiments.
Fig. 4 is a flowchart of a method for image processing according to an embodiment of the present application. The method of fig. 4 is applied to an image processing apparatus, which may include: the system comprises a plurality of first type sensors, a plurality of second type sensors, a conversion module and a processing module.
As shown in fig. 4, the image processing method may mainly include steps S410 to S430, which are described in detail below. It should be noted that, the sequence number of each step in the embodiment of the present application does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
In step S410, analog signals of a plurality of first type pixels of the subject are acquired, and analog signals of a plurality of second type pixels of the subject are acquired.
In step S420, the analog signals of the first type of pixels are converted into digital signals, and the analog signals of the second type of pixels are converted into digital signals.
In step S430, fusion processing is performed on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels according to a preset pixel fusion algorithm, so as to obtain a target image.
Optionally, the plurality of first type sensors and the plurality of second type sensors are staggered such that the focal planes of the plurality of first type sensors coincide with the focal planes of the plurality of second type sensors.
Optionally, the plurality of first type sensors and the plurality of second type sensors are staggered in a lateral direction, and/or the plurality of first type sensors and the plurality of second type sensors are staggered in a longitudinal direction.
Optionally, the set of the plurality of first type sensors and the one of the plurality of second type sensors are staggered in a lateral direction, and/or the set of the plurality of first type sensors and the one of the plurality of second type sensors are staggered in a longitudinal direction, wherein the set of the plurality of first type sensors is a partial sensor of the plurality of first type sensors.
Optionally, the conversion module includes a first readout circuit and an analog-to-digital conversion circuit. The image processing method comprises the following steps: the first readout circuit is used for transmitting analog signals acquired by partial sensors of the plurality of first type sensors, and the first readout circuit is used for transmitting analog signals acquired by partial sensors of the plurality of second type sensors. The analog signal transmitted by the first readout circuit is converted into a digital signal by the analog-to-digital conversion circuit.
Optionally, the first type of sensor is a visible light sensor, and the first type of pixel is a visible light pixel; the second type of sensor is an infrared sensor and the second type of pixel is an infrared pixel.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 500 may comprise a memory 510 and an image processing means 520 as described in any of the foregoing.
The memory 510 is used for storing information and data, for example, data during processing by the image processing device 520 or output data. Illustratively, the memory 510 may be a Read Only Memory (ROM), a random access memory (random access memory, RAM), a dynamic random access memory (dynamic random access memory, DRAM), an SDRAM, a DDR, a flash memory, which is not specifically limited in this embodiment of the present application.
It should be noted that, the electronic device mentioned in the embodiment of the present application is an electronic device having a shooting function and composed of microelectronic devices, and refers to a device that may be composed of electronic components such as an integrated circuit, a transistor, and an electronic tube, and functions by applying electronic technology (including software). The electronic device may be a random device, and the electronic device may be referred to as a terminal, a portable terminal, a mobile terminal, a communication terminal, a portable mobile terminal, a touch screen, or the like. For example, the electronic device may be, but is not limited to, various smartphones, digital cameras, video cameras, smartphones, notebook computers, tablet computers, smartphones, portable phones, gaming machines, televisions, display units, personal media players (personal media player, PMP), personal digital assistants (personal digital assistant, PDA), robots controlled by electronic computers, and the like. The electronic device may also be a portable communication terminal having a wireless communication function and a pocket size.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program for performing the method of image processing as described in any of the foregoing.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present disclosure, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a machine-readable storage medium or transmitted from one machine-readable storage medium to another machine-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The machine-readable storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more servers, data centers, etc. integrated with the available medium. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It should be understood that, in various embodiments of the present application, "first," "second," etc. are used to distinguish between different objects, and not to describe a specific order, the size of the sequence numbers of each process described above does not mean that the order of execution should not be construed as to imply that the order of execution of each process should be determined by its function and inherent logic, but should not be construed as limiting the implementation of the embodiments of the present application.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In several embodiments provided herein, it will be understood that when a portion is referred to as being "connected" or "connected" to another portion, it means that the portion can be "directly connected" or "electrically connected" while another element is interposed therebetween. In addition, the term "connected" also means that the portions are "physically connected" as well as "wirelessly connected". In addition, when a portion is referred to as "comprising" an element, it is meant that the portion may include the other element without excluding the other element, unless otherwise stated.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (14)

1. An apparatus for image processing, comprising:
a plurality of first type sensors for acquiring analog signals of a plurality of first type pixels of a photographic subject;
a plurality of second type sensors for acquiring analog signals of a plurality of second type pixels of the photographic subject;
the conversion module is used for converting the analog signals of the first type of pixels into digital signals and converting the analog signals of the second type of pixels into digital signals;
the processing module is connected with the conversion module and used for carrying out fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels according to a preset pixel fusion algorithm so as to obtain a target image.
2. The apparatus of claim 1, wherein the plurality of first-type sensors and the plurality of second-type sensors are staggered such that a focal plane of the plurality of first-type sensors coincides with a focal plane of the plurality of second-type sensors.
3. The apparatus of claim 2, wherein the plurality of first-type sensors and the plurality of second-type sensors are staggered in a lateral direction and/or the plurality of first-type sensors and the plurality of second-type sensors are staggered in a longitudinal direction.
4. The apparatus of claim 3, wherein a set of sensors of the first plurality of sensors and one sensor of the second plurality of sensors are staggered in a lateral direction; and/or the number of the groups of groups,
a group of sensors of the plurality of sensors of the first type and one sensor of the plurality of sensors of the second type are staggered in a longitudinal direction;
wherein a group of sensors of the plurality of sensors of the first type is a portion of the plurality of sensors of the first type.
5. The apparatus of claim 1, wherein the conversion module comprises:
a first readout circuit for transmitting analog signals acquired by partial sensors of the plurality of first type sensors, and transmitting analog signals acquired by partial sensors of the plurality of second type sensors;
and the analog-to-digital conversion circuit is used for converting the analog signals transmitted by the first readout circuit into digital signals.
6. The apparatus of claim 1, wherein the first type of sensor is a visible light sensor, the first type of pixel is a visible light pixel, the second type of sensor is an infrared sensor, and the second type of pixel is an infrared pixel.
7. A method of image processing, characterized by being applied to an image processing apparatus, the image processing apparatus comprising: a plurality of first type sensors, a plurality of second type sensors, a conversion module and a processing module;
the method comprises the following steps:
acquiring analog signals of a plurality of first type pixels of a shooting object;
acquiring analog signals of a plurality of second type pixels of the shooting object;
converting the analog signals of the first type of pixels into digital signals, and converting the analog signals of the second type of pixels into digital signals;
and according to a preset pixel fusion algorithm, carrying out fusion processing on the digital signals of the plurality of first type pixels and the digital signals of the plurality of second type pixels so as to obtain a target image.
8. The method according to claim 7, comprising:
the plurality of first type sensors and the plurality of second type sensors are staggered such that the focal planes of the plurality of first type sensors coincide with the focal planes of the plurality of second type sensors.
9. The method of claim 8, wherein the plurality of first-type sensors and the plurality of second-type sensors are staggered in a lateral direction and/or the plurality of first-type sensors and the plurality of second-type sensors are staggered in a longitudinal direction.
10. The method of claim 9, wherein the plurality of first-type sensors and the plurality of second-type sensors are staggered in a lateral direction and/or the plurality of first-type sensors and the plurality of second-type sensors are staggered in a longitudinal direction, comprising:
a set of the plurality of sensors of the first type and a sensor of the plurality of sensors of the second type are staggered in a lateral direction, and/or,
a group of sensors of the first plurality of sensors and a sensor of the second plurality of sensors are staggered longitudinally,
wherein a group of sensors of the plurality of sensors of the first type is a portion of the plurality of sensors of the first type.
11. The method of claim 7, wherein the conversion module comprises: a first readout circuit and an analog-to-digital conversion circuit;
the method comprises the following steps:
transmitting analog signals acquired by partial sensors of the plurality of first type sensors by using the first readout circuit, and transmitting analog signals acquired by partial sensors of the plurality of second type sensors by using the first readout circuit;
and converting the analog signal transmitted by the first readout circuit into a digital signal by utilizing the analog-to-digital conversion circuit.
12. The method of claim 7, wherein the first type of sensor is a visible light sensor, the first type of pixel is a visible light pixel, the second type of sensor is an infrared sensor, and the second type of pixel is an infrared pixel.
13. An electronic device comprising a memory and an apparatus for image processing as claimed in any one of claims 1-6.
14. A computer-readable storage medium, on which a computer program is stored, the computer program being for performing the method of image processing according to any one of claims 7-12.
CN202211712781.8A 2022-12-29 2022-12-29 Image processing device and method and electronic equipment Pending CN116071626A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211712781.8A CN116071626A (en) 2022-12-29 2022-12-29 Image processing device and method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211712781.8A CN116071626A (en) 2022-12-29 2022-12-29 Image processing device and method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116071626A true CN116071626A (en) 2023-05-05

Family

ID=86170972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211712781.8A Pending CN116071626A (en) 2022-12-29 2022-12-29 Image processing device and method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116071626A (en)

Similar Documents

Publication Publication Date Title
US20200336682A1 (en) System and method for using filtering and pixel correlation to increase sensitivity in image sensors
US9410850B2 (en) Infrared imager readout electronics
US20190141261A1 (en) Imager with array of multiple infrared imaging modules
US9445018B2 (en) Imaging systems with phase detection pixels
US8929601B2 (en) Imaging detecting with automated sensing of an object or characteristic of that object
CN108270981B (en) Pixel unit, imaging method and imaging device thereof
WO2019036237A1 (en) Detection circuit for photo sensor with stacked substrates
US9386240B1 (en) Compensation for dual conversion gain high dynamic range sensor
US20170054922A1 (en) Infrared imager readout electronics
US9729806B2 (en) Imaging systems with phase detection pixels
KR20140141390A (en) Image sensor and imaging device including the same
WO2009035785A1 (en) Circuits and methods allowing for pixel array exposure pattern control
US11438530B2 (en) Pixel unit with a design for half row reading, an imaging apparatus including the same, and an imaging method thereof
US20230345141A1 (en) Imaging device, pixel and manufacturing method thereof
KR20200001165A (en) Image sensor, pixel array and operation method thereof
CN109951656B (en) Image sensor and electronic equipment
US10051216B2 (en) Imaging apparatus and imaging method thereof using correlated double sampling
CN114449188A (en) Dark current calibration method and associated pixel circuitry
CN116071626A (en) Image processing device and method and electronic equipment
CA3100661A1 (en) Rolling subframe pulsed bias microbolometer integration
TW202301719A (en) Imaging sensor with near-infrared absorber
CN109863603A (en) Imaging sensor with electronic collection electrode and hole-collecting electrode
US10880500B2 (en) Pixel apparatus and CMOS image sensor using the same
CN110214444B (en) Back-illuminated global shutter imaging array
Katz et al. Passive CMOS single photon avalanche diode imager for a gun muzzle flash detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination