WO2022068598A1 - Imaging method and apparatus - Google Patents

Imaging method and apparatus Download PDF

Info

Publication number
WO2022068598A1
WO2022068598A1 PCT/CN2021/118697 CN2021118697W WO2022068598A1 WO 2022068598 A1 WO2022068598 A1 WO 2022068598A1 CN 2021118697 W CN2021118697 W CN 2021118697W WO 2022068598 A1 WO2022068598 A1 WO 2022068598A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
image
light
resolution
grayscale
Prior art date
Application number
PCT/CN2021/118697
Other languages
French (fr)
Chinese (zh)
Inventor
季军
陈敏
胡翔宇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022068598A1 publication Critical patent/WO2022068598A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present application relates to image processing technology, and in particular, to an imaging method and apparatus.
  • the present application provides an imaging method and device, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that images with more vivid colors and higher definition can be obtained.
  • the present application provides an imaging method, which includes: acquiring a reflected light signal of a photographed object; separating the reflected light signal into a first light signal and a second light signal by a light splitting unit, and the light splitting unit is used for spectrum and separation energy; obtain the first image signal according to the first optical signal; obtain the second image signal according to the second light signal; in the first illumination scene, obtain the first target image according to the first image signal and the second image signal; In the second illumination scene, the second target image is obtained according to the first image signal; wherein, the resolution of the first image signal is higher than the resolution of the second image signal; the illumination intensity of the second illumination scene is greater than that of the first illumination scene strength.
  • the application can capture the reflected light signal of the photographed object through a light capture module (such as a lens), and the reflected light signal is consistent with the light irradiated on the lens, that is, the natural light is still natural light after reflection, and the infrared light is still infrared light after reflection , the reflection of light does not change the wavelength of the optical signal in it. Therefore, the reflected light signal in the present application may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, infrared light emitted by infrared supplementary lights, and the like.
  • natural light also known as visible light
  • the spectroscopic unit can use a spectroscopic prism, which can separate the visible light signal and the infrared light signal in the reflected light signal to achieve the purpose of spectrum splitting; A first component signal of spectral energy and a second component signal of 60% to 90% visible spectral energy.
  • the first component signal and the infrared light signal constitute the first optical signal on the first optical path, and the second component signal constitutes the second optical signal on the second optical path.
  • the first optical signal is input to the first image sensor
  • the second optical signal is input to the second image sensor
  • the resolution of the first image sensor is higher than that of the second image sensor.
  • the first image sensor converts the first optical signal into a first image signal (for example, the suffix is .raw image) through photoelectric conversion
  • the second image sensor converts the second optical signal into a second image signal (such as the suffix name) through photoelectric conversion. for .raw images).
  • the unit pixel area on the photosensitive surface of the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio.
  • a low illuminance scene the above-mentioned first illuminance scene
  • an increase in resolution will seriously affect the color effect.
  • two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions.
  • the image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals.
  • large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night.
  • large-resolution imaging can be supported day and night.
  • the infrared fill light is turned on to illuminate the object to be photographed, and the infrared filter is turned off, so that the first light signal includes the first component signal of the visible light signal and is reflected by the infrared fill light.
  • the infrared light signal; the second light signal includes the second component signal of the visible light signal.
  • the infrared fill light works and emits an infrared light signal, so the reflected light signal includes a visible light signal and an infrared light signal from the infrared fill light.
  • the spectroscopic unit uses optical principles to separate the reflected light signal by spectrum and energy to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the infrared signal.
  • the infrared light signal obtained by the reflection of the fill light, and the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
  • the first grayscale image can be obtained according to the first image signal; the second grayscale image and the chromaticity map can be obtained according to the second image signal; the grayscale fusion of the first grayscale image and the second grayscale image can be performed to obtain the Three grayscale images; the first target image is obtained by color fusion of the third grayscale image and the chromaticity map; wherein, the resolution of the first grayscale image is higher than that of the second grayscale image; the first grayscale image is higher than the resolution of the chromaticity diagram.
  • the first image signal converted from the first light signal can obtain a black and white first grayscale (luma) image.
  • the first grayscale image is obtained according to the first image signal, which involves an image processing algorithm of ISP, that is, converting an original image in .raw format into an image visible to the human eye.
  • the second image signal converted based on the second light signal is converted into grayscale and color
  • the second grayscale (luma) image of black and white and the chromaticity (chroma) image of color are obtained by separation of degrees, and the image processing algorithm of ISP is also involved here, that is, the original image in .raw format is converted into an image visible to the human eye, and the Convert RGB format to YUV format, and separate the image in YUV format.
  • the first image acquisition The resolution of the first image signal obtained by the unit is higher than the resolution of the second image signal obtained by the second image acquisition unit.
  • the resolution of the first grayscale (luma) image is higher than that of the second grayscale (luma) image, and the resolution of the first grayscale (luma) image is also higher than that of the chromaticity (chroma) image. resolution.
  • the first grayscale (luma) image and the second grayscale (luma) image are gray-scaled to obtain a third grayscale (luma) image.
  • the high-resolution of the third grayscale (luma) image can be obtained.
  • the third grayscale (luma) image and the chromaticity (chroma) image are color-fused to obtain a target image, and the resolution of the target image is also the same as that of the first grayscale image.
  • the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio.
  • a low illuminance scene the above-mentioned first illuminance scene
  • an increase in resolution will seriously affect the color effect.
  • two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions.
  • the image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals.
  • large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night.
  • large-resolution imaging can be supported day and night.
  • the reflected light signal is separated by a spectroscopic unit, and then the separated light signal is photoelectrically converted by two image sensors of high resolution and low resolution to obtain a corresponding image signal, and then converted into a corresponding image, With the help of two-step fusion processing (grayscale fusion and color fusion), the final target image is obtained.
  • the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image.
  • Both grayscale fusion and color fusion are fusions between two high-resolution and low-resolution images, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that more color can be obtained. Vivid, higher-definition images.
  • the second illumination turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes the first component signal of the visible light signal but does not include the infrared light signal; the second light signal includes visible light. the second component signal of the signal.
  • the infrared fill light in the imaging device does not work, and the infrared filter works, so the reflected light signal includes visible light signal and infrared light signal from nature.
  • the light splitting unit in the imaging device uses optical principles to separate the reflected light signal to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the In the natural infrared light signal, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
  • the infrared filter filters out the infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit only includes the first component signal of the visible light signal.
  • the first image sensor converts the first optical signal into an electrical signal to form a first image signal with an initial image prototype.
  • the first image signal is an electrical signal converted by photoelectricity and constitutes, for example, a suffix named .raw image. Since the first light signal includes a visible light signal, the first image signal is a colored visible light image signal.
  • the second image sensor converts the second optical signal into an electrical signal to form a second image signal with an initial image prototype, and the second image signal is also a photoelectrically converted electrical signal, forming, for example, an image with a suffix of .raw. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
  • the resolution of the first image signal is higher than that of the second image signal.
  • the image processing module obtains a target image according to the first image signal, and the resolution of the target image is the same as that of the first image signal and higher than that of the second image signal.
  • the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion.
  • the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion.
  • a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
  • the method further includes: judging the current illumination scene according to the second light signal; when the illumination intensity corresponding to the second light signal is less than the first threshold, determining that the current illumination scene is the first illumination scene; when the illumination intensity When it is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the signal gain corresponding to the second light signal is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain When it is less than or equal to the second threshold, it is determined that the current illumination scene is the second illumination scene.
  • the above-mentioned first threshold and second threshold may be set in advance according to historical data or experience.
  • the light intensity can be detected, for example, by a light sensor on the imaging device.
  • the signal gain can be detected, for example, by an image sensor in the second image acquisition unit.
  • the present application provides an imaging device, comprising: a light capturing module for acquiring a reflected light signal of a photographed object; a light splitting module for separating the reflected light signal into a first light signal and a a second optical signal, the spectroscopic unit is used for spectrum and energy division; an image acquisition module is used for acquiring a first image signal according to the first optical signal; acquiring a second image signal according to the second optical signal; wherein , the resolution of the first image signal is higher than the resolution of the second image signal.
  • the method further includes: an image processing module, configured to acquire a first target image according to the first image signal and the second image signal under a first illumination scene; In an illumination scene, a second target image is acquired according to the first image signal; wherein the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
  • the light splitting unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the light splitting unit is also used to separate the visible light signal
  • the signal is divided into a first component signal that accounts for 10%-40% of the visible spectrum energy and a second component signal that accounts for 60%-90% of the visible spectrum energy.
  • the image processing module is further configured to turn on an infrared fill light to illuminate the photographed object, and turn off the infrared filter, so that the first An optical signal includes a first component signal of a visible light signal and an infrared light signal reflected by the infrared fill light; the second optical signal includes a second component signal of the visible light signal.
  • the image processing module is further configured to turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes a visible light signal
  • the image acquisition module is specifically configured to input the first optical signal into a first image sensor to acquire the first image signal; input the second optical signal into the second image sensor Acquiring the second image signal; wherein the resolution of the first image sensor is higher than the resolution of the second image sensor.
  • the image processing module is specifically configured to obtain a first grayscale image according to the first image signal; obtain a second grayscale image and a chromaticity map according to the second image signal; Performing grayscale fusion on the first grayscale image and the second grayscale image to obtain a third grayscale image; performing color fusion on the third grayscale image and the chromaticity map to obtain the first target image; wherein, the resolution of the first grayscale image is higher than the resolution of the second grayscale image; the resolution of the first grayscale image is higher than the resolution of the chromaticity map.
  • the image processing module is further configured to determine the current lighting scene according to the second light signal; when the light intensity corresponding to the second light signal is less than a first threshold, determine the The current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the second light signal corresponds to When the signal gain is greater than the second threshold, determine that the current lighting scene is the first lighting scene; or, when the signal gain is less than or equal to the second threshold, determine that the current lighting scene is the first lighting scene Two-light scene.
  • the present application provides a terminal device, comprising: one or more processors; a memory for storing one or more programs; when the one or more programs are executed by the one or more processors , causing the one or more processors to implement the method according to any one of the above first aspects.
  • the present application provides a computer-readable storage medium, which is characterized by comprising a computer program, and when the computer program is executed on a computer, causes the computer to perform any one of the above-mentioned first aspects. method.
  • the present application provides a computer program, characterized in that, when the computer program is executed by a computer, it is used to execute the method according to any one of the above-mentioned first aspects.
  • FIG. 1 is a flowchart of an embodiment of an imaging device of the present application
  • FIG. 2 is an exemplary block diagram of a device embodiment of the present application
  • FIG. 3 is a flowchart of a process 300 of an embodiment of the imaging method of the present application.
  • FIG. 4 is an exemplary frame diagram of an embodiment of an imaging method of the present application.
  • FIG. 5 is a schematic structural diagram of an embodiment of an imaging device of the present application.
  • At least one (item) refers to one or more, and "a plurality” refers to two or more.
  • “And/or” is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B exist , where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, c can be single or multiple.
  • Illuminance intensity refers to the energy of visible light received per unit area, referred to as illuminance, and the unit is Lux (Lux or lx).
  • Illuminance can be used to indicate the degree to which an object is illuminated, that is, the ratio of the luminous flux obtained on the surface of the object to the illuminated area. For example, in summer, under direct sunlight, the illuminance can reach 60000lx ⁇ 100000lx; outdoors without the sun, the illuminance can reach 1000lx ⁇ 10000lx; bright indoors, the illuminance can reach 100lx ⁇ 550lx; under the full moon at night, the illuminance can reach 0.2lx.
  • the present application involves two application scenarios, namely a first illumination scene and a second illumination scene, and the illumination of the first illumination scene is smaller than that of the second illumination scene.
  • the first illuminance scene may refer to no light at night, a dimly lit room, a dark corner with poor daily light, etc.
  • the second illuminance scene may refer to an outdoor in the daytime, an indoor with sufficient light during the day, and an indoor with sufficient light. or outdoor etc.
  • YUV A color space format for pixels, Y is the luminance (luma) component of the pixel, representing the luminance or gray level intensity, and images based on the luminance component are black and white images; UV is the chroma (chroma) component of the pixel, representing the pixel Color, an image based on chrominance components is a color image.
  • Visible light Also known as visible light signal.
  • the part of the electromagnetic spectrum that the human eye can perceive has a wavelength between 400nm and 750nm, for example.
  • Infrared light Also known as infrared light signal.
  • An electromagnetic wave whose frequency is between microwave and visible light, and its wavelength is greater than 750nm, for example, between 760nm-1mm.
  • Prism A polyhedron made of transparent materials (such as glass, crystal, etc.) used to split or disperse light. Prisms are widely used in spectroscopic instruments. For example, the "dispersive prism” that decomposes the composite light into the spectrum, the more commonly used is the equilateral prism; or the “total reflection prism” that changes the direction of the light to adjust its imaging position, the more commonly used are the periscope, binocular Right angle prisms in instruments such as telescopes.
  • Image sensor uses the photoelectric conversion function of the photoelectric device to convert the light image on the photosensitive surface into an electrical signal that is proportional to the light image.
  • the photosensitive surface is divided into many small units, and each small unit corresponds to a pixel.
  • the Bayer sensor RGB color filters are arranged on the photosensitive surface, forming a mosaic color filter array (color filter array), 50% of this color filter array is green, which is used to sense green light , 25% is red for sensing red light, and 25% is blue for sensing blue light.
  • Signal-to-noise ratio The ratio of the signal strength experienced by the sensor to the generated noise strength.
  • the signal-to-noise ratio of the sensor is positively related to the area of a single pixel on the photosensitive surface. For example, a 1/1.8-inch sensor with 4 million pixels occupies 3um per pixel. If the resolution is increased to 8 million pixels, each pixel occupies 2um. Obviously, the area of the photosensitive surface corresponding to a single pixel is reduced, which weakens the light signal it receives, thereby reducing the signal-to-noise ratio.
  • Heterospectral image Image of different spectral imaging.
  • the wavelengths of 400nm-750nm belong to visible light
  • the wavelengths of 750nm-850nm belong to infrared light
  • the imaging images of these two kinds of light are heterospectral images.
  • FIG. 1 is a flowchart of an embodiment of an imaging device of the present application.
  • the imaging device may include: a light capturing unit 10 , a light splitting unit 20 , a first image capturing unit 30 , a second image capturing unit 40 and an image processing unit unit 50.
  • the output end of the light capturing unit 10 is connected to the input end of the light splitting unit 20; the two output ends of the light splitting unit 20 are respectively connected to the input end of the first image obtaining unit 30 and the input end of the second image obtaining unit 40;
  • the output terminal of the image acquisition unit 30 and the output terminal of the second image acquisition unit 40 are respectively connected to one input terminal of the image processing unit 50 .
  • the light capturing unit 10 is used to capture the reflected light signal of the photographed object.
  • the light capturing unit 10 may be a camera or a lens of a camera, light (or light signal) irradiates on a photographed object, and is captured by the lens after being reflected by the photographed object.
  • the aforementioned light may be emitted by any light source in the environment where the subject is located, which may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, and infrared light emitted by infrared fill lights.
  • the light capturing unit 10 is responsible for capturing the reflected light signal, which is consistent with the aforementioned light, that is, natural light is still natural light after reflection, and infrared light is still infrared light after reflection, and the reflection of light does not change the wavelength of the light signal.
  • the light splitting unit 20 is configured to separate the reflected light signal into a first light signal and a second light information number, transmit the first light signal to the first image acquisition unit, and transmit the second light signal to the second image acquisition unit.
  • the light splitting unit 20 can use a prism, and use the optical principle of the prism to decompose the reflected optical signal into two optical signals, so that the two optical signals can be independently processed in subsequent modules.
  • the first image acquisition unit 30 is configured to generate a first image signal according to the first optical signal, and transmit the first image signal to the image processing unit;
  • the second image acquisition unit 40 is configured to generate a second image signal according to the second optical signal, and The second image signal is transmitted to the image processing unit; the resolution of the first image signal is higher than that of the second image signal.
  • the first image acquisition unit 30 and the second image acquisition unit 40 respectively include an image sensor to convert the incident optical signal into an electrical signal to form an image signal with an initial image. The optical signal is hit on the photosensitive surface, and the photoelectric conversion of the received optical signal is performed by the photosensitive surface to obtain the corresponding electrical signal, and the electrical signals of all the pixels on the photosensitive surface form the image signal.
  • the image format output by the Bayer sensor can be the original image inside the imaging device, and its suffix is .raw.
  • the resolution of the first image signal is higher than that of the second image signal because the resolution of the image sensor in the first image acquisition unit 30 is higher than the resolution of the image sensor in the second image acquisition unit 40, that is,
  • the photosensitive surface of the image sensor in the first image acquisition unit 30 contains more pixels than the photosensitive surface of the image sensor in the second image acquisition unit 40 contains more pixels.
  • the image processing unit 50 is configured to acquire a target image according to the first image signal and the second image signal.
  • the image processing unit 50 may be any processor or processing chip with data processing capability and computing capability, that is, a software program running on the aforementioned processor or processing chip.
  • the image processing unit 50 may adopt a system on chip (SOC) integrated in an image signal processor (ISP), and the SOC is a complete system integrated on a single chip, which is necessary for all or part of the The technique of packet grouping by electronic circuits.
  • SOC system on chip
  • ISP image signal processor
  • a complete system generally includes a central processing unit, memory, and peripheral circuits.
  • the imaging device may further include: a supplementary light unit 60 connected to the image processing unit 50 .
  • the supplementary light unit 60 may be any device that provides infrared light signals, such as an infrared supplementary light lamp.
  • the present application can control the working state of the supplementary light unit 60 through the image processing unit 50.
  • the supplementary light unit 60 can only work in the first illumination scene, in order to supplement the light intensity; In order to cope with the situation where the illumination changes greatly or changes frequently.
  • the imaging device may further include: a filter unit 70 , the filter unit 70 is disposed between the output end of the spectroscopic unit 20 and the input end of the first image acquisition unit 30 .
  • the filter unit 70 may be any device with the function of filtering out optical signals of a certain wavelength or wavelengths in a certain range, especially to filter out infrared light with a wavelength greater than 750 nm, such as an infrared light filter.
  • the filter unit 70 may only work in the second illumination scene, in order to filter out infrared light signals when the illumination is relatively high.
  • the image processing unit 50 is further configured to determine whether the current illumination scene is the first illumination scene or the second illumination scene according to the second image signal.
  • the illumination intensity is detected according to the second image signal, and when the illumination intensity is less than the first threshold, it is determined that the current illumination scene is the first illumination scene; or, when the illumination intensity is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene.
  • Illumination scene For another example, the signal gain is detected according to the second image signal, and when the signal gain is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, it is determined that the current illumination scene is the first illumination scene. Two-light scene.
  • the above-mentioned first threshold and second threshold may be set in advance according to historical data or experience.
  • the light intensity can be detected, for example, by a light sensor on the imaging device.
  • the signal gain can be detected by an image sensor in the second image acquisition unit 40, for example.
  • the fill light unit 60 works, and the filter unit 70 stops working.
  • the reflected light signal acquired by the light capturing unit 10 may include a visible light signal and a first infrared light signal from an infrared fill light (eg, fill light unit 60 ).
  • the light splitting unit 20 separates the reflected light signal into a first component signal including a visible light signal (for example, a 10% visible light signal), a first light signal including a first infrared light signal, and a second component signal including the visible light signal (for example, 90%). % of the visible light signal) of the second light signal.
  • the first image acquisition unit 30 obtains a first image signal according to the first optical signal, where the first image signal is an electrical signal converted by photoelectric conversion, and constitutes, for example, an original picture with a suffix named .raw. Since the first light signal includes an infrared light signal, the first image signal is a first grayscale image signal in black and white.
  • the second image acquisition unit 40 obtains a second image signal according to the second optical signal, and the second image signal is also a photoelectrically converted electrical signal, forming an original image with a suffix named .raw, for example. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
  • the image processing unit 50 performs luminance and chrominance separation on the second image signal to obtain a second grayscale map and a chroma map, and obtains a first grayscale map based on the first image signal. Since the resolution of the image sensor in the first image acquisition unit 30 is higher than that of the image sensor in the second image acquisition unit 40 , the resolution of the first grayscale image is higher than that of the second grayscale (luma) The resolution of the map and the chroma map.
  • the image processing unit 50 performs grayscale fusion of the first grayscale image and the second grayscale image signal to obtain a third grayscale (luma) image, the high resolution of the third grayscale (luma) image is the same as the above-mentioned first Grayscale images have the same resolution.
  • the image processing unit 50 then performs color fusion of the third grayscale (luma) image and the chromaticity (chroma) image to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
  • the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image, while the grayscale fusion and color fusion are both high resolution and low
  • the fusion between two high-resolution images can not only retain the high signal-to-noise ratio of low-resolution images, but also obtain high-resolution images, so that more vivid and high-definition images can be obtained.
  • the supplementary light unit 60 stops working, and the filter unit 70 works.
  • the reflected light signals acquired by the light capturing unit 10 may include visible light signals and infrared light signals from nature.
  • the light splitting unit 20 separates the reflected light signal into a first component signal including a visible light signal (for example, a 10% visible light signal), a first light signal including an infrared light signal from nature, and a second component signal including the visible light signal (for example, 90% of the visible light signal) of the second light signal.
  • the filter unit 70 filters out the infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit 30 only includes the first component signal of the visible light signal.
  • the first image acquisition unit 30 obtains a first image signal according to the first optical signal, where the first image signal is an electrical signal converted by photoelectric conversion, and constitutes, for example, an original picture with a suffix named .raw. Since the first light signal only includes visible light signals, the first image signal is a colored visible light image signal.
  • the second image acquisition unit 40 obtains a second image signal according to the second optical signal, and the second image signal is also a photoelectrically converted electrical signal, forming an original image with a suffix named .raw, for example. Since the second light signal only includes visible light signals, the second image signal is also a colored visible light image signal. In the present application, the second image signal is of low resolution. In order to reduce the amount of computation, only the second image signal can be used to detect the illumination intensity, and the image processing unit 50 does not need to use the second image signal to perform image fusion.
  • the image processing unit 50 obtains a target image based on the first image signal.
  • the resolution of the target image is the same as the resolution of the first image signal and higher than the resolution of the second image signal.
  • a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
  • Fig. 2 is an exemplary block diagram of an embodiment of a device of the present application, and Fig. 2 shows a schematic structural diagram when the device is a mobile phone.
  • the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1 , Antenna 2, Mobile Communication Module 250, Wireless Communication Module 260, Audio Module 270, Speaker 270A, Receiver 270B, Microphone 270C, Headphone Interface 270D, Sensor Module 280, Key 290, Motor 291, Indicator 292, Camera 293, Display Screen 294, and a subscriber identification module (subscriber identification module, SIM) card interface 295 and so on.
  • SIM subscriber identification module
  • the sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, and ambient light.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 200 .
  • the mobile phone 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the processor 210 in this application can implement the functions of the image processing unit 50 in the imaging device shown in FIG. 1 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
  • the processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 210 may contain multiple sets of I2C buses.
  • the processor 210 can be respectively coupled to the touch sensor 280K, the charger, the flash, the camera 293 and the like through different I2C bus interfaces.
  • the processor 210 can couple the touch sensor 280K through the I2C interface, so that the processor 210 and the touch sensor 280K communicate with each other through the I2C bus interface, so as to realize the touch function of the mobile phone 200.
  • the I2S interface can be used for audio communication.
  • the processor 210 may contain multiple sets of I2S buses.
  • the processor 210 may be coupled with the audio module 270 through an I2S bus to implement communication between the processor 210 and the audio module 270 .
  • the audio module 270 can transmit audio signals to the wireless communication module 260 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 270 and the wireless communication module 260 may be coupled through a PCM bus interface.
  • the audio module 270 can also transmit audio signals to the wireless communication module 260 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 210 with the wireless communication module 260 .
  • the processor 210 communicates with the Bluetooth module in the wireless communication module 260 through the UART interface to implement the Bluetooth function.
  • the audio module 270 can transmit audio signals to the wireless communication module 260 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 210 with peripheral devices such as the display screen 294 and the camera 293 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 210 communicates with the camera 293 through the CSI interface, so as to realize the shooting function of the mobile phone 200 .
  • the processor 210 communicates with the display screen 294 through the DSI interface to realize the display function of the mobile phone 200 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 210 with the camera 293, the display screen 294, the wireless communication module 260, the audio module 270, the sensor module 280, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 230 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 230 can be used to connect a charger to charge the mobile phone 200, and can also be used to transmit data between the mobile phone 200 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other mobile phones, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the mobile phone 200 .
  • the mobile phone 200 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 240 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 240 may receive charging input from the wired charger through the USB interface 230 .
  • the charging management module 240 may receive wireless charging input through the wireless charging coil of the mobile phone 200 . While the charging management module 240 charges the battery 242 , it can also supply power to the mobile phone through the power management module 241 .
  • the power management module 241 is used to connect the battery 242 , the charging management module 240 and the processor 210 .
  • the power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 294, the camera 293, and the wireless communication module 260.
  • the power management module 241 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 241 may also be provided in the processor 210 .
  • the power management module 241 and the charging management module 240 may also be provided in the same device.
  • the wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 250 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the mobile phone 200 .
  • the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 250 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 250 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the processor 210 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the same device as at least part of the modules of the processor 210 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display screen 294 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 210, and may be provided in the same device as the mobile communication module 250 or other functional modules.
  • the wireless communication module 260 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 260 receives electromagnetic waves via the antenna 2 , modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 260 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it
  • the antenna 1 of the mobile phone 200 is coupled with the mobile communication module 250, and the antenna 2 is coupled with the wireless communication module 260, so that the mobile phone 200 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi satellite system) -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone 200 realizes the display function through the GPU, the display screen 294, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 294 is used to display images, videos, and the like.
  • Display screen 294 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • cell phone 200 may include 1 or N display screens 294, where N is a positive integer greater than 1.
  • the mobile phone 200 can realize the shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294 and the application processor.
  • the ISP is used to process the data fed back by the camera 293 .
  • the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 293 . In this application, the ISP can acquire the target image according to the first image signal and the second image signal.
  • the ISP performs luminance and chromaticity separation on the second image signal to obtain a first grayscale (luma) map and a chrominance (chroma) map, and obtains a first grayscale map based on the first image signal, the first grayscale map
  • the resolution is higher than that of the second grayscale image and the chromaticity image.
  • the first grayscale (luma) image and the second grayscale image are grayscale-fused to obtain a third grayscale image, and the third grayscale image has the same high resolution as the first grayscale image.
  • the third grayscale image and the chroma map are then color-fused to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
  • Camera 293 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone 200 may include one or N cameras 293 , where N is a positive integer greater than one.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the mobile phone 200 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the handset 200 may support one or more video codecs.
  • the mobile phone 200 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NPU neural-network
  • Applications such as intelligent cognition of the mobile phone 200 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 200 .
  • the external memory card communicates with the processor 210 through the external memory interface 220 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the internal memory 221 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 200 and the like.
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 210 executes various functional applications and data processing of the mobile phone 200 by executing the instructions stored in the internal memory 221 and/or the instructions stored in the memory provided in the processor.
  • the mobile phone 200 can implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
  • the audio module 270 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be provided in the processor 210 , or some functional modules of the audio module 270 may be provided in the processor 210 .
  • Speaker 270A also referred to as a "speaker” is used to convert audio electrical signals into sound signals.
  • Mobile phone 200 can listen to music through speaker 270A, or listen to hands-free calls.
  • the receiver 270B also referred to as an "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 270B close to the human ear.
  • the microphone 270C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 270C through the human mouth, and input the sound signal into the microphone 270C.
  • the mobile phone 200 may be provided with at least one microphone 270C.
  • the mobile phone 200 may be provided with two microphones 270C, which can implement a noise reduction function in addition to collecting sound signals.
  • the mobile phone 200 may further be provided with three, four or more microphones 270C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the headphone jack 270D is used to connect wired headphones.
  • the earphone interface 270D may be a USB interface 230, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 280A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 280A may be provided on the display screen 294 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. .
  • the gyroscope sensor 280B can be used to determine the motion attitude of the mobile phone 200 .
  • Air pressure sensor 280C is used to measure air pressure.
  • Magnetic sensor 280D includes a Hall sensor.
  • the acceleration sensor 280E can detect the magnitude of the acceleration of the mobile phone 200 in various directions (generally three axes).
  • Distance sensor 280F for measuring distance.
  • Proximity light sensor 280G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the ambient light sensor 280L is used to sense ambient light brightness.
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the temperature sensor 280J is used to detect the temperature.
  • the touch sensor 280K is also called “touch device”.
  • the touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch screen”.
  • the touch sensor 280K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 294 .
  • the touch sensor 280K may also be disposed on the surface of the mobile phone 200 , which is different from the location where the display screen 294 is located.
  • the bone conduction sensor 280M can acquire vibration signals.
  • the image sensor (sensor) 280N uses the photoelectric conversion function of the photoelectric device to convert the light image on the photosensitive surface into an electrical signal that is proportional to the light image.
  • the photosensitive surface is divided into many small units, and each small unit corresponds to a pixel.
  • the Bayer sensor RGB color filters are arranged on the photosensitive surface, forming a mosaic color filter array (color filter array), 50% of this color filter array is green, which is used to sense green light , 25% is red for sensing red light, and 25% is blue for sensing blue light.
  • the keys 290 include a power-on key, a volume key, and the like. Keys 290 may be mechanical keys. It can also be a touch key.
  • the cell phone 200 can receive key input and generate key signal input related to user settings and function control of the cell phone 200 .
  • Motor 291 can generate vibrating cues.
  • the motor 291 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • the indicator 292 can be an indicator light, which can be used to indicate the charging status, the change of power, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 295 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the mobile phone 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295 .
  • the mobile phone 200 may support one or N SIM card interfaces, where N is a positive integer greater than one.
  • the SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card and so on.
  • the mobile phone 200 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the handset 200 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 200 and cannot be separated from the mobile phone 200 .
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the device.
  • the device may include more or less components than shown, or some components may be combined, or some components may be split, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • FIG. 3 is a flowchart of a process 300 of an embodiment of an imaging method of the present application.
  • the process 300 may be performed by the imaging device shown in FIG. 1 , and specifically, may be performed by a mobile phone, a tablet computer, a video camera or a camera, etc. including the imaging device.
  • Process 300 is described as a series of steps or operations, and it should be understood that process 300 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 3 .
  • Process 300 may include:
  • Step 301 Acquire a reflected light signal of a photographed object.
  • the imaging device can capture the reflected light signal of the photographed object through the light capture unit (such as a lens) therein, and the reflected light signal is consistent with the aforementioned light, that is, after the natural light is reflected, it is still natural light, and after the infrared light is reflected, it is still infrared light.
  • the reflection does not change the wavelength of the optical signal in it. Therefore, the reflected light signal in the present application may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, infrared light emitted by infrared supplementary lights, and the like.
  • Step 302 The reflected optical signal is separated into a first optical signal and a second optical signal by the optical splitting unit.
  • the beam splitting unit can use a beam splitter prism, and in this application, the beam splitter prism is used to separate the reflected light signal into a first optical path and a second optical path, wherein the first optical path can be allocated to 10% to 40% of the visible spectrum energy and greater than 80% The infrared spectral energy, the second optical path can be distributed to 60% to 90% of the visible spectral energy and less than 20% of the infrared spectral energy.
  • the optical signal (first optical signal) on the first optical path is input to the first image sensor
  • the optical signal (second optical signal) on the second optical path is input to the second image sensor
  • the resolution of the first image sensor is higher than that of the second image
  • the first image sensor converts the first optical signal into a first image signal (for example, the suffix is .raw image) through photoelectric conversion
  • the second image sensor converts the second optical signal into a second image signal (such as the suffix name) through photoelectric conversion. for .raw images).
  • the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio.
  • a low illuminance scene the above-mentioned first illuminance scene
  • an increase in resolution will seriously affect the color effect.
  • two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions.
  • the image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals.
  • large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night.
  • large-resolution imaging can be supported day and night.
  • the infrared fill light is turned on to illuminate the object to be photographed, and the infrared filter is turned off, so that the first light signal includes the first component signal of the visible light signal and is reflected by the infrared fill light.
  • the infrared light signal; the second light signal includes the second component signal of the visible light signal.
  • the supplementary light unit in the imaging device works and emits an infrared light signal, so the reflected light signal includes a visible light signal and the first infrared light signal from the infrared supplementary light.
  • the spectroscopic unit in the imaging device uses optical principles to perform spectral and energy separation on the reflected light signal to obtain a first light signal and a second light signal, where the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal). ) and the infrared light signal reflected by the infrared fill light, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
  • Step 303 Acquire a first image signal according to the first optical signal.
  • the first image acquisition unit in the imaging device converts the first optical signal into an electrical signal by using the working principle of the image sensor to form a first image signal with an initial image prototype.
  • the first image signal is an electrical signal converted by photoelectricity, which constitutes For example, the suffix is .raw original image. Since the first light signal includes an infrared light signal, the first image signal is a first grayscale image signal in black and white.
  • the image processing unit in the imaging device obtains the first grayscale image according to the first image signal, which involves the image processing algorithm of the ISP, that is, converts the original image in the .raw format into an image visible to the human eye.
  • the first grayscale image is black and white.
  • Step 304 Acquire a second image signal according to the second optical signal.
  • the second image acquisition unit in the imaging device converts the second optical signal into an electrical signal by using the working principle of the image sensor to form a second image signal with an initial image prototype, and the second image signal is also an electrical signal converted by photoelectricity, forming For example, the suffix is .raw original image. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
  • the image processing unit in the imaging device separates the luminance and chromaticity of the second image signal to obtain a second grayscale (luma) image and a chrominance (chroma) image, which also involves the image processing algorithm of the ISP, that is, the .raw format.
  • the original image is converted into an image visible to the human eye, while the RGB format is converted into YUV format, and the image in YUV format is separated.
  • the second grayscale map (luma) is in black and white, and the chroma map is in color.
  • the resolution of the image sensor in the first image acquisition unit in the imaging device is higher than the resolution of the image sensor in the second image acquisition unit, that is, the pixels included in the photosensitive surface of the image sensor in the first image acquisition unit There are more pixels than the photosensitive surface of the image sensor in the second image acquisition unit.
  • the resolution of the first image signal is higher than that of the second image signal, so the resolution of the first image signal obtained by the first image acquisition unit has a higher resolution. higher than the resolution of the second image signal obtained by the second image acquisition unit.
  • the resolution of the first grayscale image obtained in step 303 is higher than the resolution of the second grayscale (luma) image and the chromaticity (chroma) image obtained in step 304 .
  • the first target image is acquired according to the first image signal and the second image signal.
  • the second target image is acquired according to the first image signal.
  • the resolution of the first image signal is higher than the resolution of the second image signal; the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
  • the image processing unit in the imaging device performs grayscale fusion of the first grayscale (luma) image and the second grayscale image to obtain a third grayscale image.
  • the high resolution of the third grayscale image is the same as the above-mentioned first grayscale image.
  • the resolution of the degree map is the same.
  • the third grayscale image and the chroma map are then color-fused to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
  • the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio.
  • a low illuminance scene the above-mentioned first illuminance scene
  • an increase in resolution will seriously affect the color effect.
  • two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions.
  • the image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals.
  • large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night.
  • large-resolution imaging can be supported day and night.
  • the reflected light signal is separated by a spectroscopic unit, and then the separated light signal is photoelectrically converted by two image sensors of high resolution and low resolution to obtain a corresponding image signal, and then converted into a corresponding image, With the help of two-step fusion processing (grayscale fusion and color fusion), the final target image is obtained.
  • the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image.
  • Both grayscale fusion and color fusion are fusions between two high-resolution and low-resolution images, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that more color can be obtained. Vivid, higher-definition images.
  • the second illumination turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes the first component signal of the visible light signal but does not include the infrared light signal; the second light signal includes visible light. the second component signal of the signal.
  • the supplementary light unit in the imaging device does not work, and the filter unit works, so the reflected light signal includes the visible light signal and the second infrared light signal from nature.
  • the light splitting unit in the imaging device uses optical principles to separate the reflected light signal to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the In the natural infrared light signal, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
  • the filter unit in the imaging device filters out the second infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit only includes the first component signal of the visible light signal.
  • the first image acquisition unit in the imaging device converts the first optical signal into an electrical signal by using the working principle of the image sensor to form a first image signal with an initial image prototype.
  • the first image signal is an electrical signal converted by photoelectricity, which constitutes For example, the suffix is .raw original image. Since the first light signal includes a visible light signal, the first image signal is a colored visible light image signal.
  • the second image acquisition unit in the imaging device converts the second optical signal into an electrical signal by using the working principle of the image sensor to form a second image signal with an initial image prototype, and the second image signal is also an electrical signal converted by photoelectricity, forming For example, the suffix is .raw original image. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
  • the resolution of the first image signal is higher than that of the second image signal.
  • the image processing unit in the imaging device obtains a target image according to the first image signal, and the resolution of the target image is the same as that of the first image signal and higher than that of the second image signal.
  • the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion.
  • the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion.
  • a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
  • the method further includes: judging the current illumination scene according to the second light signal; when the illumination intensity corresponding to the second light signal is less than the first threshold, determining that the current illumination scene is the first illumination scene; when the illumination intensity When it is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the signal gain corresponding to the second light signal is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain When it is less than or equal to the second threshold, it is determined that the current illumination scene is the second illumination scene.
  • the above-mentioned first threshold and second threshold may be set in advance according to historical data or experience.
  • the light intensity can be detected, for example, by a light sensor on the imaging device.
  • the signal gain can be detected by an image sensor in the second image acquisition unit 40, for example.
  • the light capture unit uses an F1.4 constant aperture lens, which is confocal in the wavelength range of 400nm to 940nm.
  • the fill light unit uses a set of 850nm band light emitting diode (LED) infrared fill lights, a total of 6 pieces.
  • the fill light unit is built in the imaging device, and the imaging device controls the switch and brightness of the infrared fill light through the I2C bus.
  • the beam splitting unit uses an optical prism, which is composed of two isosceles right-angled triangle glasses. Based on the coating technology and the design of prism parameters, it can transmit all the infrared light signals and 10% of the visible light signals in the received reflected light signals. to the A direction, while refracting 90% of the visible light signal in the reflected light signal to the B direction.
  • the A direction and the B direction form an included angle of 90 degrees.
  • the filter unit uses IR-CUT dual filters, which are installed in the A direction behind the optical prism. When the IR-CUT dual filter is in working state, it only allows the light signal with the wavelength of 400nm to 750nm to pass through. When the IR-CUT dual filter is in non-working state, all light signals can be passed through.
  • the first image acquisition unit uses a 1/1.2 target surface, a 4K Bayer sensor (hereinafter referred to as a 4K sensor), and the resolution of the sensor is 3840 ⁇ 2160.
  • the second image acquisition unit uses a 1/1.2 target surface, a 2K Bayer sensor (hereinafter referred to as a 2K sensor), and the resolution of the sensor is 1920 ⁇ 1080.
  • the image processing unit uses the SOC processor of the ARM+DSP architecture.
  • FIG. 4 is an exemplary frame diagram of an embodiment of an imaging method of the present application.
  • the imaging method of this embodiment includes a day mode and a night mode, wherein the night mode corresponds to the first illumination mode, and the day mode corresponds to the second illumination mode Scenes.
  • the initial mode of the camera is day mode
  • the exposure time of the 2K sensor is set to 10ms.
  • the SOC processor detects that the gain of the second image signal from the 2K sensor is greater than or equal to 30db, the camera switches to night mode; when the SOC processor detects that the gain of the second image signal from the 2K sensor is less than 30db, it continues to run in day mode.
  • the 4K sensor captures 10% of the visible light signal and converts it into the first image signal and transmits it to the SOC processor. At the same time, the 2K sensor captures 90% of the visible light signal and converts it into a second image signal, which is also transmitted to the SOC processor.
  • the SOC processor opens an ISP pipe (pipe), and performs image processing on the first image signal through the ISP pipe, and finally outputs a high-resolution RGB image, and the RGB image is color.
  • the SOC processor also detects the signal gain according to the second image signal to determine whether mode conversion is required.
  • the 4K sensor captures 10% of the visible light signal and the first infrared light signal and converts it into a first image signal, which is transmitted to the SOC processor. At the same time, the 2K sensor captures 90% of the visible light signal and converts it into a second image signal, which is also transmitted to the SOC processor.
  • the SOC processor opens two ISP pipes, performs image processing on the first image signal through one ISP pipe, and converts the 4K RGB image signal into a high-resolution first grayscale image. At the same time, image processing is performed on the second image signal through another ISP pipe, and the 2K RGB signal is converted into a low-resolution RGB image.
  • the SOC processor converts the low-resolution RGB image into YUV format to obtain a low-resolution luminance (luma) image (ie, Y channel) and a low-resolution chrominance (Choma) image (ie, UV channel).
  • a low-resolution luminance (luma) image ie, Y channel
  • a low-resolution chrominance (Choma) image ie, UV channel
  • the SOC processor performs grayscale fusion of the low-resolution luminance (luma) image and the high-resolution first grayscale image to obtain a high-resolution luminance (luma) image.
  • the SOC processor performs color fusion of a high-resolution luminance (luma) image and a low-resolution chrominance (Choma) image to obtain a high-resolution RGB image output.
  • the SOC processor also detects the signal gain according to the second image signal to determine whether mode conversion is required.
  • FIG. 5 is a schematic structural diagram of an embodiment of an imaging apparatus of the present application. As shown in FIG. 5 , the apparatus may be applied to the terminal device in the above-mentioned embodiment.
  • the imaging device of this embodiment may include: a light capturing module 1501 , a light splitting module 1502 , an image acquiring module 1503 and an image processing module 1504 . in,
  • the light capture module 1501 is used to acquire the reflected light signal of the photographed object; the light splitting module 1502 is used to separate the reflected light signal into a first light signal and a second light signal by a light splitting unit, and the light splitting unit is used for dividing spectrum and sub-energy; the image acquisition module 1503 is configured to acquire a first image signal according to the first optical signal; acquire a second image signal according to the second optical signal; wherein, the resolution of the first image signal is high on the resolution of the second image signal.
  • the image processing module 1504 is configured to acquire a first target image according to the first image signal and the second image signal in a first illumination scene; or, in a second illumination scene Next, a second target image is acquired according to the first image signal; wherein, the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
  • the light splitting unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the light splitting unit is also used to separate the visible light signal
  • the signal is divided into a first component signal that accounts for 10%-40% of the visible spectrum energy and a second component signal that accounts for 60%-90% of the visible spectrum energy.
  • the image processing module 1504 is further configured to turn on an infrared fill light to illuminate the photographed object, and turn off the infrared filter, so that the
  • the first optical signal includes a first component signal of a visible light signal and an infrared light signal reflected by the infrared fill light;
  • the second optical signal includes a second component signal of the visible light signal.
  • the image processing module 1504 is further configured to turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes visible light
  • the first component signal of the signal does not include the infrared light signal
  • the second light signal includes the second component signal of the visible light signal.
  • the image acquisition module 1503 is specifically configured to input the first optical signal into the first image sensor to acquire the first image signal; input the second optical signal into the second image The sensor acquires the second image signal; wherein the resolution of the first image sensor is higher than the resolution of the second image sensor.
  • the image processing module 1504 is specifically configured to obtain a first grayscale map according to the first image signal; obtain a second grayscale map and a chromaticity map according to the second image signal Perform grayscale fusion of the first grayscale image and the second grayscale image to obtain a third grayscale image; perform color fusion on the third grayscale image and the chromaticity map to obtain the first grayscale image The target image; wherein the resolution of the first grayscale image is higher than the resolution of the second grayscale image; the resolution of the first grayscale image is higher than the resolution of the chromaticity map.
  • the image processing module 1504 is further configured to determine the current lighting scene according to the second light signal; when the light intensity corresponding to the second light signal is less than the first threshold, determine the the current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, determine that the current illumination scene is the second illumination scene; or, when the second light signal
  • the corresponding signal gain is greater than the second threshold, determine that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, determine that the current illumination scene is the first illumination scene The second illumination scene.
  • the apparatus of this embodiment can be used to execute the technical solution of the method embodiment shown in FIG. 3 , and its implementation principle and technical effect are similar, and details are not repeated here.
  • each step of the above method embodiments may be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
  • the processor may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other Programming logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the methods disclosed in the embodiments of the present application may be directly embodied as executed by a hardware coding processor, or executed by a combination of hardware and software modules in the coding processor.
  • the software modules can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the memory mentioned in the above embodiments may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be random access memory (RAM), which acts as an external cache.
  • RAM random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • SDRAM double data rate synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • direct rambus RAM direct rambus RAM
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (personal computer, server, or network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

Provided in the present application are an imaging method and apparatus. The imaging method of the present application comprises: acquiring a reflected optical signal of an image captured object; separating the reflected optical signal into a first optical signal and a second optical signal by means of a light splitting unit, the splitting unit being used for spectral and energy splitting; acquiring a first image signal according to the first optical signal; and acquiring a second image signal according to the second optical signal, the resolution of the first image signal being higher than the resolution of the second image signal. In the present application, a high signal-to-noise ratio of a low-resolution image can be retained, and a high-resolution image can also be acquired, thereby obtaining an image having brighter colors and a higher definition.

Description

成像方法和装置Imaging method and apparatus
本申请要求于2020年9月29日提交中国专利局、申请号为202011058209.5、申请名称为“成像方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number 202011058209.5 and the application title "Imaging Method and Apparatus" filed with the China Patent Office on September 29, 2020, the entire contents of which are incorporated into this application by reference.
技术领域technical field
本申请涉及图像处理技术,尤其涉及一种成像方法和装置。The present application relates to image processing technology, and in particular, to an imaging method and apparatus.
背景技术Background technique
很多摄像机在晚上拍摄时,存在图像质量差、成像模糊等问题,无法满足低照度场景,尤其是夜间的拍摄需求。目前市面上出现了一些面向低照度场景拍摄的技术,使用红外光补光,然后利用“分光谱棱镜”将照进摄像头的光信号分成750nm以下和750nm以上的两束光信号,对这两束光信号分别用两个图像传感器(sensor)成像,然后再将两个成像结果融合成一张彩色图像。When shooting at night, many cameras have problems such as poor image quality and blurred images, which cannot meet the needs of low-illumination scenes, especially at night. At present, there are some technologies for shooting low-light scenes on the market. They use infrared light to fill in the light, and then use a "spectral prism" to divide the light signal into the camera into two light signals below 750nm and above 750nm. The light signal is imaged by two image sensors, and then the two imaged results are fused into a single color image.
但是,随着sensor的分辨率提升,sensor的信噪比会下降,导致低照度场景的成像效果较差。However, as the resolution of the sensor increases, the signal-to-noise ratio of the sensor decreases, resulting in poor imaging results in low-light scenes.
发明内容SUMMARY OF THE INVENTION
本申请提供一种成像方法和装置,既可以保留低分辨率图像的高信噪比,又可以获取高分辨率图像,从而可以得到色彩更鲜艳、更高清的图像。The present application provides an imaging method and device, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that images with more vivid colors and higher definition can be obtained.
第一方面,本申请提供一种成像方法,包括:获取被拍摄对象的反射光信号;通过分光单元将反射光信号分离成第一光信号和第二光信号,分光单元用于分光谱和分能量;根据第一光信号获取第一图像信号;根据第二光信号获取第二图像信号;在第一照度场景下,根据第一图像信号和第二图像信号获取第一目标图像;或者,在第二照度场景下,根据第一图像信号获取第二目标图像;其中,第一图像信号的分辨率高于第二图像信号的分辨率;第二照度场景的光照强度大于第一照度场景的光照强度。In a first aspect, the present application provides an imaging method, which includes: acquiring a reflected light signal of a photographed object; separating the reflected light signal into a first light signal and a second light signal by a light splitting unit, and the light splitting unit is used for spectrum and separation energy; obtain the first image signal according to the first optical signal; obtain the second image signal according to the second light signal; in the first illumination scene, obtain the first target image according to the first image signal and the second image signal; In the second illumination scene, the second target image is obtained according to the first image signal; wherein, the resolution of the first image signal is higher than the resolution of the second image signal; the illumination intensity of the second illumination scene is greater than that of the first illumination scene strength.
本申请可以通过光捕获模块(例如镜头)捕获被拍摄对象的反射光信号,该反射光信号与照射在镜头上的光线一致,即自然光反射后仍然是自然光,红外光反射后也仍然是红外光,光线的反射不会改变其中的光信号的波长。因此本申请的反射光信号可以包括太阳发出的自然光(亦称为可见光)、日光灯发出的可见光、红外补光灯发出的红外光等。The application can capture the reflected light signal of the photographed object through a light capture module (such as a lens), and the reflected light signal is consistent with the light irradiated on the lens, that is, the natural light is still natural light after reflection, and the infrared light is still infrared light after reflection , the reflection of light does not change the wavelength of the optical signal in it. Therefore, the reflected light signal in the present application may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, infrared light emitted by infrared supplementary lights, and the like.
分光单元可以采用分光棱镜,该分光单元可以将反射光信号中的可见光信号和红外光信号分离开,以达到分光谱的目的;分光单元还用于将可见光信号分成占10%~40%的可见光谱能量的第一分量信号和占60%~90%的可见光谱能量的第二分量信号。其中第一分量信号和红外光信号组成第一光路上的第一光信号,第二分量信号组成第二光路上的第二光信号。第一光信号输入第一图像传感器,第二光信号输入第二图像传感器,第一图像传感器的分辨率高于第二图像传感器的分辨率。第一图像传感器经过光电转换将第一光信号转换成第一图像信号(例如后缀名为.raw图像),第二图像传感器经过光电转换将第二光信号转换成第二图像信号(例如后缀名为.raw图像)。由于随着图像传感器的分辨率的提升,图像传感器中的感光面上单位像素面积会减小,进而减少了感光的光子数量,导致信 噪比下降。尤其在低照度场景(上述第一照度场景)下,由于彩色光路上的进光量不足,分辨率的提升会严重影响色彩效果。上述方法中通过大小分辨率两个图像传感器,形成大小分辨率的图像传感器组合,小分辨率的图像传感器用于彩色路成像,只接收可见光信号;大分辨率的图像传感器同时接收可见光信号和红外光信号,白天可以直接实现大分辨率的彩色成像,晚上用大分辨率的红外图像(灰度图)和小分辨率彩色图像(色度图)融合。从而实现白天晚上都支持大分辨率成像。The spectroscopic unit can use a spectroscopic prism, which can separate the visible light signal and the infrared light signal in the reflected light signal to achieve the purpose of spectrum splitting; A first component signal of spectral energy and a second component signal of 60% to 90% visible spectral energy. The first component signal and the infrared light signal constitute the first optical signal on the first optical path, and the second component signal constitutes the second optical signal on the second optical path. The first optical signal is input to the first image sensor, the second optical signal is input to the second image sensor, and the resolution of the first image sensor is higher than that of the second image sensor. The first image sensor converts the first optical signal into a first image signal (for example, the suffix is .raw image) through photoelectric conversion, and the second image sensor converts the second optical signal into a second image signal (such as the suffix name) through photoelectric conversion. for .raw images). As the resolution of the image sensor increases, the unit pixel area on the photosensitive surface of the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio. Especially in a low illuminance scene (the above-mentioned first illuminance scene), due to insufficient light input on the color light path, an increase in resolution will seriously affect the color effect. In the above method, two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions. The image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals. For light signals, large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night. Thus, large-resolution imaging can be supported day and night.
可选的,在第一照度场景下,开启红外补光灯照射被拍摄对象,并且关闭红外滤光片,以使第一光信号包括可见光信号的第一分量信号和由红外补光灯反射得到的红外光信号;第二光信号包括可见光信号的第二分量信号。Optionally, in the first illumination scene, the infrared fill light is turned on to illuminate the object to be photographed, and the infrared filter is turned off, so that the first light signal includes the first component signal of the visible light signal and is reflected by the infrared fill light. The infrared light signal; the second light signal includes the second component signal of the visible light signal.
在第一照度场景下,红外补光灯工作,发出红外光信号,因此反射光信号包括可见光信号和来自红外补光灯的红外光信号。分光单元利用光学原理,对反射光信号进行光谱和能量分离得到第一光信号和第二光信号,其中第一光信号包括可见光信号的第一分量信号(例如10%的可见光信号)和由红外补光灯反射得到的红外光信号,第二光信号包括可见光信号的第二分量信号(例如90%的可见光信号)。In the first illumination scene, the infrared fill light works and emits an infrared light signal, so the reflected light signal includes a visible light signal and an infrared light signal from the infrared fill light. The spectroscopic unit uses optical principles to separate the reflected light signal by spectrum and energy to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the infrared signal. The infrared light signal obtained by the reflection of the fill light, and the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
本申请中可以根据第一图像信号获取第一灰度图;根据第二图像信号获取第二灰度图和色度图;将第一灰度图和第二灰度图进行灰度融合得到第三灰度图像;将第三灰度图和色度图进行色彩融合得到第一目标图像;其中,第一灰度图的分辨率高于第二灰度图的分辨率;第一灰度图的分辨率高于色度图的分辨率。In this application, the first grayscale image can be obtained according to the first image signal; the second grayscale image and the chromaticity map can be obtained according to the second image signal; the grayscale fusion of the first grayscale image and the second grayscale image can be performed to obtain the Three grayscale images; the first target image is obtained by color fusion of the third grayscale image and the chromaticity map; wherein, the resolution of the first grayscale image is higher than that of the second grayscale image; the first grayscale image is higher than the resolution of the chromaticity diagram.
由于第一光信号包含可见光信号的部分分量和红外光信号,因此基于第一光信号转换而来的第一图像信号可以得到黑白色的第一灰度(luma)图。本申请中根据第一图像信号获取第一灰度图,此处涉及ISP的图像处理算法,即将.raw格式的原始图片转换为人眼可见的图像。Since the first light signal includes part of the visible light signal and the infrared light signal, the first image signal converted from the first light signal can obtain a black and white first grayscale (luma) image. In this application, the first grayscale image is obtained according to the first image signal, which involves an image processing algorithm of ISP, that is, converting an original image in .raw format into an image visible to the human eye.
由于第二光信号只包含可见光信号的部分分量,可见光信号中也包含红外光信号(参照自然光的光谱),本申请中对基于第二光信号转换而来的第二图像信号进行灰度和色度分离得到黑白色的第二灰度(luma)图和彩色的色度(chroma)图,此处也涉及ISP的图像处理算法,即将.raw格式的原始图片转换为人眼可见的图像,同时将RGB格式转换为YUV格式,并对YUV格式的图像进行分离。Since the second light signal only contains a part of the visible light signal, and the visible light signal also includes the infrared light signal (refer to the spectrum of natural light), in this application, the second image signal converted based on the second light signal is converted into grayscale and color The second grayscale (luma) image of black and white and the chromaticity (chroma) image of color are obtained by separation of degrees, and the image processing algorithm of ISP is also involved here, that is, the original image in .raw format is converted into an image visible to the human eye, and the Convert RGB format to YUV format, and separate the image in YUV format.
由于第一图像传感器的分辨率高于第二图像传感器的分辨率,亦即第一图像传感器的感光面包含的像素点比第二图像传感器的感光面包含的像素点多,因此第一图像获取单元得到的第一图像信号的分辨率高于第二图像获取单元得到的第二图像信号的分辨率。相应的,第一灰度(luma)图的分辨率高于第二灰度(luma)图的分辨率,第一灰度(luma)图的分辨率也高于和色度(chroma)图的分辨率。Since the resolution of the first image sensor is higher than that of the second image sensor, that is, the photosensitive surface of the first image sensor contains more pixels than the photosensitive surface of the second image sensor, the first image acquisition The resolution of the first image signal obtained by the unit is higher than the resolution of the second image signal obtained by the second image acquisition unit. Correspondingly, the resolution of the first grayscale (luma) image is higher than that of the second grayscale (luma) image, and the resolution of the first grayscale (luma) image is also higher than that of the chromaticity (chroma) image. resolution.
本申请中将第一灰度(luma)图和第二灰度(luma)图进行灰度融合,可以得到第三灰度(luma)图,该第三灰度(luma)图的高分辨率与上述第一灰度图的分辨率相同。再将第三灰度(luma)图和色度(chroma)图进行色彩融合,即可得到目标图像,该目标图像的分辨率也与上述第一灰度图的分辨率相同。In this application, the first grayscale (luma) image and the second grayscale (luma) image are gray-scaled to obtain a third grayscale (luma) image. The high-resolution of the third grayscale (luma) image can be obtained. The same resolution as the first grayscale image above. Then, the third grayscale (luma) image and the chromaticity (chroma) image are color-fused to obtain a target image, and the resolution of the target image is also the same as that of the first grayscale image.
由于随着图像传感器的分辨率的提升,图像传感器中的感光面上单位像素面积会减小,进而减少了感光的光子数量,导致信噪比下降。尤其在低照度场景(上述第一照度场景)下,由于彩色光路上的进光量不足,分辨率的提升会严重影响色彩效果。上述方法中 通过大小分辨率两个图像传感器,形成大小分辨率的图像传感器组合,小分辨率的图像传感器用于彩色路成像,只接收可见光信号;大分辨率的图像传感器同时接收可见光信号和红外光信号,白天可以直接实现大分辨率的彩色成像,晚上用大分辨率的红外图像(灰度图)和小分辨率彩色图像(色度图)融合。从而实现白天晚上都支持大分辨率成像。As the resolution of the image sensor increases, the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio. Especially in a low illuminance scene (the above-mentioned first illuminance scene), due to insufficient light input on the color light path, an increase in resolution will seriously affect the color effect. In the above method, two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions. The image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals. For light signals, large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night. Thus, large-resolution imaging can be supported day and night.
本申请通过分光单元对反射光信号进行分离,然后通过高分辨率和低分辨率两个图像传感器分别对分离得到的光信号进行光电转换得到对应的图像信号,进而将其转换成对应的图像,借助两步融合处理(灰度融合和色彩融合)得到最终的目标图像,其中灰度融合时,第一灰度图中也包含可见光信号,可以实现同谱配准,提高图像的融合效率,而灰度融合和色彩融合均是高分辨率和低分辨率的两个图像之间的融合,既可以保留低分辨率图像的高信噪比,又可以获取高分辨率图像,从而可以得到色彩更鲜艳、更高清的图像。In the present application, the reflected light signal is separated by a spectroscopic unit, and then the separated light signal is photoelectrically converted by two image sensors of high resolution and low resolution to obtain a corresponding image signal, and then converted into a corresponding image, With the help of two-step fusion processing (grayscale fusion and color fusion), the final target image is obtained. During the grayscale fusion, the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image. Both grayscale fusion and color fusion are fusions between two high-resolution and low-resolution images, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that more color can be obtained. Vivid, higher-definition images.
可选的,在第二照度下,关闭红外补光灯,并且开启红外滤光片,以使第一光信号包括可见光信号的第一分量信号但不包括红外光信号;第二光信号包括可见光信号的第二分量信号。Optionally, under the second illumination, turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes the first component signal of the visible light signal but does not include the infrared light signal; the second light signal includes visible light. the second component signal of the signal.
在第二照度场景下,成像装置中的红外补光灯不工作,红外滤光片工作,因此反射光信号包括可见光信号和来自自然界的红外光信号。成像装置中的分光单元利用光学原理,对反射光信号进行分离得到第一光信号和第二光信号,其中第一光信号包括可见光信号的第一分量信号(例如10%的可见光信号)和来自自然界的红外光信号,第二光信号包括可见光信号的第二分量信号(例如90%的可见光信号)。In the second illumination scenario, the infrared fill light in the imaging device does not work, and the infrared filter works, so the reflected light signal includes visible light signal and infrared light signal from nature. The light splitting unit in the imaging device uses optical principles to separate the reflected light signal to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the In the natural infrared light signal, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
红外滤光片将第一光信号中的红外光信号过滤掉,因此到达第一图像获取单元的第一光信号只包括可见光信号的第一分量信号。The infrared filter filters out the infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit only includes the first component signal of the visible light signal.
第一图像传感器将第一光信号转换为电信号,形成初具图像雏形的第一图像信号,该第一图像信号是经光电转换的电信号,构成例如后缀名为.raw图像。由于第一光信号包含可见光信号,因此该第一图像信号是彩色的可见光图像信号。The first image sensor converts the first optical signal into an electrical signal to form a first image signal with an initial image prototype. The first image signal is an electrical signal converted by photoelectricity and constitutes, for example, a suffix named .raw image. Since the first light signal includes a visible light signal, the first image signal is a colored visible light image signal.
第二图像传感器将第二光信号转换为电信号,形成初具图像雏形的第二图像信号,该第二图像信号也是经光电转换的电信号,构成例如后缀名为.raw图像。由于第二光信号只包含可见光信号,因此该第二图像信号是彩色的可见光图像信号。The second image sensor converts the second optical signal into an electrical signal to form a second image signal with an initial image prototype, and the second image signal is also a photoelectrically converted electrical signal, forming, for example, an image with a suffix of .raw. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
同理,第一图像信号的分辨率高于第二图像信号的分辨率。图像处理模块根据第一图像信号得到目标图像,该目标图像的分辨率与第一图像信号的分辨率相同,高于第二图像信号的分辨率。Similarly, the resolution of the first image signal is higher than that of the second image signal. The image processing module obtains a target image according to the first image signal, and the resolution of the target image is the same as that of the first image signal and higher than that of the second image signal.
为了减少运算量,图像处理单元可以只用第二图像信号进行光照强度的检测,无需采用该第二图像信号进行图像融合。该第二照度场景下,可以直接获取到高分辨率图像,提高图像的成像效果。In order to reduce the amount of computation, the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion. In the second illumination scene, a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
在一种可能的实现方式中,还包括:根据第二光信号判断当前光照场景;当第二光信号对应的光照强度小于第一阈值时,确定当前光照场景为第一照度场景;当光照强度大于或等于第一阈值时,确定当前光照场景为第二照度场景;或者,当第二光信号对应的信号增益大于第二阈值时,确定当前光照场景为第一照度场景;或者,当信号增益小于或等于第二阈值时,确定当前光照场景为第二照度场景。In a possible implementation, the method further includes: judging the current illumination scene according to the second light signal; when the illumination intensity corresponding to the second light signal is less than the first threshold, determining that the current illumination scene is the first illumination scene; when the illumination intensity When it is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the signal gain corresponding to the second light signal is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain When it is less than or equal to the second threshold, it is determined that the current illumination scene is the second illumination scene.
上述第一阈值和第二阈值可以预先根据历史数据或经验进行设置。光照强度例如可以通过成像装置上的光传感器进行检测。信号增益例如可以通过第二图像获取单元中的图像 传感器进行检测。The above-mentioned first threshold and second threshold may be set in advance according to historical data or experience. The light intensity can be detected, for example, by a light sensor on the imaging device. The signal gain can be detected, for example, by an image sensor in the second image acquisition unit.
第二方面,本申请提供一种成像装置,包括:光捕获模块,用于获取被拍摄对象的反射光信号;分光模块,用于通过分光单元将所述反射光信号分离成第一光信号和第二光信号,所述分光单元用于分光谱和分能量;图像获取模块,用于根据所述第一光信号获取第一图像信号;根据所述第二光信号获取第二图像信号;其中,所述第一图像信号的分辨率高于所述第二图像信号的分辨率。In a second aspect, the present application provides an imaging device, comprising: a light capturing module for acquiring a reflected light signal of a photographed object; a light splitting module for separating the reflected light signal into a first light signal and a a second optical signal, the spectroscopic unit is used for spectrum and energy division; an image acquisition module is used for acquiring a first image signal according to the first optical signal; acquiring a second image signal according to the second optical signal; wherein , the resolution of the first image signal is higher than the resolution of the second image signal.
在一种可能的实现方式中,还包括:图像处理模块,用于在第一照度场景下,根据所述第一图像信号和所述第二图像信号获取第一目标图像;或者,在第二照度场景下,根据所述第一图像信号获取第二目标图像;其中,所述第二照度场景的光照强度大于所述第一照度场景的光照强度。In a possible implementation manner, the method further includes: an image processing module, configured to acquire a first target image according to the first image signal and the second image signal under a first illumination scene; In an illumination scene, a second target image is acquired according to the first image signal; wherein the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
在一种可能的实现方式中,所述分光单元用于将所述反射光信号中的可见光信号和红外光信号分离开,以达到分光谱的目的;所述分光单元还用于将所述可见光信号分成占10%~40%的可见光谱能量的第一分量信号和占60%~90%的可见光谱能量的第二分量信号。In a possible implementation manner, the light splitting unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the light splitting unit is also used to separate the visible light signal The signal is divided into a first component signal that accounts for 10%-40% of the visible spectrum energy and a second component signal that accounts for 60%-90% of the visible spectrum energy.
在一种可能的实现方式中,在所述第一照度下,所述图像处理模块,还用于开启红外补光灯照射所述被拍摄对象,并且关闭红外滤光片,以使所述第一光信号包括可见光信号的第一分量信号和由所述红外补光灯反射得到的红外光信号;所述第二光信号包括所述可见光信号的第二分量信号。In a possible implementation manner, under the first illumination, the image processing module is further configured to turn on an infrared fill light to illuminate the photographed object, and turn off the infrared filter, so that the first An optical signal includes a first component signal of a visible light signal and an infrared light signal reflected by the infrared fill light; the second optical signal includes a second component signal of the visible light signal.
在一种可能的实现方式中,在所述第二照度下,所述图像处理模块,还用于关闭红外补光灯,并且开启红外滤光片,以使所述第一光信号包括可见光信号的第一分量信号但不包括红外光信号;所述第二光信号包括所述可见光信号的第二分量信号。In a possible implementation manner, under the second illumination, the image processing module is further configured to turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes a visible light signal The first component signal of but does not include the infrared light signal; the second light signal includes the second component signal of the visible light signal.
在一种可能的实现方式中,所述图像获取模块,具体用于将所述第一光信号输入第一图像传感器获取所述第一图像信号;将所述第二光信号输入第二图像传感器获取所述第二图像信号;其中,所述第一图像传感器的分辨率高于所述第二图像传感器的分辨率。In a possible implementation manner, the image acquisition module is specifically configured to input the first optical signal into a first image sensor to acquire the first image signal; input the second optical signal into the second image sensor Acquiring the second image signal; wherein the resolution of the first image sensor is higher than the resolution of the second image sensor.
在一种可能的实现方式中,所述图像处理模块,具体用于根据所述第一图像信号获取第一灰度图;根据所述第二图像信号获取第二灰度图和色度图;将所述第一灰度图和所述第二灰度图进行灰度融合得到第三灰度图像;将所述第三灰度图和所述色度图进行色彩融合得到所述第一目标图像;其中,所述第一灰度图的分辨率高于所述第二灰度图的分辨率;所述第一灰度图的分辨率高于所述色度图的分辨率。In a possible implementation manner, the image processing module is specifically configured to obtain a first grayscale image according to the first image signal; obtain a second grayscale image and a chromaticity map according to the second image signal; Performing grayscale fusion on the first grayscale image and the second grayscale image to obtain a third grayscale image; performing color fusion on the third grayscale image and the chromaticity map to obtain the first target image; wherein, the resolution of the first grayscale image is higher than the resolution of the second grayscale image; the resolution of the first grayscale image is higher than the resolution of the chromaticity map.
在一种可能的实现方式中,所述图像处理模块,还用于根据所述第二光信号判断当前光照场景;当所述第二光信号对应的光照强度小于第一阈值时,确定所述当前光照场景为所述第一照度场景;当所述光照强度大于或等于所述第一阈值时,确定所述当前光照场景为所述第二照度场景;或者,当所述第二光信号对应的信号增益大于第二阈值时,确定所述当前光照场景为所述第一照度场景;或者,当所述信号增益小于或等于所述第二阈值时,确定所述当前光照场景为所述第二照度场景。In a possible implementation manner, the image processing module is further configured to determine the current lighting scene according to the second light signal; when the light intensity corresponding to the second light signal is less than a first threshold, determine the The current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the second light signal corresponds to When the signal gain is greater than the second threshold, determine that the current lighting scene is the first lighting scene; or, when the signal gain is less than or equal to the second threshold, determine that the current lighting scene is the first lighting scene Two-light scene.
第三方面,本申请提供一种终端设备,包括:一个或多个处理器;存储器,用于存储一个或多个程序;当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如上述第一方面中任一项所述的方法。In a third aspect, the present application provides a terminal device, comprising: one or more processors; a memory for storing one or more programs; when the one or more programs are executed by the one or more processors , causing the one or more processors to implement the method according to any one of the above first aspects.
第四方面,本申请提供一种计算机可读存储介质,其特征在于,包括计算机程序,所 述计算机程序在计算机上被执行时,使得所述计算机执行上述第一方面中任一项所述的方法。In a fourth aspect, the present application provides a computer-readable storage medium, which is characterized by comprising a computer program, and when the computer program is executed on a computer, causes the computer to perform any one of the above-mentioned first aspects. method.
第五方面,本申请提供一种计算机程序,其特征在于,当所述计算机程序被计算机执行时,用于执行上述第一方面中任一项所述的方法。In a fifth aspect, the present application provides a computer program, characterized in that, when the computer program is executed by a computer, it is used to execute the method according to any one of the above-mentioned first aspects.
附图说明Description of drawings
图1为本申请成像装置实施例的流程图;FIG. 1 is a flowchart of an embodiment of an imaging device of the present application;
图2为本申请设备实施例的示例性的框图;FIG. 2 is an exemplary block diagram of a device embodiment of the present application;
图3为本申请成像方法实施例的过程300的流程图;FIG. 3 is a flowchart of a process 300 of an embodiment of the imaging method of the present application;
图4为本申请成像方法实施例的示例性的框架图;FIG. 4 is an exemplary frame diagram of an embodiment of an imaging method of the present application;
图5为本申请成像装置实施例的结构示意图。FIG. 5 is a schematic structural diagram of an embodiment of an imaging device of the present application.
具体实施方式Detailed ways
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。In order to make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be described clearly and completely below with reference to the accompanying drawings in the present application. Obviously, the described embodiments are part of the embodiments of the present application. , not all examples. Based on the embodiments in the present application, all other embodiments obtained by those of ordinary skill in the art without creative work fall within the protection scope of the present application.
本申请的说明书实施例和权利要求书及附图中的术语“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。The terms "first", "second", etc. in the description, embodiments and claims of the present application and the drawings are only used for the purpose of distinguishing and describing, and should not be construed as indicating or implying relative importance, nor should they be construed as indicating or implied order. Furthermore, the terms "comprising" and "having" and any variations thereof, are intended to cover non-exclusive inclusion, eg, comprising a series of steps or elements. A method, system, product or device is not necessarily limited to those steps or units expressly listed, but may include other steps or units not expressly listed or inherent to the process, method, product or device.
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。It should be understood that, in this application, "at least one (item)" refers to one or more, and "a plurality" refers to two or more. "And/or" is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, "A and/or B" can mean: only A, only B, and both A and B exist , where A and B can be singular or plural. The character "/" generally indicates that the associated objects are an "or" relationship. "At least one item(s) below" or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (a) of a, b or c, can mean: a, b, c, "a and b", "a and c", "b and c", or "a and b and c" ", where a, b, c can be single or multiple.
以下是本申请涉及到的术语解释:The following is an explanation of the terms involved in this application:
照度:光照强度是指单位面积上接受的可见光的能量,简称照度,单位为勒克斯(Lux或lx)。照度可以用于表示物体被照明的程度,亦即物体表面所得到的光通量与被照面积之比。例如,在夏季,阳光直接照射下,照度可达60000lx~100000lx;没有太阳的室外,照度可达1000lx~10000lx;明朗的室内,照度可达100lx~550lx;夜间满月下,照度可达0.2lx。Illuminance: Illumination intensity refers to the energy of visible light received per unit area, referred to as illuminance, and the unit is Lux (Lux or lx). Illuminance can be used to indicate the degree to which an object is illuminated, that is, the ratio of the luminous flux obtained on the surface of the object to the illuminated area. For example, in summer, under direct sunlight, the illuminance can reach 60000lx~100000lx; outdoors without the sun, the illuminance can reach 1000lx~10000lx; bright indoors, the illuminance can reach 100lx~550lx; under the full moon at night, the illuminance can reach 0.2lx.
本申请涉及两个应用场景,即第一照度场景和第二照度场景,第一照度场景的照度小于第二照度场景的照度。例如,第一照度场景可以是指夜晚无光照下、昏暗的室内、日常光照较差的阴暗角落等;第二照度场景可以是指白天室外、白天光照充足的室内、有充足的灯光照射的室内或室外等。The present application involves two application scenarios, namely a first illumination scene and a second illumination scene, and the illumination of the first illumination scene is smaller than that of the second illumination scene. For example, the first illuminance scene may refer to no light at night, a dimly lit room, a dark corner with poor daily light, etc.; the second illuminance scene may refer to an outdoor in the daytime, an indoor with sufficient light during the day, and an indoor with sufficient light. or outdoor etc.
YUV:一种像素的色彩空间格式,Y是像素的亮度(luma)分量,表示亮度或灰度水平强度,基于亮度分量的图像是黑白图像;UV是像素的色度(chroma)分量,表示像素色彩,基于色度分量的图像是彩色图像。YUV: A color space format for pixels, Y is the luminance (luma) component of the pixel, representing the luminance or gray level intensity, and images based on the luminance component are black and white images; UV is the chroma (chroma) component of the pixel, representing the pixel Color, an image based on chrominance components is a color image.
可见光:亦称为可见光信号。电磁波谱中人眼可以感知的部分,其波长例如为400nm-750nm之间。Visible light: Also known as visible light signal. The part of the electromagnetic spectrum that the human eye can perceive has a wavelength between 400nm and 750nm, for example.
红外光:亦称为红外光信号。频率介于微波与可见光之间的电磁波,其波长大于750nm,例如760nm-1mm之间。Infrared light: Also known as infrared light signal. An electromagnetic wave whose frequency is between microwave and visible light, and its wavelength is greater than 750nm, for example, between 760nm-1mm.
棱镜:由透明材料(例如玻璃、水晶等)做成的多面体,用于分光或使光束发生色散。棱镜是在光谱仪器中应用很广。例如,把复合光分解为光谱的“色散棱镜”,较常用的是等边三棱镜;或者,改变光的进行方向,从而调整其成像位置的“全反射棱镜”,较常用的是潜望镜、双目望远镜等仪器中的直角棱镜。Prism: A polyhedron made of transparent materials (such as glass, crystal, etc.) used to split or disperse light. Prisms are widely used in spectroscopic instruments. For example, the "dispersive prism" that decomposes the composite light into the spectrum, the more commonly used is the equilateral prism; or the "total reflection prism" that changes the direction of the light to adjust its imaging position, the more commonly used are the periscope, binocular Right angle prisms in instruments such as telescopes.
图像传感器(sensor):利用光电器件的光电转换功能将感光面上的光像转换为与光像成相应比例关系的电信号,其中感光面划分成很多小单元,每个小单元对应一个像素点。例如,拜耳(Bayer)传感器,将RGB滤色器排列在感光面之上,形成马赛克彩色滤色阵列(color filter array),这种彩色滤色阵列中有50%是绿色,用于感绿光,25%是红色,用于感红光,25%是蓝色,用于感蓝光。Image sensor: uses the photoelectric conversion function of the photoelectric device to convert the light image on the photosensitive surface into an electrical signal that is proportional to the light image. The photosensitive surface is divided into many small units, and each small unit corresponds to a pixel. . For example, the Bayer sensor, RGB color filters are arranged on the photosensitive surface, forming a mosaic color filter array (color filter array), 50% of this color filter array is green, which is used to sense green light , 25% is red for sensing red light, and 25% is blue for sensing blue light.
信噪比:sensor感受到的信号强度和产生的噪声强度的比例。通常在镜头和sensor尺寸不变的情况下,sensor的信噪比和感光面上单个像素点的面积正相关。例如,1/1.8英寸的sensor,400万像素,每个像素点占用3um,如果将分辨率提升到800万像素,每个像素点占用2um。很显然单个像素点对应的感光面面积减小,使得其接受的光信号减弱,进而降低了信噪比。Signal-to-noise ratio: The ratio of the signal strength experienced by the sensor to the generated noise strength. Usually, when the size of the lens and sensor remain unchanged, the signal-to-noise ratio of the sensor is positively related to the area of a single pixel on the photosensitive surface. For example, a 1/1.8-inch sensor with 4 million pixels occupies 3um per pixel. If the resolution is increased to 8 million pixels, each pixel occupies 2um. Obviously, the area of the photosensitive surface corresponding to a single pixel is reduced, which weakens the light signal it receives, thereby reducing the signal-to-noise ratio.
异谱图像:不同光谱成像的图像。例如,波长在400nm-750nm属于可见光,波长在750nm-850nm属于红外光,这两种光的成像图像即是异谱图像。Heterospectral image: Image of different spectral imaging. For example, the wavelengths of 400nm-750nm belong to visible light, and the wavelengths of 750nm-850nm belong to infrared light, and the imaging images of these two kinds of light are heterospectral images.
图1为本申请成像装置实施例的流程图,如图1所示,该成像装置可以包括:光捕获单元10、分光单元20、第一图像获取单元30、第二图像获取单元40和图像处理单元50。其中,光捕获单元10的输出端和分光单元20的输入端连接;分光单元20的两个输出端分别连接第一图像获取单元30的输入端和第二图像获取单元40的输入端;第一图像获取单元30的输出端和第二图像获取单元40的输出端分别连接图像处理单元50的一个输入端。FIG. 1 is a flowchart of an embodiment of an imaging device of the present application. As shown in FIG. 1 , the imaging device may include: a light capturing unit 10 , a light splitting unit 20 , a first image capturing unit 30 , a second image capturing unit 40 and an image processing unit unit 50. Wherein, the output end of the light capturing unit 10 is connected to the input end of the light splitting unit 20; the two output ends of the light splitting unit 20 are respectively connected to the input end of the first image obtaining unit 30 and the input end of the second image obtaining unit 40; The output terminal of the image acquisition unit 30 and the output terminal of the second image acquisition unit 40 are respectively connected to one input terminal of the image processing unit 50 .
光捕获单元10用于捕获被拍摄对象的反射光信号。例如,光捕获单元10可以是摄像机或照相机的镜头,光线(或光信号)照射在被拍摄对象上,经被拍摄对象反射后被镜头捕获。前述光线可以是被拍摄对象所处环境中的任意光源发出的,其可以包括太阳发出的自然光(亦称为可见光)、日光灯发出的可见光、红外补光灯发出的红外光等。光捕获单元10负责捕获反射光信号,该反射光信号与前述光线一致,即自然光反射后仍然是自然光,红外光反射后也仍然是红外光,光线的反射不会改变其中的光信号的波长。The light capturing unit 10 is used to capture the reflected light signal of the photographed object. For example, the light capturing unit 10 may be a camera or a lens of a camera, light (or light signal) irradiates on a photographed object, and is captured by the lens after being reflected by the photographed object. The aforementioned light may be emitted by any light source in the environment where the subject is located, which may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, and infrared light emitted by infrared fill lights. The light capturing unit 10 is responsible for capturing the reflected light signal, which is consistent with the aforementioned light, that is, natural light is still natural light after reflection, and infrared light is still infrared light after reflection, and the reflection of light does not change the wavelength of the light signal.
分光单元20用于将反射光信号分离成第一光信号和第二光信息号,并将第一光信号传输给第一图像获取单元,将第二光信号传输给第二图像获取单元。分光单元20可以采用棱镜,利用棱镜的光学原理,把反射光信号分解为两路光信号,以便在后续的模块中分别对该两路光信号进行独立处理。The light splitting unit 20 is configured to separate the reflected light signal into a first light signal and a second light information number, transmit the first light signal to the first image acquisition unit, and transmit the second light signal to the second image acquisition unit. The light splitting unit 20 can use a prism, and use the optical principle of the prism to decompose the reflected optical signal into two optical signals, so that the two optical signals can be independently processed in subsequent modules.
第一图像获取单元30用于根据第一光信号生成第一图像信号,将第一图像信号传输给图像处理单元;第二图像获取单元40用于根据第二光信号生成第二图像信号,将第二图像信号传输给图像处理单元;第一图像信号的分辨率高于第二图像信号的分辨率。第一图像获取单元30和第二图像获取单元40分别包括一个图像传感器,以将入射的光信号转换为电信号,形成初具图像雏形的图像信号,例如Bayer传感器通过彩色滤色阵列让相应的光信号打在感光面上,由感光面对接收的光信号进行光电转换得到对应的电信号,感光面上的所有像素点的电信号形成图像信号。通常Bayer传感器输出的图像格式可以是成像装置内部的原始图片,其后缀名为.raw。第一图像信号的分辨率高于第二图像信号的分辨率,是由于第一图像获取单元30中的图像传感器的分辨率高于第二图像获取单元40中的图像传感器的分辨率,亦即第一图像获取单元30中的图像传感器的感光面包含的像素点比第二图像获取单元40中的图像传感器的感光面包含的像素点多。The first image acquisition unit 30 is configured to generate a first image signal according to the first optical signal, and transmit the first image signal to the image processing unit; the second image acquisition unit 40 is configured to generate a second image signal according to the second optical signal, and The second image signal is transmitted to the image processing unit; the resolution of the first image signal is higher than that of the second image signal. The first image acquisition unit 30 and the second image acquisition unit 40 respectively include an image sensor to convert the incident optical signal into an electrical signal to form an image signal with an initial image. The optical signal is hit on the photosensitive surface, and the photoelectric conversion of the received optical signal is performed by the photosensitive surface to obtain the corresponding electrical signal, and the electrical signals of all the pixels on the photosensitive surface form the image signal. Usually, the image format output by the Bayer sensor can be the original image inside the imaging device, and its suffix is .raw. The resolution of the first image signal is higher than that of the second image signal because the resolution of the image sensor in the first image acquisition unit 30 is higher than the resolution of the image sensor in the second image acquisition unit 40, that is, The photosensitive surface of the image sensor in the first image acquisition unit 30 contains more pixels than the photosensitive surface of the image sensor in the second image acquisition unit 40 contains more pixels.
图像处理单元50用于根据第一图像信号和第二图像信号获取目标图像。图像处理单元50可以是任意一种具备数据处理能力和运算能力的处理器或处理芯片,亦即在前述处理器或处理芯片上运行的软件程序。例如,图像处理单元50可以采用集成在图像信号处理器(image signal processor,ISP)的片上系统(system on chip,SOC),SOC是在单个芯片上集成一个完整的系统,对所有或部分必要的电子电路进行包分组的技术。完整的系统一般包括中央处理器、存储器、以及外围电路等。The image processing unit 50 is configured to acquire a target image according to the first image signal and the second image signal. The image processing unit 50 may be any processor or processing chip with data processing capability and computing capability, that is, a software program running on the aforementioned processor or processing chip. For example, the image processing unit 50 may adopt a system on chip (SOC) integrated in an image signal processor (ISP), and the SOC is a complete system integrated on a single chip, which is necessary for all or part of the The technique of packet grouping by electronic circuits. A complete system generally includes a central processing unit, memory, and peripheral circuits.
可选的,成像装置还可以包括:补光单元60,该补光单元60和图像处理单元50连接。补光单元60可以是任意提供红外光信号的装置,例如红外补光灯。本申请可以通过图像处理单元50控制补光单元60的工作状态,例如补光单元60可以只在第一照度场景下工作,目的是为了补足光照强度;或者补光单元60可以一直处于工作状态,以应付照度变化较大或变化较频繁的情况。Optionally, the imaging device may further include: a supplementary light unit 60 connected to the image processing unit 50 . The supplementary light unit 60 may be any device that provides infrared light signals, such as an infrared supplementary light lamp. The present application can control the working state of the supplementary light unit 60 through the image processing unit 50. For example, the supplementary light unit 60 can only work in the first illumination scene, in order to supplement the light intensity; In order to cope with the situation where the illumination changes greatly or changes frequently.
可选的,成像装置还可以包括:滤光单元70,该滤光单元70设置于分光单元20的输出端和第一图像获取单元30的输入端之间。滤光单元70可以是具备过滤掉某一波长或某一范围内波长光信号功能的任意一种器件,尤其是过滤掉波长大于750nm的红外光,例如红外光滤波片。本申请中滤光单元70可以只在第二照度场景下工作,目的是为了在照度较高时,过滤掉红外光信号。Optionally, the imaging device may further include: a filter unit 70 , the filter unit 70 is disposed between the output end of the spectroscopic unit 20 and the input end of the first image acquisition unit 30 . The filter unit 70 may be any device with the function of filtering out optical signals of a certain wavelength or wavelengths in a certain range, especially to filter out infrared light with a wavelength greater than 750 nm, such as an infrared light filter. In the present application, the filter unit 70 may only work in the second illumination scene, in order to filter out infrared light signals when the illumination is relatively high.
图像处理单元50还用于根据第二图像信号判断当前光照场景是第一照度场景,还是第二照度场景。例如,根据第二图像信号检测光照强度,当光照强度小于第一阈值时,确定当前光照场景为第一照度场景;或者,当光照强度大于或等于第一阈值时,确定当前光照场景为第二照度场景。又例如,根据第二图像信号检测信号增益,当信号增益大于第二阈值时,确定当前光照场景为第一照度场景;或者,当信号增益小于或等于第二阈值时,确定当前光照场景为第二照度场景。上述第一阈值和第二阈值可以预先根据历史数据或经验进行设置。光照强度例如可以通过成像装置上的光传感器进行检测。信号增益例如可以通过第二图像获取单元40中的图像传感器进行检测。The image processing unit 50 is further configured to determine whether the current illumination scene is the first illumination scene or the second illumination scene according to the second image signal. For example, the illumination intensity is detected according to the second image signal, and when the illumination intensity is less than the first threshold, it is determined that the current illumination scene is the first illumination scene; or, when the illumination intensity is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene. Illumination scene. For another example, the signal gain is detected according to the second image signal, and when the signal gain is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, it is determined that the current illumination scene is the first illumination scene. Two-light scene. The above-mentioned first threshold and second threshold may be set in advance according to historical data or experience. The light intensity can be detected, for example, by a light sensor on the imaging device. The signal gain can be detected by an image sensor in the second image acquisition unit 40, for example.
成像装置在在第一照度场景下,补光单元60工作,滤光单元70停止工作。When the imaging device is in the first illumination scene, the fill light unit 60 works, and the filter unit 70 stops working.
光捕获单元10获取的反射光信号可以包括可见光信号和来自红外补光灯(例如补光单元60)的第一红外光信号。分光单元20将上述反射光信号分离成包括可见光信号的第一分量信号(例如10%的可见光信号)和第一红外光信号的第一光信号,以及包括可见光 信号的第二分量信号(例如90%的可见光信号)的第二光信号。The reflected light signal acquired by the light capturing unit 10 may include a visible light signal and a first infrared light signal from an infrared fill light (eg, fill light unit 60 ). The light splitting unit 20 separates the reflected light signal into a first component signal including a visible light signal (for example, a 10% visible light signal), a first light signal including a first infrared light signal, and a second component signal including the visible light signal (for example, 90%). % of the visible light signal) of the second light signal.
第一图像获取单元30根据第一光信号得到第一图像信号,该第一图像信号是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第一光信号包含红外光信号,因此该第一图像信号是黑白色的第一灰度图信号。The first image acquisition unit 30 obtains a first image signal according to the first optical signal, where the first image signal is an electrical signal converted by photoelectric conversion, and constitutes, for example, an original picture with a suffix named .raw. Since the first light signal includes an infrared light signal, the first image signal is a first grayscale image signal in black and white.
第二图像获取单元40根据第二光信号得到第二图像信号,该第二图像信号也是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第二光信号只包含可见光信号,因此该第二图像信号是彩色的可见光图像信号。The second image acquisition unit 40 obtains a second image signal according to the second optical signal, and the second image signal is also a photoelectrically converted electrical signal, forming an original image with a suffix named .raw, for example. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
图像处理单元50对第二图像信号进行亮度和色度分离得到第二灰度图和色度(chroma)图,基于第一图像信号得到第一灰度图。由于第一图像获取单元30中的图像传感器的分辨率高于第二图像获取单元40中的图像传感器的分辨率,因此上述第一灰度图的分辨率高于上述第二灰度(luma)图和色度(chroma)图的分辨率。图像处理单元50将第一灰度图和第二灰度图信号进行灰度融合,可以得到第三灰度(luma)图,该第三灰度(luma)图的高分辨率与上述第一灰度图的分辨率相同。图像处理单元50再将第三灰度(luma)图和色度(chroma)图进行色彩融合,即可得到目标图像,该目标图像的分辨率也与上述第一灰度图的分辨率相同。The image processing unit 50 performs luminance and chrominance separation on the second image signal to obtain a second grayscale map and a chroma map, and obtains a first grayscale map based on the first image signal. Since the resolution of the image sensor in the first image acquisition unit 30 is higher than that of the image sensor in the second image acquisition unit 40 , the resolution of the first grayscale image is higher than that of the second grayscale (luma) The resolution of the map and the chroma map. The image processing unit 50 performs grayscale fusion of the first grayscale image and the second grayscale image signal to obtain a third grayscale (luma) image, the high resolution of the third grayscale (luma) image is the same as the above-mentioned first Grayscale images have the same resolution. The image processing unit 50 then performs color fusion of the third grayscale (luma) image and the chromaticity (chroma) image to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
该第一照度场景下,灰度融合时,第一灰度图中也包含可见光信号,可以实现同谱配准,提高图像的融合效率,而灰度融合和色彩融合均是高分辨率和低分辨率的两个图像之间的融合,既可以保留低分辨率图像的高信噪比,又可以获取高分辨率图像,从而可以得到色彩更鲜艳、更高清的图像。In the first illumination scene, during grayscale fusion, the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image, while the grayscale fusion and color fusion are both high resolution and low The fusion between two high-resolution images can not only retain the high signal-to-noise ratio of low-resolution images, but also obtain high-resolution images, so that more vivid and high-definition images can be obtained.
成像装置在在第二照度场景下,补光单元60停止工作,滤光单元70工作。When the imaging device is in the second illumination scene, the supplementary light unit 60 stops working, and the filter unit 70 works.
光捕获单元10获取的反射光信号可以包括可见光信号和来自自然界的红外光信号。分光单元20将上述反射光信号分离成包括可见光信号的第一分量信号(例如10%的可见光信号)和来自自然界的红外光信号的第一光信号,以及包括可见光信号的第二分量信号(例如90%的可见光信号)的第二光信号。The reflected light signals acquired by the light capturing unit 10 may include visible light signals and infrared light signals from nature. The light splitting unit 20 separates the reflected light signal into a first component signal including a visible light signal (for example, a 10% visible light signal), a first light signal including an infrared light signal from nature, and a second component signal including the visible light signal (for example, 90% of the visible light signal) of the second light signal.
滤光单元70将第一光信号中的红外光信号过滤掉,因此到达第一图像获取单元30的第一光信号只包括可见光信号的第一分量信号。The filter unit 70 filters out the infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit 30 only includes the first component signal of the visible light signal.
第一图像获取单元30根据第一光信号得到第一图像信号,该第一图像信号是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第一光信号只包含可见光信号,因此该第一图像信号是彩色的可见光图像信号。The first image acquisition unit 30 obtains a first image signal according to the first optical signal, where the first image signal is an electrical signal converted by photoelectric conversion, and constitutes, for example, an original picture with a suffix named .raw. Since the first light signal only includes visible light signals, the first image signal is a colored visible light image signal.
第二图像获取单元40根据第二光信号得到第二图像信号,该第二图像信号也是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第二光信号只包含可见光信号,因此该第二图像信号也是彩色的可见光图像信号。本申请中第二图像信号为低分辨率,为了减少运算量,可以只用第二图像信号进行光照强度的检测,图像处理单元50无需采用第二图像信号进行图像融合。The second image acquisition unit 40 obtains a second image signal according to the second optical signal, and the second image signal is also a photoelectrically converted electrical signal, forming an original image with a suffix named .raw, for example. Since the second light signal only includes visible light signals, the second image signal is also a colored visible light image signal. In the present application, the second image signal is of low resolution. In order to reduce the amount of computation, only the second image signal can be used to detect the illumination intensity, and the image processing unit 50 does not need to use the second image signal to perform image fusion.
图像处理单元50基于上述第一图像信号得到目标图像,该目标图像的分辨率与第一图像信号的分辨率相同,高于上述第二图像信号的分辨率。The image processing unit 50 obtains a target image based on the first image signal. The resolution of the target image is the same as the resolution of the first image signal and higher than the resolution of the second image signal.
该第二照度场景下,可以直接获取到高分辨率图像,提高图像的成像效果。In the second illumination scene, a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
上述成像装置可以应用于智能手机、平板电脑、摄像机、监控摄像头等具备拍摄功能的设备。图2为本申请设备实施例的示例性的框图,图2示出了设备为手机时的结构示意 图。The above imaging device can be applied to devices with shooting functions, such as smart phones, tablet computers, cameras, and surveillance cameras. Fig. 2 is an exemplary block diagram of an embodiment of a device of the present application, and Fig. 2 shows a schematic structural diagram when the device is a mobile phone.
如图2,手机200可以包括处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及用户标识模块(subscriber identification module,SIM)卡接口295等。其中传感器模块280可以包括压力传感器280A,陀螺仪传感器280B,气压传感器280C,磁传感器280D,加速度传感器280E,距离传感器280F,接近光传感器280G,指纹传感器280H,温度传感器280J,触摸传感器280K,环境光传感器280L,骨传导传感器280M、图像传感器280N等。As shown in FIG. 2, the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1 , Antenna 2, Mobile Communication Module 250, Wireless Communication Module 260, Audio Module 270, Speaker 270A, Receiver 270B, Microphone 270C, Headphone Interface 270D, Sensor Module 280, Key 290, Motor 291, Indicator 292, Camera 293, Display Screen 294, and a subscriber identification module (subscriber identification module, SIM) card interface 295 and so on. The sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, and ambient light. Sensor 280L, bone conduction sensor 280M, image sensor 280N, etc.
可以理解的是,本申请实施例示意的结构并不构成对手机200的具体限定。在本申请另一些实施例中,手机200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 200 . In other embodiments of the present application, the mobile phone 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
配合摄像头293和图像传感器280N,本申请中的处理器210可以实现图1所示成像装置中的图像处理单元50的功能。In cooperation with the camera 293 and the image sensor 280N, the processor 210 in this application can implement the functions of the image processing unit 50 in the imaging device shown in FIG. 1 .
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in processor 210 is cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
在一些实施例中,处理器210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。In some embodiments, the processor 210 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器210可以包含多组I2C总线。处理器210可以通过不同的I2C总线接口分别耦合触摸传感器280K,充电器,闪光灯,摄像头293等。例如:处理器210可以通过I2C接口耦合触摸传感器280K,使 处理器210与触摸传感器280K通过I2C总线接口通信,实现手机200的触摸功能。The I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL). In some embodiments, the processor 210 may contain multiple sets of I2C buses. The processor 210 can be respectively coupled to the touch sensor 280K, the charger, the flash, the camera 293 and the like through different I2C bus interfaces. For example, the processor 210 can couple the touch sensor 280K through the I2C interface, so that the processor 210 and the touch sensor 280K communicate with each other through the I2C bus interface, so as to realize the touch function of the mobile phone 200.
I2S接口可以用于音频通信。在一些实施例中,处理器210可以包含多组I2S总线。处理器210可以通过I2S总线与音频模块270耦合,实现处理器210与音频模块270之间的通信。在一些实施例中,音频模块270可以通过I2S接口向无线通信模块260传递音频信号,实现通过蓝牙耳机接听电话的功能。The I2S interface can be used for audio communication. In some embodiments, the processor 210 may contain multiple sets of I2S buses. The processor 210 may be coupled with the audio module 270 through an I2S bus to implement communication between the processor 210 and the audio module 270 . In some embodiments, the audio module 270 can transmit audio signals to the wireless communication module 260 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块270与无线通信模块260可以通过PCM总线接口耦合。在一些实施例中,音频模块270也可以通过PCM接口向无线通信模块260传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。The PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 270 and the wireless communication module 260 may be coupled through a PCM bus interface. In some embodiments, the audio module 270 can also transmit audio signals to the wireless communication module 260 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器210与无线通信模块260。例如:处理器210通过UART接口与无线通信模块260中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块270可以通过UART接口向无线通信模块260传递音频信号,实现通过蓝牙耳机播放音乐的功能。The UART interface is a universal serial data bus used for asynchronous communication. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 210 with the wireless communication module 260 . For example, the processor 210 communicates with the Bluetooth module in the wireless communication module 260 through the UART interface to implement the Bluetooth function. In some embodiments, the audio module 270 can transmit audio signals to the wireless communication module 260 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
MIPI接口可以被用于连接处理器210与显示屏294,摄像头293等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器210和摄像头293通过CSI接口通信,实现手机200的拍摄功能。处理器210和显示屏294通过DSI接口通信,实现手机200的显示功能。The MIPI interface can be used to connect the processor 210 with peripheral devices such as the display screen 294 and the camera 293 . MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc. In some embodiments, the processor 210 communicates with the camera 293 through the CSI interface, so as to realize the shooting function of the mobile phone 200 . The processor 210 communicates with the display screen 294 through the DSI interface to realize the display function of the mobile phone 200 .
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器210与摄像头293,显示屏294,无线通信模块260,音频模块270,传感器模块280等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。The GPIO interface can be configured by software. The GPIO interface can be configured as a control signal or as a data signal. In some embodiments, the GPIO interface may be used to connect the processor 210 with the camera 293, the display screen 294, the wireless communication module 260, the audio module 270, the sensor module 280, and the like. The GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
USB接口230是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口230可以用于连接充电器为手机200充电,也可以用于手机200与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他手机,例如AR设备等。The USB interface 230 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 230 can be used to connect a charger to charge the mobile phone 200, and can also be used to transmit data between the mobile phone 200 and peripheral devices. It can also be used to connect headphones to play audio through the headphones. The interface can also be used to connect other mobile phones, such as AR devices.
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机200的结构限定。在本申请另一些实施例中,手机200也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。It can be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the mobile phone 200 . In other embodiments of the present application, the mobile phone 200 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
充电管理模块240用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块240可以通过USB接口230接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块240可以通过手机200的无线充电线圈接收无线充电输入。充电管理模块240为电池242充电的同时,还可以通过电源管理模块241为手机供电。The charging management module 240 is used to receive charging input from the charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 240 may receive charging input from the wired charger through the USB interface 230 . In some wireless charging embodiments, the charging management module 240 may receive wireless charging input through the wireless charging coil of the mobile phone 200 . While the charging management module 240 charges the battery 242 , it can also supply power to the mobile phone through the power management module 241 .
电源管理模块241用于连接电池242,充电管理模块240与处理器210。电源管理模块241接收电池242和/或充电管理模块240的输入,为处理器210,内部存储器221,显示屏294,摄像头293,和无线通信模块260等供电。电源管理模块241还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中, 电源管理模块241也可以设置于处理器210中。在另一些实施例中,电源管理模块241和充电管理模块240也可以设置于同一个器件中。The power management module 241 is used to connect the battery 242 , the charging management module 240 and the processor 210 . The power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 294, the camera 293, and the wireless communication module 260. The power management module 241 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance). In some other embodiments, the power management module 241 may also be provided in the processor 210 . In other embodiments, the power management module 241 and the charging management module 240 may also be provided in the same device.
手机200的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。The wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modulation and demodulation processor, the baseband processor, and the like.
天线1和天线2用于发射和接收电磁波信号。手机200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in handset 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example, the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
移动通信模块250可以提供应用在手机200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块250的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块250的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。The mobile communication module 250 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the mobile phone 200 . The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like. The mobile communication module 250 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation. The mobile communication module 250 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 . In some embodiments, at least part of the functional modules of the mobile communication module 250 may be provided in the processor 210 . In some embodiments, at least part of the functional modules of the mobile communication module 250 may be provided in the same device as at least part of the modules of the processor 210 .
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器210,与移动通信模块250或其他功能模块设置在同一个器件中。The modem processor may include a modulator and a demodulator. Wherein, the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and passed to the application processor. The application processor outputs sound signals through audio devices (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display screen 294 . In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be independent of the processor 210, and may be provided in the same device as the mobile communication module 250 or other functional modules.
无线通信模块260可以提供应用在手机200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块260可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块260经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块260还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 260 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2 , modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 . The wireless communication module 260 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
在一些实施例中,手机200的天线1和移动通信模块250耦合,天线2和无线通信模块260耦合,使得手机200可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite  system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the mobile phone 200 is coupled with the mobile communication module 250, and the antenna 2 is coupled with the wireless communication module 260, so that the mobile phone 200 can communicate with the network and other devices through wireless communication technology. The wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc. The GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi satellite system) -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
手机200通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The mobile phone 200 realizes the display function through the GPU, the display screen 294, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
显示屏294用于显示图像,视频等。显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机200可以包括1个或N个显示屏294,N为大于1的正整数。Display screen 294 is used to display images, videos, and the like. Display screen 294 includes a display panel. The display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light). emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on. In some embodiments, cell phone 200 may include 1 or N display screens 294, where N is a positive integer greater than 1.
手机200可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。The mobile phone 200 can realize the shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294 and the application processor.
ISP用于处理摄像头293反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头293中。本申请中ISP可以根据第一图像信号和第二图像信号获取目标图像。还可以根据第二图像信号判断当前光照场景是第一照度场景,还是第二照度场景。可选的,ISP对第二图像信号进行亮度和色度分离得到第一灰度(luma)图和色度(chroma)图,基于第一图像信号得到第一灰度图,第一灰度图的分辨率高于第二灰度图和色度图的分辨率。将第一灰度(luma)图和第二灰度图信号进行灰度融合,得到第三灰度图,该第三灰度图的高分辨率与上述第一灰度图的分辨率相同。再将第三灰度图和色度(chroma)图进行色彩融合,即可得到目标图像,该目标图像的分辨率也与上述第一灰度图的分辨率相同。The ISP is used to process the data fed back by the camera 293 . For example, when taking a photo, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 293 . In this application, the ISP can acquire the target image according to the first image signal and the second image signal. It can also be determined according to the second image signal whether the current illumination scene is the first illumination scene or the second illumination scene. Optionally, the ISP performs luminance and chromaticity separation on the second image signal to obtain a first grayscale (luma) map and a chrominance (chroma) map, and obtains a first grayscale map based on the first image signal, the first grayscale map The resolution is higher than that of the second grayscale image and the chromaticity image. The first grayscale (luma) image and the second grayscale image are grayscale-fused to obtain a third grayscale image, and the third grayscale image has the same high resolution as the first grayscale image. The third grayscale image and the chroma map are then color-fused to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
摄像头293用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机200可以包括1个或N个摄像头293,N为大于1的正整数。Camera 293 is used to capture still images or video. The object is projected through the lens to generate an optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. DSP converts digital image signals into standard RGB, YUV and other formats of image signals. In some embodiments, the mobile phone 200 may include one or N cameras 293 , where N is a positive integer greater than one.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机200在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。A digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the mobile phone 200 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point, and the like.
视频编解码器用于对数字视频压缩或解压缩。手机200可以支持一种或多种视频编解码器。这样,手机200可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital video. The handset 200 may support one or more video codecs. In this way, the mobile phone 200 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现手机200的智能认知等应用,例如:图像识别,人脸识别,语音识别,文 本理解等。The NPU is a neural-network (NN) computing processor. By drawing on the structure of biological neural networks, such as the transfer mode between neurons in the human brain, it can quickly process the input information, and can continuously learn by itself. Applications such as intelligent cognition of the mobile phone 200 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
外部存储器接口220可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机200的存储能力。外部存储卡通过外部存储器接口220与处理器210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 200 . The external memory card communicates with the processor 210 through the external memory interface 220 to realize the data storage function. For example to save files like music, video etc in external memory card.
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器210通过运行存储在内部存储器221的指令,和/或存储在设置于处理器中的存储器的指令,执行手机200的各种功能应用以及数据处理。Internal memory 221 may be used to store computer executable program code, which includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like. The storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 200 and the like. In addition, the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like. The processor 210 executes various functional applications and data processing of the mobile phone 200 by executing the instructions stored in the internal memory 221 and/or the instructions stored in the memory provided in the processor.
手机200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The mobile phone 200 can implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
音频模块270用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块270还可以用于对音频信号编码和解码。在一些实施例中,音频模块270可以设置于处理器210中,或将音频模块270的部分功能模块设置于处理器210中。The audio module 270 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be provided in the processor 210 , or some functional modules of the audio module 270 may be provided in the processor 210 .
扬声器270A,也称“喇叭”,用于将音频电信号转换为声音信号。手机200可以通过扬声器270A收听音乐,或收听免提通话。 Speaker 270A, also referred to as a "speaker", is used to convert audio electrical signals into sound signals. Mobile phone 200 can listen to music through speaker 270A, or listen to hands-free calls.
受话器270B,也称“听筒”,用于将音频电信号转换成声音信号。当手机200接听电话或语音信息时,可以通过将受话器270B靠近人耳接听语音。The receiver 270B, also referred to as an "earpiece", is used to convert audio electrical signals into sound signals. When the mobile phone 200 receives a call or a voice message, the voice can be received by placing the receiver 270B close to the human ear.
麦克风270C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风270C发声,将声音信号输入到麦克风270C。手机200可以设置至少一个麦克风270C。在另一些实施例中,手机200可以设置两个麦克风270C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机200还可以设置三个,四个或更多麦克风270C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。The microphone 270C, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound by approaching the microphone 270C through the human mouth, and input the sound signal into the microphone 270C. The mobile phone 200 may be provided with at least one microphone 270C. In other embodiments, the mobile phone 200 may be provided with two microphones 270C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the mobile phone 200 may further be provided with three, four or more microphones 270C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
耳机接口270D用于连接有线耳机。耳机接口270D可以是USB接口230,也可以是3.5mm的开放移动手机平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。The headphone jack 270D is used to connect wired headphones. The earphone interface 270D may be a USB interface 230, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
压力传感器280A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器280A可以设置于显示屏294。压力传感器280A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。。The pressure sensor 280A is used to sense pressure signals, and can convert the pressure signals into electrical signals. In some embodiments, the pressure sensor 280A may be provided on the display screen 294 . There are many types of pressure sensors 280A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. .
陀螺仪传感器280B可以用于确定手机200的运动姿态。The gyroscope sensor 280B can be used to determine the motion attitude of the mobile phone 200 .
气压传感器280C用于测量气压。Air pressure sensor 280C is used to measure air pressure.
磁传感器280D包括霍尔传感器。Magnetic sensor 280D includes a Hall sensor.
加速度传感器280E可检测手机200在各个方向上(一般为三轴)加速度的大小。The acceleration sensor 280E can detect the magnitude of the acceleration of the mobile phone 200 in various directions (generally three axes).
距离传感器280F,用于测量距离。Distance sensor 280F for measuring distance.
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。Proximity light sensor 280G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes. The light emitting diodes may be infrared light emitting diodes.
环境光传感器280L用于感知环境光亮度。The ambient light sensor 280L is used to sense ambient light brightness.
指纹传感器280H用于采集指纹。The fingerprint sensor 280H is used to collect fingerprints.
温度传感器280J用于检测温度。The temperature sensor 280J is used to detect the temperature.
触摸传感器280K,也称“触控器件”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于手机200的表面,与显示屏294所处的位置不同。The touch sensor 280K is also called "touch device". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch screen". The touch sensor 280K is used to detect a touch operation on or near it. The touch sensor can pass the detected touch operation to the application processor to determine the type of touch event. Visual output related to touch operations may be provided through display screen 294 . In other embodiments, the touch sensor 280K may also be disposed on the surface of the mobile phone 200 , which is different from the location where the display screen 294 is located.
骨传导传感器280M可以获取振动信号。The bone conduction sensor 280M can acquire vibration signals.
图像传感器(sensor)280N利用光电器件的光电转换功能将感光面上的光像转换为与光像成相应比例关系的电信号,其中感光面划分成很多小单元,每个小单元对应一个像素点。例如,拜耳(Bayer)传感器,将RGB滤色器排列在感光面之上,形成马赛克彩色滤色阵列(color filter array),这种彩色滤色阵列中有50%是绿色,用于感绿光,25%是红色,用于感红光,25%是蓝色,用于感蓝光。The image sensor (sensor) 280N uses the photoelectric conversion function of the photoelectric device to convert the light image on the photosensitive surface into an electrical signal that is proportional to the light image. The photosensitive surface is divided into many small units, and each small unit corresponds to a pixel. . For example, the Bayer sensor, RGB color filters are arranged on the photosensitive surface, forming a mosaic color filter array (color filter array), 50% of this color filter array is green, which is used to sense green light , 25% is red for sensing red light, and 25% is blue for sensing blue light.
按键290包括开机键,音量键等。按键290可以是机械按键。也可以是触摸式按键。手机200可以接收按键输入,产生与手机200的用户设置以及功能控制有关的键信号输入。The keys 290 include a power-on key, a volume key, and the like. Keys 290 may be mechanical keys. It can also be a touch key. The cell phone 200 can receive key input and generate key signal input related to user settings and function control of the cell phone 200 .
马达291可以产生振动提示。马达291可以用于来电振动提示,也可以用于触摸振动反馈。Motor 291 can generate vibrating cues. The motor 291 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。The indicator 292 can be an indicator light, which can be used to indicate the charging status, the change of power, and can also be used to indicate messages, missed calls, notifications, and the like.
SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和手机200的接触和分离。手机200可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。手机200通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机200采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机200中,不能和手机200分离。The SIM card interface 295 is used to connect a SIM card. The SIM card can be contacted and separated from the mobile phone 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295 . The mobile phone 200 may support one or N SIM card interfaces, where N is a positive integer greater than one. The SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card and so on. The mobile phone 200 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, the handset 200 employs an eSIM, ie: an embedded SIM card. The eSIM card can be embedded in the mobile phone 200 and cannot be separated from the mobile phone 200 .
可以理解的是,本申请实施例示意的结构并不构成对设备的具体限定。在本申请另一些实施例中,设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the device. In other embodiments of the present application, the device may include more or less components than shown, or some components may be combined, or some components may be split, or different component arrangements. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
图3为本申请成像方法实施例的过程300的流程图。过程300可由图1所示的成像装置执行,具体的,可以由包含该成像装置的手机、平板电脑、摄像机或照相机等来执行。过程300描述为一系列的步骤或操作,应当理解的是,过程300可以以各种顺序执行和/或同时发生,不限于图3所示的执行顺序。过程300可以包括:FIG. 3 is a flowchart of a process 300 of an embodiment of an imaging method of the present application. The process 300 may be performed by the imaging device shown in FIG. 1 , and specifically, may be performed by a mobile phone, a tablet computer, a video camera or a camera, etc. including the imaging device. Process 300 is described as a series of steps or operations, and it should be understood that process 300 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 3 . Process 300 may include:
步骤301、获取被拍摄对象的反射光信号。Step 301: Acquire a reflected light signal of a photographed object.
成像装置可以通过其中的光捕获单元(例如镜头)捕获被拍摄对象的反射光信号,该 反射光信号与前述光线一致,即自然光反射后仍然是自然光,红外光反射后也仍然是红外光,光线的反射不会改变其中的光信号的波长。因此本申请的反射光信号可以包括太阳发出的自然光(亦称为可见光)、日光灯发出的可见光、红外补光灯发出的红外光等。The imaging device can capture the reflected light signal of the photographed object through the light capture unit (such as a lens) therein, and the reflected light signal is consistent with the aforementioned light, that is, after the natural light is reflected, it is still natural light, and after the infrared light is reflected, it is still infrared light. The reflection does not change the wavelength of the optical signal in it. Therefore, the reflected light signal in the present application may include natural light (also known as visible light) emitted by the sun, visible light emitted by fluorescent lamps, infrared light emitted by infrared supplementary lights, and the like.
步骤302、通过分光单元将反射光信号分离成第一光信号和第二光信号。Step 302: The reflected optical signal is separated into a first optical signal and a second optical signal by the optical splitting unit.
分光单元可以采用分光棱镜,本申请中分光棱镜用于将反射光信号分离成第一光路和第二光路,其中,第一光路可以被分配到10%~40%的可见光谱能量和大于80%的红外光谱能量,第二光路可以被分配到60%~90%的可见光谱能量和小于20%的红外光谱能量。The beam splitting unit can use a beam splitter prism, and in this application, the beam splitter prism is used to separate the reflected light signal into a first optical path and a second optical path, wherein the first optical path can be allocated to 10% to 40% of the visible spectrum energy and greater than 80% The infrared spectral energy, the second optical path can be distributed to 60% to 90% of the visible spectral energy and less than 20% of the infrared spectral energy.
第一光路上的光信号(第一光信号)输入第一图像传感器,第二光路上的光信号(第二光信号)输入第二图像传感器,第一图像传感器的分辨率高于第二图像传感器的分辨率。第一图像传感器经过光电转换将第一光信号转换成第一图像信号(例如后缀名为.raw图像),第二图像传感器经过光电转换将第二光信号转换成第二图像信号(例如后缀名为.raw图像)。由于随着图像传感器的分辨率的提升,图像传感器中的感光面上单位像素面积会减小,进而减少了感光的光子数量,导致信噪比下降。尤其在低照度场景(上述第一照度场景)下,由于彩色光路上的进光量不足,分辨率的提升会严重影响色彩效果。上述方法中通过大小分辨率两个图像传感器,形成大小分辨率的图像传感器组合,小分辨率的图像传感器用于彩色路成像,只接收可见光信号;大分辨率的图像传感器同时接收可见光信号和红外光信号,白天可以直接实现大分辨率的彩色成像,晚上用大分辨率的红外图像(灰度图)和小分辨率彩色图像(色度图)融合。从而实现白天晚上都支持大分辨率成像。The optical signal (first optical signal) on the first optical path is input to the first image sensor, and the optical signal (second optical signal) on the second optical path is input to the second image sensor, and the resolution of the first image sensor is higher than that of the second image The resolution of the sensor. The first image sensor converts the first optical signal into a first image signal (for example, the suffix is .raw image) through photoelectric conversion, and the second image sensor converts the second optical signal into a second image signal (such as the suffix name) through photoelectric conversion. for .raw images). As the resolution of the image sensor increases, the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio. Especially in a low illuminance scene (the above-mentioned first illuminance scene), due to insufficient light input on the color light path, an increase in resolution will seriously affect the color effect. In the above method, two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions. The image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals. For light signals, large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night. Thus, large-resolution imaging can be supported day and night.
可选的,在第一照度场景下,开启红外补光灯照射被拍摄对象,并且关闭红外滤光片,以使第一光信号包括可见光信号的第一分量信号和由红外补光灯反射得到的红外光信号;第二光信号包括可见光信号的第二分量信号。Optionally, in the first illumination scene, the infrared fill light is turned on to illuminate the object to be photographed, and the infrared filter is turned off, so that the first light signal includes the first component signal of the visible light signal and is reflected by the infrared fill light. The infrared light signal; the second light signal includes the second component signal of the visible light signal.
在第一照度场景下,成像装置中的补光单元工作,发出红外光信号,因此反射光信号包括可见光信号和来自红外补光灯的第一红外光信号。成像装置中的分光单元利用光学原理,对反射光信号进行光谱和能量分离得到第一光信号和第二光信号,其中第一光信号包括可见光信号的第一分量信号(例如10%的可见光信号)和由红外补光灯反射得到的红外光信号,第二光信号包括可见光信号的第二分量信号(例如90%的可见光信号)。In the first illumination scene, the supplementary light unit in the imaging device works and emits an infrared light signal, so the reflected light signal includes a visible light signal and the first infrared light signal from the infrared supplementary light. The spectroscopic unit in the imaging device uses optical principles to perform spectral and energy separation on the reflected light signal to obtain a first light signal and a second light signal, where the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal). ) and the infrared light signal reflected by the infrared fill light, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
步骤303、根据第一光信号获取第一图像信号。Step 303: Acquire a first image signal according to the first optical signal.
成像装置中的第一图像获取单元利用图像传感器的工作原理将第一光信号转换为电信号,形成初具图像雏形的第一图像信号,该第一图像信号是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第一光信号包含红外光信号,因此该第一图像信号是黑白色的第一灰度图信号。The first image acquisition unit in the imaging device converts the first optical signal into an electrical signal by using the working principle of the image sensor to form a first image signal with an initial image prototype. The first image signal is an electrical signal converted by photoelectricity, which constitutes For example, the suffix is .raw original image. Since the first light signal includes an infrared light signal, the first image signal is a first grayscale image signal in black and white.
成像装置中的图像处理单元根据第一图像信号获取第一灰度图,此处涉及ISP的图像处理算法,即将.raw格式的原始图片转换为人眼可见的图像。第一灰度图为黑白色的。The image processing unit in the imaging device obtains the first grayscale image according to the first image signal, which involves the image processing algorithm of the ISP, that is, converts the original image in the .raw format into an image visible to the human eye. The first grayscale image is black and white.
步骤304、根据第二光信号获取第二图像信号。Step 304: Acquire a second image signal according to the second optical signal.
成像装置中的第二图像获取单元利用图像传感器的工作原理将第二光信号转换为电信号,形成初具图像雏形的第二图像信号,该第二图像信号也是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第二光信号只包含可见光信号,因此该第二图像信号是彩色的可见光图像信号。The second image acquisition unit in the imaging device converts the second optical signal into an electrical signal by using the working principle of the image sensor to form a second image signal with an initial image prototype, and the second image signal is also an electrical signal converted by photoelectricity, forming For example, the suffix is .raw original image. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
成像装置中的图像处理单元对第二图像信号进行亮度和色度分离得到第二灰度 (luma)图和色度(chroma)图,此处也涉及ISP的图像处理算法,即将.raw格式的原始图片转换为人眼可见的图像,同时将RGB格式转换为YUV格式,并对YUV格式的图像进行分离。第二灰度图(luma)为黑白色的,色度(chroma)图为彩色的。The image processing unit in the imaging device separates the luminance and chromaticity of the second image signal to obtain a second grayscale (luma) image and a chrominance (chroma) image, which also involves the image processing algorithm of the ISP, that is, the .raw format. The original image is converted into an image visible to the human eye, while the RGB format is converted into YUV format, and the image in YUV format is separated. The second grayscale map (luma) is in black and white, and the chroma map is in color.
由于成像装置中的第一图像获取单元中的图像传感器的分辨率高于第二图像获取单元中的图像传感器的分辨率,亦即第一图像获取单元中的图像传感器的感光面包含的像素点比第二图像获取单元中的图像传感器的感光面包含的像素点多第一图像信号的分辨率高于第二图像信号的分辨率,因此第一图像获取单元得到的第一图像信号的分辨率高于第二图像获取单元得到的第二图像信号的分辨率。相应的,步骤303得到的第一灰度图的分辨率高于步骤304得到的第二灰度(luma)图和色度(chroma)图的分辨率。Because the resolution of the image sensor in the first image acquisition unit in the imaging device is higher than the resolution of the image sensor in the second image acquisition unit, that is, the pixels included in the photosensitive surface of the image sensor in the first image acquisition unit There are more pixels than the photosensitive surface of the image sensor in the second image acquisition unit. The resolution of the first image signal is higher than that of the second image signal, so the resolution of the first image signal obtained by the first image acquisition unit has a higher resolution. higher than the resolution of the second image signal obtained by the second image acquisition unit. Correspondingly, the resolution of the first grayscale image obtained in step 303 is higher than the resolution of the second grayscale (luma) image and the chromaticity (chroma) image obtained in step 304 .
可选的,在第一照度场景下,根据第一图像信号和第二图像信号获取第一目标图像。Optionally, in the first illumination scene, the first target image is acquired according to the first image signal and the second image signal.
可选的,在第二照度场景下,根据第一图像信号获取第二目标图像。Optionally, in the second illumination scene, the second target image is acquired according to the first image signal.
其中,第一图像信号的分辨率高于第二图像信号的分辨率;第二照度场景的光照强度大于第一照度场景的光照强度。The resolution of the first image signal is higher than the resolution of the second image signal; the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
成像装置中的图像处理单元将第一灰度(luma)图和第二灰度图进行灰度融合,可以得到第三灰度图,该第三灰度图的高分辨率与上述第一灰度图的分辨率相同。再将第三灰度图和色度(chroma)图进行色彩融合,即可得到目标图像,该目标图像的分辨率也与上述第一灰度图的分辨率相同。The image processing unit in the imaging device performs grayscale fusion of the first grayscale (luma) image and the second grayscale image to obtain a third grayscale image. The high resolution of the third grayscale image is the same as the above-mentioned first grayscale image. The resolution of the degree map is the same. The third grayscale image and the chroma map are then color-fused to obtain a target image, and the resolution of the target image is also the same as the resolution of the first grayscale image.
由于随着图像传感器的分辨率的提升,图像传感器中的感光面上单位像素面积会减小,进而减少了感光的光子数量,导致信噪比下降。尤其在低照度场景(上述第一照度场景)下,由于彩色光路上的进光量不足,分辨率的提升会严重影响色彩效果。上述方法中通过大小分辨率两个图像传感器,形成大小分辨率的图像传感器组合,小分辨率的图像传感器用于彩色路成像,只接收可见光信号;大分辨率的图像传感器同时接收可见光信号和红外光信号,白天可以直接实现大分辨率的彩色成像,晚上用大分辨率的红外图像(灰度图)和小分辨率彩色图像(色度图)融合。从而实现白天晚上都支持大分辨率成像。As the resolution of the image sensor increases, the unit pixel area on the photosensitive surface in the image sensor decreases, which in turn reduces the number of photons that are photosensitive, resulting in a decrease in the signal-to-noise ratio. Especially in a low illuminance scene (the above-mentioned first illuminance scene), due to insufficient light input on the color light path, an increase in resolution will seriously affect the color effect. In the above method, two image sensors with large and small resolutions are used to form a combination of image sensors with large and small resolutions. The image sensor with small resolution is used for color path imaging and only receives visible light signals; the image sensor with large resolution receives both visible light signals and infrared signals. For light signals, large-resolution color imaging can be directly achieved during the day, and large-resolution infrared images (grayscale images) and small-resolution color images (chromaticity images) can be fused at night. Thus, large-resolution imaging can be supported day and night.
本申请通过分光单元对反射光信号进行分离,然后通过高分辨率和低分辨率两个图像传感器分别对分离得到的光信号进行光电转换得到对应的图像信号,进而将其转换成对应的图像,借助两步融合处理(灰度融合和色彩融合)得到最终的目标图像,其中灰度融合时,第一灰度图中也包含可见光信号,可以实现同谱配准,提高图像的融合效率,而灰度融合和色彩融合均是高分辨率和低分辨率的两个图像之间的融合,既可以保留低分辨率图像的高信噪比,又可以获取高分辨率图像,从而可以得到色彩更鲜艳、更高清的图像。In the present application, the reflected light signal is separated by a spectroscopic unit, and then the separated light signal is photoelectrically converted by two image sensors of high resolution and low resolution to obtain a corresponding image signal, and then converted into a corresponding image, With the help of two-step fusion processing (grayscale fusion and color fusion), the final target image is obtained. During the grayscale fusion, the first grayscale image also contains visible light signals, which can achieve the same spectrum registration and improve the fusion efficiency of the image. Both grayscale fusion and color fusion are fusions between two high-resolution and low-resolution images, which can not only retain the high signal-to-noise ratio of low-resolution images, but also acquire high-resolution images, so that more color can be obtained. Vivid, higher-definition images.
可选的,在第二照度下,关闭红外补光灯,并且开启红外滤光片,以使第一光信号包括可见光信号的第一分量信号但不包括红外光信号;第二光信号包括可见光信号的第二分量信号。Optionally, under the second illumination, turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes the first component signal of the visible light signal but does not include the infrared light signal; the second light signal includes visible light. the second component signal of the signal.
在第二照度场景下,成像装置中的补光单元不工作,滤光单元工作,因此反射光信号包括可见光信号和来自自然界的第二红外光信号。成像装置中的分光单元利用光学原理,对反射光信号进行分离得到第一光信号和第二光信号,其中第一光信号包括可见光信号的第一分量信号(例如10%的可见光信号)和来自自然界的红外光信号,第二光信号包括可见光信号的第二分量信号(例如90%的可见光信号)。In the second illumination scene, the supplementary light unit in the imaging device does not work, and the filter unit works, so the reflected light signal includes the visible light signal and the second infrared light signal from nature. The light splitting unit in the imaging device uses optical principles to separate the reflected light signal to obtain a first light signal and a second light signal, wherein the first light signal includes the first component signal of the visible light signal (for example, 10% of the visible light signal) and the In the natural infrared light signal, the second light signal includes the second component signal of the visible light signal (for example, 90% of the visible light signal).
成像装置中的滤光单元将第一光信号中的第二红外光信号过滤掉,因此到达第一图像 获取单元的第一光信号只包括可见光信号的第一分量信号。The filter unit in the imaging device filters out the second infrared light signal in the first light signal, so the first light signal reaching the first image acquisition unit only includes the first component signal of the visible light signal.
成像装置中的第一图像获取单元利用图像传感器的工作原理将第一光信号转换为电信号,形成初具图像雏形的第一图像信号,该第一图像信号是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第一光信号包含可见光信号,因此该第一图像信号是彩色的可见光图像信号。The first image acquisition unit in the imaging device converts the first optical signal into an electrical signal by using the working principle of the image sensor to form a first image signal with an initial image prototype. The first image signal is an electrical signal converted by photoelectricity, which constitutes For example, the suffix is .raw original image. Since the first light signal includes a visible light signal, the first image signal is a colored visible light image signal.
成像装置中的第二图像获取单元利用图像传感器的工作原理将第二光信号转换为电信号,形成初具图像雏形的第二图像信号,该第二图像信号也是经光电转换的电信号,构成例如后缀名为.raw原始图片。由于第二光信号只包含可见光信号,因此该第二图像信号是彩色的可见光图像信号。The second image acquisition unit in the imaging device converts the second optical signal into an electrical signal by using the working principle of the image sensor to form a second image signal with an initial image prototype, and the second image signal is also an electrical signal converted by photoelectricity, forming For example, the suffix is .raw original image. Since the second light signal only includes visible light signals, the second image signal is a colored visible light image signal.
同理,第一图像信号的分辨率高于第二图像信号的分辨率。成像装置中的图像处理单元根据第一图像信号得到目标图像,该目标图像的分辨率与第一图像信号的分辨率相同,高于第二图像信号的分辨率。Similarly, the resolution of the first image signal is higher than that of the second image signal. The image processing unit in the imaging device obtains a target image according to the first image signal, and the resolution of the target image is the same as that of the first image signal and higher than that of the second image signal.
为了减少运算量,图像处理单元可以只用第二图像信号进行光照强度的检测,无需采用该第二图像信号进行图像融合。该第二照度场景下,可以直接获取到高分辨率图像,提高图像的成像效果。In order to reduce the amount of computation, the image processing unit may only use the second image signal to detect the illumination intensity, and it is not necessary to use the second image signal to perform image fusion. In the second illumination scene, a high-resolution image can be directly acquired, thereby improving the imaging effect of the image.
在一种可能的实现方式中,还包括:根据第二光信号判断当前光照场景;当第二光信号对应的光照强度小于第一阈值时,确定当前光照场景为第一照度场景;当光照强度大于或等于第一阈值时,确定当前光照场景为第二照度场景;或者,当第二光信号对应的信号增益大于第二阈值时,确定当前光照场景为第一照度场景;或者,当信号增益小于或等于第二阈值时,确定当前光照场景为第二照度场景。In a possible implementation, the method further includes: judging the current illumination scene according to the second light signal; when the illumination intensity corresponding to the second light signal is less than the first threshold, determining that the current illumination scene is the first illumination scene; when the illumination intensity When it is greater than or equal to the first threshold, it is determined that the current illumination scene is the second illumination scene; or, when the signal gain corresponding to the second light signal is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain When it is less than or equal to the second threshold, it is determined that the current illumination scene is the second illumination scene.
上述第一阈值和第二阈值可以预先根据历史数据或经验进行设置。光照强度例如可以通过成像装置上的光传感器进行检测。信号增益例如可以通过第二图像获取单元40中的图像传感器进行检测。The above-mentioned first threshold and second threshold may be set in advance according to historical data or experience. The light intensity can be detected, for example, by a light sensor on the imaging device. The signal gain can be detected by an image sensor in the second image acquisition unit 40, for example.
下面采用一个具体的实施例,对图3所示方法实施例的技术方案进行详细说明。A specific embodiment is used below to describe in detail the technical solution of the method embodiment shown in FIG. 3 .
以下是该实施例中的成像装置的配置:The following is the configuration of the imaging device in this embodiment:
1、光捕获单元使用F1.4恒定光圈镜头,该镜头在400nm~940nm波长范围内共焦。1. The light capture unit uses an F1.4 constant aperture lens, which is confocal in the wavelength range of 400nm to 940nm.
2、补光单元使用一组850nm波段发光二极管(light emitting diode,LED)红外补光灯,总共6颗。补光单元内置在成像装置上,成像装置通过I2C总线控制红外补光灯的开关和亮度。2. The fill light unit uses a set of 850nm band light emitting diode (LED) infrared fill lights, a total of 6 pieces. The fill light unit is built in the imaging device, and the imaging device controls the switch and brightness of the infrared fill light through the I2C bus.
3、分光单元使用一块光学棱镜,该棱镜由两块等腰直角三角形玻璃组成,基于镀膜技术和棱镜参数的设计,能将接收的反射光信号中的所有红外光信号和10%的可见光信号透射到A方向,同时将反射光信号中的90%的可见光信号折射到B方向。A方向和B方向呈90度夹角。3. The beam splitting unit uses an optical prism, which is composed of two isosceles right-angled triangle glasses. Based on the coating technology and the design of prism parameters, it can transmit all the infrared light signals and 10% of the visible light signals in the received reflected light signals. to the A direction, while refracting 90% of the visible light signal in the reflected light signal to the B direction. The A direction and the B direction form an included angle of 90 degrees.
4、滤光单元使用IR-CUT双滤镜,该IR-CUT双滤镜安装在光学棱镜之后的A方向上。当IR-CUT双滤镜处于工作状态时,只让400nm~750nm波长的光信号透过。当IR-CUT双滤镜处于非工作状态时,可以让所有光信号透过。4. The filter unit uses IR-CUT dual filters, which are installed in the A direction behind the optical prism. When the IR-CUT dual filter is in working state, it only allows the light signal with the wavelength of 400nm to 750nm to pass through. When the IR-CUT dual filter is in non-working state, all light signals can be passed through.
5、第一图像获取单元使用1/1.2靶面、4K Bayer传感器(以下简称4K传感器),该传感器的分辨率是3840×2160。5. The first image acquisition unit uses a 1/1.2 target surface, a 4K Bayer sensor (hereinafter referred to as a 4K sensor), and the resolution of the sensor is 3840×2160.
6、第二图像获取单元使用1/1.2靶面、2K Bayer传感器(以下简称2K传感器),该 传感器的分辨率是1920×1080。6. The second image acquisition unit uses a 1/1.2 target surface, a 2K Bayer sensor (hereinafter referred to as a 2K sensor), and the resolution of the sensor is 1920×1080.
7、图像处理单元使用ARM+DSP架构的SOC处理器。7. The image processing unit uses the SOC processor of the ARM+DSP architecture.
图4为本申请成像方法实施例的示例性的框架图,如图4所示,本实施例的成像方法包括白天模式和夜晚模式,其中夜晚模式对应第一照度模式,白天模式对应第二照度场景。FIG. 4 is an exemplary frame diagram of an embodiment of an imaging method of the present application. As shown in FIG. 4 , the imaging method of this embodiment includes a day mode and a night mode, wherein the night mode corresponds to the first illumination mode, and the day mode corresponds to the second illumination mode Scenes.
将照相机按预设的角度部署在立杆上,照相机的初始模式为白天模式,2K传感器的曝光时间设置为10ms。当SOC处理器检测到来自2K传感器的第二图像信号的增益大于或等于30db时,照相机切换为夜晚模式;当SOC处理器检测到来自2K传感器的第二图像信号的增益小于30db时,继续运行在白天模式下。Deploy the camera on the pole at a preset angle, the initial mode of the camera is day mode, and the exposure time of the 2K sensor is set to 10ms. When the SOC processor detects that the gain of the second image signal from the 2K sensor is greater than or equal to 30db, the camera switches to night mode; when the SOC processor detects that the gain of the second image signal from the 2K sensor is less than 30db, it continues to run in day mode.
白天模式下:In day mode:
(1)关闭照相机中的LED红外补光灯。启动IR-CUT双滤镜工作,过滤A方向的第二红外光信号。(1) Turn off the LED infrared fill light in the camera. Start the IR-CUT dual filter work to filter the second infrared light signal in the A direction.
(2)4K传感器捕获到10%的可见光信号后转成第一图像信号,传输给SOC处理器。同时,2K传感器捕获到90%的可见光信号后转成第二图像信号,也传输给SOC处理器。(2) The 4K sensor captures 10% of the visible light signal and converts it into the first image signal and transmits it to the SOC processor. At the same time, the 2K sensor captures 90% of the visible light signal and converts it into a second image signal, which is also transmitted to the SOC processor.
(3)SOC处理器开通1个ISP管道(pipe),通过该ISP pipe对第一图像信号进行图像处理,最后输出一路高分辨率的RGB图像,该RGB图像是彩色的。(3) The SOC processor opens an ISP pipe (pipe), and performs image processing on the first image signal through the ISP pipe, and finally outputs a high-resolution RGB image, and the RGB image is color.
(4)SOC处理器还根据第二图像信号检测信号增益,以确定是否需要做模式转换。(4) The SOC processor also detects the signal gain according to the second image signal to determine whether mode conversion is required.
夜晚模式下:In night mode:
(1)打开照相机中的LED红外补光灯,该LED红外补光灯向拍摄方向发出850nm波长的红外光。设置IR-CUT双滤镜不工作,A方向的第一红外光信号可以透过。(1) Turn on the LED infrared fill light in the camera, and the LED infrared fill light emits infrared light with a wavelength of 850 nm toward the shooting direction. Set the IR-CUT dual filter to not work, and the first infrared light signal in the A direction can pass through.
(2)4K传感器捕获到10%的可见光信号和第一红外光信号后转成第一图像信号,传输给SOC处理器。同时,2K传感器捕获到90%的可见光信号后转成第二图像信号,也传输给SOC处理器。(2) The 4K sensor captures 10% of the visible light signal and the first infrared light signal and converts it into a first image signal, which is transmitted to the SOC processor. At the same time, the 2K sensor captures 90% of the visible light signal and converts it into a second image signal, which is also transmitted to the SOC processor.
(3)SOC处理器开通2个ISP pipe,通过一个ISP pipe对第一图像信号进行图像处理,将4K RGB图像信号转化成高分辨率的第一灰度图。同时通过另一个ISP pipe对第二图像信号进行图像处理,将2K RGB信号转化成低分辨率的RGB图像。(3) The SOC processor opens two ISP pipes, performs image processing on the first image signal through one ISP pipe, and converts the 4K RGB image signal into a high-resolution first grayscale image. At the same time, image processing is performed on the second image signal through another ISP pipe, and the 2K RGB signal is converted into a low-resolution RGB image.
(4)SOC处理器将低分辨率的RGB图像转成YUV格式,得到低分辨率的亮度(luma)图像(即Y通道)和低分辨率的色度(Choma)图像(即UV通道)。(4) The SOC processor converts the low-resolution RGB image into YUV format to obtain a low-resolution luminance (luma) image (ie, Y channel) and a low-resolution chrominance (Choma) image (ie, UV channel).
(5)SOC处理器将低分辨率的亮度(luma)图像和高分辨率的第一灰度图做灰度融合,得到高分辨率的亮度(luma)图像。(5) The SOC processor performs grayscale fusion of the low-resolution luminance (luma) image and the high-resolution first grayscale image to obtain a high-resolution luminance (luma) image.
(6)SOC处理器将高分辨率的亮度(luma)图像和低分辨率的色度(Choma)图像做色彩融合,得到高分辨率的RGB图像输出。(6) The SOC processor performs color fusion of a high-resolution luminance (luma) image and a low-resolution chrominance (Choma) image to obtain a high-resolution RGB image output.
(7)SOC处理器还根据第二图像信号检测信号增益,以确定是否需要做模式转换。(7) The SOC processor also detects the signal gain according to the second image signal to determine whether mode conversion is required.
图5为本申请成像装置实施例的结构示意图,如图5所示,该装置可以应用于上述实施例中的终端设备。本实施例的成像装置可以包括:光捕获模块1501、分光模块1502、图像获取模块1503和图像处理模块1504。其中,FIG. 5 is a schematic structural diagram of an embodiment of an imaging apparatus of the present application. As shown in FIG. 5 , the apparatus may be applied to the terminal device in the above-mentioned embodiment. The imaging device of this embodiment may include: a light capturing module 1501 , a light splitting module 1502 , an image acquiring module 1503 and an image processing module 1504 . in,
光捕获模块1501,用于获取被拍摄对象的反射光信号;分光模块1502,用于通过分光单元将所述反射光信号分离成第一光信号和第二光信号,所述分光单元用于分光谱和分能量;图像获取模块1503,用于根据所述第一光信号获取第一图像信号;根据所述第二光信号获取第二图像信号;其中,所述第一图像信号的分辨率高于所述第二图像信号的分 辨率。The light capture module 1501 is used to acquire the reflected light signal of the photographed object; the light splitting module 1502 is used to separate the reflected light signal into a first light signal and a second light signal by a light splitting unit, and the light splitting unit is used for dividing spectrum and sub-energy; the image acquisition module 1503 is configured to acquire a first image signal according to the first optical signal; acquire a second image signal according to the second optical signal; wherein, the resolution of the first image signal is high on the resolution of the second image signal.
在一种可能的实现方式中,图像处理模块1504,用于在第一照度场景下,根据所述第一图像信号和所述第二图像信号获取第一目标图像;或者,在第二照度场景下,根据所述第一图像信号获取第二目标图像;其中,所述第二照度场景的光照强度大于所述第一照度场景的光照强度。In a possible implementation manner, the image processing module 1504 is configured to acquire a first target image according to the first image signal and the second image signal in a first illumination scene; or, in a second illumination scene Next, a second target image is acquired according to the first image signal; wherein, the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
在一种可能的实现方式中,所述分光单元用于将所述反射光信号中的可见光信号和红外光信号分离开,以达到分光谱的目的;所述分光单元还用于将所述可见光信号分成占10%~40%的可见光谱能量的第一分量信号和占60%~90%的可见光谱能量的第二分量信号。In a possible implementation manner, the light splitting unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the light splitting unit is also used to separate the visible light signal The signal is divided into a first component signal that accounts for 10%-40% of the visible spectrum energy and a second component signal that accounts for 60%-90% of the visible spectrum energy.
在一种可能的实现方式中,在所述第一照度下,所述图像处理模块1504,还用于开启红外补光灯照射所述被拍摄对象,并且关闭红外滤光片,以使所述第一光信号包括可见光信号的第一分量信号和由所述红外补光灯反射得到的红外光信号;所述第二光信号包括所述可见光信号的第二分量信号。In a possible implementation manner, under the first illumination, the image processing module 1504 is further configured to turn on an infrared fill light to illuminate the photographed object, and turn off the infrared filter, so that the The first optical signal includes a first component signal of a visible light signal and an infrared light signal reflected by the infrared fill light; the second optical signal includes a second component signal of the visible light signal.
在一种可能的实现方式中,在所述第二照度下,所述图像处理模块1504,还用于关闭红外补光灯,并且开启红外滤光片,以使所述第一光信号包括可见光信号的第一分量信号但不包括红外光信号;所述第二光信号包括所述可见光信号的第二分量信号。In a possible implementation manner, under the second illumination, the image processing module 1504 is further configured to turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes visible light The first component signal of the signal does not include the infrared light signal; the second light signal includes the second component signal of the visible light signal.
在一种可能的实现方式中,所述图像获取模块1503,具体用于将所述第一光信号输入第一图像传感器获取所述第一图像信号;将所述第二光信号输入第二图像传感器获取所述第二图像信号;其中,所述第一图像传感器的分辨率高于所述第二图像传感器的分辨率。In a possible implementation manner, the image acquisition module 1503 is specifically configured to input the first optical signal into the first image sensor to acquire the first image signal; input the second optical signal into the second image The sensor acquires the second image signal; wherein the resolution of the first image sensor is higher than the resolution of the second image sensor.
在一种可能的实现方式中,所述图像处理模块1504,具体用于根据所述第一图像信号获取第一灰度图;根据所述第二图像信号获取第二灰度图和色度图;将所述第一灰度图和所述第二灰度图进行灰度融合得到第三灰度图像;将所述第三灰度图和所述色度图进行色彩融合得到所述第一目标图像;其中,所述第一灰度图的分辨率高于所述第二灰度图的分辨率;所述第一灰度图的分辨率高于所述色度图的分辨率。In a possible implementation manner, the image processing module 1504 is specifically configured to obtain a first grayscale map according to the first image signal; obtain a second grayscale map and a chromaticity map according to the second image signal Perform grayscale fusion of the first grayscale image and the second grayscale image to obtain a third grayscale image; perform color fusion on the third grayscale image and the chromaticity map to obtain the first grayscale image The target image; wherein the resolution of the first grayscale image is higher than the resolution of the second grayscale image; the resolution of the first grayscale image is higher than the resolution of the chromaticity map.
在一种可能的实现方式中,所述图像处理模块1504,还用于根据所述第二光信号判断当前光照场景;当所述第二光信号对应的光照强度小于第一阈值时,确定所述当前光照场景为所述第一照度场景;当所述光照强度大于或等于所述第一阈值时,确定所述当前光照场景为所述第二照度场景;或者,当所述第二光信号对应的信号增益大于第二阈值时,确定所述当前光照场景为所述第一照度场景;或者,当所述信号增益小于或等于所述第二阈值时,确定所述当前光照场景为所述第二照度场景。In a possible implementation manner, the image processing module 1504 is further configured to determine the current lighting scene according to the second light signal; when the light intensity corresponding to the second light signal is less than the first threshold, determine the the current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, determine that the current illumination scene is the second illumination scene; or, when the second light signal When the corresponding signal gain is greater than the second threshold, determine that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, determine that the current illumination scene is the first illumination scene The second illumination scene.
本实施例的装置,可以用于执行图3所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。The apparatus of this embodiment can be used to execute the technical solution of the method embodiment shown in FIG. 3 , and its implementation principle and technical effect are similar, and details are not repeated here.
在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、特定应用集成电路(application-specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。本申请实施例公开的方法的步骤可以直接体现为硬件编码处理器执行完成,或者用编码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储 器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。In the implementation process, each step of the above method embodiments may be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software. The processor may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other Programming logic devices, discrete gate or transistor logic devices, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the methods disclosed in the embodiments of the present application may be directly embodied as executed by a hardware coding processor, or executed by a combination of hardware and software modules in the coding processor. The software modules can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art. The storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
上述各实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。应注意,本文描述的系统和方法的存储器旨在包括但不限于这些和任意其它适合类型的存储器。The memory mentioned in the above embodiments may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. The non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory. Volatile memory may be random access memory (RAM), which acts as an external cache. By way of example and not limitation, many forms of RAM are available, such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (double data rate SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (enhanced SDRAM, ESDRAM), synchronous link dynamic random access memory (synchlink DRAM, SLDRAM) ) and direct memory bus random access memory (direct rambus RAM, DR RAM). It should be noted that the memory of the systems and methods described herein is intended to include, but not be limited to, these and any other suitable types of memory.
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。Those of ordinary skill in the art can realize that the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein can be implemented in electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。Those skilled in the art can clearly understand that, for the convenience and brevity of description, the specific working process of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which will not be repeated here.
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are only illustrative. For example, the division of the units is only a logical function division. In actual implementation, there may be other division methods. For example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented. On the other hand, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。The units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。In addition, each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存 储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。The functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (personal computer, server, or network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application. The aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。The above are only specific embodiments of the present application, but the protection scope of the present application is not limited to this. should be covered within the scope of protection of this application. Therefore, the protection scope of the present application should be subject to the protection scope of the claims.

Claims (19)

  1. 一种成像方法,其特征在于,包括:An imaging method, comprising:
    获取被拍摄对象的反射光信号;Obtain the reflected light signal of the object being photographed;
    通过分光单元将所述反射光信号分离成第一光信号和第二光信号,所述分光单元用于分光谱和分能量;The reflected light signal is separated into a first light signal and a second light signal by a light splitting unit, the light splitting unit is used for spectrum and energy;
    根据所述第一光信号获取第一图像信号;obtaining a first image signal according to the first optical signal;
    根据所述第二光信号获取第二图像信号;obtaining a second image signal according to the second optical signal;
    其中,所述第一图像信号的分辨率高于所述第二图像信号的分辨率。Wherein, the resolution of the first image signal is higher than the resolution of the second image signal.
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述第二光信号获取第二图像信号之后,还包括:The method according to claim 1, wherein after acquiring the second image signal according to the second optical signal, the method further comprises:
    在第一照度场景下,根据所述第一图像信号和所述第二图像信号获取第一目标图像;或者,在第二照度场景下,根据所述第一图像信号获取第二目标图像;In a first illumination scene, a first target image is acquired according to the first image signal and the second image signal; or, in a second illumination scene, a second target image is acquired according to the first image signal;
    其中,所述第二照度场景的光照强度大于所述第一照度场景的光照强度。Wherein, the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
  3. 根据权利要求1或2所述的方法,其特征在于,所述分光单元用于将所述反射光信号中的可见光信号和红外光信号分离开,以达到分光谱的目的;所述分光单元还用于将所述可见光信号分成占10%~40%的可见光谱能量的第一分量信号和占60%~90%的可见光谱能量的第二分量信号。The method according to claim 1 or 2, wherein the spectroscopic unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the spectroscopic unit is further for dividing the visible light signal into a first component signal with 10%-40% visible spectrum energy and a second component signal with 60%-90% visible spectrum energy.
  4. 根据权利要求3所述的方法,其特征在于,在第一照度下,所述获取被拍摄对象的反射光信号之前,还包括:The method according to claim 3, wherein, under the first illuminance, before acquiring the reflected light signal of the photographed object, the method further comprises:
    开启红外补光灯照射所述被拍摄对象,并且关闭红外滤光片,以使所述第一光信号包括所述第一分量信号和由所述红外补光灯反射得到的红外光信号;所述第二光信号包括所述第二分量信号。Turning on the infrared fill light to illuminate the object to be photographed, and turning off the infrared filter, so that the first light signal includes the first component signal and the infrared light signal reflected by the infrared fill light; The second optical signal includes the second component signal.
  5. 根据权利要求3所述的方法,其特征在于,在第二照度下,所述获取被拍摄对象的反射光信号之前,还包括:The method according to claim 3, wherein, under the second illuminance, before acquiring the reflected light signal of the photographed object, the method further comprises:
    关闭红外补光灯,并且开启红外滤光片,以使所述第一光信号包括所述第一分量信号但不包括红外光信号;所述第二光信号包括所述第二分量信号。The infrared fill light is turned off, and the infrared filter is turned on, so that the first optical signal includes the first component signal but does not include the infrared light signal; the second optical signal includes the second component signal.
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述根据所述第一光信号获取第一图像信号,包括:The method according to any one of claims 1-5, wherein the acquiring the first image signal according to the first optical signal comprises:
    将所述第一光信号输入第一图像传感器获取所述第一图像信号;inputting the first optical signal into a first image sensor to obtain the first image signal;
    所述根据所述第二光信号获取第二图像信号,包括:The acquiring a second image signal according to the second optical signal includes:
    将所述第二光信号输入第二图像传感器获取所述第二图像信号;inputting the second optical signal into a second image sensor to obtain the second image signal;
    其中,所述第一图像传感器的分辨率高于所述第二图像传感器的分辨率。Wherein, the resolution of the first image sensor is higher than the resolution of the second image sensor.
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述根据所述第一图像信号和所述第二图像信号获取第一目标图像,包括:The method according to any one of claims 1-6, wherein the acquiring the first target image according to the first image signal and the second image signal comprises:
    根据所述第一图像信号获取第一灰度图;obtaining a first grayscale image according to the first image signal;
    根据所述第二图像信号获取第二灰度图和色度图;obtaining a second grayscale map and a chromaticity map according to the second image signal;
    将所述第一灰度图和所述第二灰度图进行灰度融合得到第三灰度图像;performing gray fusion on the first grayscale image and the second grayscale image to obtain a third grayscale image;
    将所述第三灰度图和所述色度图进行色彩融合得到所述第一目标图像;performing color fusion on the third grayscale map and the chromaticity map to obtain the first target image;
    其中,所述第一灰度图的分辨率高于所述第二灰度图的分辨率;所述第一灰度图的分辨率高于所述色度图的分辨率。The resolution of the first grayscale image is higher than the resolution of the second grayscale image; the resolution of the first grayscale image is higher than the resolution of the chromaticity map.
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,还包括:The method according to any one of claims 1-7, further comprising:
    根据所述第二光信号判断当前光照场景;Determine the current lighting scene according to the second light signal;
    当所述第二光信号对应的光照强度小于第一阈值时,确定所述当前光照场景为所述第一照度场景;当所述光照强度大于或等于所述第一阈值时,确定所述当前光照场景为所述第二照度场景;或者,When the illumination intensity corresponding to the second light signal is less than the first threshold, determine that the current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, determine the current illumination scene The illumination scene is the second illumination scene; or,
    当所述第二光信号对应的信号增益大于第二阈值时,确定所述当前光照场景为所述第一照度场景;或者,当所述信号增益小于或等于所述第二阈值时,确定所述当前光照场景为所述第二照度场景。When the signal gain corresponding to the second optical signal is greater than the second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, it is determined that the current illumination scene is the first illumination scene; The current illumination scene is the second illumination scene.
  9. 一种成像装置,其特征在于,包括:An imaging device, characterized in that it includes:
    光捕获模块,用于获取被拍摄对象的反射光信号;The light capture module is used to obtain the reflected light signal of the photographed object;
    分光模块,用于通过分光单元将所述反射光信号分离成第一光信号和第二光信号,所述分光单元用于分光谱和分能量;a light splitting module, configured to separate the reflected light signal into a first light signal and a second light signal by a light splitting unit, where the light splitting unit is used for spectrum and energy;
    图像获取模块,用于根据所述第一光信号获取第一图像信号;根据所述第二光信号获取第二图像信号;an image acquisition module, configured to acquire a first image signal according to the first optical signal; acquire a second image signal according to the second optical signal;
    其中,所述第一图像信号的分辨率高于所述第二图像信号的分辨率。Wherein, the resolution of the first image signal is higher than the resolution of the second image signal.
  10. 根据权利要求9所述的装置,其特征在于,还包括:The device of claim 9, further comprising:
    图像处理模块,用于在第一照度场景下,根据所述第一图像信号和所述第二图像信号获取第一目标图像;或者,在第二照度场景下,根据所述第一图像信号获取第二目标图像;An image processing module, configured to obtain a first target image according to the first image signal and the second image signal under a first illumination scene; or, under a second illumination scene, obtain according to the first image signal the second target image;
    其中,所述第二照度场景的光照强度大于所述第一照度场景的光照强度。Wherein, the illumination intensity of the second illumination scene is greater than the illumination intensity of the first illumination scene.
  11. 根据权利要求9或10所述的装置,其特征在于,所述分光单元用于将所述反射光信号中的可见光信号和红外光信号分离开,以达到分光谱的目的;所述分光单元还用于将所述可见光信号分成占10%~40%的可见光谱能量的第一分量信号和占60%~90%的可见光谱能量的第二分量信号。The device according to claim 9 or 10, wherein the spectroscopic unit is used to separate the visible light signal and the infrared light signal in the reflected light signal, so as to achieve the purpose of spectrum splitting; the spectroscopic unit is further for dividing the visible light signal into a first component signal with 10%-40% visible spectrum energy and a second component signal with 60%-90% visible spectrum energy.
  12. 根据权利要求11所述的装置,其特征在于,在第一照度下,所述图像处理模块,还用于开启红外补光灯照射所述被拍摄对象,并且关闭红外滤光片,以使所述第一光信号包括所述第一分量信号和由所述红外补光灯反射得到的红外光信号;所述第二光信号包括所述第二分量信号。The device according to claim 11, wherein, under the first illumination, the image processing module is further configured to turn on an infrared fill light to illuminate the photographed object, and turn off the infrared filter, so that all The first optical signal includes the first component signal and the infrared light signal reflected by the infrared fill light; the second optical signal includes the second component signal.
  13. 根据权利要求11所述的装置,其特征在于,在第二照度下,所述图像处理模块,还用于关闭红外补光灯,并且开启红外滤光片,以使所述第一光信号包括所述第一分量信号但不包括红外光信号;所述第二光信号包括所述第二分量信号。The device according to claim 11, wherein under the second illumination, the image processing module is further configured to turn off the infrared fill light and turn on the infrared filter, so that the first light signal includes The first component signal does not include the infrared light signal; the second light signal includes the second component signal.
  14. 根据权利要求9-13中任一项所述的装置,其特征在于,所述图像获取模块,具体用于将所述第一光信号输入第一图像传感器获取所述第一图像信号;将所述第二光信号输入第二图像传感器获取所述第二图像信号;其中,所述第一图像传感器的分辨率高于所述第二图像传感器的分辨率。The device according to any one of claims 9-13, wherein the image acquisition module is specifically configured to input the first optical signal into a first image sensor to acquire the first image signal; The second optical signal is input to a second image sensor to obtain the second image signal; wherein, the resolution of the first image sensor is higher than that of the second image sensor.
  15. 根据权利要求9-14中任一项所述的装置,其特征在于,所述图像处理模块,具体用于根据所述第一图像信号获取第一灰度图;根据所述第二图像信号获取第二灰度图和色度图;将所述第一灰度图和所述第二灰度图进行灰度融合得到第三灰度图像;将所述第三 灰度图和所述色度图进行色彩融合得到所述第一目标图像;其中,所述第一灰度图的分辨率高于所述第二灰度图的分辨率;所述第一灰度图的分辨率高于所述色度图的分辨率。The device according to any one of claims 9-14, wherein the image processing module is specifically configured to obtain a first grayscale image according to the first image signal; obtain a grayscale image according to the second image signal a second grayscale image and a chromaticity map; perform grayscale fusion of the first grayscale image and the second grayscale image to obtain a third grayscale image; combine the third grayscale image and the chromaticity The first target image is obtained by color fusion of the images; wherein, the resolution of the first grayscale image is higher than that of the second grayscale image; the resolution of the first grayscale image is higher than that of the first grayscale image. The resolution of the chromaticity diagram.
  16. 根据权利要求9-15中任一项所述的装置,其特征在于,所述图像处理模块,还用于根据所述第二光信号判断当前光照场景;当所述第二光信号对应的光照强度小于第一阈值时,确定所述当前光照场景为所述第一照度场景;当所述光照强度大于或等于所述第一阈值时,确定所述当前光照场景为所述第二照度场景;或者,当所述第二光信号对应的信号增益大于第二阈值时,确定所述当前光照场景为所述第一照度场景;或者,当所述信号增益小于或等于所述第二阈值时,确定所述当前光照场景为所述第二照度场景。The device according to any one of claims 9-15, wherein the image processing module is further configured to determine the current lighting scene according to the second light signal; when the lighting corresponding to the second light signal When the intensity is less than the first threshold, it is determined that the current illumination scene is the first illumination scene; when the illumination intensity is greater than or equal to the first threshold, the current illumination scene is determined to be the second illumination scene; Or, when the signal gain corresponding to the second optical signal is greater than a second threshold, it is determined that the current illumination scene is the first illumination scene; or, when the signal gain is less than or equal to the second threshold, It is determined that the current illumination scene is the second illumination scene.
  17. 一种终端设备,其特征在于,包括:A terminal device, characterized in that it includes:
    一个或多个处理器;one or more processors;
    存储器,用于存储一个或多个程序;memory for storing one or more programs;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-8中任一项所述的方法。The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
  18. 一种计算机可读存储介质,其特征在于,包括计算机程序,所述计算机程序在计算机上被执行时,使得所述计算机执行权利要求1-8中任一项所述的方法。A computer-readable storage medium, characterized by comprising a computer program, which, when executed on a computer, causes the computer to execute the method of any one of claims 1-8.
  19. 一种计算机程序,其特征在于,当所述计算机程序被计算机执行时,用于执行权利要求1-8中任一项所述的方法。A computer program, characterized in that, when the computer program is executed by a computer, it is used to execute the method of any one of claims 1-8.
PCT/CN2021/118697 2020-09-29 2021-09-16 Imaging method and apparatus WO2022068598A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011058209.5A CN114338962B (en) 2020-09-29 2020-09-29 Image forming method and apparatus
CN202011058209.5 2020-09-29

Publications (1)

Publication Number Publication Date
WO2022068598A1 true WO2022068598A1 (en) 2022-04-07

Family

ID=80949542

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118697 WO2022068598A1 (en) 2020-09-29 2021-09-16 Imaging method and apparatus

Country Status (2)

Country Link
CN (1) CN114338962B (en)
WO (1) WO2022068598A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117499789A (en) * 2023-12-25 2024-02-02 荣耀终端有限公司 Shooting method and related device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015097776A1 (en) * 2013-12-25 2015-07-02 日立マクセル株式会社 Image pickup element and image pickup apparatus
CN104822033A (en) * 2015-05-05 2015-08-05 太原理工大学 Visual sensor based on infrared and visible light image fusion and using method thereof
CN107563971A (en) * 2017-08-12 2018-01-09 四川精视科技有限公司 A kind of very color high-definition night-viewing imaging method
CN108387944A (en) * 2018-02-01 2018-08-10 北京理工大学 Minimize the life-detection instrument of visible light and the fusion of LONG WAVE INFRARED two-waveband video
CN110891138A (en) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 Black light full-color realization method and black light full-color camera

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3592182B2 (en) * 2000-02-29 2004-11-24 株式会社睦コーポレーション Imaging device
JP4757221B2 (en) * 2007-03-30 2011-08-24 富士フイルム株式会社 Imaging apparatus and method
CN109040534A (en) * 2017-06-12 2018-12-18 杭州海康威视数字技术股份有限公司 A kind of image processing method and image capture device
CN107580163A (en) * 2017-08-12 2018-01-12 四川精视科技有限公司 A kind of twin-lens black light camera
CN110490044B (en) * 2019-06-14 2022-03-15 杭州海康威视数字技术股份有限公司 Face modeling device and face modeling method
CN111027489B (en) * 2019-12-12 2023-10-20 Oppo广东移动通信有限公司 Image processing method, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015097776A1 (en) * 2013-12-25 2015-07-02 日立マクセル株式会社 Image pickup element and image pickup apparatus
CN104822033A (en) * 2015-05-05 2015-08-05 太原理工大学 Visual sensor based on infrared and visible light image fusion and using method thereof
CN107563971A (en) * 2017-08-12 2018-01-09 四川精视科技有限公司 A kind of very color high-definition night-viewing imaging method
CN108387944A (en) * 2018-02-01 2018-08-10 北京理工大学 Minimize the life-detection instrument of visible light and the fusion of LONG WAVE INFRARED two-waveband video
CN110891138A (en) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 Black light full-color realization method and black light full-color camera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117499789A (en) * 2023-12-25 2024-02-02 荣耀终端有限公司 Shooting method and related device
CN117499789B (en) * 2023-12-25 2024-05-17 荣耀终端有限公司 Shooting method and related device

Also Published As

Publication number Publication date
CN114338962A (en) 2022-04-12
CN114338962B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
WO2022262260A1 (en) Photographing method and electronic device
WO2020238741A1 (en) Image processing method, related device and computer storage medium
US20220392182A1 (en) Image acquisition method and device
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
WO2022116930A1 (en) Content sharing method, electronic device, and storage medium
US20220245778A1 (en) Image bloom processing method and apparatus, and storage medium
CN113436576B (en) OLED display screen dimming method and device applied to two-dimensional code scanning
WO2022156555A1 (en) Screen brightness adjustment method, apparatus, and terminal device
CN116095476B (en) Camera switching method and device, electronic equipment and storage medium
WO2023160295A1 (en) Video processing method and apparatus
CN113542613A (en) Device and method for photographing
WO2023015985A1 (en) Image processing method and electronic device
WO2022068598A1 (en) Imaging method and apparatus
CN112637481B (en) Image scaling method and device
CN115412678B (en) Exposure processing method and device and electronic equipment
US20240144451A1 (en) Image Processing Method and Electronic Device
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN111885768B (en) Method, electronic device and system for adjusting light source
CN111294905B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN115118963A (en) Image quality adjusting method, electronic device and storage medium
CN115145517A (en) Screen projection method, electronic equipment and system
CN116405758A (en) Data transmission method and electronic equipment
CN117119314B (en) Image processing method and related electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21874259

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21874259

Country of ref document: EP

Kind code of ref document: A1