WO2021139401A1 - 摄像头模组、成像方法和成像装置 - Google Patents

摄像头模组、成像方法和成像装置 Download PDF

Info

Publication number
WO2021139401A1
WO2021139401A1 PCT/CN2020/128975 CN2020128975W WO2021139401A1 WO 2021139401 A1 WO2021139401 A1 WO 2021139401A1 CN 2020128975 W CN2020128975 W CN 2020128975W WO 2021139401 A1 WO2021139401 A1 WO 2021139401A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
filter
camera module
mode
lens
Prior art date
Application number
PCT/CN2020/128975
Other languages
English (en)
French (fr)
Inventor
罗巍
王轶凡
唐玮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/791,471 priority Critical patent/US20230045724A1/en
Priority to EP20911551.8A priority patent/EP4075781A4/en
Publication of WO2021139401A1 publication Critical patent/WO2021139401A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0224Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using polarising or depolarising elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0235Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using means for replacing an element by another, for replacing a filter or a grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/027Control of working procedures of a spectrometer; Failure detection; Bandwidth calculation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/1256Generating the spectrum; Monochromators using acousto-optic tunable filter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/32Investigating bands of a spectrum in sequence by a single detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/462Computing operations in or between colour spaces; Colour management systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/125Colour sequential image capture, e.g. using a colour wheel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J2003/1213Filters in general, e.g. dichroic, band
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J2003/1269Electrooptic filter
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/001Optical devices or arrangements for the control of light using movable or deformable optical elements based on interference in an adjustable optical cavity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/28Interference filters
    • G02B5/281Interference filters designed for the infrared light
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/11Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on acousto-optical elements, e.g. using variable diffraction by sound or like mechanical waves

Definitions

  • This application relates to the field of image processing, and more specifically, to a camera module, an imaging method, and an imaging device.
  • the image sensor is one of the most important components of the terminal camera system, which plays a decisive role in the imaging quality.
  • the commonly used color imaging sensor is a Bayer type sensor.
  • Bayer-type sensors need to use a demosaicing algorithm to interpolate and supplement, which will result in a decrease in image resolution.
  • problems such as moiré, color noise, and zipper-like noise are often introduced in the interpolation process, thereby reducing image quality .
  • the present application provides a camera module, an imaging method, and an imaging device, which can improve the imaging quality of an image.
  • the present application provides a camera module, the camera module includes: a filter module and a sensor module; the filter module is used: at different times, the same pixel on the sensor module Point, output target light signals of different wavelength bands among the light signals incident on the filter module; the sensor module is used to: convert the target light signals incident on the sensor module into electrical signals and output them.
  • the filter module time-sharing filters to obtain target light signals of different wavebands, and the sensor performs photoelectric conversion on these different wavebands in time-sharing, and simultaneously collects and adopts light signals of multiple wavebands in the prior art.
  • the interpolation to obtain a full-color image by the demosaicing algorithm, it is helpful to improve the imaging resolution and the accuracy of the color reproduction of the imaging, and it can also help to avoid problems such as moiré fringes, color noise, and zipper noise.
  • the filter module includes a moving module and a plurality of filters; the filter is used to output the target optical signal among the optical signals incident on the filter, and the multiple The wavelength bands of the target light signals output by different filters in the two filters are different; the moving module is used to: at different times, move the different filters in the plurality of filters to be able to receive the light. The target position of the signal.
  • the filter module includes a moving module and a linear gradual interferometric filter; the moving module is used to: at different times, the linear gradual interference filter Different parts of the film are moved to target positions capable of receiving the optical signal, and the wavelength band of the target optical signal output by the different parts of the filter does not pass.
  • the filter is continuously adjustable from visible light to infrared light.
  • the filter module includes two double-sided mirrors and a moving module; the moving module is used to: adjust the distance between the two two-sided mirrors, so that at different times The distances between the two two-sided reflecting mirrors are different; the two two-sided reflecting mirrors are used to: filter out the target light signal from the optical signal incident on the one two-sided reflecting mirror and remove it from the two two-sided reflecting mirrors. The other two mirrors in the mirror output, and when the distance between the two mirrors is different, the wavelength band of the target light signal is different.
  • the filter module includes a liquid crystal tunable filter.
  • the filter module is one of a plurality of filter modules of the camera module.
  • the filter module includes an acousto-optic tunable filter.
  • the filter module further includes an infrared cut-off filter, and the infrared cut-off filter is used for: when the target light signal is incident on Before the sensor module, the infrared light in the target light signal is filtered out.
  • the infrared cut-off filter is used for: when the target light signal is incident on Before the sensor module, the infrared light in the target light signal is filtered out.
  • the camera module further includes: a first lens module.
  • the first lens module is used to output the optical signal incident on the first lens module to the filter module.
  • the first lens module can reduce the color shift problem caused by the filter module under a large angle of incidence, and at the same time ensure that more light signals are incident on the sensor module, thereby further improving the imaging quality.
  • the first lens module includes one or more of the following: a plastic lens, a glass lens, a diffractive optical element DOE, a super lens, and the like.
  • the camera module further includes a motor module; the motor module is used to: control the movement of the first lens module and/or the sensor module to realize the focusing function and/or prevention of the camera module Shake function.
  • the camera module further includes a second lens module, and the second lens module is located between the filter module and the sensor module.
  • the second lens module is used to output the target light signal output by the filter to the sensor module, and increase the coverage of the target light signal on the sensor module.
  • the second lens module can increase the incident angle of the target light signal output by the filter module on the sensor module to increase the coverage of the target light signal on the sensor module, that is, the second lens module can make the sensor module More target light signals can be collected at the same time, so that more raw image data can be provided, and the imaging quality can be improved.
  • the camera module further includes a second lens module, and the second lens module is located between the filter module and the sensor module Wherein the second lens module is used to: output the target light signal output by the filter to the sensor module, and increase the coverage of the target light signal on the sensor module.
  • the camera module further includes a second motor module; the second motor module is used to: control the movement of the second lens module and/or the sensor module to realize the focusing function and/or of the camera module Or anti-shake function.
  • the camera module further includes: a first lens module; wherein the first lens module is used to: output the optical signal incident on the first lens module to the filter module The target location.
  • the second lens module includes one or more of the following: a plastic lens, a glass lens, a diffractive optical element DOE, a super lens, and the like.
  • the sensor module includes a full band pass sensor or a wide band pass sensor.
  • the present application provides an imaging method.
  • the imaging method includes: acquiring multiple sets of image raw data, and the multiple sets of image raw data are photoelectric processing of a camera module on target light signals of different wavelength bands collected at different times.
  • the converted image raw data; the multiple sets of image raw data are subjected to color adjustment processing to obtain a color image, and different sets of image raw data in the multiple sets of image raw data correspond to target light signals in different wavelength bands .
  • the performing color adjustment processing on the multiple sets of image raw data collected by the camera module includes: performing white balance, color restoration, and gamma correction on the multiple sets of image raw data And three-dimensional color search processing.
  • the present application provides an imaging device including various modules for implementing the imaging method described in the second aspect. These modules can be implemented in hardware or software.
  • an imaging device in a fourth aspect, includes: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is executed, the processor uses To perform the method described in the first aspect.
  • the imaging device further includes a communication interface, and the communication interface is used for information interaction with the camera module or other devices.
  • the imaging device further includes a transceiver, and the transceiver is used for information interaction with the camera module or other devices.
  • the present application provides a method for adjusting the spectral mode of a camera module, the method comprising: receiving first information, where the first information is used to instruct to set the spectral mode of the camera module; and in response to the The first information is to output the mode information of each spectral mode in the plurality of spectral modes, and the mode information of each spectral mode includes the name information of each spectral mode.
  • the multiple spectral modes include at least one of the following: a normal mode, a high-precision mode, a dark light mode, a printing mode, or an expert mode.
  • the method further includes: receiving second information, where the second information is used to indicate that the spectral mode of the camera module is set to the target spectral mode among the multiple spectral modes; In response to the second information, the spectrum mode of the camera module is set to the target spectrum mode.
  • the present application provides a device, which includes various modules for implementing the method described in the fifth aspect. These modules can be implemented in hardware or software.
  • a device which includes: an input unit for receiving information input by a user; a memory for storing a program; an output unit for outputting information to a user; and a processor for executing the The program stored in the memory, when the program stored in the memory is executed, the processor is used to execute the method described in the fifth aspect.
  • the input unit may be a touch screen, a microphone, a mouse, a keyboard, a camera, or other devices capable of sensing user input; the output unit may be a display screen, a loudspeaker, and other devices.
  • the present application provides an imaging system that includes one or more of the following devices: the camera module according to the first aspect, the imaging device according to the third or fourth aspect, and the sixth aspect Or the device described in the seventh aspect.
  • the present application provides a computer-readable storage medium that stores instructions for execution by an imaging device, and the instructions are used to execute the imaging method described in the second aspect.
  • the present application provides a computer-readable storage medium that stores instructions for execution by an imaging device, and the instructions are used to execute the method described in the fifth aspect.
  • a computer program product containing instructions when the computer program product runs on a computer, the computer executes the imaging method described in the second aspect.
  • a twelfth aspect provides a computer program product containing instructions, when the computer program product runs on a computer, the computer executes the imaging method described in the fifth aspect.
  • a chip in a thirteenth aspect, includes a processor and a data interface.
  • the processor reads instructions stored in a memory through the data interface and executes the method described in the second aspect.
  • a chip in a fourteenth aspect, includes a processor and a data interface.
  • the processor reads instructions stored in a memory through the data interface and executes the method described in the fifth aspect.
  • FIG. 1 is a schematic architecture diagram of an imaging system according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • Fig. 3 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a filter module according to an embodiment of the present application.
  • FIG. 15 is a schematic flowchart of a filter module according to another embodiment of the present application.
  • FIG. 16 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 17 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 20 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 21 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 22 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 23 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 24 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • 25 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 26 is a schematic structural diagram of a filter module according to another embodiment of the present application.
  • FIG. 27 is a schematic flowchart of an imaging method according to an embodiment of the present application.
  • FIG. 28 is a schematic flowchart of an imaging method according to another embodiment of the present application.
  • FIG. 29 is a schematic diagram of a mode selection interface of an embodiment of the present application.
  • FIG. 30 is a schematic structural diagram of an imaging system according to another embodiment of the present application.
  • FIG. 31 is a schematic flowchart of an imaging method according to an embodiment of the present application.
  • FIG. 32 is a schematic flowchart of an imaging method according to another embodiment of the present application.
  • FIG. 33 is a schematic structural diagram of a camera module according to an embodiment of the present application.
  • FIG. 34 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
  • FIG. 35 is a schematic structural diagram of an imaging device according to another embodiment of the present application.
  • FIG. 36 is a schematic structural diagram of an apparatus for adjusting the spectral mode of a camera module according to an embodiment of the present application.
  • FIG. 37 is a schematic structural diagram of an apparatus for adjusting the spectral mode of a camera module according to another embodiment of the present application.
  • FIG. 38 is a schematic structural diagram of a Bayer array filter according to an embodiment of the application.
  • FIG. 39 is a schematic diagram of a single-color image according to an embodiment of the application.
  • FIG. 40 is a schematic diagram of a single-color image according to another embodiment of the application.
  • the optical image generated by the lens is projected onto the surface of the image sensor, the image sensor converts the light signal into an electrical signal, the electrical signal is processed by analog-to-digital conversion to obtain a digital image signal, and the digital image signal is processed by the processor Process, and then transmit to the monitor to see the image.
  • the implementation of obtaining a single color image of the scene based on the Bayer array filter is introduced in conjunction with FIG. 38.
  • the sensors of red, green, and blue band light are arranged in a checkerboard pattern. Therefore, the image obtained after the scene is projected on the sensor through the Bayer array filter is red, green and blue
  • the three colors are arranged in a checkerboard shape, and the single-color images corresponding to the three colors of red, green and blue are shown in Figure 39.
  • each single-color image contains only part of the data.
  • the color image restored from the three single-color images obtained by the interpolation operation not only has a low resolution; moreover, problems such as moiré, color noise and zipper noise will be introduced in the interpolation process, which will also reduce the image quality .
  • this application proposes a new filter, camera module, imaging system and imaging method.
  • FIG. 1 is a schematic architecture diagram of an imaging system 100 according to an embodiment of the present application.
  • the imaging system 100 may include a camera module 110 and a processing module 120.
  • the imaging system 100 may further include a storage module 130 and a display module 140.
  • the camera module 110 may include a filter module 112 and a sensor module 115.
  • the processing module 120 may include an image signal processing unit 121 and a filter control unit 123.
  • the filter module 112, the filter control unit 123, the sensor module 115, the image signal processing unit 121, the storage module 130, and the display module 140 communicate with each other through the bus system 150, that is, transmit signals or data.
  • the filter module 112 is used to receive the control signal output by the filter control unit 123, and under the control of the control signal, at different times, output the incident light in different wavelength bands to the same pixel on the sensor module Light signal.
  • the optical signal output by the filter module 112 includes the optical signal of the wavelength band required by the imaging system 100 to complete imaging.
  • the required bands include three bands of red, green and blue; when the spectral mode of the imaging system 100 is the RYB mode, the required bands include the three bands of red, yellow, and blue; When the spectral mode of the imaging system 100 is RWB mode, the required bands include red, blue, and full bandpass; when the spectral mode of the imaging system 100 is RGBW mode, the required bands include red, green, blue, and full bandpass. Band; In some other special spectral modes, the required band can include near-infrared light in addition to the visible light band. For the convenience of description, the optical signal of the required wavelength band is called the target optical signal.
  • the filter module 112 may include, but is not limited to, the following tunable filters: MEMS-driven interferometric film filters, liquid crystal modulation-based tunable filters, acousto-optic tunable filters (AOTF) tunable filters, tunable filters based on Fabry-Perot cavity (FP) cavity interference, tunable filters based on magneto-optical effect, etc.
  • tunable filters MEMS-driven interferometric film filters, liquid crystal modulation-based tunable filters, acousto-optic tunable filters (AOTF) tunable filters, tunable filters based on Fabry-Perot cavity (FP) cavity interference, tunable filters based on magneto-optical effect, etc.
  • AOTF acousto-optic tunable filters
  • FP Fabry-Perot cavity
  • the sensor module 115 is used to convert the light signal incident through the filter module 112 into an electrical signal, and output the electrical signal.
  • the process in which the sensor module 115 converts optical signals into electrical signals may also be referred to as photosensitive imaging.
  • the process of continuously converting the optical signal into an electrical signal by the sensor module 115 can be referred to as continuous integration of the optical signal.
  • the sensor module 115 can perform photoelectric signal conversion only when the filter module 112 outputs the target light signal under the control of the control unit; it can also perform photoelectric signal conversion on all the optical signals output by the filter module 112. Conversion of photoelectric signals.
  • the image signal processing unit 121 can filter out the electrical signal converted from the target light signal from the electrical signal output by the sensor module 115.
  • the electrical signal output by the sensor module 115 may be referred to as image raw data.
  • each pixel of the full band pass sensor can be photosensitive and imaged to visible light, or each pixel of the full band pass sensor can be photosensitive and imaged to visible light and near-infrared light .
  • the image signal processor (ISP) unit 121 is used to process the electrical signal output by the sensor module 115 to obtain a color image.
  • the image signal processing unit 121 is used to perform the following processing on the original image data: white balance (WB), color correction (CC), gamma correction, and three-dimensional color look-up table (3dimensions look-up table). Up-table, 3D Lut) correction to achieve color-related debugging and obtain full-color images.
  • the filter control unit 123 is configured to output a control signal to the filter module 112 to control the filter module 112 to filter and output target light signals of different wavelength bands in the incident light at different times.
  • the storage module 130 is used to store the program code and/or image-related data executed by the processing module 120, for example, the original image data collected by the sensor module 115, the temporary data when the processing module 120 performs color adjustment on the original image data, and the data obtained after color adjustment. Color image.
  • the storage module 130 may be a read only memory (ROM), a static storage device, a dynamic storage device, or a random access memory (RAM), etc.
  • ROM read only memory
  • RAM random access memory
  • the display module 140 is used for displaying the color image processed by the image signal processing unit 121.
  • the display module 140 may be a liquid crystal display (LCD), a light emitting diode (LED) display device, a cathode ray tube (CRT) display device, or a projector (projector).
  • LCD liquid crystal display
  • LED light emitting diode
  • CRT cathode ray tube
  • projector projector
  • the filter module 112 can provide finer spectrum raw data, and the processing module 120 can provide a high-dimensional CCM matrix to ensure multiple degrees of freedom during color adjustment and ensure the accuracy of color reproduction.
  • FIG. 2 is a schematic architecture diagram of an imaging system 200 according to another embodiment of the present application. As shown in FIG. 2, in addition to the various components of the imaging system 100, the imaging system 200 may also include a lens module 111 in the camera module 110 in the imaging system 200.
  • the lens module 111 can also be used to enable the target light signal output by the filter module 112 to cover a larger area of the photosensitive area of the sensor module 115, for example, just cover the photosensitive area of the sensor module 115.
  • the angle at which the optical signal passing through the lens module 111 enters the filter module 112 is smaller than the angle at which the optical signal enters the filter module 112 without the lens module 111.
  • the lens module 111 can be understood as reducing the incident angle of the optical signal to the filter module 112. In this way, the lens module 111 reduces the color shift problem caused by the filter module 112 under a large angle of incidence, and at the same time ensures that more light signals are incident on the sensor module 115, thereby further improving the imaging quality.
  • the lens module 111 is located in front of the filter module 112 and can also be used to collect the light signal emitted or reflected by the target and/or the target scene, and output the light signal to the filter module 112.
  • the lens module 111 may include, but is not limited to, the following lens components: a plastic lens group, a plastic glass hybrid lens group, a diffractive optical element (DOE) lens, a metalens, or other lenses. It is understandable that the three lenses included in the lens module 111 is only an example, and it may include more or less or more types of lenses.
  • DOE diffractive optical element
  • FIG. 3 is a schematic architecture diagram of an imaging system 300 according to another embodiment of the present application.
  • the imaging system 300 may also include a motor module 114 in the camera module 110 of the imaging system 300, and a motor control unit 122 in the processing module 120.
  • the motor module 114 is specifically used to: receive the control signal output by the motor control unit 122, and move the lens module 111 according to the control signal to adjust the relative position between the lens module 111 and the sensor module 115, for example along the direction of the optical axis of the lens
  • the distance between the lens module 111 and the sensor module 115 is adjusted to realize the focusing function, and the relative position between the lens module 111 and the sensor module 115 is adjusted along the direction perpendicular to the optical axis to realize optical image stabilization.
  • the motor module 114 may include, but is not limited to, the following types of motors: voice coil motor (VCM), shape-memory alloy (SMA) motor, piezoelectric (Peizo) motor or microelectromechanical system (microelectromechanical system). systems, Mems) motors, etc.
  • VCM voice coil motor
  • SMA shape-memory alloy
  • Peizo piezoelectric
  • microelectromechanical system microelectromechanical system
  • Mems microelectromechanical system
  • the filter module and the focusing and optical image stabilization functions are independent of each other, so the optical image stabilization performance is not sacrificed.
  • FIG. 4 is a schematic architecture diagram of an imaging system 400 according to another embodiment of the present application.
  • the imaging system 400 may include various components of the imaging system 300.
  • the motor module 114 is specifically used to: receive the control signal output by the motor control unit 122, and move the sensor module according to the control signal.
  • 115 to adjust the relative position between the lens module 111 and the sensor module 115, for example, adjust the distance between the lens module 111 and the sensor module 115 along the optical axis of the lens, so as to realize the focusing function, along the direction perpendicular to the optical axis of the lens
  • the relative position of the lens module 111 and the sensor module 115 is adjusted to achieve optical anti-shake.
  • FIG. 5 is a schematic architecture diagram of an imaging system 500 according to another embodiment of the present application.
  • the imaging system 500 may include various components in the imaging system 300.
  • the motor module 114 is specifically used to: receive the control signal output by the motor control unit 122, and move the lens module according to the control signal.
  • 111 and the sensor module 115 to adjust the relative position between the lens module 111 and the sensor module 115, for example, to adjust the distance between the lens module 111 and the sensor module 115 along the optical axis of the lens, so as to realize the focusing function.
  • the direction of the optical axis adjusts the relative position between the lens module 111 and the sensor module 115 to achieve optical image stabilization, and the motor module 114 can move the lens module 111 and the sensor module 115 at the same time, thereby quickly achieving focus and rapid image stabilization.
  • FIG. 6 is a schematic architecture diagram of an imaging system 600 according to another embodiment of the present application.
  • the imaging system 600 may include various components in the imaging system 200. The difference is that the lens module 111 in the imaging system 600 is in a different position from the lens module 111 in the imaging system 200. Specifically, the lens module 111 in the imaging system 600 is located between the filter module 112 and the sensor module 115.
  • FIG. 7 is a schematic architecture diagram of an imaging system 700 according to another embodiment of the present application.
  • the imaging system 700 may include various components in the imaging system 300. The difference is that the lens module 111 in the imaging system 700 has a different position from the lens module 111 in the imaging system 300. Specifically, the lens module 111 in the imaging system 700 is located between the filter module 112 and the sensor module 115.
  • FIG. 8 is a schematic architecture diagram of an imaging system 800 according to another embodiment of the present application.
  • the imaging system 800 may include various components in the imaging system 400, the difference is that the lens module 111 in the imaging system 800 is in a different position from the lens module 111 in the imaging system 400. Specifically, the lens module 111 in the imaging system 800 is located between the filter module 112 and the sensor module 115.
  • FIG. 9 is a schematic architecture diagram of an imaging system 900 according to another embodiment of the present application.
  • the imaging system 900 may include various components in the imaging system 500. The difference is that the lens module 111 in the imaging system 900 is in a different position from the lens module 111 in the imaging system 500. Specifically, the lens module 111 in the imaging system 900 is located between the filter module 112 and the sensor module 115.
  • FIG. 10 is a schematic structural diagram of an imaging system 1000 according to another embodiment of the present application.
  • the imaging system 1000 may include various components in the imaging system 600. The difference is that the imaging system 1000 has at least one more lens module 111 than the imaging system 600, and the lens module 111 is located far away from the filter module 112. One side of the sensor module 115.
  • FIG. 11 is a schematic architecture diagram of an imaging system 1100 according to another embodiment of the present application.
  • the imaging system 1100 may include various components of the imaging system 700. The difference is that the imaging system 1100 has at least one more lens module 111 than the imaging system 700, and the lens module 111 is located far away from the filter module 112. One side of the sensor module 115.
  • FIG. 12 is a schematic architecture diagram of an imaging system 1200 according to another embodiment of the present application.
  • the imaging system 1200 may include various components in the imaging system 800. The difference is that the imaging system 1200 has at least one more lens module 111 than the imaging system 800, and the lens module 111 is located far away from the filter module 112. One side of the sensor module 115.
  • FIG. 13 is a schematic architecture diagram of an imaging system 1300 according to another embodiment of the present application.
  • the imaging system 1300 may include various components in the imaging system 900. The difference is that the imaging system 1300 has at least one more lens module 111 than the imaging system 900, and the lens module 111 is located far away from the filter module 112. One side of the sensor module 115.
  • FIG. 30 is a schematic structural diagram of an imaging system 3000 according to another embodiment of the present application.
  • the imaging system 3000 may include various components in the imaging system 1300.
  • the motor module 114 in the imaging system 3000 is used to: receive the control signal output by the motor control unit 122, and control according to the The signal moves the lens module 111 and the sensor module 115 on the side of the filter module 112 away from the sensor module 115 to adjust the relative position between the lens module 111 and the sensor module 115, thereby quickly achieving focus and rapid anti-shake.
  • the schematic structure of the imaging system proposed by the present application is described above.
  • the structure of the filter module proposed in this application is described below. Before introducing the filter module proposed in this application, let us first introduce the light-passing hole in the imaging system of this application.
  • FIG. 33 is a schematic structural diagram of a camera module according to an embodiment of the application.
  • the camera module 3300 includes a lens 3310, a filter module 3320, and a sensor module 3330.
  • the dotted cylinder area in FIG. 33 is centered on the normal line of the lens 3310 and the sensor module 3330, and the dotted cylinder area is the light-passing hole 3340.
  • the camera module 3300 including the lens 3310, the filter module 3320, and the sensor module 3330 is only an example.
  • the structure of the camera module 3300 can be the structure of the camera module in any of the aforementioned imaging systems. structure.
  • the camera module 3300 may further include a motor module, or another lens module may be included between the filter 3320 and the sensor module 3330, or the camera module may not include the lens 3310.
  • the shape of the filter module 3320 shown in FIG. 33 is only an example, and the filter module 3320 may be the filter module shown in any one of FIGS. 14 to 19.
  • FIG. 14 is a schematic structural diagram of a filter module 1400 according to an embodiment of the present application.
  • the filter module 1400 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 1400 includes a red narrow-band filter 14-1, a green narrow-band filter 14-2, a blue narrow-band filter 14-3 and a high-speed motorized wheel 14-4, among which, the red narrow-band filter 14 -1.
  • the green narrowband filter 14-2 and the blue narrowband filter 14-3 are installed on the high-speed motorized runner 14-4.
  • the high-speed motorized runner 14-4 is opaque, and only the narrowband filter is transparent. Light.
  • the working principle of the filter module 1400 is as follows: the high-speed electric wheel receives the control signal output by the filter control unit, and rotates at a high speed under the control of the control signal, thereby driving the red narrowband filter 14-1 and the green narrowband filter 14-2 and the blue narrow-band filter 14-3 rotate; because the position of the light hole 14-5 is fixed, therefore, driven by the high-speed motorized wheel, the red narrow-band filter 14-1 and the green narrow-band
  • the filter 14-2 and the blue narrowband filter 14-3 alternately cover the clear aperture 14-5; the red narrowband filter 14-1, the green narrowband filter 14-2 and the blue narrowband filter 14 When each filter in -3 covers the light-passing hole 14-5, the light-passing hole 14-5 outputs the target light signal of the corresponding wavelength band. That is, through the high-speed rotation of the high-speed electric rotating wheel, the target light signal output by the filter module 1400 can be switched between the red, green, and blue wavebands.
  • FIG. 16 is a schematic structural diagram of a filter module 1600 according to another embodiment of the application.
  • the filter module 1600 includes a red narrow-band filter 16-1, a green narrow-band filter 16-2, a blue narrow-band filter 16-3 and a high-speed motorized wheel 16-4, among which, the red narrow-band filter 16 -1.
  • the green narrow-band filter 16-2 and the blue narrow-band filter 16-3 are installed on the high-speed electric runner 16-4, and the high-speed electric runner 16-4 is opaque.
  • the filter module 1600 is a drawer push-pull mechanical switching structure.
  • the filter module 1600 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the working principle of the filter module 1600 is as follows: Since the position of the light-passing hole 16-5 is fixed, the red narrow-band filter 16-1 and the green narrow-band filter are driven by the pull-out/translation motor. Sheet 16-2 and blue narrow-band filter 16-3 alternately cover the clear aperture 16-5; red narrow-band filter 16-1, green narrow-band filter 16-2 and blue narrow-band filter 16-3 When each light filter in the light-passing hole 16-5 covers the light-passing hole 16-5, the light-passing hole 14-5 outputs the target light signal of the corresponding band. That is to say, the target light signal output by the filter module 1600 can be switched between the red, green, and blue wavebands driven by the pull-out/translation motor.
  • the filters in the filter module 1400 or the filter module 1600 may be filters of other wavelength bands, for example, it may be a red narrowband filter, a yellow narrowband filter, and a blue narrowband filter. Or it can be a red narrowband filter, a blue narrowband filter, and a full-band filter, or it can be a red narrowband filter, a green narrowband filter, a blue narrowband filter, and a full-band filter.
  • filters 14-6 or even more filters can be added to the filter module 1400.
  • an infrared filter can be arranged on the high-speed motorized wheel 14-4, and the infrared filter only allows infrared light to pass through.
  • the infrared filter 16-6 can be set on the high-speed motorized wheel 16-4, and the infrared filter only allows infrared light to pass through.
  • the infrared cut filter can be covered on the light-passing hole 14-5.
  • the light-passing hole 16 can be covered.
  • -5 is covered with an infrared cut filter, which does not rotate with the rotation of the high-speed electric motor to filter out the infrared light in the target light signal, thereby improving the image quality; or, it can filter out the infrared completely.
  • the surface of the red filter is covered with an infrared cut filter to filter out infrared light.
  • FIG. 18 is a schematic structural diagram of a filter module 1800 according to another embodiment of the application.
  • the filter module 1800 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 1800 includes a gradual interference type thin film filter 18-1 and a high-speed electric rotating wheel 18-2, and the gradual interference type thin film filter 18-1 is fixed on the high-speed electric rotating wheel 18-2.
  • the gradual interference thin film filter 18-1 can continuously filter in the visible light range.
  • the high-speed motorized runner 18-2 does not transmit light, only the gradual interference thin film filter 18-1 transmits light, and the light-passing hole of the imaging system is shown in 18-3.
  • the gradient interference type thin film filter 18-1 shows a gradient color along the arrow direction, and the color gradually changes from light to dark, so it can continuously filter light in the visible light range.
  • the graded interference thin film filter 18-1 can continuously filter light with a wavelength in the range of 380 nanometers to 780 nanometers.
  • the wavelength of the light that the filter 18-1 can transmit continuously changes in the direction of the arrow.
  • the filter module 1800 can filter the wavelength of the output target optical signal continuously changing.
  • the high-speed electric rotating wheel 18-2 can drive the filter 18-1 to rotate only in the counterclockwise direction or only in the clockwise direction, and can also drive the filter to rotate in the counterclockwise and clockwise directions.
  • the high-speed electric rotating wheel 18-2 can drive the filter 18-1 to continuously rotate in a counterclockwise direction, thereby realizing continuous filtering.
  • the high-speed electric rotating wheel 18-2 can drive the filter 18-1 to continuously rotate in a clockwise direction, thereby realizing continuous filtering.
  • the high-speed electric rotating wheel 18-2 drives the filter 18-1 to rotate in a counterclockwise direction, it drives the filter 18-1 to rotate in a clockwise direction, and after rotating in a clockwise direction, Then drive the filter to rotate counterclockwise to achieve continuous filtering.
  • the working principle of the filter module 1800 is similar to the working principle of the filter module 1400, and will not be repeated here. The difference is that the progressive filter has light passing through all the time, and there is no gap between the two filters in the 1400 structure.
  • a separate infrared cut-off filter can be installed on the light-passing hole 18-3, and the infrared cut-off filter does not rotate with the wheel .
  • the gradual interference thin film filter 18-1 is continuously adjustable from visible light to infrared light.
  • FIG. 19 is a schematic structural diagram of a filter module 1900 according to another embodiment of the application.
  • the filter module 1900 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 1900 includes a gradual interference type thin film filter 19-1 and a high-speed electric rotating wheel 19-2, and the gradual interference type thin film filter 19-1 is fixed on the motor 19-2.
  • the gradual interference thin film filter 19-1 continuously filters in the visible light range along the long side.
  • the motor 19-2 does not transmit light, and only the gradual interference thin film filter 19-1 transmits light.
  • the light-through hole of the imaging system is shown in 19-3.
  • the gradient interference thin film filter 19-1 has a gradient color along the arrow direction, and the color gradually changes from light to dark, so it can continuously filter light in the visible light range.
  • the graded interference thin film filter 19-1 can continuously filter light with a wavelength in the range of 380 nanometers to 780 nanometers.
  • the filter module 1900 can continuously filter the wavelength of the output target optical signal to continuously change.
  • the motor 19-2 can drive the filter 19-1 to move in the direction of the arrow and in the opposite direction of the arrow.
  • the motor 19-2 drives the filter 19-1 to move in the direction of the arrow, so that the different color parts of the filter 19-1 slide through the light-passing hole 19-3 in order from left to right; then the motor 19 -2 drives the filter 19-1 to move in the opposite direction of the arrow, so that the different color parts of the filter 19-1 slide through the light-passing hole 19-3 in the order from right to left.
  • the working principle of the filter module 1900 is similar to the working principle of the filter module 1400, and will not be repeated here. The difference is that the progressive filter can filter light continuously without the gap between the two filters in the 1400 structure.
  • a separate infrared cut-off filter can be installed on the light-passing hole 19-5, and the infrared cut-off filter does not rotate with the wheel .
  • the gradual interference thin film filter 19-1 is continuously adjustable from visible light to infrared light.
  • FIG. 20 is a schematic structural diagram of a filter module 2000 according to another embodiment of the application.
  • the filter module 2000 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 2000 includes a mirror 20-1, a mirror 20-2, a high-speed moving motor 20-3, and a fixed structure 20-4.
  • the reflecting mirror 20-1 and the reflecting mirror 20-2 are arranged in sequence along the optical axis, and the optical axis can be understood as the center line of the light beam (light beam) passing through the reflecting mirror 20-1 and the reflecting mirror 20-2 perpendicularly.
  • the reflector 20-2 is fixed at a fixed position in the filter module 2000 by the fixing structure 20-4, and the reflector 20-1 is driven by the high-speed moving motor 20-3 to move away from or approach the reflector 20-1 along the optical axis. .
  • the reflecting mirror 20-1 can reflect the light incident on the reflecting mirror 20-1 to the reflecting mirror 20-2, and the reflecting mirror 20-2 can also reflect the light incident on the reflecting mirror 20-2 to the reflecting mirror 20-1.
  • the opposite inner surfaces of the mirror 20-1 and the mirror 20-2 have high reflectivity.
  • the transmitted light is the superposition of many parallel beams, and the optical path difference of any pair of adjacent beams is 2*n*l*cos ⁇ , the mirror 20-1 and the mirror 20-1
  • the transmittance of the mirror 20-2 is determined by the optical path difference, where n refers to the refractive index of the medium between the mirror 20-1 and the mirror 20-2, and l refers to the mirror 20-1 and the mirror 20-2
  • the distance between ⁇ refers to the angle of incidence.
  • the reflector 20-1 and the reflector 20-2 have the largest transmittance of the target optical signal. At this time, it can be considered that the filter module 2000 can only pass the target optical signal and reflect light of other wavelengths.
  • the mirror 20-1 can be moved by the high-speed moving motor 20-3 to change the distance between the mirror 20-1 and the mirror 20-2, so that the distance between the mirror 20-1 and the mirror 20-2 can be adjusted.
  • the optical path difference of the transmitted light can further adjust the transmittance of the light that can pass through the mirror 20-1 and the mirror 20-2, and finally make the filter module 2000 output a target light signal of a desired wavelength.
  • the mirror 20-2 may also be arranged on a high-speed moving motor and move with the movement of the high-speed moving motor. In these examples, it is only necessary to ensure that when the mirror 20-1 and the mirror 20-2 move with the high-speed moving motor, the mutual distance can be changed, so that the optical path difference can reach an integer multiple of the wavelength of the target optical signal. can.
  • the filter module 2000 may be continuously adjustable in the entire visible light range, or may be continuously adjustable in the visible light range and the near-infrared light range.
  • the reflectors 20-1 and 20-2 can usually be silver-plated (Ag) reflectors.
  • the high-speed moving motor 20-3 may be a high-speed moving motor based on MEMs or Peizo.
  • the high-speed moving motor 20-3 can receive a control signal output by the filter control unit, and the control signal is used to control the distance moved by the high-speed moving motor 20-3.
  • an infrared cut filter 20-5 can be added to filter infrared light.
  • the infrared cut filter 20-5 completely covers the clear aperture of the reflector 20-2, so as to achieve the purpose of allowing only the visible light waveband to pass.
  • FIG. 22 is a schematic structural diagram of a filter module 2200 according to another embodiment of the application.
  • the filter module 2200 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 2200 includes a linear polarizer 22-1, a phase delay unit 22-2, a liquid crystal cell 22-3, and an analyzer 22-4.
  • the linear polarizer 22-1, the phase retardation unit 22-2, the liquid crystal cell 22-3, and the analyzer 22-4 are arranged in order along the optical axis.
  • the optical axis can be understood as the light column passing through the linear polarizer 22-1 ( The centerline of the beam).
  • the optical signal is converted into linearly polarized light by the linear polarizer 22-1.
  • the linearly polarized light undergoes a birefringence effect under the action of the phase retarder 22-2 and the liquid crystal cell 22-3, resulting in a phase difference, which makes the polarization direction of the linearly polarized light rotate, and then So that only the optical signal consistent with the direction of the analyzer 22-4 can pass.
  • the same liquid crystal rotation direction introduces different phase differences for optical signals of different wavelength bands, so a variable voltage can be output to the liquid crystal cell, and the liquid crystal cell can be controlled by the variable voltage to filter out the target optical signal of the corresponding wavelength band.
  • the liquid crystal cell 22-3 can receive the control signal output by the filter control unit, and the control signal is used to control the input voltage of the liquid crystal cell 22-3.
  • an infrared cut filter can be added to the filter module 2200 to filter out infrared light, thereby improving the imaging quality.
  • the tunable filter module 2200 there is no special requirement for the position of the infrared cut filter, as long as it can cover the entire light-passing hole area.
  • the infrared cut filter 22-5 is arranged behind the analyzer along the optical axis direction.
  • the structure shown in FIG. 22 (a structure in a dashed frame in FIG. 24) is taken as an integral unit, and a plurality of the structures are arranged in sequence along the optical axis to obtain a new
  • the filter module 2400 the filter structure can output finer waveband optical signals.
  • an infrared cut filter can be added to the filter module 2400 to filter out infrared light, thereby improving the image quality.
  • the infrared cut filter 24-5 is arranged after the last analyzer along the optical axis.
  • FIG. 25 is a schematic structural diagram of a filter module 2500 according to another embodiment of the application.
  • the filter module 2500 may be a filter module in the camera module shown in any one of FIGS. 1 to 13.
  • the filter module 2500 includes a linear polarizer 25-1, a phase delay unit 25-2, an acousto-optic tunable filter (AOTF) 25-3, and an analyzer 25-4.
  • the linear polarizer 25-1, the phase delay unit 25-2, the acousto-optic tunable filter (AOTF) 25-3, and the analyzer 25-4 are arranged in order along the optical axis.
  • the optical axis can be understood as the linear polarized light passing vertically.
  • the phase delay unit is also called a phase retarder.
  • the incident light signal is converted into linearly polarized light by the linear polarizer 25-1.
  • the linearly polarized light undergoes a birefringence effect under the action of the phase retarder 25-2 and AOTF 25-3, resulting in a phase difference, which makes the polarization direction of the linearly polarized light rotate.
  • the optical signal in the same direction as the analyzer 25-4 can pass.
  • the AOTF is driven by radio frequency, and the same radio frequency (RF) signal introduces different phase differences for different wavelengths. Therefore, the AOTF can be controlled by changing the frequency of the radio frequency signal to filter out light of different bands.
  • RF radio frequency
  • the phase retarder can be a liquid crystal phase retarder, and the material of the AOTF can be tellurium dioxide (TeO2).
  • an infrared cut filter can be added to the filter module.
  • the tunable filter module 2600 there is no special requirement for the position of the infrared cut-off filter, as long as it can cover the entire light-passing hole area.
  • the infrared cut filter 25-5 is arranged after the analyzer along the optical axis direction.
  • the filter module can time-division output target signal light of different wavelength bands, the sensor module can obtain a complete single-color image without interpolation operation, thereby avoiding the problem of image quality degradation due to interpolation operation .
  • FIG. 27 is a schematic flowchart of an imaging method according to an embodiment of the application.
  • the imaging method can be executed by the imaging system shown in any one of FIGS. 1 to 13.
  • the imaging method shown in FIG. 27 includes S2710 to S2780. It is understandable that the imaging method proposed in this application may include more or fewer steps. For example, when the imaging system that executes the imaging method does not include the motor module and the motor control unit, the imaging method does not include S2750 and S2740.
  • S2710 start the imaging application.
  • the filter control unit outputs a control signal to the filter module to control the filter module to output a target light signal.
  • the filter control unit controls the filter module to perform spectral screening according to the set spectrum mode.
  • Spectral modes include but are not limited to the following modes: RGB mode, the required band includes three bands of red, green and blue; RYB mode, the required band includes three bands of red, yellow, and blue; RWB mode, the required band includes red RGBW mode, the required bands include red, green, blue, and full-band pass bands; in some other special spectrum modes, the required bands can also include visible light bands
  • the near-infrared light band for example, is used for post-image processing to obtain better image quality through calculation in ISP. Among them, the spectral range of each band can be fine-tuned according to actual applications.
  • the sensor module receives light signals, converts the light signals into electrical signals (ie, image raw data) and transmits them to the ISP.
  • the light-sensing integration time and period of the sensor module are matched with the filter time and period of various filters in the filter module.
  • the sensor module performs a set of target light signals in a single band. Continuous integration, without detecting the optical signal during band switching in each spectrum mode.
  • the data obtained by the sensor module continuously integrating the target light signal in the set single band can be called a narrow band picture.
  • the ISP receives the raw image data from the sensor module and processes the raw image data. For example, the sensor module transmits a group of narrow-band pictures to the ISP, and the ISP performs pre-processing, white balance, color restoration, Gamma correction, and 3D Lut correction on the narrow-band pictures in the entire color image processing pipeline. , And finally complete the color-related debugging, and get the color image.
  • the motor control unit outputs control information to the motor module according to the feedback information of the gyroscope, so that the motor module adjusts the distance between the lens module and the sensor module under the control of the control information. If there is no motor module and motor control unit in the imaging system, this step may not be performed.
  • the motor control unit outputs control information to the motor module according to the focus information output by the ISP, so that the motor module adjusts the distance between the lens module and the sensor module under the control of the control information. If there is no motor module and motor control unit in the imaging system, this step may not be performed.
  • the display module displays the color image processed by the ISP.
  • S2770 Determine whether to take a photo or video. For example, when the user clicks the camera or video button on the user interface, it is determined to take a photo or video.
  • the narrow-band image received by the ISP from the sensor module is a single-color image
  • the entire ISP color path does not need to be interpolated (ie, demosaicing) to obtain a complete single-color image, which can effectively Improve image resolution, avoid moiré fringes, and ultimately improve image quality.
  • the present application also provides an imaging method, and a schematic flowchart of the imaging method is shown in FIG. 31.
  • the imaging method shown in FIG. 31 may include S3101 to S3107.
  • the imaging method can be executed by the ISP.
  • ISP processes narrow-band images in the color image processing path to obtain color images.
  • the n narrow-band images may be images captured by a camera module including the filter module shown in any one of FIGS. 14 to 26.
  • the narrow-band image 1 to the narrow-band image n correspond to the n bands included in the spectral mode of the imaging system to which the ISP belongs
  • the narrow-band image i means that the filter module filters out the i-th of the n bands.
  • the sensor module continuously integrates the image data obtained by the target light signal of the i-th waveband. For example, when n is equal to 3, the narrow band image 1 corresponds to the red band, the narrow band image 2 corresponds to the green band, and the narrow band image 3 corresponds to the blue band.
  • the preprocessing here can refer to non-color related processing of the image data collected by the sensor module, for example, noise suppression or sharpening and other related processing.
  • the image information corresponding to adjacent channels is added together to obtain the final channel mode.
  • this pre-processing method can improve the sensitivity of the multi-channel filter.
  • S3102 Perform white balance on the preprocessed image.
  • White balance can restore the color of the scene being shot, so that the scene shot under different light source conditions is similar to the color of the picture viewed by the human eye.
  • three data matrices are usually required for white balance processing. These three data matrices correspond to the image data of the three channels one to one. These three data matrices
  • the method of obtaining can refer to the existing technology.
  • more than three data matrices are usually required for white balance processing, and these three or more data matrices correspond to the image data of more than three channels in a one-to-one manner.
  • the way of obtaining the three or more data matrices can refer to the way of obtaining the three data matrices, which will not be repeated here.
  • S3103 Use the color correction matrix to perform color reproduction on the white balanced image.
  • Color correction can ensure that the color of the image can be more accurately reproduced by the human eye at the shooting scene.
  • the dimension of the color correction matrix (CCM) used for color restoration can be the number of image channels obtained after preprocessing and the number of primary colors of the display system.
  • the corresponding CCM is an m*n matrix, and m and n are positive integers.
  • the corresponding CCM is a 3*3 matrix.
  • the CCM acquisition method in the embodiment of the present application can refer to the 3*3 CCM acquisition method in the prior art, which will not be repeated here.
  • S3104 Perform gamma correction on the image obtained by color restoration.
  • ⁇ i represents the image data matrix of the i-th channel among the n channels obtained by preprocessing, and i is a positive integer less than or equal to n;
  • S3105 Perform 3D Lut correction on the image obtained by gamma correction.
  • S3106 Perform post processing on the image obtained by 3D Lut correction.
  • the post-processing is similar to the pre-processing and will not be repeated here.
  • S3107 Display the post-processed image, and/or store the post-processed image. Among them, before storing the image, the image can be compressed first to save storage space.
  • FIG. 31 is only examples, and the imaging method proposed in the present application may include more or fewer operations, or similar operations may be performed.
  • the ISP when the ISP performs color image processing, it can process not only narrow-band image 1 to narrow-band image n, but also infrared (IR) waveband images.
  • IR infrared
  • infrared band images can be used for grayscale imaging under low illumination, and for more accurate color reproduction of visible light images.
  • This application also provides an imaging method, which may include S2720 and S2730.
  • the imaging method may further include S2740; optionally, it may also include S2750, and optionally, it may also include 2760 and/or S2780.
  • the present application also provides a camera module in the imaging system as shown in any one of FIGS. 1 to 13.
  • the present application also provides a multi-module camera, which includes a plurality of camera modules, at least one of which is a camera module in the imaging system shown in any one of FIGS. 1 to 13.
  • at least one of the camera modules may be a traditional camera module, such as a camera module based on a Bayer array sensor or a foveon sensor. In this way, the image quality can be improved and the binocular distance measurement can be realized jointly.
  • FIG. 28 A schematic flowchart of an imaging method when the multi-module camera includes a traditional camera module and the camera module in the imaging system shown in any one of FIGS. 1 to 13 is shown in FIG. 28.
  • the camera module newly proposed in this application is called an adjustable camera module.
  • S2820 start the traditional camera module to preview and focus. This step can refer to the prior art.
  • S2840 use the adjustable camera module to take pictures or videos.
  • this step refer to S2720 to S2780 in the imaging method shown in FIG. 27.
  • the present application also provides an imaging device, which includes the processing module in the imaging system shown in any one of FIGS. 1 to 13, and may even include a storage module and/or a display module therein.
  • the present application also provides a terminal device, which includes the imaging system shown in any one of FIG. 1 to FIG. 13.
  • This application also provides a mode setting method.
  • the method includes: displaying a mode selection interface, which includes multiple mode options; and determining the mode selected by the user according to the input information of the user on the mode selection interface, for example, the user clicks
  • the mode corresponding to the option is the mode selected by the user; the spectral mode is set according to the mode selected by the user, and different spectral modes correspond to optical signals of different bands.
  • the present application also provides a method for adjusting the spectral mode of a camera module.
  • the method includes: receiving instruction information for instructing to set the spectral mode of the camera module, and in response to the instruction information, outputting a plurality of spectral modes
  • the mode information of each spectral mode, and the mode information of each spectral mode includes the name information of each spectral mode.
  • the filter channels of the camera module are different.
  • the multiple spectral modes include but are not limited to: normal mode, high-precision mode, dark light mode, printing mode, and the like. Normally, the default user selects the normal mode.
  • the tunable filter adopts the same filtering strategy as the traditional Bayer filter, using red, green, and blue three-channel filtering; in the high-precision mode, the tunable filter adopts at least four-channel filtering, The degree of spectral fineness is higher than that of the traditional Bayer filter; in the dark mode, the filter channel of the tunable filter generally contains at least one channel wider than the RGB monochromatic band, such as RYB, RWB, etc.
  • the multiple spectrum modes may also include an expert mode.
  • the expert mode is mainly aimed at the needs of more professional people. In the expert mode, the user can customize the number of filtering channels and the band of each channel.
  • the method may further include: receiving second information, the second information being used to indicate that the spectral mode of the camera module is set to the target spectral mode of the plurality of spectral modes; in response to the second information, The spectrum mode of the camera module is set to the target spectrum mode.
  • the target spectrum mode is the spectrum mode selected by the user from the multiple spectrum modes.
  • the mode information of these multiple spectral modes can be output in the form of an interface.
  • An example is shown in Figure 29.
  • the mode information of the multiple spectral modes can also be output in other ways, for example, by means of voice playback.
  • the mode information of each spectral mode may also include the band or filter channel information corresponding to the spectral mode.
  • FIG. 29 is a schematic diagram of a mode setting interface according to an embodiment of the application.
  • the interface includes the following spectral mode options: normal mode, high-precision mode, dark mode, printing mode, and expert mode.
  • the default setting is normal mode.
  • the icon after each mode name is a selectable icon. Which icon is selected by the user indicates that the user has set the spectrum mode to the mode corresponding to the icon.
  • the expert mode is controlled by the dual-selection switch and is generally closed.
  • FIG. 34 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
  • the imaging device 3400 includes an acquisition module 3410 and a processing module 3420.
  • the imaging device 3400 can be used to implement any of the above-mentioned imaging methods.
  • the acquiring module 3410 is configured to acquire multiple sets of image raw data, where the multiple sets of image raw data are the raw image data obtained by photoelectric conversion of the target light signals of different wavelength bands collected by the camera module at different times.
  • the processing module 3420 is configured to perform color adjustment processing on the multiple sets of image raw data to obtain a color image, and different sets of image raw data in the multiple sets of image raw data correspond to target light signals in different wavelength bands.
  • the processing module 3420 is specifically configured to: perform white balance, color restoration, gamma correction, and three-dimensional color search processing on the multiple sets of image raw data.
  • FIG. 35 is a schematic structural diagram of an imaging device according to another embodiment of the present application.
  • the device 3500 includes a memory 3510 and a processor 3520.
  • the memory 3510 is used to store programs.
  • the processor 3520 is configured to execute a program stored in the memory 3510, and when the program stored in the memory 3510 is executed, the processor 3520 is configured to execute any one of the foregoing imaging methods.
  • FIG. 36 is a schematic structural diagram of an apparatus for adjusting the spectral mode of a camera module according to an embodiment of the present application.
  • the device 3600 includes an input module 3610 and an output module 3620, and optionally, may also include a processing module 3630.
  • the device 3600 can be used to implement the above-mentioned method of adjusting the spectral mode of the camera module.
  • the input module 3610 is configured to: receive instruction information for instructing to set the spectrum mode of the camera module;
  • the output module 3620 is configured to: in response to the instruction information, output the mode of each of the multiple spectrum modes Information, the mode information of each spectral mode includes name information of each spectral mode.
  • the input module 3610 may be further configured to: receive second information, the second information being used to instruct to set the spectral mode of the camera module to the target spectral mode among the multiple spectral modes; processing module 3630 is configured to: in response to the second information, set the spectrum mode of the camera module to the target spectrum mode, where the target spectrum mode is the spectrum mode selected by the user from the multiple spectrum modes.
  • FIG. 37 is a schematic structural diagram of an apparatus for adjusting the spectral mode of a camera module according to an embodiment of the present application.
  • the device 3700 includes a memory 3710 and a processor 3720.
  • the memory 3710 is used to store programs.
  • the processor 3720 is configured to execute the program stored in the memory 3710, and when the program stored in the memory 3710 is executed, the processor 3720 is configured to execute the aforementioned method of adjusting the spectral mode of the camera module.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), and application-specific integrated circuits. (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)
  • Blocking Light For Cameras (AREA)
  • Studio Devices (AREA)

Abstract

本申请提供摄像头模组、成像方法和成像装置。本申请的摄像头模组包括滤光片模块和传感器模块,所述滤光片模块用于:在不同的时间,向所述传感器模块上的同一像素点,输出入射到所述滤光片模块的光信号中不同波段的目标光信号,所述传感器模块用于:将入射到所述传感器模块的所述目标光信号转换为电信号并输出。本申请提供摄像头模组、成像方法和成像装置能够提高图像的成像质量。

Description

摄像头模组、成像方法和成像装置
本申请要求于2020年1月8日提交中国专利局、申请号为202010018843.X、申请名称为“摄像头模组、成像方法和成像装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理领域,并且更具体地,涉及摄像头模组、成像方法和成像装置。
背景技术
图像传感器是终端拍照系统最主要的组成元件之一,对成像质量起决定性作用。目前常用的彩色成像传感器为拜耳型传感器。拜耳型传感器为了得到全彩影像,需要采用去马赛克算法来插值补充,这会导致图像分辨率下降,同时,在插值过程中经常引入摩尔条纹、彩色噪声、拉链状噪声等问题,从而降低成像质量。
随着智能终端技术的发展和普及,人们对智能终端的拍照体验要求越来越高。其中,用户希望获得更高的拍照质量。
因此,如何提高图像的成像质量,成为了一个亟待解决的问题。
发明内容
本申请提供摄像头模组、成像方法和成像装置,能够提高图像的成像质量。
第一方面,本申请提供一种摄像头模组,该摄像头模组包括:滤光片模块和传感器模块;所述滤光片模块用于:在不同的时间,向所述传感器模块上的同一像素点,输出入射到所述滤光片模块的光信号中不同波段的目标光信号;所述传感器模块用于:将入射到所述传感器模块的所述目标光信号转换为电信号并输出。
本申请提供的摄像头模组中,滤光片模块分时过滤得到不同波段的目标光信号,传感器分时对这些不同波段进行光电转换,与现有技术中同时采集多个波段的光信号并采用去马赛克算法来插值补充以获取全彩影像相比,有助于提高成像分辨率和提高成像的色彩还原准确度,还可以有助于避免摩尔条纹、彩色噪声和拉链状噪声等问题。
在一些可能的实现方式中,所述滤光片模块包括移动模块和多个滤波片;所述滤波片用于输出入射到所述滤波片上的光信号中的所述目标光信号,所述多个滤波片中不同的滤波片输出的所述目标光信号的波段不同;所述移动模块用于:在不同的时间,将所述多个滤波片中不同的滤波片移动到能够接收所述光信号的目标位置。
在另一些可能的实现方式中,所述滤光片模块包括移动模块和线性渐变干涉式的滤光片;所述移动模块用于:在不同的时间,将所述线性渐变干涉式的滤光片不同部位移动到能够接收所述光信号的目标位置,所述滤光片的不同部分输出的所述目标光信号的波段不通过。
可选地,所述滤光片从可见光到红外光连续可调。
在另一些可能的实现方式中,所述滤光片模块包括两个两面反射镜和移动模块;所述移动模块用于:调节所述两个两面反射镜之间的距离,使得在不同的时间所述两个两面反射镜之间的距离不同;所述两个两面反射镜用于:将入射到所述一个两面反射镜上的光信号中的目标光信号过滤出并从所述两个两面反射镜中的另一个两面反射镜输出,所述两面反射镜之间的距离不同时,所述目标光信号的波段不同。
在另一些可能的实现方式中,所述滤光片模块包括液晶可调滤光片。可选地,所述滤光片模块为所述摄像头模组的多个滤光片模块中的一个。
在另一些可能的实现方式中,所述滤光片模块包括声光可调滤光片。
在第一方面或上述任意一种可能的实现方式中,可选地所述滤光片模块还包括红外截止滤光片,所述红外截止滤光片用于:在所述目标光信号入射到所述传感器模块之前,过滤掉所述目标光信号中的红外光。通过滤掉目标光信号中的红外光,传感器感光的波段更接近于人眼的感光波段,最终照片或者视频更接近与人眼视物的习惯。
在第一方面或上述任意一种可能的实现方式中,可选地,所述摄像头模组还包括:第一镜头模块。其中,所述第一镜头模块用于:将入射到所述第一镜头模块的光信号输出到所述滤光片模块。第一镜头模块可以减少大角度入射下滤光片模块产生的色偏问题,同时保证更多的光信号入射到传感器模块,从而可以进一步提高成像质量。
可选地,所述第一镜头模块包括以下一项或多项:塑料镜片,玻璃镜片,衍射光学元件DOE,超透镜等。
可选地,所述摄像头模组还包括马达模块;所述马达模块用于:控制所述第一镜头模块和/或所述传感器模块移动,以实现所述摄像头模块的对焦功能和/或防抖功能。
可选地,所述摄像头模块还包括第二镜头模块,所述第二镜头模块位于所述滤光片模块和所述传感器模块之间。其中所述第二镜头模块用于:将所述滤光片输出的目标光信号输出到所述传感器模块,并增大所述目标光信号在所述传感器模块上的覆盖范围。
第二镜头模块可以增大滤光片模块输出的目标光信号在传感器模块的入射角度,以增大该目标光信号在传感器模块上的覆盖范围,也就是说,第二镜头模块可以使得传感器模块在同一时间可以采集到更多的目标光信号,从而可以提供更多的图像原始数据,进而可以提高成像质量。
在第一方面或上述任意一种可能的实现方式中,可选地,所述摄像头模块还包括第二镜头模块,所述第二镜头模块位于所述滤光片模块和所述传感器模块之间;其中所述第二镜头模块用于:将所述滤光片输出的目标光信号输出到所述传感器模块,并增大所述目标光信号在所述传感器模块上的覆盖范围。
可选地,所述摄像头模组还包括第二马达模块;所述第二马达模块用于:控制所述第二镜头模块和/或传感器模块移动,以实现所述摄像头模块的对焦功能和/或防抖功能。
可选地,所述摄像头模组还包括:第一镜头模块;其中,所述第一镜头模块用于:将入射到所述第一镜头模块的光信号输出到所述滤光片模块上的所述目标位置。
可选地,所述第二镜头模块包括以下一项或多项:塑料镜片,玻璃镜片,衍射光学元件DOE,超透镜等。
在第一方面或上述任意一种可能的实现方式中,可选地,所述传感器模块包括全带通 传感器或宽带通传感器。
第二方面,本申请提供一种成像方法,该成像方法包括:获取多组图像原始数据,所述多组图像原始数据为摄像头模组对不同时间采集的不同波段的目标光信号进行光电转换得到的图像原始数据;对所述多组图像原始数据进行色彩调试处理,以得到彩色图像,所述多组图像原始数据中不同组的图像原始数据对应的不同波段内的目标光信号。
本申请的成像方法中,整个ISP色彩通路中没有去马赛克处理,有效的提高了分辨率,避免了摩尔条纹的产生;精细的图像原始数据和高维度的CCM矩阵,保障了色彩调试时候的多自由度,保障了色彩还原的准确性。
在一些可能的实现方式中,所述对所述摄像头模组采集的多组图像原始数据进行色彩调试处理,包括:对所述多组图像原始数据进行白平衡、色彩还原、伽马矫正和三维色彩查找处理。
第三方面,本申请提供一种成像装置,该成像装置包括用于实现第二方面所述的成像方法的各个模块。这些模块可以通过硬件或软件的方式实现。
第四方面,提供了一种成像装置,该装置包括:存储器,用于存储程序;处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于执行第一方面所述的方法。
可选地,该成像装置还包括通信接口,该通信接口用于与摄像头模组或其他装置进行信息交互。
可选地,该成像装置还包括收发器,该收发器用于与摄像头模组或其他装置进行信息交互。
第五方面,本申请提供一种调整摄像头模组的光谱模式的方法,该方法包括:接收第一信息,所述第一信息用于指示设置所述摄像头模组的光谱模式;响应于所述第一信息,输出多个光谱模式中每个光谱模式的模式信息,所述每个光谱模式的模式信息包括所述每个光谱模式的名称信息。
可选地,所述多个光谱模式包括以下至少一个:普通模式、高精度模式、暗光模式、印刷模式或专家模式。
在一些可能的实现方式中,所述方法还包括:接收第二信息,所述第二信息用于指示将所述摄像头模组的光谱模式设置为所述多个光谱模式中的目标光谱模式;响应于所述第二信息,将所述摄像头模组的光谱模式设置为所述目标光谱模式。
第六方面,本申请提供一种装置,该装置包括用于实现第五方面所述的方法的各个模块。这些模块可以通过硬件或软件的方式实现。
第七方面,提供了一种装置,该装置包括:输入单元,用于接收用户输入的信息;存储器,用于存储程序;输出单元,用于向用户输出信息;处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于执行第五方面所述的方法。
其中,输入单元可以是触摸屏、麦克风、鼠标、键盘、摄像头或者其他能够感应用户输入的装置;输出单元可以是显示屏、扩音器等等装置。
第八方面,本申请提供一种成像系统,该成像系统包括以下一种或多种装置:第一方面所述的摄像头模组,第三方面或第四方面所述的成像装置,第六方面或第七方面所述的装置。
第九方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质中存储用于成像装置执行的指令,所述指令用于执行第二方面所述的成像方法。
第十方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质中存储用于成像装置执行的指令,所述指令用于执行第五方面所述的方法。
第十一方面,提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述第二方面所述的成像方法。
第十二方面,提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述第五方面所述的成像方法。
第十三方面,提供一种芯片,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,执行上述第二方面所述的方法。
第十四方面,提供一种芯片,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,执行上述第五方面所述的方法。
附图说明
图1是本申请一个实施例的成像系统的示意架构图;
图2是本申请另一个实施例的成像系统的示意性结构图;
图3是本申请另一个实施例的成像系统的示意性结构图;
图4是本申请另一个实施例的成像系统的示意性结构图;
图5是本申请另一个实施例的成像系统的示意性结构图;
图6是本申请另一个实施例的成像系统的示意性结构图;
图7是本申请另一个实施例的成像系统的示意性结构图;
图8是本申请另一个实施例的成像系统的示意性结构图;
图9是本申请另一个实施例的成像系统的示意性结构图;
图10是本申请另一个实施例的成像系统的示意性结构图;
图11是本申请另一个实施例的成像系统的示意性结构图;
图12是本申请另一个实施例的成像系统的示意性结构图;
图13是本申请另一个实施例的成像系统的示意性结构图;
图14是本申请一个实施例的滤光片模块的示意性结构图;
图15是本申请另一个实施例的滤光片模块的示意性流程图;
图16是本申请另一个实施例的滤光片模块的示意性结构图;
图17是本申请另一个实施例的滤光片模块的示意性结构图;
图18是本申请另一个实施例的滤光片模块的示意性结构图;
图19是本申请另一个实施例的滤光片模块的示意性结构图;
图20是本申请另一个实施例的滤光片模块的示意性结构图;
图21是本申请另一个实施例的滤光片模块的示意性结构图;
图22是本申请另一个实施例的滤光片模块的示意性结构图;
图23是本申请另一个实施例的滤光片模块的示意性结构图;
图24是本申请另一个实施例的滤光片模块的示意性结构图;
图25是本申请另一个实施例的滤光片模块的示意性结构图;
图26是本申请另一个实施例的滤光片模块的示意性结构图;
图27是本申请一个实施例的成像方法的示意性流程图;
图28是本申请另一个实施例的成像方法的示意性流程图;
图29是本申请一个实施例的模式选择界面的示意图;
图30是本申请另一个实施例的成像系统的示意性结构图;
图31是本申请一个实施例的成像方法的示意性流程图;
图32是本申请另一个实施例的成像方法的示意性流程图;
图33是本申请一个实施例的摄像头模组的示意性结构图;
图34是本申请一个实施例的成像装置的示意性结构图;
图35是本申请另一个实施例的成像装置的示意性结构图;
图36是本申请一个实施例的调整摄像头模组的光谱模式的装置的示意性结构图;
图37是本申请另一个实施例的调整摄像头模组的光谱模式的装置的示意性结构图;
图38为本申请一个实施例的拜耳阵列滤光片的示意性结构图;
图39为本申请一个实施例的单一色图像的示意图;
图40为本申请另一个实施例的单一色图像的示意图。
具体实施方式
下面先介绍成像原理:景物通过镜头生成的光学图像投射到图像传感器表面,图像传感器将光信号转换为电信号,电信号经模数转换处理之后得到数字图像信号,数字图像信号经过处理器的加工处理,再传输到显示器上就可以看到图像了。
由于自然界中的绝大部分彩色,通常是由几种单一色,例如红、绿、蓝三种基色,按一定比例混合得到的,因此,想要获得彩色图像,需要先获取几种基色的单一色图像,然后再将这几种单一色图像按照一定的比例混合,才能得到彩色图像。
下面以单一色为红、绿和蓝三种基色为例,结合图38介绍基于拜耳阵列滤光片获取景物的单一色图像的实现方式。如图38所示,拜耳阵列滤光片中,红色、绿色和蓝色波段光的传感器呈棋盘状排列,因此,景物通过拜耳阵列滤光片投射到传感器上后得到的图像中,红绿蓝三个颜色呈棋盘状排列,红绿蓝三个颜色分别对应的单一色图像如图39所示。
如图39所示,每种单一色图像都只包含了一部分数据,要想获取完整的红、绿和蓝单一色图像,就需要根据图39所示的每一个单一色图像对该种单一色进行插值运算,以得到如图40所示的分别仅包含红色、绿色和蓝色的完整单一色图像。
而根据插值运算得到的三个单一色图像还原得到的彩色图像,不仅分辨率较低;而且,在插值过程中还会引入摩尔条纹、彩色噪声和拉链状噪声等问题,这也会降低成像质量。
为了提高图像的成像质量,本申请提出了新的滤光片、摄像头模组、成像系统以及成像方法。
图1是本申请一个实施例的成像系统100的示意架构图。成像系统100可以包括摄像头模组110和处理模块120。可选地,成像系统100还可以包括存储模块130和显示模块140。
其中,摄像头模组110可以包括滤光片模块112和传感器模块115。处理模块120可 以包括图像信号处理单元121和滤光片控制单元123。滤光片模块112、滤光片控制单元123、传感器模块115、图像信号处理单元121、存储模块130以及显示模块140之间通过总线系统150通信,即传输信号或数据。
滤光片模块112用于接收滤光片控制单元123输出的控制信号,并在控制信号的控制下,在不同的时间,向所述传感器模块上的同一像素点,输出入射光中不同波段内的光信号。滤光片模块112输出的光信号中包括成像系统100完成成像所需波段的光信号。例如,成像系统100的光谱模式为RGB模式时,所需波段包括红、绿和蓝三个波段;成像系统100的光谱模式为RYB模式时,所需波段包括红、黄、蓝三个波段;成像系统100的光谱模式为RWB模式时,所需波段包括红、蓝和全带通三个波段;成像系统100的光谱模式为RGBW模式时,所需波段包括红、绿、蓝、全带通波段;在其他一些特殊的光谱模式中,所需波段除了可见光中的波段以外,还可以包括近红外光波段。为了描述方便,将所需波段的光信号称为目标光信号。
可选地,滤光片模块112可以包括但不限于以下可调滤光片:微机电驱动的干涉式薄膜滤光片、基于液晶调制的可调滤光片、基于声光可调滤波器(AOTF)的可调滤光片、基于法布里-珀罗谐振腔(fabry–pérot cavity,FP)腔干涉的可调滤光片、基于磁光效应的可调滤光片等。滤光片模块112的示例性结构后续将会详细介绍。
传感器模块115用于将通过滤光片模块112入射的光信号转换为电信号,并输出该电信号。传感器模块115将光信号转换为电信号的过程,也可以称为感光成像。传感器模块115持续将光信号转换为电信号的过程,可以称为对光信号的连续积分。
可选地,传感器模块115可以在控制单元的控制下,仅在滤光片模块112输出目标光信号时才进行光电信号的转换;也可以是对滤光片模块112输出的所有光信号均进行光电信号的转换。后一种情况下,可以由图像信号处理单元121从传感器模块115输出的电信号中筛选出目标光信号转换得到的电信号。传感器模块115输出的电信号可以称为图像原始数据。
传感器模块115的一种示例为全带通传感器,该全带通传感器的每个像素对可见光均可感光成像,或者,该全带通传感器的每个像素对可见光和近红外光均可感光成像。
图像信号处理(image signal processor,ISP)单元121用于对传感器模块115输出的电信号进行处理,以得到彩色图像。例如,图像信号处理单元121用于对图像原始数据进行以下处理:白平衡(white balance,WB)、色彩还原(color correction,CC)、伽马(gamma)矫正、三维颜色查找表(3dimensions look-up-table,3D Lut)矫正,以实现色彩相关调试,得到全彩图像。
滤光片控制单元123用于向滤光片模块112输出控制信号,以控制滤光片模块112在不同的时间过滤输出入射光中不同波段的目标光信号。
存储模块130用于存储处理模块120执行的程序代码和/或图像相关数据,例如,传感器模块115采集的图像原始数据、处理模块120对图像原始数据进行色彩调试时的临时数据以及色彩调试后得到的彩色图像。
存储模块130可以是只读存储器(read only memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(random access memory,RAM)等。
显示模块140用于显示图像信号处理单元121处理得到的彩色图像。显示模块140可 以是液晶显示器(liquid crystal display,LCD),发光二级管(light emitting diode,LED)显示设备,阴极射线管(cathode ray tube,CRT)显示设备,或投影仪(projector)等。
图1所示的成像系统中,不需要去马赛克处理,可以调高图像分辨率,避免摩尔条纹的产生。滤光片模块112可以提供更精细的光谱原始数据,以及处理模块120可以提供高维度CCM矩阵,以保障色彩调试时候的多自由度,保障色彩还原的准确性。
图2是本申请另一个实施例的成像系统200的示意架构图。如图2所示,成像系统200除了可以包括成像系统100中的各个组成部分,成像系统200中的摄像头模组110还可以包括镜头模块111。
镜头模块111还可以用于使得滤光片模块112输出的目标光信号可以更大面积地覆盖传感器模块115的感光区域,例如刚好覆盖传感器模块115的感光区域。其中,经过镜头模块111的光信号入射到滤光片模块112的角度,小于没有镜头模块111时,该光信号入射到滤光片模块112的角度。或者可以说,镜头模块111可以理解为减小光信号入射到滤光片模块112的角度。这样,镜头模块111减少大角度入射下滤光片模块112产生的色偏问题,同时保证更多的光信号入射到传感器模块115,从而可以进一步提高成像质量。
可以理解的是,镜头模块111位于滤光片模块112之前,还可以用于收集目标物和/或目标景发射或反射的光信号,并将该光信号输出到滤光片模块112。
镜头模块111可以包含但不限于以下镜头组件:塑料镜片组、塑玻混合镜片组、衍射光学元件(diffractive optical elements,DOE)镜片、超透镜(metalens)或其他镜头。可以理解的是,镜头模块111中包括3个镜片仅是一种示例,其可以包括更多或更少、或是更多种类的镜片。
图3是本申请另一个实施例的成像系统300的示意架构图。如图3所示,成像系统300除了可以包括成像系统200中的各个组成部分,成像系统300的摄像头模组110还可以包括马达模块114,处理模块120还可以包括马达控制单元122。
其中,马达模块114具体用于:接收马达控制单元122输出的控制信号,并根据该控制信号移动镜头模块111,以调整镜头模块111与传感器模块115之间相对位置,例如沿着镜头光轴方向调整镜头模块111与传感器模块115之间的距离,从而实现对焦功能,沿着垂直于光轴方向调整镜头模块111与传感器模块115之间的相对位置,从而实现光学防抖。
马达模块114可以包含但不限于以下类型马达:音圈马达(voice coil motor,VCM)、记忆金属合金(shape-memory alloy,SMA)马达、压电式(Peizo)马达或微机电系统(microelectro mechanical systems,Mems)马达等。
本申请实施例中,滤光片模块与对焦以及光学防抖功能互相独立,因此不会牺牲光学防抖性能。
图4是本申请另一个实施例的成像系统400的示意架构图。如图4所示,成像系统400可以包括成像系统300中的各个组成部分,不同之处在于,马达模块114具体用于:接收马达控制单元122输出的控制信号,并根据该控制信号移动传感器模块115,以调整镜头模块111与传感器模块115之间的相对位置,例如沿着镜头光轴方向调整镜头模块111与传感器模块115之间的距离,从而实现对焦功能,沿着与镜头光轴垂直方向调整镜头模块111与传感器模块115的相对位置,以实现光学防抖。
图5是本申请另一个实施例的成像系统500的示意架构图。如图5所示,成像系统500可以包括成像系统300中的各个组成部分,不同之处在于,马达模块114具体用于:接收马达控制单元122输出的控制信号,并根据该控制信号移动镜头模块111和传感器模块115,以调整镜头模块111与传感器模块115之间的相对位置,例如沿着镜头光轴方向调整镜头模块111与传感器模块115之间的距离,从而实现对焦功能,沿着垂直于光轴的方向调整镜头模块111与传感器模块115之间的相对位置,从而实现光学防抖,且该马达模块114可以同时移动镜头模块111和传感器模块115,从而可以快速实现对焦和快速防抖。
图6是本申请另一个实施例的成像系统600的示意架构图。如图6所示,成像系统600可以包括成像系统200中的各个组成部分,不同之处在于,成像系统600中的镜头模块111,与成像系统200中的镜头模块111的位置不同。具体地,成像系统600中的镜头模块111位于滤波片模块112与传感器模块115之间。
图7是本申请另一个实施例的成像系统700的示意架构图。如图7所示,成像系统700可以包括成像系统300中的各个组成部分,不同之处在于,成像系统700中的镜头模块111,与成像系统300中的镜头模块111的位置不同。具体地,成像系统700中的镜头模块111位于滤波片模块112与传感器模块115之间。
图8是本申请另一个实施例的成像系统800的示意架构图。如图8所示,成像系统800可以包括成像系统400中的各个组成部分,不同之处在于,成像系统800中的镜头模块111,与成像系统400中的镜头模块111的位置不同。具体地,成像系统800中的镜头模块111位于滤波片模块112与传感器模块115之间。
图9是本申请另一个实施例的成像系统900的示意架构图。如图9所示,成像系统900可以包括成像系统500中的各个组成部分,不同之处在于,成像系统900中的镜头模块111,与成像系统500中的镜头模块111的位置不同。具体地,成像系统900中的镜头模块111位于滤波片模块112与传感器模块115之间。
图10是本申请另一个实施例的成像系统1000的示意架构图。如图10所示,成像系统1000可以包括成像系统600中的各个组成部分,不同之处在于,成像系统1000比成像系统600至少多一个镜头模块111,且该镜头模块111位于滤波片模块112远离传感器模块115的一侧。
图11是本申请另一个实施例的成像系统1100的示意架构图。如图11所示,成像系统1100可以包括成像系统700中的各个组成部分,不同之处在于,成像系统1100比成像系统700至少多一个镜头模块111,且该镜头模块111位于滤波片模块112远离传感器模块115的一侧。
图12是本申请另一个实施例的成像系统1200的示意架构图。如图12所示,成像系统1200可以包括成像系统800中的各个组成部分,不同之处在于,成像系统1200比成像系统800至少多一个镜头模块111,且该镜头模块111位于滤波片模块112远离传感器模块115的一侧。
图13是本申请另一个实施例的成像系统1300的示意架构图。如图13所示,成像系统1300可以包括成像系统900中的各个组成部分,不同之处在于,成像系统1300比成像系统900至少多一个镜头模块111,且该镜头模块111位于滤波片模块112远离传感器模 块115的一侧。
图30是本申请另一个实施例的成像系统3000的示意架构图。如图30所示,成像系统3000可以包括成像系统1300中的各个组成部分,不同之处在于,成像系统3000中的马达模块114用于:接收马达控制单元122输出的控制信号,并根据该控制信号移动位于滤波片模块112远离传感器模块115的那一侧的镜头模块111和传感器模块115,以调整该镜头模块111与传感器模块115之间的相对位置,从而快速实现对焦和快速防抖。
上面介绍了本申请提出的成像系统的示意性结构。下面介绍本申请提出的滤光片模块的结构。在介绍本申请提出的滤光片模块之前,先介绍一下本申请的成像系统中的通光孔。
图33为本申请一个实施例的摄像头模组的示意性结构图。如图33所示,摄像头模组3300包括镜头3310、滤波片模块3320和传感器模块3330。图33中的虚线圆柱体区域以镜头3310和传感器模块3330的法线为中心,该虚线圆柱体区域即为通光孔3340。
可以理解的是,摄像头模组3300包括镜头3310、滤波片模块3320和传感器模块3330仅是一种示例,实际上,摄像头模组3300的结构可以是前述任意一种成像系统中的摄像头模组的结构。例如,摄像头模组3300中还可以包括马达模块,或者,滤波片3320与传感器模块3330之间可以包括另一个镜头模块,或者,摄像头模组中可以不包括镜头3310。
另外,可以理解的是,图33中示出的滤波片模块3320的形状仅是一种示例,该滤波片模块3320可以是图14至图19中任一所示的滤光片模块。
图14是本申请一个实施例的滤光片模块1400的示意性结构图。滤光片模块1400可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块1400包括红色窄带滤光片14-1、绿色窄带滤光片14-2、蓝色窄带滤光片14-3和高速电动转轮14-4,其中,红色窄带滤光片14-1、绿色窄带滤光片14-2和蓝色窄带滤光片14-3安装在高速电动转轮14-4上,高速电动转轮14-4不透光,仅有窄带滤光片透光。
滤光片模块1400的工作原理如下:高速电动转轮接收滤光片控制单元输出的控制信号,在控制信号的控制下高速转动,从而带动红色窄带滤光片14-1、绿色窄带滤光片14-2和蓝色窄带滤光片14-3转动;由于通光孔14-5的位置固定不变,因此,在高速电动转轮的带动下,红色窄带滤光片14-1、绿色窄带滤光片14-2和蓝色窄带滤光片14-3轮流覆盖通光孔14-5;红色窄带滤光片14-1、绿色窄带滤光片14-2和蓝色窄带滤光片14-3中每一个滤光片覆盖通光孔14-5时,通光孔14-5输出相应波段的目标光信号。也就是说,通过高速电动转轮的高速转动,实现滤光片模块1400输出的目标光信号在红色、绿色和蓝色波段之间的切换。
图16为本申请另一个实施例的滤光片模块1600的示意性结构图。滤光片模块1600包括红色窄带滤光片16-1、绿色窄带滤光片16-2、蓝色窄带滤光片16-3和高速电动转轮16-4,其中,红色窄带滤光片16-1、绿色窄带滤光片16-2和蓝色窄带滤光片16-3安装在高速电动转轮16-4上,高速电动转轮16-4不透光。滤光片模块1600为抽屉推拉式的机械切换结构。例如,在一个时段,红色窄带滤光片16-1被高速电动转轮16-4推进到通光孔所在的通道,而其他滤光片被拉出通光孔所在通道。滤光片模块1600可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块1600的工作原理如下:由于通光孔16-5的位置固定不变,因此,在抽拉 式/平移式马达的带动下,红色窄带滤光片16-1、绿色窄带滤光片16-2和蓝色窄带滤光片16-3轮流覆盖通光孔16-5;红色窄带滤光片16-1、绿色窄带滤光片16-2和蓝色窄带滤光片16-3中每一个滤光片覆盖通光孔16-5时,通光孔14-5输出相应波段的目标光信号。也就是说,通过抽拉式/平移式马达的带动,实现滤光片模块1600输出的目标光信号在红色、绿色和蓝色波段之间的切换。
可以理解的是,滤光片模块1400或滤光片模块1600中的滤光片可以是其他波段的滤光片,例如,可以是红色窄带滤波片、黄色窄带滤波片和蓝色窄带滤波片,或者可以是红色窄带滤波片、蓝色窄带滤波片和全带滤波片,或者可以是红色窄带滤波片、绿色窄带滤波片、蓝色窄带滤波片和全带滤波片。
可以理解的是,为了实现更精细的光谱调制,得到质量更高的图像,可以增加滤光片。例如,如图15所示,可以在滤光片模块1400中增加滤光片14-6甚至更多的滤光片。例如,滤光片模块1400用于对红外光成像时,可以在高速电动转轮14-4上按照红外滤光片,该红外滤光片仅允许红外光通过。例如,如图17所示,滤光片模块1600用于对红外光成像时,可以在高速电动转轮16-4上按照红外滤光片16-6,该红外滤光片仅允许红外光通过。
基于图14和图15所示的滤光片模块,可以在通光孔14-5上覆盖红外截止滤光片,基于图16或图17所示的滤光片模块,可以在通光孔16-5上覆盖红外截止滤光片,该红外截止滤光片不随高速电动马达的转动而转动,以过滤掉目标光信号中的红外光,从而提高成像质量;或者,可以在不能完全过滤掉红外红的滤光片的表面覆盖红外截止滤光片,以过滤掉红外光。
图18为本申请另一个实施例的滤光片模块1800的示意性结构图。滤光片模块1800可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块1800包括渐变干涉式薄膜滤光片18-1和高速电动转轮18-2,渐变干涉式薄膜滤光片18-1固定在高速电动转轮18-2上。渐变干涉式薄膜滤光片18-1在可见光范围可以连续滤波。高速电动转轮18-2不透光,仅有渐变干涉式薄膜滤光片18-1透光,成像系统的通光孔如18-3所示。
渐变干涉式薄膜滤光片18-1沿箭头方向呈渐变色,且颜色逐渐由浅变深,因此其可以对可见光范围内的光线进行连续的滤波。例如,渐变干涉式薄膜滤光片18-1可以对波长位于380纳米至780纳米范围内的光线进行连续滤波。
如图18所示,以虚线为起点,沿箭头方向,滤光片18-1能够透过的光的波长连续变化。这样,当渐变干涉式薄膜滤光片18-1沿箭头相反方向连续转动时,滤光片模块1800能够过滤输出的目标光信号的波长连续变化。
高速电动转轮18-2可以带动滤光片18-1仅沿逆时针方向转动或者仅沿顺时针方向转动,也可以带动滤光片沿逆时针方向和顺时针方向交叉转动。
例如,高速电动转轮18-2可以带动滤光片18-1沿逆时针方向不停地旋转,从而实现连续滤波。
又如,高速电动转轮18-2可以带动滤光片18-1沿顺时针方向不停旋转,从而实现连续滤波。
又如,高速电动转轮18-2带动滤光片18-1沿逆时针方向旋转一圈之后,带动滤光片 18-1沿顺时针方向旋转,并在沿顺时针方向旋转一圈之后,再带动滤光片沿逆时针方向旋转,从而实现连续滤波。
滤光片模块1800的工作原理与滤光片模块1400的工作原理类似,此处不再赘述。不同之处在于,渐变式滤光片每时每刻都有光通过,没有1400结构中两片滤光片之间的间隙。
可选地,若渐变干涉式薄膜滤光片18-1无法完全滤掉红外光,可以在通光孔18-3上安装单独的红外截止滤光片,该红外截止滤光片不随转轮转动。
可选地,滤光片模块1800用于对红外光成像时,渐变干涉式薄膜滤光片18-1在可见光到红外光连续可调。
图19为本申请另一个实施例的滤光片模块1900的示意性结构图。滤光片模块1900可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块1900包括渐变干涉式薄膜滤光片19-1和高速电动转轮19-2,渐变干涉式薄膜滤光片19-1固定在马达19-2上。渐变干涉式薄膜滤光片19-1沿长边在可见光范围连续滤波。马达19-2不透光,且仅有渐变干涉式薄膜滤光片19-1透光,成像系统的通光孔如19-3所示。
渐变干涉式薄膜滤光片19-1沿箭头方向呈渐变色,且颜色逐渐由浅变深,因此其可以对可见光范围内的光线进行连续的滤波。例如,渐变干涉式薄膜滤光片19-1可以对波长位于380纳米至780纳米范围内的光线进行连续滤波。
如图19所示,沿箭头方向,滤光片19-1能够透过的光的波长连续变化。这样,当渐变干涉式薄膜滤光片19-1沿箭头相反方向连续转动时,滤光片模块1900能够连续过滤输出的目标光信号的波长连续变化。
在一些示例中,马达19-2可以带动滤光片19-1沿箭头方向移动和沿箭头相反方向移动。例如,马达19-2带动滤光片19-1沿箭头方向移动,以便于滤光片19-1不同颜色部分按照从左到右的顺序从通光孔19-3上滑过;然后马达19-2带动滤光片19-1沿箭头反方向移动,以便于滤光片19-1不同颜色部分按照从右到左的顺序从通光孔19-3上滑过。
滤光片模块1900的工作原理与滤光片模块1400的工作原理类似,此处不再赘述。不同之处在于,渐变式滤光片可以连续滤光,没有1400结构中两片滤光片之间的间隙。
可选地,若渐变干涉式薄膜滤光片19-1无法完全滤掉红外光,可以在通光孔19-5上安装单独的红外截止滤光片,该红外截止滤光片不随转轮转动。
可选地,滤光片模块1900用于对红外光成像时,渐变干涉式薄膜滤光片19-1在可见光到红外光连续可调。
图20为本申请另一个实施例的滤光片模块2000的示意性结构图。滤光片模块2000可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块2000包含反射镜20-1、反射镜20-2、高速移动马达20-3,固定结构20-4。反射镜20-1和反射镜20-2沿光轴方向依次排列,光轴可以理解为垂直通过反射镜20-1和反射镜20-2的光柱(光束)的中心线。反射镜20-2通过固定结构20-4固定在滤光片模块2000中的固定位置,反射镜20-1在高速移动马达20-3的带动下沿光轴方向远离或接近反射镜20-1。
反射镜20-1可以将入射到反射镜20-1的光线反射到反射镜20-2,反射镜20-2也可以 将入射到反射镜20-2的光线反射到反射镜20-1。例如,反射镜20-1和反射镜20-2相对的内表面都具有高反射率。
基于法布里-珀罗(Fabry–Pérot,F-P)干涉原理,当反射镜20-1与反射镜20-2的两个反射面严格平行,且来自光源任一点的单色光以入射角θ照射到反射镜20-1和反射镜20-2时,透射光是许多平行光束的叠加,且任意一对相邻光束的光程差为2*n*l*cosθ,反射镜20-1与反射镜20-2的透射率由该光程差决定,其中,n指反射镜20-1与反射镜20-2之间介质的折射率,l指反射镜20-1与反射镜20-2之间的距离,θ指入射角。
当光程差为目标光信号的波长的整数倍时,该反射镜20-1和反射镜20-2对目标光信号的透射率最大。此时,可以认为滤光片模块2000仅能通过此目标光信号,而反射其他波长的光。
因此,可以通过高速移动马达20-3移动反射镜20-1,以改变反射镜20-1与反射镜20-2之间的距离,从而可以调节反射镜20-1与反射镜20-2的透射光的光程差,进而可以调节能够透过反射镜20-1与反射镜20-2的光的透射率,最终使得滤光片模块2000输出所需波长的目标光信号。
在另一些示例中,反射镜20-2也可以设置在高速移动马达上且随高速移动马达的移动而移动。这些示例中,只需保证反射镜20-1和反射镜20-2随高速移动马达移动时,相互之间的距离能够发生改变,以使得光程差能够达到目标光信号的波长的整数倍即可。
滤光片模块2000可以在整个可见光范围连续可调,或者可以在可见光范围和近红外光范围内连续可调。滤光片模块2000在整个可见光范围连续可调时,反射镜20-1、20-2通常可以采用镀银(Ag)反射镜。
高速移动马达20-3可以采用基于MEMs或者Peizo的高速移动马达。高速移动马达20-3可以接收滤光片控制单元输出的控制信号,该控制信号用于控制高速移动马达20-3移动的距离。
可选地,若不考虑利用红外光来进行图像处理,如图21所示,可以增加红外截止滤光片20-5来滤除红外光。其中,红外截止滤光片20-5完全覆盖反射镜20-2的通光口径,这样才能实现仅容许可见光波段通过的目的。
图22为本申请另一个实施例的滤光片模块2200的示意性结构图。滤光片模块2200可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块2200包括线性偏光片22-1、相位延迟单元22-2、液晶盒22-3、检偏器22-4。线性偏光片22-1、相位延迟单元22-2、液晶盒22-3、检偏器22-4沿光轴方向依次排开,光轴可以理解为垂直通过线性偏光片22-1的光柱(光束)的中心线。
光信号经过线性偏光片22-1转换成线偏光,线偏光在相位延迟器22-2和液晶盒22-3的作用下发生双折射效应,产生相位差,从而使得线偏光偏振方向旋转,进而使得只有与检偏器22-4方向一致的光信号可以通过。相同的液晶旋向对于不同的波段的光信号引入的相位差不同,因此可以向液晶盒输出变化的电压,通过可变电压控制液晶盒筛选出对应的波段的目标光信号。液晶盒22-3可以接收滤光片控制单元输出的控制信号,该控制信号用于控制液晶盒22-3的输入电压。
可选地,可以在滤光片模块2200中添加红外截止滤光片,以过滤掉红外光,从而提高成像质量。在可调滤光片模块2200内,对红外截止滤光片的位置无特殊要求,只要能 覆盖整个通光孔区域即可。在一个示例中,如图23所示,红外截止滤光片22-5沿光轴方向排列在检偏器之后。
可选地,如图24所示,将图22所示的结构(图24中一个虚线框内结构)当作一个整体单元,并将多个该结构沿着光轴方向依次排列,得到新的滤光片模块2400,该滤光片结构可以输出更精细的波段光信号。可选地,可以在滤光片模块2400中添加红外截止滤光片,以过滤掉红外光,从而提高成像质量。
在可调滤光片模块2400内,对红外截止滤光片的位置无特殊要求,只要能覆盖整个通光孔区域即可。在一个示例中,红外截止滤光片24-5沿光轴方向排列在最后一个检偏器之后。
图25为本申请另一个实施例的滤光片模块2500的示意性结构图。滤光片模块2500可以是图1至图13任意一个所示的摄像头模组中的滤光片模块。
滤光片模块2500包含线性偏光片25-1、相位延迟单元25-2、声光可调滤光器(AOTF)25-3、检偏器25-4。线性偏光片25-1、相位延迟单元25-2、声光可调滤光器(AOTF)25-3、检偏器25-4沿光轴方向依次排列,光轴可以理解为垂直通过线性偏光片25-1的光柱(光束)的中心线。相位延迟单元也称为相位延迟器。
入射光信号经过线性偏光片25-1转换成线偏光,线偏光在相位延迟器25-2和AOTF25-3的作用下发生双折射效应,产生相位差,使得线偏光偏振方向旋转,这样,只有与检偏器25-4方向一致的光信号可以通过。其中,AOTF利用射频进行驱动,同一个射频(radio frequency,RF)信号对于不同波长引入的相位差不同,因此可以通过改变射频信号的频率控制AOTF筛选出不同波段的光。
其中,相位延迟器可以采用液晶相位延迟器,AOTF的材料可以选择二氧化碲(TeO2)。
可选地,为了提高成像质量,可以在滤光片模块中添加红外截止滤光片。在可调滤光片模块2600内,对红外截止滤光片的位置无特殊要求,只要能覆盖整个通光孔区域即可。在一个示例中,如图26所示,红外截止滤光片25-5沿光轴方向排列在检偏器之后。
本申请各个实施例中,因为滤光片模块可以分时输出不同波段的目标信号光,传感器模块无需进行插值运算即可获取完整单一色图像,从而可以避免因插值运算而导致图像质量下降的问题。
图27为本申请一个实施例的成像方法的示意性流程图。该成像方法可以由图1至图13中任一所示的成像系统来执行。
图27所示的成像方法包括S2710至S2780。可以理解的是,本申请提出的成像方法中可以包括更多或更少的步骤。例如,执行成像方法的成像系统中不包含马达模块和马达控制单元时,该成像方法中不包括S2750和S2740。
S2710,启动成像应用。例如,用户点击手机上成像应用的图标,手机启动该成像应用。
S2720,滤光片控制单元向滤光片模块输出控制信号,以控制滤光片模块输出目标光信号。
具体地,滤光片控制单元根据设定的光谱模式控制滤光片模块进行光谱筛选。光谱模式包含但不局限于以下模式:RGB模式,所需波段包括红、绿和蓝三个波段;RYB模式, 所需波段包括红、黄、蓝三个波段;RWB模式,所需波段包括红、蓝和全带通三个波段;RGBW模式,所需波段包括红、绿、蓝、全带通波段;在其他一些特殊的光谱模式中,所需波段除了可见光中的波段以外,还可以包括近红外光波段,例如用于后期图像处理,以在ISP里通过计算的方法获得更好的成像质量。其中,每个波段的光谱范围可以根据实际应用进行微调。
传感器模块接受光信号,将光信号转换为电信号(即图像原始数据)并传递给ISP。在一些设计中,传感器模块的感光积分时间和周期与滤光片模块中的各种滤光片的滤光时间和周期配合,具体地,传感器模块对设定的单波段内的目标光信号进行连续积分,对各个光谱模式中波段切换过程中的光信号不进行探测。传感器模块对设定的单波段内的目标光信号进行连续积分得到的数据可以称为窄波段图片。
S2730,ISP从传感器模块接收图像原始数据,并对图像原始数据进行处理。例如,传感器模块将窄波段图片组传给ISP,ISP在整个彩色图像处理通路(color image processing pipeline)中,对窄波段图片进行预处理、白平衡、色彩还原、Gamma校正和3D Lut校正等处理,最终完成色彩相关的调试,得到彩色图像。
S2740,马达控制单元根据陀螺仪反馈信息向马达模块输出控制信息,以使得马达模块在该控制信息的控制下调整镜头模块与传感器模块之间的距离。若成像系统中没有马达模块和马达控制单元,则可以不执行该步骤。
S2750,马达控制单元根据ISP输出的对焦信息向马达模块输出控制信息,以使得马达模块在该控制信息的控制下调整镜头模块与传感器模块之间的距离。若成像系统中没有马达模块和马达控制单元,则可以不执行该步骤。
S2760,显示模块显示ISP处理得到的彩色图像。
S2770,判断是否进行拍照或录像。例如用户点击用户界面上的拍照或录像按键时,确定进行拍照或录像。
S2780,若确定进行拍照或录像,则存储模块保存ISP处理得到的彩色图像;否则重新执行S2720。
本申请实施例中,由于ISP从传感器模块接收的窄波段图片为单一色图像,因此整个ISP色彩通路中不需要通过插值运算(即去马赛克处理)即可获取完整单一色图像,从而可以有效的提高图像分辨率,避免摩尔条纹的产生,最终提高成像质量。
本申请还提供了一种成像方法,该成像方法的一种示意性流程图如图31所示。图31所示的成像方法可以包括S3101至S3107。
该成像方法可以由ISP执行。例如,ISP在彩色图像处理通路中对窄波段图像进行处理以得到彩色图像。这n个窄波段图像可以是采用包含前述图14至图26中任一图所示的滤光片模块的摄像头模组拍摄得到的图像。其中,窄波段图像1至窄波段图像n与ISP所属的成像系统的光谱模式所包括的n个波段一一对应,窄波段图像i是指滤波片模块过滤出所述n个波段中的第i个波段的目标光信号之后,传感器模块对该第i个波段的目标光信号连续积分得到的图像数据。例如,n等于3时,窄波段图像1对应红色波段,窄波段图像2对应绿色波段、窄波段图像3对应蓝色波段。
S3101,对n个窄波段图像进行预处理(preprocessing)。
此处的预处理可以指对传感器模块采集的图像数据进行非颜色相关的处理,例如,对 进行噪声抑制或锐化等相关处理。
例如,有些可调滤光片,要实现三通道(n=3),但是实际上这种滤光片采集时候实际采集的通道要更精细更多(n>3),可以在预处理阶段通过相邻通道对应的图像信息相加的方式,获取最终的通道模式。
其中,相邻通道对应的图像信息相加,可以抵消随机噪声,提高得到的图像的抗噪声能力。也就是说,该预处理方式可以提高多通道滤光片的感光能力。
S3102,对预处理后的图像进行白平衡。
白平衡可以还原被拍摄景物的色彩,使在不同光源条件下拍摄的景物同人眼观看的画面色彩相近。
其中,若预处理之后得到三个通道的图像数据,则进行白平衡处理时,通常需要三个数据矩阵,这三个数据矩阵与这三个通道的图像数据一一对应,这三个数据矩阵的获取方式可以参考现有技术。
若预处理之后得到的是三个以上通道的图像数据,则进行白平衡处理时,通常需要三个以上数据矩阵,这三个以上的数据矩阵与这三个以上通道的图像数据一一对应,这三个以上的数据矩阵的获取方式可以参考上述三个数据矩阵的获取方式,此处不再赘述。
S3103,使用色彩矫正矩阵对白平衡后的图像进行色彩还原。
色彩矫正可以确保图像的色彩能够被较为精确地再现拍摄现场人眼看到的情况。其中,色彩还原使用的色彩校正矩阵(color correction matrix,CCM)的维度可以由预处理后得到的图像通道数和显示系统的基色数量。
例如,预处理得到n个通道的图像数据,且显示系统为m基色系统时,对应的CCM是个m*n的矩阵,m和n为正整数。
例如,预处理得到三个通道的图像数据,且显示系统为三基色系统时,对应的CCM是一个3*3的矩阵。
本申请实施例中的CCM的获取方式可以参考现有技术中3*3的CCM的获取方式,此处不再赘述。
S3104,对色彩还原得到的图像进行伽马矫正。
本申请的实施例中,若将预处理得到的n个窄波段图像中的第i个图像记为矩阵λi,将对该第i个图像进行白平衡处理所使用的矩阵记为awb i,则对预处理得到的n窄波段图像进行白平衡、色彩校正和伽马校正的一种示例性数学表达式如下:
Figure PCTCN2020128975-appb-000001
其中,
Figure PCTCN2020128975-appb-000002
中的λ i表示预处理得到的n个通道中的第i个通道的图像数据矩阵,i为小于 或等于n的正整数;
Figure PCTCN2020128975-appb-000003
表示CCM,m表示显示系统的基色数量,() γ表示伽马处理,
Figure PCTCN2020128975-appb-000004
中的λ j表示进行白平衡、色彩校正和伽马校正得到的m个基色图像数据矩阵中的第j个基色图像数据矩阵,j为小于或等于m的矩阵。
S3105,对伽马矫正得到的图像进行3D Lut矫正。
S3106,对3D Lut矫正得到的图像进行后处理(post processing)。后处理与前处理类似,此处不再赘述。
S3107,显示后处理得到的图像,和/或,存储后处理得到的图像。其中,在存储图像之前,可以对图像先进行压缩,以节省存储空间。
可以理解的是,图31所示的操作仅是示例,本申请提出的成像方法中可以包括更多或更少的操作,或者,可以执行类似的操作。
例如,如图32所示,ISP进行彩色图像处理时,不仅对窄波段图像1至窄波段图像n进行处理,还可以对红外(infrared radiation,IR)波段图像进行处理。其中,红外波段图像可以用于低照度下的灰度成像,以及用于可见光图像更精准的色彩还原。
本申请还提供了一种成像方法,该成像方法可以包括S2720和S2730。可选地,该成像方法还可以包括S2740;可选地,还可以包括S2750,可选地,还可以包括2760和/或S2780。
本申请还提供了如图1至图13中任意一个所示的成像系统中的摄像头模组。
本申请还提供了一种多模组摄像头,该该模组摄像头包括多个摄像头模组,其中至少一个摄像头模组为图1至图13中任意一个所示的成像系统中的摄像头模组。可选地,其中至少一个摄像头模组可以为传统摄像头模组,例如基于拜耳阵列传感器或基于弗文(foveon)传感器的摄像头模组。这样,可以提高成像质量以及联合实现双目测距。
该多模组摄像头包括传统摄像头模组和如图1至图13中任意一个所示的成像系统中的摄像头模组时的成像方法的一种示意性流程图如图28所示。其中,本申请新提出的摄像头模组称为可调摄像头模组。
S2810,启动成像应用。该步骤可以参考S2710。
S2820,启动传统摄像头模组进行预览对焦。该步骤可以参考现有技术。
S2830,判断是否使用可调摄像头模组进行拍照或录像,是则执行S2840,否则重复执行S2850。
S2840,使用可调摄像头模组进行拍照或录像。该步骤的具体实现方式可以参考图27所示的成像方法中的S2720至S2780。
本申请还提供一种成像装置,该成像装置包括图1至图13任一所示的成像系统中的 处理模块,甚至可以包括其中的存储模块和/或显示模块。
本申请还提供一种终端设备,该终端设备包括图1至图13中任一所示的成像系统。
本申请还提供一种模式设置方法,该方法包括:显示模式选择界面,该模式选择界面上包括多种模式选项;根据用户在模式选择界面上的输入信息确定用户选择的模式,例如,用户点击的选项对应的模式即为用户选择的模式;根据用户选择的模式设置光谱模式,不同的光谱模式对应不同波段的光信号。
本申请还提供一种调整摄像头模组的光谱模式的方法,该方法包括:接收用于指示设置所述摄像头模组的光谱模式的指示信息,响应于所述指示信息,输出多个光谱模式中每个光谱模式的模式信息,所述每个光谱模式的模式信息包括所述每个光谱模式的名称信息。
其中,所述多个光谱模式中不同的光谱模式下,所述摄像头模组的滤光通道不同。例如,所述多个光谱模式包括但不仅限于:普通模式、高精度模式、暗光模式、印刷模式等。通常情况下,默认用户选择的是普通模式。
普通模式下,可调滤光片采用与传统拜耳滤光片相同的滤光策略,采用红,绿,蓝三通道滤光;高精度模式下,可调滤光片采用至少四通道滤光,光谱精细程度高于传统拜耳滤光片;暗光模式下,可调滤光片的滤光通道中一般至少含有一个宽于RGB单色波段的通道,例如RYB,RWB等。可选地,这多个光谱模式还可以包括专家模式,专家模式主要针对更专业的人群的需求,在专家模式中用户可以自定义滤光的通道数以及每个通道的波段。
该方法还可以包括:接收第二信息,所述第二信息用于指示将所述摄像头模组的光谱模式设置为所述多个光谱模式中的目标光谱模式;响应于所述第二信息,将所述摄像头模组的光谱模式设置为所述目标光谱模式。其中,目标光谱模式即为用户从所述多个光谱模式中选择的光谱模式。
在一些设计中,可以通过界面的形式输出这多个光谱模式的模式信息,一种示例如图29所示。或者,也可以通过其他方式输出这多个光谱模式的模式信息,例如,通过语音播放的方式。
在一些设计中,每个光谱模式的模式信息还可以包括该光谱模式对应的波段或者滤光通道信息。
图29为本申请一个实施例的模式设置界面的示意图。该界面中包括以下几个光谱模式选项:普通模式、高精度模式、暗光模式、印刷模式和专家模式,其中,默认设置为普通模式。每个模式名称后的图标为可选中图标,用户选中其中哪个图标,表示用户将光谱模式设置为该图标对应的模式。专家模式通过双选开关控制,一般处于关闭状态。
图34是本申请一个实施例的成像装置的示意性结构图。该成像装置3400包括获取模块3410和处理模块3420。成像装置3400可以用于实现上述任意成像方法。
例如,获取模块3410用于:获取多组图像原始数据,所述多组图像原始数据为摄像头模组对不同时间采集的不同波段的目标光信号进行光电转换得到的图像原始数据。处理模块3420用于对所述多组图像原始数据进行色彩调试处理,以得到彩色图像,所述多组图像原始数据中不同组的图像原始数据对应的不同波段内的目标光信号。
在一些可能的实现方式中,所述处理模块3420具体用于:对所述多组图像原始数据 进行白平衡、色彩还原、伽马矫正和三维色彩查找处理。
图35是本申请另一个实施例的成像装置的示意性结构图。该装置3500包括存储器3510和处理器3520。
存储器3510用于存储程序。处理器3520用于执行所述存储器3510存储的程序,当所述存储器3510存储的程序被执行时,所述处理器3520用于执行前述任意一种成像方法。
图36是本申请一个实施例的调整摄像头模组的光谱模式的装置的示意性结构图。该装置3600包括输入模块3610和输出模块3620,可选地,还可以包括处理模块3630。装置3600可以用于实现上述调整摄像头模组的光谱模式的方法。
例如,输入模块3610用于:接收用于指示设置所述摄像头模组的光谱模式的指示信息;输出模块3620用于:响应于所述指示信息,输出多个光谱模式中每个光谱模式的模式信息,所述每个光谱模式的模式信息包括所述每个光谱模式的名称信息。
可选地,输入模块3610还可以用于:接收第二信息,所述第二信息用于指示将所述摄像头模组的光谱模式设置为所述多个光谱模式中的目标光谱模式;处理模块3630用于:响应于所述第二信息,将所述摄像头模组的光谱模式设置为所述目标光谱模式,其中,目标光谱模式即为用户从所述多个光谱模式中选择的光谱模式。
图37是本申请一个实施例的调整摄像头模组的光谱模式的装置的示意性结构图。该装置3700包括存储器3710和处理器3720。
存储器3710用于存储程序。处理器3720用于执行所述存储器3710存储的程序,当所述存储器3710存储的程序被执行时,所述处理器3720用于执行前述调整摄像头模组的光谱模式的方法。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
应理解,本申请实施例中的处理器可以为中央处理单元(central processing unit,CPU), 该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
应理解,本申请中的“/”表示“或”的意思。其中“和/或”可以包括三种并列的方案。例如“A和/或B”可以包括:A,B,A和B。应理解,本申请中的“A或B”可以包括:A,B,A和B。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (30)

  1. 一种摄像头模组,其特征在于,包括:滤光片模块和传感器模块;
    所述滤光片模块用于:在不同的时间,向所述传感器模块上的同一像素点输出入射到所述滤光片模块的光信号中不同波段的目标光信号;
    所述传感器模块用于:将入射到所述传感器模块的所述目标光信号转换为电信号并输出。
  2. 如权利要求1所述的摄像头模组,其特征在于,所述滤光片模块包括移动模块和多个滤波片;
    所述滤波片用于输出入射到所述滤波片上的光信号中的所述目标光信号,所述多个滤波片中不同的滤波片输出的所述目标光信号的波段不同;
    所述移动模块用于:在不同的时间,将所述多个滤波片中不同的滤波片移动到能够接收所述光信号的目标位置。
  3. 如权利要求1所述的摄像头模组,其特征在于,所述滤光片模块包括移动模块和线性渐变干涉式的滤光片;
    所述移动模块用于:在不同的时间,将所述线性渐变干涉式的滤光片不同部位移动到能够接收所述光信号的目标位置,所述滤光片的不同部分输出的所述目标光信号的波段不通过。
  4. 如权利要求3所述的摄像头模组,其特征在于,所述滤光片从可见光到红外光连续可调。
  5. 如权利要求1所述的摄像头模组,其特征在于,所述滤光片模块包括两个两面反射镜和移动模块;
    所述移动模块用于:调节所述两个两面反射镜之间的距离,使得在不同的时间所述两个两面反射镜之间的距离不同;
    所述两个两面反射镜用于:将入射到所述一个两面反射镜上的光信号中的目标光信号过滤出并从所述两个两面反射镜中的另一个两面反射镜输出,所述两面反射镜之间的距离不同时,所述目标光信号的波段不同。
  6. 如权利要求1所述的摄像头模组,其特征在于,所述滤光片模块包括液晶可调滤光片。
  7. 如权利要求6所述的摄像头模组,其特征在于,所述滤光片模块为所述摄像头模组的多个滤光片模块中的一个。
  8. 如权利要求1所述的摄像头模组,其特征在于,所述滤光片模块包括声光可调滤光片。
  9. 如权利要求1至8中任一项所述的摄像头模组,其特征在于,所述滤光片模块还包括红外截止滤光片,所述红外截止滤光片用于:在所述目标光信号入射到所述传感器模块之前,过滤掉所述目标光信号中的红外光。
  10. 如权利要求1至9中任一项所述的摄像头模组,其特征在于,所述摄像头模组还包括:第一镜头模块;
    其中,所述第一镜头模块用于:将入射到所述第一镜头模块的光信号输出到所述滤光片模块。
  11. 如权利要求10所述的摄像头模组,其特征在于,所述第一镜头模块包括以下一项或多项:塑料镜片,玻璃镜片,衍射光学元件DOE,超透镜,镜头。
  12. 如权利要求10或11所述的摄像头模组,其特征在于,所述摄像头模组还包括马达模块;
    所述马达模块用于:控制所述第一镜头模块和/或所述传感器模块移动,以实现所述摄像头模块的对焦功能和/或防抖功能。
  13. 如权利要求10至12中任一项所述的摄像头模组,其特征在于,所述摄像头模块还包括第二镜头模块,所述第二镜头模块位于所述滤光片模块和所述传感器模块之间;
    其中所述第二镜头模块用于:将所述滤光片输出的目标光信号输出到所述传感器模块,并增大所述目标光信号在所述传感器模块上的覆盖范围。
  14. 如权利要求1至9中任一项所述的摄像头模组,其特征在于,所述摄像头模块还包括第二镜头模块,所述第二镜头模块位于所述滤光片模块和所述传感器模块之间;
    其中所述第二镜头模块用于:将所述滤光片输出的目标光信号输出到所述传感器模块,并增大所述目标光信号在所述传感器模块上的覆盖范围。
  15. 如权利要求14所述的摄像头模组,其特征在于,所述摄像头模组还包括第二马达模块;
    所述第二马达模块用于:控制所述第二镜头模块和/或传感器模块移动,以实现所述摄像头模块的对焦功能和/或防抖功能。
  16. 如权利要求14或15所述的摄像头模组,其特征在于,所述摄像头模组还包括:第一镜头模块;
    其中,所述第一镜头模块用于:将入射到所述第一镜头模块的光信号输出到所述滤光片模块上的所述目标位置。
  17. 如权利要求13至16中任一项所述的摄像头模组,其特征在于,所述第二镜头模块包括以下一项或多项:塑料镜片,玻璃镜片,衍射光学元件DOE,超透镜,镜头。
  18. 如权利要求1至17中任一项所述的摄像头模组,其特征在于,所述传感器模块包括全带通传感器或宽带通传感器。
  19. 一种成像方法,其特征在于,包括:
    获取多组图像原始数据,所述多组图像原始数据为摄像头模组对不同时间采集的不同波段的目标光信号进行光电转换得到的图像原始数据;
    对所述多组图像原始数据进行色彩调试处理,以得到彩色图像,所述多组图像原始数据中不同组的图像原始数据对应的不同波段内的目标光信号。
  20. 如权利要求19所述的成像方法,其特征在于,所述对所述摄像头模组采集的多组图像原始数据进行色彩调试处理,包括:
    对所述多组图像原始数据进行白平衡、色彩还原、伽马矫正和三维色彩查找处理。
  21. 一种成像装置,其特征在于,包括:
    获取模块,用于获取多组图像原始数据,所述多组图像原始数据为摄像头模组对不同时间采集的不同波段的目标光信号进行光电转换得到的图像原始数据;
    处理模块,用于对所述摄像头模组采集的多组图像原始数据进行色彩调试处理,以得到彩色图像,所述多组图像原始数据中不同组的图像原始数据对应的不同波段内的目标光信号。
  22. 如权利要求21所述的成像装置,其特征在于,所述处理模块具体用于:
    对所述多组图像原始数据进行白平衡、色彩还原、伽马矫正和三维色彩查找处理。
  23. 一种成像系统,其特征在于,包括如权利要求1至18中任一项所述的摄像头模组和/或如权利要求21至22中任一项所述的成像装置。
  24. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储用于成像装置执行的指令,所述指令用于执行权利要求19至20中任一项所述的成像方法。
  25. 一种调整摄像头模组的光谱模式的方法,其特征在于,包括:
    接收第一信息,所述第一信息用于指示设置所述摄像头模组的光谱模式;
    响应于所述第一信息,输出多个光谱模式中每个光谱模式的模式信息,所述每个光谱模式的模式信息包括所述每个光谱模式的名称信息。
  26. 如权利要求25所述的方法,其特征在于,所述多个光谱模式包括以下至少一个:普通模式、高精度模式、暗光模式、印刷模式或专家模式。
  27. 如权利要求25或26所述的方法,其特征在于,所述方法还包括:
    接收第二信息,所述第二信息用于指示将所述摄像头模组的光谱模式设置为所述多个光谱模式中的目标光谱模式;
    响应于所述第二信息,将所述摄像头模组的光谱模式设置为所述目标光谱模式。
  28. 一种调整摄像头模组的光谱模式的装置,其特征在于,包括:
    输入模块,用于接收第一信息,所述第一信息用于指示设置所述摄像头模组的光谱模式;
    输出模块,用于:响应于所述第一信息,输出多个光谱模式中每个光谱模式的模式信息,所述每个光谱模式的模式信息包括所述每个光谱模式的名称信息。
  29. 如权利要求28所述的装置,其特征在于,所述多个光谱模式包括以下至少一个:普通模式、高精度模式、暗光模式、印刷模式或专家模式。
  30. 如权利要求28或29所述的装置,其特征在于,所述装置还包括处理模块;
    其中,所述输入模块还用于接收第二信息,所述第二信息用于指示将所述摄像头模组的光谱模式设置为所述多个光谱模式中的目标光谱模式;
    所述处理模块用于:响应于所述第二信息,将所述摄像头模组的光谱模式设置为所述目标光谱模式。
PCT/CN2020/128975 2020-01-08 2020-11-16 摄像头模组、成像方法和成像装置 WO2021139401A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/791,471 US20230045724A1 (en) 2020-01-08 2020-11-16 Camera Module, Imaging Method, and Imaging Apparatus
EP20911551.8A EP4075781A4 (en) 2020-01-08 2020-11-16 CAMERA MODULE, IMAGE METHOD AND IMAGE DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010018843.XA CN113099078B (zh) 2020-01-08 2020-01-08 摄像头模组、成像方法和成像装置
CN202010018843.X 2020-01-08

Publications (1)

Publication Number Publication Date
WO2021139401A1 true WO2021139401A1 (zh) 2021-07-15

Family

ID=76664124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/128975 WO2021139401A1 (zh) 2020-01-08 2020-11-16 摄像头模组、成像方法和成像装置

Country Status (4)

Country Link
US (1) US20230045724A1 (zh)
EP (1) EP4075781A4 (zh)
CN (1) CN113099078B (zh)
WO (1) WO2021139401A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112492182B (zh) * 2020-12-11 2022-05-20 维沃移动通信有限公司 摄像模组、电子设备、拍摄控制方法及装置
CN113747014B (zh) * 2021-09-03 2023-12-26 维沃移动通信(杭州)有限公司 摄像头模组、电子设备和图像采集方法
CN113905164A (zh) * 2021-10-09 2022-01-07 奕目(上海)科技有限公司 光场成像系统以及通过光场成像系统采集光场信息的方法
CN114384052A (zh) * 2021-12-22 2022-04-22 重庆盾银科技有限公司 一种跟踪成像系统及成像方法
TW202346911A (zh) * 2022-02-14 2023-12-01 美商圖納堤克斯股份有限公司 使用分色超穎光學計算相機進行高品質成像之系統及方法
CN114845017A (zh) * 2022-04-13 2022-08-02 Oppo广东移动通信有限公司 成像模组、电子设备、成像方法和存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202600333U (zh) * 2012-04-20 2012-12-12 中国科学院遥感应用研究所 一种波段可调的多光谱ccd相机
US20130229646A1 (en) * 2012-03-02 2013-09-05 Seiko Epson Corporation Component analyzer
CN105467490A (zh) * 2015-12-28 2016-04-06 北京天诚盛业科技有限公司 一种滤光模组、成像装置及移动终端
CN106657802A (zh) * 2016-12-19 2017-05-10 北京空间机电研究所 一种转轮式多光谱相机自动曝光调节系统及调节方法
CN107395940A (zh) * 2017-08-30 2017-11-24 广东欧珀移动通信有限公司 滤光组件、成像装置、电子设备和电子设备的成像方法
CN107561684A (zh) * 2017-09-30 2018-01-09 广东欧珀移动通信有限公司 滤光片、镜头模组和成像模组
CN108827886A (zh) * 2018-03-16 2018-11-16 上海帆声图像科技有限公司 一种成像装置

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7477309B2 (en) * 2002-07-10 2009-01-13 Lockheed Martin Corporation Infrared camera system and method
JP2004159293A (ja) * 2002-09-11 2004-06-03 Casio Comput Co Ltd 撮像記録装置、画像処理装置、撮像記録制御プログラム及び画像処理プログラム、撮像記録方法並びに画像処理方法
JP2009121986A (ja) * 2007-11-15 2009-06-04 Omron Corp 分光装置
WO2014020791A1 (ja) * 2012-08-02 2014-02-06 パナソニック株式会社 偏光カラー撮像装置
TWI593950B (zh) * 2012-11-13 2017-08-01 唯亞威方案公司 可攜式光譜儀
KR20160023838A (ko) * 2013-06-24 2016-03-03 테크놀로지 이노베이션 모멘텀 펀드 (이스라엘) 리미티드 파트너쉽 컬러 이미지 취득을 위한 시스템 및 방법
CN103698010B (zh) * 2013-12-31 2016-05-04 中国科学院光电研究院 一种微型化线性渐变滤光片型成像光谱仪
CN104159049A (zh) * 2014-08-15 2014-11-19 北京思比科微电子技术股份有限公司 一种图像传感器及其工作方法
CN105987754B (zh) * 2015-03-04 2020-06-12 中国人民解放军电子工程学院 一种集成高光谱和偏振高光谱探测能力的成像仪
CN207164435U (zh) * 2017-08-24 2018-03-30 普联技术有限公司 一种摄像装置
CN108449531B (zh) * 2018-03-26 2020-03-06 京东方科技集团股份有限公司 悬浮触控摄像头模组、电子设备及触控方法
CN209182232U (zh) * 2018-11-23 2019-07-30 深圳市雷泛科技有限公司 光谱仪
CN110595616A (zh) * 2019-08-23 2019-12-20 南京理工大学 采用线性渐变滤光片和狭缝的高光谱成像装置及成像方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130229646A1 (en) * 2012-03-02 2013-09-05 Seiko Epson Corporation Component analyzer
CN202600333U (zh) * 2012-04-20 2012-12-12 中国科学院遥感应用研究所 一种波段可调的多光谱ccd相机
CN105467490A (zh) * 2015-12-28 2016-04-06 北京天诚盛业科技有限公司 一种滤光模组、成像装置及移动终端
CN106657802A (zh) * 2016-12-19 2017-05-10 北京空间机电研究所 一种转轮式多光谱相机自动曝光调节系统及调节方法
CN107395940A (zh) * 2017-08-30 2017-11-24 广东欧珀移动通信有限公司 滤光组件、成像装置、电子设备和电子设备的成像方法
CN107561684A (zh) * 2017-09-30 2018-01-09 广东欧珀移动通信有限公司 滤光片、镜头模组和成像模组
CN108827886A (zh) * 2018-03-16 2018-11-16 上海帆声图像科技有限公司 一种成像装置

Also Published As

Publication number Publication date
EP4075781A1 (en) 2022-10-19
US20230045724A1 (en) 2023-02-09
CN113099078A (zh) 2021-07-09
CN113099078B (zh) 2023-06-27
EP4075781A4 (en) 2023-01-11

Similar Documents

Publication Publication Date Title
WO2021139401A1 (zh) 摄像头模组、成像方法和成像装置
JP4717363B2 (ja) マルチスペクトル画像撮影装置及びアダプタレンズ
JP5543616B2 (ja) 色フィルタアレイ画像反復デノイズ
US7932932B2 (en) Method and apparatus for a chopped two-chip cinematography camera
JP3943848B2 (ja) 撮像装置
US9008412B2 (en) Image processing device, image processing method and recording medium for combining image data using depth and color information
EP2087725A1 (en) Improved light sensitivity in image sensors
TW200917833A (en) Image sensor having checkerboard pattern
CN103636199B (zh) 三维摄像装置、图像处理装置、图像处理方法
JP4952329B2 (ja) 撮像装置、色収差補正方法およびプログラム
KR20140131452A (ko) 이미지 신호 처리장치 및 방법과 이를 이용한 영상 처리 시스템
WO2017199557A1 (ja) 撮像装置、撮像方法、プログラム、及び非一時的記録媒体
JP2010245870A (ja) マルチスペクトル撮影用レンズアダプタ装置、マルチスペクトル撮影装置、および画像処理装置
WO2013027507A1 (ja) 撮像装置
JP5507362B2 (ja) 3次元撮像装置および光透過板
JP2011017827A (ja) フィルタ装置ならびにこれを備える撮影レンズおよび撮影装置
CN112203065A (zh) 图像生成装置及电子设备、图像生成方法
JP2011232615A (ja) 撮像装置
US8063979B2 (en) Digital camera displaying moving image and exposure control method thereof
US7176967B1 (en) Method and apparatus for a two-chip cinematography camera
CN102687514B (zh) 三维摄像装置以及图像处理装置
CN116471466B (zh) 一种双晶片内窥镜成像方法及成像装置
CN117061841B (zh) 一种双晶片内窥镜成像方法及成像装置
JP2009141842A (ja) 撮像装置、印刷システム、撮像装置制御プログラム及び撮像装置制御方法
CN117061841A (zh) 一种双晶片内窥镜成像方法及成像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20911551

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020911551

Country of ref document: EP

Effective date: 20220713

NENP Non-entry into the national phase

Ref country code: DE