CN113259546B - Image acquisition device and image acquisition method - Google Patents

Image acquisition device and image acquisition method Download PDF

Info

Publication number
CN113259546B
CN113259546B CN202010087243.9A CN202010087243A CN113259546B CN 113259546 B CN113259546 B CN 113259546B CN 202010087243 A CN202010087243 A CN 202010087243A CN 113259546 B CN113259546 B CN 113259546B
Authority
CN
China
Prior art keywords
image
filter
sampling interval
light
image sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010087243.9A
Other languages
Chinese (zh)
Other versions
CN113259546A (en
Inventor
陈晓雷
胡红旗
吴志江
赖昌材
郑士胜
李瑞华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010087243.9A priority Critical patent/CN113259546B/en
Priority to PCT/CN2020/126076 priority patent/WO2021159768A1/en
Publication of CN113259546A publication Critical patent/CN113259546A/en
Application granted granted Critical
Publication of CN113259546B publication Critical patent/CN113259546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The application provides an image acquisition device and method. The image acquisition device comprises an optical filter, an image sensor, a control unit and an image synthesis unit. In this embodiment, the image sensor performs photoelectric imaging on the visible light passing through the optical filter at the first image sampling interval to obtain the first image containing the chromaticity information, performs photoelectric imaging on the infrared light (or infrared light and visible light) passing through the optical filter at the second image sampling interval to obtain the second image containing the brightness information, and then the image synthesis unit synthesizes the first image and the second image, so that a target image with higher quality can be obtained, a beam splitting prism is not required, only one sensor is required, and the material cost and the complexity of the device can be reduced.

Description

Image acquisition device and image acquisition method
Technical Field
The present application relates to the field of image applications, and more particularly, to an image acquisition apparatus and an image acquisition method.
Background
With the rapid development of image application technology, the requirement on the imaging quality of image acquisition equipment under low illumination is higher and higher. The low-illuminance environment has a reduced imaging quality compared with a scene with normal illuminance, and is mainly characterized by reduced image brightness and increased noise. The biggest factor causing degradation of imaging quality is a decrease in the amount of incoming light of the imaging sensor, which decreases the signal-to-noise ratio of the output signal. Therefore, increasing the amount of light entering the image sensor at low illumination is a key to improving the quality of the low-illumination image. Light filling using light filling devices is a common means, and these devices can be divided into two types according to the wavelength of light emitted from the light filling device: visible light and infrared light. The real color information of the scene can be conveniently obtained by the visible light supplementing, but the visible light supplementing has interference to people. The infrared light supplement has no obvious interference to people, but the true color of the object cannot be acquired well.
In order to acquire both visible and infrared imaging in a scene, it is currently practiced to employ a beam splitting prism. As shown in fig. 1, the incident light is divided into visible light and infrared light by the optical coating film in the beam-splitting prism after passing through the beam-splitting prism. On the exit surface of the visible light, a color sensor is placed to image the visible light to obtain a visible light image. An infrared light sensor is placed on the outgoing surface of the infrared light to image the infrared light so as to acquire an infrared image. Then, the visible light image and the infrared image are synthesized by using an algorithm, and a final color image is output.
However, the above scheme requires the use of a beam-splitting prism and two sensors, which are costly in terms of materials, and the two sensors require accurate alignment during assembly, resulting in a high production complexity. In addition, a beam splitting prism is added behind the lens, so that the existing standard back focal lens cannot be compatible, and the light path part has larger volume, which is not beneficial to the miniaturization of equipment.
Disclosure of Invention
The application provides an image acquisition device and an image acquisition method, which can respectively acquire a first image containing chromaticity information and a second image containing brightness information at different image sampling intervals, and synthesize the first image and the second image to acquire a target image, so that a beam splitting prism is not required, only one sensor is required, and the material cost and the complexity of the device can be reduced.
In a first aspect, an image acquisition apparatus is provided that includes a filter, an image sensor, a control unit, and an image synthesizing unit.
And the optical filter is used for gating the incident light.
And the image sensor is used for photoelectrically imaging the light rays passing through the optical filter in the incident light.
And the control unit is connected with the optical filter and the image sensor and is used for controlling the optical filter to pass the visible light in the incident light and block the infrared light in the incident light from passing (namely, not passing the infrared light in the incident light) at a first image sampling interval and pass the infrared light in the incident light at a second image sampling interval. and
The image sensor is used for controlling the image sensor to perform photoelectric imaging on the light rays passing through the optical filter at the first image sampling interval in the incident light to obtain a first image, and performing photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval in the incident light to obtain a second image.
And the image synthesis unit is used for synthesizing the first image and the second image to generate a first target image.
Therefore, in the embodiment of the application, the first image including the chromaticity information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the first image sampling interval, the second image including the brightness information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the second image sampling interval, and then the first image and the second image are synthesized, so that the target image with higher quality can be obtained. Because the image sensor in this application embodiment can acquire first image and second image respectively in different time quantum, consequently this application embodiment can use a sensor, reduces material cost, and does not need to register image sensor, reduces the complexity to can help reducing the device volume.
Optionally, the controller may be further connected to the image synthesis unit, for controlling the image synthesis unit to synthesize the first image and the second image.
With reference to the first aspect, in certain implementations of the first aspect, the filter includes a first filter region for passing visible light and not passing infrared light, and a second filter region for passing infrared light. Alternatively, the second filtering area may pass visible light, or may not pass visible light, which is not limited in the embodiments of the present application.
The control unit is specifically configured to filter the incident light using the first filtering region at the first image sampling interval, and filter the incident light using the second filtering region at the second image sampling interval.
Therefore, in the embodiment of the application, the first filtering area can be covered on the photosensitive area of the image sensor at the first image sampling interval, the second filtering area can be covered on the photosensitive area of the image sensor at the second image sampling interval, and then the image sensor can perform photoelectric imaging on visible light passing through the optical filter at the first image sampling interval, and perform photoelectric imaging on infrared light (or infrared light and visible light) passing through the optical filter at the second image sampling interval.
The above-mentioned "filtering the incident light with the first filter region at the first image sampling interval" includes both cases, excluding and not excluding the simultaneous use of the second filter region to filter the incident light, that is, the use of the first filter region alone to filter the incident light, or the simultaneous use of the first filter region and the second filter region to filter the incident light. Likewise, the above-mentioned "filtering the incident light using the second filter region at the second image sampling interval" includes both cases, excluding and not excluding the simultaneous use of the first filter region to filter the incident light, i.e., the use of the second filter region alone to filter the incident light, or the simultaneous use of the first filter region and the second filter region to filter the incident light.
With reference to the first aspect, in certain implementations of the first aspect, the filter is a circular filter, the first filter region is a first sector region of the circular filter, and the second filter region is a second sector region of the circular filter;
the control unit comprises a motor, wherein the motor is used for controlling the circular optical filter to rotate around the circle center of the circular optical filter, so that the first fan-shaped area covers the photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval.
Since infrared imaging interferes very much with RGB images, but visible light does not interfere very much with infrared imaging, the "covering" in the "first sector area covers the photosensitive area of the image sensor at the first image sampling interval" means covering the entire photosensitive area. The "covering" of the second fan-shaped region in the second image sampling interval covering the photosensitive region "means that the entire photosensitive region may be covered, or a part of the photosensitive region may be covered.
Therefore, the embodiment of the application can periodically change the spectrum on the image sensor positioned behind the circular optical filter by rotating the circular optical filter around the circle center, so that the image sensor periodically performs color frame exposure and black and white frame exposure.
With reference to the first aspect, in certain implementation manners of the first aspect, the photosensitive area is a rectangular area, and in a case that a duration of the first image sampling interval and a duration of the second image sampling interval are the same, the circular filter meets the following condition:
R≥c+b/(2sin((π–ωt)/2));
wherein ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other. Wherein ωt is less than or equal to pi.
It is to be noted that no imaging is performed at this time within the angle of 2β, that is, no photo imaging is performed with both the first fan-shaped region and the second fan-shaped region covering the photosensitive region. And beta is the central angle of the round optical filter corresponding to the first side length.
With reference to the first aspect, in certain implementation manners of the first aspect, a central angle corresponding to the first sector area is greater than or equal to ωt+2arcsin (b/(2 (R-c))), and/or a central angle corresponding to the second sector area is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
With reference to the first aspect, in certain implementation manners of the first aspect, the photosensitive area is a rectangular area, and in a case that a duration of the first image sampling interval and a duration of the second image sampling interval are the same, the circular filter meets the following condition:
R≥c+b/(2sin(π–ωt));
wherein ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
With reference to the first aspect, in certain implementation manners of the first aspect, a central angle corresponding to the first fan-shaped area is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
With reference to the first aspect, in certain implementation manners of the first aspect, the photosensitive area is a rectangular area, and the circular filter meets the following conditions:
R≥c+b/(2sin(β/2));
wherein R represents the radius of the circular filter, b represents the first side length of the photosensitive area of the image sensor, c represents the second side length of the photosensitive area of the image sensor, beta is the central angle of the circular filter opposite to the first side length, and the first side and the second side are mutually perpendicular.
With reference to the first aspect, in certain implementation manners of the first aspect, the rotation angular velocity of the circular filter is ω, the central angle of the first sector area in the circular filter is (2pi—γ), the central angle of the second sector area is γ, and the exposure time of the first image (i.e. the first image sampling interval) is t 1 The exposure time of the second image (i.e. the second image sampling interval) is t 2
Omega, gamma, t 1 、t 2 The following formula is satisfied:
2π-γ-β≥ωt 1
γ+β≥ωt 2 or gamma is equal to or greater than ωt 2 Or gamma-beta is greater than or equal to ωt 2
Alternatively, t 1 And t 2 Can be in phaseEtc., or not, and the embodiments of the present application are not limited thereto. At t 1 And t 2 When equal, can be expressed as t 1 =t 2 =t。
With reference to the first aspect, in certain implementations of the first aspect, the control unit includes a driver and a lever, a first end of the lever is connected to the optical filter, and the driver is configured to control, by the lever, movement of the optical filter such that the first optical filter region covers a photosensitive region of the image sensor at the first image sampling interval, and the second optical filter region covers the photosensitive region at the second image sampling interval.
Since infrared imaging interferes very much with RGB images, but visible light does not interfere very much with infrared imaging, the "covering" of the first filter area in the first image sampling interval covering the photosensitive area of the image sensor "means covering the entire photosensitive area. The "covering" of the second filter region in the second image sampling interval covering the photosensitive region "means that the entire photosensitive region may be covered, or a part of the photosensitive region may be covered.
Therefore, the embodiment of the application drives the optical filter at the tail end of the lever to move under the drive of the driver to the lever through the lever connected with the optical filter, so that the spectrum on the image sensor positioned behind the optical filter is periodically changed, and the image sensor is periodically exposed to color frames and black and white frames.
With reference to the first aspect, in certain implementations of the first aspect, the optical filter includes electronically controlled light absorbing material therein,
the control unit is specifically configured to apply a voltage to the optical filter, and control the voltage so that the optical filter passes visible light and cannot pass infrared light at the first image sampling interval, and passes infrared light at the second image sampling interval.
Therefore, according to the embodiment of the application, the electric control light absorption material is arranged in the optical filter, and the absorption peak of the light absorption material in the optical filter can be periodically changed by changing the voltage applied to the optical filter, so that the spectrum on the image sensor positioned behind the optical filter is periodically changed, and the image sensor is periodically subjected to color frame exposure and black and white frame exposure.
With reference to the first aspect, in certain implementations of the first aspect, the electronically controlled light absorbing material comprises an organic color changing material or a liquid crystal material. Thus, the response time of the electronically controlled light absorbing material can reach millimeter level, and the switching of black and white frame exposure and color frame exposure can be responded quickly.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes:
An optical unit including an optical lens for capturing the incident light and imaging the incident light on the image sensor, wherein the optical filter is disposed before or after the optical lens or between two lenses in the optical lens.
With reference to the first aspect, in certain implementation manners of the first aspect, the control unit is specifically configured to:
and under the condition that the gain of the image sensor is determined to be larger than a preset value, controlling the optical filter to pass visible light and not pass infrared light at the first image sampling interval and pass infrared light at the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on light passing through the optical filter at the first image sampling interval and perform photoelectric imaging on light passing through the optical filter at the second image sampling interval.
When it is determined that the gain of the image sensor is greater than the preset value, it may be determined that the subject environment is a low-illuminance environment. Therefore, when the shot environment is perceived as a low-illumination environment, the control filter can pass visible light and cannot pass infrared light at the first image sampling interval, and pass infrared light (or pass infrared light and visible light) at the second image sampling interval.
With reference to the first aspect, in certain implementation manners of the first aspect, the control unit is further configured to:
and when the gain of the image sensor is smaller than or equal to the preset value, controlling the image sensor to perform photoelectric imaging on incident light in a third image sampling interval to acquire a second target image, wherein the third image sampling interval is equal to the sum of the first image sampling interval and the second image sampling interval.
Thus, the frame rate of the output image of the image acquisition device in the low-illumination environment can be kept unchanged from that in the normal environment.
In a second aspect, an image acquisition method is provided. The method may be performed by an image acquisition device in the first aspect or any one of the possible implementations of the first aspect.
In the method, the filter is controlled to pass visible light and not infrared light in the incident light at a first image sampling interval, and the filter is controlled to pass infrared light in the incident light at a second image sampling interval; performing photoelectric imaging on the light rays passing through the optical filter at the first image sampling interval through an image sensor to obtain a first image; performing photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval through the image sensor to obtain a second image; and synthesizing the first image and the second image by an image synthesizing unit to generate a first target image.
Therefore, in the embodiment of the application, the first image including the chromaticity information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the first image sampling interval, the second image including the brightness information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the second image sampling interval, and then the first image and the second image are synthesized, so that the target image with higher quality can be obtained. Because the image sensor in this application embodiment can acquire first image and second image respectively in different time quantum, consequently this application embodiment can use a sensor, reduces material cost, and does not need to register image sensor, reduces the complexity to can help reducing the device volume.
The respective steps of the method of the second aspect may refer to the respective operations of the respective modules of the apparatus of the first aspect and are not repeated here.
In a third aspect, a computer readable medium is provided for storing a computer program comprising instructions for performing the or any possible implementation of the or each of the first or second aspects.
In a fourth aspect, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the instructions of the first or second aspect or any possible implementation of the first or second aspect.
Drawings
Fig. 1 is a schematic diagram of imaging using a beam splitting prism.
Fig. 2 is a schematic diagram of an image capturing device according to an embodiment of the present application.
Fig. 3 is another schematic diagram of an image capturing device according to an embodiment of the present application.
Fig. 4 is a specific example of a circular filter provided in an embodiment of the present application.
Fig. 5 is an example of exposure of a circular filter provided in an embodiment of the present application.
Fig. 6 is a specific example of the connection between the lever and the filter according to the embodiment of the present application.
Fig. 7 is an example of exposure of a filter coupled to a lever according to an embodiment of the present application.
Fig. 8 is another example of exposure of a filter coupled to a lever according to an embodiment of the present application.
Fig. 9 is another example of exposure of a filter coupled to a lever according to an embodiment of the present application.
Fig. 10 is another specific example of the filter provided in the embodiment of the present application.
Fig. 11 is a schematic diagram of another image capturing device according to an embodiment of the present application.
Fig. 12 is a schematic diagram of combining a first image and a second image in a YUV color space according to an embodiment of the present application.
Fig. 13 is a schematic flow chart of a method of image acquisition provided in an embodiment of the present application.
Detailed Description
The technical solutions in the present application will be described below with reference to the accompanying drawings.
Fig. 2 shows a schematic diagram of an image capturing device 200 according to an embodiment of the present application. As an example, the image obtaining apparatus 200 may be a monitoring camera, a mobile phone or other wearable devices with shooting function or a camera in a smart home device, which is not limited in the embodiment of the present application.
For example, in an application scene of monitoring imaging, the monitoring camera needs 7×24 hours (referring to 7 days a week, 24 hours a day) to operate. That is, most monitoring cameras operate in a low-light environment for about half of the time. Therefore, there is a need for a monitoring camera that has high imaging quality (e.g., high brightness, small noise) in low-light environments, so as to be able to help acquire important information (e.g., information of a face, a body color, etc.). In addition, in an application scene of photographing, when the camera is used in a low-illuminance environment, it is also required that the camera can acquire an image satisfying a business application.
In some possible embodiments, the image acquisition apparatus 200 may be an apparatus for acquiring an image at night, and the related units of the apparatus 200 may be capable of performing operations such as light gating and corresponding image composition at night, but the community embodiment is not limited thereto.
As shown in fig. 2, the image acquisition apparatus 200 includes a filter 210, an image sensor 220, a control unit 230, and an image synthesizing unit 240.
Wherein the optical filter 210 is used for gating the incident light.
The image sensor 220 is used for photoelectrically imaging the light passing through the filter 210 in the incident light.
A control unit 230, connected to the filter 210 and the image sensor 220, for controlling the filter 210 to pass visible light in the incident light and block infrared light in the incident light from passing (i.e., not passing infrared light in the incident light) at a first image sampling interval, and to pass infrared light in the incident light at a second image sampling interval. And, the image sensor 220 is configured to perform photoelectric imaging on the light passing through the optical filter 210 at the first image sampling interval to obtain a first image, and perform photoelectric imaging on the light passing through the optical filter 210 at the second image sampling interval to obtain a second image.
That is, the control unit 230 can control the filter 210 to pass different light types at different image sampling intervals, i.e. realize a gating function of the spectrum, and further can control the light types reaching the surface of the image sensor 220.
It should be noted that, in the embodiment of the present application, the first image is obtained by imaging visible light of the real scene, so as to embody chromaticity information of the real scene, and the second image is mainly obtained by imaging infrared light of the real scene, so as to embody brightness information of the real scene. In some embodiments, the first image may be referred to as a color frame image and the second image may be referred to as a black and white frame image.
Since infrared light affects imaging of visible light, infrared light is not available to pass through the filter 210 during the first image sampling interval. The visible light does not affect the imaging of the infrared light, so the filter 210 may not pass the visible light in the incident light or the visible light in the incident light at the second image sampling interval, which is not limited in the embodiment of the present application. When the filter 210 may pass visible light in the incident light at the second image sampling interval, it can help to reduce the complexity of the filter.
The image sensor 220 is configured to receive an optical signal and convert the optical signal into an electrical signal. As an example, the image sensor 220 may be a charge coupled device (charge coupled device, CCD), a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS), or the like, which is not limited by the embodiments of the present application.
That is, in the embodiment of the present application, the control unit 230 controls the optical filter 210 and the image sensor 220, so that the optical filter 210 allows the visible light in the incident light to pass through and does not allow the infrared light in the incident light to pass through at the first image sampling interval, and the light passing through the optical filter 210 performs the photoelectric imaging on the image sensor 220 at this time, so as to obtain the first image; at the second image sampling interval, the filter 210 allows the infrared light in the incident light to pass through, or allows the infrared light and the visible light in the incident light to pass through, and the light passing through the filter 210 at this time is photoelectrically imaged on the image sensor 220 to obtain a second image.
It should be noted that, since the image sensor 220 performs photoelectric imaging on visible light at the first image sampling interval, performs photoelectric imaging on infrared light at the second image sampling interval, or performs photoelectric imaging on visible light and infrared light at the second image sampling interval, the embodiment of the present application may employ one sensor to respectively acquire visible light and infrared light in a photographed scene at different time periods, and perform imaging respectively. That is, in the present embodiment, the image sensor 220 may be time division multiplexed.
Alternatively, the duration of the first image sampling interval and the second image sampling interval may be the same or different, which is not limited in the embodiment of the present application.
In other possible implementations, two sensors may be used to perform visible light imaging and infrared light imaging, respectively, which is not limited in this embodiment of the present application.
The image synthesizing unit 240 is configured to synthesize the first image acquired by the image sensor 220 and the second image acquired by the image sensor 220, and acquire a target image. Wherein the chrominance information in the target image is mainly from the first image and the luminance information in the target image is mainly from the second image.
In some alternative embodiments, the control unit 230 may be further connected to the image synthesis unit 240, for controlling the image synthesis unit 240 to synthesize the first image and the second image.
In the embodiment of the present application, the period of acquiring a frame of target image may be referred to as a target image acquisition period. In the embodiment of the application, the first image and the second image need to be acquired in a time-sharing manner in one target image acquisition period, that is, the first image is acquired at a first image sampling interval in the target image acquisition period, and the second image is acquired at a second image sampling interval in the target image acquisition period. Then, a target acquisition image of the target image acquisition period may be generated from the first image and the second image.
As an example, when the output frame rate of the apparatus 200 is 25 frames/second, one target image acquisition period is 0.04s. The first image sampling interval and the second image sampling interval may be 0.02s, respectively. As a specific example, the first image may be acquired for the first 0.02s and the second image may be acquired for the second 0.02s within the target image acquisition period of 0.04s.
Therefore, in the embodiment of the application, the first image including the chromaticity information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the first image sampling interval, the second image including the brightness information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the second image sampling interval, and then the first image and the second image are synthesized, so that the target image with higher quality can be obtained. Because the image sensor in this application embodiment can acquire first image and second image respectively in different time quantum, consequently this application embodiment can use a sensor, reduces material cost, and does not need to register image sensor, reduces the complexity to can help reducing the device volume.
In the embodiment of the present application, the image synthesizing unit may be implemented in electronic hardware, or computer software, or a combination of computer software and electronic hardware, which is not limited in the embodiment of the present application. As an example, the image composition unit may be implemented by a central processing unit (central processing unit, CPU), a graphics processor (graphics processing unit, GPU) or a specific image processing chip. In some possible implementations, the CPU, GPU, or image processing chip for image composition may read code to implement composition of the first image and the second image, which is not limited in this embodiment of the present application.
In some possible implementations, the filter 210 may include a first filter region and a second filter region. The first filter region is used for passing visible light and not passing infrared light, and the second filter region is used for passing infrared light. Alternatively, the second filtering area may pass visible light, or may not pass visible light, which is not limited in the embodiments of the present application.
As shown in fig. 3, the filter 210 may include a first filter region 2101 and a second filter region 2102 therein. The control unit 220 may be specifically configured to gate the incident light using a first filtering area at the first image sampling interval and gate the incident light using a second filtering area at the image sampling interval.
As an example, the control unit 230 may control the optical filter 210 to move such that a first filter region in the optical filter 210 covers a photosensitive region of the image sensor 220 at a first image sampling interval and a second filter region covers a photosensitive region of the image sensor 220 at a second image sampling interval. That is, at a first image sampling interval, the first filter region may be in the optical path before the image sensor, gating the visible light in the optical path so that the image sensor receives the visible light; at a second image sampling interval, a second filter region is in the optical path before the image sensor, gating infrared light in the optical path (or gating infrared light and visible light in the optical path) such that the image sensor is controlled to receive infrared light (or infrared light and visible light).
Therefore, in the embodiment of the present application, the first filtering area in the optical filter 210 can be covered on the photosensitive area of the image sensor 220 at the first image sampling interval, and the second filtering area can be covered on the photosensitive area of the image sensor 220 at the second image sampling interval, so that the image sensor can perform photoelectric imaging on the visible light passing through the optical filter at the first image sampling interval, and perform photoelectric imaging on the infrared light (or the infrared light and the visible light) passing through the optical filter at the second image sampling interval.
Note that fig. 3 is for illustrating a schematic structure of the filter 210, and thus only a part of modules or units in the image capturing apparatus are shown. That is, in addition to the filter 210 and the control unit 230 shown in fig. 3, other modules or units (not shown in fig. 3) such as an image sensor and an image synthesizing unit are also included in the image capturing apparatus, which is not limited in the embodiment of the present application.
The following describes specific implementations of two types of optical filters including a first optical filtering region and a second optical filtering region provided in embodiments of the present application.
Mode one
The filter 210 may be designed as a circular filter. Wherein the first filter region 2101 is a first fan-shaped region in the circular filter and the second filter region is a second fan-shaped region in the circular filter. Accordingly, the control unit 230 may include a motor therein, and the motor may be configured to control the circular filter to rotate around the center of the circular filter, such that the first fan-shaped area covers the photosensitive area of the image sensor during the first image sampling interval and the second fan-shaped area covers the photosensitive area during the second image sampling interval. In some descriptions, the circular filter may also be referred to as a color wheel, which is not limited by the embodiments of the present application.
In some embodiments, the motor may control the circular filter to rotate at a constant speed around the center of the circle, or rotate at a variable speed, or control the circular filter to rotate clockwise around the center of the circle, or rotate counterclockwise, but the embodiment of the application is not limited thereto.
Optionally, when the control unit 230 includes a motor, the control unit 230 may further include a control chip, and the control chip may send a control instruction to the motor to control the motor to operate, so as to control the circular filter by the motor. As an example, the control chip may read instructions or code in the memory to effect the acquisition of the image.
The control chip may be different from the CPU or may be the CPU.
Therefore, the embodiment of the application can periodically change the spectrum on the image sensor positioned behind the circular optical filter by rotating the circular optical filter around the circle center, so that the image sensor periodically performs color frame exposure and black and white frame exposure.
Referring to fig. 4, a specific example of a circular filter is shown. The rectangular shadow area is an example of a photosensitive area of the image sensor, and two sides of the rectangular shadow area are b and c respectively. The circle center of the circular filter is O point, the radius is R, AO and BO are the boundary line of two filters (namely the first fan-shaped area and the second fan-shaped area) included in the circular filter.
Illustratively, the fan-shaped area OBCA is one example of a first fan-shaped area that allows visible light to pass through and does not allow infrared light to pass through. The fan-shaped area OBDA is one example of the second fan-shaped area, and allows infrared light to pass through, or may allow infrared light and visible light to pass through. For ease of understanding, the description below will be given taking the sector area OBCA as an example of allowing visible light to pass and not allowing infrared light to pass, the sector area OBDA allowing infrared light to pass, or may allow infrared light and visible light to pass.
Thus, color frame exposure may be performed when the entire area of the photosensitive area of the image sensor (i.e., the hatched area in fig. 4) is located within the sector area OBCA, and black and white frame exposure may be performed when the entire area or a portion of the area of the photosensitive area of the image sensor (i.e., the hatched area in fig. 4) is located within the sector area OBDA.
Typically, the radius of the circular filter is greater than the length of the photosensitive area of the image sensor. At this time, the photosensitive region of the image sensor may be disposed at a side away from the center of the circular filter, and the circular filter may be made to entirely cover the photosensitive region of the image sensor, such as the structure shown in fig. 4.
With continued reference to fig. 4, when the length of the photosensitive area of the image sensor is c and the width is b, the central angle β of the circular filter corresponding to the edge of the sensor with length b (i.e., the angle formed by the two end points G and F of the edge of the sensor with length b near the center O and the center O, i.e., the angle FOG) satisfies the following formula:
R≥c + b/(2sin(β/2)) (1)
from equation (1), β+.gtoreq.2 arcsin (b/(2 (R-c)));
wherein R is the radius of the circular filter.
In some embodiments, the empirical value of c may be expressed as:
c≤R/3;
the process of periodically performing color frame exposure and black and white frame exposure of the image sensor with rotation of the circular filter is described below with reference to fig. 4 and 5.
As shown in fig. 4, assuming that the circular filter rotates clockwise around the center of the circle, the intersection of OA and the vertex F of the shadow area may be taken as the start of one target image acquisition period. When the OA rotates clockwise one revolution, again intersecting the vertex F, the target image acquisition period ends. The image sensor may perform one black and white frame exposure and one color frame exposure during the target image acquisition period.
Specifically, when OA intersects with the vertex F of the shadow region (as shown in fig. 4), the infrared light can reach the image sensor (i.e., the photosensitive region of the image sensor) through the second filter region, and at this time or before that, the exposure of the color frame in the previous image acquisition cycle is ended.
Alternatively, black and white frame exposure may be started when OA intersects the vertex F of the shadow region, i.e., the fan-shaped region AOBF starts to have an overlapping region with the photosensitive region of the image sensor. Alternatively, as the circular filter rotates, black-and-white frame exposure may be restarted at a first time when or after AO intersects the vertex G, i.e., at a first time when or after the fan area AOBD completely overlaps the image sensor photosensitive area. Alternatively, exposure of the black-and-white frame may be started at any time between the intersection of OA and vertex F and the intersection of OA and vertex G.
Note that, in the period from when OA intersects with vertex F to when OA intersects with vertex G, since the fan-shaped area AOBD does not completely cover the image sensor photosensitive area at this time, the infrared light that is strobed is relatively less than that which is strobed when the fan-shaped area AOBD does not completely cover the image sensor photosensitive area, and therefore the quality of the black-and-white frame image in this period is poor compared to that when the fan-shaped area AOBD completely covers the image sensor photosensitive area.
As the circular filter continues to rotate, BO successively intersects vertex F and the other vertex G of the shadow area. Alternatively, black-and-white frame exposure may be ended when BO intersects the vertex F or a time of a second duration before the intersection, i.e., a time when the sector area AOBF starts to incompletely cover the photosensitive area of the image sensor or a second duration before. Alternatively, as the circular filter rotates, black-and-white frame exposure ends when BO intersects the vertex G (as shown in fig. 5), i.e., the sector area AOBF and the image sensor begin to completely overlap. Alternatively, exposure of the black-and-white frame may be ended at any time between the intersection of BO and vertex F and the intersection of BO and vertex G.
Illustratively, the frame number of the black-and-white frame obtained at this time may be denoted as (2 n-1). As the circular filter continues to rotate, color frame exposure may be performed after BO intersects vertex G (i.e., the case shown in fig. 5) until OA again intersects vertex F. For example, color frame exposure may begin after BO intersects vertex G, or after a third time period after intersection, and end at a fourth time period before AO intersects vertex F. Illustratively, the frame number of the color frame obtained at this time may be denoted as (2 n). Wherein n is an integer greater than or equal to 1.
Three specific examples of the image sensor periodically performing color frame exposure and black-and-white frame exposure are described below. The central angle of the fan-shaped area OBDA may be γ, and the central angle of the fan-shaped area OBCA may be (2pi—γ). The sector area in the filter rotates clockwise around the circle center O at a constant speed at the angular speed omega.
First example: when AO just begins to intersect the vertex F of the shadow region (i.e., the photosensitive region of the image sensor), black and white frame exposure is turned on. When BO intersects the G point of the shadow area, black and white frame exposure ends, and color frame exposure begins. When AO again intersects the vertex F of the shadow area, the color frame exposure is ended, and the black-and-white frame exposure of the next cycle is started. At this time, the black-and-white frame exposure angle is (γ+β), and the color frame exposure angle is (2pi—γ—β), which can be denoted as α (i.e., α=2pi—γ—β).
At this time, the following conditions are satisfied:
α=ωt 1
γ+β=ωt 2
wherein the exposure time of the color frame can be t 1 The exposure time of the black-and-white frame can be t 2 . Alternatively, t 1 And t 2 May be equal or unequal, and embodiments of the present application are not limited in this regard.
When the exposure time of the color frame is equal to that of the black-and-white frame, i.e. t 1 =t 2 When α=γ+β=pi.
In some possible implementations, the exposure angle of the color frame may also be less than or equal to α, i.e., α+.ωt 1 The exposure angle of the black-and-white frame can also be less than or equal to gamma+beta, namely gamma+beta is more than or equal to ωt 2 . In the case of designing a circular filter, it is usually designed to be α=ωt to ensure compactness 1 ,γ+β=ωt 2
A second example: when AO coincides with the central axis of the shadow area, black and white frame exposure is turned on. When BO coincides with the central axis of the shadow area, black and white frame exposure is ended. When BO intersects the G point of the shadow area, color frame exposure is started. When AO again intersects the vertex F of the shadow area, the color frame exposure is ended. When AO coincides again with the central axis of the shadow area, black and white frame exposure of the next cycle is started. At this time, the black-and-white frame exposure angle is γ, and the color frame exposure angle is still α.
Here, in the process in which AO intersects with the vertex F until AO coincides with the central axis of the shadow area (corresponding to the (β/2) angle of the sector area OBCA near OA), neither black-and-white frame exposure nor color light exposure is performed. Similarly, in the process where BO coincides with the central axis of the shadow region to where BO intersects with the vertex F (corresponding to the (β/2) angle near OB of the sector-shaped region OBCA), neither black-and-white frame exposure nor color frame exposure is performed.
At this time, the following conditions are satisfied:
α=ωt 1
γ=ωt 2
wherein the exposure time of the color frame can be t 1 The exposure time of the black-and-white frame can be t 2 . Alternatively, t 1 And t 2 May be equal or unequal, and embodiments of the present application are not limited in this regard.
When the exposure time of the color frame is equal to that of the black-and-white frame, i.e. t 1 =t 2 When=t, α=γ=pi- (β/2).
That is, β=2 (pi- α) =2 (pi- ωt), at which time the above formula (1) may be deformed into the following formula:
R≥c+b/(2sin((π–ωt)));
in some possible implementations, the exposure angle of the color frame may also be less than or equal to α, i.e., α+.ωt 1 The exposure angle of the black-and-white frame can also be less than or equal to gamma, i.e. gamma is more than or equal to ωt 2 . In the case of designing a circular filter, it is usually designed to be α=ωt to ensure compactness 1 ,γ=ωt 2
I.e. when t 1 =t 2 When t is equal to or greater than ωt+2arcsin (b/(2 (R-c))) the central angle (i.e., α+β) corresponding to the first sector.
In a third example, exposure of a black-and-white frame begins when AO intersects the vertex G of the shadow region. When BO intersects with the vertex F of the shadow area, the exposure of the black-and-white frame is ended. When BO intersects the G point of the shadow area, color frame exposure is started. When AO again intersects the vertex F of the shadow area, the exposure of the color frame is ended. When AO again intersects the vertex G of the shadow area, black and white frame exposure of the next cycle is started. At this time, the exposure angle of the black-and-white frame is (γ - β), and the exposure angle of the color frame is still α.
Here, in the process from the intersection of AO with the vertex F to the intersection of AO with the vertex G (β angle near OA corresponding to the sector area OBCA), neither black-and-white frame exposure nor color light exposure is performed. Similarly, in the process where BO intersects with vertex F and vertex G (angle corresponding to β of sector area OBCA near OB), neither black-and-white frame exposure nor color frame exposure is performed.
At this time, the following conditions are satisfied:
α=ωt 1
γ-β=ωt 2
wherein the exposure time of the color frame can be t 1 The exposure time of the black-and-white frame can be t 2 . Alternatively, t 1 And t 2 May be equal or unequal, and embodiments of the present application are not limited in this regard.
When the exposure time of the color frame is equal to that of the black-and-white frame, i.e. t 1 =t 2 When t, α=pi- β, γ=pi.
That is, β=pi- α=pi- ωt, where the above formula (1) can be modified into the following formula:
R≥c+b/(2sin((π–ωt)/2));
in some possible implementations, the exposure angle of the color frame may also be less than or equal to α, i.e., α+.ωt 1 The exposure angle of the black-and-white frame can also be less than or equal to gamma-beta, i.e. gamma-beta is more than or equal to ωt 2 . In the case of designing a circular filter, it is usually designed to be α=ωt to ensure compactness 1 ,γ-β=ωt 2
I.e. when t 1 =t 2 When=t, the central angle (i.e. α+β) corresponding to the first sector is greater than or equal to ωt+2arcsin (b/(2 (R-c))), and/or the central angle (i.e. γ) corresponding to the second sector is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
The greater the angular velocity ω of rotation of the circular filter, the exposure time t 1 Or t 2 The smaller.
In some alternative embodiments, the position of the OA may be detected by a hall sensor. As one implementation, a magnet may be disposed at the a position on the optical filter, and a hall sensor may be disposed at a first position corresponding to the a position on the image sensor in fig. 4. When the hall sensor detects that the magnetic field strength is maximum, it can be determined that OA intersects the apex F.
When the filter rotates at a constant speed, the black-and-white frame exposure and the color frame exposure can be controlled by setting a timer. As a possible implementation, the timing may be started starting from the point at which OA intersects vertex F. For a specific example, for the first example above, black and white frame exposure may be turned on at a time of 0, at a time of t 2 At this time, black-and-white frame exposure is ended, and color frame exposure is started. Through t 1 After the duration, the OA again intersects the vertex F, the timer clears and the timing resumes.
Mode two
The control unit 240 may include a driver and a lever, a first end of the lever being connected to the optical filter 210, the driver for controlling the optical filter 210 to move by the lever such that the first filter area 2101 in the first period of time of the optical filter 210 covers the photosensitive area of the image sensor and the second filter area 2102 in the second period of time of the optical filter 210 covers the photosensitive area of the image sensor.
Alternatively, the second end of the lever may be fixed and act as a fulcrum, at which time the actuator may move the filter 210 coupled to the first end of the lever by controlling the movement of the first position in the lever. Alternatively, a second position between two end points on the lever may be fixed and act as a fulcrum, at which time the actuator may move the filter 210 connected to the first end of the lever by controlling the movement of the second end in the lever.
As one example, the driver may be a piezoelectric (piezo) ceramic driver, but embodiments of the present application are not limited thereto.
Optionally, when the control unit 230 includes a driver and a lever, the control unit 230 may further include a control chip, and the control chip may send a control instruction to the driver to control the driver to operate, so as to control the lever to the optical filter. As an example, the control chip may read instructions or code in the memory to effect the acquisition of the image.
Referring to fig. 6, a specific example of the connection of the lever to the filter is shown. Illustratively, JK is the boundary between the first filter region 2101 and the second filter region 2102. In addition, the lever 2402 has a first end J connected to the filter 210 and a second end H fixed. The second end H may be fixed by a bearing, for example. As a specific implementation, the driver 2401 may include a linear motion portion that may be coupled to the lever 2402 at point I through a bearing. As shown in fig. 6, the distance e by which the first end J of the lever 2402 moves up and down and the distance f by which the rectilinear motion portion moves up and down satisfy the following relationship:
Figure GDA0004058476720000121
Where L1 represents the distance between the end point of the second end H and the point I, and L2 represents the distance between the end point of the first end J and the point I.
With continued reference to fig. 6, the filter is moved around the fixed end H by the pushing of the driver 2401. Illustratively, when the filter moves from bottom to top, when JK intersects one vertex M of the shadow region (as shown in fig. 7), infrared light can pass through the second filter region 2102 to the image sensor, at which time color frame exposure is stopped. Alternatively, black and white frame exposure may be started at this time. As the filter continues to rotate upward, the filter will reach its highest point.
Alternatively, as shown in fig. 8, the second filter region 2102 can still be located in front of the shadow region (i.e., the image sensor) when the filter reaches the highest point. In some implementations, by controlling the lengths of L1 and L2, and the shift of the linear motion portion, the second filter region 2102 can still be located in front of the image sensor when the filter reaches the highest point.
After the filter reaches the highest point, the filter begins to swing downward. With the filter swinging, after JK will intersect the vertex M again (as shown in fig. 9), the second filter area 2102 has no intersection area with the shadow portion, and alternatively, black-and-white frame exposure may be ended. The frame number of the black-and-white frame obtained at this time can be noted as (2 n-1). After the black and white frame exposure is completed, color frame exposure may begin as the filter continues to swing downward.
Alternatively, the first filter region 2101 can still be located in front of the image sensor when the filter reaches the lowest point. In some implementations, by controlling the lengths of L1 and L2 and the displacement of the linear motion portion, the first filter region 2101 can still be located in front of the image sensor when the filter reaches the lowest point.
After the filter reaches the lowest point, the upward movement starts. As the filter moves, the color frame exposure ends when JK re-intersects the vertex M. Illustratively, the frame number of the color frame obtained at this time may be denoted as (2 n).
It should be understood that the "up", "down", and "back" described in the embodiments of the present application are only for those skilled in the art to better understand the movement of the lever or the relative positions of the image sensor and the optical filter in the embodiments of the present application, and are not intended to limit the technical solutions of the embodiments of the present application.
Therefore, the embodiment of the application drives the optical filter at the tail end of the lever to move under the drive of the driver to the lever through the lever connected with the optical filter, so that the spectrum on the image sensor positioned behind the optical filter is periodically changed, and the image sensor is periodically exposed to color frames and black and white frames.
The embodiment of the application also provides a specific implementation mode of the optical filter, and the specific implementation mode is shown in the following mode III.
Mode three
The filter 210 may include an electronically controlled light absorbing material therein. At this time, the control unit 230 may specifically be configured to apply a voltage to the optical filter 210, and to control the magnitude of the voltage such that the optical filter 210 passes visible light and fails to pass infrared light in the first period of time, and passes infrared light in the second period of time.
In some alternative embodiments, the electronically controlled light absorbing material comprises an organic color shifting material or a liquid crystal material. Thus, the response time of the electronically controlled light absorbing material can reach millimeter level, and the switching of black and white frame exposure and color frame exposure can be responded quickly.
Referring to fig. 10, a specific example of the filter 210 is shown. Wherein, the electrically controlled light absorbing material can be arranged between the two layers of transparent electrodes. For example, an electronically controlled light absorbing material may be wrapped between two layers of transparent electrodes. In this way, the control unit 230 may control the absorption peak of the light absorbing material to be switched between the infrared region and the ultraviolet region by the variation of the voltage applied between the transparent electrodes, thereby implementing the gating function of the spectrum.
Illustratively, when the control unit 230 controls the light absorbing material to gate the infrared light, the infrared light may pass through the filter 210 to the image sensor, at which time black and white frame exposure may be performed. The frame number of the black-and-white frame obtained at this time can be noted as (2 n-1). Color frame exposure may be performed when the control unit 230 controls the light absorbing material to gate on visible light (or visible light and infrared light). The frame number of the color frame obtained at this time can be noted as (2 n).
In addition, in some alternative embodiments, glass may also be provided on the outside of the transparent electrode (i.e., the side remote from the electronically controlled light absorbing material).
Therefore, according to the embodiment of the application, the electric control light absorption material is arranged in the optical filter, and the absorption peak of the light absorption material in the optical filter can be periodically changed by changing the voltage applied to the optical filter, so that the spectrum on the image sensor positioned behind the optical filter is periodically changed, and the image sensor is periodically subjected to color frame exposure and black and white frame exposure.
Fig. 11 shows a schematic diagram of another image acquisition apparatus 300 provided in an embodiment of the present application. As shown in fig. 11, the apparatus 300 includes an optical unit 301, an infrared light supplementing unit 302, a tunable filter 303, an image sensor 304, an image processing unit 305, a filter control unit 306, and an image synthesizing unit 307. Wherein the tunable filter 303 may be an example of the above filter 210, the image sensor 304 may be an example of the above image sensor 220, the filter control unit 306 may be an example of the above control unit 230, and the image synthesizing unit 307 may be an example of the above image synthesizing unit 230.
It should be understood that fig. 11 shows modules or units of the image capturing device, but these modules or units are only examples, and the image capturing device of the embodiments of the present application may also include other modules or units, or include variations of the respective modules or units in fig. 11. Further, the image acquisition apparatus in fig. 11 may not necessarily include all of the modules or units in fig. 11.
Wherein the optical unit 301 is configured to capture incident light and to image the incident light on the image sensor 304. Illustratively, the primary component in the optical unit includes an optical lens. Wherein the optical lens may be used for optical imaging. As an example, the optical lens may be a lens confocal to visible light and infrared light. Optionally, the light unit 301 may further include a filter, which is mainly used for polarizing light, etc.
It should be noted that, light is transmitted between the optical unit 301 and the image sensor 304. In some possible descriptions, the transmission of light may be considered to be that of data. Alternatively, in some possible descriptions, the transmission of light may be considered not to be the transmission of data. The embodiments of the present application are not limited in this regard.
It should be further noted that, the optical filter described in other parts of the embodiments of the present application refers to an optical filter for visible light gating or infrared light gating, such as the optical filters in fig. 2 and 3, and the tunable optical filter shown in fig. 11.
The tunable filter 303 may be disposed before the optical lens or after the optical lens, and is not limited thereto. In addition, when at least two optical lenses are included in the optical lens, the tunable filter 303 may be disposed between two lenses of the at least two optical lenses (in this case, it may be referred to as that the tunable filter 303 is disposed in the lens)
When the tunable filter 303 is disposed before the optical lens, the optical lens causes visible light or infrared light that is gated by the tunable filter 303 to be imaged on the image sensor 304 after the incident light is gated by the tunable filter 303. When the tunable filter 303 is disposed behind the optical lens, incident light passes through the optical lens, the tunable filter 303 gates visible light or infrared light therein, and then the visible light or infrared light passing through the filter is imaged on the image sensor 304. When the tunable filter 303 is disposed in the optical lens, incident light is gated by the tunable filter 303 to visible light or infrared light therein during passing through the optical lens, and the light is imaged on the image sensor 304 after passing through the optical lens.
The infrared light supplementing unit 302 is used for supplementing light to the subject environment in a low-illuminance environment. As an example, an infrared light emitting diode (light emitting diode, LED) may be employed, or other light emitting device or apparatus may be employed. The center wavelength of the infrared light compensating unit 302 may be 750nm, 850nm or 950nm, which is not limited in the embodiment of the present application.
In some possible implementations, the infrared light filling unit 302 may be controlled to be turned on and off, the illumination intensity, or the center wavelength by a switching value or a level signal.
The image processing unit 305 is configured to process the digital image signal acquired by the image sensor 304, such as at least one of demosaicing, automatic exposure, automatic white balancing, and the like. As an example, the image processing unit 305 may be implemented by dedicated hardware, such as an image signal processor (image signal processor, ISP), which is not limited by the embodiment of the present application.
The filter control unit 306 is configured to control the tunable filter 303 such that the tunable filter 303 gates visible light in the incident light and cannot pass infrared light in the incident light at a first image sampling interval, gates infrared light in the incident light at a second image sampling interval, or gates infrared light and visible light in the incident light at a second image sampling interval.
For example, the filter control unit 306 may transmit a control signal for the tunable filter 303 to the tunable filter 303 such that the tunable filter 303 gates the visible light in the incident light at a first image sampling interval and cannot pass the infrared light in the incident light, gates the infrared light in the incident light at a second time period, or gates the infrared light and the visible light in the incident light at a second image sampling interval.
Alternatively, the tunable filter 303 may send a feedback signal about the tunable filter 303 to the filter control unit 306, so that the filter control unit 306 knows the gating situation of the tunable filter 303.
In some alternative embodiments, the filter control unit 306 may control the initiation and termination of the exposure of the image sensor 304 according to the light gating conditions of the tunable filter 303. That is, the filter control unit 306 may control the synchronization of the light gating of the tunable filter 303 with the exposure of the image sensor 304.
For example, when the filter control unit 306 outputs a control signal to control the tunable filter 303 to switch to gate visible light and not allow infrared light to pass, the control signal may also be used to trigger the image sensor 304 to perform color frame exposure. The control signal may also be used to trigger the image sensor 304 to perform black and white frame exposure when the filter control unit 306 outputs a control signal to control the tunable filter 303 to switch to gating infrared light. Based on this, the embodiments of the present application enable synchronization of the tunable filter 303 switching with the exposure of the image sensor 304.
Alternatively, after the exposure time is satisfied, the filter control unit 306 may output one or more control signals to stop the exposure of the image sensor 304.
In some alternative embodiments, the filter control unit 306 may control the image synthesis unit to perform image synthesis according to the light gating condition of the tunable filter 303. For example, when the tunable filter 303 is a color wheel, the image synthesizing unit is controlled to synthesize the first image and the second image acquired in the period to acquire the target image when the color wheel rotates one revolution.
Specifically, with the tunable filter 303 switched, the image sensor 304 periodically and alternately outputs a first image including chromaticity information and a second image including luminance information. As an example, the image sensor 304 may output one frame of the first image and one frame of the second image sequentially in one image acquisition period, wherein the frame number of the first image and the frame number of the second image in one period are adjacent. As a specific example, a second image of the (2 n-1) th frame (may be denoted as I (2 n-1)) and a first image of the 2n frame (may be denoted as I (2 n)) may be included in one image acquisition period, where n is an integer greater than or equal to 1.
In some alternative embodiments, after the first image and the second image in one image acquisition period are processed by the image processing unit 305, the filter control unit 306 controls the image synthesizing unit 307 to synthesize two frames of images in one image acquisition period, and acquire the target image.
Taking Y (luminance) UV (chrominance) color space for example, the UV component in the target image may take the UV component of the image of the even frame (i.e., the first image) and the Y component in the target image may take the Y component of the image of the odd frame (i.e., the second image). Wherein the UV component is used to represent chrominance information of the image and the Y component is used to represent luminance information of the image.
Illustratively, the target image may be denoted as I ', and then I' (n) of the nth frame may be expressed as:
I’ Y (n)=I Y (2n-1) (2)
I’ UV (n)=I UV (2n) (3)
wherein I' Y (n) represents the Y component of the target image with frame number n, I Y (2 n-1) represents the Y component, I 'of the second image with the frame number (2 n-1)' UV (n) UV component of target image with frame number n, I UV (2 n) represents the UV component of the second image with frame number 2 n.
As is clear from the above-described formula (2) and formula (3), the second image of the (2 n-1) th frame and the first image of the 2 n-th frame can be synthesized by the image synthesizing unit 307 to obtain the target image having the frame number n.
Fig. 12 shows a schematic diagram of the image synthesizing unit 307 synthesizing the first image and the second image in YUV color space. The second image with frame number 1 and the first image with frame number 2 are two frames of images of the image sensor 304 in the first image acquisition period. The image synthesizing unit 307 may synthesize the two frame images into a target image with a frame number of 1. Wherein the Y component of the frame number 1 target image is from the Y component of the frame number 1 second image and the UV component is from the UV component of the frame number 2 first image. Similarly, the image synthesizing unit 307 may synthesize the second image with the frame number 3 and the first image with the frame number 4 in the second image acquisition period to obtain the target image with the frame number 2. Wherein the Y component of the target image with frame number 2 is from the Y component of the second image with frame number 3 and the UV component is from the Y component of the first image with frame number 4.
It should be understood that fig. 12 only shows an example of the first image acquisition period and the second image acquisition period, and after the second image acquisition period, the image sensor may acquire the first image and the second image in a subsequent image acquisition period, and the manner of image synthesis in the subsequent image acquisition period may refer to the description in the first image acquisition period or the second image acquisition period, which is not repeated herein for brevity.
As an example, a hall sensor and a timer may be provided in the filter control unit 306 to monitor position information of the tunable filter 303 and transmit control signals to the tunable filter 303, the image sensor, and the image synthesizing unit according to the position information and the timer information.
In some possible implementations, the control unit 230 is further configured to determine an ambient illuminance of the subject environment according to the gain of the image sensor 220. In actual use of the image capturing apparatus 200, the gain of the image sensor 220 gradually increases as the illuminance of the subject environment decreases. When the gain of the image sensor 220 is greater than a preset value, it may be determined that the subject environment is a low-illuminance environment. At this time, the filter may be controlled to pass visible light and not infrared light at a first image sampling interval and to pass infrared light at a second image sampling interval. As an example, the preset value may be 36dB.
In some alternative embodiments, the control unit 230 is further configured to control the image sensor 220 to photoelectrically image visible light at the third image sampling interval to obtain the target image when it is determined that the gain of the image sensor 220 is less than or equal to the preset value. Here, in one target image acquisition period, the target image may be directly acquired in the third image sampling interval.
Here, the third image sampling interval may be equal to a sum of the first image sampling interval and the second image sampling interval. For example, assuming that the image acquisition device requires 10 milliseconds (ms) to output an image, then the first image sampling interval plus the second image sampling interval should be 10ms, which together produce an output image, while the third image sampling interval itself is 10ms, which alone produces an output image.
That is, the output frame rate of the image sensor at the third image sampling interval is half of the output frame rate of the image sensor at the first image sampling interval or the second image sampling interval, i.e., the output frame rate of the image sensor at a gain less than or equal to a preset value (i.e., in a normal environment) is half of the output frame rate of the image sensor at a gain greater than the preset value (i.e., in a low-illuminance environment).
Illustratively, for the apparatus 300, the output frame rate at which the filter control unit 306 controls the image sensor 304 at the first image sampling interval or the second image sampling interval is 2 times the output frame rate of the image sensor 304 at the third image sampling interval.
Thus, after the image synthesizing unit synthesizes two frames of images acquired by the image sensor into one frame of image, the frame rate of the output image of the image acquisition device in the low-illumination environment can be kept unchanged from the frame rate output in the normal environment.
As an example, at normal ambient illumination (e.g., gain of the image sensor is less than 36 dB), the output frame rate of the image sensor is 25 frames/second. When the shot environment is detected as a low illumination environment (for example, the gain of the image sensor is 36 dB), the output frame rate of the image sensor can be modified to be 50 frames/second.
In some alternative embodiments, the control unit 230 may shift the optical filter out of the optical path at the third image sampling interval, which is not limited in the embodiments of the present application.
Therefore, in the embodiment of the application, the first image including the chromaticity information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the first image sampling interval, the second image including the brightness information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the second image sampling interval, and then the first image and the second image are synthesized to obtain the high-quality color target image. Because the image sensor in this application embodiment can acquire first image and second image respectively in different time quantum, consequently this application embodiment can use a sensor, reduces material cost, and does not need to register image sensor, reduces the complexity, and can help reducing the device volume.
Fig. 13 shows a schematic flow chart of an image acquisition method 400 provided in an embodiment of the present application. The method 400 may be performed by the image acquisition apparatus 200 of fig. 2 above, or by the image acquisition apparatus 300 of fig. 11. As shown in fig. 13, method 400 includes steps 410 through 440.
It should be noted that fig. 13 shows steps or operations of the image acquisition method, but these steps or operations are merely examples, and other operations or variations of the operations in fig. 13 may also be performed by the embodiments of the present application. Further, it is possible that not all operations in fig. 13 are to be performed.
Optionally, 410, the ambient illuminance is estimated.
For example, the estimation of the ambient illuminance may be made based on the gain of the image sensor. It may then be determined whether to perform step 420 based on the estimated ambient illuminance.
During image acquisition, the gain of the image sensor gradually increases as the ambient illuminance decreases. For example, when the gain of the image sensor is lower than 36dB, it can be determined that the subject environment is a normal illuminance environment. When the gain of the image sensor is higher than 36dB, it can be determined that the subject environment is a low-illuminance environment.
When the subject environment is determined to be a normal illuminance environment, the optical filter in fig. 2 or 3 may be moved out of the optical path, and an image obtained by photoelectrically imaging visible light by the image sensor may be output, displayed, or saved. As an example, the output frame rate of the image sensor at this time may be 25 frames/second.
When it is determined that the subject environment is a low-illuminance environment, step 420 may be performed.
As an example, this step 410 may be performed by the control unit 230 in fig. 2, or the step 410 may be performed by the filter control unit 306 in fig. 11.
420, acquiring a first image and a second image.
Specifically, visible light passing through the filter is acquired at a first image sampling interval, and infrared light passing through the filter is acquired at a second image sampling interval. The filter may be referred to the above description, and for brevity, the description is omitted here.
Then, the light passing through the filter at the first image sampling interval is photoelectrically imaged by an image sensor, and a first image can be acquired. And carrying out photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval through the image sensor, so as to acquire a second image.
In some embodiments, after determining that the subject environment is a low-illumination environment, the output frame rate of the image sensor may be modified from 25 frames/second to 50 frames/second, that is, the acquisition frame rate of the image acquisition device is doubled, so that the output frame rate of the image acquisition device in the low-illumination environment is ensured to be the same as the output frame rate in the normal illumination environment.
In some embodiments, when the filter is a circular filter, the motor may be started to rotate the circular filter before step 420. When the rotating speed of the circular optical filter reaches the set rotating speed, the exposure of the black-and-white frame and the color frame is started, and a first image and a second image are obtained.
As an example, the filter and the image sensor may be controlled by the control unit 230 in fig. 2 to acquire the first image and the second image, or the tunable filter and the image sensor may be controlled by the filter control unit in fig. 11 to acquire the first image and the second image. Specifically, the process of the first image and the second image may be referred to the above description, and for brevity, the description is omitted here.
Optionally, after step 420, basic image processing may be performed on the acquired first and second images. For example, demosaicing, white balancing, color correction, etc., which are not limited in this embodiment.
And 430, synthesizing the first image and the second image to generate a target image.
As an example, the image synthesizing unit may be controlled by the control unit 230 to perform this step 430. Specifically, the synthesis process may be referred to the above description, and for brevity, will not be repeated here.
Optionally, 440, the target image is transmitted, displayed or stored.
For example, the target image may be output to an associated business module. And the related business module performs operations such as image transmission, display/storage and the like on the synthesized images according to business application.
Therefore, in the embodiment of the application, the first image including the chromaticity information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the first image sampling interval, the second image including the brightness information is obtained by performing the photoelectric imaging on the light passing through the optical filter at the second image sampling interval, and then the first image and the second image are synthesized to obtain the high-quality color target image. Because the image sensor in this application embodiment can acquire first image and second image respectively in different time quantum, consequently this application embodiment can use a sensor, reduces material cost, and does not need to register image sensor, reduces the complexity, and can help reducing the device volume.
It should be noted that the examples in the embodiments of the present application are merely to help those skilled in the art understand and implement the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application. Equivalent changes and modifications can be made by those skilled in the art based on the examples given herein, and such changes and modifications should still fall within the scope of the embodiments of the present application.
It should also be noted that, in various embodiments of the present invention, the sequence number of each process described above does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The present application also provides a computer-readable storage medium comprising a computer program which, when run on a computer, causes the computer to perform the relevant instructions in the methods or apparatus provided in the above embodiments.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the relevant instructions of the methods or apparatus provided by the above embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (21)

1. An image acquisition apparatus, comprising:
the optical filter is used for gating incident light;
the image sensor is used for carrying out photoelectric imaging on the light rays passing through the optical filter in the incident light;
the control unit is connected with the optical filter and the image sensor and is used for controlling the optical filter to pass visible light in the incident light at a first image sampling interval and block infrared light in the incident light from passing, and pass infrared light in the incident light at a second image sampling interval; and
the image sensor is used for controlling the image sensor to perform photoelectric imaging on the light rays passing through the optical filter at the first image sampling interval in the incident light to obtain a first image, and performing photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval in the incident light to obtain a second image; and
for controlling the image sensor to end the photo-electric imaging at a fourth image sampling interval;
an image synthesis unit, configured to synthesize the first image and the second image, and generate a first target image;
the fourth image sampling interval is a time period except a first image sampling interval and a second image sampling interval in the acquisition period of the first target image;
The optical filter comprises a first optical filtering area and a second optical filtering area, wherein the first optical filtering area is used for passing visible light and blocking infrared light from passing, and the second optical filtering area is used for passing infrared light;
the control unit is specifically configured to filter the incident light using the first filtering region at the first image sampling interval, and filter the incident light using the second filtering region at the second image sampling interval;
the optical filter is a circular optical filter, the first optical filtering area is a first fan-shaped area of the circular optical filter, and the second optical filtering area is a second fan-shaped area of the circular optical filter;
the control unit comprises a motor, wherein the motor is used for controlling the circular optical filter to rotate around the circle center of the circular optical filter, so that the first fan-shaped area covers the photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval;
the photosensitive area is a rectangular area, and the circular filter meets the following first condition or second condition under the condition that the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
The first condition is: r is greater than or equal to c+b/(2 sin ((pi- ωt)/2));
the second condition is: r is more than or equal to c+b/(2 sin (pi- ωt));
wherein ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
2. The apparatus of claim 1, wherein when the circular filter satisfies the first condition, a central angle corresponding to the first sector area is greater than or equal to ωt+2arcsin (b/(2 (R-c))), or a central angle corresponding to the second sector area is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
3. The apparatus of claim 1, wherein the first sector region corresponds to a central angle greater than or equal to ωt+2arcsin (b/(2 (R-c))) when the circular filter satisfies the second condition.
4. A device according to any one of claims 1 to 3, wherein the photosensitive area is a rectangular area, and the circular filter satisfies the following condition:
R≥c+b/(2sin(β/2));
Wherein R represents the radius of the circular filter, b represents the first side length of the photosensitive area of the image sensor, c represents the second side length of the photosensitive area of the image sensor, beta is the central angle of the circular filter opposite to the first side length, and the first side and the second side are mutually perpendicular.
5. The apparatus according to claim 4, wherein a rotation angular velocity of a circular filter is ω, a central angle of the first sector in the circular filter is (2Ω - γ), a central angle of the second sector is γ, and an exposure time of the first image is t 1 The exposure time of the second image is t 2
Omega, gamma, t 1 、t 2 The following formula is satisfied:
2π-γ-β≥ωt 1
γ+β≥ωt 2 or gamma is equal to or greater than ωt 2 Or gamma-beta is greater than or equal to ωt 2
6. The apparatus of claim 1, wherein the control unit includes a driver and a lever, a first end of the lever being connected to the optical filter, the driver being configured to control movement of the optical filter by the lever such that the first filter region covers a photosensitive region of the image sensor at the first image sampling interval, and the second filter region covers the photosensitive region at the second image sampling interval.
7. The device of claim 1, wherein the optical filter comprises an electronically controlled light absorbing material,
the control unit is specifically configured to apply a voltage to the optical filter, and control the voltage so that the optical filter passes visible light and blocks infrared light at the first image sampling interval, and passes infrared light at the second image sampling interval.
8. The device of claim 7, wherein the electronically controlled light absorbing material comprises an organic color shifting material or a liquid crystal material.
9. The apparatus according to any one of claims 1-3, 5-8, further comprising:
an optical unit including an optical lens for capturing the incident light and imaging the incident light on the image sensor, wherein the optical filter is disposed before or after the optical lens or between two lenses in the optical lens.
10. The device according to any one of claims 1-3, 5-8, characterized in that the control unit is specifically configured to:
and under the condition that the gain of the image sensor is larger than a preset value, controlling the optical filter to pass visible light and block infrared light in the first image sampling interval and pass infrared light in the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on light passing through the optical filter in the first image sampling interval and perform photoelectric imaging on light passing through the optical filter in the second image sampling interval.
11. The apparatus according to any one of claims 1-3, 5-8, wherein the control unit is further configured to:
when the gain of the image sensor is smaller than or equal to a preset value, controlling the image sensor to perform photoelectric imaging on incident light in a third image sampling interval to acquire a second target image, wherein the duration of the third image sampling interval is equal to the sum of the duration of the first image sampling interval and the duration of the second image sampling interval.
12. An image acquisition method, comprising:
controlling the filter to pass visible light in the incident light and block infrared light from passing at a first image sampling interval, and controlling the filter to pass infrared light in the incident light at a second image sampling interval;
performing photoelectric imaging on light rays passing through the optical filter at the first image sampling interval in the incident light through an image sensor to obtain a first image;
performing photoelectric imaging on light rays passing through the optical filter at the second image sampling interval in the incident light through the image sensor to obtain a second image;
controlling the image sensor to finish photoelectric imaging at a fourth image sampling interval;
Synthesizing the first image and the second image through an image synthesizing unit to generate a first target image;
the fourth image sampling interval is a time period except a first image sampling interval and a second image sampling interval in the acquisition period of the first target image;
the optical filter comprises a first optical filtering area and a second optical filtering area, wherein the first optical filtering area is used for passing visible light and blocking infrared light from passing, and the second optical filtering area is used for passing infrared light;
wherein controlling the filter to pass visible light in the incident light and block infrared light from passing at a first image sampling interval and controlling the filter to pass infrared light in the incident light at a second image sampling interval comprises:
controlling the first filtering area to filter the incident light at the first image sampling interval, and controlling the second filtering area to filter the incident light at the second image sampling interval;
the optical filter is a circular optical filter, the first optical filtering area is a first fan-shaped area of the circular optical filter, and the second optical filtering area is a second fan-shaped area of the circular optical filter;
Wherein controlling the first filtering region to filter the incident light at the first image sampling interval and controlling the second filtering region to filter the incident light at the second image sampling interval includes:
controlling the circular optical filter to rotate around the circle center of the circular optical filter through a motor, so that the first fan-shaped area covers a photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval;
the photosensitive area is a rectangular area, and the circular filter meets the following first condition or second condition under the condition that the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
the first condition is: r is greater than or equal to c+b/(2 sin ((pi- ωt)/2));
the second condition is: r is more than or equal to c+b/(2 sin (pi- ωt));
wherein ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
13. The method of claim 12, wherein when the circular filter satisfies the first condition, a central angle corresponding to the first sector is greater than or equal to ωt+2arcsin (b/(2 (R-c))), or a central angle corresponding to the second sector is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
14. The method of claim 12, wherein when the circular filter satisfies the second condition, a central angle corresponding to the first sector area is greater than or equal to ωt+2arcsin (b/(2 (R-c))).
15. The method of any one of claims 12-14, wherein the photosensitive area is a rectangular area and the circular filter satisfies the following condition:
R≥c+b/(2sin(β/2));
wherein R represents the radius of the circular filter, b represents the first side length of the photosensitive area of the image sensor, c represents the second side length of the photosensitive area of the image sensor, beta is the central angle of the circular filter opposite to the first side length, and the first side and the second side are mutually perpendicular.
16. The method of claim 15, wherein the angular velocity of rotation of the circular filter is ω, the central angle of the first sector in the circular filter is (2Ω - γ), the central angle of the second sector is γ, and the exposure time of the first image is t 1 The exposure time of the second image is t 2
Omega, gamma, t 1 、t 2 The following formula is satisfied:
2π-γ-β≥ωt 1
γ+β≥ωt 2 or gamma is equal to or greater than ωt 2 Or gamma-beta is greater than or equal to ωt 2
17. The method of claim 12, wherein controlling the first filter region to filter the incident light at the first image sampling interval and controlling the second filter region to filter the incident light at the second image sampling interval comprises:
controlling the optical filter to move through a driver and a lever so that the first optical filtering area covers the photosensitive area of the image sensor at the first image sampling interval to acquire the visible light passing through the optical filter, and the second optical filtering area covers the photosensitive area of the image sensor at the second image sampling interval to acquire the infrared light passing through the optical filter;
wherein, the first end of lever with the light filter is connected.
18. The method of claim 12, wherein the optical filter includes an electronically controlled light absorbing material,
wherein controlling the filter to pass visible light in the incident light and block infrared light from passing at a first image sampling interval and controlling the filter to pass infrared light in the incident light at a second image sampling interval comprises:
The visible light passing through the optical filter is obtained by applying a voltage to the optical filter and controlling the voltage so that the optical filter passes visible light and blocks infrared light in the first image sampling interval, and the infrared light passing through the optical filter is obtained in the second image sampling interval.
19. The method of claim 18, wherein the electronically controlled light absorbing material comprises an organic color shifting material or a liquid crystal material.
20. The method of any one of claims 12-14, 16-19, further comprising:
and under the condition that the gain of the image sensor is larger than a preset value, controlling the optical filter to pass visible light and block infrared light in the first image sampling interval and pass infrared light in the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on light passing through the optical filter in the first image sampling interval and perform photoelectric imaging on light passing through the optical filter in the second image sampling interval.
21. The method of any one of claims 12-14, 16-19, further comprising:
And when the gain of the image sensor is smaller than or equal to a preset value, carrying out photoelectric imaging on incident light at a third image sampling interval through the image sensor to obtain a second target image, wherein the duration of the third image sampling interval is equal to the sum of the duration of the first image sampling interval and the duration of the second image sampling interval.
CN202010087243.9A 2020-02-11 2020-02-11 Image acquisition device and image acquisition method Active CN113259546B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010087243.9A CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method
PCT/CN2020/126076 WO2021159768A1 (en) 2020-02-11 2020-11-03 Image acquisition apparatus and image acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010087243.9A CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method

Publications (2)

Publication Number Publication Date
CN113259546A CN113259546A (en) 2021-08-13
CN113259546B true CN113259546B (en) 2023-05-12

Family

ID=77219573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010087243.9A Active CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method

Country Status (2)

Country Link
CN (1) CN113259546B (en)
WO (1) WO2021159768A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023028769A1 (en) * 2021-08-30 2023-03-09 Oppo广东移动通信有限公司 Imaging module, imaging system, image processing method, and terminal
CN114650359A (en) * 2022-03-22 2022-06-21 维沃移动通信有限公司 Camera module and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08321977A (en) * 1995-05-25 1996-12-03 Sony Corp Filter disk device and video camera provided with the device
JPH1051796A (en) * 1996-05-31 1998-02-20 Olympus Optical Co Ltd Solid-state image pickup device
CN104661008B (en) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 The treating method and apparatus that color image quality is lifted under low light conditions
CN109951646B (en) * 2017-12-20 2021-01-15 杭州海康威视数字技术股份有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
CN109951624B (en) * 2019-04-12 2024-04-19 武汉鸿瑞达信息技术有限公司 Imaging shooting system and method based on filter optical wheel
CN110490041B (en) * 2019-05-31 2022-03-15 杭州海康威视数字技术股份有限公司 Face image acquisition device and method

Also Published As

Publication number Publication date
CN113259546A (en) 2021-08-13
WO2021159768A1 (en) 2021-08-19

Similar Documents

Publication Publication Date Title
JP6911202B2 (en) Imaging control method and imaging device
US10274706B2 (en) Image capture control methods and apparatus
US8229294B2 (en) Cameras with varying spatio-angular-temporal resolutions
US7113318B2 (en) Light amount control apparatus, photographing apparatus, and filter
CN109788207B (en) Image synthesis method and device, electronic equipment and readable storage medium
CN107959778B (en) Imaging method and device based on dual camera
US10412280B2 (en) Camera with light valve over sensor array
JP5276529B2 (en) Image processing apparatus and method
JP2016541151A (en) Method and apparatus for implementing and / or using a camera device
JP5577772B2 (en) Imaging device
CN113259546B (en) Image acquisition device and image acquisition method
CN113099078B (en) Camera module, imaging method and imaging device
CN102783135A (en) Method and apparatus for providing a high resolution image using low resolution
WO2015058156A1 (en) Methods and apparatus for capturing and/or combining images
CN1655592A (en) Photographic device employing multiple photoreceptors
CN101785319A (en) Imageing sensor with the color filter array that has panchromatic checkerboard pattern
JP7156352B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
US20060077282A1 (en) Image capturing apparatus and portable communication apparatus
JP2011017827A (en) Filter device, imaging lens and imaging device including the same
WO2020137217A1 (en) Imaging element, imaging device, image data processing method, and program
CN114374776B (en) Camera and control method of camera
KR20220059176A (en) Apparatus and method for photographing multi-band image
TWI841071B (en) Thinner image sensor module
WO2021226770A1 (en) Mobile terminal, method for acquiring image, and computer-readable storage medium
Cao et al. Snapshot High Dynamic Range Imaging Based on Adjustable Attenuation Microarray Mask

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant