CN113259546A - Image acquisition apparatus and image acquisition method - Google Patents

Image acquisition apparatus and image acquisition method Download PDF

Info

Publication number
CN113259546A
CN113259546A CN202010087243.9A CN202010087243A CN113259546A CN 113259546 A CN113259546 A CN 113259546A CN 202010087243 A CN202010087243 A CN 202010087243A CN 113259546 A CN113259546 A CN 113259546A
Authority
CN
China
Prior art keywords
image
sampling interval
filter
image sampling
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010087243.9A
Other languages
Chinese (zh)
Other versions
CN113259546B (en
Inventor
陈晓雷
胡红旗
吴志江
赖昌材
郑士胜
李瑞华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010087243.9A priority Critical patent/CN113259546B/en
Priority to PCT/CN2020/126076 priority patent/WO2021159768A1/en
Publication of CN113259546A publication Critical patent/CN113259546A/en
Application granted granted Critical
Publication of CN113259546B publication Critical patent/CN113259546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Blocking Light For Cameras (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Optical Elements Other Than Lenses (AREA)

Abstract

The application provides an image acquisition device and method. The image acquisition device comprises an optical filter, an image sensor, a control unit and an image synthesis unit. In the embodiment of the application, the image sensor performs photoelectric imaging on visible light passing through the optical filter at a first image sampling interval to obtain a first image containing chromaticity information, the image sensor performs photoelectric imaging on infrared light (or infrared light and visible light) passing through the optical filter at a second image sampling interval to obtain a second image containing brightness information, and then the image synthesis unit synthesizes the first image and the second image to obtain a target image with higher quality, so that a beam splitter prism is not needed, only one sensor is needed, and reduction of material cost and complexity of the device can be facilitated.

Description

Image acquisition apparatus and image acquisition method
Technical Field
The present application relates to the field of image applications, and more particularly, to an image acquisition apparatus and an image acquisition method.
Background
With the rapid development of image application technology, the imaging quality of image acquisition equipment under low illumination is more and more required. Compared with a scene with normal illumination, the low-illumination environment has reduced imaging quality, which is mainly represented by reduced image brightness and increased noise. The largest factor causing the degradation of the imaging quality is the decrease of the light input amount of the imaging sensor, and the decrease of the signal-to-noise ratio of the output signal thereof. Therefore, increasing the amount of light entering the image sensor under low illumination is the key to improving the low image quality. The light supplementing device is a common method, and the devices can be divided into two types according to the wavelength of light emitted by the light supplementing device: visible light supplement and infrared supplement. The light supplement of the visible light can conveniently acquire the real color information of the scene, but can interfere people. The infrared supplementary lighting has no obvious interference to people, but the real color of the object cannot be well acquired.
In order to simultaneously acquire visible light imaging and infrared imaging in a scene, the current practice in the industry is to use a beam splitter prism. As shown in fig. 1, the incident light is separated into visible light and infrared light by an optical coating in the beam splitter prism after passing through the beam splitter prism. And a color sensor is arranged on the emergent surface of the visible light to image the visible light so as to acquire a visible light image. And placing an infrared light sensor on the emergent surface of the infrared light to image the infrared light so as to obtain an infrared image. Then, the visible light image and the infrared image are synthesized by using an algorithm, and a final color image is output.
However, the above scheme needs to use a beam splitter prism and two sensors, so that the material cost is high, the two sensors need to be accurately matched during assembly, and the production complexity is high. In addition, the beam splitter prism is added behind the lens, so that the lens cannot be compatible with the existing standard rear focal lens, and the volume of a light path part is larger, which is not beneficial to the miniaturization of equipment.
Disclosure of Invention
The application provides an image acquisition device and an image acquisition method, which can respectively acquire a first image containing chrominance information and a second image containing luminance information at different image sampling intervals, and synthesize the first image and the second image to acquire a target image, so that a beam splitter prism is not needed, only one sensor is needed, and the reduction of material cost and the complexity of the device can be facilitated.
In a first aspect, an image acquisition apparatus is provided that includes an optical filter, an image sensor, a control unit, and an image synthesis unit.
And the optical filter is used for gating the incident light.
And the image sensor is used for performing photoelectric imaging on the light rays which pass through the optical filter in the incident light.
And the control unit is connected with the optical filter and the image sensor and used for controlling the optical filter to pass visible light in the incident light and block infrared light in the incident light (namely, the infrared light in the incident light cannot pass) in a first image sampling interval and pass the infrared light in the incident light in a second image sampling interval. And
the image sensor is used for controlling the light rays which pass through the optical filter at the first image sampling interval in the incident light to be subjected to photoelectric imaging to obtain a first image, and controlling the light rays which pass through the optical filter at the second image sampling interval in the incident light to be subjected to photoelectric imaging to obtain a second image.
And the image synthesis unit is used for synthesizing the first image and the second image to generate a first target image.
Therefore, in the embodiment of the present application, a high-quality target image can be obtained by performing photoelectric imaging on light passing through the optical filter at a first image sampling interval to obtain a first image containing chromaticity information, performing photoelectric imaging on light passing through the optical filter at a second image sampling interval to obtain a second image containing luminance information, and then combining the first image and the second image. Because the image sensor can acquire the first image and the second image respectively in different time periods in the embodiment of the application, one sensor can be used in the embodiment of the application, the material cost is reduced, the image sensor does not need to be registered, the complexity is reduced, and the reduction of the size of the device can be facilitated.
Optionally, the controller may be further connected to the image synthesizing unit, and configured to control the image synthesizing unit to synthesize the first image and the second image.
With reference to the first aspect, in certain implementations of the first aspect, the optical filter includes a first filter region for passing visible light and not passing infrared light and a second filter region for passing infrared light. Optionally, the second filter region may pass through visible light, or may not pass through visible light, which is not limited in this embodiment of the application.
The control unit is specifically configured to filter the incident light using the first filtering region at the first image sampling interval, and filter the incident light using the second filtering region at the second image sampling interval.
Therefore, in the embodiment of the application, the first filtering region covers the photosensitive region of the image sensor at the first image sampling interval, the second filtering region covers the photosensitive region of the image sensor at the second image sampling interval, and further, the image sensor can perform photoelectric imaging on the visible light passing through the optical filter at the first image sampling interval, and perform photoelectric imaging on the infrared light (or the infrared light and the visible light) passing through the optical filter at the second image sampling interval.
The above "filtering the incident light using the first filtering region at the first image sampling interval" includes two cases, excluding and not excluding the simultaneous use of the second filtering region to filter the incident light, that is, filtering the incident light using only the first filtering region, or filtering the incident light using both the first filtering region and the second filtering region. Similarly, the above "filtering the incident light using the second filtering region at the second image sampling interval" also includes two cases, excluding and not excluding the filtering the incident light using the first filtering region at the same time, that is, filtering the incident light using only the second filtering region, or filtering the incident light using both the first filtering region and the second filtering region.
With reference to the first aspect, in certain implementations of the first aspect, the optical filter is a circular optical filter, the first filtering region is a first sector region of the circular optical filter, and the second filtering region is a second sector region of the circular optical filter;
the control unit comprises a motor, the motor is used for controlling the circular optical filter to rotate around the circle center of the circular optical filter, so that the first fan-shaped area covers the photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval.
Since infrared greatly interferes with the imaging of RGB images, but visible light does not greatly interfere with the imaging of infrared, the "coverage" of the "first sector area in the first image sampling interval covering the photosensitive area of the image sensor" means covering the entire photosensitive area. The "coverage" of the "second sector area covering the photosensitive area at the second image sampling interval" means that the entire photosensitive area may be covered or a part of the photosensitive area may be covered.
Therefore, the circular filter rotates around the circle center, so that the spectrum reaching the image sensor behind the circular filter changes periodically, and the image sensor performs color frame exposure and black-and-white frame exposure periodically.
With reference to the first aspect, in certain implementations of the first aspect, the photosensitive region is a rectangular region, and in a case that a duration of the first image sampling interval is the same as a duration of the second image sampling interval, the circular filter satisfies the following condition:
R≥c+b/(2sin((π–ωt)/2));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other. Wherein, ω t is less than or equal to π.
It is to be noted that no imaging is performed at this time within an angle of 2 β, that is to say no photoelectric imaging is performed with both the first sector area and the second sector area covering the photosensitive area. Wherein β is a central angle of the circular filter to which the first side length is opposite.
With reference to the first aspect, in certain implementations of the first aspect, the central angle corresponding to the first sector area is greater than or equal to ω t +2arcsin (b/(2(R-c))), and/or the central angle corresponding to the second sector area is greater than or equal to ω t +2arcsin (b/(2 (R-c))).
With reference to the first aspect, in certain implementations of the first aspect, the photosensitive region is a rectangular region, and in a case that a duration of the first image sampling interval is the same as a duration of the second image sampling interval, the circular filter satisfies the following condition:
R≥c+b/(2sin(π–ωt));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
With reference to the first aspect, in certain implementations of the first aspect, the first sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2 (R-c))).
With reference to the first aspect, in certain implementations of the first aspect, the photosensitive region is a rectangular region, and the circular filter satisfies the following condition:
R≥c+b/(2sin(β/2));
wherein, R represents a radius of the circular filter, b represents a first side length of a photosensitive area of the image sensor, c represents a second side length of the photosensitive area of the image sensor, β is a central angle of the circular filter to which the first side length is opposite, and the first side and the second side are perpendicular to each other.
With reference to the first aspect, in certain implementations of the first aspect, a rotation angular velocity of a circular filter is ω, a central angle of the first sector area in the circular filter is (2 pi- γ), a central angle of the second sector area is γ, and an exposure time (i.e., a first image sampling interval) of the first image is t1The exposure time of the second image (i.e. the second image sampling interval) is t2
Then ω, γ, t1、t2The following formula is satisfied:
2π-γ-β≥ωt1
γ+β≥ωt2or gamma is not less than ω t2Or gamma-beta is not less than ω t2
Optionally, t1And t2May be equal or different, and the embodiments of the present application are not limited thereto. At t1And t2Are equal and can be represented as t1=t2=t。
With reference to the first aspect, in certain implementations of the first aspect, the control unit includes a driver and a lever, a first end of the lever is connected to the optical filter, and the driver is configured to control the optical filter to move through the lever, so that the first filter region covers a photosensitive region of the image sensor at the first image sampling interval, and the second filter region covers the photosensitive region at the second image sampling interval.
Since infrared greatly interferes with the imaging of RGB images, but visible light does not greatly interfere with the imaging of infrared, the "covering" of the "first filter area in the first image sampling interval covering the photosensitive area of the image sensor" means covering the entire photosensitive area. The "covering" of the "second filter region covering the photosensitive region at the second image sampling interval" means that the entire photosensitive region may be covered or a part of the photosensitive region may be covered.
Therefore, the optical filter is connected with the optical filter through the lever, and the optical filter at the tail end of the lever is driven to move by the driver under the drive of the lever, so that the spectrum reaching the image sensor behind the optical filter is changed periodically, and the image sensor is subjected to color frame exposure and black and white frame exposure periodically.
With reference to the first aspect, in certain implementations of the first aspect, the filter includes an electrically controllable light absorbing material therein,
the control unit is specifically configured to apply a voltage to the optical filter, and control the magnitude of the voltage so that the optical filter passes visible light and cannot pass infrared light at the first image sampling interval, and passes infrared light at the second image sampling interval.
Therefore, in the embodiment of the application, the electrically controlled light absorption material is arranged in the optical filter, and the absorption peak of the light absorption material in the optical filter can be periodically changed by changing the voltage applied to the optical filter, so that the spectrum reaching the image sensor located behind the optical filter can be periodically changed, and the image sensor can periodically perform color frame exposure and black and white frame exposure.
With reference to the first aspect, in certain implementations of the first aspect, the electrically controllable light absorbing material includes an organic color changing material or a liquid crystal material. Therefore, the response time of the electric control light absorption material can reach millimeter level, and the switching of black-white frame exposure and color frame exposure can be quickly responded.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes:
an optical unit including an optical lens for capturing the incident light and imaging the incident light on the image sensor, wherein the optical filter is disposed before, after, or between two lenses in the optical lens.
With reference to the first aspect, in certain implementations of the first aspect, the control unit is specifically configured to:
and under the condition that the gain of the image sensor is determined to be larger than a preset value, controlling the optical filter to pass visible light and not pass infrared light at the first image sampling interval and pass infrared light at the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on the light rays passing through the optical filter at the first image sampling interval and perform photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval.
When it is determined that the gain of the image sensor is greater than the preset value, the photographic environment may be determined to be a low-illuminance environment. Therefore, when the shooting environment is sensed to be a low-illumination environment, the control filter can pass through the visible light and cannot pass through the infrared light in the first image sampling interval, and pass through the infrared light (or pass through the infrared light and the visible light) in the second image sampling interval.
With reference to the first aspect, in certain implementations of the first aspect, the control unit is further configured to:
and when the gain of the image sensor is determined to be smaller than or equal to the preset value, controlling the image sensor to perform photoelectric imaging on incident light in a third image sampling interval to acquire a second target image, wherein the third image sampling interval is equal to the sum of the first image sampling interval and the second image sampling interval.
In this way, the frame rate of the output image of the image acquisition device in the low-illumination environment can be kept unchanged relative to the frame rate output in the normal environment.
In a second aspect, an image acquisition method is provided. The method may be performed by the image acquisition apparatus of the first aspect or any one of the possible implementations of the first aspect.
In the method, a filter is controlled to pass visible light and not infrared light in incident light in a first image sampling interval, and the filter is controlled to pass infrared light in incident light in a second image sampling interval; performing photoelectric imaging on the light rays passing through the optical filter at the first image sampling interval through an image sensor to obtain a first image; performing photoelectric imaging on the light rays passing through the optical filter at the second image sampling interval through the image sensor to obtain a second image; and synthesizing the first image and the second image by an image synthesis unit to generate a first target image.
Therefore, in the embodiment of the present application, a high-quality target image can be obtained by performing photoelectric imaging on light passing through the optical filter at a first image sampling interval to obtain a first image containing chromaticity information, performing photoelectric imaging on light passing through the optical filter at a second image sampling interval to obtain a second image containing luminance information, and then combining the first image and the second image. Because the image sensor can acquire the first image and the second image respectively in different time periods in the embodiment of the application, one sensor can be used in the embodiment of the application, the material cost is reduced, the image sensor does not need to be registered, the complexity is reduced, and the reduction of the size of the device can be facilitated.
The various steps of the method of the second aspect may refer to the various operations of the respective modules of the apparatus of the first aspect and are not repeated here.
In a third aspect, a computer-readable medium is provided for storing a computer program comprising instructions for performing the first or second aspect or any possible implementation of the first or second aspect.
In a fourth aspect, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the instructions of the first aspect or the second aspect or any possible implementation of the first aspect or the second aspect.
Drawings
Fig. 1 is a schematic view of imaging using a beam splitter prism.
Fig. 2 is a schematic diagram of an image capturing apparatus according to an embodiment of the present application.
Fig. 3 is another schematic diagram of an image capturing apparatus according to an embodiment of the present application.
Fig. 4 is a specific example of a circular filter provided in an embodiment of the present application.
Fig. 5 is an example of exposure performed by a circular filter according to an embodiment of the present disclosure.
Fig. 6 is a specific example of the connection between the lever and the optical filter according to the embodiment of the present application.
Fig. 7 is an example of exposure performed by the optical filter connected to the lever according to the embodiment of the present application.
Fig. 8 is another example of exposure performed by the optical filter connected to the lever according to the embodiment of the present disclosure.
Fig. 9 is another example of exposure performed by the optical filter connected to the lever according to the embodiment of the present application.
Fig. 10 is another specific example of the optical filter provided in the embodiment of the present application.
Fig. 11 is a schematic view of another image capturing apparatus provided in an embodiment of the present application.
Fig. 12 is a schematic diagram of synthesizing a first image and a second image in YUV color space according to an embodiment of the present application.
Fig. 13 is a schematic flowchart of a method for image acquisition according to an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
Fig. 2 shows a schematic diagram of an image capturing apparatus 200 provided in an embodiment of the present application. By way of example, the image capturing apparatus 200 may be a monitoring camera, a mobile phone or other wearable device with a shooting function, or a camera in a smart home device, and the like, which is not limited in this embodiment of the application.
For example, in a monitoring camera application scenario, a monitoring camera needs to operate for 7 × 24 hours (which means 7 days a week and 24 hours a day). That is, most surveillance cameras operate in low-light environments for approximately half of their operating time. Therefore, the monitoring camera is required to have high imaging quality (such as high brightness and small noise) in a low-illumination environment, so as to be capable of helping to acquire important information (such as information of a human face, a vehicle body color and the like). In addition, in an application scene of photographing, when the camera is used in a low-illumination environment, it is also required that the camera can acquire an image satisfying a business application.
In some possible embodiments, the image capturing apparatus 200 may be an apparatus for capturing images at night, and the relevant units of the apparatus 200 may be capable of performing operations such as light gating and corresponding image synthesis at night, but the present community embodiment is not limited thereto.
As shown in fig. 2, the image acquisition apparatus 200 includes an optical filter 210, an image sensor 220, a control unit 230, and an image synthesis unit 240.
The filter 210 is used for gating incident light.
And an image sensor 220 for performing a photoelectric imaging on the light passing through the optical filter 210 among the incident lights.
And a control unit 230, connected to the optical filter 210 and the image sensor 220, for controlling the optical filter 210 to pass visible light and block infrared light (i.e. not pass infrared light) in the incident light in a first image sampling interval, and to pass infrared light in the incident light in a second image sampling interval. And, the image sensor 220 is configured to perform photoelectric imaging on the light passing through the optical filter 210 at the first image sampling interval to obtain a first image, and perform photoelectric imaging on the light passing through the optical filter 210 at the second image sampling interval to obtain a second image.
That is, the control unit 230 can control the filter 210 to pass different light types at different image sampling intervals, i.e., to implement a spectrum gating function, and thus can control the type of light reaching the surface of the image sensor 220.
It should be noted that, in the embodiment of the present application, the first image is obtained by imaging visible light of a real scene, and can represent chromaticity information of the real scene, and the second image is mainly obtained by imaging infrared light of the real scene, and can represent luminance information of the real scene. In some embodiments, the first image may be referred to as a color frame image and the second image may be referred to as a black and white frame image.
Since infrared light affects the imaging of visible light, infrared light is not available to pass through the filter 210 in the first image sampling interval. However, the visible light does not affect the imaging of the infrared light, so the optical filter 210 may not pass the visible light in the incident light or may pass the visible light in the incident light at the second image sampling interval. When the filter 210 may pass visible light in the incident light at the second image sampling interval, it may help to reduce complexity of the filter.
The image sensor 220 is configured to receive an optical signal and convert the optical signal into an electrical signal. The image sensor 220 may be, for example, a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), and the like, which is not limited in this embodiment.
That is to say, in the embodiment of the present application, the control unit 230 can implement the first image sampling interval by controlling the optical filter 210 and the image sensor 220, the optical filter 210 allows visible light in the incident light to pass through, and does not allow infrared light in the incident light to pass through, and the light passing through the optical filter 210 performs the photoelectric imaging on the image sensor 220 at this time, so as to obtain the first image; at the second image sampling interval, the optical filter 210 allows the infrared light in the incident light to pass through, or allows the infrared light and the visible light in the incident light to pass through, and the light passing through the optical filter 210 at this time is subjected to photoelectric imaging on the image sensor 220, resulting in a second image.
It should be noted that, since the image sensor 220 respectively performs photoelectric imaging on visible light at the first image sampling interval, performs photoelectric imaging on infrared light at the second image sampling interval, or performs photoelectric imaging on visible light and infrared light at the second image sampling interval, in the embodiment of the present application, one sensor may be adopted to respectively acquire visible light and infrared light in a shot scene at different time periods and perform imaging respectively. That is, in the embodiment of the present application, the image sensor 220 may be time division multiplexed.
Optionally, the time lengths of the first image sampling interval and the second image sampling interval may be the same or different, which is not limited in this embodiment of the application.
In other possible implementation manners, two sensors may also be used to perform visible light imaging and infrared light imaging, which are not limited in this application.
And an image synthesizing unit 240, configured to synthesize the first image acquired by the image sensor 220 and the second image acquired by the image sensor 220 to obtain a target image. The chrominance information in the target image mainly comes from the first image, and the luminance information in the target image mainly comes from the second image.
In some optional embodiments, the control unit 230 may be further connected to the image synthesizing unit 240, and configured to control the image synthesizing unit 240 to synthesize the first image and the second image.
In the embodiment of the present application, a duration of acquiring a frame of target image may be referred to as a target image acquisition period. In the embodiment of the application, the first image and the second image are acquired in a time-sharing manner in a target image acquisition period, namely, the first image is acquired at a first image sampling interval in the target image acquisition period, and the second image is acquired at a second image sampling interval in the target image acquisition period. Then, a target capture image of the target image acquisition cycle may be generated from the first and second images.
As an example, when the output frame rate of the apparatus 200 is 25 frames/second, one target image acquisition period is 0.04 s. The first image sampling interval and the second image sampling interval may be 0.02s, respectively, at this time. As a specific example, a first image may be acquired 0.02s before and a second image may be acquired 0.02s after the target image acquisition period of 0.04 s.
Therefore, in the embodiment of the present application, a high-quality target image can be obtained by performing photoelectric imaging on light passing through the optical filter at a first image sampling interval to obtain a first image containing chromaticity information, performing photoelectric imaging on light passing through the optical filter at a second image sampling interval to obtain a second image containing luminance information, and then combining the first image and the second image. Because the image sensor can acquire the first image and the second image respectively in different time periods in the embodiment of the application, one sensor can be used in the embodiment of the application, the material cost is reduced, the image sensor does not need to be registered, the complexity is reduced, and the reduction of the size of the device can be facilitated.
In the embodiment of the present application, the image combining unit may be implemented by electronic hardware, or computer software, or a combination of computer software and electronic hardware, and the embodiment of the present application is not limited thereto. As an example, the image synthesizing unit may be implemented by a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a specific image processing chip. In some possible implementations, the CPU, the GPU or the image processing chip for image composition may read codes to implement composition of the first image and the second image, which is not limited in this embodiment of the present application.
In some possible implementations, the filter 210 may include a first filter region and a second filter region. The first filter area is used for passing visible light and not passing infrared light, and the second filter area is used for passing infrared light. Optionally, the second filter region may pass through visible light, or may not pass through visible light, which is not limited in this embodiment of the application.
As shown in fig. 3, the filter 210 may include a first filter region 2101 and a second filter region 2102 therein. The control unit 220 may be specifically configured to gate the incident light using a first filtering region at the first image sampling interval, and gate the incident light using a second filtering region at the image sampling interval.
As an example, the control unit 230 may control the filter 210 to move such that a first filtering region in the filter 210 covers a photosensitive region of the image sensor 220 at a first image sampling interval and a second filtering region covers a photosensitive region of the image sensor 220 at a second image sampling interval. That is, at a first image sampling interval, the first filter region may be in the optical path before the image sensor, gating visible light in the optical path so that the image sensor receives visible light; at a second image sampling interval, the second filter region is in the optical path before the image sensor, and gates infrared light in the optical path (or gates infrared light and visible light in the optical path) so that the image sensor is controlled to receive infrared light (or infrared light and visible light).
Therefore, in the embodiment of the present application, the first filtering region in the optical filter 210 covers the photosensitive region of the image sensor 220 at the first image sampling interval, and the second filtering region covers the photosensitive region of the image sensor 220 at the second image sampling interval, so that the image sensor can perform photoelectric imaging on the visible light passing through the optical filter at the first image sampling interval, and perform photoelectric imaging on the infrared light (or the infrared light and the visible light) passing through the optical filter at the second image sampling interval.
It should be noted that fig. 3 is a schematic diagram illustrating the filter 210, and therefore only some modules or units in the image capturing apparatus are shown. That is, in addition to the optical filter 210 and the control unit 230 shown in fig. 3, other modules or units (not shown in fig. 3), such as an image sensor and an image synthesis unit, are also included in the image capturing apparatus, which is not limited in this embodiment of the present application.
Two specific implementations of the optical filter including the first filtering region and the second filtering region provided in the embodiments of the present application are described below.
In a first mode
The filter 210 may be designed as a circular filter. The first filtering region 2101 is a first sector region of the circular filter, and the second filtering region is a second sector region of the circular filter. Accordingly, the control unit 230 may include a motor, and the motor may be configured to control the circular filter to rotate around a center of the circular filter, so that the first sector area covers the photosensitive area of the image sensor in the first image sampling interval, and the second sector area covers the photosensitive area in the second image sampling interval. In some descriptions, the circular filter may also be referred to as a color wheel, which is not limited in this application.
In some embodiments, the motor may control the circular filter to rotate around the center at a constant speed, or rotate at a variable speed, or control the circular filter to rotate around the center clockwise, or rotate around the center counterclockwise, but the embodiments of the present application are not limited thereto.
Optionally, when the control unit 230 includes a motor, the control unit 230 may further include a control chip, and the control chip may send a control instruction to the motor to control the motor to operate, so as to control the circular filter by the motor. As an example, the control chip may read instructions or code in a memory to enable the acquisition of an image.
The control chip may be different from the CPU, or may be the CPU.
Therefore, the circular filter rotates around the circle center, so that the spectrum reaching the image sensor behind the circular filter changes periodically, and the image sensor performs color frame exposure and black-and-white frame exposure periodically.
Referring to fig. 4, a specific example of a circular filter is shown. Among them, the rectangular shaded area is an example of a photosensitive area of the image sensor, and two sides thereof are b and c, respectively. The circle center of the circular filter is the point O, the radius is R, AO and BO are the boundary lines of the two filters (i.e. the first sector area and the second sector area) included in the circular filter.
Illustratively, the sector area OBCA is an example of a first sector area, allowing visible light to pass through and not allowing infrared light to pass through. The sector area OBDA is an example of the second sector area, and allows infrared light to pass therethrough, or may allow infrared light and visible light to pass therethrough. For ease of understanding, the fan-shaped area OBCA is described below as allowing visible light to pass through but not allowing infrared light to pass through, and the fan-shaped area OBDA is described as allowing infrared light to pass through, or allowing infrared light and visible light to pass through.
In this way, when the entire area of the light-sensing region of the image sensor (i.e., the shaded region in fig. 4) is within the sector area OBCA, color frame exposure is possible, and when the entire area or a partial area of the light-sensing region of the image sensor (i.e., the shaded region in fig. 4) is within the sector area OBDA, black-and-white frame exposure is possible.
Typically, the radius of the circular filter is greater than the length of the photosensitive area of the image sensor. At this time, the light sensing area of the image sensor may be disposed at a side away from the center of the circular filter and the circular filter may completely cover the light sensing area of the image sensor, such as the structure shown in fig. 4.
Referring to fig. 4, when the length of the photosensitive region of the image sensor is c and the width of the photosensitive region is b, a central angle β of the circular filter subtended by the edge of the sensor having the length of b (i.e., an angle formed by two end points G and F of the edge length of the sensor having the length of b close to the center O and the center O ═ FOG) satisfies the following formula:
R≥c+b/(2sin(β/2)) (1)
as shown in formula (1), β ═ 2arcsin (b/(2 (R-c)));
wherein, R is the radius of the circular filter.
In some embodiments, the empirical value of c may be expressed as follows:
c≤R/3;
the following describes a process in which the image sensor periodically performs color frame exposure and black-and-white frame exposure as the circular filter rotates, with reference to fig. 4 and 5.
Assuming that the circular filter rotates clockwise about the center of the circle, the intersection of OA with the vertex F of the shaded region may be taken as the start of one target image acquisition cycle, as shown in fig. 4. The target image acquisition cycle ends when OA rotates clockwise one revolution, again intersecting vertex F. During the target image acquisition period, the image sensor may make one black and white frame exposure and one color frame exposure.
Specifically, when OA intersects vertex F of the shadow region (as shown in fig. 4), the infrared light can pass through the second filter region to reach the image sensor (i.e., the photosensitive region of the image sensor), and at this time or before this time, the color frame exposure in the previous image acquisition cycle is finished.
Alternatively, black and white frame exposure may begin when OA intersects the vertex F of the shadow region, i.e., the sector AOBF begins to have an overlapping region with the image sensor photosensitive area. Alternatively, with the rotation of the circular filter, the black-and-white frame exposure may be resumed at or after the first time period when AO intersects the vertex G, that is, when the sector AOBD completely overlaps the image sensor light-sensing area. Alternatively, the exposure of the black-and-white frame may be started at any time between the intersection of OA and the vertex F and the intersection of OA and the vertex G.
It should be noted that, in the period from the intersection of OA and the vertex F to the intersection of OA and the vertex G, since the sector AOBD does not completely cover the photosensitive area of the image sensor at this time, the infrared light that is gated is relatively less than the infrared light that is gated when the sector AOBD does not completely cover the photosensitive area of the image sensor, and therefore, the quality of the black-and-white frame image in this period is poorer than the black-and-white frame image when the sector AOBD completely covers the photosensitive area of the image sensor.
As the circular filter continues to rotate, BO will successively intersect vertex F and another vertex G of the shaded area. Alternatively, the black-and-white frame exposure may be ended when BO intersects with the vertex F or at a time of a second duration before the intersection, that is, when the sector area AOBF starts not completely covering the photosensitive area of the image sensor or at a time of a second duration before the intersection. Alternatively, with the rotation of the circular filter, the black-and-white frame exposure is ended when BO intersects the vertex G (as shown in fig. 5), i.e., the sector AOBF starts not to overlap the image sensor at all. Alternatively, the exposure of the black-and-white frame may be ended at any time between the intersection of BO with the vertex F and the intersection of BO with the vertex G.
Illustratively, the frame number of the black-and-white frame obtained at this time may be denoted as (2 n-1). As the circular filter continues to rotate, a color frame exposure can be performed after BO intersects vertex G (i.e., the case shown in fig. 5) until OA again intersects vertex F. For example, the color frame exposure can be started after the BO intersects the vertex G, or a third duration after the intersection, and ended at a time instant a fourth duration before the AO intersects or intersects the vertex F. Illustratively, the frame number of the color frame obtained at this time may be denoted as (2 n). Wherein n is an integer greater than or equal to 1.
Three specific examples of the image sensor periodically performing color frame exposure and black-and-white frame exposure are described below. The central angle of the sector area OBDA may be γ, and the central angle of the sector area OBCA may be (2 pi- γ). The sector area in the optical filter rotates clockwise around the center O at a constant speed at an angular speed omega.
The first example is: when AO just begins to intersect the vertex F of the shadow area (i.e., the photosensitive area of the image sensor), black and white frame exposure is turned on. When BO intersects with G point of the shadow area, black and white frame exposure is ended and color frame exposure is started. When AO again intersects the vertex F of the shadow area, the color frame exposure is ended, and the black-and-white frame exposure of the next cycle is started. In this case, the black-and-white frame exposure angle is (γ + β), and the color frame exposure angle is (2 pi- γ - β), and can be written as α (that is, α ═ 2 pi- γ - β).
At this time, the following conditions are satisfied:
α=ωt1
γ+β=ωt2
wherein, the exposure time of the color frame can be t1The exposure time of the black and white frame may be t2. Optionally, t1And t2May be equal or different, and the embodiments of the present application are not limited thereto.
When the exposure time of the color frame is equal to the exposure time of the black-and-white frame, i.e. t1=t2Then, α ═ γ + β ═ pi.
In some possible implementations, the exposure angle of the color frame may also be less than or equal to α, i.e., α ≧ ω t1The exposure angle of the black and white frame can be less than or equal to gamma + beta, i.e. gamma + beta is not less than ω t2. When designing a circular filter, the design is to ensureThe structure is compact, and can be designed as alpha-omega t1,γ+β=ωt2
The second example: and when the AO coincides with the central axis of the shadow area, starting black-and-white frame exposure. When BO coincides with the central axis of the shadow area, black and white frame exposure is ended. Color frame exposure begins when the BO intersects the G point of the shadow area. When AO again intersects the vertex F of the shadow area, the color frame exposure is ended. When AO coincides with the central axis of the shadow region again, the black-and-white frame exposure of the next cycle is started. At this time, the angle of black-and-white frame exposure is γ, and the angle of color frame exposure is still α.
Here, in a process in which AO intersects the vertex F to the point at which AO coincides with the central axis of the shaded area (corresponding to the (β/2) angle near OA of the sector area OBCA), neither black-and-white frame exposure nor color light exposure is performed. Similarly, neither black-and-white frame exposure nor color frame exposure is performed in the process in which BO coincides with the central axis of the shaded region to the intersection of BO with the vertex F (corresponding to the (β/2) angle near OB of the sector region OBCA).
At this time, the following conditions are satisfied:
α=ωt1
γ=ωt2
wherein, the exposure time of the color frame can be t1The exposure time of the black and white frame may be t2. Optionally, t1And t2May be equal or different, and the embodiments of the present application are not limited thereto.
When the exposure time of the color frame is equal to the exposure time of the black-and-white frame, i.e. t1=t2When t, α ═ γ ═ pi — (β/2).
That is, β ═ 2 (pi- α) ═ 2 (pi- ω t), in which case the above formula (1) can be modified as follows:
R≥c+b/(2sin((π–ωt)));
in some possible implementations, the exposure angle of the color frame may also be less than or equal to α, i.e., α ≧ ω t1The exposure angle of black and white frame can be less than or equal to gamma, i.e. gamma is more than or equal to ω t2. While in the design circleIn the case of a shape filter, in order to ensure a compact structure, it is usually designed to have α ═ ω t1,γ=ωt2
When t is1=t2When t, the central angle (i.e., α + β) of the first sector region is greater than or equal to ω t +2arcsin (b/(2 (R-c))).
Third example, exposure of a black and white frame begins when AO intersects the vertex G of a shadow region. When the BO intersects the vertex F of the shadow area, the exposure of the black-and-white frame is ended. Color frame exposure begins when the BO intersects the G point of the shadow area. When AO again intersects the vertex F of the shadow area, the exposure of the color frame is ended. When AO again intersects the vertex G of the shadow area, the black-and-white frame exposure of the next cycle is started. At this time, the exposure angle of the black and white frame is (γ - β), and the exposure angle of the color frame is still α.
Here, in the process from the intersection of AO with the vertex F to the intersection of AO with the vertex G (corresponding to the β angle near OA of the sector area OBCA), neither black-and-white frame exposure nor color light exposure is performed. Similarly, neither black-and-white frame exposure nor color frame exposure is performed in the process in which BO intersects vertex F and vertex G (corresponding to the angle β of sector area OBCA near OB).
At this time, the following conditions are satisfied:
α=ωt1
γ-β=ωt2
wherein, the exposure time of the color frame can be t1The exposure time of the black and white frame may be t2. Optionally, t1And t2May be equal or different, and the embodiments of the present application are not limited thereto.
When the exposure time of the color frame is equal to the exposure time of the black-and-white frame, i.e. t1=t2When t, α ═ pi — β and γ ═ pi.
That is, β -pi- α -pi- ω t, where the above formula (1) may be modified as follows:
R≥c+b/(2sin((π–ωt)/2));
in some possible implementations, the lotteryThe exposure angle of the color frame may also be less than or equal to α, i.e., α ≧ ω t1The exposure angle of the black and white frame can be less than or equal to gamma-beta, i.e. gamma-beta is not less than ω t2. When designing a circular filter, in order to ensure a compact structure, the filter may be designed to have α ═ ω t1,γ-β=ωt2
When t is1=t2When t is satisfied, the central angle (i.e., α + β) corresponding to the first sector region is greater than or equal to ω t +2arcsin (b/(2(R-c))), and/or the central angle (i.e., γ) corresponding to the second sector region is greater than or equal to ω t +2arcsin (b/(2 (R-c))).
It should be noted that the larger the angular velocity ω of the rotation of the circular filter, the longer the exposure time t1Or t2The smaller.
In some alternative embodiments, the position of the OA may be detected by a hall sensor. As one implementation, a magnet may be disposed at a position a on the filter, and a hall sensor may be disposed at a first position corresponding to the a position on the image sensor in fig. 4. When the hall sensor detects that the magnetic field strength is at a maximum, it can be determined that OA intersects vertex F.
When the optical filter rotates at a constant speed, the black-and-white frame exposure and the color frame exposure can be controlled by setting a timer. As one possible implementation, the timing may be started from the time OA intersects with vertex F. As a specific example, for the first example above, black-and-white frame exposure may be turned on at a timing of 0 and at a timing of t2At this time, black-and-white frame exposure is ended, and color frame exposure is started. Passing through t1After the duration, OA again intersects vertex F, the timer is cleared and the timing resumes.
Mode two
The control unit 240 may include a driver and a lever, a first end of the lever is connected with the filter 210, and the driver is used for controlling the filter 210 to move through the lever, so that the first filter region 2101 in the filter 210 covers the photosensitive region of the image sensor in the first time period and the second filter region 2102 in the filter 210 covers the photosensitive region of the image sensor in the second time period.
Alternatively, the second end of the lever may be fixed and act as a fulcrum, in which case the actuator may cause the filter 210 connected to the first end of the lever to move by controlling the movement of the first position in the lever. Alternatively, a second position between two end points on the lever may be fixed and act as a fulcrum, in which case the actuator may move the filter 210 connected to the first end of the lever by controlling the movement of the second end of the lever.
As an example, the driver may be a piezoelectric (piezo) ceramic driver, but the embodiment of the present application is not limited thereto.
Optionally, when the control unit 230 includes a driver and a lever, the control unit 230 may further include a control chip, and the control chip may send a control instruction to the driver to control the driver to operate, so as to implement control of the lever on the optical filter. As an example, the control chip may read instructions or code in a memory to enable the acquisition of an image.
Referring to fig. 6, a specific example of the lever and the filter are shown. Illustratively, JK is the boundary between the first filter region 2101 and the second filter region 2102. The lever 2402 has a first end J connected to the filter 210 and a second end H fixed thereto. For example, the second end H may be fixed by a bearing. As a specific implementation, driver 2401 may include a linear motion portion that may be coupled to lever 2402 at point I via a bearing. As shown in fig. 6, the distance e by which the first end J of the lever 2402 moves up and down and the distance f by which the linear motion part moves up and down satisfy the following relationship:
Figure BDA0002382481930000121
wherein, L1 represents the distance between the end point of the second end H and the point I, and L2 represents the distance between the end point of the first end J and the point I.
With continued reference to fig. 6, the filter is moved around the fixed end H by the urging of the driver 2401. Illustratively, as the filter moves from bottom to top, when JK intersects one vertex M of the shadow region (as shown in fig. 7), infrared light can pass through the second filter region 2102 to the image sensor, at which time color frame exposure is stopped. Alternatively, black and white frame exposure may begin at this point. As the filter continues to rotate upward, the filter peaks.
Alternatively, as shown in FIG. 8, when the filters are at their highest point, the second filter region 2102 can still be in front of the shadow region (i.e., the image sensor). In some implementations, by controlling the length of L1 and L2, and the displacement of the linear motion part, the second filter region 2102 can still be located in front of the image sensor when the filter reaches the highest point.
After the filter reaches the highest point, it starts to swing downward. As the filter swings, after JK intersects vertex M again (as shown in fig. 9), the second filter region 2102 does not intersect the shadow portion, and optionally, black and white frame exposure may end. The frame number of the black-and-white frame obtained at this time can be noted as (2 n-1). After black and white frame exposure ends, color frame exposure can begin as the filter continues to swing downward.
Optionally, when the filter reaches the lowest point, the first filter region 2101 can still be located in front of the image sensor. In some implementations, the length of L1 and L2, and the size of the displacement of the linear motion portion can be controlled to achieve that the first filter region 2101 can still be in front of the image sensor when the filter bottoms out.
After the filter reaches the lowest point, the upward movement is started. As the filter moves, the color frame exposure ends when JK re-intersects vertex M. Illustratively, the frame number of the color frame obtained at this time may be denoted as (2 n).
It should be understood that "up", "down", and "back" are only described in the embodiments of the present application to better understand the movement of the lever in the embodiments of the present application, or the relative position of the image sensor and the filter, and are not intended to limit the technical solutions of the embodiments of the present application.
Therefore, the optical filter is connected with the optical filter through the lever, and the optical filter at the tail end of the lever is driven to move by the driver under the drive of the lever, so that the spectrum reaching the image sensor behind the optical filter is changed periodically, and the image sensor is subjected to color frame exposure and black and white frame exposure periodically.
The embodiment of the application also provides a specific implementation mode of the optical filter, which is shown in the following mode III.
Mode III
The filter 210 may include an electrically controllable light absorbing material therein. At this time, the control unit 230 may be specifically configured to apply a voltage to the optical filter 210, and control the magnitude of the voltage such that the optical filter 210 passes visible light and cannot pass infrared light in the first time period, and passes infrared light in the second time period.
In some alternative embodiments, the electrically controllable light absorbing material comprises an organic color-changing material or a liquid crystal material. Therefore, the response time of the electric control light absorption material can reach millimeter level, and the switching of black-white frame exposure and color frame exposure can be quickly responded.
Referring to fig. 10, a specific example of the optical filter 210 is shown. Wherein, the electric control light absorption material can be arranged between two layers of transparent electrodes. For example, the electrically controllable light absorbing material may be wrapped between two transparent electrodes. In this way, the control unit 230 may control the absorption peaks of the light absorbing material to be switched between the infrared region and the ultraviolet region by the change of the voltage applied between the transparent electrodes, thereby implementing the spectrum gating function.
Illustratively, when the control unit 230 controls the light absorbing material to gate infrared light, the infrared light may reach the image sensor through the filter 210, and at this time, black and white frame exposure may be performed. The frame number of the black-and-white frame obtained at this time can be noted as (2 n-1). When the control unit 230 controls the light absorbing material to gate visible light (or visible light and infrared light), color frame exposure may be performed. The frame number of the color frame obtained at this time can be denoted as (2 n).
In addition, in some alternative embodiments, glass may be disposed outside (i.e., on the side away from the electrically controllable light absorbing material) the transparent electrode.
Therefore, in the embodiment of the application, the electrically controlled light absorption material is arranged in the optical filter, and the absorption peak of the light absorption material in the optical filter can be periodically changed by changing the voltage applied to the optical filter, so that the spectrum reaching the image sensor located behind the optical filter can be periodically changed, and the image sensor can periodically perform color frame exposure and black and white frame exposure.
Fig. 11 shows a schematic diagram of another image capturing apparatus 300 provided in the embodiment of the present application. As shown in fig. 11, the apparatus 300 includes an optical unit 301, an infrared fill light unit 302, a tunable filter 303, an image sensor 304, an image processing unit 305, a filter control unit 306, and an image synthesizing unit 307. Here, the tunable filter 303 may be an example of the filter 210, the image sensor 304 may be an example of the image sensor 220, the filter control unit 306 may be an example of the control unit 230, and the image synthesis unit 307 may be an example of the image synthesis unit 230.
It should be understood that fig. 11 shows modules or units of the image capturing apparatus, but these modules or units are merely examples, and the image capturing apparatus of the embodiments of the present application may further include other modules or units, or include variations of the respective modules or units in fig. 11. Furthermore, the image capture device of FIG. 11 may not include all of the modules or elements of FIG. 11.
Therein, an optical unit 301 for capturing incident light and imaging the incident light on an image sensor 304. Illustratively, the primary device in the optical unit includes an optical lens. Among them, the optical lens can be used for optical imaging. As an example, the optical lens may be a confocal lens of visible light and infrared light. Optionally, the light unit 301 may further include a filter, which is mainly used for polarizing light.
It should be noted that light is transmitted between the optical unit 301 and the image sensor 304. In some possible descriptions, the transmission of light may be considered to pertain to the transmission of data. Alternatively, in some possible descriptions, the transmission of light may be considered not to pertain to the transmission of data. The embodiments of the present application do not limit this.
It should be noted that the optical filter described in other parts in the embodiments of the present application refers to an optical filter for visible light gating or infrared light gating, such as the optical filter in fig. 2 and 3, and the tunable optical filter shown in fig. 11.
The tunable filter 303 may be disposed in front of the optical lens, or may be disposed behind the optical lens, which is not limited herein. In addition, when the optical lens includes at least two optical lenses, the tunable filter 303 may be disposed between two of the at least two optical lenses (in this case, it may be referred to that the tunable filter 303 is disposed in the lens)
When the tunable filter 303 is disposed in front of the optical lens, the optical lens causes visible light or infrared light gated by the tunable filter 303 to be imaged on the image sensor 304 after the incident light is gated by the tunable filter 303. When the tunable filter 303 is disposed behind the optical lens, the tunable filter 303 gates visible light or infrared light therein after the incident light passes through the optical lens, and then the visible light or infrared light passing through the filter is imaged on the image sensor 304. When the tunable filter 303 is disposed in the optical lens, the tunable filter 303 gates visible light or infrared light in incident light during the process of passing through the optical lens, and the light is imaged on the image sensor 304 after passing through the optical lens.
The infrared light supplement unit 302 is configured to supplement light to a photographed environment in a low-illumination environment. As an example, an infrared Light Emitting Diode (LED) may be employed, or may be other light emitting devices or devices. The central wavelength of the infrared light supplement unit 302 may be 750nm, 850nm, or 950nm, which is not limited in this embodiment.
In some possible implementations, the on/off, the illumination intensity, or the central wavelength of the infrared light supplement unit 302 may be controlled by a switching value or a level signal.
The image processing unit 305 is configured to process the digital image signal acquired by the image sensor 304, such as at least one of demosaicing, auto exposure, auto white balance, and the like. As an example, the image processing unit 305 may be implemented by dedicated hardware, such as an Image Signal Processor (ISP), which is not limited in this embodiment.
The filter control unit 306 is configured to control the tunable filter 303 such that the tunable filter 303 gates visible light in incident light and cannot pass infrared light in the incident light at a first image sampling interval, gates infrared light in the incident light at a second image sampling interval, or gates infrared light and visible light in the incident light at the second image sampling interval.
For example, the filter control unit 306 may pass a control signal for the tunable filter 303 to cause the tunable filter 303 to gate visible light in the incident light and not pass infrared light in the incident light for a first image sampling interval, gate infrared light in the incident light for a second time period, or gate infrared light and visible light in the incident light for a second image sampling interval.
Alternatively, the tunable filter 303 may send a feedback signal about the tunable filter 303 to the filter control unit 306, so that the filter control unit 306 knows the gating condition of the tunable filter 303.
In some alternative embodiments, the filter control unit 306 may control the start and the end of the exposure of the image sensor 304 according to the light gating condition of the tunable filter 303. That is, the filter control unit 306 may control the light gating of the tunable filter 303 in synchronization with the exposure of the image sensor 304.
For example, when the filter control unit 306 outputs a control signal to control the tunable filter 303 to gate visible light and not allow infrared light to pass, the control signal can also be used to trigger the image sensor 304 to perform color frame exposure. When the filter control unit 306 outputs a control signal to control the tunable filter 303 to switch to gate the infrared light, the control signal may also be used to trigger the image sensor 304 to perform black and white frame exposure. Based on this, the embodiments of the present application can achieve the synchronization of the tunable filter 303 switching and the image sensor 304 exposure.
Alternatively, the filter control unit 306 may output one or more control signals to stop the exposure of the image sensor 304 after the exposure time is satisfied.
In some alternative embodiments, the filter control unit 306 may control the image synthesis unit to perform image synthesis according to the light gating condition of the tunable filter 303. For example, when the tunable filter 303 is a color wheel, the image synthesis unit is controlled to synthesize the first image and the second image acquired in the period when the color wheel rotates for one cycle, so as to acquire a target image.
Specifically, with the tunable filter 303 switched, the image sensor 304 periodically and alternately outputs a first image containing chrominance information and a second image containing luminance information. As an example, the image sensor 304 may output a frame of the first image and a frame of the second image sequentially in one image acquisition period, where the frame number of the first image and the frame number of the second image in one period are adjacent. As a specific example, an image acquisition period may include a second image (which may be denoted as I (2n-1)) of a (2n-1) th frame and a first image (which may be denoted as I (2n)) of a 2n frame, where n is an integer greater than or equal to 1.
In some alternative embodiments, after the first image and the second image in one image acquisition period are processed by the image processing unit 305, the filter control unit 306 controls the image synthesis unit 307 to synthesize two frames of images in one image acquisition period, and acquire the target image.
For example, the UV component in the target image may be a UV component of an even frame image (i.e., the first image), and the Y component in the target image may be a Y component of an odd frame image (i.e., the second image). Wherein the UV component is used to represent chrominance information of the image and the Y component is used to represent luminance information of the image.
For example, the target image may be denoted as I ', and I' (n) of the nth frame may be expressed as follows:
I’Y(n)=IY(2n-1) (2)
I’UV(n)=IUV(2n) (3)
wherein, I'Y(n) Y component, I, of the target image with frame number nY(2n-1) denotes a Y component, I 'of the second image having frame number (2 n-1)'UV(n) UV component of the target image with frame number n, IUV(2n) denotes the UV component of the second image with frame number 2 n.
As can be seen from the above equations (2) and (3), the second image of the (2n-1) th frame and the first image of the 2n th frame can obtain the target image with the frame number n after being synthesized by the image synthesizing unit 307.
Fig. 12 shows a schematic diagram of the image composition unit 307 composing the first image and the second image in YUV color space. Wherein the second image with frame number 1 and the first image with frame number 2 are two images of the image sensor 304 during the first image acquisition period. The image synthesizing unit 307 can synthesize the two frame images into a target image having a frame number of 1. Wherein the Y component of the target image with the frame number of 1 is from the Y component of the second image with the frame number of 1, and the UV component is from the UV component of the first image with the frame number of 2. Similarly, the image synthesizing unit 307 may synthesize the second image with the frame number of 3 and the first image with the frame number of 4 in the second image acquisition period, to obtain the target image with the frame number of 2. Wherein the Y component of the target image with frame number 2 is from the Y component of the second image with frame number 3, and the UV component is from the Y component of the first image with frame number 4.
It should be understood that fig. 12 only shows an example of the first image acquisition cycle and the second image acquisition cycle, after the second image acquisition cycle, the image sensor may further acquire the first image and the second image in a subsequent image acquisition cycle, and the manner of image synthesis in the subsequent image acquisition cycle may be referred to the description in the first image acquisition cycle or the second image acquisition cycle, and is not repeated here for brevity.
As an example, a hall sensor and a timer may be provided in the filter control unit 306 to monitor the position information of the tunable filter 303 and to send control signals to the tunable filter 303, the image sensor, and the image synthesis unit according to the position information and the timer information.
In some possible implementations, the control unit 230 is further configured to determine an ambient illuminance of the subject environment according to a gain of the image sensor 220. When the image capturing apparatus 200 is actually used, the gain of the image sensor 220 gradually increases as the illuminance of the subject environment decreases. When the gain of the image sensor 220 is greater than a preset value, it can be determined that the shooting environment is a low-illumination environment. At this time, the optical filter may be controlled to pass the visible light and not pass the infrared light at the first image sampling interval and to pass the infrared light at the second image sampling interval. As an example, the preset value may be 36 dB.
In some optional embodiments, the control unit 230 is further configured to control the image sensor 220 to perform photoelectric imaging on the visible light at a third image sampling interval to obtain the target image when it is determined that the gain of the image sensor 220 is smaller than or equal to the preset value. Here, in one target image acquisition period, the target image may be directly acquired in the third image sampling interval.
Here, the third image sampling interval may be equal to a sum of the first image sampling interval and the second image sampling interval. For example, assuming that the image acquisition device requires 10 milliseconds (ms) to output an image, the first image sampling interval plus the second image sampling interval should be 10ms, both together producing an output image, while the third image sampling interval itself is 10ms, which alone produces an output image.
That is, the output frame rate of the image sensor at the third image sampling interval is half of the output frame rate of the image sensor at the first image sampling interval or the second image sampling interval, that is, the output frame rate of the image sensor when the gain of the image sensor is less than or equal to the preset value (i.e., in a normal environment) is half of the output frame rate of the image sensor when the gain of the image sensor is greater than the preset value (i.e., in a low-illumination environment).
Illustratively, for the apparatus 300, the output frame rate of the image sensor 304 at the first image sampling interval or the second image sampling interval is 2 times the output frame rate of the image sensor 304 at the third image sampling interval at the filter control unit 306.
In this way, after the image synthesis unit synthesizes two frames of images acquired by the image sensor into one frame of image, the frame rate of the output image of the image acquisition device in the low-illumination environment can be kept unchanged relative to the frame rate output in the normal environment.
For example, when the ambient illuminance is normal (for example, the gain of the image sensor is lower than 36dB), the output frame rate of the image sensor is 25 frames/second. When the shooting environment is detected to be a low-illumination environment (such as the gain of the image sensor is reduced to 36dB), the output frame rate of the image sensor can be modified to 50 frames/second.
In some alternative embodiments, the control unit 230 may move the filter out of the optical path at the third image sampling interval, which is not limited in this embodiment.
Therefore, in the embodiment of the present application, a first image including chromaticity information is obtained by performing photoelectric imaging on light passing through the optical filter at a first image sampling interval, a second image including luminance information is obtained by performing photoelectric imaging on light passing through the optical filter at a second image sampling interval, and then the first image and the second image are synthesized to obtain a high-quality color target image. Because the image sensor in the embodiment of the application can respectively acquire the first image and the second image in different time periods, one sensor can be used in the embodiment of the application, the material cost is reduced, the image sensor does not need to be registered, the complexity is reduced, and the reduction of the size of the device can be facilitated.
Fig. 13 shows a schematic flowchart of an image acquisition method 400 provided in an embodiment of the present application. The method 400 may be performed by the image acquisition apparatus 200 in fig. 2 above, or by the image acquisition apparatus 300 in fig. 11. As shown in fig. 13, method 400 includes steps 410 through 440.
It should be noted that fig. 13 shows steps or operations of the image acquisition method, but these steps or operations are merely examples, and other operations or variations of the operations in fig. 13 may also be performed in the embodiment of the present application. Further, it is possible that not all of the operations in fig. 13 are performed.
Optionally, 410, ambient illumination is estimated.
For example, the ambient illumination may be estimated based on the gain of the image sensor. Then, it may be determined whether to perform step 420 according to the estimated ambient illuminance.
In the process of image acquisition, the gain of the image sensor gradually increases along with the reduction of the ambient illumination. For example, when the gain of the image sensor is lower than 36dB, it may be determined that the subject environment is a normal illuminance environment. When the gain of the image sensor is higher than 36dB, the subject environment can be determined to be a low light environment.
When the shooting environment is determined to be a normal illumination environment, the optical filter in fig. 2 or fig. 3 may be moved out of the optical path, and an image obtained by performing photoelectric imaging on visible light by the image sensor may be output, displayed, or saved. As an example, the output frame rate of the image sensor at this time may be 25 frames/second.
When it is determined that the subject environment is a low-illuminance environment, step 420 may be performed.
This step 410 may be performed by the control unit 230 in fig. 2, or the filter control unit 306 in fig. 11, as an example.
A first image and a second image are acquired 420.
Specifically, visible light passing through a filter is acquired at a first image sampling interval, and infrared light passing through the filter is acquired at a second image sampling interval. The filter can be referred to the above description, and is not described herein for brevity.
A first image may then be acquired by photo-electrically imaging the light passing through the filter at the first image sampling interval by an image sensor. A second image may be acquired by the image sensor performing a photo-electric imaging of the light passing through the optical filter at the second image sampling interval.
In some embodiments, after the shooting environment is determined to be a low-illumination environment, the output frame rate of the image sensor may be modified from 25 frames/second to 50 frames/second, that is, the acquisition frame rate of the image acquisition device is doubled, so that it can be ensured that the output frame rate of the image acquisition device in the low-illumination environment is the same as the output frame rate in normal illumination.
In some embodiments, when the filter is a circular filter, the motor may be activated to rotate the circular filter before step 420. And when the rotating speed of the circular filter reaches the set rotating speed, starting the exposure of black and white frames and color frames to obtain a first image and a second image.
As an example, the filter and the image sensor may be controlled by the control unit 230 in fig. 2 to acquire the first image and the second image, or the tunable filter and the image sensor may be controlled by the filter control unit in fig. 11 to acquire the first image and the second image. Specifically, the processes of the first image and the second image may refer to the description above, and are not described herein again for brevity.
Optionally, after step 420, basic image processing may be performed on the acquired first and second images. For example, demosaicing, white balance, color correction, etc., which are not limited in this embodiment.
430, the first image and the second image are synthesized to generate a target image.
As an example, the image synthesis unit may be controlled by the control unit 230 to perform this step 430. Specifically, the synthesis process can be referred to the above description, and is not repeated herein for brevity.
Optionally, 440, the target image is transmitted, displayed, or stored.
Illustratively, the target image may be output to an associated business module. And the related service module executes operations such as image transmission, display/storage and the like on the synthesized image according to the service application.
Therefore, in the embodiment of the present application, a first image including chromaticity information is obtained by performing photoelectric imaging on light passing through the optical filter at a first image sampling interval, a second image including luminance information is obtained by performing photoelectric imaging on light passing through the optical filter at a second image sampling interval, and then the first image and the second image are synthesized to obtain a high-quality color target image. Because the image sensor in the embodiment of the application can respectively acquire the first image and the second image in different time periods, one sensor can be used in the embodiment of the application, the material cost is reduced, the image sensor does not need to be registered, the complexity is reduced, and the reduction of the size of the device can be facilitated.
It should be noted that the examples in the embodiments of the present application are only for assisting those skilled in the art to understand and implement the embodiments of the present application, and do not limit the scope of the embodiments of the present application. Equivalent alterations and modifications may be made by those skilled in the art based on the examples set forth herein, and such alterations and modifications are intended to be within the scope of the embodiments of the present application.
It should also be noted that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Embodiments of the present application further provide a computer-readable storage medium, which includes a computer program and when the computer program runs on a computer, the computer is caused to execute the relevant instructions in the method or apparatus provided by the above embodiments.
The embodiments of the present application also provide a computer program product containing instructions, which when run on a computer, causes the computer to execute the relevant instructions in the method or apparatus provided by the above embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (29)

1. An image acquisition apparatus, characterized by comprising:
the optical filter is used for gating the incident light;
the image sensor is used for performing photoelectric imaging on the light rays which pass through the optical filter in the incident light;
the control unit is connected with the optical filter and the image sensor and used for controlling the optical filter to pass visible light in the incident light and block infrared light in the incident light at a first image sampling interval and pass infrared light in the incident light at a second image sampling interval; and
the image sensor is used for controlling the image sensor to perform photoelectric imaging on the light rays which pass through the optical filter at the first image sampling interval in the incident light to acquire a first image, and performing photoelectric imaging on the light rays which pass through the optical filter at the second image sampling interval in the incident light to acquire a second image; and
and the image synthesis unit is used for synthesizing the first image and the second image to generate a first target image.
2. The apparatus of claim 1, wherein the optical filter comprises a first filtering region for passing visible light and blocking infrared light and a second filtering region for passing infrared light;
the control unit is specifically configured to filter the incident light using the first filtering region at the first image sampling interval, and filter the incident light using the second filtering region at the second image sampling interval.
3. The apparatus of claim 2, wherein the filter is a circular filter, the first filtering region is a first sector of the circular filter, and the second filtering region is a second sector of the circular filter;
the control unit comprises a motor, the motor is used for controlling the circular optical filter to rotate around the circle center of the circular optical filter, so that the first fan-shaped area covers the photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval.
4. The apparatus of claim 3, wherein the photosensitive region is a rectangular region, and wherein the circular filter satisfies the following condition when the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
R≥c+b/(2sin((π–ωt)/2));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
5. The apparatus of claim 4, wherein the first sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2(R-c))), and/or the second sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2 (R-c))).
6. The apparatus of claim 3, wherein the photosensitive region is a rectangular region, and wherein the circular filter satisfies the following condition when the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
R≥c+b/(2sin(π–ωt));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
7. The apparatus of claim 6, wherein the first sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2 (R-c))).
8. The device according to any one of claims 3-7, wherein the photosensitive region is a rectangular region, and the circular filter satisfies the following condition:
R≥c+b/(2sin(β/2));
wherein, R represents a radius of the circular filter, b represents a first side length of a photosensitive area of the image sensor, c represents a second side length of the photosensitive area of the image sensor, β is a central angle of the circular filter to which the first side length is opposite, and the first side and the second side are perpendicular to each other.
9. The apparatus according to claim 8, wherein the angular velocity of rotation of the circular filter is ω, the central angle of the first sector in the circular filter is (2 π - γ), the central angle of the second sector is γ, and the exposure time of the first image is t1The exposure time of the second image is t2
Then ω, γ, t1、t2The following formula is satisfied:
2π-γ-β≥ωt1
γ+β≥ωt2or gamma is not less than ω t2Or gamma-beta is not less than ω t2
10. The apparatus of claim 2, wherein the control unit comprises a driver and a lever, a first end of the lever is connected with the filter, and the driver is configured to control the filter to move through the lever, so that the first filter region covers a photosensitive region of the image sensor at the first image sampling interval and the second filter region covers the photosensitive region at the second image sampling interval.
11. The apparatus of claim 1, wherein the filter comprises an electrically controlled light absorbing material therein,
the control unit is specifically configured to apply a voltage to the optical filter, and control the magnitude of the voltage so that the optical filter passes visible light and blocks infrared light at the first image sampling interval, and passes infrared light at the second image sampling interval.
12. The device of claim 11, wherein the electrically controlled light absorbing material comprises an organic color changing material or a liquid crystal material.
13. The apparatus of any one of claims 1-12, further comprising:
an optical unit including an optical lens for capturing the incident light and imaging the incident light on the image sensor, wherein the optical filter is disposed before, after, or between two lenses in the optical lens.
14. The device according to any of claims 1-13, wherein the control unit is specifically configured to:
and under the condition that the gain of the image sensor is determined to be larger than a preset value, controlling the optical filter to pass through visible light and block infrared light at the first image sampling interval and pass through infrared light at the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on light rays passing through the optical filter at the first image sampling interval and perform photoelectric imaging on light rays passing through the optical filter at the second image sampling interval.
15. The apparatus according to any one of claims 1-14, wherein the control unit is further configured to:
and when the gain of the image sensor is determined to be smaller than or equal to the preset value, controlling the image sensor to perform photoelectric imaging on incident light in a third image sampling interval to acquire a second target image, wherein the duration of the third image sampling interval is equal to the sum of the duration of the first image sampling interval and the duration of the second image sampling interval.
16. An image acquisition method, comprising:
controlling the optical filter to pass visible light and block infrared light in incident light at a first image sampling interval, and controlling the optical filter to pass infrared light in incident light at a second image sampling interval;
performing photoelectric imaging on the light rays which pass through the optical filter at the first image sampling interval in the incident light through an image sensor to obtain a first image;
performing photoelectric imaging on the light rays which pass through the optical filter at the second image sampling interval in the incident light through the image sensor to obtain a second image;
and synthesizing the first image and the second image by an image synthesis unit to generate a first target image.
17. The method of claim 16, wherein the filter comprises a first filter region for passing visible light and blocking infrared light and a second filter region for passing infrared light;
wherein, controlling the optical filter to pass the visible light and block the infrared light in the incident light at the first image sampling interval, and controlling the optical filter to pass the infrared light in the incident light at the second image sampling interval, comprises:
and controlling the first filtering area to filter the incident light at the first image sampling interval, and controlling the second filtering area to filter the incident light at the second image sampling interval.
18. The method of claim 17, wherein the filter is a circular filter, the first filtering region is a first sector of the circular filter, and the second filtering region is a second sector of the circular filter;
wherein controlling the first filtering region to filter the incident light at the first image sampling interval and controlling the second filtering region to filter the incident light at the second image sampling interval includes:
the circular optical filter is controlled by a motor to rotate around the circle center of the circular optical filter, so that the first fan-shaped area covers the photosensitive area of the image sensor at the first image sampling interval, and the second fan-shaped area covers the photosensitive area at the second image sampling interval.
19. The method of claim 18, wherein the photosensitive region is a rectangular region, and wherein the circular filter satisfies the following condition when the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
R≥c+b/(2sin((π–ωt)/2));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
20. The method of claim 19, wherein the first sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2(R-c))), and/or the second sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2 (R-c))).
21. The method of claim 18, wherein the photosensitive region is a rectangular region, and wherein the circular filter satisfies the following condition when the duration of the first image sampling interval is the same as the duration of the second image sampling interval:
R≥c+b/(2sin(π–ωt));
where ω represents an angular velocity of rotation of the circular filter, t represents a duration of the first image sampling interval or the second image sampling interval, R represents a radius of the circular filter, b represents a first side length of a photosensitive region of the image sensor, c represents a second side length of the photosensitive region of the image sensor, and the first side and the second side are perpendicular to each other.
22. The apparatus of claim 21, wherein the first sector area corresponds to a central angle greater than or equal to ω t +2arcsin (b/(2 (R-c))).
23. The apparatus of any one of claims 18-22, wherein the photosensitive region is a rectangular region, and the circular filter satisfies the following condition:
R≥c+b/(2sin(β/2));
wherein, R represents a radius of the circular filter, b represents a first side length of a photosensitive area of the image sensor, c represents a second side length of the photosensitive area of the image sensor, β is a central angle of the circular filter to which the first side length is opposite, and the first side and the second side are perpendicular to each other.
24. The apparatus of claim 23, wherein the angular velocity of rotation of a circular filter is ω, the central angle of the first sector in the circular filter is (2 π - γ), the central angle of the second sector is γ, and the exposure time of the first image is t1The exposure time of the second image is t2
Then ω, γ, t1、t2The following formula is satisfied:
2π-γ-β≥ωt1
γ+β≥ωt2or gamma is not less than ω t2Or gamma-beta is not less than ω t2
25. The method of claim 17, wherein controlling the first filtering region to filter incident light at the first image sampling interval and controlling the second filtering region to filter incident light at the second image sampling interval comprises:
controlling the optical filter to move through a driver and a lever, so that the first filter area covers a photosensitive area of the image sensor at the first image sampling interval to acquire visible light of the optical filter, and the second filter area covers the photosensitive area of the image sensor at the second image sampling interval to acquire infrared light of the optical filter;
wherein the first end of the lever is connected with the optical filter.
26. The method of claim 16, wherein the filter includes an electrically controllable light absorbing material therein,
wherein, controlling the optical filter to pass the visible light and block the infrared light in the incident light at the first image sampling interval, and controlling the optical filter to pass the infrared light in the incident light at the second image sampling interval, comprises:
the method comprises the steps of applying voltage to an optical filter, controlling the size of the voltage to enable the optical filter to pass through visible light and block infrared light at a first image sampling interval, obtaining the visible light passing through the optical filter, and obtaining the infrared light passing through the optical filter at a second image sampling interval.
27. The method of claim 26, wherein the electrically controlled light absorbing material comprises an organic color changing material or a liquid crystal material.
28. The method of any one of claims 16-27, further comprising:
and under the condition that the gain of the image sensor is determined to be larger than a preset value, controlling the optical filter to pass through visible light and block infrared light at the first image sampling interval and pass through infrared light at the second image sampling interval, and controlling the image sensor to perform photoelectric imaging on light rays passing through the optical filter at the first image sampling interval and perform photoelectric imaging on light rays passing through the optical filter at the second image sampling interval.
29. The method of any one of claims 16-28, further comprising:
and when the gain of the image sensor is determined to be smaller than or equal to a preset value, performing photoelectric imaging on incident light through the image sensor at a third image sampling interval to obtain a second target image, wherein the duration of the third image sampling interval is equal to the sum of the duration of the first image sampling interval and the duration of the second image sampling interval.
CN202010087243.9A 2020-02-11 2020-02-11 Image acquisition device and image acquisition method Active CN113259546B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010087243.9A CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method
PCT/CN2020/126076 WO2021159768A1 (en) 2020-02-11 2020-11-03 Image acquisition apparatus and image acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010087243.9A CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method

Publications (2)

Publication Number Publication Date
CN113259546A true CN113259546A (en) 2021-08-13
CN113259546B CN113259546B (en) 2023-05-12

Family

ID=77219573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010087243.9A Active CN113259546B (en) 2020-02-11 2020-02-11 Image acquisition device and image acquisition method

Country Status (2)

Country Link
CN (1) CN113259546B (en)
WO (1) WO2021159768A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114650359A (en) * 2022-03-22 2022-06-21 维沃移动通信有限公司 Camera module and electronic equipment
WO2023028769A1 (en) * 2021-08-30 2023-03-09 Oppo广东移动通信有限公司 Imaging module, imaging system, image processing method, and terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109951624A (en) * 2019-04-12 2019-06-28 武汉鸿瑞达信息技术有限公司 A kind of imaging camera system and method based on filter halo
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium
CN110490041A (en) * 2019-05-31 2019-11-22 杭州海康威视数字技术股份有限公司 Human face image collecting device and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08321977A (en) * 1995-05-25 1996-12-03 Sony Corp Filter disk device and video camera provided with the device
JPH1051796A (en) * 1996-05-31 1998-02-20 Olympus Optical Co Ltd Solid-state image pickup device
CN104661008B (en) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 The treating method and apparatus that color image quality is lifted under low light conditions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109951646A (en) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 Image interfusion method, device, electronic equipment and computer readable storage medium
CN109951624A (en) * 2019-04-12 2019-06-28 武汉鸿瑞达信息技术有限公司 A kind of imaging camera system and method based on filter halo
CN110490041A (en) * 2019-05-31 2019-11-22 杭州海康威视数字技术股份有限公司 Human face image collecting device and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023028769A1 (en) * 2021-08-30 2023-03-09 Oppo广东移动通信有限公司 Imaging module, imaging system, image processing method, and terminal
CN114650359A (en) * 2022-03-22 2022-06-21 维沃移动通信有限公司 Camera module and electronic equipment

Also Published As

Publication number Publication date
WO2021159768A1 (en) 2021-08-19
CN113259546B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN109788207B (en) Image synthesis method and device, electronic equipment and readable storage medium
US10274706B2 (en) Image capture control methods and apparatus
CN105323474B (en) Picture pick-up device and its control method
EP2380345B1 (en) Improving the depth of field in an imaging system
US8416339B2 (en) Providing multiple video signals from single sensor
US7609291B2 (en) Device and method for producing an enhanced color image using a flash of infrared light
CN102783135A (en) Method and apparatus for providing a high resolution image using low resolution
CN102892008A (en) Dual image capture processing
JP2016541151A (en) Method and apparatus for implementing and / or using a camera device
WO2010059182A1 (en) Extended depth of field for image sensor
WO2012057622A1 (en) System and method for imaging using multi aperture camera
WO2021139401A1 (en) Camera module, imaging method, and imaging apparatus
JP7156352B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
CN113711584B (en) Camera device
CN112839215B (en) Camera module, camera, terminal device, image information determination method and storage medium
WO2008079187A1 (en) Anti-aliasing in imaging device using image stabilization system
JP2002171430A (en) Compound eye imaging system, imaging device and electronic apparatus
CN113259546A (en) Image acquisition apparatus and image acquisition method
CN109005343A (en) Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
JP2023516410A (en) Image sensor and image light sensing method
JP2011017827A (en) Filter device, imaging lens and imaging device including the same
CN114793271A (en) Image acquisition device and brightness balance method
WO2022078036A1 (en) Camera and control method therefor
KR102429093B1 (en) Apparatus and method for photographing multi-band image
CN113728618A (en) Camera device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant