WO2022188558A1 - 一种图像处理方法、装置及摄像机 - Google Patents

一种图像处理方法、装置及摄像机 Download PDF

Info

Publication number
WO2022188558A1
WO2022188558A1 PCT/CN2022/072686 CN2022072686W WO2022188558A1 WO 2022188558 A1 WO2022188558 A1 WO 2022188558A1 CN 2022072686 W CN2022072686 W CN 2022072686W WO 2022188558 A1 WO2022188558 A1 WO 2022188558A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
infrared
time
processing chip
image
Prior art date
Application number
PCT/CN2022/072686
Other languages
English (en)
French (fr)
Inventor
朱海燕
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2022188558A1 publication Critical patent/WO2022188558A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present application relates to the technical field of image processing, and in particular, to an image processing method, device and camera.
  • a white-light fill light for the camera.
  • turn on the white-light fill-light to fill the target scene with the white-light fill-light, so Improve the brightness of the environment and change the low-illumination environment into a high-illumination environment, so that when the image of the target scene is captured by the camera, a high-brightness and high-definition color image can be collected.
  • the white light fill light needs to be turned on all the time, that is, the white light fill light needs to be always on, which consumes a lot of power consumption, which is not conducive to energy saving, and the white light fill light easily causes light pollution.
  • the present application provides a camera, including a processing chip, an image sensor and an infrared fill light;
  • the processing chip is used to determine the on-time interval and the off-time interval corresponding to the infrared fill light; wherein, the on-time interval and the off-time interval appear alternately;
  • the image sensor is generated based on visible light;
  • Image fusion is performed on the infrared image and the color image.
  • the present application provides an image processing method, which is applied to a camera, where the camera at least includes a processing chip, an image sensor and an infrared fill light, and the method includes:
  • the processing chip determines an on-time interval and an off-time interval corresponding to the infrared fill light; wherein the on-time interval and the off-time interval appear alternately;
  • the processing chip controls the infrared fill light to fill the target scene with light in the on-time interval, and controls the image sensor to collect an infrared image of the target scene in the on-time interval, and controls the image
  • the sensor collects the color image of the target scene in the off time interval; wherein, the infrared image is generated by the image sensor based on infrared light when the infrared fill light fills the target scene; The color image is generated by the image sensor based on visible light when the infrared fill light does not fill the target scene;
  • the processing chip performs image fusion on the infrared image and the color image.
  • the present application provides a camera, including: a processing chip, an image sensor, and an infrared fill light;
  • the processing chip is used for:
  • controlling the image sensor to collect an infrared image of the scene during the on time interval, and controlling the image sensor to collect a color image of the scene during the off time interval;
  • the infrared image is generated by the image sensor based on infrared light when the infrared fill light fills the scene with light; the color image is when the infrared fill light does not fill the scene.
  • the supplementary light is used, it is generated by the image sensor based on visible light;
  • Image fusion is performed on the infrared image and the color image.
  • the target scene is supplemented with an infrared fill light, and the turn-on time interval and the turn-off time interval of the infrared fill light alternately occur, that is, the infrared fill light is not normally used.
  • Bright so that the time of fill light is reduced, power consumption is reduced, energy is saved, and light pollution caused by long-term fill light is reduced.
  • image fusion of infrared image and color image a high-brightness and high-definition color image is obtained, so that a high-brightness and high-definition color image can be collected in a low-light environment.
  • FIG. 1 is a hardware structure diagram of a camera in an embodiment of the present application
  • FIGS. 2A and 2B are schematic diagrams of an on-time interval and an off-time interval
  • FIG. 3 is a hardware structure diagram of a camera in an embodiment of the present application.
  • FIGS. 4A and 4B are schematic diagrams of signal transmission in an embodiment of the present application.
  • 5 and 6 are hardware structural diagrams of a camera in an embodiment of the present application.
  • FIG. 7 is a schematic circuit diagram of an infrared fill light drive in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of supplementary light synchronization of multiple cameras in an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of an image processing method in an embodiment of the present application.
  • first, second, third, etc. may be used in the embodiments of the present application to describe various information, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • the first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information without departing from the scope of the present application.
  • the use of the word "if” can be interpreted as "at the time of" or "when” or “in response to determining”, depending on the context.
  • An embodiment of the present application proposes a camera.
  • the camera may include an infrared fill light.
  • a low-illumination environment that is, the ambient brightness is less than a preset brightness threshold
  • the target scene that is, the field of view of the camera, the camera is used to collect the image of the target scene
  • infrared light that is, the field of view of the camera, the camera is used to collect the image of the target scene
  • high-brightness and high-definition color images are collected by the camera.
  • a high-illumination environment that is, the ambient brightness is not less than the preset brightness threshold
  • the ambient brightness is not less than the preset brightness threshold
  • an example is taken that an infrared supplementary light needs to be used to fill the target scene in a low-illumination environment, and the processing process in a high-illuminance environment is not repeated.
  • the camera may include a processing chip 11 , an image sensor 12 and an infrared fill light 13 .
  • the processing chip 11 may be implemented by an SOC (System on Chip, system-on-chip), or may be implemented by other types of chips, which are not limited.
  • the image sensor 12 may be a CMOS (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor) image sensor, or may be other types of image sensors, which are not limited thereto.
  • the infrared fill light 13 is a fill light with an infrared fill function, which can fill the target scene with light.
  • natural light may strike the image sensor 12, so that the image sensor 12 generates an image based on the natural light.
  • the natural light irradiating the image sensor 12 may be infrared light
  • the image sensor 12 can generate an image based on the infrared light
  • the image generated based on the infrared light It is called an infrared image, that is, a black and white image or a grayscale image.
  • the natural light irradiating the image sensor 12 can be visible light, and the image sensor 12 can generate an image based on the visible light, and the image generated based on the visible light is called a color image, that is, RGB ( Red Green Blue) image.
  • the infrared image generated based on infrared light does not contain color information, but only contains luminance information, while the color image generated based on visible light contains color information.
  • the infrared fill light 13 in a low-illumination environment, can be used to fill the target scene, and the on time interval and the off time interval of the infrared fill light 13 alternately appear, that is, the infrared fill light 13 is not Always bright, thus reducing the time for fill light and reducing light pollution caused by long-term fill light.
  • the processing chip 11 determines the on time interval and the off time interval corresponding to the infrared fill light 13 , and the on time interval and the off time interval appear alternately. Then, the processing chip 11 controls the infrared fill light 13 to fill in the target scene during the on time interval, controls the image sensor 12 to collect the infrared image of the target scene during the on time interval, and controls the image sensor 12 to collect the infrared image during the off time interval
  • the color image of the target scene the infrared image is generated by the image sensor 12 based on infrared light when the infrared fill light 13 fills the target scene, and the color image is when the infrared fill light 13 does not fill the target scene. , generated by the image sensor 12 based on visible light.
  • the processing chip 11 performs image fusion on the infrared image and the color image.
  • the opening time interval and the closing time interval appear alternately.
  • the opening time interval and the closing time interval are both short, the infrared image and color image collected by the image sensor 12 in the adjacent opening time interval and closing time interval are collected.
  • the content of the images is similar, so the infrared image and the color image can be image fusion.
  • the processing chip 11 can determine an on time interval and an off time interval corresponding to the infrared fill light 13, and the on time interval can indicate that the infrared fill light 13 is turned on, thereby performing fill light on the target scene.
  • the off time interval may indicate that the infrared fill light 13 is turned off, so that no fill light is performed on the target scene.
  • the duration of the opening time interval and the duration of the closing time interval may be the same or different. Referring to FIG. 2A , it is a schematic diagram of the opening time interval and the closing time interval. In FIG. 2A , the duration of the opening time interval and the For example, the duration of the closing time interval is the same.
  • the opening time interval and the closing time interval may alternate, that is, the opening time interval is followed by the closing time interval, and the closing time interval is followed by the opening time interval.
  • the opening time interval can include the starting moment of the fill light (that is, the starting moment of the opening time interval) and the ending moment of the filling light (that is, the ending moment of the opening time interval), and the closing time interval can include the starting moment of the unfilled light (that is, the closing time.
  • the end time of the fill light in the on time interval is the same as the start time of the unfilled light in the off time interval, and the end time of the unfilled light in the off time interval is the same as the start time of the fill light in the on time interval.
  • the closing time interval a1 is followed by the opening time interval a2
  • the opening time interval a2 is followed by the closing time interval a3
  • the closing time interval a3 is followed by the opening time interval a4.
  • the opening time interval a4 is followed by the closing time interval a5, and so on.
  • the end time of the non-complementary light in the off time interval is the same as the start time of the supplementary light in the on time interval
  • the end time of the supplementary light in the on time interval is the same as the start time of the non-complementary light in the off time interval.
  • the time interval is closed first and then the time interval is opened as an example.
  • the implementation process is similar to that of FIG. 2A , which is not repeated in the embodiment of the present application.
  • the duration of the on time interval and the exposure duration of the image sensor 12 may be the same.
  • the duration of the off time interval is also the same as the exposure duration of the image sensor 12 .
  • the exposure duration (ie, exposure time) of the image sensor 12 refers to the time interval from the shutter opening to closing, during which the image sensor 12 captures one frame of image.
  • the processing chip 11 can first obtain the exposure duration of the image sensor 12, and then determine the duration of the on-time interval and the duration of the off-time interval based on the exposure duration, and then determine the on-time based on the duration of the on-time interval and the duration of the off-time interval interval and closing time interval.
  • the processing chip 11 may preconfigure the exposure duration of the image sensor 12 , or may acquire the exposure duration of the image sensor 12 from the image sensor 12 , and there is no limitation on the acquisition method of the exposure duration.
  • the processing chip 11 controls the infrared fill light 13 to fill in the target scene during the on time interval, and controls the image sensor 12 to collect the infrared image of the target scene during the on time interval.
  • the infrared image is generated by the image sensor 12 based on the infrared light in the ON time interval when the infrared fill light 13 fills the target scene.
  • the infrared fill light 13 is controlled not to fill light on the target scene during the off time interval, and the image sensor 12 is controlled to collect a color image of the target scene during the off time interval.
  • the color image is generated by the image sensor 12 based on the visible light in the off time interval when the infrared fill light 13 does not fill the target scene.
  • the processing chip 11 controls the infrared fill light 13 to not fill the target scene. Since the infrared fill light 13 does not fill the target scene, the image sensor 12 is illuminated. The natural light is visible light, and the image sensor 12 can generate a color image based on the visible light, that is, the processing chip 11 controls the image sensor 12 to collect a color image of the target scene.
  • the processing chip 11 controls the infrared fill light 13 to fill the target scene.
  • the natural light is infrared light
  • the image sensor 12 can generate an infrared image based on the infrared light, that is, the processing chip 11 controls the image sensor 12 to collect an infrared image of the target scene.
  • the image sensor 12 in each off time interval, can capture one frame of color image, and in each on time interval, the image sensor 12 can capture one frame of infrared image.
  • the image sensor 12 collects the color image b1 during the off time interval a1, and sends the color image b1 to the processing chip 11; the image sensor 12 collects the infrared image b2 during the on time interval a2, and sends the infrared image b2 to the processing chip 11.
  • the chip 11 and the image sensor 12 collect the color image b3 in the off time interval a2, and send the color image b3 to the processing chip 11, and so on.
  • the image sensor 12 first collects one frame of color image, then another frame of infrared image, then another frame of color image, and then another frame of infrared image, and so on, that is, the color image and the infrared image are collected alternately.
  • the processing chip 11 first controls the infrared fill light 13 to not fill the target scene (at this time, the image sensor 12 collects a frame of color image, and the infrared fill light 13 is not to be filled by turning off the switch of the infrared fill light 13) , and then control the infrared fill light 13 to fill the target scene (at this time, the image sensor 12 collects a frame of infrared image, and the infrared fill light 13 is turned on by turning on the switch of the infrared fill light 13 to make the infrared fill light 13 fill light), and then control the infrared fill light 13.
  • the fill light 13 does not perform fill light on the target scene, and so on.
  • the processing chip 11 controls the infrared fill light 13 to fill the target scene once
  • the image sensor 12 will collect one frame of color image and one frame of infrared image, that is, every two frames of images correspond to one fill light. Therefore, the infrared fill light
  • the fill light frequency of the lamp 13 is half of the image capturing frame rate of the image sensor 12 .
  • the processing chip 11 may perform image fusion (ie, dual-light image fusion) on the infrared image and the color image.
  • image fusion ie, dual-light image fusion
  • the processing chip 11 acquires brightness information from the infrared image, acquires color information from the color image, and generates a fusion image based on the brightness information and the color information.
  • the fusion image is A frame of a clear color image with both luminance information and color information, that is, a high-brightness and high-definition color image.
  • the processing chip 11 performs image fusion on the color image b1 and the infrared image b2 to obtain a frame of fusion image, and the fusion image is a high-brightness and high-definition color image.
  • the processing chip 11 performs image fusion on the color image b3 and the infrared image b4 to obtain a frame of fusion image, and so on.
  • the target scene is supplemented with an infrared fill light, and the on time interval and the off time interval of the infrared fill light alternately occur, that is, the infrared fill light is not always on, so the infrared fill light is not always on.
  • Reduce the time of fill light reduce power consumption, save energy, and reduce light pollution caused by long-term fill light.
  • image fusion of infrared image and color image a high-brightness and high-definition color image is obtained, so that a high-brightness and high-definition color image can be collected in a low-light environment.
  • the camera may further include a lens 14 through which natural light illuminates the image sensor 12 .
  • a lens 14 through which natural light illuminates the image sensor 12 .
  • the infrared light is irradiated to the image sensor 12 through the lens 14, and the image sensor 12 generates an infrared image based on the infrared light.
  • the visible light is irradiated to the image sensor 12 through the lens 14, and the image sensor 12 generates a color image based on the visible light.
  • the turn-on time interval may include the start time of fill light and the end time of fill light
  • the turn-off time interval may include the start time of no fill light and the end time of no fill light
  • the processing chip 11 controls the infrared fill light
  • the lamp 13 fills the target scene with light during the on time interval, and controls the image sensor 12 to collect the infrared image of the target scene during the on time interval, and controls the infrared fill light 13 to not fill in the target scene during the off time interval.
  • light, and control the image sensor 12 to collect the color image of the target scene during the off time interval which may include but not limited to the following methods:
  • Mode 1 Determine the start time of the control signal based on the start time of the fill light and the first duration, determine the end time of the control signal based on the end time of the fill light and the first duration, and determine the time between the start time of the control signal and the end time of the control signal.
  • a control signal is sent to the infrared fill light 13, so that after the infrared fill light 13 receives the control signal, the target scene is filled with light.
  • the start time of the synchronization signal is determined based on the start time of the fill light and the second duration, and the synchronization signal is sent to the image sensor 12 at the start time of the synchronization signal, so that the image sensor 12 collects the target scene within the exposure time after receiving the synchronization signal. infrared image.
  • the exposure duration is the exposure duration of the image sensor 12
  • the duration of the ON time interval is the same as the exposure duration
  • the duration of the OFF duration is the same as the exposure duration
  • the first duration is the signal transmission delay between the processing chip 11 and the infrared fill light 13 .
  • the first duration may be an empirical value configured by the processing chip 11 , or may be measured by the processing chip 11 . Numerical value, there is no restriction on how to obtain the first duration.
  • the processing chip 11 sends a request signal to the infrared fill light 13. After receiving the request signal, the infrared fill light 13 returns a response signal to the processing chip 11.
  • the processing chip 11 is based on the sending time of the request signal and the receiving time of the response signal. , to determine the signal transmission delay between the processing chip 11 and the infrared fill light 13 (eg, half of the difference between the reception time of the response signal and the transmission time of the request signal), and the signal transmission delay is the first duration.
  • the second duration is the signal transmission delay between the processing chip 11 and the image sensor 12 .
  • the second duration may be an empirical value configured by the processing chip 11 or a value measured by the processing chip 11 .
  • the processing chip 11 sends a request signal to the image sensor 12. After receiving the request signal, the image sensor 12 returns a response signal to the processing chip 11.
  • the processing chip 11 determines the processing chip based on the sending time of the request signal and the receiving time of the response signal.
  • the signal transmission delay between 11 and the image sensor 12 (for example, half of the difference between the reception time of the response signal and the transmission time of the request signal), the signal transmission delay is the second duration.
  • Determining the start time of the control signal based on the start time of the supplementary light and the first duration may include: determining the difference between the start time of the supplementary light and the first duration as the start time of the control signal.
  • the processing chip 11 is When the control signal starts to send the control signal to the infrared fill light 13, the control signal reaches the infrared fill light 13 just at the start of the fill light, that is to say, the infrared fill light 13 starts the fill light.
  • the control signal is received at any time, and the target scene is filled with light from the start time of the fill light.
  • Determining the end time of the control signal based on the end time of the supplementary light and the first duration may include: determining the difference between the end time of the supplementary light and the first duration as the end time of the control signal, so that the processing chip 11 ends when the control signal ends.
  • Determining the starting time of the termination signal based on the starting time of the unfilled light and the first duration may include: determining the difference between the starting time of the unfilled light and the first duration as the starting time of the termination signal, and the processing chip 11 is in the
  • the termination signal arrives at the infrared fill light 13 at the beginning of the unfilled light, that is to say, the infrared fill light 13 starts when the unfilled light starts.
  • the termination signal is received at the moment, and from the start time of the un-filling light, the lighting for the target scene is stopped.
  • Determining the end time of the termination signal based on the end time of the non-filling light and the first duration may include: determining the difference between the end time of the non-filling light and the first duration as the end time of the termination signal, and the processing chip 11 ends when the termination signal When sending a termination signal to the infrared fill light 13 at all times, the termination signal reaches the infrared fill light 13 just at the end of the unfilled light, that is to say, the infrared fill light 13 receives the termination signal at the end of the unfilled light, And starting from the end of the non-filling light, fill light on the target scene.
  • the control signal can be a high-level signal, and the termination signal can be a low-level signal, or the control signal can be a low-level signal, and the termination signal can be a high-level signal, and then the control signal can be a high-level signal, and the termination signal It is an example of a low-level signal, therefore, in the time interval between the start time of the control signal and the end time of the control signal, the processing chip 11 sends a high-level signal to the infrared fill light 13, and the infrared fill light 13 receives the After the high-level signal, fill light on the target scene.
  • the processing chip 11 sends a low-level signal to the infrared fill light 13. After receiving the low-level signal, the infrared fill light 13 does not respond to the target scene. Fill light.
  • the processing chip 11 starts to send a high-level signal from the start time of the control signal, and the high-level signal reaches the infrared fill light 13 at the start of the fill light, that is, the infrared fill light 13 starts from The fill light starts at the start time of the fill light, and during the receiving period of the high-level signal, the infrared fill light 13 will perform fill light.
  • the processing chip 11 starts to send a low-level signal from the start time of the termination signal (ie, the end time of the control signal), and this low-level signal reaches the infrared fill light 13 at the start time of the unfilled light (ie, the end time of the fill light), that is, The infrared supplementary light 13 does not perform supplementary light since the starting time of the non-complementary light, and during the receiving period of the low-level signal, the infrared supplementary light 13 does not perform supplementary light.
  • Determining the start time of the synchronization signal based on the start time of the supplementary light and the second duration may include: determining the difference between the start time of the supplementary light and the second duration as the start time of the synchronization signal, and the processing chip 11 is in the synchronization signal
  • the synchronization signal reaches the image sensor 12 at the start time of the supplementary light, that is to say, the image sensor 12 receives the synchronization signal at the start time of the supplementary light, and sends the synchronization signal from the supplementary light to the image sensor 12.
  • the image of the target scene is captured during the exposure time from the light onset moment. Since the turn-on time interval of the infrared fill light 13 is the same as the exposure time, and the infrared fill light 13 performs fill light during the turn-on time interval, the image sensor 12 collects an infrared image within the exposure time.
  • Determining the start time of the synchronization signal based on the start time of the unfilled light and the second duration may include: determining the difference between the start time of the unfilled light and the second duration as the start time of the synchronization signal, and the processing chip 11 is at the start time of the synchronization signal.
  • the synchronization signal can be a pulse signal, that is, at the start moment of each synchronization signal (determined based on the start moment of the supplementary light and the second duration, or determined based on the start moment of the non-complementary light and the second duration), the processing chip 11 A pulse signal is sent to the image sensor 12 .
  • Method 2 In the time interval between the start time of the fill light and the end time of the fill light, a control signal is sent to the infrared fill light 13, so that the infrared fill light 13 can fill the target scene after receiving the control signal. . And, a synchronization signal is sent to the image sensor 12 at the start of the fill light, so that the image sensor 12 collects an infrared image of the target scene within the exposure time period after receiving the synchronization signal and delaying for a third time period.
  • a termination signal is sent to the infrared fill light 13, so that the infrared fill light 13 does not perform fill light on the target scene after receiving the termination signal .
  • a synchronization signal is sent to the image sensor 12 at the starting moment of no light supplementation, so that the image sensor 12 captures a color image of the target scene within the exposure time period after receiving the synchronization signal and delaying for a third time period.
  • the exposure duration is the exposure duration of the image sensor 12
  • the duration of the on-time interval is the same as the exposure duration
  • the duration of the off-time interval is the same as the exposure duration.
  • the first duration is the signal transmission delay between the processing chip 11 and the infrared fill light 13 .
  • the second duration is the signal transmission delay between the processing chip 11 and the image sensor 12 .
  • the third duration is the difference between the first duration and the second duration.
  • the first duration may be greater than the second duration, and the third duration may be a value greater than 0.
  • the processing chip 11 Since the signal transmission delay between the processing chip 11 and the infrared fill light 13 is the first time length, the processing chip 11 sends the infrared fill light 13 to the time interval from the fill light start time to the fill light end time.
  • the infrared fill light 13 When sending the control signal, the infrared fill light 13 will receive the control signal during the time interval between the fill light start time + the first duration to the fill light end time + the first duration, and fill light on the target scene.
  • the control signal can be a high-level signal
  • the termination signal can be a low-level signal
  • the processing chip 11 sends a high-level signal to the infrared fill light 13 in the time interval between the start of the fill light and the end of the fill light.
  • the infrared fill light 13 fills the target scene with light.
  • the processing chip 11 sends a low-level signal to the infrared fill-in light 13, and the infrared fill-in light 13 sends a low-level signal to the target after receiving the low-level signal.
  • the scene is not filled with light.
  • the processing chip 11 starts to send a high-level signal from the start of the fill-light, and the high-level signal reaches the infrared fill-light 13 at the start of the fill-light + the first duration, and the high-level signal reaches the infrared fill light 13 at the start of the fill-light + the first duration.
  • the infrared supplementary light 13 will perform supplementary light.
  • the processing chip 11 starts to send a low-level signal at the start time of the unfilled light (ie, the end of the fill-in light), and the low-level signal reaches the infrared fill-light 13 at the start time of the unfilled light + the first duration, and the low-level signal reaches the infrared fill light 13 at the low level.
  • the infrared supplementary light 13 will not perform supplementary light.
  • the signal transmission delay between the processing chip 11 and the image sensor 12 is the second duration
  • the processing chip 11 sends a synchronization signal to the image sensor 12 at the start of the fill light
  • the image sensor 12 is at the start of the fill light +
  • the synchronization signal will be received for the second duration.
  • the third duration is the difference between the first duration and the second duration, that is, the sum of the third duration and the second duration is the first duration
  • the image sensor 12 receives the synchronization signal and delays the time after the third duration. It is the start time of the supplementary light + the first duration, that is, the image sensor 12 starts from the start time of the supplementary light + the first duration, and collects the image of the target scene within the exposure duration.
  • the ON time interval of the infrared fill light 13 is the same as the exposure time, the infrared fill light 13 starts fill light from the fill light start time + the first time period, and the image sensor 12 collects an infrared image during the exposure time.
  • the image sensor 12 When the processing chip 11 sends a synchronization signal to the image sensor 12 at the start time of the non-filling light, the image sensor 12 will receive the synchronization signal at the start time of the non-filling light + the second duration.
  • the time after the image sensor 12 receives the synchronization signal and is delayed by the third time period is the start time of un-filling light + the first time period, that is, starting from the start time of unfilling light + the first time period, the target is collected within the exposure time period image of the scene. Since the off time interval of the infrared fill light 13 is the same as the exposure time, the infrared fill light 13 has not performed fill light since the starting time of the fill light + the first time period. Therefore, the image sensor 12 collects color during the exposure time. image.
  • the synchronization signal may be a pulse signal, that is to say, the processing chip 11 sends a pulse signal to the image sensor 12 at each start time of supplemental light and each start time of no supplementary light. Each time the image sensor 12 receives the pulse signal and delays for a third time period, an image can be collected within the exposure time period.
  • the above-mentioned methods 1 and 2 are only examples, and there is no restriction on this, as long as the infrared fill light 13 can be controlled to fill in the target scene during the on-time interval, and the image sensor 12 can be controlled to collect the infrared image of the target scene during the on-time interval, and control the The infrared fill light 13 does not fill light on the target scene during the off time interval, and the image sensor 12 can be controlled to collect a color image of the target scene during the off time interval.
  • the camera may further include an infrared fill light driver 15, and the infrared fill light driver 15 is used to control the opening or closing of the infrared fill light 13, that is, the infrared fill light
  • the lamp driver 15 can turn on the infrared fill light 13, so that the infrared fill light 13 can fill in the target scene, or the infrared fill light driver 15 can turn off the infrared fill light 13, so that the infrared fill light 13 does not affect the target scene. Fill light.
  • the processing chip 11 when the processing chip 11 sends a control signal (i.e., a high-level signal) to the infrared fill light 13, the processing chip 11 can send a control signal to the infrared fill light driver 15. After receiving the control signal, the infrared fill light driver 15 can control the infrared fill light 13 to fill the target scene with light.
  • the processing chip 11 sends a termination signal (ie, a low-level signal) to the infrared fill light 13
  • the processing chip 11 may send a termination signal to the infrared fill light driver 15 .
  • the infrared fill light driver 15 After receiving the termination signal, the infrared fill light driver 15 can control the infrared fill light 13 to not perform fill light on the target scene.
  • the camera may further include a single-chip microcomputer 16 .
  • the processing chip 11 sends a control signal to the infrared fill light 13
  • the processing chip 11 first sends the control signal To the single chip 16, after receiving the control signal, the single chip 16 can send the control signal to the infrared fill light driver 15.
  • the infrared fill light driver 15 can control the infrared fill light 13 to fill the target scene with light.
  • the processing chip 11 When the processing chip 11 sends a termination signal to the infrared fill light 13 , the processing chip 11 first sends the termination signal to the single-chip microcomputer 16 , and the single-chip 16 can send the termination signal to the infrared fill light driver 15 after receiving the termination signal. After receiving the termination signal, the infrared fill light driver 15 can control the infrared fill light 13 to not perform fill light on the target scene.
  • the microcontroller 16 may include a GPIO (General Purpose Input Output) interface connected to the processing chip 11, and a PWM (Pulse Width Modulation, pulse width modulation) interface connected to the infrared fill light driver 15.
  • the processing chip 11 outputs the control signal or the termination signal to the GPIO interface of the microcontroller, and the microcontroller outputs the control signal or the termination signal to the infrared fill light driver 15 through the PWM interface.
  • the infrared fill light driver 15 receives the control signal, a drive current is generated between "LED+” and “LED-", “LED+” is connected to the positive pole of the infrared fill light 13, “LED-” ” is connected to the negative pole of the infrared fill light 13, so that a driving current is generated between the positive pole and the negative pole of the infrared fill light 13, and the infrared fill light 13 is turned on, so that the infrared fill light 13 fills the target scene.
  • the infrared fill light driver 15 When the infrared fill light driver 15 receives the termination signal, it does not generate a drive current between "LED+” and "LED-", that is, no drive current is generated between the positive and negative poles of the infrared fill light 13, and the infrared fill light is turned off. Light 13, so that the infrared fill light 13 does not fill light on the target scene.
  • the infrared fill light driver 15 may include resistor R1 (0.25 ⁇ , error is 1%), resistor R2 (0.25 ⁇ , error is 1%), resistor R3 (1K ⁇ , error is 1%), Resistor R5 (10K ⁇ ), resistor R6 (10K ⁇ ), capacitor C1 (10uF), capacitor C2 (100nF), capacitor C3 (100nF), capacitor C4 (10uF), inductor L1 (10uH) and chip U1.
  • PWM means control signal or termination signal
  • GND means ground
  • "LED+” is connected to the positive pole of the infrared fill light 13
  • LED- is connected to the negative pole of the infrared fill light
  • 12V means the input voltage of the infrared fill light driver 15 .
  • the chip U1 may include 6 pins (also referred to as pins), and pin 1 is an FB pin, which is an output voltage feedback pin.
  • Pin 2 is a DIM pin, which is an input pin for receiving an externally input control signal or a termination signal.
  • Pin 3 is the GND pin, which is the ground pin.
  • Pin 4 is an IN pin and is a power input pin.
  • Pin 5 is an LX pin, which is an output pin, used to control the output of the chip U1, and generate a drive current between "LED+" and "LED-”.
  • Pin 6 is the BS pin, which is a self-boosting presser foot.
  • FIG. 7 is only an example of the infrared fill light driver 15 , and the implementation of the infrared fill light driver 15 is not limited, as long as the infrared fill light 13 can be controlled to be turned on and off.
  • the supplementary light synchronization control of the multiple cameras can be performed, so that the multiple cameras
  • the ON time intervals determined by the processing chips of each camera are the same, and the OFF time intervals determined by the processing chips of each camera in the multiple cameras are the same.
  • the on-time interval determined by the processing chip of camera 1 is the same as the on-time interval determined by the processing chip of camera 2
  • the off-time interval determined by the processing chip of camera 1 is the same as the off-time interval determined by the processing chip of camera 2.
  • the on time interval determined by the processing chip of camera 1 is the same as the on time interval determined by the processing chip of camera 3
  • the off time interval determined by the processing chip of camera 1 is the same as the off time interval determined by the processing chip of camera 3, and so on.
  • the principle of light synchronization between multiple cameras can be seen in Figure 8.
  • an additional power adapter can be configured, and the power adapter can send a power synchronization signal (such as a square wave of a specific frequency) to multiple cameras at the same time. signal), the power synchronization signal is used as the start signal of each camera.
  • the power synchronization signal is used as the start signal of each camera. For each camera, only after the processing chip of the camera receives the power supply synchronization signal, does it start to determine the on-time interval and off-time interval corresponding to the infrared fill light.
  • the timings of the on-time interval and the off-time interval are determined to be synchronized, so that the on-time interval determined by the processing chip of each camera is the same, and the processing chip of each camera The turn-off time intervals determined by the chips are all the same.
  • an image processing method is also proposed in this embodiment of the present application, and the method can be applied to a camera.
  • the camera at least includes a processing chip, an image sensor and an infrared fill light, as shown in FIG. 9 ,
  • the method may include:
  • Step 911 the processing chip determines the ON time interval and the OFF time interval corresponding to the infrared fill light.
  • the on time interval and the off time interval may alternate.
  • Step 912 the processing chip controls the infrared fill light to fill in the target scene during the on-time interval, controls the image sensor to collect the infrared image of the target scene during the on-time interval, and controls the image sensor to collect the target scene during the off-time interval color image.
  • the infrared image is generated by the image sensor based on infrared light when the infrared fill light fills the target scene; the color image is generated by the image sensor based on visible light when the infrared fill light does not fill the target scene. of.
  • the turn-on time interval includes the start time of the fill light and the end time of the fill light
  • the processing chip controls the infrared fill light lamp to fill the target scene with light in the turn-on time interval, which may include: based on the start time of the fill light and the first fill light.
  • the duration determines the start time of the control signal, and determines the end time of the control signal based on the fill light end time and the first duration; the first duration is the signal transmission delay between the processing chip and the infrared fill light;
  • a control signal is sent to the infrared fill light, so that the infrared fill light can fill the target scene after receiving the control signal.
  • the processing chip controlling the image sensor to collect the infrared image of the target scene in the on-time interval may include: determining the starting time of the synchronization signal based on the start time of the fill light and the second duration, where the second duration is between the processing chip and the image sensor.
  • the signal transmission delay between the synchronizing signals is sent to the image sensor at the starting moment of the synchronizing signal, so that the image sensor can collect the infrared image of the target scene within the exposure time after receiving the synchronizing signal; among them, the duration of the opening time interval Same as exposure duration.
  • the processing chip controls the infrared fill light to fill light on the target scene during the on-time interval, and controls the image sensor to collect the infrared image of the target scene during the on-time interval, which may include: from the start time of the fill light to the end time of the fill light. In the time interval between, send a control signal to the infrared fill light, so that the infrared fill light can fill the target scene after receiving the control signal; After the image sensor receives the synchronization signal and delays for a third time, it collects the infrared image of the target scene within the exposure time; the third time is the difference between the first time and the second time, and the first time is the processing chip and the infrared compensation.
  • the signal transmission delay between the lights, and the second duration is the signal transmission delay between the processing chip and the image sensor; the duration of the turn-on time interval is the same as the exposure duration.
  • the off-time interval includes the start time of not filling the light and the end moment of not filling the light
  • the processing chip controls the image sensor to collect the color image of the target scene in the off time interval, which may include: based on the initial moment of not filling the light and the second The duration determines the start time of the synchronization signal, and the second duration is the signal transmission delay between the processing chip and the image sensor; the synchronization signal is sent to the image sensor at the start time of the synchronization signal, so that after the image sensor receives the synchronization signal, Acquire a color image of the target scene during the exposure time.
  • the image sensor send a synchronizing signal to the image sensor at the starting moment when the light is not supplemented, so that the image sensor collects the color image of the target scene within the exposure time after receiving the synchronizing signal and delaying the third time;
  • the third time is The difference between the first duration and the second duration, the first duration is the signal transmission delay between the processing chip and the infrared fill light, and the second duration is the signal transmission delay between the processing chip and the image sensor;
  • the duration of the closing time interval can be the same as the exposure duration.
  • Step 913 the processing chip performs image fusion on the infrared image and the color image.
  • the processing chip obtains brightness information from an infrared image and color information from a color image, and generates a fusion image based on the brightness information and the color information.
  • the above execution sequence is only an example given for the convenience of description. In practical applications, the execution sequence of the steps can also be changed, and the execution sequence is not limited. Moreover, in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described in this specification, and the methods may include more or less steps than those described in this specification. In addition, a single step described in this specification may be decomposed into multiple steps for description in other embodiments; multiple steps described in this specification may also be combined into a single step for description in other embodiments.
  • the target scene is supplemented with an infrared fill light, and the on time interval and the off time interval of the infrared fill light alternately occur, that is, the infrared fill light is not always on, so the infrared fill light is not always on.
  • Reduce the time of fill light reduce power consumption, save energy, and reduce light pollution caused by long-term fill light.
  • image fusion of infrared image and color image a high-brightness and high-definition color image is obtained, so that a high-brightness and high-definition color image can be collected in a low-light environment.
  • an embodiment of the present application proposes a camera, including:
  • the processing chip is used for:
  • controlling the image sensor to collect an infrared image of the scene during the on time interval, and controlling the image sensor to collect a color image of the scene during the off time interval;
  • the infrared image is generated by the image sensor based on infrared light when the infrared fill light fills the scene with light; the color image is when the infrared fill light does not fill the scene.
  • the supplementary light is used, it is generated by the image sensor based on visible light;
  • Image fusion is performed on the infrared image and the color image.
  • the processing chip is configured to, from the time obtained by subtracting the first time length from the start time of the supplementary light to the time obtained by subtracting the first time length from the end time of the supplementary light, to the The infrared fill light sends a signal, wherein the first duration is a preset value and indicates the delay required for signal transmission between the processing chip and the infrared fill light;
  • the infrared fill light is used to fill light on the scene after receiving the signal sent by the processing chip.
  • the processing chip is configured to send a signal to the image sensor at a moment obtained by subtracting a second duration from the start moment of the supplementary light, wherein the second duration is a preset value, and Indicate the delay required for signal transmission between the processing chip and the image sensor;
  • the image sensor is configured to collect an infrared image of the scene within a first preset time period after receiving the signal sent by the processing chip, where the first preset time period is the time period of the on-time interval.
  • processing chip is configured to send a signal to the infrared fill light from the start time of the fill light to the end time of the fill light;
  • the infrared fill light is used to fill the scene after receiving the signal sent by the processing chip;
  • the processing chip is used to send a signal to the image sensor at a time of a third time period before the start time of the fill light, and the third time length indicates: the distance between the processing chip and the infrared fill light lamp The delay required for signal transmission, the delay difference between the delay time required for signal transmission between the processing chip and the image sensor;
  • the image sensor is configured to collect an infrared image of the scene within a first preset period of time after receiving the signal sent by the processing chip, where the first preset period of time is the period of the on-time interval.
  • the off time interval includes a start time of not filling light and an end time of not filling light
  • the processing chip is used to send a second time period before the start time of not filling light to the The image sensor sends a signal
  • the image sensor is configured to collect the color image of the scene within a second preset time period after receiving the signal sent by the processing chip, and the second preset time period is the time length of the closing time interval;
  • the processing chip is configured to send a signal to the image sensor at a time of a third time before the start time of the unfilled light;
  • the image sensor is configured to collect a color image of the scene within a second preset time period after receiving the signal sent by the processing chip;
  • the second duration and the third duration are preset values, the second duration indicates the delay required for signal transmission between the processing chip and the image sensor, and the third duration indicates : the delay difference between the delay time required for signal transmission between the processing chip and the infrared fill light, and the delay time required for signal transmission between the processing chip and the image sensor.
  • processing chip is used for:
  • a fused image is generated based on the luminance information and the color information.
  • the on-time interval determined by the processing chip of each camera in the multiple cameras is the same, and the off-time determined by the processing chip of each camera in the multiple cameras is the same.
  • the intervals are the same.
  • the functions of the camera shown in the embodiment of the present application are similar to the functions of the camera shown in the foregoing embodiment of FIG. 1 , and details are not described herein again.
  • an image processing device is proposed in the embodiment of the present application.
  • the camera at least includes a processing chip, an image sensor and an infrared fill light.
  • the device is applied to the processing chip, and the device includes:
  • a determination module used for determining the corresponding on time interval and off time interval of the infrared fill light; wherein, the on time interval and the off time interval appear alternately;
  • a control module used to control the infrared fill light to fill the target scene in the on time interval; and, control the image sensor to collect the infrared image of the target scene in the on time interval, and control all
  • the image sensor collects a color image of the target scene in the off time interval; wherein, the infrared image is generated by the image sensor based on infrared light when the infrared fill light fills the target scene ; the color image is generated by the image sensor based on visible light when the infrared fill light does not fill the target scene;
  • a processing module configured to perform image fusion on the infrared image and the color image.
  • the target scene is supplemented with an infrared fill light, and the on time interval and the off time interval of the infrared fill light alternately occur, that is, the infrared fill light is not always on, so the infrared fill light is not always on.
  • Reduce the time of fill light reduce power consumption, save energy, and reduce light pollution caused by long-term fill light.
  • image fusion of infrared images and color images high-brightness and high-definition color images are obtained, so that high-brightness and high-definition color images can be collected in low-light environments.
  • the turn-on time interval includes the start time of the fill light and the end time of the fill light
  • the control module is specifically used for:
  • the start time of the control signal is determined based on the start time of the supplementary light and the first duration
  • the end time of the control signal is determined based on the end time of the supplementary light and the first duration
  • the first duration is the processing chip
  • the turn-on time interval includes the start time of the fill light and the end time of the fill light
  • the control module is specifically used for:
  • the start time of the synchronization signal is determined based on the start time of the fill light and the second duration, and the second duration is the signal transmission delay between the processing chip and the image sensor; at the start time of the synchronization signal sending a synchronization signal to the image sensor, so that after receiving the synchronization signal, the image sensor collects the infrared image of the target scene within the exposure time;
  • the duration of the opening time interval is the same as the exposure duration.
  • the turn-on time interval includes the start time of the fill light and the end time of the fill light
  • the control module is specifically used for:
  • a control signal is sent to the infrared fill light, so that the infrared fill light can respond to the control signal after receiving the control signal.
  • the third duration is the difference between the first duration and the second duration
  • the first duration is the signal transmission between the processing chip and the infrared fill light Delay
  • the second duration is the signal transmission delay between the processing chip and the image sensor
  • the camera further includes an infrared fill light driver, and the control module is specifically used for:
  • the control signal is sent to the infrared fill light driver, so that the infrared fill light driver controls the infrared fill light to fill the target scene after receiving the control signal.
  • the shut-off time interval includes a start time of no light supplementation and an end time of no light supplementation
  • the control module is specifically used for:
  • the start time of the synchronization signal is determined based on the start time of the unfilled light and a second duration, where the second duration is the signal transmission delay between the processing chip and the image sensor; at the start of the synchronization signal Sending a synchronization signal to the image sensor at all times, so that after receiving the synchronization signal, the image sensor acquires a color image of the target scene within the exposure time period;
  • the third duration is the difference between the first duration and the second duration
  • the first duration is the signal transmission delay between the processing chip and the infrared fill light
  • the The second duration is the signal transmission delay between the processing chip and the image sensor
  • the duration of the closing time interval is the same as the exposure duration.
  • processing module is specifically used for:
  • the processing chip obtains luminance information from the infrared image, obtains color information from the color image, and generates a fusion image based on the luminance information and the color information.
  • the on-time interval determined by the processing chip of each camera in the multiple cameras is the same, and the off-time determined by the processing chip of each camera in the multiple cameras is the same.
  • the intervals are the same.
  • a computer-readable storage medium is also provided, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, any one of the above image processing methods is implemented A step of.
  • the target scene is supplemented with an infrared supplementary light, and the on-time interval of the infrared supplementary light is different from that of when it is turned off.
  • the time interval appears alternately, that is, the infrared fill light is not always on, which reduces the time for fill light, reduces power consumption, saves energy, and reduces light pollution caused by long-term fill light.
  • an infrared fill light is used to fill the target scene, and the turn-on time interval and the turn-off time interval of the infrared fill light alternately appear, that is, infrared fill light.
  • the fill light is not always on, which reduces the time for fill light, reduces power consumption, saves energy, and reduces light pollution caused by long-term fill light.
  • image fusion of infrared image and color image a high-brightness and high-definition color image is obtained, so that a high-brightness and high-definition color image can be collected in a low-light environment.
  • a typical implementing device is a computer, which may be in the form of a personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media player, navigation device, email sending and receiving device, game control desktop, tablet, wearable device, or a combination of any of these devices.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • these computer program instructions may also be stored in a computer readable memory capable of directing a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer readable memory result in an article of manufacture comprising the instruction means,
  • the instruction means implements the functions specified in a flow or flows of the flowcharts and/or a block or blocks of the block diagrams.

Abstract

本申请提供一种图像处理方法、装置及摄像机,该摄像机包括处理芯片、图像传感器和红外补光灯;处理芯片,用于确定红外补光灯对应的开启时间区间和关闭时间区间;开启时间区间与关闭时间区间交替出现;控制红外补光灯在开启时间区间对目标场景进行补光;控制图像传感器在开启时间区间采集目标场景的红外图像,控制图像传感器在关闭时间区间采集目标场景的彩色图像;红外图像是红外补光灯对目标场景进行补光时,由图像传感器基于红外光生成的;彩色图像是红外补光灯未对目标场景进行补光时,由图像传感器基于可见光生成的;对红外图像和彩色图像进行图像融合。通过本申请的技术方案,使补光的时间减少,功耗降低,减少长时间补光造成的光污染。

Description

一种图像处理方法、装置及摄像机
本申请要求于2021年3月10日提交中国专利局、申请号为202110260871.7发明名称为“一种图像处理方法、装置及摄像机”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,尤其是一种图像处理方法、装置及摄像机。
背景技术
为了在低照度环境下采集高亮度和高清晰度的彩色图像,为摄像机配置白光补光灯,在低照度环境下,打开白光补光灯,通过白光补光灯对目标场景进行补光,从而提高环境的亮度,将低照度环境变化为高照度环境,这样,在通过摄像机采集目标场景的图像时,就可以采集到高亮度和高清晰度的彩色图像。
但是,在低照度环境下,需要始终打开白光补光灯,即白光补光灯需要常亮,从而消耗大量功耗,不利于能源的节约,且白光补光灯容易造成光污染。
发明内容
第一方面,本申请提供了一种摄像机,包括处理芯片、图像传感器和红外补光灯;
所述处理芯片,用于确定所述红外补光灯对应的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替出现;
控制所述红外补光灯在所述开启时间区间对目标场景进行补光;
控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像;其中,所述红外图像是所述红外补光灯对所述目标场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述目标场景进行补光时,由所述图像传感器基于可见光生成的;
对所述红外图像和所述彩色图像进行图像融合。
第二方面,本申请提供了一种图像处理方法,应用于摄像机,所述摄像机至少包括处理芯片、图像传感器和红外补光灯,所述方法包括:
所述处理芯片确定所述红外补光灯对应的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替出现;
所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光,并控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像;其中,所述红外图像是所述红外补光灯对所述目标场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述目标场景进行补光时,由所述图像传感器基于可见光生成的;
所述处理芯片对所述红外图像和所述彩色图像进行图像融合。
第三方面,本申请提供了一种摄像机,包括:处理芯片、图像传感器和红外补光灯;
所述处理芯片,用于:
确定所述红外补光灯的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替设置,其中所述开启时间区间包括补光起始时刻和补光结束时刻;
控制所述红外补光灯在所述开启时间区间对场景进行补光;
控制所述图像传感器在所述开启时间区间采集所述场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述场景的彩色图像;
其中,所述红外图像是所述红外补光灯对所述场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述场景进行补光时,由所述图像传感器基于可见光生成的;
对所述红外图像和所述彩色图像进行图像融合。
由以上技术方案可见,本申请实施例提供的方案中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。
附图说明
为了更清楚地说明本发明实施例和现有技术的技术方案,下面对实施例和现有技术中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,本领域普通技术人员来讲还可以根据这些附图获得其他的附图。
图1是本申请一种实施方式中的摄像机的硬件结构图;
图2A和图2B是开启时间区间和关闭时间区间的示意图;
图3是本申请一种实施方式中的摄像机的硬件结构图;
图4A和图4B是本申请一种实施方式中的信号传输的示意图;
图5和图6是本申请一种实施方式中的摄像机的硬件结构图;
图7是本申请一种实施方式中的红外补光灯驱动的电路示意图;
图8是本申请一种实施方式中的多摄像机的补光同步示意图;
图9是本申请一种实施方式中的图像处理方法的流程示意图;
具体实施方式
为使本发明的目的、技术方案、及优点更加清楚明白,以下参照附图并举实施例,对本发明进一步详细说明。显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。本领域普通技术人员基于本发明中的实施例所获得的所有其他实施例,都属于本发明保护的范围。
在本申请实施例使用的术语仅仅是出于描述特定实施例的目的,而非限制本申请。本申请和权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其它含义。还应当理解,本文中使用的术语“和/或”是指包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本申请实施例可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本申请范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,此外,所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
本申请实施例中提出一种摄像机,该摄像机可以包括红外补光灯,在低照度环境(即环境亮度小于预设亮度阈值)下,比如夜间、阴天、雾天、雨雪天气等,可以使用红外补光灯对目标场景(即摄像机的视野范围,摄像机用于采集目标场景的图像)进行补光,并通过摄像机采集高亮度和高清晰度的彩色图像。在高照度环境(即环境亮度不小于预设亮度阈值)下,比如白天、晴天等,不需要使用红外补光 灯对目标场景进行补光,直接通过摄像机采集高亮度和高清晰度的彩色图像。本申请实施例中,以针对低照度环境下,需要使用红外补光灯对目标场景进行补光为例,关于高照度环境的处理过程不再赘述。
参见图1所示,为摄像机的结构示意图,该摄像机可以包括处理芯片11、图像传感器12和红外补光灯13。处理芯片11可以由SOC(System on Chip,系统级芯片)实现,也可以由其它类型的芯片实现,对此不做限制。图像传感器12可以为CMOS(Complementary Metal Oxide Semiconductor,互补金属氧化物半导体)图像传感器,也可以为其它类型的图像传感器,对此不做限制。红外补光灯13是具有红外补光功能的补光灯,可以对目标场景进行补光。
在低照度环境下,自然光可以照射到图像传感器12,使得图像传感器12基于自然光生成图像。本申请实施例中,若使用红外补光灯13对目标场景进行补光,则照射到图像传感器12的自然光可以为红外光,图像传感器12能够基于红外光生成图像,将基于红外光生成的图像称为红外图像,即黑白图像或者灰度图像。若未使用红外补光灯13对目标场景进行补光,则照射到图像传感器12的自然光可以为可见光,图像传感器12能够基于可见光生成图像,将基于可见光生成的图像称为彩色图像,即RGB(Red Green Blue,红绿蓝)图像。
由此可见,基于红外光生成的红外图像中不包含色彩信息,仅包含亮度信息,而基于可见光生成的彩色图像中包含色彩信息。
本申请实施例中,在低照度环境下,可以使用红外补光灯13对目标场景进行补光,且红外补光灯13的开启时间区间与关闭时间区间交替出现,即红外补光灯13不是常亮,从而使补光的时间减少,减少长时间补光造成的光污染。
以下结合具体实施例,对本申请实施例的技术方案进行说明。
本申请实施例中,处理芯片11确定红外补光灯13对应的开启时间区间和关闭时间区间,该开启时间区间与该关闭时间区间交替出现。然后,处理芯片11控制红外补光灯13在该开启时间区间对目标场景进行补光,控制图像传感器12在该开启时间区间采集目标场景的红外图像,并控制图像传感器12在该关闭时间区间采集目标场景的彩色图像,该红外图像是红外补光灯13对目标场景进行补光时,由图像传感器12基于红外光生成的,该彩色图像是红外补光灯13未对目标场景进行补光时,由图像传感器12基于可见光生成的。在得到红外图像和彩色图像后,处理芯片11对该红外图像和该彩色图像进行图像融合。
具体的,开启时间区间与关闭时间区间交替出现,在开启时间区间与关闭时间区间均较短的情况下,相邻的开启时间区间与关闭时间区间中图像传感器12采集的红外图像与彩色图像采集的图像内容相近,因此能够对红外图像与彩色图像进行图像融合。
示例性的,在低照度环境下,处理芯片11可以确定红外补光灯13对应的开启时间区间和关闭时间区间,该开启时间区间可以表示红外补光灯13开启,从而对目标场景进行补光,该关闭时间区间可以表示红外补光灯13关闭,从而对目标场景不进行补光。其中,开启时间区间的时长与关闭时间区间的时长可以相同,也可以不同,参见图2A所示,为开启时间区间和关闭时间区间的示意图,在图2A中,是以开启时间区间的时长与关闭时间区间的时长相同为例。
开启时间区间与关闭时间区间可以交替出现,也就是说,开启时间区间的后面是关闭时间区间,而关闭时间区间的后面是开启时间区间。
开启时间区间可以包括补光起始时刻(即开启时间区间的起始时刻)和补光结束时刻(即开启时间 区间的结束时刻),关闭时间区间可以包括未补光起始时刻(即关闭时间区间的起始时刻)和未补光结束时刻(即关闭时间区间的结束时刻),在此基础上,为了使开启时间区间与关闭时间区间交替出现,则:
开启时间区间的补光结束时刻与关闭时间区间的未补光起始时刻相同,且关闭时间区间的未补光结束时刻与开启时间区间的补光起始时刻相同。
参见图2A所示,以先关闭时间区间后开启时间区间为例,关闭时间区间a1后面是开启时间区间a2,开启时间区间a2后面是关闭时间区间a3,关闭时间区间a3后面是开启时间区间a4,开启时间区间a4后面是关闭时间区间a5,以此类推。显然,关闭时间区间的未补光结束时刻与开启时间区间的补光起始时刻相同,开启时间区间的补光结束时刻与关闭时间区间的未补光起始时刻相同。
在图2A中,是以先关闭时间区间后开启时间区间为例,针对先开启时间区间后关闭时间区间的情况,与图2A的实现过程类似,本申请实施例中不再赘述。
开启时间区间的时长与图像传感器12的曝光时长可以相同,在开启时间区间的时长与关闭时间区间的时长相同时,关闭时间区间的时长也与图像传感器12的曝光时长相同。图像传感器12的曝光时长(即曝光时间)是指:从快门打开到关闭的时间间隔,在这一段时间内,图像传感器12采集一帧图像。
显然,处理芯片11可以先获取图像传感器12的曝光时长,继而基于该曝光时长确定开启时间区间的时长和关闭时间区间的时长,然后,基于开启时间区间的时长和关闭时间区间的时长确定开启时间区间和关闭时间区间。
处理芯片11可以预先配置图像传感器12的曝光时长,也可以从图像传感器12获取图像传感器12的曝光时长,对此曝光时长的获取方式不做限制。
示例性的,在低照度环境下,处理芯片11控制红外补光灯13在该开启时间区间对目标场景进行补光,并控制图像传感器12在该开启时间区间采集目标场景的红外图像。显然,该红外图像是红外补光灯13对目标场景进行补光时,由图像传感器12基于开启时间区间的红外光生成的。以及,控制红外补光灯13在该关闭时间区间对目标场景不进行补光,并控制图像传感器12在该关闭时间区间采集目标场景的彩色图像。显然,该彩色图像是红外补光灯13未对目标场景进行补光时,由图像传感器12基于关闭时间区间的可见光生成的。
参见图2B所示,在关闭时间区间a1,处理芯片11控制红外补光灯13对目标场景不进行补光,由于红外补光灯13对目标场景不进行补光,因此,照射到图像传感器12的自然光为可见光,图像传感器12能够基于可见光生成彩色图像,也就是说,处理芯片11控制图像传感器12采集目标场景的彩色图像。
继续参见图2B所示,在开启时间区间a2,处理芯片11控制红外补光灯13对目标场景进行补光,由于红外补光灯13对目标场景进行补光,因此,照射到图像传感器12的自然光为红外光,图像传感器12能够基于红外光生成红外图像,也就是说,处理芯片11控制图像传感器12采集目标场景的红外图像。
以此类推,在每个关闭时间区间,图像传感器12均能够采集到一帧彩色图像,在每个开启时间区间,图像传感器12均能够采集到一帧红外图像。
比如说,图像传感器12在关闭时间区间a1采集到彩色图像b1,并将彩色图像b1发送给处理芯片11,图像传感器12在开启时间区间a2采集到红外图像b2,并将红外图像b2发送给处理芯片11,图像 传感器12在关闭时间区间a2采集到彩色图像b3,并将彩色图像b3发送给处理芯片11,以此类推。
综上所述,图像传感器12先采集一帧彩色图像,再采集一帧红外图像,再采集一帧彩色图像,再采集一帧红外图像,以此类推,即彩色图像和红外图像交替采集。处理芯片11先控制红外补光灯13对目标场景不进行补光(此时图像传感器12采集一帧彩色图像,通过关闭红外补光灯13的开关以使红外补光灯13不进行补光),再控制红外补光灯13对目标场景进行补光(此时图像传感器12采集一帧红外图像,通过开启红外补光灯13的开关以使红外补光灯13进行补光),再控制红外补光灯13对目标场景不进行补光,以此类推。
显然,处理芯片11控制红外补光灯13对目标场景进行一次补光时,图像传感器12会采集一帧彩色图像和一帧红外图像,即每2帧图像对应一次补光,因此,红外补光灯13的补光频率为图像传感器12的图像采集帧率的一半。
示例性的,在得到红外图像和彩色图像后,处理芯片11可以对该红外图像和该彩色图像进行图像融合(即双光图像融合)。比如说,处理芯片11从该红外图像中获取亮度信息,并从该彩色图像中获取色彩信息,并基于该亮度信息和该色彩信息生成融合图像,对此融合方式不做限制,该融合图像是一帧兼顾亮度信息和色彩信息的清晰彩色图像,即高亮度和高清晰度的彩色图像。
比如说,处理芯片11对彩色图像b1和红外图像b2进行图像融合,得到一帧融合图像,该融合图像是高亮度和高清晰度的彩色图像。处理芯片11对彩色图像b3和红外图像b4进行图像融合,得到一帧融合图像,以此类推。
由以上技术方案可见,本申请实施例中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。在上述方式中,基于红外隔帧补光和双光图像融合方式,使得只需要单图像传感器的摄像机,就可以在低照度环境下采集到高亮度和高清晰度的彩色图像,摄像机中仅包含一个图像传感器,硬件资源的消耗较低。
在一种可能的实施方式中,参见图3所示,摄像机还可以包括镜头14,自然光通过镜头14照射到图像传感器12。比如说,若使用红外补光灯13对目标场景进行补光,则红外光通过镜头14照射到图像传感器12,图像传感器12基于红外光生成红外图像。若未使用红外补光灯13对目标场景进行补光,则可见光通过镜头14照射到图像传感器12,图像传感器12基于可见光生成彩色图像。
在一种可能的实施方式中,开启时间区间可以包括补光起始时刻和补光结束时刻,关闭时间区间可以包括未补光起始时刻和未补光结束时刻,处理芯片11控制红外补光灯13在该开启时间区间对目标场景进行补光,并控制图像传感器12在该开启时间区间采集目标场景的红外图像,以及,控制红外补光灯13在该关闭时间区间对目标场景不进行补光,并控制图像传感器12在该关闭时间区间采集目标场景的彩色图像,可以包括但不限于如下方式:
方式一、基于补光起始时刻和第一时长确定控制信号起始时刻,基于补光结束时刻和第一时长确定控制信号结束时刻,在控制信号起始时刻至控制信号结束时刻之间的时间区间,向红外补光灯13发送控制信号,以使红外补光灯13在接收到控制信号后,对目标场景进行补光。基于补光起始时刻和第二时长确定同步信号起始时刻,在同步信号起始时刻向图像传感器12发送同步信号,以使图像传感器12在接收到同步信号后,在曝光时长内采集目标场景的红外图像。
基于未补光起始时刻和第一时长确定终止信号起始时刻,基于未补光结束时刻和第一时长确定终止信号结束时刻,在终止信号起始时刻至终止信号结束时刻之间的时间区间,向红外补光灯13发送终止信号,以使红外补光灯13接收到终止信号后,对目标场景不进行补光。基于未补光起始时刻和第二时长确定同步信号起始时刻,在同步信号起始时刻向图像传感器12发送同步信号,以使图像传感器12接收到同步信号后,在曝光时长内采集目标场景的彩色图像。
在上述实施例中,该曝光时长是图像传感器12的曝光时长,该开启时间区间的时长与该曝光时长相同,该关闭时间区间的时长与该曝光时长相同。
在上述实施例中,第一时长是处理芯片11与红外补光灯13之间的信号传输延时,该第一时长可以是处理芯片11配置的经验值,也可以是处理芯片11测量得到的数值,对此第一时长的获取方式不做限制。比如说,处理芯片11向红外补光灯13发送请求信号,红外补光灯13接收到请求信号后,向处理芯片11返回响应信号,处理芯片11基于请求信号的发送时刻和响应信号的接收时刻,确定处理芯片11与红外补光灯13之间的信号传输延时(如响应信号的接收时刻与请求信号的发送时刻之间差值的一半),该信号传输延时就是第一时长。
在上述实施例中,第二时长是处理芯片11与图像传感器12之间的信号传输延时,该第二时长可以是处理芯片11配置的经验值,也可以是处理芯片11测量得到的数值,对此第二时长的获取方式不做限制。比如说,处理芯片11向图像传感器12发送请求信号,图像传感器12接收到请求信号后,向处理芯片11返回响应信号,处理芯片11基于请求信号的发送时刻和响应信号的接收时刻,确定处理芯片11与图像传感器12之间的信号传输延时(如响应信号的接收时刻与请求信号的发送时刻之间差值的一半),该信号传输延时就是第二时长。
基于补光起始时刻和第一时长确定控制信号起始时刻,可以包括:将补光起始时刻与第一时长之间的差值,确定为控制信号起始时刻,这样,处理芯片11在该控制信号起始时刻向红外补光灯13发送控制信号时,该控制信号正好在该补光起始时刻到达红外补光灯13,也就是说,红外补光灯13在该补光起始时刻接收到该控制信号,并从该补光起始时刻开始,对目标场景进行补光。
基于补光结束时刻和第一时长确定控制信号结束时刻,可以包括:将补光结束时刻与第一时长之间的差值,确定为控制信号结束时刻,这样,处理芯片11在该控制信号结束时刻向红外补光灯13发送控制信号时,该控制信号正好在该补光结束时刻到达红外补光灯13,也就是说,红外补光灯13在该补光结束时刻接收到该控制信号,并从该补光结束时刻开始,停止对目标场景进行补光。
基于未补光起始时刻和第一时长确定终止信号起始时刻,可以包括:将未补光起始时刻与第一时长之间的差值,确定为终止信号起始时刻,处理芯片11在该终止信号起始时刻向红外补光灯13发送终止信号时,终止信号正好在该未补光起始时刻到达红外补光灯13,也就是说,红外补光灯13在未补光起始时刻接收到该终止信号,并从该未补光起始时刻开始,停止对目标场景进行补光。
基于未补光结束时刻和第一时长确定终止信号结束时刻,可以包括:将未补光结束时刻与第一时长之间的差值,确定为终止信号结束时刻,处理芯片11在该终止信号结束时刻向红外补光灯13发送终止信号时,终止信号正好在该未补光结束时刻到达红外补光灯13,也就是说,红外补光灯13在未补光结束时刻接收到该终止信号,并从该未补光结束时刻开始,对目标场景进行补光。
控制信号可以为高电平信号,终止信号可以为低电平信号,或者,控制信号可以为低电平信号,终止信号可以为高电平信号,后续以控制信号为高电平信号,终止信号为低电平信号为例,因此,在控制 信号起始时刻至控制信号结束时刻之间的时间区间,处理芯片11向红外补光灯13发送高电平信号,红外补光灯13在接收到高电平信号后,对目标场景进行补光。在终止信号起始时刻至终止信号结束时刻之间的时间区间,处理芯片11向红外补光灯13发送低电平信号,红外补光灯13在接收到低电平信号后,对目标场景不进行补光。
以下结合图4A对上述过程进行说明,处理芯片11从控制信号起始时刻开始发送高电平信号,高电平信号在补光起始时刻到达红外补光灯13,即红外补光灯13从补光起始时刻开始进行补光,且在高电平信号的接收时段内,红外补光灯13均会进行补光。处理芯片11从终止信号起始时刻(即控制信号结束时刻)开始发送低电平信号,这个低电平信号在未补光起始时刻(即补光结束时刻)到达红外补光灯13,即红外补光灯13从未补光起始时刻开始不进行补光,且在低电平信号的接收时段内,红外补光灯13均不会进行补光。
基于补光起始时刻和第二时长确定同步信号起始时刻,可以包括:将补光起始时刻与第二时长之间的差值,确定为同步信号起始时刻,处理芯片11在同步信号起始时刻向图像传感器12发送同步信号时,该同步信号在该补光起始时刻到达图像传感器12,也就是说,图像传感器12在该补光起始时刻接收到同步信号,并从该补光起始时刻开始,在曝光时长内采集目标场景的图像。由于红外补光灯13的开启时间区间与该曝光时长相同,且红外补光灯13在该开启时间区间进行补光,因此,图像传感器12在该曝光时长内采集的是红外图像。
基于未补光起始时刻和第二时长确定同步信号起始时刻,可以包括:将未补光起始时刻与第二时长之间的差值,确定为同步信号起始时刻,处理芯片11在同步信号起始时刻向图像传感器12发送同步信号时,该同步信号在该未补光起始时刻到达图像传感器12,图像传感器12在该未补光起始时刻接收到同步信号,并从该未补光起始时刻开始,在曝光时长内采集目标场景的图像。由于红外补光灯13的关闭时间区间与该曝光时长相同,且红外补光灯13在该关闭时间区间未进行补光,因此,图像传感器12在该曝光时长内采集的是彩色图像。
同步信号可以为脉冲信号,也就是说,在每个同步信号起始时刻(基于补光起始时刻和第二时长确定,或者基于未补光起始时刻和第二时长确定),处理芯片11向图像传感器12发送一个脉冲信号。图像传感器12每次接收到脉冲信号后,就可以在曝光时长内采集图像(如红外图像或者彩色图像)。
方式二、在补光起始时刻至补光结束时刻之间的时间区间,向红外补光灯13发送控制信号,以使红外补光灯13在接收到控制信号后,对目标场景进行补光。以及,在补光起始时刻向图像传感器12发送同步信号,以使图像传感器12在接收到同步信号并延时第三时长后,在曝光时长内采集目标场景的红外图像。
在未补光起始时刻至未补光结束时刻之间的时间区间,向红外补光灯13发送终止信号,以使红外补光灯13在接收到终止信号后,对目标场景不进行补光。以及,在未补光起始时刻向图像传感器12发送同步信号,以使图像传感器12在接收到同步信号并延时第三时长后,在曝光时长内采集目标场景的彩色图像。
在上述实施例中,该曝光时长是图像传感器12的曝光时长,且开启时间区间的时长与曝光时长相同,关闭时间区间的时长与曝光时长相同。第一时长是处理芯片11与红外补光灯13之间的信号传输延时。第二时长是处理芯片11与图像传感器12之间的信号传输延时。第三时长是第一时长与第二时长的差值,比如说,第一时长会大于第二时长,第三时长可以是大于0的数值。
由于处理芯片11与红外补光灯13之间的信号传输延时为第一时长,因此,从补光起始时刻至补光结束时刻之间的时间区间,处理芯片11向红外补光灯13发送控制信号时,红外补光灯13在补光起始时刻+第一时长至补光结束时刻+第一时长之间的时间区间,会接收到控制信号,并对目标场景进行补光。从未补光起始时刻至未补光结束时刻之间的时间区间,处理芯片11向红外补光灯13发送终止信号时,红外补光灯13在未补光起始时刻+第一时长至未补光结束时刻+第一时长之间的时间区间,会接收到终止信号,并对目标场景不进行补光。
控制信号可以为高电平信号,终止信号可以为低电平信号,在补光起始时刻至补光结束时刻之间的时间区间,处理芯片11向红外补光灯13发送高电平信号,红外补光灯13在接收到高电平信号后,对目标场景进行补光。在未补光起始时刻至未补光结束时刻之间的时间区间,处理芯片11向红外补光灯13发送低电平信号,红外补光灯13在接收到低电平信号后,对目标场景不进行补光。
以下结合图4B进行说明,处理芯片11从补光起始时刻开始发送高电平信号,高电平信号在补光起始时刻+第一时长到达红外补光灯13,在高电平信号的接收时段内,红外补光灯13会进行补光。处理芯片11从未补光起始时刻(即补光结束时刻)开始发送低电平信号,低电平信号在未补光起始时刻+第一时长到达红外补光灯13,在低电平信号的接收时段内,红外补光灯13不会进行补光。
由于处理芯片11与图像传感器12之间的信号传输延时为第二时长,因此,处理芯片11在补光起始时刻向图像传感器12发送同步信号时,图像传感器12在补光起始时刻+第二时长会接收到同步信号。由于第三时长是第一时长与第二时长的差值,即第三时长与第二时长之和为第一时长,因此,图像传感器12接收到同步信号并延时第三时长后的时刻,为补光起始时刻+第一时长,即,图像传感器12从补光起始时刻+第一时长开始,在曝光时长内采集目标场景的图像。由于红外补光灯13的开启时间区间与曝光时长相同,红外补光灯13从补光起始时刻+第一时长开始补光,则图像传感器12在曝光时长内采集的是红外图像。
处理芯片11在未补光起始时刻向图像传感器12发送同步信号时,图像传感器12在未补光起始时刻+第二时长会接收到同步信号。图像传感器12接收到同步信号并延时第三时长后的时刻,为未补光起始时刻+第一时长,即,从未补光起始时刻+第一时长开始,在曝光时长内采集目标场景的图像。由于红外补光灯13的关闭时间区间与曝光时长相同,红外补光灯13从未补光起始时刻+第一时长开始未进行补光,因此,图像传感器12在曝光时长内采集的是彩色图像。
同步信号可以为脉冲信号,也就是说,在每个补光起始时刻和每个未补光起始时刻,处理芯片11向图像传感器12发送一个脉冲信号。图像传感器12每次接收到脉冲信号并延时第三时长后,就可以在曝光时长内采集图像。
上述方式一和方式二只是示例,对此不做限制,只要能够控制红外补光灯13在开启时间区间对目标场景进行补光,控制图像传感器12在开启时间区间采集目标场景的红外图像,控制红外补光灯13在关闭时间区间对目标场景不进行补光,控制图像传感器12在关闭时间区间采集目标场景的彩色图像即可。
在一种可能的实施方式中,参见图5所示,摄像机还可以包括红外补光灯驱动15,红外补光灯驱动15用于控制红外补光灯13的开启或者关闭,即,红外补光灯驱动15可以开启红外补光灯13,使得红外补光灯13对目标场景进行补光,或者,红外补光灯驱动15可以关闭红外补光灯13,使得红外补光灯13对目标场景不进行补光。在此基础上,处理芯片11在向红外补光灯13发送控制信号(即高电 平信号)时,处理芯片11可以向红外补光灯驱动15发送控制信号。红外补光灯驱动15在接收到控制信号后,可以控制红外补光灯13对目标场景进行补光。处理芯片11在向红外补光灯13发送终止信号(即低电平信号)时,处理芯片11可以向红外补光灯驱动15发送终止信号。红外补光灯驱动15在接收到终止信号后,可以控制红外补光灯13对目标场景不进行补光。
在一种可能的实施方式中,参见图6所示,摄像机还可以包括单片机16,在此基础上,处理芯片11在向红外补光灯13发送控制信号时,处理芯片11先将控制信号发送给单片机16,单片机16在接收到控制信号后,可以向红外补光灯驱动15发送该控制信号。红外补光灯驱动15在接收到控制信号后,可以控制红外补光灯13对目标场景进行补光。处理芯片11在向红外补光灯13发送终止信号时,处理芯片11先将终止信号发送给单片机16,单片机16在接收到终止信号后,可以向红外补光灯驱动15发送终止信号。红外补光灯驱动15在接收到终止信号后,可以控制红外补光灯13对目标场景不进行补光。
示例性的,单片机16可以包括与处理芯片11连接的GPIO(General Purpose Input Output,通用输入输出)接口,与红外补光灯驱动15连接的PWM(Pulse Width Modulation,脉冲宽度调制)接口。处理芯片11将控制信号或者终止信号输出给单片机的GPIO接口,单片机通过PWM接口将控制信号或者终止信号输出给红外补光灯驱动15。在此基础上,红外补光灯驱动15在接收到该控制信号时,在“LED+”和“LED-”之间产生驱动电流,“LED+”与红外补光灯13的正极连接,“LED-”与红外补光灯13的负极连接,从而在红外补光灯13的正极与负极之间产生驱动电流,开启红外补光灯13,使得红外补光灯13对目标场景进行补光。红外补光灯驱动15在接收到该终止信号时,不在“LED+”和“LED-”之间产生驱动电流,即红外补光灯13的正极与负极之间未产生驱动电流,关闭红外补光灯13,使得红外补光灯13不对目标场景进行补光。
在一种可能的实施方式中,红外补光灯驱动15的电路结构可以参见图7所示,红外补光灯驱动15对红外补光灯13的驱动电流可以为0.1V/0.125Ω=0.8A。
从图7可以看出,红外补光灯驱动15可以包括电阻R1(0.25Ω,误差为1%),电阻R2(0.25Ω,误差为1%),电阻R3(1KΩ,误差为1%),电阻R5(10KΩ),电阻R6(10KΩ),电容C1(10uF),电容C2(100nF),电容C3(100nF),电容C4(10uF),电感L1(10uH)和芯片U1。PWM表示控制信号或终止信号,GND表示地,“LED+”与红外补光灯13的正极连接,“LED-”与红外补光灯13的负极连接,12V表示红外补光灯驱动15的输入电压。
芯片U1可以包括6个管脚(也可以称为引脚),管脚1为FB管脚,是输出电压反馈脚。管脚2为DIM管脚,是输入管脚,用于接收外部输入的控制信号或终止信号。管脚3为GND管脚,是地管脚。管脚4为IN管脚,是电源输入管脚。管脚5为LX管脚,是输出管脚,用于控制芯片U1的输出,在“LED+”和“LED-”之间产生驱动电流。管脚6为BS管脚,是自升压脚。
当然,图7只是红外补光灯驱动15的示例,对此红外补光灯驱动15的实现方式不做限制,只要能够控制红外补光灯13的开启和关闭即可。
在一种可能的实施方式中,在多个摄像机相邻布置且同时工作时,为了避免多个摄像机之间的补光互相干扰,可以进行多个摄像机的补光同步控制,使得多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,并且,多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
比如说,摄像机1的处理芯片确定的开启时间区间与摄像机2的处理芯片确定的开启时间区间相同,摄像机1的处理芯片确定的关闭时间区间与摄像机2的处理芯片确定的关闭时间区间相同,摄像机1 的处理芯片确定的开启时间区间与摄像机3的处理芯片确定的开启时间区间相同,摄像机1的处理芯片确定的关闭时间区间与摄像机3的处理芯片确定的关闭时间区间相同,以此类推。
比如说,多摄像机间补光同步原理可以参见图8所示,在多个摄像机的基础上,可以额外配置电源适配器,电源适配器可以同时向多个摄像机发送电源同步信号(如特定频率的方波信号),该电源同步信号作为每个摄像机的启动信号。针对每个摄像机来说,该摄像机的处理芯片在接收到该电源同步信号之后,才开始确定红外补光灯对应的开启时间区间和关闭时间区间。
由于所有摄像机的处理芯片同时接收到电源同步信号,因此,确定开启时间区间和关闭时间区间的时机同步,这样,可以使得每个摄像机的处理芯片确定的开启时间区间均相同,每个摄像机的处理芯片确定的关闭时间区间均相同。
基于与上述实施例同样的发明构思,本申请实施例中还提出一种图像处理方法,该方法可以应用于摄像机,摄像机至少包括处理芯片、图像传感器和红外补光灯,参见图9所示,为该方法的流程示意图,该方法可以包括:
步骤911,处理芯片确定红外补光灯对应的开启时间区间和关闭时间区间。
示例性的,开启时间区间与关闭时间区间可以交替出现。
步骤912,处理芯片控制红外补光灯在该开启时间区间对目标场景进行补光,并控制图像传感器在该开启时间区间采集目标场景的红外图像,并控制图像传感器在该关闭时间区间采集目标场景的彩色图像。示例性的,红外图像是红外补光灯对目标场景进行补光时,由图像传感器基于红外光生成的;彩色图像是红外补光灯未对目标场景进行补光时,由图像传感器基于可见光生成的。
示例性的,开启时间区间包括补光起始时刻和补光结束时刻,处理芯片控制红外补光灯在该开启时间区间对目标场景进行补光,可以包括:基于补光起始时刻和第一时长确定控制信号起始时刻,基于补光结束时刻和第一时长确定控制信号结束时刻;第一时长是处理芯片与红外补光灯之间的信号传输延时;在控制信号起始时刻至控制信号结束时刻之间的时间区间,向红外补光灯发送控制信号,以使红外补光灯在接收到控制信号后,对目标场景进行补光。
示例性的,处理芯片控制图像传感器在该开启时间区间采集目标场景的红外图像,可以包括:基于补光起始时刻和第二时长确定同步信号起始时刻,第二时长是处理芯片与图像传感器之间的信号传输延时;在同步信号起始时刻向图像传感器发送同步信号,以使图像传感器在接收到同步信号后,在曝光时长内采集目标场景的红外图像;其中,开启时间区间的时长与曝光时长相同。
示例性的,处理芯片控制红外补光灯在开启时间区间对目标场景进行补光,控制图像传感器在开启时间区间采集目标场景的红外图像,可以包括:在补光起始时刻至补光结束时刻之间的时间区间,向红外补光灯发送控制信号,以使红外补光灯在接收到控制信号后,对目标场景进行补光;在补光起始时刻向图像传感器发送同步信号,以使图像传感器在接收到同步信号并延时第三时长后,在曝光时长内采集目标场景的红外图像;第三时长是第一时长与第二时长的差值,第一时长是处理芯片与红外补光灯之间的信号传输延时,第二时长是处理芯片与图像传感器之间的信号传输延时;开启时间区间的时长与曝光时长相同。
示例性的,关闭时间区间包括未补光起始时刻和未补光结束时刻,处理芯片控制图像传感器在关闭时间区间采集目标场景的彩色图像,可以包括:基于未补光起始时刻和第二时长确定同步信号起始时刻,第二时长是处理芯片与图像传感器之间的信号传输延时;在同步信号起始时刻向图像传感器发送同步信 号,以使图像传感器在接收到同步信号后,在曝光时长内采集目标场景的彩色图像。或者,在未补光起始时刻向图像传感器发送同步信号,以使图像传感器在接收到同步信号并延时第三时长后,在曝光时长内采集目标场景的彩色图像;其中,第三时长是第一时长与第二时长的差值,第一时长是处理芯片与红外补光灯之间的信号传输延时,第二时长是处理芯片与图像传感器之间的信号传输延时;示例性的,关闭时间区间的时长与曝光时长可以相同。
步骤913,处理芯片对红外图像和彩色图像进行图像融合。
比如说,处理芯片从红外图像中获取亮度信息,并从彩色图像中获取色彩信息,并基于该亮度信息和该色彩信息生成融合图像。
示例性的,上述执行顺序只是为了方便描述给出的示例,在实际应用中,还可以改变步骤之间的执行顺序,对此执行顺序不做限制。而且,在其它实施例中,并不一定按照本说明书示出和描述的顺序来执行相应方法的步骤,其方法所包括的步骤可以比本说明书所描述的更多或更少。此外,本说明书中所描述的单个步骤,在其它实施例中可能被分解为多个步骤进行描述;本说明书中所描述的多个步骤,在其它实施例也可能被合并为单个步骤进行描述。
由以上技术方案可见,本申请实施例中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。
基于与上述方法同样的申请构思,本申请实施例中提出了一种摄像机,包括:
处理芯片、图像传感器和红外补光灯;
所述处理芯片,用于:
确定所述红外补光灯的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替设置,其中所述开启时间区间包括补光起始时刻和补光结束时刻;
控制所述红外补光灯在所述开启时间区间对场景进行补光;
控制所述图像传感器在所述开启时间区间采集所述场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述场景的彩色图像;
其中,所述红外图像是所述红外补光灯对所述场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述场景进行补光时,由所述图像传感器基于可见光生成的;
对所述红外图像和所述彩色图像进行图像融合。
示例性的,其中,所述处理芯片用于从所述补光起始时刻减去第一时长得到的时刻起至所述补光结束时刻减去所述第一时长得到的时刻止,向所述红外补光灯发送信号,其中所述第一时长是预设值,且指示所述处理芯片与所述红外补光灯之间的信号传输所需的延时;
所述红外补光灯用于在接收到所述处理芯片发送的信号后,对所述场景进行补光。
示例性的,其中,所述处理芯片用于在所述补光起始时刻减去第二时长得到的时刻,向所述图像传感器发送信号,其中,所述第二时长是预设值,且指示所述处理芯片与所述图像传感器之间的信号传输所需的延时;
所述图像传感器用于从接收到所述处理芯片发送的信号起第一预设时长内采集所述场景的红外图像,其中,所述第一预设时长为所述开启时间区间的时长。
示例性的,其中,所述处理芯片用于从所述补光起始时刻至所述补光结束时刻,向所述红外补光灯发送信号;
所述红外补光灯用于在接收到所述处理芯片发送的信号后对所述场景进行补光;
且,所述处理芯片用于在所述补光起始时刻前第三时长的时刻,向所述图像传感器发送信号,所述第三时长指示:处理芯片与所述红外补光灯之间的信号传输所需的延时、所述处理芯片与所述图像传感器之间的信号传输所需的延时间的延时差值;
所述图像传感器用于从接收到所述处理芯片发送的信号起第一预设时长内采集所述场景的红外图像,所述第一预设时长为所述开启时间区间的时长。
示例性的,其中,所述关闭时间区间包括未补光起始时刻和未补光结束时刻,所述处理芯片用于在所述未补光起始时刻前第二时长的时刻,向所述图像传感器发送信号;
所述图像传感器用于在接收到所述处理芯片发送的信号后第二预设时长内采集所述场景的彩色图像,所述第二预设时长为所述关闭时间区间的时长;
或者,所述处理芯片用于在所述未补光起始时刻前第三时长的时刻,向所述图像传感器发送信号;
所述图像传感器用于在接收到所述处理芯片发送的信号后第二预设时长内采集所述场景的彩色图像;
其中,所述第二时长和所述第三时长是预设值,所述第二时长指示所述处理芯片与所述图像传感器之间的信号传输所需的延时,所述第三时长指示:处理芯片与所述红外补光灯之间的信号传输所需的延时、所述处理芯片与所述图像传感器之间的信号传输所需的延时间的延时差值。
示例性的,其中,所述处理芯片用于:
从所述红外图像中获取亮度信息,并从所述彩色图像中获取色彩信息;
基于所述亮度信息和所述色彩信息生成融合图像。
示例性的,在多个摄像机相邻布置时,所述多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,且所述多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
具体的,本申请实施例所示的摄像机与前文图1实施例所示的摄像机的功能相似,在此不再赘述。
基于与上述方法同样的申请构思,本申请实施例中提出一种图像处理装置,摄像机至少包括处理芯片、图像传感器和红外补光灯,所述装置应用于所述处理芯片,所述装置包括:
确定模块,用于确定所述红外补光灯对应的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替出现;
控制模块,用于控制所述红外补光灯在所述开启时间区间对目标场景进行补光;以及,控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像;其中,所述红外图像是所述红外补光灯对所述目标场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述目标场景进行补光时,由所述图像传感器基于可见光生成的;
处理模块,用于对所述红外图像和所述彩色图像进行图像融合。
由以上技术方案可见,本申请实施例中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清 晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。
示例性的,所述开启时间区间包括补光起始时刻和补光结束时刻,所述控制模块,具体用于:
基于所述补光起始时刻和第一时长确定控制信号起始时刻,基于所述补光结束时刻和所述第一时长确定控制信号结束时刻;其中,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时;在所述控制信号起始时刻至所述控制信号结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光。
示例性的,所述开启时间区间包括补光起始时刻和补光结束时刻,所述控制模块,具体用于:
基于所述补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的红外图像;
其中,所述开启时间区间的时长与所述曝光时长相同。
示例性的,所述开启时间区间包括补光起始时刻和补光结束时刻,所述控制模块,具体用于:
在所述补光起始时刻至所述补光结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光;以及,在所述补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的红外图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;其中,所述开启时间区间的时长与所述曝光时长相同。
示例性的,所述摄像机还包括红外补光灯驱动,所述控制模块,具体用于:
向所述红外补光灯驱动发送所述控制信号,以使所述红外补光灯驱动在接收到所述控制信号后,控制所述红外补光灯对目标场景进行补光。
示例性的,所述关闭时间区间包括未补光起始时刻和未补光结束时刻,所述控制模块,具体用于:
基于所述未补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的彩色图像;
或者,在所述未补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的彩色图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;
其中,所述关闭时间区间的时长与所述曝光时长相同。
示例性的,所述处理模块,具体用于:
所述处理芯片从所述红外图像中获取亮度信息,并从所述彩色图像中获取色彩信息,并基于所述亮度信息和所述色彩信息生成融合图像。
示例性的,在多个摄像机相邻布置时,所述多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,且所述多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
在本申请提供的又一实施例中,还提供了一种计算机可读存储介质,该计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现上述任一图像处理方法的步骤。
应用本实施例提供的计算机可读存储介质中存储的计算机程序进行图像处理时,本申请实施例中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。
在本申请提供的又一实施例中,还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述实施例中任一图像处理方法。
执行本实施例提供的计算机程序产品进行图像处理时,本申请实施例中,使用红外补光灯对目标场景进行补光,且红外补光灯的开启时间区间与关闭时间区间交替出现,即红外补光灯不是常亮,从而使补光的时间减少,功耗降低,节约能源,减少长时间补光造成的光污染。通过对红外图像和彩色图像进行图像融合,得到高亮度和高清晰度的彩色图像,从而在低照度环境下采集到高亮度和高清晰度的彩色图像。
上述实施例阐明的系统、装置、模块或单元,具体可以由计算机芯片或实体实现,或者由具有某种功能的产品来实现。一种典型的实现设备为计算机,计算机的具体形式可以是个人计算机、膝上型计算机、蜂窝电话、相机电话、智能电话、个人数字助理、媒体播放器、导航设备、电子邮件收发设备、游戏控制台、平板计算机、可穿戴设备或者这些设备中的任意几种设备的组合。
为了描述的方便,描述以上装置时以功能分为各种单元分别描述。当然,在实施本申请时可以把各单元的功能在同一个或多个软件和/或硬件中实现。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请实施例可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可以由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其它可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其它可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
而且,这些计算机程序指令也可以存储在能引导计算机或其它可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或者多个流程和/或方框图一个方框或者多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其它可编程数据处理设备上,使得在计算机或者其它可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其它可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (23)

  1. 一种摄像机,包括处理芯片、图像传感器和红外补光灯;
    所述处理芯片,用于确定所述红外补光灯对应的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替出现;
    控制所述红外补光灯在所述开启时间区间对目标场景进行补光;
    控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像;其中,所述红外图像是所述红外补光灯对所述目标场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述目标场景进行补光时,由所述图像传感器基于可见光生成的;
    对所述红外图像和所述彩色图像进行图像融合。
  2. 根据权利要求1所述的摄像机,
    所述开启时间区间包括补光起始时刻和补光结束时刻,所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光时具体用于:
    基于所述补光起始时刻和第一时长确定控制信号起始时刻,基于所述补光结束时刻和所述第一时长确定控制信号结束时刻;其中,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时;在所述控制信号起始时刻至所述控制信号结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光。
  3. 根据权利要求1或2所述的摄像机,
    所述开启时间区间包括补光起始时刻和补光结束时刻,所述处理芯片控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像时具体用于:
    基于所述补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的红外图像;
    其中,所述开启时间区间的时长与所述曝光时长相同。
  4. 根据权利要求1所述的摄像机,
    所述开启时间区间包括补光起始时刻和补光结束时刻,所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光,控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像时具体用于:
    在所述补光起始时刻至所述补光结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光;以及,在所述补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的红外图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;
    其中,所述开启时间区间的时长与所述曝光时长相同。
  5. 根据权利要求2或4所述的摄像机,所述摄像机还包括红外补光灯驱动,所述处理芯片向所述红外补光灯发送控制信号时具体用于:
    向所述红外补光灯驱动发送所述控制信号,以使所述红外补光灯驱动在接收到所述控制信号后,控制所述红外补光灯对目标场景进行补光。
  6. 根据权利要求1-5中任一项所述的摄像机,所述关闭时间区间包括未补光起始时刻和未补光结束时刻,所述处理芯片控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像时具体用于:
    基于所述未补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的彩色图像;
    或者,在所述未补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的彩色图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;
    其中,所述关闭时间区间的时长与所述曝光时长相同。
  7. 根据权利要求1-6中任一项所述的摄像机,
    所述处理芯片对所述红外图像和所述彩色图像进行图像融合时具体用于:
    从所述红外图像中获取亮度信息,并从所述彩色图像中获取色彩信息;
    基于所述亮度信息和所述色彩信息生成融合图像。
  8. 根据权利要求1-7中任一项所述的摄像机,在多个摄像机相邻布置时,所述多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,且所述多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
  9. 一种图像处理方法,应用于摄像机,所述摄像机至少包括处理芯片、图像传感器和红外补光灯,所述方法包括:
    所述处理芯片确定所述红外补光灯对应的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替出现;
    所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光,并控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像;其中,所述红外图像是所述红外补光灯对所述目标场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述目标场景进行补光时,由所述图像传感器基于可见光生成的;
    所述处理芯片对所述红外图像和所述彩色图像进行图像融合。
  10. 根据权利要求9所述的方法,所述开启时间区间包括补光起始时刻和补光结束时刻,所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光,包括:
    所述处理芯片基于所述补光起始时刻和第一时长确定控制信号起始时刻,基于所述补光结束时刻和所述第一时长确定控制信号结束时刻;其中,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时;在所述控制信号起始时刻至所述控制信号结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光。
  11. 根据权利要求9或10所述的方法,所述开启时间区间包括补光起始时刻和补光结束时刻,所 述处理芯片控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,包括:
    基于所述补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的红外图像;
    其中,所述开启时间区间的时长与所述曝光时长相同。
  12. 根据权利要求9所述的方法,所述开启时间区间包括补光起始时刻和补光结束时刻,所述处理芯片控制所述红外补光灯在所述开启时间区间对目标场景进行补光,并控制所述图像传感器在所述开启时间区间采集所述目标场景的红外图像,包括:
    所述处理芯片在所述补光起始时刻至所述补光结束时刻之间的时间区间,向所述红外补光灯发送控制信号,以使所述红外补光灯在接收到所述控制信号后,对目标场景进行补光;以及,在所述补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的红外图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;
    其中,所述开启时间区间的时长与所述曝光时长相同。
  13. 根据权利要求10或12所述的方法,所述摄像机还包括红外补光灯驱动,所述处理芯片向所述红外补光灯发送控制信号,包括:
    所述处理芯片向所述红外补光灯驱动发送所述控制信号,以使所述红外补光灯驱动在接收到所述控制信号后,控制所述红外补光灯对目标场景进行补光。
  14. 根据权利要求9-13中任一项所述的方法,所述关闭时间区间包括未补光起始时刻和未补光结束时刻,所述处理芯片控制所述图像传感器在所述关闭时间区间采集所述目标场景的彩色图像,包括:
    所述处理芯片基于所述未补光起始时刻和第二时长确定同步信号起始时刻,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;在所述同步信号起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号后,在曝光时长内采集所述目标场景的彩色图像;
    或者,所述处理芯片在所述未补光起始时刻向所述图像传感器发送同步信号,以使所述图像传感器在接收到所述同步信号并延时第三时长后,在曝光时长内采集所述目标场景的彩色图像;其中,所述第三时长是第一时长与第二时长的差值,所述第一时长是所述处理芯片与所述红外补光灯之间的信号传输延时,所述第二时长是所述处理芯片与所述图像传感器之间的信号传输延时;
    其中,所述关闭时间区间的时长与所述曝光时长相同。
  15. 根据权利要求9-14中任一项所述的方法,所述处理芯片对所述红外图像和所述彩色图像进行图像融合,包括:
    所述处理芯片从所述红外图像中获取亮度信息,并从所述彩色图像中获取色彩信息,并基于所述亮度信息和所述色彩信息生成融合图像。
  16. 根据权利要求9-15中任一项所述的方法,在多个摄像机相邻布置时,所述多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,且所述多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
  17. 一种摄像机,包括:
    处理芯片、图像传感器和红外补光灯;
    所述处理芯片,用于:
    确定所述红外补光灯的开启时间区间和关闭时间区间;其中,所述开启时间区间与所述关闭时间区间交替设置,其中所述开启时间区间包括补光起始时刻和补光结束时刻;
    控制所述红外补光灯在所述开启时间区间对场景进行补光;
    控制所述图像传感器在所述开启时间区间采集所述场景的红外图像,并控制所述图像传感器在所述关闭时间区间采集所述场景的彩色图像;
    其中,所述红外图像是所述红外补光灯对所述场景进行补光时,由所述图像传感器基于红外光生成的;所述彩色图像是所述红外补光灯未对所述场景进行补光时,由所述图像传感器基于可见光生成的;
    对所述红外图像和所述彩色图像进行图像融合。
  18. 根据权利要求17所述的摄像机,其中,所述处理芯片用于从所述补光起始时刻减去第一时长得到的时刻起至所述补光结束时刻减去所述第一时长得到的时刻止,向所述红外补光灯发送信号,其中所述第一时长是预设值,且指示所述处理芯片与所述红外补光灯之间的信号传输所需的延时;
    所述红外补光灯用于在接收到所述处理芯片发送的信号后,对所述场景进行补光。
  19. 根据权利要求17或18所述的摄像机,其中,所述处理芯片用于在所述补光起始时刻减去第二时长得到的时刻,向所述图像传感器发送信号,其中,所述第二时长是预设值,且指示所述处理芯片与所述图像传感器之间的信号传输所需的延时;
    所述图像传感器用于从接收到所述处理芯片发送的信号起第一预设时长内采集所述场景的红外图像,其中,所述第一预设时长为所述开启时间区间的时长。
  20. 根据权利要求17所述的摄像机,其中,所述处理芯片用于从所述补光起始时刻至所述补光结束时刻,向所述红外补光灯发送信号;
    所述红外补光灯用于在接收到所述处理芯片发送的信号后对所述场景进行补光;
    且,所述处理芯片用于在所述补光起始时刻前第三时长的时刻,向所述图像传感器发送信号,所述第三时长指示:处理芯片与所述红外补光灯之间的信号传输所需的延时、所述处理芯片与所述图像传感器之间的信号传输所需的延时间的延时差值;
    所述图像传感器用于从接收到所述处理芯片发送的信号起第一预设时长内采集所述场景的红外图像,所述第一预设时长为所述开启时间区间的时长。
  21. 根据权利要求17-20中任一项所述的摄像机,其中,所述关闭时间区间包括未补光起始时刻和未补光结束时刻,所述处理芯片用于在所述未补光起始时刻前第二时长的时刻,向所述图像传感器发送信号;
    所述图像传感器用于在接收到所述处理芯片发送的信号后第二预设时长内采集所述场景的彩色图像,所述第二预设时长为所述关闭时间区间的时长;
    或者,所述处理芯片用于在所述未补光起始时刻前第三时长的时刻,向所述图像传感器发送信号;
    所述图像传感器用于在接收到所述处理芯片发送的信号后第二预设时长内采集所述场景的彩色图像;
    其中,所述第二时长和所述第三时长是预设值,所述第二时长指示所述处理芯片与所述图像传感器之间的信号传输所需的延时,所述第三时长指示:处理芯片与所述红外补光灯之间的信号传输所需的延 时、所述处理芯片与所述图像传感器之间的信号传输所需的延时间的延时差值。
  22. 根据权利要求17-21中任一项所述的摄像机,其中,所述处理芯片用于:
    从所述红外图像中获取亮度信息,并从所述彩色图像中获取色彩信息;
    基于所述亮度信息和所述色彩信息生成融合图像。
  23. 根据权利要求17-22中任一项所述的摄像机,在多个摄像机相邻布置时,所述多个摄像机中每个摄像机的处理芯片确定的开启时间区间均相同,且所述多个摄像机中每个摄像机的处理芯片确定的关闭时间区间均相同。
PCT/CN2022/072686 2021-03-10 2022-01-19 一种图像处理方法、装置及摄像机 WO2022188558A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110260871.7A CN113114926B (zh) 2021-03-10 2021-03-10 一种图像处理方法、装置及摄像机
CN202110260871.7 2021-03-10

Publications (1)

Publication Number Publication Date
WO2022188558A1 true WO2022188558A1 (zh) 2022-09-15

Family

ID=76711447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/072686 WO2022188558A1 (zh) 2021-03-10 2022-01-19 一种图像处理方法、装置及摄像机

Country Status (2)

Country Link
CN (1) CN113114926B (zh)
WO (1) WO2022188558A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113114926B (zh) * 2021-03-10 2022-11-25 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置及摄像机
WO2023125087A1 (zh) * 2021-12-30 2023-07-06 华为技术有限公司 图像处理方法及相关装置
CN114158156B (zh) * 2022-02-10 2022-05-03 深圳佑驾创新科技有限公司 一种补光灯的pwm调节方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144688A (zh) * 2013-04-24 2015-12-09 日立麦克赛尔株式会社 摄像装置和摄像系统
CN107005639A (zh) * 2014-12-10 2017-08-01 索尼公司 图像拾取设备,图像拾取方法,程序和图像处理设备
CN107566747A (zh) * 2017-09-22 2018-01-09 浙江大华技术股份有限公司 一种图像亮度增强方法及装置
CN109951646A (zh) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN109963086A (zh) * 2018-07-30 2019-07-02 华为技术有限公司 时分复用补光成像装置和方法
CN110493532A (zh) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 一种图像处理方法和系统
CN113114926A (zh) * 2021-03-10 2021-07-13 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置及摄像机

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102932603A (zh) * 2012-08-16 2013-02-13 浙江宇视科技有限公司 一种补光控制方法及摄像机
US20150062347A1 (en) * 2013-08-27 2015-03-05 Semiconductor Components Industries, Llc Image processing methods for visible and infrared imaging
KR102277178B1 (ko) * 2015-03-09 2021-07-14 삼성전자 주식회사 카메라 모듈을 포함하는 전자 장치 및 전자 장치의 이미지 처리 방법
CN106357985B (zh) * 2016-08-22 2019-07-02 Oppo广东移动通信有限公司 一种屏幕补光拍照方法、装置及移动终端
CN107040727B (zh) * 2017-05-11 2020-09-04 成都希格玛光电科技有限公司 摄像机曝光方法及装置
WO2019084806A1 (zh) * 2017-10-31 2019-05-09 深圳市大疆创新科技有限公司 用于闪光灯亮度补偿的方法、无人机及存储介质
CN109005365B (zh) * 2018-08-22 2020-12-08 浙江大华技术股份有限公司 一种控制补光灯开启的方法及装置
US11653101B2 (en) * 2019-05-17 2023-05-16 Samsung Electronics Co., Ltd. Imaging system for generating high dynamic range image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144688A (zh) * 2013-04-24 2015-12-09 日立麦克赛尔株式会社 摄像装置和摄像系统
CN107005639A (zh) * 2014-12-10 2017-08-01 索尼公司 图像拾取设备,图像拾取方法,程序和图像处理设备
CN107566747A (zh) * 2017-09-22 2018-01-09 浙江大华技术股份有限公司 一种图像亮度增强方法及装置
CN109951646A (zh) * 2017-12-20 2019-06-28 杭州海康威视数字技术股份有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN109963086A (zh) * 2018-07-30 2019-07-02 华为技术有限公司 时分复用补光成像装置和方法
CN110493532A (zh) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 一种图像处理方法和系统
CN113114926A (zh) * 2021-03-10 2021-07-13 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置及摄像机

Also Published As

Publication number Publication date
CN113114926B (zh) 2022-11-25
CN113114926A (zh) 2021-07-13

Similar Documents

Publication Publication Date Title
WO2022188558A1 (zh) 一种图像处理方法、装置及摄像机
CN103856764B (zh) 一种利用双快门进行监控的装置
CN106375645B (zh) 一种基于红外摄像装置的自适应控制系统
CN111741185B (zh) 补光控制方法、装置、系统及设备、存储介质
WO2020093651A1 (zh) 自动检测和抑制条纹的方法及装置、电子设备及计算机可读存储介质
WO2022253010A1 (zh) 一种同步曝光处理方法、装置、系统及设备
US9148581B2 (en) Multi-function control illumination device
WO2015139519A1 (zh) 一种互补金属氧化物半导体摄像机及其补光方法
CN100527915C (zh) 控制闪光灯设备的方法
CN101296541B (zh) 频闪光源控制装置
CN106455224B (zh) 自适应辅助光源发光产生装置及辅助光源调节方法
WO2019084806A1 (zh) 用于闪光灯亮度补偿的方法、无人机及存储介质
CN102724406A (zh) 一种实现cmos抓拍摄像机全局快门功能的方法
CN104900188A (zh) Led显示屏均匀性校正方法
US20160363668A1 (en) Method and image pick-up system for obtaining clear images through the rain, snow or fog
CN113014748A (zh) 一种摄像头红外灯动态补光系统及方法
CN106133820B (zh) 显示装置以及取景器装置
WO2012071751A1 (zh) 摄像系统和摄像方法
CN102879977B (zh) 相机闪光控制方法、控制装置及移动终端
TWI794837B (zh) 透過環境切換對圖像感測器進行自動曝光控制的方法和設備
WO2017059577A1 (zh) 眼球跟踪装置及其辅助光源控制方法及相关装置
CN114158156B (zh) 一种补光灯的pwm调节方法、装置、设备及存储介质
CN201388319Y (zh) 可多种同步的led光源控制装置
KR100961675B1 (ko) 발광 다이오드를 이용하는 교통 신호등
KR101909553B1 (ko) 주간 및 나이트모드 촬영 자동 전환 기능을 구비한 블랙박스 및 그 자동전환 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22766085

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22766085

Country of ref document: EP

Kind code of ref document: A1