CN110490187B - License plate recognition device and method - Google Patents

License plate recognition device and method Download PDF

Info

Publication number
CN110490187B
CN110490187B CN201910472683.3A CN201910472683A CN110490187B CN 110490187 B CN110490187 B CN 110490187B CN 201910472683 A CN201910472683 A CN 201910472683A CN 110490187 B CN110490187 B CN 110490187B
Authority
CN
China
Prior art keywords
image
exposure
light
image signal
license plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910472683.3A
Other languages
Chinese (zh)
Other versions
CN110490187A (en
Inventor
聂鑫鑫
范蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201910472683.3A priority Critical patent/CN110490187B/en
Publication of CN110490187A publication Critical patent/CN110490187A/en
Application granted granted Critical
Publication of CN110490187B publication Critical patent/CN110490187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/63Scene text, e.g. street names
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates

Abstract

The application provides a license plate recognition device and a method, and the license plate recognition device comprises: the device comprises a light filtering component, a single image sensor, a light supplementing device and an image processing unit; the image sensor is used for generating and outputting a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal is an image signal generated according to a second preset exposure; the light supplement device is used for performing near-infrared light supplement, wherein the near-infrared light supplement exists at least in a part of exposure time period of the first preset exposure, and the near-infrared light supplement does not exist in the exposure time period of the second preset exposure; the filtering component is used for passing through a visible light waveband and part of near infrared light; and the image processing unit is used for identifying the license plate to be identified according to the first image signal and the second image signal, and the license plate identification effect is good.

Description

License plate recognition device and method
Technical Field
The application relates to the technical field of image processing, in particular to license plate recognition equipment and a license plate recognition method.
Background
In traffic management, a vehicle license plate is identified by capturing a video image of the vehicle, which is the most common traffic management means at present. In order to obtain a high-quality snapshot image, light is usually supplemented during snapshot.
In the related art, a license plate image is obtained from an infrared supplementary lighting image, a license plate background area is divided, license plate recognition is carried out after the image color is adjusted, and a recognition license plate is output.
Disclosure of Invention
The application provides a license plate recognition device and a license plate recognition method, which aim to improve the license plate recognition effect.
In a first aspect, the present application provides a license plate recognition device, including:
the device comprises a light filtering component, an image sensor, a light supplementing device and an image processing unit, wherein the image sensor is positioned on the light emitting side of the light filtering component;
the image sensor is used for generating and outputting a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, and the first preset exposure and the second preset exposure are two exposures of the multiple exposures; the first image signal and the second image signal comprise information of a license plate to be recognized;
the light supplement device comprises a first light supplement device, and the first light supplement device is used for performing near-infrared light supplement in a stroboscopic mode, wherein the near-infrared light supplement is performed at least in a part of the exposure time period of the first preset exposure, and the near-infrared light supplement is not performed in the exposure time period of the second preset exposure;
the filtering component is used for passing through a visible light waveband and part of near infrared light;
and the image processing unit is used for identifying the license plate to be identified according to the first image signal and the second image signal.
In a second aspect, the present application provides a license plate recognition method, which is applied to license plate recognition equipment, where the license plate recognition equipment includes an image sensor, a light supplement device and a light filtering component, the image sensor is located on a light emitting side of the light filtering component, and the method includes:
near-infrared supplementary lighting is carried out through a first supplementary lighting device included by a supplementary lighting device, wherein the near-infrared supplementary lighting is carried out at least in a part of exposure time period of first preset exposure, the near-infrared supplementary lighting is not carried out in the exposure time period of second preset exposure, and the first preset exposure and the second preset exposure are two times of exposure in multiple times of exposure of an image sensor;
the light in the visible light wave band and part of near infrared light pass through the filtering component;
performing multiple exposures by the image sensor to generate and output a first image signal and a second image signal, the first image signal being an image signal generated according to the first preset exposure, the second image signal being an image signal generated according to the second preset exposure; the first image signal and the second image signal comprise information of a license plate to be recognized;
and the image processing unit identifies the license plate to be identified according to the first image signal and the second image signal.
The license plate recognition device and method provided by the embodiment of the application comprise the following steps: the device comprises a light filtering component, a single image sensor, a light supplementing device and an image processing unit; the image sensor is used for generating and outputting a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal is an image signal generated according to a second preset exposure; the light supplement device is used for performing near-infrared light supplement, wherein the near-infrared light supplement exists at least in a part of exposure time period of the first preset exposure, and the near-infrared light supplement does not exist in the exposure time period of the second preset exposure; the filtering component is used for passing through a visible light waveband and part of near infrared light; the image processing unit is used for identifying the license plate to be identified according to the first image signal and the second image signal, can simultaneously acquire the first image signal containing near infrared light information and the second image signal containing visible light information through first preset exposure and second preset exposure in any time period, and has a better identification effect according to the first image signal and the second image signal.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic structural diagram of a license plate recognition device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram illustrating a relationship between a wavelength and a relative intensity of a first light supplement device for performing near-infrared light supplement according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a relationship between a wavelength and a transmittance of light passing through a first optical filter according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another license plate recognition device provided in the embodiment of the present application;
FIG. 5 is a schematic diagram of an RGB sensor provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of an RGBW sensor provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an RCCB sensor according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a RYYB sensor provided in an embodiment of the present application;
fig. 9 is a schematic diagram of an induction curve of an image sensor according to an embodiment of the present application;
FIG. 10 is a schematic view of a roller shutter exposure method according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of a first preset exposure and a second preset exposure provided in an embodiment of the present application;
FIG. 12 is a schematic diagram of a second first preset exposure and a second preset exposure provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of a third first preset exposure and a second preset exposure provided by an embodiment of the present application;
fig. 14 is a schematic view of a first roller shutter exposure method and near-infrared light supplement provided in an embodiment of the present application;
fig. 15 is a schematic view of a second roller shutter exposure method and near-infrared light supplement provided in an embodiment of the present application;
fig. 16 is a schematic diagram of a third rolling shutter exposure mode and near-infrared light supplement provided in the embodiment of the present application;
fig. 17 is a schematic structural diagram of an image processing unit according to an embodiment of the present application;
FIG. 18 is a schematic structural diagram of an image preprocessing unit according to an embodiment of the present disclosure;
FIG. 19 is a schematic structural diagram of another image preprocessing unit provided in an embodiment of the present application;
FIG. 20 is a schematic diagram illustrating an image fusion process according to an embodiment of the present disclosure;
FIG. 21 is a schematic diagram of an image recognition unit according to an embodiment of the present application;
FIG. 22 is a schematic diagram of another image recognition unit provided in the embodiments of the present application;
fig. 23 is a schematic flowchart of a license plate recognition method according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terms "comprising" and "having," and any variations thereof, in the description and claims of this application and the drawings described herein are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic structural diagram of a license plate recognition device provided in an embodiment of the present application, and referring to fig. 1, the license plate recognition device includes an image sensor 01, a light supplement device 02, a light filtering component 03, and an image processing unit, where the image sensor 01 is located on a light emitting side of the light filtering component 03. The image sensor 01 is configured to generate and output a first image signal and a second image signal through multiple exposures. The first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, and the first preset exposure and the second preset exposure are two exposures of the multiple exposures. The light supplement device 02 includes a first light supplement device 021, and the first light supplement device 021 is configured to perform near-infrared light supplement, wherein the near-infrared light supplement is performed at least in a partial exposure time period of the first preset exposure, and the near-infrared light supplement is not performed in an exposure time period of the second preset exposure.
The image processing unit is used for identifying the license plate to be identified according to the first image signal and the second image signal output by the image sensor; the first image signal and the second image signal comprise information of the license plate to be recognized.
Referring to fig. 4, the optical filter assembly 03 includes a first optical filter 031, and the first optical filter 031 allows light in the visible light band and a portion of near-infrared light to pass therethrough, wherein the intensity of the near-infrared light passing through the first optical filter 031 when the first light supplement device 021 performs near-infrared light supplement is higher than the intensity of the near-infrared light passing through the first optical filter 031 when the first light supplement device 021 does not perform near-infrared light supplement. The near infrared light band passing through the first filter (031) may be a partial near infrared light band.
In the embodiment of the present application, referring to fig. 1, the license plate recognition device may further include a lens 04, in this case, the filter assembly 03 may be located between the lens 04 and the image sensor 01, and the image sensor 01 is located on the light emitting side of the filter assembly 03. Alternatively, the lens 04 is located between the filter assembly 03 and the image sensor 01, and the image sensor 01 is located on the light emitting side of the lens 04. As an example, the first filter 031 may be a filter film, such that the first filter 031 may be attached to a surface of the light-emitting side of the lens 04 when the filter assembly 03 is positioned between the lens 04 and the image sensor 01, or attached to a surface of the light-entering side of the lens 04 when the lens 04 is positioned between the filter assembly 03 and the image sensor 01.
It should be noted that the light supplement device 02 may be located inside the license plate recognition device, or may be located outside the license plate recognition device. That is, the light supplement device 02 may be a part of the license plate recognition device, or may be a device independent of the license plate recognition device. When the light supplement device 02 is located outside the license plate recognition device, the light supplement device 02 can be in communication connection with the license plate recognition device, so that the exposure time sequence of the image sensor 01 in the license plate recognition device and the near-infrared light supplement time sequence of the first light supplement device 021 included in the light supplement device 02 can be guaranteed to have a certain relation, if near-infrared light supplement is performed in at least a part of the exposure time period of first preset exposure, near-infrared light supplement is not performed in the exposure time period of second preset exposure.
The exposure time sequence of the image sensor 01 in the license plate recognition device and the near-infrared light supplement time sequence of the first light supplement device 021 included in the light supplement device 02 can be ensured to have a certain relationship, for example, near-infrared light supplement is performed at least in a part of the exposure time period of the first preset exposure, and near-infrared light supplement is not performed in the exposure time period of the second preset exposure.
In addition, the first light supplement device 021 is a device capable of emitting near-infrared light, such as a near-infrared light supplement lamp, and the first light supplement device 021 can perform near-infrared light supplement in a stroboscopic manner, and also can perform near-infrared light supplement in other manners similar to stroboscopic manner, and this embodiment of the present application is not limited thereto. In some examples, when the first light supplement device 021 performs near-infrared light supplement in a stroboscopic manner, the first light supplement device 021 may be controlled in a manual manner to perform near-infrared light supplement in the stroboscopic manner, or the first light supplement device 021 may be controlled in a software program or a specific device to perform near-infrared light supplement in the stroboscopic manner, which is not limited in this embodiment. The time period of the first light supplement device 021 for performing near-infrared light supplement may coincide with the exposure time period of the first preset exposure, or may be greater than the exposure time period of the first preset exposure or smaller than the exposure time period of the first preset exposure, as long as the near-infrared light supplement is performed in the whole exposure time period or part of the exposure time period of the first preset exposure, and the near-infrared light supplement is not performed in the exposure time period of the second preset exposure.
It should be noted that the near-infrared supplementary lighting is not performed in the exposure time period of the second preset exposure, for the global exposure mode, the exposure time period of the second preset exposure may be a time period between the exposure start time and the exposure end time, and for the rolling shutter exposure mode, the exposure time period of the second preset exposure may be a time period between the exposure start time of the first row of effective images and the exposure end time of the last row of effective images of the second image signal, but is not limited thereto. For example, the exposure time period of the second preset exposure may also be an exposure time period corresponding to a target image in the second image signal, the target image is a plurality of lines of effective images corresponding to a target object or a target area in the second image signal, and a time period between the starting exposure time and the ending exposure time of the plurality of lines of effective images may be regarded as the exposure time period of the second preset exposure.
Another point to be described is that, when the first light supplement device 021 performs near-infrared light supplement on an external scene, near-infrared light incident on the surface of an object may be reflected by the object, and thus enters the first optical filter 031. In addition, since the ambient light may include visible light and near infrared light in a normal condition, and the near infrared light in the ambient light is also reflected by the object when being incident on the surface of the object, so as to enter the first filter 031. Therefore, the near-infrared light passing through the first optical filter 031 during the near-infrared light supplement may include near-infrared light entering the first optical filter 031 after being reflected by an object when the first light supplement device 021 performs the near-infrared light supplement, and the near-infrared light passing through the first optical filter 031 during the non-near-infrared light supplement may include near-infrared light entering the first optical filter 031 after being reflected by the object when the first light supplement device 021 does not perform the near-infrared light supplement. That is, the near-infrared light passing through the first optical filter 031 during the near-infrared light supplement includes the near-infrared light emitted by the first light supplement device 021 and reflected by the object and the near-infrared light in the ambient light reflected by the object, and the near-infrared light passing through the first optical filter 031 during the non-near-infrared light supplement includes the near-infrared light in the ambient light reflected by the object.
With license plate recognition device, light filtering component 03 can be located between camera lens 04 and image sensor 01, and image sensor 01 is located the structural feature of light-emitting side of light filtering component 03 as an example, the process that license plate recognition device gathered first image signal and second image signal is: when the image sensor 01 performs the first preset exposure, the first light supplement device 021 performs near-infrared light supplement, and when the ambient light in the shooting scene and the near-infrared light reflected by an object in the scene during the near-infrared light supplement performed by the first light supplement device pass through the lens 04 and the first optical filter 031, the image sensor 01 generates a first image signal through the first preset exposure; when the image sensor 01 performs the second preset exposure, the first light supplement device 021 does not perform near-infrared light supplement, at this time, ambient light in a shooting scene passes through the lens 04 and the first optical filter 031, the image sensor 01 generates a second image signal through the second preset exposure, M first preset exposures and N second preset exposures can be provided in one frame period of image acquisition, sequencing of various combinations can be provided between the first preset exposure and the second preset exposure, in one frame period of image acquisition, values of M and N and a size relationship of M and N can be set according to actual requirements, for example, the values of M and N can be equal or different.
In addition, since the intensity of the near-infrared light in the ambient light is lower than the intensity of the near-infrared light emitted by the first light supplement device 021, the intensity of the near-infrared light passing through the first optical filter 031 when the first light supplement device 021 performs the near-infrared light supplement is higher than the intensity of the near-infrared light passing through the first optical filter 031 when the first light supplement device 021 does not perform the near-infrared light supplement.
The first light supplement device 021 can perform near-infrared light supplement within a second reference waveband range, and the second reference waveband range can be 700-800 nm or 900-1000 nm, so that the interference caused by a common near-red light of 850 nm can be reduced. In addition, the wavelength range of the near-infrared light incident to the first optical filter 031 may be a first reference wavelength range, which is 650 nm to 1100 nm.
When the near-infrared light compensation is performed, the near-infrared light passing through the first optical filter 031 may include the near-infrared light reflected by the object and entering the first optical filter 031 when the first light compensation device 021 performs the near-infrared light compensation, and the near-infrared light reflected by the object in the ambient light. The intensity of the near infrared light entering the filter assembly 03 is stronger at this time. However, when the near-infrared light compensation is not performed, the near-infrared light passing through the first filter 031 includes near-infrared light reflected by an object in the ambient light and entering the filter assembly 03. Since there is no near infrared light supplemented by the first light supplement device 021, the intensity of the near infrared light passing through the first filter 031 is weak at this time. Therefore, the intensity of near-infrared light included in the first image signal generated and output according to the first preset exposure is higher than the intensity of near-infrared light included in the second image signal generated and output according to the second preset exposure.
The first light supplement device 021 can have multiple choices for the center wavelength and/or the waveband range of near-infrared light supplement, in this embodiment of the application, in order to make the first light supplement device 021 and the first optical filter 031 have better cooperation, the center wavelength of near-infrared light supplement can be designed for the first light supplement device 021, and the characteristic of the first optical filter 031 is selected, thereby make the center wavelength of near-infrared light supplement be for setting for the characteristic wavelength or fall when setting for the characteristic wavelength range at the first light supplement device 021, the center wavelength and/or the waveband width of near-infrared light through the first optical filter 031 can reach the constraint condition. The constraint condition is mainly used to constrain the center wavelength of the near-infrared light passing through the first optical filter 031 to be as accurate as possible, and the band width of the near-infrared light passing through the first optical filter 031 to be as narrow as possible, so as to avoid the occurrence of wavelength interference caused by too wide band width of the near-infrared light.
The central wavelength of the near-infrared light supplement by the first light supplement device 021 may be an average value in a wavelength range where energy in a spectrum of the near-infrared light emitted by the first light supplement device 021 is the maximum, or may be a wavelength at an intermediate position in a wavelength range where energy in the spectrum of the near-infrared light emitted by the first light supplement device 021 exceeds a certain threshold.
The set characteristic wavelength or the set characteristic wavelength range may be preset. As an example, the center wavelength of the near-infrared supplementary lighting performed by the first supplementary lighting device 021 may be any wavelength within a wavelength range of 750 ± 10 nanometers; or, the center wavelength of the near-infrared supplementary lighting performed by the first supplementary lighting device 021 is any wavelength within the wavelength range of 780 ± 10 nanometers; or, the first light supplement device 021 supplements light in near-infrared light at any wavelength within a wavelength range of 940 ± 10 nanometers. That is, the set characteristic wavelength range may be a wavelength range of 750 ± 10 nanometers, or a wavelength range of 780 ± 10 nanometers, or a wavelength range of 940 ± 10 nanometers. Illustratively, the center wavelength of the first fill-in light device 021 for near-infrared fill-in light is 940 nm, and the relationship between the wavelength and the relative intensity of the first fill-in light device 021 for near-infrared fill-in light is shown in fig. 2. As can be seen from fig. 2, the wavelength band of the first light supplement device 021 for performing near-infrared light supplement is 900 nm to 1000 nm, wherein at 940 nm, the relative intensity of the near-infrared light is the highest.
Since most of the near-infrared light passing through the first optical filter 031 is near-infrared light entering the first optical filter 031 after being reflected by the object when the first fill-in light device 021 performs near-infrared light fill-in, in some embodiments, the constraint conditions may include: the difference between the central wavelength of the near-infrared light passing through the first optical filter 031 and the central wavelength of the near-infrared light supplemented by the first light supplementing device 021 is within a wavelength fluctuation range, which may be 0 to 20 nm, as an example.
The central wavelength of the near-infrared supplementary light passing through the first optical filter 031 may be a wavelength at a peak position in a near-infrared band range in the near-infrared light transmittance curve of the first optical filter 031, or may be a wavelength at a middle position in a near-infrared band range in which a transmittance exceeds a certain threshold in the near-infrared light transmittance curve of the first optical filter 031.
In order to avoid introducing wavelength interference due to too wide band width of the near infrared light passing through the first filter 031, in some embodiments, the constraint conditions may include: the first band width may be less than the second band width. The first wavelength band width refers to the wavelength band width of the near-infrared light passing through the first filter 031, and the second wavelength band width refers to the wavelength band width of the near-infrared light blocked by the first filter 031. It should be understood that the band width refers to the width of the wavelength range in which the wavelength of the light is located. For example, the wavelength of the near infrared light passing through the first filter 031 is in the wavelength range of 700 nm to 800 nm, and then the first wavelength band width is 800 nm minus 700 nm, i.e., 100 nm. In other words, the wavelength band width of the near infrared light passing through the first filter 031 is smaller than the wavelength band width of the near infrared light blocked by the first filter 031.
For example, referring to fig. 3, fig. 3 is a schematic diagram illustrating a relationship between a wavelength of light that can pass through the first filter 031 and a pass rate. The band of the near-infrared light incident to the first optical filter 031 is 650 nm to 1100 nm, the first optical filter 031 allows visible light having a wavelength of 380 nm to 650 nm to pass through, near-infrared light having a wavelength of 900 nm to 1100 nm to pass through, and near-infrared light having a wavelength of 650 nm to 900 nm to be blocked. That is, the first band width is 1000 nanometers minus 900 nanometers, i.e., 100 nanometers. The second band has a width of 900 nm minus 650 nm plus 1100 nm minus 1000 nm, i.e., 350 nm. 100 nm is smaller than 350 nm, that is, the band width of the near infrared light passing through the first optical filter 031 is smaller than the band width of the near infrared light blocked by the first optical filter 031. The above relation is only an example, and the wavelength range of the near-red light band that can pass through the filter may be different for different filters, and the wavelength range of the near-infrared light that is blocked by the filter may also be different.
In order to avoid introducing wavelength interference due to too wide band width of the near-infrared light passing through the first filter 031 during the non-near-infrared light supplement period, in some embodiments, the constraint conditions may include: the half-bandwidth of the near infrared light passing through the first filter 031 is less than or equal to 50 nm. The half bandwidth refers to the band width of near infrared light with a passing rate of more than 50%.
In order to avoid introducing wavelength interference due to too wide band width of the near infrared light passing through the first filter 031, in some embodiments, the constraint conditions may include: the third band width may be less than the reference band width. The third wavelength band width is a wavelength band width of the near infrared light having a transmittance greater than a set ratio, and as an example, the reference wavelength band width may be any one of wavelength band widths in a wavelength band range of 50 nm to 100 nm. The set proportion may be any proportion of 30% to 50%, and of course, the set proportion may be set to other proportions according to the use requirement, which is not limited in the embodiment of the present application. In other words, the band width of the near infrared light having the passing rate larger than the set ratio may be smaller than the reference band width.
For example, referring to fig. 3, the wavelength band of the near infrared light incident to the first filter 031 is 650 nm to 1100 nm, the set ratio is 30%, and the reference wavelength band width is 100 nm. As can be seen from fig. 3, in the wavelength band of the near-infrared light of 650 nm to 1100 nm, the wavelength band width of the near-infrared light with the transmittance of more than 30% is significantly less than 100 nm.
Because the first light supplement device 021 provides near-infrared light supplement at least in the partial exposure time period of the first preset exposure, the near-infrared light supplement is not provided in the whole exposure time period of the second preset exposure, and the first preset exposure and the second preset exposure are two exposures of multiple exposures of the image sensor 01, that is, the first light supplement device 021 provides near-infrared light supplement in the exposure time period of the partial exposure of the image sensor 01, and the near-infrared light supplement is not provided in the exposure time period of the other partial exposure of the image sensor 01. Therefore, the number of light supplement times of the first light supplement device 021 in a unit time length can be lower than the number of exposure times of the image sensor 01 in the unit time length, wherein one or more exposures are spaced in each interval time period of two adjacent light supplement.
Illustratively, since human eyes easily confuse the color of the near-infrared light supplementary lighting by the first supplementary lighting device 021 with the color of the red light in the traffic light, referring to fig. 4, the supplementary lighting device 02 may further include a second supplementary lighting device 022, and the second supplementary lighting device 022 is used for supplementary lighting of visible light. Like this, if second light filling device 022 provides the visible light filling at the partial exposure time of first preset exposure at least, promptly, has near-infrared light filling and visible light filling in the partial exposure time quantum of first preset exposure at least, and the mixed colour of these two kinds of light can be distinguished from the colour of the red light in the traffic light to the colour that the people's eye carries out near-infrared light filling with light filling ware 02 and the colour of the red light in the traffic light is confused has been avoided. In addition, if the second light supplement device 022 provides supplementary lighting for visible light in the exposure time period of the second preset exposure, since the intensity of visible light in the exposure time period of the second preset exposure is not particularly high, the brightness of visible light in the second image signal can be further improved when the supplementary lighting for visible light is performed in the exposure time period of the second preset exposure, and the quality of image acquisition is further ensured.
In some embodiments, the second light supplement device 022 can be used for supplementing visible light in a normally bright manner; or, the second light supplement device 022 may be configured to supplement the visible light in a stroboscopic manner, where the supplementary filling of the visible light is performed at least in a partial exposure time period of the first preset exposure, and the supplementary filling of the visible light is not performed in the entire exposure time period of the second preset exposure; or, the second light supplement device 022 may be configured to supplement the visible light in a stroboscopic manner, where the supplementary visible light is not performed at least in the whole exposure time period of the first preset exposure, and the supplementary visible light exists in a part of the exposure time period of the second preset exposure. When the second light supplement device 022 is normally on, visible light is supplemented, so that the color of the first light supplement device 021 for near-infrared light supplement can be prevented from being mixed up with the color of the red light in the traffic light by human eyes, the brightness of the visible light in the second image signal can be improved, and the quality of image acquisition is ensured. When second light filling device 022 carries out visible light filling with the stroboscopic mode, can avoid the colour that human eye carries out near-infrared light filling with first light filling device 021 and the colour of the red light in the traffic light to obscure, perhaps, can improve the luminance of the visible light in the second image signal, and then guarantee image acquisition's quality, but also can reduce the light filling number of times of second light filling device 022 to prolong the life of second light filling device 022.
The switching member 033 is configured to switch the second filter 032 to the light incident side of the image sensor 01, and it may be understood that the second filter 032 replaces the position of the first filter 031 on the light incident side of the image sensor 01. After the second filter 032 is switched to the light incident side of the image sensor 01, the first light supplement device 021 may be in an off state or an on state.
In some embodiments, the multiple exposure refers to multiple exposures within one frame period, that is, the image sensor 01 performs multiple exposures within one frame period, so as to generate and output at least one frame of the first image signal and at least one frame of the second image signal. For example, the image sensor 01 performs exposure for a plurality of times in each frame period for 1 second, thereby generating at least one frame of the first image signal and at least one frame of the second image signal, and the first image signal and the second image signal generated in one frame period are referred to as a set of image signals, so that 25 sets of image signals are generated in 25 frame periods. The first preset exposure and the second preset exposure may be adjacent two exposures in multiple exposures within one frame period, or may also be nonadjacent two exposures in multiple exposures within one frame period, which is not limited in this embodiment of the application.
The first image signal is generated and output for a first preset exposure, the second image signal is generated and output for a second preset exposure, and the first image signal and the second image signal may be processed after the first image signal and the second image signal are generated and output. In some cases, the first image signal and the second image signal may be used differently, so in some embodiments, at least one exposure parameter of the first preset exposure and the second preset exposure may be different. As an example, the at least one exposure parameter may include, but is not limited to, one or more of exposure time, analog gain, digital gain, aperture size. Wherein the exposure gain comprises an analog gain and/or a digital gain.
In some embodiments. It can be understood that, when performing the near-infrared light compensation, the intensity of the near-infrared light sensed by the image sensor 01 is stronger, and the brightness of the near-infrared light included in the first image signal generated and outputted accordingly is higher, compared to the second preset exposure. But the higher brightness near infrared light is not favorable for the acquisition of external scene information. Also, in some embodiments, the larger the exposure gain, the higher the brightness of the image signal output by the image sensor 01, and the smaller the exposure gain, the lower the brightness of the image signal output by the image sensor 01, and therefore, in order to ensure that the brightness of the near-infrared light included in the first image signal is within a suitable range, in the case where at least one exposure parameter of the first preset exposure and the second preset exposure is different, as an example, the exposure gain of the first preset exposure may be smaller than the exposure gain of the second preset exposure. Thus, when the first light supplement device 021 performs near-infrared light supplement, the brightness of near-infrared light included in the first image signal generated and output by the image sensor 01 is not too high due to the near-infrared light supplement performed by the first light supplement device 021.
In other embodiments, the longer the exposure time, the higher the brightness included in the image signal obtained by the image sensor 01, and the longer the motion smear of the moving object in the external scene in the image signal; the shorter the exposure time, the lower the brightness included in the image signal obtained by the image sensor 01, and the shorter the motion smear of the moving object in the external scene in the image signal. Therefore, in order to ensure that the brightness of the near-infrared light contained in the first image signal is within a proper range, and the motion tail of the moving object in the external scene in the first image signal is short. In a case where at least one exposure parameter of the first preset exposure and the second preset exposure is different, as an example, the exposure time of the first preset exposure may be smaller than the exposure time of the second preset exposure. Thus, when the first light supplement device 021 performs near-infrared light supplement, the brightness of near-infrared light included in the first image signal generated and output by the image sensor 01 is not too high due to the near-infrared light supplement performed by the first light supplement device 021. And the shorter exposure time makes the motion smear of the moving object in the external scene appearing in the first image signal shorter, thereby facilitating the identification of the moving object. Illustratively, the exposure time of the first preset exposure is 40 milliseconds, the exposure time of the second preset exposure is 60 milliseconds, and so on.
It is noted that, in some embodiments, when the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure, the exposure time of the first preset exposure may be not only smaller than the exposure time of the second preset exposure, but also equal to the exposure time of the second preset exposure. Similarly, when the exposure time of the first preset exposure is shorter than the exposure time of the second preset exposure, the exposure gain of the first preset exposure may be smaller than or equal to the exposure gain of the second preset exposure.
In other embodiments, the first image signal and the second image signal may be used for the same purpose, for example, when both the first image signal and the second image signal are used for intelligent analysis, at least one exposure parameter of the first preset exposure and the second preset exposure may be the same in order to enable the same definition of the human face or the target under intelligent analysis when the human face or the target moves. As an example, the exposure time of the first preset exposure may be equal to the exposure time of the second preset exposure, and if the exposure time of the first preset exposure is different from the exposure time of the second preset exposure, a motion smear may exist in one path of image signals with a longer exposure time, resulting in different resolutions of the two paths of image signals. Likewise, as another example, the exposure gain of the first preset exposure may be equal to the exposure gain of the second preset exposure.
It is noted that, in some embodiments, when the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure, the exposure gain of the first preset exposure may be smaller than or equal to the exposure gain of the second preset exposure. Similarly, when the exposure gain of the first preset exposure is equal to the exposure gain of the second preset exposure, the exposure time of the first preset exposure may be shorter than the exposure time of the second preset exposure, or may be equal to the exposure time of the second preset exposure.
The image sensor 01 may include a plurality of light sensing channels, each of which may be configured to sense light in at least one visible light band and to sense light in a near infrared band. That is, each photosensitive channel can sense light of at least one visible light band and light of a near-infrared band, so that complete resolution can be ensured in the first image signal and the second image signal, and pixel values are not lost. Illustratively, the plurality of light sensing channels may be configured to sense light in at least two different visible light bands.
In some embodiments, the plurality of photosensitive channels may include at least two of an R photosensitive channel, a G photosensitive channel, a B photosensitive channel, a Y photosensitive channel, a W photosensitive channel, and a C photosensitive channel. The light sensing device comprises a light sensing channel, a light sensing channel and a light sensing channel, wherein the light sensing channel R is used for sensing light of a red light wave band and a near infrared wave band, the light sensing channel G is used for sensing light of a green light wave band and a near infrared wave band, the light sensing channel B is used for sensing light of a blue light wave band and a near infrared wave band, and the light sensing channel Y is used for sensing light of a yellow light wave band and a near infrared wave band. Since in some embodiments, the photosensitive channel for sensing the light of the full wavelength band may be denoted by W, and in other embodiments, the photosensitive channel for sensing the light of the full wavelength band may be denoted by C, when the plurality of photosensitive channels include the photosensitive channel for sensing the light of the full wavelength band, the photosensitive channel may be the photosensitive channel of W, and may also be the photosensitive channel of C. That is, in practical applications, the photosensitive channel for sensing the light of the full wavelength band can be selected according to the use requirement. Illustratively, the image sensor 01 may be an RGB sensor, an RGBW sensor, or an RCCB sensor, or an ryb sensor. The distribution mode of the R photosensitive channels, the G photosensitive channels and the B photosensitive channels in the RGB sensor can be seen in fig. 5, the distribution mode of the R photosensitive channels, the G photosensitive channels, the B photosensitive channels and the W photosensitive channels in the RGBW sensor can be seen in fig. 6, the distribution mode of the R photosensitive channels, the C photosensitive channels and the B photosensitive channels in the RCCB sensor can be seen in fig. 7, and the distribution mode of the R photosensitive channels, the Y photosensitive channels and the B photosensitive channels in the RYYB sensor can be seen in fig. 8.
In other embodiments, some of the photosensitive channels may sense only light in the near infrared band and not light in the visible band, so as to ensure complete resolution in the first image signal without missing pixel values. As an example, the plurality of photosensitive channels may include at least two of an R photosensitive channel, a G photosensitive channel, a B photosensitive channel, and an IR photosensitive channel. The R light sensing channel is used for sensing light of a red light wave band and a near infrared wave band, the G light sensing channel is used for sensing light of a green light wave band and a near infrared wave band, the B light sensing channel is used for sensing light of a blue light wave band and a near infrared wave band, and the IR light sensing channel is used for sensing light of a near infrared wave band.
Illustratively, the image sensor 01 may be an rgbiir sensor, wherein each IR photosensitive channel in the rgbiir sensor may sense light in the near infrared band, but not light in the visible band.
When the image sensor 01 is an RGB sensor, compared with other image sensors such as an rgbiir sensor, RGB information acquired by the RGB sensor is more complete, and a part of photosensitive channels of the rgbiir sensor cannot acquire visible light, so that color details of an image acquired by the RGB sensor are more accurate.
It is noted that the image sensor 01 may include a plurality of photosensitive channels corresponding to a plurality of sensing curves. Illustratively, referring to fig. 9, an R curve in fig. 9 represents a sensing curve of the image sensor 01 for light in a red wavelength band, a G curve represents a sensing curve of the image sensor 01 for light in a green wavelength band, a B curve represents a sensing curve of the image sensor 01 for light in a blue wavelength band, a W (or C) curve represents a sensing curve of the image sensor 01 for light in a full wavelength band, and an NIR (Near infrared) curve represents a sensing curve of the image sensor 01 for light in a Near infrared wavelength band.
As an example, the image sensor 01 may adopt a global exposure mode, and may also adopt a rolling shutter exposure mode. The global exposure mode means that the exposure start time of each line of effective images is the same, and the exposure end time of each line of effective images is the same. In other words, the global exposure mode is an exposure mode in which all the lines of the effective image are exposed at the same time and the exposure is ended at the same time. The rolling shutter exposure mode means that the exposure time of different lines of effective images is not completely overlapped, that is, the exposure starting time of one line of effective images is later than the exposure starting time of the previous line of effective images, and the exposure ending time of one line of effective images is later than the exposure ending time of the previous line of effective images. In addition, since data output is possible after exposure of each line of effective images is completed in the rolling exposure method, the time from the time when data output of the first line of effective images is started to the time when data output of the last line of effective images is completed can be expressed as a readout time.
Illustratively, referring to fig. 10, fig. 10 is a schematic view of a rolling shutter exposure mode. As can be seen from fig. 10, the line 1 effective image starts exposure at time T1, ends exposure at time T3, and the line 2 effective image starts exposure at time T2, ends exposure at time T4, and shifts back by a time period from time T2 to time T1, and shifts back by a time period from time T4 to time T3. When the exposure of the 1 st line effective image is finished and the data output is started at the time T3, the data output is finished at the time T5, the exposure of the nth line effective image is finished and the data output is started at the time T6, and the data output is finished at the time T7, the time between the times T3 and T7 is the readout time.
In some embodiments, when the image sensor 01 performs multiple exposures in a global exposure manner, for any one of the near-infrared supplementary exposures, there is no intersection between the time period of the near-infrared supplementary exposure and the exposure time period of the nearest second preset exposure, and the time period of the near-infrared supplementary exposure is a subset of the exposure time period of the first preset exposure, or there is an intersection between the time period of the near-infrared supplementary exposure and the exposure time period of the first preset exposure, or the exposure time period of the first preset exposure is a subset of the near-infrared supplementary exposure. Therefore, the near-infrared light supplement can be realized at least in the partial exposure time period of the first preset exposure, and the near-infrared light supplement is not existed in the whole exposure time period of the second preset exposure, so that the second preset exposure is not influenced.
For example, referring to fig. 11, for any one near-infrared fill light, there is no intersection between the time period of the near-infrared fill light and the exposure time period of the nearest second preset exposure, and the time period of the near-infrared fill light is a subset of the exposure time period of the first preset exposure. Referring to fig. 12, for any one near-infrared supplementary lighting, there is no intersection between the time period of the near-infrared supplementary lighting and the exposure time period of the nearest second preset exposure, and there is an intersection between the time period of the near-infrared supplementary lighting and the exposure time period of the first preset exposure. Referring to fig. 13, for any one near-infrared fill light, there is no intersection between the time period of the near-infrared fill light and the exposure time period of the nearest second preset exposure, and the exposure time period of the first preset exposure is a subset of the near-infrared fill light. Fig. 11 to 13 are merely examples, and the order of the first preset exposure and the second preset exposure may not be limited to these examples.
In other embodiments, when the image sensor 01 performs multiple exposures in a rolling shutter exposure manner, for any one near-infrared supplementary light, there is no intersection between the time period of the near-infrared supplementary light and the exposure time period of the nearest second preset exposure. And the starting time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last row of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not later than the exposure ending time of the first row of effective images in the first preset exposure. Or the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure ending time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure. Or the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure starting time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure.
For example, referring to fig. 14, for any near-infrared supplementary lighting, there is no intersection between the time period of the near-infrared supplementary lighting and the exposure time period of the nearest second preset exposure, and the starting time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last row of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not later than the exposure ending time of the first row of effective images in the first preset exposure. Referring to fig. 15, for any one time of near-infrared supplementary lighting, there is no intersection between the time period of the near-infrared supplementary lighting and the exposure time period of the nearest second preset exposure, and the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure ending time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure. Referring to fig. 16, for any one time of near-infrared supplementary lighting, there is no intersection between the time period of the near-infrared supplementary lighting and the exposure time period of the nearest second preset exposure, and the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure starting time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure. In fig. 14 to 16, for the first preset exposure and the second preset exposure, the oblique dotted line represents the exposure start time, the oblique solid line represents the exposure end time, and for the first preset exposure, the vertical dotted line represents the time period of the near-infrared fill light corresponding to the first preset exposure, and fig. 14 to 16 are only examples, and the sequence of the first preset exposure and the second preset exposure may not be limited to these examples.
The multiple exposures may include odd number of exposures and even number of exposures, so that the first preset exposure and the second preset exposure may include, but are not limited to, the following modes:
in a first possible implementation, the first pre-exposure is one of an odd number of exposures and the second pre-exposure is one of an even number of exposures. Thus, the multiple exposures may include a first preset exposure and a second preset exposure arranged in odd-even order. For example, the odd-numbered exposures such as the 1 st exposure, the 3 rd exposure, and the 5 th exposure in the multiple exposures are all the first preset exposures, and the even-numbered exposures such as the 2 nd exposure, the 4 th exposure, and the 6 th exposure are all the second preset exposures.
In a second possible implementation, the first pre-set exposure is one of an even number of exposures and the second pre-set exposure is one of an odd number of exposures, such that the multiple exposures may include the first pre-set exposure and the second pre-set exposure arranged in odd-even order. For example, the odd-numbered exposures such as the 1 st exposure, the 3 rd exposure, and the 5 th exposure in the multiple exposures are all the second preset exposures, and the even-numbered exposures such as the 2 nd exposure, the 4 th exposure, and the 6 th exposure are all the first preset exposures.
In a third possible implementation manner, the first preset exposure is one exposure of the designated odd number of exposures, and the second preset exposure is one exposure of the other exposures except the designated odd number of exposures, that is, the second preset exposure may be an odd number of exposures of the multiple exposures or an even number of exposures of the multiple exposures.
In a fourth possible implementation manner, the first preset exposure is one exposure of the designated even-numbered exposures, and the second preset exposure is one exposure of the other exposures except the designated even-numbered exposure, that is, the second preset exposure may be an odd exposure of the multiple exposures or an even exposure of the multiple exposures.
In a fifth possible implementation manner, the first preset exposure is one exposure in the first exposure sequence, and the second preset exposure is one exposure in the second exposure sequence.
In a sixth possible implementation manner, the first preset exposure is one exposure in the second exposure sequence, and the second preset exposure is one exposure in the first exposure sequence.
The multiple exposure comprises a plurality of exposure sequences, the first exposure sequence and the second exposure sequence are the same exposure sequence or two different exposure sequences in the multiple exposure sequences, each exposure sequence comprises N exposures, the N exposures comprise 1 first preset exposure and N-1 second preset exposures, or the N exposures comprise 1 second preset exposure and N-1 second preset exposures, and N is a positive integer greater than 2.
For example, each exposure sequence includes 3 exposures, and the 3 exposures may include 1 first preset exposure and 2 second preset exposures, such that the 1 st exposure of each exposure sequence may be the first preset exposure and the 2 nd and 3 rd exposures are the second preset exposures. That is, each exposure sequence may be represented as: the method comprises a first preset exposure, a second preset exposure and a second preset exposure. Alternatively, the 3 exposures may include 1 second preset exposure and 2 first preset exposures, such that the 1 st exposure of each exposure sequence may be the second preset exposure and the 2 nd and 3 rd exposures are the first preset exposures. That is, each exposure sequence may be represented as: second preset exposure, first preset exposure and first preset exposure.
The foregoing provides only six possible implementation manners of the first preset exposure and the second preset exposure, and in practical applications, the implementation manners are not limited to the above six possible implementation manners, and this is not limited in this application.
In summary, when the intensity of visible light in ambient light is weak, for example, at night, the first light supplement device 021 may be used to perform flash light supplement, so that the image sensor 01 generates and outputs a first image signal containing near-infrared luminance information and a second image signal containing visible light luminance information, and both the first image signal and the second image signal are acquired by the same image sensor 01, so that the viewpoint of the first image signal is the same as the viewpoint of the second image signal, and thus the complete information of an external scene may be acquired through the first image signal and the second image signal. When the visible light intensity is strong, for example, in the daytime, the proportion of near infrared light in the daytime is strong, the color reduction degree of the acquired image is poor, and a third image signal containing visible light brightness information can be generated and output by the image sensor 01, so that even in the daytime, an image with good color reduction degree can be acquired, the real color information of an external scene can be efficiently and simply acquired no matter the intensity of the visible light intensity, or no matter day or night, the use flexibility of the image acquisition device is improved, and the image acquisition device can be conveniently compatible with other image acquisition devices.
This application utilizes image sensor's exposure chronogenesis to control the near-infrared light filling chronogenesis of light filling device, so that carry out near-infrared light filling and produce first image signal at the in-process of first preset exposure, do not carry out near-infrared light filling and produce the second image signal at the in-process of the second preset exposure, such data acquisition mode, can be simple structure, directly gather the first image signal and the second image signal that luminance information is different in the time of reduce cost, also just can acquire two kinds of different image signals through an image sensor, make this license plate identification equipment simpler and more convenient, and then make and acquire first image signal and second image signal also more high-efficient. And, the first image signal and the second image signal are both generated and output by the same image sensor, so the viewpoint corresponding to the first image signal is the same as the viewpoint corresponding to the second image signal. Therefore, the information of the external scene can be obtained through the first image signal and the second image signal together, and the image misalignment generated according to the first image signal and the second image signal can be avoided because the viewpoint corresponding to the first image signal is different from the viewpoint corresponding to the second image signal.
In an embodiment of the present application, referring to fig. 17, an image processing unit includes:
the image fusion device comprises an image preprocessing unit, an image fusion unit and an image identification unit;
the image preprocessing unit is used for preprocessing the first image signal to generate a gray-scale image and preprocessing the second image signal to generate a color image;
the image fusion unit is used for carrying out fusion processing on the color image and the gray level image to obtain a color fusion image;
the image recognition unit is used for recognizing the license plate to be recognized according to the color fusion image output by the image fusion unit or recognizing the license plate to be recognized according to the gray image.
Specifically, after a first image signal and a second image signal output by a sensor are acquired, the first image signal and the second image signal can be preprocessed by an image preprocessing unit to respectively obtain a preprocessed gray image and a preprocessed color image, and then the color image and the gray image are fused by an image fusion unit to obtain a color fusion image; and identifying the color fusion image or the gray level image output by the image fusion unit through the image identification unit to obtain the license plate number of the license plate to be identified.
The first image signal containing near infrared light information and the second image signal containing visible light information can be simultaneously acquired through the first preset exposure and the second preset exposure in any time period, and the quality of an image obtained by processing according to the first image signal and the second image signal is higher, so that the license plate recognition effect is better.
For example, in some embodiments, as shown in fig. 18, the image preprocessing unit may further include:
the device comprises a first preprocessing unit, a second preprocessing unit and a combined noise reduction unit which is respectively connected with the first preprocessing unit and the second preprocessing unit;
the first preprocessing unit is used for performing first preprocessing operation on the first image signal to obtain a preprocessed gray-scale image;
the second preprocessing unit is used for performing second preprocessing operation on the second image signal to obtain a color image;
the combined noise reduction unit is used for filtering the color image and the gray level image to obtain a noise-reduced color image and a noise-reduced gray level image; or the like, or, alternatively,
as shown in fig. 19, the image preprocessing unit includes:
the device comprises a combined noise reduction unit, a first preprocessing unit and a second preprocessing unit, wherein the first preprocessing unit and the second preprocessing unit are respectively connected with the combined noise reduction unit;
the joint noise reduction unit is configured to perform filtering processing on the first image signal and the second image signal to obtain a noise-reduced first image signal and a noise-reduced second image signal;
the first preprocessing unit is used for performing first preprocessing operation on the first image signal subjected to noise reduction to obtain a preprocessed gray image; and the second preprocessing unit is used for performing second preprocessing operation on the second image signal subjected to noise reduction to obtain a color image.
Wherein the first preprocessing operation comprises at least one of the following operations: black level processing, dead pixel correction, brightness calculation, gamma correction, noise reduction and sharpening;
the second preprocessing operation includes, for example, at least one of the following operations: black level processing, dead pixel correction, white balance correction, color interpolation, gamma correction, color correction, RGB to YUV operation, noise reduction and sharpening processing.
And the joint noise reduction unit is used for performing joint noise reduction processing, such as joint bilateral filtering processing, on the first image signal or the second image signal by using the correlation of information between the first image signal and the second image signal to obtain the noise-reduced first image signal and the noise-reduced second image signal with improved signal-to-noise ratio.
In some embodiments of the present application, the color image and the grayscale image are respectively subjected to joint filtering processing according to a correlation between the color image and the grayscale image, so as to obtain the color image and the grayscale image after noise reduction;
or, the joint noise reduction unit is specifically configured to:
and respectively carrying out combined filtering processing on the first image signal and the second image signal according to the correlation between the first image signal and the second image signal to obtain the first image signal and the second image signal after noise reduction.
Illustratively, in an embodiment of the present application, when the image sensor is a sensor arranged in a bayer manner, the output image signal will be a mosaic image signal. The first pre-processing unit may further perform demosaicing processing on the first image signal, so as to obtain a repaired first image signal.
It should be noted that, when performing demosaicing processing on the first image signal, a bilinear interpolation method or an adaptive interpolation method may be used for processing, and details are not described here. Further, the image fusion unit includes: the color extraction unit and the brightness extraction unit are respectively connected with the fusion processing unit;
wherein, the color extracting unit is used for extracting color signals of the color image;
the brightness extraction unit is used for extracting a brightness signal of the color image;
the fusion processing unit is used for carrying out high-pass filtering on the gray level image to obtain a filtered high-frequency signal;
carrying out low-pass filtering on the brightness signal of the color image to obtain a filtered low-frequency signal;
and carrying out fusion processing on the high-frequency signal, the low-frequency signal and the color signal to obtain the color fusion image.
The fusion processing unit is specifically configured to:
carrying out weighted fusion processing on the high-frequency signal and the low-frequency signal to obtain a fusion brightness image;
and carrying out fusion processing on the color signals of the fusion brightness image and the color image to obtain the color fusion image.
Specifically, the image fusion unit in fig. 17 performs weighted fusion on the luminance signal of the color image and the luminance signal of the grayscale image to obtain a fused luminance image, and then synthesizes the fused luminance image with the color signal of the color image to obtain a color fused image.
In fig. 20, before the weighted fusion, the luminance signal may be low-pass filtered to obtain a low-frequency signal, the grayscale image may be high-frequency filtered to obtain a high-frequency signal, and the low-frequency signal and the high-frequency signal may be weighted and fused to obtain a fused luminance image.
For example, assuming that the gray image is Y and the luminance signal of the color image is W, the high frequency signal of the luminance signal W of the color image and the low frequency information of the gray image Y are extracted using high pass filtering and low pass filtering, as described in the following,
Whigh=HPF(W)
Ylow=LPF(Y)
wherein HPF denotes a high pass filter, LPF denotes a low pass filter, WhighRepresents ashHigh frequency signal of degree image, YlowA low frequency signal representing a luminance signal of a color image. Adding a low-frequency signal of a luminance signal of a color image to a high-frequency signal of a grayscale image to obtain a fused luminance image, i.e., FUS ═ Whigh+VISlow. And finally, synthesizing the color signals of the fusion brightness image and the color image to obtain a color fusion image.
Illustratively, as shown in fig. 21, in an embodiment of the present application, the image recognition unit includes:
the license plate positioning unit is connected with the license plate identification unit;
the license plate positioning unit is used for extracting a color license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the color fusion image output by the image fusion unit;
and the license plate recognition unit is used for carrying out character recognition on the color license plate region image and acquiring the license plate number of the license plate to be recognized.
Specifically, when the license plate is positioned, a license plate region image can be extracted based on a deep learning algorithm, firstly, a color fusion image is input into a license plate positioning unit, a plurality of candidate regions are generated from the color fusion image, the characteristics of the candidate regions are respectively calculated, finally, one of the candidate regions is used as a license plate region image through classification, and the license plate region image at the moment is the color license plate region image.
The license plate recognition unit performs character recognition on the color license plate area image, for example, character segmentation can be performed first, and then character recognition is performed, so as to obtain a recognized license plate number. The character segmentation can adopt a vertical projection method, for example, and the binary image is vertically projected according to the characteristic that the intervals among the license plate characters are equal, so that character information is segmented; character recognition typically uses a template matching method to recognize final character information.
In another embodiment of the present application, as shown in fig. 22, the image recognition unit includes:
the license plate positioning unit is respectively connected with a license plate recognition unit and an image capturing unit;
the license plate positioning unit is used for extracting a gray license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the gray image output by the image preprocessing unit and acquiring a license plate position coordinate in the license plate region image;
the image intercepting unit is used for extracting a color license plate region image of the license plate to be recognized according to the color fusion image output by the image fusion unit and the license plate position coordinate output by the license plate positioning unit;
and the license plate recognition unit is used for carrying out character recognition according to the gray-scale license plate area image output by the license plate positioning unit and acquiring the license plate number of the license plate to be recognized.
Specifically, the first image signal and the second image signal can output a gray image and a color fusion image through the image preprocessing unit and the image fusion unit, the gray image is used for license plate positioning and license plate recognition (the license plate in the gray image is clearer and is more beneficial to license plate positioning and recognition), and the license plate number is output; meanwhile, the license plate positioning unit inputs the obtained license plate position coordinates into the image intercepting unit, and the color license plate region image is extracted from the color fusion image, namely the license plate number is removed, and the color license plate region image, the color fusion image and the like can be output.
In an embodiment of the present application, the color image is subjected to noise reduction processing according to the following formula (1), so as to obtain a noise-reduced color image;
Figure BDA0002081227530000231
according to the following formula (2), carrying out noise reduction processing on the gray level image to obtain a noise-reduced gray level image;
Figure BDA0002081227530000232
wherein x and y represent any current pixel pointThe coordinate of (a) is represented by img _ vis (x + i, y + j), img _ vis '(x, y) is represented by the pixel value of a pixel point in a neighborhood corresponding to a current pixel point in the color image, img _ nir (x + i, y + j) is represented by the pixel value of a pixel point in a neighborhood corresponding to the current pixel point in the grayscale image, img _ nir' (x, y) is represented by the pixel value of a pixel point in a neighborhood corresponding to the current pixel point in the grayscale image, S is represented by the neighborhood size corresponding to the current pixel point, and weight (x + i, y + j) ═ weightvis(x+i,y+j)+weightnir(x+i,y+j),weightvis(x + i, y + j) is the weight, corresponding to the current pixel point in the color imagenir(x + i, y + j) is the weight, corresponding to the current pixel point in the gray-scale imagenir(x + i, y + j) and weightvis(x + i, y + j) can all pass through
Figure BDA0002081227530000233
Calculation of fxyPixel value, f, representing the current pixel pointijExpressing the pixel value of the neighborhood pixel of the current pixel, i and j are the coordinates of the neighborhood pixel, delta1,δ2The standard deviation of the gaussian distribution is indicated.
In an embodiment of the present application, with reference to the following formula (3), performing noise reduction processing on the first image signal to obtain a noise-reduced first image signal;
Figure BDA0002081227530000241
performing noise reduction processing on the second image signal with reference to the following formula (4) to obtain a noise-reduced second image signal;
Figure BDA0002081227530000242
wherein x and y represent the coordinates of any current pixel point, img1(x + i, y + j) represents the pixel value of the pixel point in the neighborhood corresponding to the current pixel point in the first image signal, and img1' (x, y) represents the pixel value of the current pixel point in the first image signal after noise reductionImg2(x + i, y + j) represents a pixel value of a pixel point in a neighborhood corresponding to a current pixel point in the second image signal, img2' (x, y) represents a noise-reduced pixel value of the current pixel point in the second image signal, S represents a neighborhood size corresponding to the current pixel point, and weight (x + i, y + j) ═ weightvis(x+i,y+j)+weightnir(x+i,y+j),weightvis(x + i, y + j) is the weight, corresponding to the current pixel point in the first image signalnir(x + i, y + j) is the weight, corresponding to the current pixel point in the second image signalnir(x + i, y + j) and weightvis(x + i, y + j) can all pass through
Figure BDA0002081227530000243
Calculation of fxyPixel value, f, representing the current pixel pointijExpressing the pixel value of the neighborhood pixel of the current pixel, i and j are the coordinates of the neighborhood pixel, delta1,δ2The standard deviation of the gaussian distribution is indicated.
The embodiment of the present application further provides a license plate recognition method, which is explained based on the license plate recognition device provided in the embodiments shown in fig. 1 to 22. Referring to fig. 23, the method includes:
240, performing near-infrared light supplement through a first light supplement device included in a light supplement device, wherein the near-infrared light supplement is performed at least in a partial exposure time period of a first preset exposure, the near-infrared light supplement is not performed in an exposure time period of a second preset exposure, and the first preset exposure and the second preset exposure are two exposures of multiple exposures of the image sensor;
241, in the process of performing near-infrared external light supplement by the first light supplement device, allowing light in a visible light band and part of near-infrared light to pass through the filtering component;
step 242, after the filtering component passes through light in the visible light band and part of near-infrared light, performing multiple exposures by the image sensor to generate and output a first image signal and a second image signal, where the first image signal is an image signal generated according to the first preset exposure, and the second image signal is an image signal generated according to the second preset exposure; the first image signal and the second image signal comprise information of a license plate to be recognized;
and 243, the image processing unit identifies the license plate to be identified according to the first image signal and the second image signal.
For example, when the first light supplement device performs near-infrared light supplement, the intensity of near-infrared light passing through the filter assembly is higher than the intensity of near-infrared light passing through the filter assembly when the first light supplement device does not perform near-infrared light supplement.
Exemplary, also include:
and the light supplement device comprises a second light supplement device for supplementing visible light.
For example, when the central wavelength of the near-infrared light supplement by the first light supplement device is a set characteristic wavelength or falls within a set characteristic wavelength range, the central wavelength and/or the band width of the near-infrared light passing through the first optical filter reach a constraint condition.
Illustratively, the image processing unit identifies the license plate to be identified according to the first image signal and the second image signal, and includes:
the image preprocessing unit generates a gray image after preprocessing the first image signal and generates a color image gray after preprocessing the second image signal;
the image fusion unit performs fusion processing on the color image and the gray level image to obtain a color fusion image;
the image recognition unit recognizes the license plate to be recognized according to the color fusion image output by the image fusion, or recognizes the license plate to be recognized according to the gray image output by the image preprocessing unit.
Illustratively, the image preprocessing unit generates a grayscale image by preprocessing the first image signal and generates a color image grayscale by preprocessing the second image signal, including:
the first preprocessing unit carries out first preprocessing operation on the first image signal to obtain a preprocessed gray image;
the second preprocessing unit carries out second preprocessing operation on the second image signal to obtain a color image;
the combined noise reduction unit carries out filtering processing on the color image and the gray level image to obtain a noise-reduced color image and a noise-reduced gray level image; or the like, or, alternatively,
the joint noise reduction unit carries out filtering processing on the first image signal and the second image signal to obtain a first image signal and a second image signal after noise reduction;
the first preprocessing unit carries out first preprocessing operation on the first image signal subjected to noise reduction to obtain a preprocessed gray image;
and the second preprocessing unit carries out second preprocessing operation on the second image signal subjected to noise reduction to obtain a color image.
Illustratively, the recognizing the license plate to be recognized by the image recognizing unit according to the color fusion image output by the image fusion, includes:
the license plate positioning unit extracts a color license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the color fusion image output by the image fusion unit;
and the license plate recognition unit performs character recognition on the color license plate area image to acquire the license plate number of the license plate to be recognized.
Illustratively, the recognizing the license plate to be recognized by the image recognizing unit according to the grayscale image output by the image preprocessing unit includes:
the license plate positioning unit extracts a gray license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the gray image output by the image preprocessing unit, and acquires a license plate position coordinate in the license plate region image;
the image intercepting unit extracts a color license plate region image of the license plate to be recognized according to the color fusion image output by the image fusion unit and the license plate position coordinate output by the license plate positioning unit;
and the license plate recognition unit performs character recognition according to the gray license plate area image output by the license plate positioning unit to acquire the license plate number of the license plate to be recognized.
Illustratively, the fusing the color image and the grayscale image by an image fusion unit to obtain a color fused image, including:
the color extraction unit extracts color signals of the color image;
the brightness extraction unit extracts a brightness signal of the color image;
the fusion processing unit carries out high-pass filtering on the gray level image to obtain a filtered high-frequency signal; carrying out low-pass filtering on the brightness signal of the color image to obtain a filtered low-frequency signal; and carrying out fusion processing on the high-frequency signal, the low-frequency signal and the color signal to obtain the color fusion image.
Illustratively, the fusion processing unit performs fusion processing on the high-frequency signal, the low-frequency signal, and the color signal to obtain the color fusion image:
carrying out weighted fusion processing on the high-frequency signal and the low-frequency signal to obtain a fusion brightness image;
and carrying out fusion processing on the color signals of the fusion brightness image and the color image to obtain the color fusion image.
Illustratively, the filtering processing is performed on the color image and the grayscale image by the joint noise reduction unit to obtain a noise-reduced color image and a noise-reduced grayscale image, and the processing includes:
and respectively carrying out combined filtering processing on the color image and the gray image according to the correlation between the color image and the gray image to obtain the color image and the gray image after noise reduction.
Illustratively, the filtering processing, by the joint noise reduction unit, the first image signal and the second image signal to obtain a noise-reduced first image signal and second image signal includes:
and respectively carrying out combined filtering processing on the first image signal and the second image signal according to the correlation between the first image signal and the second image signal to obtain the first image signal and the second image signal after noise reduction.
For example, the filter assembly may further include a second filter and a switching member, and at this time, the second filter may be further switched to the light incident side of the image sensor by the switching member. After the second optical filter is switched to the light incident side of the image sensor, light in a visible light waveband is made to pass through the second optical filter, light in a near infrared light waveband is blocked, and after the second optical filter passes through the light in the visible light waveband and blocks the light in the near infrared light waveband, exposure is carried out through the image sensor, so that a third image signal is generated and output.
Optionally, the wavelength range of the near-infrared light incident to the first optical filter is a first reference wavelength range, and the first reference wavelength range is 650 nm to 1100 nm.
Optionally, the center wavelength of the near-infrared supplementary lighting performed by the first supplementary lighting device is any wavelength within a wavelength range of 750 ± 10 nanometers; or
The center wavelength of the near-infrared supplementary lighting performed by the first supplementary lighting device is any wavelength within the wavelength range of 780 +/-10 nanometers; or
The center wavelength of the near-infrared supplementary lighting performed by the first supplementary lighting device is any wavelength within a wavelength range of 940 +/-10 nanometers.
Optionally, the constraint includes:
the difference value between the central wavelength of the near-infrared light passing through the first optical filter and the central wavelength of the near-infrared light supplemented by the first light supplementing device is within a wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers.
Optionally, the constraint includes:
the half-bandwidth of the near-infrared light passing through the first filter is less than or equal to 50 nanometers.
Optionally, the constraint includes:
the first wave band width is smaller than the second wave band width; the first band width refers to the band width of the near-infrared light passing through the first optical filter, and the second band width refers to the band width of the near-infrared light blocked by the first optical filter.
Optionally, the constraint is:
the third wave band width is smaller than the reference wave band width, the third wave band width is the wave band width of the near infrared light with the passing rate larger than the set proportion, and the reference wave band width is any wave band width in the wave band range of 50 nanometers to 150 nanometers.
Alternatively, the set ratio is any ratio within a ratio range of 30% to 50%.
Optionally, the at least one exposure parameter of the first preset exposure and the second preset exposure is different, the at least one exposure parameter is one or more of exposure time, exposure gain, aperture size, the exposure gain comprises analog gain, and/or digital gain.
Optionally, the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure.
Optionally, at least one exposure parameter of the first preset exposure and the second preset exposure is the same, the at least one exposure parameter includes one or more of exposure time, exposure gain, aperture size, the exposure gain includes analog gain, and/or digital gain.
Optionally, the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure.
Optionally, the image sensor comprises a plurality of photosensitive channels, each photosensitive channel being configured to sense light in at least one visible wavelength band and to sense light in a near infrared wavelength band.
Optionally, a plurality of light sensing channels are used to sense light in at least two different visible light bands.
Optionally, the plurality of photosensitive channels includes at least two of an R photosensitive channel, a G photosensitive channel, a B photosensitive channel, a Y photosensitive channel, a W photosensitive channel, and a C photosensitive channel;
the light sensing device comprises a light sensing channel, a light sensing channel and a light sensing channel, wherein the light sensing channel is used for sensing light of a red light wave band and a near infrared wave band, the light sensing channel is used for sensing light of a green light wave band and a near infrared wave band, the light sensing channel is used for sensing light of a blue light wave band and a near infrared wave band, the light sensing channel is used for sensing light of a yellow light wave band and a near infrared wave band, the light sensing channel is used for sensing light of a full wave band, and the light sensing channel is used for sensing light of the full wave band.
Optionally, the image sensor is an RGB sensor, an RGBW sensor, or an RCCB sensor, or an ryb sensor.
Optionally, the second light supplement device is configured to supplement the visible light in a normally bright manner; or
The second light supplementing device is used for supplementing visible light in a stroboscopic mode, wherein the supplementing visible light is performed at least in a part of the exposure time period of the first preset exposure, and the supplementing visible light is not performed in the whole exposure time period of the second preset exposure; or
The second light supplement device is used for supplementing visible light in a stroboscopic mode, wherein the visible light is not supplemented at least in the whole exposure time period of the first preset exposure, and the visible light is supplemented in a part of the exposure time period of the second preset exposure.
Optionally, the number of light supplement times of the first light supplement device in the unit time length is lower than the number of exposure times of the image sensor in the unit time length, wherein one or more exposures are spaced in each interval time period of two adjacent light supplement.
Optionally, the image sensor performs multiple exposures in a global exposure manner, and for any one near-infrared supplementary light, there is no intersection between the time period of the near-infrared supplementary light and the exposure time period of the nearest second preset exposure, and the time period of the near-infrared supplementary light is a subset of the exposure time period of the first preset exposure, or there is an intersection between the time period of the near-infrared supplementary light and the exposure time period of the first preset exposure, or the exposure time period of the first preset exposure is a subset of the near-infrared supplementary light.
Optionally, the image sensor performs multiple exposures in a rolling shutter exposure mode, and for any near-infrared supplementary light, an intersection does not exist between the time period of the near-infrared supplementary light and the exposure time period of the nearest second preset exposure;
the starting time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not later than the exposure ending time of the first line of effective images in the first preset exposure;
alternatively, the first and second electrodes may be,
the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure ending time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure; or
The starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure starting time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure.
Optionally, the multiple exposures include odd and even exposures;
the first preset exposure is one exposure in odd number of exposures, and the second preset exposure is one exposure in even number of exposures; or
The first preset exposure is one exposure in even number of exposures, and the second preset exposure is one exposure in odd number of exposures; or
The first preset exposure is one exposure of the appointed odd number exposures, and the second preset exposure is one exposure of other exposures except the appointed odd number exposure; or
The first preset exposure is one exposure of designated even-numbered exposures, and the second preset exposure is one exposure of other exposures except the designated even-numbered exposures; alternatively, the first and second electrodes may be,
the first preset exposure is one exposure in a first exposure sequence, and the second preset exposure is one exposure in a second exposure sequence; or
The first preset exposure is one exposure in a second exposure sequence, and the second preset exposure is one exposure in the first exposure sequence;
the multiple exposure comprises a plurality of exposure sequences, the first exposure sequence and the second exposure sequence are one exposure sequence or two exposure sequences in the exposure sequences, each exposure sequence comprises N times of exposure, the N times of exposure comprise 1 time of first preset exposure and N-1 times of second preset exposure, or the N times of exposure comprise 1 time of second preset exposure and N-1 times of second preset exposure, and N is a positive integer greater than 2.
It should be noted that, since the present embodiment and the embodiment shown in fig. 1 to 23 may adopt the same inventive concept, for the explanation of the present embodiment, reference may be made to the explanation of the relevant contents in the embodiment shown in fig. 1 to 23, and the description thereof is omitted here.
Other embodiments of the present disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (24)

1. A license plate recognition apparatus, characterized by comprising:
the device comprises a light filtering component, an image sensor, a light supplementing device and an image processing unit, wherein the image sensor is positioned on the light emitting side of the light filtering component;
the image sensor is used for generating and outputting a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure are two exposures of the multiple exposures, the first preset exposure and the second preset exposure are continuous two exposures, and the first preset exposure and the second preset exposure are alternately performed; the first image signal and the second image signal comprise information of a license plate to be recognized, and the first image signal and the second image signal are both one-frame image signals;
the light supplement device comprises a first light supplement device, and the first light supplement device is used for performing near-infrared light supplement, wherein the near-infrared light supplement is performed at least in a part of the exposure time period of the first preset exposure, and the near-infrared light supplement is not performed in the exposure time period of the second preset exposure; for any near-infrared supplementary lighting, no intersection exists between the time period of the near-infrared supplementary lighting and the nearest exposure time period of the second preset exposure;
the filtering component is used for passing through a visible light waveband and part of near infrared light;
and the image processing unit is used for identifying the license plate to be identified according to the first image signal and the second image signal.
2. The apparatus of claim 1, wherein the intensity of the near-infrared light passing through the filter assembly when the first fill-in device performs near-infrared fill-in is higher than the intensity of the near-infrared light passing through the filter assembly when the first fill-in device does not perform near-infrared fill-in.
3. The apparatus according to claim 1 or 2, wherein the light supplement device further comprises a second light supplement device, and the second light supplement device is configured to supplement visible light.
4. The apparatus according to claim 3, wherein the second fill-in device is configured to fill in the visible light in a normally bright manner;
or
The second light supplement device is used for supplementing visible light in a stroboscopic mode, wherein the supplementing visible light is performed at least in a part of the exposure time period of the first preset exposure, and the supplementing visible light is not performed in the whole exposure time period of the second preset exposure;
or
The second light supplement device is used for supplementing visible light in a stroboscopic mode, wherein the supplementing visible light is not performed at least in the whole exposure time period of the first preset exposure, and the supplementing visible light is performed in a part of the exposure time period of the second preset exposure.
5. The apparatus according to claim 1 or 2,
the first preset exposure and the second preset exposure are different in at least one exposure parameter, the at least one exposure parameter is one or more of exposure time, exposure gain and aperture size, and the exposure gain comprises analog gain and/or digital gain.
6. The apparatus according to claim 1 or 2,
at least one exposure parameter of the first preset exposure and the second preset exposure is the same, the at least one exposure parameter comprises one or more of exposure time, exposure gain and aperture size, and the exposure gain comprises analog gain and/or digital gain.
7. The apparatus according to claim 1 or 2,
the image sensor comprises a plurality of photosensitive channels, and each photosensitive channel is used for sensing light of at least one visible light wave band and sensing light of a near infrared wave band.
8. The apparatus according to claim 1 or 2,
the image sensor performs multiple exposures in a global exposure mode, the time period of the near-infrared supplementary light is a subset of the exposure time period of the first preset exposure, or the time period of the near-infrared supplementary light and the exposure time period of the first preset exposure have an intersection, or the exposure time period of the first preset exposure is a subset of the near-infrared supplementary light.
9. The apparatus according to claim 1 or 2,
the image sensor adopts a rolling shutter exposure mode to carry out multiple exposure,
the starting time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last row of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not later than the exposure ending time of the first row of effective images in the first preset exposure;
alternatively, the first and second electrodes may be,
the starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and is not later than the exposure ending time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure starting time of the last line of effective images in the first preset exposure and is not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure; or
The starting time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images of the nearest second preset exposure before the first preset exposure and not later than the exposure starting time of the first line of effective images in the first preset exposure, and the ending time of the near-infrared supplementary lighting is not earlier than the exposure ending time of the last line of effective images in the first preset exposure and not later than the exposure starting time of the first line of effective images of the nearest second preset exposure after the first preset exposure.
10. The apparatus according to claim 1 or 2,
when the central wavelength of the near-infrared light supplement performed by the first light supplement device is a set characteristic wavelength or falls within a set characteristic wavelength range, the central wavelength and/or the waveband width of the near-infrared light passing through the light filtering component reach a constraint condition.
11. The apparatus of claim 10, wherein the constraint comprises any one of:
the difference value between the central wavelength of the near-infrared light passing through the light filtering component and the central wavelength of the near-infrared light supplementary lighting performed by the first light supplementary lighting device is within a wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers;
or
The half bandwidth of the near infrared light passing through the filtering component is less than or equal to 50 nanometers;
or
The first wave band width is smaller than the second wave band width; the first wavelength band width refers to the wavelength band width of near infrared light passing through the filter assembly, and the second wavelength band width refers to the wavelength band width of near infrared light blocked by the filter assembly;
or
The third wave band width is smaller than the reference wave band width, the third wave band width refers to the wave band width of the near infrared light with the passing rate larger than the set proportion, the reference wave band width is any wave band width in the wave band range of 50 nanometers to 150 nanometers, and the set proportion is any proportion in the proportion range of 30 percent to 50 percent.
12. The apparatus according to claim 1 or 2, wherein the image processing unit comprises:
the image fusion device comprises an image preprocessing unit, an image fusion unit and an image identification unit;
the image preprocessing unit is used for preprocessing the first image signal to generate a gray image and preprocessing the second image signal to generate a color image;
the image fusion unit is used for carrying out fusion processing on the color image and the gray level image to obtain a color fusion image;
the image recognition unit is used for recognizing the license plate to be recognized according to the color fusion image output by the image fusion, or recognizing the license plate to be recognized according to the gray image output by the image preprocessing unit.
13. The apparatus of claim 12, wherein the image pre-processing unit comprises:
the device comprises a first preprocessing unit, a second preprocessing unit and a combined noise reduction unit;
the first preprocessing unit is used for performing first preprocessing operation on the first image signal to obtain a preprocessed gray-scale image;
the second preprocessing unit is used for performing second preprocessing operation on the second image signal to obtain a color image;
the combined noise reduction unit is used for filtering the color image and the gray level image to obtain a noise-reduced color image and a noise-reduced gray level image; or the like, or, alternatively,
the image preprocessing unit includes:
the device comprises a joint noise reduction unit, a first preprocessing unit and a second preprocessing unit;
the joint noise reduction unit is configured to perform filtering processing on the first image signal and the second image signal to obtain a noise-reduced first image signal and a noise-reduced second image signal;
the first preprocessing unit is used for performing first preprocessing operation on the first image signal subjected to noise reduction to obtain a preprocessed gray image;
and the second preprocessing unit is used for performing second preprocessing operation on the second image signal subjected to noise reduction to obtain a color image.
14. The apparatus of claim 12, wherein the image recognition unit comprises:
the license plate positioning unit and the license plate recognition unit are arranged on the vehicle body;
the license plate positioning unit is used for extracting a color license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the color fusion image output by the image fusion unit;
and the license plate recognition unit is used for carrying out character recognition on the color license plate region image and acquiring the license plate number of the license plate to be recognized.
15. The apparatus of claim 12, wherein the image recognition unit comprises:
the license plate positioning unit, the license plate recognition unit and the image intercepting unit are arranged on the vehicle body;
the license plate positioning unit is used for extracting a gray license plate region image of the license plate to be recognized by adopting a deep learning algorithm according to the gray image output by the image preprocessing unit and acquiring a license plate position coordinate in the license plate region image;
the image intercepting unit is used for extracting a color license plate region image of the license plate to be recognized according to the color fusion image output by the image fusion unit and the license plate position coordinate output by the license plate positioning unit;
and the license plate recognition unit is used for carrying out character recognition according to the gray-scale license plate area image output by the license plate positioning unit and acquiring the license plate number of the license plate to be recognized.
16. The apparatus according to claim 12, wherein the image fusion unit comprises: the color extraction unit and the brightness extraction unit are respectively connected with the fusion processing unit;
wherein, the color extracting unit is used for extracting color signals of the color image;
the brightness extraction unit is used for extracting a brightness signal of the color image;
the fusion processing unit is used for carrying out high-pass filtering on the gray level image to obtain a filtered high-frequency signal;
carrying out low-pass filtering on the brightness signal of the color image to obtain a filtered low-frequency signal;
and carrying out fusion processing on the high-frequency signal, the low-frequency signal and the color signal to obtain the color fusion image.
17. The device according to claim 16, wherein the fusion processing unit is specifically configured to:
carrying out weighted fusion processing on the high-frequency signal and the low-frequency signal to obtain a fusion brightness image;
and carrying out fusion processing on the color signals of the fusion brightness image and the color image to obtain the color fusion image.
18. The device according to claim 13, wherein the joint noise reduction unit is specifically configured to:
and respectively carrying out combined filtering processing on the color image and the gray image according to the correlation between the color image and the gray image to obtain the color image and the gray image after noise reduction.
19. The device according to claim 13, wherein the joint noise reduction unit is specifically configured to:
and respectively carrying out combined filtering processing on the first image signal and the second image signal according to the correlation between the first image signal and the second image signal to obtain the first image signal and the second image signal after noise reduction.
20. The utility model provides a license plate recognition method, its characterized in that is applied to license plate recognition equipment, license plate recognition equipment includes image sensor, light filling ware and filtering subassembly, image sensor is located filtering subassembly's light-emitting side, its characterized in that, the method includes:
near-infrared supplementary lighting is carried out through a first supplementary lighting device included by a supplementary lighting device, wherein the near-infrared supplementary lighting is carried out at least in a part of exposure time period of first preset exposure, the near-infrared supplementary lighting is not carried out in the exposure time period of second preset exposure, the first preset exposure and the second preset exposure are two times of multiple exposures of an image sensor, the first preset exposure and the second preset exposure are continuous two times of exposures, and the first preset exposure and the second preset exposure are alternately carried out; for any near-infrared supplementary lighting, no intersection exists between the time period of the near-infrared supplementary lighting and the nearest exposure time period of the second preset exposure;
the light in the visible light wave band and part of near infrared light pass through the filtering component;
performing multiple exposures by the image sensor to generate and output a first image signal and a second image signal, the first image signal being an image signal generated according to the first preset exposure, the second image signal being an image signal generated according to the second preset exposure; the first image signal and the second image signal comprise information of a license plate to be recognized, and the first image signal and the second image signal are both one-frame image signals;
and the image processing unit identifies the license plate to be identified according to the first image signal and the second image signal.
21. The method of claim 20,
the intensity of near infrared light passing through the light filtering component when the first light supplementing device performs near infrared light supplementing is higher than the intensity of near infrared light passing through the light filtering component when the first light supplementing device does not perform near infrared light supplementing.
22. The method of claim 20 or 21, further comprising:
and the light supplement device comprises a second light supplement device for supplementing visible light.
23. The method of claim 20 or 21,
when the central wavelength of the near-infrared light supplement performed by the first light supplement device is a set characteristic wavelength or falls within a set characteristic wavelength range, the central wavelength and/or the waveband width of the near-infrared light passing through the light filtering component reach a constraint condition.
24. The method according to claim 20 or 21, wherein the image processing unit identifies the license plate to be identified according to the first image signal and the second image signal, and comprises:
preprocessing the first image signal to generate a gray image, and preprocessing the second image signal to generate a color image;
fusing the color image and the gray level image to obtain a color fused image;
and identifying the license plate to be identified according to the color fusion image, or identifying the license plate to be identified according to the gray image.
CN201910472683.3A 2019-05-31 2019-05-31 License plate recognition device and method Active CN110490187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910472683.3A CN110490187B (en) 2019-05-31 2019-05-31 License plate recognition device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910472683.3A CN110490187B (en) 2019-05-31 2019-05-31 License plate recognition device and method

Publications (2)

Publication Number Publication Date
CN110490187A CN110490187A (en) 2019-11-22
CN110490187B true CN110490187B (en) 2022-04-15

Family

ID=68546315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910472683.3A Active CN110490187B (en) 2019-05-31 2019-05-31 License plate recognition device and method

Country Status (1)

Country Link
CN (1) CN110490187B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110490811B (en) * 2019-05-31 2022-09-09 杭州海康威视数字技术股份有限公司 Image noise reduction device and image noise reduction method
CN110493491B (en) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 Image acquisition device and camera shooting method
CN110490042B (en) * 2019-05-31 2022-02-11 杭州海康威视数字技术股份有限公司 Face recognition device and entrance guard's equipment
CN110493492B (en) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 Image acquisition device and image acquisition method
CN113542613B (en) * 2020-04-14 2023-05-12 华为技术有限公司 Device and method for photographing
CN111601046B (en) * 2020-04-22 2022-03-01 惠州市德赛西威汽车电子股份有限公司 Dark light environment driving state monitoring method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202887451U (en) * 2012-10-26 2013-04-17 青岛海信网络科技股份有限公司 Infrared light and visible light filling blended electronic police system
CN106375740A (en) * 2016-09-28 2017-02-01 华为技术有限公司 Method, device and system for generating RGB image
CN107005639A (en) * 2014-12-10 2017-08-01 索尼公司 Image pick up equipment, image pickup method, program and image processing equipment
CN107845083A (en) * 2016-09-19 2018-03-27 杭州海康威视数字技术股份有限公司 It is divided the image capture device of fusion
CN108540736A (en) * 2018-04-03 2018-09-14 深圳新亮智能技术有限公司 Infrared laser illuminates the camera chain of Color License Plate

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016096430A (en) * 2014-11-13 2016-05-26 パナソニックIpマネジメント株式会社 Imaging device and imaging method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202887451U (en) * 2012-10-26 2013-04-17 青岛海信网络科技股份有限公司 Infrared light and visible light filling blended electronic police system
CN107005639A (en) * 2014-12-10 2017-08-01 索尼公司 Image pick up equipment, image pickup method, program and image processing equipment
CN107845083A (en) * 2016-09-19 2018-03-27 杭州海康威视数字技术股份有限公司 It is divided the image capture device of fusion
CN106375740A (en) * 2016-09-28 2017-02-01 华为技术有限公司 Method, device and system for generating RGB image
CN108540736A (en) * 2018-04-03 2018-09-14 深圳新亮智能技术有限公司 Infrared laser illuminates the camera chain of Color License Plate

Also Published As

Publication number Publication date
CN110490187A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN110490187B (en) License plate recognition device and method
CN110493494B (en) Image fusion device and image fusion method
CN110519489B (en) Image acquisition method and device
CN110505377B (en) Image fusion apparatus and method
CN110493491B (en) Image acquisition device and camera shooting method
CN110490041B (en) Face image acquisition device and method
CN110490811B (en) Image noise reduction device and image noise reduction method
CN110706178B (en) Image fusion device, method, equipment and storage medium
CN110490042B (en) Face recognition device and entrance guard's equipment
CN110493535B (en) Image acquisition device and image acquisition method
CN110493536B (en) Image acquisition device and image acquisition method
CN110490044B (en) Face modeling device and face modeling method
CN110493495B (en) Image acquisition device and image acquisition method
CN110493496B (en) Image acquisition device and method
CN108712608A (en) Terminal device image pickup method and device
CN110493537B (en) Image acquisition device and image acquisition method
CN110493493B (en) Panoramic detail camera and method for acquiring image signal
US11889032B2 (en) Apparatus for acquiring image and method for acquiring image
CN110493532A (en) A kind of image processing method and system
CN110493533B (en) Image acquisition device and image acquisition method
CN105635702A (en) Imaging method, image sensor, imaging device and electronic device
CN113126252B (en) Low-light-level imaging system
JP2012010141A (en) Image processing apparatus
CN114374776B (en) Camera and control method of camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant