CN108234897B - Method and device for controlling night vision system, storage medium and processor - Google Patents

Method and device for controlling night vision system, storage medium and processor Download PDF

Info

Publication number
CN108234897B
CN108234897B CN201810103623.XA CN201810103623A CN108234897B CN 108234897 B CN108234897 B CN 108234897B CN 201810103623 A CN201810103623 A CN 201810103623A CN 108234897 B CN108234897 B CN 108234897B
Authority
CN
China
Prior art keywords
acquisition device
image acquisition
distance
optical axis
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810103623.XA
Other languages
Chinese (zh)
Other versions
CN108234897A (en
Inventor
蒋涛
左昉
王新韬
李泽一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING JIGUANG TONGDA TECHNOLOGY Co.,Ltd.
Original Assignee
蒋涛
左昉
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 蒋涛, 左昉 filed Critical 蒋涛
Priority to CN201810103623.XA priority Critical patent/CN108234897B/en
Publication of CN108234897A publication Critical patent/CN108234897A/en
Application granted granted Critical
Publication of CN108234897B publication Critical patent/CN108234897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Abstract

The invention discloses a method and a device for controlling a night vision system, a storage medium and a processor. Wherein, the method comprises the following steps: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle. The invention solves the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time, so that the laser illumination efficiency is low.

Description

Method and device for controlling night vision system, storage medium and processor
Technical Field
The invention relates to the technical field of night vision, in particular to a method and a device for controlling a night vision system, a storage medium and a processor.
Background
Night vision technology refers to a kind of electro-optical technology that achieves night observation by means of an electro-optical imaging device. Mainly comprises an active infrared night vision technology and a passive infrared night vision technology. The active infrared refers to processing for compensating infrared light for an area to be observed, and imaging is performed by infrared light reflected by an object to be observed. While passive infrared refers to long-wave infrared imaging by sensing infrared rays emitted by an observed object. The infrared night vision technology plays an important role in forest fire prevention, railway monitoring, police evidence obtaining and the like.
However, in the conventional laser night vision device, due to the distance between the two optical axes, the light spot of the laser cannot be irradiated to the center of the field of view of the low-illumination camera, so that a satisfactory light supplement effect is achieved, and the influence of the existing optical axis distance on the laser illumination effect becomes more obvious as the night vision distance becomes longer and the angle of view becomes smaller, if the irradiation angle of the laser emitting device cannot be adjusted instantly and accurately, the requirement of the low-illumination camera on the light source is difficult to achieve, so that the imaging is not clear.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a method and a device for controlling a night vision system, a storage medium and a processor, which are used for at least solving the technical problem that the night vision system in the prior art cannot adjust the angle of a laser lens in real time, so that the laser illumination efficiency is low.
According to an aspect of an embodiment of the present invention, there is provided a method of controlling a night vision system, including: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
Further, before obtaining the first distance and the second distance, the method further includes: detecting whether the optical axis of the first image acquisition device is parallel to the optical axis of the second image acquisition device; if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device; and if the detection result is positive, extracting the first image characteristic data acquired by the first image acquisition device and the second image characteristic data acquired by the second image acquisition device.
Further, after extracting the first image feature data collected by the first image collection device and the second image feature data collected by the second image collection device, the method further includes: judging whether the first image characteristic data is matched with the second image characteristic data; and if the first image characteristic data is matched with the second image characteristic data, acquiring a first distance between an imaging plane of the first image acquisition device and a target object and a second distance between an optical axis of the first image acquisition device and an optical axis of the laser lens.
Further, after controlling the laser lens to adjust to the position corresponding to the target light-emitting angle, the method further includes: acquiring first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device; performing fusion processing on the first image data and the second image data by adopting an image fusion algorithm to obtain third image data; and outputting the third image data.
Further, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Further, a first distance between the imaging plane of the first image capturing device and the target object is obtained by the following formula:
Figure GDA0002823344900000021
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is a distance between an imaging plane of the first image capturing device and an imaging plane of the second image capturing device; b is a second distance; x is the number ofrIs the distance, x, from the optical axis of the image of the energy center of gravity of the target object in the imaging plane of the first image acquisition devicelThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and the optical axis is shown.
Further, according to the first mentioned above, by the following formulaThe distance and the second distance determine the target light-emitting angle:
Figure GDA0002823344900000022
where θ is the target light-emitting angle.
According to another aspect of an embodiment of the present invention, there is also provided an apparatus for controlling a night vision system, including: the first determining module is used for determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; the acquisition module is used for acquiring a first distance between an imaging plane of the first image acquisition device and the target object based on the second distance; the second determining module is used for determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots a target object; and the control module is used for controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
According to another aspect of an embodiment of the present invention, there is also provided a storage medium comprising a stored program, wherein the program is executed by a computer processor to implement any one of the above-mentioned methods of controlling a night vision system.
According to another aspect of an embodiment of the present invention, there is also provided a processor for executing a program, wherein the program is executed to perform any one of the above-mentioned methods of controlling a night vision system.
In the embodiment of the invention, a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens is determined; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; the laser lens is controlled to be adjusted to the position corresponding to the target light-emitting angle, the aim of irradiating laser spots to the center of the field of view of the low-illumination camera by controlling the rotation of the laser lens is achieved, the technical effects of improving the efficiency and the light supplementing rate of laser illumination are achieved, and the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time to cause low efficiency of the laser illumination is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of steps of a method of controlling a night vision system according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating steps of an alternative method of controlling a night vision system in accordance with an embodiment of the present invention;
FIG. 3 is a schematic diagram of the ranging principle of an alternative method of controlling a night vision system according to an embodiment of the invention;
FIG. 4 is a schematic diagram of an alternative laser lens angle adjustment according to an embodiment of the present invention;
FIG. 5 is a schematic illustration of an alternative laser spot according to an embodiment of the present invention;
FIG. 6 is a flow chart illustrating steps of an alternative method of controlling a night vision system in accordance with an embodiment of the present invention;
FIG. 7 is a flowchart illustrating steps in an alternative method of controlling a night vision system in accordance with an embodiment of the present invention; and
fig. 8 is a schematic structural diagram of an arrangement for controlling a night vision system according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, in order to facilitate understanding of the embodiments of the present invention, some terms or nouns referred to in the present invention will be explained as follows:
digital Signal Processing (DSP): refers to the theory and technique of representing and processing signals digitally.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a method of controlling a night vision system, it being noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that presented herein.
Fig. 1 is a flow chart of the steps of a method of controlling a night vision system according to an embodiment of the invention, as shown in fig. 1, comprising the steps of:
step S102, determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens;
step S104, acquiring a first distance between the imaging plane of the first image acquisition device and a target object based on the second distance;
step S106, determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object;
and step S108, controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
In the embodiment of the invention, a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens is determined; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; the laser lens is controlled to be adjusted to the position corresponding to the target light-emitting angle, the aim of irradiating laser spots to the center of the field of view of the low-illumination camera by controlling the rotation of the laser lens is achieved, the technical effects of improving the efficiency and the light supplementing rate of laser illumination are achieved, and the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time to cause low efficiency of the laser illumination is solved.
It should be clear that, alternatively, the current light-exiting angle and the target light-exiting angle may be any angle between 0 and 45 °.
In an optional embodiment, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, a laser lens of the laser device is fixedly disposed on the laser angle fine-tuning device, and the laser angle fine-tuning device may be connected to the controller, and is configured to adjust the laser lens to a position corresponding to the target light-emitting angle under the control of the controller, where the laser angle fine-tuning device may adjust a horizontal angle and a vertical angle of the laser lens.
As an alternative embodiment, the first image capturing device may be a low-light camera, and the first image capturing device may include, but is not limited to: a visible light telephoto lens; the low-illumination camera can be used for sensing low-illumination information of visible light of the environment, and can provide images of visible light wave bands for the DSP high-speed fusion circuit under the control of the controller, and the zoom focusing position of the visible light telephoto lens on the low-illumination camera is controlled by the controller.
As an alternative embodiment, the second image capturing device may be an infrared camera. The infrared camera can be used for sensing the temperature field of the environment and imaging in an invisible light wave band (such as a far infrared wave band), and further the infrared camera can provide images in the far infrared wave band for the DSP high-speed fusion circuit under the control of the controller. The second image acquisition device further comprises an infrared zoom lens, wherein the zooming and focusing positions of the infrared zoom lens are controlled by the controller.
In an alternative embodiment, fig. 2 is a flow chart of steps of an alternative method of controlling a night vision system according to an embodiment of the invention, as shown in fig. 2, before the first distance and the second distance are obtained, the method further comprising:
step S202, detecting whether the optical axis of the first image acquisition device is parallel to the optical axis of the second image acquisition device;
step S204, if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device;
in step S206, if the detection result is yes, the first image feature data collected by the first image collection device and the second image feature data collected by the second image collection device are extracted.
It should be noted that, it is detected whether the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device, that is, whether the first image capturing device and the second image capturing device are coaxial.
Based on the optional embodiments provided in steps S202 to S206, a precondition for controlling the laser lens to adjust to the position corresponding to the target light-emitting angle is that the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device. Therefore, when the optical axis of the first image acquisition device is detected to be not parallel to the optical axis of the second image acquisition device, the optical axis of the first image acquisition device is adjusted to be parallel to the optical axis of the second image acquisition device.
As an alternative embodiment, when night vision needs to be performed according to the fusion mode, the controller in the night vision system may detect whether the optical axis of the preset visible light camera and the optical axis of the preset infrared camera are parallel, and when the optical axis of the visible light camera and the optical axis of the preset infrared camera are parallel, the controller performs sampling measurement on the horizontal angle and the pitch angle of the infrared angle fine-tuning device at the current moment. And according to the comparison sampling value and the expected value, the controller controls a stepping motor in the infrared angle fine adjustment device to rotate so as to enable the infrared camera to reach the designated position.
And after the infrared camera reaches the designated position, acquiring first image characteristic data acquired by the infrared camera and second image characteristic data acquired by the low-illumination camera, selecting a target object existing in both the first image characteristic data and the second image characteristic data, and marking the target object in the first image characteristic data. And then finding a corresponding target object in the laser imaging image through edge detection and feature point matching. As shown in FIG. 3, the energy barycenters Pr and Pl of the target object in the image are calculated by the energy barycenter method, and the coordinates are xrAnd x1
As also shown in fig. 3, where d is the first distance; f1 is the focal length number of the lens of the first image pickup device (low-illuminance camera), and f2 is the focal length number of the lens of the second image pickup device (infrared camera); k is the first diagramThe distance between the imaging plane of the image acquisition device and the imaging plane of the second image acquisition device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number ofrIs the imaging distance, x, of the energy center of gravity of the target object in the imaging plane of the first image acquisition device from the optical axisr1The distance between the imaging of the energy gravity center of the target object in the imaging plane of the first image acquisition device and a first preset parallel line is taken; x is the number oflIs the imaging distance, x, of the energy center of gravity of the target object in the imaging plane of the second image acquisition device from the optical axisl1The distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image capturing device and the second predetermined parallel line, O1 the optical axis of the first image capturing device (i.e., the optical axis of visible light), and O2 the optical axis of the second image capturing device (the optical axis of thermal imaging).
In an alternative embodiment, the second distance b between the optical axis of the first image capturing device and the optical axis of the laser lens may be determined according to the following formula:
Figure GDA0002823344900000071
based on the second distance b, a first distance d between the imaging plane of the first image capturing device and the target object can be acquired:
Figure GDA0002823344900000072
it should be noted that, in the above alternative embodiments provided in the present application, only the focal length number f1 of the lens of the first image capturing device, the focal length number f2 of the lens of the second image capturing device, the second distance b between the optical axis of the first image capturing device and the optical axis of the laser lens, the distance k between the imaging plane of the first image capturing device and the imaging plane of the second image capturing device, and the distance x between the imaging plane of the first image capturing device and the optical axis of the energy center of gravity of the target object in the imaging plane of the first image capturing device are requiredrAnd the energy center of gravity of the target objectDistance x between the image of the imaging plane of the second image acquisition device and the optical axisl
In an alternative embodiment, the image capturing device in the present application may use either a fixed focus lens or a zoom lens, wherein if the fixed focus lens is used, the focal length f1 of the visible light lens of the low-illumination camera and the focal length f2 of the thermal imaging lens of the infrared camera are fixed. If the zoom lens is used, the focal length parameters of the visible light lens and the thermal imaging lens at the moment need to be read. The distance k between the imaging plane of the first image acquisition device and the imaging plane of the second image acquisition device can be determined, but is not limited to, by measurement. At this time, the present application can derive the first distance d between the target object to be observed and the imaging plane of the visible light camera.
As shown in fig. 4, the target light-emitting angle θ can be further determined based on the first distance d and the second distance b:
Figure GDA0002823344900000081
where θ is the target light-emitting angle.
In addition, it should be noted that fig. 4 includes the target object 1 and the target object 2, and since the first distance d between the imaging plane of the first image capturing device and the target object is different, the target light-emitting angle varies with the first distance d. The laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; the target light-emitting angle, that is, the angle of the laser emitted by the laser lens, can be adjusted to a position corresponding to the target light-emitting angle by controlling the laser lens through a laser angle fine-tuning device shown in fig. 4.
In an optional embodiment, if the determination result is yes, that is, if the current light-emitting angle of the laser lens is the target light-emitting angle, the laser device is controlled to adjust the laser spot to a preset size.
Fig. 5 is a schematic diagram of an alternative laser spot according to an embodiment of the present invention, and as shown in fig. 5, since the size and shape of the laser spot are not completely circular in the picture, it is highly likely that only a part of the circle appears in the picture. Since the application has already finished the horizontal direction calibration of the laser spot at this time, only the size of the laser spot and the angle in the vertical direction need to be adjusted. The edge of the laser spot is extracted through an edge detection algorithm, then binarization processing is carried out on the image data, and the radius and the center of a circle closest to the laser spot are fitted through a circle fitting algorithm.
As an optional embodiment, according to the relationship between the circle center radius and the view field size, the optimal size of the laser spot can be calculated, and at this time, the controller compares the fitted circle center, radius and optimal laser spot size to obtain the control strategy at this time. At this point, the apparatus controls the stepper motor on the laser lens according to this control strategy. The lens group in the laser lens is driven to move on the spiral groove of the laser lens by controlling the stepping motor. The optical system in the laser lens is adjusted to a proper position, so that the laser spot is in an optimal position, and the size of the laser spot can be matched with a target in a visual field.
In an alternative embodiment, the night vision system in the present application may calculate the zoom value and the focus value of the infrared zoom lens matched with the visible light camera at this time according to the size of the field of view at the current time. In addition, the night vision system can calculate an included angle between an optical axis (a main optical axis) of the infrared zoom lens and an optical axis (a main optical axis) of the visible light zoom lens according to the size of the field of view and the first distance d. That is, based on the above alternative embodiments, the night vision system in the present application may determine the best matching focus value and zoom value of the specific ir zoom lens and the best angle value of the ir angle fine adjustment device. After the operation is finished, the DSP high-speed fusion circuit transmits the angle value of the included angle to the controller through a data line by an RS485 level protocol.
In an optional embodiment, the controller receives the angle value calculated by the DSP high-speed fusion circuit, and samples the focus value and the zoom value of the infrared zoom lens to determine the current light-emitting angle of the laser lens in the laser device. And comparing the current light-emitting angle with the target light-emitting angle in the sampling result to judge whether the current light-emitting angle of the laser lens is the target light-emitting angle, and controlling the stepping motor to rotate in the direction of reducing the difference between the current light-emitting angle and the target light-emitting angle under the condition that the current light-emitting angle is not the target light-emitting angle until the difference between the current light-emitting angle and the target light-emitting angle is within a set range.
In an alternative embodiment, fig. 6 is a flowchart illustrating steps of an alternative method for controlling a night vision system according to an embodiment of the present invention, as shown in fig. 6, after extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device, the method further includes:
step S302, judging whether the first image characteristic data is matched with the second image characteristic data;
step S304, if the first image characteristic data matches the second image characteristic data, a first distance between an imaging plane of the first image capturing device and a target object and a second distance between an optical axis of the first image capturing device and an optical axis of the laser lens are obtained.
As an alternative embodiment, when it is detected that the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device, the first image feature data captured by the first image capturing device and the second image feature data captured by the second image capturing device may be obtained, and when it is determined that the first image feature data matches the second image feature data, the first distance between the imaging plane of the first image capturing device and the target object and the second distance between the optical axis of the first image capturing device and the optical axis of the laser lens may be obtained.
In an alternative embodiment, fig. 7 is a flowchart illustrating steps of an alternative method for controlling a night vision system according to an embodiment of the present invention, where, after controlling the laser lens to adjust to a position corresponding to the target light-emitting angle, as shown in fig. 7, the method further includes:
step S402, acquiring first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device;
step S404, an image fusion algorithm is adopted to perform fusion processing on the first image data and the second image data to obtain third image data; and outputting the third image data.
In an optional implementation manner, if the laser angle fine-tuning device has controlled the laser lens to adjust to the position corresponding to the target light-emitting angle, or the controller detects that the difference between the current light-emitting angle of the laser lens in the laser device and the target light-emitting angle is within the set range, the controller may send a trigger signal to the DSP high-speed fusion circuit, and the DSP high-speed fusion circuit reads each frame of image captured by the visible light camera and the infrared thermal imaging camera through the video decoder to obtain the first image data and the second image data.
As an alternative embodiment, the first image data and the second image data may be fused by a pyramid decomposition fusion algorithm or an odd-even segmentation fusion algorithm to obtain third image data, and the third image data may be displayed on a display.
Optionally, the image fusion algorithm may be a pyramid decomposition fusion algorithm, which is also called a tower decomposition fusion algorithm, or an odd-even segmentation fusion algorithm.
As an alternative embodiment, an alternative image-based optical axis smart-adjusting night vision fusion system further provided in the present application may include, but is not limited to: the system comprises an infrared camera, an infrared zoom lens, an infrared angle fine adjustment device, a low-illumination camera, a visible light zoom lens, a laser driver, a laser lens, a laser angle fine adjustment device, a controller, a photosensitive controller, a DSP high-speed fusion circuit, a display and the like.
In an alternative embodiment, the infrared zoom lens and the infrared camera are connected through a connecting ring and fixed on the infrared angle fine adjustment device through a lens bracket. The infrared zoom lens is communicated with the control circuit through the position feedback potentiometer. The infrared camera is connected with the DSP high-speed fusion circuit through a signal wire and is communicated with the control circuit through the signal wire. The infrared angle fine-tuning device is connected with the controller through a signal wire. The position of an optical axis passing through the capture infrared imaging system in the infrared angle fine adjustment device is transmitted back to the controller, and the controller controls the infrared angle fine adjustment device according to the back transmission angle.
In an alternative embodiment, the low-light camera and the visible light zoom lens are connected through a connecting ring and fixed on the bottom plate through a hoop. The visible light zoom lens is communicated with the controller through a position feedback potentiometer; the low-illumination camera is connected with the DSP high-speed fusion circuit through a signal line and communicated with the controller through the signal line.
In an alternative embodiment, the laser driver is connected with the laser through a power supply line, the laser is led out through the switching optical fiber, the switching optical fiber is connected to the laser lens through the flange, the size of a laser spot is adjusted through an optical system in the laser lens, and laser energy is uniformly distributed on the spot to homogenize the laser. The laser lens is fixed on the laser angle fine-tuning device through the lens support. Wherein, the laser driver is connected with the controller through a signal line. The laser angle fine-tuning device is connected with the controller through a signal line. The laser lens communicates with the controller through a position feedback potentiometer.
In an alternative embodiment, the DSP high-speed fusion circuit may communicate with the controller, and may transmit the fused image data to a display for displaying through a signal line.
Example 2
An embodiment of the present invention further provides an apparatus for implementing the method for controlling a night vision system, fig. 8 is a schematic structural diagram of an apparatus for controlling a night vision system according to an embodiment of the present invention, and as shown in fig. 8, the apparatus for controlling a night vision system includes: a first determination module 10, an acquisition module 12, a second determination module 14, and a control module 16, wherein,
the first determining module 10 is configured to determine a second distance between the optical axis of the first image capturing device and the optical axis of the laser lens; the obtaining module 12 is configured to obtain a first distance between an imaging plane of the first image capturing device and the target object based on the second distance; the second determining module 14 is configured to determine a target light-emitting angle of the laser lens according to the first distance and the second distance, where the laser lens is configured to emit laser light for supplementing light to the first image capturing device when the first image capturing device captures a target object; and the control module 16 is used for controlling the laser lens to adjust to a position corresponding to the target light-emitting angle.
In the embodiment of the present invention, the first determining module 10 is configured to determine a second distance between the optical axis of the first image capturing device and the optical axis of the laser lens; the obtaining module 12 is configured to obtain a first distance between an imaging plane of the first image capturing device and the target object based on the second distance; the second determining module 14 is configured to determine a target light-emitting angle of the laser lens according to the first distance and the second distance, where the laser lens is configured to emit laser light for supplementing light to the first image capturing device when the first image capturing device captures a target object; the control module 16 is used for controlling the laser lens to adjust to a position corresponding to the target light-emitting angle, so that the purpose of irradiating laser spots to the center of the field of view of the low-illumination camera by controlling the laser lens to rotate is achieved, the technical effects of improving the efficiency and the light supplement rate of laser illumination are achieved, and the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time to cause low efficiency of laser illumination is solved.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted that the first determining module 10, the obtaining module 12, the second determining module 14 and the control module 16 correspond to steps S102 to S108 in embodiment 1, and the modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above-described arrangement for controlling a night vision system may further comprise a processor and a memory, the above-described first determining module 10, the acquiring module 12, the second determining module 14, the control module 16, etc. being stored in the memory as program elements, the processor executing the above-described program elements stored in the memory to implement the respective functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
The embodiment of the application also provides a storage medium. Optionally, in this embodiment, the storage medium includes a stored program, and the device on which the storage medium is located is controlled to execute any one of the above methods for controlling a night vision system when the program runs.
Optionally, in this embodiment, the storage medium may be located in any one of computer terminals in a computer terminal group in a computer network, or in any one of mobile terminals in a mobile terminal group.
The embodiment of the application also provides a processor. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes any one of the above methods for controlling a night vision system.
The embodiment of the application provides equipment, the equipment comprises a processor, a memory and a program which is stored on the memory and can run on the processor, and the following steps are realized when the processor executes the program: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
Optionally, when the processor executes a program, it may further detect whether an optical axis of the first image capturing device is parallel to an optical axis of the second image capturing device; if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device; and if the detection result is positive, extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device.
Optionally, when the processor executes a program, it may further determine whether the first image feature data matches the second image feature data; and if the first image characteristic data is matched with the second image characteristic data, acquiring a first distance between an imaging plane of the first image acquisition device and a target object and a second distance between an optical axis of the first image acquisition device and an optical axis of the laser lens.
Optionally, when the processor executes a program, first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device may also be acquired; performing fusion processing on the first image data and the second image data by adopting an image fusion algorithm to obtain third image data; and outputting the third image data.
Optionally, when the processor executes a program, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, when the processor executes the program, the first distance between the imaging plane of the first image capturing device and the target object may be obtained according to the following formula:
Figure GDA0002823344900000131
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is a distance between an imaging plane of the first image capturing device and an imaging plane of the second image capturing device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number ofrThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device and the optical axis is obtained; x is the number oflThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and the optical axis is shown.
Optionally, when the processor executes the program, the target light-emitting angle may be determined according to the first distance and the second distance by the following formula:
Figure GDA0002823344900000132
where θ is the target light-emitting angle.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
Optionally, when the computer program product executes a program, it may further detect whether an optical axis of the first image capturing device is parallel to an optical axis of the second image capturing device; if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device; and if the detection result is positive, extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device.
Optionally, when the computer program product executes a program, it may further determine whether the first image feature data matches the second image feature data; and if the first image characteristic data is matched with the second image characteristic data, acquiring a first distance between an imaging plane of the first image acquisition device and a target object and a second distance between an optical axis of the first image acquisition device and an optical axis of the laser lens.
Optionally, when the computer program product executes a program, first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device may also be acquired; performing fusion processing on the first image data and the second image data by adopting an image fusion algorithm to obtain third image data; and outputting the third image data.
Optionally, when the computer program product executes a program, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, when the computer program product executes a program, the first distance between the imaging plane of the first image capturing device and the target object may be obtained according to the following formula:
Figure GDA0002823344900000141
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is a distance between an imaging plane of the first image capturing device and an imaging plane of the second image capturing device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number ofrThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device and the optical axis is obtained; x is the number oflThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and the optical axis is shown.
Optionally, when the computer program product executes a program, the target light-emitting angle may be determined according to the first distance and the second distance by the following formula:
Figure GDA0002823344900000142
where θ is the target light-emitting angle.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (8)

1. A method of controlling a night vision system, comprising:
determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens;
acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance;
determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object;
controlling the laser lens to adjust to a position corresponding to the target light-emitting angle;
wherein, prior to obtaining the first distance and the second distance, the method further comprises:
detecting whether the optical axis of the first image acquisition device is parallel to the optical axis of the second image acquisition device;
if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device;
if the detection result is yes, extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device;
acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance:
Figure FDA0002823344890000011
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is the distance between the imaging plane of the first image acquisition device and the imaging plane of the second image acquisition device; b is the second distance; x is the number ofrIs the distance, x, of the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device from the optical axis of the first image acquisition devicelThe distance between the imaging of the energy gravity center of the target object in the imaging plane of the second image acquisition device and the optical axis of the second image acquisition device is determined;
wherein the second image acquisition device comprises the laser lens.
2. The method of claim 1, wherein after extracting first image feature data acquired by the first image acquisition device and second image feature data acquired by the second image acquisition device, the method further comprises:
judging whether the first image characteristic data is matched with the second image characteristic data;
and if the first image characteristic data is matched with the second image characteristic data, executing the determination of the second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens.
3. The method of claim 1, wherein the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
4. The method of claim 1, wherein the second distance between the optical axis of the first image capture device and the optical axis of the laser lens is determined by the following equation:
xr+xr1+xl+xl1=b;
wherein x isr1Is the distance, x, from a first predetermined parallel line for the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition devicel1Is the distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and a second predetermined parallel line, wherein the first predetermined parallel line and the second predetermined parallel line are both parallel to the optical axis of the first image acquisition device.
5. The method of claim 4, wherein the target light extraction angle is determined from the first distance and the second distance by the formula:
Figure FDA0002823344890000021
and theta is the target light-emitting angle.
6. An arrangement for controlling a night vision system, comprising:
the first determining module is used for determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens;
the acquisition module is used for acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance;
a second determining module, configured to determine a target light-emitting angle of the laser lens according to the first distance and the second distance, where the laser lens is configured to emit laser light for supplementing light to the first image capturing device when the first image capturing device captures the target object;
the control module is used for controlling the laser lens to adjust to a position corresponding to the target light-emitting angle;
wherein the apparatus further comprises:
the detection module is used for detecting whether the optical axis of the first image acquisition device is parallel to the optical axis of the second image acquisition device;
the adjusting module is used for adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device if the detection result is negative;
the extraction module is used for extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device if the detection result is positive;
acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance:
Figure FDA0002823344890000031
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is the distance between the imaging plane of the first image acquisition device and the imaging plane of the second image acquisition device; b is the second distance; x is the number ofrIs the distance, x, of the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device from the optical axis of the first image acquisition devicelThe distance between the imaging of the energy gravity center of the target object in the imaging plane of the second image acquisition device and the optical axis of the second image acquisition device is obtained;
wherein the second image acquisition device comprises the laser lens.
7. A storage medium characterized in that it comprises a stored program, wherein the program is executed by a computer processor to implement the method of controlling a night vision system of any one of claims 1 to 5.
8. A processor, characterized in that the processor is adapted to run a program, wherein the program is run to perform the method of controlling a night vision system of any one of claims 1 to 5.
CN201810103623.XA 2018-02-01 2018-02-01 Method and device for controlling night vision system, storage medium and processor Active CN108234897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810103623.XA CN108234897B (en) 2018-02-01 2018-02-01 Method and device for controlling night vision system, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810103623.XA CN108234897B (en) 2018-02-01 2018-02-01 Method and device for controlling night vision system, storage medium and processor

Publications (2)

Publication Number Publication Date
CN108234897A CN108234897A (en) 2018-06-29
CN108234897B true CN108234897B (en) 2021-03-16

Family

ID=62670525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810103623.XA Active CN108234897B (en) 2018-02-01 2018-02-01 Method and device for controlling night vision system, storage medium and processor

Country Status (1)

Country Link
CN (1) CN108234897B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109474789B (en) * 2018-10-30 2020-09-08 维沃移动通信(杭州)有限公司 Method for adjusting field angle of fill-in lamp and mobile terminal
CN109541590B (en) * 2018-12-19 2020-07-10 北京科技大学 Blast furnace burden surface point cloud imaging method
CN112954152A (en) * 2020-12-30 2021-06-11 神思电子技术股份有限公司 System and method for eliminating light reflection of laser camera
CN114339209B (en) * 2021-12-29 2023-11-14 苏州浪潮智能科技有限公司 Method for measuring and calculating image inclination value, image acquisition system, device and medium
CN115499595B (en) * 2022-10-13 2023-06-30 长沙观谱红外科技有限公司 Image acquisition system based on visible light and infrared light dual imaging

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079883A (en) * 2014-07-07 2014-10-01 山东神戎电子股份有限公司 Laser night-vision device illuminator presetting bit implementation method based on stepping motor
CN104780306A (en) * 2015-04-22 2015-07-15 山东神戎电子股份有限公司 Laser night vision device capable of adjusting light spots at different distances to be aligned with view field center, and adjusting method
CN105301870A (en) * 2015-12-04 2016-02-03 蒋涛 Human eye safe laser lighting night vision system
CN105300175A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Infrared light and shimmer two-phase fused night-vision sighting device
CN105676564A (en) * 2016-03-25 2016-06-15 山东华光光电子股份有限公司 Laser night-vision device with target distance measuring and positioning functions

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017055296A (en) * 2015-09-10 2017-03-16 株式会社東芝 Wearable imaging apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079883A (en) * 2014-07-07 2014-10-01 山东神戎电子股份有限公司 Laser night-vision device illuminator presetting bit implementation method based on stepping motor
CN104780306A (en) * 2015-04-22 2015-07-15 山东神戎电子股份有限公司 Laser night vision device capable of adjusting light spots at different distances to be aligned with view field center, and adjusting method
CN105300175A (en) * 2015-10-30 2016-02-03 北京艾克利特光电科技有限公司 Infrared light and shimmer two-phase fused night-vision sighting device
CN105301870A (en) * 2015-12-04 2016-02-03 蒋涛 Human eye safe laser lighting night vision system
CN105676564A (en) * 2016-03-25 2016-06-15 山东华光光电子股份有限公司 Laser night-vision device with target distance measuring and positioning functions

Also Published As

Publication number Publication date
CN108234897A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108234897B (en) Method and device for controlling night vision system, storage medium and processor
US7181049B2 (en) Authentication object image-pickup method and device therefor
CN104065859B (en) A kind of acquisition methods and camera head of full depth image
CN102036005B (en) The imager of image is caught in process
US9336439B2 (en) System and method for the long range acquisition of iris images from stationary and mobile subjects
KR101716036B1 (en) Fire surveillance apparatus
US20130258089A1 (en) Eye Gaze Based Image Capture
WO2015001550A1 (en) Method and system for selective imaging of objects in a scene to yield enhanced
JP2008527806A (en) Night monitoring system and method
KR100977499B1 (en) Iris image acquisition system using panning and tilting of mirror at a long distance
CN109451233B (en) Device for collecting high-definition face image
JP2010107114A (en) Imaging device and air conditioner
CN205539525U (en) Automatic seek system of camera
CN101430477B (en) Method for judging object distance
CN110769148A (en) Camera automatic control method and device based on face recognition
CN109614909B (en) Iris acquisition equipment and method for extending acquisition distance
CN105376522A (en) Infrared thermal imager with optical warning device and application thereof
CN106599779A (en) Human ear recognition method
CN113627385A (en) Method and device for detecting sight direction, detection system and readable storage medium thereof
CN108449547B (en) Method for controlling a night vision system, storage medium and processor
JP5397078B2 (en) Imaging device
WO2021199168A1 (en) Photographing system, photographing method, and non-transitory computer-readable medium storing photographing program
KR20220023237A (en) Fast zoom and focusing apparatus predicting for high-speed tracking and predicting moving object for camera for providing a high-quality image seamlessly and method for high-speed tracking and predicting moving object for camera using the same
CN108366210B (en) Unmanned aerial vehicle and unmanned aerial vehicle control method
JP6780543B2 (en) Image imaging system and image imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Jiang Tao

Inventor after: Zuo Fang

Inventor after: Wang Xintao

Inventor after: Li Zeyi

Inventor before: Zuo Fang

Inventor before: Jiang Tao

Inventor before: Wang Xintao

Inventor before: Li Zeyi

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200730

Address after: 100083 Haidian District, Xueyuan Road, No. 30,

Applicant after: Jiang Tao

Applicant after: Zuo Fang

Address before: 100096 Beijing Haidian District Xisanqi street, Teng Jianhua business building 6 floor 606

Applicant before: BEIJING JIGUANG TONGDA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210729

Address after: 102208 Room 501, building 5, No. 97, Changping Road, science and Technology Park, Changping District, Beijing

Patentee after: BEIJING JIGUANG TONGDA TECHNOLOGY Co.,Ltd.

Address before: 100083 No. 30, Haidian District, Beijing, Xueyuan Road

Patentee before: Jiang Tao

Patentee before: Zuo Fang