CN109887025B - Monocular self-adjusting fire point three-dimensional positioning method and device - Google Patents

Monocular self-adjusting fire point three-dimensional positioning method and device Download PDF

Info

Publication number
CN109887025B
CN109887025B CN201910096126.6A CN201910096126A CN109887025B CN 109887025 B CN109887025 B CN 109887025B CN 201910096126 A CN201910096126 A CN 201910096126A CN 109887025 B CN109887025 B CN 109887025B
Authority
CN
China
Prior art keywords
point
fire
coordinates
calculation
fire point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910096126.6A
Other languages
Chinese (zh)
Other versions
CN109887025A (en
Inventor
王剑松
李响
鑫龙
刘美汐
孟嘉
李思毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Ligong University
Original Assignee
Shenyang Ligong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Ligong University filed Critical Shenyang Ligong University
Priority to CN201910096126.6A priority Critical patent/CN109887025B/en
Publication of CN109887025A publication Critical patent/CN109887025A/en
Application granted granted Critical
Publication of CN109887025B publication Critical patent/CN109887025B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Fire-Detection Mechanisms (AREA)
  • Image Processing (AREA)

Abstract

A monocular self-adjusting fire point three-dimensional positioning method and a monocular self-adjusting fire point three-dimensional positioning device solve the problems that binocular positions are fixed, solving precision is limited, shooting positions cannot adapt to the fire point when being shielded, and equipment cost is high in the existing binocular image method. The method adopts an infrared thermal imager as an image acquisition means, and drives a ball screw through a stepping motor to drive monocular image acquisition equipment; the controller is used as a control means, a control system is compiled at a PC end, and corresponding camera calibration calculation, shooting distance calculation and thermal image fire point three-dimensional coordinate calculation algorithms are matched, so that automatic fire point early warning is realized, and meanwhile, the fire point three-dimensional space coordinate is given. The method and the device can be applied to three-dimensional positioning of fire points in limited spaces such as indoor spaces or vehicles and ships, have strong environmental adaptability and high positioning accuracy, and can effectively save cost.

Description

Monocular self-adjusting fire point three-dimensional positioning method and device
Technical Field
The invention belongs to the technical field of mechanical and electronic engineering, and particularly relates to a monocular self-adjusting fire point three-dimensional positioning method and a monocular self-adjusting fire point three-dimensional positioning device, which can be applied to fire point three-dimensional positioning in limited spaces such as indoor spaces or vehicles and ships, have strong environmental adaptability and high positioning accuracy, and can effectively save cost.
Background
For an automatic fire extinguishing system, it is very important to accurately locate the three-dimensional position of a fire point at the initial stage of a fire, and it directly determines the aiming accuracy of fire extinguishing equipment, and plays a key role in effectively controlling the fire behavior of the automatic fire extinguishing system at the initial stage of the fire.
Fire point positioning in limited spaces such as indoors or in vehicles and ships requires rapid response and accurate position of a system so as to efficiently and accurately guide fire extinguishing equipment to extinguish fire, and meanwhile, the fire point positioning system requires low cost in most cases. Therefore, the large space fire point positioning method and system such as "accurate fire extinguishing system and method based on unmanned aerial vehicle and intelligent fire-fighting robot" with publication number CN107899166A, "infrared monitoring and early warning method of forest adaptive cruise aerial unmanned aerial vehicle" with publication number CN107481465A, and "high voltage power transmission corridor mountain fire positioning method based on cradle head attitude angle" with publication number CN106289531A are not suitable.
At present, fire point positioning systems suitable for limited space mainly include two main types, namely a sensing scanning type and an image type. The technical solution disclosed in "a fire detection positioning device" with publication number CN206896650U is a typical sensing scanning type fire point positioning system. Although the method has the advantage of low cost, the method can only realize two-dimensional positioning, cannot obtain the three-dimensional space position of the fire source, has limited guiding capability for aiming the fire extinguishing equipment at a target, has low adaptability to complex environment and has insufficient response capability to the change of fire. The image type fire point positioning system mainly comprises two modes of infrared thermal images (such as a fire positioning method based on infrared thermal imaging and laser guidance technology with the bulletin number of CN 106169217A) and visual identification (such as a precise positioning method for tall and big space building fire in the period of 'Jilin university' 2016 and 46 (06)), which have advantages and disadvantages respectively; the infrared thermography method is higher in accuracy and is the current mainstream method; however, since the images acquired by the video equipment are two-dimensional images, the general monocular image type fire point positioning system cannot realize three-dimensional space positioning.
The method for realizing three-dimensional fire point positioning is a binocular image method, such as: the publication number is CN105741481B 'a fire monitoring and positioning device and a fire monitoring and positioning method based on binocular cameras'. The method adopts two image acquisition devices to shoot the fire point, and calculates and solves the spatial position of the fire point by comparing the image difference. Although this type of method can obtain three-dimensional coordinates of a fire point, the following problems exist: (1) the positions of the two eyes are fixed, the general distance is small, and the solving precision is limited; (2) in a complex environment, the current shooting position cannot be adapted to when being shielded; (3) the cost of the infrared thermal imaging equipment is high, and the cost is increased by arranging two infrared thermal imaging equipment at a single measuring point. There is a need for an improved three-dimensional fire location method of the prior art.
Disclosure of Invention
Aiming at the problems, the invention provides a monocular self-adjusting fire point three-dimensional positioning method and a monocular self-adjusting fire point three-dimensional positioning device which can be applied to fire point three-dimensional positioning in limited spaces such as indoor space, vehicle and ship, have strong environmental adaptability and high positioning accuracy and can effectively save cost.
The technical scheme adopted by the invention is as follows: the monocular self-adjusting fire point three-dimensional positioning method comprises the following steps:
firstly, calibrating and calculating a monocular mobile camera; obtaining a position calibration mode suitable for monocular mobile equipment through graphical calculation and derivation, namely, calibrating through ground four-point coordinates; the calibration algorithm does not need camera parameters, and is a camera calibration mode suitable for monocular mobile equipment;
step two, calculating the shooting distance by considering the precision and the size adaptability; in order to respond to the constantly changing fire, the equipment needs to constantly acquire images and calculate in real time; the adjustment of the sampling position of the equipment is realized by an image sampling point interval calculation mode based on the area value of the pixel connecting line overlapped plane as a standard;
step three, calculating three-dimensional coordinates of the fire points of the monocular mobile thermal image; and solving the corresponding fire point three-dimensional space coordinate through the high-temperature point coordinate in the thermal image under the condition of setting the distance between the sampling points after the calibration is finished.
The first step is that the main purpose of the calibration calculation of the monocular mobile camera is to obtain a homography mapping matrix formed by considering the internal parameters of the camera and the projection transformation parameters;
let the point Q be an arbitrary point in the world coordinate system, and its coordinate is denoted as Q (X, Y, Z), and Q be the image point of the point in the image plane, and its coordinate is denoted as Q (X, Y); if f is the focal length, the pinhole camera model has:
Figure BDA0001964589960000031
the unit of the image point coordinate in the actual internal parameter of the camera is a pixel, and as the actual pixel point is mostly rectangular on the imager, the focal lengths in the x direction and the y direction under an image plane coordinate system are generally unequal; meanwhile, because the camera has processing and assembling errors, the intersection point of the imaging plane and the optical axis of the camera is not always the center of the imaging plane; in addition, because the precision required by the fire point positioning is relatively low, the image plane distortion caused by the processing error of the lens is not considered more reasonably; in summary, the camera reference matrix M is introduced, and the homogeneous mapping relationship between the image point Q and the entity point Q is obtained without considering the projection transformation as follows:
Figure BDA0001964589960000032
wherein f isxAnd fyRespectively representing the product of the physical focal length of the camera and the pixel cell size of the display screen, CxAnd CyAnd w-Z is a homogeneous term.
In the first step, for a specific object image obtained by each camera, the relative position of the specific object image can be described through rotation and translation coordinate transformation; assuming that the rotation matrix is R and the translation vector is t, it can be obtained from graphics:
t=[xt yt zt]T (3)
R=RxRyRz (4)
wherein the rotation angles around the x, y and z axes are psi,
Figure BDA0001964589960000033
And θ, then:
Figure BDA0001964589960000041
Figure BDA0001964589960000042
Figure BDA0001964589960000043
combining formula (3) and formula (4) into one matrix W ═ R t, and combining formula (2), one can obtain,
Figure BDA0001964589960000044
wherein,
Figure BDA0001964589960000045
and
Figure BDA0001964589960000046
in the form of homogeneous coordinates, the first and second,
Figure BDA0001964589960000047
in this algorithm, the coordinate in the height direction of the calibration plane is set to be zero (Y is 0), and equation (5) can be simplified as follows:
Figure BDA0001964589960000048
wherein, H is a 3 × 3 matrix which is a homography matrix in the algorithm; the homography matrix is composed of 8 independent parameters, each group of mapping points can provide 2 equations, and it can be known that 4 groups of mapping points are needed to be solved to obtain a matrix H;
the four vertex coordinates of a square calibration plate are adopted for calibration, the space coordinates of vertexes A, B, C and D of the calibration plate and the image plane coordinates of corresponding image plane points a, b, c and D are respectively obtained, and the homography matrix H can be obtained by substituting the space coordinates and the image plane coordinates into a formula (6), so that the calibration is realized.
In the second step, the basic idea of calculating the shooting distance considering the accuracy and the size adaptability is as follows: in a limited space range with fire point positioning requirements, the maximum error of the calculation precision in the depth direction is not greater than the calculation error generated by equipment parameters such as the number of pixels and the like, so that the positioning accuracy is ensured;
firstly, analyzing the condition that the optical center vector of the camera equipment is parallel to the ground, wherein the imaging plane is vertical to the ground at the moment, GH is set as the spatial distance corresponding to a single pixel at a certain depth, and the spatial horizontal distance corresponding to the single pixel is unchanged at the same depth; (a) the (b) and the (c) represent corresponding equipment pixel points of GH when a plurality of camera shooting equipment are positioned, and the equipment moves along the horizontal direction, namely from the (a) to the (c);
the fire point three-dimensional coordinate is obtained by superposition calculation of two shooting position data, when the distance of the calculated shooting position is short, the overlapping area of two times of shooting is large, and the calculation positioning precision is reduced; therefore, in order to improve the positioning accuracy, the distance between two times of calculation shooting should be enlarged; the method takes the overlapping area of two times of shooting as a standard to measure the calculation error, so that the calculation error is larger than a set standard value to meet the precision requirement;
then, the height of the triangle corresponding to the shaded part at the positions (a) and (b) can be found as follows:
Figure BDA0001964589960000051
wherein d is the distance from the optical center to the target depth, l is the distance between two shooting points, and xiAnd the horizontal direction space distance corresponding to the pixel point.
Secondly, the height of a triangle corresponding to the shadow part directly corresponds to the precision of the depth direction; the maximum error of the calculation precision in the set depth direction is not more than the calculation error generated by equipment parameters such as the number of pixels, namely the following conditions are met:
Figure BDA0001964589960000052
in the second step, because the camera device needs to continuously shoot in the process to adapt to the change of the field situation, the shooting position is selected in consideration of the influence of fire and field change besides the precision influence, namely, the calculation errors caused by various reasons, even the camera is shielded and other extreme situations; therefore, the problem is solved by adopting an iterative solution mode, namely when the error between the three-dimensional coordinate values of the fire point space obtained by continuous secondary tests is smaller than the error limit, an effective result is output, and the moving distance is calculated by taking the precision as the standard again; if the solving error is larger than the error limit, the effective result is temporarily not output; meanwhile, the driving device continues to move for calculation until the standard is reached and the calculation output is recovered;
the fire point position is fixed when the two images are acquired, and the three-dimensional coordinate calculation precision is measured by taking the two norms of the difference between the two fire point coordinate values solved twice as a standard, namely the two norms meet the following requirements:
[ε]≥||Ki+1-Ki||2 (9)
ki in the formula represents a fire point coordinate value obtained by comparing the ith measuring point with the (i-1) th measuring point image.
In the second step, because the shot images correspond to different accuracies at different depths, a more reasonable idea is to measure the measurement accuracy by relative errors, namely:
Figure BDA0001964589960000061
wherein,
Figure BDA0001964589960000062
representing the direction vector corresponding to the space coordinate of the two calculation results.
Calculating three-dimensional coordinates of the fire point of the monocular mobile thermal image, knowing coordinates of two positions of the thermal imager and coordinates of a corresponding high-temperature point in an image plane, and solving the three-dimensional space coordinates of the actual fire point position, wherein the specific method comprises the following steps:
after the calibration of the camera is finished, the homography mapping calculation matrix H of the camera is obtained; because the plane Y of the calibration plate is 0 and H is simplified into a 3X 3 matrix, the projection coordinates of the high-temperature point in the calibration plane can be reversely solved after the thermal imaging equipment collects the pixel coordinates of the imaging plane of the high-temperature point; as can be seen from the equation (6),
Figure BDA0001964589960000063
the projection coordinate of the actual high temperature point in the calibration plane is Q ═ X0Z]T
Step three, setting P1、P2For two effective shooting positions of the camera, the projection coordinate Q of the corresponding high-temperature point in the calibration plane can be obtained through the formula (11)1、Q2(ii) a Point P1、Q1、P2And Q2With known coordinates, two non-parallel straight lines P in space1Q1And P2Q2The shortest upper distance line segment is K1K2Then there is
Figure BDA0001964589960000064
And
Figure BDA0001964589960000065
let vector N1Perpendicular to P1Q1And K1K2Vector N2Perpendicular to P2Q2And K1K2Then, there are:
Figure BDA0001964589960000066
thus, a plane P can be obtained1Q1K2Equation of plane where three points are located and plane P2Q2K1The plane equations of the three points are respectively as follows:
Figure BDA0001964589960000067
Figure BDA0001964589960000071
combined (13) with a straight line P2Q2Equation, K can be obtained2Point space coordinates; similarly, the combination (14) and the straight line P1Q1The equation can be given as K1Point coordinates, thus obtaining a high temperature fire point gaugeCalculating a coordinate vector as:
Figure BDA0001964589960000072
the device adopted by the monocular self-adjusting fire point three-dimensional positioning method consists of a PC (personal computer) end control program, a controller, a control program, a driver, a driving execution mechanism, a camera device and the like; the overall operation flow of the device is as follows: once finding a high-temperature fire point in the monitoring process, the PC terminal program sends a corresponding starting signal to the controller, and simultaneously gives motion parameters such as the moving distance of the camera equipment and the like through calculation; the controller is responsible for acquiring data from a computer, converting the data into digital signals through a pre-embedded controller program and outputting the digital signals; the driver converts the digital signal obtained from the controller into an analog signal which can be recognized by the actuating mechanism and sends the analog signal to the driving actuating mechanism; the actuating mechanism drives the mechanical mechanism to move and drives the monocular camera equipment to complete preset movement; finally, the image pickup apparatus returns the fire point image data acquired at the different positions to the program so that the program calculates the three-dimensional coordinates of the fire point, thereby guiding the fire extinguishing apparatus to extinguish the fire.
The invention has the beneficial effects that: according to the monocular self-adjusting fire point three-dimensional positioning method and device, the traditional binocular image positioning is replaced by movable monocular image acquisition equipment, the corresponding fire point three-dimensional coordinates are obtained by self-adjusting multipoint image acquisition of the equipment in the moving process and combining a camera calibration and positioning calculation method which is adaptive to the self-adjusting multipoint image acquisition equipment, and the automatic fire extinguishing equipment is guided to extinguish fire. The monocular self-adjusting fire point three-dimensional positioning method and the monocular self-adjusting fire point three-dimensional positioning device have the following main characteristics:
(1) the device has the capability of positioning three-dimensional space coordinates, and has high response speed to fire change;
(2) due to the fact that the moving distance is adjustable, the method and the device have good adaptability to different space sizes;
(3) because the image is collected for the multipoint and the position of the collecting point can be adjusted automatically, the measuring precision of the method has better expansibility, and meanwhile, because of adopting the idea of multi-position iterative computation, the fault tolerance of the device is improved;
(4) because the image acquisition position is not fixed, when some shooting angles are blocked, the method can still obtain effective fire source position images to realize positioning;
(5) the infrared thermal imaging equipment has good forecasting accuracy but higher cost, and the device based on the method can effectively save the cost compared with a binocular positioning device configured in the same way.
Drawings
Fig. 1 is a schematic diagram of camera calibration calculation in step one of the present invention.
FIG. 2 is a schematic diagram illustrating the effect of the moving distance on the accuracy in step two of the present invention.
Fig. 3 is a flowchart of the iterative calculation of device movement in step two of the present invention.
Fig. 4 is a calculation chart for solving the three-dimensional fire point position in step three of the present invention.
Fig. 5 is a calculation chart of the fire point coordinate for solving two shooting points in step three of the invention.
Fig. 6 is a system block diagram of a monocular mobile fire point locating device of the present invention.
FIG. 7 is a diagram of the UML class of PC program architecture employed by the present invention.
FIG. 8 is a main interface of the PC side control program in the present invention to the PC user program of the upper computer.
FIG. 9 is a graph showing the results of the test calculations of the present invention.
Detailed Description
The specific steps of the present invention are described in detail with reference to FIGS. 1 to 5. The monocular self-adjusting fire point three-dimensional positioning method comprises the following steps:
step one, calibrating and calculating the monocular mobile camera. And (4) obtaining a position calibration mode suitable for the monocular mobile equipment through graphical calculation and derivation, namely calibrating through ground four-point coordinates. The calibration algorithm does not need camera parameters, and is a camera calibration mode suitable for monocular mobile equipment.
The main purpose of the calibration calculation of the monocular mobile camera is to obtain a homography mapping matrix formed by considering the internal parameters of the camera and the projection transformation parameters. Let the point Q be an arbitrary point in the world coordinate system, whose coordinates are denoted as Q (X, Y, Z), and Q be the image point of the point in the image plane, whose coordinates are denoted as Q (X, Y), as shown in fig. 1; if f is the focal length, the pinhole camera model has:
Figure BDA0001964589960000091
the unit of the image point coordinate in the camera internal parameter reality is a pixel, and the actual pixel point is mostly a rectangle on the imager, so that the focal lengths in the x direction and the y direction under the image plane coordinate system are not equal generally. Meanwhile, due to processing and assembling errors of the camera, the intersection point (i.e., the principal point) of the imaging plane and the optical axis is not always the center of the imaging plane. In addition, because the accuracy required for locating the fire point is relatively low, it is more reasonable to not consider the image plane distortion of the lens due to processing errors. In summary, the camera reference matrix M is introduced, and the homogeneous mapping relationship between the image point Q and the entity point Q is obtained without considering the projection transformation as follows:
Figure BDA0001964589960000092
wherein f isxAnd fyRespectively representing the product of the physical focal length of the camera and the pixel cell size of the display screen, CxAnd CyAnd w-Z is a homogeneous term.
Meanwhile, for the specific object image obtained by each camera, the relative position of the specific object image can be described through rotation and translation coordinate transformation; assuming that the rotation matrix is R and the translation vector is t, it can be obtained from graphics:
t=[xt yt zt]T (3)
R=RxRyRz (4)
wherein the rotation angles around the x, y and z axes are psi,
Figure BDA0001964589960000096
And θ, then:
Figure BDA0001964589960000093
Figure BDA0001964589960000094
Figure BDA0001964589960000095
combining formula (3) and formula (4) into one matrix W ═ R t, and combining formula (2), one can obtain,
Figure BDA0001964589960000101
wherein,
Figure BDA0001964589960000102
and
Figure BDA0001964589960000103
in the form of homogeneous coordinates, the first and second,
Figure BDA0001964589960000104
in this algorithm, the coordinate in the height direction of the calibration plane is set to be zero (Y is 0), and equation (5) can be simplified as follows:
Figure BDA0001964589960000105
wherein, H is a 3 × 3 matrix which is a homography matrix in the algorithm; the homography matrix is composed of 8 independent parameters, each group of mapping points can provide 2 equations, and it can be known that 4 groups of mapping points are needed to be solved to obtain a matrix H;
calibrating by adopting four vertex coordinates of a square calibration plate, as shown in figure 1; and the space coordinates of the vertexes A, B, C and D of the calibration plate and the image plane coordinates of the corresponding image plane points a, b, c and D are respectively obtained and substituted into the formula (6) to obtain a homography matrix H, so that the calibration is realized. For convenient operation, the algorithm sets the upper left corner of the calibration plate as the origin of a world coordinate system, and obtains two groups of mapping relations by translating the camera along the horizontal direction (the world coordinate system x) so as to realize the calibration of the camera; in order to ensure the precision, the image plane coordinate value of the corresponding point in the two groups of calibration parameters is more than 1 pixel.
And step two, calculating the shooting distance by considering the precision and the size adaptability. In order to respond to the ever changing fire, the equipment must constantly acquire images and perform calculations in real time. The adjustment of the sampling position of the equipment is realized by an image sampling point distance calculation mode based on the area value of the pixel connecting line overlapped plane as a standard.
The basic idea of the shooting distance calculation considering accuracy and size adaptability is as follows: in a limited space range with fire point positioning requirement, the maximum error of the depth direction calculation precision is not more than the calculation error generated by equipment parameters such as pixel quantity and the like, so that the positioning accuracy is ensured.
First, the case where the optical center vector of the image pickup apparatus is parallel to the ground is analyzed. At this time, the imaging plane is perpendicular to the ground (as shown in fig. 2), GH is defined as the spatial distance corresponding to a single pixel at a certain depth, and the spatial horizontal distance corresponding to a single pixel is unchanged at the same depth. (a) And (b) and (c) represent the corresponding device pixel points of GH when a plurality of camera shooting device positions exist, and the device moves along the horizontal direction, namely from (a) to (c).
The fire point three-dimensional coordinate is obtained by superposition calculation of two shooting position data, and when the distance of the shooting position is calculated to be short, the overlapping area of two times of shooting is large, namely: the shaded area in fig. 2 is large, and the accuracy of the calculated positioning is reduced. Therefore, in order to improve the positioning accuracy, the distance between two times of calculation shots should be enlarged. The method takes the overlapping area (shaded part in figure 2) of two times of shooting as a standard to measure and calculate the error, so that the error is larger than a set standard value to meet the precision requirement.
Then, the height of the triangle corresponding to the shaded part at the positions (a) and (b) can be found as follows:
Figure BDA0001964589960000111
wherein d is the distance from the optical center to the target depth, l is the distance between two shooting points, and xiAnd the horizontal direction space distance corresponding to the pixel point.
The height of the triangle corresponding to the shaded portion in fig. 2 directly corresponds to the depth direction accuracy. The maximum error of the calculation precision in the depth direction is set and should not be larger than the calculation error generated by equipment parameters such as the number of pixels, and the like, namely:
Figure BDA0001964589960000112
because the camera equipment needs to continuously shoot in the whole positioning process to adapt to the change of the field condition, the shooting position is selected by considering the influence of fire and field change besides the precision; namely, calculation errors caused by various reasons, even extreme situations such as the camera being blocked. Therefore, the problem is solved by adopting an iterative solution mode, namely: and when the error between the three-dimensional coordinate values of the fire point space obtained by the continuous secondary test is smaller than the error limit, outputting an effective result and calculating the moving distance by taking the precision as the standard again. If the solving error is larger than the error limit (including the situation that the measuring point is shielded), outputting an effective result temporarily; meanwhile, the driving device continues to move the computation until the standard recovery computation output is reached.
The fire point position is fixed when the two images are collected, and the three-dimensional coordinate calculation precision in the method is measured by taking the two norms of the difference between the two fire point coordinate values obtained by solving for two times as the standard, namely the two norms meet the following requirements:
[ε]≥||Ki+1-Ki||2 (9)
ki in the formula represents a fire point coordinate value obtained by comparing the ith measuring point with the (i-1) th measuring point image.
It can be understood that, because the images of making a video recording correspond different accuracies at different depths, a more reasonable idea is to measure the measurement accuracy with relative error, namely:
Figure BDA0001964589960000121
wherein,
Figure BDA0001964589960000122
representing the direction vector corresponding to the space coordinate of the two calculation results.
In summary, the moving distance calculation flow of the method is shown in fig. 3.
When the method is adopted, because the loop iteration judgment can be set, when the calculation error (even blocked) occurs due to the complex situation of a fire scene and the like, the correction and compensation can be carried out through the subsequent calculation, thereby improving the response capability of the equipment to the fire and the fault-tolerant capability of the equipment.
And step three, calculating the three-dimensional coordinates of the fire points of the monocular mobile thermal image. And solving the corresponding fire point three-dimensional space coordinate through the high-temperature point coordinate in the thermal image under the condition of setting the distance between the sampling points after the calibration is finished.
Under the condition that two position coordinates of the thermal imager and corresponding coordinates of a high-temperature point in an image plane are known, the three-dimensional space coordinate of the actual fire point position is solved, and the specific method comprises the following steps:
and (4) after the camera calibration related to the step one is completed, solving a homography mapping calculation matrix H of the camera. Because the plane Y of the calibration plate is 0 and H is simplified into a 3X 3 matrix, the projection coordinates of the high-temperature point in the calibration plane can be reversely solved after the thermal imaging equipment collects the pixel coordinates of the imaging plane of the high-temperature point; as can be seen from the equation (6),
Figure BDA0001964589960000123
the projection coordinate of the actual high temperature point in the calibration plane is Q ═ X0Z]T
As shown in FIG. 4, let P1、P2For two effective shooting positions of the camera, the projection coordinate Q of the corresponding high-temperature point in the calibration plane can be obtained by the formula (11)1、Q2. Connecting the optical center P and the projection coordinate Q (shown in FIG. 4) can obtain P1Q1And P2Q2The equation of the straight line of (c). Theoretically solving the intersection point to obtain the three-dimensional coordinate of the actual fire point position space; however, in actual calculation, P is caused by the pixel unit size and measurement error1Q1And P2Q2The possibility that two straight lines do not have an intersection is high. Because the influence of measurement errors is small under general conditions, P is set in the algorithm to ensure the calculation efficiency1Q1And P2Q2And calculating coordinates of the high-temperature point by using the middle point of the line segment with the shortest distance between the straight lines. If the influence of the measurement error of the shooting position is large, the algorithm can be automatically adjusted by the precision control method in the second step.
The fire point three-dimensional coordinate calculation method comprises the following steps: as shown in FIG. 5, let P1、P2For two effective shooting positions of the camera, the projection coordinate Q of the corresponding high-temperature point in the calibration plane can be obtained through the formula (11)1、Q2(ii) a Point P1、Q1、P2And Q2With known coordinates, two non-parallel straight lines P in space1Q1And P2Q2The shortest upper distance line segment is K1K2Then there is
Figure BDA0001964589960000131
And
Figure BDA0001964589960000132
let vector N1Perpendicular to P1Q1And K1K2Vector N2Perpendicular to P2Q2And K1K2Then, there are:
Figure BDA0001964589960000133
thus, a plane P can be obtained1Q1K2Equation of plane where three points are located and plane P2Q2K1The plane equations of the three points are respectively as follows:
Figure BDA0001964589960000134
Figure BDA0001964589960000135
combined (13) with a straight line P2Q2Equation, K can be obtained2Point space coordinates; similarly, the combination (14) and the straight line P1Q1The equation can be given as K1Point coordinates, and then calculating a high-temperature fire point calculation coordinate vector as follows:
Figure BDA0001964589960000136
the device adopted by the monocular self-adjusting fire point three-dimensional positioning method is based on a system scheme frame shown in figure 6 and comprises five parts, namely a PC (personal computer) end control program, a controller, a control program, a driver, a driving execution mechanism, camera equipment and the like. The overall operation flow of the device is as follows: once finding a high-temperature fire point in the monitoring process, the PC terminal program sends a corresponding starting signal to the controller, and simultaneously gives motion parameters such as the moving distance of the camera equipment and the like through calculation; the controller is responsible for acquiring data from a computer, converting the data into digital signals through a pre-embedded controller program and outputting the digital signals; the driver converts the digital signal obtained from the controller into an analog signal which can be recognized by the actuating mechanism and sends the analog signal to the driving actuating mechanism; the actuating mechanism drives the mechanical mechanism to move and drives the monocular camera equipment to complete preset movement; finally, the image pickup apparatus returns the fire point image data acquired at the different positions to the program so that the program calculates the three-dimensional coordinates of the fire point, thereby guiding the fire extinguishing apparatus to extinguish the fire. The products and parameters selected in the device can be determined according to different actual situations on site.
For example: the image acquisition equipment selected by the invention can be a DM10series thermal infrared imager produced by Hangzhou Dazhui company, the pixel is 160 multiplied by 120, the wavelength range is 8-14 mu m, the field angle is 60 degrees multiplied by 45 degrees, the temperature measurement range is-20 ℃ to +150 ℃, the temperature measurement precision is a larger value in the range of +/-2 ℃ or +/-2 percent, the temperature measurement can be automatically measured and corrected, and a dynamic link library (dll) provided in the equipment sdk is developed and adopted.
The driving actuator can adopt a ball screw, the number of heads of the ball screw is 1, the stroke is 400mm, and the thread pitch is 5 mm. A stepping motor is adopted to drive a ball screw, factors such as stroke, torque and the like are comprehensively considered, a 42-type stepping motor with the model number of 42BYGH40-1.8-22A can be selected, the output torque is 0.5Nm, and the stepping angle is 1.8 degrees. According to the current and the type of the driving motor, a TB6600 type two-phase stepping motor driver can be selected, and the rated current is 22A.
The controller main control board is an Arduino main control board, and the model is Arduino UNO R3. The USB interface is connected with a computer, and communication is realized by adopting a mode of a Baud rate of 9600bps and half duplex. And the controller program is written based on C + + language in OOP idea. The controller program realizes the communication between the PC side and the arduino by adopting a Baud rate of 9600bps and a half-duplex mode.
The Arduino controller program consists of two parts, a communication program and a motor driving program. The communication program is responsible for transmitting the motor position information and receiving user instructions, and the motor driving program is responsible for sending pulse signals to the driver to drive the motor. And considering the safety of actual operation, the controller program should include the function of emergency braking. Namely, when the motor runs, the user can send an instruction at any time to stop the motor. The format of an input instruction is specified in a program to be a four-digit decimal code, and then the arduino translates the decimal code into a corresponding action according to the position; the contents and corresponding functions are shown in the following table.
Controller communication content meaning table
Figure BDA0001964589960000151
The PC side upper computer program is written in the framework shown in FIG. 7, and the main interface part of the PC side upper computer program is shown in FIG. 8. Fig. 8 shows the apparatus being monocular mobile camera calibrated using a high temperature point.
Fig. 9 shows a display of the actual fire calculation results, with image updates and recalculation performed at a frequency of 1/s in the present system. The calculation results are displayed on the interface in a three-dimensional coordinate value mode and are recorded in the Excel table. After multiple times of calibration and actual measurement, under the current equipment and configuration, when the fire point is 5000mm away, the maximum error of the three-dimensional coordinate is epsilonx=-10.62mm、εy=88.17mm、εz44.69mm, the relative calculation precision is more than 98%, the coverage of the fire extinguishing equipment is comprehensively considered, and the fire extinguishing requirement is met. It can be understood that the pixels of the thermal image equipment currently configured by the experimental device are low, and if the resolution of the equipment is improved, the test accuracy of the device can be further improved.

Claims (6)

1. A monocular self-adjusting fire point three-dimensional positioning method is characterized in that: the method comprises the following steps:
firstly, calibrating and calculating a monocular mobile camera; obtaining a position calibration mode suitable for monocular mobile equipment through graphical calculation and derivation, namely, calibrating through ground four-point coordinates;
step two, calculating the shooting distance by considering the precision and the size adaptability; in order to respond to the constantly changing fire, the equipment needs to constantly acquire images and calculate in real time; the adjustment of the sampling position of the equipment is realized by an image sampling point interval calculation mode based on the area value of the pixel connecting line overlapped plane as a standard;
in the second step, the basic idea of calculating the shooting distance considering the accuracy and the size adaptability is as follows: in a limited space range with fire point positioning requirements, the maximum error of the depth direction calculation precision is smaller than the calculation error generated by the pixel quantity equipment parameters, so that the positioning accuracy is ensured;
firstly, analyzing the condition that the optical center vector of the camera equipment is parallel to the ground, wherein the imaging plane is vertical to the ground at the moment, GH is set as the spatial distance corresponding to a single pixel at a certain depth, and the spatial horizontal distance corresponding to the single pixel is unchanged at the same depth; (a) the (b) and the (c) represent corresponding equipment pixel points of GH when a plurality of camera shooting equipment are positioned, and the equipment moves along the horizontal direction, namely from the (a) to the (c);
the fire point three-dimensional coordinate is obtained by superposition calculation of two shooting position data, and the method measures the calculation error by taking the overlapping area in two times of shooting as a standard, so that the error is larger than a set standard value to meet the precision requirement;
then, the height of the triangle corresponding to the shaded part at the positions (a) and (b) can be found as follows:
Figure FDA0002897221110000011
wherein d is the distance from the optical center to the target depth, l is the distance between two shooting points, and xG、xHThe spatial distance in the horizontal direction corresponding to the pixel point is obtained;
secondly, the height of a triangle corresponding to the shadow part directly corresponds to the precision of the depth direction; the maximum error of the calculation precision in the depth direction is set to be smaller than the calculation error generated by the pixel number equipment parameters, namely the following conditions are met:
Figure FDA0002897221110000021
when the error between the three-dimensional coordinate values of the fire point space obtained by the continuous secondary test is smaller than the error limit, outputting an effective result and calculating the moving distance by taking the precision as the standard again; if the solving error is larger than the error limit, the effective result is temporarily not output; meanwhile, the driving device continues to move for calculation until the standard is reached and the calculation output is recovered;
the fire point position is fixed when the two images are acquired, and the three-dimensional coordinate calculation precision is measured by taking the two norms of the difference between the two fire point coordinate values solved twice as a standard, namely the two norms meet the following requirements:
[ε]≥||Ki+1-Ki||2 (9)
ki represents a fire point coordinate value obtained by comparing the ith measuring point with the (i-1) th measuring point image;
in the second step, the shot images correspond to different accuracies at different depths, and the measurement accuracy is measured by relative errors, namely:
Figure FDA0002897221110000022
wherein,
Figure FDA0002897221110000023
representing the direction vectors corresponding to the space coordinates of the two previous and next calculation results;
step three, calculating three-dimensional coordinates of the fire points of the monocular mobile thermal image; and solving the corresponding fire point three-dimensional space coordinate through the high-temperature point coordinate in the thermal image under the condition of setting the distance between the sampling points after the calibration is finished.
2. The monocular self-adjusting fire point three-dimensional positioning method according to claim 1, characterized in that: the first step is that the aim of calibration calculation of the monocular mobile camera is to obtain a homography matrix formed by considering the internal parameters of the camera and the projection transformation parameters;
let the point Q be an arbitrary point in the world coordinate system, and its coordinate is denoted as Q (X, Y, Z), and Q be the image point of the point in the image plane, and its coordinate is denoted as Q (X, Y); if f is the focal length, the pinhole camera model has:
Figure FDA0002897221110000024
introducing a camera internal reference matrix M, and obtaining a homogeneous mapping relation between an image point Q and an entity point Q without considering projection transformation as follows:
Figure FDA0002897221110000031
wherein f isxAnd fyRespectively representing the product of the physical focal length of the camera and the pixel cell size of the display screen, CxAnd CyAnd w-Z is a homogeneous term.
3. The monocular self-adjusting fire point three-dimensional positioning method according to claim 2, characterized in that: in the first step, for a specific object image obtained by each camera, the relative position of the specific object image can be described through rotation and translation coordinate transformation; assuming that the rotation matrix is R and the translation vector is t, it can be obtained from graphics:
t=[xt yt zt]T (3)
R=RxRyRz (4)
wherein the rotation angles around the x, y and z axes are psi,
Figure FDA0002897221110000032
And θ, then:
Figure FDA0002897221110000033
Figure FDA0002897221110000034
Figure FDA0002897221110000035
combining formula (3) and formula (4) into one matrix W ═ R t, and combining formula (2), one can obtain,
Figure FDA0002897221110000036
wherein,
Figure FDA0002897221110000037
and
Figure FDA0002897221110000038
in the form of homogeneous coordinates, the first and second,
Figure FDA0002897221110000039
in this algorithm, the coordinate in the height direction of the calibration plane is set to be zero (Y is 0), and equation (5) can be simplified as follows:
Figure FDA00028972211100000310
wherein, H is a 3 × 3 matrix which is a homography matrix in the algorithm; the homography matrix is composed of 8 independent parameters, each group of mapping points can provide 2 equations, and it can be known that 4 groups of mapping points are needed to be solved to obtain a matrix H;
the four vertex coordinates of a square calibration plate are adopted for calibration, the space coordinates of vertexes A, B, C and D of the calibration plate and the image plane coordinates of corresponding image plane points a, b, c and D are respectively obtained, and the homography matrix H can be obtained by substituting the space coordinates and the image plane coordinates into a formula (6), so that the calibration is realized.
4. The monocular self-adjusting fire point three-dimensional positioning method according to claim 1, characterized in that: calculating three-dimensional coordinates of the fire point of the monocular mobile thermal image, knowing coordinates of two positions of the thermal imager and coordinates of a corresponding high-temperature point in an image plane, and solving the three-dimensional space coordinates of the actual fire point position, wherein the specific method comprises the following steps:
after the calibration of the camera is completed, the homography matrix H of the camera is obtained; because the plane Y of the calibration plate is 0 and H is simplified into a 3X 3 matrix, the projection coordinates of the high-temperature point in the calibration plane can be reversely solved after the thermal imaging equipment collects the pixel coordinates of the imaging plane of the high-temperature point; as can be seen from the equation (6),
Figure FDA0002897221110000041
the projection coordinate of the actual high temperature point in the calibration plane is Q ═ X0Z]T
5. The monocular self-adjusting fire point three-dimensional positioning method according to claim 4, characterized in that: step three, setting P1、P2For two effective shooting positions of the camera, the projection coordinate Q of the corresponding high-temperature point in the calibration plane can be obtained through the formula (11)1、Q2(ii) a Point P1、Q1、P2And Q2With known coordinates, two non-parallel straight lines P in space1Q1And P2Q2The shortest upper distance line segment is K1K2Then there is
Figure FDA0002897221110000042
And
Figure FDA0002897221110000043
let vector N1Perpendicular to P1Q1And K1K2Vector N2Perpendicular to P2Q2And K1K2Then, there are:
Figure FDA0002897221110000044
thus, a plane P can be obtained1Q1K2Equation of plane where three points are located and plane P2Q2K1The plane equations of the three points are respectively as follows:
Figure FDA0002897221110000051
Figure FDA0002897221110000052
combined (13) with a straight line P2Q2Equation, K can be obtained2Point space coordinates; similarly, the combination (14) and the straight line P1Q1The equation can be given as K1Point coordinates, and then calculating a high-temperature fire point calculation coordinate vector as follows:
Figure FDA0002897221110000053
6. the apparatus for the monocular self-adjusting three-dimensional positioning of fire points according to claim 1, characterized in that: the device consists of a PC end control program, a controller, a control program, a driver, a drive execution mechanism and camera equipment; the overall operation flow of the device is as follows: once finding a high-temperature fire point in the monitoring process, the PC terminal program sends a corresponding starting signal to the controller, and simultaneously, the motion parameter of the moving distance of the camera equipment is given out through calculation; the controller is responsible for acquiring data from a computer, converting the data into digital signals through a pre-embedded controller program and outputting the digital signals; the driver converts the digital signal obtained from the controller into an analog signal which can be recognized by the actuating mechanism and sends the analog signal to the driving actuating mechanism; the actuating mechanism drives the mechanical mechanism to move and drives the monocular camera equipment to complete preset movement; finally, the image pickup apparatus returns the fire point image data acquired at the different positions to the program so that the program calculates the three-dimensional coordinates of the fire point, thereby guiding the fire extinguishing apparatus to extinguish the fire.
CN201910096126.6A 2019-01-31 2019-01-31 Monocular self-adjusting fire point three-dimensional positioning method and device Expired - Fee Related CN109887025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910096126.6A CN109887025B (en) 2019-01-31 2019-01-31 Monocular self-adjusting fire point three-dimensional positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910096126.6A CN109887025B (en) 2019-01-31 2019-01-31 Monocular self-adjusting fire point three-dimensional positioning method and device

Publications (2)

Publication Number Publication Date
CN109887025A CN109887025A (en) 2019-06-14
CN109887025B true CN109887025B (en) 2021-03-23

Family

ID=66927547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910096126.6A Expired - Fee Related CN109887025B (en) 2019-01-31 2019-01-31 Monocular self-adjusting fire point three-dimensional positioning method and device

Country Status (1)

Country Link
CN (1) CN109887025B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110557720B (en) * 2019-09-12 2024-02-13 南京工程学院 Ultra-wideband indoor positioning system and blind compensation positioning method based on dynamic reference label
CN111564017B (en) * 2020-06-04 2023-05-23 侯雨硕 Forest fire intelligent early warning device
CN115038946A (en) * 2020-09-15 2022-09-09 华为技术有限公司 Temperature measuring method and temperature measuring device
CN112330740A (en) * 2020-10-28 2021-02-05 华北电力大学(保定) Pseudo-binocular dynamic distance measurement method based on monocular video
CN112546525A (en) * 2020-12-09 2021-03-26 上海赛复智能科技有限公司 Flame positioning method combining intermediate infrared signal and image analysis
CN112686961B (en) * 2020-12-31 2024-06-04 杭州海康机器人股份有限公司 Correction method and device for calibration parameters of depth camera

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967712A (en) * 2017-11-21 2018-04-27 海南电网有限责任公司电力科学研究院 Mountain fire is accurately positioned and algorithm of the mountain fire edge far from overhead transmission line vertical range
CN109146959A (en) * 2018-08-29 2019-01-04 华南农业大学 Monocular camera realizes dynamic point method for three-dimensional measurement

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298818B (en) * 2011-08-18 2013-07-10 中国科学技术大学 Binocular shooting fire detecting and positioning device and fire positioning method thereof
CN102497509B (en) * 2011-12-14 2013-07-03 哈尔滨工业大学 Forest fire point disturbance-removing single point positioning method
US9613432B2 (en) * 2014-01-29 2017-04-04 Stmicroelectronics S.R.L. Fire detection system and method employing digital images processing
CN106504287B (en) * 2016-10-19 2019-02-15 大连民族大学 Monocular vision object space positioning system based on template
KR20180138236A (en) * 2017-06-19 2018-12-31 전자부품연구원 Fire point detection system and method in a dynamic environment
CN108182706B (en) * 2017-12-08 2021-09-28 重庆广睿达科技有限公司 Method and system for monitoring incinerated substances

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967712A (en) * 2017-11-21 2018-04-27 海南电网有限责任公司电力科学研究院 Mountain fire is accurately positioned and algorithm of the mountain fire edge far from overhead transmission line vertical range
CN109146959A (en) * 2018-08-29 2019-01-04 华南农业大学 Monocular camera realizes dynamic point method for three-dimensional measurement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于摄像机标定技术的林火定位方法研究;杨广群 等;《安全与环境学报》;20130228;第13卷(第1期);第215-219页 *

Also Published As

Publication number Publication date
CN109887025A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
CN109887025B (en) Monocular self-adjusting fire point three-dimensional positioning method and device
CN110136208B (en) Joint automatic calibration method and device for robot vision servo system
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
US9043146B2 (en) Systems and methods for tracking location of movable target object
CN109454634B (en) Robot hand-eye calibration method based on plane image recognition
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
EP3744484A1 (en) Information processing device, information processing method, and information processing system
CN102042807B (en) Flexible stereoscopic vision measuring unit for target space coordinate
CN109360243B (en) Calibration method of multi-degree-of-freedom movable vision system
CN114283203A (en) Calibration method and system of multi-camera system
JP2014089168A (en) Calibration method of camera and calibration device of camera
CN111192318A (en) Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
CN116564159A (en) Photoelectric measurement and control equipment tracking operation simulation training system and method
CN113330487A (en) Parameter calibration method and device
Li et al. A flexible calibration algorithm for high-speed bionic vision system based on galvanometer
CN110989645B (en) Target space attitude processing method based on compound eye imaging principle
CN111028298B (en) Convergent binocular system for rigid coordinate system space transformation calibration
CN110619664B (en) Laser pattern-assisted camera distance posture calculation method and server
CN110415292A (en) Movement attitude vision measurement method of ring identification and application thereof
CN116091615A (en) RGBD camera coordinate conversion and visual positioning method based on three-dimensional matrix pellets
CN113960564B (en) Laser comprehensive reference system for underwater detection and ranging and calibrating method
JPS5916912B2 (en) Idohou Koukiyoujisouchi
CN113276115A (en) Hand-eye calibration method and device without robot movement
JPH0675617A (en) Camera view point change system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210323

Termination date: 20220131