CN107909615A - A kind of fire monitor localization method based on binocular vision - Google Patents

A kind of fire monitor localization method based on binocular vision Download PDF

Info

Publication number
CN107909615A
CN107909615A CN201711364149.8A CN201711364149A CN107909615A CN 107909615 A CN107909615 A CN 107909615A CN 201711364149 A CN201711364149 A CN 201711364149A CN 107909615 A CN107909615 A CN 107909615A
Authority
CN
China
Prior art keywords
point
fire
image
positioning
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711364149.8A
Other languages
Chinese (zh)
Inventor
孔祥明
林邓平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Industry Kaiyuan Science And Technology Co Ltd
Original Assignee
Guangdong Industry Kaiyuan Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Industry Kaiyuan Science And Technology Co Ltd filed Critical Guangdong Industry Kaiyuan Science And Technology Co Ltd
Priority to CN201711364149.8A priority Critical patent/CN107909615A/en
Publication of CN107909615A publication Critical patent/CN107909615A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

The invention discloses a kind of fire monitor localization method based on binocular vision, including:After image detection system alarm, high-definition picture is gathered;To fire image positioning measurement, including the matching of Flame Image Segmentation, flame image, flame location measurement, location data filtering;Such as determine that fire occurs, then carry out sound-light alarm, and locus occurs for the fire calculated by step 2, exports location data and puts out a fire to fire monitor.It is of the invention that the seamless coverage of internal volume is completed using multigroup vision measurement mode that crosses, and there is certain redundancy, system can realize the real-time positioning to fire location, improve ability and robustness that system adapts to event.The vision positioning of fire monitor can adjust installation parameter, positioning accuracy is improved as far as possible by reasonably being arranged to protection zone in allowable range of error.And positioning result is handled by filtering, remove error larger data so that result is more accurate.

Description

Fire monitor positioning method based on binocular vision
Technical Field
The invention relates to the technical field of fire detection and automatic fire extinguishing, in particular to a binocular vision-based fire monitor positioning method.
Background
The steel construction of the general adoption of tall and big space building is main body frame, have high strength, large-span, convenient hoist and mount, the short advantage of construction period, but the use of putting into service of these buildings has also proposed new topic for fire control work, because tall and big space building height is more than 12m, the span is more than 60m, the area is huge, there is not effectual fire prevention subregion, insider equips numerous, very easily stretch after the conflagration takes place, the steel construction intensity is fast down the peak when the temperature rises, the short time will collapse, the fire control task is extremely great. The traditional temperature-sensing smoke-sensing fire detection and automatic water mist spraying technology cannot meet the fire-fighting requirement of high and large space buildings.
The robot vision system simulates the perception function of human vision, has the advantages of wide detection range, complete target information and the like, and is one of the intelligent key technologies of the mobile robot. The binocular stereo vision technology is a method for imitating human binocular perception distance to realize perception of two-dimensional information, adopts a method based on two-angle measurement in realization, and utilizes two cameras to image different scenes from different positions, thereby recovering distance information from parallax, and is an important branch of robot vision research. Binocular vision has become an essential component of most autonomous mobile robots. Because the binocular vision device has the advantages of simple structure, convenient and rapid data acquisition, non-contact measurement and wide applicability, and can be applied to various complex and severe environments, the binocular vision device is widely applied to a plurality of fields.
Binocular vision is located with two cameras. For a characteristic point on an object, two cameras fixed at different positions are used for shooting images of the object, and coordinates of the point on image planes of the two cameras are respectively obtained. As long as the precise relative positions of the two cameras are known, the coordinates of the feature point in the coordinate system for fixing one camera can be obtained by a geometric method, namely the position of the feature point is determined.
The fire disaster is a disaster which is not controlled in time and space, and the position, the volume and the shape of the fire disaster change along with time nonlinearity, so that the image positioning of the fire disaster in a tall and big space building is always difficult. For a binocular vision positioning system, errors mainly come from two aspects of system parameter calibration errors and actual measurement. The prior art can ensure that the calibration precision of the system structure parameters reaches more than 0.01 percent, and the influence can be basically ignored. When fire visual positioning is carried out, the largest error comes from the matching error of an image and the distortion error of a lens; GB15631-2008 special fire detector requires that under a first-level fire protection environment, a lens with a focal length of 12mm, a horizontal field angle of 22 degrees and a vertical field angle of 17 degrees is used for positioning a fire with a size of 0.15m multiplied by 0.15m far away from 100m, the x error is required to be less than 0.612m, the y error is required to be less than 3.360m, and CNCA/CTS0014-2010, fire extinguishing device certification technical specification of auto-seeking water-spraying fire extinguishing device by fire-fighting product qualification evaluation center of Ministry of public Security requires that the auto-seeking cylindrical water-spraying fire extinguishing device can detect the fire at the maximum protection radius and aim at a fire source to shoot water, a jet water flow can cover an oil pan with the diameter of 400mm, a jet flow covering area does not exceed a circular area with the center of the oil pan as the center and the diameter of 1.2m, and higher requirements are provided for the mechanical performance and positioning accuracy of a fire monitor.
Although patent No. 201510390154.0 proposes a monocular positioning method, a relationship between an image coordinate system and a space coordinate system is established when various parameters in a camera are known. However, the method is only limited to positioning and monitoring the position, angle and swing condition of the ignition point, but in a complex changing fire scene environment, the methods are only shallow-level monitoring and positioning, the requirement on hardware is high, the algorithm is simple, the precision is not high, and a large error exists.
Disclosure of Invention
In view of the above-mentioned defects in the prior art, the technical problem to be solved by the present invention is to provide a method for positioning a fire monitor based on binocular vision, so as to solve the deficiencies in the prior art.
In order to achieve the purpose, the invention provides a binocular vision-based fire monitor positioning method, which comprises the following steps of:
step 1, a camera collects monitoring video images, and a high-resolution image is collected after an image detection system gives an alarm;
step 2, positioning and measuring the fire image, and judging whether a fire occurs or not, wherein the positioning and measuring comprises flame image segmentation, flame image matching, flame position measurement and positioning data filtering;
and 3, if the fire disaster is determined to occur, performing sound-light alarm, outputting positioning data to the fire monitor according to the fire disaster space position calculated in the step 2, spraying a fire extinguishing agent to the fire disaster area by the fire monitor to complete fire extinguishing, and removing the alarm after the fire disaster is extinguished.
Further, the flame image segmentation in the step 2 is implemented by the following steps:
step 21, counting color information and calculating similarity, namely firstly calculating the average value of the color information of pixel points in the image to be used as a color template of the image, and then calculating the color distance and the similarity between each pixel point in the image and the color template;
step 22, image segmentation, namely setting a corresponding threshold, setting the gray value of the pixel point with the similarity larger than the threshold as 0, and setting the gray value of the pixel point with the similarity smaller than or equal to the threshold as 255 to finish the image segmentation;
and step 23, after the image segmentation is finished, further processing the image by using a median filtering algorithm.
Further, the flame image matching in the step 2 specifically comprises the following steps:
step 24, selecting a space matching point:
1) Setting the collected left and right 2 images, respectively obtaining the matching of the left and right 2 images and the flame template image by utilizing an SIFT feature matching algorithm to obtain featuresPoint pair set S L And S R Extracting 2 characteristic point pair sets, extracting elements with the same template image characteristic point coordinates, namely searching the characteristic points with the same matching points in the template images in the left and right 2 images, and generating a new characteristic point pair set S from the matching points in the left and right images N
2) For characteristic point pair set S N Verifying, matching the left image and the right image by using an SIFT feature matching algorithm, and generating a feature point pair set S LAR Will aggregate S N Is in the set S LAR If the same element is searched, the characteristic point pair is matched between the left picture and the right picture, namely the space matching point, and the space matching point is reserved to generate a set of the space matching point pair finally
Step 25, coordinate calculation of the calibration point:
setting the total number of elements in the space matching point set as n, (χ) ii ) Is the coordinate of the point set, the center coordinate of the point set is
And taking the integral central coordinate as the coordinate of the calibration point image.
Further, the step 2 of measuring the flame position specifically comprises the following steps:
flame position space point P in camera coordinate system X c1 Y c1 Z c1 O c1 Three-dimensional coordinate of (x) pc1pc1 ,z pc1 ) Can be written as
Spatial point P in camera coordinate system X c2 Y c2 Z c2 O c2 Three-dimensional coordinate of (x) pc2pc2 ,z pc2 ) Is composed of
According to the above formula, the three-dimensional coordinates of the target point can be calibrated in the camera coordinate system by acquiring the amount of bits of the target point in the 2 images.
Further, the step 2 of positioning data filtering specifically comprises the following steps:
firstly, fitting data by using a minimum fitting circle to obtain the actual width and height of a fire disaster in an image, taking the fire disaster area as a square, and directly removing a point exceeding the maximum range of the square as a random error;
if the result data is still large, using the fire range when the average distance is used;
and processing the data once again, solving a joint probability distribution function F (x, y) from all the rest positioning data points to obtain a most dense area of X, Y joint probability distribution, taking the most dense point as an approximate fire point, taking the point as the center of a circle, taking the system precision delta d as a radius as a circle, taking the center of the circle as an initial point, and moving the center of the circle which finally covers the most positioning data as the optimal fire point.
The invention has the beneficial effects that:
the invention adopts a plurality of groups of intersection vision measurement modes to complete seamless coverage of the internal space, has certain redundancy, can realize real-time positioning of the fire source position, and improves the capability and robustness of the system for adapting to the accident. The visual positioning of the fire monitor can be realized by reasonably arranging the protection area within the error allowable range and adjusting the installation parameters, so that the positioning precision is improved as much as possible. And the positioning result is processed through filtering, and data with larger errors are removed, so that the result is more accurate. The invention solves the problems of higher requirement on hardware and lower positioning precision of the existing binocular vision positioning algorithm according to the optimized binocular vision positioning algorithm.
The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.
Drawings
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a schematic diagram of the flame position measurement of the present invention.
Detailed Description
As shown in fig. 1, the method for positioning a fire monitor based on binocular vision provided by the invention comprises the following steps:
step 1, a camera collects monitoring video images, and a high-resolution image is collected after an image detection system gives an alarm;
step 2, positioning and measuring the fire image, and judging whether a fire occurs or not, wherein the positioning and measuring comprises flame image segmentation, flame image matching, flame position measurement and positioning data filtering;
and 3, if the fire disaster is determined to occur, performing sound-light alarm, outputting positioning data to the fire monitor according to the fire disaster space position calculated in the step 2, spraying a fire extinguishing agent to the fire disaster area by the fire monitor to complete fire extinguishing, and removing the alarm after the fire disaster is extinguished.
The fire fighting water supply system mainly comprises an image type fire detecting and positioning part, a fire fighting part, a high-pressure fire fighting special water supply pipe network part and a control center. The image type fire detection and positioning part is the core of the intelligent fire monitor system, and seamless monitoring is continuously carried out on a protection area in real time through an image sensor. And analyzing the image through various fire criteria, judging whether a fire disaster occurs, if the fire disaster occurs, performing sound-light alarm, calculating the space position of the fire disaster by a vision measurement algorithm, and sending the space position to a fire monitor, wherein the fire monitor sprays a fire extinguishing agent to a fire disaster area to finish fire extinguishing. The fire disaster image positioning measurement is used for judging whether a fire disaster occurs or not, and the specific steps are analyzed as follows:
1. flame image segmentation
Performing threshold segmentation on the image, and extracting a target object with specific color characteristics from a complex background, wherein the specific implementation steps are as follows:
(1) Statistics of color information and similarity calculation
Setting the size of a picture shot by a camera to be mxn, firstly calculating the average value of pixel point color information in the image, wherein the calculation formula is shown as formula 1 and is used as a color template C of the image 1 ={H 1 ,S 1 ,V 1 }. And then calculating the color distance and the similarity between each pixel point in the image and the color template.
Formula 1
(2) Image segmentation
Setting a corresponding threshold, setting the gray value of the pixel point with the similarity larger than the threshold as 0, and setting the gray value of the pixel point with the similarity smaller than or equal to the threshold as 255, as shown in formula 2, thereby completing the segmentation of the image. The main factor influencing the segmentation effect is the selection of the threshold, so that the determination of an optimal threshold is the key of the segmentation. The requirement on corresponding color purity is increased due to the fact that the threshold value is selected to be too high, and therefore the anti-interference capability is greatly reduced; on the contrary, if the threshold is selected too low, the segmentation effect is affected due to the mixed color, and the value range of the similarity threshold in the invention is 0.7-0.9.
Formula 2
In the formula 2, P ij And representing the RGB pixel values of the pixel points after being divided. threshold is the similarity threshold.
(3) Median filtering
After the preliminary segmentation is finished, a small range of impulse noise points or tiny holes may exist in the image, and the image is further processed by using median filtering, so that the segmentation effect is smoother.
2. Flame image matching
For flame target positioning, if three-dimensional information of a target object is to be acquired, target images at different positions are acquired, three-dimensional information of the same point on the target object in 2 images is recovered by using a parallax principle, the point is called a calibration point of the target object, the matching degree of the positions of the calibration points in the 2 images determines the accuracy of the target positioning, SIFT feature points can be acquired in the 2 images by using an SIFT feature matching algorithm, due to the difference of shooting environments of the 2 images, the SIFT feature points acquired in the 2 images are not consistent in spatial position, namely the feature points are not all corresponding to the points on the same position on the target, and the calibration points with spatial consistency can be acquired only by selecting.
(1) Spatial matching point selection
In order to obtain the same matching point in 2 images, matching point selection is carried out by combining reverse search of the matching point and SIFT feature matching of the images, the left and right 2 images are collected, and the spatial point matching process is divided into the following steps:
1) Respectively obtaining the matching of the left and right 2 images and the flame template image by utilizing an SIFT feature matching algorithm to obtain a feature point pair set S L And S R Extracting2 characteristic point pair sets, extracting elements with the same template image characteristic point coordinates, namely searching the characteristic points with the same matching points in the template images in the left and right 2 images, and generating a new characteristic point pair set S from the matching points in the left and right images N
2) For characteristic point pair set S N Verifying, matching the left image and the right image by using an SIFT feature matching algorithm, and generating a feature point pair set S LAR Will aggregate S N Is in the set S LAR To perform a search. If the same element is searched, the characteristic point pair is also matched between the left picture and the right picture, which is called as a space matching point, and the space matching point is reserved. Finally, a set of spatially matched point pairs is generated
Set of spatially matched point pairs obtained by the above methodThe feature points in the method can well ensure that the feature points correspond to the same positions in the target object, and have good spatial position matching performance.
In which SIFT features are based on some local appearance of points of interest on objects independent of the size and rotation of the image. The tolerance to light, noise, and micro-viewing angle changes is also quite high. Based on these characteristics, they are highly significant and relatively easy to retrieve, easily identify objects and are rarely misidentified in feature databases with large denominations. The detection rate of partial object occlusion using the SIFT feature description is also quite high, and even more than 3 SIFT object features are enough to calculate the position and orientation.
(2) Calibration point coordinate calculation
Setting the total number of elements in the space matching point set as n, (χ) ii ) Is the coordinate of the point set, the center coordinate of the point set is
Formula 3
And taking the integral central coordinate as the coordinate of the calibration point image.
3. Flame position measurement
The most convenient binocular vision system is a parallel equal-height binocular vision system which is consistent with the structure of human eyes. Two identical cameras of the binocular vision system are arranged in parallel, and the front ends of the cameras are parallel and level and have the same height. The cameras coincide in the coordinate system with their abscissas at a distance b from the origin of the coordinates, which is referred to as the baseline. As shown in fig. 2.
Flame position space point P in camera coordinate system X c1 Y c1 Z c1 O c1 Three-dimensional coordinate of (x) pc1pc1 ,z pc1 ) Can be written as
Formula 4
Spatial point P in camera coordinate system X c2 Y c2 Z c2 O c2 Three-dimensional coordinate of (x) pc2pc2 ,z pc2 ) Is composed of
Formula 5
1 ,γ 1 )、(χ 2 ,γ 2 ) The physical coordinates of the spatial points P on the 2 camera imaging planes can be calculated from the pixel coordinates in the image by using known camera internal parameters. According to the formula, three target points can be calibrated in the camera coordinate system by acquiring the bit quantity of the target points in the 2 imagesAnd (4) dimensional coordinates.
4. Positioning data filtering
The positioning data is mostly near the real value, and a small part of error data can be scattered randomly. Generally, all positioning data points are covered by a minimum block fitting circle, the center of the circle is taken as the coordinate of a fire point, or the average value of all data points is calculated, namelyThe data is not filtered. The ratio of error data to total data is very low, or the error data is distributed uniformly and symmetrically by taking a truth value as a center, the result is very little influenced, actually, the fire location of a large-space building is influenced by various reasons, the error data is more and larger, the accuracy of the location result is seriously influenced, and the data needs to be filtered.
Firstly, fitting data by using a minimum fitting circle, and setting the diameter of a string as r and the circle center as h 1 When the system precision requirement is delta d, then r is less than or equal to d m When the data meets the requirement of measurement precision, the circle center h can be directly taken 1 As a fire point, when r&And Δ d, especially when r is much larger than d, may be a large error or a large fire hazard. The farthest distance from the fire point to the two cameras is d max1 、d max2 Average distance of d mean1 、d mean2 And estimating the fire range by combining the pixel number of the flame image. The horizontal field angle of the camera is 0, the vertical field angle is A, the pixel resolution is M multiplied by N, and the horizontal width of the flame image acquired by the two-phase camera is w 1 、w 2 Vertical height v 1 、v 2 . The actual width and height of the fire in the image is L, H, see equation 6.
Formula 6
The fire-taking area becomes a square with side length D max =max(L max ,H max ). The point exceeding the maximum range is the random error and is directly excluded. If the resulting data is still large, the average distance D is used mean The range of fire at the time.
The data is processed again. And solving probability distribution for all the rest positioning data. Because the root of the fire is the point of fire suppression, it is generally more focused on its in-plane coordinates. Here only (X, Y). The joint probability distribution function F (x, y) of all positioning data points is obtained to obtain the most dense area of X, Y joint probability distribution, the most dense point is used as an approximate fire point, the point is used as the center of a circle, precision delta d is used as the radius to make a circle, the center of the circle which finally covers the circle with the largest positioning data number is used as the initial point, the center of the circle which finally covers the most data is used as the optimal fire point, more than 50% of data are covered by the circle to ensure the universality and the correctness of the data, and the parameter can be adjusted according to the actual situation.
The invention adopts a plurality of groups of intersection vision measurement modes to complete seamless coverage of the internal space, has certain redundancy, can realize real-time positioning of the fire source position by the system, and improves the capability and robustness of the system for adapting to the accident. The visual positioning of the fire monitor can be realized by reasonably arranging the protection area within the error allowable range and adjusting the installation parameters, so that the positioning precision is improved as much as possible. And the positioning result is processed through filtering, and data with larger errors are removed, so that the result is more accurate. The invention solves the problems of higher requirement on hardware and lower positioning precision of the existing binocular vision positioning algorithm according to the optimized binocular vision positioning algorithm.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (5)

1. A fire monitor positioning method based on binocular vision is characterized by comprising the following steps:
step 1, a camera collects monitoring video images, and a high-resolution image is collected after an image detection system gives an alarm;
step 2, positioning and measuring the fire image, and judging whether a fire occurs or not, wherein the positioning and measuring comprises flame image segmentation, flame image matching, flame position measurement and positioning data filtering;
and 3, if the fire disaster is determined to occur, performing sound-light alarm, outputting positioning data to the fire monitor according to the fire disaster space position calculated in the step 2, spraying a fire extinguishing agent to the fire disaster area by the fire monitor to complete fire extinguishing, and removing the alarm after the fire disaster is extinguished.
2. The binocular vision-based fire monitor positioning method of claim 1, wherein: the flame image segmentation in the step 2 is realized by the following steps:
step 21, counting color information and calculating similarity, namely firstly calculating the average value of the color information of pixel points in the image to be used as a color template of the image, and then calculating the color distance and the similarity between each pixel point in the image and the color template;
step 22, image segmentation, namely setting a corresponding threshold, setting the gray value of the pixel point with the similarity larger than the threshold as 0, and setting the gray value of the pixel point with the similarity smaller than or equal to the threshold as 255, and completing the image segmentation;
and step 23, after the image segmentation is finished, further processing the image by using a median filtering algorithm.
3. The binocular vision-based fire monitor positioning method of claim 1, wherein the flame image matching in the step 2 is implemented by the following steps:
step 24, selecting a space matching point:
1) Set the left and right 2 images to be collected and utilizedA SIFT feature matching algorithm is used for respectively matching the left and right 2 images with the flame template image to obtain a feature point pair set S L And S R Extracting 2 characteristic point pair sets, extracting elements with the same template image characteristic point coordinates, namely searching the characteristic points with the same matching points in the template images in the left and right 2 images, and generating a new characteristic point pair set S from the matching points in the left and right images N
2) For feature point pair set S N Verifying, matching the left image and the right image by using an SIFT feature matching algorithm, and generating a feature point pair set S LAR Will aggregate S N Is in the set S LAR If the same element is searched, the characteristic point pair is matched between the left picture and the right picture, namely the space matching point, and the space matching point is reserved to generate a set of the space matching point pair finally
Step 25, coordinate calculation of the calibration point:
setting the total number of elements in the space matching point set as n, (χ) ii ) Is the coordinate of the point set, the center coordinate of the point set is
And taking the integral central coordinate as the coordinate of the calibration point image.
4. The binocular vision based fire monitor positioning method as recited in claim 1, wherein the step 2 of measuring the flame position specifically comprises the following steps:
flame position space point P in camera coordinate system X c1 Y c1 Z c1 O c1 InThree-dimensional coordinate (X) pc1pc1 ,z pc1 ) Can be written as
Spatial point P in camera coordinate system X c2 Y c2 Z c2 O c2 Three-dimensional coordinate of (x) pc2pc2 ,z pc2 ) Is composed of
According to the above formula, the three-dimensional coordinates of the target point can be calibrated in the camera coordinate system by obtaining the amount of bits of the target point in the 2 images.
5. The binocular vision based fire monitor positioning method of claim 1, wherein the step 2 positioning data filtering is implemented by the following steps:
firstly, fitting data by using a minimum fitting circle to obtain the actual width and height of a fire disaster in an image, wherein a fire disaster area is square, and a point exceeding the maximum range of the square is a random error and is directly eliminated;
if the result data is still larger, using the fire range when the average distance is used;
and processing the data again, solving a joint probability distribution function F (x, y) for all the rest positioning data points to obtain a X, Y joint probability distribution densest area, taking the densest point as an approximate fire point, taking the point as a circle center, taking the system precision delta d as a radius to make a circle, taking the circle center point as an initial point, and moving the circle center point which finally covers the circle with the largest number of positioning data as an optimal fire point.
CN201711364149.8A 2017-12-18 2017-12-18 A kind of fire monitor localization method based on binocular vision Pending CN107909615A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711364149.8A CN107909615A (en) 2017-12-18 2017-12-18 A kind of fire monitor localization method based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711364149.8A CN107909615A (en) 2017-12-18 2017-12-18 A kind of fire monitor localization method based on binocular vision

Publications (1)

Publication Number Publication Date
CN107909615A true CN107909615A (en) 2018-04-13

Family

ID=61869252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711364149.8A Pending CN107909615A (en) 2017-12-18 2017-12-18 A kind of fire monitor localization method based on binocular vision

Country Status (1)

Country Link
CN (1) CN107909615A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876856A (en) * 2018-06-29 2018-11-23 北京航空航天大学 A kind of heavy construction fire fire source recognition positioning method and system
CN109785574A (en) * 2019-01-21 2019-05-21 五邑大学 A kind of fire detection method based on deep learning
CN110501914A (en) * 2018-05-18 2019-11-26 佛山市顺德区美的电热电器制造有限公司 A kind of method for safety monitoring, equipment and computer readable storage medium
CN111494853A (en) * 2020-04-10 2020-08-07 中国矿业大学 Multi-mode visual servo control fire-fighting system and working method thereof
US20200368568A1 (en) * 2018-06-04 2020-11-26 Gaoli Ge Electric heater fire extinguishing strategy customization mechanism
CN112206441A (en) * 2020-10-12 2021-01-12 江西省智能产业技术创新研究院 Cooperative scheduling method of fire-fighting robot scheduling system
CN112633060A (en) * 2020-11-18 2021-04-09 合肥中科贝伦科技有限公司 Fire source positioning algorithm based on automatic aiming system of foam water mist turbofan cannon
CN113299035A (en) * 2021-05-21 2021-08-24 上海电机学院 Fire identification method and system based on artificial intelligence and binocular vision
CN113413564A (en) * 2021-05-28 2021-09-21 浙江工业大学 Fire source positioning and fire extinguishing control method for fire-fighting robot
CN113705539A (en) * 2021-09-29 2021-11-26 内江师范学院 Intelligent fire monitor fire extinguishing control method and control system
CN114534146A (en) * 2021-11-25 2022-05-27 北京南瑞怡和环保科技有限公司 Control method and system for automatically searching for ground based on flame of mobile electric fire-fighting robot
CN115797439A (en) * 2022-11-11 2023-03-14 中国消防救援学院 Flame space positioning system and method based on binocular vision
CN116884167A (en) * 2023-09-08 2023-10-13 山东舒尔智能工程有限公司 Intelligent fire control video monitoring and alarm linkage control system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101574567A (en) * 2009-06-08 2009-11-11 南京航空航天大学 Computer vision technique based method and system for detecting and extinguishing fire disaster intelligently
CN201600330U (en) * 2009-09-23 2010-10-06 中国农业大学 System for recognizing and locating mature pineapples
EP2757356A1 (en) * 2013-01-22 2014-07-23 SwissOptic AG Multispectral optical device
CN106934809A (en) * 2017-03-29 2017-07-07 厦门大学 Unmanned plane based on binocular vision autonomous oiling rapid abutting joint air navigation aid in the air

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101574567A (en) * 2009-06-08 2009-11-11 南京航空航天大学 Computer vision technique based method and system for detecting and extinguishing fire disaster intelligently
CN201600330U (en) * 2009-09-23 2010-10-06 中国农业大学 System for recognizing and locating mature pineapples
EP2757356A1 (en) * 2013-01-22 2014-07-23 SwissOptic AG Multispectral optical device
CN106934809A (en) * 2017-03-29 2017-07-07 厦门大学 Unmanned plane based on binocular vision autonomous oiling rapid abutting joint air navigation aid in the air

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JEROME VICENTE等: "An image processing technique for automatically detecting forest fire", 《INTERNATIONAL JOURNAL OF THERMAL SCIENCES》 *
孟浩等: "基于SIFT特征点的双目视觉定位", 《哈尔滨工程大学学报》 *
尚倩等: "双目立体视觉的目标识别与定位", 《智能系统学报》 *
徐飞等: "基于CCD的大空间建筑中智能消防炮定位技术", 《消防科学与技术》 *
杨帆主编: "《数字图像处理与分析》", 31 May 2015, 北京:北京航空航天大学出版社 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110501914A (en) * 2018-05-18 2019-11-26 佛山市顺德区美的电热电器制造有限公司 A kind of method for safety monitoring, equipment and computer readable storage medium
CN110501914B (en) * 2018-05-18 2023-08-11 佛山市顺德区美的电热电器制造有限公司 Security monitoring method, equipment and computer readable storage medium
US20200368568A1 (en) * 2018-06-04 2020-11-26 Gaoli Ge Electric heater fire extinguishing strategy customization mechanism
CN108876856A (en) * 2018-06-29 2018-11-23 北京航空航天大学 A kind of heavy construction fire fire source recognition positioning method and system
CN108876856B (en) * 2018-06-29 2020-10-09 北京航空航天大学 Fire source identification and positioning method and system for large building
CN109785574A (en) * 2019-01-21 2019-05-21 五邑大学 A kind of fire detection method based on deep learning
CN111494853B (en) * 2020-04-10 2021-05-11 中国矿业大学 Multi-mode visual servo control fire-fighting system and working method thereof
CN111494853A (en) * 2020-04-10 2020-08-07 中国矿业大学 Multi-mode visual servo control fire-fighting system and working method thereof
CN112206441A (en) * 2020-10-12 2021-01-12 江西省智能产业技术创新研究院 Cooperative scheduling method of fire-fighting robot scheduling system
CN112633060A (en) * 2020-11-18 2021-04-09 合肥中科贝伦科技有限公司 Fire source positioning algorithm based on automatic aiming system of foam water mist turbofan cannon
CN113299035A (en) * 2021-05-21 2021-08-24 上海电机学院 Fire identification method and system based on artificial intelligence and binocular vision
CN113413564A (en) * 2021-05-28 2021-09-21 浙江工业大学 Fire source positioning and fire extinguishing control method for fire-fighting robot
CN113705539A (en) * 2021-09-29 2021-11-26 内江师范学院 Intelligent fire monitor fire extinguishing control method and control system
CN114534146A (en) * 2021-11-25 2022-05-27 北京南瑞怡和环保科技有限公司 Control method and system for automatically searching for ground based on flame of mobile electric fire-fighting robot
CN115797439A (en) * 2022-11-11 2023-03-14 中国消防救援学院 Flame space positioning system and method based on binocular vision
CN116884167A (en) * 2023-09-08 2023-10-13 山东舒尔智能工程有限公司 Intelligent fire control video monitoring and alarm linkage control system
CN116884167B (en) * 2023-09-08 2023-12-05 山东舒尔智能工程有限公司 Intelligent fire control video monitoring and alarm linkage control system

Similar Documents

Publication Publication Date Title
CN107909615A (en) A kind of fire monitor localization method based on binocular vision
CA3022839C (en) Flight parameter measuring apparatus and flight parameter measuring method
CN104751593B (en) Method and system for fire detection, warning, positioning and extinguishing
CN101574567B (en) Computer vision technique based method and system for detecting and extinguishing fire disaster intelligently
CN104935893B (en) Monitor method and apparatus
CN109190628A (en) A kind of plate camber detection method based on machine vision
CN103700140B (en) Spatial modeling method used for linkage of single gun camera and multiple dome cameras
CN104902246A (en) Video monitoring method and device
CN105550670A (en) Target object dynamic tracking and measurement positioning method
CN108038867A (en) Fire defector and localization method based on multiple features fusion and stereoscopic vision
CN101574566A (en) Monocular vision technique based fire monitor control method for adjusting relative positions of fire point and water-drop point
CN104036488A (en) Binocular vision-based human body posture and action research method
CN102298816B (en) Fire early warning method for marine engine room based on multi-source fusion
CN110021133B (en) All-weather fire-fighting fire patrol early-warning monitoring system and fire image detection method
CN106210634A (en) A kind of wisdom gold eyeball identification personnel fall down to the ground alarm method and device
CN104966062A (en) Video monitoring method and device
CN106846375A (en) A kind of flame detecting method for being applied to autonomous firefighting robot
CN111241667B (en) Method for identifying plasma configuration based on image processing and probe data processing
CN113299035A (en) Fire identification method and system based on artificial intelligence and binocular vision
CN109830078B (en) Intelligent behavior analysis method and intelligent behavior analysis equipment suitable for narrow space
WO2022127181A1 (en) Passenger flow monitoring method and apparatus, and electronic device and storage medium
CN108537829A (en) A kind of monitor video personnel state recognition methods
CN112906674A (en) Mine fire identification and fire source positioning method based on binocular vision
CN116832380B (en) Fire extinguishing method of fire extinguishing system based on multi-fire-source point evaluation decision
CN111539264A (en) Ship flame detection positioning system and detection positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180413

RJ01 Rejection of invention patent application after publication