CN116665034A - Three-dimensional matching and flame space positioning method based on edge characteristics - Google Patents

Three-dimensional matching and flame space positioning method based on edge characteristics Download PDF

Info

Publication number
CN116665034A
CN116665034A CN202211412606.7A CN202211412606A CN116665034A CN 116665034 A CN116665034 A CN 116665034A CN 202211412606 A CN202211412606 A CN 202211412606A CN 116665034 A CN116665034 A CN 116665034A
Authority
CN
China
Prior art keywords
matching
image
flame
points
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211412606.7A
Other languages
Chinese (zh)
Other versions
CN116665034B (en
Inventor
郭赞权
王佩
王建飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Fire Rescue College
Original Assignee
China Fire Rescue College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Fire Rescue College filed Critical China Fire Rescue College
Priority to CN202211412606.7A priority Critical patent/CN116665034B/en
Publication of CN116665034A publication Critical patent/CN116665034A/en
Application granted granted Critical
Publication of CN116665034B publication Critical patent/CN116665034B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a stereo matching and flame space positioning method based on edge characteristics, wherein the stereo matching comprises the following steps: s1, determining a matching primitive, a matching constraint criterion, a similarity measure and a matching strategy; s2, taking pixel points on the flame edge as matching elements, and screening out optimal matching points, candidate matching points and unmatched points by utilizing double-threshold discrimination; the best match is considered to be the correct match, and the candidate match is screened in the next step; s3, searching for optimal matching further by utilizing a parallax compatibility principle, and eliminating mismatching; and S4, after the unique matching is output, obtaining the sub-pixel parallax according to a binary search method. Flame space positioning includes: s1', determining a flame target space position; s2', acquiring the coordinates of the flame center based on the camera parameters, and realizing the positioning of the flame target. Corresponding systems, electronic devices, and computer-readable storage media are also disclosed.

Description

Three-dimensional matching and flame space positioning method based on edge characteristics
Technical Field
The invention belongs to the technical field of safety science and engineering, and particularly relates to a three-dimensional matching and flame space positioning method based on edge characteristics.
Background
Timely and accurately detecting a fire should include both timely detection of the fire and alarm in time and spatially accurate detection of the fire and fire extinguishing. However, just detecting in time and early warning is far from enough, and if the fire cannot be found accurately, the subsequent putting out and investigation are very difficult. Detecting fire in time and early warning are always research hot spots in the field of fire detection. The traditional detection technologies such as temperature sensing, smoke sensing, light sensing and the like are mature; new image type detection techniques are also being studied in depth, and corresponding products have been also being engineering examples. However, both conventional detectors and image detectors have mainly studied the problem of detecting fires in time, but there has been little research on how to spatially locate flame positions accurately.
Disclosure of Invention
The invention aims to provide a three-dimensional matching and flame space positioning method based on edge characteristics, wherein the flame image matching method based on the edge characteristics utilizes a double-threshold discrimination and compatibility principle to realize the matching of flame edge characteristic points, and utilizes a binary search method to obtain the parallax of sub-pixel levels, thereby laying a foundation for realizing flame positioning. The method for realizing flame positioning is realized, and an error model of positioning is established. The model shows that: the error in positioning increases with the distance D of the target relative to the camera; decreasing with increasing baseline distance b of the system; decreasing with increasing focal length f of the camera. And the positioning precision is high.
Referring to fig. 1, in one aspect, the invention provides a stereo matching and flame space positioning method based on edge features, which comprises the steps of performing stereo matching of flames based on the edge features and performing flame space positioning based on the stereo matching of the flames; the content of the stereo matching of the flame comprises: establishing a corresponding relation between pixels of images of the same scene in a scene obtained from different angles by two or more cameras respectively;
the method for stereo matching of the flame comprises the following steps:
s1, selecting a matching primitive, determining a matching constraint criterion based on the matching primitive, establishing a similarity measurement function based on the matching constraint criterion, determining a similarity measure based on the similarity measurement function, and determining a search strategy of stereo matching based on the measure;
s2, taking pixel points on the flame edge as matching elements, and screening out optimal matching points, candidate matching points and unmatched points by utilizing double-threshold discrimination; the best match is considered to be the correct match, and the candidate match is screened in the next step;
s3, searching for optimal matching further by utilizing a parallax compatibility principle, and eliminating mismatching;
s4, obtaining sub-pixel parallax according to a binary search method after outputting unique matching;
The flame space positioning includes:
s1', determining a flame target space position;
through stereo matching, the optimal matching point of the flame edge characteristic is preliminarily obtained. Assumed point (x) l ,y l) and (xr ,y r ) Is a pair of best matching points, the parallax of which is d, and the image coordinates are converted into camera coordinates, then there are:
after the calibration and correction of the system, the binocular image pair acquired by the system is aligned in the forward direction in the mathematical sense, and then the rotation matrix between the two cameras is providedAnd a translation vector t= [ b 0 0 ]]B is the baseline distance. If the left camera coordinate system is taken as the world coordinate system, the relationship between the two camera coordinate systems can be expressed as:
according to equations (16), (17) and (18), the spatial three-dimensional coordinates of the point are obtained with the left camera coordinate system as the world coordinate system as follows:
taking outAs the system focal length, then equation (19) can be reduced to:
describing the specific location of the flame by the geometric center of the flame region, there is a flame center location coordinate:
wherein n is the number of edge features corresponding to the matching (x) i ,y i ,z i ) To match the three-dimensional space coordinates of the corresponding edge feature points, (x) 0 ,y 0 ,z 0 ) Is the three-dimensional spatial coordinates of the flame center.
Converting coordinates in a three-dimensional rectangular coordinate system into coordinates in a spherical coordinate system with an imaging center of a left camera as an origin, and obtaining a distance D, a horizontal deflection angle alpha and a vertical deflection angle beta of a fire source center relative to the camera, wherein the distance D, the horizontal deflection angle alpha and the vertical deflection angle beta are respectively as follows:
S2', obtaining the coordinates of the flame center based on the camera parameters, and realizing the positioning of the flame target, wherein the method comprises the following steps: the flame image three-dimensional matching method based on the edge characteristics and the determination of the flame target space position are combined with the camera parameters acquired in the prior art, so that the coordinates of the flame center can be obtained, and the flame target is positioned.
Preferably, the matching primitive is an image feature for stereo matching, and the selection of the matching primitive determines which information in the image is used for matching, including: one or more of an original gray value of an image, a salient feature in the image, a statistical feature formed by measurement and evaluation of a certain preset area in the image and a high-level shape descriptor, wherein the salient feature comprises angular points, intersection points, edges, outlines and local curvature maximum points; the statistical features include geometric invariant moment, centroid; the original gray value of the image is the most basic image information, and is also the most direct and simplest matching primitive, and can be directly obtained by an imaging system; the salient features represent most of internal structural information in the image, and have good image distortion invariance and good uniqueness; if no obvious salient features are found in the image, the high-level shape descriptors have better uniqueness and are used as matching primitives, including topology descriptors, morphology descriptors and Fourier descriptors; the statistical features of a region in the image are also often referred to as matching primitives, which describe the statistical information on that region in the image, and the values of the centroid and geometric invariance moment are independent of a particular coordinate system.
Preferably, the matching constraint criterion is that after selecting a proper matching primitive, image features corresponding to the image features selected by the reference image are searched in the image to be matched, and the matching constraint criterion includes:
(1) Uniqueness constraint criteria: each matching primitive in the reference image can only correspond to a unique one of the matching primitives in the image to be matched at most, but in the presence of occlusion, there will be no corresponding point;
(2) Polar constraint criteria: according to the principle of epipolar geometry, a certain point in a reference image is limited on the conjugate polar line of the matching point in the image to be matched, and when in matching searching, searching is only needed on the horizontal scanning line of another image;
(3) Similarity constraint criteria: the mutually matched points are obtained by projection of the same point in space, and the points and the respective adjacent areas or the areas on the outline, the gray scale and the gray scale gradient have similarity;
(4) Continuity constraint criteria: the matched disparities should be smooth except for occlusion areas and disparity discontinuity areas caused by boundaries;
(5) Order consistency constraint criteria: the dot columns on the outer polar line of the reference image and the dot columns corresponding to the dot columns in the image to be matched have the same arrangement;
(6) Interoperability constraint criteria: the mutual nature of matching, assuming that the left graph is first used as a reference graph, the right graph is searched for a matching primitive p in the left graph l Corresponding matching primitive p r P is established l →p r Corresponding relation of (3); then the right graph is used as reference graph in turn, and the matching primitive p in the left graph and the right graph is searched r Corresponding matching primitive p l Can have p r →p l That is, the correspondence is bi-directional, mutual; if the matching correspondence is not bi-directional, the correspondence is deemed unreliable.
Preferably, the similarity measure refers to a measure for describing the similarity degree between the matching primitives of the image pair, and includes: firstly, constructing a function describing the difference between the matched primitives, and when the difference function obtains the minimum value, considering that the difference between the two matched primitives is the minimum, namely the similarity is the best; set I l (x, y) and I r (x, y) are the image pairs to be matched acquired by the system respectively, (x) l ,y l ) For a point to be matched in a reference image, the template W is a neighborhood with the point as a center, the size is m multiplied by n, d is parallax, and the similarity measure function comprises:
sum-difference function of pixel gray level absolute difference
Sum of squares difference function of pixel gray level differences
Zero-mean pixel gray level absolute difference sum-difference function
Normalized cross-correlation function
Preferably, the stereo matching search strategy includes:
(1) Global optimum search strategy: under the multi-constraint criterion, searching the global by taking the minimum difference function as a target, wherein the global optimal matching strategy comprises the following steps: the dynamic programming method is to implement a global optimal matching strategy on a point set of the reference image and the image to be matched along two corresponding epipolar lines to find a minimum matching difference path; the relaxation method is that on the basis of a similarity constraint criterion, the matching probability of each point is calculated, and then the probability is adjusted by a nonlinear optimization algorithm under a multi-constraint condition so as to find the optimal matching corresponding relation;
(2) Hierarchical matching search strategy: carrying out multi-level sampling on the image to generate a series of image samples with different resolutions, and carrying out layering processing according to the resolution; firstly, carrying out matching search on an image layer with low resolution, and obtaining a relatively rough matching position due to the resolution relation; and then, using the matching position as a template, finding a corresponding window in the image of the next layer, and continuing to perform matching search in the window to obtain a finer matching position in one step, so as to find the corresponding matching position in the original image pair.
Preferably, the step S2 is to use the pixel points on the flame edge as the matching primitive, and screen out the best matching point, the candidate matching point and the unmatched point by using double threshold discrimination; the best match is considered to be the correct match, and the candidate match is then screened for further steps including:
set I 1 and I2 Respectively two images in the image pair to be matched, wherein the sets of flame edge characteristic points in the images are respectively recorded as and />The corresponding gray value is denoted as g i 1 and gj 2 Each point on the edge feature in the image pair is marked with a sitting sign (x 1 ,y 1) and (x2 ,y 2 ) The neighborhood W is defined as a square window with the characteristic point as the center and the size of (2n+1) x (2n+1); in image I 2 And image I 1 Searching for the best match correspondence>The specific steps of (a) are as follows:
(1) Determining a search range and calculating the similarity degree of each group of possible matches;
(2) Carrying out double-threshold judgment on each group of possible matches, and screening the best matching point and the candidate matching point;
(3) And (3) detecting the interactivity: according to the above steps, the image I is further processed 2 Characteristic point I of (1) j 2 The operation is carried out, and the characteristic point I is obtained j 2 Is a set L of best matching points and candidate matching points 2 (I j 2 ) According to the constraint criterion of the mutual property, if I i 1 and Ij 2 Is the best match, then consider the matching relationship (I i 1 ,I j 2 ) For best matching, the output result will be some best matching points and candidate matching point set L 1 (I i 1) and L2 (I j 2 )。
Preferably, the step S3 of further searching for the best match using the parallax compatibility principle, and the step of eliminating the mismatching includes: the method comprises the steps of carrying out next search on a candidate matching set by utilizing a parallax compatibility principle, defining an evaluation function for matching in the candidate matching set, wherein the size of a function value of the evaluation function reflects the compatibility degree between each pair of matching in the neighborhood, and the function can be utilized to obtain the best matching of parallax compatibility, and the evaluation function has the following form:
wherein ,
d ij and dmn Respectively is matched (I i 1 ,I j 2) and (Im 1 ,I n 2 ) The corresponding parallaxes, a and b are normal numbers, and a+b=1;
if a pair of candidate matching points satisfies:
then the match is considered to be the best match;
carrying out the two-step processing on all candidate matching points of the two images, removing all possible candidate matching points containing the unique optimal matching points, and carrying out the process until the number of the optimal matching points is not changed or the iteration number reaches a certain preset maximum value; to this end, all best matches and their integer level disparities will be output.
Preferably, the step S4 of obtaining the sub-pixel parallax according to the binary search method after the unique match is output includes: defining a reference window W 1 (x) And a matching window W 2 (x+d ij ) Canonical correlation number between:
when d ij If the floating point gray value is not an integer, calculating the floating point gray value W of the matching window by using a linear interpolation formula 2 (x+d ij ):
wherein [dij ]The representation pair is less than or equal to d ij A maximum integer;
the calculation formula of the dichotomy search is:
wherein A is the iteration number.
Preferably, the method further comprises the step of verifying the effect of stereo matching through a positioning error model; the position of any point in space, if expressed in terms of distance D from the left camera, horizontal offset angle α and vertical offset angle β, is shown in equation (22), however, taking equation (20) into equation (22) has:
establishing an error model for the distance D obtained by the system;
first, deriving D from D is available:
where the minus sign indicates only the direction, an error model of the positioning can be built according to equation (24):
wherein b is the baseline distance, x l 、y l For the coordinate value of the target point in the left image, f is the focal length of the camera, Δd= |d-D 0 I is absolute error, D is distance measured by positioning system, D 0 For the actual measured distance, ΔD/D is the relative error, Δd= |d-D 0 I is the error of matching, d is the parallax calculated by the stereo matching algorithm, d 0 Is the theoretical parallax; due to the coordinate value x of the target point in the left image l 、y l Far smaller than the focal length f of the camera and thereforeThe above-described error model can be simplified by the following equations (25) and (26):
a second aspect of the present invention provides a stereo matching and flame space localization system based on edge features, comprising:
the stereo matching module is used for carrying out stereo matching of flame based on the edge characteristics; and
and the space positioning module is used for performing flame space positioning based on the three-dimensional matching of the flames.
A third aspect of the invention provides an electronic device comprising a processor and a memory, the memory storing a plurality of instructions, the processor being for reading the instructions and performing the method according to the first aspect.
A fourth aspect of the invention provides a computer readable storage medium storing a plurality of instructions readable by a processor and for performing the method of the first aspect.
The method, the device, the electronic equipment and the computer readable storage medium provided by the invention have the following beneficial technical effects:
(1) Based on the bionics principle, the technology of digital image processing, stereoscopic vision and the like is combined to realize a monitoring technology of flame space positioning, two cameras are utilized to monitor target environments at the same time, if a fire is found, binocular images of flames are timely obtained, the space coordinates of the flame targets are obtained according to the binocular stereoscopic vision principle, and the fire is accurately detected in space, so that fire-fighting facilities and fire fighters are guided to conduct fire extinguishing work, reference data can be provided for fire adjustment work, and the method has outstanding theoretical significance and application value.
(2) The flame image matching method based on the edge features utilizes the principle of dual-threshold discrimination and compatibility to realize the matching of the flame edge feature points, and utilizes the binary search method to obtain the parallax of sub-pixel levels, thereby laying a foundation for realizing flame positioning. The method for realizing flame positioning is realized, and an error model of positioning is established. The model shows that: the error in positioning increases with the distance D of the target relative to the camera; decreasing with increasing baseline distance b of the system; the focal length f of the camera is reduced along with the increase of the focal length f of the camera, and the positioning accuracy is high.
Drawings
FIG. 1 is a flow chart of a stereo matching and flame spatial localization method based on edge features according to a preferred embodiment of the present invention;
FIG. 2 is a schematic view of camera spherical coordinates according to a preferred embodiment of the present invention;
FIG. 3 is a graph showing experimental and theoretical errors at a focal length of 8mm and a baseline distance of 0.3m, according to a preferred embodiment of the present invention;
FIG. 4 is a graph showing experimental and theoretical errors for a focal length of 6mm and an actual distance of 6m, according to a preferred embodiment of the present invention;
FIG. 5 is a graph showing experimental and theoretical errors at an actual distance of 6m and a baseline distance of 0.3m, in accordance with a preferred embodiment of the present invention;
FIG. 6 is a graph showing experimental error at a focal length of 6mm, according to a preferred embodiment of the present invention;
FIG. 7 is a graph showing experimental error at a focal length of 8mm according to a preferred embodiment of the present invention;
FIG. 8 is a graph showing experimental error at a focal length of 10mm, according to a preferred embodiment of the present invention;
fig. 9 is a schematic structural diagram of an embodiment of an electronic device according to the present invention.
Detailed Description
The following describes in further detail the embodiments of the present invention with reference to the drawings and examples. The following examples are illustrative of the invention and are not intended to limit the scope of the invention.
Example 1
Referring to fig. 1, a stereo matching and flame space positioning method based on edge features is provided, which comprises stereo matching of flames based on the edge features and flame space positioning based on the stereo matching of the flames.
As a preferred embodiment, the content of the stereo matching of the flame includes:
when two or even more cameras acquire images of the same scene in a scene from different angles, the scene will be very different in the images of the different cameras due to the different viewing angles, the geometric and physical characteristics of the object, the environmental noise and the like, and the stereo matching is to establish a corresponding relationship between the pixels of the images. In the prior art, many researches on stereo matching exist, but a general algorithm is not formed, and four problems are inevitably solved no matter what stereo matching method is adopted: i.e. selection of matching primitives, matching constraint criteria, establishment of similarity metric functions, search strategy for stereo matching.
The method for stereo matching of the flame comprises the following steps:
s1, selecting a matching primitive, determining a matching constraint criterion based on the matching primitive, establishing a similarity measurement function based on the matching constraint criterion, determining a similarity measure based on the similarity measurement function, and determining a search strategy of stereo matching based on the measure;
s2, taking pixel points on the flame edge as matching elements, and screening out optimal matching points, candidate matching points and unmatched points by utilizing double-threshold discrimination; the best match is considered to be the correct match, and the candidate match is screened in the next step;
s3, searching for optimal matching further by utilizing a parallax compatibility principle, and eliminating mismatching;
and S4, after the unique matching is output, obtaining the sub-pixel parallax according to a binary search method.
As a preferred embodiment, the matching primitive is an image feature used for stereo matching, and the selection of the matching primitive determines which information in the image is used for matching, so that the appropriate matching primitive is the basis for which stereo matching is performed smoothly. There is no general theory as to how to select matching primitives, and thus diversity of matching primitive selection is also resulted. The features available in practical applications are many, and these features may be the original gray values of the image, or some prominent features in the image, such as: corner points, edges, contours, local curvature maximum points, etc.; it may also be a statistical feature (a statistical feature is a measurement and evaluation of a certain predetermined area in an image), such as: geometric invariant moment, centroid, etc.
(1) Image gray scale
The original gray level of the image is the most basic image information, and is also the most direct and simplest matching primitive, and can be directly obtained by an imaging system.
(2) Protruding features
The salient features such as corner points, intersection points, edges, outlines, local curvature maximum points and the like in the image often represent most of internal structural information in the image, and the complexity of a matching algorithm is greatly reduced because the salient features have good image distortion invariance and good uniqueness, so that the salient features are frequently selected as matching primitives, and the salient features are widely applied to stereo matching. In addition, if no obvious salient features are found in the image, some high-level shape descriptors, such as topology descriptors, morphology descriptors and fourier descriptors, are often used as matching primitives due to their better uniqueness and uniqueness.
(3) Statistical features
The statistical features of a region in an image are also often referred to as matching primitives, which describe the statistical information on that region in the image. Common statistical information is centroid and geometric invariant moment, both of which values are independent of a particular coordinate system. However, they also have no specific spatial meaning like the scale, so the whole matching problem is converted into a problem of maximizing the similarity measure of the respective eigenvalues in the two images, and the intuitive physical meaning of the matching problem itself is lost. Thus, research has pointed out that the use of special shape points such as centroid and radius weighted averages to simplify shape matching, such features are easier to calculate and also have some robustness to noise, and more importantly, such features are not abstract any more, and all have intuitive spatial significance, so the practicability is higher.
As a preferred embodiment, the matching constraint criterion is another key content in the stereo matching problem, and after selecting a suitable matching primitive, the image to be matched should be searched for the image feature corresponding to the image feature selected by the reference image. When a scene in a scene is projected as an image, the scene may have a large difference in the images at different viewing angles due to the difference in viewing angles, the geometric and physical characteristics of objects, environmental noise, and the like. Thus, a matching primitive in the reference image may have a plurality of similar candidate primitives in the image to be matched, i.e. a one-to-many situation, but in practice there should be only one feature in the image pair that corresponds to the match. In this way, a certain constraint criterion must be introduced into the stereo matching algorithm so as to reduce the range of matching search and determine the correct correspondence of the feature primitives in the image pair. In other words, stereo matching is essentially the best search problem under multiple constraints, with several common matching constraint criteria:
(1) Uniqueness constraint criteria
The uniqueness constraint is the most fundamental constraint in stereo matching. In short, each matching primitive in the reference image can at most correspond to only one matching primitive in the image to be matched, but in the presence of occlusion there will be no corresponding point.
(2) Polar constraint criteria
From the principle of epipolar geometry, a point in the reference image must be limited to its conjugate polar line at the matching point in the image to be matched. The constraint greatly reduces the complexity of the matching problem, reducing the range of matching searches from two-dimensional global images to one-dimensional conjugate lines. Especially, when the binocular image pair is in a forward parallel alignment state, parallax between corresponding points of images in the Y-axis direction can be effectively eliminated, so that when matching searching is performed, searching is only performed on a horizontal scanning line of another image, and matching difficulty is greatly reduced.
(3) Similarity constraint criteria
The similarity constraint criterion is the core of a stereo matching algorithm, and the mutually matched points are actually obtained by projection of the same point in space, so that the points and their respective neighbors have similarity in appearance outline (such as geometric shape and the like) or in certain physical quantities (such as regional gray scale, gray scale change gradient and the like). This is also the basis for constructing the objective function at the time of the match search.
(4) Continuity constraint criterion
The matched disparities should be smooth except for occlusion areas and disparity discontinuity areas caused by boundaries. Specifically, if two points P and Q are closely spaced on the same surface, their projections in the image pair are also generally relatively close, and thus, P is established l and pr After matching with q l Q of matching r That is to say at p r Nearby.
(5) Order consistency constraint criteria
The order consistency constraint criteria describes that the columns of points on the epipolar line of the reference image and the columns of points corresponding thereto in the image to be matched have the same arrangement.
(6) Interoperability constraint criteria
The interoperability constraint criteria describe the interactivity of the matches, specifically: assuming that the left diagram is taken as a reference diagram first, searching the right diagram for a matching primitive p in the left diagram l Corresponding matching primitive p r P is established l →p r Corresponding relation of (3); then the right graph is used as reference graph in turn, and the matching primitive p in the left graph and the right graph is searched r Corresponding matching primitive p l Can have p r →p l That is, the correspondence is bidirectional, mutual. If the matching correspondence is not bi-directional, the correspondence is deemed unreliable.
As a preferred embodiment, the similarity measure refers to a measure describing the degree of similarity between matching primitives of an image pair, and in short, how to determine the correspondence between matching primitives. The general idea is to first construct a function describing the difference between the matching primitives, and when the difference function takes the minimum value, the difference between the two matching primitives is considered to be the smallest, i.e. the similarity is the best. In general, the form of similarity measure varies from algorithm to algorithm.
Set I l (x, y) and I r (x, y) are the pairs of images to be matched acquired by the system, respectively. (x) l ,y l ) For a point to be matched in the reference image, the template W is a neighborhood centered on the point, the size is m×n, and d is parallax. Several similarity measure functions are common:
sum-difference function of pixel gray level absolute difference
Sum of squares difference function of pixel gray level differences
Zero-mean pixel gray level absolute difference sum-difference function
Normalized cross-correlation function
As a preferred embodiment, the stereo matching search strategy includes: from the foregoing analysis, it is clear that stereo matching is essentially the best search problem under multiple constraints. When stereo matching is performed, no matter what matching primitive is selected, which constraints exist, how to evaluate the difference between the matching primitives, and finally how to search. And a proper search strategy can certainly greatly improve the effect and efficiency of stereo matching.
(1) Global optimum search strategy
The global optimal searching strategy, as the name implies, searches the global by taking the minimum difference function as a target under the multi-constraint criterion, and the matching searching can effectively avoid the influence of various interference sources and the trouble of sinking into a local optimal solution; however, the global optimal search has large calculation amount, low matching speed and low efficiency. The most common global optimal matching strategies are: dynamic programming, relaxation, etc. The dynamic programming method is that a global optimal matching strategy is implemented on a point set of the reference image and the image to be matched along two corresponding polar lines to find a minimum matching difference path; the relaxation method is to calculate the matching probability of each point based on the similarity constraint criterion, and then to adjust the probability by using a nonlinear optimization algorithm under the multi-constraint condition to find the optimal matching corresponding relation.
(2) Hierarchical matching search strategy
The hierarchical search strategy is a process of hierarchical search from coarse to fine, and the basic idea is that: the image is sampled in multiple layers to generate a series of image samples with different resolutions, and layering processing is carried out according to the resolution. Firstly, carrying out matching search on an image layer with low resolution, and obtaining a relatively rough matching position due to the resolution relation; and then, using the matching position as a template, finding a corresponding window in the image of the next layer, and continuing to perform matching search in the window to obtain a finer matching position in one step, so as to find the corresponding matching position in the original image pair.
As mentioned above, stereo matching is essentially an optimal search problem under multiple constraints, and the process involves several key issues: i.e. selection of matching primitives, matching constraint criteria, establishment of similarity metric functions, search strategy for stereo matching. The edge profile, which is one of the common features of the flame image, contains a lot of important information on the flame image and is relatively simple to process, so that the edge feature of the flame image is selected as a matching primitive to have high practicability. In combination with the characteristics of the flame image, the embodiment provides a flame image matching method based on edge characteristics. The basic idea is that: firstly, taking pixel points on flame edges as matching elements, and screening out optimal matching points, candidate matching points and unmatched points by utilizing double-threshold discrimination; the best match is considered to be the correct match, the candidate match is screened in the next step, and the best match is further searched by utilizing the parallax compatibility principle in the next step, so that the mismatching is eliminated; and finally, outputting unique matching, and acquiring the parallax of the sub-pixels according to a binary search method.
As a preferred embodiment, the step S2 is to screen out the best matching point, the candidate matching point and the unmatched point by using the pixel points on the flame edge as the matching primitive and using the dual threshold discrimination; the best match is considered to be the correct match, and the candidate match is then screened for further steps including:
set I 1 and I2 Respectively two images in the image pair to be matched, wherein the sets of flame edge characteristic points in the images are respectively marked as { I } i 1} and {Ij 2 Corresponding gray values are noted g i 1 and gj 2 Each point on the edge feature in the image pair is marked with a sitting sign (x 1 ,y 1) and (x2 ,y 2 ) The neighborhood W is defined as a square window of (2n+1) × (2n+1) size centered on the feature point. Then, in image I 2 And image I 1 Searching for the best match correspondence (I) i 1 ,I j 2 ) The specific steps of (a) are as follows:
(1) Determining search range and calculating similarity degree of each group of possible matches
The search range S is the search range S of the points (x 1 ,y 1 ) Is centered and has a size of (2H max +1)×(2V max +1), wherein H max and Vmax Respectively horizontal maximum parallax and vertical maximum parallax, obviously I j 2 Coordinates (x) 2 ,y 2 ) The following should be satisfied:
{(x 2 ,y 2 )|x 1 -H max ≤x 2 ≤x 1 +H max ,y 1 -V max ≤y 2 ≤y 1 +V max } (5)
since the system is aligned in parallel in the forward direction, the vertical parallax should theoretically be 0 according to the epipolar geometry, but in view of various uncertainties, a smaller V is still taken here max The vertical maximum parallax is adopted, so that the matching search range is enlarged, and the accuracy of stereo matching is improved.
For each set of possible matches, the degree of similarity is calculated, here using a normalized cross-correlation function:
here the neighborhood W is typically a 9 x 9 square window.
(2) Performing double-threshold discrimination on each group of possible matches, and screening the best matching point and the candidate matching point
In stereo matching, the matching with the largest cross-correlation function is generally taken as the best matching point, but in many cases, the maximum value of the cross-correlation function is not greatly different from the next largest value, and mismatching is easy to occur at the moment, so that double-threshold judgment is adopted hereinAnd respectively screening out the best matching point and the candidate matching point by using two-stage thresholds. Set C max (I i 1 ,I p 2) and Csec (I i 1 ,I q 2 ) The maximum value and the next maximum value of the cross-correlation function are respectively, and the threshold value of the maximum value and the threshold value of the next maximum value are respectively T 1 and T2 If it meets the following conditions:
the matching point is considered to be the best matching point. If it meets
Then the matching point is considered as a candidate matching point and is marked as L 1 (I i 1 ) The rest is the mismatch point.
(3) Interactivity detection
According to the above steps, the image I is further processed 2 Characteristic point I of (1) j 2 The operation is carried out, and the characteristic point I is obtained j 2 Is a set L of best matching points and candidate matching points 2 (I j 2 ) According to the constraint criterion of the mutual property, if I i 1 and Ij 2 Is the best match, then consider the matching relationship (I i 1 ,I j 2 ) Is the best match. The output result at this time will be some best matching points and candidate matching point set L 1 (I i 1) and L2 (I j 2 )。
As a preferred embodiment, the step S3 of further searching for the best match using the parallax compatibility principle, and the step of eliminating the mismatching includes:
(4) The candidate matching set is searched for in the next step by utilizing the parallax compatibility principle
Here, an evaluation function may be defined for the matches in the candidate matching set, the size of the function value representing the degree of compatibility between pairs of matches in the neighborhood, with which the best match for parallax compatibility can be obtained. The evaluation function is as follows:
wherein ,
d ij and dmn Respectively is matched (I i 1 ,I j 2) and (Im 1 ,I n 2 ) The corresponding parallaxes, a and b are normal numbers, and a+b=1.
If a pair of candidate matching points satisfies:
the match is considered to be the best match.
And carrying out the two-step processing on all candidate matching points of the two images, removing all possible candidate matching points containing the unique best matching point, and carrying out the process until the number of the best matching points is not changed or the iteration number reaches a certain preset maximum value. To this end, all best matches and their integer level disparities will be output.
As a preferred embodiment, the step S4 of obtaining the sub-pixel parallax according to the binary search method after outputting the unique match includes:
(5) By binary search [48] Acquiring sub-pixel disparities
By the previous search, the best matching set has been established and the matched integer level disparities are obtained, here a reference window W is defined 1 (x) Andmatching window W 2 (x+d ij ) Canonical correlation number between:
when d ij If the floating point gray value is not an integer, calculating the floating point gray value W of the matching window by using a linear interpolation formula 2 (x+d ij ):
W 2 (x+d ij )=(d ij -[d ij ])·W 2 (x+[d ij ])+[d ij ]·W 2 (x+[d ij ]+1) (14)
wherein [dij ]The representation pair is less than or equal to d ij A maximum integer.
The calculation formula of the dichotomy search is:
wherein A is the iteration number.
As a preferred embodiment, the flame space positioning comprises:
s1', determining a flame target space position;
through stereo matching, the optimal matching point of the flame edge characteristic is preliminarily obtained. Assumed point (x) l ,y l) and (xr ,y r ) Is a pair of best matching points, the parallax of which is d, and the image coordinates are converted into camera coordinates, then there are:
after the calibration and correction of the system, the binocular image pair acquired by the system is aligned in the forward direction in the mathematical sense, and then the rotation matrix between the two cameras is providedAnd a translation vector t= [ b 0 0 ]]B is the baseline distance. If the left camera coordinate system is taken as the world coordinate system, the relationship between the two camera coordinate systems can be expressed as: / >
According to equations (16), (17) and (18), the spatial three-dimensional coordinates of the point are readily obtained with the left camera coordinate system as the world coordinate system:
although the system uses cameras of the same focal length as much as possible, the focal length of the camera obtained by calibration is often not exactly the same. And in fact there is f at this time r ≈f l Here, for the purpose of convenient calculation, takeAs the system focal length, then equation (19) can be reduced to:
to this end, three-dimensional coordinates of a series of edge feature points have been obtained, and such coordinates do not describe the specific location of the flame concisely and clearly, and therefore, the specific location of the flame is generally described by the geometric center of the flame region, then there are flame center location coordinates:
wherein n is the number of edge features corresponding to the matching (x) i ,y i ,z i ) To match the three-dimensional space coordinates of the corresponding edge feature points, (x) 0 ,y 0 ,z 0 ) Is in the center of flameThree-dimensional space coordinates.
Considering that the coordinates in the three-dimensional rectangular coordinate system cannot intuitively represent the specific position of the flame relative to the camera, the coordinates in the three-dimensional rectangular coordinate system are converted into coordinates in a spherical coordinate system with the imaging center of the left camera as the origin. According to the geometric relationship, the distance D, the horizontal deflection angle α and the vertical deflection angle β of the center of the fire source relative to the camera can be easily obtained according to the spherical coordinates of the camera in fig. 2, which are respectively:
S2', obtaining the coordinates of the flame center based on the camera parameters, and realizing the positioning of the flame target, wherein the method comprises the following steps: the flame image three-dimensional matching method based on the edge characteristics and the determination of the flame target space position are combined with the camera parameters acquired in the prior art, so that the coordinates of the flame center can be obtained, and the flame target is positioned.
As a preferred embodiment, the positioning error model
In the process of flame target positioning, the nature of the camera and the imaging transformation relationship are determined through camera calibration, so that stereo matching becomes a key for influencing the positioning effect. The result of the stereo matching finally appears as the parallax of the matching target. Assuming here that the error in parallax obtained by stereo matching is Δd, an error model for object localization can be established by simple derivation.
The position of any point in space, if expressed in terms of distance D from the left camera, horizontal offset angle α and vertical offset angle β, is shown in equation (22), however, taking equation (20) into equation (22) has:
as can be seen from equation (23), the horizontal offset angle α and the vertical offset angle β can be directly obtained from the pixel coordinates of the image, and are irrelevant to the binocular system, and the error of the binocular system matching only affects the distance D between the target and the camera, because the image point, the object point and the imaging center are located on the same ray, and therefore, the same horizontal offset angle α and the same vertical offset angle β exist under the spherical coordinate system of the camera: the position of the target can be uniquely determined by a two-dimensional image, but the depth of the target cannot be acquired. The binocular parallax exists, so that the binocular system can accurately acquire the depth information of the target, and the essence of binocular positioning is to acquire the depth information of the target, namely the measurement of the distance D. Thus, the present embodiment builds an error model for the distance D obtained by the system.
First, deriving D from D is available:
where the minus sign indicates only the direction, an error model of the positioning can be built according to equation (24):
/>
wherein b is the baseline distance, x l 、y l For the coordinate value of the target point in the left image, f is the focal length of the camera, Δd= |d-D 0 I is absolute error, D is distance measured by positioning system, D 0 For the actual measured distance, ΔD/D is the relative error, Δd= |d-D 0 I is the error of matching, d is the parallax, d, which is the parallax that the stereo matching algorithm of this embodiment finds 0 Is the theoretical parallax. Due to the coordinate value x of the target point in the left image l 、y l Far smaller than the focal length f of the camera and thereforeThe above-described error model can be simplified by the following equations (25) and (26):
from the formulas (27) and (28), it can be derived that:
(1) Under a certain focal length f and a baseline distance b, the positioning error is in a direct proportion to the square of the distance D between the target and the camera, and the larger the distance D between the target and the camera is, the larger the positioning error is;
(2) Under a certain focal length f and a distance D between the target and the cameras, the positioning error is inversely related to a base line distance b between the two cameras, and the smaller the base line distance b is, the larger the positioning error is;
(3) At a certain baseline distance b and a distance D between the target and the camera, the larger the focal length f of the camera is, the higher the positioning accuracy of the system is, and the smaller the positioning error is.
Experiment and result analysis
In order to verify the effectiveness and the positioning accuracy of the system, the embodiment develops a binocular vision-based flame positioning experiment, and tests the system under the conditions of different base line distances b and different distances D by using cameras with different focal lengths.
1. Experimental equipment
(a) LHY digital infrared cameras with different focal lengths
The main parameters of the camera are shown in Table 1:
table 1 main parameters of camera
(b) Two of the space-sensitive image acquisition cards
(c) Wo Shi-7904D high-definition hard disk video recorder
(d) Black and white square calibration reference of 6 x 9.
(e) High-performance computer
The main parameters of the computer are shown in Table 2:
TABLE 2 principal parameters of computer
2. Experimental procedure
(1) Building parallel alignment binocular vision system
In the experiment, two LHY digital infrared cameras are adopted, the model and each parameter are consistent, the two cameras with the same focal length are placed in parallel alignment at the same height as much as possible, and the distance between the two cameras is adjusted to enable the baseline distance to be equal to 0.1m, 0.2m, 0.3m, 0.4m, 0.5m and 0.6m in sequence.
(2) Camera calibration
The method comprises the steps of utilizing a built binocular vision system, utilizing a binocular image acquisition method to obtain 12 image pairs of a calibration reference object in different directions, calibrating the system according to a third chapter method, solving parameters of two cameras, and a rotation matrix R and a displacement vector T of a right camera relative to a left camera, utilizing the parameters to carry out three-dimensional correction on the calibration reference object, carrying out three-dimensional calibration on the corrected calibration reference object image again, and obtaining a corrected rotation matrix R * And translation vector T * Rotation matrix R, R before and after correction by contrast analysis * And translation vector T, T * A good correction effect is obtained.
(3) Capturing images of flames
And (3) synchronously recording flame images at the positions 2m, 4m, 6m, 8m, 10m and 12m from the cameras by using a binocular image acquisition method to acquire synchronous flame images.
(4) Stereoscopic correction of binocular image pairs
And (3) carrying out three-dimensional correction on the flame image obtained in the third step by utilizing the camera parameters obtained in the second step and the rotation matrix R and the translation vector T between the two cameras, so that the left image and the right image are aligned strictly in parallel in mathematical sense.
(5) Segmentation of flame regions to extract flame boundaries
And (3) dividing the binocular image pair by using the image processing method introduced in the fourth chapter to obtain a flame target area and an edge contour of the flame target area.
(6) Three-dimensional matching is carried out on flame boundary characteristics and flame center position coordinates are calculated
And performing three-dimensional matching by using a flame image matching method based on edge characteristics, solving parallax, and then calculating three-dimensional space coordinates of a flame center, and a distance D, a horizontal deflection angle alpha and a vertical deflection angle beta which take a left camera as a reference.
(7) Results comparative analysis
And (3) comparing and analyzing the distance D of the flame target relative to the left camera, which is obtained by the sixth step, with the actually measured distance.
3. Experimental results and analysis
Through multiple experiments, the focal length of the camera under different conditions and the rotation matrix R, R of the right camera relative to the left camera before and after the stereo correction are respectively obtained * And translation vector T, T * And flame center coordinates for different distance locations.
Considering the complexity of the system, a large amount of intermediate data is generated during the experiment, and therefore, the experimental results are analyzed from two angles herein: namely the effectiveness analysis of the algorithm itself and the positioning effect analysis of the system.
(a) Validity analysis of the algorithm itself
Because the system is complex, the experimental process is complex, and a large amount of intermediate data such as a calibration result of a camera, a three-dimensional correction result, a final positioning result and the like are generated, it is necessary to check whether the intermediate process of the algorithm is reasonably effective.
Taking the monocular calibration result of the camera of the system under the conditions of 6mm focal length and 0.1m baseline distance as an example, the monocular calibration result is shown in table 3. The results show that: the calibration effect is good, the focal length of the obtained camera is not much different from the actual focal length by 6mm, and the errors are within 1%.
Table 3 video camera focal length calibration data table when focal length is 6mm
Taking the results of binocular stereo calibration and stereo correction of the system at a focal length of 6mm and a baseline distance of 0.1m as an example, table 4 shows. The results show that: after the binocular image pair is subjected to three-dimensional calibration and correction, the rotation matrix is close to an ideal unitary matrix, the translation vector is also close to the ideal translation vector, the obtained baseline distance is slightly different from the actual baseline distance by 0.1m, and the error is within 1%.
Table 4 three-dimensional calibration and correction data table at 6mm focal length
The three-dimensional matching algorithm establishes an error model, where the actual error is analyzed in comparison to error models of Δd=0.5 pixels, Δd=1 pixels, respectively. Let the matching error Δd=0.5 pixel, Δd=1 pixel, and bring them into equation (27), respectively, a theoretical error curve can be obtained, and it is compared with the actual error, respectively. The following fig. 3 to 5 show the comparison result of the theoretical error curve and the actual error in three different cases (for the sake of space, it is not limited to this, and only the examples shown in fig. 3 to 5 are shown). It is easy to see that the actual error in the experiment falls substantially within the Δd=1 pixel theoretical error curve and falls near the Δd=0.5 pixel theoretical error curve; under certain conditions, the error of system positioning increases with increasing distance, decreases with increasing baseline distance, and decreases with increasing focal length of the camera. Therefore, the stereo matching effect of the system is better, the matching error is basically about 0.5 pixel, and the sub-pixel level matching requirement is met.
(b) System positioning effect analysis
The positioning of the system in the flame space is ultimately represented by the determination of the distance D of the flame target relative to the camera. When the base line and the focal length are not too small, the error of system positioning is basically within 0.2m, and the effect is good; and when both the base line and the focal length are smaller, the positioning effect on a shorter distance is better, and the error on a longer distance is relatively higher, which is determined by the nature of the system itself, as in the equation (5.27), if the focal length and the base line distance are smaller at the same time, and the target distance is farther, the small matching error will also result in a larger positioning error. But from the whole, the positioning error of the system is within 1%, and the effect is good.
In conclusion, the system has reasonable and effective algorithm, better final positioning effect and higher accuracy and reliability.
Example two
A stereo matching and flame space localization system based on edge features, comprising:
the stereo matching module is used for carrying out stereo matching of flame based on the edge characteristics; and
and the space positioning module is used for performing flame space positioning based on the three-dimensional matching of the flames.
The invention also provides a memory storing a plurality of instructions for implementing the method according to embodiment one.
As shown in fig. 9, the present invention further provides an electronic device, including a processor 301 and a memory 302 connected to the processor 301, where the memory 302 stores a plurality of instructions, and the instructions may be loaded and executed by the processor, so that the processor can perform the method according to the embodiment.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. The three-dimensional matching and flame space positioning method based on the edge features is characterized by comprising the steps of performing three-dimensional matching of flames based on the edge features and performing flame space positioning based on the three-dimensional matching of the flames; the content of the stereo matching of the flame comprises: establishing a corresponding relation between pixels of images of the same scene in a scene obtained from different angles by two or more cameras respectively;
the method for stereo matching of the flame comprises the following steps:
s1, selecting a matching primitive, determining a matching constraint criterion based on the matching primitive, establishing a similarity measurement function based on the matching constraint criterion, determining a similarity measure based on the similarity measurement function, and determining a search strategy of stereo matching based on the measure;
s2, taking pixel points on the flame edge as matching elements, and screening out optimal matching points, candidate matching points and unmatched points by utilizing double-threshold discrimination; the best match is considered to be the correct match, and the candidate match is screened in the next step;
s3, searching for optimal matching further by utilizing a parallax compatibility principle, and eliminating mismatching;
s4, obtaining sub-pixel parallax according to a binary search method after outputting unique matching;
The flame space positioning includes:
s1', determining a flame target space position;
through three-dimensional matching, the optimal matching point of the flame edge characteristic is preliminarily obtained; assumed point (x) l ,y l) and (xr ,y r ) Is a pair of best matching points, the parallax of which is d, and the image coordinates are converted into camera coordinates, then there are:
after the calibration and correction of the system, the binocular image pair acquired by the system is aligned in the forward direction in the mathematical sense, and then the rotation matrix between the two cameras is providedAnd a translation vector t= [ b 0 0 ]]B is the baseline distance; if the left camera coordinate system is taken as the world coordinate system, the relationship between the two camera coordinate systems can be expressed as:
according to equations (16), (17) and (18), the spatial three-dimensional coordinates of the point are obtained with the left camera coordinate system as the world coordinate system as follows:
taking outAs the system focal length, then equation (19) can be reduced to:
describing the specific location of the flame by the geometric center of the flame region, there is a flame center location coordinate:
wherein n is the number of edge feature points corresponding to the matching,(x i ,y i ,z i ) To match the three-dimensional space coordinates of the corresponding edge feature points, (x) 0 ,y 0 ,z 0 ) Is the three-dimensional space coordinates of the flame center;
converting coordinates in a three-dimensional rectangular coordinate system into coordinates in a spherical coordinate system with an imaging center of a left camera as an origin, and obtaining a distance D, a horizontal deflection angle alpha and a vertical deflection angle beta of a fire source center relative to the camera, wherein the distance D, the horizontal deflection angle alpha and the vertical deflection angle beta are respectively as follows:
S2', obtaining the coordinates of the flame center based on the camera parameters, and realizing the positioning of the flame target, wherein the method comprises the following steps: the flame image three-dimensional matching method based on the edge characteristics and the determination of the flame target space position are combined with the camera parameters acquired in the prior art, so that the coordinates of the flame center can be obtained, and the flame target is positioned.
2. The method for stereo matching and flame spatial localization based on edge features of claim 1, wherein the matching primitive is an image feature for stereo matching, and the selection of the matching primitive determines which information in the image is used for matching, comprising: one or more of an original gray value of an image, a salient feature in the image, a statistical feature formed by measurement and evaluation of a certain preset area in the image and a high-level shape descriptor, wherein the salient feature comprises angular points, intersection points, edges, outlines and local curvature maximum points; the statistical features include geometric invariant moment, centroid; the original gray value of the image is the most basic image information, and is also the most direct and simplest matching primitive, and can be directly obtained by an imaging system; the salient features represent most of internal structural information in the image, and have good image distortion invariance and good uniqueness; if no obvious salient features are found in the image, the high-level shape descriptors have better uniqueness and are used as matching primitives, including topology descriptors, morphology descriptors and Fourier descriptors; the statistical features of a region in the image are also often referred to as matching primitives, which describe the statistical information on that region in the image, and the values of the centroid and geometric invariance moment are independent of a particular coordinate system.
3. The method for three-dimensional matching and flame space positioning based on edge features according to claim 2, wherein the matching constraint criteria are that after selecting a proper matching primitive, image features corresponding to the image features selected by the reference map are searched in the image to be matched, and the matching constraint criteria include:
(1) Uniqueness constraint criteria: each matching primitive in the reference image can only correspond to a unique one of the matching primitives in the image to be matched at most, but in the presence of occlusion, there will be no corresponding point;
(2) Polar constraint criteria: according to the principle of epipolar geometry, a certain point in a reference image is limited on the conjugate polar line of the matching point in the image to be matched, and when in matching searching, searching is only needed on the horizontal scanning line of another image;
(3) Similarity constraint criteria: the mutually matched points are obtained by projection of the same point in space, and the points and the respective adjacent areas or the areas on the outline, the gray scale and the gray scale gradient have similarity;
(4) Continuity constraint criteria: the matched disparities should be smooth except for occlusion areas and disparity discontinuity areas caused by boundaries;
(5) Order consistency constraint criteria: the dot columns on the outer polar line of the reference image and the dot columns corresponding to the dot columns in the image to be matched have the same arrangement;
(6) Interoperability constraint criteria: the mutual nature of matching, assuming that the left graph is first used as a reference graph, the right graph is searched for a matching primitive p in the left graph l Corresponding matching primitive p r P is established l →p r Corresponding relation of (3); then take the right image as the reverseReferring to the figure, a matching primitive p in the left and right figures is searched for r Corresponding matching primitive p l Can have p r →p l That is, the correspondence is bi-directional, mutual; if the matching correspondence is not bi-directional, the correspondence is deemed unreliable.
4. A stereo matching and flame spatial localization method based on edge features as claimed in claim 3, wherein the similarity measure is a measure describing the degree of similarity between matching elements of an image pair, comprising: firstly, constructing a function describing the difference between the matched primitives, and when the difference function obtains the minimum value, considering that the difference between the two matched primitives is the minimum, namely the similarity is the best; set I l (x, y) and I r (x, y) are the image pairs to be matched acquired by the system respectively, (x) l ,y l ) For a point to be matched in a reference image, the template W is a neighborhood with the point as a center, the size is m multiplied by n, d is parallax, and the similarity measure function comprises:
(1) Sum-difference function of pixel gray level absolute difference
(2) Sum of squares difference function of pixel gray level differences
(3) Zero-mean pixel gray level absolute difference sum-difference function
(4) Normalized cross-correlation function
5. The method for stereo matching and flame space localization based on edge features of claim 4, wherein the search strategy for stereo matching comprises:
(1) Global optimum search strategy: under the multi-constraint criterion, searching the global by taking the minimum difference function as a target, wherein the global optimal matching strategy comprises the following steps: the dynamic programming method is to implement a global optimal matching strategy on a point set of the reference image and the image to be matched along two corresponding epipolar lines to find a minimum matching difference path; the relaxation method is that on the basis of a similarity constraint criterion, the matching probability of each point is calculated, and then the probability is adjusted by a nonlinear optimization algorithm under a multi-constraint condition so as to find the optimal matching corresponding relation;
(2) Hierarchical matching search strategy: carrying out multi-level sampling on the image to generate a series of image samples with different resolutions, and carrying out layering processing according to the resolution; firstly, carrying out matching search on an image layer with low resolution, and obtaining a relatively rough matching position due to the resolution relation; and then, using the matching position as a template, finding a corresponding window in the image of the next layer, and continuing to perform matching search in the window to obtain a finer matching position in one step, so as to find the corresponding matching position in the original image pair.
6. The stereo matching and flame space positioning method based on edge features as claimed in claim 1, wherein the step S2 is to use the pixel points on the flame edge as matching elements, and to use dual threshold discrimination to screen out the best matching point, candidate matching point and non-matching point; the best match is considered to be the correct match, and the candidate match is then screened for further steps including:
set I 1 and I2 Respectively two images in the image pair to be matched, wherein the images areThe sets of flame edge feature points are respectively denoted as { I } i 1} and {Ij 2 Corresponding gray values are noted g i 1 and gj 2 Each point on the edge feature in the image pair is marked with a sitting sign (x 1 ,y 1) and (x2 ,y 2 ) The neighborhood W is defined as a square window with the characteristic point as the center and the size of (2n+1) x (2n+1); in image I 2 And image I 1 Searching for the best match correspondence (I) i 1 ,I j 2 ) The specific steps of (a) are as follows:
(1) Determining a search range and calculating the similarity degree of each group of possible matches;
(2) Carrying out double-threshold judgment on each group of possible matches, and screening the best matching point and the candidate matching point;
(3) And (3) detecting the interactivity: according to the above steps, the image I is further processed 2 Characteristic point I of (1) j 2 The operation is carried out, and the characteristic point I is obtained j 2 Is a set L of best matching points and candidate matching points 2 (I j 2 ) According to the constraint criterion of the mutual property, if I i 1 and Ij 2 Is the best match, then consider the matching relationship (I i 1 ,I j 2 ) For best matching, the output result will be some best matching points and candidate matching point set L 1 (I i 1) and L2 (I j 2 )。
7. The method for stereo matching and flame space positioning based on edge features as set forth in claim 1, wherein the step S3 of further searching for the best match using the parallax compatibility principle, the step of eliminating the mismatching includes: the method comprises the steps of carrying out next search on a candidate matching set by utilizing a parallax compatibility principle, defining an evaluation function for matching in the candidate matching set, wherein the size of a function value of the evaluation function reflects the compatibility degree between each pair of matching in the neighborhood, and the function can be utilized to obtain the best matching of parallax compatibility, and the evaluation function has the following form:
wherein ,
d ij and dmn Respectively is matched (I i 1 ,I j 2) and (Im 1 ,I n 2 ) The corresponding parallaxes, a and b are normal numbers, and a+b=1;
if a pair of candidate matching points satisfies:
then the match is considered to be the best match;
carrying out the two-step processing on all candidate matching points of the two images, removing all possible candidate matching points containing the unique optimal matching points, and carrying out the process until the number of the optimal matching points is not changed or the iteration number reaches a certain preset maximum value; to this end, all best matches and their integer level disparities will be output.
8. The method for three-dimensional matching and flame space positioning based on edge features according to claim 1, wherein the step of obtaining the sub-pixel parallax according to the binary search method after the step of outputting the unique match comprises the following steps: defining a reference window W 1 (x) And a matching window W 2 (x+d ij ) Canonical correlation number between:
when d ij If the floating point gray value is not an integer, calculating the floating point gray value W of the matching window by using a linear interpolation formula 2 (x+d ij ):
W 2 (x+d ij )=(d ij -[d ij ])·W 2 (x+[d ij ])+[d ij ]·W 2 (x+[d ij ]+1) (14)
wherein [dij ]The representation pair is less than or equal to d ij A maximum integer;
the calculation formula of the dichotomy search is:
wherein A is the iteration number.
9. The method for three-dimensional matching and flame space positioning based on edge features according to claim 1, further comprising verifying the effect of three-dimensional matching by a positioning error model; the position of any point in space, if expressed in terms of distance D from the left camera, horizontal offset angle α and vertical offset angle β, is shown in equation (22), however, taking equation (20) into equation (22) has:
establishing an error model for the distance D obtained by the system;
first, deriving D from D is available:
where the minus sign indicates only the direction, an error model of the positioning can be built according to equation (24):
wherein b is the baseline distance, x l 、y l For the coordinate value of the target point in the left image, f is the focal length of the camera, Δd= |d-D 0 I is absolute error, D is distance measured by positioning system, D 0 For the actual measured distance, ΔD/D is the relative error, Δd= |d-D 0 I is the error of matching, d is the parallax calculated by the stereo matching algorithm, d 0 Is the theoretical parallax; due to the coordinate value x of the target point in the left image l 、y l Far smaller than the focal length f of the camera and thereforeThe above-described error model can be simplified by the following equations (25) and (26):
10. a stereo matching and flame space localization system based on edge features, characterized in that it implements the method according to any one of claims 1-9, comprising:
the stereo matching module is used for carrying out stereo matching of flame based on the edge characteristics; and
and the space positioning module is used for performing flame space positioning based on the three-dimensional matching of the flames.
CN202211412606.7A 2022-11-11 2022-11-11 Three-dimensional matching and flame space positioning method based on edge characteristics Active CN116665034B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211412606.7A CN116665034B (en) 2022-11-11 2022-11-11 Three-dimensional matching and flame space positioning method based on edge characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211412606.7A CN116665034B (en) 2022-11-11 2022-11-11 Three-dimensional matching and flame space positioning method based on edge characteristics

Publications (2)

Publication Number Publication Date
CN116665034A true CN116665034A (en) 2023-08-29
CN116665034B CN116665034B (en) 2023-10-31

Family

ID=87721196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211412606.7A Active CN116665034B (en) 2022-11-11 2022-11-11 Three-dimensional matching and flame space positioning method based on edge characteristics

Country Status (1)

Country Link
CN (1) CN116665034B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020095752A (en) * 2001-06-15 2002-12-28 한국전자통신연구원 Method for Stereo Image Disparity Map Fusion And Method for Display 3-Dimension Image By Using it
US20100048242A1 (en) * 2008-08-19 2010-02-25 Rhoads Geoffrey B Methods and systems for content processing
CN102999939A (en) * 2012-09-21 2013-03-27 魏益群 Coordinate acquisition device, real-time three-dimensional reconstruction system, real-time three-dimensional reconstruction method and three-dimensional interactive equipment
CN104573719A (en) * 2014-12-31 2015-04-29 国家电网公司 Mountain fire detection method based on intelligent image analysis
CN109903507A (en) * 2019-03-04 2019-06-18 上海海事大学 A kind of fire disaster intelligent monitor system and method based on deep learning
CN113887624A (en) * 2021-09-30 2022-01-04 西安理工大学 Improved feature stereo matching method based on binocular vision
CN114155289A (en) * 2021-12-08 2022-03-08 电子科技大学 Electric spark contour dimension measuring method of electric ignition system based on binocular vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020095752A (en) * 2001-06-15 2002-12-28 한국전자통신연구원 Method for Stereo Image Disparity Map Fusion And Method for Display 3-Dimension Image By Using it
US20100048242A1 (en) * 2008-08-19 2010-02-25 Rhoads Geoffrey B Methods and systems for content processing
CN102999939A (en) * 2012-09-21 2013-03-27 魏益群 Coordinate acquisition device, real-time three-dimensional reconstruction system, real-time three-dimensional reconstruction method and three-dimensional interactive equipment
CN104573719A (en) * 2014-12-31 2015-04-29 国家电网公司 Mountain fire detection method based on intelligent image analysis
CN109903507A (en) * 2019-03-04 2019-06-18 上海海事大学 A kind of fire disaster intelligent monitor system and method based on deep learning
CN113887624A (en) * 2021-09-30 2022-01-04 西安理工大学 Improved feature stereo matching method based on binocular vision
CN114155289A (en) * 2021-12-08 2022-03-08 电子科技大学 Electric spark contour dimension measuring method of electric ignition system based on binocular vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
卢阿丽;唐振民;杨静宇;: "基于信任度传播的体视算法", 模式识别与人工智能, no. 01, pages 84 - 90 *
郭赞权等, 《消防科学与技术》, vol. 39, no. 12, pages 1622 - 1625 *

Also Published As

Publication number Publication date
CN116665034B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
US9959455B2 (en) System and method for face recognition using three dimensions
US7822267B2 (en) Enhanced object reconstruction
US8326025B2 (en) Method for determining a depth map from images, device for determining a depth map
KR100513055B1 (en) 3D scene model generation apparatus and method through the fusion of disparity map and depth map
US11521311B1 (en) Collaborative disparity decomposition
Forkuo Automatic fusion of photogrammetric imagery and laser scanner point clouds
CN110307790A (en) Camera shooting machine detecting device and method applied to safety monitoring slope
JP2004340840A (en) Distance measuring device, distance measuring method and distance measuring program
JP5951043B2 (en) Image measuring device
CN112184811A (en) Monocular space structured light system structure calibration method and device
CN116665034B (en) Three-dimensional matching and flame space positioning method based on edge characteristics
KR101154436B1 (en) Line matching method based on intersection context
Partovi et al. Automatic integration of laser scanning and photogrammetric point clouds: From acquisition to co-registration
Otero et al. Local iterative DLT soft-computing vs. interval-valued stereo calibration and triangulation with uncertainty bounding in 3D reconstruction
Sato et al. Efficient hundreds-baseline stereo by counting interest points for moving omni-directional multi-camera system
Petitpas et al. Roughness measurement from multi-stereo reconstruction
Reji et al. Comparative analysis in satellite image registration
Zhang et al. Passive 3D reconstruction based on binocular vision
Barazzetti et al. Automatic image-based 3D modeling for medical applications
Chen et al. Stereo with zooming
Li Feature Based Calibration of a Network of Kinect Sensors
US20240070897A1 (en) Geoposition determination and evaluation of same using a 2-d object detection sensor calibrated with a spatial model
Yan et al. A hierarchical image matching method for stereo satellite imagery
Katai-Urban et al. Stereo Reconstruction of Atmospheric Cloud Surfaces from Fish-Eye Camera Images
Ramli et al. Enhancement of Depth Value Approximation for 3D Image-Based Modelling using Noise Filtering and Inverse Perspective Mapping Techniques for Complex Object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant