CN103065323A - Subsection space aligning method based on homography transformational matrix - Google Patents

Subsection space aligning method based on homography transformational matrix Download PDF

Info

Publication number
CN103065323A
CN103065323A CN2013100130458A CN201310013045A CN103065323A CN 103065323 A CN103065323 A CN 103065323A CN 2013100130458 A CN2013100130458 A CN 2013100130458A CN 201310013045 A CN201310013045 A CN 201310013045A CN 103065323 A CN103065323 A CN 103065323A
Authority
CN
China
Prior art keywords
centerdot
millimetre
wave radar
coordinate system
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100130458A
Other languages
Chinese (zh)
Other versions
CN103065323B (en
Inventor
付梦印
靳璐
杨毅
宗民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201310013045.8A priority Critical patent/CN103065323B/en
Publication of CN103065323A publication Critical patent/CN103065323A/en
Application granted granted Critical
Publication of CN103065323B publication Critical patent/CN103065323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a subsection space aligning method based on a homography transformational matrix. According to the subsection space aligning method based on the homography transformational matrix, large marked distance is sectioned, the homography transformational matrix between a camera coordinate system and a millimeter-wave radar coordinate system of each subsection is acquired, errors caused by using a same homography transformational matrix to represent coordinate relations of two sensors in the prior art are avoided, and space aligning of target detection of the large marked distance can be achieved. Relations of different coordinate systems between the camera and the millimeter-wave radar are deduced and represented, and finally the relations of the coordinate systems between the camera and the millimeter-wave radar are represented by the homography transformational matrix N. The two sensors are used for obtaining target data and solving the homography transformational matrix N, and a camera internal parameter matrix and a rotation matrix which are composed of solving scaling factors, focal distance and the like are avoided and a camera external parameter matrix composed of translation vectors is avoided. Therefore, operation process is greatly simplified and operation time is saved.

Description

A kind of segment space alignment methods based on the homography transformation matrix
Technical field
The present invention relates to unmanned vehicle multi-sensor information fusion technology field, be specifically related to a kind of segment space alignment methods based on the homography transformation matrix.
Background technology
Unmanned vehicle claims again Outdoor Intelligent Autonomous Mobile Robot, be a kind of environment sensing, dynamic decision and planning, behavior control device very high with the multi-functional intelligent degree such as carrying out that integrate, its rapidity to environment sensing, accuracy are inseparable with multi-sensor information fusion technology.Multi-sensor information fusion technology is exactly that computing machine takes full advantage of each sensor resource, by reasonable domination and the use to various measurement informations, on room and time, combine according to certain Optimality Criteria with redundant information complementary, generation is explained the consistance of observing environment or is described, produces simultaneously new fusion results.In the environment sensing module, vision sensor and millimetre-wave radar are two kinds of sensors commonly using.Size and profile information that the vision sensor sensing range is wide, can obtain target in the external environment, but it is subject to the extraneous factor impact, has target disappearance problem.And millimetre-wave radar resolution is high, antijamming capability is strong, can under various weather environments, accurately obtain distance, relative velocity and the azimuth information of target, but can not identify target shape and size, therefore, utilize this complementary characteristic to merge both information, obtain more comprehensive, reliable environmental information.And spacial alignment is the prerequisite of both information fusion.Spacial alignment essence is the transformation matrix relation of estimating between video camera and radar fix system.At present, in the Traditional Space alignment methods, in 20 meters demarcation distance range, on different range points, at random the point on the target is surveyed, obtain respectively that the coordinate system of target in video camera expressed and the expression of radar fix in being, the data of obtaining according to two sensors, the video camera external parameter matrix that the intrinsic parameters of the camera matrix that estimation is made of the scaling factor, focal length etc. and rotation matrix, translation vector consist of, this computation process is more loaded down with trivial details, also easily introduces error; In addition, when according to above-mentioned algorithm the target that surpasses 20 meters demarcation distance being found the solution transformation matrix, because scope is larger, therefore huge error causes the spacial alignment failure.
Summary of the invention
In view of this, the invention provides a kind of segment space alignment methods based on the homography transformation matrix, the video camera of unmanned vehicle loading and the spacial alignment of millimetre-wave radar can be in larger demarcation distance range, realized, the computation process of finding the solution the homography transformation matrix can also be simplified simultaneously.
A kind of segment space alignment methods based on the homography transformation matrix of the present invention comprises the steps:
Step 1: set up the relation based on the homography transformation matrix between camera coordinate system and the millimetre-wave radar coordinate system:
Image coordinate system O ' the uv of definition video camera, wherein O ' is positioned at the upper left corner on video camera imaging plane; The u axle is parallel with the camera-scanning line direction; The v axle is perpendicular to the camera-scanning line direction;
Definition O " " ρ θ is the millimetre-wave radar polar coordinate system, and O " " is the center on millimetre-wave radar surface; ρ is the air line distance between target and millimetre-wave radar; θ is the angle that target departs from millimetre-wave radar plane of scanning motion center line, and then the image coordinate system O ' uv of video camera and millimetre-wave radar polar coordinate system O " " ρ θ Relations Among are expressed as:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
Wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , Be defined as the homography transformation matrix;
Step 2: determine the suitable demarcation distance between unmanned vehicle and the spotting:
Definition O " " X rY rZ rExpression millimetre-wave radar rectangular coordinate system, O " " is the center on millimetre-wave radar surface; Y rAxle is millimetre-wave radar plane of scanning motion center line, and is perpendicular to the millimetre-wave radar surface, directed straight ahead; X rAxle and Y rVertically, point to the right side; Z rAxle is perpendicular to X r, Y rThe plane of determining, points upwards;
Then the pass between millimetre-wave radar rectangular coordinate system and the millimetre-wave radar polar coordinate system is:
X r Y r 1 = ρ sin θ ρ cos θ 1 - - - ( 7 ) ′
The distance of spotting and unmanned vehicle is at millimetre-wave radar rectangular coordinate system longitudinal axis Y rOn projection be called the demarcation distance; In the investigative range of millimetre-wave radar, according to the maximum movement speed of unmanned vehicle, determine suitable demarcation distance L; To demarcate distance L and be divided into from the near to the remote close range L1 and long-distance range L2, in close range L1, it will be divided into the m1 section, in long-distance range L2, it will be divided into the m2 section, and guarantee that L1/m1 is less than L2/m2;
Step 3: the image and the data message that gather respectively spotting by the video camera that loads in the unmanned vehicle and millimetre-wave radar:
Spotting is placed on respectively will demarcates different segmentations place that distance L is divided in the step 2, millimetre-wave radar and video camera are surveyed the target at above-mentioned m1+m2 segment distance place respectively, when surveying, for the target at every segment distance place, with target along Y rIt is capable that direction of principal axis is divided into m, again with every delegation along X rDirection of principal axis is divided into the h segment, and the control millimetre-wave radar obtains the coordinate data (X of each segment M Rk, Y M Rk), the control video camera is taken the view data f of every segment k M, M=1 wherein ..., (m1+m2), k=1,2 ..., mh;
Step 4: for the view data f of each segment in each segmentation of video camera acquisition in the step 3 k M, the center-of-mass coordinate (u of difference computed image k M, v k M);
Step 5: the homography spatial alternation matrix of finding the solution expression millimetre-wave radar coordinate system and camera coordinate system Relations Among:
The millimetre-wave radar coordinate data that all segments that obtain for each segment distance of telling in the whole demarcation distance L are corresponding (
Figure BDA00002733233500031
Y M Rk) and the view data (u of video camera M k, v M k) form data set corresponding in each segmentation, with each data set be updated to respectively formula (7) and (7) ' in, obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
With 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
Definition P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , I K * 1=[1 ... 1] T, homography spatial alternation matrix N then MLeast square solution can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , Wherein,
Figure BDA00002733233500046
With Be respectively: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M With N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
Step 6: the spacial alignment of realizing vision sensor and millimetre-wave radar:
According to the actual range of the spotting of millimetre-wave radar scanning, judge that this distance in which segmentation, searches this apart from the homography spatial alternation matrix of correspondence, the implementation space aligning in step 2 in m1+m2 the result that step 5 calculates.
In step 2, when the demarcation distance was 50 meters, 0-20 rice was close range, is divided into 4 sections; 20-50 rice is long-distance range, is divided into 3 sections.
The method of the center-of-mass coordinate of the every a bit of image in the described step 4 in the calculating spotting image is as follows:
S40, manually choose the candidate region of containing spotting;
S41, the candidate region image is carried out medium filtering, the noise in the removal of images;
S42, the candidate region image is carried out the Sobel operator edge detection, obtain the spotting edge image of binaryzation;
S43, in the image coordinate system take pixel as unit, seek coordinate minimum value and peaked u axle pixel coordinate u in the spotting edge image along the u axle Min, u Max, along coordinate minimum value and the peaked v axle pixel coordinate v in the v axle searching spotting edge image Min, v Max, press clockwise or counterclockwise connect with straight line above-mentioned 4, form a quadrilateral area, in this quadrilateral area, utilize formula u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) With v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) Calculate the center-of-mass coordinate (u of spotting k M, v k M), f wherein k MM gray-scale value apart from pixel (i, j) in quadrilateral area corresponding to k segment of target in the segmentation of (i, j) expression.
The present invention has following beneficial effect:
1) by larger demarcation distance is carried out segmentation, try to achieve respectively homography transformation matrix between video camera and the millimetre-wave radar coordinate system for each segmentation, avoid the error that causes owing to the coordinate relation of expressing with same homography transformation matrix between two sensors in the prior art, thereby can realize the spacial alignment to larger demarcation distance objective detection;
2) by the relation between the different coordinates of derive sign video camera and millimetre-wave radar, adopt at last the homograph matrix N to characterize both coordinate system relations, the target data that obtains respectively with two sensors solves the homograph matrix N again, avoided finding the solution the video camera external parameter matrix that intrinsic parameters of the camera matrix that the scaling factor, focal length etc. consist of and rotation matrix, translation vector consist of, greatly simplify calculating process, saved operation time;
3) different with the degree of concern of long distance target to low coverage according to the sensor in the unmanned vehicle, different to the segmentation fineness of low coverage and long distance, when guaranteeing spacial alignment, can also reduce calculated amount;
4) in image coordinate system, after having obtained the edge image of target, find respectively the pixel of two axial maximal values and minimum value, these 4 pixels are surrounded quadrilateral, find again this tetragonal barycenter, the method can be determined the barycenter of each object edge image fast, thereby can the simplified operation process, simultaneously can also be so that spacial alignment is more accurate.
Description of drawings
Fig. 1 is camera pin-hole model schematic diagram;
Fig. 2 is millimetre-wave radar coordinate system schematic diagram;
Fig. 3 image coordinate system and millimetre-wave radar rectangular coordinate system mapping relations schematic diagram.
Embodiment
Below in conjunction with the accompanying drawing embodiment that develops simultaneously, describe the present invention.
The invention provides a kind of segment space alignment methods based on the homography transformation matrix, comprise the steps:
Step 1: set up the relation based on the homography transformation matrix between camera coordinate system and the millimetre-wave radar coordinate system:
As shown in Figure 1, OX cY cZ cThe expression camera coordinate system, initial point O is positioned at the photocentre of video camera; X cAxle is parallel to the camera-scanning line direction, points to the direction that scanning element increases; Y cAxle points to the direction that scan line increases perpendicular to the camera-scanning line direction; Z cAxle points to the camera direction of visual lines perpendicular to imaging plane.O ' uv represents the image coordinate system take pixel as unit, and O ' is positioned at the upper left corner of imaging plane; U axle and X cParallel; V axle and Y cParallel.O " xy represents the image coordinate system take millimeter as unit, O " is the focus of imaging plane; X axle and X cParallel; Y axle and Y cParallel.F is the focal length of camera, and I represents imaging plane.Postulated point P is at coordinate system OX cY cZ c, " coordinate under xy and the O ' uv is respectively (X to O c, Y c, Z c), (x, y) and (u, v), by the P point at coordinate system OX cY cZ cWith coordinate system O " the geometric proportion relation under the xy:
Figure BDA00002733233500062
Above-mentioned relation is expressed as homogeneous form to be had:
Z c x y 1 = f 0 0 0 0 f 0 0 0 0 1 0 X c Y c Z c 1 - - - ( 1 )
" zooming and panning under xy and the coordinate system O ' uv coordinate system concern: x=S (u-u at coordinate system O by the P point 0), y=S (v-v 0), being expressed as homogeneous form has:
x y 1 = S 1 0 - u 0 0 1 - v 0 0 0 1 u v 1 - - - ( 2 )
Wherein S is the scaling factor, (u 0, v 0) be the coordinate of coordinate system O " xy initial point O " under coordinate system O ' uv.Suppose world coordinate system O " ' X wY wZ w, the coordinate of some P under this coordinate system is (X w, Y w, Z w), the relation of two coordinate systems is represented to have with homogeneous form:
X c Y c Z c 1 = R T 0 1 X w Y w Z w 1 - - - ( 3 )
Wherein R and T represent respectively rotation matrix and translation vector.Composite type (1), formula (2) and formula (3) then have relation:
u v 1 = 1 β 1 0 u 0 0 1 v 0 0 0 1 f 0 0 0 0 f 0 0 0 0 1 0 R T 0 1 X w Y w Z w 1 - - - ( 4 )
Wherein, β=Z cS.
As shown in Figure 2, O " " X rY rZ rExpression millimetre-wave radar rectangular coordinate system, O " " is the center on millimetre-wave radar surface; Y rAxle is millimetre-wave radar plane of scanning motion center line, and is perpendicular to the millimetre-wave radar surface, directed straight ahead; X rAxle and Y rVertically, point to the right side; Z rAxle is perpendicular to X r, Y rThe plane of determining, points upwards.O " " ρ θ represents the millimetre-wave radar polar coordinate system, initial point and coordinate system O " " X rY rZ rInitial point overlap; ρ is the air line distance between target and millimetre-wave radar; θ is the angle that target departs from millimetre-wave radar plane of scanning motion center line.Point P is at O " " ρ θ and O " " X rY rZ rUnder coordinate be respectively (ρ, θ) and (X r, Y r, Z r), because the plane of scanning motion of millimetre-wave radar is coordinate system O " " ρ θ next one two dimensional surface, Z is arranged r=0, some P is at O " " ρ θ and O " " X rY rZ rUnder triangle relation be expressed as:
X r Y r Z r = ρ sin θ ρ cos θ 0 - - - ( 5 )
Supposing that the millimetre-wave radar rectangular coordinate is world coordinate system, is intermediate variable and convolution (4) and formula (5) by world coordinates, and then image coordinate system O ' uv and millimeter wave polar coordinate system O " " ρ θ relation table are shown:
u v 1 = 1 β 1 0 u 0 0 1 v 0 0 0 1 f 0 0 0 0 f 0 0 0 0 1 0 R T 0 1 ρ sin θ ρ cos θ 0 1 - - - ( 6 )
As shown in Figure 3, when camera and millimetre-wave radar are observed same target, but the impact point that through type (6) scans millimetre-wave radar projects in the image of camera collection.But need to estimate the camera inner parameters such as zoom factor S, camera focal distance f and the external parameters such as rotation matrix R, translation vector T, computation process is complicated, calculate for simplifying, utilize the transformation relation N of an equivalence to represent the relation of O ' uv and O " " ρ θ, be expressed as:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
Wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , Be called the homography transformation matrix.
Step 2: determine suitable demarcation distance and carry out equitable subsection to demarcating distance:
We call the demarcation distance to the spotting that measures under the millimetre-wave radar polar coordinate system apart from ρ, distance L be will demarcate and close range L1 and long-distance range L2 will be divided into from the near to the remote, in close range L1, it is divided into the m1 section, in long-distance range L2, it is divided into the m2 section, because in the unmanned vehicle detection process, need to accurately survey nearer target, and for only need judgement roughly of distant object, therefore, it is larger that the long-distance range segmentation can be compared the close range segmentation, shown in L1/m1 less than L2/m2;
In the present embodiment, the maximal rate of unmanned vehicle regulation is at 36km/h, the distance areas of paying close attention in the process of moving according to the unmanned vehicle relation relevant with its velocity magnitude as can be known, we determine that 50m is suitable demarcation distance.In the demarcation distance of 50m, learn by experiment: utilize single homography spatial alternation matrix can cause aiming at unsuccessfully, therefore, according to the difference of unmanned vehicle rate request and concern distance areas, demarcating apart from being in the 0-20m, interval 5m is one section totally 4 sections; Demarcating apart from being in the 20-50m, interval 10m is one section totally 3 sections.
Step 3: the image and the data message that gather respectively spotting by the video camera that loads in the unmanned vehicle and millimetre-wave radar:
For guaranteeing the accuracy of spacial alignment, in each segmentation, vertically getting the five-element respectively, every row is laterally being got 7 groups of image and data messages that spotting is corresponding.The millimetre-wave radar that utilization is installed in the camera of the IEEE1394 interface on the unmanned vehicle and CAN bus interface gathers image and the data message of spotting under the Same Scene, and with communication in industrial computer.
Step 4: for the view data f of each segment in each segmentation of video camera acquisition in the step 3 k M, the center-of-mass coordinate (u of difference computed image k M, v k M);
Generally, the clustering method that millimetre-wave radar adopts is nearest field method, and therefore, we think that the coordinate figure of millimetre-wave radar barycenter position under the spotting data message that scans under the millimetre-wave radar coordinate system and image coordinate system is mutually corresponding.To digital picture f, f (i, j) represents gray-scale value, the point in (i, j) presentation video zone.The center-of-mass coordinate of spotting in the computed image information, adopt following steps:
S40, manually choose the candidate region of containing spotting, to reduce background to the interference of spotting centroid calculation.
S41, medium filtering is carried out in the candidate region, noise in the removal of images.Medium filtering is realized by expression formula f (i, j)=Median{f (i-k, j-l).Wherein, (k, l) ∈ ω, ω are 3 * 3 fields of pixel.
S42, the candidate region image is carried out the Sobel operator edge detection, obtain containing the candidate region edge image of spotting.The Sobel operator adopts masterplate mask = - 1 - 2 - 1 0 0 0 1 2 1 , Candidate region image and mask are made convolution algorithm, obtain the bianry image of candidate region.
S43, in the image coordinate system take pixel as unit, seek coordinate minimum value and peaked u axle pixel coordinate u in the spotting edge image along the u axle Min, u Max, along coordinate minimum value and the peaked v axle pixel coordinate v in the v axle searching spotting edge image Min, v Max, press clockwise or counterclockwise connect with straight line above-mentioned 4, form a quadrilateral area, in this quadrilateral area, utilize formula u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) With v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) Calculate the center-of-mass coordinate (u of spotting k M, v k M), f wherein k MM gray-scale value apart from pixel (i, j) in quadrilateral area corresponding to k segment of target in the segmentation of (i, j) expression;
Step 5: the homography spatial alternation matrix of finding the solution expression millimetre-wave radar coordinate system and camera coordinate system Relations Among:
The millimetre-wave radar coordinate data that all segments that obtain for each segment distance of telling in the whole demarcation distance L are corresponding (
Figure BDA00002733233500101
Y M Rk) and the view data (u of video camera M k, v M k) form data set corresponding in each segmentation, each data set is updated to respectively in formula (5) and (7), obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
With 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
Definition P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , I K * 1=[1 ... 1] T, homography spatial alternation matrix N then MLeast square solution can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , Wherein,
Figure BDA000027332335001010
With
Figure BDA000027332335001011
Be respectively: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M With N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
Step 6: the spacial alignment of realizing vision sensor and millimetre-wave radar:
According to the actual range of the spotting of millimetre-wave radar scanning, at first, judge this distance in step 2 in which segmentation, then, the N of homography spatial alternation matrix in this segmentation that calculates according to step 5 M(M=1,2 ..., 7), the implementation space is aimed at.
In sum, above is preferred embodiment of the present invention only, is not for limiting protection scope of the present invention.Within the spirit and principles in the present invention all, any modification of doing, be equal to replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (3)

1. the segment space alignment methods based on the homography transformation matrix is characterized in that, comprises the steps:
Step 1: set up the relation based on the homography transformation matrix between camera coordinate system and the millimetre-wave radar coordinate system:
Image coordinate system O ' the uv of definition video camera, wherein O ' is positioned at the upper left corner on video camera imaging plane; The u axle is parallel with the camera-scanning line direction; The v axle is perpendicular to the camera-scanning line direction;
Definition O " " ρ θ is the millimetre-wave radar polar coordinate system, and O " " is the center on millimetre-wave radar surface; ρ is the air line distance between target and millimetre-wave radar; θ is the angle that target departs from millimetre-wave radar plane of scanning motion center line, and then the image coordinate system O ' uv of video camera and millimetre-wave radar polar coordinate system O " " ρ θ Relations Among are expressed as:
u v 1 = N ρ sin θ ρ cos θ 1 - - - ( 7 )
Wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , Be defined as the homography transformation matrix;
Step 2: determine the suitable demarcation distance between unmanned vehicle and the spotting:
Definition O " " X rY rZ rExpression millimetre-wave radar rectangular coordinate system, O " " is the center on millimetre-wave radar surface; Y rAxle is millimetre-wave radar plane of scanning motion center line, and is perpendicular to the millimetre-wave radar surface, directed straight ahead; X rAxle and Y rVertically, point to the right side; Z rAxle is perpendicular to X r, Y rThe plane of determining, points upwards;
Then the pass between millimetre-wave radar rectangular coordinate system and the millimetre-wave radar polar coordinate system is:
X r Y r 1 = ρ sin θ ρ cos θ 1 - - - ( 7 ) ′
The distance of spotting and unmanned vehicle is at millimetre-wave radar rectangular coordinate system longitudinal axis Y rOn projection be called the demarcation distance; In the investigative range of millimetre-wave radar, according to the maximum movement speed of unmanned vehicle, determine suitable demarcation distance L; To demarcate distance L and be divided into from the near to the remote close range L1 and long-distance range L2, in close range L1, it will be divided into the m1 section, in long-distance range L2, it will be divided into the m2 section, and guarantee that L1/m1 is less than L2/m2;
Step 3: the image and the data message that gather respectively spotting by the video camera that loads in the unmanned vehicle and millimetre-wave radar:
Spotting is placed on respectively will demarcates different segmentations place that distance L is divided in the step 2, millimetre-wave radar and video camera are surveyed the target at above-mentioned m1+m2 segment distance place respectively, when surveying, for the target at every segment distance place, with target along Y rIt is capable that direction of principal axis is divided into m, again with every delegation along X rDirection of principal axis is divided into the h segment, and the control millimetre-wave radar obtains the coordinate data (X of each segment M Rk, Y M Rk), the control video camera is taken the view data f of every segment k M, M=1 wherein ..., (m1+m2), k=1,2 ..., mh;
Step 4: for the view data f of each segment in each segmentation of video camera acquisition in the step 3 k M, the center-of-mass coordinate (u of difference computed image k M, v k M);
Step 5: the homography spatial alternation matrix of finding the solution expression millimetre-wave radar coordinate system and camera coordinate system Relations Among:
The millimetre-wave radar coordinate data that all segments that obtain for each segment distance of telling in the whole demarcation distance L are corresponding (
Figure FDA00002733233400021
Y M Rk) and the view data (u of video camera M k, v M k) form data set corresponding in each segmentation, with each data set be updated to respectively formula (7) and (7) ' in, obtain:
u 1 M · · · u k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M · · · v k M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
With 1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
Definition P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , I K * 1=[1 ... 1] T, homography spatial alternation matrix N then MLeast square solution can be expressed as: N M = N 1 M T N 2 M T N 3 M T T , Wherein,
Figure FDA00002733233400032
With
Figure FDA00002733233400033
Be respectively: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M With N 3 = ( P M P M T ) - 1 P M T I k × 1 ;
Step 6: the spacial alignment of realizing vision sensor and millimetre-wave radar:
According to the actual range of the spotting of millimetre-wave radar scanning, judge that this distance in which segmentation, searches this apart from the homography spatial alternation matrix of correspondence, the implementation space aligning in step 2 in m1+m2 the result that step 5 calculates.
2. a kind of segment space alignment methods based on the homography transformation matrix as claimed in claim 1 is characterized in that, when the demarcation distance of step 2 was 50 meters, 0-20 rice was close range, is divided into 4 sections; 20-50 rice is long-distance range, is divided into 3 sections.
3. a kind of segment space alignment methods based on the homography transformation matrix as claimed in claim 1 is characterized in that, the method for the center-of-mass coordinate of the every a bit of image in the described step 4 in the calculating spotting image is as follows:
S40, manually choose the candidate region of containing spotting;
S41, the candidate region image is carried out medium filtering, the noise in the removal of images;
S42, the candidate region image is carried out the Sobel operator edge detection, obtain the spotting edge image of binaryzation;
S43, in the image coordinate system take pixel as unit, seek coordinate minimum value and peaked u axle pixel coordinate u in the spotting edge image along the u axle Min, u Max, along coordinate minimum value and the peaked v axle pixel coordinate v in the v axle searching spotting edge image Min, v Max, press clockwise or counterclockwise connect with straight line above-mentioned 4, form a quadrilateral area, in this quadrilateral area, utilize formula u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) With v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) Calculate the center-of-mass coordinate (u of spotting k M, v k M), f wherein k MM gray-scale value apart from pixel (i, j) in quadrilateral area corresponding to k segment of target in the segmentation of (i, j) expression.
CN201310013045.8A 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix Active CN103065323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Publications (2)

Publication Number Publication Date
CN103065323A true CN103065323A (en) 2013-04-24
CN103065323B CN103065323B (en) 2015-07-15

Family

ID=48107940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310013045.8A Active CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Country Status (1)

Country Link
CN (1) CN103065323B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN104280019A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 All-round looking system calibration device based on flexible calibration plate
CN104464173A (en) * 2014-12-03 2015-03-25 国网吉林省电力有限公司白城供电公司 Power transmission line external damage protection system based on space image three-dimensional measurement
CN104965202A (en) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 Barrier detection method and device
CN105818763A (en) * 2016-03-09 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method, device and system for confirming distance of object around vehicle
CN106730106A (en) * 2016-11-25 2017-05-31 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173A (en) * 2016-11-25 2018-06-01 宁波舜宇光电信息有限公司 Vision positioning method, camera system and automation equipment
CN108226906A (en) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
CN109471096A (en) * 2018-10-31 2019-03-15 奇瑞汽车股份有限公司 Multi-Sensor Target matching process, device and automobile
CN110660186A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN110658518A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110879598A (en) * 2019-12-11 2020-03-13 北京踏歌智行科技有限公司 Information fusion method and device of multiple sensors for vehicle
CN111429530A (en) * 2020-04-10 2020-07-17 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN112162252A (en) * 2020-09-25 2021-01-01 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112348863A (en) * 2020-11-09 2021-02-09 Oppo广东移动通信有限公司 Image alignment method, image alignment device and terminal equipment
US20210318426A1 (en) * 2018-05-21 2021-10-14 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
US20110025841A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Estimating vehicle height using homographic projections
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
US20110025841A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Estimating vehicle height using homographic projections
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIANRU LIU ET AL.: "Advanced Obstacles Detection and Tracking by Fusing Millimeter Wave Radar and Image Sensor Data", 《INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS 2010》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104280019A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 All-round looking system calibration device based on flexible calibration plate
CN104200483B (en) * 2014-06-16 2018-05-18 南京邮电大学 Object detection method based on human body center line in multi-cam environment
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN104464173A (en) * 2014-12-03 2015-03-25 国网吉林省电力有限公司白城供电公司 Power transmission line external damage protection system based on space image three-dimensional measurement
CN104965202A (en) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 Barrier detection method and device
CN105818763A (en) * 2016-03-09 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method, device and system for confirming distance of object around vehicle
CN105818763B (en) * 2016-03-09 2018-06-22 睿驰智能汽车(广州)有限公司 A kind of method, apparatus and system of determining vehicle periphery object distance
CN106730106A (en) * 2016-11-25 2017-05-31 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173A (en) * 2016-11-25 2018-06-01 宁波舜宇光电信息有限公司 Vision positioning method, camera system and automation equipment
CN106730106B (en) * 2016-11-25 2019-10-08 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173B (en) * 2016-11-25 2022-06-28 宁波舜宇光电信息有限公司 Visual positioning method, camera system and automation equipment
CN108226906A (en) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
CN108226906B (en) * 2017-11-29 2019-11-26 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
US11733370B2 (en) * 2018-05-21 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system
US20210318426A1 (en) * 2018-05-21 2021-10-14 Johnson Controls Tyco IP Holdings LLP Building radar-camera surveillance system
CN110658518A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110658518B (en) * 2018-06-29 2022-01-21 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110660186B (en) * 2018-06-29 2022-03-01 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN110660186A (en) * 2018-06-29 2020-01-07 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN109471096B (en) * 2018-10-31 2023-06-27 奇瑞汽车股份有限公司 Multi-sensor target matching method and device and automobile
CN109471096A (en) * 2018-10-31 2019-03-15 奇瑞汽车股份有限公司 Multi-Sensor Target matching process, device and automobile
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN110879598A (en) * 2019-12-11 2020-03-13 北京踏歌智行科技有限公司 Information fusion method and device of multiple sensors for vehicle
CN111429530A (en) * 2020-04-10 2020-07-17 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN111429530B (en) * 2020-04-10 2023-06-02 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN112162252A (en) * 2020-09-25 2021-01-01 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112162252B (en) * 2020-09-25 2023-07-18 南昌航空大学 Data calibration method for millimeter wave radar and visible light sensor
CN112348863A (en) * 2020-11-09 2021-02-09 Oppo广东移动通信有限公司 Image alignment method, image alignment device and terminal equipment

Also Published As

Publication number Publication date
CN103065323B (en) 2015-07-15

Similar Documents

Publication Publication Date Title
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN107272021B (en) Object detection using radar and visually defined image detection areas
CN108369743B (en) Mapping a space using a multi-directional camera
CN108419446B (en) System and method for laser depth map sampling
Siegemund et al. A temporal filter approach for detection and reconstruction of curbs and road surfaces based on conditional random fields
CN102917171B (en) Based on the small target auto-orientation method of pixel
Fruh et al. Fast 3D model generation in urban environments
CN102254318A (en) Method for measuring speed through vehicle road traffic videos based on image perspective projection transformation
CN113657224A (en) Method, device and equipment for determining object state in vehicle-road cooperation
JP2004086779A (en) Obstacle detection device and its method
CN113865580A (en) Map construction method and device, electronic equipment and computer readable storage medium
CN109828267A (en) The Intelligent Mobile Robot detection of obstacles and distance measuring method of Case-based Reasoning segmentation and depth camera
CN113223075A (en) Ship height measuring system and method based on binocular camera
Li et al. Automatic targetless LiDAR–camera calibration: a survey
CN114743021A (en) Fusion method and system of power transmission line image and point cloud data
CN105116886A (en) Robot autonomous walking method
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
Gokhool et al. A dense map building approach from spherical RGBD images
Jiang et al. Bridge Deformation Measurement Using Unmanned Aerial Dual Camera and Learning‐Based Tracking Method
CN111780744A (en) Mobile robot hybrid navigation method, equipment and storage device
Chen et al. Low cost and efficient 3D indoor mapping using multiple consumer RGB-D cameras
Cigla et al. Image-based visual perception and representation for collision avoidance
Hasegawa et al. Real-time interpolation method for sparse lidar point cloud using rgb camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant