CN103473761A - Automobile chassis three-dimensional elevation determination method based on binocular linear array CCD - Google Patents

Automobile chassis three-dimensional elevation determination method based on binocular linear array CCD Download PDF

Info

Publication number
CN103473761A
CN103473761A CN2013103591206A CN201310359120A CN103473761A CN 103473761 A CN103473761 A CN 103473761A CN 2013103591206 A CN2013103591206 A CN 2013103591206A CN 201310359120 A CN201310359120 A CN 201310359120A CN 103473761 A CN103473761 A CN 103473761A
Authority
CN
China
Prior art keywords
lab
chara
characteristic
image
characteristic curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103591206A
Other languages
Chinese (zh)
Other versions
CN103473761B (en
Inventor
朱虹
俞帅男
王栋
王芙
张喜
王佳
高磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN201310359120.6A priority Critical patent/CN103473761B/en
Publication of CN103473761A publication Critical patent/CN103473761A/en
Application granted granted Critical
Publication of CN103473761B publication Critical patent/CN103473761B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an automobile chassis three-dimensional elevation determination method based on a binocular linear array CCD. The method is specifically implemented according to the following steps: 1. performing gray scale closing operation processing on an automobile chassis image acquired by the binocular linear array CCD; 2. extracting the characteristics of a gray scale closing operation result obtained in step one; 3. performing characteristic matching on a left side view and a right side view; and 4. obtaining the elevation information of a stereo object. According to the method provided by the invention, by searching appropriate matching characteristics in terms of the properties of the objects of an automobile chassis and by relying on an automobile chassis safety real-time monitoring system, the detection result of the system is displayed in a three-dimensional mode, and sparse three-dimensional elevation data is obtained on the basis of binocular characteristic correct matching, so that dense three-dimensional elevation data can be further obtained accordingly, thus the restrictions based on an orientation map method are overcome, and geometrical constraint relations of different views are recovered by only depending on the characteristic matching among images, thereby the provided method has good application prospects.

Description

The three-dimensional elevation of automobile chassis based on the binocular line array CCD is determined method
Technical field
The invention belongs to binocular line array CCD 3 Dimension Image Technique field, be specifically related to the three-dimensional elevation of a kind of automobile chassis based on the binocular line array CCD and determine method.
Background technology
The detection of automobile chassis Environmental security is an important research direction of intelligent traffic safety management system.Existing vehicle safety check place, detection mode for automobile chassis, be mainly to rely on manual detection and manual instrument and equipment to be detected for a long time, the part place also needs the staff to utilize mirror image observation automobile chassis whether to carry contraband goods secretly, also has department to be banned by police dog.This just level method is wasted time and energy, inefficiency.
Summary of the invention
The purpose of this invention is to provide the three-dimensional elevation of a kind of automobile chassis based on the binocular line array CCD and determine method, solved the manual type observation automobile chassis existed in the prior art, waste time and energy, ineffective problem.
The technical solution adopted in the present invention is that the three-dimensional elevation of a kind of automobile chassis based on the binocular line array CCD is determined method, according to following steps, specifically implements:
Step 1, the automobile chassis image that the binocular line array CCD is collected carry out gray scale closed operation processing;
The feature of the gray scale closed operation result that step 2, extraction step 1 obtain;
Step 3, left and right view is carried out to characteristic matching;
Step 4, obtain the stereoscopic article elevation information.
The invention has the beneficial effects as follows, object characteristic for automobile chassis is searched suitable matching characteristic, rely on the automobile chassis safety real time monitoring system, testing result to system is carried out three-dimensional display, obtain sparse three-dimensional altitude figures on the basis of the correct coupling of binocular feature, further obtain thus dense three-dimensional altitude figures.
The accompanying drawing explanation
Fig. 1 is the binocular camera imaging model schematic diagram in the inventive method.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention is described in detail.
The three-dimensional elevation of automobile chassis based on the binocular line array CCD of the present invention is determined method, according to following steps, specifically implements:
Step 1, the automobile chassis image that the binocular line array CCD is collected carry out gray scale closed operation processing
If the image that the binocular line array CCD collects size is m * n, by left and right view difference called after [I l(x, y)] m * nand [I r(x, y)] m * n, to utilize Gauss's template respectively left and right view to be done to the gray scale closed operation and process, the Gauss's template arranged in order to do the closed operation processing is [G (x, y)] m * n, computing formula is:
G ( x , y ) = K · 1 2 πσ 2 exp ( - x 2 + y 2 2 σ 2 ) , x=1,2,…m;y=1,2,…,n (1)
Wherein, K is scale-up factor, and for the image of 256 GTGs, span is K ∈ [80,100]; σ gets empirical value, and span is σ ∈ [10,20];
Gauss's template of utilizing formula (1) to obtain, respectively to left view [I l(x, y)] m * nand right view [I r(x, y)] m * ncarry out the gray scale expansion process,
Above-mentioned [G (x, y)] m * nwhat mean with G (x, y) is not same concept, [G (x, y)] m * nwhat mean is the matrix of a m * n size, and that in formula (1), provide is all elements G (x, y) in this matrix, x=1, and 2 ..., m; Y=1,2 ..., the computing formula of n, at first, according to formula (2) and template [G (x, y)] m * ncarry out gray scale and computing, obtain result and be
Figure BDA0000367473280000022
with
Figure BDA0000367473280000023
I k ~ ( x , y ) = I k ( x , y ) + G ( x , y ) , x=1,2,…,m;y=1,2,…,n;k=L,R (2)
Afterwards, according to with
Figure BDA0000367473280000033
obtain the gray scale expansion results according to formula (3)
Figure BDA0000367473280000034
with
Figure BDA0000367473280000035
Figure BDA0000367473280000036
x=1,2,…,m;y=1,2,…,n;k=L,R (3)
Left and right view after the gray scale expansion process is carried out to the gray scale corrosion treatment again,
At first, according to formula (4), formula (3) is obtained
Figure BDA0000367473280000037
with carry out and template [G (x, y)] m * nthe gray scale difference computing, obtain result and be
Figure BDA0000367473280000039
with
Figure BDA00003674732800000310
I ‾ k ( x , y ) = I ^ k ( x , y ) - G ( x , y ) , x=1,2,…,m;y=1,2,…,n;k=L,R, (4)
Afterwards, according to
Figure BDA00003674732800000312
with obtain the result [I of gray scale closed operation according to formula (5) l close(x, y)] m * n[I r close(x, y)] m * n:
Figure BDA00003674732800000314
x=1,2,…,m;y=1,2,…,n;k=L,R; (5)
The feature of the gray scale closed operation result that step 2, extraction step 1 obtain
Step 2a, the image [I after step 1 is processed l close(x, y)] m * n[I r close(x, y)] m * nextract the canny edge, obtain edge image [I l edge(x, y)] m * n[I r edge(x, y)] m * n, afterwards, remove the interference of small size, (the canny edge extracting method all has detailed introduction in general textbook, at this, no longer repeats),
To image [I l edge(x, y)] m * n[I r edge(x, y)] m * ncarry out the processing of labelling of eight connected domains, (labeling method all has detailed introduction in general textbook, at this, no longer repeats), establish after connected domain is labelled image [I l edge(x, y)] m * nobtain N lindividual connected domain, the area of each connected domain is Area i l, i is tag number, i=1, and 2 ..., N l; Image [I r edge(x, y)] m * nobtain N rindividual connected domain, the area of each connected domain is Area i r, i is tag number, i=1, and 2 ..., N r;
Set afterwards the threshold value A rea_Th that eliminates noise, (size of this threshold value is according to the size of the resolution of image, and Image Acquisition quality, the size of the noise that may occur is determined), all connected domains are processed, when the area of connected domain is greater than Area_Th, retain this connected domain; If while being less than Area_Th, think noise, remove this connected domain;
Step 2b, remove the transverse edge in edge image
Due to when carrying out three-dimensional modeling, only need to know left and right parallax, so remove transverse edge, be in order to resist the interference of transverse edge,
The rectangle template [M (x, y)] that the effect template is a p * q size is set p * q, M (x, y)=1, x=1,2 ..., p; Y=1,2 ..., q; The p value is odd number, and in order to judge the thickness of transverse edge, span is p ∈ { 3,5,7}; The q value is odd number, in order to judge the minimum length of transverse edge, span be q ∈ 5,7,9,11},
By template [M (x, y)] p * qact on two images that step 2a obtains
Figure BDA0000367473280000042
[I r edge(x, y)] m * n, on the position of the center of template corresponding to the pending pixel of image, carry out following computing, obtained removing the characteristic image [MI of transverse edge l edge(x, y)] m * nand [MI r edge(x, y)] m * n:
MI k edge ( x , y ) = 0 Π i = x - ( p - 1 ) / 2 x + ( p - 1 ) / 2 Σ j = y - ( q - 1 ) / 2 y + ( q - 1 ) / 2 I k edge ( x + i , y + j ) · M ( i , j ) = 0 1 Π i = x - ( p - 1 ) / 2 x + ( p - 1 ) / 2 Σ j = y - ( q - 1 ) / 2 y + ( q - 1 ) / 2 I k edge ( x + i , y + j ) · M ( i , j ) ≠ 0 , - - - ( 6 )
X=wherein (p+1)/2, (p+1)/2+1 ..., m-(p+1)/2,
y=(q+1)/2,(q+1)/2+1,…,n-(q+1)/2,k=L,R
MI k edge(x,y)=0, (7)
X=1 wherein, 2 ..., (p+1)/2-1,
y=1,2,…,(q+1)/2-1,k=L,R;
Step 2c, with reference to step 2a, remove image [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nin the small size zone, for convenience of description for the purpose of, will remove the image of surface area still with [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nmeaned;
Step 2d, remove unreliable marginal information
Because the depth information of the stereoscopic article in image is reflected in left view [I l(x, y)] m * nand right view [I r(x, y)] m * nparallax in, therefore obtain its parallax with the mark stereoscopic article,
Suppose, left and right view is carried out to registration, (method for registering of two images all has detailed introduction in textbook, at this, no longer repeat), because what adopt here is the line array CCD imaging, so there is not rotation relationship in the same target in left and right view, therefore, the alternate position spike of reference field after the establishing standard in left and right view is Δ x and Δ y, disparity map [the ▽ I in order to judge stereoscopic article l,R(x, y)] m * ncomputing formula be:
&dtri; I L , R ( x , y ) = 0 | I L ( x , y ) - I R ( x + &Delta;x , y + &Delta;y ) | < &epsiv; 1 | I L ( x , y ) - I R ( x + &Delta;x , y + &Delta;y ) | &GreaterEqual; &epsiv; , x=1,2,…,m;y=1,2,…,n (8)
Wherein, ε is the adjustment parameter of parallax value, considers the impact that is subject in actual applications photoenvironment, for the parallax of non-stereoscopic article, certain difference is also arranged, and its value is taken as empirical value according to environment is set, and span is ε ∈ [15,30];
Afterwards, at disparity map
Figure BDA0000367473280000054
upper will in the edge feature obtained in step 2, do not belong to the part of stereoscopic article remove;
Because be distributed in left and right disparity map [▽ I on the edge line principle of stereoscopic article l,R(x, y)] m * nbe 1 position, so by [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nrespectively with
Figure BDA0000367473280000053
carry out AND operation, obtain the left and right boundary characteristic image [I of stereoscopic article l chara(x, y)] m * n[I r chara(x, y)] m * n:
Figure BDA0000367473280000052
X=1 wherein, 2 ..., m; Y=1,2 ..., n, k=L, R;
Step 3, left and right view is carried out to characteristic matching
Step 3a, ask the left and right characteristic image [I that looks l chara(x, y)] m * n[I r chara(x, y)] m * nfeature between correlativity
To the left and right characteristic image [I that looks l chara(x, y)] m * n[I r chara(x, y)] m * nthe processing of being labelled, and order is from left to right arranged tag number according to connected domain central series coordinate, by [I l chara(x, y)] m * ntag number to sort successively be 1,2 ..., Lab l, by [I r chara(x, y)] m * ntag number to sort successively be 1,2 ..., Lab r; Afterwards, obtain the left and right feature correlation of looking characteristic image,
Feature correlation comprises following four correlation parameters:
1) characteristic image [I is looked on a left side l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r,
At first, calculate the capable terminal point coordinate b (i, j) between the left and right characteristic curve of looking characteristic image, i=1,2 according to formula (10) ... Lab l, j=1,2 ... Lab r:
Figure BDA0000367473280000061
i=1,2…Lab L,j=1,2,…Lab R, (10)
Wherein, b l(i) be that characteristic image [I is looked on a left side l chara(x, y)] m * nthe capable terminal point coordinate of characteristic curve i, b r(j) be that characteristic image [I is looked on the right side r chara(x, y)] m * nthe capable terminal point coordinate of characteristic curve j;
Afterwards, calculate the beginning-of-line coordinate t (i, j) between the left and right characteristic curve of looking characteristic image, i=1,2 according to formula (11) ... Lab l, j=1,2 ... Lab r:
Figure BDA0000367473280000062
i=1,2…Lab L,j=1,2,…Lab R, (11)
T l(i) be that characteristic image [I is looked on a left side l chara(x, y)] m * nthe beginning-of-line coordinate of characteristic curve i, t r(j) be that characteristic image [I is looked on the right side r chara(x, y)] m * nthe beginning-of-line coordinate of characteristic curve j;
The beginning-of-line coordinate t (i, j) obtained according to formula (10) and formula (11), i=1,2 ... Lab l, j=1,2 ... Lab r, and row terminal point coordinate b (i, j), i=1,2 ... Lab l, j=1,2 ... Lab r, obtain a left side according to formula (12) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i, characteristic image [I is looked on the right side r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r:
N Match(i,j)=b(i,j)-t(i,j),i=1,2…Lab L,j=1,2,…Lab R, (12)
2) characteristic image [I is looked on the left side obtained according to formula (12) l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, obtain a left side according to formula (13) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image the common row of characteristic curve j account for the number ratio L of short characteristic curve match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r,
L Match ( i , j ) = N Match ( i , j ) L ( i , j ) , i=1,2,…,Lab L,j=1,2,…,Lab R, (13)
Wherein:
Figure BDA0000367473280000072
i=1,2,…,Lab L,j=1,2,…,Lab R, (14)
3) calculate a left side according to formula (15) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe slope differences of characteristic curve j
Figure BDA0000367473280000076
i=1,2 ..., Lab l, j=1,2 ..., Lab r,
If (x t l(i), y t l(i)) for a left side, look characteristic image the starting point coordinate of characteristic curve i, (x b l(i), y b l(i)) be the terminal point coordinate of its characteristic curve i; (x t r(j), y t r(j)) for the right side, look characteristic image [I r chara(x, y)] m * nthe starting point coordinate of characteristic curve j, (x b r(j), y b r(j)) be the terminal point coordinate of its characteristic curve j:
S Match ( i , j ) = | y b L ( i ) - y t L ( i ) x b L ( i ) - x t L ( i ) - y b R ( j ) - y t R ( j ) x b R ( j ) - y j R ( j ) | , i=1,2,…,Lab L,j=1,2,…,Lab R, (15)
4), according to formula (16), calculate a left side and look characteristic image [I l chara(x, y)] m * nthe corresponding left view [I of characteristic curve i l(x, y)] m * n, and characteristic image [I is looked on the right side r chara(x, y)] m * nthe corresponding right view [I of characteristic curve j r(x, y)] m * naround characteristic block gray scale difference accumulative total and
Figure BDA0000367473280000078
i=1,2 ..., Lab l, j=1,2 ..., Lab r:
R Match ( i , j ) = &Sigma; ( x , y ) &Element; &Omega; i , j | I L ( x , y ) - I R ( x , y ) | , I=1,2 ..., Lab l, j=1,2 ..., Lab r, (16) wherein, Ω i,jfor looking characteristic image [I in a left side l chara(x, y)] m * ncharacteristic curve i expand to the left and right sides Δ i, Δ i is spreading range, the experience value, span is Δ i ∈ [10,30]; Characteristic image [I is looked on the right side r chara(x, y)] m * ncharacteristic curve j expand to the left and right sides Δ j, Δ j is spreading range, experience value, the zone that span is Δ j ∈ [10,30];
Step 3b, the relative coefficient of obtaining according to step 3a find optimum matching pair
1), according to formula (17), obtain the left side obtained according to formula (12) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab rmaximal value N match max:
N Match max=max{N Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (17)
According to formula (18), obtain the left side obtained according to formula (13) and look characteristic image [I again l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row of characteristic curve j account for the number ratio L of short characteristic curve match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, maximal value L match max:
L Match max=max{L Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (18)
2), according to formula (19), obtain the left side obtained according to formula (15) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe slope differences of characteristic curve j
Figure BDA0000367473280000082
i=1,2 ..., Lab l, j=1,2 ..., Lab rminimum value S match min:
S Match min=min{S Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (19)
Build the matrix that mark can not make left and right view feature pairing
Figure BDA0000367473280000083
its value is:
Figure BDA0000367473280000081
i=1,2,…,Lab L,j=1,2,…,Lab R
Wherein, k max, k minbe characteristic curve matching degree parameter, the similarity degree assignment according between acceptable characteristic curve, be generally the experience value, and span is k max∈ [0.5,0.7], k min∈ [2,6];
3) according to matching the mark matrix with according to formula (16), characteristic image [I is looked on the left side obtained l chara(x, y)] m * nthe corresponding left view [I of characteristic curve i l(x, y)] m * nlook characteristic image [I with the right side r chara(x, y)] m * nthe corresponding left view [I of characteristic curve j r(x, y)] m * naround gray scale difference accumulative total and the R of characteristic block match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, obtain the right sequence number (i*, j*) of pairing characteristic curve of looking the same target in characteristic image when front left and right, computing formula is:
( i * , j * ) arg i , j min { R Match ( i , j ) | i = 1,2 , . . . , Lab L ; j = 1 , 2 , . . . , Lab R ; a ( i , j ) = 0 } , - - - ( 21 )
Step 3c, repeating step 3b, until all characteristic curves have matched;
Step 4, obtain the stereoscopic article elevation information
Step 4a, obtain the characteristic curve elevation
Characteristic image [I is looked on a traversal left side l chara(x, y)] m * n, find successively the right side to look characteristic image [I r chara(x, y)] m * nin the characteristic curve of all successful matchings to resequencing as (i, i), i=1,2 ..., Lab lR, Lab wherein lRthe right number of characteristic curve of successful matching,
If i is matched line to (i, i), i=1,2 ..., Lab lRcharacteristic curve, look characteristic image on a left side
Figure BDA0000367473280000093
row-coordinate be x i l, the row coordinate is y i l; Look characteristic image [I on the right side r chara(x, y)] m * nin, the characteristic curve matched with it, on a left side, looking in characteristic image (is x on the identical row-coordinate of characteristic curve i r=x i l) the row coordinate be y i r, claim e i=y i r-y i lfor its parallax, i=1,2 ..., Lab lR,
The elevation of three-dimensional target is obtained by disparity computation, establishes elevation matrix [h (x, y)] m * n, be initialized as full null matrix,
With reference to Fig. 1, be binocular camera imaging model schematic diagram, wherein O lleft camera, O rbe right camera, establish and be imaged on a left side and look characteristic image [I l chara(x, y)] m * nlook characteristic image [I with the right side r chara(x, y)] m * nthe pairing characteristic curve on aerial image vegetarian refreshments (x, y, z), look characteristic image [I on a left side l chara(x, y)] m * nthe row coordinate of middle imaging point is yl, looks characteristic image [I on the right side r chara(x, y)] m * nin the row coordinate be y r, corresponding to its i, match line to (i, i), y l=y i l, y r=y i r, f is camera focus, and z is the distance of spatial point from camera plane, and b is the distance of left and right camera, by imaging model, is obtained:
b z = ( b + y r ) - y l z - f , That is: z = b &CenterDot; f y l - y r , - - - ( 22 )
h(x,y)=C-z, (23)
Wherein, C is the fixing camera plane distance to the automobile chassis reference field, and h (x, y) is exactly the elevation on the characteristic curve that detects of required automobile chassis stereoscopic article;
Step 4b, find paired elevation characteristic curve
After step 2 has been removed transverse edge, three-dimensional target is just formed by two vertical surrounded by edges, therefore, according to step 4a, the elevation matrix [h (x, y)] by the characteristic curve of step 3 successful matching to obtaining respectively its characteristic curve m * n, then look characteristic image [I according to a left side l chara(x, y)] m * ncharacteristic curve i, i=1,2 ..., Lab lR, find it at matrix [h (x, y)] m * nelevation on upper relevant position, according in the elevation matrix, the elevation of two characteristic curves is the most approaching, and the row-coordinate of two characteristic curves is all maximum principle mutually, detect in elevation map two characteristic curves that belong to same target, these two characteristic curves are just as the left and right boundary line of same three-dimensional target;
Step 4c, repeating step 4b, the elevation characteristic curve that meets determining step 4b until all has matched;
Step 4d, will be judged to be the altitude traverse that belongs to same target included center section will be defined as to target area, and give paired altitude traverse height value by it, complete the foundation to the three-dimensional elevation map of three-dimensional object.
By above step, just realize accurately that the elevation of automobile chassis 3 D stereo target detects.The elevation that method of the present invention also can extend to other objectives based on the imaging of binocular line array CCD detects.

Claims (5)

1. the three-dimensional elevation of the automobile chassis based on the binocular line array CCD is determined method, it is characterized in that, according to following steps, specifically implements:
Step 1, the automobile chassis image that the binocular line array CCD is collected carry out gray scale closed operation processing;
The feature of the gray scale closed operation result that step 2, extraction step 1 obtain;
Step 3, left and right view is carried out to characteristic matching;
Step 4, obtain the stereoscopic article elevation information.
2. the three-dimensional elevation of automobile chassis based on the binocular line array CCD according to claim 1 is determined method, it is characterized in that: the detailed step of described step 1 is,
If the image that the binocular line array CCD collects size is m * n, by left and right view difference called after [I l(x, y)] m * nand [I r(x, y)] m * n, to utilize Gauss's template respectively left and right view to be done to the gray scale closed operation and process, the Gauss's template arranged in order to do the closed operation processing is [G (x, y)] m * n, computing formula is:
G ( x , y ) = K &CenterDot; 1 2 &pi; &sigma; 2 exp ( - x 2 + y 2 2 &sigma; 2 ) , x=1,2,…,m;y=1,2,…,n (1)
Wherein, K is scale-up factor, for the image of 256 GTGs, and K ∈ [80,100], σ ∈ [10,20];
Gauss's template of utilizing formula (1) to obtain, respectively to left view [I l(x, y)] m * nand right view [I r(x, y)] m * ncarry out the gray scale expansion process,
According to formula (2) and template [G (x, y)] m * ncarry out gray scale and computing, obtain result and be
Figure FDA0000367473270000012
with
Figure FDA0000367473270000013
I k ~ ( x , y ) = I k ( x , y ) + G ( x , y ) , x=1,2,…,m;y=1,2,…,n;k=L,R (2)
Afterwards, according to
Figure FDA0000367473270000015
with
Figure FDA0000367473270000016
obtain the gray scale expansion results according to formula (3)
Figure FDA0000367473270000017
with
Figure FDA0000367473270000018
Figure FDA0000367473270000019
x=1,2,…,m;y=1,2,…,n;k=L,R (3)
Left and right view after the gray scale expansion process is carried out to the gray scale corrosion treatment again,
According to formula (4), formula (3) is obtained
Figure FDA0000367473270000021
with carry out and template [G (x, y)] m * nthe gray scale difference computing, obtain result and be
Figure FDA0000367473270000023
with
I _ k ( x , y ) = I ^ k ( x , y ) - G ( x , y ) , x=1,2,…,m;y=1,2,…,n;k=L,R, (4)
Afterwards, according to
Figure FDA0000367473270000026
with
Figure FDA0000367473270000027
obtain the result [I of gray scale closed operation according to formula (5) l close(x, y)] m * n[I r close(x, y)] m * n:
Figure FDA0000367473270000028
x=1,2,…,m;y=1,2,…,n;k=L,R。(5)
3. the three-dimensional elevation of automobile chassis based on the binocular line array CCD according to claim 1 is determined method, it is characterized in that: the detailed step of described step 2 is,
Step 2a, the image [I after step 1 is processed l close(x, y)] m * n[I r close(x, y)] m * nextract the canny edge, obtain edge image [I l edge(x, y)] m * n[I r edge(x, y)] m * n, afterwards, remove the interference of small size,
To image [I l edge(x, y)] m * n[I r edge(x, y)] m * ncarry out the processing of labelling of eight connected domains, establish after connected domain is labelled image [I l edge(x, y)] m * nobtain N lindividual connected domain, the area of each connected domain is Area i l, i is tag number, i=1, and 2 ..., N l; Image [I r edge(x, y)] m * nobtain N rindividual connected domain, the area of each connected domain is Area i r, i is tag number, i=1, and 2 ..., N r;
Set afterwards the threshold value A rea_Th that eliminates noise, all connected domains are processed, when the area of connected domain is greater than Area_Th, retain this connected domain; If while being less than Area_Th, think noise, remove this connected domain;
Step 2b, remove the transverse edge in edge image
The rectangle template [M (x, y)] that the effect template is a p * q size is set p * q, M (x, y)=1, x=1,2 ..., p; Y=1,2 ..., q; The p value is odd number, and span is p ∈ { 3,5,7}; The q value is odd number, span be q ∈ 5,7,9,11},
By template [M (x, y)] p * qact on two image [I that step 2a obtains l edge(x, y)] m * n[I r edge(x, y)] m * n, on the position of the center of template corresponding to the pending pixel of image, carry out following computing, obtained removing the characteristic image [MI of transverse edge l edge(x, y)] m * nand [MI r edge(x, y)] m * n:
MI k edge ( x , y ) = 0 &Pi; i = x - ( p - 1 ) / 2 x + ( p - 1 ) / 2 &Sigma; j = y - ( q - 1 ) / 2 y + ( q - 1 ) / 2 I k edge ( x + i , y + j ) &CenterDot; M ( i , j ) = 0 1 &Pi; i = x - ( p - 1 ) / 2 x + ( p - 1 ) / 2 &Sigma; j = y - ( q - 1 ) / 2 y + ( q - 1 ) / 2 I k edge ( x + i , y + j ) &CenterDot; M ( i , j ) &NotEqual; 0 - - - ( 6 )
X=wherein (p+1)/2, (p+1)/2+1 ..., m-(p+1)/2,
y=(q+1)/2,(q+1)/2+1,…,n-(q+1)/2,k=L,R
MI k edge(x,y)=0, (7)
X=1 wherein, 2 ..., (p+1)/2-1,
y=1,2,…,(q+1)/2-1,k=L,R;
Step 2c, with reference to step 2a, remove image [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nin the small size zone, will remove the image of surface area still with [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nmeaned;
Step 2d, remove unreliable marginal information
Suppose, left and right view is carried out to registration, the alternate position spike of the reference field after the establishing standard in left and right view is Δ x and Δ y, disparity map [the ▽ I in order to judge stereoscopic article l,R(x, y)] m * ncomputing formula be:
&dtri; I L , R ( x , y ) = 0 | I L ( x , y ) - I R ( x + &Delta;x , y + &Delta;y ) | < &epsiv; 1 | I L ( x , y ) - I R ( x + &Delta;x , y + &Delta;y ) | &GreaterEqual; &epsiv; , x=1,2,…,m;y=1,2,…,n (8)
Wherein, ε is the adjustment parameter of parallax value, and span is ε ∈ [15,30];
Afterwards, at disparity map [▽ I l,R(x, y)] m * nupper will in the edge feature obtained in step 2, do not belong to the part of stereoscopic article remove;
By [MI l edge(x, y)] m * n[MI r edge(x, y)] m * nrespectively with [▽ I l,R(x, y)] m * ncarry out AND operation, obtain the left and right boundary characteristic image [I of stereoscopic article l chara(x, y)] m * n[I r chara(x, y)] m * n:
Figure FDA0000367473270000041
X=1 wherein, 2 ..., m; Y=1,2 ..., n, k=L, R.
4. the three-dimensional elevation of automobile chassis based on the binocular line array CCD according to claim 1 is determined method, it is characterized in that: the detailed step of described step 3 is,
Step 3a, ask the left and right characteristic image [I that looks l chara(x, y)] m * n[I r chara(x, y)] m * nfeature between correlativity
To the left and right characteristic image [I that looks l chara(x, y)] m * n[I r chara(x, y)] m * nthe processing of being labelled, and order is from left to right arranged tag number according to connected domain central series coordinate, by [I l chara(x, y)] m * ntag number to sort successively be 1,2 ..., Lab l, by [I r chara(x, y)] m * ntag number to sort successively be 1,2 ..., Lab r; Afterwards, obtain the left and right feature correlation of looking characteristic image,
Feature correlation comprises following four correlation parameters:
1) characteristic image [I is looked on a left side l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r,
At first, calculate the capable terminal point coordinate b (i, j) between the left and right characteristic curve of looking characteristic image, i=1,2 according to formula (10) ... Lab l, j=1,2 ... Lab r:
Figure FDA0000367473270000042
i=1,2…Lab L,j=1,2,…Lab R, (10)
Wherein, b l(i) be that characteristic image [I is looked on a left side l chara(x, y)] m * nthe capable terminal point coordinate of characteristic curve i, b r(j) be that characteristic image [I is looked on the right side r chara(x, y)] m * nthe capable terminal point coordinate of characteristic curve j;
Afterwards, calculate the beginning-of-line coordinate t (i, j) between the left and right characteristic curve of looking characteristic image, i=1,2 according to formula (11) ... Lab l, j=1,2 ... Lab r:
Figure FDA0000367473270000043
i=1,2 ... Lab l, j=1,2 ... Lab r, (11) t l(i) be that characteristic image [I is looked on a left side l chara(x, y)] m * nthe beginning-of-line coordinate of characteristic curve i, t r(j) be that characteristic image [I is looked on the right side r chara(x, y)] m * nthe beginning-of-line coordinate of characteristic curve j;
The beginning-of-line coordinate t (i, j) obtained according to formula (10) and formula (11), i=1,2 ... Lab l, j=1,2 ... Lab r, and row terminal point coordinate b (i, j), i=1,2 ... Lab l, j=1,2 ... Lab r, obtain a left side according to formula (12) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i, characteristic image [I is looked on the right side r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r:
N Match(i,j)=b(i,j)-t(i,j),i=1,2…Lab L,j=1,2,…Lab R, (12)
2) characteristic image [I is looked on the left side obtained according to formula (12) l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, obtain a left side according to formula (13) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row of characteristic curve j account for the number ratio L of short characteristic curve match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r,
L Match ( i , j ) = N match ( i , j ) L ( i , j ) , i=1,2,…,Lab L,j=1,2,…,Lab R, (13)
Wherein:
Figure FDA0000367473270000052
i=1,2,…,Lab L,j=1,2,…,Lab R, (14)
3) calculate a left side according to formula (15) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe slope differences S of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r,
If (x t l(i), y t l(i)) for a left side, look characteristic image [I l chara(x, y)] m * nthe starting point coordinate of characteristic curve i, (x b l(i), y b l(i)) be the terminal point coordinate of its characteristic curve i; (x t r(j), y t r(j)) for the right side, look characteristic image [I r chara(x, y)] m * nthe starting point coordinate of characteristic curve j, (x b r(j), y b r(j)) be the terminal point coordinate of its characteristic curve j:
S Match ( i , j ) = | y b L ( i ) - y t L ( i ) x b L ( i ) - x t L ( i ) - y b R ( j ) - y t R ( j ) x b R ( j ) - y t R ( j ) | , i=1,2,…,Lab L,j=1,2,…,Lab R, (15)
4), according to formula (16), calculate a left side and look characteristic image [I l chara(x, y)] m * nthe corresponding left view [I of characteristic curve i l(x, y)] m * nlook characteristic image [I with the right side r chara(x, y)] m * nthe corresponding right view [I of characteristic curve j r(x, y)] m * naround gray scale difference accumulative total and the R of characteristic block match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r:
R Match ( i , j ) = &Sigma; ( x , y ) &Element; &Omega; i , j | I L ( x , y ) - I R ( x , y ) | , i=1,2,…,Lab L,j=1,2,…,Lab R, (16)
Wherein, Ω i,jfor looking characteristic image [I in a left side l chara(x, y)] m * ncharacteristic curve i expand to the left and right sides Δ i, Δ i is spreading range, span is Δ i ∈ [10,30]; Characteristic image [I is looked on the right side r chara(x, y)] m * ncharacteristic curve j expand to the left and right sides Δ j, Δ j is spreading range, span is Δ j ∈ [10,30];
Step 3b, the relative coefficient of obtaining according to step 3a find optimum matching pair
1), according to formula (17), obtain the left side obtained obtained according to formula (12) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row number N of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab rmaximal value N match max:
N Match max=max{N Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (17)
Again according to formula (18), obtain and obtain a left side according to formula (13) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe common row of characteristic curve j account for the number ratio L of short characteristic curve match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, maximal value L match max:
L Match max=max{L Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (18)
2), according to formula (19), obtain the left side obtained according to formula (15) and look characteristic image [I l chara(x, y)] m * ncharacteristic curve i and the right side look characteristic image [I r chara(x, y)] m * nthe slope differences S of characteristic curve j match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, the minimum value S of this slope differences match minbe:
S Match min=min{S Match(i,j)|i=1,2,…,Lab L;j=1,2,…,Lab R}, (19)
Build the matrix that mark can not make left and right view feature pairing
Figure FDA0000367473270000073
its value is:
Figure FDA0000367473270000071
i=1,2,…,Lab L,j=1,2,…,Lab R
K max, k minbe characteristic curve matching degree parameter, span is k max∈ [0.5,0.7], k min∈ [2,6];
3) according to matching the mark matrix with according to formula (16), characteristic image [I is looked on the left side obtained l chara(x, y)] m * nthe corresponding left view [I of characteristic curve i l(x, y)] m * nlook characteristic image [I with the right side r chara(x, y)] m * nthe corresponding left view [I of characteristic curve j r(x, y)] m * naround gray scale difference accumulative total and the R of characteristic block match(i, j), i=1,2 ..., Lab l, j=1,2 ..., Lab r, obtain the right sequence number (i*, j*) of pairing characteristic curve of looking the same target in characteristic image when front left and right, computing formula is:
( i * , j * ) = arg i , j min { R Match ( i , j ) | i = 1,2 , . . . , Lab L ; j = 1,2 , . . . , Lab R ; a ( i , j ) = 0 } , - - - ( 21 )
Step 3c, repeating step 3b, until all characteristic curves have matched.
5. the three-dimensional elevation of automobile chassis based on the binocular line array CCD according to claim 1 is determined method, it is characterized in that: the detailed step of described step 4 is,
Step 4a, obtain the characteristic curve elevation
Characteristic image [I is looked on a traversal left side l chara(x, y)] m * n, find successively the right side to look characteristic image [I r chara(x, y)] m * nin the characteristic curve pair of all successful matchings, resequence as (i, i), i=1,2 ..., Lab lR, Lab wherein lRthe right number of characteristic curve of successful matching,
If i is matched line to (i, i), i=1,2 ..., Lab lRcharacteristic curve, look characteristic image [I on a left side l chara(x, y)] m * nrow-coordinate be x i l, the row coordinate is y i l; Look characteristic image [I on the right side r chara(x, y)] m * nin, the characteristic curve matched with it, on a left side, looking in characteristic image (is x on the identical row-coordinate of characteristic curve i r=x i l) the row coordinate be y i r, claim e i=y i r-y i lfor its parallax, i=1,2 ..., Lab lR,
The elevation of three-dimensional target is obtained by disparity computation, establishes elevation matrix [h (x, y)] m * n, be initialized as full null matrix,
If O lleft camera, O rbe right camera, establish and be imaged on a left side and look characteristic image [I l chara(x, y)] m * nlook characteristic image [I with the right side r chara(x, y)] m * nthe pairing characteristic curve on aerial image vegetarian refreshments (x, y, z), look characteristic image [I on a left side l chara(x, y)] m * nthe row coordinate of middle imaging point is y l, look characteristic image [I on the right side r chara(x, y)] m * nin the row coordinate be y r, corresponding to its i, match line to (i, i), y l=y i l, y r=y i r, f is camera focus, and z is the distance of spatial point from camera plane, and b is the distance of left and right camera, by imaging model, is obtained:
b z = ( b + y r ) - y l z - f , That is: z = b &CenterDot; f y l - y r , - - - ( 22 )
h(x,y)=C-z, (23)
Wherein, C is the fixing camera plane distance to the automobile chassis reference field, and h (x, y) is exactly the elevation on the characteristic curve that detects of required automobile chassis stereoscopic article;
Step 4b, find paired elevation characteristic curve
After step 2 has been removed transverse edge, three-dimensional target is just formed by two vertical surrounded by edges, therefore, according to step 4a, the elevation matrix [h (x, y)] by the characteristic curve of step 3 successful matching to obtaining respectively its characteristic curve m * n, then look characteristic image [I according to a left side l chara(x, y)] m * ncharacteristic curve i, i=1,2 ..., Lab lR, find it at matrix [h (x, y)] m * nelevation on upper relevant position, according in the elevation matrix, the elevation of two characteristic curves is the most approaching, and the row-coordinate of two characteristic curves is all maximum principle mutually, detect in elevation map two characteristic curves that belong to same target, these two characteristic curves are just as the left and right boundary line of same three-dimensional target;
Step 4c, repeating step 4b, until the elevation characteristic curve of all satisfied judgements has matched;
Step 4d, will be judged to be the altitude traverse that belongs to same target included center section will be defined as to target area, and give paired altitude traverse height value by it, complete the foundation of the three-dimensional elevation map of three-dimensional object.
CN201310359120.6A 2013-08-16 2013-08-16 Automobile chassis three-dimensional elevation determination method based on binocular line array CCD Expired - Fee Related CN103473761B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310359120.6A CN103473761B (en) 2013-08-16 2013-08-16 Automobile chassis three-dimensional elevation determination method based on binocular line array CCD

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310359120.6A CN103473761B (en) 2013-08-16 2013-08-16 Automobile chassis three-dimensional elevation determination method based on binocular line array CCD

Publications (2)

Publication Number Publication Date
CN103473761A true CN103473761A (en) 2013-12-25
CN103473761B CN103473761B (en) 2016-06-22

Family

ID=49798595

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310359120.6A Expired - Fee Related CN103473761B (en) 2013-08-16 2013-08-16 Automobile chassis three-dimensional elevation determination method based on binocular line array CCD

Country Status (1)

Country Link
CN (1) CN103473761B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107808378A (en) * 2017-11-20 2018-03-16 浙江大学 Complicated structure casting latent defect detection method based on vertical co-ordination contour feature

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008059513A (en) * 2006-09-04 2008-03-13 Hitachi Information & Communication Engineering Ltd Device and method for monitoring bottom face of vehicle
CN102565873A (en) * 2012-01-14 2012-07-11 张森 Safety check system for three-dimensional vehicle chassis
CN102941864A (en) * 2012-11-09 2013-02-27 武汉翔翼科技有限公司 Train loading state high-definition monitoring and overloading detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008059513A (en) * 2006-09-04 2008-03-13 Hitachi Information & Communication Engineering Ltd Device and method for monitoring bottom face of vehicle
CN102565873A (en) * 2012-01-14 2012-07-11 张森 Safety check system for three-dimensional vehicle chassis
CN102941864A (en) * 2012-11-09 2013-02-27 武汉翔翼科技有限公司 Train loading state high-definition monitoring and overloading detection method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘洪波: "双目视觉的立体匹配", 《计算机工程应用技术》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107808378A (en) * 2017-11-20 2018-03-16 浙江大学 Complicated structure casting latent defect detection method based on vertical co-ordination contour feature

Also Published As

Publication number Publication date
CN103473761B (en) 2016-06-22

Similar Documents

Publication Publication Date Title
CN104063702B (en) Three-dimensional gait recognition based on shielding recovery and partial similarity matching
CN104200521B (en) High Resolution SAR Images building target three-dimensional rebuilding method based on model priori
CN109460709A (en) The method of RTG dysopia analyte detection based on the fusion of RGB and D information
CN105225482A (en) Based on vehicle detecting system and the method for binocular stereo vision
CN104700414A (en) Rapid distance-measuring method for pedestrian on road ahead on the basis of on-board binocular camera
CN101398886A (en) Rapid three-dimensional face identification method based on bi-eye passiveness stereo vision
CN103177236A (en) Method and device for detecting road regions and method and device for detecting separation lines
CN110288659B (en) Depth imaging and information acquisition method based on binocular vision
CN105005999A (en) Obstacle detection method for blind guiding instrument based on computer stereo vision
CN105975957B (en) A kind of road plane detection method based on lane line edge
Oniga et al. Curb detection based on a multi-frame persistence map for urban driving scenarios
CN104463099A (en) Multi-angle gait recognizing method based on semi-supervised coupling measurement of picture
CN106156752A (en) A kind of model recognizing method based on inverse projection three-view diagram
CN106446785A (en) Passable road detection method based on binocular vision
CN103903238A (en) Method for fusing significant structure and relevant structure of characteristics of image
CN106183995A (en) A kind of visual parking device method based on stereoscopic vision
CN103632376A (en) Method for suppressing partial occlusion of vehicles by aid of double-level frames
Kwak et al. Registration of aerial imagery and aerial LiDAR data using centroids of plane roof surfaces as control information
Hautière et al. Road scene analysis by stereovision: a robust and quasi-dense approach
CN105761507A (en) Vehicle counting method based on three-dimensional trajectory clustering
CN110675442A (en) Local stereo matching method and system combined with target identification technology
CN103810489A (en) LiDAR point cloud data overwater bridge extraction method based on irregular triangulated network
Ortigosa et al. Obstacle-free pathway detection by means of depth maps
CN103473761B (en) Automobile chassis three-dimensional elevation determination method based on binocular line array CCD
CN105716530A (en) Method for measuring geometric dimension of vehicle based on binocular stereoscopic vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160622

Termination date: 20200816