CN101029824A - Method and apparatus for positioning vehicle based on characteristics - Google Patents

Method and apparatus for positioning vehicle based on characteristics Download PDF

Info

Publication number
CN101029824A
CN101029824A CN 200610055053 CN200610055053A CN101029824A CN 101029824 A CN101029824 A CN 101029824A CN 200610055053 CN200610055053 CN 200610055053 CN 200610055053 A CN200610055053 A CN 200610055053A CN 101029824 A CN101029824 A CN 101029824A
Authority
CN
China
Prior art keywords
mentioned
vehicle
symmetry
right edges
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200610055053
Other languages
Chinese (zh)
Other versions
CN101029824B (en
Inventor
刘威
郑烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Corp
Alpine Electronics Inc
Original Assignee
Neusoft Corp
Alpine Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Corp, Alpine Electronics Inc filed Critical Neusoft Corp
Priority to CN2006100550539A priority Critical patent/CN101029824B/en
Priority to JP2007043721A priority patent/JP4942509B2/en
Publication of CN101029824A publication Critical patent/CN101029824A/en
Application granted granted Critical
Publication of CN101029824B publication Critical patent/CN101029824B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A vehicle positioning method based on vehicle character includes calculating one or multiple symmetric axis of vehicle at height direction, calculating one pair or multiple pair of left-right edges of vehicle, utilizing projection to calculate top-bottom edges vehicle, utilizing grey scale to calculate bottom edge of vehicle, utilizing color to calculate top edge of vehicle and utilizing various confirmed edges integrally to confirm position of vehicle in image.

Description

Vehicle positioning method and device based on vehicle characteristics
Technical field
The present invention relates to based on vehicle positioning method and the device of determining vehicle location in the vehicle identification system of machine vision, particularly based on from the image-region that may comprise potential vehicle being vehicle positioning method and the device that the vehicle characteristics that extracts the ROI to determine exactly the position of vehicle in ROI.
Background technology
In vehicle identification system based on machine vision, the original image of being taken by the cameras such as video camera that are installed on vehicle (being designated hereinafter simply as from car) or other object that moves/fix such as is cut apart at processing, can obtain to comprise the image-region of potential vehicle, be so-called ROI (area-of-interest, Region Of Interest).And the accurate position of positioned vehicle in ROI helps to improve the vehicle identification rate of system.The width of position, bottom of especially accurate positioned vehicle (being the position that wheel and road surface have a common boundary) and vehicle is for being obtained from the car surrounding vehicles with most important from the relative position and the relative distance of car.In addition, vehicle location accurately is also extremely important for vehicle tracking, vehicle headway calculating, vehicle judgement and vehicle color identification.
At present, mainly be that these features of horizontal edge, vertical edge and symmetry of utilizing vehicle are come positioned vehicle.Fig. 1 shows the key step of positioned vehicle in the prior art.As shown in Figure 1, when determining the position of vehicle in ROI, mainly comprise following steps: (1) calculates the axis of symmetry on the vehicle-height direction; (2) determine left and right vehicle wheel, last lower limb.
The method of (1) calculating vehicle axis of symmetry at first, is described.
In the past, when positioned vehicle, generally only calculated a kind of axis of symmetry, i.e. profile axis of symmetry or gray scale axis of symmetry.The profile symmetry does not allow to be subject to the influence of illumination, but is subjected to the The noise such as background as road surface or buildings so easily, and it is very big to the symmetry influence particularly to have symmetric background.And when the vehicle part was blocked, the symmetry of profile was destroyed.Shown in Fig. 2 A,, and the axis of symmetry of buildings is defined as mistakenly the axis of symmetry of vehicle because as a setting buildings has than the more significant symmetry of vehicle.The gray scale symmetry does not allow to be subject to background influence, but is subjected to the influence of illumination easily.Therefore, only calculate a kind of axis of symmetry inevitable be subjected to easily background, illumination condition, vehicle image imperfect (be vehicle be blocked or the local car body of vehicle image outside) influence, axis of symmetry is calculated is forbidden, thereby influence the precision of vehicle location.
In addition, also disclosing the method for calculating a plurality of axis of symmetry in the prior art, mainly is to obtain final axis of symmetry by comprehensive a plurality of axis of symmetry result of calculations.But what adopt also is profile symmetry, gray scale symmetry, horizontal edge symmetry, vertical edge symmetry, and horizontal edge symmetry and vertical edge symmetry are a kind of of profile symmetry on principle, therefore, still can not avoid background, illumination condition, the incomplete influence of vehicle image.
In addition, prior art is when calculating axis of symmetry, and the image-region that participates in axis of symmetry calculating is whole ROI.So have following problem: calculated amount is big, be subjected to the influence of some backgrounds easily, and it is inaccurate that axis of symmetry is calculated.
Then, illustrate that (2) in the prior art determine the method for left and right vehicle wheel edge and last lower limb.
Prior art generally uses the method for vertical edge projection to determine the left and right edges of vehicle, and the method for usage level edge projection is determined the last lower limb of vehicle.So-called " vertical edge projection " is meant, use the vertical edge of edge detection operator abstract image, and vertical projection, be about to non-zero pixels all in the vertical edge image vertical direction press Lie Jia and, the left and right edges of the pixel column correspondence vehicle of projection value maximum.So-called " horizontal edge projection ", with the vertical edge projection-type seemingly, be meant the horizontal edge that uses the edge detection operator abstract image, and horizontal projection, be about to that non-zero pixels all in the horizontal edge image adds by row in the horizontal direction and, the last lower limb of the corresponding vehicle of the pixel column of projection value maximum.
Because the background in ROI also may have vertical edge and horizontal edge, so their existence will inevitably have influence on the left and right edges of vehicle and determining of last lower limb.In addition, because different illumination conditions, the vehicle part of cloudy day or fine day etc. are blocked or the incomplete influence of vehicle image outside image time of vehicle part, vehicle may lack part edge, this also can influence the left and right edges of vehicle and determining of last lower limb, finally influences the vehicle location precision.For example, Fig. 2 B schematically shows owing to be subjected to the influence of the vertical edge of buildings as a setting, makes the left and right vehicle wheel edge calculations wrong situation occur.Fig. 2 C schematically shows because the influence of the horizontal edge of as a setting direction board, makes the situation of coboundary miscount of vehicle.
In prior art (Japanese kokai publication hei 7-334799), also proposed to utilize shade at the bottom of the car to determine the method for the lower limb of vehicle.Its concrete grammar is: at first find shade at the bottom of the car, determine the edge of shade and road surface formation at the bottom of the car then, at last the lower limb of this edge as vehicle.But, owing to be subjected to position, the video camera setting height(from bottom) of the sun (or other light source), the influence of road grade, detects to go out shade at the bottom of the car on the image sometimes, as light leak, backlight at the bottom of the car, can't determine the lower limb of vehicle at this moment.For example, Fig. 2 D schematically shows owing to light leak or backlight at the bottom of the car, detects the situation less than shade at the bottom of the car.
Summary of the invention
The present invention puts forward in view of the problems referred to above that prior art exists, its purpose is to provide a kind of vehicle positioning method and device, based on from the image-region that may comprise potential vehicle being the left and right edges that the vehicle characteristics that extracts the ROI is accurately determined vehicle, thereby accurately determine the position of vehicle in above-mentioned ROI.
Another object of the present invention is accurately to determine the lower limb of vehicle, thereby accurately determine the position of vehicle in above-mentioned ROI.
Another object of the present invention is that shade, car body color (or gray scale) feature at the bottom of symmetry, level and the vertical edge of comprehensive utilization vehicle, the car are accurately determined the position of vehicle in above-mentioned ROI.
To achieve these goals, vehicle positioning method of the present invention, according to the image-region that may comprise vehicle is ROI, determine the position of above-mentioned vehicle in above-mentioned ROI, it is characterized in that having following steps: the left and right edges determining step, one or more vehicle characteristics that utilization is extracted from the presumptive area of above-mentioned ROI calculate one or more pairs of candidates' of above-mentioned vehicle left and right edges, and determine the left and right edges of above-mentioned vehicle according to above-mentioned candidate's left and right edges, wherein, above-mentioned vehicle characteristics comprises shadow character and/or car body color characteristic at the bottom of the car at least; Positioning step utilizes determined each edge, determines the position of above-mentioned vehicle in above-mentioned ROI.
Described vehicle positioning method, wherein, further comprising the steps of: in above-mentioned left and right edges determining step, when the above-mentioned candidate's left and right edges that is calculated be a pair of or calculated many to candidate's left and right edges when identical, the above-mentioned candidate's left and right edges that calculates is defined as the left and right edges of vehicle, many candidate's left and right edges when inequality, is determined the left and right edges of vehicle according to predetermined left and right edges fusion rule when what calculate.Wherein, in above-mentioned left and right edges determining step, utilize when shadow character calculates the left and right edges of above-mentioned vehicle at the bottom of the above-mentioned car, determine the candidate's of above-mentioned vehicle left and right edges according to the gray scale of the pixel in the shade presumptive area at the bottom of the car of above-mentioned ROI.Wherein, in above-mentioned left and right edges determining step, according to the gray average of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as the candidate's of above-mentioned vehicle left and right edges.Wherein, in above-mentioned left and right edges determining step, when calculating the left and right edges of above-mentioned vehicle, the appearance in the car body color presumptive area of above-mentioned ROI is defined as the candidate's of above-mentioned vehicle left and right edges with the left and right edges of color level line according to above-mentioned car body color characteristic.Wherein, in above-mentioned left and right edges determining step, comprise that also the projection of the vertical edge that utilization is extracted from above-mentioned ROI calculates the step at left and right vehicle wheel edge.
So, the vehicle according to the invention localization method just can accurately be determined the left and right edges of vehicle, thereby accurately determines the position of vehicle in above-mentioned ROI.
Described vehicle positioning method wherein, also comprises the lower limb determining step, and the projection of the horizontal edge that utilization is extracted from above-mentioned ROI is calculated respectively and determined that the candidate of above-mentioned vehicle goes up lower limb.Wherein, on above-mentioned in the lower limb determining step, also comprise the lower limb of determining the candidate of above-mentioned vehicle according to the gray scale of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, when the lower limb of calculating when this lower limb and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the lower limb of vehicle according to predetermined lower limb fusion rule.Wherein, in above-mentioned lower limb determining step, according to the gray average of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as candidate's lower limb of above-mentioned vehicle.
So, the vehicle according to the invention localization method just can accurately be determined the lower limb of vehicle, thereby accurately determines the position of vehicle in above-mentioned ROI.
Described vehicle positioning method, wherein, on above-mentioned in the lower limb determining step, also utilize the feature that changes from the horizontal slice of above-mentioned car body color to calculate candidate's coboundary of above-mentioned vehicle, when the coboundary of calculating when this coboundary and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the coboundary of vehicle according to predetermined coboundary fusion rule.Wherein, in above-mentioned coboundary determining step, when the feature that changes according to the horizontal slice of above-mentioned car body color is calculated the coboundary of above-mentioned vehicle, determine candidate's coboundary of above-mentioned vehicle according to the gray scale difference between the pixel column in the car body presumptive area of above-mentioned ROI or colour-difference.
Described vehicle positioning method, wherein, also comprise the axis of symmetry determining step, calculate the axis of symmetry of the one or more candidates on the short transverse of above-mentioned vehicle according to one or more symmetrical features that from above-mentioned ROI, extract, and determine the axis of symmetry of above-mentioned vehicle according to above-mentioned candidate's axis of symmetry; Wherein, in above-mentioned axis of symmetry determining step, when the above-mentioned candidate's axis of symmetry that is calculated is that one or a plurality of candidate's axis of symmetry of being calculated are when identical, the above-mentioned candidate's axis of symmetry that calculates is defined as the axis of symmetry of vehicle, when a plurality of candidate's axis of symmetry that calculated are inequality, determine the axis of symmetry of vehicle according to predetermined axis of symmetry fusion rule.Wherein, in above-mentioned axis of symmetry determining step, above-mentioned symmetrical feature comprises at least one in profile symmetry, gray scale symmetry and the S component symmetry of extracting from above-mentioned ROI.Wherein, the image-region of determining above-mentioned profile axis of symmetry is whole ROI, the image-region of determining above-mentioned gray scale axis of symmetry and S component axis of symmetry be among the above-mentioned ROI by the width of shade at the bottom of the above-mentioned car and the image-region of forming with the height of above-mentioned width proportion relation.
So, utilize vehicle positioning method of the present invention just can fully utilize shade at the bottom of symmetry, level and vertical edge, the car of vehicle, car body color (or gray scale) feature, accurately determine the position of vehicle in above-mentioned ROI.
To achieve these goals, vehicle locating device of the present invention, according to what may comprise vehicle is ROI, determine the position of above-mentioned vehicle in above-mentioned image, it is characterized in that having with lower unit: the left and right edges determining unit, one or more vehicle characteristics that utilization is extracted from the zones of different of above-mentioned ROI calculate one or more pairs of candidates' of above-mentioned vehicle left and right edges, and determine the left and right edges of above-mentioned vehicle according to above-mentioned candidate's left and right edges, wherein, above-mentioned vehicle characteristics comprises shadow character and/or car body color characteristic at the bottom of the car at least; Positioning unit utilizes determined each edge, determines the position of above-mentioned vehicle in above-mentioned ROI.
Described vehicle locating device is corresponding with above-mentioned vehicle positioning method.
The present invention has following effect:
Vehicle according to the invention localization method and device, even it is inaccurate that axis of symmetry calculates, influenced by illumination, background etc., also can determine the position in ROI of vehicle exactly, in addition, can also handle vehicle image among the ROI imperfect, and vehicle vehicle location under the situation that headstock tilts appears when turning round.
Description of drawings
Fig. 1 is the process flow diagram of the key step of positioned vehicle in the expression prior art.
Fig. 2 A shows when utilizing profile symmetry to calculate the vehicle axis of symmetry in the prior art, because buildings as a setting has more symmetry characteristic than vehicle, and makes axis of symmetry with buildings, is calculated as the figure of situation of the axis of symmetry of vehicle mistakenly.
Fig. 2 B shows when utilizing the vertical edge projection to calculate the left and right vehicle wheel edge in the prior art, owing to be subjected to the influence of the vertical edge of buildings as a setting, makes left and right edges with buildings, is calculated as the figure of situation of the left and right edges of vehicle mistakenly.
Fig. 2 C shows when utilizing the horizontal edge projection to calculate on the vehicle lower limb in the prior art, because the influence of the horizontal edge of signal lamp as a setting, makes the figure of situation of coboundary miscount of vehicle.
Fig. 2 D shows owing to light leak or backlight at the bottom of the car, detects the figure less than the situation of shade at the bottom of the car.
Fig. 3 is the process flow diagram of the key step of expression positioned vehicle of the present invention.
Fig. 4 is the process flow diagram that merges 3 candidate's axis of symmetry in the expression embodiment of the present invention 1.
Fig. 5 A is the process flow diagram that utilizes car body color calculation left and right vehicle wheel edge.
Fig. 5 B utilizes the car body color to determine the wicket setting synoptic diagram at left and right vehicle wheel edge.
Fig. 5 C utilizes the car body color to determine the synoptic diagram at left and right vehicle wheel edge.
Fig. 6 is that the process flow diagram that 3 pairs of candidate's left and right edges are determined the left and right edges of vehicle is merged in expression the present invention.
Fig. 7 is the synoptic diagram that the horizontal slice of expression vehicle color changes.
Fig. 8 is the block scheme of main composition key element in the vehicle locating device of expression embodiment of the present invention 1.
Embodiment
Below, embodiments of the present invention are described with reference to the accompanying drawings.
Embodiment 1
Below, with reference to the vehicle positioning method of Fig. 3-7 explanation embodiments of the present invention 1.
Fig. 3 is the process flow diagram of key step of positioned vehicle in the vehicle positioning method of expression embodiment of the present invention 1.Below, describe the key step of positioned vehicle of the present invention in detail with reference to Fig. 3.
Axis of symmetry on<calculating and the definite vehicle-height direction 〉
In step S1, utilize the S component symmetrical feature of profile symmetry, gray scale symmetry and HSV color space, calculating vehicle respectively is axis of symmetry on the vertical direction in short transverse, and just profile axis of symmetry, gray scale axis of symmetry and S component axis of symmetry are as candidate's axis of symmetry of vehicle.
Because the method for utilizing profile symmetry, gray scale symmetry to calculate profile axis of symmetry, gray scale axis of symmetry is the known technology of this area, so in this description will be omitted, below will describe the method for calculating S component axis of symmetry in detail.
Generally, the HSV color space is that tone (H), saturation degree (S), the brightness (V) with color is that three elements are represented, is the color space that is fit to human vision property.Wherein, the S component, promptly the saturation degree component is the component relevant with material behavior.
The inventor etc. are through after studying, and the S component axis of symmetry that discovery utilizes the S component to calculate is not vulnerable to the influence of background, illumination condition etc., so can determine the axis of symmetry of vehicle more exactly under above-mentioned particular case.
When utilizing the S component to calculate S component axis of symmetry, at first, set the image-region that is suitable for calculating S component axis of symmetry among the ROI.This image-region is selected among the ROI by the width of shade at the bottom of the car and the image-region formed with the proportional height of above-mentioned width.For example, the base of above-mentioned image-region is set at the capable Y at the place, base of shade at the bottom of the car Bc, the border, the left and right sides of image-region is set at border, the left and right sides X of shade at the bottom of the car lAnd X r, the width W of shade at the bottom of the car is set at W=X r-X l, the height H of image-region is set at H=0.9W.Wherein, the capable Y at the place, base of shade at the bottom of the car BcCalculating see step S4 for details, border, the left and right sides X of shade at the bottom of the car lAnd X rCalculating see step S4 for details.
Then, in the above-mentioned image-region that has been set of ROI, according to the difference and the S (j) of S component in the Δ x scope of the every row pixel of formula (1) the calculating left and right sides, minimum difference and pairing pixel column are the S component axis of symmetry that utilizes the S component to calculate.
S ( j ) = Σ Y bc Y bc + H Σ Δx = 1 W / 2 | P ( j + Δx , i ) - P ( j - Δx , i ) | - - - ( 1 )
Wherein, (x y) is the S component value of image to P.
More than, just calculate a preferred embodiment of S component axis of symmetry, the method that the present invention calculates S component axis of symmetry is not limited to this, can change as the case may be.
Then, in step S2, these 3 the candidate's axis of symmetry of profile axis of symmetry, gray scale axis of symmetry and S component axis of symmetry that calculate among the step S1 are merged according to predetermined axis of symmetry fusion rule, determine the axis of symmetry on the vehicle-height direction.
The concrete grammar of fusion and definite axis of symmetry can be predetermined as required.Below, with reference to Fig. 4 the example of an axis of symmetry fusion rule is described, but the present invention is merged the method for a plurality of candidate's axis of symmetry and is not limited thereto.
Fig. 4 is that the process flow diagram that 3 candidate's axis of symmetry are determined the axis of symmetry on the short transverse of vehicle is merged in expression the present invention.
As 3 candidate's axis of symmetry identical (i.e. coincidence fully, x 1=x 2=x 3) or when approximate identical (be between any two axis of symmetry distance be less than or equal to predetermined value), the axis of symmetry in the middle of 3 candidate's axis of symmetry is defined as the axis of symmetry of vehicle.
2 identical (promptly coincidences, as x of candidate's axis of symmetry are arranged in 3 candidate's axis of symmetry 1=x 2) or approximate identical (distance between these 2 candidate's axis of symmetry is less than or equal to predetermined value, as | x 1-x 2|≤Δ, Δ is represented predetermined value) time, the center of these 2 candidate's axis of symmetry is defined as the axis of symmetry of vehicle.
All different when 3 candidate's axis of symmetry, the distance between promptly any 2 candidate's axis of symmetry then is defined as the profile axis of symmetry axis of symmetry of vehicle during all greater than predetermined value.
As mentioned above, at first calculate candidate's axis of symmetry, particularly S component axis of symmetry merges to determine the axis of symmetry of vehicle by predetermined fusion rule to it then.So, be not vulnerable to the influence of background, illumination condition etc., can improve the precision of determining the vehicle axis of symmetry.
In addition, prior art is when calculating different candidate's axis of symmetry, and the zone that participates in axis of symmetry calculating generally is whole ROI.Because whole ROI may comprise the part background image, therefore, calculate axis of symmetry based on whole ROI and cause axis of symmetry calculating inaccurate easily.And among the present invention, according to circumstances adopt different ROI zones to calculate axis of symmetry.For example, the image-region that calculates the profile axis of symmetry is whole ROI, the image-region that calculates gray scale axis of symmetry and component axis of symmetry is to carry out in the image-region of forming by the width of shade at the bottom of the car with the suitable height of width proportion relation among the ROI, can reduce so promptly that background can also reduce calculated amount to the influence of axis of symmetry among the ROI.
Have again, though exemplified here and calculated the example that 3 candidate's axis of symmetry are finally determined the axis of symmetry of vehicle respectively, but the invention is not restricted to this, also can according to circumstances change, for example also can only calculate an axis of symmetry as required, perhaps calculate more axis of symmetry, can also come to determine suitable axis of symmetry fusion rule as required.
The left and right edges of<calculating and definite vehicle 〉
In step S3, utilize the vertical edge projection algorithm to calculate the left and right edges of vehicle, i.e. vertical edge on the vehicle-height direction.At first, in ROI, extract vertical edge with edge detection operators such as Sobel.Then, filter, remove, because they are likely the noise pixel that does not belong to vehicle ' s contour about the asymmetric pixel of axis of symmetry based on symmetric constraints.Behind the vertical edge image after obtaining filtering, adopt the method for vertical projection to calculate the left and right edges of vehicle, as candidate's left and right edges.
Then, in step S4, utilize shade at the bottom of the car, car body color characteristic to calculate the left and right edges of vehicle respectively, as candidate's left and right edges.Though in step S3, utilized the projection meter of vertical edge to calculate left and right edges, in actual applications, because the complicacy of environment and background utilizes the vertical edge projection algorithm to determine that also there is following problem in the left and right edges of vehicle.For example, because the vertical edge projection algorithm has utilized the axis of symmetry of vehicle-height direction, when if the axis of symmetry of determining is inaccurate, the left and right vehicle wheel edge that utilizes vertical projection to calculate is just inaccurate, and, even the calculating of axis of symmetry is accurate, but because the vertical edge projection algorithm is subjected to the vertical edge influence of background easily, so can not determine the left and right edges of vehicle sometimes exactly.
So, the present invention proposes the border, the left and right sides that utilizes shade at the bottom of the car, the left and right edges that the car body color characteristic calculates vehicle respectively, as the method for candidate's left and right edges.
Then explanation utilizes the border, the left and right sides of shade at the bottom of the car to calculate the method at left and right vehicle wheel edge.
So-called " shade at the bottom of the car " is meant the car shadow of car bottom.With the left and right edges of shade at the bottom of the car left and right edges as vehicle.Therefore usually, shade is the vehicle characteristics also more stable than vehicle vertical edge at the bottom of the car, utilizes the left and right vehicle wheel edge that shade calculates at the bottom of the car, and is more stable and accurate.
At first, set the image-region that is used to calculate shade at the bottom of the car among the ROI, i.e. so-called " shade presumptive area at the bottom of the car ".Though also can calculate, select suitable scope can reduce calculated amount and also can make calculating more accurate to whole ROI.For example, the latter half of ROI is set at the image-region that uses when calculating the left and right vehicle wheel edge.Then, calculate the gray average in this image-region, and in the flag settings zone less than the pixel of this average.
Subsequently, calculate the left and right edges of shade at the bottom of the car.At first, calculate the maximum row of pixel that in above-mentioned image-region, are labeled, with the pixel quantity that is labeled of this row pixel as the search car at the bottom of the threshold value T of shade left and right edges cThen, the quantity of the pixel that pixel column is labeled is designated as C c, from left to right by row search setting regions, first C that satisfies condition c〉=T cThe row of α are the left hand edge of shade at the bottom of the car, are designated as X l, wherein α is constant (0<α<1), can rule of thumb suitably set.To shade right hand edge at the bottom of the car determine identically with left hand edge, be designated as X r
More than, the method for utilizing shade at the bottom of the car to calculate the left and right vehicle wheel edge has been described.Below, explanation utilizes the car body color to calculate the method at left and right vehicle wheel edge with reference to Fig. 5 A.Fig. 5 A utilizes the car body color to calculate the process flow diagram of the left and right edges of vehicle.
So-called " utilizing the car body color characteristic to determine the left and right edges of vehicle ", the border, the horizontal left and right sides of the longest solid colour on the car body that in ROI, finds exactly in brief.
Shown in Fig. 5 A, at first, in step 420, set the image-region that is used to calculate the car body color among the ROI, i.e. so-called " car body color presumptive area ".Though also can calculate, select suitable scope can reduce calculated amount and also can make calculating more accurate to whole ROI.The base of setting this image-region is the pixel column Y at the top margin place of shade at the bottom of the car Tc, the border, the left and right sides of setting this image-region is border, the left and right sides X of shade at the bottom of the car lAnd X r, the height of setting this image-region is H=0.6W, the width W=X of shade at the bottom of the car r-X lWherein, the capable Y at the top margin place of shade at the bottom of the car TcCalculating see step S7 for details, border, the left and right sides X of shade at the bottom of the car lAnd X rCalculating see step S4 for details.
Then, in step 421,, promptly in the left field of the vehicle axis of symmetry in the car body color presumptive area, set the wicket (wicket is set shown in Fig. 5 B) that is used to search for the horizontal pre-sizing of same color at above-mentioned image-region.In this wicket, calculate the gray average of every row pixel, compare the gray average of adjacent column pixel, if all value difference thinks then that less than preset threshold this two row color is identical.Calculate the consecutive identical maximum continuation column of color in wicket, this continuation column promptly represents the appearance that finds in this wicket with the color level line, and horizontal left and right sides end points is classified in the initial row of this continuation column and end as.According to foregoing description, in above-mentioned image-region, with wicket traversal (mobile from bottom to top wicket), in each wicket, all can obtain a horizontal line the longest, in all horizontal lines, find the longlyest, be the longest horizontal line that finds in the axis of symmetry left field.The longest horizontal line on axis of symmetry right side looks for method identical with the left side.
At last, in step 422, will be at the longest horizontal left end point in axis of symmetry left side left hand edge as vehicle, will be at the longest horizontal right endpoint on axis of symmetry right side right hand edge as vehicle.Fig. 5 C schematically shows the figure that utilizes the car body color to determine the left and right vehicle wheel edge.
More than, the method for utilizing shade at the bottom of the car, car body color characteristic to calculate the left and right vehicle wheel edge respectively has been described respectively.Wherein, though utilizing the car body color characteristic to calculate the axis of symmetry that has used vehicle in the method at left and right vehicle wheel edge assists to calculate, but can not to use axis of symmetry also, two image-regions calculate about being not in relation to axis of symmetry and being divided into, but calculate the border, the horizontal left and right sides of solid colour the longest on the car body in the entire image zone.
Below, return Fig. 3 and go on to say 3 pairs of candidate's left and right edges that utilization calculates, finally determine the method for the left and right edges of vehicle according to predetermined left and right edges fusion rule.
In step S5, according to predetermined left and right edges fusion rule, 3 pairs of candidate's left and right edges that will calculate in step S3 and step S4 merge, and obtain final left and right vehicle wheel edge.Below, with reference to Fig. 6 the example of a left and right edges fusion rule is described, but left and right edges fusion rule of the present invention being not limited thereto, can according to circumstances change.
Fig. 6 is that the process flow diagram that 3 pairs of candidate's left and right edges are determined the left and right edges of vehicle is merged in expression the present invention.The flow process of left and right edges fusion rule shown in Figure 6 is similar to the flow process of axis of symmetry fusion rule shown in Figure 4, so detailed here.
The last lower limb of<calculating and definite vehicle 〉
In step S6, the candidate who utilizes the horizontal edge projection to calculate vehicle goes up lower limb.Concrete grammar is as follows: after determine the left and right edges of vehicle, can obtain the vertical strip zone between the left and right edges.In this vertical strip zone, similar with the location left and right edges, in ROI, extract the horizontal edge of image with edge detection operators such as horizontal Sobel, utilize horizontal projection method of the prior art to calculate lower limb on the candidate of vehicle.
Then, in step S7, utilize that shade calculates the vehicle lower limb at the bottom of the car.With the lower limb of shade at the bottom of the car lower limb as vehicle.In order to determine the lower limb of shade at the bottom of the car, at first, calculate the maximum row of pixel that in presumptive area, is labeled, the pixel quantity that this row pixel is labeled as the search car at the bottom of the threshold value T of shade lower limb rThen, the quantity of the pixel that certain row is labeled is designated as C r, search for setting regions from bottom to top line by line, first C that satisfies condition r〉=T rα 1Row be the lower limb of shade at the bottom of the car, be designated as Y Bc, α wherein 1Be constant (0<α 1<1), can according to circumstances suitably determine.
At this, the method for the coboundary of shade at the bottom of the simple declaration calculating car.Begin upwards search, first C that satisfies condition from the maximum row of pixel that is labeled r≤ T rThe row of β is the coboundary of shade at the bottom of the car, is designated as Y Tc, wherein β is constant (0<β<1), can rule of thumb suitably set.Then, in step S8, utilize car body color level layering variation characteristic to calculate candidate's coboundary of vehicle.When there were little billboard, direction board or other background in the vehicle top, it was the coboundary of vehicle that their horizontal edge is mistaken as easily.In order to address this problem, the present invention utilizes vehicle to have significant horizontal structure, and particularly the feature that color can change about these horizontal structures is adjusted coboundary.The synoptic diagram that the vehicle color horizontal slice changes as shown in Figure 7.
Below, simple declaration utilizes the feature of car body color level layering variation, calculates the method for candidate's coboundary of vehicle.
At first, set the image-region that is used to calculate car body color level layering variation among the ROI,, select suitable scope can reduce calculated amount and also can make calculating more accurate though also can calculate to whole ROI.The base of this image-region is the capable Y at the top margin place of shade at the bottom of the car Tc, the border, the left and right sides of image-region is the left and right vehicle wheel edge V that determines among the step S5 l, V r, the top margin of image-region is I T=Y Tc-V W, V WBe vehicle width, V W=V r-V lThen, the gray average of every capable pixel in the computed image zone.
At last, the following pixel column that calculates gray average sudden change: in the image setting zone, from top to bottom, the gray average that i is capable and the i+ Δ is capable relatively; If the difference of the gray average of two row is greater than preset threshold T 1, then compare pixel of these two row, and count the pixel grey scale difference greater than setting threshold T by row 1Pixel count, if pixel count is greater than preset threshold T 2,,, then that i is capable of gray scale sudden change row if in the capable certain intervals up and down of i, similarly do not suddenly change row then with the capable candidate row of i as the gray scale sudden change.First sudden change row of top is as the coboundary of vehicle in the image-region.
More than, illustrated and utilized that shade and the layering of car body color level change candidate's coboundary of vehicle, the method for candidate's lower limb calculated respectively at the bottom of horizontal edge projection, the car.Below, will return Fig. 3, illustrate according to above-mentioned candidate and go up the method that lower limb is finally determined the last lower limb of vehicle.
In step S9,2 candidate's lower limbs that calculated are merged according to predetermined lower limb fusion rule, determine the lower limb that vehicle is final.The rule that lower limb merges is exemplified below, but lower limb fusion rule of the present invention is not limited thereto:
(1) if do not detect shade at the bottom of the car, the lower limb that the flat projection of then fetching water is determined is as the lower limb of vehicle;
(2) if the lower limb of shade is below vehicle's center of gravity at the bottom of the car, the lower limb of the end shade of then picking up the car is as the lower limb of vehicle, otherwise the lower limb that the flat projection of fetching water is determined is as the lower limb of vehicle.
Calculate lower limb according to the present invention and have following effect: when the bottom of vehicle is relatively darker, may can not find tangible horizontal lower edge or find wrong horizontal lower edge, thereby adopt the method for the horizontal projection among the step S5 can't determine the vehicle bottom merely or determine the bottom of a mistake; And utilize shade at the bottom of the car to determine the method for the lower limb of vehicle, at the bottom of can not detecting car on the image, during shade, can't determine the lower limb of vehicle.In order to address these problems, the present invention proposes to determine the vehicle lower limb in conjunction with the method for shade at the bottom of horizontal edge projection and the car.
Then, in step S10,2 candidate's coboundarys that calculated are merged according to predetermined coboundary fusion rule, determine the coboundary that vehicle is final.The rule that coboundary merges is exemplified below, but coboundary fusion rule of the present invention is not limited thereto.
(1) when the coboundary that utilizes car body color level layering variation characteristic to determine during more than or equal to the coboundary that utilizes the horizontal edge projection to determine:
Satisfy the requirement of vehicle aspect ratio, then to utilize the coboundary of the definite coboundary of car body color level layering variation characteristic as vehicle;
Do not satisfy the requirement of vehicle aspect ratio, then in order to the coboundary of the coboundary of determining with horizontal edge projection as vehicle;
(2) when the coboundary that utilizes the horizontal edge projection to determine during more than or equal to the coboundary that utilizes car body color level layering variation characteristic to determine:
Satisfy the requirement of vehicle aspect ratio, then in order to the coboundary of the coboundary of determining with horizontal edge projection as vehicle;
Do not satisfy the requirement of vehicle aspect ratio, then to utilize the coboundary of the definite coboundary of car body color level layering variation characteristic as vehicle.
<determine the position of vehicle 〉
In step S11, determine the position of vehicle in ROI according to each edge of determined vehicle.
The process flow diagram of the positioned vehicle shown in Fig. 3 is an example, and the method for certain vehicle location of the present invention is not limited thereto, and can according to circumstances change it.For example, the step that has definite vehicle axis of symmetry, left and right edges, last lower limb in the process flow diagram among Fig. 3, but additive method also can be adopted in the position of positioned vehicle in ROI, method such as the left and right edges of determining vehicle is utilized the present invention, determine that the method for lower limb on the method for vehicle axis of symmetry and/or the definite vehicle can utilize prior art to substitute, also can only calculate left and right edges or go up lower limb and come the position of positioned vehicle in ROI.
(embodiment 2)
Below, the vehicle locating device of embodiments of the present invention 2 is described with reference to Fig. 8.
Fig. 8 is the figure of the main composition of positioned vehicle device 100 in the expression embodiment of the present invention 2.Below, describe the main composition of vehicle locating device 100 in detail with reference to Fig. 8.
As shown in Figure 8, vehicle locating device 100 has: axis of symmetry determining unit 101, left and right edges determining unit 102, last lower limb determining unit 103 and positioning unit 104.
The formation of the positioned vehicle device shown in Fig. 8 is an example, and certain vehicle locating device of the present invention is not limited thereto, and can according to circumstances change it.For example, have axis of symmetry determining unit, left and right edges determining unit among Fig. 8, go up the lower limb determining unit, but vehicle locating device also can not be provided with the axis of symmetry determining unit, and left and right edges determining unit or and last lower limb determining unit only are set.
In addition, the vehicle locating device 100 of embodiments of the present invention 2 is basic corresponding with the vehicle positioning method of embodiment 1.Below, a simple declaration vehicle locating device 100, and omit the explanation that repeats with embodiment 1, for example, for the device of realizing the needed correspondence of step in the above-mentioned vehicle positioning method is set, such as various fusing devices.
As shown in Figure 8, the image-region that ROI input media 200 may comprise vehicle is that the image of ROI is input in the vehicle locating device 100, and the result of vehicle locating device 100 output vehicle locations.
Axis of symmetry determining unit 101 is calculated the axis of symmetry of the one or more candidates on the short transverse of vehicle according to one or more symmetrical features that extract from ROI, and determine the axis of symmetry of vehicle according to candidate's axis of symmetry.Wherein, preferred symmetrical feature comprises at least one in profile symmetry, gray scale symmetry and the S component symmetry of extracting from ROI.Also have, determine that preferably the image-region of profile axis of symmetry is whole ROI, the image-region of determining gray scale axis of symmetry and S component axis of symmetry be in ROI by the width of shade at the bottom of the car and the image-region of forming with the height of width proportion relation.
Left and right edges determining unit 102 utilizes one or more vehicle characteristics that extract from the presumptive area of ROI to calculate one or more pairs of candidates' of vehicle left and right edges, and determines the left and right edges of vehicle according to candidate's left and right edges.Wherein, vehicle characteristics comprises shadow character and/or car body color characteristic at the bottom of the car at least.
In addition, in left and right edges determining unit 102, utilize when shadow character calculates the left and right edges of vehicle at the bottom of the car, determine the candidate's of vehicle left and right edges according to the gray scale of the pixel in the shade presumptive area at the bottom of the car of ROI.In addition, in left and right edges determining unit 102, when calculating the left and right edges of vehicle, determine the candidate's of vehicle left and right edges with the left and right edges of color level line according to the appearance in the car body color presumptive area of ROI according to the car body color characteristic.In left and right edges determining unit 102, comprise that also the projection of the vertical edge that utilization is extracted from ROI calculates the unit 102 at left and right vehicle wheel edge.
Last lower limb determining unit 103 utilizes the projection of the horizontal edge that extracts from ROI to calculate respectively and the candidate's of definite vehicle last lower limb.In last lower limb determining unit 103, also comprise the lower limb of determining the candidate of vehicle according to the gray scale of the pixel of shade presumptive area at the bottom of the car of ROI, when the lower limb of calculating when this lower limb and the projection meter that utilizes horizontal edge is inequality, determine the lower limb of vehicle according to predetermined lower limb fusion rule.In last lower limb determining unit 103,, gray-scale value less than the quantity of the pixel of the gray average pixel column more than or equal to predetermined value, is defined as the candidate's of vehicle lower limb according to the gray average of the pixel of shade presumptive area at the bottom of the car of ROI.
In last lower limb determining unit 103, also utilize the feature that changes from the horizontal slice of car body color to calculate candidate's coboundary of vehicle, when the coboundary of calculating when this coboundary and the projection meter that utilizes horizontal edge is inequality, determine the coboundary of vehicle according to predetermined coboundary fusion rule.In last lower limb determining unit 103, when the feature that changes according to the horizontal slice of car body color is calculated the coboundary of vehicle, determine candidate's coboundary of vehicle according to the gray scale difference between the pixel column in the car body presumptive area of ROI or colour-difference.
Positioning unit 104 utilizes determined each edge to determine the position of vehicle in image.
Localization method of the present invention and device can be determined the axis of symmetry on the vehicle-height direction exactly.In addition, owing to combine shade and car body color characteristic at the bottom of the car, inaccurate even axis of symmetry calculates, also can obtain a correct left and right vehicle wheel edge.At last, owing to the feature of shade and the horizontal colo(u)r breakup of car body at the bottom of the horizontal edge projection that combines vehicle, the car, last lower limb location is more accurate.In addition, localization method of the present invention and device, usable range is wide, does not allow to be subject to illumination, background influence, can handle to occur the situation that headstock tilts when the imperfect and vehicle of vehicle image turns round.

Claims (30)

1. a vehicle positioning method is ROI according to the image-region that may comprise vehicle, determines the position of above-mentioned vehicle in above-mentioned ROI, it is characterized in that having following steps:
The left and right edges determining step, one or more vehicle characteristics that utilization is extracted from the presumptive area of above-mentioned ROI calculate one or more pairs of candidates' of above-mentioned vehicle left and right edges, and determine the left and right edges of above-mentioned vehicle according to above-mentioned candidate's left and right edges, wherein, above-mentioned vehicle characteristics comprises shadow character and/or car body color characteristic at the bottom of the car at least;
Positioning step utilizes determined each edge, determines the position of above-mentioned vehicle in above-mentioned ROI.
2. vehicle positioning method as claimed in claim 1, wherein, further comprising the steps of:
In above-mentioned left and right edges determining step, when the above-mentioned candidate's left and right edges that is calculated be a pair of or calculated many to candidate's left and right edges when identical, the above-mentioned candidate's left and right edges that calculates is defined as the left and right edges of vehicle, many candidate's left and right edges when inequality, is determined the left and right edges of vehicle according to predetermined left and right edges fusion rule when what calculate.
3. vehicle positioning method as claimed in claim 2, wherein,
In above-mentioned left and right edges determining step, utilize when shadow character calculates the left and right edges of above-mentioned vehicle at the bottom of the above-mentioned car, determine the candidate's of above-mentioned vehicle left and right edges according to the gray scale of the pixel in the shade presumptive area at the bottom of the car of above-mentioned ROI.
4. vehicle positioning method as claimed in claim 3, wherein,
In above-mentioned left and right edges determining step, gray average according to the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as the candidate's of above-mentioned vehicle left and right edges.
5. vehicle positioning method as claimed in claim 2, wherein,
In above-mentioned left and right edges determining step, when calculating the left and right edges of above-mentioned vehicle, the appearance in the car body color presumptive area of above-mentioned ROI is defined as the candidate's of above-mentioned vehicle left and right edges with the left and right edges of color level line according to above-mentioned car body color characteristic.
6. as each described vehicle positioning method of claim 1 to 5, wherein,
In above-mentioned left and right edges determining step, comprise that also the projection of the vertical edge that utilization is extracted from above-mentioned ROI calculates the step at left and right vehicle wheel edge.
7. as each described vehicle positioning method of claim 1 to 5, wherein,
Also comprise the lower limb determining step, the projection of the horizontal edge that utilization is extracted from above-mentioned ROI is calculated respectively and is determined that the candidate of above-mentioned vehicle goes up lower limb.
8. vehicle positioning method as claimed in claim 7, wherein,
On above-mentioned in the lower limb determining step, also comprise the lower limb of determining the candidate of above-mentioned vehicle according to the gray scale of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, when the lower limb of calculating when this lower limb and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the lower limb of vehicle according to predetermined lower limb fusion rule.
9. vehicle positioning method as claimed in claim 8, wherein,
In above-mentioned lower limb determining step,, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as candidate's lower limb of above-mentioned vehicle according to the gray average of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI.
10. vehicle positioning method as claimed in claim 7, wherein,
On above-mentioned in the lower limb determining step, also utilize the feature that changes from the horizontal slice of above-mentioned car body color to calculate candidate's coboundary of above-mentioned vehicle, when the coboundary of calculating when this coboundary and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the coboundary of vehicle according to predetermined coboundary fusion rule.
11. vehicle positioning method as claimed in claim 10, wherein,
In above-mentioned coboundary determining step, when the feature that changes according to the horizontal slice of above-mentioned car body color is calculated the coboundary of above-mentioned vehicle, determine candidate's coboundary of above-mentioned vehicle according to the gray scale difference between the pixel column in the car body presumptive area of above-mentioned ROI or colour-difference.
12. vehicle positioning method as claimed in claim 2, wherein,
Also comprise the axis of symmetry determining step, calculate the axis of symmetry of the one or more candidates on the short transverse of above-mentioned vehicle according to one or more symmetrical features that from above-mentioned ROI, extract, and determine the axis of symmetry of above-mentioned vehicle according to above-mentioned candidate's axis of symmetry.
13. vehicle positioning method as claimed in claim 12, wherein,
In above-mentioned axis of symmetry determining step, when the above-mentioned candidate's axis of symmetry that is calculated is that one or a plurality of candidate's axis of symmetry of being calculated are when identical, the above-mentioned candidate's axis of symmetry that calculates is defined as the axis of symmetry of vehicle, when a plurality of candidate's axis of symmetry that calculated are inequality, determine the axis of symmetry of vehicle according to predetermined axis of symmetry fusion rule.
14. vehicle positioning method as claimed in claim 12, wherein,
In above-mentioned axis of symmetry determining step, above-mentioned symmetrical feature comprises at least one in profile symmetry, gray scale symmetry and the S component symmetry of extracting from above-mentioned ROI.
15. vehicle positioning method as claimed in claim 14, wherein,
The image-region of determining above-mentioned profile axis of symmetry is whole ROI, the image-region of determining above-mentioned gray scale axis of symmetry and S component axis of symmetry be among the above-mentioned ROI by the width of shade at the bottom of the above-mentioned car and the image-region of forming with the height of above-mentioned width proportion relation.
16. a vehicle locating device is ROI according to the image-region that may comprise vehicle, determines the position of above-mentioned vehicle in above-mentioned ROI, it is characterized in that having with lower unit:
The left and right edges determining unit, one or more vehicle characteristics that utilization is extracted from the presumptive area of above-mentioned ROI calculate one or more pairs of candidates' of above-mentioned vehicle left and right edges, and determine the left and right edges of above-mentioned vehicle according to above-mentioned candidate's left and right edges, wherein, above-mentioned vehicle characteristics comprises shadow character and/or car body color characteristic at the bottom of the car at least;
Positioning unit utilizes determined each edge, determines the position of above-mentioned vehicle in above-mentioned ROI.
17. vehicle locating device as claimed in claim 16 wherein, also comprises with lower unit:
In above-mentioned left and right edges determining unit, when the above-mentioned candidate's left and right edges that is calculated be a pair of or calculated many to candidate's left and right edges when identical, the above-mentioned candidate's left and right edges that calculates is defined as the left and right edges of vehicle, many candidate's left and right edges when inequality, is determined the left and right edges of vehicle according to predetermined left and right edges fusion rule when what calculate.
18. vehicle locating device as claimed in claim 17, wherein,
In above-mentioned left and right edges determining unit, utilize when shadow character calculates the left and right edges of above-mentioned vehicle at the bottom of the above-mentioned car, determine the candidate's of above-mentioned vehicle left and right edges according to the gray scale of the pixel in the shade presumptive area at the bottom of the car of above-mentioned ROI.
19. vehicle locating device as claimed in claim 18, wherein,
In above-mentioned left and right edges determining unit, gray average according to the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as the candidate's of above-mentioned vehicle left and right edges.
20. vehicle locating device as claimed in claim 17, wherein,
In above-mentioned left and right edges determining unit, when calculating the left and right edges of above-mentioned vehicle, determine the candidate's of above-mentioned vehicle left and right edges with the left and right edges of color level line according to the appearance in the car body color presumptive area of above-mentioned ROI according to above-mentioned car body color characteristic.
21. as each described vehicle locating device of claim 16 to 20, wherein,
In above-mentioned left and right edges determining unit, comprise that also the projection of the vertical edge that utilization is extracted from above-mentioned ROI calculates the unit at left and right vehicle wheel edge.
22. as each described vehicle locating device of claim 16 to 20, wherein,
Also comprise the lower limb determining unit, the candidate's of above-mentioned vehicle last lower limb is calculated and determines in the projection of the horizontal edge that utilization is extracted from above-mentioned ROI respectively.
23. vehicle locating device as claimed in claim 22, wherein,
On above-mentioned in the lower limb determining unit, also comprise the lower limb of determining the candidate of above-mentioned vehicle according to the gray scale of the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, when the lower limb of calculating when this lower limb and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the lower limb of vehicle according to predetermined lower limb fusion rule.
24. vehicle locating device as claimed in claim 23, wherein,
On above-mentioned in the lower limb determining unit, gray average according to the pixel of shade presumptive area at the bottom of the car of above-mentioned ROI, gray-scale value less than the quantity of the pixel of the above-mentioned gray average pixel column more than or equal to predetermined value, is defined as the candidate's of above-mentioned vehicle lower limb.
25. vehicle locating device as claimed in claim 22, wherein,
On above-mentioned in the lower limb determining unit, also utilize the feature that changes from the horizontal slice of above-mentioned car body color to calculate candidate's coboundary of above-mentioned vehicle, when the coboundary of calculating when this coboundary and the projection meter that utilizes above-mentioned horizontal edge is inequality, determine the coboundary of vehicle according to predetermined coboundary fusion rule.
26. vehicle locating device as claimed in claim 25, wherein,
On above-mentioned in the lower limb determining unit, when the feature that changes according to the horizontal slice of above-mentioned car body color is calculated the coboundary of above-mentioned vehicle, determine candidate's coboundary of above-mentioned vehicle according to the gray scale difference between the pixel column in the car body presumptive area of above-mentioned ROI or colour-difference.
27. vehicle locating device as claimed in claim 16, wherein,
Also comprise the axis of symmetry determining unit, calculate the axis of symmetry of the one or more candidates on the short transverse of above-mentioned vehicle according to one or more symmetrical features that from above-mentioned ROI, extract, and determine the axis of symmetry of above-mentioned vehicle according to above-mentioned candidate's axis of symmetry.
28. vehicle locating device as claimed in claim 27, wherein,
In above-mentioned axis of symmetry determining unit, when the above-mentioned candidate's axis of symmetry that is calculated is that one or a plurality of candidate's axis of symmetry of being calculated are when identical, the above-mentioned candidate's axis of symmetry that calculates is defined as the axis of symmetry of vehicle, when a plurality of candidate's axis of symmetry that calculated are inequality, determine the axis of symmetry of vehicle according to predetermined axis of symmetry fusion rule.
29. vehicle locating device as claimed in claim 27, wherein,
In above-mentioned axis of symmetry determining unit, above-mentioned symmetrical feature comprises at least one in profile symmetry, gray scale symmetry and the S component symmetry of extracting from above-mentioned ROI.
30. vehicle locating device as claimed in claim 29, wherein,
The image-region of determining above-mentioned profile axis of symmetry is whole ROI, the image-region of determining above-mentioned gray scale axis of symmetry and S component axis of symmetry be from above-mentioned ROI by the width of shade at the bottom of the above-mentioned car and the image-region of forming with the height of above-mentioned width proportion relation.
CN2006100550539A 2006-02-28 2006-02-28 Method and apparatus for positioning vehicle based on characteristics Expired - Fee Related CN101029824B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2006100550539A CN101029824B (en) 2006-02-28 2006-02-28 Method and apparatus for positioning vehicle based on characteristics
JP2007043721A JP4942509B2 (en) 2006-02-28 2007-02-23 Vehicle position detection method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2006100550539A CN101029824B (en) 2006-02-28 2006-02-28 Method and apparatus for positioning vehicle based on characteristics

Publications (2)

Publication Number Publication Date
CN101029824A true CN101029824A (en) 2007-09-05
CN101029824B CN101029824B (en) 2011-10-26

Family

ID=38556003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2006100550539A Expired - Fee Related CN101029824B (en) 2006-02-28 2006-02-28 Method and apparatus for positioning vehicle based on characteristics

Country Status (2)

Country Link
JP (1) JP4942509B2 (en)
CN (1) CN101029824B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314601A (en) * 2010-07-06 2012-01-11 通用汽车环球科技运作有限责任公司 Use nonlinear optical to remove by the shade in the image of catching based on the camera of vehicle according to constant nuclear
US8098933B2 (en) 2006-11-17 2012-01-17 Alpine Electronics, Inc. Method and apparatus for partitioning an object from an image
CN101739550B (en) * 2009-02-11 2012-02-22 北京智安邦科技有限公司 Method and system for detecting moving objects
CN102376090A (en) * 2010-08-19 2012-03-14 索尼公司 Image processing device, method, and program
CN101436253B (en) * 2007-11-14 2012-04-25 东软集团股份有限公司 Method and device for verifying interested area of vehicle
CN101739686B (en) * 2009-02-11 2012-05-30 北京智安邦科技有限公司 Moving object tracking method and system thereof
CN104205170A (en) * 2012-03-28 2014-12-10 株式会社巨晶片 Object detection device and program
CN104766308A (en) * 2015-03-19 2015-07-08 杭州电子科技大学 Road vehicle shadow feature extraction method
CN105096655A (en) * 2014-05-19 2015-11-25 本田技研工业株式会社 Object detection device, driving assistance device, object detection method, and object detection program
CN105574542A (en) * 2015-12-15 2016-05-11 中国北方车辆研究所 Multi-vision feature vehicle detection method based on multi-sensor fusion
CN106650726A (en) * 2016-12-05 2017-05-10 渤海大学 License plate recognition method
CN109191492A (en) * 2018-07-11 2019-01-11 东南大学 A kind of intelligent video black smoke vehicle detection method based on edge analysis
CN109815812A (en) * 2018-12-21 2019-05-28 辽宁石油化工大学 A kind of vehicle bottom localization method based on horizontal edge information accumulation
CN110059566A (en) * 2019-03-20 2019-07-26 东软睿驰汽车技术(沈阳)有限公司 A kind of image-recognizing method and device
CN110285870A (en) * 2019-07-22 2019-09-27 深圳市卓城科技有限公司 Vehicle spindle-type and wheel number determination method and its system
CN112565614A (en) * 2021-02-22 2021-03-26 四川赛狄信息技术股份公司 Signal processing module and method
CN112581473A (en) * 2021-02-22 2021-03-30 常州微亿智造科技有限公司 Method for realizing surface defect detection gray level image positioning algorithm
CN115984836A (en) * 2023-03-20 2023-04-18 山东杨嘉汽车制造有限公司 Tank opening identification and positioning method for railway tank wagon

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101458814B (en) * 2007-12-13 2012-02-01 东软集团股份有限公司 Method and apparatus for separating objects interesting region from image
JP4856656B2 (en) * 2008-01-22 2012-01-18 富士重工業株式会社 Vehicle detection device
JP5651414B2 (en) * 2010-09-16 2015-01-14 株式会社東芝 Vehicle detection device
WO2013129380A1 (en) * 2012-03-01 2013-09-06 日産自動車株式会社 Vehicle detector and vehicle detection method
JP5911062B2 (en) 2012-04-26 2016-04-27 株式会社メガチップス Object detection apparatus and program
JP5911063B2 (en) 2012-04-27 2016-04-27 株式会社メガチップス Object detection apparatus and program
JP6169366B2 (en) * 2013-02-08 2017-07-26 株式会社メガチップス Object detection device, program, and integrated circuit
JP6140478B2 (en) * 2013-03-04 2017-05-31 株式会社メガチップス Object detection device, program, and integrated circuit
KR102069843B1 (en) * 2018-08-31 2020-01-23 서강대학교 산학협력단 Apparatus amd method for tracking vehicle
CN112215240B (en) * 2020-10-13 2024-02-20 珠海博明视觉科技有限公司 Optimization method for improving 2D complex edge detection precision
CN114998618A (en) * 2022-01-13 2022-09-02 山东高速股份有限公司 Truck color identification method based on convolutional neural network model
CN116958099B (en) * 2023-07-27 2024-05-24 微牌科技(浙江)有限公司 Cable abrasion detection method, system, device and computer equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08138036A (en) * 1994-11-11 1996-05-31 Nissan Motor Co Ltd Preceding vehicle recognition device
JP3072730B2 (en) * 1998-10-09 2000-08-07 日本電気株式会社 Vehicle detection method and device
JP4205825B2 (en) * 1999-11-04 2009-01-07 本田技研工業株式会社 Object recognition device
KR100553431B1 (en) * 2003-04-21 2006-02-20 주식회사 팬택 Method for concluding threshold for image division
JP2005149143A (en) * 2003-11-14 2005-06-09 Konica Minolta Holdings Inc Object detecting device and method, and computer program

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8098933B2 (en) 2006-11-17 2012-01-17 Alpine Electronics, Inc. Method and apparatus for partitioning an object from an image
CN101436253B (en) * 2007-11-14 2012-04-25 东软集团股份有限公司 Method and device for verifying interested area of vehicle
CN101739550B (en) * 2009-02-11 2012-02-22 北京智安邦科技有限公司 Method and system for detecting moving objects
CN101739686B (en) * 2009-02-11 2012-05-30 北京智安邦科技有限公司 Moving object tracking method and system thereof
CN102314601A (en) * 2010-07-06 2012-01-11 通用汽车环球科技运作有限责任公司 Use nonlinear optical to remove by the shade in the image of catching based on the camera of vehicle according to constant nuclear
CN102376090A (en) * 2010-08-19 2012-03-14 索尼公司 Image processing device, method, and program
CN102376090B (en) * 2010-08-19 2016-03-16 索尼公司 Image processing apparatus and method
CN104205170A (en) * 2012-03-28 2014-12-10 株式会社巨晶片 Object detection device and program
US10325171B2 (en) 2014-05-19 2019-06-18 Honda Motor Co., Ltd. Object detection device, driving assistance device, object detection method, and object detection program
CN105096655A (en) * 2014-05-19 2015-11-25 本田技研工业株式会社 Object detection device, driving assistance device, object detection method, and object detection program
CN104766308B (en) * 2015-03-19 2018-08-10 杭州电子科技大学 A kind of road vehicle shadow character extracting method
CN104766308A (en) * 2015-03-19 2015-07-08 杭州电子科技大学 Road vehicle shadow feature extraction method
CN105574542A (en) * 2015-12-15 2016-05-11 中国北方车辆研究所 Multi-vision feature vehicle detection method based on multi-sensor fusion
CN106650726A (en) * 2016-12-05 2017-05-10 渤海大学 License plate recognition method
CN109191492A (en) * 2018-07-11 2019-01-11 东南大学 A kind of intelligent video black smoke vehicle detection method based on edge analysis
CN109815812B (en) * 2018-12-21 2020-12-04 辽宁石油化工大学 Vehicle bottom edge positioning method based on horizontal edge information accumulation
CN109815812A (en) * 2018-12-21 2019-05-28 辽宁石油化工大学 A kind of vehicle bottom localization method based on horizontal edge information accumulation
CN110059566A (en) * 2019-03-20 2019-07-26 东软睿驰汽车技术(沈阳)有限公司 A kind of image-recognizing method and device
WO2020187311A1 (en) * 2019-03-20 2020-09-24 东软睿驰汽车技术(沈阳)有限公司 Image recognition method and device
WO2020186603A1 (en) * 2019-03-20 2020-09-24 东软睿驰汽车技术(沈阳)有限公司 Image recognition method and apparatus
CN110285870A (en) * 2019-07-22 2019-09-27 深圳市卓城科技有限公司 Vehicle spindle-type and wheel number determination method and its system
CN112565614A (en) * 2021-02-22 2021-03-26 四川赛狄信息技术股份公司 Signal processing module and method
CN112581473A (en) * 2021-02-22 2021-03-30 常州微亿智造科技有限公司 Method for realizing surface defect detection gray level image positioning algorithm
CN115984836A (en) * 2023-03-20 2023-04-18 山东杨嘉汽车制造有限公司 Tank opening identification and positioning method for railway tank wagon

Also Published As

Publication number Publication date
JP4942509B2 (en) 2012-05-30
CN101029824B (en) 2011-10-26
JP2007235950A (en) 2007-09-13

Similar Documents

Publication Publication Date Title
CN101029824A (en) Method and apparatus for positioning vehicle based on characteristics
US10970566B2 (en) Lane line detection method and apparatus
JP5896027B2 (en) Three-dimensional object detection apparatus and three-dimensional object detection method
CN102156868B (en) Image binaryzation method and device
CN111968144B (en) Image edge point acquisition method and device
CN1237327C (en) System and method for discriminating road gap
US8050456B2 (en) Vehicle and road sign recognition device
JP5867596B2 (en) Three-dimensional object detection apparatus and three-dimensional object detection method
US8036427B2 (en) Vehicle and road sign recognition device
JP6254084B2 (en) Image processing device
US10552706B2 (en) Attachable matter detection apparatus and attachable matter detection method
CN1991865A (en) Device, method, program and media for extracting text from document image having complex background
CN104899554A (en) Vehicle ranging method based on monocular vision
CN1945596A (en) Vehicle lane Robust identifying method for lane deviation warning
CN1924899A (en) Precise location method of QR code image symbol region at complex background
JP2006331193A (en) Vehicle, image processing system, image processing method, and image processing program
JP6529315B2 (en) Main subject detection method, main subject detection device and program
CN1822027A (en) Precise dividing device and method for grayscale character
JP5874831B2 (en) Three-dimensional object detection device
CN101029823A (en) Method for tracking vehicle based on state and classification
CN107644538B (en) Traffic signal lamp identification method and device
CN1790378A (en) Binary method and system for image
CN101599175A (en) Determine the detection method and the image processing equipment of alteration of shooting background
CN104156727A (en) Lamplight inverted image detection method based on monocular vision
CN109800641B (en) Lane line detection method based on threshold value self-adaptive binarization and connected domain analysis

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111026

Termination date: 20210228

CF01 Termination of patent right due to non-payment of annual fee