CN112766077B - Front vehicle rollover recognition method based on self-vehicle camera perception information - Google Patents

Front vehicle rollover recognition method based on self-vehicle camera perception information Download PDF

Info

Publication number
CN112766077B
CN112766077B CN202011638511.8A CN202011638511A CN112766077B CN 112766077 B CN112766077 B CN 112766077B CN 202011638511 A CN202011638511 A CN 202011638511A CN 112766077 B CN112766077 B CN 112766077B
Authority
CN
China
Prior art keywords
vehicle
rollover
feature point
feature
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011638511.8A
Other languages
Chinese (zh)
Other versions
CN112766077A (en
Inventor
罗禹贡
向云丰
贺岩松
陈健
卢家怿
曹礼鹏
王博
古谚谌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Chongqing University
Original Assignee
Tsinghua University
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Chongqing University filed Critical Tsinghua University
Priority to CN202011638511.8A priority Critical patent/CN112766077B/en
Publication of CN112766077A publication Critical patent/CN112766077A/en
Application granted granted Critical
Publication of CN112766077B publication Critical patent/CN112766077B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a front vehicle rollover recognition method based on self-vehicle camera perception information, which comprises the steps of firstly obtaining an image of a front area environment through a self-vehicle-mounted camera, then recognizing a vehicle area in the image, then respectively extracting characteristic points of the vehicle area in two continuous frames of images, and carrying out characteristic matching; removing all feature points which are not on the set feature plane in the feature point matching pairs by using a self-adaptive threshold value removing method; calculating the sum of the rollover characteristic angle and the absolute value of the rollover characteristic angles of a plurality of previous frames of front environment images by using the remaining characteristic point matching pairs; and judging whether the front vehicle is turned over or not according to whether the sum of the rollover characteristic angles is larger than a rollover threshold value or not. According to the invention, the vehicle-mounted camera is used for acquiring the image information of the environment in the front area, and the rollover state of the front vehicle is judged through the information, so that the problem that the existing unmanned vehicle cannot identify the rollover state of the front vehicle is solved.

Description

Front vehicle rollover recognition method based on self-vehicle camera perception information
Technical Field
The invention belongs to the technical field of autonomous decision-making of unmanned vehicles, and particularly relates to a method for identifying a rollover state of a vehicle in front of a highway, which helps the vehicle to better understand the state of the vehicle in front and better perform behavior decision-making and trajectory planning.
Background
During the driving process of the unmanned vehicle on the expressway, the self-vehicle decision can be made according to different behaviors of surrounding vehicles. In order for the unmanned vehicle to make a correct decision, it is therefore necessary to accurately identify the behavior and driving state of the surrounding vehicles. At present, the behaviors of braking, lane changing, lane keeping, overtaking, steering and the like of surrounding vehicles can be well recognized by unmanned vehicles. However, there is currently little method of identifying the rollover state of the vehicle ahead. However, when the front vehicle rolls over, the movement of the front vehicle is not controlled by the driver of the front vehicle, which seriously threatens the safety of the unmanned vehicle. Therefore, in order to improve the driving safety of the unmanned vehicle, it is necessary to correctly identify the state (whether the vehicle is turning over) of the vehicle ahead in the environment according to the information acquired by the sensor of the unmanned vehicle, and then to better predict the future trajectory of the vehicle ahead, and to make a reasonable behavior decision and trajectory planning, so as to avoid or reduce the influence of the vehicle ahead turning over on the vehicle.
Disclosure of Invention
The invention aims to provide an online rollover recognition method for automobiles facing to the surroundings in a mixed traffic flow of unmanned automobiles on a highway. The invention judges whether the front vehicle is turned over according to the information acquired by the vehicle-mounted sensor, and effectively solves the problem that the front vehicle is difficult to identify or monitor the turning over state of the front vehicle.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a method for identifying rollover of a front vehicle based on self-vehicle camera perception information, which is characterized by comprising the following steps of:
1) acquiring the N frame and the N-1 frame of front environment images through a vehicle-mounted camera of a vehicle, performing feature matching on vehicle regions in the two frames of front environment images, and storing all feature point matching pairs in a set X;
2) using a self-adaptive threshold value removing method to remove the feature points which are not on the set feature plane in all the feature point matching pairs stored in the set X, and storing the feature point matching pairs which are not removed in the set K;
3) calculating a rollover characteristic angle theta between the front environment images of the Nth frame and the (N-1) th frame according to the matching pairs of the characteristic points in the set K N
4) According to the sum ss of absolute values of rollover characteristic angles of the front N frames of front environment images of the Nth frame of front environment image N And threshold th of suspected rollover s Update the sum S of the rollover characteristic angles of the vehicle in the forward environment image of the Nth frame N The method specifically comprises the following steps:
if ss N <th s Then give an order
Figure BDA0002879267650000021
If ss N ≥th s Then let S N =S N-1N ,th s Is a threshold value of suspected rollover, [ theta ] ii A rollover characteristic angle in the front N frames of front environment images of the Nth frame of front environment image is set;
5) according to the sum S of the rollover characteristic angles of the forward environment image of the Nth frame N The absolute value of the threshold value th and the rollover threshold value th are used for judging whether the vehicle in the nth frame of front environment image is rollover, specifically:
xabs (S) N ) Judging whether the vehicle in the front environment image of the Nth frame is turned over or not, informing that the vehicle is turned over in the front of the vehicle, and requesting a response; xabs (S) N ) If the vehicle rollover state is less than th, judging that the vehicle in the Nth frame of front environment image does not rollover, enabling N to be N +1, returning to the step 1), and continuously monitoring the rollover state of the front vehicle until the driving task of the self vehicle is completed.
Further, the step 2) specifically comprises the following steps:
21) the weight of each feature point matching pair in the jth feature point matching pair extracted from the set X is set to be omega j,i (ii) a The sampling probability of each feature point matching pair in the j-th feature point matching pair extraction from the set X is set as pp j (i) (ii) a Constructing a set S j For storing a extracted from the set X at the jth time according to the probability distribution of the feature point matching pairs j Each feature point matching pair, and ensuring a extracted from the set X at each time j The feature points in the feature point matching pair are in a corresponding plane, and the plane is recorded as a feature point plane pi j
22) Let j equal to 1, initialize ω j,i 1/M, 1, 2,. M, and initializing the sampling probability pp j (i)=ω j,i I 1, 2,. M; m is the total number of the feature point matching pairs stored in the set X;
23) according to the sampling probability ratio pp j Extracting a from the set X j Storing the matched pairs of the feature points into a set S j From the set S j Each feature point matching pair in (2) determines a feature point plane pi j And calculates homography matrix H j The homography matrix H j A 3 × 3 matrix;
24) by homography matrix H j Calculating the error e of each feature point matching pair in the feature point matching pair set X extracted from the set X for the jth time j,i And an adaptive threshold th j,i I is 1, 2.. M, and error e is compared j,i And an adaptive threshold th j,i Size of (a) e j,i ≤th j,i The corresponding ith feature point matching pair in the set X is stored into a subset K j Wherein:
error e j,i The calculation formula of (a) is as follows:
e j,i =||ps i,N -H j ps i,N - 1 || 2
in the formula, ps i,N ,ps i,N The characteristic points p in the ith characteristic point matching pair in the front environment image of the N-1 th frame and the N-th frame respectively i,N - 1 ,p i,N Homogeneous coordinates of (a);
adaptive threshold th i,j The calculation formula of (a) is as follows:
Figure BDA0002879267650000031
in the formula,. DELTA.d j For allowing characteristic points plane pi j From characteristic point to characteristic point plane pi j Maximum distance of, i.e. if the characteristic point is in the characteristic point plane pi j Is less than or equal to Δ d j Then, the feature point is considered to be in the feature point plane pi j Otherwise, the feature point is not in the feature point plane pi j The above step (1); Δ V is a relative speed between the host vehicle and a vehicle corresponding to the vehicle area in the front environment image; delta t is the time interval between the front environmental images of the Nth frame and the N-1 st frame; d j For the camera of the bicycle to the characteristic point plane pi j The distance of (d); Δ u j,i And Δ v j,i Are respectively a characteristic point p i,N-1 The difference between the horizontal and vertical pixel coordinates in the Nth frame and the N-1 th image is calculated by the following formula:
Figure BDA0002879267650000032
25) let j equal j +1, according to the error e j-1,i Calculating an updated weight ω j,i And the sampling probability pp j (i) Wherein:
weight ω j,i The calculation formula of (a) is as follows:
Figure BDA0002879267650000033
probability of sampling pp j (i) The calculation formula of (a) is as follows:
Figure BDA0002879267650000034
26) if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j - 1 (i)|| 2 Greater than or equal to a set threshold th pp And the number of times j of extraction does not reach the set upper limit th j I.e. j < th j Returning to step 23); if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j-1 (i)|| 2 Is less than a set threshold th pp Or the number of times j of extraction reaches the set upper limit th j I.e. j is th i If so, then step 27) is performed;
27) the subset K that will contain the most elements rr J-1, as a set K from which the feature points not on the set feature plane are finally removed, step 3) is performed.
Further, the step 3) specifically comprises the following steps:
let A N-1,1 B N-1,1 Is a plumb line on the vehicle region plane in the forward environment image of the N-1 st frame, and is set as A N-1,1 ,B N-1,1 Pixel points with pixel coordinates of (0, 0) and (0, 1) in the forward environment image of the (N-1) th frame respectively; vertical line A N-1,1 B N-1,1 By homography matrix H ZN After mapping, obtaining the vertical line A in the front environment image of the Nth frame N-1,1 B N-1,1 Corresponding straight line A N-1, 2 B N-1,2 The specific calculation formula is as follows:
As N-1,2 =H ZN As N-1,1
Bs N-1,2 =H ZN Bs N-1,1
Figure BDA0002879267650000041
in the formula, As N-1,1 ,Bs N-1,1 ,As N-1,2 ,Bs N-1,2 Respectively, is a pixel point A N-1,1 ,B N-1,1 ,A N-1,2 ,B N-1,2 Homogeneous coordinates of (a); homography matrix H ZN Obtained by calculating the matching pairs of the feature points in the set K, h N1 ~h N9 Is a homography matrix H ZN The first to ninth elements of (a);
determining the rollover characteristic angle theta of the vehicle in the front environment image of the Nth frame by using the following formula N
Figure BDA0002879267650000042
In the formula (I), the compound is shown in the specification,
Figure BDA0002879267650000043
is the slave pixel point A in the N-1 th frame image N-1,1 Point to pixel point B N-1,1 The vector of (a) is determined,
Figure BDA0002879267650000044
is a pixel point A in the N frame front environment image N-1,2 Point to pixel point B N-1,2 The vector of (2).
The invention has the following characteristics and beneficial effects:
the invention relates to a method for judging whether other vehicles turn over or not based on self-vehicle perception information. According to the method, an environment image is obtained through a vehicle-mounted camera, a front vehicle is identified by using shadow features of the bottom of the vehicle, a homography matrix and rollover characteristic angles are calculated through vehicle region feature matching of front and rear frames of images, and the sum of n rollover characteristic angles before the current moment is calculated. And finally, judging whether the front vehicle turns over according to whether the sum of the rollover characteristic angles is larger than a rollover threshold value. The problem of present unmanned vehicle can't have the discernment of the state of turning on one's side to the vehicle in the front is solved, safe driving in the environment that unmanned vehicle can have the vehicle of turning on one's side in the front has laid the basis for unmanned vehicle.
Drawings
Fig. 1 is a schematic view of a driving scene of an unmanned vehicle to which the present invention is applied.
Fig. 2 is a flow chart of the method for recognizing rollover of a vehicle ahead according to the present invention.
FIG. 3 is a schematic diagram of vehicle identification through vehicle bottom shading features in the present invention.
FIG. 4 is a schematic diagram of matching vehicle region feature points of the Nth frame and the N-1 th frame in the present invention.
FIG. 5 is a diagram illustrating feature points not on a plane using adaptive threshold culling in accordance with the present invention.
FIG. 6 is a schematic diagram of matching the feature points of the vehicle region after the feature point pairs are removed by adaptive threshold.
FIG. 7 is a schematic diagram of the calculation of a rollover characteristic angle according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating the scope of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
For better understanding of the present invention, an application example of the method for recognizing a rollover of a front vehicle based on sensed information of a camera of the front vehicle according to the present invention is described in detail below.
The application scenes and the flow chart of the front rollover vehicle identification method based on the self-vehicle camera are respectively shown in the figures 1 and 2, the method is explained below for any front vehicle, and the other front vehicles are subjected to rollover identification according to the same method. The method comprises the following steps:
1) acquiring front environment images of the current moment and the previous moment through a vehicle-mounted camera of a vehicle, performing feature matching on vehicle regions in the two frames of front environment images, and storing all feature point matching pairs in a set X, wherein the specific steps are as follows:
firstly, acquiring front environment images at the current moment and the previous moment, namely an Nth frame and an N-1 th frame of front environment images, by using a vehicle-mounted camera of a vehicle; then, vehicle regions in the front environment images of the nth frame and the (N-1) th frame are identified (the specific identification methods include, but are not limited to, a vehicle bottom shadow feature identification method, a template-based vehicle identification method, and a neural network-based vehicle identification method, which are all prior art), in the embodiment, the vehicle regions are identified by using the vehicle bottom shadow feature identification method, and the identification result of a certain frame is shown in fig. 3; then extracting the frame N and the frame N-1 beforeFeature points of a vehicle region (i.e., a region covered by a preceding vehicle) in the square environment image are then subjected to feature matching to obtain a feature point matching pair, and fig. 4 is a schematic diagram of a matching result of a group of feature points in this embodiment; storing all matched feature point pairs of a vehicle region in the forward environment images of the Nth frame and the (N-1) th frame into a set X, setting that the set X contains M feature point matched pairs, and respectively marking the feature points in each feature point matched pair in the forward environment images of the (N-1) th frame and the Nth frame as p i,N-1 And p i,N ,i=1,2,...M。
2) And (3) removing the feature points which are not on the set feature plane in all the feature point matching pairs stored in the set X by using an adaptive threshold removing method, which specifically comprises the following steps:
21) the weight of each feature point matching pair in the jth feature point matching pair extracted from the set X is set to be omega j,i (ii) a The sampling probability of each feature point matching pair in the jth feature point matching pair extracted from the set X is set as pp j (i) In that respect Constructing a set s j For storing a extracted from the set X according to the probability distribution of the matched pairs of the feature points at the jth time j Each feature point matching pair; a is more than or equal to 4 j M is less than or equal to M, the number of the feature point matching pairs extracted from the set X at each time can be the same or different, and a extracted at the jth time is ensured j The feature points of the feature point matching pairs are in a plane, which is recorded as a feature point plane pi j Preferably a j 4, the feature points of the 4 feature point matching pairs extracted at the j-th time are calculated in the smallest amount while being in one plane as possible.
22) Let j equal to 1, initialize ω j,i 1/M, i 1, 2 j (i)=ω j,i ,i=1,2,...M。
23) According to the sampling probability ratio pp j Extracting a from the set X j Storing the matched pairs of the feature points into a set S j From the set S j Each feature point matching pair in (2) determines a feature point plane pi j And calculates homography matrix H j The homography matrix H j A 3 x 3 matrix.
24) Referring to FIG. 5, by sheetAnswer matrix H j Calculating the error e of each feature point matching pair in the j-th feature point matching pair extraction set X from the set X j,i And an adaptive threshold th j,i M, and comparing the error e j,i And an adaptive threshold th j,i Size of (a) e j,i ≤th j,i The corresponding ith feature point matching pair in the set X is stored into a subset K j Wherein:
error e j,i The calculation formula of (a) is as follows:
e j,i =||ps i,N -H j ps i,N-1 || 2
in the formula, ps i,N ,ps i,N The characteristic points p in the ith characteristic point matching pair in the front environment image of the N-1 th frame and the N-th frame respectively i,N-1 ,p i,N Of homogeneous coordinates, e.g. p i,N-1 =[u i,N-1 v i,N-1 ] T ,u i,N-1 ,v i,N-1 Are respectively a characteristic point p i,N-1 Horizontal and vertical pixel coordinates in the front environment image of the N-1 th frame; the homogeneous coordinate ps i,N-1 =[u i,N-1 v i,N-1 1] T
Adaptive threshold th i,j The calculation formula of (a) is as follows:
Figure BDA0002879267650000061
in the formula,. DELTA.d i For allowing characteristic points plane pi j From characteristic point to characteristic point plane pi j Maximum distance of (a), i.e. if the feature point is in the feature point plane pi j Is less than or equal to Δ d i Then the feature point is considered to be in the feature point plane pi j Otherwise, the feature point is not in the feature point plane pi j The above. Δ V is a relative speed between the host vehicle and a vehicle corresponding to a vehicle region identified in the preceding environment image, Δ t is a time interval between the Nth frame and the N-1 st frame of the preceding environment image, d j For the camera of the bicycle to the characteristic point plane pi j The distance of (c).Δu j,i And Δ v j,i Are respectively a characteristic point p i,N-1 The difference between the horizontal and vertical pixel coordinates in the Nth frame and the N-1 th image is calculated by the following formula:
Figure BDA0002879267650000062
25) let j equal j +1, according to the error e j-1,i Calculating an updated weight ω j,i And the sampling probability pp j (i) Wherein:
weight ω i,i The calculation formula of (a) is as follows:
Figure BDA0002879267650000063
probability of sampling pp j (i) The calculation formula of (c) is as follows:
Figure BDA0002879267650000064
26) if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j - 1 (i)|| 2 Greater than or equal to a set threshold th pp (preferably th) pp =10 -6 ) And the number of times j of extraction does not reach the set upper limit, i.e. j < th j (preferably th) j 1000), return to step 23); if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j-1 (i)|| 2 Is less than a set threshold th pp Or the number of extractions j reaches a set upper limit, i.e., j equals th j Then step 27) is performed.
27) The subset K that will contain the most elements rr And rr is 1, 2.. j-1, and step 3) is executed as the set K after the feature points which are not on the set feature plane are finally removed. In this embodiment, a schematic diagram of matching results of a group of feature points after the feature point matching pairs are removed is shown in fig. 6.
3) Calculating the N-1 th frame and the N-th frame according to the feature point matching pairs in the set KRollover characteristic angle theta between vehicle regions in front environment image N The method comprises the following specific steps:
referring to FIG. 7, let A N-1,1 B N-1,1 Is a plumb line on the vehicle region plane in the forward environment image of the N-1 st frame, and is set as A N-1,1 ,B N-1,1 Pixel points with pixel coordinates of (0, 0) and (0, 1) in the forward environment image of the (N-1) th frame respectively; vertical line A N-1, 1 B N-1,1 By homography matrix H ZN After mapping, obtaining the vertical line A in the front environment image of the Nth frame N-1,1 B N-1,1 Corresponding straight line A N-1,2 B N-1,2 The specific calculation formula is as follows:
As N-1,2 =H ZN As N-1,1
Bs N-1,2 =H ZN Bs N-1,1
Figure BDA0002879267650000071
in the formula, As N-1,1 ,Bs N-1,1 ,As N-1,2 ,Bs N-1,2 Respectively, is a pixel point A N-1,1 ,B N-1,1 ,A N-1,2 ,B N-1,2 Homogeneous coordinates of (a); homography matrix H ZN Obtained by calculating the matching pairs of the feature points in the set K, h N1 ~h N9 Is H ZN First to ninth elements.
Determining the rollover characteristic angle theta of the vehicle in the front environment image of the Nth frame by using the following formula NN Is a plumb line A N-1, 1 B N-1,1 And a straight line A N-1,2 B N-1,2 Included angle of):
Figure BDA0002879267650000072
in the formula, h N2 And h N5 Are respectively homography matrix H ZN The second element and the fifth element.
Figure BDA0002879267650000073
Is the slave pixel point A in the N-1 th frame image N-1,1 Point to pixel point B N-1,1 The vector of (a) is calculated,
Figure BDA0002879267650000074
is a pixel point A in the N frame front environment image N-1,2 Point to pixel point B N-1,2 The vector of (2).
4) Calculating the sum ss of absolute values of rollover characteristic angles of n frames of images before the current time N
Figure BDA0002879267650000081
In the formula, N is the number of frames in the setting range before the N-th frame front environment image, and the number of frames corresponding to 1s to 3s before the N-th frame front environment image is generally taken. Theta ii The rollover characteristic angle is one rollover characteristic angle in the front environment image of the previous N frames of the front environment image of the Nth frame.
According to the sum ss of absolute values of the rollover characteristic angles N And threshold th of suspected rollover s Update the sum S of the rollover characteristic angles of the vehicle ahead at the current time N The update rule is as follows:
if ss N <th s Then give an order
Figure BDA0002879267650000082
If ss N ≥th s Then let S N =S N-1N ,th s A threshold value of suspected rollover, preferably a rollover threshold value th s The value range of (a) is 8-15 degrees.
When ss N Is less than th s In time, the probability of rollover of the front vehicle is low, and S is prevented from being caused by accumulated errors N Too large, get S N Only the sum of the first n rollover characteristic angles is taken. When s is N Greater than th s When the vehicle is rolling over, the probability of the front vehicle rolling over is high, and the sum of the characteristic roll angles of the front vehicle is set to be around 0Counting, taking S N =S N-1N
5) According to the sum S of the rollover characteristic angles of the forward environment image of the Nth frame N The absolute value of the threshold value th and the rollover threshold value th are used for judging whether the front vehicle is rollover or not, and the method specifically comprises the following steps:
xabs (S) N ) And (5) judging that the front vehicle is turned over, informing the front of the vehicle that the vehicle is turned over, requesting a response, and ending the method. If abs (S) N ) If the vehicle is not in the rollover state, judging that the vehicle is in the front, enabling N to be N +1, returning to the step 1), continuously monitoring the rollover state of the vehicle in the front until the driving task of the vehicle is finished, and ending the method.
The larger the rollover threshold th is, the smaller the false rate of identifying rollover by using the method is, but the probability of missed judgment is increased, and the later the time for identifying rollover of the front vehicle is. The preferable range of the rollover threshold th is 20-40 degrees.

Claims (6)

1. A front vehicle rollover recognition method based on self-vehicle camera perception information is characterized by comprising the following steps:
1) acquiring the N frame and the N-1 frame of front environment images through a vehicle-mounted camera of a vehicle, performing feature matching on vehicle regions in the two frames of front environment images, and storing all feature point matching pairs in a set X;
2) using a self-adaptive threshold value removing method to remove the feature points which are not on the set feature plane in all the feature point matching pairs stored in the set X, and storing the feature point matching pairs which are not removed in the set K;
3) calculating a rollover characteristic angle theta between the front environment images of the Nth frame and the (N-1) th frame according to the matching pairs of the characteristic points in the set K N The method specifically comprises the following steps:
let A N-1,1 B N-1,1 Is a plumb line on the vehicle region plane in the forward environment image of the (N-1) th frame, and is set as A N-1,1 ,B N-1,1 Pixel points with pixel coordinates of (0, 0) and (0, 1) in the forward environment image of the (N-1) th frame respectively; plumb line A N-1,1 B N-1,1 By homography matrix H ZN After mapping, obtaining the vertical line A in the front environment image of the Nth frame N-1,1 B N-1,1 Corresponding straight line A N-1,2 B N-1,2 The specific calculation formula is as follows:
As N-1,2 =H ZN As N-1,1
Bs N-1,2 =H ZN Bs N-1,1
Figure FDA0003756867940000011
in the formula, As N-1,1 ,Bs N-1,1 ,As N-1,2 ,Bs N-1,2 Respectively, is a pixel point A N-1,1 ,B N-1,1 ,A N-1,2 ,B N-1,2 Homogeneous coordinates of (a); homography matrix H ZN Obtained by calculating the matching pairs of the feature points in the set K, h N1 ~h N9 Is a homography matrix H ZN The first to ninth elements of (a);
determining the rollover characteristic angle theta of the vehicle in the front environment image of the Nth frame by using the following formula N
Figure FDA0003756867940000012
In the formula (I), the compound is shown in the specification,
Figure FDA0003756867940000013
is the slave pixel point A in the N-1 th frame image N-1,1 Point to pixel point B N-1,1 The vector of (a) is determined,
Figure FDA0003756867940000014
is a pixel point A in the N frame front environment image N-1,2 Point to pixel point B N-1,2 The vector of (a);
4) according to the sum ss of absolute values of rollover characteristic angles of the front N frames of front environment images of the Nth frame of front environment image N And threshold th of suspected rollover s Is largeSmall updating of the sum S of the rollover characteristic angles of the vehicle in the Nth frame of the front environment image N The method specifically comprises the following steps:
if ss N <th s Then give an order
Figure FDA0003756867940000021
If ss N ≥th s Then let S N =S N-1N ,th s Is a threshold value of suspected rollover, [ theta ] ii A rollover characteristic angle in the front N frames of the front environment image of the Nth frame of the front environment image is obtained;
5) according to the sum S of the rollover characteristic angles of the Nth frame of front environment image N The absolute value of the threshold value th and the rollover threshold value th are used for judging whether the vehicle in the nth frame of front environment image is rollover, specifically:
if abs (S) N ) Judging whether the vehicle in the front environment image of the Nth frame is turned over or not, informing that the vehicle is turned over in the front of the vehicle, and requesting a response; xabs (S) N ) And < th, judging that the vehicle in the Nth frame of the front environment image is not overturned, enabling N to be N +1, returning to the step 1), and continuously monitoring the overturning state of the front vehicle until the driving task of the self vehicle is completed.
2. The method for recognizing rollover of a preceding vehicle as claimed in claim 1, wherein the step 2) specifically comprises the steps of:
21) the weight of each feature point matching pair in the j-th feature point matching pair extraction from the set X is set to be omega respectively j,i (ii) a The sampling probability of each feature point matching pair in the jth feature point matching pair extracted from the set X is set as pp j (i) (ii) a Constructing a set S j For storing a extracted from the set X according to the probability distribution of the matched pairs of the feature points at the jth time j Each feature point matches a pair and guarantees a extracted from the set X each time j The feature points in the feature point matching pairs are in a corresponding plane, and the plane is taken as a feature point plane pi j
22) Let j equal to 1, initialize ω j,i 1/M, 1, 2,. M, and initializing the sampling probability pp j (i)=ω j,i M, where M is the total number of feature point matching pairs stored in the set X;
23) according to the sampling probability pp j Extracting a from the set X j Storing the matched pairs of the feature points into a set S j From the set S j Each feature point matching pair in (2) determines a feature point plane pi j And calculates homography matrix H j The homography matrix H j Is a 3 × 3 matrix;
24) by homography matrix H j Calculating the error e of each feature point matching pair in the j-th feature point matching pair extraction set X from the set X j,i And an adaptive threshold th j,i M, and comparing the error e j,i And an adaptive threshold th j,i Size of (a) e j,i ≤th j,i The corresponding ith feature point matching pair in the set X is stored into a subset K j Wherein:
error e j,i The calculation formula of (c) is as follows:
e j,i =||ps i,N -H j ps i,N-1 || 2
in the formula, ps i,N ,ps i,N The characteristic points p in the ith characteristic point matching pair in the front environment image of the N-1 th frame and the N-th frame respectively i,N-1 ,p i,N Homogeneous coordinates of (a);
adaptive threshold th i,j The calculation formula of (a) is as follows:
Figure FDA0003756867940000031
in the formula,. DELTA.d j For allowing a characteristic point plane pi j From characteristic point to characteristic point plane pi j Maximum distance of (a), i.e. if the feature point is in the feature point plane pi j Is less than or equal to Δ d j Then, the feature point is considered to be in the feature point plane pi j Otherwise, the feature point is not in the feature point plane pi j C, removing; Δ V is a relative distance between the host vehicle and the vehicle corresponding to the vehicle region in the front environment imageSpeed; delta t is the time interval between the front environmental images of the Nth frame and the N-1 st frame; d j Is a plane pi from a camera to a feature point j The distance of (d); Δ u j,i And Δ v j,i Are respectively a characteristic point p i,N-1 The difference between the horizontal and vertical pixel coordinates in the Nth frame and the N-1 th image is calculated by the following formula:
Figure FDA0003756867940000032
25) let j equal j +1, according to the error e j-1,i Calculating an updated weight ω j,i And the sampling probability pp j (i) Wherein:
weight ω j,i The calculation formula of (a) is as follows:
Figure FDA0003756867940000033
probability of sampling pp j (i) The calculation formula of (c) is as follows:
Figure FDA0003756867940000034
26) if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j-1 (i)|| 2 Greater than or equal to a set threshold th pp And the number of times j of extraction does not reach the set upper limit th j I.e. j < th j Returning to step 23); if the sampling probability changes by Δ pp j (i)=||pp j (i)-pp j-1 (i)|| 2 Is less than a set threshold th pp Or the number of times j of extraction reaches a set upper limit th j I.e. j is th j If so, then step 27) is performed;
27) the subset K that will contain the most elements rr And rr is 1, 2.. j-1, and step 3) is executed as the set K after the feature points which are not on the set feature plane are finally removed.
3. The method for recognizing a rollover of a preceding vehicle according to claim 2, wherein the number a of matching pairs of feature points is extracted from the set X each time j =4。
4. The method for recognizing a rollover of a vehicle ahead of a preceding claim 1, wherein N is a number of frames corresponding to 1s to 3s before an nth frame of the preceding environment image.
5. The method for identifying a rollover of a vehicle ahead according to claim 1, wherein in step 4), the threshold th for suspected rollover s The value range of (a) is 8-15 degrees.
6. The method for recognizing rollover of a preceding vehicle according to claim 1, wherein in step 5), the rollover threshold th has a value ranging from 20 ° to 40 °.
CN202011638511.8A 2020-12-31 2020-12-31 Front vehicle rollover recognition method based on self-vehicle camera perception information Active CN112766077B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011638511.8A CN112766077B (en) 2020-12-31 2020-12-31 Front vehicle rollover recognition method based on self-vehicle camera perception information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011638511.8A CN112766077B (en) 2020-12-31 2020-12-31 Front vehicle rollover recognition method based on self-vehicle camera perception information

Publications (2)

Publication Number Publication Date
CN112766077A CN112766077A (en) 2021-05-07
CN112766077B true CN112766077B (en) 2022-09-06

Family

ID=75698263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011638511.8A Active CN112766077B (en) 2020-12-31 2020-12-31 Front vehicle rollover recognition method based on self-vehicle camera perception information

Country Status (1)

Country Link
CN (1) CN112766077B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289821A (en) * 2011-08-25 2011-12-21 西北工业大学 Image detection method for side-slipping motion of vehicle
CN103473774A (en) * 2013-09-09 2013-12-25 长安大学 Vehicle locating method based on matching of road surface image characteristics
CN106197374A (en) * 2016-08-15 2016-12-07 临沂大学 Car body angle excursion measuring method
WO2017104713A1 (en) * 2015-12-14 2017-06-22 ヤマハ発動機株式会社 Vehicle roll angle estimation system, vehicle, vehicle roll angle estimation method, and program
CN107054358A (en) * 2015-12-03 2017-08-18 罗伯特·博世有限公司 The inclination identification of two wheeler

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289821A (en) * 2011-08-25 2011-12-21 西北工业大学 Image detection method for side-slipping motion of vehicle
CN103473774A (en) * 2013-09-09 2013-12-25 长安大学 Vehicle locating method based on matching of road surface image characteristics
CN107054358A (en) * 2015-12-03 2017-08-18 罗伯特·博世有限公司 The inclination identification of two wheeler
WO2017104713A1 (en) * 2015-12-14 2017-06-22 ヤマハ発動機株式会社 Vehicle roll angle estimation system, vehicle, vehicle roll angle estimation method, and program
CN106197374A (en) * 2016-08-15 2016-12-07 临沂大学 Car body angle excursion measuring method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Vision-Based Autonomous Path Following Using a Human Driver Control Model With Reliable Input-Feature Value Estimation;Kazuhide Okamoto,et al.;《IEEE TRANSACTIONS ON INTELLIGENT VEHICLES》;20190930;全文 *
基于改进TTR算法的重型车辆侧翻预警系统;朱天军等;《机械工程学报》;20110520(第10期);全文 *
基于零力矩点指标和侧翻时间算法的车辆侧翻预警;靳立强等;《汽车工程》;20170325(第03期);全文 *

Also Published As

Publication number Publication date
CN112766077A (en) 2021-05-07

Similar Documents

Publication Publication Date Title
Andrade et al. A novel strategy for road lane detection and tracking based on a vehicle’s forward monocular camera
CN110745140B (en) Vehicle lane change early warning method based on continuous image constraint pose estimation
CN107798335B (en) Vehicle logo identification method fusing sliding window and Faster R-CNN convolutional neural network
CN104573646B (en) Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN110443225B (en) Virtual and real lane line identification method and device based on feature pixel statistics
US11373532B2 (en) Pothole detection system
CN111667512B (en) Multi-target vehicle track prediction method based on improved Kalman filtering
CN110298307B (en) Abnormal parking real-time detection method based on deep learning
CN110979313B (en) Automatic parking positioning method and system based on space map
EP3029538B1 (en) Vehicle position/bearing estimation device and vehicle position/bearing estimation method
CN108764108A (en) A kind of Foregut fermenters method based on Bayesian inference
CN112053589A (en) Target vehicle lane changing behavior adaptive identification model construction method
CN110738081B (en) Abnormal road condition detection method and device
CN111381248A (en) Obstacle detection method and system considering vehicle bump
CN109002797B (en) Vehicle lane change detection method, device, storage medium and computer equipment
CN106569214A (en) Method and system for processing vehicle-mounted radar data of adaptive cruise vehicle in conjunction with navigation information
CN110588623A (en) Large automobile safe driving method and system based on neural network
CN113942524B (en) Vehicle running control method, system and computer readable storage medium
CN105300390B (en) The determination method and device of obstructing objects movement locus
CN112766077B (en) Front vehicle rollover recognition method based on self-vehicle camera perception information
JP2007299045A (en) Lane recognition device
JP3319383B2 (en) Roadway recognition device
CN111414857A (en) Front vehicle detection method based on vision multi-feature fusion
DE112020002753T5 (en) VEHICLE CONTROL METHOD, VEHICLE CONTROL DEVICE AND VEHICLE CONTROL SYSTEM INCLUDING THEM
CN113486837B (en) Automatic driving control method for low-pass obstacle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant