CN107021104A - A kind of lane identification compensation method and device - Google Patents
A kind of lane identification compensation method and device Download PDFInfo
- Publication number
- CN107021104A CN107021104A CN201710301941.2A CN201710301941A CN107021104A CN 107021104 A CN107021104 A CN 107021104A CN 201710301941 A CN201710301941 A CN 201710301941A CN 107021104 A CN107021104 A CN 107021104A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- lane
- information
- road
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013461 design Methods 0.000 claims description 71
- 238000004364 calculation method Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 8
- 238000005070 sampling Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000012887 quadratic function Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a kind of lane identification compensation method and device, this method includes:The lane information in track where obtaining vehicle;According to the lane information, the lane identification parameter in the track is determined, the lane identification parameter includes road curvature, yaw angle and lateral run-out distance;When the lane identification parameter can not be determined, the driving speed information and vehicle position information of the vehicle are determined by the locating module being arranged on the vehicle, according to the vehicle position information and the highway layout information list pre-set, the road curvature in track where determining the vehicle;The turning speed information of the vehicle is obtained by the combination inertial sensor being arranged on the vehicle, the yaw angle and the lateral run-out distance of the vehicle are calculated by the turning speed information and the driving speed information.The invention also discloses corresponding lane identification compensation device.
Description
Technical Field
The invention relates to the technical field of lane recognition, in particular to a lane recognition compensation method and a lane recognition compensation device.
Background
Lane recognition is an important link of intelligent vehicle path planning and decision control. In lane recognition, a monocular vision sensor is usually used to recognize lane lines, that is, a real-time road scene picture is obtained through a monocular camera, and lane line profiles are extracted through an image recognition technology, so as to recognize road curvature, a yaw angle (an included angle between a road center line and a vehicle head orientation) and a road lateral deviation distance (a lateral deviation distance between the road center line and a vehicle center line) of a front lane. However, in the existing technical scheme, in the complex traffic roads such as intersections with broken lane lines, roads with traffic jams and the like, due to the conditions of unclear lane lines or low visibility and the like caused by weather reasons such as haze, rain, snow and the like, a lane recognition algorithm fails, a sensor cannot recognize lane information within a certain period of time, effective lane line characteristic signals cannot be transmitted to a vehicle, the driving decision of an intelligent vehicle is influenced by the information interruption fault of lane recognition, and the safety and the reliability of the intelligent vehicle are negatively influenced.
Disclosure of Invention
In view of the above, it is an object of the present invention to provide a lane recognition compensation method and apparatus in an attempt to solve or at least alleviate the above existing problems.
In a first aspect, an embodiment of the present invention provides a lane identification compensation method, where the method includes:
acquiring lane information of a lane where a vehicle is located;
determining lane identification parameters of the lane according to the lane information, wherein the lane identification parameters comprise road curvature, yaw angle and transverse deviation distance;
when the lane recognition parameter cannot be determined,
determining running speed information and vehicle position information of the vehicle through a positioning module arranged on the vehicle, and determining the road curvature of a lane where the vehicle is located according to the vehicle position information and a preset road design information list;
acquiring turning speed information of the vehicle through a combined inertial sensor arranged on the vehicle, and calculating the yaw angle and the lateral deviation distance of the vehicle through the turning speed information and the running speed information.
Optionally, in the method according to the present invention, before the determining the lane identification parameter of the lane according to the lane information, the method includes:
acquiring lane information of a lane where the vehicle is located through a visual sensor arranged on the vehicle;
determining lane identification parameters of the lane according to the lane information includes:
and processing the lane information by using an image processing algorithm to obtain the lane identification parameter.
Optionally, in the method according to the present invention, the determining the road curvature of the lane where the vehicle is located according to the vehicle position information and a preset road design information list includes:
traversing the road design information list, and determining the position information of the target road marking point consistent with the vehicle position information from the road design information list according to the vehicle position information;
and after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
Optionally, in the method according to the present invention, when determining, from the road design information list, position information of a target road marking point that is consistent with the position information according to the vehicle position information, the method further includes:
and after the target road marking point position information does not exist in the road design information list, calculating a weighted average value of road design curvatures corresponding to at least two pieces of road marking point position information adjacent to the vehicle position information in the road design information list, and marking the weighted average value as the road curvature of a lane where the vehicle is located.
Optionally, in the method according to the invention, the turning speed information comprises yaw rate, the calculating of the yaw angle and lateral deviation distance of the vehicle from the turning speed information and the travel speed information comprises:
determining relative position information of the vehicle according to the yaw angular velocity and the running speed information;
calculating the yaw angle and the lateral deviation distance of the vehicle using a specific equation set according to the relative position information.
In a second aspect, the present invention provides a lane recognition compensation apparatus, comprising:
the lane information acquisition unit is used for acquiring lane information of a lane where the vehicle is located;
the parameter determining unit is used for determining lane identification parameters of the lane according to the lane information, wherein the lane identification parameters comprise road curvature, yaw angle and transverse deviation distance;
the parameter calculation unit is used for determining the running speed information and the vehicle position information of the vehicle through a positioning module arranged on the vehicle when the lane identification parameter cannot be determined, and determining the road curvature of the lane where the vehicle is located according to the vehicle position information and a preset road design information list; acquiring turning speed information of the vehicle through a combined inertial sensor arranged on the vehicle, and calculating the yaw angle and the lateral deviation distance of the vehicle through the turning speed information and the running speed information.
Optionally, in the apparatus according to the present invention, the lane information acquiring unit is further configured to acquire lane information of a lane in which the vehicle is located through a visual sensor provided on the vehicle;
the parameter determining unit is further configured to process the lane information by using an image processing algorithm to obtain the lane identification parameter.
Optionally, in the apparatus according to the present invention, the road design information list at least includes road marking point position information and a road design curvature corresponding to the marking point, and the parameter calculation unit is further configured to:
traversing the road design information list, and determining the position information of the target road marking point consistent with the vehicle position information from the road design information list according to the vehicle position information;
and after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
Optionally, in the apparatus according to the present invention, the parameter calculating unit is further configured to:
and after the target road marking point position information does not exist in the road design information list, calculating a weighted average value of the road design curvatures corresponding to the position information of two road marking points adjacent to the vehicle position information in the road design information list, and marking the weighted average value as the road curvature of the lane where the vehicle is located.
Optionally, in the apparatus according to the present invention, the turning speed information comprises yaw angular velocity, the parameter calculation unit is further configured to:
determining relative position information of the vehicle according to the yaw angular velocity and the running speed information;
calculating the yaw angle and the lateral deviation distance of the vehicle using a specific equation set according to the relative position information.
According to the technical scheme of the invention, when the lane identification parameters cannot be determined, the information such as turning speed information, position information and the like in the driving process of the vehicle is obtained by combining the inertial sensor and the positioning module, and the lane identification parameters are calculated, so that the influence on the driving decision of the vehicle caused by information interruption due to the fact that the lane identification parameters cannot be obtained is avoided, and the safety and the reliability in the driving process of the vehicle are improved.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart illustrating a lane recognition compensation method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a lane image provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a coordinate system for calculating lane identification parameters according to an embodiment of the present invention;
fig. 4 is a block diagram illustrating a lane recognition compensating apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The invention obtains the information related to the lane by a monocular vision sensor, a vehicle-mounted combined inertial sensor (IMU), a vehicle-mounted locator (such as a GPS sensor) and the like arranged on the vehicle, and compensates and predicts the lane identification parameter in the lane information calculation model according to the obtained information related to the lane. The lane information calculation model is a quadratic function, and the formula is as follows:
f(x)=1/2c2x2+c1x+c0
wherein, c2Road curvature representing a lane, c1Representing yaw angle, c0Representing the lateral offset distance. And finally, transmitting the obtained lane identification parameters to a driving decision system or a vehicle control unit (ECU) and the like. The details are as follows.
Fig. 1 shows a flowchart of a lane recognition compensation method according to an embodiment of the present invention. As shown in fig. 1, the method begins at step S110.
In step S110, lane information of a lane in which the vehicle is located is acquired. The lane information may be, but is not limited to, a lane image, etc. For example, the vehicle acquires a lane image of a lane ahead of the traveling direction of the vehicle by a monocular vision sensor such as a camera provided on the vehicle (refer to fig. 2).
In step S120, lane recognition parameters of the lane are determined according to the lane information, and the lane recognition parameters include a road curvature, a yaw angle, and a lateral deviation distance.
In one embodiment, lane information, such as a lane image, of a lane in which the vehicle is located is acquired through a visual sensor disposed on the vehicle, and the lane information, such as the lane image, is processed by using an image processing algorithm to obtain lane parameter information. The visual sensor may be, but is not limited to, a monocular visual sensor, a monocular camera, a sensor capable of acquiring lane information, and the like.
Road curvature is the rotation rate of the tangential angle to the arc length at a point on the road. The curvature of the road at a certain point in the road is the reciprocal of the turning radius of the road at the point, that is, the curvature of the road is 1/turning radius of the road, the unit of the curvature of the road is (1/m), which indicates the degree of deviation of the road from a straight line, and is an important factor influencing traffic safety, and vehicles passing through a curve at a high speed are very easy to cause rollover or sideslip, thus causing serious traffic accidents. Therefore, the rapid and effective calculation of the road curvature is of great significance in the intelligent traffic system. In the yaw angle calculation, referring to fig. 3, a vehicle plane two-dimensional coordinate system is as follows: the center of mass of the vehicle is used as an origin, a longitudinal center line of a vehicle body is used as an X axis, the advancing direction of a vehicle head is used as a positive direction, the direction perpendicular to the X axis in a horizontal plane is used as a Y axis, a yaw angle is an included angle between the tangential direction of a road center line at the position of the vehicle and the X axis in a vehicle coordinate system, the yaw angle can also be considered as an included angle between the advancing direction of the head and the tangential direction of the road center line at the position of the vehicle, the right yaw of the vehicle is positive, and the reverse direction is negative. The transverse deviation distance refers to the distance of the vehicle deviating from a road middle line in the direction perpendicular to the driving direction of the vehicle in the driving process of the vehicle, wherein the road center line is a virtual center line of a left lane line and a right lane line in the same lane.
After the lane image is acquired, first, image enhancement is performed by, for example, histogram equalization to change the gradation of the lane image, highlighting the lane lines in the lane image. The method comprises the steps of detecting lane line edges through gray value gradients of lane lines and adjacent areas, and performing smooth filtering and differential calculation on image gray levels through a Sobel operator. Finally, fitting the lane edge line to a quadratic function by a least square method to obtain c2,c1,c0。
In one embodiment, in a clear weather, the acquired lane images are generally clear, the finally output lane parameters are the lane identification parameters, while the lane images acquired in a haze day, a rain day and a snow day are fuzzy, and the lane identification parameters may not be acquired after the lane images are processed.
In step S130, when the lane identification parameter cannot be determined, the vehicle position information and the driving speed information of the vehicle are determined by a positioning module provided on the vehicle, and the road curvature of the lane in which the vehicle is located is determined according to the vehicle position information and a preset road design information list. The road design information list at least includes the position information of the road marking point and the road design curvature corresponding to the marking point, and the driving speed information generally refers to the driving speed of the current position of the vehicle or the average driving speed in a certain period of time. The positioning module may be, but is not limited to, a GPS module, a navigator with positioning function, a computing device with positioning function, and the like.
In one embodiment, the road design information list is traversed, and the position information of the target road marking point consistent with the vehicle position information is determined from the road design information list according to the vehicle position information. And after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
The road marking points in the road design information list are selected at equal intervals along the lane direction, and the intervals are about 30 meters generally, for example, the marking points (x)k,yk) Corresponding road curvature of c2,kWhere k is a positive integer, the road design information list may be loaded in the form of a data table into a vehicle controller (ECU) in which the lane recognition system is installed. The road design information list refers to table 1 as follows:
TABLE 1 road design information List
Vehicle location coordinates | Curvature of road |
(x1,y1) | c2,1 |
(x2,y2) | c2,2 |
… | … |
(xk,yk) | c2,k |
(xk+1,yk+1) | c2,k+1 |
In one embodiment, after determining that the target road marking point position information does not exist in the road design information list, calculating a weighted average value of the road design curvatures corresponding to two pieces of road marking point position information adjacent to the vehicle position information in the road design information list, and marking the weighted average value as the road curvature of the lane where the vehicle is located.
In one embodiment, the previous sampling time at which the lane parameter information cannot be acquired is defined as the 0 th time, the vehicle coordinate position is (x (0), y (0)), and the real-time vehicle coordinate position obtained by, for example, GPS positioning is (x (i), y (i)). Wherein the sampling time is defined as the least common multiple of the GPS sampling interval. If the coordinates (x (i), y (i)) of the current position of the vehicle coincide with the coordinates of a certain marking point in the road design information list, the road curvature corresponding to the coordinates in the road design information list is marked as the road curvature of the current position of the vehicle. If the position coordinates (x (i), y (i)) of the vehicle are located at the k-th and k + 1-th adjacent marking points (x (i)) in the road design information listk,yk) And (x)k+1,yk+1) Calculating a road curvature weighted average value corresponding to two mark points in the road design information list, wherein the calculation formula is as follows:
wherein,curvature of road corresponding to the location of vehicle at time i, dk、dk+1Respectively the current vehicle position and the mark point (x)k,yk) And a mark point (x)k+1,yk+1) A distance between c2,kFor the road curvature corresponding to the kth marking point in the road design information list, c2,k+1And designing the curvature of the road corresponding to the (k + 1) th mark point in the information list for the road.
In step S140, when the lane recognition parameter cannot be determined, the turning speed information of the vehicle is acquired by a combination inertial sensor provided on the vehicle, and the yaw angle and the lateral deviation distance of the vehicle are calculated from the turning speed information and the running speed information. The turning speed information comprises yaw angular velocity, lateral acceleration and the like, the combined Inertial sensor (IMU) comprises three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of an object in three independent axes of a carrier coordinate system, the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, the angular velocity and the acceleration of the object in a three-dimensional space are measured, the posture of the object is calculated through the angular velocity signals, and the combined Inertial sensor has important application value in navigation. In addition, the turning speed information may be acquired by an inertial sensor provided in an Electronic Stability Program (ESP) for a vehicle body. It should be understood, however, that the present invention is not limited by the instrument for acquiring the turning speed information, and all sensors and the like that can acquire the turning speed information are within the scope of the present invention.
In one embodiment, the relative position information of the vehicle is determined from the yaw rate and the driving speed information. Based on the relative position information, a yaw angle and a lateral deviation distance of the vehicle are calculated using a specific set of equations. The relative position information refers to a position of the current position of the vehicle relative to an initial point, and the initial point may be a position of the vehicle at a certain time marked in the moving process of the vehicle.
The specific calculation procedure is as follows.
Estimating the position (x) of the vehicle at the i-th time point relative to the 0-th time point from the yaw rateV(i),yV(i)):
In the above formula, xV(i) The abscissa of the vehicle position at the ith moment is taken as the origin of the coordinate system at the 0 th moment;
yV(i) the vertical coordinate of the vehicle position at the ith moment when the vehicle position at the 0 th moment is taken as the origin of the coordinate system;
v (i) is the vehicle speed at the ith time, i is a natural number;
γ (i) is the yaw rate at the i-th time;
t is the least common multiple of the system sampling interval, e.g., the combined inertial sensor sampling interval.
Solving the following equation set to obtain the transverse deviation distance at the current momentAngle of yaw
yL(i)=s·xL(i)+(yV(i)-s·xV(i))
Wherein,is the yaw angle at time i;
is the lateral offset distance at time i;
s is a first intermediate variable in the process of calculating the estimated value of the lateral deviation distance and the estimated value of the yaw angle;
γ (i) is the yaw rate at the i-th time;
t is the system sampling interval, e.g., the least common multiple of the combined inertial sensor sampling interval;
xV(i) the abscissa of the vehicle position at the ith moment is taken as the origin of the coordinate system at the 0 th moment;
yV(i) the vertical coordinate of the vehicle position at the ith moment when the vehicle position at the 0 th moment is taken as the origin of the coordinate system;
m1calculating a fourth intermediate variable in the process of the lateral deviation distance and the yaw angle;
m2is the fifth intermediate variable in calculating the lateral offset distance and yaw angle.
After obtaining the lane identification parameters, i.e. the road curvature, the yaw angle and the lateral deviation distance, may be transmitted to, for example, a driving decision system. The lane identification parameters are transmitted to a driving decision system through a controller area network such as a CAN network. CAN is an abbreviation of Controller Area Network (hereinafter CAN) and is a serial communication protocol standardized by ISO international. The CAN belongs to the field bus category, is a serial communication network which effectively supports distributed control or real-time control, and compared with a plurality of distributed control systems which are constructed by RS-485 based on R lines, the distributed control system based on the CAN bus has obvious advantages in the following aspects: the data communication among the nodes of the network has strong real-time performance and short development period, and forms a field bus of international standard.
According to the technical scheme of the invention, when the lane identification parameters cannot be determined, the information such as turning speed information, position information and the like in the driving process of the vehicle is obtained by combining the inertial sensor and the positioning module, and the lane identification parameters are calculated, so that the influence on the driving decision of the vehicle caused by information interruption due to the fact that the lane identification parameters cannot be obtained is avoided, and the safety and the reliability in the driving process of the vehicle are improved.
Fig. 4 is a structural diagram illustrating a lane recognition compensating apparatus according to an embodiment of the present invention. As shown in fig. 4, the apparatus includes: a lane information acquisition unit 410, a parameter determination unit 420, a parameter calculation unit 430, and a transmission unit 440.
The lane information acquiring unit 410 is used for acquiring lane information of a lane in which the vehicle is located. The lane information acquiring unit 410 is further configured to acquire lane information of a lane in which the vehicle is located through a visual sensor disposed on the vehicle.
The parameter determining unit 420 is configured to determine lane identification parameters of the lane according to the lane information, where the lane identification parameters include a road curvature, a yaw angle, and a lateral deviation distance. The parameter determining unit 420 is further configured to process the lane information by using an image processing algorithm to obtain the lane identification parameter.
The parameter calculation unit 430 is configured to determine, by a positioning module disposed on the vehicle, driving speed information and vehicle position information of the vehicle when the lane identification parameter cannot be determined, and determine a road curvature of a lane where the vehicle is located according to the vehicle position information and a preset road design information list; the method comprises the steps of obtaining turning speed information of the vehicle through a combined inertial sensor arranged on the vehicle, and calculating a yaw angle and a lateral deviation distance of the vehicle through the turning speed information and the running speed information.
In one embodiment, the road design information list at least includes road marking point position information and a road design curvature corresponding to the marking point, and the parameter calculation unit 430 is further configured to traverse the road design information list, and determine target road marking point position information consistent with the vehicle position information from the road design information list according to the vehicle position information; and after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
In one embodiment, the parameter calculating unit 430 is further configured to, after determining that the target road marking point position information does not exist in the road design information list, calculate a weighted average of road design curvatures corresponding to two pieces of road marking point position information adjacent to the vehicle position information in the road design information list, and mark the weighted average as the road curvature of the lane where the vehicle is located.
In an embodiment, the turning speed information comprises yaw angular velocity, the parameter calculation unit 430 is further configured to determine relative position information of the vehicle from the yaw angular velocity and the travel speed information; calculating the yaw angle and the lateral deviation distance of the vehicle using a specific equation set according to the relative position information.
The transmission unit 440 is configured to transmit the road curvature, the yaw angle and the lateral deviation distance of the vehicle to a driving decision system.
The lane recognition compensation device provided by the embodiment of the invention can be specific hardware on equipment or software or firmware installed on the equipment. The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided by the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the present invention in its spirit and scope. Are intended to be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A lane recognition compensation method, characterized in that the method comprises:
acquiring lane information of a lane where a vehicle is located;
determining lane identification parameters of the lane according to the lane information, wherein the lane identification parameters comprise road curvature, yaw angle and transverse deviation distance;
when the lane recognition parameter cannot be determined,
determining running speed information and vehicle position information of the vehicle through a positioning module arranged on the vehicle, and determining the road curvature of a lane where the vehicle is located according to the vehicle position information and a preset road design information list;
acquiring turning speed information of the vehicle through a combined inertial sensor arranged on the vehicle, and calculating the yaw angle and the lateral deviation distance of the vehicle through the turning speed information and the running speed information.
2. The method of claim 1, prior to said determining lane identification parameters for said lane from said lane information, comprising:
acquiring lane information of a lane where the vehicle is located through a visual sensor arranged on the vehicle;
determining lane identification parameters of the lane according to the lane information includes:
and processing the lane information by using an image processing algorithm to obtain the lane identification parameter.
3. The method as claimed in claim 1, wherein the road design information list at least includes road marking point position information and road design curvature corresponding to the marking point, and the determining the road curvature of the lane in which the vehicle is located according to the vehicle position information and the preset road design information list comprises:
traversing the road design information list, and determining the position information of the target road marking point consistent with the vehicle position information from the road design information list according to the vehicle position information;
and after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
4. The method according to claim 3, wherein, when determining target road marking point position information that coincides with the position information from the road design information list based on the vehicle position information, further comprising:
and after the target road marking point position information does not exist in the road design information list, calculating a weighted average value of road design curvatures corresponding to at least two pieces of road marking point position information adjacent to the vehicle position information in the road design information list, and marking the weighted average value as the road curvature of a lane where the vehicle is located.
5. The method of claim 1, wherein the turning speed information comprises yaw rate, the calculating a yaw angle and a lateral deviation distance of the vehicle from the turning speed information and the travel speed information comprising:
determining relative position information of the vehicle according to the yaw angular velocity and the running speed information;
calculating the yaw angle and the lateral deviation distance of the vehicle using a specific equation set according to the relative position information.
6. A lane recognition compensation apparatus, characterized in that the apparatus comprises:
the lane information acquisition unit is used for acquiring lane information of a lane where the vehicle is located;
the parameter determining unit is used for determining lane identification parameters of the lane according to the lane information, wherein the lane identification parameters comprise road curvature, yaw angle and transverse deviation distance;
the parameter calculation unit is used for determining the running speed information and the vehicle position information of the vehicle through a positioning module arranged on the vehicle when the lane identification parameter cannot be determined, and determining the road curvature of the lane where the vehicle is located according to the vehicle position information and a preset road design information list; acquiring turning speed information of the vehicle through a combined inertial sensor arranged on the vehicle, and calculating the yaw angle and the lateral deviation distance of the vehicle through the turning speed information and the running speed information.
7. The apparatus of claim 6, wherein the lane information acquiring unit is further configured to acquire lane information of a lane in which the vehicle is located through a visual sensor provided on the vehicle;
the parameter determining unit is further configured to process the lane information by using an image processing algorithm to obtain the lane identification parameter.
8. The apparatus of claim 6, wherein the road design information list includes at least road marking point position information and road design curvatures corresponding to the marking points, the parameter calculation unit is further configured to:
traversing the road design information list, and determining the position information of the target road marking point consistent with the vehicle position information from the road design information list according to the vehicle position information;
and after the position information of the target road marking point is determined to exist in the road design information list, marking the road design curvature corresponding to the position information of the target road marking point as the road curvature of the lane where the vehicle is located.
9. The apparatus of claim 8, wherein the parameter calculation unit is further to:
and after the target road marking point position information does not exist in the road design information list, calculating a weighted average value of the road design curvatures corresponding to the position information of two road marking points adjacent to the vehicle position information in the road design information list, and marking the weighted average value as the road curvature of the lane where the vehicle is located.
10. The apparatus of claim 6, wherein the turning speed information comprises yaw rate, the parameter calculation unit further to:
determining relative position information of the vehicle according to the yaw angular velocity and the running speed information;
calculating the yaw angle and the lateral deviation distance of the vehicle using a specific equation set according to the relative position information.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710267571 | 2017-04-21 | ||
CN2017102675715 | 2017-04-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107021104A true CN107021104A (en) | 2017-08-08 |
Family
ID=59527410
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710301941.2A Pending CN107021104A (en) | 2017-04-21 | 2017-05-02 | A kind of lane identification compensation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107021104A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109460739A (en) * | 2018-11-13 | 2019-03-12 | 广州小鹏汽车科技有限公司 | Method for detecting lane lines and device |
WO2019052567A1 (en) * | 2017-09-18 | 2019-03-21 | 中车株洲电力机车研究所有限公司 | Virtual turnout system and method for virtual rail vehicle |
CN110221324A (en) * | 2019-05-28 | 2019-09-10 | 上海车轮互联网服务有限公司 | Data processing method used for positioning and device |
CN111674393A (en) * | 2020-05-12 | 2020-09-18 | 坤泰车辆系统(常州)有限公司 | Control method for long-curve running of automatic driving system with lane centering auxiliary function |
WO2021047275A1 (en) * | 2019-09-09 | 2021-03-18 | 华为技术有限公司 | Method and device for determining traffic-lane line information |
CN114537446A (en) * | 2022-03-28 | 2022-05-27 | 重庆长安汽车股份有限公司 | Lane dividing method and storage medium for target vehicle behind vehicle |
CN114913500A (en) * | 2022-07-12 | 2022-08-16 | 福思(杭州)智能科技有限公司 | Pose determination method and device, computer equipment and storage medium |
CN115782926A (en) * | 2022-12-29 | 2023-03-14 | 苏州市欧冶半导体有限公司 | Vehicle motion prediction method and device based on road information |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120314055A1 (en) * | 2011-06-08 | 2012-12-13 | Toyota Jidosha Kabushiki Kaisha | Lane departure prevention support apparatus, method of displaying a lane boundary line and program |
CN103101535A (en) * | 2011-11-09 | 2013-05-15 | 现代摩比斯株式会社 | Apparatus for compensating vehicle lane of lane keeping assistance system and method |
CN104029676A (en) * | 2013-03-05 | 2014-09-10 | 通用汽车环球科技运作有限责任公司 | Vehicle Lane Determination |
CN106323308A (en) * | 2015-07-02 | 2017-01-11 | 集奥数字国际有限公司 | Attributed roadway trajectories for self-driving vehicles |
-
2017
- 2017-05-02 CN CN201710301941.2A patent/CN107021104A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120314055A1 (en) * | 2011-06-08 | 2012-12-13 | Toyota Jidosha Kabushiki Kaisha | Lane departure prevention support apparatus, method of displaying a lane boundary line and program |
CN103101535A (en) * | 2011-11-09 | 2013-05-15 | 现代摩比斯株式会社 | Apparatus for compensating vehicle lane of lane keeping assistance system and method |
CN104029676A (en) * | 2013-03-05 | 2014-09-10 | 通用汽车环球科技运作有限责任公司 | Vehicle Lane Determination |
CN106323308A (en) * | 2015-07-02 | 2017-01-11 | 集奥数字国际有限公司 | Attributed roadway trajectories for self-driving vehicles |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019052567A1 (en) * | 2017-09-18 | 2019-03-21 | 中车株洲电力机车研究所有限公司 | Virtual turnout system and method for virtual rail vehicle |
CN109460739A (en) * | 2018-11-13 | 2019-03-12 | 广州小鹏汽车科技有限公司 | Method for detecting lane lines and device |
CN110221324A (en) * | 2019-05-28 | 2019-09-10 | 上海车轮互联网服务有限公司 | Data processing method used for positioning and device |
CN110221324B (en) * | 2019-05-28 | 2021-04-27 | 上海车轮互联网服务有限公司 | Data processing method and device for positioning |
WO2021047275A1 (en) * | 2019-09-09 | 2021-03-18 | 华为技术有限公司 | Method and device for determining traffic-lane line information |
CN111674393A (en) * | 2020-05-12 | 2020-09-18 | 坤泰车辆系统(常州)有限公司 | Control method for long-curve running of automatic driving system with lane centering auxiliary function |
CN111674393B (en) * | 2020-05-12 | 2021-12-07 | 坤泰车辆系统(常州)有限公司 | Control method for long-curve running of automatic driving system with lane centering auxiliary function |
CN114537446A (en) * | 2022-03-28 | 2022-05-27 | 重庆长安汽车股份有限公司 | Lane dividing method and storage medium for target vehicle behind vehicle |
CN114537446B (en) * | 2022-03-28 | 2024-09-24 | 重庆长安汽车股份有限公司 | Lane dividing method and storage medium for target vehicle behind host vehicle |
CN114913500A (en) * | 2022-07-12 | 2022-08-16 | 福思(杭州)智能科技有限公司 | Pose determination method and device, computer equipment and storage medium |
CN115782926A (en) * | 2022-12-29 | 2023-03-14 | 苏州市欧冶半导体有限公司 | Vehicle motion prediction method and device based on road information |
CN115782926B (en) * | 2022-12-29 | 2023-12-22 | 苏州市欧冶半导体有限公司 | Vehicle motion prediction method and device based on road information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107021104A (en) | A kind of lane identification compensation method and device | |
CN106256606B (en) | A kind of lane departure warning method based on vehicle-mounted binocular camera | |
CN106053879B (en) | Pass through the vehicle speed estimation of the expiration operation of data fusion | |
Han et al. | Vehicle distance estimation using a mono-camera for FCW/AEB systems | |
US9912933B2 (en) | Road surface detection device and road surface detection system | |
EP2372308B1 (en) | Image processing system and vehicle control system | |
US20110320163A1 (en) | Method and system for determining road data | |
US8855848B2 (en) | Radar, lidar and camera enhanced methods for vehicle dynamics estimation | |
CN106842269A (en) | Localization method and system | |
CN109791598A (en) | The image processing method of land mark and land mark detection system for identification | |
CN110316197B (en) | Tilt estimation method, tilt estimation device, and non-transitory computer-readable storage medium storing program | |
US20050201593A1 (en) | Vehicle state sensing system and vehicle state sensing method | |
CN110979339B (en) | Front road form reconstruction method based on V2V | |
EP2372607A2 (en) | Scene matching reference data generation system and position measurement system | |
CN110745140A (en) | Vehicle lane change early warning method based on continuous image constraint pose estimation | |
CN107111879A (en) | Pass through the method and apparatus of panoramic looking-around Image estimation vehicle displacement | |
CN105937912A (en) | Map data processing device for vehicle | |
CN104943694A (en) | System and method for determining of and compensating for misalignment of sensor | |
WO2022147924A1 (en) | Method and apparatus for vehicle positioning, storage medium, and electronic device | |
JP6941178B2 (en) | Automatic operation control device and method | |
DE112016006962T5 (en) | Recognition region estimation device, recognition region estimation method, and recognition region estimation program | |
JP2020003463A (en) | Vehicle's self-position estimating device | |
WO2018149539A1 (en) | A method and apparatus for estimating a range of a moving object | |
CN115993597A (en) | Visual radar perception fusion method and terminal equipment | |
CN117746357A (en) | Lane line identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170808 |
|
RJ01 | Rejection of invention patent application after publication |