CN116400362A - Driving boundary detection method, device, storage medium and equipment - Google Patents

Driving boundary detection method, device, storage medium and equipment Download PDF

Info

Publication number
CN116400362A
CN116400362A CN202310674370.2A CN202310674370A CN116400362A CN 116400362 A CN116400362 A CN 116400362A CN 202310674370 A CN202310674370 A CN 202310674370A CN 116400362 A CN116400362 A CN 116400362A
Authority
CN
China
Prior art keywords
previous frame
boundary
value
obstacle
obstacle distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310674370.2A
Other languages
Chinese (zh)
Other versions
CN116400362B (en
Inventor
罗宇亮
陈春光
李冬磊
黄以佳
彭易锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GAC Aion New Energy Automobile Co Ltd
Original Assignee
GAC Aion New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GAC Aion New Energy Automobile Co Ltd filed Critical GAC Aion New Energy Automobile Co Ltd
Priority to CN202310674370.2A priority Critical patent/CN116400362B/en
Publication of CN116400362A publication Critical patent/CN116400362A/en
Application granted granted Critical
Publication of CN116400362B publication Critical patent/CN116400362B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The method comprises the steps of detecting a boundary by using ultrasonic sensors arranged on two sides of a vehicle, obtaining an obstacle distance value and a confidence coefficient, correcting the obstacle distance value of a previous frame by the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, transferring point coordinates of the corresponding obstacle under the vehicle coordinate system of the current frame to a world coordinate system to obtain coordinates of boundary points, and performing curve fitting by using the coordinates of the boundary points when the number of the extracted boundary points reaches a preset threshold value, so as to generate a boundary curve. In this way, high-precision boundary detection can be achieved in the case of light differences, uphill and downhill slopes, curved channels, etc., and expensive equipment support is not required, thus the cost is low.

Description

Driving boundary detection method, device, storage medium and equipment
Technical Field
The application relates to the technical field of automatic driving, in particular to a driving boundary detection method, a driving boundary detection device, a storage medium and driving boundary detection equipment.
Background
Driving boundary detection is an important research direction in the field of intelligent automobiles, and particularly for automatic driving automobiles, the automatic driving needs to be planned according to the boundary on the driving area of the automobile so as to safely complete automatic driving. The current driving boundary detection is mainly realized by a visual detection mode or a laser radar detection mode, wherein the visual detection mode is to detect a boundary from image data by a deep learning model after image data are acquired by a vehicle-mounted side camera; and the laser radar detection mode is to detect the boundary through a segmentation algorithm after collecting laser point cloud data. However, under the condition of poor light or ascending and descending slopes, the problem of false detection or low precision often occurs in the visual detection mode, and the laser radar detection mode needs higher equipment cost, so that the requirements of most vehicles are difficult to meet. Therefore, there is a need in the market for a boundary detection scheme that has high accuracy in poor light or up-down slopes and is low in cost.
Disclosure of Invention
The invention aims to provide a driving boundary detection method, a driving boundary detection device, a storage medium and driving boundary detection equipment, and aims to solve the problems that false detection or low precision often occurs or higher equipment cost is required under the condition of poor light or ascending and descending slopes in a driving boundary detection mode in the related technology, and the requirements of most vehicles are difficult to meet.
In a first aspect, a driving boundary detection method provided in the present application includes: acquiring boundary data detected by ultrasonic sensors arranged on two sides of a vehicle; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees; correcting the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame to the world coordinate system, and storing the converted point coordinates as coordinates of boundary points; when the number of the stored boundary points reaches a preset threshold value, curve fitting is performed by utilizing the coordinates of all the boundary points, and a boundary curve is generated.
In the implementation process, the ultrasonic sensors arranged on two sides of the vehicle are used for detecting the boundary to obtain the obstacle distance value and the confidence coefficient, the obstacle distance value of the previous frame is corrected through the obstacle distance value of the current frame and the previous two frames and the confidence coefficient of the previous frame, the point coordinates of the corresponding obstacle under the vehicle coordinate system of the current frame are transferred to the world coordinate system to obtain the coordinates of boundary points, and when the number of the extracted boundary points reaches a preset threshold value, curve fitting is carried out by using the coordinates of the boundary points to generate a boundary curve. In this way, high-precision boundary detection can be achieved in the case of light differences, uphill and downhill slopes, curved channels, etc., and expensive equipment support is not required, thus the cost is low.
Further, in some embodiments, the obstacle distance value is a distance of the obstacle perpendicular to the direction of the vehicle body; the obstacle distance value is calculated based on the distance value detected by the ultrasonic sensor and the signal angle; the signal angle characterizes an installation angle of the ultrasonic sensor on a vehicle.
In the implementation process, according to the distance value detected by the ultrasonic sensor, the distance of the obstacle perpendicular to the direction of the vehicle body is obtained by combining the signal angle obtained by the sensor calibration test, and a good data base is laid for subsequent boundary detection.
Further, in some embodiments, the correcting the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence of the previous frame includes: comparing the obstacle distance values of the current frame and the previous two frames to obtain a larger value and a smaller value; if the difference value between the obstacle distance value of the previous frame and the preset distance threshold is larger than the larger value, correcting the obstacle distance value of the previous frame based on the confidence coefficient of the previous frame and the larger value; and if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, correcting the obstacle distance value of the previous frame based on the confidence of the previous frame and the smaller value.
In the implementation process, the boundary data of the previous frame is judged to be abnormal convex or concave data through the boundary data of the previous two frames, the previous frame and the current frame, and when the judgment result is yes, the abnormal data is corrected, so that the extracted boundary points are smoother.
Further, in some embodiments, the obstacle distance value in the boundary data of the corrected previous frame is calculated based on the following formula:
Figure SMS_1
wherein the said
Figure SMS_2
The obstacle distance value of the previous frame after correction is used; said->
Figure SMS_3
An obstacle distance value for a previous frame before correction; said->
Figure SMS_4
Confidence for the previous frame; if the difference between the obstacle distance value of the previous frame and the preset distance threshold is greater than the larger value, the +.>
Figure SMS_5
Is the larger value; if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, the +.>
Figure SMS_6
Is the smaller value.
In the implementation process, a specific way of correcting the obstacle distance value is provided.
Further, in some embodiments, the correcting the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence of the previous frame further includes: if the difference between the obstacle distance value of the previous frame and the preset distance threshold is not greater than the larger value and the sum of the obstacle distance value of the previous frame and the preset distance threshold is not less than the smaller value, determining that the difference between the obstacle distance value of the previous frame and the preset distance threshold is not greater than the larger value
Figure SMS_7
To correct the obstacle distance value of the previous frame.
In the above implementation, when it is confirmed that the boundary data of the previous frame is not abnormally convex or concave data, the above-mentioned formula is followed
Figure SMS_8
And determining the original obstacle distance value so that the corrected obstacle distance value is consistent with the original obstacle distance value. Thus, various conditions of boundary data of the previous frame can be processed based on the formula, and processing efficiency is improved.
Further, in some embodiments, the converting the point coordinates of the obstacle corresponding to the modified boundary data of the previous frame in the vehicle coordinate system of the current frame into the world coordinate system includes: and converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame into the point coordinates of the obstacle under the world coordinate system according to the vehicle pose data, the external parameter matrix of the ultrasonic sensor and the external parameter matrix of the vehicle.
In the implementation process, according to the vehicle pose data, the coordinate transformation relation of the vehicle coordinate system of the current frame relative to the world coordinate system is determined by combining the sensor external parameters and the vehicle external parameters, and then according to the coordinate transformation relation, the point coordinates in the vehicle coordinate system of the current frame are converted into the point coordinates in the world coordinate system, so that the coordinate conversion is realized.
Further, in some embodiments, before the curve fitting using the coordinates of all the boundary points, the method includes: and converting the coordinates of all the boundary points into the current vehicle coordinate system through the current vehicle pose data.
In the implementation process, the stored boundary points are transferred to the current vehicle coordinate system according to the pose information of the current vehicle, and curve fitting is carried out by using the coordinates of the boundary points, so that the generated boundary curve can correspond to the current vehicle coordinate system, and the accuracy and the efficiency of automatic driving path planning are improved.
In a second aspect, the present application provides a driving boundary detection apparatus, including: the acquisition module is used for acquiring boundary data detected by ultrasonic sensors arranged on two sides of the vehicle; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees; the conversion module is used for correcting the obstacle distance value of the previous frame through the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame into the world coordinate system, and storing the converted point coordinates as coordinates of boundary points; and the generating module is used for performing curve fitting by utilizing the coordinates of all the boundary points when the number of the stored boundary points reaches a preset threshold value, so as to generate a boundary curve.
In a third aspect, the present application provides an electronic device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspects when the computer program is executed.
In a fourth aspect, the present application provides a computer readable storage medium having instructions stored thereon, which when run on a computer, cause the computer to perform the method according to any of the first aspects.
In a fifth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method according to any one of the first aspects.
Additional features and advantages of the disclosure will be set forth in the description which follows, or in part will be obvious from the description, or may be learned by practice of the techniques disclosed herein.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a driving boundary detection method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an ultrasonic sensor deployment scenario based on an ultrasonic signal boundary detection scheme provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an effect of final road boundary generation according to a boundary detection scheme based on an ultrasonic signal according to an embodiment of the present application;
fig. 4 is a block diagram of a driving boundary detection device provided in an embodiment of the present application;
fig. 5 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
As described in the background art, in the driving boundary detection method in the related art, under the condition of poor light or ascending and descending slopes, false detection or low precision often occurs, or higher equipment cost is required, so that the requirements of most vehicles are difficult to meet. Based on this, the embodiment of the application provides a new driving boundary detection scheme to solve the above problem.
The embodiments of the present application are described below:
as shown in fig. 1, fig. 1 is a flowchart of a driving boundary detection method provided in an embodiment of the present application, where the method may be applied to a controller on a vehicle, such as a whole vehicle controller (Vehicle Control Unit, VCU) or a domain controller (Domain Control Unit, DCU), and may also be applied to a server that establishes a communication connection with the vehicle.
The method comprises the following steps:
in step 101, boundary data detected by ultrasonic sensors mounted on both sides of a vehicle are acquired; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees;
the ultrasonic sensor, also called ultrasonic sensor, is a sensor which converts ultrasonic signals into other energy signals (usually electric signals), and mainly comprises a transmitter, a receiver, a control part and a power supply part, wherein when the ultrasonic sensor works, the transmitter can emit ultrasonic waves with a certain frequency, the ultrasonic waves are transmitted to the surface of a measured object through air, the ultrasonic waves reflected by the surface of the measured object are received by the receiver and converted into the electric signals, and the control part can calculate parameters such as the distance, the position and the like of the measured object according to the propagation time and the speed of the ultrasonic waves.
In the scheme of the embodiment, the left side and the right side of the vehicle are respectively provided with an ultrasonic sensor, the ultrasonic sensors at the two sides detect in real time in the running process of the vehicle, if a boundary exists in the detection range, the sensor signals give corresponding distance values and carry confidence degrees, and the range of the confidence degrees is 0, 1. The obstacle distance value can be obtained according to the distance value given in the sensor signal, one obstacle distance value and a corresponding confidence coefficient are a group of boundary data, and the sensor generates f groups of boundary data within one second on the assumption that the working frequency of the ultrasonic sensor is f hertz. The predetermined number of frames referred to in this step may be obtained based on the operating frequency of the ultrasonic sensor, one frame corresponding to a time at which the ultrasonic sensor generates a set of boundary data.
In some embodiments, the obstacle distance value referred to in this step is the distance of the obstacle perpendicular to the direction of the vehicle body; the obstacle distance value is calculated based on the distance value detected by the ultrasonic sensor and the signal angle; the signal angle characterizes an installation angle of the ultrasonic sensor on the vehicle. In practical applications, the signal emission direction of the ultrasonic sensor mounted on the vehicle may not be perpendicular to the vehicle body direction, and the distance value detected by the ultrasonic sensor is actually the distance between the obstacle and the ultrasonic sensor, so that according to the calibration test of the ultrasonic sensor, a signal angle, that is, an angle of the signal emission direction of the ultrasonic sensor relative to the vehicle body direction is obtained, and then according to the distance value detected by the ultrasonic sensor and the signal angle, the distance of the obstacle perpendicular to the vehicle body direction, that is, the obstacle distance value, is calculated by using a trigonometric function relation. Of course, when the signal emitting direction of the ultrasonic sensor is perpendicular to the vehicle body direction, the signal angle is 0, and the obstacle distance value is the distance value detected by the ultrasonic sensor. Thus, a good data foundation is laid for subsequent boundary detection.
In step 102, correcting the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame to the world coordinate system, and storing the converted point coordinates as coordinates of boundary points;
when detecting a boundary, if the extracted boundary points are not smooth, the fitted boundary curves may have larger differences, so in the scheme of the embodiment, the abnormal convex or concave points are screened and corrected through the boundary data at adjacent moments, and the filtering function is realized, so that the boundary curves generated later are more accurate. The first frame refers to a last frame before the current time, for example, assuming that a time interval for generating a set of boundary data by the ultrasonic sensor is 10ms, the boundary data acquired by the ultrasonic sensor at a time point corresponding to 10ms before the current time point is the boundary data of the previous frame, and the boundary data acquired by the ultrasonic sensor at a time point corresponding to 20ms before the current time point is the boundary data of the previous frame.
Specifically, in some embodiments, the correcting the obstacle distance value of the previous frame by the obstacle distance values of the current frame and the previous two frames and the confidence of the previous frame mentioned in the step may include: comparing the obstacle distance values of the current frame and the previous two frames to obtain a larger value and a smaller value; if the difference value between the obstacle distance value of the previous frame and the preset distance threshold is larger than the larger value, correcting the obstacle distance value of the previous frame based on the confidence coefficient of the previous frame and the larger value; and if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, correcting the obstacle distance value of the previous frame based on the confidence of the previous frame and the smaller value. That is, if the difference obtained by subtracting the preset distance threshold from the obstacle distance value of the previous frame is greater than the greater value of the obstacle distance values of the current frame and the previous two frames, or the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the lesser value of the obstacle distance values of the current frame and the previous two frames, the boundary data of the previous frame is confirmed to be abnormal convex or concave data, and then the boundary data of the previous frame is corrected. For example, if boundary data of the previous two frames are (4.5 m, 0.95) and boundary data of the previous frame is (4.2 m, 0.80) and boundary data of the current frame is (4.6 m, 0.95), a larger value of 4.7 and a smaller value of 4.5 are obtained by comparison, and if the preset distance threshold is 0.2, the boundary data of the previous frame is determined to be abnormal data because of 4.5 > (4.2+0.2), and if the boundary data of the previous frame is (4.6 m, 0.95), the boundary data of the previous frame is corrected by using the confidence of the previous frame and the smaller value, so that the extracted boundary point is smoother. The preset distance threshold may be set differently according to requirements of different scenes, which is not limited in this application.
Further, in some embodiments, the obstacle distance value of the corrected previous frame may be calculated based on the following formula:
Figure SMS_9
wherein the said
Figure SMS_11
The obstacle distance value of the previous frame after correction is used; said->
Figure SMS_14
An obstacle distance value for a previous frame before correction; said->
Figure SMS_16
Confidence for the previous frame; if the difference between the obstacle distance value of the previous frame and the preset distance threshold is greater than the larger value, the +.>
Figure SMS_12
Is the larger value; if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, the +.>
Figure SMS_13
Is the smaller value. By following the previous example, then +.>
Figure SMS_15
Is->
Figure SMS_17
Accordingly, the obstacle distance value of the corrected previous frame is +.>
Figure SMS_10
Still further, in some embodiments, if the difference between the obstacle distance value of the previous frame and the preset distance threshold is not greater than the greater value and the sum of the obstacle distance value of the previous frame and the preset distance threshold is not less than the lesser value, determining the
Figure SMS_18
To correct the obstacle distance value of the previous frame. That is, when it is confirmed that the boundary data of the previous frame is not abnormal convex or concave data, +.>
Figure SMS_19
And determining the original obstacle distance value so that the corrected obstacle distance value is consistent with the original obstacle distance value. Thus, various conditions of boundary data of the previous frame can be processed based on the formula, and processing efficiency is improved.
After the boundary data of the previous frame is corrected, according to the boundary data and the installation position of an ultrasonic sensor on a vehicle, the point coordinates of the corresponding obstacle under the vehicle coordinate system of the previous frame can be obtained, wherein the vehicle coordinate system is established by using the central point of the rear axle of the vehicle, the front side of the vehicle running is an X axis, and the right side of the vehicle is a Y axis; next, according to the GPS (Global Positioning System ) and IMU (Inertial Measurement Unit, inertial measurement unit) mounted on the vehicle, the rotation translation parameters of the vehicle coordinate system of the current frame relative to the vehicle coordinate system of the previous frame can be obtained; then, based on the rotation and translation parameter, the point coordinates of the obstacle corresponding to the boundary data of the previous frame in the vehicle coordinate system of the current frame can be obtained. In consideration of the fact that the vehicle is likely to be in a driving state all the time in the process of accumulating the boundary points, the vehicle coordinate system is transformed accordingly, and therefore, the point coordinates in the vehicle coordinate system of the current frame are transferred to the world coordinate system and stored as the coordinates of the boundary points, and the calculated amount caused by the fact that the point coordinates in the vehicle coordinate system are updated in real time due to the transformation of the vehicle coordinate system is reduced.
In some embodiments, the converting, in the step, the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame in the vehicle coordinate system of the current frame into the world coordinate system may include: and converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame into the point coordinates of the obstacle under the world coordinate system according to the vehicle pose data, the external parameter matrix of the ultrasonic sensor and the external parameter matrix of the vehicle. That is, according to vehicle pose data such as the position and heading angle of the vehicle, a coordinate transformation relation of the vehicle coordinate system of the current frame relative to the world coordinate system is determined by combining the sensor external parameters and the vehicle external parameters, and then according to the coordinate transformation relation, point coordinates in the vehicle coordinate system of the current frame are converted into point coordinates in the world coordinate system, so that coordinate conversion is realized.
And 103, when the number of the stored boundary points reaches a preset threshold value, performing curve fitting by utilizing the coordinates of all the boundary points to generate a boundary curve.
The method comprises the following steps: acquiring for a period of time to obtain a boundary point sequence, and performing curve fitting by using coordinates of all boundary points to obtain a polynomial curve equation when the number of the boundary points in the boundary point sequence is detected to be more than or equal to a preset threshold value
Figure SMS_20
. From this polynomial curve equation, in addition to the boundary points, defects extending forward are also predictable, i.e. the boundary curve at this point in time. In the case of dim driving environment or excessive curvature, such as when the vehicle is driving at low speed in an uphill, downhill or arc-shaped channel with dark light and obvious boundaries (such as walls, edges, etc.), the rule link can be guided by the generated boundary curve and automatic driving is completed. The preset threshold value here may be set according to the accuracy requirement, which is not limited in this application.
In practical applications, the vehicle is usually implemented by means of the current own vehicle coordinate system when planning an automatic driving path, and thus, in some embodiments, before performing curve fitting by using the coordinates of all boundary points, the method may include: and converting the coordinates of all the boundary points into the current vehicle coordinate system through the current vehicle pose data. That is, according to the pose information of the current vehicle, the stored boundary points are transferred to the current vehicle coordinate system, and curve fitting is performed by using the coordinates of the boundary points, so that the generated boundary curve can correspond to the current vehicle coordinate system, and the accuracy and efficiency of automatic driving path planning are improved.
According to the method, the boundary is detected by using the ultrasonic sensors arranged on two sides of the vehicle, the obstacle distance value and the confidence coefficient are obtained, the obstacle distance value of the previous frame is corrected by the obstacle distance value of the current frame and the obstacle distance value of the previous frame and the confidence coefficient of the previous frame, the point coordinates of the corresponding obstacle under the vehicle coordinate system of the current frame are transferred to the world coordinate system, the coordinates of boundary points are obtained, and when the number of the extracted boundary points reaches a preset threshold value, curve fitting is carried out by using the coordinates of the boundary points, so that a boundary curve is generated. In this way, high-precision boundary detection can be achieved in the case of light differences, uphill and downhill slopes, curved channels, etc., and expensive equipment support is not required, thus the cost is low.
For a more detailed description of the solution of the present application, a specific embodiment is described below:
the embodiment provides a boundary detection scheme based on ultrasonic signals. In this scheme, as shown in fig. 2, four short-distance ultrasonic sensors 22 are respectively installed on the front and rear sides of a vehicle 21, and one long-distance ultrasonic sensor 23 is respectively installed on the left and right sides, and the detection range is K meters. The sensor signal gives a corresponding distance value and carries a confidence, and the confidence has a value range of [0,1].
The boundary detection flow in the scheme comprises the following steps:
s201, detecting ultrasonic sensors at two sides in real time during running of a vehicle, and obtaining an obstacle distance value and a confidence degree at the moment from the ultrasonic sensors, wherein the obstacle distance value and the confidence degree are marked as distance and confidence and are called dc data; assuming that the working frequency of the sensor is f hertz, generating f groups of dc data by the sensor within one second;
s202, obtaining a signal angle according to a sensor calibration test
Figure SMS_21
The distance +.of the obstacle perpendicular to the direction of the vehicle body can be calculated>
Figure SMS_22
S203, a vehicle coordinate system is established by using a center point of a rear axle of the vehicle, wherein the front side of the vehicle running is an X axis, and the right side of the vehicle is a Y axis; in this way, if the boundary exists in the detection range K meter during the running process of the vehicle, a group of dc values are obtained, and the point coordinate p_car of the corresponding obstacle under the own vehicle coordinate system can be obtained;
s204, the data stored by the previous step
Figure SMS_24
And the corresponding confidence, the process includes: taking the first two points p_car [ i-2 ]]、p_car[i-1]And the current point p_car [ i ]]According to a preset threshold value threshold, a debounce algorithm is adopted for p_car [ i-1 ]]Judging whether it is an abnormal convex or concave point, and correcting it if the judgment is yes, specifically, correcting p_car [ i-2 ]]And p_car [ i ]]Two->
Figure SMS_27
The larger value of (2) is denoted as max and the smaller value is denoted as min, if +.>
Figure SMS_29
Then for p_car [ i-1 ]]Is->
Figure SMS_25
Its corrected value
Figure SMS_26
The method comprises the steps of carrying out a first treatment on the surface of the If it is
Figure SMS_28
Then for p_car [ i-1 ]]Is->
Figure SMS_30
Its corrected value->
Figure SMS_23
Then, according to the vehicle pose data and the ultrasonic sensor and vehicle external parameter matrix, turning the point coordinate corresponding to the calDistance to the world coordinate system to obtain p_world;
s205, acquiring a boundary point p_cal and a sequence p_cal_list through a period of acquisition; when N p-world boundary points are detected to be greater than or equal to N, converting the stored N p-world boundary points into a current own vehicle coordinate system through pose information of a vehicle at present, and performing curve fitting by using the N boundary points to obtain a polynomial curve equation
Figure SMS_31
The method comprises the steps of carrying out a first treatment on the surface of the From this polynomial curve equation, in addition to the boundary points, defects extending forward are also predictable, i.e. the boundary curve at this point in time.
In this embodiment, the effect of the final road boundary generation is shown in fig. 3. When the vehicle runs in an ascending slope, a descending slope or an arc-shaped channel with dark light and obvious boundaries (such as walls, road edges and the like) at a low speed, the generated boundary curve can conduct line guidance on the regulation link and complete automatic running.
Corresponding to the embodiment of the method, the application also provides an embodiment of the driving boundary detection device and the terminal applied by the driving boundary detection device:
as shown in fig. 4, fig. 4 is a block diagram of a driving boundary detection device provided in an embodiment of the present application, where the device includes:
an acquisition module 41 for acquiring boundary data detected by ultrasonic sensors mounted on both sides of the vehicle; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees;
the conversion module 42 is configured to correct the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, convert the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame in the vehicle coordinate system of the current frame into the world coordinate system, and store the converted point coordinates as coordinates of the boundary points;
and the generating module 43 is configured to perform curve fitting by using coordinates of all the boundary points when the number of the stored boundary points reaches a preset threshold value, so as to generate a boundary curve.
The implementation process of the functions and roles of each module in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
The application further provides an electronic device, please refer to fig. 5, and fig. 5 is a block diagram of an electronic device according to an embodiment of the application. The electronic device may include a processor 510, a communication interface 520, a memory 530, and at least one communication bus 540. Wherein the communication bus 540 is used to enable direct connection communication for these components. The communication interface 520 of the electronic device in the embodiment of the present application is used for performing signaling or data communication with other node devices. Processor 510 may be an integrated circuit chip with signal processing capabilities.
The processor 510 may be a general-purpose processor, including a central processing unit (CPU, centralProcessingUnit), a network processor (NP, networkProcessor), etc.; but may also be a Digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 510 may be any conventional processor or the like.
The Memory 530 may be, but is not limited to, random access Memory (RAM, randomAccessMemory), read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable Read Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable Read Only Memory (EEPROM, electric Erasable Programmable Read-Only Memory), and the like. The memory 530 has stored therein computer readable instructions which, when executed by the processor 510, may cause an electronic device to perform the steps described above in relation to the method embodiment of fig. 1.
Optionally, the electronic device may further include a storage controller, an input-output unit.
The memory 530, the memory controller, the processor 510, the peripheral interface, and the input/output unit are electrically connected directly or indirectly to each other, so as to realize data transmission or interaction. For example, the elements may be electrically coupled to each other via one or more communication buses 540. The processor 510 is configured to execute executable modules stored in the memory 530, such as software functional modules or computer programs included in the electronic device.
The input-output unit is used for providing the user with the creation task and creating the starting selectable period or the preset execution time for the task so as to realize the interaction between the user and the server. The input/output unit may be, but is not limited to, a mouse, a keyboard, and the like.
It will be appreciated that the configuration shown in fig. 5 is merely illustrative, and that the electronic device may also include more or fewer components than shown in fig. 5, or have a different configuration than shown in fig. 5. The components shown in fig. 5 may be implemented in hardware, software, or a combination thereof.
The embodiment of the application further provides a storage medium, where instructions are stored, and when the instructions run on a computer, the computer program is executed by a processor to implement the method described in the method embodiment, so that repetition is avoided, and no further description is given here.
The present application also provides a computer program product which, when run on a computer, causes the computer to perform the method of the method embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application, and various modifications and variations may be suggested to one skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A driving boundary detection method, characterized by comprising:
acquiring boundary data detected by ultrasonic sensors arranged on two sides of a vehicle; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees;
correcting the obstacle distance value of the previous frame according to the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame to the world coordinate system, and storing the converted point coordinates as coordinates of boundary points;
when the number of the stored boundary points reaches a preset threshold value, curve fitting is performed by utilizing the coordinates of all the boundary points, and a boundary curve is generated.
2. The method of claim 1, wherein the obstacle distance value is a distance of an obstacle perpendicular to a vehicle body direction; the obstacle distance value is calculated based on the distance value detected by the ultrasonic sensor and the signal angle; the signal angle characterizes an installation angle of the ultrasonic sensor on a vehicle.
3. The method of claim 2, wherein the correcting the obstacle distance value of the previous frame by the obstacle distance values of the current frame and the previous two frames and the confidence of the previous frame comprises:
comparing the obstacle distance values of the current frame and the previous two frames to obtain a larger value and a smaller value;
if the difference value between the obstacle distance value of the previous frame and the preset distance threshold is larger than the larger value, correcting the obstacle distance value of the previous frame based on the confidence coefficient of the previous frame and the larger value;
and if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, correcting the obstacle distance value of the previous frame based on the confidence of the previous frame and the smaller value.
4. A method according to claim 3, wherein the obstacle distance value in the boundary data of the corrected previous frame is calculated based on the following formula:
Figure QLYQS_1
wherein the said
Figure QLYQS_2
The obstacle distance value of the previous frame after correction is used; said->
Figure QLYQS_3
An obstacle distance value for a previous frame before correction; said->
Figure QLYQS_4
Confidence for the previous frame; if the difference between the obstacle distance value of the previous frame and the preset distance threshold is greater than the larger value, the +.>
Figure QLYQS_5
Is the larger value; if the sum of the obstacle distance value of the previous frame and the preset distance threshold is smaller than the smaller value, the +.>
Figure QLYQS_6
Is the smaller value.
5. The method of claim 4, wherein correcting the obstacle distance value of the previous frame by the obstacle distance values of the current frame and the previous two frames and the confidence of the previous frame further comprises:
if the difference between the obstacle distance value of the previous frame and the preset distance threshold is not greater than the larger value and the sum of the obstacle distance value of the previous frame and the preset distance threshold is not less than the smaller value, determining that the difference between the obstacle distance value of the previous frame and the preset distance threshold is not greater than the larger value
Figure QLYQS_7
To be before correctionObstacle distance value of a frame.
6. The method according to claim 1, wherein converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame in the vehicle coordinate system of the current frame into the world coordinate system includes:
and converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame into the point coordinates of the obstacle under the world coordinate system according to the vehicle pose data, the external parameter matrix of the ultrasonic sensor and the external parameter matrix of the vehicle.
7. The method according to claim 1, wherein prior to curve fitting using coordinates of all boundary points, comprising:
and converting the coordinates of all the boundary points into the current vehicle coordinate system through the current vehicle pose data.
8. A traffic boundary detection device, characterized by comprising:
the acquisition module is used for acquiring boundary data detected by ultrasonic sensors arranged on two sides of the vehicle; the ultrasonic sensor generates boundary data of a preset frame number every second, wherein the boundary data comprises obstacle distance values and confidence degrees;
the conversion module is used for correcting the obstacle distance value of the previous frame through the obstacle distance values of the current frame and the previous two frames and the confidence coefficient of the previous frame, converting the point coordinates of the obstacle corresponding to the boundary data of the corrected previous frame under the vehicle coordinate system of the current frame into the world coordinate system, and storing the converted point coordinates as coordinates of boundary points;
and the generating module is used for performing curve fitting by utilizing the coordinates of all the boundary points when the number of the stored boundary points reaches a preset threshold value, so as to generate a boundary curve.
9. A computer readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, implements the method according to any of claims 1 to 7.
10. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 7 when the computer program is executed by the processor.
CN202310674370.2A 2023-06-08 2023-06-08 Driving boundary detection method, device, storage medium and equipment Active CN116400362B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310674370.2A CN116400362B (en) 2023-06-08 2023-06-08 Driving boundary detection method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310674370.2A CN116400362B (en) 2023-06-08 2023-06-08 Driving boundary detection method, device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN116400362A true CN116400362A (en) 2023-07-07
CN116400362B CN116400362B (en) 2023-08-08

Family

ID=87010938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310674370.2A Active CN116400362B (en) 2023-06-08 2023-06-08 Driving boundary detection method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN116400362B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117698711A (en) * 2024-02-06 2024-03-15 江苏日盈电子股份有限公司 Intelligent automobile radar ranging control system based on Internet of things

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001066361A (en) * 1999-08-30 2001-03-16 Denso Corp Calculating device and correcting device for center axis deflection quantity of obstacle detecting device for vehicle, and neutral learning device and inter-vehicle interval controller
CN101900814A (en) * 2010-07-09 2010-12-01 深圳市豪恩电子科技股份有限公司 Reversing radar system and detection method
CN108007451A (en) * 2017-11-10 2018-05-08 未来机器人(深圳)有限公司 Detection method, device, computer equipment and the storage medium of cargo carrying device pose
CN108931246A (en) * 2017-05-26 2018-12-04 杭州海康机器人技术有限公司 A kind of method and apparatus for the barrier existing probability detecting unknown position
WO2019058507A1 (en) * 2017-09-22 2019-03-28 三菱電機株式会社 Obstacle detection device and obstacle detection method
CN110573905A (en) * 2017-04-28 2019-12-13 株式会社电装 Obstacle detection device
CN111257893A (en) * 2020-01-20 2020-06-09 珠海上富电技股份有限公司 Parking space detection method and automatic parking method
CN112084810A (en) * 2019-06-12 2020-12-15 杭州海康威视数字技术股份有限公司 Obstacle detection method and device, electronic equipment and storage medium
CN112105953A (en) * 2018-11-26 2020-12-18 华为技术有限公司 Obstacle detection method and device
CN113552575A (en) * 2021-07-16 2021-10-26 铁将军汽车电子股份有限公司 Parking obstacle detection method and device
CN115113189A (en) * 2022-06-30 2022-09-27 东风汽车有限公司东风日产乘用车公司 Method, device and equipment for calibrating obstacle detection system and storage medium
CN115540896A (en) * 2022-12-06 2022-12-30 广汽埃安新能源汽车股份有限公司 Path planning method, path planning device, electronic equipment and computer readable medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001066361A (en) * 1999-08-30 2001-03-16 Denso Corp Calculating device and correcting device for center axis deflection quantity of obstacle detecting device for vehicle, and neutral learning device and inter-vehicle interval controller
CN101900814A (en) * 2010-07-09 2010-12-01 深圳市豪恩电子科技股份有限公司 Reversing radar system and detection method
CN110573905A (en) * 2017-04-28 2019-12-13 株式会社电装 Obstacle detection device
CN108931246A (en) * 2017-05-26 2018-12-04 杭州海康机器人技术有限公司 A kind of method and apparatus for the barrier existing probability detecting unknown position
WO2019058507A1 (en) * 2017-09-22 2019-03-28 三菱電機株式会社 Obstacle detection device and obstacle detection method
CN108007451A (en) * 2017-11-10 2018-05-08 未来机器人(深圳)有限公司 Detection method, device, computer equipment and the storage medium of cargo carrying device pose
CN112105953A (en) * 2018-11-26 2020-12-18 华为技术有限公司 Obstacle detection method and device
CN112084810A (en) * 2019-06-12 2020-12-15 杭州海康威视数字技术股份有限公司 Obstacle detection method and device, electronic equipment and storage medium
CN111257893A (en) * 2020-01-20 2020-06-09 珠海上富电技股份有限公司 Parking space detection method and automatic parking method
CN113552575A (en) * 2021-07-16 2021-10-26 铁将军汽车电子股份有限公司 Parking obstacle detection method and device
CN115113189A (en) * 2022-06-30 2022-09-27 东风汽车有限公司东风日产乘用车公司 Method, device and equipment for calibrating obstacle detection system and storage medium
CN115540896A (en) * 2022-12-06 2022-12-30 广汽埃安新能源汽车股份有限公司 Path planning method, path planning device, electronic equipment and computer readable medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张奇, 顾伟康: "激光测距雷达距离图障碍物实时检测算法研究及误差分析", 机器人, no. 02, pages 122 - 133 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117698711A (en) * 2024-02-06 2024-03-15 江苏日盈电子股份有限公司 Intelligent automobile radar ranging control system based on Internet of things
CN117698711B (en) * 2024-02-06 2024-04-26 江苏日盈电子股份有限公司 Intelligent automobile radar ranging control system based on Internet of things

Also Published As

Publication number Publication date
CN116400362B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
CN110867132B (en) Environment sensing method, device, electronic equipment and computer readable storage medium
WO2018212346A1 (en) Control device, scanning system, control method, and program
CN109191487B (en) Unmanned vehicle-based collision detection method, device, equipment and storage medium
CN116400362B (en) Driving boundary detection method, device, storage medium and equipment
US10269245B2 (en) Server, system, and method for determining a position of an end of a traffic jam
EP4089659A1 (en) Map updating method, apparatus and device
JP2018180735A (en) Operation range determination device
CN112748421B (en) Laser radar calibration method based on automatic driving of straight road section
US11912293B2 (en) Method, system, and computer program product for determining a blockage of a sensor of a plurality of sensors of an ego vehicle
CN112147651B (en) Asynchronous multi-vehicle cooperative target state robust estimation method
GB2576206A (en) Sensor degradation
US20220128377A1 (en) Travel route setting system, travel route setting method, and program
KR20170068937A (en) Autonomous driving vehicle navigation system using the tunnel lighting
JP2019184566A (en) Vehicle and vehicle position estimation device
US20190257936A1 (en) Apparatus and method for determining a speed of a vehicle
EP3712642B1 (en) Light signal detection device, range finding device, and detection method
JP2019069734A (en) Vehicle control device
CN110986966B (en) Automatic driving positioning method and system for long-distance tunnel
CN110426714B (en) Obstacle identification method
WO2022098516A1 (en) Systems and methods for radar false track mitigation with camera
CN109827610B (en) Method and device for verifying sensor fusion result
CN114396958B (en) Lane positioning method and system based on multiple lanes and multiple sensors and vehicle
CN112835029A (en) Unmanned-vehicle-oriented multi-sensor obstacle detection data fusion method and system
CN113325415B (en) Fusion method and system of vehicle radar data and camera data
JP2016115211A (en) Position recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant