CN110068814B - Method and device for measuring distance of obstacle - Google Patents

Method and device for measuring distance of obstacle Download PDF

Info

Publication number
CN110068814B
CN110068814B CN201910238525.1A CN201910238525A CN110068814B CN 110068814 B CN110068814 B CN 110068814B CN 201910238525 A CN201910238525 A CN 201910238525A CN 110068814 B CN110068814 B CN 110068814B
Authority
CN
China
Prior art keywords
radar
obstacle
coordinate system
distance
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910238525.1A
Other languages
Chinese (zh)
Other versions
CN110068814A (en
Inventor
张时嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Reach Automotive Technology Shenyang Co Ltd
Original Assignee
Neusoft Reach Automotive Technology Shenyang Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Reach Automotive Technology Shenyang Co Ltd filed Critical Neusoft Reach Automotive Technology Shenyang Co Ltd
Priority to CN201910238525.1A priority Critical patent/CN110068814B/en
Publication of CN110068814A publication Critical patent/CN110068814A/en
Application granted granted Critical
Publication of CN110068814B publication Critical patent/CN110068814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Abstract

The application provides a method for measuring the distance of an obstacle, which comprises the steps of obtaining a position frame of the obstacle in a shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle; converting a coordinate system of the radar into a coordinate system of the shot image; acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system; and clustering the data points according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle. By the method, the distance between the vehicle and the obstacle can be accurately obtained, and safety in the automatic driving process is improved. The application also provides a device for measuring the distance between obstacles.

Description

Method and device for measuring distance of obstacle
Technical Field
The application relates to the technical field of vehicle control, in particular to a method and a device for measuring barrier distance.
Background
The automatic driving technology is an important technology in the field of motor vehicles at present and is also a popular research direction of various manufacturers at present. The automatic driving technology mainly depends on the cooperative cooperation of artificial intelligence, visual calculation, radar, a monitoring device, a global positioning system and the like, so that the vehicle-mounted computer can automatically and safely operate the motor vehicle without any active operation of human beings. And the obstacle identification is taken as an important link in automatic driving, and the safety in the automatic driving process is directly influenced.
The existing automatic driving technology obtains the distance between an obstacle in front of a current road and a vehicle through a radar, but when a plurality of obstacles exist in a certain area and partial shielding exists among the plurality of obstacles, the radar only obtains the position of the obstacle in the area closest to the vehicle, but ignores other partially shielded obstacles, and thus potential safety hazards appear in a driving strategy of the vehicle in an automatic driving process. For example, when a pedestrian is partially hidden by a barrier (or a green belt, a building wall, a sign, or other facilities), the radar of the vehicle may determine only the barrier on the right side as an obstacle in the direction and ignore the pedestrian, so that the distance between the vehicle and the pedestrian cannot be obtained, which may cause safety hazards in driving strategies of the vehicle, such as route selection, driving speed, braking time, and the like.
Disclosure of Invention
In order to solve the technical problems in the prior art, the application provides a method and a device for measuring the distance between an obstacle, which can accurately obtain the distance between a vehicle and the obstacle and improve the safety in the automatic driving process.
The embodiment of the application provides a method for measuring the distance between obstacles, which comprises the following steps:
acquiring a position frame of an obstacle in a shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle;
converting a coordinate system of the radar into a coordinate system of the shot image;
acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
and clustering the data points according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle.
Optionally, the converting the coordinate system of the radar into the coordinate system of the shot image includes:
converting the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar;
and converting the coordinate system of the camera into the coordinate system of the shot image.
Optionally, the acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system specifically includes:
determining a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
and reserving the radar point cloud area corresponding to the position frame of the obstacle and cutting off other radar point cloud areas.
Optionally, the clustering the data points according to the distance between the data point in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle specifically includes:
clustering data points in the radar point cloud area according to the distance between the data points and the origin of the coordinate system of the radar to obtain the number of the obstacles;
and acquiring the distance between the vehicle and each obstacle.
Optionally, the method further includes:
and acquiring the projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and the angle of the position frame deviating from the current driving direction of the vehicle.
The embodiment of the present application further provides a device for measuring the distance between obstacles, the device includes: the device comprises a first acquisition unit, a coordinate conversion unit, a second acquisition unit and a third acquisition unit;
the first acquisition unit is used for acquiring a position frame of an obstacle in a shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle;
the coordinate conversion unit is used for converting a coordinate system of the radar into a coordinate system of the shot image;
the second acquisition unit is used for acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
the third obtaining unit is configured to cluster the data points according to a distance between the data point in the radar point cloud area and an origin of a coordinate system of the radar to obtain a distance between the vehicle and the obstacle.
Optionally, the conversion unit specifically includes: a first conversion subunit and a second conversion subunit;
the first conversion subunit converts the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar;
and the second conversion subunit is used for converting the coordinate system of the camera into the coordinate system of the shot image.
Optionally, the second obtaining unit specifically includes: a position determining subunit and a clipping subunit;
the position determining subunit is configured to determine a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
and the cutting subunit is used for reserving the radar point cloud area corresponding to the position frame of the obstacle and cutting off other radar point cloud areas.
Optionally, the third obtaining unit specifically includes: a number determination subunit and a distance acquisition subunit;
the number determining subunit is configured to cluster the data points according to a distance between the data point in the radar point cloud area and an origin of a coordinate system of the radar to obtain the number of the obstacles;
the distance obtaining subunit is configured to obtain a distance between the vehicle and each obstacle.
Optionally, the apparatus further comprises: a fourth acquisition unit;
the fourth obtaining unit is configured to obtain a projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and an angle of the position frame deviating from the current driving direction of the vehicle.
The method has the following advantages:
the application provides a method for measuring obstacle distance, which comprises the following steps: acquiring a position frame of an obstacle in a shot image, and determining the direction of the obstacle relative to the vehicle in the shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle; converting a coordinate system of a radar into a coordinate system of a shot image to obtain the position of a point cloud in the radar coordinate system in the coordinate system of the shot image; acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system; clustering data points in the radar point cloud area according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle, wherein the obstacle exists in the point cloud area, the radar represents the obstacle by using point cloud consisting of a plurality of data points, the data points representing the same obstacle are used as the same class, the distances from the data points in the radar point cloud area to the origin of the coordinate system on the coordinate system of the radar are the same, and the distance of the class is the distance between the vehicle and the obstacle. By the method, the distance between the vehicle and the obstacle can be accurately obtained, and safety and fluency in the automatic driving process are improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for measuring a distance to an obstacle according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a scene of a method for measuring a distance to an obstacle according to an embodiment of the present disclosure;
fig. 3 is a schematic view of another scenario of a method for measuring a distance to an obstacle according to an embodiment of the present application;
fig. 4 is a flowchart of another method for measuring a distance to an obstacle according to the second embodiment of the present application;
fig. 5 is a schematic scene diagram of a method for measuring a distance to an obstacle according to a second embodiment of the present application;
fig. 6 is a schematic view of an apparatus for measuring a distance to an obstacle according to a second embodiment of the present disclosure;
fig. 7 is a schematic view of another apparatus for measuring a distance to an obstacle according to the second embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The first embodiment is as follows:
the embodiment of the present application provides a method for measuring a distance between obstacles, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 1, which is a flowchart illustrating a method for measuring a distance to an obstacle according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of a scene of a method for measuring a distance to an obstacle according to an embodiment of the present application.
The method comprises the following steps:
s101: a position frame of an obstacle in a captured image is acquired.
The shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle. The shot image is input into a preset neural network for deep learning to judge whether the shot image has an obstacle, if so, a rectangular frame is used for containing the obstacle, specifically, the scene shown in fig. 2 can be referred to, and the position frame is a rectangular frame just completely containing the obstacle. The number of obstacles is not particularly limited, and therefore, a plurality of position frames may be acquired from the captured image.
In addition, in a possible scene, due to partial occlusion between multiple obstacles, for example, two obstacles exist in front of the vehicle, and a first obstacle therein forms occlusion on a second obstacle, specifically, referring to the scene shown in fig. 3, since the captured image is a two-dimensional image, and only includes position information of the obstacle on a coordinate system of the captured image, and does not include distance information between the obstacle and the vehicle, a situation that one position frame includes multiple obstacles may occur, and the number of obstacles in the position frame is not specifically limited in the present application.
It should be noted that the camera may be disposed at the top end of the vehicle, may also be disposed at a rear windshield of the vehicle, and is used for capturing real-time images during forward movement and reverse movement, and may also be disposed at other suitable positions, which is not specifically limited in this application.
S102: and converting the coordinate system of the radar into the coordinate system of the shot image.
The coordinate system of the shot image is a two-dimensional coordinate system established based on the shot image, and in a possible implementation manner, the coordinate system of the shot image takes a midpoint of the shot image as a coordinate origin, and a plane rectangular coordinate system is established by taking a horizontal direction and a vertical direction as directions of two coordinate axes respectively.
The radar coordinate system is a two-dimensional coordinate system established based on the radar, and in a possible implementation manner, the radar coordinate system takes the radar as an origin of coordinates, and a spatial rectangular coordinate system is established by taking the horizontal direction, the vertical direction and the current advancing direction of the vehicle as three coordinate axes respectively.
Since the captured image is a two-dimensional image and only includes the position information of the obstacle on the coordinate system of the captured image, the distance between the obstacle and the vehicle cannot be determined from the captured image. The radar point clouds obtained by radar scanning are distributed in a three-dimensional space and comprise distance information of the obstacle and the vehicle, so that the point clouds corresponding to the obstacle can be obtained from the point clouds obtained by radar scanning according to a position frame of the obstacle in the shot image, but the coordinate system of the shot image is a two-dimensional coordinate system because the shot image is a two-dimensional image, and the coordinate system of the radar is a three-dimensional coordinate system, so that the coordinate system of the radar needs to be converted into the coordinate system of the shot image.
The following describes the coordinate transformation process specifically:
a: and converting the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar.
b: and converting the coordinate system of the camera into the coordinate system of the shot image.
One method of implementing the coordinate transformation is described below:
the camera is used as the origin of the coordinate system of the real physical environment, and in a possible implementation manner, the camera is used as the origin of the coordinate system of the camera, and a plane rectangular coordinate system is established by respectively taking the horizontal direction and the vertical direction as the directions of two coordinate axes.
The coordinate system of the shot image and the coordinate system of the camera are both two-dimensional coordinate systems, the conversion process involves the proportional transformation of coordinates, the points on the coordinate system of the shot image can be mapped onto the coordinate system of the camera through the conversion of the coordinate systems, and for the points P (x, y) on the coordinate system of the shot image, the mapping of the points P (x, y) on the coordinate system of the camera is P1(x1,y1) Then, there are:
Figure BDA0002008943290000071
in the formula (1), k is a proportionality coefficient, the value of k can be determined by the performance parameters of the camera, the value of k can be measured in advance, and when the working state of the camera is fixed, the value of k is fixed.
Firstly, establishing an auxiliary three-dimensional rectangular coordinate system by taking the camera as a coordinate origin and taking the horizontal direction, the vertical direction and the advancing direction of the vehicle as coordinate axes respectively, and establishing a point P on an original two-dimensional coordinate system of the camera1(x1,y1) The mapping is P in the auxiliary three-dimensional rectangular coordinate system2(x1,y1,z1) Wherein z is1Is the camera and point P1Is projected distance in the advancing direction of the vehicle.
After the positions of the camera and the radar on the vehicle are determined, the relative positions of the camera and the radar are also determined, and the position (x) of the radar in the auxiliary three-dimensional rectangular coordinate system is used0,y0,z0) To represent the relative position of the radar and the camera, where x0,y0And z0It is known that P in a three-dimensional rectangular coordinate system will be assisted2(x1,y1,z1) Is mapped to a coordinate system of the radar as P3(x2,y2,z2) Wherein z is2For the radar and point P3The projected distance of the distance in the advancing direction of the vehicle is as follows:
x2=x1-x0 (2)
y2=y1-y0 (3)
z2=z1-z0 (4)
the position of the vehicle on which the radar is mounted is not particularly limited, and in one possible implementation, the radar may be mounted at the forefront and the rearmost of the vehicle for detecting obstacles on the vehicle travel path in forward and reverse, respectively. In addition, the coordinate conversion process may be completed on a vehicle-mounted computer of the vehicle, or may be completed remotely at a terminal, which is not specifically limited in this application.
The coordinate transformation may also be implemented by other methods, which are not specifically limited in this application.
S103: and acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system.
And after coordinate conversion is completed, acquiring the radar point cloud area corresponding to the point in the position frame after the point is mapped to the radar coordinate system, namely determining the position of the radar point cloud area in the radar coordinate system in the horizontal direction and the position of the radar point cloud area in the vertical direction.
Because the point cloud obtained by the radar scanning is distributed in the three-dimensional space, a plurality of non-obstacles may exist and are also scanned by the radar, and therefore, in order to improve the efficiency of subsequently clustering the data points, the point cloud obtained by the radar scanning can be cut.
In a possible implementation manner, only the radar point cloud area corresponding to the position frame of the obstacle is reserved after the point cloud obtained by scanning the radar is cut, and the influence of the radar point cloud area obtained by scanning other non-obstacles can be removed, as shown in fig. 2, the areas formed by the four vertices of the obstacle position frame and the dashed lines connected with the radar in fig. 2 are the radar point cloud areas corresponding to the position frame of the obstacle.
In another possible implementation manner, the point cloud obtained by scanning the radar is cut according to the shooting angle range of the camera, the cut radar point cloud area is the shooting area of the camera, the cutting can be completed in advance after the shooting angle of the camera is confirmed, if the obstacle is shot by the camera, the radar point cloud area corresponding to the position frame of the obstacle inevitably exists in the cut point cloud area, the influence of the radar point cloud area obtained by scanning other non-obstacles can be removed, and meanwhile, the time for cutting the radar point cloud is saved.
S104: and clustering the data points according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle.
The obstacle exists in the point cloud area, the radar represents the obstacle by the point cloud consisting of a plurality of data points, the data points representing the same obstacle can be in the same class, therefore, the distances between the data points in the radar point cloud area and the origin of the coordinate system on the coordinate system of the radar are the same, the distances between the vehicle and the obstacle are the same, the data points are clustered according to the distances between the data points in the radar point cloud area and the origin of the coordinate system of the radar, and the distance between the obstacle corresponding to each class and the vehicle can be obtained.
In a possible implementation manner, if the position frame actually includes a plurality of obstacles due to the fact that the plurality of obstacles are shielded from each other, referring to the scene shown in fig. 3, clustering data points in a radar point cloud region corresponding to the position frame according to a distance to the vehicle to obtain a plurality of classes, each class corresponding to one obstacle, where a distance corresponding to each class is a distance between the obstacle corresponding to the class and the vehicle.
It should be noted that the above steps are only for convenience of describing the flow of the method described in the present application, and do not constitute a limitation to the method described in the present application.
The embodiment of the application provides a method for measuring the distance between obstacles, which comprises the steps of determining the direction of the obstacles relative to a vehicle in a shot image by acquiring a position frame of the obstacles in the shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle; converting the coordinate system of the shot image into a coordinate system of a radar, wherein the shot image is a two-dimensional image, only the direction of the obstacle relative to the vehicle can be determined, and the real distance between the vehicle and the obstacle cannot be determined, so that the position relation of the shot image can be mapped into the radar coordinate system in a coordinate transformation mode; acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system; clustering data points in the radar point cloud area according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle, scanning the obstacle in the point cloud area by the radar, representing the obstacle by point cloud consisting of a plurality of data points, dividing the data points representing the same obstacle into the same class, and representing the distance between the vehicle and the obstacle by the distance between the data points of the same class and the origin of the coordinate system of the radar. By the method, the distance between the vehicle and the obstacle can be accurately obtained, and safety and fluency in the automatic driving process are improved.
Example two:
the second embodiment of the present application further provides another method for measuring a distance to an obstacle, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 4, it is a flowchart of another method for measuring an obstacle distance according to the second embodiment of the present application.
Fig. 5 is a schematic view of a scene of a method for measuring an obstacle distance according to a second embodiment of the present application.
The method comprises the following steps:
s101: a position frame of an obstacle in a captured image is acquired.
S102 a: and converting the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar.
S102 b: and converting the coordinate system of the camera into the coordinate system of the shot image.
S103 a: and determining a corresponding radar point cloud area of the position frame of the obstacle in the radar coordinate system.
S103 b: and reserving the radar point cloud area corresponding to the position frame of the obstacle and cutting off other radar point cloud areas.
S104 a: clustering data points in the radar point cloud area according to the distance between the data points and the origin of the coordinate system of the radar to obtain the number of the obstacles.
The method comprises the steps that a plurality of obstacles are actually contained in a position frame due to mutual shielding of the plurality of obstacles, a plurality of classes are obtained after data points in a radar point cloud area corresponding to the position frame are clustered according to the distance between the data points and a vehicle, each class corresponds to one obstacle, the number of the classes is determined, namely the number of the obstacles is determined, and the distance corresponding to each class is the distance between the obstacle corresponding to the class and the vehicle.
S104 b: and acquiring the distance between the vehicle and each obstacle.
S105: and acquiring the projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and the angle of the position frame deviating from the current driving direction of the vehicle.
Specifically, as shown in fig. 5, L in fig. 5 is a distance between the vehicle and the obstacle, β is an angle of the position frame deviating from a current driving direction of the vehicle, and D is a projection distance of the obstacle in the current driving direction of the vehicle. Since the obstacle is often not located right in front of the vehicle, L > D is typical, and if there is a potential collision risk between the vehicle and the obstacle, the driving strategy according to L may have a safety hazard, for example, may cause the braking time of the vehicle to be late, because the current driving direction of the vehicle is different from the direction of the line connecting the vehicle and the obstacle. The projected distance D can be obtained so that the vehicle can make a more accurate and safe driving strategy according to the projected distance.
The projection distance can be determined by the following formula:
D=L×COSβ (5)
it should be noted that the above steps are only for convenience of describing the flow of the method described in the present application, and do not constitute a limitation to the method described in the present application, and the above steps may also be appropriately adjusted to obtain other possible implementations, for example, in one possible implementation, the original S102a may be modified to "convert the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar", and the original S102b may be adjusted to "convert the coordinate system of the camera into the coordinate system of the captured image", and the coordinate conversion process after adjustment is opposite to the coordinate conversion process before adjustment, and it may also be implemented to determine the radar area corresponding to the position frame of the obstacle in the radar point cloud coordinate system.
By utilizing the method provided by the embodiment of the application, the distance between the vehicle and the obstacle can be accurately obtained, meanwhile, the projection distance of the obstacle in the current driving direction of the vehicle can be obtained according to the angle of the position frame deviating from the current driving direction of the vehicle, so that the vehicle can make a more correct and safe driving strategy according to the projection distance, and the safety and the fluency in the automatic driving process are further improved.
Example three:
based on the method for measuring the distance between the obstacles provided by the above embodiment, a third embodiment of the present application further provides a device for measuring the distance between the obstacles, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 6, which is a structural diagram of an apparatus for measuring a distance to an obstacle according to a third embodiment of the present application.
The device of the embodiment of the application comprises: a first acquisition unit 601, a coordinate conversion unit 602, a second acquisition unit 603, and a third acquisition unit 604.
The first acquisition unit 601 is configured to acquire a position frame of an obstacle in a captured image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle.
The coordinate conversion unit 602 is configured to convert a coordinate system of the radar into a coordinate system of the captured image.
The second obtaining unit 603 is configured to obtain a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system.
The third obtaining unit 604 is configured to cluster the data points according to a distance between the data point in the radar point cloud area and an origin of the coordinate system of the radar to obtain a number distance of the obstacles, and cluster the data points to obtain a distance between the vehicle and the obstacles.
It should be noted that the words "first", "second", and "third" are used in the embodiments of the present application only for convenience of description of the devices, and do not limit the devices.
The embodiment of the application provides a device for measuring the distance between obstacles, wherein the device acquires a position frame of the obstacles in a shot image through a first acquisition unit and determines the direction of the obstacles relative to a vehicle in the shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle; converting the coordinate system of the radar into a coordinate system of a shot image through a coordinate conversion unit; acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system through a second acquisition unit; and clustering data points in the radar point cloud area according to the distance between the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle, wherein the obstacle exists in the point cloud area, and the obstacle is represented by the radar through the point cloud consisting of a plurality of data points, so that the data points representing the same obstacle after clustering can be in the same class, and the distance in the class is the distance between the vehicle and the obstacle. By utilizing the device provided by the embodiment of the application, the distance between the vehicle and the obstacle can be accurately obtained, and the safety and the fluency in the automatic driving process are improved.
Example four:
the fourth embodiment of the present application further provides another device for measuring a distance to an obstacle, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 7, it is a flowchart of another apparatus for measuring a distance to an obstacle according to the fourth embodiment of the present application.
On the basis of the device in the third embodiment, the device in the embodiment of the present application further includes: a fourth acquisition unit 705.
The fourth obtaining unit 705 is configured to obtain a projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and an angle of the position frame deviating from the current driving direction of the vehicle.
The conversion unit of the apparatus in this embodiment specifically includes: a first conversion sub-unit 602a and a second conversion sub-unit 602 b.
The first converting subunit 602a is configured to convert the coordinate system of the radar into the coordinate system of the camera according to the relative position between the camera and the radar.
The second converting subunit 602b is configured to convert the coordinate system of the camera into the coordinate system of the captured image.
The second obtaining unit of the apparatus in this embodiment specifically includes: a position determination subunit 603a and a cropping subunit 604 b.
The position determining subunit 603a is configured to determine a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system.
The clipping subunit 604b is configured to reserve the radar point cloud area corresponding to the position frame of the obstacle and clip to remove other radar point cloud areas.
The third obtaining unit of the apparatus in this embodiment specifically includes: a number determination subunit 604a and a distance acquisition subunit 604 b.
The number determining subunit 604a is configured to cluster the data points according to a distance between the data point in the radar point cloud area and an origin of the coordinate system of the radar to obtain a number of the obstacles, and cluster the data points to obtain the number of the obstacles.
The distance obtaining subunit 604b is configured to obtain a distance between the vehicle and each obstacle.
It should be noted that the words "first", "second", "third", and "fourth" in the embodiments of the present application are only for convenience of description of the devices, and do not constitute a limitation on the devices.
The device that this application embodiment provided not only can accurately obtain the distance between vehicle and the barrier, simultaneously the fourth acquisition unit utilizes the vehicle with the distance of barrier and the position frame deviates the angle of the current driving direction of vehicle acquires the barrier is in the vehicle the projection distance in the current driving direction to make the vehicle can be according to projection distance makes more accurate safe driving strategy, has further promoted security and the fluency in the autopilot process.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus embodiment, since it is substantially similar to the method embodiment, it is relatively simple to describe, and reference may be made to some descriptions of the method embodiment for relevant points. The above-described apparatus embodiments are merely illustrative, and the units and modules described as separate components may or may not be physically separate. In addition, some or all of the units and modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The foregoing is directed to embodiments of the present application and it is noted that numerous modifications and adaptations may be made by those skilled in the art without departing from the principles of the present application and are intended to be within the scope of the present application.

Claims (8)

1. A method of measuring a distance to an obstacle, the method comprising:
acquiring a position frame of an obstacle in a shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle;
converting a coordinate system of the radar into a coordinate system of the shot image;
acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
clustering data points in the radar point cloud area according to the distance between the data points and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle;
the converting the coordinate system of the radar into the coordinate system of the photographed image includes:
converting the coordinate system of the radar into the coordinate system of the camera according to the relative position of the camera and the radar;
and converting the coordinate system of the camera into the coordinate system of the shot image.
2. The method according to claim 1, wherein the acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system includes:
determining a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
and reserving the radar point cloud area corresponding to the position frame of the obstacle and cutting off other radar point cloud areas.
3. The method according to claim 2, wherein the data points are clustered according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the distance between the vehicle and the obstacle, specifically:
clustering data points in the radar point cloud area according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar to obtain the number of the obstacles;
and acquiring the distance between the vehicle and each obstacle.
4. The method of measuring an obstacle distance of claim 1, further comprising:
and acquiring the projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and the angle of the position frame deviating from the current driving direction of the vehicle.
5. An apparatus for measuring a distance to an obstacle, the apparatus comprising: the device comprises a first acquisition unit, a coordinate conversion unit, a second acquisition unit and a third acquisition unit;
the first acquisition unit is used for acquiring a position frame of an obstacle in a shot image; the shot image is a real-time image shot by a camera on the vehicle in the current driving direction of the vehicle;
the coordinate conversion unit is used for converting a coordinate system of the radar into a coordinate system of the shot image;
the second acquisition unit is used for acquiring a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
the third acquisition unit is used for clustering the data points according to the distance between the data points in the radar point cloud area and the origin of the coordinate system of the radar so as to acquire the distance between the vehicle and the obstacle;
the coordinate conversion unit specifically includes: a first conversion subunit and a second conversion subunit;
the first conversion subunit is configured to convert a coordinate system of the radar into a coordinate system of the camera according to a relative position between the camera and the radar;
and the second conversion subunit is used for converting the coordinate system of the camera into the coordinate system of the shot image.
6. The apparatus for measuring the obstacle distance according to claim 5, wherein the second acquiring unit specifically comprises: a position determining subunit and a clipping subunit;
the position determining subunit is configured to determine a radar point cloud area corresponding to the position frame of the obstacle in the radar coordinate system;
and the cutting subunit is used for reserving the radar point cloud area corresponding to the position frame of the obstacle and cutting off other radar point cloud areas.
7. The apparatus for measuring the obstacle distance according to claim 6, wherein the third acquiring unit specifically comprises: a number determination subunit and a distance acquisition subunit;
the number determining subunit is configured to cluster the data points according to a distance between the data point in the radar point cloud area and an origin of a coordinate system of the radar to obtain the number of the obstacles;
the distance obtaining subunit is configured to obtain a distance between the vehicle and each obstacle.
8. The apparatus for measuring obstacle distance according to claim 5, further comprising: a fourth acquisition unit;
the fourth obtaining unit is configured to obtain a projection distance of the obstacle in the current driving direction of the vehicle according to the distance between the vehicle and the obstacle and an angle of the position frame deviating from the current driving direction of the vehicle.
CN201910238525.1A 2019-03-27 2019-03-27 Method and device for measuring distance of obstacle Active CN110068814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910238525.1A CN110068814B (en) 2019-03-27 2019-03-27 Method and device for measuring distance of obstacle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910238525.1A CN110068814B (en) 2019-03-27 2019-03-27 Method and device for measuring distance of obstacle

Publications (2)

Publication Number Publication Date
CN110068814A CN110068814A (en) 2019-07-30
CN110068814B true CN110068814B (en) 2021-08-24

Family

ID=67366627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910238525.1A Active CN110068814B (en) 2019-03-27 2019-03-27 Method and device for measuring distance of obstacle

Country Status (1)

Country Link
CN (1) CN110068814B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112445215A (en) * 2019-08-29 2021-03-05 阿里巴巴集团控股有限公司 Automatic guided vehicle driving control method, device and computer system
CN112668371B (en) * 2019-10-16 2024-04-09 北京京东乾石科技有限公司 Method and device for outputting information
CN112364888A (en) * 2020-10-16 2021-02-12 爱驰汽车(上海)有限公司 Point cloud data processing method and device, computing equipment and computer storage medium
CN112379674B (en) * 2020-11-26 2022-06-21 中国第一汽车股份有限公司 Automatic driving equipment and system
CN112802092B (en) * 2021-01-29 2024-04-09 深圳一清创新科技有限公司 Obstacle sensing method and device and electronic equipment
CN113223076B (en) * 2021-04-07 2024-02-27 东软睿驰汽车技术(沈阳)有限公司 Coordinate system calibration method, device and storage medium for vehicle and vehicle-mounted camera
CN113376643A (en) * 2021-05-10 2021-09-10 广州文远知行科技有限公司 Distance detection method and device and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09142236A (en) * 1995-11-17 1997-06-03 Mitsubishi Electric Corp Periphery monitoring method and device for vehicle, and trouble deciding method and device for periphery monitoring device
CN106774296A (en) * 2016-10-24 2017-05-31 中国兵器装备集团自动化研究所 A kind of disorder detection method based on laser radar and ccd video camera information fusion
CN109270534B (en) * 2018-05-07 2020-10-27 西安交通大学 Intelligent vehicle laser sensor and camera online calibration method

Also Published As

Publication number Publication date
CN110068814A (en) 2019-07-30

Similar Documents

Publication Publication Date Title
CN110068814B (en) Method and device for measuring distance of obstacle
EP3418943B1 (en) Object detecting apparatus, object detecting method, and computer-readable medium
KR102275310B1 (en) Mtehod of detecting obstacle around vehicle
JP7160040B2 (en) Signal processing device, signal processing method, program, moving object, and signal processing system
US11100808B2 (en) System and method for vehicle convoys
WO2018020954A1 (en) Database construction system for machine-learning
JP2019096072A (en) Object detection device, object detection method and program
WO2021046716A1 (en) Method, system and device for detecting target object and storage medium
WO2005088971A1 (en) Image generation device, image generation method, and image generation program
CN110068819B (en) Method and device for extracting position information of obstacle
JP2021510227A (en) Multispectral system for providing pre-collision alerts
CN110341621B (en) Obstacle detection method and device
CN109211260B (en) Intelligent vehicle driving path planning method and device and intelligent vehicle
US11514683B2 (en) Outside recognition apparatus for vehicle
CN111731304B (en) Vehicle control device, vehicle control method, and storage medium
CN110780287A (en) Distance measurement method and distance measurement system based on monocular camera
CN116935281A (en) Method and equipment for monitoring abnormal behavior of motor vehicle lane on line based on radar and video
CN113706633A (en) Method and device for determining three-dimensional information of target object
JP7115869B2 (en) Map generation system
JP6839642B2 (en) Vehicle control devices, vehicle control methods, and programs
CN110727269A (en) Vehicle control method and related product
CN116892949A (en) Ground object detection device, ground object detection method, and computer program for ground object detection
CN107292818B (en) Automatic positioning system and method for line capture device based on panoramic camera
EP3223188A1 (en) A vehicle environment mapping system
KR102174423B1 (en) Method And Apparatus for Detection of Parking Loss for Automatic Parking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant