CN114442101A - Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar - Google Patents
Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar Download PDFInfo
- Publication number
- CN114442101A CN114442101A CN202210108581.5A CN202210108581A CN114442101A CN 114442101 A CN114442101 A CN 114442101A CN 202210108581 A CN202210108581 A CN 202210108581A CN 114442101 A CN114442101 A CN 114442101A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- vehicle
- target
- information
- continuous
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000033001 locomotion Effects 0.000 claims abstract description 50
- 230000003068 static effect Effects 0.000 claims abstract description 49
- 238000004590 computer program Methods 0.000 claims description 16
- 238000009434 installation Methods 0.000 claims description 15
- 238000001914 filtration Methods 0.000 claims description 4
- 230000007613 environmental effect Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 230000015572 biosynthetic process Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 230000008447 perception Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000011161 development Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000003749 cleanliness Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 206010016256 fatigue Diseases 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000004402 high myopia Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
- G01S13/726—Multiple target tracking
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The application discloses a vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar, and the method comprises the following steps: converting the point cloud obtained by the imaging millimeter wave radar into a vehicle coordinate system, and splicing to obtain continuous point cloud; determining location cloud information of the continuous point cloud according to the vehicle motion information and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud based on the location cloud information; clustering the dynamic point cloud to obtain a clustering target, and tracking the clustering target to obtain tracking target information; and determining a travelable area according to the static point cloud, planning a path based on the travelable area and the tracking target information, and navigating the vehicle. Therefore, all-weather work can be realized by using the imaging millimeter wave radar, dynamic point clouds and static point clouds are distinguished from continuous point clouds obtained by point cloud splicing, tracking target information is obtained by using the dynamic point clouds, feasible areas are obtained by using the static point clouds, and path planning can be successfully carried out to navigate vehicles.
Description
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle navigation method, a vehicle navigation device, vehicle navigation equipment and a vehicle navigation medium based on an imaging millimeter wave radar.
Background
Currently, with the development of scientific technology, more and more science fiction things start to gradually become reality, such as automatic driving of road vehicles, automatic driving of subways, automatic driving of special scene logistics vehicles, and the like. An unwieldy autonomous vehicle frees the driver from repetitive labor, high stress and the risk of fatigue driving. And as the population structure and the industrial structure change along with the development of social economy and urbanization, the proportion of young labor force is reduced, and a large amount of people enter the city, so that drivers of long-distance freight drivers, remote mine field mine cards and the like are more and more difficult to recruit. The intelligent automatic driving of these vehicles can help owners to alleviate the shortage of employment problem, and generally can bring higher operating efficiency and reduce the operation cost.
In order to realize automatic driving of a vehicle such as a vehicle or a ship, it is necessary to provide a vehicle with an external ambient environment sensing sensor unit such as a human eye or ear, a central decision unit such as a human brain, and an operation execution unit such as a hand or a foot, in addition to a conventional sensor and other electronic and mechanical devices. The decision unit generally selects the embedded computer with corresponding configuration according to the overall configuration of the system and the complexity of the algorithm. The decision unit receives information of an environment sensor and an internal sensor (such as speed, yaw rate, steering angle of a steering wheel of the vehicle, gears, inclination angles and acceleration), acquires and evaluates the environment and the self condition of the vehicle, then makes a decision of next action, and controls control devices of a brake, an accelerator, the steering wheel and the like of the vehicle through an output interface to realize danger avoidance, normal path conversion and the like. The current tool for collecting environmental data comprises a laser radar and a camera, but the laser radar and the camera have poor adaptability and cannot work all weather.
In summary, how to acquire environmental data by using a suitable environmental data acquisition tool so as to make an accurate path plan to navigate a vehicle and realize automatic driving is a problem to be solved urgently at present.
Disclosure of Invention
In view of this, an object of the present invention is to provide a vehicle navigation method, apparatus, device and medium based on an imaging millimeter wave radar, which can utilize a suitable environment information obtaining tool to obtain environment information, so as to make an accurate path plan to navigate a vehicle, thereby implementing automatic driving. The specific scheme is as follows:
in a first aspect, the application discloses a vehicle navigation method based on an imaging millimeter wave radar, comprising:
converting point clouds, which are obtained by the imaging millimeter wave radars and are located in radar coordinate systems, into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems;
determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location cloud information;
clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target;
and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
Optionally, the point cloud located in each radar coordinate system and obtained by each imaging millimeter wave radar is converted into a vehicle coordinate system, so as to perform point cloud splicing to obtain a continuous point cloud, including:
acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of each imaging millimeter wave radar yaw angle;
calculating coordinates of the point clouds in the imaging millimeter wave radar coordinate systems in the vehicle coordinate system according to the radar installation information to obtain multiple groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system;
and splicing the multiple groups of point cloud coordinates to obtain corresponding continuous point clouds.
Optionally, before determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, the method further includes:
if the point cloud quality parameters corresponding to the continuous point cloud meet the preset parameter threshold value filtering condition, determining the continuous point cloud as a false point cloud, and then deleting the false point cloud;
and/or if the continuous point cloud appears in the current scanning period but cannot be continuously detected in a first preset coordinate range in a first preset number of continuous scanning periods, determining the continuous point cloud as a false point cloud, and deleting the false point cloud.
Optionally, the determining, according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, location-to-location cloud information of the continuous point cloud relative to the ground, and distinguishing a static point cloud and a dynamic point cloud from the continuous point cloud based on the location-to-location cloud information includes:
detecting the vehicle motion information including vehicle motion speed and vehicle yaw rate of the vehicle relative to the ground;
determining the ground point cloud speed of the continuous point cloud relative to the ground according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud;
and if the opposite location cloud speed of the continuous point cloud is less than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the opposite location cloud speed of the continuous point cloud is not less than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
Optionally, the clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target includes:
determining a target dynamic point cloud with similar point cloud information from the dynamic point cloud within a second preset coordinate range;
and clustering the target dynamic point cloud according to preset contour information to obtain a clustering target, and determining a clustering point corresponding to the clustering target and a coordinate of the clustering point.
Optionally, the tracking the clustered target according to the speed and the moving direction of the clustered target included in the clustered target to obtain tracking target information corresponding to the clustered target includes:
predicting a target coordinate range of the clustering target in the next scanning according to the clustering target speed, the clustering target motion direction and the clustering point coordinate included in the clustering target;
and if the point cloud to be judged with the clustering target speed and the clustering target motion direction similar to the clustering target is detected in the target coordinate range during the next scanning, judging the point cloud to be judged as the clustering target so as to realize target tracking and obtain tracking target information corresponding to the clustering target.
Optionally, the vehicle navigation method based on the imaging millimeter wave radar further includes:
and if the fact that the information of the point clouds of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range is not changed is monitored, determining the third preset coordinate range as a non-driving area.
In a second aspect, the present application discloses a vehicle navigation device based on imaging millimeter wave radar, comprising:
the point cloud splicing module is used for converting point clouds, obtained by the imaging millimeter wave radars, located in radar coordinate systems into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems;
the point cloud distinguishing module is used for determining the point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing a static point cloud and a dynamic point cloud from the continuous point cloud based on the point cloud information;
the clustering module is used for clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the motion direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target;
and the path planning module is used for determining a travelable area according to the static point cloud and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
In a third aspect, the present application discloses an electronic device comprising a processor and a memory; wherein the processor, when executing the computer program stored in the memory, implements the imaging millimeter wave radar-based vehicle navigation method disclosed above.
In a fourth aspect, the present application discloses a computer readable storage medium for storing a computer program; wherein the computer program when executed by a processor implements the imaging millimeter wave radar-based vehicle navigation method disclosed above.
Therefore, the point cloud located in each radar coordinate system and obtained by each imaging millimeter wave radar is converted into a vehicle coordinate system, so that point cloud splicing is carried out to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems; determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location cloud information; clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In addition, the method and the device can splice point clouds obtained by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguish dynamic point clouds and static point clouds according to the point cloud information determined by the vehicle motion information and the continuous point clouds, obtain tracking target information by using the dynamic point clouds, determine feasible regions by using the static point clouds, and perform path planning according to the tracking target information and the feasible regions to navigate the vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flowchart of a vehicle navigation method based on an imaging millimeter wave radar according to the present disclosure;
FIG. 2 is a schematic illustration of a radar installation of a vehicle according to the present application;
FIG. 3 is a schematic view of a coordinate system provided herein;
FIG. 4 is a schematic diagram of a coordinate system transformation provided herein;
FIG. 5 is a flowchart of a specific imaging millimeter wave radar-based vehicle navigation method provided by the present application;
FIG. 6 is a schematic diagram of an automatic vehicle driving system provided herein;
FIG. 7 is a schematic structural diagram of a vehicle navigation device based on an imaging millimeter wave radar according to the present application;
fig. 8 provides a block diagram of an electronic device according to the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The current environmental data acquisition tool is poor in adaptability, cannot work all weather, is not beneficial to making accurate path planning, and realizes automatic driving.
In order to overcome the problems, the application provides a vehicle navigation scheme based on an imaging millimeter wave radar, which can utilize a proper environment information acquisition tool to acquire environment information so as to make accurate path planning to navigate a vehicle and realize automatic driving.
Referring to fig. 1, an embodiment of the application discloses a vehicle navigation method based on an imaging millimeter wave radar, which includes:
step S11: converting point clouds located in radar coordinate systems and obtained by imaging millimeter wave radars into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; and each radar coordinate system and the vehicle coordinate system are rectangular coordinate systems.
In the embodiment of the application, surrounding environment data are obtained by utilizing various imaging millimeter wave radars to obtain point clouds. At present, the outside surrounding environment perception sensor of common acquireing surrounding environment data has camera, lidar, ultrasonic radar and millimeter wave radar etc. and corresponding sensor has the effect performance of obvious difference again because of the concrete configuration of self of difference. The camera is most similar to human eyes and belongs to a passive device, and light rays emitted or reflected by a receiving object acquire images in the imaging module. With the progress of the technology, the camera can obtain environment image information with ultrahigh resolution, and even a binocular stereo camera capable of identifying the depth of field. However, the lens has disadvantages similar to human eyes, such as being easily affected by air visibility such as haze and light, and the lens cleanliness maintenance is also troublesome in a dust-laden scene. In addition, the distance measurement accuracy of the camera to the target is gradually reduced along with the increase of the distance, and the decision quality of the system is influenced to a certain degree. Laser radar generally adopts a plurality of infrared laser transmitters to transmit laser beams, the laser beams are reflected by a target and then received by a receiver, and the target distance is calculated by calculating the time interval between receiving and transmitting and the constant of the light speed. And acquiring the target azimuth according to the azimuth angle of the laser radar transmitter during transmitting. Therefore, the laser radar can acquire accurate distance information of targets with different distances relative to the camera, does not depend on an ambient light source, and can acquire the distance information at daytime and night. Of course, the weather adaptability is poor due to the influence of direct sunlight, haze, rain, snow and other air visibility and lens cleanliness. Millimeter wave radars are used for auxiliary driving such as AEB (automatic Emergency braking) automatic Emergency braking and BSD (brake spread detection) blind area monitoring, and are limited by cost, era technical development limit and CAN (controller area network) bus interface with limited bandwidth, data are generally sparse, the angle resolution of target measurement is low, environment cannot be finely sensed, and only an auxiliary driving function without pursuing identification rate can be provided. However, automatic driving requires careful sensing of the environment to detect the positions and states of stationary and moving objects at different distances, not only to solve the problem of collision avoidance, but also to solve the problem of planning the route to the destination.
It should be noted that, in the past, millimeter wave radars for driving assistance, such as AEB automatic emergency braking and BSD blind area monitoring, are subject to cost, time technical development limitation, and the use of a CAN bus interface with limited bandwidth, and the like, and generally have sparse data, and the angular resolution of target measurement is also low, so that the environment cannot be finely sensed, and only a driving assistance function without pursuing a recognition rate CAN be provided. However, the automatic driving vehicle generally depends on a camera and a laser radar with high perception precision, but the two sensors are easily influenced by visibility and cannot meet the requirement of all-weather operation of outdoor industrial and mining scenes, and the laser radar also has the problems that mechanical moving parts are easily damaged and the price is high. The millimeter wave radar has the characteristic of working in all weather relative to the environment perception sensors such as the camera and the laser radar, and is not influenced by the camera and the laser radar easily by illumination, air visibility, lens cleanliness of the camera and the laser radar, and the characteristic has very important significance for improving the weather adaptability of the auxiliary driving level of the vehicle and even the automatic driving. Therefore, the imaging millimeter wave radar is adopted, and more surrounding environment data with higher precision are output by using the Ethernet interface.
It can be understood that the 77GHz imaging millimeter wave radar adopting the latest millimeter wave radar technology is used as a sensor, more antenna combinations are adopted, more physical channels and virtual channels are realized, the angular resolution characteristic of a long short plate of the millimeter wave radar is greatly improved, more precise environment perception information can be obtained, namely, the ambient environment data, clear ambient environment can be seen after the eyes with similar high myopia are corrected, and the user walks more confidently and more freely. And the formation of image radar has bigger detection range, and the collocation adopts the ethernet interface output target information of bigger communication bandwidth, can export more, the higher target spatial information and the attribute information of measurement accuracy, targets such as vehicle and pedestrian can correspond denser Point, similar laser radar presents the cloud form, also called Point cloud (Point cloud), it just can see the effect of environmental aspect clearly just to reach the cloud effect of Point, the effect of the formation of image of camera is close, be applicable to the environmental data acquisition in the aspect of the autopilot more. Specifically, the imaging millimeter wave radar sensor actively emits millimeter-wave-band electromagnetic waves for scanning, then receives electromagnetic wave information reflected by a target in a visual field, and calculates the relative speed, the relative distance and the azimuth angle of the target, even the pitch angle, by adopting a pulse compression principle, a Doppler effect and a Fourier transform algorithm. The environmental information is detected at high frequency for 10-30 times of scanning per second, and sufficient data refresh rate required by automatic driving of the vehicle is provided. Because 77 GHz's millimeter wave radar wavelength is about 4mm, can diffract easily fine dust, steam etc. so do not receive light and haze influence, ensured the all-weather work of vehicle.
In the embodiment of the application, two rectangular coordinate systems of a radar coordinate system and a vehicle coordinate system are utilized, after each imaging millimeter wave radar acquires environment information, the environment information exists in the radar coordinate system in a point cloud form, because the field angle of a single imaging millimeter wave radar is limited, about horizontal 120 degrees, therefore a vehicle may correspond to a plurality of imaging millimeter wave radars, through the point cloud information of a plurality of imaging millimeter wave radars installed around the spliced vehicle, 360-degree seamless perception around the vehicle can be realized, or the local large field angle perception in the front of the front and the side is covered. Specifically, a plurality of imaging millimeter wave radars are arranged at the front, four corners, the rear part, even the side surfaces and the like of the vehicle in the monitoring range according to requirements, the overlapping of the visual fields is ensured, and the type and the length of the vehicle are also considered, for example, a passenger vehicle can well cover the 360-degree range of the vehicle by using 6 imaging radars positioned at the front, the rear and the four corners, while a freight vehicle with longer length may need to be provided with 8 imaging millimeter wave radars, namely 1 radar is additionally arranged on the left side surface and the right side surface. Due to the fact that the number of radars is too large, point clouds obtained by the imaging millimeter wave radar point clouds need to be converted into a vehicle coordinate system so that point cloud splicing can be conducted to obtain continuous point clouds. It is specific, can realize the all-weather high accuracy point cloud formation of image of vehicle surrounding environment 10 ~ 30 frames/second through gathering a plurality of formation of image millimeter wave radars of installing around the vehicle after accomplishing the concatenation, be convenient for carry out surrounding environment data acquisition, as shown in figure 2, the simple information of radar mounted position and vehicle has been shown, specific including radar 1, radar 2, radar 3, radar 4, radar 5, radar 6's mounted position, and radar 1 normal line and the 1 driftage angle of radar that radar 1 corresponds, also include vehicle coordinate system origin and coordinate axis (X axle and Y axle), the vehicle axis, the parallel line and the driving direction of vehicle axis.
In the embodiment of the present application, the specific process of converting the point cloud into the vehicle coordinate system is as follows: acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of each imaging millimeter wave radar yaw angle; calculating coordinates of the point clouds in the imaging millimeter wave radar coordinate systems in the vehicle coordinate system according to the radar installation information to obtain multiple groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system; then, splicing the multiple groups of point cloud coordinates to obtain corresponding continuous point clouds; the point cloud in the radar coordinate system comprises a radar point cloud distance, a radar point cloud speed and a radar point cloud azimuth angle of the radar point cloud relative to the radar. It is to be noted that the radar installation information is acquired not by imaging the millimeter wave radar but by other sensors installed in the vehicle.
It should be noted that, as shown in fig. 3, the radar installation position is not at the origin of the vehicle coordinate system, in which case, addition and subtraction of coordinates in the horizontal and longitudinal directions are required, corresponding to the translation of the coordinate matrix in the data processing algorithm; then, the coordinate matrix is rotated in the data processing algorithm according to the radar installation yaw angle, as shown in fig. 3, a vehicle coordinate system generally takes the center of the rear axle of the vehicle as an origin O, the central axis of the vehicle is a longitudinal axis Y, and the straight line where the rear axle of the vehicle is located is a transverse axis X, so as to form a rectangular coordinate system, and generally takes the forward direction of the vehicle as the forward direction of the longitudinal axis and the forward direction of the transverse axis of the right direction of the driver. Let the coordinates in which the radar 1 is mounted in the vehicle coordinate system be (x)radar,yradar) The installation yaw angle is θ, and the coordinates of the detected target point trajectory P in the radar coordinate system are (x)1,y1) We need to find its coordinates (x) in the vehicle coordinate system2,y2). First, the origins of the radar coordinate system and the vehicle coordinate system are "overlapped" by translation, and "overlapping" is performed as shown in fig. 4 "Assuming that the coordinates of the trace point P in the radar coordinate system after the radar coordinate system is translated are (x)0,y0) Then, then
x0=x1-xradar;
y0=y1-yradar;
In the embodiment of the present application, because there is a yaw angle θ between the radar installation direction and the central axis of the vehicle, the rotation is further required, specifically, there is the following formula according to the angle complementation and trigonometric function rule:
x2=x0 cosθ-y0 sinθ=(x1-xradar)(cosθ-sinθ);
y2=y0 sinθ+y0 cosθ=(y1-yradar)(sinθ+cosθ);
it can be understood that, after the coordinates of all radar point clouds all correspond vehicle coordinate system, different radar point clouds just can be handled in a coordinate system, if guarantee when formation of image millimeter wave radar installs that there is the overlap in adjacent radar field of vision, just can guarantee the zero clearance of the point cloud after the concatenation, reach the whole concatenation effect like vehicle 360 degrees look around camera.
Step S12: and determining the location-to-location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location-to-location cloud information.
In the embodiment of the application, the sensors with different principles can generate wrong detection information due to self or environmental factors, and the wrong detection information is filtered or the influence of the wrong detection information is reduced according to the characteristics of the sensors and test experience, so that wrong results caused by the fact that false targets participate in later-stage operation are avoided. In addition, false point cloud appears due to the imaging millimeter wave radar caused by angle measurement blurring, velocity measurement blurring, environmental background noise or multipath effect and the like. Therefore, before the information of the continuous point cloud relative to the ground is determined according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, a false point cloud needs to be determined from the continuous point cloud and deleted, specifically, if the point cloud quality parameters corresponding to the continuous point cloud meet the preset parameter threshold filtering condition, the continuous point cloud is determined to be the false point cloud, and then the false point cloud is deleted; and/or if the continuous point clouds appear in the current scanning period but cannot be continuously detected in a first preset coordinate range in a first preset number of continuous scanning periods, determining the continuous point clouds as false point clouds, and deleting the false point clouds. It should be noted that the point cloud quality parameters include quality information such as a target rcs (radar Cross section) radar scattering Cross section area and ambiguity output by the millimeter wave radar, and the preset parameter threshold filtering condition is determined by finding a reproducible rule through a large number of actual scene tests in the early stage of a corresponding application project; it can be understood that if the continuous point cloud appears in the current scanning period, but the continuous point cloud cannot be continuously detected in the first preset coordinate range in the first preset number of continuous scanning periods, the confidence level of the point cloud in the first preset coordinate range is low, and the point cloud in the first preset coordinate range is deleted as a false point cloud, and the first preset number and the first preset coordinate range are obtained and verified through the early-stage testing experience of a specific application scenario.
In the embodiment of the application, after the false target is deleted, the opposite-place cloud information of the continuous point cloud relative to the ground is determined according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and the static point cloud and the dynamic point cloud are distinguished from the continuous point cloud based on the opposite-place cloud information. It can be understood that the imaged millimeter wave radar can classify the imaged millimeter wave radar into static point clouds without changing the information of the point clouds, including walls, railings and parked vehicles, which are obstacles to be avoided when planning a travel route in the later period; after the continuous point cloud is detected by the imaging millimeter wave radar, the point cloud information is changed and can be classified into dynamic point cloud, and a specific action may be decided by combining the dynamic point cloud when a travel route is planned at a later stage, for example, when it is judged that a target corresponding to the dynamic point cloud passes through a planned route of the vehicle, the vehicle may need to stop to wait for passing or detouring.
Step S13: and clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target.
In the embodiment of the present application, the clustering process is: determining a target dynamic point cloud with similar point cloud information from the dynamic point cloud within a second preset coordinate range; and clustering the target dynamic point cloud according to preset contour information to obtain a clustering target, and determining a clustering point corresponding to the clustering target and a coordinate of the clustering point. It can be understood that similar point cloud information includes similar point cloud speed of the point cloud of the point, the preset contour information is the contour of a common target in a specific scene and the contour size of the target, the target includes traffic participants such as vehicles and pedestrians, and the use of the preset contour can avoid clustering the dynamic point clouds corresponding to different individuals in a close distance, an approximate speed and a motion direction into 1 clustering point as much as possible; the clustering point is a position where the corresponding clustering target is closest to the vehicle or a central point of the clustering target.
In the embodiment of the application, after clustering is finished, further tracking a clustering target, and predicting a target coordinate range of the clustering target in the next scanning according to the clustering target speed and the clustering point coordinate contained in the clustering target; and if the point cloud to be judged with the clustering target speed and the clustering target motion direction similar to the clustering target is detected in the target coordinate range during the next scanning, judging the point cloud to be judged as the clustering target so as to realize target tracking and obtain tracking target information corresponding to the clustering target.
It should be noted that, the clustering points and the coordinates of the clustering points are determined in the clustering process, so that the computation load of the tracking process and the computation of the tracking target information is reduced; the target coordinate range may become a "wave gate". In addition, the radar scanning interval is only tens of milliseconds, the change of the motion state of the vehicle on the road is rapid, particularly, the vehicle is automatically driven in low-speed operation in a specific scene such as a mine truck and a port truck, and the motion state of the vehicle in normal running does not have large sudden change in the period, so the accuracy is high.
Step S14: and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
In the embodiment of the application, the area where the static point cloud is located is determined as the non-drivable area, and the other areas are drivable areas; in addition, the no-travel region may also be determined by: and if the fact that the information of the point clouds of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range is not changed is monitored, determining the third preset coordinate range as a non-driving area. It should be noted that the second preset number may be 5, or the second preset number may be appropriately reduced or increased according to the requirements of the specific application scene, the target running speed, the real-time performance, and the like, so as to adjust the sensitivity; the third preset coordinate range is defaulted to be 1 meter in length and width, and the size of the third preset coordinate range can be adjusted according to the requirements of specific applications.
In the embodiment of the application, for the road conditions of the step and the cliff which are lower than the current road section, the imaging millimeter wave radar cannot accurately identify the road conditions, and other sensors are needed to assist in identifying the road conditions. If the method is applied to automatic driving of vehicles in a specific scene, such as mine cards in an open-pit mine area, the map of the collected operation area is generally used, and a Beidou satellite positioning system is combined to avoid the vehicle from mistakenly falling off the cliff.
Therefore, the point cloud located in each radar coordinate system and obtained by each imaging millimeter wave radar is converted into a vehicle coordinate system, so that point cloud splicing is carried out to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems; determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location cloud information; clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In addition, the method splices point clouds obtained by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the point cloud information determined by the vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain cluster targets, tracks the cluster targets to obtain tracking target information, determines feasible areas by using the static point clouds, and then performs path planning according to the tracking target information and the feasible areas to navigate the vehicle.
Referring to fig. 5, an embodiment of the present application discloses a specific imaging millimeter wave radar-based vehicle navigation method, which includes:
step S21: converting point clouds located in radar coordinate systems and obtained by imaging millimeter wave radars into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; and each radar coordinate system and the vehicle coordinate system are rectangular coordinate systems.
For a more specific processing procedure of step S21, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
Step S22: detecting the vehicle motion information including vehicle motion speed and vehicle yaw rate of the vehicle relative to the ground; determining the ground point cloud speed of the continuous point cloud relative to the ground according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud; and if the opposite location cloud speed of the continuous point cloud is less than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the opposite location cloud speed of the continuous point cloud is not less than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
In the embodiment of the application, radar point cloud information including radar point cloud coordinate, radar point cloud distance, radar point cloud speed and radar point cloud azimuth angle of radar point cloud for the radar is obtained the vehicle is in including vehicle motion speed and vehicle yaw rate for ground vehicle motion information acquires including each formation of image millimeter wave radar coordinate information in the vehicle coordinate system and the radar installation information of each formation of image millimeter wave radar yaw angle, obtains radar point cloud, vehicle and the relative motion relation of the three on ground through above-mentioned radar point cloud information, vehicle motion information and radar installation information, deduces the point cloud to the place cloud information, then further distinguishes static point cloud and dynamic point cloud according to the place cloud information. It should be noted that, in the above process, the continuous point cloud is obtained according to the radar installation information and the radar point cloud information, the continuous point cloud includes continuous point cloud information of the continuous point cloud relative to the vehicle, the point cloud information can be deduced by using the continuous point cloud and the vehicle motion information, and then the static point cloud and the dynamic point cloud are further distinguished. Specifically, the ground point cloud speed of the continuous point cloud relative to the ground is determined according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud. And if the opposite location cloud speed of the continuous point cloud is less than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the opposite location cloud speed of the continuous point cloud is not less than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud. It will be appreciated that the vehicle yaw rate is used to characterize the physical quantity of how fast the vehicle steering angle changes, in "degrees/second" or "radians/second".
It should be noted that, for the location cloud speed including the speed in the X-axis direction and the speed in the Y-axis direction, the preset speed threshold includes a first preset speed threshold in the X-axis direction and a second preset speed threshold in the Y-axis direction. And when the speed of the continuous point cloud in the X-axis direction is less than a first preset speed threshold and the speed of the continuous point cloud in the Y-axis direction is less than a second preset speed threshold, determining the continuous point cloud as a static point cloud, and otherwise, determining the continuous point cloud as a dynamic point cloud.
Specifically, the yaw rate of the vehicle is also called angular velocity, and is represented by ω, and according to the rule that the target makes circular motion with radius r, the corresponding linear velocity is:
wherein x and y are projections of r on the coordinate axis of the rectangular coordinate system taking the circle center of the circumference as the origin, and linear velocities at coordinates are obtained in sequence, because the linear velocities are tangential velocities and are perpendicular to the straight line determined by the origin and the coordinate point, for convenience of use, the linear velocities are decomposed into 2 components along the coordinate axis of the vehicle coordinate system, wherein the components are v respectivelyLine xAnd vLine yIt can be calculated that:
it can be understood that the radar point cloud speed v of the radar point cloud relative to the radarPoint cloudThe component in the X axis being vPoint cloud xThe component in the Y axis being vPoint cloud yUsing vPoint cloud xAnd vPoint cloud yWith vehicle speed vVehicle with wheelsLinear velocity component vLine xAnd vLine yCalculating the cloud speed v of the location by performing addition and subtraction vector operationTo the location cloudFor example, when the vehicle is going straight, vThread0 if the transverse relative velocity v of the radar point cloudPoint cloud xAlso 0, the longitudinal speed is passedThe addition and subtraction of the degree obtains the ground speed v of the continuous point cloudTo the location cloud=vTo the location cloud y=vVehicle with a detachable front cover+vPoint cloud yWhen the continuous point cloud is a stationary point cloud, v isPoint cloud yThe magnitude of the value will be equal to vVehicle with a detachable front coverIn the same but opposite direction, i.e. according to vVehicle with wheels+vPoint cloud yWhether the continuous point cloud is equal to 0 or not is judged. Of course, if vVehicle with wheels+vPoint cloud y0 but vPoint cloud xIf not equal to 0, the continuous point cloud is still a dynamic point cloud. When the vehicle is turning, vTo the location cloud x=vPoint cloud x+vLine xAnd vTo the location cloud y=vVehicle with wheels+vPoint cloud y+vLine yIf the number of the continuous point clouds is equal to 0, the continuous point cloud is judged to be a static point cloud, otherwise, the continuous point cloud is judged to be a dynamic point cloud. The actual detection data of the radar has a certain range of jitter, so that the time when v is equal toTo the place cloud xLess than a first preset speed threshold and vTo the point cloud yAnd when the speed is less than a second preset speed threshold, judging the continuous point cloud to be a static point cloud, and judging the continuous point cloud to be a dynamic point cloud under other conditions.
Step S23: clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain the tracking target information corresponding to the clustering target
For a more specific processing procedure of step S23, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
Step S24: and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
For a more specific processing procedure of step S24, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
Therefore, the point cloud located in each radar coordinate system and obtained by each imaging millimeter wave radar is converted into a vehicle coordinate system, so that point cloud splicing is carried out to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems; detecting the vehicle motion information including vehicle motion speed and vehicle yaw rate of the vehicle relative to the ground; determining the ground point cloud speed of the continuous point cloud relative to the ground according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud; if the cloud speed of the continuous point cloud to the place is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the cloud speed of the continuous point cloud to the place is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud; clustering the dynamic point cloud according to the pair of location cloud information to obtain a clustering target, and tracking the clustering target according to the clustering target speed and the clustering target movement direction contained in the clustering target to obtain tracking target information corresponding to the clustering target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In addition, the method splices point clouds obtained by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the speed of the point clouds determined by vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain cluster targets, tracks the cluster targets to obtain tracking target information, determines feasible areas by using the static point clouds, and then performs path planning according to the tracking target information and the feasible areas to navigate the vehicle.
As shown in fig. 6, a vehicle automatic driving system used in the embodiment of the present application is shown, the system includes an imaging millimeter wave radar subsystem and other devices on a vehicle such as a combined navigation system and a power supply, the system belongs to a subset of the whole automatic driving system, and outputs a primarily processed target and judgment result data to a main control unit of the automatic driving system for fusion operation and use, the other devices include other sensors, and the vehicle automatic driving system includes a millimeter wave radar sensor, a radar gateway and a processor. In the imaging millimeter wave radar subsystem, the radar 1, the radar 2, the radar 3, the radar 4, the radar 5 and the radar 6 send point clouds to a radar gateway through the Ethernet, and a network switch is arranged in the radar gateway and is responsible for collecting the point clouds output by each imaging millimeter wave radar through an Ethernet interface of the imaging millimeter wave radar and outputting the point clouds to a processor through a gigabit network interface. The power supply in the other equipment supplies power to the processor through the power supply line of 12V DC/10A, the other sensors in the other equipment transmit the vehicle speed and the vehicle yaw rate to the processor through a CAN bus or RS232, and the processor transmits the processed result to the other equipment through an output interface so as to carry out path planning and navigate the vehicle. It is to be noted that converting point clouds in a radar coordinate system into a vehicle coordinate system, splicing the point clouds to obtain continuous point clouds, determining a false target and deleting the false target, distinguishing a dynamic point canal static point cloud, performing point cloud clustering on the target point cloud to obtain a clustered target, tracking the clustered target to obtain tracking target information, determining a travelable area by using the static point cloud, and the like are all completed by a processor. Wherein the processor includes but is not limited to an embedded processor.
Referring to fig. 7, an embodiment of the present application discloses a vehicle navigation device based on an imaging millimeter wave radar, including:
the point cloud splicing module 11 is used for converting point clouds, obtained by the imaging millimeter wave radars, located in radar coordinate systems into a vehicle coordinate system so as to perform point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems;
the point cloud distinguishing module 12 is configured to determine location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguish a static point cloud and a dynamic point cloud from the continuous point cloud based on the location cloud information;
the clustering module 13 is configured to cluster the dynamic point cloud according to the location cloud information to obtain a clustering target, and track the clustering target according to a clustering target speed and a clustering target motion direction included in the clustering target to obtain tracking target information corresponding to the clustering target;
and the path planning module 14 is configured to determine a travelable area according to the static point cloud, and perform path planning based on the travelable area and the tracking target information, so as to navigate the vehicle.
For more specific working processes of the modules, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
Therefore, the point cloud located in each radar coordinate system and obtained by each imaging millimeter wave radar is converted into a vehicle coordinate system, so that point cloud splicing is carried out to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems; determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location cloud information; clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In addition, the method splices point clouds obtained by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the point cloud information determined by the vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain cluster targets, tracks the cluster targets to obtain tracking target information, determines feasible areas by using the static point clouds, and then performs path planning according to the tracking target information and the feasible areas to navigate the vehicle.
Further, an electronic device is provided in the embodiments of the present application, and fig. 8 is a block diagram of an electronic device 20 according to an exemplary embodiment, which should not be construed as limiting the scope of the application.
Fig. 8 is a schematic structural diagram of an electronic device 20 according to an embodiment of the present disclosure. The electronic device 20 may specifically include: at least one processor 21, at least one memory 22, a power supply 23, an input output interface 24, a communication interface 25, and a communication bus 26. Wherein the memory 22 is used for storing a computer program, and the computer program is loaded and executed by the processor 21 to implement the relevant steps of the imaging millimeter wave radar-based vehicle navigation method disclosed in any of the foregoing embodiments.
In this embodiment, the power supply 23 is configured to provide a working voltage for each hardware device on the electronic device 20; the communication interface 25 can create a data transmission channel between the electronic device 20 and an external device, and a communication protocol followed by the communication interface is any communication protocol applicable to the technical solution of the present application, and is not specifically limited herein; the input/output interface 24 is configured to obtain external input data or output data to the outside, and a specific interface type thereof may be selected according to specific application requirements, which is not specifically limited herein.
In addition, the storage 22 is used as a carrier for resource storage, and may be a read-only memory, a random access memory, a magnetic disk or an optical disk, and the storage 22 is used as a non-volatile storage that may include a random access memory as a running memory and a storage purpose for an external memory, and the storage resources on the storage include an operating system 221, a computer program 222, and the like, and the storage manner may be a transient storage or a permanent storage.
The operating system 221 is used for managing and controlling each hardware device and the computer program 222 on the electronic device 20 on the source host, and the operating system 221 may be Windows, Unix, Linux, or the like. The computer program 222 may further include a computer program that can be used to perform other specific tasks in addition to the computer program that can be used to perform the imaging millimeter wave radar-based vehicle navigation method executed by the electronic device 20 disclosed in any of the foregoing embodiments.
In this embodiment, the input/output interface 24 may specifically include, but is not limited to, a USB interface, a hard disk reading interface, a serial interface, a voice input interface, a fingerprint input interface, and the like.
Further, the embodiment of the application also discloses a computer readable storage medium for storing a computer program; wherein the computer program when executed by a processor implements the imaging millimeter wave radar-based vehicle navigation method disclosed above.
For the specific steps of the method, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and details are not repeated here
A computer-readable storage medium as referred to herein includes a Random Access Memory (RAM), a Memory, a Read-Only Memory (ROM), an electrically programmable ROM, an electrically erasable programmable ROM, a register, a hard disk, a magnetic or optical disk, or any other form of storage medium known in the art. Wherein the computer program, when executed by a processor, implements the aforementioned imaging millimeter wave radar-based vehicle navigation method. For the specific steps of the method, reference may be made to the corresponding contents disclosed in the foregoing embodiments, which are not described herein again.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the vehicle navigation method based on the imaging millimeter wave radar disclosed by the embodiment, so that the description is simple, and the relevant points can be obtained by referring to the description of the method part.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of an algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method, the device, the equipment and the medium for vehicle navigation based on the imaging millimeter wave radar are described in detail, a specific example is applied in the text to explain the principle and the implementation mode of the invention, and the description of the embodiment is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (10)
1. A vehicle navigation method based on an imaging millimeter wave radar is characterized by comprising the following steps:
converting point clouds located in radar coordinate systems and obtained by imaging millimeter wave radars into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems;
determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the location cloud information;
clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the moving direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target;
and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
2. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein the step of converting the point cloud obtained by each imaging millimeter wave radar in each radar coordinate system into a vehicle coordinate system for point cloud registration to obtain a continuous point cloud comprises:
acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of each imaging millimeter wave radar yaw angle;
calculating coordinates of the point clouds in the imaging millimeter wave radar coordinate systems in the vehicle coordinate system according to the radar installation information to obtain multiple groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system;
and splicing the multiple groups of point cloud coordinates to obtain corresponding continuous point clouds.
3. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, before determining the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, further comprising:
if the point cloud quality parameters corresponding to the continuous point cloud meet the preset parameter threshold filtering condition, determining the continuous point cloud as a false point cloud, and then deleting the false point cloud;
and/or if the continuous point cloud appears in the current scanning period but cannot be continuously detected in a first preset coordinate range in a first preset number of continuous scanning periods, determining the continuous point cloud as a false point cloud, and deleting the false point cloud.
4. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein the determining of the location cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and the distinguishing of the static point cloud and the dynamic point cloud from the continuous point cloud based on the location cloud information comprises:
detecting the vehicle motion information including vehicle motion speed and vehicle yaw rate of the vehicle relative to the ground;
determining the ground point cloud speed of the continuous point cloud relative to the ground according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud;
and if the opposite location cloud speed of the continuous point cloud is less than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the opposite location cloud speed of the continuous point cloud is not less than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
5. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein the clustering the dynamic point cloud according to the point cloud information to obtain a clustering target comprises:
determining a target dynamic point cloud with similar point cloud information from the dynamic point cloud within a second preset coordinate range;
and clustering the target dynamic point cloud according to preset contour information to obtain a clustering target, and determining a clustering point corresponding to the clustering target and a coordinate of the clustering point.
6. The imaging millimeter wave radar-based vehicle navigation method according to claim 5, wherein the tracking the clustered target according to the speed and the moving direction of the clustered target included in the clustered target to obtain the tracking target information corresponding to the clustered target comprises:
predicting the coordinate range of the target where the clustering target is located in the next scanning according to the clustering target speed, the clustering target moving direction and the clustering point coordinate which are contained in the clustering target;
and if the point cloud to be judged with the clustering target speed and the clustering target motion direction similar to the clustering target is detected in the target coordinate range during the next scanning, judging the point cloud to be judged as the clustering target so as to realize target tracking and obtain tracking target information corresponding to the clustering target.
7. The imaging millimeter wave radar-based vehicle navigation method according to any one of claims 1 to 6, further comprising:
and if the fact that the information of the point clouds of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range is not changed is monitored, determining the third preset coordinate range as a non-driving area.
8. A vehicle navigation device based on imaging millimeter wave radar, characterized by comprising:
the point cloud splicing module is used for converting point clouds, obtained by the imaging millimeter wave radars, located in radar coordinate systems into a vehicle coordinate system so as to carry out point cloud splicing to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; each radar coordinate system and each vehicle coordinate system are rectangular coordinate systems;
the point cloud distinguishing module is used for determining the point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing a static point cloud and a dynamic point cloud from the continuous point cloud based on the point cloud information;
the clustering module is used for clustering the dynamic point cloud according to the information of the point cloud to obtain a clustering target, and tracking the clustering target according to the speed and the motion direction of the clustering target contained in the clustering target to obtain tracking target information corresponding to the clustering target;
and the path planning module is used for determining a travelable area according to the static point cloud and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
9. An electronic device comprising a processor and a memory; wherein the processor, when executing the computer program stored in the memory, implements the imaging millimeter wave radar-based vehicle navigation method of any of claims 1 to 7.
10. A computer-readable storage medium for storing a computer program; wherein the computer program when executed by a processor implements the imaging millimeter wave radar-based vehicle navigation method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210108581.5A CN114442101B (en) | 2022-01-28 | 2022-01-28 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210108581.5A CN114442101B (en) | 2022-01-28 | 2022-01-28 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114442101A true CN114442101A (en) | 2022-05-06 |
CN114442101B CN114442101B (en) | 2023-11-14 |
Family
ID=81372193
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210108581.5A Active CN114442101B (en) | 2022-01-28 | 2022-01-28 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114442101B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114637006A (en) * | 2022-05-07 | 2022-06-17 | 长沙莫之比智能科技有限公司 | Early warning area self-adaptive adjustment method based on millimeter wave personnel fall detection radar |
CN114690134A (en) * | 2022-03-14 | 2022-07-01 | 重庆长安汽车股份有限公司 | Fidelity testing method for millimeter wave radar model and readable storage medium |
CN115017467A (en) * | 2022-08-08 | 2022-09-06 | 北京主线科技有限公司 | Method and device for compensating following target and storage medium |
CN115290104A (en) * | 2022-07-14 | 2022-11-04 | 襄阳达安汽车检测中心有限公司 | Simulation map generation method, device, equipment and readable storage medium |
CN116953704A (en) * | 2022-12-23 | 2023-10-27 | 河北德冠隆电子科技有限公司 | Wisdom is adjustable omnidirectionally scanning millimeter wave radar of multidimension angle for transportation |
CN117148837A (en) * | 2023-08-31 | 2023-12-01 | 上海木蚁机器人科技有限公司 | Dynamic obstacle determination method, device, equipment and medium |
Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108226924A (en) * | 2018-01-11 | 2018-06-29 | 李烜 | Running car environment detection method, apparatus and its application based on millimetre-wave radar |
CN108345007A (en) * | 2017-01-23 | 2018-07-31 | 郑州宇通客车股份有限公司 | A kind of obstacle recognition method and device |
CN108345823A (en) * | 2017-01-23 | 2018-07-31 | 郑州宇通客车股份有限公司 | A kind of barrier tracking and device based on Kalman filtering |
CN108710828A (en) * | 2018-04-18 | 2018-10-26 | 北京汽车集团有限公司 | The method, apparatus and storage medium and vehicle of identification object |
CN109740628A (en) * | 2018-12-03 | 2019-05-10 | 深圳市华讯方舟太赫兹科技有限公司 | Point cloud clustering method, image processing equipment and the device with store function |
CN109870680A (en) * | 2018-10-26 | 2019-06-11 | 北京润科通用技术有限公司 | A kind of objective classification method and device |
CN110084840A (en) * | 2019-04-24 | 2019-08-02 | 百度在线网络技术(北京)有限公司 | Point cloud registration method, device, server and computer-readable medium |
CN110208793A (en) * | 2019-04-26 | 2019-09-06 | 纵目科技(上海)股份有限公司 | DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar |
CN110310294A (en) * | 2019-07-08 | 2019-10-08 | 江苏易图地理信息科技股份有限公司 | A kind of point cloud segmentation method using adaptivenon-uniform sampling face fuzzy C-means clustering |
CN110412516A (en) * | 2019-08-20 | 2019-11-05 | 河北德冠隆电子科技有限公司 | Detection method and device of the millimetre-wave radar to stationary object and slow changing object |
CN110914703A (en) * | 2017-07-31 | 2020-03-24 | 深圳市大疆创新科技有限公司 | Correction of motion-based inaccuracies in point clouds |
CN110969855A (en) * | 2019-12-13 | 2020-04-07 | 长沙莫之比智能科技有限公司 | Traffic flow monitoring system based on millimeter wave radar |
EP3633404A1 (en) * | 2018-10-02 | 2020-04-08 | Ibeo Automotive Systems GmbH | Method and apparatus for optical distance measurements |
CN111239766A (en) * | 2019-12-27 | 2020-06-05 | 北京航天控制仪器研究所 | Water surface multi-target rapid identification and tracking method based on laser radar |
CN111260683A (en) * | 2020-01-09 | 2020-06-09 | 合肥工业大学 | Target detection and tracking method and device for three-dimensional point cloud data |
CN111582352A (en) * | 2020-04-30 | 2020-08-25 | 上海高仙自动化科技发展有限公司 | Object-based sensing method and device, robot and storage medium |
CN111797734A (en) * | 2020-06-22 | 2020-10-20 | 广州视源电子科技股份有限公司 | Vehicle point cloud data processing method, device, equipment and storage medium |
CN111857168A (en) * | 2020-07-03 | 2020-10-30 | 北京二郎神科技有限公司 | Unmanned aerial vehicle positioning method and device and unmanned aerial vehicle parking attitude adjusting method and device |
WO2021017314A1 (en) * | 2019-07-29 | 2021-02-04 | 浙江商汤科技开发有限公司 | Information processing method, information positioning method and apparatus, electronic device and storage medium |
JPWO2021053811A1 (en) * | 2019-09-20 | 2021-03-25 | ||
CN112847343A (en) * | 2020-12-29 | 2021-05-28 | 深圳市普渡科技有限公司 | Dynamic target tracking and positioning method, device, equipment and storage medium |
CN113031005A (en) * | 2021-02-22 | 2021-06-25 | 江苏大学 | Crane dynamic obstacle identification method based on laser radar |
CN113139607A (en) * | 2021-04-27 | 2021-07-20 | 苏州挚途科技有限公司 | Obstacle detection method and device |
CN113156414A (en) * | 2020-12-16 | 2021-07-23 | 中国人民解放军陆军工程大学 | Intelligent sensing and path planning transportation system based on MIMO millimeter wave radar |
CN113313200A (en) * | 2021-06-21 | 2021-08-27 | 中国科学院自动化研究所苏州研究院 | Point cloud fine matching method based on normal constraint |
CN113391270A (en) * | 2021-06-11 | 2021-09-14 | 森思泰克河北科技有限公司 | False target suppression method and device for multi-radar point cloud fusion and terminal equipment |
JPWO2021181647A1 (en) * | 2020-03-13 | 2021-09-16 | ||
CN113479218A (en) * | 2021-08-09 | 2021-10-08 | 哈尔滨工业大学 | Roadbed automatic driving auxiliary detection system and control method thereof |
CN113537316A (en) * | 2021-06-30 | 2021-10-22 | 南京理工大学 | Vehicle detection method based on 4D millimeter wave radar point cloud |
CN113589288A (en) * | 2021-06-24 | 2021-11-02 | 广西综合交通大数据研究院 | Target screening method, device and equipment based on millimeter wave radar and storage medium |
CN113671481A (en) * | 2021-07-21 | 2021-11-19 | 西安电子科技大学 | 3D multi-target tracking processing method based on millimeter wave radar |
CN113674355A (en) * | 2021-07-06 | 2021-11-19 | 中国北方车辆研究所 | Target identification and positioning method based on camera and laser radar |
CN113689471A (en) * | 2021-09-09 | 2021-11-23 | 中国联合网络通信集团有限公司 | Target tracking method and device, computer equipment and storage medium |
CN113721234A (en) * | 2021-08-30 | 2021-11-30 | 南京慧尔视智能科技有限公司 | Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device |
CN113761238A (en) * | 2021-08-27 | 2021-12-07 | 广州文远知行科技有限公司 | Point cloud storage method, device, equipment and storage medium |
CN113792699A (en) * | 2021-09-24 | 2021-12-14 | 北京易航远智科技有限公司 | Object-level rapid scene recognition method based on semantic point cloud |
CN113807168A (en) * | 2021-08-05 | 2021-12-17 | 北京蜂云科创信息技术有限公司 | Vehicle driving environment sensing method, vehicle-mounted equipment and storage medium |
CN113848545A (en) * | 2021-09-01 | 2021-12-28 | 电子科技大学 | Fusion target detection and tracking method based on vision and millimeter wave radar |
CN113888748A (en) * | 2021-09-27 | 2022-01-04 | 北京经纬恒润科技股份有限公司 | Point cloud data processing method and device |
-
2022
- 2022-01-28 CN CN202210108581.5A patent/CN114442101B/en active Active
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108345007A (en) * | 2017-01-23 | 2018-07-31 | 郑州宇通客车股份有限公司 | A kind of obstacle recognition method and device |
CN108345823A (en) * | 2017-01-23 | 2018-07-31 | 郑州宇通客车股份有限公司 | A kind of barrier tracking and device based on Kalman filtering |
CN110914703A (en) * | 2017-07-31 | 2020-03-24 | 深圳市大疆创新科技有限公司 | Correction of motion-based inaccuracies in point clouds |
CN108226924A (en) * | 2018-01-11 | 2018-06-29 | 李烜 | Running car environment detection method, apparatus and its application based on millimetre-wave radar |
CN108710828A (en) * | 2018-04-18 | 2018-10-26 | 北京汽车集团有限公司 | The method, apparatus and storage medium and vehicle of identification object |
EP3633404A1 (en) * | 2018-10-02 | 2020-04-08 | Ibeo Automotive Systems GmbH | Method and apparatus for optical distance measurements |
CN109870680A (en) * | 2018-10-26 | 2019-06-11 | 北京润科通用技术有限公司 | A kind of objective classification method and device |
CN109740628A (en) * | 2018-12-03 | 2019-05-10 | 深圳市华讯方舟太赫兹科技有限公司 | Point cloud clustering method, image processing equipment and the device with store function |
CN110084840A (en) * | 2019-04-24 | 2019-08-02 | 百度在线网络技术(北京)有限公司 | Point cloud registration method, device, server and computer-readable medium |
CN110208793A (en) * | 2019-04-26 | 2019-09-06 | 纵目科技(上海)股份有限公司 | DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar |
CN110310294A (en) * | 2019-07-08 | 2019-10-08 | 江苏易图地理信息科技股份有限公司 | A kind of point cloud segmentation method using adaptivenon-uniform sampling face fuzzy C-means clustering |
WO2021017314A1 (en) * | 2019-07-29 | 2021-02-04 | 浙江商汤科技开发有限公司 | Information processing method, information positioning method and apparatus, electronic device and storage medium |
CN110412516A (en) * | 2019-08-20 | 2019-11-05 | 河北德冠隆电子科技有限公司 | Detection method and device of the millimetre-wave radar to stationary object and slow changing object |
JPWO2021053811A1 (en) * | 2019-09-20 | 2021-03-25 | ||
CN110969855A (en) * | 2019-12-13 | 2020-04-07 | 长沙莫之比智能科技有限公司 | Traffic flow monitoring system based on millimeter wave radar |
CN111239766A (en) * | 2019-12-27 | 2020-06-05 | 北京航天控制仪器研究所 | Water surface multi-target rapid identification and tracking method based on laser radar |
CN111260683A (en) * | 2020-01-09 | 2020-06-09 | 合肥工业大学 | Target detection and tracking method and device for three-dimensional point cloud data |
JPWO2021181647A1 (en) * | 2020-03-13 | 2021-09-16 | ||
CN111582352A (en) * | 2020-04-30 | 2020-08-25 | 上海高仙自动化科技发展有限公司 | Object-based sensing method and device, robot and storage medium |
CN111797734A (en) * | 2020-06-22 | 2020-10-20 | 广州视源电子科技股份有限公司 | Vehicle point cloud data processing method, device, equipment and storage medium |
CN111857168A (en) * | 2020-07-03 | 2020-10-30 | 北京二郎神科技有限公司 | Unmanned aerial vehicle positioning method and device and unmanned aerial vehicle parking attitude adjusting method and device |
CN113156414A (en) * | 2020-12-16 | 2021-07-23 | 中国人民解放军陆军工程大学 | Intelligent sensing and path planning transportation system based on MIMO millimeter wave radar |
CN112847343A (en) * | 2020-12-29 | 2021-05-28 | 深圳市普渡科技有限公司 | Dynamic target tracking and positioning method, device, equipment and storage medium |
CN113031005A (en) * | 2021-02-22 | 2021-06-25 | 江苏大学 | Crane dynamic obstacle identification method based on laser radar |
CN113139607A (en) * | 2021-04-27 | 2021-07-20 | 苏州挚途科技有限公司 | Obstacle detection method and device |
CN113391270A (en) * | 2021-06-11 | 2021-09-14 | 森思泰克河北科技有限公司 | False target suppression method and device for multi-radar point cloud fusion and terminal equipment |
CN113313200A (en) * | 2021-06-21 | 2021-08-27 | 中国科学院自动化研究所苏州研究院 | Point cloud fine matching method based on normal constraint |
CN113589288A (en) * | 2021-06-24 | 2021-11-02 | 广西综合交通大数据研究院 | Target screening method, device and equipment based on millimeter wave radar and storage medium |
CN113537316A (en) * | 2021-06-30 | 2021-10-22 | 南京理工大学 | Vehicle detection method based on 4D millimeter wave radar point cloud |
CN113674355A (en) * | 2021-07-06 | 2021-11-19 | 中国北方车辆研究所 | Target identification and positioning method based on camera and laser radar |
CN113671481A (en) * | 2021-07-21 | 2021-11-19 | 西安电子科技大学 | 3D multi-target tracking processing method based on millimeter wave radar |
CN113807168A (en) * | 2021-08-05 | 2021-12-17 | 北京蜂云科创信息技术有限公司 | Vehicle driving environment sensing method, vehicle-mounted equipment and storage medium |
CN113479218A (en) * | 2021-08-09 | 2021-10-08 | 哈尔滨工业大学 | Roadbed automatic driving auxiliary detection system and control method thereof |
CN113761238A (en) * | 2021-08-27 | 2021-12-07 | 广州文远知行科技有限公司 | Point cloud storage method, device, equipment and storage medium |
CN113721234A (en) * | 2021-08-30 | 2021-11-30 | 南京慧尔视智能科技有限公司 | Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device |
CN113848545A (en) * | 2021-09-01 | 2021-12-28 | 电子科技大学 | Fusion target detection and tracking method based on vision and millimeter wave radar |
CN113689471A (en) * | 2021-09-09 | 2021-11-23 | 中国联合网络通信集团有限公司 | Target tracking method and device, computer equipment and storage medium |
CN113792699A (en) * | 2021-09-24 | 2021-12-14 | 北京易航远智科技有限公司 | Object-level rapid scene recognition method based on semantic point cloud |
CN113888748A (en) * | 2021-09-27 | 2022-01-04 | 北京经纬恒润科技股份有限公司 | Point cloud data processing method and device |
Non-Patent Citations (6)
Title |
---|
周笑宇: "融合附加神经网络的激光雷达点云单目标跟踪", 《中国激光》, vol. 48, no. 21, pages 2110001 - 145 * |
战荫泽: "基于激光雷达与红外图像融合的车辆目标识别算法", 《激光与红外》, vol. 51, no. 9, pages 1238 - 1242 * |
汪晓楚: "地面激光点云结合摄影测量的建筑物三维建模", 《安徽师范大学学报(自然科学版)》, vol. 40, no. 6, pages 569 - 573 * |
程艳云等: "动态分配聚类中心的改进K均值聚类算法", 《计算机技术与发展》, vol. 27, no. 2, pages 33 - 37 * |
罗革: "三维激光点云运动图像动态目标识别", 《激光杂志》, vol. 42, no. 9, pages 134 - 138 * |
邸慧军等: "《无人驾驶车辆目标检测与运动跟踪》", 北京理工大学出版社, pages: 143 - 145 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114690134A (en) * | 2022-03-14 | 2022-07-01 | 重庆长安汽车股份有限公司 | Fidelity testing method for millimeter wave radar model and readable storage medium |
CN114637006A (en) * | 2022-05-07 | 2022-06-17 | 长沙莫之比智能科技有限公司 | Early warning area self-adaptive adjustment method based on millimeter wave personnel fall detection radar |
CN115290104A (en) * | 2022-07-14 | 2022-11-04 | 襄阳达安汽车检测中心有限公司 | Simulation map generation method, device, equipment and readable storage medium |
CN115017467A (en) * | 2022-08-08 | 2022-09-06 | 北京主线科技有限公司 | Method and device for compensating following target and storage medium |
CN116953704A (en) * | 2022-12-23 | 2023-10-27 | 河北德冠隆电子科技有限公司 | Wisdom is adjustable omnidirectionally scanning millimeter wave radar of multidimension angle for transportation |
CN117148837A (en) * | 2023-08-31 | 2023-12-01 | 上海木蚁机器人科技有限公司 | Dynamic obstacle determination method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN114442101B (en) | 2023-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114442101B (en) | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar | |
CN109556615B (en) | Driving map generation method based on multi-sensor fusion cognition of automatic driving | |
US10705220B2 (en) | System and method for ground and free-space detection | |
CN109927719B (en) | Auxiliary driving method and system based on obstacle trajectory prediction | |
EP3779922B1 (en) | Method for estimating driving road and driving road estimation system | |
CN112665556B (en) | Generating a three-dimensional map of a scene using passive and active measurements | |
Ogawa et al. | Pedestrian detection and tracking using in-vehicle lidar for automotive application | |
US8605947B2 (en) | Method for detecting a clear path of travel for a vehicle enhanced by object detection | |
US10705534B2 (en) | System and method for ground plane detection | |
CN112241007A (en) | Calibration method and arrangement structure of automatic driving environment perception sensor and vehicle | |
US11544940B2 (en) | Hybrid lane estimation using both deep learning and computer vision | |
CN113850102B (en) | Vehicle-mounted vision detection method and system based on millimeter wave radar assistance | |
CN109828571A (en) | Automatic driving vehicle, method and apparatus based on V2X | |
CN110807412B (en) | Vehicle laser positioning method, vehicle-mounted equipment and storage medium | |
DE112018004891T5 (en) | IMAGE PROCESSING DEVICE, IMAGE PROCESSING PROCESS, PROGRAM AND MOBILE BODY | |
US11754415B2 (en) | Sensor localization from external source data | |
US11796331B2 (en) | Associating perceived and mapped lane edges for localization | |
Valldorf et al. | Advanced Microsystems for Automotive Applications 2007 | |
US11210941B2 (en) | Systems and methods for mitigating anomalies in lane change detection | |
US11845429B2 (en) | Localizing and updating a map using interpolated lane edge data | |
CN114084129A (en) | Fusion-based vehicle automatic driving control method and system | |
CN110427034B (en) | Target tracking system and method based on vehicle-road cooperation | |
US20240192021A1 (en) | Handling Road Marking Changes | |
CN113734197A (en) | Unmanned intelligent control scheme based on data fusion | |
CN111766601A (en) | Recognition device, vehicle control device, recognition method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |