CN114442101B - Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar - Google Patents

Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar Download PDF

Info

Publication number
CN114442101B
CN114442101B CN202210108581.5A CN202210108581A CN114442101B CN 114442101 B CN114442101 B CN 114442101B CN 202210108581 A CN202210108581 A CN 202210108581A CN 114442101 B CN114442101 B CN 114442101B
Authority
CN
China
Prior art keywords
point cloud
clustered
vehicle
target
continuous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210108581.5A
Other languages
Chinese (zh)
Other versions
CN114442101A (en
Inventor
王升亮
顾超
许孝勇
仇世豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Hurys Intelligent Technology Co Ltd
Original Assignee
Nanjing Hurys Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Hurys Intelligent Technology Co Ltd filed Critical Nanjing Hurys Intelligent Technology Co Ltd
Priority to CN202210108581.5A priority Critical patent/CN114442101B/en
Publication of CN114442101A publication Critical patent/CN114442101A/en
Application granted granted Critical
Publication of CN114442101B publication Critical patent/CN114442101B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging

Abstract

The application discloses a vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar, wherein the method comprises the following steps: converting the point cloud obtained by the imaging millimeter wave radar into a vehicle coordinate system, and splicing to obtain a continuous point cloud; determining point cloud opposite information of the continuous point cloud according to the vehicle motion information and the continuous point cloud, and distinguishing a static point cloud and a dynamic point cloud based on the point cloud opposite information; clustering the dynamic point cloud to obtain a clustered target, and tracking the clustered target to obtain tracking target information; and determining a drivable area according to the static point cloud, planning a path based on the drivable area and tracking target information, and navigating the vehicle. Therefore, all-weather work can be realized by using the imaging millimeter wave radar, the dynamic point cloud and the static point cloud are distinguished from the continuous point cloud obtained by point cloud splicing, tracking target information is obtained by using the dynamic point cloud, a exercisable area is obtained by using the static point cloud, and path planning can be successfully made to navigate the vehicle.

Description

Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar.
Background
Currently, with the development of science and technology, more and more science fiction things begin to become reality gradually, such as automatic driving of road vehicles, automatic driving of subways, automatic driving of logistics vehicles in specific scenes, and the like. Tired autopilot vehicles free the driver from repetitive labor, high strain and fatigue driving hazards. And as the population structure and the industry structure change with the development of social economy and city, the proportion of young labor force is reduced and a large amount of people enter the city, so that drivers such as long-distance freight drivers and remote mining cards are more and more difficult to recruit. The intelligent automatic driving of the vehicles can help owners to relieve the barren problem of workers, and generally can bring higher operation efficiency and reduce operation cost.
In order to realize the automatic driving of vehicles such as vehicles and ships, it is necessary to provide a conventional sensor and other electronic and mechanical devices, and it is also necessary for the vehicles to have an external ambient sensing sensor unit such as human eyes and ears, a central decision unit such as human brain, and an action execution unit such as hands and feet. The decision unit typically selects the embedded computer of the corresponding configuration based on the overall configuration of the system and the complexity of the algorithm. The decision unit receives information of an environment sensor and an internal sensor (such as speed, yaw rate, steering angle of a steering wheel of a vehicle, gear, inclination angle, acceleration and the like), acquires and evaluates conditions of the environment and the vehicle, then makes a decision of next action, and controls control devices such as a brake, an accelerator, a steering wheel and the like of the vehicle through an output interface to realize danger avoidance, normal path conversion and the like. The current tool for collecting environmental data comprises a laser radar and a camera, but the laser radar and the camera are poor in adaptability and cannot work around the clock.
In summary, how to utilize a suitable environmental data acquisition tool to acquire environmental data so as to make an accurate path plan to navigate the vehicle, and realizing automatic driving is a current urgent problem to be solved.
Disclosure of Invention
In view of the above, the present application aims to provide a vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar, which can acquire environmental information by using a suitable environmental information acquisition tool so as to make accurate path planning to navigate a vehicle and realize automatic driving. The specific scheme is as follows:
in a first aspect, the present application discloses a vehicle navigation method based on imaging millimeter wave radar, comprising:
converting the point clouds which are obtained by the imaging millimeter wave radars and are positioned in the radar coordinate systems into a vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems;
determining ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the ground point cloud information;
Clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target;
and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
Optionally, the converting the point cloud obtained by each imaging millimeter wave radar and located in each radar coordinate system into the vehicle coordinate system so as to perform point cloud stitching to obtain a continuous point cloud includes:
acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of yaw angles of each imaging millimeter wave radar;
calculating coordinates of the point clouds in the vehicle coordinate system in each imaging millimeter wave radar coordinate system according to the radar installation information so as to obtain a plurality of groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system;
and splicing the plurality of groups of point cloud coordinates to obtain corresponding continuous point clouds.
Optionally, before determining the point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, the method further includes:
if the point cloud quality parameters corresponding to the continuous point clouds meet the preset parameter threshold filtering conditions, determining the continuous point clouds as false point clouds, and deleting the false point clouds;
and/or if the continuous point cloud appears in the current scanning period, but the continuous point cloud can not be continuously detected in the first preset coordinate range in the first preset number of continuous scanning periods, determining the continuous point cloud as a false point cloud, and deleting the false point cloud.
Optionally, the determining, according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, the point cloud opposite information of the continuous point cloud relative to the ground, and distinguishing, based on the point cloud opposite information, a static point cloud from a dynamic point cloud from the continuous point cloud includes:
detecting the vehicle motion information including a vehicle motion speed and a vehicle yaw rate of the vehicle relative to the ground;
Determining the speed of the continuous point cloud relative to the ground according to the vehicle motion information and the speed of the continuous point cloud in the continuous point cloud;
and if the speed of the continuous point cloud to the ground point cloud is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the speed of the continuous point cloud to the ground point cloud is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
Optionally, the clustering the dynamic point cloud according to the pair of point cloud information to obtain a clustered target includes:
determining target dynamic point clouds with similar information on the point clouds from the dynamic point clouds in a second preset coordinate range;
clustering the target dynamic point cloud according to preset contour information to obtain a clustered target, and determining a clustered point corresponding to the clustered target and coordinates of the clustered point.
Optionally, the tracking the clustered targets according to the clustered target speed and the clustered target moving direction contained in the clustered targets to obtain tracking target information corresponding to the clustered targets includes:
predicting a target coordinate range of the clustered targets in the next scanning according to the clustered target speed, the clustered target moving direction and the clustered point coordinates contained in the clustered targets;
And if the point cloud to be judged, which has similar speed and moving direction of the clustered targets, is detected in the target coordinate range during the next scanning, the point cloud to be judged is judged to be the clustered targets, so that target tracking is realized and tracking target information corresponding to the clustered targets is obtained.
Optionally, the vehicle navigation method based on the imaging millimeter wave radar further includes:
and if no change is detected to the point cloud information of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range, determining the third preset coordinate range as a non-driving area.
In a second aspect, the present application discloses a vehicle navigation device based on imaging millimeter wave radar, comprising:
the point cloud splicing module is used for converting the point cloud which is obtained by each imaging millimeter wave radar and is positioned in each radar coordinate system into a vehicle coordinate system so as to splice the point cloud to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems;
The point cloud distinguishing module is used for determining the opposite-ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the opposite-ground point cloud information;
the clustering module is used for clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target;
and the path planning module is used for determining a drivable area according to the static point cloud, and carrying out path planning based on the drivable area and the tracking target information so as to navigate the vehicle.
In a third aspect, the application discloses an electronic device comprising a processor and a memory; wherein the processor implements the imaging millimeter wave radar-based vehicle navigation method disclosed above when executing the computer program stored in the memory.
In a fourth aspect, the present application discloses a computer-readable storage medium for storing a computer program; wherein the computer program when executed by a processor implements the aforementioned disclosed imaging millimeter wave radar-based vehicle navigation method.
Therefore, the application converts the point clouds which are obtained by each imaging millimeter wave radar and are positioned in each radar coordinate system into the vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems; determining ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the ground point cloud information; clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In this way, the imaging millimeter wave radar with strong weather adaptability and no fragile glass lens can realize all-weather operation, in addition, the application splices the point clouds acquired by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, and distinguishes dynamic point clouds and static point clouds according to the information of the point clouds determined by the vehicle motion information and the continuous point clouds, then obtains tracking target information by utilizing the dynamic point clouds, and determines a movable area by utilizing the static point clouds, and then makes path planning according to the tracking target information and the movable area so as to navigate the vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a vehicle navigation method based on imaging millimeter wave radar provided by the application;
FIG. 2 is a schematic view of a radar installation of a vehicle according to the present application;
FIG. 3 is a schematic diagram of a coordinate system according to the present application;
FIG. 4 is a schematic diagram of coordinate system transformation according to the present application;
FIG. 5 is a flowchart of a specific vehicle navigation method based on imaging millimeter wave radar provided by the application;
FIG. 6 is a schematic diagram of an automated driving system for a vehicle according to the present application;
fig. 7 is a schematic diagram of a vehicle navigation device based on imaging millimeter wave radar;
fig. 8 is a block diagram of an electronic device according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The current environment data acquisition tool has poor adaptability, can not realize all-weather work, is unfavorable for making accurate path planning, and realizes automatic driving.
In order to overcome the problems, the application provides a vehicle navigation scheme based on imaging millimeter wave radar, which can acquire environment information by utilizing a proper environment information acquisition tool so as to make accurate path planning to navigate a vehicle and realize automatic driving.
Referring to fig. 1, an embodiment of the application discloses a vehicle navigation method based on imaging millimeter wave radar, which comprises the following steps:
step S11: converting the point clouds which are obtained by the imaging millimeter wave radars and are positioned in the radar coordinate systems into a vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems.
In the embodiment of the application, surrounding environment data is acquired by utilizing each imaging millimeter wave radar to obtain the point cloud. Currently, common external ambient sensing sensors for acquiring ambient data include cameras, laser radars, ultrasonic radars, millimeter wave radars and the like, and the corresponding sensors have different specific configurations and different effects. The camera is most similar to human eyes, belongs to a passive device, and receives light rays emitted or reflected by an object to acquire images in the imaging module. With the progress of technology, the camera can obtain ultra-high resolution environmental image information, and even a binocular stereo camera capable of identifying depth of field. However, the lens has disadvantages similar to human eyes, such as easy influence of air visibility such as haze, easy influence of light, and the like, and the maintenance of the cleanliness of the lens is also troublesome in a dust diffuse scene. In addition, the distance measurement precision of the camera to the target can be gradually reduced along with the increase of the distance, and the decision quality of the system is affected to a certain extent. The laser radar generally adopts a plurality of infrared laser transmitters to transmit laser beams, the laser beams are received by a receiver after being reflected by a target, and the target distance is calculated by calculating the time interval between the receiving and transmitting and the light velocity constant. And acquiring the target azimuth according to the azimuth of the laser radar transmitter during transmitting. Therefore, the laser radar can acquire accurate distance information of targets with different distances relative to the camera, does not depend on an ambient light source, and can acquire the accurate distance information at daytime and night. Naturally, the device is also easily influenced by the air visibility and the lens cleanliness of direct sunlight, haze, rain and snow and the like, and has poor weather adaptability. The millimeter wave radars are used for AEB (Autonomous Emergency Braking) automatic emergency braking, BSD (Blind Spot Detection) blind area monitoring and other auxiliary driving, and are limited by cost, time technology development, CAN (Controller AreaNetwork) bus interfaces with limited bandwidth and other reasons, the data are generally sparse, the angle resolution of target measurement is low, the environment cannot be finely perceived, and only an auxiliary driving function which does not pursue recognition rate can be provided. While automatic driving not only solves the problem of collision avoidance, but also solves the problem of path planning to a destination, and needs to perform fine sensing on the environment to ascertain the positions and states of stationary and moving targets in different distances.
It should be noted that, in the past, millimeter wave radars used for auxiliary driving such as AEB automatic emergency braking and BSD blind area monitoring are limited by cost, time technology development, and CAN bus interfaces with limited bandwidth, etc., the data are generally sparse, the angular resolution of target measurement is low, the environment cannot be finely perceived, and only the auxiliary driving function without pursuing recognition rate CAN be provided. The problem of collision avoidance and the problem of path planning to a destination are solved in automatic driving, the environment is required to be finely sensed, and the positions and states of static and moving targets in different distances are detected, so that the existing automatic driving vehicle generally depends on a camera and a laser radar with the finely sensed and fine, but the two sensors are easily influenced by visibility and cannot adapt to all-weather operation requirements of outdoor industrial and mining scenes, and the laser radar also has the problems that mechanical moving parts are easily damaged and the cost is high. The millimeter wave radar has the characteristic of all-weather operation compared with environment sensing sensors such as a camera and a laser radar, is not influenced by illumination, air visibility, self lens cleanliness and the like easily like the camera and the laser radar, and has important significance for helping to promote the auxiliary driving level of a vehicle and even the weather adaptability of automatic driving. Therefore, the application adopts the imaging millimeter wave radar, and outputs more surrounding environment data with higher precision by using the Ethernet interface.
It can be understood that the 77GHz imaging millimeter wave radar adopting the latest millimeter wave radar technology is adopted as a sensor, more antenna combinations are adopted to realize more physical channels and virtual channels, so that the angle resolution characteristic of a long-term short plate serving as the millimeter wave radar is greatly improved in precision, finer environmental perception information, namely surrounding environment data, clear surrounding environment can be seen after correction of eyes similar to high myopia, and more confidence and more pleasure are obtained when the user walks. The imaging radar has a larger detection range, and is matched with an Ethernet interface with larger communication bandwidth to output target information, so that more target space information and attribute information with higher measurement accuracy can be output, targets such as vehicles, pedestrians and the like can correspond to denser points, and the imaging radar is similar to a laser radar to present cloud shape, which is also called Point cloud (Point cloud), so that the effect of seeing the environment clearly only by the Point cloud effect is achieved, and the imaging radar is more suitable for environmental data acquisition in the aspect of automatic driving. Specifically, the imaging millimeter wave radar sensor actively emits millimeter wave electromagnetic waves to scan, then receives electromagnetic wave information reflected by a target in a visual field, and calculates the relative speed, the relative distance and the azimuth angle of the target, even the pitch angle by adopting a pulse compression principle, a Doppler effect and a Fourier transform algorithm. The method scans 10-30 times per second, detects environment information at high frequency, and provides enough data refresh rate required by automatic driving of the vehicle. Because the millimeter wave radar of 77GHz has a wavelength of about 4mm, fine dust, water vapor and the like can be easily diffracted, the vehicle all-weather operation is ensured without being influenced by light and haze.
In the embodiment of the application, two rectangular coordinate systems of a radar coordinate system and a vehicle coordinate system are utilized, after each imaging millimeter wave radar obtains environment information, the environment information exists in the radar coordinate system in a point cloud mode, and because the view angle of a single imaging millimeter wave radar is limited and is about 120 degrees, one vehicle can correspond to a plurality of imaging millimeter wave radars, and 360-degree seamless perception around the vehicle or partial large view angle perception covering the front and the side front can be realized by splicing the point cloud information of a plurality of imaging millimeter wave radars arranged around the vehicle. Specifically, a plurality of imaging millimeter wave radars are installed at the front, four corners and the rear and even the side surfaces of the vehicle according to the range to be monitored, and overlapping fields of view are ensured, and the type and the length of the vehicle are considered, for example, a passenger vehicle can well cover the 360-degree range of the vehicle by using 6 imaging radars positioned at the front, the rear and the four corners, and a freight vehicle with a longer length can be required to be installed with 8 imaging millimeter wave radars, namely, 1 radar is additionally installed at the left side surface and the right side surface. Due to the excessive number of radars, the point clouds obtained by each imaging millimeter wave Lei Dadian cloud need to be converted into a vehicle coordinate system so as to be spliced to obtain continuous point clouds. Specifically, after the splicing is completed, the all-weather high-precision point cloud imaging of 10-30 frames/second of the surrounding environment of the vehicle can be realized by collecting a plurality of imaging millimeter wave radars installed around the vehicle, surrounding environment data acquisition is facilitated, and as shown in fig. 2, the radar installation positions and simple information of the vehicle are displayed, specifically, the radar installation positions comprise radar 1, radar 2, radar 3, radar 4, radar 5 and radar 6, the radar 1 normal corresponding to the radar 1 and the radar 1 yaw angle, and the radar installation positions also comprise the origin and coordinate axes (X axis and Y axis) of a vehicle coordinate system, the central axis of the vehicle, parallel lines of the central axis of the vehicle and the driving direction.
In the embodiment of the application, the specific process of converting the point cloud into the vehicle coordinate system is as follows: acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of yaw angles of each imaging millimeter wave radar; calculating coordinates of the point clouds in the vehicle coordinate system in each imaging millimeter wave radar coordinate system according to the radar installation information so as to obtain a plurality of groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system; then, the plurality of groups of point cloud coordinates are spliced to obtain corresponding continuous point clouds; the point cloud in the radar coordinate system comprises a Lei Dadian cloud distance, a radar point cloud speed and a radar point cloud azimuth angle relative to the radar. It is to be noted that the radar installation information is acquired not by imaging millimeter wave radar but by other sensors installed in the vehicle.
It should be noted that, as shown in fig. 3, the radar installation position is not at the origin of the coordinate system of the vehicle, and in this case, the addition and subtraction of the coordinates in the transverse direction and the longitudinal direction are required, which corresponds to the translation of the coordinate matrix in the data processing algorithm; then, according to the radar installation yaw angle, the rotation of the coordinate matrix is performed in a data processing algorithm, as shown in fig. 3, the vehicle coordinate system generally takes the center of the rear axle of the vehicle as an origin O, the central axis of the vehicle as a longitudinal axis Y, and the straight line of the rear axle of the vehicle as a transverse axis X, so as to form a rectangular coordinate system, and generally takes the front direction of the vehicle as the positive direction of the longitudinal axis and the positive direction of the transverse axis of the right side direction of the driver. Let the coordinates of the radar 1 installation position in the vehicle coordinate system be (x radar ,y radar ) The installation yaw angle is θ, and the coordinates of the detected target point trace P in the radar coordinate system are (x 1 ,y 1 ) We need to find its coordinates (x 2 ,y 2 ). First, the origin of the radar coordinate system and the vehicle coordinate system is "superimposed" by translation, and as shown in fig. 4, a simple numerical addition and subtraction operation of "superimposed" is performed, assuming that the coordinates of the trace P in the radar coordinate system after translating the radar coordinate system are (x 0 ,y 0 ) Then
x 0 =x 1 -x radar
y 0 =y 1 -y radar
In the embodiment of the application, because the radar installation direction and the central axis of the vehicle have a yaw angle theta, the radar installation direction and the central axis of the vehicle also need to rotate, and specifically, the radar installation direction and the central axis of the vehicle have the following formulas according to angle redundancy and trigonometric function rules:
x 2 =x 0 cosθ-y 0 sinθ=(x 1 -x radar )(cosθ-sinθ);
y 2 =y 0 sinθ+y 0 cosθ=(y 1 -y radar )(sinθ+cosθ);
it can be understood that after coordinates of all radar point clouds correspond to a vehicle coordinate system, different radar point clouds can be processed in one coordinate system, if overlapping of adjacent radar fields is ensured when the imaging millimeter wave radar is installed, no gap of the spliced point clouds can be ensured, and the overall splicing effect similar to that of a 360-degree looking-around camera of a vehicle is achieved.
Step S12: and determining the opposite-ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing a static point cloud and a dynamic point cloud from the continuous point cloud based on the opposite-ground point cloud information.
In the embodiment of the application, the sensors with different principles can generate error detection information due to some self or environmental factors, and the error detection information is filtered or influence of the error detection information is reduced according to the characteristics of the sensors and experimental experience, so that false targets are prevented from participating in later operation to cause error results. In addition, false point cloud can occur due to imaging millimeter wave radar itself due to the reasons of angle measurement blurring, speed measurement blurring, environmental background noise or multipath effect and the like. Determining a false point cloud from the continuous point clouds and deleting the false point cloud before determining the information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, specifically, if the point cloud quality parameters corresponding to the continuous point clouds meet the preset parameter threshold filtering conditions, determining the continuous point cloud as the false point cloud and then deleting the false point cloud; and/or if the continuous point cloud appears in the current scanning period, but the continuous point cloud cannot be continuously detected in the first preset coordinate range in the first preset number of continuous scanning periods, determining the continuous point cloud as a false point cloud, and deleting the false point cloud. It should be noted that the point cloud quality parameters include quality information such as a target RCS (Radar Cross Section) radar scattering cross section area and ambiguity output by the millimeter wave radar, and the preset parameter threshold filtering condition is determined by finding out reproducible rules through a large number of actual scene tests in the current period of the corresponding application item; it will be appreciated that if the continuous point cloud occurs during the current scanning period, but the continuous point cloud cannot be continuously detected within the first preset coordinate range within the first preset number of continuous scanning periods, the point cloud confidence of the first preset coordinate range will be low and will be deleted as a false point cloud, and the first preset number and the first preset coordinate range will be obtained and verified through the early test experience of the specific application scenario.
In the embodiment of the application, after the false target is deleted, the ground point cloud information of the continuous point cloud relative to the ground is determined according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and the static point cloud and the dynamic point cloud are distinguished from the continuous point cloud based on the ground point cloud information. It can be understood that the imaged millimeter wave radar can be classified into static point clouds without changing the point cloud information after detecting, including walls, railings and parked vehicles, which are obstacles to be avoided when planning the travel route in the later stage; after the continuous point cloud is detected by the imaging millimeter wave radar, the change of the information of the point cloud can be classified as dynamic point cloud, and the dynamic point cloud can be combined to determine specific actions when planning the travelling route in the later period, for example, parking can be stopped for waiting for the passing or bypassing of the target corresponding to the dynamic point cloud when determining that the target can pass through the planning route of the vehicle.
Step S13: and clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target.
In the embodiment of the application, the clustering process is as follows: determining target dynamic point clouds with similar information on the point clouds from the dynamic point clouds in a second preset coordinate range; clustering the target dynamic point cloud according to preset contour information to obtain a clustered target, and determining a clustered point corresponding to the clustered target and coordinates of the clustered point. It can be understood that the similar opposite-location cloud information includes similar opposite-location cloud speeds, the preset contour information is a contour of a target common to a specific scene and an outline size of the target, the target includes traffic participants such as vehicles and pedestrians, and the like, and the dynamic point clouds corresponding to different individuals with short distances, approximate speeds and moving directions can be prevented from being clustered into 1 clustering point as much as possible by using the preset contour; the clustering point is a position of the corresponding clustering target closest to the vehicle or a central point of the clustering target.
In the embodiment of the application, after the clustering is completed, tracking of a clustered target is further carried out, and a target coordinate range where the clustered target is located in the next scanning is predicted according to the clustered target speed and the clustered point coordinates contained in the clustered target; and if the point cloud to be judged, which has similar speed and moving direction of the clustered targets, is detected in the target coordinate range during the next scanning, the point cloud to be judged is judged to be the clustered targets, so that target tracking is realized and tracking target information corresponding to the clustered targets is obtained.
It is pointed out that the clustering points and the coordinates of the clustering points are determined in the clustering process, so that the calculation amount of the tracking process and the calculation of the tracking target information is reduced; the target coordinate range may be referred to as a "wave gate". In addition, the radar scanning interval is only tens of milliseconds, so that the movement state of the vehicle on the road changes very fast, and particularly, the vehicle can automatically drive under a low-speed operation in specific scenes such as a mine truck, a port collector truck and the like, and the movement state of the vehicle during normal running does not have larger abrupt change in the period, so that the accuracy is very high.
Step S14: and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
In the embodiment of the application, the area where the static point cloud is located is determined as an area which can not be driven, and the other areas are areas which can be driven; in addition, the non-drivable region may also be determined by: and if no change is detected to the point cloud information of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range, determining the third preset coordinate range as a non-driving area. It should be noted that the second preset number may be 5, or the second preset number may be reduced or increased appropriately according to the requirements of the specific application scene vehicle, the target running speed, the real-time performance, and the like, and the sensitivity may be adjusted; the third preset coordinate range defaults to 1 m long and wide respectively, and the size can be adjusted according to the needs of specific applications.
In the embodiment of the application, for road conditions of steps and cliffs lower than the current road section, the imaging millimeter wave radar cannot accurately identify the road conditions, and other sensors are needed to assist in identifying the road conditions. If the method is applied to the automatic driving of vehicles in specific scenes, such as mine cards in open-pit mining areas, a map of an acquired operation area is generally used, and a Beidou satellite positioning system is combined to avoid the situation that the vehicle mistakenly drops down to a cliff.
Therefore, the application converts the point clouds which are obtained by each imaging millimeter wave radar and are positioned in each radar coordinate system into the vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems; determining ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the ground point cloud information; clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In this way, the imaging millimeter wave radar with strong weather adaptability and no fragile glass lens can realize all-weather operation, in addition, the application splices the point clouds acquired by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the information of the point clouds determined by the vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain clustered targets, tracks the clustered targets to obtain tracking target information, and determines a movable area by utilizing the static point clouds.
Referring to fig. 5, an embodiment of the present application discloses a specific vehicle navigation method based on imaging millimeter wave radar, which includes:
step S21: converting the point clouds which are obtained by the imaging millimeter wave radars and are positioned in the radar coordinate systems into a vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems.
For more specific processing in step S21, reference may be made to the corresponding content disclosed in the foregoing embodiment, and a detailed description is omitted herein.
Step S22: detecting the vehicle motion information including a vehicle motion speed and a vehicle yaw rate of the vehicle relative to the ground; determining the speed of the continuous point cloud relative to the ground according to the vehicle motion information and the speed of the continuous point cloud in the continuous point cloud; and if the speed of the continuous point cloud to the ground point cloud is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the speed of the continuous point cloud to the ground point cloud is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
According to the embodiment of the application, radar point cloud information including radar point cloud coordinates, lei Dadian cloud distance, radar point cloud speed and radar point cloud azimuth angle of Lei Dadian cloud relative to radar is obtained, vehicle motion information including vehicle motion speed and vehicle yaw rate of the vehicle relative to the ground is obtained, radar installation information including coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and yaw angle of each imaging millimeter wave radar is obtained, and the relative motion relation of Lei Dadian cloud, vehicle and ground is obtained through the Lei Dadian cloud information, the vehicle motion information and the radar installation information, so that point cloud to ground point cloud information is deduced, and then static point cloud and dynamic point cloud are further distinguished according to the point cloud information. It should be noted that, in the above process, the continuous point cloud has been obtained according to the radar installation information and the radar point cloud information, the continuous point cloud includes continuous point cloud information of the continuous point cloud relative to the vehicle, the continuous point cloud and the vehicle motion information are used to infer the point cloud information, and then the static point cloud and the dynamic point cloud are further distinguished. Specifically, according to the vehicle motion information and the continuous point cloud speed in the continuous point cloud, determining the speed of the continuous point cloud relative to the ground. And if the speed of the continuous point cloud to the ground point cloud is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the speed of the continuous point cloud to the ground point cloud is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud. It is understood that the vehicle yaw rate is a physical quantity that characterizes how quickly the steering angle of the vehicle changes, in terms of "degrees/sec" or "radians/sec".
It should be noted that, for a point cloud speed including an X-axis direction speed and a Y-axis direction speed, the preset speed threshold includes an X-axis direction first preset speed threshold and a Y-axis direction second preset speed threshold. And when the speed of the continuous point cloud in the X-axis direction is smaller than a first preset speed threshold value and the speed of the continuous point cloud in the Y-axis direction is smaller than a second preset speed threshold value, determining the continuous point cloud as a static point cloud, and otherwise, determining the continuous point cloud as a dynamic point cloud.
Specifically, the yaw rate of the vehicle is also called angular velocity, represented by ω, and according to the law of circular motion with a radius r of the target, the corresponding linear velocity is:
wherein x and y are projections of r on coordinate axes of a rectangular coordinate system with the circle center of the circumference as an origin,and sequentially deriving the linear velocity at the coordinates, since the linear velocity is tangential velocity and perpendicular to the straight line defined by the origin and the coordinate points, which is decomposed into 2 components along the coordinate axes of the vehicle coordinate system, v for convenience of use Line x And v Line y It is possible to calculate:
it will be appreciated that the radar point cloud is relative to the radar point cloud velocity v of the radar Point cloud The component in the X-axis is v Point cloud x The component in the Y axis is v Point cloud y By v Point cloud x And v Point cloud y With vehicle speed v Vehicle with a frame Linear velocity component v Line x And v Line y The addition and subtraction vector operation is carried out to calculate the speed v of the cloud of the ground point To the ground point cloud For example, when the vehicle is straight, v Wire (C) 0, if the transverse relative velocity v of the radar point cloud Point cloud x Also 0, the continuous point cloud ground speed v is obtained by adding and subtracting the longitudinal speed To the ground point cloud =v To the ground point cloud y =v Vehicle with a frame +v Point cloud y When the continuous point cloud is the stationary point cloud, v Point cloud y The numerical value will be equal to v Vehicle with a frame Consistent but opposite in direction, i.e. according to v Vehicle with a frame +v Point cloud y Whether or not is equal to 0 is used for judging whether or not the continuous point cloud is a stationary point cloud. Of course, if v Vehicle with a frame +v Point cloud y =0 but v Point cloud x Not equal to 0, the continuous point cloud is also a dynamic point cloud. V when the vehicle turns To the point cloud x =v Point cloud x +v Line x And v To the ground point cloud y =v Vehicle with a frame +v Point cloud y +v Line y And if the point cloud is equal to 0, judging that the continuous point cloud is a static point cloud, and otherwise, judging that the continuous point cloud is a dynamic point cloud. The radar actually detected data is dithered to a certain extent, so long as v is To the point cloud x Less than a first preset speed threshold and v To the ground point cloud y And if the continuous point cloud is smaller than the second preset speed threshold value, judging that the continuous point cloud is a static point cloud, and otherwise judging that the continuous point cloud is a dynamic point cloud.
Step S23: clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target
For more specific processing in step S23, reference may be made to the corresponding content disclosed in the foregoing embodiment, and a detailed description is omitted herein.
Step S24: and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle.
For more specific processing in step S24, reference may be made to the corresponding content disclosed in the foregoing embodiment, and a detailed description is omitted herein.
Therefore, the application converts the point clouds which are obtained by each imaging millimeter wave radar and are positioned in each radar coordinate system into the vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems; detecting the vehicle motion information including a vehicle motion speed and a vehicle yaw rate of the vehicle relative to the ground; determining the speed of the continuous point cloud relative to the ground according to the vehicle motion information and the speed of the continuous point cloud in the continuous point cloud; if the speed of the continuous point cloud to the ground point cloud is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the speed of the continuous point cloud to the ground point cloud is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud; clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In this way, the imaging millimeter wave radar with strong weather adaptability and no fragile glass lens can realize all-weather operation, in addition, the application splices the point clouds acquired by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the speed of the point clouds determined by the vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain clustered targets, tracks the clustered targets to obtain tracking target information, and determines a movable area by utilizing the static point clouds.
As shown in fig. 6, a vehicle automatic driving system used in the embodiment of the application is shown, the system includes an imaging millimeter wave radar subsystem and other devices on the vehicle such as integrated navigation and power supply, the system belongs to a subset of the whole automatic driving system, the primary processed target and judgment result data are output to a main control unit of the automatic driving system for fusion operation and use, the other devices include other sensors, and the vehicle automatic driving system includes a millimeter wave radar sensor, a radar gateway and a processor. In the imaging millimeter wave radar subsystem, the radar 1, the radar 2, the radar 3, the radar 4, the radar 5 and the radar 6 send point clouds to a radar gateway through Ethernet, a network switch is arranged in the radar gateway and is responsible for collecting the point clouds output by each imaging millimeter wave radar through an Ethernet interface thereof, and the point clouds are output to a processor through one gigabit network interface. And the power supply in the other equipment supplies power to the processor through a 12V DC/10A power supply line, the other sensors in the other equipment transmit the vehicle speed and the vehicle yaw rate to the processor through a CAN bus or RS232, and the processor transmits the processed result to the other equipment through an output interface so as to carry out path planning and navigate the vehicle. It should be noted that the steps of converting the point cloud in the radar coordinate system into the vehicle coordinate system, splicing the point cloud to obtain the continuous point cloud, determining the false target and deleting the false target, distinguishing the static point cloud of the dynamic point canal, performing point cloud clustering on the target point cloud to obtain the clustered target, tracking the clustered target to obtain the tracking target information, determining the travelable area by utilizing the static point cloud, and the like are all completed by the processor. Wherein the processor includes, but is not limited to, an embedded processor.
Referring to fig. 7, an embodiment of the present application discloses a vehicle navigation device based on imaging millimeter wave radar, comprising:
the point cloud splicing module 11 is used for converting the point cloud, which is obtained by each imaging millimeter wave radar and is positioned in each radar coordinate system, into a vehicle coordinate system so as to splice the point cloud to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems;
a point cloud distinguishing module 12, configured to determine, according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, point cloud pairing information of the continuous point cloud relative to the ground, and distinguish a static point cloud from a dynamic point cloud based on the point cloud pairing information;
the clustering module 13 is configured to cluster the dynamic point cloud according to the point cloud information to obtain a clustered target, and track the clustered target according to a clustered target speed and a clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target;
And the path planning module 14 is used for determining a drivable area according to the static point cloud and carrying out path planning based on the drivable area and the tracking target information so as to navigate the vehicle.
The more specific working process of each module may refer to the corresponding content disclosed in the foregoing embodiment, and will not be described herein.
Therefore, the application converts the point clouds which are obtained by each imaging millimeter wave radar and are positioned in each radar coordinate system into the vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems; determining ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the ground point cloud information; clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target; and determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle. In this way, the imaging millimeter wave radar with strong weather adaptability and no fragile glass lens can realize all-weather operation, in addition, the application splices the point clouds acquired by different imaging millimeter wave radars to obtain continuous point clouds for displaying the surrounding environment of the vehicle, distinguishes dynamic point clouds and static point clouds according to the information of the point clouds determined by the vehicle motion information and the continuous point clouds, clusters the dynamic point clouds to obtain clustered targets, tracks the clustered targets to obtain tracking target information, and determines a movable area by utilizing the static point clouds.
Further, the embodiment of the present application further provides an electronic device, and fig. 8 is a block diagram of an electronic device 20 according to an exemplary embodiment, where the content of the diagram is not to be considered as any limitation on the scope of use of the present application.
Fig. 8 is a schematic structural diagram of an electronic device 20 according to an embodiment of the present application. The electronic device 20 may specifically include: at least one processor 21, at least one memory 22, a power supply 23, an input-output interface 24, a communication interface 25, and a communication bus 26. Wherein the memory 22 is for storing a computer program that is loaded and executed by the processor 21 to implement the relevant steps of the imaging millimeter wave radar-based vehicle navigation method disclosed in any of the foregoing embodiments.
In this embodiment, the power supply 23 is configured to provide an operating voltage for each hardware device on the electronic device 20; the communication interface 25 can create a data transmission channel between the electronic device 20 and an external device, and the communication protocol to be followed is any communication protocol applicable to the technical solution of the present application, which is not specifically limited herein; the input/output interface 24 is used for obtaining external input data or outputting external output data, and the specific interface type thereof may be selected according to the specific application needs, which is not limited herein.
The memory 22 may be a carrier for storing resources, such as a read-only memory, a random access memory, a magnetic disk, or an optical disk, and the memory 22 may be a nonvolatile memory including a random access memory as a running memory and a storage purpose for an external memory, and the storage resources include an operating system 221, a computer program 222, and the like, and the storage may be temporary storage or permanent storage.
The operating system 221 is used to manage and control various hardware devices on the electronic device 20 and the computer program 222 on the source host, and the operating system 221 may be Windows, unix, linux or the like. The computer program 222 may further include a computer program that can be used to perform other specific tasks in addition to the computer program that can be used to perform the imaging millimeter wave radar-based vehicle navigation method performed by the electronic device 20 disclosed in any of the foregoing embodiments.
In this embodiment, the input/output interface 24 may specifically include, but is not limited to, a USB interface, a hard disk read interface, a serial interface, a voice input interface, a fingerprint input interface, and the like.
Further, the embodiment of the application also discloses a computer readable storage medium for storing a computer program; wherein the computer program when executed by a processor implements the aforementioned disclosed imaging millimeter wave radar-based vehicle navigation method.
For specific steps of the method, reference may be made to the corresponding matters disclosed in the foregoing embodiments, and detailed description thereof will not be repeated here
The computer readable storage medium as referred to herein includes random access Memory (Random Access Memory, RAM), memory, read-Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, magnetic or optical disk, or any other form of storage medium known in the art. Wherein the computer program, when executed by the processor, implements the aforementioned imaging millimeter wave radar-based vehicle navigation method. For specific steps of the method, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and no further description is given here.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other. As for the apparatus disclosed in the embodiment, since it corresponds to the method of vehicle navigation based on the imaging millimeter wave radar disclosed in the embodiment, the description is relatively simple, and the relevant points are only described with reference to the method section.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of an algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above describes in detail a vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar, and specific examples are applied to illustrate the principle and implementation of the invention, and the above description of the examples is only used to help understand the method and core idea of the invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (8)

1. A vehicle navigation method based on imaging millimeter wave radar, characterized by comprising:
converting the point clouds which are obtained by the imaging millimeter wave radars and are positioned in the radar coordinate systems into a vehicle coordinate system so as to splice the point clouds to obtain continuous point clouds; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems;
determining ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the ground point cloud information;
Clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target;
determining a travelable area according to the static point cloud, and planning a path based on the travelable area and the tracking target information so as to navigate the vehicle;
the step of clustering the dynamic point cloud according to the point cloud information to obtain a clustered target includes:
determining target dynamic point clouds with similar information on the point clouds from the dynamic point clouds in a second preset coordinate range;
clustering the target dynamic point cloud according to preset contour information to obtain a clustered target, and determining a clustered point corresponding to the clustered target and coordinates of the clustered point;
the method for tracking the clustered targets according to the clustered target speed and the clustered target moving direction contained in the clustered targets to obtain tracking target information corresponding to the clustered targets comprises the following steps:
predicting a target coordinate range of the clustered targets in the next scanning according to the clustered target speed, the clustered target moving direction and the clustered point coordinates contained in the clustered targets;
And if the point cloud to be judged, which has similar speed and moving direction of the clustered targets, is detected in the target coordinate range during the next scanning, the point cloud to be judged is judged to be the clustered targets, so that target tracking is realized and tracking target information corresponding to the clustered targets is obtained.
2. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein the converting the point cloud in each radar coordinate system obtained by each imaging millimeter wave radar into the vehicle coordinate system so as to perform point cloud stitching to obtain continuous point cloud comprises:
acquiring coordinate information of each imaging millimeter wave radar in the vehicle coordinate system and radar installation information of yaw angles of each imaging millimeter wave radar;
calculating coordinates of the point clouds in the vehicle coordinate system in each imaging millimeter wave radar coordinate system according to the radar installation information so as to obtain a plurality of groups of point cloud coordinates corresponding to different imaging millimeter wave radars in the vehicle coordinate system;
and splicing the plurality of groups of point cloud coordinates to obtain corresponding continuous point clouds.
3. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein before the determining the continuous point cloud to ground point cloud information according to the detected vehicle motion information of the vehicle with respect to ground and the continuous point cloud, further comprising:
if the point cloud quality parameters corresponding to the continuous point clouds meet the preset parameter threshold filtering conditions, determining the continuous point clouds as false point clouds, and deleting the false point clouds;
and/or if the continuous point cloud appears in the current scanning period, but the continuous point cloud can not be continuously detected in the first preset coordinate range in the first preset number of continuous scanning periods, determining the continuous point cloud as a false point cloud, and deleting the false point cloud.
4. The imaging millimeter wave radar-based vehicle navigation method according to claim 1, wherein the determining the pair-point cloud information of the continuous point cloud with respect to the ground from the detected vehicle motion information of the vehicle with respect to the ground and the continuous point cloud, and distinguishing a static point cloud from a dynamic point cloud based on the pair-point cloud information, comprises:
Detecting the vehicle motion information including a vehicle motion speed and a vehicle yaw rate of the vehicle relative to the ground;
determining the speed of the continuous point cloud relative to the ground according to the vehicle motion information and the speed of the continuous point cloud in the continuous point cloud;
and if the speed of the continuous point cloud to the ground point cloud is smaller than a preset speed threshold, determining the continuous point cloud as a static point cloud, and if the speed of the continuous point cloud to the ground point cloud is not smaller than the preset speed threshold, determining the continuous point cloud as a dynamic point cloud.
5. The imaging millimeter wave radar-based vehicle navigation method according to any one of claims 1 to 4, characterized by further comprising:
and if no change is detected to the point cloud information of all the continuous point clouds in a second preset number of continuous scanning periods and a third preset coordinate range, determining the third preset coordinate range as a non-driving area.
6. A vehicle navigation device based on imaging millimeter wave radar, characterized by comprising:
the point cloud splicing module is used for converting the point cloud which is obtained by each imaging millimeter wave radar and is positioned in each radar coordinate system into a vehicle coordinate system so as to splice the point cloud to obtain continuous point cloud; the continuous point cloud comprises a continuous point cloud speed and continuous point cloud coordinates of the continuous point cloud relative to the vehicle; the radar coordinate systems and the vehicle coordinate system are rectangular coordinate systems;
The point cloud distinguishing module is used for determining the opposite-ground point cloud information of the continuous point cloud relative to the ground according to the detected vehicle motion information of the vehicle relative to the ground and the continuous point cloud, and distinguishing static point cloud and dynamic point cloud from the continuous point cloud based on the opposite-ground point cloud information;
the clustering module is used for clustering the dynamic point cloud according to the point cloud information to obtain a clustered target, and tracking the clustered target according to the clustered target speed and the clustered target moving direction contained in the clustered target to obtain tracking target information corresponding to the clustered target;
the path planning module is used for determining a drivable area according to the static point cloud, and carrying out path planning based on the drivable area and the tracking target information so as to navigate the vehicle;
wherein, the clustering module is used for:
determining target dynamic point clouds with similar information on the point clouds from the dynamic point clouds in a second preset coordinate range;
clustering the target dynamic point cloud according to preset contour information to obtain a clustered target, and determining a clustered point corresponding to the clustered target and coordinates of the clustered point;
Wherein, the clustering module is further configured to:
predicting a target coordinate range of the clustered targets in the next scanning according to the clustered target speed, the clustered target moving direction and the clustered point coordinates contained in the clustered targets;
and if the point cloud to be judged, which has similar speed and moving direction of the clustered targets, is detected in the target coordinate range during the next scanning, the point cloud to be judged is judged to be the clustered targets, so that target tracking is realized and tracking target information corresponding to the clustered targets is obtained.
7. An electronic device comprising a processor and a memory; wherein the processor, when executing the computer program stored in the memory, implements the imaging millimeter wave radar-based vehicle navigation method according to any one of claims 1 to 5.
8. A computer-readable storage medium storing a computer program; wherein the computer program, when executed by a processor, implements the imaging millimeter wave radar-based vehicle navigation method according to any one of claims 1 to 5.
CN202210108581.5A 2022-01-28 2022-01-28 Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar Active CN114442101B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210108581.5A CN114442101B (en) 2022-01-28 2022-01-28 Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210108581.5A CN114442101B (en) 2022-01-28 2022-01-28 Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar

Publications (2)

Publication Number Publication Date
CN114442101A CN114442101A (en) 2022-05-06
CN114442101B true CN114442101B (en) 2023-11-14

Family

ID=81372193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210108581.5A Active CN114442101B (en) 2022-01-28 2022-01-28 Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar

Country Status (1)

Country Link
CN (1) CN114442101B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637006B (en) * 2022-05-07 2023-03-10 长沙莫之比智能科技有限公司 Early warning area self-adaptive adjustment method based on millimeter wave personnel fall detection radar
CN115290104A (en) * 2022-07-14 2022-11-04 襄阳达安汽车检测中心有限公司 Simulation map generation method, device, equipment and readable storage medium
CN115017467B (en) * 2022-08-08 2022-11-15 北京主线科技有限公司 Method and device for compensating following target and storage medium
CN116953704A (en) * 2022-12-23 2023-10-27 河北德冠隆电子科技有限公司 Wisdom is adjustable omnidirectionally scanning millimeter wave radar of multidimension angle for transportation

Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108226924A (en) * 2018-01-11 2018-06-29 李烜 Running car environment detection method, apparatus and its application based on millimetre-wave radar
CN108345823A (en) * 2017-01-23 2018-07-31 郑州宇通客车股份有限公司 A kind of barrier tracking and device based on Kalman filtering
CN108345007A (en) * 2017-01-23 2018-07-31 郑州宇通客车股份有限公司 A kind of obstacle recognition method and device
CN108710828A (en) * 2018-04-18 2018-10-26 北京汽车集团有限公司 The method, apparatus and storage medium and vehicle of identification object
CN109740628A (en) * 2018-12-03 2019-05-10 深圳市华讯方舟太赫兹科技有限公司 Point cloud clustering method, image processing equipment and the device with store function
CN109870680A (en) * 2018-10-26 2019-06-11 北京润科通用技术有限公司 A kind of objective classification method and device
CN110084840A (en) * 2019-04-24 2019-08-02 百度在线网络技术(北京)有限公司 Point cloud registration method, device, server and computer-readable medium
CN110208793A (en) * 2019-04-26 2019-09-06 纵目科技(上海)股份有限公司 DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar
CN110310294A (en) * 2019-07-08 2019-10-08 江苏易图地理信息科技股份有限公司 A kind of point cloud segmentation method using adaptivenon-uniform sampling face fuzzy C-means clustering
CN110412516A (en) * 2019-08-20 2019-11-05 河北德冠隆电子科技有限公司 Detection method and device of the millimetre-wave radar to stationary object and slow changing object
CN110914703A (en) * 2017-07-31 2020-03-24 深圳市大疆创新科技有限公司 Correction of motion-based inaccuracies in point clouds
CN110969855A (en) * 2019-12-13 2020-04-07 长沙莫之比智能科技有限公司 Traffic flow monitoring system based on millimeter wave radar
EP3633404A1 (en) * 2018-10-02 2020-04-08 Ibeo Automotive Systems GmbH Method and apparatus for optical distance measurements
CN111239766A (en) * 2019-12-27 2020-06-05 北京航天控制仪器研究所 Water surface multi-target rapid identification and tracking method based on laser radar
CN111260683A (en) * 2020-01-09 2020-06-09 合肥工业大学 Target detection and tracking method and device for three-dimensional point cloud data
CN111582352A (en) * 2020-04-30 2020-08-25 上海高仙自动化科技发展有限公司 Object-based sensing method and device, robot and storage medium
CN111797734A (en) * 2020-06-22 2020-10-20 广州视源电子科技股份有限公司 Vehicle point cloud data processing method, device, equipment and storage medium
CN111857168A (en) * 2020-07-03 2020-10-30 北京二郎神科技有限公司 Unmanned aerial vehicle positioning method and device and unmanned aerial vehicle parking attitude adjusting method and device
WO2021017314A1 (en) * 2019-07-29 2021-02-04 浙江商汤科技开发有限公司 Information processing method, information positioning method and apparatus, electronic device and storage medium
JPWO2021053811A1 (en) * 2019-09-20 2021-03-25
CN112847343A (en) * 2020-12-29 2021-05-28 深圳市普渡科技有限公司 Dynamic target tracking and positioning method, device, equipment and storage medium
CN113031005A (en) * 2021-02-22 2021-06-25 江苏大学 Crane dynamic obstacle identification method based on laser radar
CN113139607A (en) * 2021-04-27 2021-07-20 苏州挚途科技有限公司 Obstacle detection method and device
CN113156414A (en) * 2020-12-16 2021-07-23 中国人民解放军陆军工程大学 Intelligent sensing and path planning transportation system based on MIMO millimeter wave radar
CN113313200A (en) * 2021-06-21 2021-08-27 中国科学院自动化研究所苏州研究院 Point cloud fine matching method based on normal constraint
CN113391270A (en) * 2021-06-11 2021-09-14 森思泰克河北科技有限公司 False target suppression method and device for multi-radar point cloud fusion and terminal equipment
JPWO2021181647A1 (en) * 2020-03-13 2021-09-16
CN113479218A (en) * 2021-08-09 2021-10-08 哈尔滨工业大学 Roadbed automatic driving auxiliary detection system and control method thereof
CN113537316A (en) * 2021-06-30 2021-10-22 南京理工大学 Vehicle detection method based on 4D millimeter wave radar point cloud
CN113589288A (en) * 2021-06-24 2021-11-02 广西综合交通大数据研究院 Target screening method, device and equipment based on millimeter wave radar and storage medium
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113674355A (en) * 2021-07-06 2021-11-19 中国北方车辆研究所 Target identification and positioning method based on camera and laser radar
CN113689471A (en) * 2021-09-09 2021-11-23 中国联合网络通信集团有限公司 Target tracking method and device, computer equipment and storage medium
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device
CN113761238A (en) * 2021-08-27 2021-12-07 广州文远知行科技有限公司 Point cloud storage method, device, equipment and storage medium
CN113792699A (en) * 2021-09-24 2021-12-14 北京易航远智科技有限公司 Object-level rapid scene recognition method based on semantic point cloud
CN113807168A (en) * 2021-08-05 2021-12-17 北京蜂云科创信息技术有限公司 Vehicle driving environment sensing method, vehicle-mounted equipment and storage medium
CN113848545A (en) * 2021-09-01 2021-12-28 电子科技大学 Fusion target detection and tracking method based on vision and millimeter wave radar
CN113888748A (en) * 2021-09-27 2022-01-04 北京经纬恒润科技股份有限公司 Point cloud data processing method and device

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108345823A (en) * 2017-01-23 2018-07-31 郑州宇通客车股份有限公司 A kind of barrier tracking and device based on Kalman filtering
CN108345007A (en) * 2017-01-23 2018-07-31 郑州宇通客车股份有限公司 A kind of obstacle recognition method and device
CN110914703A (en) * 2017-07-31 2020-03-24 深圳市大疆创新科技有限公司 Correction of motion-based inaccuracies in point clouds
CN108226924A (en) * 2018-01-11 2018-06-29 李烜 Running car environment detection method, apparatus and its application based on millimetre-wave radar
CN108710828A (en) * 2018-04-18 2018-10-26 北京汽车集团有限公司 The method, apparatus and storage medium and vehicle of identification object
EP3633404A1 (en) * 2018-10-02 2020-04-08 Ibeo Automotive Systems GmbH Method and apparatus for optical distance measurements
CN109870680A (en) * 2018-10-26 2019-06-11 北京润科通用技术有限公司 A kind of objective classification method and device
CN109740628A (en) * 2018-12-03 2019-05-10 深圳市华讯方舟太赫兹科技有限公司 Point cloud clustering method, image processing equipment and the device with store function
CN110084840A (en) * 2019-04-24 2019-08-02 百度在线网络技术(北京)有限公司 Point cloud registration method, device, server and computer-readable medium
CN110208793A (en) * 2019-04-26 2019-09-06 纵目科技(上海)股份有限公司 DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar
CN110310294A (en) * 2019-07-08 2019-10-08 江苏易图地理信息科技股份有限公司 A kind of point cloud segmentation method using adaptivenon-uniform sampling face fuzzy C-means clustering
WO2021017314A1 (en) * 2019-07-29 2021-02-04 浙江商汤科技开发有限公司 Information processing method, information positioning method and apparatus, electronic device and storage medium
CN110412516A (en) * 2019-08-20 2019-11-05 河北德冠隆电子科技有限公司 Detection method and device of the millimetre-wave radar to stationary object and slow changing object
JPWO2021053811A1 (en) * 2019-09-20 2021-03-25
CN110969855A (en) * 2019-12-13 2020-04-07 长沙莫之比智能科技有限公司 Traffic flow monitoring system based on millimeter wave radar
CN111239766A (en) * 2019-12-27 2020-06-05 北京航天控制仪器研究所 Water surface multi-target rapid identification and tracking method based on laser radar
CN111260683A (en) * 2020-01-09 2020-06-09 合肥工业大学 Target detection and tracking method and device for three-dimensional point cloud data
JPWO2021181647A1 (en) * 2020-03-13 2021-09-16
CN111582352A (en) * 2020-04-30 2020-08-25 上海高仙自动化科技发展有限公司 Object-based sensing method and device, robot and storage medium
CN111797734A (en) * 2020-06-22 2020-10-20 广州视源电子科技股份有限公司 Vehicle point cloud data processing method, device, equipment and storage medium
CN111857168A (en) * 2020-07-03 2020-10-30 北京二郎神科技有限公司 Unmanned aerial vehicle positioning method and device and unmanned aerial vehicle parking attitude adjusting method and device
CN113156414A (en) * 2020-12-16 2021-07-23 中国人民解放军陆军工程大学 Intelligent sensing and path planning transportation system based on MIMO millimeter wave radar
CN112847343A (en) * 2020-12-29 2021-05-28 深圳市普渡科技有限公司 Dynamic target tracking and positioning method, device, equipment and storage medium
CN113031005A (en) * 2021-02-22 2021-06-25 江苏大学 Crane dynamic obstacle identification method based on laser radar
CN113139607A (en) * 2021-04-27 2021-07-20 苏州挚途科技有限公司 Obstacle detection method and device
CN113391270A (en) * 2021-06-11 2021-09-14 森思泰克河北科技有限公司 False target suppression method and device for multi-radar point cloud fusion and terminal equipment
CN113313200A (en) * 2021-06-21 2021-08-27 中国科学院自动化研究所苏州研究院 Point cloud fine matching method based on normal constraint
CN113589288A (en) * 2021-06-24 2021-11-02 广西综合交通大数据研究院 Target screening method, device and equipment based on millimeter wave radar and storage medium
CN113537316A (en) * 2021-06-30 2021-10-22 南京理工大学 Vehicle detection method based on 4D millimeter wave radar point cloud
CN113674355A (en) * 2021-07-06 2021-11-19 中国北方车辆研究所 Target identification and positioning method based on camera and laser radar
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113807168A (en) * 2021-08-05 2021-12-17 北京蜂云科创信息技术有限公司 Vehicle driving environment sensing method, vehicle-mounted equipment and storage medium
CN113479218A (en) * 2021-08-09 2021-10-08 哈尔滨工业大学 Roadbed automatic driving auxiliary detection system and control method thereof
CN113761238A (en) * 2021-08-27 2021-12-07 广州文远知行科技有限公司 Point cloud storage method, device, equipment and storage medium
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device
CN113848545A (en) * 2021-09-01 2021-12-28 电子科技大学 Fusion target detection and tracking method based on vision and millimeter wave radar
CN113689471A (en) * 2021-09-09 2021-11-23 中国联合网络通信集团有限公司 Target tracking method and device, computer equipment and storage medium
CN113792699A (en) * 2021-09-24 2021-12-14 北京易航远智科技有限公司 Object-level rapid scene recognition method based on semantic point cloud
CN113888748A (en) * 2021-09-27 2022-01-04 北京经纬恒润科技股份有限公司 Point cloud data processing method and device

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
三维激光点云运动图像动态目标识别;罗革;《激光杂志》;第42卷(第9期);134-138 *
动态分配聚类中心的改进K均值聚类算法;程艳云等;《计算机技术与发展》;第27卷(第2期);33-37 *
地面激光点云结合摄影测量的建筑物三维建模;汪晓楚;《安徽师范大学学报(自然科学版)》;第40卷(第6期);569-573 *
基于激光雷达与红外图像融合的车辆目标识别算法;战荫泽;《激光与红外》;第51卷(第9期);1238-1242 *
融合附加神经网络的激光雷达点云单目标跟踪;周笑宇;《中国激光》;第48卷(第21期);(2110001-1)-(2110001-13) *
邸慧军等.《无人驾驶车辆目标检测与运动跟踪》.北京理工大学出版社,2021,第143-145页. *

Also Published As

Publication number Publication date
CN114442101A (en) 2022-05-06

Similar Documents

Publication Publication Date Title
CN109927719B (en) Auxiliary driving method and system based on obstacle trajectory prediction
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
CN108572663B (en) Target tracking
EP3580581B1 (en) Using wheel orientation to determine future heading
CN112665556B (en) Generating a three-dimensional map of a scene using passive and active measurements
US10705220B2 (en) System and method for ground and free-space detection
US11004000B1 (en) Predicting trajectory intersection by another road user
CN113002396B (en) A environmental perception system and mining vehicle for automatic driving mining vehicle
US20220155415A1 (en) Detecting Spurious Objects For Autonomous Vehicles
JPH11212640A (en) Autonomously traveling vehicle and method for controlling autonomously traveling vehicle
US10380757B2 (en) Detecting vehicle movement through wheel movement
CN113085896B (en) Auxiliary automatic driving system and method for modern rail cleaning vehicle
KR20190126024A (en) Traffic Accident Handling Device and Traffic Accident Handling Method
US11673581B2 (en) Puddle occupancy grid for autonomous vehicles
US20230152458A1 (en) Lidar System with Gyroscope-Aided Focus Steering
CN114518113A (en) Filtering return points in a point cloud based on radial velocity measurements
CN112485784A (en) Method and device for determining danger coefficient of target in inner wheel difference region, electronic equipment and storage medium
US11590978B1 (en) Assessing perception of sensor using known mapped objects
CN114348018A (en) Automatic driving system and method for commercial vehicle
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
EP4187277A1 (en) A method to detect radar installation error for pitch angle on autonomous vehicles
RU2775817C2 (en) Method and system for training machine learning algorithm for detecting objects at a distance
CN215495425U (en) Compound eye imaging system and vehicle using same
US20240029450A1 (en) Automated driving management system and automated driving management method
US20230152466A1 (en) Lidar System with Scene Dependent Focus Intensity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant