WO2021259344A1 - Vehicle detection method and device, vehicle, and storage medium - Google Patents
Vehicle detection method and device, vehicle, and storage medium Download PDFInfo
- Publication number
- WO2021259344A1 WO2021259344A1 PCT/CN2021/101971 CN2021101971W WO2021259344A1 WO 2021259344 A1 WO2021259344 A1 WO 2021259344A1 CN 2021101971 W CN2021101971 W CN 2021101971W WO 2021259344 A1 WO2021259344 A1 WO 2021259344A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- radar
- target
- coordinates
- millimeter wave
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- This application relates to the field of vehicle technology, for example, to a vehicle detection method, device, vehicle, and storage medium.
- environment perception technology is the foundation and prerequisite for the development of intelligent driving vehicles.
- the sensors commonly used for environment perception include millimeter wave radar, camera, and lidar.
- Millimeter wave radar is not susceptible to weather factors such as rain, snow, and fog. It is highly adaptable to the environment and has great advantages in extracting the depth of field information and speed information of the target.
- Camera-based visual recognition is used in target recognition and classification. It has a great advantage and can perceive real spatial information. Therefore, millimeter-wave radars and cameras are often used to perceive and integrate environmental information to assist autonomous driving.
- the amount of fusion data is large, the efficiency is low, and the real-time performance of the perceived environmental information is poor. , Is not conducive to assisted automatic driving.
- This application provides a vehicle detection method, device, vehicle, and storage medium to solve the problem that in the process of sensing environmental information based on millimeter wave radar and camera, the amount of fusion data is large, the efficiency is low, and the real-time performance of the environmental information obtained by perception is poor, which is not conducive to The problem of assisting autonomous driving.
- a vehicle detection method is provided, which is applied to a vehicle equipped with a millimeter wave radar and a camera, including:
- the detection data of the target vehicle is obtained by fusing the first detection data and the second detection data.
- the establishing the dangerous area of the own vehicle includes:
- a rectangular dangerous area is established in the driving direction of the host vehicle and in front of the host vehicle, the width of the dangerous area is M widths of the host vehicle, and the length of the dangerous area is all For the N vehicle lengths of the host vehicle, the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
- the determining the target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data includes:
- Acquire first detection data of the target vehicle where the first detection data includes at least one of radar coordinates, speed, acceleration, vehicle width, and vehicle length of the target vehicle in the radar coordinate system.
- the removing false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar includes:
- the false target is determined by the following formula:
- n-millimeter-wave radar is a sample number, T r millimeter-wave radar sampling period, x (n + 1), y (n + 1) for the longitudinal and transverse coordinates of the coordinate sampling point n + 1 of the target, x(n), y(n) are the longitudinal and lateral coordinates of the target at n sampling points, v x (n) and v y (n) are the longitudinal and lateral speeds of the target at n sampling points, a x And a y are the maximum values of longitudinal acceleration and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, and a y T r is the lateral velocity difference threshold.
- the target is a real target, otherwise the target is a false target;
- the vehicle detected by the millimeter wave radar is obtained by removing the false target from the target detected by the millimeter wave radar.
- the first detection data includes radar coordinates of the target vehicle in the radar coordinate system
- the second detection data includes second pixel coordinates of the target vehicle in the pixel coordinate system
- the fusion of the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image includes:
- the radar coordinates are corrected by the second pixel coordinates of the vehicle contained in the region of interest to obtain the final coordinates of the target vehicle.
- the projecting the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel with the geometric center of the target vehicle in the image includes:
- H c is the height of the installation position of the camera from the ground
- ⁇ is the elevation angle of the camera installation
- (x r , y r , z r ) is the radar coordinates of the target vehicle in the radar coordinate system
- W r is the installation of millimeter wave radar
- H r is the height of the millimeter-wave radar installation position from the ground
- ⁇ L is the absolute value of the difference between the installation distance of the millimeter-wave radar and the camera in the driving direction of the vehicle.
- (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image
- (u 0 , v 0 ) is the pixel coordinate of the center point of the image
- f x and f y are the equivalent focal lengths of the camera
- the pixel point corresponding to the first pixel coordinate is the target pixel point of the geometric center of the target vehicle in the image.
- the correcting the radar coordinates to obtain the final coordinates of the target vehicle by using the second pixel coordinates of the vehicle included in the region of interest includes:
- the radar coordinates are corrected by the following formula to obtain the final coordinates of the target vehicle:
- a vehicle detection device is also provided, which is applied to a vehicle equipped with a millimeter wave radar and a camera, and includes:
- the radar data and image acquisition module is set to collect radar data through millimeter wave radar and images through the camera on the vehicle;
- the dangerous area establishment module is set to establish the dangerous area of the vehicle
- a first detection data determination module configured to determine a target vehicle located in the dangerous area and first detection data of the target vehicle according to the radar data
- a second detection data acquisition module configured to input the image into a pre-trained target detection model to obtain second detection data of the target vehicle
- the fusion module is configured to fuse the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image.
- a vehicle including:
- a millimeter wave radar and a camera the millimeter wave radar is set to collect radar data, and the camera is set to collect images;
- One or more processors are One or more processors;
- Storage device set to store one or more programs
- the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the vehicle detection method according to any embodiment of the present application.
- a computer-readable storage medium which stores a computer program, and when the computer program is executed by a processor, the vehicle detection method according to any one of the embodiments of the present application is implemented.
- FIG. 1 is a flow chart of the steps of a vehicle detection method provided by Embodiment 1 of the present application;
- 2A is a flowchart of the steps of a vehicle detection method provided in the second embodiment of the present application.
- 2B is a schematic diagram of the host vehicle and the target vehicle in an embodiment of the present application.
- 2C is a schematic diagram of the training and application of the target detection model in an embodiment of the present application.
- FIG. 3 is a structural block diagram of a vehicle detection device provided in Embodiment 3 of the present application.
- Fig. 4 is a schematic diagram of a vehicle provided in the fourth embodiment of the present application.
- Figure 1 is a flow chart of the steps of a vehicle detection method provided in the first embodiment of the application.
- the embodiment of the application can be applied to a situation where the vehicle detects surrounding vehicles during driving.
- the vehicle detection device may be implemented by hardware or software and integrated in the vehicle provided in the embodiment of the present application.
- the vehicle detection method of the embodiment of the present application may include the following steps.
- S101 Collect radar data through millimeter wave radar and collect images through a camera on the own vehicle.
- the host vehicle may be a vehicle that requires assisted driving.
- the host vehicle needs to perceive the surrounding environment through sensors mounted on the vehicle body, and obtain data of vehicles traveling in the surrounding environment to perform assisted driving through the data. decision making.
- the vehicle can be equipped with millimeter-wave radar and camera.
- the millimeter-wave radar transmits radar waves to the front of the driving direction and receives the reflected radar waves.
- the transmitted radar waves and the received radar waves are mixed to obtain the radar signal as a radar.
- the camera takes pictures in the direction of travel to obtain images.
- millimeter-wave radar and camera can collect data at the same sampling rate or at different sampling rates. When collecting data at different sampling rates, use the sampling time of the smaller sampling rate as the time.
- the benchmark acquires radar data and images at a time, that is, after acquiring one frame of data at a sampling time with a smaller sampling rate, acquire the data collected closest to the sampling time with a larger sampling rate as the sampling data.
- the camera collects 25 frames of images per second
- the millimeter wave radar collects 35 frames of radar data per second.
- the sampling rate of the camera is relatively small.
- the camera collects one frame of images at a time, it is determined that the millimeter wave radar is closest to that time.
- the radar data collected by the millimeter wave radar and the image collected by the camera are obtained, and the target vehicle detected by the radar data and the target vehicle recognized by the image are fused to obtain the final detection data of the target vehicle.
- the dangerous area is a potentially dangerous area during driving.
- it can be an area close to the vehicle in front of the vehicle in the driving direction of the vehicle.
- the dangerous area may be a rectangular area
- the width of the rectangular area may be the width of two vehicle bodies
- the length may be N vehicle body lengths or determined according to the braking distance of the vehicle.
- the center line of the direction coincides with the center line of the vehicle body in the longitudinal direction of the vehicle body.
- the radar data can be the radar signal after the transmitted radar wave and the received radar wave are mixed. After the radar signal is detected and detected, multiple targets in front of the vehicle detected by the millimeter wave radar can be obtained, and multiple targets can be obtained relative to each other.
- the target detection model can be trained offline.
- the target detection model can recognize the types of multiple targets in the input image and the pixel coordinates of the geometric center of the target in the image.
- the target detection model can be based on Haar
- the target detection model combined with -like features and Adaboost can also be neural networks such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Deep Neural Network (DNN), etc.
- CNN Convolutional Neural Network
- RNN Recurrent Neural Network
- DNN Deep Neural Network
- the embodiment of the present application does not impose restrictions on the target detection model.
- the image collected by the camera can be input into the target detection model to obtain the second detection data of multiple target vehicles in the image.
- the second detection data may include the type of the target vehicle and the information of the target vehicle.
- the relative position of the camera and the millimeter wave radar, the equivalent focal length of the camera, the pitch angle of the camera after installation, the height of the camera and the millimeter wave radar to the ground, and the pixels of the center point of the image can be used. Coordinates, etc., establish a conversion formula from the radar coordinate system of the millimeter-wave radar to the image coordinate system of the image. Through this conversion formula, the radar coordinates of the geometric center of the target vehicle in the radar coordinate system are converted to the image coordinate system of the image to obtain the target vehicle The geometric center of the corresponding target pixel in the image, and the target pixel is used to establish a region of interest.
- the target identified in the region of interest and the target vehicle detected by the millimeter wave radar belong to the same target vehicle, so that The radar coordinates are corrected by the pixel coordinates of the geometric center of the target vehicle identified by the image to obtain the final detection data of the target vehicle, for example, the lateral distance, longitudinal distance, speed, acceleration, and target of the target vehicle relative to the vehicle.
- Vehicle (Identifier, ID), vehicle width, vehicle length, etc., output the detection data of the target vehicle to the driving assistance system, so that the driving assistance system makes driving assistance decisions based on the detection data of the target vehicle.
- the radar data and the image collected by the camera are collected by the millimeter wave radar on the own vehicle to establish the dangerous area of the own vehicle, and the target vehicle located in the dangerous area and the first detection data of the target vehicle are determined according to the radar data;
- the second detection data of the target vehicle is obtained by inputting the image into the pre-trained target detection model; based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image, the first detection data and the second detection data are merged to obtain the target vehicle's Test data.
- the embodiment of the application realizes that only the first detection data and the second detection data of the target vehicle in the dangerous area are fused to obtain the target detection data of the target vehicle, which greatly reduces the amount of data of the vehicle that needs to be fused, and improves the data fusion Efficiency, the detection data of the target vehicle finally obtained has high real-time performance, which is conducive to the timely decision-making of the automatic driving assistance system.
- FIG. 2A is a flow chart of the steps of a vehicle detection method provided in the second embodiment of the application, and the embodiment of the present application is described on the basis of the foregoing first embodiment.
- the vehicle detection method of the embodiment of the present application may include the following steps.
- S201 Collect radar data through millimeter wave radar and collect images through a camera on the own vehicle.
- a millimeter-wave radar can be installed at the air intake grid in front of the vehicle, and a camera can be installed inside the windshield. Both the millimeter-wave radar and the camera are installed on the longitudinal symmetry plane F of the vehicle. The millimeter-wave radar can also be offset with respect to the longitudinal symmetry plane F, and both the camera and the millimeter-wave radar collect data in the driving direction of the vehicle.
- the width of the dangerous area is M widths of the host vehicle, and the length of the dangerous area Is the N vehicle lengths of the host vehicle, and the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
- the dangerous area is a rectangular area, and the rectangular area is located in front of the driving direction of the vehicle.
- the dangerous area A is in front of the driving direction of the vehicle.
- the width of the dangerous area is 2d and the length is L.
- the width center line of the dangerous area A coincides with the longitudinal symmetry plane F of the vehicle, d
- L is equal to the length of the body of the N own vehicle, for example, N may be equal to 3.
- L can also be equal to the braking distance of the vehicle.
- the braking distance can be determined according to the current driving speed of the vehicle.
- the dangerous area can also be set according to the actual situation, which is not limited in the embodiment of the application. .
- S203 Perform Kalman filtering on the radar data to obtain filtered radar data.
- the radar data may be a mixed radar signal of the transmitted radar wave and the received radar wave.
- the mixed radar signal is interfered by the environment and internal circuits and has noise, and the radar signal can be filtered by Kalman filtering. , In order to remove the noise and interference in the radar signal, and obtain the filtered radar data.
- S204 Determine a target detected by the millimeter wave radar based on the filtered radar data.
- the filtered radar signal can be detected and processed to obtain all the targets detected by the millimeter-wave radar.
- the targets detected by the millimeter-wave radar may include false targets caused by the disturbance of the road surface and the moving vehicle.
- the target detection algorithm of the millimeter wave radar can be used to determine the target detected by the millimeter wave radar based on the radar data. In the embodiment of the present application, the determination of the target detected by the millimeter wave radar based on the radar data will not be described in detail.
- the false target can be determined by the following formula:
- n is the sample number millimeter-wave radar
- T r is the sampling period of a millimeter wave radar
- horizontal coordinates (OrYr) x(n), y(n) are the longitudinal and horizontal coordinates of the target at n sampling points
- v x (n) and v y (n) are the target’s at n sampling points Longitudinal velocity and lateral velocity
- a x and a y are the maximum values of longitudinal and lateral acceleration
- Is the longitudinal speed difference Is the lateral velocity difference
- a x T r is the longitudinal velocity difference threshold
- a y T r is the lateral velocity difference threshold
- the target is the real target when the above formula is established, otherwise the target is a false target, that is, the lateral velocity difference is greater than the lateral velocity difference threshold
- the longitudinal speed difference is greater than
- the false targets After determining the false targets detected by the millimeter-wave radar, the false targets can be removed from the targets detected by the millimeter-wave radar to obtain the vehicles detected by the millimeter-wave radar, thus avoiding the image data fusion of the false targets and reducing the fusion
- the amount of data can improve the efficiency of vehicle detection.
- the millimeter-wave radar When the millimeter-wave radar detects a vehicle, it can also obtain the radar coordinates of the geometric center of the detected vehicle, and the horizontal and vertical distances of the geometric center of the detected vehicle relative to the vehicle.
- the longitudinal distance determines whether the vehicle is in the danger zone. Exemplarily, it can be determined whether the radar coordinates of the detected vehicle are located in the dangerous area, or whether the vehicle is located in the dangerous area is determined by the horizontal and longitudinal distances of the detected vehicle relative to the vehicle of the vehicle. It is determined that the vehicle is a vehicle located in a dangerous area, and the vehicle is regarded as the target vehicle.
- the detection range of millimeter-wave radar is relatively wide, and the detection distance can usually reach 100 meters.
- the dangerous area includes target vehicle 1 and target vehicle 2, which greatly reduces the number of target vehicles, reduces the amount of fusion data, and improves Improve the efficiency of vehicle detection.
- S207 Acquire first detection data of the target vehicle, where the first detection data includes at least one of radar coordinates, speed, acceleration, vehicle width, and vehicle length of the target vehicle in the radar coordinate system.
- the millimeter wave radar can not only detect the target vehicle, but also obtain the radar coordinates of the target vehicle, the horizontal and longitudinal distance relative to the own vehicle, the speed, acceleration, vehicle width, and length of the target vehicle, etc.
- the first detection data of the above-mentioned target vehicle can be obtained by any kind of millimeter wave radar detection target algorithm, which will not be described in detail here.
- the target detection model may be a target detection model based on a combination of Haar-like features and Adaboost algorithm. As shown in FIG. 2C, target detection model training based on a combination of Haar-like features and Adaboost algorithm And the application process is as follows.
- the offline training phase collect vehicle sample images and non-vehicle sample images, normalize and grayscale the vehicle sample images and non-vehicle sample images, and perform features on the processed images based on the Haar-like features of the integral map Extract and input the extracted features into a cascaded classifier based on the Adaboost algorithm to iteratively train the classifier to obtain the final cascaded classifier.
- the camera detects the sample to be detected in real time, such as the image collected by the camera in the embodiment of this application, extracts the features of the image collected by the camera based on the Haar-like features of the integral map, and inputs the extracted features into training
- a good cascade classifier recognizes the target in the image, and obtains the target category and the pixel coordinates of the geometric center of the target.
- the recognized target is a car, a bus, a truck, a motorcycle, a bicycle, etc.
- the target detection model can output the types of multiple vehicles in the image and the pixel coordinates of the geometric centers of the vehicles.
- S209 Project the radar coordinates from the radar coordinate system to the pixel coordinate system to obtain the target pixel point of the geometric center of the target vehicle in the image.
- the radar coordinates of the target vehicle can be converted to camera coordinates through the following conversion formula:
- the camera is installed on the longitudinal symmetry plane F of the vehicle.
- the camera is fixed in the windshield of the vehicle or the upper part of the front air intake grille, and the horizontal angle and yaw angle of the camera are 0°
- the pitch angle can be determined according to the front detection range of the vehicle;
- the millimeter wave radar is installed on the longitudinal symmetry plane F of the vehicle, or installed at a certain distance from the longitudinal symmetry plane F.
- the millimeter wave radar is installed below At the intake grille, the horizontal angle, yaw angle, and pitch angle of the millimeter-wave radar installation are all 0°. According to the installation position of the camera and millimeter wave radar, the coordinate system of the camera and millimeter wave radar can be established.
- Oc-Xc-Yc-Zc is the camera coordinate system
- Or-Xr-Yr-Zr is the radar coordinate system of the millimeter wave radar.
- Oc is the origin of the camera coordinate system, that is, the optical center of the camera
- the OcXc axis is perpendicular to the longitudinal symmetry plane F of the own vehicle and points to the right side of the direction of travel of the own vehicle
- the OcYc axis (not shown) is located in the own vehicle
- the longitudinal symmetry plane F of the vehicle is perpendicular to the optical axis of the camera
- the OcZc axis coincides with the optical axis of the camera.
- Or-Xr-Yr-Zr is the radar coordinate system, where Or is the geometric center of the millimeter-wave radar's working surface, the OrXr axis is perpendicular to the millimeter-wave radar's working surface and points to the driving direction of the vehicle, and the OrYr axis is perpendicular to the vehicle.
- the longitudinal symmetry plane F of the vehicle points to the left side of the forward direction of the vehicle, and the OrZr axis (not shown) is perpendicular to the ground and points to the roof of the vehicle.
- H c is the height of the installation position of the camera from the ground
- ⁇ is the pitch angle of the camera installation
- (x r , y r , z r ) is the target vehicle in the radar coordinate system Radar coordinates
- W r is the distance between the installation position of the millimeter-wave radar and the longitudinal symmetry plane F of the vehicle
- H r is the height of the installation position of the millimeter-wave radar from the ground
- ⁇ L is the distance between the millimeter-wave radar and the camera along the vehicle.
- the absolute value of the difference between the installation distances in the driving direction of, (x c , y c , z c ) is the camera coordinate of the target vehicle in the camera coordinate system.
- (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image
- (u 0 , v 0 ) is the pixel coordinate of the center point of the image
- f x and f y are The equivalent focal length of the camera.
- the radar coordinates of the center point of the target vehicle can be projected to the pixel coordinates of the image through the above formula (3), and the pixel point corresponding to the pixel coordinate is the center point of the target vehicle The corresponding target pixel.
- S210 Establish a region of interest with the target pixel as the center.
- a rectangular area is established as the region of interest.
- S211 Correct the radar coordinates by using the second pixel coordinates of the vehicles included in the region of interest to obtain the final coordinates of the target vehicle.
- the millimeter-wave radar detects the radar coordinates of the geometric center of the target vehicle.
- the radar coordinates are projected into the pixel coordinate system to obtain a first pixel coordinate.
- the image is input into the pre-trained target detection model.
- the second pixel coordinates of the geometric center of the target vehicle are obtained, and the position of the target vehicle can be corrected through the conversion of the second pixel coordinates to the radar coordinate system.
- the radar coordinates can be corrected by the following formula to obtain the final coordinates of the target vehicle.
- (x a, y a, z a) as the final corrected coordinates, (u r, v r) is the target pixel coordinate by the second detection pattern to obtain the geometric center of the target vehicle.
- the final coordinates, the vehicle width and length detected by the millimeter wave radar, the distance relative to the vehicle, speed, acceleration, etc. can be output to the driving assistance system as the final detection data, so that the assistance
- the driving system makes auxiliary driving decisions based on the detection data of the target vehicle.
- radar data is collected by millimeter wave radar and images are collected on the vehicle, and a rectangular dangerous area is established in the driving direction of the vehicle and in front of the vehicle, and the radar data is filtered to obtain the millimeter wave.
- the target detected by the radar after filtering out false targets from the detected targets, determine the target vehicle located in the dangerous area, obtain the first detection data of the target vehicle and input the image into the target detection model to obtain the second detection data,
- the radar coordinates of the geometric center of the target vehicle in the first detection data are projected into the image to obtain the target pixel, and the area of interest is established based on the target pixel.
- the coordinates are modified to obtain the final coordinates of the target vehicle, and the detection data including the final coordinates are output.
- the fusion of radar data and camera vision data by filtering false targets, establishing a dangerous area and identifying target vehicles located in the dangerous area greatly reduces the number of vehicles that need to be fused, reduces the amount of data processing, and improves the vehicle detection effect .
- FIG. 3 is a structural block diagram of a vehicle detection device provided in the third embodiment of the present application. As shown in FIG. 3, the vehicle detection device of the embodiment of the present application is applied to a vehicle equipped with a millimeter wave radar and a camera, and may include the following Module.
- the radar data and image acquisition module 301 is configured to collect radar data through millimeter wave radar and collect images through a camera on the vehicle;
- the dangerous area establishment module 302 is configured to establish the dangerous area of the own vehicle
- the first detection data determining module 303 is configured to determine the target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data;
- the second detection data acquisition module 304 is configured to input the image into a pre-trained target detection model to obtain second detection data of the target vehicle;
- the fusion module 305 is configured to merge the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image.
- the dangerous area establishing module 302 includes: a dangerous area establishing sub-module.
- the dangerous area establishment sub-module is set to establish a rectangular dangerous area in the driving direction of the host vehicle and in front of the host vehicle, and the width of the dangerous area is M vehicle widths of the host vehicle, The length of the dangerous area is N vehicle lengths of the host vehicle, and the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
- the first detection data determining module 303 includes the following modules.
- a filtering sub-module configured to perform Kalman filtering on the radar data to obtain filtered radar data
- a target determination sub-module configured to determine the target detected by the millimeter wave radar based on the filtered radar data
- a false target removal sub-module configured to remove false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar;
- the target vehicle determination sub-module is configured to determine a vehicle located in the dangerous area from the vehicles detected by the millimeter wave radar and use the determined vehicle as a target vehicle;
- the first detection data acquisition sub-module is configured to acquire first detection data of the target vehicle, the first detection data including the radar coordinates, speed, acceleration, vehicle width, and vehicle width of the target vehicle in the radar coordinate system. At least one of the captains.
- the false target removal sub-module includes the following modules.
- the false target determination unit is set to determine the false target through the following formula:
- n-millimeter-wave radar is a sample number, T r millimeter-wave radar sampling period, x (n + 1), y (n + 1) for the longitudinal and transverse coordinates of the coordinate sampling point n + 1 of the target, x(n), y(n) are the longitudinal and lateral coordinates of the target at n sampling points, v x (n) and v y (n) are the longitudinal and lateral speeds of the target at n sampling points, a x And a y are the maximum values of longitudinal acceleration and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, and a y T r is the lateral velocity difference threshold.
- the target is a real target, otherwise the target is a false target;
- the false target removing unit is configured to remove the false target from the target detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar.
- the first detection data includes radar coordinates of the target vehicle
- the second detection data includes second pixel coordinates of the target vehicle
- the fusion module 305 includes the following modules.
- a coordinate projection sub-module configured to project the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel with the geometric center of the target vehicle in the image;
- the region of interest establishment sub-module is set to establish a region of interest with the target pixel as the center;
- the coordinate correction sub-module is configured to correct the radar coordinates to obtain the final coordinates of the target vehicle by using the second pixel coordinates of the vehicle contained in the region of interest.
- the coordinate projection sub-module includes the following modules.
- the coordinate projection unit is set to project the radar coordinates from the radar coordinate system to the pixel coordinate system to obtain the first pixel coordinate through the following conversion formula from the radar coordinate system to the pixel coordinate system:
- H c is the height of the installation position of the camera from the ground
- ⁇ is the elevation angle of the camera installation
- (x r , y r , z r ) is the radar coordinates of the target vehicle in the radar coordinate system
- W r is the installation of millimeter wave radar
- H r is the height of the millimeter-wave radar installation position from the ground
- ⁇ L is the absolute value of the difference between the installation distance of the millimeter-wave radar and the camera in the driving direction of the vehicle.
- (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image
- (u 0 , v 0 ) is the pixel coordinate of the center point of the image
- f x and f y are the equivalent focal lengths of the camera
- the target pixel point determining unit is configured to determine the pixel point corresponding to the first pixel coordinate as the target pixel point of the geometric center of the target vehicle in the image.
- the coordinate correction sub-module includes the following modules.
- the coordinate correction unit is set to correct the radar coordinates through the following formula to obtain the final coordinates of the target vehicle:
- the vehicle detection device provided in the embodiment of the present application can execute the vehicle detection method provided in the first and second embodiments of the present application, and has the functional modules and effects corresponding to the execution method.
- the vehicle may include: a processor 400, a memory 401, a display screen 402 with a touch function, an input device 403, an output device 404, a communication device 405, a camera 406, and a millimeter wave radar 407.
- the number of processors 400 in the vehicle may be one or more, and one processor 400 is taken as an example in FIG. 4.
- the number of memories 401 in the vehicle may be one or more. In FIG. 4, one memory 401 is taken as an example.
- the processor 400, the memory 401, the display screen 402, the input device 403, the output device 404, the communication device 405, the camera 406 and the millimeter wave radar 407 of the vehicle may be connected by a bus or other means.
- the bus connection is taken as an example .
- the memory 401 can be configured to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the vehicle detection method described in any embodiment of the present application (for example, the vehicle detection device described above)
- the storage program area can store the operating device and application programs required by at least one function; the storage data area can store data created according to the use of the device, etc.
- the memory 401 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
- the memory 401 may include a memory remotely provided with respect to the processor 400, and these remote memories may be connected to the device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
- the display screen 402 is a display screen 402 with a touch function, and may be a capacitive screen, an electromagnetic screen or an infrared screen.
- the display screen 402 is set to display data according to instructions of the processor 400, and is also set to receive touch operations on the display screen 402 and send corresponding signals to the processor 400 or other devices.
- the display screen 402 is an infrared screen, it also includes an infrared touch frame.
- the infrared touch frame is arranged around the display screen 402. It can also be set to receive infrared signals and send the infrared signals to the processor. 400 or other equipment.
- the communication device 405 is configured to establish a communication connection with other devices, and it may be a wired communication device and/or a wireless communication device.
- the input device 403 may be configured to receive input digital or character information, and generate key signal input related to user settings and function control of the device.
- the output device 404 may include audio equipment such as a speaker. The composition of the input device 403 and the output device 404 can be set according to actual conditions.
- the camera 406 is configured to collect images around the vehicle, and the millimeter wave radar 407 is configured to emit radar signals around the vehicle to collect radar data.
- the processor 400 executes various functional applications and data processing of the device by running the software programs, instructions, and modules stored in the memory 401, thereby realizing the aforementioned vehicle detection method.
- the vehicle detection method provided in the embodiment of the present application can be implemented.
- the embodiment of the present application also provides a computer-readable storage medium.
- the instructions in the storage medium are executed by the processor of the device, the device can execute the vehicle detection method described in the above-mentioned embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar Systems Or Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
A vehicle detection method and device, a vehicle, and a storage medium. The vehicle detection method comprises: on the present vehicle, acquiring radar data by means of a millimeter-wave radar and acquiring an image by means of a camera (S101); establishing a dangerous area of the present vehicle (S102); determining, according to the radar data, a target vehicle located in the dangerous area and first detection data of the target vehicle (S103); inputting the image into a target detection model to obtain second detection data of the target vehicle (S104); and on the basis of a projection of a radar coordinate system of the millimeter-wave radar onto a pixel coordinate system of the image, fusing the first detection data and the second detection data to obtain detection data of the target vehicle (S105).
Description
本申请要求在2020年06月24日提交中国专利局、申请号为202010591915.X的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office with an application number of 202010591915.X on June 24, 2020, and the entire content of this application is incorporated into this application by reference.
本申请涉及车辆技术领域,例如涉及一种车辆检测方法、装置、车辆和存储介质。This application relates to the field of vehicle technology, for example, to a vehicle detection method, device, vehicle, and storage medium.
随着自动驾驶技术的发展,自动驾驶正在慢慢融入人们的生活,配备驾驶辅助功能已经成为人们选择购买车辆时的重要参考之一。With the development of autonomous driving technology, autonomous driving is slowly integrating into people's lives. Equipping with driving assistance functions has become one of the important references when people choose to buy a vehicle.
在自动驾驶技术中,环境感知技术是智能驾驶车辆发展的基础与前提,常用于环境感知的传感器有毫米波雷达、相机、以及激光雷达等。然而,单一传感器难以满足智能驾驶全天候、多信息、高精度、以及高鲁棒性的环境感知要求。毫米波雷达不易受雨、雪、以及雾等天气因素的影响,对环境适应性强,并且在对目标的景深信息与速度信息提取方面具有很大优势,基于相机的视觉识别在目标识别分类方面具有较大优势,可感知现实空间信息。因此,多采用毫米波雷达和相机来感知并融合环境信息辅助自动驾驶,然而,基于毫米波雷达和相机来感知环境信息过程中,融合的数据量大,效率低,感知得到的环境信息实时性差,不利于辅助自动驾驶。In autonomous driving technology, environment perception technology is the foundation and prerequisite for the development of intelligent driving vehicles. The sensors commonly used for environment perception include millimeter wave radar, camera, and lidar. However, it is difficult for a single sensor to meet the requirements of intelligent driving for all-weather, multi-information, high-precision, and high-robustness environmental perception. Millimeter wave radar is not susceptible to weather factors such as rain, snow, and fog. It is highly adaptable to the environment and has great advantages in extracting the depth of field information and speed information of the target. Camera-based visual recognition is used in target recognition and classification. It has a great advantage and can perceive real spatial information. Therefore, millimeter-wave radars and cameras are often used to perceive and integrate environmental information to assist autonomous driving. However, in the process of sensing environmental information based on millimeter-wave radars and cameras, the amount of fusion data is large, the efficiency is low, and the real-time performance of the perceived environmental information is poor. , Is not conducive to assisted automatic driving.
发明内容Summary of the invention
本申请提供一种车辆检测方法、装置、车辆和存储介质,以解决基于毫米波雷达和相机来感知环境信息过程中,融合的数据量大,效率低,感知得到的环境信息实时性差,不利于辅助自动驾驶的问题。This application provides a vehicle detection method, device, vehicle, and storage medium to solve the problem that in the process of sensing environmental information based on millimeter wave radar and camera, the amount of fusion data is large, the efficiency is low, and the real-time performance of the environmental information obtained by perception is poor, which is not conducive to The problem of assisting autonomous driving.
提供了一种车辆检测方法,应用于设置有毫米波雷达和相机的本车车辆,包括:A vehicle detection method is provided, which is applied to a vehicle equipped with a millimeter wave radar and a camera, including:
在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像;Collect radar data through millimeter-wave radar and collect images through cameras on the vehicle;
建立所述本车车辆的危险区域;Establish the dangerous area of the own vehicle;
根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据;Determining a target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data;
将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据;Inputting the image into a pre-trained target detection model to obtain second detection data of the target vehicle;
基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。Based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image, the detection data of the target vehicle is obtained by fusing the first detection data and the second detection data.
可选地,所述建立所述本车车辆的危险区域,包括:Optionally, the establishing the dangerous area of the own vehicle includes:
在所述本车车辆的行驶方向、所述本车车辆的前方建立一个矩形的危险区域,所述危险区域的宽度为所述本车车辆的M个车宽,所述危险区域的长度为所述本车车辆的N个车长,所述危险区域的宽度中心线与所述本车车辆的车宽中心线重合。A rectangular dangerous area is established in the driving direction of the host vehicle and in front of the host vehicle, the width of the dangerous area is M widths of the host vehicle, and the length of the dangerous area is all For the N vehicle lengths of the host vehicle, the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
可选地,所述根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据,包括:Optionally, the determining the target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data includes:
对所述雷达数据进行卡尔曼滤波获得滤波后的雷达数据;Performing Kalman filtering on the radar data to obtain filtered radar data;
基于所述滤波后的雷达数据确定出所述毫米波雷达所检测到的目标;Determine the target detected by the millimeter wave radar based on the filtered radar data;
从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆;Removing false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar;
从所述毫米波雷达检测到的车辆中确定出位于所述危险区域内的车辆并将确定出的车辆作为目标车辆;Determine a vehicle located in the dangerous area from the vehicles detected by the millimeter wave radar and use the determined vehicle as a target vehicle;
获取所述目标车辆的第一检测数据,所述第一检测数据包括所述目标车辆在所述雷达坐标系中的雷达坐标、速度、加速度、车宽、车长中的至少一项。Acquire first detection data of the target vehicle, where the first detection data includes at least one of radar coordinates, speed, acceleration, vehicle width, and vehicle length of the target vehicle in the radar coordinate system.
可选地,所述从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆,包括:Optionally, the removing false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar includes:
通过以下公式确定出虚假目标:The false target is determined by the following formula:
其中,n为毫米波雷达的采样序号,T
r为毫米波雷达的采样周期,x(n+1)、 y(n+1)为在n+1采样点时目标的纵向坐标和横向坐标,x(n)、y(n)为在n采样点时目标的纵向坐标和横向坐标,v
x(n)和v
y(n)为在n采样点时目标的纵向速度和横向速度,a
x和a
y为纵向加速度和横向加速度的最大值,
为纵向速度差,
为横向速度差,a
xT
r为纵向速度差阈值,a
yT
r为横向速度差阈值,上述公式成立时目标为真实目标,否则目标为虚假目标;
Wherein, n-millimeter-wave radar is a sample number, T r millimeter-wave radar sampling period, x (n + 1), y (n + 1) for the longitudinal and transverse coordinates of the coordinate sampling point n + 1 of the target, x(n), y(n) are the longitudinal and lateral coordinates of the target at n sampling points, v x (n) and v y (n) are the longitudinal and lateral speeds of the target at n sampling points, a x And a y are the maximum values of longitudinal acceleration and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, and a y T r is the lateral velocity difference threshold. When the above formula is established, the target is a real target, otherwise the target is a false target;
从所述毫米波雷达检测到的目标中去除所述虚假目标得到所述毫米波雷达检测到的车辆。The vehicle detected by the millimeter wave radar is obtained by removing the false target from the target detected by the millimeter wave radar.
可选地,所述第一检测数据包括所述目标车辆在所述雷达坐标系中的雷达坐标,所述第二检测数据包括所述目标车辆在所述像素坐标系中的第二像素坐标,所述基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据,包括:Optionally, the first detection data includes radar coordinates of the target vehicle in the radar coordinate system, and the second detection data includes second pixel coordinates of the target vehicle in the pixel coordinate system, The fusion of the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image includes:
将所述雷达坐标从雷达坐标系投影到像素坐标系中,得到所述目标车辆的几何中心在所述图像中的目标像素点;Projecting the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel with the geometric center of the target vehicle in the image;
以所述目标像素点为中心建立一个感兴趣区域;Establishing a region of interest with the target pixel as the center;
通过所述感兴趣区域中包含的车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标。The radar coordinates are corrected by the second pixel coordinates of the vehicle contained in the region of interest to obtain the final coordinates of the target vehicle.
可选地,所述将所述雷达坐标从雷达坐标系投影到像素坐标系中,得到所述目标车辆的几何中心在所述图像中的目标像素点,包括:Optionally, the projecting the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel with the geometric center of the target vehicle in the image includes:
通过以下雷达坐标系到像素坐标系的转换公式将所述雷达坐标从雷达坐标系投影到像素坐标系得到第一像素坐标:Project the radar coordinates from the radar coordinate system to the pixel coordinate system to obtain the first pixel coordinate by the following conversion formula from the radar coordinate system to the pixel coordinate system:
H
c为相机的安装位置距离地面的高度,α为相机安装的俯仰角,(x
r,y
r,z
r)为目标车辆在雷达坐标系中的雷达坐标,W
r为毫米波雷达的安装位置距离本车车辆的纵向对称面的距离,H
r为毫米波雷达的安装位置距离地面的高度,ΔL为毫米波雷达与相机在沿本车车辆的行驶方向上安装距离之差的绝对值,(u,v)为目标车辆在图像中的目标像素点的第一像素坐标,(u
0,v
0)为图像的中心点的像素坐标,f
x和f
y为相机的等效焦距;
H c is the height of the installation position of the camera from the ground, α is the elevation angle of the camera installation, (x r , y r , z r ) is the radar coordinates of the target vehicle in the radar coordinate system, and W r is the installation of millimeter wave radar The distance between the position and the longitudinal symmetry plane of the vehicle, H r is the height of the millimeter-wave radar installation position from the ground, and ΔL is the absolute value of the difference between the installation distance of the millimeter-wave radar and the camera in the driving direction of the vehicle. (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image, (u 0 , v 0 ) is the pixel coordinate of the center point of the image, f x and f y are the equivalent focal lengths of the camera;
确定所述第一像素坐标对应的像素点为目标车辆的几何中心在所述图像中的目标像素点。It is determined that the pixel point corresponding to the first pixel coordinate is the target pixel point of the geometric center of the target vehicle in the image.
可选地,所述通过所述感兴趣区域中包含的车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标,包括:Optionally, the correcting the radar coordinates to obtain the final coordinates of the target vehicle by using the second pixel coordinates of the vehicle included in the region of interest includes:
通过以下公式对所述雷达坐标进行修正得到所述目标车辆的最终坐标:The radar coordinates are corrected by the following formula to obtain the final coordinates of the target vehicle:
(x
a,y
a,z
a)为修正后的最终坐标,(u
r,v
r)为感兴趣区域中包含的目标车辆的第二像素坐标。
(x a , y a , z a ) are the corrected final coordinates, (u r , v r ) are the second pixel coordinates of the target vehicle contained in the region of interest.
还提供了一种车辆检测装置,应用于设置有毫米波雷达和相机的本车车辆,包括:A vehicle detection device is also provided, which is applied to a vehicle equipped with a millimeter wave radar and a camera, and includes:
雷达数据和图像采集模块,设置为在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像;The radar data and image acquisition module is set to collect radar data through millimeter wave radar and images through the camera on the vehicle;
危险区域建立模块,设置为建立所述本车车辆的危险区域;The dangerous area establishment module is set to establish the dangerous area of the vehicle;
第一检测数据确定模块,设置为根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据;A first detection data determination module, configured to determine a target vehicle located in the dangerous area and first detection data of the target vehicle according to the radar data;
第二检测数据获取模块,设置为将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据;A second detection data acquisition module, configured to input the image into a pre-trained target detection model to obtain second detection data of the target vehicle;
融合模块,设置为基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。The fusion module is configured to fuse the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image.
还提供了一种车辆,包括:A vehicle is also provided, including:
毫米波雷达和相机,所述毫米波雷达设置为采集雷达数据,所述相机设置为采集图像;A millimeter wave radar and a camera, the millimeter wave radar is set to collect radar data, and the camera is set to collect images;
一个或多个处理器;One or more processors;
存储装置,设置为存储一个或多个程序,Storage device, set to store one or more programs,
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本申请任一实施例所述的车辆检测方法。When the one or more programs are executed by the one or more processors, the one or more processors implement the vehicle detection method according to any embodiment of the present application.
还提供了一种计算机可读存储介质,存储有计算机程序,该计算机程序被处理器执行时实现本申请任一实施例所述的车辆检测方法。A computer-readable storage medium is also provided, which stores a computer program, and when the computer program is executed by a processor, the vehicle detection method according to any one of the embodiments of the present application is implemented.
图1是本申请实施例一提供的一种车辆检测方法的步骤流程图;FIG. 1 is a flow chart of the steps of a vehicle detection method provided by Embodiment 1 of the present application;
图2A是本申请实施例二提供的一种车辆检测方法的步骤流程图;2A is a flowchart of the steps of a vehicle detection method provided in the second embodiment of the present application;
图2B是本申请实施例中本车车辆和目标车辆的示意图;2B is a schematic diagram of the host vehicle and the target vehicle in an embodiment of the present application;
图2C是本申请实施例中目标检测模型训练和应用的示意图;2C is a schematic diagram of the training and application of the target detection model in an embodiment of the present application;
图3是本申请实施例三提供的一种车辆检测装置的结构框图;FIG. 3 is a structural block diagram of a vehicle detection device provided in Embodiment 3 of the present application;
图4是本申请实施例四提供的一种车辆的示意图。Fig. 4 is a schematic diagram of a vehicle provided in the fourth embodiment of the present application.
下面结合附图和实施例对本申请进行说明。此处所描述的实施例仅仅用于解释本申请,而非对本申请的限定。为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互结合。The application will be described below with reference to the drawings and embodiments. The embodiments described here are only used to explain the application, but not to limit the application. For ease of description, the drawings only show a part of the structure related to the present application, but not all of the structure. In the case of no conflict, the embodiments in the application and the features in the embodiments can be combined with each other.
实施例一Example one
图1为本申请实施例一提供的一种车辆检测方法的步骤流程图,本申请实施例可适用于驾驶过程中本车车辆对周围车辆进行检测的情况,该方法可以由本申请实施例的车辆检测装置来执行,该车辆检测装置可以由硬件或软件来实现,并集成在本申请实施例所提供的车辆中,如图1所示,本申请实施例的车辆检测方法可以包括如下步骤。Figure 1 is a flow chart of the steps of a vehicle detection method provided in the first embodiment of the application. The embodiment of the application can be applied to a situation where the vehicle detects surrounding vehicles during driving. The vehicle detection device may be implemented by hardware or software and integrated in the vehicle provided in the embodiment of the present application. As shown in FIG. 1, the vehicle detection method of the embodiment of the present application may include the following steps.
S101、在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像。S101. Collect radar data through millimeter wave radar and collect images through a camera on the own vehicle.
在本申请实施例中,本车车辆可以是需要进行辅助驾驶的车辆,本车车辆需要通过车身上搭载的传感器感知周围环境,获取周围环境中行驶的车辆的数据,以通过该数据执行辅助驾驶决策。In the embodiments of the present application, the host vehicle may be a vehicle that requires assisted driving. The host vehicle needs to perceive the surrounding environment through sensors mounted on the vehicle body, and obtain data of vehicles traveling in the surrounding environment to perform assisted driving through the data. decision making.
本车车辆上可以搭载有毫米波雷达和相机,通过毫米波雷达向行驶方向前方发射雷达波,并接收反射的雷达波,发射的雷达波和接收的雷达波经混频后得到雷达信号作为雷达数据,同时,相机对行驶方向前方进行拍摄以获得图像。在实际应用中,毫米波雷达和相机可以以相同的采样率采集数据,也可以以不同的采样率采集数据,当以不同的采样率采集数据时,以采样率较小者的采样时间作为时间基准获取一个时刻的雷达数据和图像,即在一个采样时刻以采样率较小者采集一帧数据后,获取采样率较大者距离该采样时刻最近的所采集到的数据作为采样数据。The vehicle can be equipped with millimeter-wave radar and camera. The millimeter-wave radar transmits radar waves to the front of the driving direction and receives the reflected radar waves. The transmitted radar waves and the received radar waves are mixed to obtain the radar signal as a radar. At the same time, the camera takes pictures in the direction of travel to obtain images. In practical applications, millimeter-wave radar and camera can collect data at the same sampling rate or at different sampling rates. When collecting data at different sampling rates, use the sampling time of the smaller sampling rate as the time. The benchmark acquires radar data and images at a time, that is, after acquiring one frame of data at a sampling time with a smaller sampling rate, acquire the data collected closest to the sampling time with a larger sampling rate as the sampling data.
示例性地,相机每秒钟采集25帧图像,毫米波雷达每秒采集35帧雷达数据,相机的采样率比较小,当相机在一个时刻采集一帧图像后,确定毫米波雷达距离该时刻最近的采集的雷达数据,从而获得毫米波雷达采集的雷达数据和相机采集的图像,通过该雷达数据检测到的目标车辆和图像识别的目标车辆进行融合获得目标车辆的最终检测数据。Exemplarily, the camera collects 25 frames of images per second, and the millimeter wave radar collects 35 frames of radar data per second. The sampling rate of the camera is relatively small. When the camera collects one frame of images at a time, it is determined that the millimeter wave radar is closest to that time. The radar data collected by the millimeter wave radar and the image collected by the camera are obtained, and the target vehicle detected by the radar data and the target vehicle recognized by the image are fused to obtain the final detection data of the target vehicle.
S102、建立所述本车车辆的危险区域。S102. Establish a dangerous area of the own vehicle.
危险区域相对于本车车辆而言是驾驶过程中潜在的危险区域,例如可以是本车车辆行驶方向前方靠近本车车辆的一个区域,该区域可以根据本车车辆的车身长度、车身宽度、制动距离来确定。可选地,该危险区域可以是一个矩形区域,该矩形区域的宽度可以为两个车身的宽度,长度可以是N个车身长度或 者根据本车车辆的制动距离来确定,该矩形区域在长度方向的中心线与车身在车身长度方向的中心线重合。Compared with the vehicle, the dangerous area is a potentially dangerous area during driving. For example, it can be an area close to the vehicle in front of the vehicle in the driving direction of the vehicle. Move the distance to determine. Optionally, the dangerous area may be a rectangular area, the width of the rectangular area may be the width of two vehicle bodies, and the length may be N vehicle body lengths or determined according to the braking distance of the vehicle. The center line of the direction coincides with the center line of the vehicle body in the longitudinal direction of the vehicle body.
S103、根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据。S103. Determine a target vehicle located in the dangerous area and first detection data of the target vehicle according to the radar data.
雷达数据可以是发射雷达波和接收雷达波混频之后的雷达信号,该雷达信号经检波、验波后可以得到毫米波雷达检测到的本车车辆前方的多个目标,以及获得多个目标相对于本车车辆的雷达坐标、距离、目标的行驶速度、加速度、宽度和长度等第一检测数据。进而可以根据多个目标的雷达坐标确定目标是否位于危险区域内,若是,则确定位于危险区域内的目标车辆并获取该目标车辆的第一检测数据。The radar data can be the radar signal after the transmitted radar wave and the received radar wave are mixed. After the radar signal is detected and detected, multiple targets in front of the vehicle detected by the millimeter wave radar can be obtained, and multiple targets can be obtained relative to each other. The radar coordinates, distance, target speed, acceleration, width and length of the first detection data of the vehicle. Furthermore, it can be determined whether the target is located in the dangerous area according to the radar coordinates of the multiple targets, and if so, the target vehicle located in the dangerous area is determined and the first detection data of the target vehicle is obtained.
S104、将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据。S104. Input the image into a pre-trained target detection model to obtain second detection data of the target vehicle.
在本申请实施例中,可以离线训练目标检测模型,该目标检测模型可以识别输入图像中的多个目标的类型、目标的几何中心在图像中的像素坐标,其中,目标检测模型可以是基于Haar-like特征与Adaboost组合的目标检测模型,还可以是卷积神经网络(Convolutional Neural Network,CNN)、循环神经网络(Recurrent Neural Network,RNN)、深度神经网络(Deep Neural Network,DNN)等神经网络,本申请实施例对目标检测模型不加以限制。In the embodiments of the present application, the target detection model can be trained offline. The target detection model can recognize the types of multiple targets in the input image and the pixel coordinates of the geometric center of the target in the image. The target detection model can be based on Haar The target detection model combined with -like features and Adaboost can also be neural networks such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Deep Neural Network (DNN), etc. , The embodiment of the present application does not impose restrictions on the target detection model.
在离线训练好目标检测模型后,可以将相机采集到的图像输入到目标检测模型中获得图像中多个目标车辆的第二检测数据,该第二检测数据可以包括目标车辆的类型、目标车辆的几何中心在图像中的像素坐标等。After the target detection model is trained offline, the image collected by the camera can be input into the target detection model to obtain the second detection data of multiple target vehicles in the image. The second detection data may include the type of the target vehicle and the information of the target vehicle. The pixel coordinates of the geometric center in the image, etc.
S105、基于所述毫米波雷达的雷达坐标系到所述图像的图像坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。S105. Based on the projection of the radar coordinate system of the millimeter wave radar to the image coordinate system of the image, fusion of the first detection data and the second detection data to obtain the detection data of the target vehicle.
在本申请的可选实施例中,可以通过相机与毫米波雷达的相对位置、相机的等效焦距、相机安装后的俯仰角、相机和毫米波雷达到地面的高度、图像的中心点的像素坐标等建立毫米波雷达的雷达坐标系到图像的图像坐标系的转换式,通过该转换式将目标车辆的几何中心在雷达坐标系中的雷达坐标转换到图像的图像坐标系下,得到目标车辆的几何中心在图像中对应的目标像素点,并以该目标像素点建立一感兴趣区域,该感兴趣区域内识别出的目标与毫米波雷达检测出的目标车辆属于同一个目标车辆,从而可以通过该图像识别出的目标车辆的几何中心的像素坐标对雷达坐标进行修正,得到目标车辆最终的检测数据,例如,得到目标车辆相对于本车车辆的横向距离、纵向距离、速度、加速度、目标车辆(Identifier,ID)、车宽、车长等,将目标车辆的检测数据输出到 辅助驾驶系统,使得辅助驾驶系统根据目标车辆的检测数据做出辅助驾驶决策。In an optional embodiment of the present application, the relative position of the camera and the millimeter wave radar, the equivalent focal length of the camera, the pitch angle of the camera after installation, the height of the camera and the millimeter wave radar to the ground, and the pixels of the center point of the image can be used. Coordinates, etc., establish a conversion formula from the radar coordinate system of the millimeter-wave radar to the image coordinate system of the image. Through this conversion formula, the radar coordinates of the geometric center of the target vehicle in the radar coordinate system are converted to the image coordinate system of the image to obtain the target vehicle The geometric center of the corresponding target pixel in the image, and the target pixel is used to establish a region of interest. The target identified in the region of interest and the target vehicle detected by the millimeter wave radar belong to the same target vehicle, so that The radar coordinates are corrected by the pixel coordinates of the geometric center of the target vehicle identified by the image to obtain the final detection data of the target vehicle, for example, the lateral distance, longitudinal distance, speed, acceleration, and target of the target vehicle relative to the vehicle. Vehicle (Identifier, ID), vehicle width, vehicle length, etc., output the detection data of the target vehicle to the driving assistance system, so that the driving assistance system makes driving assistance decisions based on the detection data of the target vehicle.
本申请实施例在本车车辆上通过毫米波雷达采集雷达数据和相机采集图像,建立本车车辆的危险区域,根据雷达数据确定位于危险区域内的目标车辆以及目标车辆的第一检测数据;将图像输入预先训练好的目标检测模型中得到目标车辆的第二检测数据;基于毫米波雷达的雷达坐标系到图像的像素坐标系的投影,融合第一检测数据和第二检测数据得到目标车辆的检测数据。本申请实施例实现了只对危险区域内的目标车辆的第一检测数据和第二检测数据进行融合获得目标车辆的目标检测数据,大大减少了需要融合的车辆的数据量,提高了数据融合的效率,最终获得的目标车辆的检测数据实时性高,有利于自动驾驶辅助系统及时作出决策。In the embodiment of the application, the radar data and the image collected by the camera are collected by the millimeter wave radar on the own vehicle to establish the dangerous area of the own vehicle, and the target vehicle located in the dangerous area and the first detection data of the target vehicle are determined according to the radar data; The second detection data of the target vehicle is obtained by inputting the image into the pre-trained target detection model; based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image, the first detection data and the second detection data are merged to obtain the target vehicle's Test data. The embodiment of the application realizes that only the first detection data and the second detection data of the target vehicle in the dangerous area are fused to obtain the target detection data of the target vehicle, which greatly reduces the amount of data of the vehicle that needs to be fused, and improves the data fusion Efficiency, the detection data of the target vehicle finally obtained has high real-time performance, which is conducive to the timely decision-making of the automatic driving assistance system.
实施例二Example two
图2A为本申请实施例二提供的一种车辆检测方法的步骤流程图,本申请实施例在前述实施例一的基础上进行说明。如图2A所示,本申请实施例的车辆检测方法可以包括如下步骤。2A is a flow chart of the steps of a vehicle detection method provided in the second embodiment of the application, and the embodiment of the present application is described on the basis of the foregoing first embodiment. As shown in FIG. 2A, the vehicle detection method of the embodiment of the present application may include the following steps.
S201、在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像。S201. Collect radar data through millimeter wave radar and collect images through a camera on the own vehicle.
如图2B所示,可以在本车车辆前面的进气口栅格处安装毫米波雷达,在挡风玻璃内侧安装相机,毫米波雷达和相机均安装在本车车辆的纵向对称面F上,毫米波雷达也可以相对于纵向对称面F偏置设置,相机和毫米波雷达均向本车车辆的行驶方向采集数据。As shown in Figure 2B, a millimeter-wave radar can be installed at the air intake grid in front of the vehicle, and a camera can be installed inside the windshield. Both the millimeter-wave radar and the camera are installed on the longitudinal symmetry plane F of the vehicle. The millimeter-wave radar can also be offset with respect to the longitudinal symmetry plane F, and both the camera and the millimeter-wave radar collect data in the driving direction of the vehicle.
S202、在所述本车车辆的行驶方向、所述本车车辆的前方建立一个矩形的危险区域,所述危险区域的宽度为所述本车车辆的M个车宽,所述危险区域的长度为所述本车车辆的N个车长,所述危险区域的宽度中心线与所述本车车辆的车宽中心线重合。S202. Establish a rectangular dangerous area in the driving direction of the host vehicle and in front of the host vehicle, the width of the dangerous area is M widths of the host vehicle, and the length of the dangerous area Is the N vehicle lengths of the host vehicle, and the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
如图2B所示,在本申请的可选实施例中,危险区域为一矩形区域,该矩形区域位于本车车辆行驶方向的前方。如图2B所示,危险区域A在本车车辆行驶方向的前方,危险区域的宽度为2d,长度为L,其中,危险区域A的宽度中心线与本车车辆的纵向对称面F重合,d为本车车辆的车宽,L等于N个本车车辆的车身长度,例如,N可以等于3。L还可以等于本车车辆的制动距离,该制动距离可以根据本车车辆当前的行驶速度确定,在实际应用中,还可以根据实际情况设置危险区域,本申请实施例对此不加以限制。As shown in FIG. 2B, in an optional embodiment of the present application, the dangerous area is a rectangular area, and the rectangular area is located in front of the driving direction of the vehicle. As shown in Figure 2B, the dangerous area A is in front of the driving direction of the vehicle. The width of the dangerous area is 2d and the length is L. The width center line of the dangerous area A coincides with the longitudinal symmetry plane F of the vehicle, d For the vehicle width of the own vehicle, L is equal to the length of the body of the N own vehicle, for example, N may be equal to 3. L can also be equal to the braking distance of the vehicle. The braking distance can be determined according to the current driving speed of the vehicle. In practical applications, the dangerous area can also be set according to the actual situation, which is not limited in the embodiment of the application. .
S203、对所述雷达数据进行卡尔曼滤波获得滤波后的雷达数据。S203: Perform Kalman filtering on the radar data to obtain filtered radar data.
本申请实施例中,雷达数据可以是发射的雷达波和接收到的雷达波的混频雷达信号,该混频雷达信号受环境和内部电路干扰存在噪声,可以通过卡尔曼 滤波对雷达信号进行滤波,以去除雷达信号中的噪声和干扰,获得滤波后的雷达数据。In the embodiment of the present application, the radar data may be a mixed radar signal of the transmitted radar wave and the received radar wave. The mixed radar signal is interfered by the environment and internal circuits and has noise, and the radar signal can be filtered by Kalman filtering. , In order to remove the noise and interference in the radar signal, and obtain the filtered radar data.
S204、基于所述滤波后的雷达数据确定出所述毫米波雷达所检测到的目标。S204: Determine a target detected by the millimeter wave radar based on the filtered radar data.
可以对滤波后的雷达信号进行检波、验波处理得到毫米波雷达所检测到的所有目标,毫米波雷达所检测的目标中有可能包括由于路面、行驶车辆扰动雷达波造成的虚假目标,在实际应用中,可以通过毫米波雷达检测目标的算法基于雷达数据确定毫米波雷达所检测到的目标,本申请实施例对基于雷达数据确定毫米波雷达所检测到的目标不再进行详述。The filtered radar signal can be detected and processed to obtain all the targets detected by the millimeter-wave radar. The targets detected by the millimeter-wave radar may include false targets caused by the disturbance of the road surface and the moving vehicle. In application, the target detection algorithm of the millimeter wave radar can be used to determine the target detected by the millimeter wave radar based on the radar data. In the embodiment of the present application, the determination of the target detected by the millimeter wave radar based on the radar data will not be described in detail.
S205、从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆。S205. Remove false targets from the targets detected by the millimeter wave radar to obtain a vehicle detected by the millimeter wave radar.
在本申请的可选实施例中,可以通过以下公式确定出虚假目标:In an optional embodiment of the present application, the false target can be determined by the following formula:
上述公式中,n为毫米波雷达的采样序号,T
r为毫米波雷达的采样周期,x(n+1)、y(n+1)为在n+1采样点时目标的纵向坐标(OrXr)和横向坐标(OrYr),x(n)、y(n)为在n采样点时目标的纵向坐标和横向坐标,v
x(n)和v
y(n)为在n采样点时目标的纵向速度和横向速度,a
x和a
y为纵向和横向加速度的最大值,
为纵向速度差,
为横向速度差,a
xT
r为纵向速度差阈值,a
yT
r为横向速度差阈值,上述公式成立时目标为真实目标,否则目标为虚假目标,即横向速度差大于横向速度差阈值,纵向速度差大于纵向速度差阈值时,该目标为虚假目标。
The above equation, n is the sample number millimeter-wave radar, T r is the sampling period of a millimeter wave radar, x (n + 1), y (n + 1) in the longitudinal direction of the coordinate sampling point n + 1 of the target (OrXr ) And horizontal coordinates (OrYr), x(n), y(n) are the longitudinal and horizontal coordinates of the target at n sampling points, v x (n) and v y (n) are the target’s at n sampling points Longitudinal velocity and lateral velocity, a x and a y are the maximum values of longitudinal and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, a y T r is the lateral velocity difference threshold, the target is the real target when the above formula is established, otherwise the target is a false target, that is, the lateral velocity difference is greater than the lateral velocity difference threshold, When the longitudinal speed difference is greater than the longitudinal speed difference threshold, the target is a false target.
在确定毫米波雷达所检测到的虚假目标后,可以从毫米波雷达检测到的目标中去除虚假目标得到毫米波雷达检测到的车辆,从而避免了对虚假目标进行图像数据融合,降低了融合的数据量,可以提高车辆检测的效率。After determining the false targets detected by the millimeter-wave radar, the false targets can be removed from the targets detected by the millimeter-wave radar to obtain the vehicles detected by the millimeter-wave radar, thus avoiding the image data fusion of the false targets and reducing the fusion The amount of data can improve the efficiency of vehicle detection.
S206、从所述毫米波雷达检测到的车辆中确定出位于所述危险区域内的车辆并将确定出的车辆作为目标车辆。S206. Determine a vehicle located in the dangerous area from the vehicles detected by the millimeter wave radar and use the determined vehicle as a target vehicle.
毫米波雷达检测到车辆时,同时可以获取所检测到的车辆的几何中心的雷达坐标、所检测到的车辆的几何中心相对于本车车辆的横向和纵向距离,可以根据雷达坐标、横向距离和纵向距离确定该车辆是否位于危险区域内。示例性地,可以判断所检测到的车辆的雷达坐标是否位于危险区域内,或者是通过所检测到的车辆相对于本车车辆的横向和纵向距离确定该车辆是否位于危险区域内,若是,则确定该车辆为位于危险区域内的车辆,将该车辆作为目标车辆。如图2B所示,对于毫米波雷达而言,毫米波雷达的探测范围较广,通常检测距离可以达到100米,在车流密集的道路上,毫米波雷达可以检测到比较多的车辆,可以从检测到的所有车辆中确定出位于危险区域内的目标车辆,如图2B所示,危险区域内包括目标车辆1和目标车辆2,大大减少了目标车辆的数量,减少了融合的数据量,提高了车辆检测的效率。When the millimeter-wave radar detects a vehicle, it can also obtain the radar coordinates of the geometric center of the detected vehicle, and the horizontal and vertical distances of the geometric center of the detected vehicle relative to the vehicle. The longitudinal distance determines whether the vehicle is in the danger zone. Exemplarily, it can be determined whether the radar coordinates of the detected vehicle are located in the dangerous area, or whether the vehicle is located in the dangerous area is determined by the horizontal and longitudinal distances of the detected vehicle relative to the vehicle of the vehicle. It is determined that the vehicle is a vehicle located in a dangerous area, and the vehicle is regarded as the target vehicle. As shown in Figure 2B, for millimeter-wave radar, the detection range of millimeter-wave radar is relatively wide, and the detection distance can usually reach 100 meters. On roads with dense traffic, millimeter-wave radar can detect more vehicles. The target vehicle located in the dangerous area is determined from all detected vehicles. As shown in Figure 2B, the dangerous area includes target vehicle 1 and target vehicle 2, which greatly reduces the number of target vehicles, reduces the amount of fusion data, and improves Improve the efficiency of vehicle detection.
S207、获取所述目标车辆的第一检测数据,所述第一检测数据包括所述目标车辆在所述雷达坐标系中的雷达坐标、速度、加速度、车宽、车长中的至少一项。S207. Acquire first detection data of the target vehicle, where the first detection data includes at least one of radar coordinates, speed, acceleration, vehicle width, and vehicle length of the target vehicle in the radar coordinate system.
本申请实施例中,毫米波雷达不仅可以检测到目标车辆,同时可以获取目标车辆的雷达坐标、相对于本车车辆的横向和纵向距离、目标车辆的速度、加速度、车宽和车长等,将这些数据作为第一检测数据,上述目标车辆的第一检测数据可通过任意一种毫米波雷达检测目标的算法所获取,在此不再详述。In the embodiments of the present application, the millimeter wave radar can not only detect the target vehicle, but also obtain the radar coordinates of the target vehicle, the horizontal and longitudinal distance relative to the own vehicle, the speed, acceleration, vehicle width, and length of the target vehicle, etc. Regarding these data as the first detection data, the first detection data of the above-mentioned target vehicle can be obtained by any kind of millimeter wave radar detection target algorithm, which will not be described in detail here.
S208、将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据,所述第二检测数据包括所述目标车辆的几何中心的第二像素坐标。S208. Input the image into a pre-trained target detection model to obtain second detection data of the target vehicle, where the second detection data includes second pixel coordinates of the geometric center of the target vehicle.
在本申请的可选实施例中,目标检测模型可以是基于Haar-like特征与Adaboost算法组合得到的目标检测模型,如图2C所示,基于Haar-like特征与Adaboost算法组合的目标检测模型训练和应用过程如下。In an optional embodiment of the present application, the target detection model may be a target detection model based on a combination of Haar-like features and Adaboost algorithm. As shown in FIG. 2C, target detection model training based on a combination of Haar-like features and Adaboost algorithm And the application process is as follows.
在离线训练阶段,收集车辆样本图像和非车辆样本图像,并对车辆样本图像和非车辆样本图像进行归一化和灰度化处理,基于积分图的Haar-like特征对处理后的图像进行特征提取,并将提取到的特征输入基于Adaboost算法构成的级联分类器,以此迭代训练该分类器得到最终的级联分类器。In the offline training phase, collect vehicle sample images and non-vehicle sample images, normalize and grayscale the vehicle sample images and non-vehicle sample images, and perform features on the processed images based on the Haar-like features of the integral map Extract and input the extracted features into a cascaded classifier based on the Adaboost algorithm to iteratively train the classifier to obtain the final cascaded classifier.
在在线识别阶段,相机实时探测到待检测样本,例如本申请实施例中相机采集到的图像,基于积分图的Haar-like特征对相机采集到的图像进行特征提取,将提取到的特征输入训练好的级联分类器中识别图像中的目标,得到目标的类 别、目标的几何中心的像素坐标,在本申请实施例中,识别的目标为汽车、公交车、货车、摩托车、自行车等。In the online recognition phase, the camera detects the sample to be detected in real time, such as the image collected by the camera in the embodiment of this application, extracts the features of the image collected by the camera based on the Haar-like features of the integral map, and inputs the extracted features into training A good cascade classifier recognizes the target in the image, and obtains the target category and the pixel coordinates of the geometric center of the target. In the embodiment of the present application, the recognized target is a car, a bus, a truck, a motorcycle, a bicycle, etc.
本申请实施例将相机采集到的图像输入训练好的目标检测模型中后,目标检测模型可以输出图像中多个车辆的类别、车辆的几何中心的像素坐标。In the embodiment of the present application, after the image collected by the camera is input into the trained target detection model, the target detection model can output the types of multiple vehicles in the image and the pixel coordinates of the geometric centers of the vehicles.
S209、将所述雷达坐标从雷达坐标系投影到像素坐标系中,得到所述目标车辆的几何中心在所述图像中的目标像素点。S209: Project the radar coordinates from the radar coordinate system to the pixel coordinate system to obtain the target pixel point of the geometric center of the target vehicle in the image.
可以先通过以下转换式将目标车辆的雷达坐标转换为相机坐标:The radar coordinates of the target vehicle can be converted to camera coordinates through the following conversion formula:
如图2B所示,相机安装在车辆的纵向对称面F上,例如相机固定在本车车辆挡风玻璃内或前进气格栅上部,并且相机安装的水平角、横摆角为0°,俯仰角可根据本车车辆的前方检测范围进行确定;毫米波雷达安装在本车车辆的纵向对称面F上,或相对纵向对称面F偏置一定距离安装,示例性地,毫米波雷达安装在下进气格栅处,且毫米波雷达安装的水平角、横摆角、俯仰角均为0°。根据相机和毫米波雷达的安装位置,可以建立相机及毫米波雷达的坐标系。As shown in Figure 2B, the camera is installed on the longitudinal symmetry plane F of the vehicle. For example, the camera is fixed in the windshield of the vehicle or the upper part of the front air intake grille, and the horizontal angle and yaw angle of the camera are 0°, The pitch angle can be determined according to the front detection range of the vehicle; the millimeter wave radar is installed on the longitudinal symmetry plane F of the vehicle, or installed at a certain distance from the longitudinal symmetry plane F. For example, the millimeter wave radar is installed below At the intake grille, the horizontal angle, yaw angle, and pitch angle of the millimeter-wave radar installation are all 0°. According to the installation position of the camera and millimeter wave radar, the coordinate system of the camera and millimeter wave radar can be established.
如图2B所示,Oc-Xc-Yc-Zc为相机坐标系,Or-Xr-Yr-Zr为毫米波雷达的雷达坐标系。其中,Oc为相机坐标系的原点,即相机的光心,OcXc轴垂直于本车车辆的纵向对称面F、指向本车车辆的行驶方向的右侧,OcYc轴(图未示)位于本车车辆的纵向对称面F内,且与相机的光轴垂直,OcZc轴与相机的光轴重合。Or-Xr-Yr-Zr为雷达坐标系,其中Or为毫米波雷达的工作面的几何中心,OrXr轴垂直于毫米波雷达的工作表面、指向本车车辆的行驶方向,OrYr轴垂直于本车车辆的纵向对称面F、指向本车车辆的前进方向的左侧,OrZr轴(图未示)垂直于地面且指向本车车辆的车顶方向。As shown in Figure 2B, Oc-Xc-Yc-Zc is the camera coordinate system, and Or-Xr-Yr-Zr is the radar coordinate system of the millimeter wave radar. Among them, Oc is the origin of the camera coordinate system, that is, the optical center of the camera, the OcXc axis is perpendicular to the longitudinal symmetry plane F of the own vehicle and points to the right side of the direction of travel of the own vehicle, and the OcYc axis (not shown) is located in the own vehicle The longitudinal symmetry plane F of the vehicle is perpendicular to the optical axis of the camera, and the OcZc axis coincides with the optical axis of the camera. Or-Xr-Yr-Zr is the radar coordinate system, where Or is the geometric center of the millimeter-wave radar's working surface, the OrXr axis is perpendicular to the millimeter-wave radar's working surface and points to the driving direction of the vehicle, and the OrYr axis is perpendicular to the vehicle. The longitudinal symmetry plane F of the vehicle points to the left side of the forward direction of the vehicle, and the OrZr axis (not shown) is perpendicular to the ground and points to the roof of the vehicle.
上述雷达坐标到相机坐标的转换式中,H
c为相机的安装位置距离地面的高度,α为相机安装的俯仰角,(x
r、y
r、z
r)为目标车辆在雷达坐标系中的雷达坐标,W
r为毫米波雷达的安装位置距离本车车辆的纵向对称面F的距离,H
r为毫米波雷达的安装位置距离地面的高度,ΔL为毫米波雷达与相机在沿本车车辆的行驶方向上安装距离之差的绝对值,(x
c、y
c、z
c)为目标车辆在相机坐标系中的相机坐标。
In the above-mentioned conversion formula from radar coordinates to camera coordinates, H c is the height of the installation position of the camera from the ground, α is the pitch angle of the camera installation, and (x r , y r , z r ) is the target vehicle in the radar coordinate system Radar coordinates, W r is the distance between the installation position of the millimeter-wave radar and the longitudinal symmetry plane F of the vehicle, H r is the height of the installation position of the millimeter-wave radar from the ground, and ΔL is the distance between the millimeter-wave radar and the camera along the vehicle. The absolute value of the difference between the installation distances in the driving direction of, (x c , y c , z c ) is the camera coordinate of the target vehicle in the camera coordinate system.
将目标车辆的相机坐标转换为像素坐标系中的第一像素坐标,如下:Convert the camera coordinates of the target vehicle to the first pixel coordinate in the pixel coordinate system, as follows:
上述公式(2)中,(u,v)为目标车辆在图像中的目标像素点的第一像素坐标,(u
0,v
0)为图像的中心点的像素坐标,f
x和f
y为相机的等效焦距。
In the above formula (2), (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image, (u 0 , v 0 ) is the pixel coordinate of the center point of the image, f x and f y are The equivalent focal length of the camera.
综合上述公式(1)和(2),得到雷达坐标系到像素坐标系的转换式如下:Combining the above formulas (1) and (2), the conversion formula from radar coordinate system to pixel coordinate system is obtained as follows:
当获取到目标车辆的几何中心的雷达坐标后,通过上述公式(3)可以将目标车辆的中心点的雷达坐标投影到图像的像素坐标,该像素坐标对应的像素点即为目标车辆的中心点对应的目标像素点。When the radar coordinates of the geometric center of the target vehicle are obtained, the radar coordinates of the center point of the target vehicle can be projected to the pixel coordinates of the image through the above formula (3), and the pixel point corresponding to the pixel coordinate is the center point of the target vehicle The corresponding target pixel.
S210、以所述目标像素点为中心建立一个感兴趣区域。S210: Establish a region of interest with the target pixel as the center.
以目标车辆在图像中的目标像素点为中心、目标车辆的车宽为矩形宽、目标车辆的车长为矩形长建立一个矩形区域作为感兴趣区域。Taking the target pixel of the target vehicle in the image as the center, the width of the target vehicle as the width of the rectangle, and the length of the target vehicle as the length of the rectangle, a rectangular area is established as the region of interest.
S211、通过所述感兴趣区域中包含的车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标。S211: Correct the radar coordinates by using the second pixel coordinates of the vehicles included in the region of interest to obtain the final coordinates of the target vehicle.
在实际应用中,毫米波雷达检测到目标车辆的几何中心的雷达坐标,该雷达坐标投影到像素坐标系下后得到一个第一像素坐标,同时,将图像输入预先训练好的目标检测模型中也得到目标车辆的几何中心的第二像素坐标,可以通过第二像素坐标向雷达坐标系的转换对目标车辆的位置进行修正。In practical applications, the millimeter-wave radar detects the radar coordinates of the geometric center of the target vehicle. The radar coordinates are projected into the pixel coordinate system to obtain a first pixel coordinate. At the same time, the image is input into the pre-trained target detection model. The second pixel coordinates of the geometric center of the target vehicle are obtained, and the position of the target vehicle can be corrected through the conversion of the second pixel coordinates to the radar coordinate system.
可以通过以下公式对雷达坐标进行修正得到目标车辆的最终坐标。The radar coordinates can be corrected by the following formula to obtain the final coordinates of the target vehicle.
其中,(x
a,y
a,z
a)为修正后的最终坐标,(u
r,v
r)为通过目标检测模型得到目标车辆的几何中心的第二像素坐标。
Where, (x a, y a, z a) as the final corrected coordinates, (u r, v r) is the target pixel coordinate by the second detection pattern to obtain the geometric center of the target vehicle.
在获得目标车辆的最终坐标后,可以将最终坐标、毫米波雷达检测到的车宽、车长、相对于本车车辆的距离、速度、加速度等作为最终检测数据输出到辅助驾驶系统,使得辅助驾驶系统根据目标车辆的检测数据做出辅助驾驶决策。After obtaining the final coordinates of the target vehicle, the final coordinates, the vehicle width and length detected by the millimeter wave radar, the distance relative to the vehicle, speed, acceleration, etc. can be output to the driving assistance system as the final detection data, so that the assistance The driving system makes auxiliary driving decisions based on the detection data of the target vehicle.
本申请实施例在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像,在本车车辆的行驶方向、本车车辆的前方建立一个矩形的危险区域,对雷达数据滤波后得到毫米波雷达所检测的目标,并从所检测到的目标中滤除虚假目标后确定位于危险区域的目标车辆,获取该目标车辆的第一检测数据以及将图像输入目标检测模型中获得第二检测数据,将第一检测数据中目标车辆的几何中心的雷达坐标投影到图像中获得目标像素点,基于该目标像素点建立感兴趣区域,通过感兴趣区域中目标车辆的几何中心的第二像素坐标对雷达坐标进行修改获得目标车辆的最终坐标,输出包含最终坐标在内的检测数据。通过过滤虚假目标,以及建立危险区域并确定位于危险区域内的目标车辆来进行雷达数据和相机视觉数据的融合,大大减少了需要融合的车辆的数量,降低了数据处理量,提高了车辆检测效果。In the embodiment of this application, radar data is collected by millimeter wave radar and images are collected on the vehicle, and a rectangular dangerous area is established in the driving direction of the vehicle and in front of the vehicle, and the radar data is filtered to obtain the millimeter wave. The target detected by the radar, after filtering out false targets from the detected targets, determine the target vehicle located in the dangerous area, obtain the first detection data of the target vehicle and input the image into the target detection model to obtain the second detection data, The radar coordinates of the geometric center of the target vehicle in the first detection data are projected into the image to obtain the target pixel, and the area of interest is established based on the target pixel. The coordinates are modified to obtain the final coordinates of the target vehicle, and the detection data including the final coordinates are output. The fusion of radar data and camera vision data by filtering false targets, establishing a dangerous area and identifying target vehicles located in the dangerous area greatly reduces the number of vehicles that need to be fused, reduces the amount of data processing, and improves the vehicle detection effect .
实施例三Example three
图3是本申请实施例三提供的一种车辆检测装置的结构框图,如图3所示,本申请实施例的车辆检测装置应用于设置有毫米波雷达和相机的本车车辆,可以包括如下模块。FIG. 3 is a structural block diagram of a vehicle detection device provided in the third embodiment of the present application. As shown in FIG. 3, the vehicle detection device of the embodiment of the present application is applied to a vehicle equipped with a millimeter wave radar and a camera, and may include the following Module.
雷达数据和图像采集模块301,设置为在本车车辆上通过毫米波雷达采集雷达数据和通过相机采集图像;The radar data and image acquisition module 301 is configured to collect radar data through millimeter wave radar and collect images through a camera on the vehicle;
危险区域建立模块302,设置为建立所述本车车辆的危险区域;The dangerous area establishment module 302 is configured to establish the dangerous area of the own vehicle;
第一检测数据确定模块303,设置为根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据;The first detection data determining module 303 is configured to determine the target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data;
第二检测数据获取模块304,设置为将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据;The second detection data acquisition module 304 is configured to input the image into a pre-trained target detection model to obtain second detection data of the target vehicle;
融合模块305,设置为基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。The fusion module 305 is configured to merge the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image.
可选地,所述危险区域建立模块302包括:危险区域建立子模块。Optionally, the dangerous area establishing module 302 includes: a dangerous area establishing sub-module.
危险区域建立子模块,设置为在所述本车车辆的行驶方向、所述本车车辆的前方建立一个矩形的危险区域,所述危险区域的宽度为所述本车车辆的M个车宽,所述危险区域的长度为所述本车车辆的N个车长,所述危险区域的宽度中心线与所述本车车辆的车宽中心线重合。The dangerous area establishment sub-module is set to establish a rectangular dangerous area in the driving direction of the host vehicle and in front of the host vehicle, and the width of the dangerous area is M vehicle widths of the host vehicle, The length of the dangerous area is N vehicle lengths of the host vehicle, and the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
可选地,所述第一检测数据确定模块303包括以下模块。Optionally, the first detection data determining module 303 includes the following modules.
滤波子模块,设置为对所述雷达数据进行卡尔曼滤波获得滤波后的雷达数据;A filtering sub-module, configured to perform Kalman filtering on the radar data to obtain filtered radar data;
目标确定子模块,设置为基于所述滤波后的雷达数据确定出所述毫米波雷达所检测到的目标;A target determination sub-module configured to determine the target detected by the millimeter wave radar based on the filtered radar data;
虚假目标去除子模块,设置为从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆;A false target removal sub-module, configured to remove false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar;
目标车辆确定子模块,设置为从所述毫米波雷达检测到的车辆中确定位于所述危险区域内的车辆并将确定出的车辆作为目标车辆;The target vehicle determination sub-module is configured to determine a vehicle located in the dangerous area from the vehicles detected by the millimeter wave radar and use the determined vehicle as a target vehicle;
第一检测数据获取子模块,设置为获取所述目标车辆的第一检测数据,所述第一检测数据包括所述目标车辆在所述雷达坐标系中的雷达坐标、速度、加速度、车宽、车长中的至少一项。The first detection data acquisition sub-module is configured to acquire first detection data of the target vehicle, the first detection data including the radar coordinates, speed, acceleration, vehicle width, and vehicle width of the target vehicle in the radar coordinate system. At least one of the captains.
可选地,所述虚假目标去除子模块包括以下模块。Optionally, the false target removal sub-module includes the following modules.
虚假目标确定单元,设置为通过以下公式确定出虚假目标:The false target determination unit is set to determine the false target through the following formula:
其中,n为毫米波雷达的采样序号,T
r为毫米波雷达的采样周期,x(n+1)、 y(n+1)为在n+1采样点时目标的纵向坐标和横向坐标,x(n)、y(n)为在n采样点时目标的纵向坐标和横向坐标,v
x(n)和v
y(n)为在n采样点时目标的纵向速度和横向速度,a
x和a
y为纵向加速度和横向加速度的最大值,
为纵向速度差,
为横向速度差,a
xT
r为纵向速度差阈值,a
yT
r为横向速度差阈值,上述公式成立时目标为真实目标,否则目标为虚假目标;
Wherein, n-millimeter-wave radar is a sample number, T r millimeter-wave radar sampling period, x (n + 1), y (n + 1) for the longitudinal and transverse coordinates of the coordinate sampling point n + 1 of the target, x(n), y(n) are the longitudinal and lateral coordinates of the target at n sampling points, v x (n) and v y (n) are the longitudinal and lateral speeds of the target at n sampling points, a x And a y are the maximum values of longitudinal acceleration and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, and a y T r is the lateral velocity difference threshold. When the above formula is established, the target is a real target, otherwise the target is a false target;
虚假目标去除单元,设置为从所述毫米波雷达检测到的目标中去除所述虚假目标得到所述毫米波雷达检测到的车辆。The false target removing unit is configured to remove the false target from the target detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar.
可选地,所述第一检测数据包括所述目标车辆的雷达坐标,所述第二检测数据包括所述目标车辆的第二像素坐标,所述融合模块305包括以下模块。Optionally, the first detection data includes radar coordinates of the target vehicle, the second detection data includes second pixel coordinates of the target vehicle, and the fusion module 305 includes the following modules.
坐标投影子模块,设置为将所述雷达坐标从雷达坐标系投影到像素坐标系中,得到所述目标车辆的几何中心在所述图像中的目标像素点;A coordinate projection sub-module, configured to project the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel with the geometric center of the target vehicle in the image;
感兴趣区域建立子模块,设置为以所述目标像素点为中心建立一感兴趣区域;The region of interest establishment sub-module is set to establish a region of interest with the target pixel as the center;
坐标修正子模块,设置为通过所述感兴趣区域中包含的车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标。The coordinate correction sub-module is configured to correct the radar coordinates to obtain the final coordinates of the target vehicle by using the second pixel coordinates of the vehicle contained in the region of interest.
可选地,所述坐标投影子模块包括以下模块。Optionally, the coordinate projection sub-module includes the following modules.
坐标投影单元,设置为通过以下雷达坐标系到像素坐标系的转换公式将所述雷达坐标从雷达坐标系投影到像素坐标系得到第一像素坐标:The coordinate projection unit is set to project the radar coordinates from the radar coordinate system to the pixel coordinate system to obtain the first pixel coordinate through the following conversion formula from the radar coordinate system to the pixel coordinate system:
H
c为相机的安装位置距离地面的高度,α为相机安装的俯仰角,(x
r,y
r,z
r)为目标车辆在雷达坐标系中的雷达坐标,W
r为毫米波雷达的安装位置距离 本车车辆的纵向对称面的距离,H
r为毫米波雷达的安装位置距离地面的高度,ΔL为毫米波雷达与相机在沿本车车辆的行驶方向上安装距离之差的绝对值,(u,v)为目标车辆在图像中的目标像素点的第一像素坐标,(u
0,v
0)为图像的中心点的像素坐标,f
x和f
y为相机的等效焦距;
H c is the height of the installation position of the camera from the ground, α is the elevation angle of the camera installation, (x r , y r , z r ) is the radar coordinates of the target vehicle in the radar coordinate system, and W r is the installation of millimeter wave radar The distance between the position and the longitudinal symmetry plane of the vehicle, H r is the height of the millimeter-wave radar installation position from the ground, and ΔL is the absolute value of the difference between the installation distance of the millimeter-wave radar and the camera in the driving direction of the vehicle. (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image, (u 0 , v 0 ) is the pixel coordinate of the center point of the image, f x and f y are the equivalent focal lengths of the camera;
目标像素点确定单元,设置为确定所述第一像素坐标对应的像素点为目标车辆的几何中心在所述图像中的目标像素点。The target pixel point determining unit is configured to determine the pixel point corresponding to the first pixel coordinate as the target pixel point of the geometric center of the target vehicle in the image.
可选地,所述坐标修正子模块包括以下模块。Optionally, the coordinate correction sub-module includes the following modules.
坐标修正单元,设置为通过以下公式对所述雷达坐标进行修正得到所述目标车辆的最终坐标:The coordinate correction unit is set to correct the radar coordinates through the following formula to obtain the final coordinates of the target vehicle:
(x
a,y
a,z
a)为修正后的最终坐标,(u
r,v
r)为感兴趣区域中包含的目标车辆的第二像素坐标。
(x a , y a , z a ) are the corrected final coordinates, (u r , v r ) are the second pixel coordinates of the target vehicle contained in the region of interest.
本申请实施例所提供的车辆检测装置可执行本申请实施例一、实施例二所提供的车辆检测方法,具备执行方法相应的功能模块和效果。The vehicle detection device provided in the embodiment of the present application can execute the vehicle detection method provided in the first and second embodiments of the present application, and has the functional modules and effects corresponding to the execution method.
实施例四Embodiment four
参照图4,示出了本申请一个示例中的一种车辆的结构示意图。如图4所示,该车辆可以包括:处理器400、存储器401、具有触摸功能的显示屏402、输入装置403、输出装置404、通信装置405、相机406和毫米波雷达407。该车辆中处理器400的数量可以是一个或者多个,图4中以一个处理器400为例。该车辆中存储器401的数量可以是一个或者多个,图4中以一个存储器401为例。该车辆的处理器400、存储器401、显示屏402、输入装置403、输出装置404、通信装置405、相机406和毫米波雷达407可以通过总线或者其他方式连接,图4中以通过总线连接为例。Referring to Fig. 4, there is shown a schematic structural diagram of a vehicle in an example of the present application. As shown in FIG. 4, the vehicle may include: a processor 400, a memory 401, a display screen 402 with a touch function, an input device 403, an output device 404, a communication device 405, a camera 406, and a millimeter wave radar 407. The number of processors 400 in the vehicle may be one or more, and one processor 400 is taken as an example in FIG. 4. The number of memories 401 in the vehicle may be one or more. In FIG. 4, one memory 401 is taken as an example. The processor 400, the memory 401, the display screen 402, the input device 403, the output device 404, the communication device 405, the camera 406 and the millimeter wave radar 407 of the vehicle may be connected by a bus or other means. In Figure 4, the bus connection is taken as an example .
存储器401作为一种计算机可读存储介质,可设置为存储软件程序、计算机可执行程序以及模块,如本申请任意实施例所述的车辆检测方法对应的程序指令/模块(例如,上述车辆检测装置中的雷达数据和图像采集模块301、危险区域建立模块302、第一检测数据确定模块303、第二检测数据获取模块304和融合模块305),存储器401可包括存储程序区和存储数据区,其中,存储程序区可存储操作装置、至少一个功能所需的应用程序;存储数据区可存储根据设备的使用所创建的数据等。此外,存储器401可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实例中,存储器401可包括相对于处理器400远程设置的存储器,这些远程存储器可以通过网络连接至设备。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。As a computer-readable storage medium, the memory 401 can be configured to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the vehicle detection method described in any embodiment of the present application (for example, the vehicle detection device described above) The radar data and image acquisition module 301, the dangerous area establishment module 302, the first detection data determination module 303, the second detection data acquisition module 304 and the fusion module 305) in the , The storage program area can store the operating device and application programs required by at least one function; the storage data area can store data created according to the use of the device, etc. In addition, the memory 401 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices. In some examples, the memory 401 may include a memory remotely provided with respect to the processor 400, and these remote memories may be connected to the device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
显示屏402为具有触摸功能的显示屏402,可以是电容屏、电磁屏或者红外屏。一般而言,显示屏402设置为根据处理器400的指示显示数据,还设置为接收作用于显示屏402的触摸操作,并将相应的信号发送至处理器400或其他装置。可选的,当显示屏402为红外屏时,其还包括红外触摸框,该红外触摸框设置在显示屏402的四周,其还可以设置为接收红外信号,并将该红外信号发送至处理器400或者其他设备。The display screen 402 is a display screen 402 with a touch function, and may be a capacitive screen, an electromagnetic screen or an infrared screen. Generally speaking, the display screen 402 is set to display data according to instructions of the processor 400, and is also set to receive touch operations on the display screen 402 and send corresponding signals to the processor 400 or other devices. Optionally, when the display screen 402 is an infrared screen, it also includes an infrared touch frame. The infrared touch frame is arranged around the display screen 402. It can also be set to receive infrared signals and send the infrared signals to the processor. 400 or other equipment.
通信装置405,设置为与其他设备建立通信连接,其可以是有线通信装置和/或无线通信装置。The communication device 405 is configured to establish a communication connection with other devices, and it may be a wired communication device and/or a wireless communication device.
输入装置403可设置为接收输入的数字或者字符信息,以及产生与设备的用户设置以及功能控制有关的键信号输入。输出装置404可以包括扬声器等音频设备。输入装置403和输出装置404的组成可以根据实际情况设定。The input device 403 may be configured to receive input digital or character information, and generate key signal input related to user settings and function control of the device. The output device 404 may include audio equipment such as a speaker. The composition of the input device 403 and the output device 404 can be set according to actual conditions.
相机406设置为采集车辆周围的图像,毫米波雷达407设置为向车辆周围发射雷达信号以采集雷达数据。The camera 406 is configured to collect images around the vehicle, and the millimeter wave radar 407 is configured to emit radar signals around the vehicle to collect radar data.
处理器400通过运行存储在存储器401中的软件程序、指令以及模块,从而执行设备的多种功能应用以及数据处理,即实现上述车辆检测方法。The processor 400 executes various functional applications and data processing of the device by running the software programs, instructions, and modules stored in the memory 401, thereby realizing the aforementioned vehicle detection method.
在实施例中,处理器400执行存储器401中存储的一个或多个程序时,可以实现本申请实施例提供的车辆检测方法。In an embodiment, when the processor 400 executes one or more programs stored in the memory 401, the vehicle detection method provided in the embodiment of the present application can be implemented.
本申请实施例还提供一种计算机可读存储介质,所述存储介质中的指令由设备的处理器执行时,使得设备能够执行如上述实施例所述的车辆检测方法。The embodiment of the present application also provides a computer-readable storage medium. When the instructions in the storage medium are executed by the processor of the device, the device can execute the vehicle detection method described in the above-mentioned embodiment.
对于装置、车辆、存储介质的实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。For the embodiments of the device, the vehicle, and the storage medium, since they are basically similar to the method embodiment, the description is relatively simple, and for related parts, please refer to the part of the description of the method embodiment.
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具 体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施例或示例。而且,描述的特征、结构、材料或者特点可以在任何的一个或多个实施例或示例中以合适的方式结合。In the description of this specification, descriptions with reference to the terms "one embodiment", "some embodiments", "examples", "specific examples", or "some examples", etc. mean the features described in conjunction with the embodiments or examples, The structure, material or feature is included in at least one embodiment or example of the present application. In this specification, the schematic representations of the above-mentioned terms do not necessarily refer to the same embodiment or example. Moreover, the described features, structures, materials or characteristics can be combined in any one or more embodiments or examples in a suitable manner.
Claims (10)
- 一种车辆检测方法,应用于设置有毫米波雷达和相机的本车车辆,包括:A vehicle detection method, which is applied to a vehicle equipped with a millimeter wave radar and a camera, includes:在所述本车车辆上通过所述毫米波雷达采集雷达数据和通过所述相机采集图像;Collecting radar data through the millimeter wave radar and collecting images through the camera on the own vehicle;建立所述本车车辆的危险区域;Establish the dangerous area of the own vehicle;根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据;Determining a target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data;将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据;Inputting the image into a pre-trained target detection model to obtain second detection data of the target vehicle;基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。Based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image, the detection data of the target vehicle is obtained by fusing the first detection data and the second detection data.
- 根据权利要求1所述的方法,其中,所述建立所述本车车辆的危险区域,包括:The method according to claim 1, wherein the establishing the dangerous area of the host vehicle comprises:在所述本车车辆的行驶方向、以及所述本车车辆的前方建立一个矩形的危险区域,所述危险区域的宽度为所述本车车辆的M个车宽,所述危险区域的长度为所述本车车辆的N个车长,所述危险区域的宽度中心线与所述本车车辆的车宽中心线重合。A rectangular dangerous area is established in the driving direction of the host vehicle and in front of the host vehicle, the width of the dangerous area is M widths of the host vehicle, and the length of the dangerous area is For the N vehicle lengths of the host vehicle, the width center line of the dangerous area coincides with the vehicle width center line of the host vehicle.
- 根据权利要求1或2所述的方法,其中,所述根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据,包括:The method according to claim 1 or 2, wherein the determining the target vehicle located in the dangerous area and the first detection data of the target vehicle according to the radar data comprises:对所述雷达数据进行卡尔曼滤波获得滤波后的雷达数据;Performing Kalman filtering on the radar data to obtain filtered radar data;基于所述滤波后的雷达数据确定出所述毫米波雷达所检测到的目标;Determine the target detected by the millimeter wave radar based on the filtered radar data;从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆;Removing false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar;从所述毫米波雷达检测到的车辆中确定出位于所述危险区域内的车辆并将确定出的车辆作为所述目标车辆;Determine a vehicle located in the dangerous area from the vehicles detected by the millimeter wave radar and use the determined vehicle as the target vehicle;获取所述目标车辆的第一检测数据,其中,所述第一检测数据包括以下至少之一:所述目标车辆在所述雷达坐标系中的雷达坐标、速度、加速度、车宽、车长。Acquire first detection data of the target vehicle, where the first detection data includes at least one of the following: radar coordinates, speed, acceleration, vehicle width, and vehicle length of the target vehicle in the radar coordinate system.
- 根据权利要求3所述的方法,其中,所述从所述毫米波雷达所检测到的目标中去除虚假目标得到所述毫米波雷达检测到的车辆,包括:The method according to claim 3, wherein the removing false targets from the targets detected by the millimeter wave radar to obtain the vehicle detected by the millimeter wave radar comprises:通过以下公式确定出所述虚假目标:The false target is determined by the following formula:其中,n为所述毫米波雷达的采样序号,T r为所述毫米波雷达的采样周期,x(n+1)和y(n+1)为在n+1采样点时所述目标的纵向坐标和横向坐标,x(n)和y(n)为在n采样点时所述目标的纵向坐标和横向坐标,v x(n)和v y(n)为在n采样点时所述目标的纵向速度和横向速度,a x和a y为纵向加速度和横向加速度的最大值, 为纵向速度差, 为横向速度差,a xT r为纵向速度差阈值,a yT r为横向速度差阈值,在上述公式成立的情况下所述目标为真实目标,在上述公式中一个不成立的情况下所述目标为所述虚假目标; Wherein, n is the sample number of the millimeter-wave radar, T r is the sampling period of the millimeter-wave radar, x (n + 1) and Y (n + 1) to n + 1 at the target sampling point Longitudinal and horizontal coordinates, x(n) and y(n) are the longitudinal and horizontal coordinates of the target at n sampling points, and v x (n) and v y (n) are stated at n sampling points The longitudinal velocity and lateral velocity of the target, a x and a y are the maximum values of longitudinal acceleration and lateral acceleration, Is the longitudinal speed difference, Is the lateral velocity difference, a x T r is the longitudinal velocity difference threshold, and a y T r is the lateral velocity difference threshold. When the above formula is established, the target is the real target, and when one of the above formulas is not established, The target is the false target;从所述毫米波雷达检测到的目标中去除所述虚假目标得到所述毫米波雷达检测到的车辆。The vehicle detected by the millimeter wave radar is obtained by removing the false target from the target detected by the millimeter wave radar.
- 根据权利要求1所述的方法,其中,所述第一检测数据包括所述目标车辆在所述雷达坐标系中的雷达坐标,所述第二检测数据包括所述目标车辆在所述像素坐标系中的第二像素坐标,所述基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据,包括:The method according to claim 1, wherein the first detection data includes radar coordinates of the target vehicle in the radar coordinate system, and the second detection data includes the target vehicle in the pixel coordinate system. The second pixel coordinates in the image are based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image, and the first detection data and the second detection data are merged to obtain the target vehicle's Test data, including:将所述雷达坐标从所述雷达坐标系投影到所述像素坐标系中,得到所述目标车辆的几何中心在所述图像中的目标像素点;Projecting the radar coordinates from the radar coordinate system into the pixel coordinate system to obtain the target pixel point of the geometric center of the target vehicle in the image;以所述目标像素点为中心建立一个感兴趣区域;Establishing a region of interest with the target pixel as the center;通过所述感兴趣区域中包含的目标车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标。The radar coordinates are corrected by the second pixel coordinates of the target vehicle contained in the region of interest to obtain the final coordinates of the target vehicle.
- 根据权利要求5所述的方法,其中,所述将所述雷达坐标从所述雷达坐标系投影到所述像素坐标系中,得到所述目标车辆的几何中心在所述图像中的 目标像素点,包括:The method according to claim 5, wherein the projecting the radar coordinates from the radar coordinate system into the pixel coordinate system obtains the target pixel point of the geometric center of the target vehicle in the image ,include:通过以下所述雷达坐标系到所述像素坐标系的转换公式将所述雷达坐标从所述雷达坐标系投影到所述像素坐标系得到第一像素坐标:Project the radar coordinates from the radar coordinate system to the pixel coordinate system through the following conversion formula from the radar coordinate system to the pixel coordinate system to obtain the first pixel coordinate:其中,H c为所述相机的安装位置距离地面的高度,α为所述相机安装的俯仰角,(x r,y r,z r)为所述目标车辆在所述雷达坐标系中的雷达坐标,W r为所述毫米波雷达的安装位置距离所述本车车辆的纵向对称面的距离,H r为所述毫米波雷达的安装位置距离地面的高度,ΔL为所述毫米波雷达与所述相机在沿所述本车车辆的行驶方向上安装距离之差的绝对值,(u,v)为所述目标车辆在所述图像中的目标像素点的第一像素坐标,(u 0,v 0)为所述图像的中心点的像素坐标,f x和f y为所述相机的等效焦距; Wherein, H c is the height of the installation position of the camera from the ground, α is the elevation angle of the camera installation, (x r , y r , z r ) is the radar of the target vehicle in the radar coordinate system The coordinates, W r is the distance between the installation position of the millimeter wave radar and the longitudinal symmetry plane of the vehicle, H r is the height of the installation position of the millimeter wave radar from the ground, and ΔL is the distance between the millimeter wave radar and the The absolute value of the difference in the installation distance of the camera along the traveling direction of the host vehicle, (u, v) is the first pixel coordinate of the target pixel of the target vehicle in the image, (u 0 , V 0 ) is the pixel coordinates of the center point of the image, f x and f y are the equivalent focal lengths of the camera;确定所述第一像素坐标对应的像素点为所述目标车辆的几何中心在所述图像中的目标像素点。It is determined that the pixel point corresponding to the first pixel coordinate is the target pixel point of the geometric center of the target vehicle in the image.
- 根据权利要求6所述的方法,其中,所述通过所述感兴趣区域中包含的车辆的第二像素坐标对所述雷达坐标进行修正得到所述目标车辆的最终坐标,包括:The method according to claim 6, wherein the correcting the radar coordinates to obtain the final coordinates of the target vehicle by using the second pixel coordinates of the vehicle contained in the region of interest comprises:通过以下公式对所述雷达坐标进行修正得到所述目标车辆的最终坐标:The radar coordinates are corrected by the following formula to obtain the final coordinates of the target vehicle:其中,(x a,y a,z a)为修正后的最终坐标,(u r,v r)为所述感兴趣区域中包含的目标车辆的第二像素坐标。 Where, (x a, y a, z a) as the final corrected coordinates, (u r, v r) of the second pixel coordinates of the object of interest contained in the region of the vehicle.
- 一种车辆检测装置,应用于设置有毫米波雷达和相机的本车车辆,包括:A vehicle detection device, which is applied to a vehicle equipped with a millimeter wave radar and a camera, includes:雷达数据和图像采集模块,设置为在所述本车车辆上通过所述毫米波雷达采集雷达数据和通过所述相机采集图像;A radar data and image acquisition module configured to collect radar data on the host vehicle through the millimeter wave radar and collect images through the camera;危险区域建立模块,设置为建立所述本车车辆的危险区域;The dangerous area establishment module is set to establish the dangerous area of the vehicle;第一检测数据确定模块,设置为根据所述雷达数据确定位于所述危险区域内的目标车辆以及所述目标车辆的第一检测数据;A first detection data determination module, configured to determine a target vehicle located in the dangerous area and first detection data of the target vehicle according to the radar data;第二检测数据获取模块,设置为将所述图像输入预先训练好的目标检测模型中得到所述目标车辆的第二检测数据;A second detection data acquisition module, configured to input the image into a pre-trained target detection model to obtain second detection data of the target vehicle;融合模块,设置为基于所述毫米波雷达的雷达坐标系到所述图像的像素坐标系的投影,融合所述第一检测数据和所述第二检测数据得到所述目标车辆的检测数据。The fusion module is configured to fuse the first detection data and the second detection data to obtain the detection data of the target vehicle based on the projection of the radar coordinate system of the millimeter wave radar to the pixel coordinate system of the image.
- 一种车辆,包括:A vehicle including:毫米波雷达和相机,所述毫米波雷达设置为采集雷达数据,所述相机设置为采集图像;A millimeter wave radar and a camera, the millimeter wave radar is set to collect radar data, and the camera is set to collect images;一个或多个处理器;One or more processors;存储装置,设置为存储一个或多个程序,Storage device, set to store one or more programs,当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-7中任一项所述的车辆检测方法。When the one or more programs are executed by the one or more processors, the one or more processors implement the vehicle detection method according to any one of claims 1-7.
- 一种计算机可读存储介质,存储有计算机程序,其中,所述计算机程序被处理器执行时实现如权利要求1-7中任一项所述的车辆检测方法。A computer-readable storage medium storing a computer program, wherein the computer program is executed by a processor to implement the vehicle detection method according to any one of claims 1-7.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010591915.X | 2020-06-24 | ||
CN202010591915.XA CN111797741A (en) | 2020-06-24 | 2020-06-24 | Vehicle detection method, device, vehicle and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021259344A1 true WO2021259344A1 (en) | 2021-12-30 |
Family
ID=72804185
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/101971 WO2021259344A1 (en) | 2020-06-24 | 2021-06-24 | Vehicle detection method and device, vehicle, and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111797741A (en) |
WO (1) | WO2021259344A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114942437A (en) * | 2022-04-25 | 2022-08-26 | 东风汽车集团股份有限公司 | Front side target detection method for front vehicle |
CN115063772A (en) * | 2022-05-09 | 2022-09-16 | 厦门金龙联合汽车工业有限公司 | Vehicle formation rear vehicle detection method, terminal device and storage medium |
CN115327524A (en) * | 2022-07-29 | 2022-11-11 | 江苏集萃深度感知技术研究所有限公司 | Road side end target detection method and device based on millimeter wave radar and vision fusion |
CN115327532A (en) * | 2022-08-04 | 2022-11-11 | 澳克诺(上海)汽车科技有限公司 | Method and system for sensor data fusion |
CN115981337A (en) * | 2023-01-18 | 2023-04-18 | 中国矿业大学 | Underground unmanned vehicle decision making system and method based on multi-source information |
CN116148801A (en) * | 2023-04-18 | 2023-05-23 | 深圳市佰誉达科技有限公司 | Millimeter wave radar-based target detection method and system |
CN116757981A (en) * | 2023-06-19 | 2023-09-15 | 北京拙河科技有限公司 | Multi-terminal image fusion method and device |
CN117214966A (en) * | 2023-08-01 | 2023-12-12 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111797741A (en) * | 2020-06-24 | 2020-10-20 | 中国第一汽车股份有限公司 | Vehicle detection method, device, vehicle and storage medium |
CN112578792B (en) * | 2020-11-12 | 2022-05-31 | 东风汽车集团有限公司 | Crossroad auxiliary control method and storage medium |
CN112379674B (en) * | 2020-11-26 | 2022-06-21 | 中国第一汽车股份有限公司 | Automatic driving equipment and system |
CN112465868B (en) * | 2020-11-30 | 2024-01-12 | 浙江华锐捷技术有限公司 | Target detection tracking method and device, storage medium and electronic device |
CN112622893A (en) * | 2020-12-25 | 2021-04-09 | 北京理工大学前沿技术研究院 | Multi-sensor fusion target vehicle automatic driving obstacle avoidance method and system |
CN112799055B (en) * | 2020-12-28 | 2024-07-09 | 深圳承泰科技有限公司 | Method and device for detecting detected vehicle and electronic equipment |
CN113682259B (en) * | 2021-09-22 | 2023-07-04 | 海南大学 | Door opening early warning anti-collision system for vehicle and control method |
CN113949996B (en) * | 2021-10-11 | 2022-10-28 | 重庆大学 | Data processing method and system for acquisition and playback of vehicle-road cooperative test scene |
CN114266859B (en) * | 2021-12-02 | 2022-09-06 | 国汽智控(北京)科技有限公司 | Data processing method, device, equipment and storage medium |
CN114872735B (en) * | 2022-07-10 | 2022-10-04 | 成都工业职业技术学院 | Neural network algorithm-based decision-making method and device for automatically-driven logistics vehicles |
CN115598644B (en) * | 2022-11-28 | 2023-03-28 | 北京木牛领航科技有限公司 | Collision detection method, device, medium and equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150219758A1 (en) * | 2014-01-31 | 2015-08-06 | Applied Concepts, Inc. | Mobile radar and visual tracking coordinate transformation |
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
CN107991671A (en) * | 2017-11-23 | 2018-05-04 | 浙江东车智能科技有限公司 | A kind of method based on radar data and vision signal fusion recognition risk object |
CN110532896A (en) * | 2019-08-06 | 2019-12-03 | 北京航空航天大学 | A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision |
CN111797741A (en) * | 2020-06-24 | 2020-10-20 | 中国第一汽车股份有限公司 | Vehicle detection method, device, vehicle and storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102975662A (en) * | 2012-11-13 | 2013-03-20 | 上海交通大学 | Ultrasonic wave reversing radar system based on time difference of arrival (TDOA) |
JP5929870B2 (en) * | 2013-10-17 | 2016-06-08 | 株式会社デンソー | Target detection device |
CN106908783B (en) * | 2017-02-23 | 2019-10-01 | 苏州大学 | Based on obstacle detection method combined of multi-sensor information |
CN106891889A (en) * | 2017-03-10 | 2017-06-27 | 南京沃杨机械科技有限公司 | Agricultural machinery is unmanned to use farm environment cognitive method |
CN107862287A (en) * | 2017-11-08 | 2018-03-30 | 吉林大学 | A kind of front zonule object identification and vehicle early warning method |
CN109254289B (en) * | 2018-11-01 | 2021-07-06 | 百度在线网络技术(北京)有限公司 | Detection method and detection equipment for road guardrail |
CN110515041B (en) * | 2019-08-30 | 2023-12-01 | 的卢技术有限公司 | Vehicle distance measurement control method and system based on Kalman filtering technology |
CN111284485B (en) * | 2019-10-10 | 2021-06-18 | 中国第一汽车股份有限公司 | Method and device for predicting driving behavior of obstacle vehicle, vehicle and storage medium |
CN110940979A (en) * | 2019-10-28 | 2020-03-31 | 杭州飞步科技有限公司 | Obstacle detection method, apparatus, device, and storage medium |
CN110949402B (en) * | 2019-11-15 | 2021-06-01 | 中国第一汽车股份有限公司 | Alarm area determination method and device, storage medium and vehicle |
-
2020
- 2020-06-24 CN CN202010591915.XA patent/CN111797741A/en active Pending
-
2021
- 2021-06-24 WO PCT/CN2021/101971 patent/WO2021259344A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150219758A1 (en) * | 2014-01-31 | 2015-08-06 | Applied Concepts, Inc. | Mobile radar and visual tracking coordinate transformation |
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
CN107991671A (en) * | 2017-11-23 | 2018-05-04 | 浙江东车智能科技有限公司 | A kind of method based on radar data and vision signal fusion recognition risk object |
CN110532896A (en) * | 2019-08-06 | 2019-12-03 | 北京航空航天大学 | A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision |
CN111797741A (en) * | 2020-06-24 | 2020-10-20 | 中国第一汽车股份有限公司 | Vehicle detection method, device, vehicle and storage medium |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114942437A (en) * | 2022-04-25 | 2022-08-26 | 东风汽车集团股份有限公司 | Front side target detection method for front vehicle |
CN115063772A (en) * | 2022-05-09 | 2022-09-16 | 厦门金龙联合汽车工业有限公司 | Vehicle formation rear vehicle detection method, terminal device and storage medium |
CN115063772B (en) * | 2022-05-09 | 2024-04-16 | 厦门金龙联合汽车工业有限公司 | Method for detecting vehicles after formation of vehicles, terminal equipment and storage medium |
CN115327524A (en) * | 2022-07-29 | 2022-11-11 | 江苏集萃深度感知技术研究所有限公司 | Road side end target detection method and device based on millimeter wave radar and vision fusion |
CN115327532A (en) * | 2022-08-04 | 2022-11-11 | 澳克诺(上海)汽车科技有限公司 | Method and system for sensor data fusion |
CN115981337A (en) * | 2023-01-18 | 2023-04-18 | 中国矿业大学 | Underground unmanned vehicle decision making system and method based on multi-source information |
CN115981337B (en) * | 2023-01-18 | 2023-11-17 | 中国矿业大学 | Underground unmanned vehicle decision-making system and method based on multi-source information |
CN116148801A (en) * | 2023-04-18 | 2023-05-23 | 深圳市佰誉达科技有限公司 | Millimeter wave radar-based target detection method and system |
CN116757981A (en) * | 2023-06-19 | 2023-09-15 | 北京拙河科技有限公司 | Multi-terminal image fusion method and device |
CN117214966A (en) * | 2023-08-01 | 2023-12-12 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
CN117214966B (en) * | 2023-08-01 | 2024-04-05 | 珠海微度芯创科技有限责任公司 | Image mapping method, device, equipment and medium of millimeter wave security inspection imaging equipment |
Also Published As
Publication number | Publication date |
---|---|
CN111797741A (en) | 2020-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021259344A1 (en) | Vehicle detection method and device, vehicle, and storage medium | |
CN109435942B (en) | Information fusion-based parking space line and parking space recognition method and device | |
US11373532B2 (en) | Pothole detection system | |
JP4676373B2 (en) | Peripheral recognition device, peripheral recognition method, and program | |
CN108638999B (en) | Anti-collision early warning system and method based on 360-degree look-around input | |
WO2021056895A1 (en) | Multi-target detection and recognition method and assisted driving method and system | |
CN112215306B (en) | Target detection method based on fusion of monocular vision and millimeter wave radar | |
US11577748B1 (en) | Real-time perception system for small objects at long range for autonomous vehicles | |
CN113192091B (en) | Long-distance target sensing method based on laser radar and camera fusion | |
CN111369541A (en) | Vehicle detection method for intelligent automobile under severe weather condition | |
CN113850102B (en) | Vehicle-mounted vision detection method and system based on millimeter wave radar assistance | |
WO2020154990A1 (en) | Target object motion state detection method and device, and storage medium | |
CN108944668B (en) | Auxiliary driving early warning method based on vehicle-mounted 360-degree look-around input | |
JP2006018751A (en) | Image processor for vehicle | |
CN113297881B (en) | Target detection method and related device | |
JP2005148817A (en) | Image processor for vehicle | |
CN108645375B (en) | Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system | |
CN110458050B (en) | Vehicle cut-in detection method and device based on vehicle-mounted video | |
KR20110001427A (en) | High speed road lane detection method based on extraction of roi-lb | |
CN114415171A (en) | Automobile travelable area detection method based on 4D millimeter wave radar | |
JP4296287B2 (en) | Vehicle recognition device | |
JP2009245042A (en) | Traffic flow measurement device and program | |
JP4956099B2 (en) | Wall detector | |
TWI680898B (en) | Light reaching detection device and method for close obstacles | |
Ramaiah et al. | Stereo Vision Based Pothole Detection System for Improved Ride Quality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21828828 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21828828 Country of ref document: EP Kind code of ref document: A1 |