US20180335787A1 - Six-dimensional point cloud system for a vehicle - Google Patents
Six-dimensional point cloud system for a vehicle Download PDFInfo
- Publication number
- US20180335787A1 US20180335787A1 US15/597,994 US201715597994A US2018335787A1 US 20180335787 A1 US20180335787 A1 US 20180335787A1 US 201715597994 A US201715597994 A US 201715597994A US 2018335787 A1 US2018335787 A1 US 2018335787A1
- Authority
- US
- United States
- Prior art keywords
- point
- point cloud
- points
- processor
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000004044 response Effects 0.000 claims abstract description 10
- 238000000034 method Methods 0.000 claims description 16
- 238000010606 normalization Methods 0.000 claims description 10
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims 1
- 239000011159 matrix material Substances 0.000 description 9
- 230000000875 corresponding effect Effects 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G01S17/023—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T2201/00—Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
- B60T2201/08—Lane monitoring; Lane Keeping Systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4026—Antenna boresight
Definitions
- the technical field generally relates to vehicles, and more particularly relates to systems and methods for operating radar system in a vehicle.
- Vehicles often utilize various sensors and systems to detect the environment around the vehicle in order to navigate the vehicle, in the case of a self-driving vehicle, or aid a driver in navigating through a driving environment.
- most current sensors and systems are more effective in more predictable environments, such as driving along a highway.
- urban environments such as city driving, are more difficult for to navigate as there more variables that the vehicle would have to consider.
- a six-dimensional point cloud system for a vehicle.
- the system may include, but is not limited to, a multiple input multiple output radar system, a memory, and a processor communicatively connected to the multiple input multiple output radar system and the memory, the processor configured to receive, from the multiple input multiple output radar system, a current data frame comprising a point cloud, the point cloud including three-dimensional position information and Doppler data corresponding to an object detected by the multiple input multiple output radar system at each point in the point cloud, calculate, for each point in the point cloud, a three-dimensional velocity of the respective point in the point cloud based upon the three-dimensional position information and Doppler data associated with the point in the current data frame and data from a previous data frame stored in them memory, segment each point in the point cloud into one or more clusters of points based upon the three-dimensional position information associated with each respective point and the calculated three-dimensional velocity of each respective point, compute a kinematic state of each of the one or more clusters of points, the kine
- a method for operating a six-dimensional point cloud system for a vehicle may include, but is not limited to, receiving, by a processor from a multiple input multiple output radar system, a current data frame comprising a point cloud, the point cloud including three-dimensional position information and Doppler data corresponding to an object detected by the multiple input multiple output radar system at each point in the point cloud, calculating, by the processor for each point in the point cloud, a three-dimensional velocity of the respective point in the point cloud based upon the three-dimensional position information and Doppler data associated with the point in the current data frame and data from a previous data frame stored in a memory, segmenting, by the processor, each point in the point cloud into one or more clusters of points based upon the three-dimensional position information associated with each respective point and the calculated three-dimensional velocity of each respective point, computing, by the processor, a kinematic state of each of the one or more clusters of points, the kinematic state including a center of mass for the respective
- FIG. 1 is a block diagram of a vehicle, in accordance with an embodiment
- FIG. 2 is a flow diagram illustrated an exemplary method for operating the 6D point cloud system, in accordance with an embodiment
- FIG. 3 illustrates data points in a current data frame and a previous data frame in an exemplary Cartesian coordinate system.
- a six-dimensional point cloud system which utilizes a multiple input multiple output radar system for accurately identifying and tracking objects around a vehicle in both urban and highway environments.
- FIG. 1 is a block diagram of a vehicle 100 , in accordance with an embodiment.
- the vehicle 100 may be an automobile.
- the vehicle 100 may be, for example, an aircraft, a spacecraft, a watercraft or the like, or any combination thereof.
- the vehicle 100 includes a six-dimensional (6D) point cloud system 110 .
- the 6D point cloud system 110 generates three-dimensional (3D) velocity data and 3D position data for objects to accurately identify an object near the vehicle and to accurately track the object, as discussed in further detail below.
- the objects may be, for example, another vehicle, a motorcycle, a bicycle, a pedestrian, a street light, a street sign, construction/warning objects (e.g., barricades, cones, drums, etc.), or the like.
- the 6D point cloud system 110 includes a multiple input multiple output (MIMO) radar system 120 .
- the MIMO radar system 120 may include a single, or multiple co-located or distributed antenna that is capable of simultaneous transmitting and receiving multiple frequencies of radar signals in the same field of view. Accordingly, the MIMO radar system 120 is capable of generating a dense point cloud of data. Every time an object reflects the radar signal a point in the point cloud is created. The number of points on the point cloud that correspond to a single object can vary depending upon the size of the object and the distance of the object from the vehicle. As discussed in further detail below, the MIMO radar system 120 outputs a multiple data points for each point in the point cloud which the 6D point cloud system 110 utilizes to identify and track objects.
- the 6D point cloud system 110 further includes at least one processor 130 .
- the at least one processor 130 may be a central processing unit (CPU), a graphics processing unit (GPU), a physics processing unit (PPU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller, or any other logic device or any combination thereof. While the 6D point cloud system 110 may utilize multiple processors 130 , the at least one processor 130 hereinafter is referred to as a singular processor 130 for simplicity.
- the 6D point cloud system 110 further includes a memory 140 .
- the memory may be any combination of volatile and non-volatile memory.
- the memory 140 may store frames of data from the MIMO radar system 120 for processing, as discussed in further detail below.
- the memory 140 may also store non-transitory computer readable instructions for implementing the 6D point cloud system 110 as discussed herein.
- the vehicle 100 further includes at least one control system 150 which can be controlled by the 6D point cloud system 110 .
- the control system 150 may be, for example, a braking system, an acceleration system, a steering system, a lighting system, or the like.
- the 6D point cloud system 110 may generate a control signal to activate the control system to generate a warning, avoid the object or minimize an impact with the object.
- FIG. 2 is a flow diagram illustrated an exemplary method 200 for operating the 6D point cloud system 120 , in accordance with an embodiment.
- the processor 130 receives a data frame from the MIMO radar system 120 .
- the data frame includes data for each data point in the point cloud of the radar image at a point in time.
- the data frame includes four data points for each data point in the point cloud.
- the four data points include three location data points and a Doppler data point.
- the intensity data point may be generated based upon one or more of the four data points. For example, an intensity may be based upon a signal strength of the received location or Doppler data points.
- the radar may output the three location data points as a range, and azimuth angle and elevation angle relative to the location of the MIMO radar system 120 .
- the radar may output the three location data points as Cartesian coordinates (i.e., X, Y and Z).
- the intensity data point may be measured in, for example, watts, decibels, range in meters, velocity in meters per second, Doppler in Hz, angles in degrees, or the like.
- the Doppler data point includes a range and a range rate. In other words, the Doppler data point includes data corresponding to whether a point in the point cloud, which corresponds to an object, is getting closer to or further from the vehicle 100 as well as the distance from the vehicle 100 .
- the processor 130 then computes an estimated three-dimensional velocity for each data point in the point cloud where an object was detected by the MIMO radar system 120 . (Step 210 ).
- the processor 130 In order to determine the estimated three-dimensional velocity for each data point in the point cloud, the processor 130 must determine which point in the point cloud corresponds to which point in the point cloud of a previous data frame.
- FIG. 3 illustrates data points in a current data frame and a previous data frame in an exemplary Cartesian coordinate system.
- the circles 300 , 310 and 320 correspond to points in a point cloud in previous data frame and the solid circles 330 , 340 and 350 correspond to points in a point cloud in the current data frame.
- the processor 130 In order to determine the velocity for point 330 , the processor 130 must determine which of the points 300 - 320 the point 330 corresponds to in the previous data frame and then use the relative distance between the determined point and point 330 to determine the velocity of the point 330 . In one embodiment, for example, the processor may simultaneously determine both the corresponding point in the previous frame for each point in the current frame and the velocity of the each point in the current frame according to Equation 1:
- the processor 130 determines the three-dimensional velocity v k for each point k in the point cloud of the current frame by determining the minimum velocity v k (i.e., argmin vk ) between the respective point in the point cloud of the current data frame and every point in the point cloud of the previous frame.
- Equation 1 s k is the multidimensional location data received from the MIMO radar system 120 (i.e., the three location points discussed above) for the point k in the current data frame for which the velocity is being determined for, m j is the location data from one of the points in the point cloud of the previous frame, d k is the Doppler measurement for the point in the current frame (i.e., the Doppler measurement for s k ), n k is the unit direction from the center of the MIMO radar system 120 to the point s k , ⁇ 1 and ⁇ 2 are calibration parameters, N k is the number of radar reflection points in current time frame matches with a point m j detected in previous time frame, and ⁇ jk is the probability that m j and s k are associated.
- ⁇ jk may be calculated according to Equation 2:
- a jk c ⁇ ⁇ e ( ⁇ s k - ( m j + v k ⁇ ⁇ ⁇ ⁇ t ) ⁇ ) 2 ⁇ 1 2 ⁇ e ( ⁇ d k - d j ⁇ ) 2 ⁇ 2 2 ⁇ e ( ⁇ I k - I j ⁇ ) 2 ⁇ 3 2 Equation ⁇ ⁇ 2
- I k is an intensity value for the point s k
- I j is an intensity value for the point m j
- c is a normalization factor such that a sum of all ⁇ jk is equal to 1.
- c is a normalization factor such that a sum of all the probabilities that any given point in the point cloud of the previous data frame corresponds to the point s k in the point cloud of the current data frame.
- ⁇ t is the difference in time between the respective data frames and ⁇ 1 , ⁇ 2 and ⁇ 3 are calibration parameters are the standard deviation of position measurement, Doppler measurement, and intensity measurement of a calibrated target (e.g., corner reflector), respectively.
- the processor 130 when performing the calculations for Equations 1 and 2, may initially set the velocity v k of s k as ⁇ j N k a jk ⁇ v j .
- the velocity v k of s k is assumed to be a sum of the probability each respective point in the point cloud of the previous frame corresponds to the point s k in the current frame multiplied by the velocity of the respective point in the point cloud of the previous frame determined in the previous three-dimensional velocity calculation (i.e., v k for the point in the point cloud of the previous frame determined in a previous calculation).
- the processor 130 then removes noisy resolution points from the data frame (Step 215 ).
- the processor 130 may remove data points from subsequent calculations based upon temporal persistency. In other words, points in a current point cloud which could not be correlated with points in a previous point cloud are not considered in any subsequent calculations.
- the processor 130 then classifies each point in the point cloud of the current frame as stationary or dynamic. (Step 220 ).
- a point in a point cloud is assumed to be stationary when an estimated relative velocity for the point is below a predetermined threshold.
- the threshold may be, for example, ten decibels (dB) above a noise level.
- the processor 130 determines the relative velocity of each point in the point cloud of the current frame by comparing the velocity of the point to the vehicle spend after compensating the velocity v k for the point (determined in Step 215 ) with the yaw, pitch and roll of the vehicle and compensating the position of the point (i.e., the three location points from MIMO radar system 120 ) based upon a position of the MIMO radar system 120 as well as the yaw, pitch and roll of the vehicle.
- the processor may determine a classification for each point in the point cloud based upon Equation 3:
- v k is the velocity of the point in the current data frame
- ⁇ H is a pitch rate of the vehicle 100
- s k is the position of the point in the current data frame
- V H Is the speed of the vehicle 100
- R is a rotational matrix that compensates the velocity v k and position s k for the yaw, pitch and roll of the vehicle 100 .
- the rotational matrix R may be calculated according to Equation 4:
- ⁇ is a yaw of the vehicle 100
- ⁇ is a pitch of the vehicle 100
- ⁇ is a roll of the vehicle
- ⁇ represents a predetermined stationary speed. In one embodiment, for example, ⁇ may be less than 0.5 kilometers per hour (KPH). Accordingly, if the norm of the sum of the velocities represented on the left side of Equation 3 is less than ⁇ , the processor 130 classifies the point in the point cloud as stationary. However, if the norm of the sum of the velocities represented on the left side of Equation 3 is greater than ⁇ , the processor classifies the point in the point cloud as dynamic. The processor 130 , in Step 220 performs this classification for every point in the point cloud of the current data frame.
- the processor 130 may analyze the points in the point cloud classified as stationary to correct a radar angle alignment. (Step 225 ). In other words, the processor 130 may analyze the points in the point cloud classified as stationary determine if the angle of the MIMO radar system 120 has become misaligned from an intended angle. The misalignment could be from an impact to the vehicle 100 or from installation tolerances during installation of the MIMO radar system 120 .
- the det(UV T ) is the determinant of the matrix UV T .
- U is the left orthogonal matrix and V is the right orthogonal matrix of the singular value decomposition (svd) of the matrix ⁇ T ⁇ .
- a pitch correction ⁇ , a roll correction ⁇ , and an azimuth correction ⁇ for the radar angle alignment may be calculated according to Equations 5, 6, and 7:
- the calculated pitch correction ⁇ , a roll correction ⁇ , and an azimuth correction ⁇ may then be used to correct subsequent radar data input by compensating subsequent radar data input by the calculated pitch correction ⁇ , roll correction ⁇ , and azimuth correction ⁇ .
- the processor 130 may then segment the points in the point cloud from the current frame into clusters (Step 230 ). Any two points of the point cloud may be clustered in the same cluster according to Equation 8:
- the processor 130 associates the points j and k into the same cluster.
- the three dimensional velocity for a point k can be calculated according to Equation 1. The three dimensional position is determined based upon the data from the MIMO radar system 120 .
- the processor 130 may then compute a kinematic state for each cluster. (Step 235 ).
- the kinematic state may include, for example, a center of mass for the cluster, a reference velocity and angular velocity for the cluster and contour points for the cluster.
- the contour of the cluster corresponds to an outline of the cluster.
- the center of mass p cm of the cluster may be determined by the processor 130 according to Equation 9:
- Equation 9 C corresponds to the number of points in the cluster and p k corresponds to the position of each point in the cluster.
- the processor 130 may then determine the reference velocity u ref and a reference angular velocity ⁇ ref according to Equation 10:
- the processor 130 may use a Kalman filter on the kinematic state data (i.e., the center of mass for the cluster, the reference velocity, the reference angular velocity, and the contour points for the cluster) to track and report a smoothed kinematic state for the cluster over multiple radar data frames.
- a Kalman filter on the kinematic state data (i.e., the center of mass for the cluster, the reference velocity, the reference angular velocity, and the contour points for the cluster) to track and report a smoothed kinematic state for the cluster over multiple radar data frames.
- the processor 130 may than classify an object to be associated with each cluster. (Step 240 ). In other words, the processor 130 determines an object type for the cluster.
- a cluster may be classified, for example, as a vehicle, a pedestrian, a barricade, a street light, a stop sign, or any other object a vehicle could encounter on the road.
- a deep learning neural network DLNN
- a separate processor 130 such as a GPU or an array of FPGA's may be used on this process alone.
- the number of processors 130 in the 6D point cloud system 110 may vary depending upon a performance level of the processor and the desired response time of the 6D point cloud system 110 .
- the processor 130 may input a normalized intensity map and a time sliding window of micro-Doppler signals to the DLNN for each cluster.
- the MIMO radar system 120 may output an intensity associated with each data point.
- the normalized intensity map is created by dividing the intensity values by the total number of values.
- the time sliding window of micro-Doppler signals corresponds to a frequency in hertz received by the MIMO radar system 120 corresponding to the cluster over a period of data frames.
- the DLNN creates feature maps for each cluster by extracting features from the intensity map and the time sliding window of micro-Doppler signals.
- the processor 130 classifies a cluster based upon the reference velocity associated with the cluster and the contour of the cluster.
- Each type of target for example, a pedestrian or a vehicle, has a typical micro-Doppler signature, and thus when analyzing the spectrogram (i.e., a spectrum change over time corresponding to the target), the processor 130 can classify the target based upon which signature the target most closely resembles.
- the processor 130 then tracks each classified cluster in subsequent passes through the process. (Step 245 ).
- Each object e.g., vehicle, pedestrian, etc.
- the processor 130 does not need to track each detection point individually, but rather the group of points clustered together to represent an object.
- the processor 130 tracks a change in the center of the cluster and parameters (e.g., position and velocity) of the cluster. The updated center and other parameters are used in subsequent threat level determinations.
- the processor 130 determines a threat level based upon the kinematic state data for the object determined in Steps 235 and 245 (i.e., the center of mass, reference velocity, reference angular velocity and contour), the object type determined in Step 235 and the object size (i.e., the number of points in the point cloud corresponding to the object) and response commands. (Step 250 ).
- the response commands may include generating a command for a vehicle control system 150 to avoid an object, to minimize an impact with the object to warn an object, or the like.
- the command may be, for example, to brake, accelerate, steer, flash lights, or the like, based upon the cluster velocity and the cluster proximity.
- the 6D point cloud system 110 can accurately identify and track objects in an urban environment using a small sensor array.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Radar Systems Or Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The technical field generally relates to vehicles, and more particularly relates to systems and methods for operating radar system in a vehicle.
- Vehicles often utilize various sensors and systems to detect the environment around the vehicle in order to navigate the vehicle, in the case of a self-driving vehicle, or aid a driver in navigating through a driving environment. In general, most current sensors and systems are more effective in more predictable environments, such as driving along a highway. In contrast, urban environments, such as city driving, are more difficult for to navigate as there more variables that the vehicle would have to consider.
- Accordingly, it is desirable to include a sensor system in a vehicle capable of accurately identifying objects and tracking objects in both urban and highway environments. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description of the invention and the appended claims, taken in conjunction with the accompanying drawings and the background of the invention.
- In one embodiment, for example, a six-dimensional point cloud system for a vehicle is provided. The system may include, but is not limited to, a multiple input multiple output radar system, a memory, and a processor communicatively connected to the multiple input multiple output radar system and the memory, the processor configured to receive, from the multiple input multiple output radar system, a current data frame comprising a point cloud, the point cloud including three-dimensional position information and Doppler data corresponding to an object detected by the multiple input multiple output radar system at each point in the point cloud, calculate, for each point in the point cloud, a three-dimensional velocity of the respective point in the point cloud based upon the three-dimensional position information and Doppler data associated with the point in the current data frame and data from a previous data frame stored in them memory, segment each point in the point cloud into one or more clusters of points based upon the three-dimensional position information associated with each respective point and the calculated three-dimensional velocity of each respective point, compute a kinematic state of each of the one or more clusters of points, the kinematic state including a center of mass for the respective cluster of points, a reference velocity for the respective cluster of points, an angular velocity for the respective cluster of points, and contour points for the respective cluster of points, determine an object type associated with each of the one or more cluster, and determine a threat level and a response command based upon the determined object type and the kinematic state of each of the one or more clusters of points.
- In another embodiment, for example, a method for operating a six-dimensional point cloud system for a vehicle, is provided. The method may include, but is not limited to, receiving, by a processor from a multiple input multiple output radar system, a current data frame comprising a point cloud, the point cloud including three-dimensional position information and Doppler data corresponding to an object detected by the multiple input multiple output radar system at each point in the point cloud, calculating, by the processor for each point in the point cloud, a three-dimensional velocity of the respective point in the point cloud based upon the three-dimensional position information and Doppler data associated with the point in the current data frame and data from a previous data frame stored in a memory, segmenting, by the processor, each point in the point cloud into one or more clusters of points based upon the three-dimensional position information associated with each respective point and the calculated three-dimensional velocity of each respective point, computing, by the processor, a kinematic state of each of the one or more clusters of points, the kinematic state including a center of mass for the respective cluster of points, a reference velocity for the respective cluster of points, an angular velocity for the respective cluster of points, and contour points for the respective cluster of points, determining, by the processor, an object type associated with each of the one or more cluster, and determining, by the processor, a threat level and a response command based upon the determined object type and the kinematic state of each of the one or more clusters of points.
- The exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
-
FIG. 1 is a block diagram of a vehicle, in accordance with an embodiment; -
FIG. 2 is a flow diagram illustrated an exemplary method for operating the 6D point cloud system, in accordance with an embodiment; and -
FIG. 3 illustrates data points in a current data frame and a previous data frame in an exemplary Cartesian coordinate system. - The following detailed description is merely exemplary in nature and is not intended to limit the application and uses. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
- As discussed above, city environments are a challenging environment for a vehicle to navigate due to the large number of variable situations the vehicle can encounter. Accordingly, as discussed in further detail below, a six-dimensional point cloud system is provided which utilizes a multiple input multiple output radar system for accurately identifying and tracking objects around a vehicle in both urban and highway environments.
-
FIG. 1 is a block diagram of avehicle 100, in accordance with an embodiment. In one embodiment, for example, thevehicle 100 may be an automobile. However, in other embodiments, thevehicle 100 may be, for example, an aircraft, a spacecraft, a watercraft or the like, or any combination thereof. - The
vehicle 100 includes a six-dimensional (6D)point cloud system 110. The 6Dpoint cloud system 110 generates three-dimensional (3D) velocity data and 3D position data for objects to accurately identify an object near the vehicle and to accurately track the object, as discussed in further detail below. The objects may be, for example, another vehicle, a motorcycle, a bicycle, a pedestrian, a street light, a street sign, construction/warning objects (e.g., barricades, cones, drums, etc.), or the like. - The 6D
point cloud system 110 includes a multiple input multiple output (MIMO)radar system 120. TheMIMO radar system 120 may include a single, or multiple co-located or distributed antenna that is capable of simultaneous transmitting and receiving multiple frequencies of radar signals in the same field of view. Accordingly, theMIMO radar system 120 is capable of generating a dense point cloud of data. Every time an object reflects the radar signal a point in the point cloud is created. The number of points on the point cloud that correspond to a single object can vary depending upon the size of the object and the distance of the object from the vehicle. As discussed in further detail below, theMIMO radar system 120 outputs a multiple data points for each point in the point cloud which the 6Dpoint cloud system 110 utilizes to identify and track objects. - The 6D
point cloud system 110 further includes at least oneprocessor 130. The at least oneprocessor 130 may be a central processing unit (CPU), a graphics processing unit (GPU), a physics processing unit (PPU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller, or any other logic device or any combination thereof. While the 6Dpoint cloud system 110 may utilizemultiple processors 130, the at least oneprocessor 130 hereinafter is referred to as asingular processor 130 for simplicity. - The 6D
point cloud system 110 further includes amemory 140. The memory may be any combination of volatile and non-volatile memory. Thememory 140 may store frames of data from theMIMO radar system 120 for processing, as discussed in further detail below. Thememory 140 may also store non-transitory computer readable instructions for implementing the 6Dpoint cloud system 110 as discussed herein. - The
vehicle 100 further includes at least onecontrol system 150 which can be controlled by the 6Dpoint cloud system 110. Thecontrol system 150 may be, for example, a braking system, an acceleration system, a steering system, a lighting system, or the like. When the 6Dpoint cloud system 110 identifies an object which may impact or come near thevehicle 100, the 6Dpoint cloud system 110 may generate a control signal to activate the control system to generate a warning, avoid the object or minimize an impact with the object. -
FIG. 2 is a flow diagram illustrated anexemplary method 200 for operating the 6Dpoint cloud system 120, in accordance with an embodiment. Theprocessor 130 receives a data frame from theMIMO radar system 120. (Step 205). The data frame includes data for each data point in the point cloud of the radar image at a point in time. The data frame includes four data points for each data point in the point cloud. The four data points include three location data points and a Doppler data point. The intensity data point may be generated based upon one or more of the four data points. For example, an intensity may be based upon a signal strength of the received location or Doppler data points. In one embodiment, for example, the radar may output the three location data points as a range, and azimuth angle and elevation angle relative to the location of theMIMO radar system 120. In another embodiment, for example, the radar may output the three location data points as Cartesian coordinates (i.e., X, Y and Z). - The intensity data point may be measured in, for example, watts, decibels, range in meters, velocity in meters per second, Doppler in Hz, angles in degrees, or the like. The Doppler data point includes a range and a range rate. In other words, the Doppler data point includes data corresponding to whether a point in the point cloud, which corresponds to an object, is getting closer to or further from the
vehicle 100 as well as the distance from thevehicle 100. - The
processor 130 then computes an estimated three-dimensional velocity for each data point in the point cloud where an object was detected by theMIMO radar system 120. (Step 210). In order to determine the estimated three-dimensional velocity for each data point in the point cloud, theprocessor 130 must determine which point in the point cloud corresponds to which point in the point cloud of a previous data frame.FIG. 3 illustrates data points in a current data frame and a previous data frame in an exemplary Cartesian coordinate system. In this example, thecircles solid circles point 330, theprocessor 130 must determine which of the points 300-320 thepoint 330 corresponds to in the previous data frame and then use the relative distance between the determined point andpoint 330 to determine the velocity of thepoint 330. In one embodiment, for example, the processor may simultaneously determine both the corresponding point in the previous frame for each point in the current frame and the velocity of the each point in the current frame according to Equation 1: -
- Accordingly, the
processor 130 determines the three-dimensional velocity vk for each point k in the point cloud of the current frame by determining the minimum velocity vk (i.e., argminvk) between the respective point in the point cloud of the current data frame and every point in the point cloud of the previous frame. In Equation 1, sk is the multidimensional location data received from the MIMO radar system 120 (i.e., the three location points discussed above) for the point k in the current data frame for which the velocity is being determined for, mj is the location data from one of the points in the point cloud of the previous frame, dk is the Doppler measurement for the point in the current frame (i.e., the Doppler measurement for sk), nk is the unit direction from the center of theMIMO radar system 120 to the point sk, σ1 and σ2 are calibration parameters, Nk is the number of radar reflection points in current time frame matches with a point mj detected in previous time frame, and σjk is the probability that mj and sk are associated. In one embodiment, for example, σjk may be calculated according to Equation 2: -
- where Ik is an intensity value for the point sk, Ij is an intensity value for the point mj, and c is a normalization factor such that a sum of all σjk is equal to 1. In other words, c is a normalization factor such that a sum of all the probabilities that any given point in the point cloud of the previous data frame corresponds to the point sk in the point cloud of the current data frame. In equations 1 and 2 Δt is the difference in time between the respective data frames and σ1, σ2 and σ3 are calibration parameters are the standard deviation of position measurement, Doppler measurement, and intensity measurement of a calibrated target (e.g., corner reflector), respectively. The
processor 130, when performing the calculations for Equations 1 and 2, may initially set the velocity vk of sk as Σj Nk ajk·vj. In other words, the velocity vk of sk is assumed to be a sum of the probability each respective point in the point cloud of the previous frame corresponds to the point sk in the current frame multiplied by the velocity of the respective point in the point cloud of the previous frame determined in the previous three-dimensional velocity calculation (i.e., vk for the point in the point cloud of the previous frame determined in a previous calculation). - Returning to
FIG. 2 , theprocessor 130 then removes noisy resolution points from the data frame (Step 215). In one embodiment, for example, theprocessor 130 may remove data points from subsequent calculations based upon temporal persistency. In other words, points in a current point cloud which could not be correlated with points in a previous point cloud are not considered in any subsequent calculations. - The
processor 130 then classifies each point in the point cloud of the current frame as stationary or dynamic. (Step 220). A point in a point cloud is assumed to be stationary when an estimated relative velocity for the point is below a predetermined threshold. The threshold may be, for example, ten decibels (dB) above a noise level. Theprocessor 130 determines the relative velocity of each point in the point cloud of the current frame by comparing the velocity of the point to the vehicle spend after compensating the velocity vk for the point (determined in Step 215) with the yaw, pitch and roll of the vehicle and compensating the position of the point (i.e., the three location points from MIMO radar system 120) based upon a position of theMIMO radar system 120 as well as the yaw, pitch and roll of the vehicle. In one embodiment, for example, the processor may determine a classification for each point in the point cloud based upon Equation 3: -
∥R·v k+(ωH ·R·s k +V H)∥<ε Equation3 - where vk is the velocity of the point in the current data frame, ωH is a pitch rate of the
vehicle 100, sk is the position of the point in the current data frame, VH Is the speed of thevehicle 100 and R is a rotational matrix that compensates the velocity vk and position sk for the yaw, pitch and roll of thevehicle 100. In one embodiment, for example, the rotational matrix R may be calculated according to Equation 4: -
- where ψ is a yaw of the
vehicle 100, θ is a pitch of thevehicle 100, Φ is a roll of the vehicle, c represents a cosine function (i.e., cθ=cos(θ)) and s represents a sin function (i.e., sΦ=sin(Φ)). - In equation 3 ε represents a predetermined stationary speed. In one embodiment, for example, ε may be less than 0.5 kilometers per hour (KPH). Accordingly, if the norm of the sum of the velocities represented on the left side of Equation 3 is less than ε, the
processor 130 classifies the point in the point cloud as stationary. However, if the norm of the sum of the velocities represented on the left side of Equation 3 is greater than ε, the processor classifies the point in the point cloud as dynamic. Theprocessor 130, inStep 220 performs this classification for every point in the point cloud of the current data frame. - In one embodiment, for example, the
processor 130 may analyze the points in the point cloud classified as stationary to correct a radar angle alignment. (Step 225). In other words, theprocessor 130 may analyze the points in the point cloud classified as stationary determine if the angle of theMIMO radar system 120 has become misaligned from an intended angle. The misalignment could be from an impact to thevehicle 100 or from installation tolerances during installation of theMIMO radar system 120. - Consider a stationary radar point sk, we have Rvk=uk, uk=−(ωH×pk+vH), for k=1, . . . , K. Let α=[v1 . . . vk], and β=[u1 . . . uK]. The α and β may include radar points from current and past time frames, for example, within 1 hour. The unknown rotation matrix R of the radar referring to the vehicle frame is equal to UCVT and C is the matrix:
-
- The det(UVT) is the determinant of the matrix UVT. U is the left orthogonal matrix and V is the right orthogonal matrix of the singular value decomposition (svd) of the matrix βT α.
- A pitch correction θ, a roll correction Φ, and an azimuth correction ψ for the radar angle alignment may be calculated according to Equations 5, 6, and 7:
-
- Where 1, 2, and 3 refer to coordinates in the a matrix R. The calculated pitch correction θ, a roll correction Φ, and an azimuth correction ψ may then be used to correct subsequent radar data input by compensating subsequent radar data input by the calculated pitch correction θ, roll correction Φ, and azimuth correction ψ.
- Matrices α and β needn't to be stored, since αβT can be recursively computed. For example, in case of deleting a sample (v1, u1) and adding a sample (vK+1, uK+1), the new α′β′T is computed as α′β′T=αβT−v1u1 T+vK+1uK+1 T.
- The
processor 130 may then segment the points in the point cloud from the current frame into clusters (Step 230). Any two points of the point cloud may be clustered in the same cluster according to Equation 8: -
- In other words, when the difference between the three dimensional position of two points j and k in the point cloud, divided by σ1 2 which is a varience in the position pk−pj added to the difference between the three dimensional velocity of the two points j and k, divided by σ2 2 which is a varience in the velocity data uk−uj, is less than one, than the
processor 130 associates the points j and k into the same cluster. As discussed above, the three dimensional velocity for a point k can be calculated according to Equation 1. The three dimensional position is determined based upon the data from theMIMO radar system 120. - The
processor 130 may then compute a kinematic state for each cluster. (Step 235). The kinematic state may include, for example, a center of mass for the cluster, a reference velocity and angular velocity for the cluster and contour points for the cluster. The contour of the cluster corresponds to an outline of the cluster. The center of mass pcm of the cluster may be determined by theprocessor 130 according to Equation 9: -
- In Equation 9, C corresponds to the number of points in the cluster and pk corresponds to the position of each point in the cluster. The
processor 130 may then determine the reference velocity uref and a reference angular velocity ωref according to Equation 10: -
- Where w is the radial frequency 2πf. The radial frequency is a function of target velocity and the measured frequency shift induced by target motion. The
processor 130 may determine the contour points according to pk−pcm for each point k=1, 2, . . . C in the cluster. In other words, theprocessor 130 determines a contour point for each point in the cluster corresponding to the difference in the three-dimensional position of each respective point k in the cluster and the three-dimensional position of the center of mass for the cluster. - In one embodiment, for example, the
processor 130 may use a Kalman filter on the kinematic state data (i.e., the center of mass for the cluster, the reference velocity, the reference angular velocity, and the contour points for the cluster) to track and report a smoothed kinematic state for the cluster over multiple radar data frames. One benefit of using the Kalman filter over multiple frames is that any anomalies in the kinematic state data are averaged out. - The
processor 130 may than classify an object to be associated with each cluster. (Step 240). In other words, theprocessor 130 determines an object type for the cluster. A cluster may be classified, for example, as a vehicle, a pedestrian, a barricade, a street light, a stop sign, or any other object a vehicle could encounter on the road. In one embodiment, for example, a deep learning neural network (DLNN) may perform the object classification. As the object classification is a complex process, aseparate processor 130, such as a GPU or an array of FPGA's may be used on this process alone. However, as discussed above, the number ofprocessors 130 in the 6Dpoint cloud system 110 may vary depending upon a performance level of the processor and the desired response time of the 6Dpoint cloud system 110. - The
processor 130 may input a normalized intensity map and a time sliding window of micro-Doppler signals to the DLNN for each cluster. As discussed above, theMIMO radar system 120 may output an intensity associated with each data point. The normalized intensity map is created by dividing the intensity values by the total number of values. The time sliding window of micro-Doppler signals corresponds to a frequency in hertz received by theMIMO radar system 120 corresponding to the cluster over a period of data frames. The DLNN creates feature maps for each cluster by extracting features from the intensity map and the time sliding window of micro-Doppler signals. - The
processor 130 classifies a cluster based upon the reference velocity associated with the cluster and the contour of the cluster. Each type of target, for example, a pedestrian or a vehicle, has a typical micro-Doppler signature, and thus when analyzing the spectrogram (i.e., a spectrum change over time corresponding to the target), theprocessor 130 can classify the target based upon which signature the target most closely resembles. - The
processor 130 then tracks each classified cluster in subsequent passes through the process. (Step 245). Each objet (e.g., vehicle, pedestrian, etc.) has multiple detection points in the point cloud. Accordingly, theprocessor 130 does not need to track each detection point individually, but rather the group of points clustered together to represent an object. On each subsequent pass through themethod 200, theprocessor 130 tracks a change in the center of the cluster and parameters (e.g., position and velocity) of the cluster. The updated center and other parameters are used in subsequent threat level determinations. - The
processor 130 then determines a threat level based upon the kinematic state data for the object determined inSteps 235 and 245 (i.e., the center of mass, reference velocity, reference angular velocity and contour), the object type determined inStep 235 and the object size (i.e., the number of points in the point cloud corresponding to the object) and response commands. (Step 250). The response commands may include generating a command for avehicle control system 150 to avoid an object, to minimize an impact with the object to warn an object, or the like. The command may be, for example, to brake, accelerate, steer, flash lights, or the like, based upon the cluster velocity and the cluster proximity. - One benefit of the 6D
point cloud system 110 is that by generating a 6D point cloud from the four-dimensional data received from theMIMO radar system 120, the 6Dpoint cloud system 110 can accurately identify and track objects in an urban environment using a small sensor array. - While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the disclosure as set forth in the appended claims and the legal equivalents thereof.
Claims (16)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/597,994 US10139833B1 (en) | 2017-05-17 | 2017-05-17 | Six-dimensional point cloud system for a vehicle |
CN201810434983.8A CN108931773B (en) | 2017-05-17 | 2018-05-09 | Six-dimensional point cloud system for vehicle |
DE102018111626.8A DE102018111626A1 (en) | 2017-05-17 | 2018-05-15 | SIX-DIMENSIONAL SPOT POINT SYSTEM FOR ONE VEHICLE |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/597,994 US10139833B1 (en) | 2017-05-17 | 2017-05-17 | Six-dimensional point cloud system for a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180335787A1 true US20180335787A1 (en) | 2018-11-22 |
US10139833B1 US10139833B1 (en) | 2018-11-27 |
Family
ID=64272292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/597,994 Active US10139833B1 (en) | 2017-05-17 | 2017-05-17 | Six-dimensional point cloud system for a vehicle |
Country Status (3)
Country | Link |
---|---|
US (1) | US10139833B1 (en) |
CN (1) | CN108931773B (en) |
DE (1) | DE102018111626A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200292706A1 (en) * | 2019-03-14 | 2020-09-17 | Aeva, Inc. | Velocity estimation using doppler per point lidar systems |
CN112327842A (en) * | 2020-10-29 | 2021-02-05 | 深圳市普渡科技有限公司 | Method and system for positioning charging pile by robot |
CN112363500A (en) * | 2020-10-29 | 2021-02-12 | 深圳市普渡科技有限公司 | Automatic recharging moving method and system |
US20210318412A1 (en) * | 2020-04-09 | 2021-10-14 | Robert Bosch Gmbh | Apparatus and method for processing radar data and radar system |
WO2022055847A1 (en) * | 2020-09-10 | 2022-03-17 | Argo AI, LLC | Systems and methods for simultaneous range-rate unwrapping and outlier removal for radar |
US11300677B2 (en) * | 2019-07-08 | 2022-04-12 | GM Global Technology Operations LLC | Automated driving systems and control logic for host vehicle velocity estimation using wide aperture radar |
WO2022094430A1 (en) * | 2020-11-02 | 2022-05-05 | Waymo Llc | Point cloud segmentation using a coherent lidar for autonomous vehicle applications |
WO2022093513A1 (en) * | 2020-11-02 | 2022-05-05 | Argo AI, LLC | Systems and methods for range-rate dealiasing using position consistency |
CN114966673A (en) * | 2022-05-31 | 2022-08-30 | 上海海拉电子有限公司 | Radar-based trailer detection method and system and vehicle |
US11585896B2 (en) * | 2019-02-04 | 2023-02-21 | Metawave Corporation | Motion-based object detection in a vehicle radar using convolutional neural network systems |
US20230085887A1 (en) * | 2020-06-15 | 2023-03-23 | Gm Cruise Holdings Llc | Systems and methods for high velocity resolution high update rate radar for autonomous vehicles |
US11656629B1 (en) | 2020-12-08 | 2023-05-23 | Waymo Llc | Detection of particulate matter in autonomous vehicle applications |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111722196A (en) * | 2019-03-19 | 2020-09-29 | 富士通株式会社 | Radar reflection point extraction method and device |
US11361554B2 (en) | 2019-10-22 | 2022-06-14 | Robert Bosch Gmbh | Performing object and activity recognition based on data from a camera and a radar sensor |
CN112068125B (en) * | 2020-08-18 | 2023-09-08 | 南京六季光电技术研究院有限公司 | Six-dimensional ground penetrating radar system |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5959574A (en) * | 1993-12-21 | 1999-09-28 | Colorado State University Research Foundation | Method and system for tracking multiple regional objects by multi-dimensional relaxation |
US7783403B2 (en) * | 1994-05-23 | 2010-08-24 | Automotive Technologies International, Inc. | System and method for preventing vehicular accidents |
AU2001290533A1 (en) * | 2000-08-16 | 2002-02-25 | Raytheon Company | Automotive radar systems and techniques |
TW536637B (en) * | 2001-11-20 | 2003-06-11 | Ching-Fang Lin | Improved positioning and data integrating method and system thereof |
EP2306433A1 (en) * | 2009-10-05 | 2011-04-06 | Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO | Collision avoidance system and method for a road vehicle and respective computer program product |
US8565481B1 (en) * | 2011-05-26 | 2013-10-22 | Google Inc. | System and method for tracking objects |
DE102011113015A1 (en) * | 2011-09-09 | 2013-03-14 | Astyx Gmbh | Imaging radar sensor with synthetic magnification of the antenna taper and two-dimensional beam sweep |
US8724906B2 (en) * | 2011-11-18 | 2014-05-13 | Microsoft Corporation | Computing pose and/or shape of modifiable entities |
CN102722886B (en) * | 2012-05-21 | 2015-12-09 | 浙江捷尚视觉科技股份有限公司 | A kind of video frequency speed-measuring method based on three-dimensional scaling and Feature Points Matching |
US9250324B2 (en) * | 2013-05-23 | 2016-02-02 | GM Global Technology Operations LLC | Probabilistic target selection and threat assessment method and application to intersection collision alert system |
CN103576137B (en) * | 2013-09-27 | 2015-05-27 | 电子科技大学 | Multi-sensor multi-target location method based on imaging strategies |
US9820233B2 (en) * | 2013-11-25 | 2017-11-14 | Qualcomm Incorporated | Motion state based mobile device positioning |
US9255988B2 (en) * | 2014-01-16 | 2016-02-09 | GM Global Technology Operations LLC | Object fusion system of multiple radar imaging sensors |
US10185030B2 (en) * | 2014-09-05 | 2019-01-22 | GM Global Technology Operations LLC | Object boundary detection for automotive radar imaging |
US10088561B2 (en) * | 2014-09-19 | 2018-10-02 | GM Global Technology Operations LLC | Detection of a distributed radar target based on an auxiliary sensor |
US10539669B2 (en) * | 2014-10-08 | 2020-01-21 | Texas Instruments Incorporated | Three dimensional (3D) tracking of objects in a radar system |
CN104267728B (en) * | 2014-10-16 | 2016-09-14 | 哈尔滨工业大学 | A kind of moving robot obstacle avoiding method based on range coverage centroid vector |
CN104573646B (en) * | 2014-12-29 | 2017-12-12 | 长安大学 | Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera |
US10088549B2 (en) * | 2015-06-25 | 2018-10-02 | Appropolis Inc. | System and a method for tracking mobile objects using cameras and tag devices |
CN105222772B (en) * | 2015-09-17 | 2018-03-16 | 泉州装备制造研究所 | A kind of high-precision motion track detection system based on Multi-source Information Fusion |
JP6650344B2 (en) * | 2015-10-02 | 2020-02-19 | パナソニック株式会社 | Object detection device and object detection method |
CN105809668B (en) * | 2016-01-15 | 2018-10-02 | 武汉武大卓越科技有限责任公司 | The body surface deformation behaviour extracting method of three-dimensional point cloud is scanned based on line |
CN105913489B (en) * | 2016-04-19 | 2019-04-23 | 东北大学 | A kind of indoor three-dimensional scenic reconstructing method using plane characteristic |
CN106096516A (en) * | 2016-06-01 | 2016-11-09 | 常州漫道罗孚特网络科技有限公司 | The method and device that a kind of objective is followed the tracks of |
CN106384359B (en) * | 2016-09-23 | 2019-06-25 | 青岛海信电器股份有限公司 | Motion target tracking method and TV |
CN106570454B (en) * | 2016-10-10 | 2019-06-11 | 同济大学 | Pedestrian traffic parameter extracting method based on mobile laser scanning |
-
2017
- 2017-05-17 US US15/597,994 patent/US10139833B1/en active Active
-
2018
- 2018-05-09 CN CN201810434983.8A patent/CN108931773B/en active Active
- 2018-05-15 DE DE102018111626.8A patent/DE102018111626A1/en active Pending
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11585896B2 (en) * | 2019-02-04 | 2023-02-21 | Metawave Corporation | Motion-based object detection in a vehicle radar using convolutional neural network systems |
US11828848B2 (en) * | 2019-03-14 | 2023-11-28 | Aeva, Inc. | Velocity estimation using doppler per point LiDAR systems |
US20200292706A1 (en) * | 2019-03-14 | 2020-09-17 | Aeva, Inc. | Velocity estimation using doppler per point lidar systems |
US11300677B2 (en) * | 2019-07-08 | 2022-04-12 | GM Global Technology Operations LLC | Automated driving systems and control logic for host vehicle velocity estimation using wide aperture radar |
US20210318412A1 (en) * | 2020-04-09 | 2021-10-14 | Robert Bosch Gmbh | Apparatus and method for processing radar data and radar system |
US20230085887A1 (en) * | 2020-06-15 | 2023-03-23 | Gm Cruise Holdings Llc | Systems and methods for high velocity resolution high update rate radar for autonomous vehicles |
WO2022055847A1 (en) * | 2020-09-10 | 2022-03-17 | Argo AI, LLC | Systems and methods for simultaneous range-rate unwrapping and outlier removal for radar |
US11448748B2 (en) | 2020-09-10 | 2022-09-20 | Argo AI, LLC | Systems and methods for simultaneous range-rate unwrapping and outlier removal for radar |
CN112363500A (en) * | 2020-10-29 | 2021-02-12 | 深圳市普渡科技有限公司 | Automatic recharging moving method and system |
CN112327842A (en) * | 2020-10-29 | 2021-02-05 | 深圳市普渡科技有限公司 | Method and system for positioning charging pile by robot |
US20220137227A1 (en) * | 2020-11-02 | 2022-05-05 | Waymo Llc | Point cloud segmentation using a coherent lidar for autonomous vehicle applications |
WO2022093513A1 (en) * | 2020-11-02 | 2022-05-05 | Argo AI, LLC | Systems and methods for range-rate dealiasing using position consistency |
WO2022094430A1 (en) * | 2020-11-02 | 2022-05-05 | Waymo Llc | Point cloud segmentation using a coherent lidar for autonomous vehicle applications |
US11662454B2 (en) | 2020-11-02 | 2023-05-30 | Ford Global Technologies, Llc | Systems and methods for range-rate dealiasing using position consistency |
US11841439B2 (en) * | 2020-11-02 | 2023-12-12 | Waymo Llc | Point cloud segmentation using a coherent lidar for autonomous vehicle applications |
US11656629B1 (en) | 2020-12-08 | 2023-05-23 | Waymo Llc | Detection of particulate matter in autonomous vehicle applications |
CN114966673A (en) * | 2022-05-31 | 2022-08-30 | 上海海拉电子有限公司 | Radar-based trailer detection method and system and vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN108931773B (en) | 2023-01-13 |
US10139833B1 (en) | 2018-11-27 |
DE102018111626A1 (en) | 2018-11-22 |
CN108931773A (en) | 2018-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10139833B1 (en) | Six-dimensional point cloud system for a vehicle | |
JP7394582B2 (en) | Apparatus and method for processing radar data | |
US9599706B2 (en) | Fusion method for cross traffic application using radars and camera | |
CN104943694B (en) | For the system and method for the misalignment for determining and compensating for sensor | |
US9784829B2 (en) | Wheel detection and its application in object tracking and sensor registration | |
EP3415948B1 (en) | A method of determining the de-aliased range rate of a target | |
US11131766B2 (en) | Method for the recognition of an object | |
US9255988B2 (en) | Object fusion system of multiple radar imaging sensors | |
CN107144839B (en) | Detecting long objects by sensor fusion | |
US6687577B2 (en) | Simple classification scheme for vehicle/pole/pedestrian detection | |
US20140347207A1 (en) | Probabilistic target selection and threat assessment method and application to intersection collision alert system | |
US20150336575A1 (en) | Collision avoidance with static targets in narrow spaces | |
JP2017067756A (en) | Object detection device and object detection method | |
CN107103275B (en) | Wheel-based vehicle detection and tracking using radar and vision | |
US20220128995A1 (en) | Velocity estimation and object tracking for autonomous vehicle applications | |
US20220003860A1 (en) | Determining the orientation of objects using radar or through the use of electromagnetic interrogation radiation | |
US11709260B2 (en) | Data driven resolution function derivation | |
CN114518574A (en) | Kurtosis-based pruning for sensor fusion systems | |
US11269059B2 (en) | Locating and/or classifying objects based on radar data, with improved reliability at different distances | |
US20230034560A1 (en) | Method for tracking a remote target vehicle in an area surrounding a motor vehicle by means of a collision detection device | |
US20050004719A1 (en) | Device and method for determining the position of objects in the surroundings of a motor vehicle | |
US11035943B2 (en) | Radar based tracking of slow moving objects | |
US11667295B2 (en) | Apparatus and method for recognizing object | |
Xie et al. | On-bicycle vehicle tracking at traffic intersections using inexpensive low-density lidar | |
RU2814813C1 (en) | Device and method for tracking objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZENG, SHUQING;BILIK, IGIL;SIGNING DATES FROM 20170425 TO 20170430;REEL/FRAME:042489/0755 |
|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE THE SPELLING OF ASSIGNOR'S NAME SHOULD BE CORRECTED TO IGAL BILIK . PREVIOUSLY RECORDED AT REEL: 42489 FRAME: 0755. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:ZENG, SHUQING;BILIK, IGAL;SIGNING DATES FROM 20170425 TO 20170725;REEL/FRAME:043358/0638 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |