CN109870692A - A kind of radar viewing system and data processing method - Google Patents
A kind of radar viewing system and data processing method Download PDFInfo
- Publication number
- CN109870692A CN109870692A CN201910301738.4A CN201910301738A CN109870692A CN 109870692 A CN109870692 A CN 109870692A CN 201910301738 A CN201910301738 A CN 201910301738A CN 109870692 A CN109870692 A CN 109870692A
- Authority
- CN
- China
- Prior art keywords
- radar
- data
- processing unit
- central processing
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a kind of radar viewing system and data processing methods, and the system comprises central processing unit, and several radar front ends being connected by high-speed bus, low speed bus with central processing unit;The method includes environmental data collectings;The output of target raw data points;Coordinate system conversion;The fusion of initial data grade and data splicing.Each radar front end of the present invention exports untreated original signal, retains complete environmental information to greatest extent, is conducive to extract that vehicle body surrounding is abundant and environmental information complicated and changeable;Object grades or more high-level data fusion can be used, but also cluster grades of fusions can be carried out, be more advantageous to comprehensive characterization environmental information, carry out target recognition and tracking, enrich application scenarios;Multiple radar front ends are controlled simultaneously using 1 central processing unit, have both realized unified and efficient management and configuration, the flexibility of improve data transfer efficiency and system, while hardware cost is greatly lowered.
Description
Technical field
The present invention relates to a kind of radar viewing system and data processing methods, belong to vehicle safety monitoring field.
Background technique
In recent years, unmanned, advanced auxiliary drives the rise of (ADAS), mentions for the increasingly diversified trip mode of people
More more options and possibility are supplied.And wherein road environment complicated and changeable proposes higher want to automotive environment sensing capability
It asks, in addition, system high real-time, traffic policy regulation is to ADAS function mandatory requirement etc. for system landing cost limitation
Factor considers, undoubtedly proposes lot of challenges to the validity reliability of sensor plan again.And wherein Multi-sensor Fusion due to
Its flexibility and reliability, the features such as having complementary advantages between sensor and be widely noticed, be necessarily becoming for context aware systems future development
Gesture.
The sensor integration program of mainstream at present are as follows: it is (thunderous that main vehicle vehicle body surrounding disposes multiple independent sensor units
Reach, camera etc.), main vehicle central controller receives the output data of each sensor unit, and each sensing data is carried out coordinate
System's conversion, after being unified for vehicle axis system, is integrated all the sensors data for characterizing same target using blending algorithm,
To obtain the consistency conclusion about target threat.But there may be following problems for this kind of system:
1) independent sensor unit is exported due to including complete radio-frequency front-end and signal processing chip in legacy system
Be object data (tracking target data), object data are numbers on the basis of initial data after signal processing
According to, be not initial data, thus include characterize environment information content it is less (far fewer than initial data), and then limit part
System function may make part system function fail in some scenarios;
2) multiple independent sensor units are since output is object data point in legacy system, so generalling use
It is object grades (tracking target levels) or more high-level fusion, can not generally carries out cluster grades (signal grade) fusions;
3) in legacy system multiple independent sensor units be substantially it is independent, be unfavorable for system to all the sensors list
Member carries out unified and efficient management and configuration, and data transmission efficiency is low, reduces the flexibility of system, and greatly improves hardware
Cost.
In view of this, the present inventor studies this, a kind of radar viewing system and data processing side are specially developed
Thus method, this case generate.
Summary of the invention
The object of the present invention is to provide a kind of radar viewing system and data processing methods.
To achieve the goals above, solution of the invention is:
A kind of radar viewing system, including central processing unit, and pass through high-speed bus, low speed bus and central processing unit
Several connected radar front ends.
Preferably, the central processing unit includes processing module and interface modular converter, one side of processing module
Face carries out signal communication by low speed bus and each radar front end, is on the other hand connect by high-speed bus and interface modular converter
Receive the collected original signal of each radar front end.
Preferably, the processing module is controlled using DSP;Low speed bus uses CAN bus, and the high-speed bus uses
Coaxial cable or Ethernet.
Preferably, the interface modular converter, which uses, coaxially turns MIPI module, the coaxial letter that each radar front end is uploaded
Number MIPI signal is converted to, so as to processing module identifying processing.
Preferably, the radar front end is equipped with digital analog converter (ADC), the radar front end is exported through digital analog converter
Original signal, original signal turn MIPI module through coaxial cable and coaxially and are input to processing module progress data processing.
Preferably, the radar front end includes millimetre-wave radar front end.
Preferably, the central processing unit further includes output module, output module includes CAN interface, SPI/I2C
Interface.
A kind of data processing method of radar viewing system, includes the following steps:
Vehicle environmental data is acquired by multiple radar front ends, and environmental data original signal is input to central processing unit;
Central processing unit carries out preliminary treatment to above-mentioned original signal, obtains original comprising several of environment and target information
Data point, the raw data points include multiple attribute values;
The coordinate system of above-mentioned each raw data points is uniformly converted into vehicle axis system;
Central processing unit successively traverses the raw data points for being in each radar front end investigative range overlapping region, to there is similar spy
The raw data points of reference breath carry out cluster association, and carry out fusion treatment to such raw data points, and superposition building one is melted
New data point after conjunction;
Central processing unit is put centered on main vehicle and is successively spelled to raw data points zone boundary corresponding to each radar front end
It connects, and the initial data of each radar front end detection covering overlapping region is pressed fused new data point and is calculated, to construct
To the environment and target map of the circular vehicle body surrounding based on radar.
Preferably, the data processing method further include:
For the target map generated after data splicing, the original of same target first will be characterized in target map with clustering algorithm
Data point polymerization, then cluster tracking is carried out to target based on Kalman filtering.
Preferably, the attribute value includes but is not limited to the speed of target, distance, azimuth and radar cross section.
Radar viewing system of the present invention and data processing method, each radar front end output is untreated original
Beginning signal retains complete environmental information to the maximum extent, is more advantageous to and extracts that vehicle body surrounding is abundant and environment complicated and changeable
Information is utmostly reduced since data volume lacks erroneous judgement of the bring to vehicle body surrounding environment;Not only can be used object grades or
More high-level data fusion, but also cluster grades (signal grade) fusions can be carried out, it is more advantageous to comprehensive characterization environment letter
Breath carries out target recognition and tracking, enriches application scenarios;In addition, before controlling multiple radars simultaneously using 1 central processing unit
End, had both realized unified and efficient management and configuration, the flexibility of improve data transfer efficiency and system, while significantly dropping
Low hardware cost.
Below in conjunction with drawings and the specific embodiments, the invention will be described in further detail.
Detailed description of the invention
Fig. 1 is the radar viewing system layout of the present embodiment;
Fig. 2 is the radar viewing system functional block diagram of the present embodiment;
Fig. 3 is the data processing method flow chart of the radar viewing system of the present embodiment;
Fig. 4 is multiple independent sensor unit syncretizing effect figures in legacy system;
Fig. 5 is the present embodiment radar viewing system syncretizing effect figure.
Specific embodiment
As shown in Figs. 1-2, a kind of radar viewing system is applied in vehicle, including a central processing unit 1 and several
A radar front end 2 being arranged on vehicle 5, before the central processing unit 1 is by low speed bus 3 and high-speed bus 4 and radar
End 2 is connected, and the low speed bus 3 can use CAN bus, and the high-speed bus 4 can use coaxial cable or Ethernet.?
In the present embodiment, the high-speed bus 4 uses coaxial cable, and the central processing unit 1 includes that processing module 11 and interface turn
Block 12 is changed the mold, the interface modular converter 12 is converted for signal, and it is specific using coaxially MIPI module is turned, by each radar front end 2
The coaxial signal of upload is converted to MIPI signal, so as to 11 identifying processing of processing module.11 one side of processing module passes through
CAN bus and each radar front end 2 progress signal communication, on the other hand turn MIPI module by coaxial cable and coaxially and receive respectively
The collected original signal of radar front end 2.
Central processing unit 1 further includes CAN module 13 and output module in addition to processing module 11 and interface modular converter 12
14.The processing module 11 is controlled using High Performance DSP.The CAN module 13 is transmitted for CAN bus.The output module
On the one hand 14 read DSP processing result, on the other hand provide external interface abundant, including a CAN interface, can be used for device
Part is communicated with painting canvas, and output data processing result is conveniently used for graphical display, a SPI/I2C interface is also provided, for carrying out
Power management IC also provides a four road Serial Peripheral Interface (SPI)s, can be used for directly downloading code from serial flash.
The radar front end 2 is equipped with digital analog converter (ADC), and the radar front end 2 exports original letter through digital analog converter
Number, original signal turns MIPI module through coaxial cable and coaxially and is input to the progress data processing of processing module 11.Before each radar
2 working frequency range are held to construct using MIMO radar empty in 24GHz or 76GHz-81GHz using multiple-input and multiple-output (MIMO) system
The characteristics of matroid column, increases receiver aperture, to improve orientation angles resolving power.In the present embodiment, each radar front end
2 all external grounds and 12V power supply, each radar front end 2 are connected with processing module 11 by CAN bus and coaxial cable, wherein
High-speed bus (coaxial cable/Ethernet etc.) is used for transmission ADC sampled data that each radar front end 2 exports to central processing unit
1.Possess higher communication channel bandwidth compared to traditional millimetre-wave radar, accelerate original data transmissions, increases processing module 11
Sampling number obtains the detailed-oriented description to vehicle body context;Using low speed bus (CAN bus) to each radar front end 2
Parameter setting (transmitting receives carrier waveform etc.) is carried out, while the working hour of each radar front end 2 is allocated, for not
With the rational utilization of Scene realization resource.
The central processing unit 1 be responsible for the control of peripheral module, the acquisition of signal, the storage of data, signal place
The display of reason and result.The processing of signal be to the ADC output signal of each radar front end 2 of 11 buffer area of processing module successively
Algorithm process, including distance dimension, speed dimension, angle ties up Fourier transformation (FFT), to obtain the key message of target point, such as
Distance, speed, azimuth, object cross section product (RCS) etc..
Based on the data processing method of above-mentioned radar viewing system, as shown in figure 3, including the following steps:
S101, environmental data collecting:
5 environmental data of vehicle is acquired by multiple radar front ends 2, and exports corresponding environment through 2 digital analog converter of radar front end
Data original signal is to central processing unit 1;
Specifically: radar detection signal is actively received and dispatched in the radar front end combination for being in relevant work mode, includes target and environment
The echo-signal of information exports after each front-end A/D C sampling, and each front end ADC output signal is directly synchronous through high speed coaxial cable
It is transmitted to processing module buffer area, all includes significant figure in the buffer area corresponding to all radar front ends in a detection cycle
According to then the signal processing unit of processing module 11 can carry out subsequent algorithm processing.
S102, the output of target raw data points:
The processing module 11 of the central processing unit 1 carries out preliminary treatment to above-mentioned original signal, obtains comprising environment and mesh
Several raw data points (cluster point) of information are marked, the raw data points include multiple attribute values;
In central processing unit 1, to the ADC output signal of each radar front end of buffer area successively algorithm process, including distance dimension,
Speed dimension, angle ties up Fourier transformation (FFT), so that the key message of target point is obtained, such as distance, speed, azimuth, target
Cross-sectional area (RCS) etc..Making its output is the cluster point comprising environment and target information, and each cluster point includes
Target velocity, distance, azimuth, the attribute values such as RCS.
S103, coordinate system conversion:
The coordinate system of above-mentioned each raw data points is uniformly converted into vehicle axis system;
The coordinate system of raw data points comprising target and environmental information is radar fix system, so each radar front end is original
Data point information is all based on respective radar fix system, and disunity, needs uniformly to be converted to vehicle axis system by geometric operation.
S104, the fusion of initial data grade:
Central processing unit 1 successively traverses the raw data points for being in each radar front end investigative range overlapping region, similar to having
The raw data points of characteristic information carry out cluster association, and carry out fusion treatment, superposition building one to such raw data points
Fused new data point;
Specifically: for raw data points after coordinate system is converted, central processing unit 1, which successively traverses, is in each front end investigative range weight
The raw data points for closing region carry out cluster association to the raw data points for having similar features information, for these initial data
Point needs to carry out the processing of such as Bayesian Fusion algorithm, i.e., in coincidence detection region, there are two characterize same target above
The raw data points of point information, can be provided the weight of each raw data points by bayesian algorithm, then superposition building one
Fused new data point.
S105, data splicing:
Raw data points corresponding to each radar front end 2 exported by central processing unit 1 are to its overlay environment area
The characterization in domain, so central processing unit 1 is put successively centered on main vehicle to raw data points region corresponding to each radar front end
Boundary is spliced, and the initial data of each radar front end detection covering overlapping region is pressed fused new data point and calculated,
To which building obtains the environment and target map of the circular vehicle body surrounding based on radar;
S106, target following and identification:
For the target map generated after data splicing, the original of same target first will be characterized in target map with clustering algorithm
Data point polymerization, then cluster tracking is carried out to target based on Kalman filtering.
Specifically: for the target map generated after data splicing, first the clustering algorithms such as dBScan can be used map
The raw data points polymerization of the middle same target of characterization, then cluster tracking is carried out to target based on Kalman filtering, compared to tradition
Single-point tracking, cluster tracking can more characterize the true motion state of target.It, can be with based on the informative advantage of cluster point
By objective contour, speed, position, the features such as RCS are as target classification foundation, and utilization decision tree or support vector machines etc. are classified
Algorithm, or classify with depth nerve net to target, generate pedestrian, the class categories such as vehicle, to complete target identification.
In the present embodiment, what the central processing unit unit 1 was directly handled is each radar front end digital analog converter
(ADC) original signal exported, retains complete environmental information to the maximum extent, and in traditional scheme, at the signal of each radar
Link original signal processed independently is managed, and exports object data, object data are considered ambient enviroment letter
The refinement of breath especially target critical information has been far smaller than original output in the information content that subsequent algorithm fusion link provides
Signal, so radar viewing system described in the present embodiment, which is more advantageous to, extracts the abundant and complicated and changeable environment letter of vehicle body surrounding
Breath, is utmostly reduced since data volume lacks erroneous judgement of the bring to vehicle body surrounding environment.
Radar viewing system described in the present embodiment can be equivalent to a collaboration sensor, and the deployment of vehicle body surrounding is thunder
Up to front end rather than with complete object detecting function independent radar.Central processing unit 1 can facilitate spirit by CAN bus
It lives in configuring each radar front end 2, can be needed by 1 flexible choice of central processing unit such as before vehicle driving
The radar front end combination used, including select several front ends, positioned at which position of vehicle body etc., can additionally unify configuration thunder
Up to 2 radio frequency parameter of front end, signal processing mode, front end operating mode etc., and self-test is completed before vehicle launch;In vehicle driving
All radar front ends 2 for being in working condition are monitored in real time, and transmit respective synchronization transmitting-receiving instruction and other controls refer to
Enable etc..High-efficient disposition is conducive to improve system flexibility, scalability and reliability etc. comprehensively.Also, due to independent radar
Unit requires independent micro-chip processor, cost great raising with the promotion of vehicle body institute arrangement radar quantity;And this reality
It applies example and only needs one piece of unified central processing unit 1, efficiently controlled the cost of whole system.
Object grades or more high-level data fusion not only can be used in data processing method of the present invention, but also
It can carry out cluster grades (signal grade) fusions.Since the information that the fusion of signal grade is extracted is richer, result can more characterize outer
Boundary's environmental information.As shown in figure 4, origin and fork indicate the object output of different sensors, it can be seen that data fusion result
The data point for being included is less, is unfavorable for comprehensive characterization environmental information;And as shown in figure 5, origin and fork indicate different sensings
The cluster point of device exports, it can be seen that the data point that data fusion result is included is more, is more advantageous to comprehensive characterization ring
Border information carries out target recognition and tracking.So application scenarios of the present invention also more horn of plenty.
When being vehicle automatic parking below, the data processing method of radar viewing system, the master based on system output
Environmental information abundant near vehicle can carry out the related applications such as parking lot automatic parking using SLAM related algorithm;Including
Following steps:
1, environmental data collecting
When vehicle drives into parking lot, and entrance automatic parking mode, the radar front end combination for being in relevant work mode is actively received
Radar detection signal is sent out, the echo-signal comprising the information such as parking lot environment and surrounding Pedestrians and vehicles is sampled through each front-end A/D C
After export, each front end ADC output signal is directly through high speed coaxial cable synchronous transfer to central processing unit buffer area, when one
All comprising valid data in buffer area corresponding to all front ends in detection cycle, then the signal processing unit of central processing unit can
To carry out subsequent algorithm processing.
2, target raw data points export
In central processing unit, to the ADC output signal of each radar front end of buffer area successively algorithm process, including distance dimension,
Speed dimension, angle ties up Fourier transformation (FFT), so that the key message of target point is obtained, such as distance, speed, azimuth, target
Cross-sectional area (RCS) etc..Making its output is the raw data points (cluster point) comprising environment and target information, and each
Cluster point includes target velocity, distance, azimuth, the attribute values such as RCS.
3, coordinate system is converted
The coordinate system of raw data points comprising target and environmental information is radar fix system, so the initial data of each front end
Point information is all based on respective radar fix system, and disunity, so needing uniformly to be converted to vehicle by simple geometric operation
Coordinate system.
4, initial data grade merges
For raw data points after coordinate system is converted, system successively traverses the initial data for being in each front end investigative range overlapping region
Point carries out cluster association to the raw data points for having similar features information, these raw data points are needed to carry out such as shellfish
The processing of this blending algorithm of leaf, i.e., in coincidence detection region, there are two the initial data for characterizing same target point information above
Point can be provided the weight of each raw data points by bayesian algorithm, then one fused new data point of superposition building.
5, data are spliced
Raw data points corresponding to each radar front end by central processing unit output are to its overlay environment region
Characterization, successively raw data points zone boundary corresponding to each front end is spliced so being put centered on main vehicle, and it is each before
The initial data of end detection covering overlapping region is pressed fused new data point and is calculated, so that building obtains the ring based on radar
Around the parking environment map of vehicle body surrounding.
6, target following and identification
For the target map generated after data splicing, same mesh first will can be characterized in map with clustering algorithms such as dBScan
The polymerization of target raw data points, then cluster tracking is carried out to target based on Kalman filtering, simultaneously according to real-time tracking result feedback
Combining target classification results, so that in parking process avoids and other objects or pedestrian collision;It is additionally based on generation
Map detects parking stall, generates and can travel region, completes automatic parking based on SLAM algorithm.
When being vehicle local positioning below, the data processing method of radar viewing system, the system can be detected preferably
To targets such as road both sides guardrails, more output point can preferably describe the boundary characteristic of road, based on phases such as Kalman filterings
Algorithm is closed, can be used for the local positioning of vehicle.Specific step is as follows:
1, environmental data collecting
In the structured roads scene such as highway, road boundary is relatively clear, and radar detection is actively received and dispatched in radar front end combination
Signal, the echo-signal comprising target and environmental information export after each front-end A/D C sampling, and each front end ADC output signal is straight
It connects through high speed coaxial cable synchronous transfer to central processing unit buffer area, delays corresponding to all front ends in a detection cycle
It deposits all comprising valid data in area, then the signal processing unit of central processing unit can carry out subsequent algorithm processing.
2, target raw data points export
In central processing unit, to the ADC output signal of each radar front end of buffer area successively algorithm process, including distance dimension,
Speed dimension, angle ties up Fourier transformation (FFT), so that the key message of target point is obtained, such as distance, speed, azimuth, target
Cross-sectional area (RCS) etc..Making its output is the raw data points (cluster point) comprising environment and target information, and each
Cluster point includes target velocity, distance, azimuth, the attribute values such as RCS.
3, coordinate system is converted
The coordinate system of raw data points comprising target and environmental information is radar fix system, so the initial data of each front end
Point information is all based on respective radar fix system, and disunity, so needing uniformly to be converted to vehicle by simple geometric operation
Coordinate system.
4, initial data grade merges
For raw data points after coordinate system is converted, system successively traverses the initial data for being in each front end investigative range overlapping region
Point carries out cluster association to the raw data points for having similar features information, these raw data points are needed to carry out such as shellfish
The processing of this blending algorithm of leaf, i.e., in coincidence detection region, there are two the initial data for characterizing same target point information above
Point can be provided the weight of each raw data points by bayesian algorithm, then one fused new data point of superposition building.
5, data are spliced
Raw data points corresponding to each radar front end by central processing unit output are to its overlay environment region
Characterization, successively raw data points zone boundary corresponding to each front end is spliced so being put centered on main vehicle, and it is each before
The cluster of end detection covering overlapping region presses fused new beginning data point calculation, thus building surrounding based on radar
The environment and target map of vehicle body surrounding.
6, target following and identification
For the map generated after data splicing, the static point that road boundary information is characterized in map is screened, using least square
Fitting generates actual lane line, and completes the tracking to lane line based on related algorithms such as Kalman filterings, to complete vehicle
Local positioning, early warning is carried out when automotive run-off-road.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to of the invention its
His embodiment.This application is intended to cover any variations, uses, or adaptations of the invention, these modifications, purposes or
Adaptive change follow general principle of the invention and including the undocumented common knowledge in the art of the present invention or
Conventional techniques.The description and examples are only to be considered as illustrative, and true scope and spirit of the invention are by claim
It points out.
Claims (10)
1. a kind of radar viewing system, it is characterised in that: including central processing unit, and pass through high-speed bus, low speed bus
Several radar front ends being connected with central processing unit.
2. a kind of radar viewing system as described in claim 1, it is characterised in that: the central processing unit includes processing mould
Block and interface modular converter, on the one hand the processing module passes through low speed bus and each radar front end carries out signal communication, separately
On the one hand each collected original signal of radar front end is received by high-speed bus and interface modular converter.
3. a kind of radar viewing system as claimed in claim 1 or 2, it is characterised in that: the processing module is controlled using DSP
System;The low speed bus uses CAN bus, and the high-speed bus uses coaxial cable or Ethernet.
4. a kind of radar viewing system as claimed in claim 3, it is characterised in that: the interface modular converter is turned using coaxial
MIPI module.
5. a kind of radar viewing system as claimed in claim 4, it is characterised in that: the radar front end is equipped with digital-to-analogue conversion
Device, the radar front end export original signal through digital analog converter, and it is defeated that original signal through coaxial cable and coaxially turns MIPI module
Enter to processing module and carries out data processing.
6. a kind of radar viewing system as claimed in claim 1 or 2, it is characterised in that: the radar front end includes millimeter wave
Radar front end.
7. a kind of radar viewing system as claimed in claim 2, it is characterised in that: the central processing unit further includes output
Module, output module include CAN interface, SPI/I2C interface.
8. a kind of data processing method of radar viewing system, includes the following steps:
Vehicle environmental data is acquired by multiple radar front ends, and environmental data original signal is input to central processing unit;
Central processing unit carries out preliminary treatment to above-mentioned original signal, obtains original comprising several of environment and target information
Data point, the raw data points include multiple attribute values;
The coordinate system of above-mentioned each raw data points is uniformly converted into vehicle axis system;
Central processing unit successively traverses the raw data points for being in each radar front end investigative range overlapping region, to there is similar spy
The raw data points of reference breath carry out cluster association, and carry out fusion treatment to such raw data points, and superposition building one is melted
New data point after conjunction;
Central processing unit is put centered on main vehicle and is successively spelled to raw data points zone boundary corresponding to each radar front end
It connects, and the initial data of each radar front end detection covering overlapping region is pressed fused new data point and is calculated, to construct
To the environment and target map of the circular vehicle body surrounding based on radar.
9. a kind of data processing method of radar viewing system as claimed in claim 8, it is characterised in that: the method is also wrapped
It includes: for the target map generated after data splicing, first will characterize the original of same target in target map with clustering algorithm
Data point polymerization, then cluster tracking is carried out to target based on Kalman filtering.
10. a kind of data processing method of radar viewing system as claimed in claim 8 or 9, it is characterised in that: the attribute
Value includes but is not limited to the speed of target, distance, azimuth and radar cross section.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910301738.4A CN109870692B (en) | 2019-04-16 | 2019-04-16 | Radar looking around system and data processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910301738.4A CN109870692B (en) | 2019-04-16 | 2019-04-16 | Radar looking around system and data processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109870692A true CN109870692A (en) | 2019-06-11 |
CN109870692B CN109870692B (en) | 2023-10-20 |
Family
ID=66922587
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910301738.4A Active CN109870692B (en) | 2019-04-16 | 2019-04-16 | Radar looking around system and data processing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109870692B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110596654A (en) * | 2019-10-18 | 2019-12-20 | 富临精工先进传感器科技(成都)有限责任公司 | Data synchronous acquisition system based on millimeter wave radar |
CN111123228A (en) * | 2020-01-02 | 2020-05-08 | 浙江力邦合信智能制动系统股份有限公司 | Vehicle-mounted radar integration test system and method |
CN111487596A (en) * | 2020-04-20 | 2020-08-04 | 航天新气象科技有限公司 | Wind field detection data fusion method and device and electronic equipment |
CN112261036A (en) * | 2020-10-20 | 2021-01-22 | 苏州矽典微智能科技有限公司 | Data transmission method and device |
CN117008122A (en) * | 2023-08-04 | 2023-11-07 | 江苏苏港智能装备产业创新中心有限公司 | Method and system for positioning surrounding objects of engineering mechanical equipment based on multi-radar fusion |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060238406A1 (en) * | 2005-04-20 | 2006-10-26 | Sicom Systems Ltd | Low-cost, high-performance radar networks |
CN102253391A (en) * | 2011-04-19 | 2011-11-23 | 浙江大学 | Multi-laser-radar-based pedestrian target tracking method |
CN105572664A (en) * | 2015-12-31 | 2016-05-11 | 上海广电通信技术有限公司 | Networking navigation radar target tracking system based on data fusion |
CN205427184U (en) * | 2015-12-31 | 2016-08-03 | 上海广电通信技术有限公司 | Network deployment navigation radar target tracker based on data fusion |
CN107966700A (en) * | 2017-11-20 | 2018-04-27 | 天津大学 | A kind of front obstacle detecting system and method for pilotless automobile |
CN108509972A (en) * | 2018-01-16 | 2018-09-07 | 天津大学 | A kind of barrier feature extracting method based on millimeter wave and laser radar |
CN109521427A (en) * | 2018-11-15 | 2019-03-26 | 上海赫千电子科技有限公司 | Vehicle-mounted Ethernet radar system |
-
2019
- 2019-04-16 CN CN201910301738.4A patent/CN109870692B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060238406A1 (en) * | 2005-04-20 | 2006-10-26 | Sicom Systems Ltd | Low-cost, high-performance radar networks |
CN102253391A (en) * | 2011-04-19 | 2011-11-23 | 浙江大学 | Multi-laser-radar-based pedestrian target tracking method |
CN105572664A (en) * | 2015-12-31 | 2016-05-11 | 上海广电通信技术有限公司 | Networking navigation radar target tracking system based on data fusion |
CN205427184U (en) * | 2015-12-31 | 2016-08-03 | 上海广电通信技术有限公司 | Network deployment navigation radar target tracker based on data fusion |
CN107966700A (en) * | 2017-11-20 | 2018-04-27 | 天津大学 | A kind of front obstacle detecting system and method for pilotless automobile |
CN108509972A (en) * | 2018-01-16 | 2018-09-07 | 天津大学 | A kind of barrier feature extracting method based on millimeter wave and laser radar |
CN109521427A (en) * | 2018-11-15 | 2019-03-26 | 上海赫千电子科技有限公司 | Vehicle-mounted Ethernet radar system |
Non-Patent Citations (1)
Title |
---|
林卉等: "《数字摄影测量学》", 《中国矿业大学出版社》, pages: 275 - 277 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110596654A (en) * | 2019-10-18 | 2019-12-20 | 富临精工先进传感器科技(成都)有限责任公司 | Data synchronous acquisition system based on millimeter wave radar |
CN111123228A (en) * | 2020-01-02 | 2020-05-08 | 浙江力邦合信智能制动系统股份有限公司 | Vehicle-mounted radar integration test system and method |
CN111487596A (en) * | 2020-04-20 | 2020-08-04 | 航天新气象科技有限公司 | Wind field detection data fusion method and device and electronic equipment |
CN111487596B (en) * | 2020-04-20 | 2022-06-21 | 航天新气象科技有限公司 | Wind field detection data fusion method and device and electronic equipment |
CN112261036A (en) * | 2020-10-20 | 2021-01-22 | 苏州矽典微智能科技有限公司 | Data transmission method and device |
CN112261036B (en) * | 2020-10-20 | 2021-09-24 | 苏州矽典微智能科技有限公司 | Data transmission method and device |
CN117008122A (en) * | 2023-08-04 | 2023-11-07 | 江苏苏港智能装备产业创新中心有限公司 | Method and system for positioning surrounding objects of engineering mechanical equipment based on multi-radar fusion |
Also Published As
Publication number | Publication date |
---|---|
CN109870692B (en) | 2023-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109870692A (en) | A kind of radar viewing system and data processing method | |
Li et al. | A hardware platform framework for an intelligent vehicle based on a driving brain | |
US11922341B2 (en) | Context-based remote autonomous vehicle assistance | |
CN109358614A (en) | Automatic Pilot method, system, device and readable storage medium storing program for executing | |
CN107479532A (en) | The domain controller test system and method for a kind of intelligent automobile | |
CN102837658A (en) | Intelligent vehicle multi-laser-radar data integration system and method thereof | |
CN110031845A (en) | The device and method of control radar | |
CN206493897U (en) | Vehicle environmental sensory perceptual system and autonomous driving vehicle | |
CN105620391A (en) | Intelligent vehicle assistant system | |
CN206400780U (en) | A kind of radar module recognized for parking stall and parking stall identifying system | |
CN105931491A (en) | Parking space identification method, radar module group and parking space identification system | |
WO2007053350A2 (en) | Systems and methods for configuring intersection detection zones | |
CN110441790A (en) | Method and apparatus crosstalk and multipath noise reduction in laser radar system | |
US11852749B2 (en) | Method and apparatus for object detection using a beam steering radar and a decision network | |
CN108566439A (en) | Distributing communication network topology structure suitable for vehicle | |
CN109726795A (en) | Method for the central artificial intelligence module of training | |
CN110083099A (en) | One kind meeting automobile function safety standard automatic Pilot architecture system and working method | |
CN104570770A (en) | Traffic flow simulation experiment platform based on micro intelligent vehicles | |
CN103935320A (en) | Sensor Device Of Manual Accessing Free Trunk System And Acting Method Thereof | |
CN110428647A (en) | The vehicle collaboration at crossing current method, apparatus, equipment and storage medium | |
CN206734223U (en) | Vehicle data processing system | |
Kheder et al. | Real-time traffic monitoring system using IoT-aided robotics and deep learning techniques | |
CN108304852A (en) | The determination method, apparatus and storage medium of road section type, electronic device | |
CN206961317U (en) | A kind of information of vehicle flowrate on road collects control system | |
CN109747655A (en) | Steering instructions generation method and device for automatic driving vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |