CN109507689A - Multilasered optical radar data fusion method with barrier memory function - Google Patents

Multilasered optical radar data fusion method with barrier memory function Download PDF

Info

Publication number
CN109507689A
CN109507689A CN201811596732.6A CN201811596732A CN109507689A CN 109507689 A CN109507689 A CN 109507689A CN 201811596732 A CN201811596732 A CN 201811596732A CN 109507689 A CN109507689 A CN 109507689A
Authority
CN
China
Prior art keywords
data
point
barrier
coordinate
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811596732.6A
Other languages
Chinese (zh)
Inventor
肖湘江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201811596732.6A priority Critical patent/CN109507689A/en
Publication of CN109507689A publication Critical patent/CN109507689A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Abstract

The present invention provides a kind of multilasered optical radar data fusion method with barrier memory function.Comprising steps of setting main and auxiliary laser radar;If the point cloud data coordinate transformation of auxiliary lidar measurement barrier is global map coordinate when only auxiliary laser radar is visible by barrier;Do you judge in global data chained list with the presence or absence of the corresponding global map coordinate of the point? otherwise it saves to global data chained list, and creates life cycle for the point;It is that the corresponding global map coordinate weight of the point is added one;Auxiliary the every scan round of laser radar one week, the corresponding global map coordinate weight of the point was reduced, and corresponding life cycle is also reduced;The distance of weight, life cycle and this away from robot of the global map coordinate stored in detection global data chained list, selection only save the point data or delete the point or be converted into fusion laser data.The method provided by the invention has barrier memory function, can effectively merge multilasered optical radar data in real time, while effectively filtering noise.

Description

Multilasered optical radar data fusion method with barrier memory function
Technical field
The present invention relates to laser radar technique fields, and in particular to a kind of multilasered optical radar number with barrier memory function According to fusion method.
Background technique
Laser is equivalent to the eyes of robot, in robot obstacle-avoiding, laser as current robot navigation important devices The real-time and accuracy of data determine that can robot safe avoidance.Since structure is blocked and robot mounting height etc. Problem, existing laser radar apparatus performance is bad, single laser limited coverage area, coordinates to make therefore, it is necessary to multilasered optical radar Industry.
Multilasered optical radar coordinative operation sixty-four dollar question is the fusion of multilasered optical radar data.Laser radar high speed is swept It retouches, by taking 10hz, 2000 point data laser as an example, needs to handle 2000 data points within laser every 0.1 second, each data point has spy Fixed angle, time, and put and the time difference of point is only 1/20000s, within the so short time, different equipment is difficult to do To time synchronization, and distinct device performance is different, this will lead to the different laser acquisitions when robot or barrier movement The obstacle information arrived is different, if directly merged to different laser datas, can generate a large amount of noises, and barrier is opposite Radar motion speed is faster, and error is bigger.
Two kinds of data fusion methods exist in the prior art: first is that adaptive message synchronization method: this method is equivalent to Synchrotimer after the message different from multiple laser pick-ofves, has disappearing for identical time stamp and if only if multiple laser radars Respective point cloud data is just exported when breath.Therefore news release frequency can be caused to decline since waiting time by force is synchronous.Two Be direct fusion method: this method directly merges multiple laser radar datas, and disadvantage is obvious, not only can be because of the time Difference leads to timing error, a large amount of noises occurs, and does not account for obstacle memory function.
Summary of the invention
The present invention proposes the multilasered optical radar data fusion method with barrier memory function in view of the above problems, initiative The concept of barrier life cycle is proposed to remember the barrier of discovery, not only can effectively solve dyskinesia noise problem, The sightless phenomenon of obstacle short for the avoidance later period also can effectively solve.
The present invention provides a kind of multilasered optical radar data fusion method with barrier memory function, specifically includes following step It is rapid:
Step S1: setting main laser radar and auxiliary laser radar;
Step S2: if barrier laser radar invisible and auxiliary for main laser radar is visible, auxiliary laser radar is surveyed The point cloud data coordinate (x ', y ') of the barrier of amount is converted into global map coordinate (x, y);
Step S3: judge in global data chained list with the presence or absence of the corresponding global map of point cloud data coordinate (x ', y ') Coordinate (x, y)? if it is not, then saving to global data chained list, and life cycle T is created for the point;If so, by the point cloud data Corresponding global map coordinate (x, the y) weight of coordinate (x ', y ') adds one;
Step S4: auxiliary the every scan round of laser radar one week, the corresponding global map of the point cloud data coordinate (x ', y ') is sat It marks (x, y) weight and reduces a certain amount of μ, life cycle T is corresponding to reduce a certain amount of v;
Step S5: the corresponding global map coordinate of point cloud data coordinate (x ', y ') stored in detection global data chained list The distance of weight, life cycle T and this away from robot of (x, y) only saves the point data and does not convert as weight < 2 To merge laser data;As life cycle T < 0, the point is deleted;When the distance of the point to robot is greater than threshold value, deleting should Point;Other situations are converted into fusion laser data.
Preferably implement in one kind of the multilasered optical radar data fusion method provided by the invention with barrier memory function In example, point cloud data coordinate (x ', y ') described in step S2 is converted into the formula of global map coordinate (x, y) are as follows:
Wherein, angle of the laser radar point cloud data coordinate system with respect to global map coordinate system, (x supplemented by α0, y0) auxiliary laser Initial position co-ordinates of the radar points cloud data coordinate system origin with respect to global map coordinate system.
Preferably implement in one kind of the multilasered optical radar data fusion method provided by the invention with barrier memory function In example, in step S3, the life cycle T of creation specifically:
T=a*W* time constant+b*L/V;
Wherein L is the distance of current obstacle distance robot, and V is robot obstacle-avoiding speed, and a, b are weight coefficient, a+b =1, W are the present weight.
Preferably implement in one kind of the multilasered optical radar data fusion method provided by the invention with barrier memory function Example in, the data fusion method apply also for multiple ultrasonic waves, multiple infrared distance sensors, multiple millimetre-wave radars they One of or a variety of any combination.
Compared to the prior art, multilasered optical radar data fusion method provided by the invention has barrier memory function, Multiple laser radar datas can be effectively merged in real time, while can effectively filter noise.Solves adaptive message synchronization method By force the waiting time it is synchronous and the problem of cause news release frequency to decline, while solve immediate data fusion method occur it is big The problem of measuring noise.In addition, this method is equally applicable to the range sensor of other similar principles, such as ultrasound, infrared distance measurement is passed Sensor, millimetre-wave radar etc. or any combination thereof.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, used in being described below to embodiment Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for ability For the those of ordinary skill of domain, without creative efforts, it can also be obtained according to these attached drawings other attached Figure, in which:
Fig. 1 is the workflow of the multilasered optical radar data fusion method provided by the invention with barrier memory function Figure;
Fig. 2 is 1 different level multilasered optical radar signal area schematic diagram of specific embodiment;
Fig. 3 is same level face multilasered optical radar signal area schematic diagram;
Fig. 4 is cartesian coordinate Transformation Graphs;
Fig. 5 (a) is general data fusion method avoidance effect figure;
Fig. 5 (b) is the multilasered optical radar data fusion method avoidance effect provided by the invention with barrier memory function Figure.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that the described embodiments are merely a part of the embodiments of the present invention, instead of all the embodiments.
Referring to Fig. 1, being the multilasered optical radar data fusion method work provided by the invention with barrier memory function Flow diagram.The multilasered optical radar data fusion method with barrier memory function, specifically comprises the following steps:
Step S1: setting main laser radar and auxiliary laser radar.Main and auxiliary laser radar is set convenient for robot navigation's ginseng According to robot carries out operation navigation avoidance referring to the navigation data of main laser radar under normal conditions.
Step S2: if barrier laser radar invisible and auxiliary for main laser radar is visible, auxiliary laser radar is surveyed The point cloud data coordinate (x ', y ') of the barrier of amount is converted into global map coordinate (x, y).Global map is kept away for robot navigation Barrier stablizes constant referential, is that global map coordinate is conducive to intuitively embody barrier in reality by point cloud data coordinate transformation Position in body environment.
Step S3: judge in global data chained list with the presence or absence of the corresponding global map of point cloud data coordinate (x ', y ') Coordinate (x, y)? if it is not, then saving to global data chained list, and life cycle T is created for the point;If so, by the point cloud data Corresponding global map coordinate (x, y) the weight W of coordinate (x ', y ') adds one.The corresponding global map of point cloud data coordinate (x ', y ') Coordinate (x, y) weight W initial value is 0.Life cycle T=a*W* time constant+b*L/V;Wherein L is obstacle distance robot Distance, V be robot obstacle-avoiding speed, a, b be weight coefficient, a+b=1, W be this present weight value.The time constant It is determined according to robot obstacle-avoiding coefficient.
Step S4: auxiliary the every scan round of laser radar one week, the corresponding global map of the point cloud data coordinate (x ', y ') is sat It marks (x, y) weight W and reduces a certain amount of μ, life cycle T is corresponding to reduce a certain amount of v.In specific embodiment, μ is according to practical feelings Condition is determined, and takes 0.1 or so under normal circumstances, determines that road conditions are more complicated with specific reference to the complexity of barrier and road conditions, Reduction amount is smaller, not delete barrier in subsequent decision process accidentally as far as possible, but too small situation be easy to appear it is more Noise.That is:
T (n+1)=T (n)-μ;W (n+1)=W (n)-v.Wherein, laser radar rotates cycle supplemented by n.
Step S5: the corresponding global map coordinate of point cloud data coordinate (x ', y ') stored in detection global data chained list The distance of weight, life cycle T and this away from robot of (x, y) only saves the point data and does not convert as weight < 2 To merge laser data;As life cycle T < 0, the point is deleted;When the distance of the point to robot is greater than threshold value, deleting should Point.In specific implementation process, threshold value generally takes to be advisable slightly larger than robot maximum cross section radius.
Specifically, point cloud data coordinate (x ', y ') described in step S2 is converted into the conversion formula of global map coordinate (x, y) Are as follows:
Wherein, angle of the laser radar point cloud data coordinate system with respect to global map coordinate system, (x supplemented by α0, y0) auxiliary laser Initial position co-ordinates of the radar points cloud data coordinate system origin with respect to global map coordinate system.
In specific embodiment, the data fusion method applies also for multiple ultrasonic waves, multiple infrared distance sensors, more A millimetre-wave radar they one of or a variety of any combination.
In specific embodiment, in the case where robot is provided with multiple laser radars, the selection of main and auxiliary laser radar is It is highly desirable, refers to, the selection of main and auxiliary laser radar is not required in principle, still so that robot cruises Be typically chosen unobscured laser radar as main laser radar avoidance effect more, if such as laser radar be arranged not Same level face, then the relatively open laser radar in the visual field for selecting position relatively high is as main laser radar, remaining is as auxiliary Laser radar.
Specific embodiment 1
As shown in Fig. 2, laser radar A, B are located on different level, the barrier of laser radar A is greater than for height, Laser radar A, B are visible simultaneously;For lower than laser radar A and be greater than laser radar B object, only laser radar B as it can be seen that Laser radar A is invisible;For being lower than the barrier of laser radar B height, laser radar B can when remote enough apart from barrier See, but barrier is invisible when the distance between robot and barrier are constantly foreshortened to more than distance threshold.
Robot can be divided into three regions: the single separately visible region of laser, the public visibility region of laser and All laser all invisible areas, the signal distributions for being projected in horizontal plane are as shown in Figure 3.
Single laser data is directly subject to without the concern for problem of data fusion in region separately visible for single laser, This region point cloud data is denoted as Pk (alone);Point cloud data collection is
For all laser radars all invisible areas, this region point cloud data set is P (invisible)=0;
Visibility region public for laser radar, this region point cloud data set are P (common):
Ideally, because of obstacle height, distance is different for public domain, as shown in figure 3,3 kinds of regions can be divided into,
(1) laser radar A is as it can be seen that laser radar B is invisible, such as barrier farther out or for the prominent obstacle in non-ground feelings Condition.
(2) laser radar B is as it can be seen that laser radar A is invisible, as barrier lower than laser radar A height and is higher than laser The case where height of radar B;
(3) laser radar A, B are visible;The case where being higher than laser radar A such as barrier;
For (1) (3) two kinds of situations, this region point cloud data is directly subject to the point cloud data collection P (A) of laser radar A:
For situation (2), this region point cloud data is directly subject to the point cloud data collection P (B) of radar B:
P (common)=P (A)+P (B) as a result,
Finally, the point cloud data collection of laser radar after ideally merging are as follows:
P=P (alone)+P (invisible)+P (common).
Under reality, if not having barrier memory function, insecurity factor will be present, it is assumed for example that at robot In stationary state, barrier is kept in motion, when barrier is in D point, laser radar B as it can be seen that laser radar A is invisible, After moving to C by D, barrier is all invisible for laser radar A and B, and there is no disappear for barrier at this time.It therefore, is energy Guarantee the safety of avoidance under all situations, data fusion of the present invention introduces barrier memory function.
The specific method is as follows:
Step 1: setting main laser radar and auxiliary laser radar.Laser radar A is unobscured, so as main laser thunder It reaches, laser radar B is as auxiliary laser radar.
Step 2: auxiliary laser radar B is as it can be seen that by auxiliary laser radar B if barrier is invisible for main laser radar A Global map coordinate (x, y), specific conversion formula are converted into for the point cloud data (x ', y ') of obstacle are as follows:
Wherein, angle of the laser radar B point cloud data coordinate system with respect to global map coordinate system, (x supplemented by α0, y0) auxiliary sharp Initial position co-ordinates of the optical radar B point cloud data coordinate origin with respect to global map coordinate system.As shown in Figure 4.X-axis, y-axis group At coordinate system be global map coordinate system, x ' axis, y ' axis composition coordinate system supplemented by laser radar B point cloud data coordinate system.
The overall situation is converted by the point cloud data coordinate (x ', y ') that auxiliary laser radar B can be measured barrier by above formula Map reference (x, y);X ' can be similarly solved, it is as follows that y ' just obtains inverse transformation formula:
Step 3: judging in global data chained list with the presence or absence of the corresponding global map of point cloud data coordinate (x ', y ') Coordinate (x, y)? if it is not, then saving to global data chained list, and life cycle T is created for the point;If so, by the point cloud data Corresponding global map coordinate (x, y) the weight W of coordinate (x ', y ') adds one;
Specific life cycle T=a*W*time+b*L/V;
Wherein L is obstacle distance, and V is robot obstacle-avoiding speed, and a, b are weight coefficient, and a+b=1, W are that the point is current Weighted value;
Step 4: the auxiliary every circulation primary of laser radar B, the corresponding global map coordinate of point cloud data coordinate (x ', y ') (x, y) life cycle T can reduce μ, and weight W is corresponding to reduce v.
T (n+1)=T (n)-μ;
W (n+1)=W (n)-v;
Step 5: the corresponding global map of all the points cloud data coordinates (x ', y ') stored in detection global data chained list The distance of weight W, life cycle T and this away from robot of coordinate (x, y) only saves the point data not as weight < 2 It is converted into fusion laser data;As life cycle T < 0, the point is deleted;When the distance of the point to robot is greater than threshold value, delete Except this point.
This patent increases the concept of barrier point cloud data weight W and life cycle T, and step 5 avoids introducing barrier Another problem that may cause for hindering object to be remembered, i.e., since robot and barrier are all likely to be at motion state, in Fig. 3 Barrier the case where moving to q point from p point, barrier is originally in the public visibility region of laser, but at this time due to laser scanning Time irreversibility, having reformed into the separately visible region of laser, (p point main laser radar A is as it can be seen that auxiliary laser radar B is invisible;Q point Main laser radar A is invisible, and auxiliary laser radar B is visible), q point is noise at this time, if q point is remembered and global data is added In chained list, the later period is then converted to laser data noise all the way will occurs on motion profile, and noise can deposit always When noise blocks robot completely, robot obstacle-avoiding failure will lead to.
For in Fig. 3, the case where q point, solved by step 5, can by be arranged the reasonable point to robot away from It is solved from threshold value, is generally subject to the maximum crosscutting radius surface of robot, when main laser radar A does not have in the determination range of q point Any obstacle is detected, then is added into global data chained list;In addition, this judgment step can make noise frequency of occurrence few, power Weight is low, and life cycle is also short, can disappear quickly erroneous judgement, will not influence robot obstacle-avoiding.
For the case where in Fig. 2, barrier moves to C by D, during, this measures point cloud data appearance for robot Number it is very more, weight W is big, and life cycle T is also grown, and life cycle T fully meets avoidance time demand.As Fig. 5 (b) is Using this patent method, noise life cycle is short, can disappear immediately after of short duration appearance.And such as Fig. 5 (a) is commonsense method fusion, If memory disorders object, pedestrian walk it is out-of-date can generate noise, and never disappear.
Compared to the prior art, the multilasered optical radar data fusion side with barrier memory function provided by the invention Method has barrier memory function, can effectively merge multilasered optical radar data in real time, while can effectively filter noise.And the party Method is equally applicable to the range sensor of other similar principles, such as ultrasonic wave, infrared distance sensor, millimetre-wave radar etc..

Claims (4)

1. a kind of multilasered optical radar data fusion method with barrier memory function, it is characterised in that: specifically include following step It is rapid:
Step S1: setting main laser radar and auxiliary laser radar;
Step S2: if barrier laser radar invisible and auxiliary for main laser radar is visible, by auxiliary lidar measurement The point cloud data coordinate (x ', y ') of barrier is converted into global map coordinate (x, y);
Step S3: judge in global data chained list with the presence or absence of the corresponding global map coordinate of point cloud data coordinate (x ', y ') (x, y)? if it is not, then saving to global data chained list, and life cycle T is created for the point;If so, by the point cloud data coordinate (x ', y ') corresponding global map coordinate (x, y) weight adds one;
Step S4: auxiliary the every scan round of laser radar one week, the corresponding global map coordinate of point cloud data coordinate (x ', y ') (x, y) weight reduces a certain amount of μ, and life cycle T is corresponding to reduce a certain amount of ν;
Step S5: the corresponding global map coordinate (x, y) of point cloud data coordinate (x ', y ') stored in detection global data chained list The distance of weight, life cycle T and this away from robot;As weight < 2, only saves the point data and be not converted into fusion Laser data;As life cycle T < 0, the point is deleted;When the distance of the point to robot is greater than threshold value, the point is deleted;Other Situation is converted into fusion laser data.
2. the multilasered optical radar data fusion method according to claim 1 with barrier memory function, it is characterised in that Point cloud data coordinate (x ', y ') described in rapid S2 is converted into the formula of global map coordinate (x, y) are as follows:
Wherein, angle of the laser radar point cloud data coordinate system with respect to global map coordinate system, (x supplemented by α0, y0) auxiliary laser radar Initial position co-ordinates of the point cloud data coordinate origin with respect to global map coordinate system.
3. the multilasered optical radar data fusion method according to claim 2 with barrier memory function, it is characterised in that: In step S3, the life cycle T of creation specifically:
T=a*W* time constant+b*L/V;
Wherein L be current obstacle distance robot distance, V be robot obstacle-avoiding speed, a, b be weight coefficient, a+b=1, W is the present weight.
4. the multilasered optical radar data fusion method according to claim 3 with barrier memory function, it is characterised in that: The data fusion method applies also for multiple ultrasonic waves, multiple infrared distance sensors, multiple millimetre-wave radars in them One or more any combination.
CN201811596732.6A 2018-12-25 2018-12-25 Multilasered optical radar data fusion method with barrier memory function Pending CN109507689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811596732.6A CN109507689A (en) 2018-12-25 2018-12-25 Multilasered optical radar data fusion method with barrier memory function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811596732.6A CN109507689A (en) 2018-12-25 2018-12-25 Multilasered optical radar data fusion method with barrier memory function

Publications (1)

Publication Number Publication Date
CN109507689A true CN109507689A (en) 2019-03-22

Family

ID=65755434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811596732.6A Pending CN109507689A (en) 2018-12-25 2018-12-25 Multilasered optical radar data fusion method with barrier memory function

Country Status (1)

Country Link
CN (1) CN109507689A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111722201A (en) * 2020-06-23 2020-09-29 常州市贝叶斯智能科技有限公司 Data denoising method for indoor robot laser radar

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212129A (en) * 2002-12-27 2004-07-29 Ishikawajima Harima Heavy Ind Co Ltd Environmental condition grasping device
CN104002747A (en) * 2014-06-10 2014-08-27 北京联合大学 Multiple-laser radar raster map merging system based on pilotless automobile
CN105404844A (en) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 Road boundary detection method based on multi-line laser radar
WO2016197986A1 (en) * 2015-06-12 2016-12-15 北京中飞艾维航空科技有限公司 High-precision autonomous obstacle-avoidance flying method for unmanned plane
US20170205829A1 (en) * 2010-11-19 2017-07-20 Bradley Tyers Automatic Location Placement System
US20180017682A1 (en) * 2016-07-13 2018-01-18 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for determining unmanned vehicle positioning accuracy
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN108320329A (en) * 2018-02-02 2018-07-24 维坤智能科技(上海)有限公司 A kind of 3D map creating methods based on 3D laser

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212129A (en) * 2002-12-27 2004-07-29 Ishikawajima Harima Heavy Ind Co Ltd Environmental condition grasping device
US20170205829A1 (en) * 2010-11-19 2017-07-20 Bradley Tyers Automatic Location Placement System
CN104002747A (en) * 2014-06-10 2014-08-27 北京联合大学 Multiple-laser radar raster map merging system based on pilotless automobile
CN105404844A (en) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 Road boundary detection method based on multi-line laser radar
WO2016197986A1 (en) * 2015-06-12 2016-12-15 北京中飞艾维航空科技有限公司 High-precision autonomous obstacle-avoidance flying method for unmanned plane
US20180017682A1 (en) * 2016-07-13 2018-01-18 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for determining unmanned vehicle positioning accuracy
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN108320329A (en) * 2018-02-02 2018-07-24 维坤智能科技(上海)有限公司 A kind of 3D map creating methods based on 3D laser

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JERRY SILVIOUS ET.AL: "《Automotive GMTI Radar for Object and Human Avoidance》", 《2011 IEEE RADARCON (RADAR)》 *
叶炜垚 等: "《基于虚拟障碍物的移动机器人路径规划方法》", 《机器人ROBOT》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111722201A (en) * 2020-06-23 2020-09-29 常州市贝叶斯智能科技有限公司 Data denoising method for indoor robot laser radar

Similar Documents

Publication Publication Date Title
US11915470B2 (en) Target detection method based on fusion of vision, lidar, and millimeter wave radar
EP3885794A1 (en) Track and road obstacle detecting method
CN106570454B (en) Pedestrian traffic parameter extracting method based on mobile laser scanning
KR102270401B1 (en) Real-time warning for distracted pedestrians with smartphones
CN108509972A (en) A kind of barrier feature extracting method based on millimeter wave and laser radar
CN110361727A (en) A kind of millimetre-wave radar multi-object tracking method
CN104215951B (en) System and method for detecting low-speed small target under sea cluster background
CN102081801A (en) Multi-feature adaptive fused ship tracking and track detecting method
CN103245976B (en) Based on human body target and the surrounding environment structure compatible detection method of UWB bioradar
CN105915846A (en) Monocular and binocular multiplexed invading object monitoring method and system
JP3520326B2 (en) Running vehicle detection method using millimeter wave radar
CN106156758B (en) A kind of tidal saltmarsh method in SAR seashore image
CN103675791B (en) Based on the Mie scattering lidar cloud recognition methods of numeric distribution equalization
CN107103275A (en) The vehicle detection carried out using radar and vision based on wheel and tracking
CN101520892A (en) Detection method of small objects in visible light image
CN105954733A (en) Time-domain filtering method based on photon flight time correlation
CN109188430A (en) A kind of target extraction method based on ground surveillance radar system
CN108958284A (en) A kind of unmanned plane obstacle avoidance system and method
CN113391282A (en) Human body posture recognition method based on radar multi-dimensional feature fusion
CN109839634A (en) A kind of subject fusion method of vehicle-mounted camera and radar
CN109507689A (en) Multilasered optical radar data fusion method with barrier memory function
CN110147748A (en) A kind of mobile robot obstacle recognition method based on road-edge detection
CN114005246A (en) Old man falling detection method and device based on frequency modulation continuous wave millimeter wave radar
WO2003007012A8 (en) Apparatus and method of tracking objects in flight
CN113734176A (en) Environment sensing system and method for intelligent driving vehicle, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190322

WD01 Invention patent application deemed withdrawn after publication