CN110346799A - A kind of obstacle detection method and equipment - Google Patents
A kind of obstacle detection method and equipment Download PDFInfo
- Publication number
- CN110346799A CN110346799A CN201910596133.2A CN201910596133A CN110346799A CN 110346799 A CN110346799 A CN 110346799A CN 201910596133 A CN201910596133 A CN 201910596133A CN 110346799 A CN110346799 A CN 110346799A
- Authority
- CN
- China
- Prior art keywords
- barrier
- mobile device
- obstacle
- target mobile
- blind area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9315—Monitoring blind spots
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93273—Sensor installation details on the top of the vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
This application discloses a kind of obstacle detection method and equipment, it is related to automatic control technology field, to solve when detecting the barrier in mobile device local environment, it can not judge whether the barrier detected really has the problem of menace to equipment, it is with reference to barrier and to mark that the application method, which comprises determining that current time has dangerous barrier to target mobile device,;Determine the obstacle blind area formed with reference to barrier relative to target mobile device;Based on the obstacle blind area, by labeled barrier is not determined as the not dangerous barrier for not having menace to the target device and deletes in the obstacle blind area.The application can detecte out current time to target mobile device do not have menace not dangerous barrier and delete, and then can be target mobile device plan optimal avoidance travelling route.
Description
Technical field
This application involves automatic control technology field, in particular to a kind of obstacle detection method and equipment.
Background technique
Under Vehicular automatic driving scene, detection identification is carried out to the barrier on vehicle driving road, vehicle can be helped
Suitable travel route is cooked up, to evade vehicle potential collision danger between barrier when driving on travel
Danger, carries out detection identification to barrier on vehicle driving road, helps avoid multiple barriers on vehicle driving road
In the presence of the data interfered to vehicle planning and multi-obstacle avoidance may cause block, data transmit slow problem.
But the point cloud information for obtaining barrier in the prior art, is often detected by laser radar, and then is believed according to cloud
Breath or fusion deep vision information are fitted barrier profile, to obtain all barrier profiles and position letter in sensor visual field
Breath, in mobile device (such as automobile) movement, processor in mobile device according to all obstacle position informations detected,
Avoidance is carried out based on barrier route planning strategy from the near to the distant, this method not only needs powerful calculating power, and due to
It calculates data volume and be easy to cause data jamming greatly, so that mobile device travelling route state labile (and it is possible that
Mobile device situation fluid when according to line-of-road movement or so), therefore it is badly in need of wanting a kind of detection barrier whether to shifting
The method that dynamic equipment has menace.
Summary of the invention
The application provides a kind of obstacle detection method and equipment, sets to solve to exist in the prior art in detection movement
In standby local environment when barrier, be only capable of the profile of detection barrier, can not judge to detect barrier to mobile device whether
Really there is menace, and then can not realize optimal avoidance route problem based on the case where barrier menace.
In a first aspect, the application provides a kind of obstacle detection method, this method comprises:
Determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark;
Determine the obstacle blind area formed with reference to barrier relative to target mobile device;
Based on the obstacle blind area, by labeled barrier is not determined as to the target device in the obstacle blind area
Not dangerous barrier without menace is simultaneously deleted.
In the above method, not dangerous obstacle of the current time to target mobile device without menace can detecte out
Object, and then by after the deletion of not dangerous barrier, the position based on other barriers is that target mobile device plans optimal avoidance
Travelling route.
In one possible implementation, determine that current time has dangerous barrier to target mobile device and is
With reference to barrier and mark, comprising:
Determine that barrier of the current time at a distance from the target mobile device within the scope of pre-determined distance is the ginseng
Examine barrier;And/or
Determine that barrier of the current time on the mobile course of the target mobile device is described with reference to barrier.
In one possible implementation, determine current time at a distance from the target mobile device in pre-determined distance
Barrier in range is described with reference to barrier, comprising:
Determine that current time barrier central point is not more than the barrier of the first pre-determined distance at a distance from equipment center point
Barrier is referred to be described, the equipment center point is the central point of the target mobile device, and the barrier central point is
The central point of the two-dimensional silhouette of each barrier.
In one possible implementation, barrier of the current time on the mobile course of the target mobile device is determined
Hindering object is described with reference to barrier, comprising:
The shortest distance for determining current time barrier profile point to the mobile course of equipment is not more than the second pre-determined distance
Barrier is described with reference to barrier, and the barrier central point is the central point of the two-dimensional silhouette of each barrier, the equipment
Mobile course is the ray drawn from the central point of target mobile device.
In one possible implementation, the obstacle for determining that the target mobile device is formed in reference barrier is blind
Area, comprising:
Using the equipment center of target mobile device point as starting point, draws two and taken turns with any two dimension with reference to barrier
Wide circumscribed ray;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is described any with reference to obstacle
Object is directed to the obstacle blind area that target mobile device is formed.
In one possible implementation, determine whether any not labeled barrier is in institute by the following method
It states in the obstacle blind area formed with reference to barrier:
Determine the barrier region that the two-dimensional silhouette of any barrier is surrounded;
Determine that the preset ratio region in the barrier region is in the obstacle blind area formed with reference to barrier
When, determine that any barrier is in the obstacle blind area formed with reference to barrier.
Second aspect, the application provide a kind of obstacle detecting apparatus, which includes processor and memory, wherein
The memory stores executable program, and described program is performed, and the processor realizes following process:
Determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark;
Determine the obstacle blind area formed with reference to barrier relative to target mobile device;
Based on the obstacle blind area, by labeled barrier is not determined as to the target device in the obstacle blind area
Not dangerous barrier without menace is simultaneously deleted.
The third aspect, the application provide a kind of obstacle detector, which includes:
With reference to obstacle determination unit, it is for determining that current time has dangerous barrier to target mobile device
With reference to barrier and mark;
Obstacle blind area determination unit, for determining that the obstacle formed with reference to barrier relative to target mobile device is blind
Area;
Not dangerous obstacle determination unit, for being based on the obstacle blind area, by what is be not labeled in the obstacle blind area
Barrier is determined as the not dangerous barrier for not having menace to the target device and deletes.
Fourth aspect, the application also provide a kind of computer storage medium, are stored thereon with computer program, the program quilt
The step of first aspect the method is realized when processing unit executes.
In addition, technical effect brought by any implementation can be found in first aspect in second aspect and fourth aspect
Technical effect brought by middle difference implementation, details are not described herein again.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment
Attached drawing is briefly introduced, it should be apparent that, the drawings in the following description are only some examples of the present application, for this
For the those of ordinary skill in field, without any creative labor, it can also be obtained according to these attached drawings
His attached drawing.
Fig. 1 is a kind of flow diagram of obstacle detection method provided by the embodiments of the present application;
Fig. 2 is a kind of schematic diagram of environment point cloud map provided by the embodiments of the present application;
Fig. 3 is the flow diagram of the two-dimensional silhouette provided by the embodiments of the present application for obtaining barrier;
Fig. 4 is a kind of schematic diagram of the representation of point cloud data provided by the embodiments of the present application;
Fig. 5 is a kind of visual schematic diagram of three dimensional point cloud provided by the embodiments of the present application;
Fig. 6 is the schematic diagram of the point cloud data after progress barrier segmentation provided by the embodiments of the present application;
Fig. 7 is the schematic diagram that a kind of point cloud data of barrier provided by the embodiments of the present application is converted to birds-eye view;
Fig. 8 is a kind of schematic diagram of profile for constructing barrier provided by the embodiments of the present application;
Fig. 9 is the schematic diagram that the first determination provided by the embodiments of the present application refers to barrier;
Figure 10 is the schematic diagram that second of determination provided by the embodiments of the present application refers to barrier;
Figure 11 is that provided by the embodiments of the present application the third determines the schematic diagram for referring to barrier;
Figure 12 is a kind of schematic diagram of obstacle blind area provided by the embodiments of the present application;
Figure 13 is a kind of schematic diagram that not dangerous barrier is determined according to obstacle blind area provided by the embodiments of the present application;
Figure 14 is a kind of schematic diagram of obstacle detecting apparatus provided by the embodiments of the present application;
Figure 15 is a kind of schematic diagram of obstacle detector provided by the embodiments of the present application.
Specific embodiment
In order to keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with attached drawing to the application make into
It is described in detail to one step, it is clear that described embodiment is only the application some embodiments, rather than whole implementation
Example.Based on the embodiment in the application, obtained by those of ordinary skill in the art without making creative efforts
All other embodiment, shall fall in the protection scope of this application.
The some words occurred in text are explained below:
Term "and/or" in the embodiment of the present application describes the incidence relation of affiliated partner, indicates that there may be three kinds of passes
System, for example, A and/or B, can indicate: individualism A exists simultaneously A and B, these three situations of individualism B.Character "/" one
As indicate forward-backward correlation object be a kind of "or" relationship.
2, " mobile device " and " target mobile device " in the present embodiment is that can carry out mobile equipment, on road
Automobile, the food delivery trolley in dining room, unmanned plane, balance car of traveling etc..
The application scenarios of the embodiment of the present application description are the technical solutions in order to more clearly illustrate the embodiment of the present application,
The restriction for technical solution provided by the embodiments of the present application is not constituted, those of ordinary skill in the art are it is found that with newly answering
With the appearance of scene, technical solution provided by the embodiments of the present application is equally applicable for similar technical problem.Wherein, at this
In the description of application, unless otherwise indicated, the meaning of " plurality " is two or more.
Before navigating to mobile device, need to carry out real-time capture to the scene of road ahead, and then combine and ride
Vehicle works as prelocalization, digital map navigation information offer navigation routine to user;
In existing navigation procedure, if detect that barrier occurs in navigation mobile device driving direction front, need
Offer can evade the dangerous barrier of tool, that is, navigation system is needed to provide avoidance route;Wherein there is advanced driving auxiliary
The mobile device of system (Advanced Driving Assistant System, ADAS) function, can be by being mounted on movement
The sensors such as camera, radar in equipment acquire the barrier on vehicle travel in real time, but its usually intelligent measurement goes out
The position of barrier and profile, and can not determine which is that really have the dangerous obstacles of threat to mobile device and do not have
The not dangerous barrier of standby menace, and then optimal avoidance route cannot be provided for mobile device.
Based on the above-mentioned problem, the application provides a kind of method and apparatus of detection of obstacles, to detect needle
Dangerous obstacles and not dangerous barrier to target mobile device, so as to according to the dangerous obstacles that detect and not dangerous
Barrier plans optimal avoidance route for above-mentioned target mobile device;
In method provided by the present application, the environment point cloud data on target mobile device travel is acquired first, and
The position that processing obtains target mobile device, and the barrier for target mobile device are carried out to above-mentioned environment point cloud data
Hinder position and the two-dimensional silhouette of object;And then according to the position of target mobile device, the position of barrier and two-dimensional silhouette, to target
It is with reference to barrier and to mark, and then determine each reference barrier relative to target that mobile device, which has dangerous barrier,
The obstacle blind area that mobile device is formed, and all barriers are judged based on above-mentioned obstacle blind area, determine above-mentioned obstacle
Labeled barrier is not the not dangerous barrier for not having menace to the target mobile device and and deletes in blind area.
Below in conjunction with attached drawing, the scheme proposed for the application is described in detail:
As shown in Figure 1, the application provides a kind of method of detection of obstacles, specifically comprise the following steps:
Step S1 determines the position of target mobile device, and the position of the barrier for target mobile device
And two-dimensional silhouette;
It should be noted that the target mobile device in the present embodiment is only illustrated with vehicle, other mobile device (such as food deliveries
Vehicle, unmanned plane, balance car etc.) obstacle detection method method provided in this embodiment can be used.
In this step, can with but be not limited to radar by the way that vehicle roof is arranged in or other such as camera senses
Device be based on blending algorithm obtain current target vehicle-periphery point cloud map, above-mentioned environment point cloud map referring to fig. 2,
In, the part that the rectangle frame in Fig. 2 is irised out is the point cloud data of the barrier of current target vehicle;
As shown in figure 3, specifically can refer to the two-dimensional silhouette that following method obtains target mobile device and each barrier:
Step S31 obtains point cloud data and handles
Point cloud data such as is obtained using KITTI data set, wherein in KITTI data set there are four camera, mainly uses the
Picture, calibrating parameters and the label file of three cameras (serial number 02) shooting.
Point cloud data is typically expressed as N (N is positive integer) row, and the numpy array of at least three column, corresponding one of every row is individually
Point, so being indicated using the spatial position point (X, Y, Z) of at least three value, for details, reference can be made to Fig. 4.
There is an added value " reflectivity " in KITTI data, this is to measure laser beam to be reflected back toward in some position
Amount how much is carried out, so point cloud data is exactly the matrix of (N × 4) in KITTI data.
Step S32) three dimensional point cloud visualization processing
Three-dimensional point cloud can be visualized in visualization tool such as MATLAB, visualization effect is referring to Fig. 5.
The mayavi in programming language python can be used to realize for the visualization processing of above-mentioned three dimensional point cloud, on
Stating mayavi is the special python tool for drawing three-dimensional figure, furthermore it is also possible to carry out at visualization to three dimensional point cloud
Reason obtains the birds-eye view and front view of point cloud data.
Step S33) obtain only include barrier point cloud data
From high-precision map retrieval to comprising road surface, crossing can driver area, after above-mentioned steps S32 visualization processing
Three dimensional point cloud is pre-processed, and the point cloud data of barrier on the road surface of road is left behind, and removes remaining background obstacle
Object and the point cloud data on ground.
Step S34) the barrier segmentation based on point cloud data
And then to only there are the point cloud datas of barrier to carry out barrier segmentation, individual barrier is detected and divides,
In the present embodiment, above-mentioned target mobile device be vehicle, above-mentioned barrier can with but be not limited to include pedestrian, other vehicles
Deng, herein will the point cloud datas of individual barriers such as vehicle, pedestrian split, the point cloud number after carrying out barrier segmentation
According to referring to Fig. 6.
In this step, above-mentioned barrier segmentation can be carried out using the dividing method as follows based on point cloud data: used
Grid Method constructs top view (projecting to two-dimensional surface) two-dimensional grid of barrier point cloud data, i.e., by the point of above-mentioned barrier
Cloud data are converted to birds-eye view (as shown in Figure 7), and then obtain the obstacles borders frame on two dimensional image to get each obstacle is arrived
The two-dimensional silhouette of object, wherein the size of two-dimensional grid is determined by the size of cloud.
Step S35) building barrier profile
After each obstacle object point cloud derived above, it is necessary to draw the two-dimentional bounding box of each barrier.Make herein
The minimum area Polygonal Boundary frame for surrounding point cloud, such as Fig. 8 are found out with Minimum Convex Closure method.
In this step S1, dynamic barrier can also be obtained by deep learning neural network algorithm or other algorithms
Position, course, attribute etc., in order to determine barrier be away from or closer to equal multi-motions state and attribute, with estimation and
It is accurately positioned the three-dimensional multiclass bounding box of barrier.
Step S2: determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark
Note;
Can with but be not limited to through following either type, it is dangerous to determine that current time has target mobile device
Barrier is with reference to barrier:
With reference to barrier method of determination 1: determining current time at a distance from above-mentioned target mobile device in pre-determined distance model
Barrier in enclosing is with reference to barrier;
Can with but be not limited to determine current time barrier central point with equipment center point at a distance from no more than first in advance
If the barrier of distance is above-mentioned with reference to barrier, above equipment central point is the central point of above-mentioned target mobile device, above-mentioned
Barrier central point is the central point of the two-dimensional silhouette of each barrier.
Excessive restriction is not done to above-mentioned first pre-determined distance, those skilled in the art can be arranged according to actual needs, such as
5 meters, 10 meters etc. above-mentioned first pre-determined distance set in real world, at this point, i.e. will be in barrier centre distance equipment
The distance of heart point be determined as no more than 5 meters or 10 meters of barrier it is above-mentioned with reference to barrier, as shown in figure 9, closed curve 901 is
The two-dimensional silhouette of above-mentioned target device, closed curve 902, closed curve 903, closed curve 904 and closed curve 905 are respectively
The two-dimensional silhouette of the barrier A of above-mentioned target device, barrier B, barrier C and barrier D, point P are above-mentioned target device
The central point of two-dimensional silhouette, point OA、OB、OCAnd ODThe two dimension wheel of respectively barrier A, barrier B, barrier C and barrier D
Wide central point, when determining with reference to barrier, line segment POA, line segment POB, line segment POCWith line segment PODIn, line segment POALength
Less than the first pre-determined distance r, barrier A is for target mobile device at this time, great menace, while it is referred to barrier
And it marks.
With reference to barrier method of determination 2: determining obstacle of the current time on the mobile course of above-mentioned target mobile device
Object is with reference to barrier;
The shortest distance for determining current time barrier profile point to the mobile course of equipment is not more than the second pre-determined distance
Barrier is above-mentioned with reference to barrier, and above-mentioned barrier central point is the central point of the two-dimensional silhouette of each barrier, above equipment
Mobile course is the ray drawn from the central point of target mobile device, and for details, reference can be made to the ray PP in Figure 101。
Excessive restriction is not done to above-mentioned second pre-determined distance, those skilled in the art can be arranged according to actual needs, such as
It is set to 1 meter of real world, referring to Figure 10, closed curve 901 is the two-dimensional silhouette of above-mentioned target device, closed curve
902, closed curve 903, closed curve 904, closed curve 905, closed curve 1006 are respectively the obstacle of above-mentioned target device
The two-dimensional silhouette of object A, barrier B, barrier C, barrier D and barrier E, point P are the two-dimensional silhouette of above-mentioned target device
Central point, point OA、OB、OCAnd ODThe respectively central point of the two-dimensional silhouette of barrier A, barrier B, barrier C and barrier D,
Ray PP1For the mobile course of the target mobile device, point A1、B1、C1、D1And E1The respectively two dimension of barrier A, B, C, D and E
Apart from ray PP on profile1Nearest point, line segment PBB1, line segment PCC1, line segment PDD1, line segment PEE1Respectively barrier A, obstacle
The two-dimensional silhouette point of object B, barrier C and barrier D are to ray PP1The shortest distance, as seen from Figure 9, point A1In ray PP1
On, line segment PBB1With line segment PCC1Length less than the second pre-determined distance d, therefore barrier A, barrier B and barrier C are true
It is set to the reference barrier of the target mobile device;
With reference to barrier method of determination 3: determining current time at a distance from above-mentioned target mobile device in pre-determined distance model
Interior barrier is enclosed, and the barrier on the mobile course of above-mentioned target mobile device is with reference to barrier.
Wherein it is determined that barrier of the current time at a distance from above-mentioned target mobile device within the scope of pre-determined distance is ginseng
Barrier is examined, and determines that barrier of the current time on the mobile course of above-mentioned target mobile device is with reference to barrier
Method is not repeated to describe herein for details, reference can be made to the above-mentioned principle with reference to barrier method of determination 1 and 2 is identical;
As shown in figure 11, it should be noted that when using determining barrier with reference to barrier method of determination 3, Tu11Zhong, envelope
Closed curve 901 is the two-dimensional silhouette of above-mentioned target device, and closed curve 902, closed curve 903, closed curve 904, closing are bent
Line 905, closed curve 1006 are respectively barrier A, barrier B, barrier C, barrier D and the barrier of above-mentioned target device
The two-dimensional silhouette of E, point P are the central point of the two-dimensional silhouette of above-mentioned target device, point OA、OB、OCAnd ODRespectively barrier A, barrier
Hinder the central point of the two-dimensional silhouette of object B, barrier C and barrier D, ray PP1 is the mobile course of the target mobile device, point
A1、B1、C1、D1And E1Respectively apart from ray PP in the two-dimensional silhouette of barrier A, B, C, D and E1Nearest point, line segment PBB1, line
Section PCC1, line segment PDD1, line segment PEE1Respectively barrier A, barrier B, barrier C and barrier D two-dimensional silhouette point to penetrating
Line PP1The shortest distance, determine that barrier A and barrier D is to be navigated according to mobile with reference to barrier according to the first pre-determined distance r
To PP1Determine that barrier A, barrier B and barrier C are the barrier that is, in Figure 11 with reference to barrier with the second pre-determined distance d
A, barrier B, barrier C and barrier D are confirmed as with reference to barrier.
Step S3: the above-mentioned obstacle blind area formed with reference to barrier relative to target mobile device is determined;
Above-mentioned obstacle blind area refers to the field range from above-mentioned target mobile device, blocked by some barrier;As
From the visual field of above-mentioned target device by the above-mentioned region blocked in two-dimentional birds-eye view with reference to barrier;Specifically, it is based on
The above-mentioned reference barrier determined with reference to barrier method of determination 1-3 determines the view that any barrier is formed by the following method
Wild blind area:
Using the equipment center of above-mentioned target mobile device point as starting point, draws two and taken turns with any two dimension with reference to barrier
Wide circumscribed ray;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is any of the above-described with reference to obstacle
Object is directed to the obstacle blind area that target mobile device is formed.
The region that ray PA1 and ray PA2 are surrounded in obstacle blind area that barrier A is formed such as Figure 12, in Figure 12, closing
Curve 901 is the two-dimensional silhouette of above-mentioned target device, closed curve 902, closed curve 903, closed curve 904 and closed curve
905 be respectively the two-dimensional silhouette of the barrier A of above-mentioned target device, barrier B, barrier C and barrier D, and point P is above-mentioned mesh
The central point of the two-dimensional silhouette of marking device, point OA、OB、OCAnd ODRespectively barrier A, barrier B, barrier C and barrier D
Two-dimensional silhouette central point, P be target mobile device visual field starting point, ray PA1With ray PA2The field range in besieged city
The obstacle blind area that as barrier A is formed.
Step S4 is based on above-mentioned obstacle blind area, by labeled barrier is not determined as to above-mentioned in above-mentioned obstacle blind area
Target device does not have the not dangerous barrier of menace and deletes.
In the present embodiment, determine whether any not labeled barrier is in above-mentioned with reference to obstacle by the following method
In the obstacle blind area that object is formed:
Determine the barrier region that the two-dimensional silhouette of any of the above-described barrier is surrounded;
Determine that the preset ratio region in above-mentioned barrier region is in the above-mentioned obstacle blind area formed with reference to barrier
When, determine that any of the above-described barrier is in the above-mentioned obstacle blind area formed with reference to barrier.
Above-mentioned preset ratio region can with but be not limited to include any in 50% to 100% section of barrier region
Or appoints any in 50% to 100% section of more proportional regions and/or above-mentioned barrier region or appoint more ratio sections area
Domain ought see that 50% barrier region of any barrier is in from target device within sweep of the eye, by the barrier
Object is hindered to be determined as with reference to barrier;
Specifically, can with but be not limited to be in any obstacle blind area by the two-dimensional silhouette point of any barrier of determination
Profile point quantity, and then determine the profile point quantity and the profile point total quantity of any barrier profile point ratio refer to, such as
When above-mentioned profile point ratio is not less than preset ratio value, determines that any barrier is in any obstacle blind area, i.e., should
Any barrier, which is determined as current time, not to be had the not dangerous barrier of threat to target mobile device and marks;
Referring specifically to Figure 13, closed curve 901 is the two-dimensional silhouette of above-mentioned target device, closed curve 902, closed curve
903, closed curve 904 and closed curve 905 are respectively barrier A, barrier B, barrier C and the obstacle of above-mentioned target device
The two-dimensional silhouette of object D, point P is the central point of the two-dimensional silhouette of above-mentioned target device, using equipment center point P as the ray of starting point
PA1With ray PA2The region of composition is the obstacle blind area that barrier A is formed, and all two-dimensional silhouette points of barrier B are by ray PA1
With ray PA2The obstacle blind area of composition covers, and the two-dimensional silhouette of barrier B is not observed out of target device the visual field,
Therefore barrier B is determined as to the not dangerous barrier of current target mobile device;Surpass in the two-dimensional silhouette point of barrier C
50% two-dimensional silhouette point is crossed by ray PA1With ray PA2The obstacle blind area of composition covers, out of, target device the visual field
Only it is observed that 20% region of the two-dimensional silhouette of barrier C, therefore barrier C is determined as current target movement and is set
Standby not dangerous barrier is simultaneously deleted.
Based on deleting the position of other barriers and two-dimensional silhouette after not dangerous barrier after above-mentioned steps S4, in determination
State the travel route of target mobile device;
Through the above scheme, judge whether each barrier is true to target device by reference to the obstacle blind area that barrier is formed
Just there is menace, method is simple, after the not dangerous barrier for not having menace to target mobile device is deleted, energy
According to the position of other barriers and two-dimensional silhouette, optimal avoidance route is provided for above-mentioned target mobile device.
As shown in figure 14, it is based on identical inventive concept, the present embodiment provides a kind of obstacle detecting apparatus, the equipment packets
Include processor 1401 and memory 1402, wherein above-mentioned memory stores executable program, and above procedure is performed, above-mentioned
Processor realizes following process:
Determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark;
Determine the above-mentioned obstacle blind area formed with reference to barrier relative to target mobile device;
Based on above-mentioned obstacle blind area, by labeled barrier is not determined as to above-mentioned target device in above-mentioned obstacle blind area
Not dangerous barrier without menace is simultaneously deleted.
Optionally, above-mentioned processor is specifically used for, and determines current time at a distance from above-mentioned target mobile device default
Barrier in distance range is above-mentioned with reference to barrier;And/or
Determine that barrier of the current time on the mobile course of above-mentioned target mobile device is above-mentioned with reference to barrier.
Optionally, above-mentioned processor is specifically used for, and determines current time barrier central point at a distance from equipment center point
It is above-mentioned with reference to barrier no more than the barrier of the first pre-determined distance, above equipment central point is above-mentioned target mobile device
Central point, above-mentioned barrier central point are the central point of the two-dimensional silhouette of each barrier.
Optionally, above-mentioned processor is specifically used for, and determines current time barrier profile point to the mobile course of equipment most
Short distance is above-mentioned with reference to barrier no more than the barrier of the second pre-determined distance, and above-mentioned barrier central point is each barrier
The central point of two-dimensional silhouette, the mobile course of above equipment is the ray drawn from the central point of target mobile device.
Optionally, above-mentioned processor is specifically used for, and using the equipment center of above-mentioned target mobile device point as starting point, draws two
Item and the circumscribed ray of any two-dimensional silhouette with reference to barrier;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is any of the above-described with reference to obstacle
Object is directed to the obstacle blind area that target mobile device is formed.
Optionally, above-mentioned processor is specifically used for determining whether any not labeled barrier is in by the following method
In the above-mentioned obstacle blind area formed with reference to barrier:
Determine the barrier region that the two-dimensional silhouette of any of the above-described barrier is surrounded;
Determine that the preset ratio region in above-mentioned barrier region is in the above-mentioned obstacle blind area formed with reference to barrier
When, determine that any of the above-described barrier is in the above-mentioned obstacle blind area formed with reference to barrier.
As shown in figure 15, it is based on identical inventive concept, the present embodiment also provides a kind of obstacle detector, the device
Include:
With reference to obstacle determination unit 1501, for determining that current time has dangerous obstacle to target mobile device
Object is with reference to barrier and to mark;
Obstacle blind area determination unit 1502, for determining the above-mentioned barrier formed with reference to barrier relative to target mobile device
Hinder blind area;
Not dangerous obstacle determination unit 1503 will not be marked for being based on above-mentioned obstacle blind area in above-mentioned obstacle blind area
The barrier of note is determined as the not dangerous barrier for not having menace to above-mentioned target device and deletes
It is above-mentioned to be used to determine current time at a distance from above-mentioned target mobile device default with reference to obstacle determination unit
Barrier in distance range is above-mentioned with reference to barrier;And/or
Determine that barrier of the current time on the mobile course of above-mentioned target mobile device is above-mentioned with reference to barrier.
It is above-mentioned to be used to determine current time barrier central point at a distance from equipment center point with reference to obstacle determination unit
It is above-mentioned with reference to barrier no more than the barrier of the first pre-determined distance, above equipment central point is above-mentioned target mobile device
Central point, above-mentioned barrier central point are the central point of the two-dimensional silhouette of each barrier.
It is above-mentioned to be used to determine current time barrier profile point to the mobile course of equipment most with reference to obstacle determination unit
Short distance is above-mentioned with reference to barrier no more than the barrier of the second pre-determined distance, and above-mentioned barrier central point is each barrier
The central point of two-dimensional silhouette, the mobile course of above equipment is the ray drawn from the central point of target mobile device.
Above-mentioned obstacle blind area determination unit is used for, and using the equipment center of above-mentioned target mobile device point as starting point, draws two
Item and the circumscribed ray of any two-dimensional silhouette with reference to barrier;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is any of the above-described with reference to obstacle
Object is directed to the obstacle blind area that target mobile device is formed.
Above-mentioned not dangerous obstacle determination unit is used for, and whether determines any not labeled barrier by the following method
In the above-mentioned obstacle blind area formed with reference to barrier:
Determine the barrier region that the two-dimensional silhouette of any of the above-described barrier is surrounded;
Determine that the preset ratio region in above-mentioned barrier region is in the above-mentioned obstacle blind area formed with reference to barrier
When, determine that any of the above-described barrier is in the above-mentioned obstacle blind area formed with reference to barrier.
Based on identical inventive concept, the embodiment of the present application also provides a kind of computer-readable non-volatile memory medium,
Including program code, when above procedure code is run on computing terminal, above procedure code is for making above-mentioned computing terminal
The step of executing above-mentioned method provided by the present application.
Above by reference to showing according to the method, apparatus (system) of the embodiment of the present application and/or the frame of computer program product
Figure and/or flow chart describe the application.It should be understood that can realize that block diagram and or flow chart is shown by computer program instructions
The combination of the block of a block and block diagram and or flow chart diagram for figure.These computer program instructions can be supplied to logical
With computer, the processor of special purpose computer and/or other programmable data processing units, to generate machine, so that via meter
The instruction that calculation machine processor and/or other programmable data processing units execute creates for realizing block diagram and or flow chart block
In specified function action method.
Correspondingly, the application can also be implemented with hardware and/or software (including firmware, resident software, microcode etc.).More
Further, the application can take computer usable or the shape of the computer program product on computer readable storage medium
Formula has the computer realized in the medium usable or computer readable program code, to be made by instruction execution system
It is used with or in conjunction with instruction execution system.In the present context, computer can be used or computer-readable medium can be with
It is arbitrary medium, may include, stores, communicates, transmits or transmit program, is made by instruction execution system, device or equipment
With, or instruction execution system, device or equipment is combined to use.
Obviously, those skilled in the art can carry out various modification and variations without departing from the essence of the application to the application
Mind and range.In this way, if these modifications and variations of the application belong to the range of the claim of this application and its equivalent technologies
Within, then the application is also intended to include these modifications and variations.
Claims (11)
1. a kind of obstacle detection method, which is characterized in that this method comprises:
Determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark;
Determine the obstacle blind area formed with reference to barrier relative to target mobile device;
Based on the obstacle blind area, by labeled barrier is not determined as not having the target device in the obstacle blind area
There is the not dangerous barrier of menace and deletes.
2. the method according to claim 1, wherein determining that current time has risk to target mobile device
Barrier be with reference to barrier and to mark, comprising:
Determine that barrier of the current time at a distance from the target mobile device within the scope of pre-determined distance is described with reference to barrier
Hinder object;And/or
Determine that barrier of the current time on the mobile course of the target mobile device is described with reference to barrier.
3. method according to claim 2, which is characterized in that determine current time at a distance from the target mobile device
Barrier within the scope of pre-determined distance is described with reference to barrier, comprising:
Determine that barrier of the current time barrier central point at a distance from equipment center point no more than the first pre-determined distance is institute
It states with reference to barrier, the equipment center point is the central point of the target mobile device, and the barrier central point is each barrier
Hinder the central point of the two-dimensional silhouette of object.
4. method according to claim 2, which is characterized in that determine current time in the mobile boat of the target mobile device
Upward barrier is described with reference to barrier, comprising:
The shortest distance for determining current time barrier profile point to the mobile course of equipment is not more than the obstacle of the second pre-determined distance
Object is described with reference to barrier, and the barrier central point is the central point of the two-dimensional silhouette of each barrier, and the equipment is mobile
Course is the ray drawn from the central point of target mobile device.
5. the method as claimed in claim 3 or 4, which is characterized in that determine that the reference barrier is set relative to target movement
The standby obstacle blind area formed, comprising:
Using the equipment center of target mobile device point as starting point, draw two with outside any two-dimensional silhouette with reference to barrier
The ray cut;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is described any with reference to barrier needle
The obstacle blind area that target mobile device is formed.
6. method as claimed in claim 5, which is characterized in that determine that any not labeled barrier is by the following method
It is no to be in the obstacle blind area formed with reference to barrier:
Determine the barrier region that the two-dimensional silhouette of any barrier is surrounded;
When determining that the preset ratio region in the barrier region is in the obstacle blind area formed with reference to barrier, really
Fixed any barrier is in the obstacle blind area formed with reference to barrier.
7. a kind of obstacle detecting apparatus, which is characterized in that the equipment includes processor and memory, wherein the memory
Executable program is stored, described program is performed, and the processor realizes following process:
Determining that current time has dangerous barrier to target mobile device is with reference to barrier and to mark;
Determine the obstacle blind area formed with reference to barrier relative to target mobile device;
Based on the obstacle blind area, by labeled barrier is not determined as not having the target device in the obstacle blind area
There is the not dangerous barrier of menace and deletes.
8. equipment as claimed in claim 7, which is characterized in that the processor is specifically used for:
Determine that barrier of the current time at a distance from the target mobile device within the scope of pre-determined distance is described with reference to barrier
Hinder object;And/or
Determine that barrier of the current time on the mobile course of the target mobile device is described with reference to barrier.
9. equipment as claimed in claim 7, which is characterized in that the processor is specifically used for:
Using the equipment center of target mobile device point as starting point, draw two with outside any two-dimensional silhouette with reference to barrier
The ray cut;
Determine that the region of two ray compositions circumscribed with the two-dimensional silhouette with reference to barrier is described any with reference to barrier needle
The obstacle blind area that target mobile device is formed.
10. a kind of obstacle detector, which is characterized in that the device includes:
With reference to obstacle determination unit, for determining that current time has dangerous barrier to target mobile device for reference
Barrier simultaneously marks;
Obstacle blind area determination unit, for determining the obstacle blind area formed with reference to barrier relative to target mobile device;
Not dangerous obstacle determination unit is used to be based on the obstacle blind area, the obstacle being labeled in the obstacle blind area
Object is determined as the not dangerous barrier for not having menace to the target device and deletes.
11. a kind of computer can storage medium, be stored thereon with computer program, which is characterized in that the program is held by processor
The step of the method as any such as claim 1~6 is realized when row.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910596133.2A CN110346799A (en) | 2019-07-03 | 2019-07-03 | A kind of obstacle detection method and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910596133.2A CN110346799A (en) | 2019-07-03 | 2019-07-03 | A kind of obstacle detection method and equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110346799A true CN110346799A (en) | 2019-10-18 |
Family
ID=68177821
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910596133.2A Pending CN110346799A (en) | 2019-07-03 | 2019-07-03 | A kind of obstacle detection method and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110346799A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111598034A (en) * | 2020-05-22 | 2020-08-28 | 知行汽车科技(苏州)有限公司 | Obstacle detection method, obstacle detection device and storage medium |
CN112132929A (en) * | 2020-09-01 | 2020-12-25 | 北京布科思科技有限公司 | Grid map marking method based on depth vision and single line laser radar |
CN113859228A (en) * | 2020-06-30 | 2021-12-31 | 上海商汤智能科技有限公司 | Vehicle control method and device, electronic equipment and storage medium |
WO2023093056A1 (en) * | 2021-11-29 | 2023-06-01 | 上海商汤智能科技有限公司 | Vehicle control |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105261224A (en) * | 2015-09-02 | 2016-01-20 | 奇瑞汽车股份有限公司 | Intelligent vehicle control method and apparatus |
CN106143309A (en) * | 2016-07-18 | 2016-11-23 | 乐视控股(北京)有限公司 | A kind of vehicle blind zone based reminding method and system |
CN106872994A (en) * | 2017-04-14 | 2017-06-20 | 北京佳讯飞鸿电气股份有限公司 | A kind of Laser Radar Scanning method and device |
CN107831777A (en) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | A kind of aircraft automatic obstacle avoiding system, method and aircraft |
WO2018217498A1 (en) * | 2017-05-22 | 2018-11-29 | Pcms Holdings, Inc. | Method and apparatus for in-vehicle augmented reality visualization of sensor range and field-of-view |
CN109598187A (en) * | 2018-10-15 | 2019-04-09 | 西北铁道电子股份有限公司 | Obstacle recognition method, differentiating obstacle and railcar servomechanism |
CN109844671A (en) * | 2016-08-26 | 2019-06-04 | 克朗设备公司 | The path confirmation and dynamic route modification of materials handling vehicle |
CN109910008A (en) * | 2019-03-14 | 2019-06-21 | 烟台市广智微芯智能科技有限责任公司 | Avoidance early warning system and method for early warning for data type laser radar robot |
-
2019
- 2019-07-03 CN CN201910596133.2A patent/CN110346799A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105261224A (en) * | 2015-09-02 | 2016-01-20 | 奇瑞汽车股份有限公司 | Intelligent vehicle control method and apparatus |
CN106143309A (en) * | 2016-07-18 | 2016-11-23 | 乐视控股(北京)有限公司 | A kind of vehicle blind zone based reminding method and system |
CN109844671A (en) * | 2016-08-26 | 2019-06-04 | 克朗设备公司 | The path confirmation and dynamic route modification of materials handling vehicle |
CN106872994A (en) * | 2017-04-14 | 2017-06-20 | 北京佳讯飞鸿电气股份有限公司 | A kind of Laser Radar Scanning method and device |
WO2018217498A1 (en) * | 2017-05-22 | 2018-11-29 | Pcms Holdings, Inc. | Method and apparatus for in-vehicle augmented reality visualization of sensor range and field-of-view |
CN107831777A (en) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | A kind of aircraft automatic obstacle avoiding system, method and aircraft |
CN109598187A (en) * | 2018-10-15 | 2019-04-09 | 西北铁道电子股份有限公司 | Obstacle recognition method, differentiating obstacle and railcar servomechanism |
CN109910008A (en) * | 2019-03-14 | 2019-06-21 | 烟台市广智微芯智能科技有限责任公司 | Avoidance early warning system and method for early warning for data type laser radar robot |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111598034A (en) * | 2020-05-22 | 2020-08-28 | 知行汽车科技(苏州)有限公司 | Obstacle detection method, obstacle detection device and storage medium |
CN113859228A (en) * | 2020-06-30 | 2021-12-31 | 上海商汤智能科技有限公司 | Vehicle control method and device, electronic equipment and storage medium |
WO2022001322A1 (en) * | 2020-06-30 | 2022-01-06 | 上海商汤智能科技有限公司 | Vehicle control method and apparatus, and electronic device and storage medium |
CN112132929A (en) * | 2020-09-01 | 2020-12-25 | 北京布科思科技有限公司 | Grid map marking method based on depth vision and single line laser radar |
CN112132929B (en) * | 2020-09-01 | 2024-01-26 | 北京布科思科技有限公司 | Grid map marking method based on depth vision and single-line laser radar |
WO2023093056A1 (en) * | 2021-11-29 | 2023-06-01 | 上海商汤智能科技有限公司 | Vehicle control |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109017570B (en) | Vehicle surrounding scene presenting method and device and vehicle | |
CN110346799A (en) | A kind of obstacle detection method and equipment | |
US11656620B2 (en) | Generating environmental parameters based on sensor data using machine learning | |
CN110674705B (en) | Small-sized obstacle detection method and device based on multi-line laser radar | |
EP2372310B1 (en) | Image processing system and position measurement system | |
US11670087B2 (en) | Training data generating method for image processing, image processing method, and devices thereof | |
CN109791052A (en) | For generate and using locating reference datum method and system | |
US11280630B2 (en) | Updating map data | |
CN107850448A (en) | Method and system for generating and using locating reference datum | |
EP2372605A2 (en) | Image processing system and position measurement system | |
CN110531376A (en) | Detection of obstacles and tracking for harbour automatic driving vehicle | |
CN104376297A (en) | Detection method and device for linear indication signs on road | |
CN111338382B (en) | Unmanned aerial vehicle path planning method guided by safety situation | |
CN114812581A (en) | Cross-country environment navigation method based on multi-sensor fusion | |
CN108108750A (en) | Metric space method for reconstructing based on deep learning and monocular vision | |
CN109271857A (en) | A kind of puppet lane line elimination method and device | |
CN111595357A (en) | Display method and device of visual interface, electronic equipment and storage medium | |
CN110389587A (en) | A kind of robot path planning's new method of target point dynamic change | |
CN111881245B (en) | Method, device, equipment and storage medium for generating visibility dynamic map | |
CN114765972A (en) | Display method, computer program, controller and vehicle for representing a model of the surroundings of a vehicle | |
KR101868898B1 (en) | Method and apparatus of identifying lane for self-driving car | |
CN114170499A (en) | Target detection method, tracking method, device, visual sensor and medium | |
CN116142172A (en) | Parking method and device based on voxel coordinate system | |
CN114212106A (en) | Method and device for determining safety probability in driving area of vehicle | |
US20240257376A1 (en) | Method and system for detection a line above ground from a helicopter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191018 |
|
RJ01 | Rejection of invention patent application after publication |