CN110751336B - Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier - Google Patents
Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier Download PDFInfo
- Publication number
- CN110751336B CN110751336B CN201911006620.5A CN201911006620A CN110751336B CN 110751336 B CN110751336 B CN 110751336B CN 201911006620 A CN201911006620 A CN 201911006620A CN 110751336 B CN110751336 B CN 110751336B
- Authority
- CN
- China
- Prior art keywords
- detection range
- obstacle
- depth image
- detection
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000001514 detection method Methods 0.000 claims abstract description 358
- 230000001133 acceleration Effects 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 12
- 230000004888 barrier function Effects 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000009499 grossing Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 238000003860 storage Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000004904 shortening Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000036039 immunity Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
- G06Q10/047—Optimisation of routes or paths, e.g. travelling salesman problem
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Human Resources & Organizations (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The embodiment of the invention relates to an obstacle avoidance method and an obstacle avoidance device of an unmanned vehicle and the unmanned vehicle. The method comprises the following steps: determining a first detection distance according to the maximum detection range of the unmanned vehicle; determining a second detection distance according to the first detection distance, and determining a second detection range according to the second detection distance; and when an obstacle is detected in the second detection range, planning an obstacle avoidance path according to the second detection range and the maximum detection range. The method comprises the steps of determining a first detection range through the maximum detection range of the unmanned vehicle, then determining a second detection range and a second detection range corresponding to the second detection range according to the first detection range, and obtaining more complete image information of the obstacle according to the second detection range and the maximum detection range because the second detection range corresponding to the second detection range is smaller than the maximum detection range corresponding to the first detection range, and further determining a more accurate obstacle avoidance path according to the more complete image information of the obstacle.
Description
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of unmanned vehicles, in particular to an obstacle avoidance method and an obstacle avoidance device of an unmanned vehicle and the unmanned vehicle.
[ background of the invention ]
An unmanned vehicle (unmanned vehicle) is a vehicle without carrying people. Generally, the following types can be roughly classified: unmanned Ground Vehicles (UGV) that operate on the ground, unmanned Aerial Vehicles (UAV) often referred to as drones, unmanned Sea Vehicles (USV) that operate on the water Surface, and Unmanned Underwater Vehicles (UUV) that operate under water. The unmanned vehicle is generally controlled by remote control, guidance or automatic driving, does not need to carry members, and can be used for scientific research, military, leisure and entertainment.
In recent years, unmanned vehicles are gradually and more widely applied in the fields of surveying and mapping, search and rescue, real estate, agriculture and the like due to the characteristics of flexibility, portability, strong space maneuverability and the like, and are more popular with consumers in the fields of aerial photography or entertainment. The popularization of unmanned vehicles requires that they can work safely and reliably in various environments. The safety of the unmanned vehicle is improved, the environment adaptability is improved, the unmanned vehicle can identify surrounding environment information and effectively avoid obstacles, the key point of the unmanned vehicle technology research is achieved, and the technical difficulty of overcoming urgent need is achieved.
In the process of implementing the present invention, the inventor finds that the related art has at least the following problems: when the existing unmanned vehicle utilizes the depth sensor to detect obstacles and plan a path, because the detection range of the depth sensor is limited, when the braking performance is fixed, the upper limit of the speed needs to be set to ensure that the obstacles are avoided or braked in time when being detected. However, when obstacle avoidance is performed, if path planning is performed only by using the partial area that has just entered the detection range, the planned route cannot be calculated well due to incomplete detection of obstacles, resulting in inaccurate path planning results.
[ summary of the invention ]
The embodiment of the invention provides an unmanned vehicle obstacle avoidance method, an obstacle avoidance device and an unmanned vehicle, which can solve the technical problems of incomplete obstacle detection and low accuracy of a path planning result.
In order to solve the above technical problem, the embodiments of the present invention provide the following technical solutions: an obstacle avoidance method for an unmanned vehicle. The unmanned vehicle obstacle avoidance method comprises the following steps:
optionally, determining a first detection distance according to a maximum detection range of the unmanned vehicle;
determining a second detection distance according to the first detection distance, wherein the second detection distance is smaller than the first detection distance;
determining a second detection range according to the second detection distance;
and when an obstacle is detected in a second detection range, planning an obstacle avoidance path according to the second detection range and the maximum detection range.
Optionally, the planning an obstacle avoidance path according to the second detection range and the maximum detection range includes:
extending the second detection range to a maximum detection range;
acquiring a depth image of the obstacle within a maximum detection range;
and planning an obstacle avoidance path according to the depth image.
Optionally, the planning an obstacle avoidance path according to the second detection range and the maximum detection range includes:
expanding the second detection range to a preset detection range, wherein the preset detection range corresponds to a preset detection distance, the preset detection distance is greater than the second detection distance, and the preset detection distance is smaller than the first detection distance;
acquiring a depth image of the obstacle within a preset detection range;
and planning an obstacle avoidance path according to the depth image.
Optionally, the planning an obstacle avoidance path according to the depth image includes:
judging whether the depth image is a complete depth image of the obstacle or not;
if so, planning an obstacle avoidance path according to the depth image;
if not, expanding the preset detection range to the maximum detection range, acquiring the depth image of the obstacle in the maximum detection range, and planning an obstacle avoidance path according to the depth image of the obstacle.
Optionally, the planning an obstacle avoidance path according to the depth image includes:
judging whether the depth image is a complete depth image of the obstacle or not;
if so, planning an obstacle avoidance path according to the depth image;
if not, increasing the preset detection distance corresponding to the preset detection range by a preset distance, and continuously acquiring the depth image of the obstacle in the expanded preset detection range.
Optionally, the depth image comprises a three-dimensional point cloud of the obstacle.
Optionally, after the obtaining the depth image of the obstacle, the method further includes:
and carrying out filtering and/or smoothing processing on the depth image.
Optionally, before planning the obstacle avoidance path, the method further includes: and decelerating the unmanned vehicle to a preset safe speed.
Optionally, the decelerating the unmanned vehicle to a preset safe speed includes:
decelerating the unmanned vehicle to a preset safe speed at a preset acceleration;
calculating the preset acceleration by the following formula
Wherein V is the current speed, V max The preset safe speed is the d is the second detection distance, and the a is the preset acceleration.
In order to solve the above technical problems, embodiments of the present invention further provide the following technical solutions: the utility model provides an keep away barrier device is applied to unmanned vehicle, keep away barrier device includes: a first detection distance determining module, configured to determine a first detection distance according to a maximum detection range of the unmanned vehicle;
a second detection distance determining module, configured to determine a second detection distance according to the first detection distance, where the second detection distance is smaller than the first detection distance;
a second detection range determining module, configured to determine a second detection range according to the second detection distance;
and the obstacle avoidance path planning module is used for planning an obstacle avoidance path according to the second detection range and the maximum detection range when an obstacle is detected in the second detection range.
Optionally, the obstacle avoidance path planning module includes a detection range expansion unit, a depth image acquisition unit, and an obstacle avoidance path planning unit;
the detection range extending unit is used for extending the second detection range to a maximum detection range;
the depth image acquisition unit is used for acquiring a depth image of the obstacle in a maximum detection range;
and the obstacle avoidance path planning unit is used for planning an obstacle avoidance path according to the depth image.
In order to solve the above technical problems, embodiments of the present invention further provide the following technical solutions: an unmanned vehicle. The unmanned vehicle includes:
an unmanned vehicle body;
the depth sensor is arranged on the unmanned vehicle main body and used for acquiring a depth image of an obstacle and planning an obstacle avoidance path;
a processor disposed within the unmanned vehicle body and in communication with the depth sensors, respectively; and the number of the first and second groups,
a memory communicatively coupled to the processor; wherein,
the memory stores instructions executable by the processor to cause the processor to perform the method described above.
Compared with the prior art, the obstacle avoidance method of the unmanned vehicle provided by the embodiment of the invention can determine the first detection range according to the maximum detection range of the unmanned vehicle, then determine the second detection range corresponding to the second detection range and the second detection range corresponding to the second detection range according to the first detection range, and because the second detection range corresponding to the second detection range is smaller than the maximum detection range corresponding to the first detection range, more complete image information of the obstacle can be obtained according to the second detection range and the maximum detection range, and a more accurate obstacle avoidance path can be further determined according to the more complete image information of the obstacle.
[ description of the drawings ]
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
FIG. 1 is a schematic diagram of an application environment of an embodiment of the present invention;
fig. 2 is a schematic flow chart of an obstacle avoidance method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an obstacle avoidance method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an obstacle avoidance method according to another embodiment of the present invention;
FIG. 5 is a schematic flow chart of one embodiment of S40 of FIG. 2;
fig. 6 is a schematic structural diagram of an obstacle avoidance method according to another embodiment of the present invention;
FIG. 7 is a schematic flow chart of another embodiment of S40 of FIG. 2;
fig. 8 is a schematic structural diagram of an obstacle avoidance method according to yet another embodiment of the present invention;
FIG. 9 is a schematic flow chart of one embodiment of S46 of FIG. 7;
FIG. 10 is a schematic flow chart of another embodiment of S46 of FIG. 7;
fig. 11 is a block diagram of an obstacle avoidance apparatus for an unmanned vehicle according to an embodiment of the present invention;
fig. 12 is a block diagram of an unmanned vehicle according to an embodiment of the present invention.
[ detailed description ] A
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that, if not conflicted, the various features of the embodiments of the present application may be combined with each other within the scope of protection of the present application. Additionally, while functional block divisions are performed in apparatus schematics, with logical sequences shown in flowcharts, in some cases, steps shown or described may be performed in sequences other than block divisions in apparatus or flowcharts. Furthermore, the terms "first," "second," "third," and the like, as used herein, do not limit the order of data and execution, but merely distinguish between similar items or items that have substantially the same function or function.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Furthermore, the technical features mentioned in the different embodiments of the invention described below can be combined with each other as long as they do not conflict with each other.
The embodiment of the invention provides an obstacle avoidance method and device for an unmanned vehicle, wherein the method and device determine a first detection range according to the maximum detection range of the unmanned vehicle, then determine a second detection range corresponding to the second detection range and a second detection range corresponding to the second detection range according to the first detection range, and because the second detection range corresponding to the second detection range is smaller than the maximum detection range corresponding to the first detection range, more complete image information of an obstacle can be obtained according to the second detection range and the maximum detection range, and a more accurate obstacle avoidance path can be further determined according to the more complete image information of the obstacle.
The following illustrates an application environment of the unmanned vehicle obstacle avoidance method and device.
Fig. 1 is a schematic view of an application environment of an unmanned vehicle obstacle avoidance system according to an embodiment of the present invention; as shown in fig. 1, the application scenario includes an unmanned vehicle 10, a wireless network 20, a smart terminal 30, and a user 40. A user 40 may operate a smart terminal 30 to maneuver the unmanned vehicle 10 over the wireless network 20.
The unmanned vehicle 10 may have a corresponding volume or power according to the needs of actual conditions, so as to provide a load capacity, a flight speed, a flight endurance, and the like that can meet the use needs. One or more functional modules can be added to the unmanned vehicle 10, so that the unmanned vehicle 10 can implement corresponding functions.
For example, in the present embodiment, the unmanned vehicle 10 is provided with at least one sensor of a depth sensor, a distance sensor, a magnetometer, a GPS navigator, and a vision sensor. Correspondingly, the unmanned vehicle 10 is provided with an information receiving device for receiving and processing the information collected by the at least one sensor.
The unmanned vehicle 10 includes at least one main control chip as a control core for unmanned vehicle motion, data transmission, and the like, and integrates one or more modules to execute corresponding logic control programs.
For example, in some embodiments, the main control chip may include an unmanned vehicle obstacle avoidance device 50 for selecting and processing an obstacle avoidance path.
The smart terminal 30 may be any type of smart device, such as a mobile phone, a tablet computer, or a smart remote controller, for establishing a communication connection with the unmanned vehicle 10. The intelligent terminal 30 may be equipped with one or more different user 40 interaction means for collecting user 40 instructions or presenting and feeding information to the user 40.
These interaction means include, but are not limited to: button, display screen, touch-sensitive screen, speaker and remote control action pole. For example, the intelligent terminal 30 may be equipped with a touch display screen, through which a remote control instruction of the user 40 to the unmanned vehicle 10 is received and image information obtained by the unmanned vehicle is displayed to the user 40 through the touch display screen, and the user 40 may also switch the image information currently displayed on the display screen through the remote control touch screen.
In some embodiments, the unmanned vehicle 10 and the intelligent terminal 30 may further integrate the existing image vision processing technology to further provide more intelligent services. For example, the unmanned vehicle 10 may analyze the image by the intelligent terminal 30 in a manner that the image is collected by the dual-optical camera, so as to realize gesture control of the user 40 on the unmanned vehicle 10.
The wireless network 20 may be a wireless communication network based on any type of data transmission principle for establishing a data transmission channel between two nodes, such as a bluetooth network, a WiFi network, a wireless cellular network or a combination thereof located in different signal frequency bands.
Fig. 2 is a diagram illustrating an embodiment of an obstacle avoidance method for an unmanned vehicle according to an embodiment of the present invention. As shown in fig. 2, the unmanned vehicle obstacle avoidance method includes the following steps:
s10: and determining a first detection distance according to the maximum detection range of the unmanned vehicle.
In particular, the maximum detection range may be acquired by a depth sensor onboard the unmanned vehicle. The maximum detection range may be a physical detection limit range of the depth sensor, or may be a maximum stable working range in an actual application process.
The depth sensor is a general term of sensors capable of acquiring environmental depth, and different sensors are selected according to different environments and required accuracy.
In this embodiment, the depth sensor is configured to detect obstacle information on a path of the unmanned vehicle, and a specific implementation manner is a depth sensor Kinect, where a system architecture of the Kinect includes a physical layer, a driving and interface layer, and an application layer, and the SDK and the API provided by the Kinect can implement multiple types of functions, and mainly include: obtaining depth data: the Kinect calculates through a scattering dot matrix for emitting infrared rays to the environment by the infrared emitter and emission data of obstacles in the environment received by the infrared receiver to obtain depth information of each pixel point in the received image, and accordingly generates a three-dimensional depth image.
The depth sensor may also be a binocular camera, an RGBD camera, structured light, TOF, lidar, or the like.
Specifically, different sensors are selected for different environments and required accuracy, and after the type of the depth sensor of the unmanned vehicle is determined, the first detection distance can be determined according to the maximum detection range of the depth sensor.
Specifically, for example, as shown in fig. 3, the maximum detection range 112 of the depth sensor 11 is first determined as the corresponding maximum detection threshold L1, and then a first perpendicular line is perpendicular to the maximum detection threshold L1 with the depth sensor 11 as a starting point, where the first perpendicular line intersects the maximum detection threshold L1 at a point, and a distance between the intersection of the first perpendicular line and the maximum detection threshold L1 and the depth sensor 11 is the first detection distance D1.
In some embodiments, the first detection distance may also be obtained by other means, for example, according to a maximum detection angle corresponding to a maximum detection range. As long as the acquired first detection distance is a unique value after determination of the maximum detection range of the depth sensor is satisfied.
S20: a second detection range is determined from the first detection range, wherein the second detection range is less than the first detection range.
Specifically, after a first detection distance is obtained through the maximum detection range of the depth sensor, the first detection distance is shortened by a preset distance to obtain the second detection distance, where the preset distance may be determined according to a size of a pre-stored obstacle, for example, the preset distance is equal to or greater than the size of the pre-stored obstacle.
The unmanned vehicle stores the detected obstacle size in a designated memory in a historical driving path, and the size of the pre-stored obstacle can be obtained through the obstacle size stored in the designated memory. For example, an average obstacle size of the obstacle sizes stored in the designated memory may be calculated, and the size of the pre-stored obstacle is equal to the average obstacle size, that is, the preset distance is equal to or greater than the average obstacle size.
Specifically, for example, as shown in fig. 4, after the first detection distance D1 is acquired through the maximum detection range 112 of the depth sensor 11, the second detection distance D2 is obtained by shortening the first detection distance D1 by the preset distance Δ D. The direction of shortening the first detection distance D1 by the preset distance is opposite to the extending direction of the first perpendicular line.
S30: and determining a second detection range according to the second detection distance.
Specifically, after the first detection range is shortened by a preset distance to obtain the second detection range, the second detection range may be uniquely determined according to the second detection range.
Specifically, for example, as shown in fig. 4, after the first detection distance D1 is shortened by a preset distance Δ D to obtain the second detection distance D2, a perpendicular line is made to perpendicularly intersect with one end of the second detection distance D2, the perpendicular line intersects with an edge of the maximum detection range to form two intersection points, the two intersection points are connected to obtain a second detection threshold L2 corresponding to the second detection distance D2, and the second detection threshold L2 and the edge of the maximum detection range form a second detection range 114 corresponding to the second detection distance D2.
In some embodiments, the second detection range may be obtained by other methods as long as it is determined that the second detection range corresponds to a unique second detection range.
S40: and when an obstacle is detected in a second detection range, planning an obstacle avoidance path according to the second detection range and the maximum detection range.
Specifically, when the unmanned aerial vehicle can detect whether an obstacle exists in the second detection range by a visual sensor, ultrasonic waves, infrared rays, millimeter wave radars, or the like.
Specifically, in the present embodiment, a visual sensor is used to capture objective object information instead of human eyes, and outline information, depth information, position information, and the like of an object are acquired through a related visual image processing algorithm to detect whether an obstacle exists in the second detection range. The vision sensor is different from sensors such as ultrasonic wave and laser radar, and passively receives light source information (passive perception sensor), and the obtained information is rich. The ultrasonic wave and laser radar sensors actively emit sound waves or light waves (active sensing sensors), simultaneously receive reflected information, and acquire single information.
The vision sensor comprises a lens, an image sensor, an analog-to-digital converter, an image processor, a memory and the like. When the vision sensor is used for imaging, light information of an obstacle in a three-dimensional space is projected on a two-dimensional plane through a lens and image geometric change, the image sensor collects two-dimensional image light signals to obtain analog image signals, the image analog signals are encoded into digital images through an analog-to-digital converter, and finally the digital images are recoded by an image processor and stored in a memory. In this embodiment, a high-end camera of the CCD image sensor is used as the vision sensor, and the CCD has the advantages of high image quality, strong noise immunity, and the like.
Specifically, for example, as shown in fig. 4, when it is detected that the obstacle 12 exists in the second detection range 114, an obstacle avoidance path is planned according to the second detection range 114 and the maximum detection range 112.
In this embodiment, by determining the first detection range according to the maximum detection range of the unmanned vehicle, and then determining the second detection range corresponding to the second detection range and the second detection range according to the first detection range, since the second detection range corresponding to the second detection range is smaller than the maximum detection range corresponding to the first detection range, it is possible to obtain more complete image information of the obstacle according to the second detection range and the maximum detection range, and further determine a more accurate obstacle avoidance path according to the more complete image information of the obstacle.
In some embodiments, in order to ensure that the unmanned vehicle can detour or brake more safely and more smoothly to avoid the obstacle, in some embodiments, the planning of the obstacle avoidance path further comprises the following steps:
and decelerating the unmanned vehicle to a preset safe speed.
Specifically, decelerating the unmanned vehicle to a preset safe speed at a preset acceleration;
calculating the preset acceleration by the following formula
Where V is the current speed, V max The preset safe speed is the d is the second detection distance, and the a is the preset acceleration.
In order to provide more complete obstacle information to ensure the accuracy of obstacle avoidance path planning, in some embodiments, as shown in fig. 5, S40 includes the following steps:
s41: extending the second detection range to a maximum detection range.
Specifically, as shown in fig. 6, when it is detected that the obstacle 12 exists within the second detection range 114, the second detection range 114 corresponding to the second detection distance D2 is expanded to the maximum detection range 112 corresponding to the first detection distance D1. When the obstacle 12 exists in the second detection range 114 once, the second detection range 114 is expanded to the maximum detection range 112, so that more complete obstacle information of the obstacle can be acquired, and a more accurate obstacle avoidance path can be determined according to the more complete obstacle information.
S43: acquiring a depth image of the obstacle within a maximum detection range.
Specifically, as shown in fig. 6, when the second detection range 114 corresponding to the second detection distance D2 is expanded to the maximum detection range 112 corresponding to the first detection distance D1, the depth image of the obstacle within the maximum detection range 112 is acquired.
Specifically, a three-dimensional point cloud of the obstacle can be generally obtained through a depth sensor, and a depth image of the obstacle is obtained by performing feature extraction and matching processing on the three-dimensional point cloud.
In some embodiments, the depth image of the obstacle may also be acquired based on an inter-frame difference algorithm, a background difference algorithm, an optical flow method, and an image segmentation algorithm. For example, the inter-frame difference algorithm is to obtain the edge contour of an obstacle according to the difference between the previous and subsequent frames of the temporally continuous image and the difference between the previous and subsequent frames when the position of the obstacle changes in the motion in the image. The interframe difference algorithm has the advantages of being simple in operation, high in detection speed, strong in expansibility, low in algorithm implementation complexity and the like. As another example, the background difference algorithm detects an obstacle based on a difference between an image in which the obstacle appears and a fixed background. Therefore, the method needs to store a fixed scene first, and has the characteristics of simple operation, high detection speed, low algorithm implementation complexity and the like when the position information of the obstacle is detected.
In order to obtain a clearer and more accurate depth image for more accurate planning of an obstacle avoidance path, in some embodiments, after obtaining the depth image of the obstacle, the method further includes:
and carrying out filtering and/or smoothing processing on the depth image.
S45: and planning an obstacle avoidance path according to the depth image.
Specifically, after the depth image of the obstacle in the maximum detection range is acquired, the obstacle avoidance path can be planned according to the acquired depth image by using a path planning algorithm. The algorithm for path planning includes, but is not limited to, vfh +, a track library, and the like.
In order to provide more complete obstacle information to ensure the accuracy of obstacle avoidance path planning, in some embodiments, as shown in fig. 7, S40 replaces the above steps S41, S43, and S45 by:
s42: and expanding the second detection range to a preset detection range, wherein the preset detection range corresponds to a preset detection distance, the preset detection distance is greater than the second detection distance, and the preset detection distance is smaller than the first detection distance.
Specifically, as shown in fig. 8, the second detection distance D2 corresponding to the second detection range 114 is increased to a preset detection distance D3, and the preset detection distance D3 corresponds to a preset detection range 116, so that the second detection range 114 is extended to the preset detection range 116, the preset detection distance D3 is greater than the second detection distance D2, and the preset detection distance D3 is smaller than the first detection distance D1.
When the obstacle 12 is detected to exist in the second detection range 114 once, the second detection range 114 is expanded to the preset detection range 116, so that more complete obstacle information of the obstacle can be acquired, and a more accurate obstacle avoidance path can be determined according to the more complete obstacle information.
S44: and acquiring a depth image of the obstacle within a preset detection range.
Specifically, as shown in fig. 8, when the second detection range 114 corresponding to the second detection distance D2 is expanded to the preset detection range 116 corresponding to the preset detection distance D3, the depth image of the obstacle in the preset detection range 116 is acquired.
Specifically, a three-dimensional point cloud of the obstacle can be generally obtained through a depth sensor, and a depth image of the obstacle is obtained by performing feature extraction and matching processing on the three-dimensional point cloud.
S46: and planning an obstacle avoidance path according to the depth image.
Specifically, after the depth image of the obstacle in the maximum detection range is acquired, the obstacle avoidance path can be planned according to the acquired depth image by using a path planning algorithm. The algorithm for path planning includes, but is not limited to, vfh +, trajectory library, and the like.
In order to obtain a more complete depth image of the obstacle and further more accurately plan an obstacle avoidance path according to the depth image, in some embodiments, as shown in fig. 9, S46 includes the following steps:
s461: and judging whether the depth image is a complete depth image of the obstacle.
Specifically, when the second detection range 114 corresponding to the second detection distance D2 is expanded to the preset detection range 116 corresponding to the preset detection distance D3, if the acquired depth image of the obstacle in the preset detection range 116 is smaller than the actual size of the obstacle, that is, the acquired depth image of the obstacle is not the complete depth image of the obstacle, so that the obstacle avoidance path determined according to the depth image is inaccurate, and therefore, it is necessary to further determine whether the acquired depth image of the obstacle in the preset detection range 116 is the complete depth image of the obstacle.
S463: and if so, planning an obstacle avoidance path according to the depth image.
Specifically, if the acquired depth image of the obstacle in the preset detection range is equal to the actual size of the obstacle, that is, the acquired depth image of the obstacle is a complete depth image of the obstacle, an obstacle avoidance path can be directly planned according to the depth image. By acquiring the obstacle depth image within the preset detection range instead of directly acquiring the obstacle depth image within the maximum detection range, the operation flow of the depth sensor can be optimized, the acquisition speed of the depth image is increased, and therefore an obstacle avoidance path can be planned in time according to the depth image of the obstacle.
Specifically, planning of an obstacle avoidance path can be performed according to the acquired depth image by using a path planning algorithm. The algorithm for path planning includes, but is not limited to, vfh +, a track library, and the like.
S465: if not, expanding the preset detection range to the maximum detection range, acquiring the depth image of the obstacle in the maximum detection range, and planning an obstacle avoidance path according to the depth image of the obstacle.
Specifically, if the acquired depth image of the obstacle in the preset detection range is smaller than the actual size of the obstacle, that is, the acquired depth image of the obstacle is not the complete depth image of the obstacle, the preset detection range is expanded to the maximum detection range, the depth image of the obstacle in the maximum detection range is acquired, the acquired depth image is the complete obstacle information of the obstacle, and then a more accurate obstacle avoidance path is determined according to the complete obstacle information.
In order to obtain a more complete depth image of the obstacle and further more accurately plan an obstacle avoidance path according to the depth image, in some embodiments, as shown in fig. 10, S46 includes the following steps:
s462: and judging whether the depth image is a complete depth image of the obstacle.
S464: and if so, planning an obstacle avoidance path according to the depth image.
S466: if not, increasing the preset detection distance corresponding to the preset detection range by a preset distance, and continuously acquiring the depth image of the obstacle in the expanded preset detection range.
Specifically, the preset distance may be determined according to the accuracy of the depth sensor, or may be preset according to an external environment condition of the unmanned vehicle for executing the task. For example, when the external environment of the unmanned vehicle for executing the task is complex, the number of obstacles is large and the corresponding size is large, the preset distance is adjusted to be 5-10m, and when the external environment of the unmanned vehicle for executing the task is simple, the number of obstacles is small and the corresponding size is small, the preset distance is adjusted to be 2-3m, so that the operation flow of the depth sensor can be optimized, the speed of acquiring the depth image is increased, and the obstacle avoidance path can be planned in time according to the depth image of the obstacles.
Further, after the preset detection range corresponding to the preset detection range is increased by the preset distance, the preset detection range corresponding to the preset detection range is increased due to the increase of the preset detection range, so that the depth image of the obstacle in the increased preset detection range can be acquired, the acquired depth image is the complete obstacle information of the obstacle, and a more accurate obstacle avoidance path is determined according to the complete obstacle information.
It should be noted that, in the foregoing embodiments, a certain order does not necessarily exist between the foregoing steps, and it can be understood by those skilled in the art from the description of the embodiments of the present application that, in different embodiments, the foregoing steps may have different execution orders, that is, may be executed in parallel, may also be executed in an exchange manner, and the like.
As another aspect of the embodiment of the present application, an obstacle avoidance apparatus 50 for an unmanned vehicle is provided in the embodiment of the present application. Referring to fig. 5, the unmanned vehicle obstacle avoidance device 50 includes: a first detection distance determining module 51, a second detection distance determining module 52, a second detection range determining module 53 and an obstacle avoidance path planning module 54.
The first detection range determining module 51 is configured to determine a first detection range according to a maximum detection range of the unmanned vehicle.
The second detection range determination module 52 is configured to determine a second detection range according to the first detection range, wherein the second detection range is smaller than the first detection range.
The second detection range determining module 53 is configured to determine a second detection range according to the second detection distance.
The obstacle avoidance path planning module 54 is configured to plan an obstacle avoidance path according to the second detection range and the maximum detection range when an obstacle is detected in the second detection range.
In this embodiment, by determining the first detection range according to the maximum detection range of the unmanned vehicle, and then determining the second detection range corresponding to the second detection range and the second detection range according to the first detection range, since the second detection range corresponding to the second detection range is smaller than the maximum detection range corresponding to the first detection range, it is possible to obtain more complete image information of the obstacle according to the second detection range and the maximum detection range, and further determine a more accurate obstacle avoidance path according to the more complete image information of the obstacle.
In some embodiments, unmanned vehicle obstacle avoidance device 50 further includes a deceleration module 55, where deceleration module 55 is configured to decelerate the unmanned vehicle to a preset safe speed.
Specifically, decelerating the unmanned vehicle to a preset safe speed at a preset acceleration;
calculating the preset acceleration according to the following formula
Where V is the current speed, V max And d is the second detection distance, and a is the preset acceleration.
In some embodiments, the unmanned vehicle obstacle avoidance device 50 further includes an image post-processing module 56, and the image post-processing module 56 is configured to filter and/or smooth the depth image.
In some embodiments, the obstacle avoidance path planning module 54 includes a detection range expanding unit, a depth image acquiring unit, and an obstacle avoidance path planning unit.
The detection range extending unit is configured to extend the second detection range to a maximum detection range.
In some embodiments, the detection range extending unit is further configured to extend the second detection range to a preset detection range, where the preset detection range corresponds to a preset detection distance, the preset detection distance is greater than the second detection distance, and the preset detection distance is smaller than the first detection distance.
In some embodiments, the detection range extending unit is further configured to increase a preset detection distance corresponding to the preset detection range by a preset distance.
The depth image acquisition unit is used for acquiring a depth image of the obstacle within a maximum detection range.
In some embodiments, the depth image acquiring unit is further configured to acquire a depth image of the obstacle within a preset detection range.
In some embodiments, the depth image acquiring unit is further configured to acquire a depth image of the obstacle within the extended preset detection range.
And the obstacle avoidance path planning unit is used for planning an obstacle avoidance path according to the depth image.
In some embodiments, the obstacle avoidance path planning unit includes an image integrity determination subunit, a path planning subunit, and a detection range expansion subunit, and the image integrity determination subunit is configured to determine whether the depth image is an integral depth image of the obstacle. And the path planning subunit is used for planning an obstacle avoidance path according to the depth image.
It should be noted that the obstacle avoidance apparatus for an unmanned vehicle can execute the obstacle avoidance method for an unmanned vehicle provided by the embodiments of the present invention, and has functional modules and beneficial effects corresponding to the execution method. For technical details that are not described in detail in the embodiment of the obstacle avoidance device for an unmanned vehicle, reference may be made to the obstacle avoidance method for an unmanned vehicle provided in the embodiment of the present invention.
Fig. 12 is a block diagram of an unmanned vehicle 10 according to an embodiment of the present invention. The unmanned vehicle 10 can be used to implement all or part of the functions of the main control chip. As shown in fig. 6, the unmanned vehicle 10 may include: an unmanned vehicle body, a depth sensor, a distance sensor, a processor 110, a memory 120, and a communication module 130.
The depth sensor is arranged on the unmanned vehicle main body and used for acquiring a depth image of an obstacle and planning an obstacle avoidance path; the distance sensor is arranged on the unmanned vehicle main body and used for detecting the distance information between the unmanned vehicle and the obstacle; the processor is arranged in the unmanned vehicle body and is respectively in communication connection with the depth sensor and the distance sensor.
The processor 110, the memory 120 and the communication module 130 establish a communication connection therebetween by way of a bus.
The processor 110 may be of any type, including a processor 110 having one or more processing cores. The system can execute single-thread or multi-thread operation and is used for analyzing instructions to execute operations of acquiring data, executing logic operation functions, issuing operation processing results and the like.
The memory 120 is a non-transitory computer-readable storage medium, and can be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the unmanned vehicle obstacle avoidance method in the embodiment of the present invention (for example, the first detection distance determining module 51, the second detection distance determining module 52, the second detection range determining module 53, and the obstacle avoidance path planning module 54 shown in fig. 11). The processor 110 executes various functional applications and data processing of the unmanned vehicle obstacle avoidance device 50 by running the non-transitory software program, instructions and modules stored in the memory 120, so as to implement the unmanned vehicle obstacle avoidance method in any one of the above method embodiments.
The memory 120 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the unmanned vehicle obstacle avoidance device 50, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 120 optionally includes memory located remotely from processor 110, which may be connected to unmanned vehicle 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The memory 120 stores instructions executable by the at least one processor 110; the at least one processor 110 is configured to execute the instructions to implement the unmanned vehicle obstacle avoidance method in any of the method embodiments described above, for example, to execute the above-described method steps 10, 20, 30, 40, and so on, to implement the functions of the modules 51 to 54 in fig. 11.
The communication module 130 is a functional module for establishing a communication connection and providing a physical channel. The communication module 130 may be any type of wireless or wired communication module 130 including, but not limited to, a WiFi module or a bluetooth module, etc.
Further, embodiments of the present invention also provide a non-transitory computer-readable storage medium storing computer-executable instructions, which are executed by one or more processors 110, for example, by one processor 110 in fig. 6, and may cause the one or more processors 110 to execute the unmanned vehicle obstacle avoidance method in any of the above method embodiments, for example, to execute the above described method steps 10, 20, 30, 40, and so on, to implement the functions of the modules 51 to 54 in fig. 11.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by associated hardware as a computer program in a computer program product, the computer program being stored in a non-transitory computer-readable storage medium, the computer program comprising program instructions that, when executed by an associated apparatus, cause the associated apparatus to perform the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The product can execute the unmanned vehicle obstacle avoidance method provided by the embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the unmanned vehicle obstacle avoidance method. For details of the unmanned vehicle obstacle avoidance method provided in the embodiments of the present invention, reference may be made to the following description.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (12)
1. An obstacle avoidance method is applied to an unmanned vehicle, and is characterized by comprising the following steps:
determining a first detection distance according to the maximum detection range of the unmanned vehicle;
determining a second detection distance according to the first detection distance, wherein the second detection distance is smaller than the first detection distance, and the difference between the first detection distance and the second detection distance is greater than or equal to a preset distance;
determining a second detection range according to the second detection distance;
and when an obstacle is detected in the second detection range, planning an obstacle avoidance path according to the second detection range and the maximum detection range, wherein at least the obstacle with the size corresponding to the preset distance is detected.
2. The method of claim 1, wherein planning an obstacle avoidance path according to the second detection range and the maximum detection range comprises:
extending the second detection range to the maximum detection range;
acquiring a depth image of the obstacle within the maximum detection range;
and planning an obstacle avoidance path according to the depth image.
3. The method according to claim 1, wherein planning an obstacle avoidance path according to the second detection range and the maximum detection range comprises:
expanding the second detection range to a preset detection range, wherein the preset detection range corresponds to a preset detection distance, the preset detection distance is greater than the second detection distance, and the preset detection distance is smaller than the first detection distance;
acquiring a depth image of the obstacle within the preset detection range;
and planning an obstacle avoidance path according to the depth image.
4. The method of claim 3, wherein planning an obstacle avoidance path according to the depth image comprises:
judging whether the depth image is a complete depth image of the obstacle or not;
if so, planning an obstacle avoidance path according to the depth image;
if not, expanding the preset detection range to the maximum detection range, acquiring the depth image of the obstacle in the maximum detection range, and planning an obstacle avoidance path according to the depth image of the obstacle.
5. The method of claim 3, wherein planning an obstacle avoidance path according to the depth image comprises:
judging whether the depth image is a complete depth image of the obstacle;
if so, planning an obstacle avoidance path according to the depth image;
if not, increasing the preset detection distance corresponding to the preset detection range by a preset distance, and continuously acquiring the expanded depth image of the obstacle in the preset detection range.
6. The method according to any one of claims 2 to 5,
the depth image includes a three-dimensional point cloud of the obstacle.
7. The method according to any one of claims 2 to 6,
after the obtaining of the depth image of the obstacle, the method further includes:
and carrying out filtering and/or smoothing processing on the depth image.
8. The method according to any one of claims 2 to 7,
before planning the obstacle avoidance path, the method further comprises:
and decelerating the unmanned vehicle to a preset safe speed.
9. The method of claim 8,
the decelerating the unmanned vehicle to a preset safe speed comprises:
decelerating the unmanned vehicle to the preset safe speed at a preset acceleration;
calculating the preset acceleration by the following formula
Where V is the current speed, V max D is the second detection for the preset safe speed
And a is the preset acceleration.
10. The utility model provides a keep away barrier device is applied to unmanned vehicle, its characterized in that includes:
a first detection distance determination module for determining a first detection distance according to a maximum detection range of the unmanned vehicle;
a second detection distance determining module, configured to determine a second detection distance according to the first detection distance, where the second detection distance is smaller than the first detection distance, and a difference between the first detection distance and the second detection distance is greater than or equal to a preset distance;
a second detection range determining module, configured to determine a second detection range according to the second detection distance;
and the obstacle avoidance path planning module is used for planning an obstacle avoidance path according to the second detection range and the maximum detection range when an obstacle is detected in the second detection range, wherein at least the obstacle with the size corresponding to the preset distance is detected.
11. An obstacle avoidance apparatus according to claim 10,
the obstacle avoidance path planning module comprises a detection range expansion unit, a depth image acquisition unit and an obstacle avoidance path planning unit;
the detection range extending unit is configured to extend the second detection range to the maximum detection range;
the depth image acquisition unit is used for acquiring a depth image of the obstacle in a maximum detection range;
and the obstacle avoidance path planning unit is used for planning an obstacle avoidance path according to the depth image.
12. An unmanned vehicle, comprising:
an unmanned vehicle body;
the depth sensor is arranged on the unmanned vehicle main body and used for acquiring a depth image of an obstacle and planning an obstacle avoidance path;
a processor disposed within the unmanned vehicle body and in communication with the depth sensors, respectively; and the number of the first and second groups,
a memory communicatively coupled to the processor; wherein,
the memory stores instructions executable by the processor to enable the processor to perform the method of any one of claims 1-9.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911006620.5A CN110751336B (en) | 2019-10-22 | 2019-10-22 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
PCT/CN2020/118850 WO2021078003A1 (en) | 2019-10-22 | 2020-09-29 | Obstacle avoidance method and device for unmanned vehicle, and unmanned vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911006620.5A CN110751336B (en) | 2019-10-22 | 2019-10-22 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110751336A CN110751336A (en) | 2020-02-04 |
CN110751336B true CN110751336B (en) | 2023-04-14 |
Family
ID=69279373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911006620.5A Active CN110751336B (en) | 2019-10-22 | 2019-10-22 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110751336B (en) |
WO (1) | WO2021078003A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751336B (en) * | 2019-10-22 | 2023-04-14 | 深圳市道通智能航空技术股份有限公司 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
CN114701526A (en) * | 2022-04-02 | 2022-07-05 | 广东电网有限责任公司惠州供电局 | Automatic control method and unmanned control transmission line rail transportation equipment |
CN114994604A (en) * | 2022-04-21 | 2022-09-02 | 深圳市倍思科技有限公司 | Human-computer interaction position determining method and device, robot and storage medium |
CN116047909B (en) * | 2023-01-13 | 2023-09-05 | 大连海事大学 | Unmanned plane-ship cooperative robust self-adaptive control method for maritime parallel search |
CN118015369B (en) * | 2024-02-20 | 2024-08-13 | 常州市双爱家私股份有限公司 | Obstacle detection system and method and electric lifting table |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106802668A (en) * | 2017-02-16 | 2017-06-06 | 上海交通大学 | Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave |
CN107831777A (en) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | A kind of aircraft automatic obstacle avoiding system, method and aircraft |
CN108268036A (en) * | 2018-01-19 | 2018-07-10 | 刘晋宇 | A kind of novel robot intelligent barrier avoiding system |
CN109857112A (en) * | 2019-02-21 | 2019-06-07 | 广东智吉科技有限公司 | Obstacle Avoidance and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4189858B2 (en) * | 2004-11-16 | 2008-12-03 | 株式会社ホンダアクセス | Obstacle detection device |
CN107480638B (en) * | 2017-08-16 | 2020-06-30 | 北京京东尚科信息技术有限公司 | Vehicle obstacle avoidance method, controller, device and vehicle |
CN107650908B (en) * | 2017-10-18 | 2023-07-14 | 长沙冰眼电子科技有限公司 | Unmanned vehicle environment sensing system |
CN110231832B (en) * | 2018-03-05 | 2022-09-06 | 北京京东乾石科技有限公司 | Obstacle avoidance method and obstacle avoidance device for unmanned aerial vehicle |
CN109917420B (en) * | 2019-02-27 | 2024-09-06 | 科沃斯商用机器人有限公司 | Automatic walking device and robot |
CN110751336B (en) * | 2019-10-22 | 2023-04-14 | 深圳市道通智能航空技术股份有限公司 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
-
2019
- 2019-10-22 CN CN201911006620.5A patent/CN110751336B/en active Active
-
2020
- 2020-09-29 WO PCT/CN2020/118850 patent/WO2021078003A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106802668A (en) * | 2017-02-16 | 2017-06-06 | 上海交通大学 | Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave |
CN107831777A (en) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | A kind of aircraft automatic obstacle avoiding system, method and aircraft |
CN108268036A (en) * | 2018-01-19 | 2018-07-10 | 刘晋宇 | A kind of novel robot intelligent barrier avoiding system |
CN109857112A (en) * | 2019-02-21 | 2019-06-07 | 广东智吉科技有限公司 | Obstacle Avoidance and device |
Also Published As
Publication number | Publication date |
---|---|
WO2021078003A1 (en) | 2021-04-29 |
CN110751336A (en) | 2020-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110751336B (en) | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier | |
US11915502B2 (en) | Systems and methods for depth map sampling | |
CN111448476B (en) | Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle | |
CN110765894B (en) | Target detection method, device, equipment and computer readable storage medium | |
US10520943B2 (en) | Unmanned aerial image capture platform | |
KR102614323B1 (en) | Create a 3D map of a scene using passive and active measurements | |
CN111670339B (en) | Techniques for collaborative mapping between unmanned aerial vehicles and ground vehicles | |
JP2019011971A (en) | Estimation system and automobile | |
WO2020186444A1 (en) | Object detection method, electronic device, and computer storage medium | |
CN111784748A (en) | Target tracking method and device, electronic equipment and mobile carrier | |
CN113561963A (en) | Parking method and device and vehicle | |
CN112740268A (en) | Target detection method and device | |
CN111216127A (en) | Robot control method, device, server and medium | |
CN112379681A (en) | Unmanned aerial vehicle obstacle avoidance flight method and device and unmanned aerial vehicle | |
CN112378397A (en) | Unmanned aerial vehicle target tracking method and device and unmanned aerial vehicle | |
CN110568857A (en) | unmanned aerial vehicle flight control method and unmanned aerial vehicle | |
EP4155759A1 (en) | Systems and method for lidar grid velocity estimation | |
KR20200070100A (en) | A method for detecting vehicle and device for executing the method | |
JP2020106986A (en) | Information processing device, information processing method, and mobile | |
CN113661513A (en) | Image processing method, image processing device, image processing system and storage medium | |
JP7242822B2 (en) | Estimation system and car | |
US20230150543A1 (en) | Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques | |
KR102719930B1 (en) | Method and apparatus for generating an image representing an object around a vehicle | |
CN116457843A (en) | Time-of-flight object detection circuit and time-of-flight object detection method | |
CN117542042A (en) | Three-dimensional object detection method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 518055 Guangdong city of Shenzhen province Nanshan District Xili Street Xueyuan Road No. 1001 Chi Yuen Building 9 layer B1 Applicant after: Shenzhen daotong intelligent Aviation Technology Co.,Ltd. Address before: 518055 Guangdong city of Shenzhen province Nanshan District Xili Street Xueyuan Road No. 1001 Chi Yuen Building 9 layer B1 Applicant before: AUTEL ROBOTICS Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |