US12202473B2 - Obstacle tracking method, storage medium and unmanned driving device - Google Patents
Obstacle tracking method, storage medium and unmanned driving device Download PDFInfo
- Publication number
- US12202473B2 US12202473B2 US17/669,355 US202217669355A US12202473B2 US 12202473 B2 US12202473 B2 US 12202473B2 US 202217669355 A US202217669355 A US 202217669355A US 12202473 B2 US12202473 B2 US 12202473B2
- Authority
- US
- United States
- Prior art keywords
- obstacle
- point cloud
- aggregated
- determining
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- This application relates to the field of unmanned driving technologies and in particular to an obstacle tracking method and apparatus, a storage medium and an unmanned driving device.
- an unmanned driving device is to perform path planning in real time according to a position of the unmanned driving device and a position of each surrounding obstacle, to perform obstacle avoidance driving.
- each dynamic obstacle further needs to be tracked, to predict a moving track of each dynamic obstacle according to a tracking result, and to further plan a driving path of the unmanned driving device.
- obstacle detection may be first performed for every two consecutive frames of laser point clouds acquired by the unmanned driving device, to determine each obstacle included in the two consecutive frames of laser point clouds. Then, according to obstacle information of each obstacle in the two consecutive frames of laser point clouds, for example, information such as a position of a central point of the obstacle and a size of the obstacle, a matching between every two obstacles in the two frames of laser point clouds is performed, to determine a matching result. Finally, an obstacle tracking result is determined according to the matching result.
- an obstacle matching may be performed on every two consecutive frames of laser point clouds acquired by the unmanned driving device in a surrounding environment, a position change of a same obstacle in every two consecutive frames of the laser point clouds is tracked, to continuously track the same obstacle.
- an obstacle matching may alternatively be performed on two frames of laser point clouds at intervals acquired by the unmanned driving device. For example, an obstacle matching is performed on an acquired first frame of laser point cloud and an acquired third frame of laser point cloud. This is not limited in this disclosure, and may be set as desired.
- the obstacle matching in examples herein is intended to determine whether obstacles are a same obstacle.
- the obstacle tracking method may be performed by the unmanned driving device, or may be performed by a server that controls the driving of the unmanned driving device.
- the server may be a single server, or may be a system including a plurality of servers such as a distributed server cluster, or may be a physical server device, or may be a cloud server. This is not limited in this disclosure, and may be specifically set as required.
- S 102 Determine an obstacle aggregation region in the first point cloud according to position information of each obstacle in the first point cloud.
- a matched obstacle group may be further limited, to improve the matching accuracy of the aggregated obstacles. Therefore, in this disclosure, whether there is an obstacle aggregation region in the first point cloud acquired by the unmanned driving device may be first detected. When it is detected that there is the obstacle aggregation region in the first point cloud, the matched obstacle group is further limited.
- the obstacle aggregation region in the first point cloud may alternatively be determined in another manner, for example, by using a clustering algorithm for clustering.
- a specific adopted manner is not limited in this disclosure, and may be set as required, provided that the region in which the obstacles are densely distributed in the first point cloud is determined.
- S 104 Determine, according to a position of the obstacle aggregation region, at least one obstacle in the second point cloud that falls within the obstacle aggregation region as at least one aggregated obstacle, and at least one another obstacle in the second point cloud except the at least one aggregated obstacle as at least one non-aggregated obstacle.
- an aggregated obstacle in an aggregation region and a non-aggregated obstacle in a non-aggregation region in the second point cloud may be determined.
- the unmanned driving device may determine whether the obstacle is located within the obstacle aggregation region of the first point cloud. When it is determined that the obstacle is located within the obstacle aggregation region, the obstacle is determined as an aggregated obstacle. When it is determined that the obstacle is not located within the obstacle aggregation region, the obstacle is determined as a non-aggregated obstacle.
- a left side of FIG. 2 shows a first point cloud acquired by the unmanned driving device
- a right side shows a second point cloud acquired by the unmanned driving device, where cylinders connected by dotted lines represent obstacles, and each obstacle travels to the right.
- an obstacle aggregation region with densely distributed obstacles in the first point cloud may be determined, as shown in a three-dimensional (3D) box in FIG. 2 .
- an obstacle falling in the obstacle aggregation region (the 3D box in FIG. 2 ) in the second point cloud is determined as an aggregated obstacle, and an obstacle not falling in the second point cloud in the obstacle aggregation region is a non-aggregated obstacle.
- S 106 For each of the at least one aggregated obstacle, respectively determine matching parameter values between the aggregated obstacle and each obstacle in the first point cloud by using a group matching rule, and determine a matching result of the aggregated obstacle according to the matching parameter values between the aggregated obstacle and each obstacle in the first point cloud.
- matching parameter values between the aggregated obstacle and each obstacle in the first point cloud may be respectively determined by using a group matching rule.
- the obstacle information includes at least a position, a size, a quantity of included points, a type, and the like of the obstacle.
- a larger matching parameter value between two obstacles indicates a lower matching degree (that is, a similarity) between the two obstacles.
- the group matching rule includes matching rules corresponding to different feature dimensions, such as a distance dimension, a size dimension, a point quantity dimension, and a type dimension.
- a distance between the aggregated obstacle and each obstacle in the first point cloud may be determined according to the position of the aggregated obstacle and the position of each obstacle in the first point cloud. Then, for each obstacle in the first point cloud, a matching parameter value between the aggregated obstacle and the obstacle in the distance dimension is determined according to a distance between the aggregated obstacle and the obstacle.
- the position of the obstacle is a position of a centre point of the obstacle. A smaller distance between the aggregated obstacle and the obstacle indicates a smaller matching parameter value, and it indicates a higher similarity between the aggregated obstacle and the obstacle in the distance dimension. That is, the matching parameter value is negatively correlated with the similarity between the two obstacles.
- the matching parameter value between the two obstacles in the distance dimension is determined according to the distance between the two obstacles, to distinguish matching degrees between the obstacles based on distance differences between the obstacles, the matching parameter value between the two obstacles may be determined by using the following formula:
- f ⁇ ( t ) ⁇ x 1 ⁇ ⁇ ⁇ t , ⁇ ⁇ t > t 0 x 2 ⁇ ⁇ ⁇ t , ⁇ ⁇ t ⁇ t 0
- ⁇ t represents the distance between the two obstacles
- t 0 represents a second predetermined threshold, and may be specifically set as required
- x 1 represents a first weight
- x 2 represents a second weight
- f(t) represents the matching parameter value between the two obstacles in the distance dimension.
- the distance between the two obstacles is greater than the second predetermined threshold, it indicates a lower matching degree between the two obstacles, and it may be determined that the matching parameter value between the two obstacles in the distance dimension is x 2 ⁇ t.
- the second weight x 2 is often set to be greater than the first weight x 1 .
- a matching parameter value between the aggregated obstacle and the obstacle in the size dimension may be determined according to a size of the aggregated obstacle and a size of the obstacle.
- the matching parameter value between the two obstacles in the size dimension is determined according to a size between the two obstacles
- the matching parameter value of the aggregated obstacle in the second point cloud and the obstacle in the first point cloud in the size dimension may be determined according to a size ratio, such as a length ratio or width ratio, between the two obstacles and a third weight corresponding to the size dimension.
- the third weight may be increased, to further obtain a larger matching parameter value, which represents a lower similarity between the two obstacles.
- a matching parameter value between the aggregated obstacle and the obstacle in the point quantity dimension may be determined according to a quantity of points included in the aggregated obstacle in the second point cloud and a quantity of points included in the obstacle in the first point cloud.
- the matching parameter value between the two obstacles in the point quantity dimension may be determined according to the quantity of points included in the two obstacles.
- g ⁇ ( O ) ⁇ x 4 ( O 2 - O 1 O 2 ) , O 1 ⁇ O 2 x 4 ( O 1 - O 2 O 1 ) , O 1 ⁇ O 2
- O 1 represents the quantity of points included in the obstacle in the first point cloud
- O 2 represents the quantity of points included in the aggregated obstacle in the second point cloud
- x 4 represents a fourth weight corresponding to the point quantity dimension
- g(O) represents the matching parameter value between the two obstacles in the point quantity dimension.
- a matching parameter value between the aggregated obstacle and the obstacle in the type quantity dimension may be determined according to a type of the aggregated obstacle in the second point cloud and a type of the obstacle in the first point cloud. For example, when it is determined that the types of the two obstacles are the same, it may be determined that a matching parameter value between the two obstacles in the type dimension is 0, and when it is determined that the types of the two obstacles are different, it may be determined that the matching parameter value between the two obstacles in the type dimension is 1.
- a matching parameter value between the aggregated obstacle and the obstacle in the first point cloud may be determined according to a matching parameter value between the aggregated obstacle and the obstacle in the first point cloud in at least one of the feature dimensions.
- a matching result of each aggregated obstacle may be determined according to each matching parameter value by using a matching algorithm.
- the matching algorithm may be a Kuhn-Munkres (KM) algorithm, a Gale-Shapley algorithm, or the like. This is not limited in this disclosure, and may be specifically set as required.
- S 108 For each of the at least one non-aggregated obstacle, respectively determine matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud by using a non-group matching rule, and determine a matching result of the non-aggregated obstacle according to the matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud.
- a matching rule with a relatively weak constraint may be used for obstacle matching. That is, the constraint of the group matching rule for matching on the aggregated obstacle is stronger than that of the non-group matching rule for matching on the non-aggregated obstacle.
- matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud may be respectively determined by using a non-group matching rule.
- the obstacle information includes at least a position, an orientation, a size, a quantity of included points, a type, and the like of the obstacle.
- a larger matching parameter value between two obstacles indicates a lower matching degree between the two obstacles.
- the non-group matching rule includes matching rules corresponding to different feature dimensions, such as a distance dimension, a size dimension, a point quantity dimension, a type dimension, and an orientation dimension.
- each non-aggregated obstacle and each obstacle in the first point cloud are matched based on a matching rule of each different feature dimension in the non-group matching rule, reference may also be made to the method in the foregoing step S 106 , to respectively determine matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud in the distance dimension, the size dimension, the point quantity dimension, or the type dimension.
- the aggregated obstacle group is more stable in the environment, that is, position changes of the aggregated obstacles are relatively small. However, the non-aggregated obstacles are less stable in the environment, and position changes are relatively large. Therefore, when the aggregated obstacles are matched, a stronger constraint may be set on the distance dimension. That is, the first weight set for the distance dimension in the group matching rule is greater than a first weight set for the distance dimension in the non-group matching rule. The second weight set for the distance dimension in the group matching rule is also greater than a second weight set for the distance dimension in the non-group matching rule.
- the aggregated obstacles are relatively densely distributed, and situations such as under-segmentation may occur. For example, two obstacles are connected to form one obstacle, resulting in a large size of the obstacle and a relatively large quantity of point clouds.
- the non-aggregated obstacles are relatively sparsely distributed, a single obstacle is relatively prone to be obtained through segmentation. Therefore, when the aggregated obstacles are matched, a weaker constraint may be set in the size dimension and the point quantity dimension. That is, a third weight set for the size dimension in the group matching rule is less than a third weight set for the distance dimension in the non-group matching rule. A fourth weight set for the point quantity dimension in the group matching rule is also less than a fourth weight set for the point quantity dimension in the non-group matching rule.
- a matching parameter value between the non-aggregated obstacle and the obstacle in the orientation dimension may be determined according to an orientation of the non-aggregated obstacle and an orientation of the obstacle.
- a matching parameter value between the non-aggregated obstacle and the obstacle in the first point cloud is determined according to a matching parameter value between the non-aggregated obstacle and the obstacle in the first point cloud in at least one of the feature dimensions.
- a matching result of each non-aggregated obstacle may be determined according to each matching parameter value by using the matching algorithm.
- the matching algorithm may be a Kuhn-Munkres (KM) algorithm, a Gale-Shapley algorithm, or the like. This is not limited in this disclosure, and may be specifically set as required.
- S 110 Determine an obstacle tracking result according to the matching result of each of the at least one aggregated obstacle and the matching result of each of the at least one non-aggregated obstacle.
- each obstacle that is successfully tracked may be determined.
- a speed of the obstacle may be updated according to position information of the same obstacle in the first point cloud and the second point cloud respectively and a time interval of acquiring the two frames of laser point clouds.
- a following driving path of the unmanned driving device is planned according to updated information such as a speed of each obstacle.
- an obstacle aggregation region in the first point cloud may be determined according to position information of each obstacle in the first point cloud acquired by the unmanned driving device. Then, an aggregated obstacle and a non-aggregated obstacle in the second point cloud acquired by the unmanned driving device are determined according to the obstacle aggregation region. In addition, a matching result of each aggregated obstacle is respectively determined based on a group matching rule, and a matching result of each non-aggregated obstacle is respectively determined based on a non-group matching rule. Finally, an obstacle tracking result is determined according to the matching result of each obstacle in the second point cloud. Based on different matching rules, the aggregated obstacle and the non-aggregated obstacle are respectively matched, which improves the matching accuracy of the aggregated obstacle, so that the obstacle tracking result is more accurate.
- the matching parameter value is negatively correlated with a similarity between the obstacle in the second point cloud and the obstacle in the first point cloud in a predetermined dimension.
- the predetermined dimension includes a plurality of dimensions
- the matching parameter value may be obtained based on a plurality of matching parameter values corresponding to the plurality of dimensions.
- a dimension value between the two obstacles in the dimension may be determined according to obstacle information of the obstacle in the second point cloud and the obstacle in the first point cloud, and the matching parameter value is then determined according to the dimension value.
- the final matching parameter value determined according to the dimension values between the two obstacles in the dimensions by using the non-group matching rule is relatively small, that is, it indicates that the similarity between the obstacle in the second point cloud and the obstacle in the first point cloud is relatively high. That is, the constraint of the group matching rule is stronger than the constraint of the non-group matching rule.
- the matching parameter value is negatively correlated with the similarity.
- step S 106 if the aggregated obstacle matches each obstacle in the first point cloud, and no matched obstacle is obtained, the aggregated obstacle may be changed to the non-aggregated obstacle, and matching is performed again by using the non-group matching rule.
- step S 106 and step S 108 to further improve the matching accuracy of each obstacle, when the matching parameter value between the two obstacles is greater than the predetermined value, it may be determined that the two obstacles are extremely dissimilar, and the similarity is relatively low. Therefore, it may be determined that the two obstacles do not match. Then, a matching result of each obstacle may be subsequently determined according to each obstacle whose matching parameter value is less than the predetermined value by using a matching algorithm.
- a sequence in which the foregoing step S 106 and step S 108 are performed is not limited.
- the aggregated obstacle may be first matched, or the non-aggregated obstacle may be first matched, or the two obstacles are matched simultaneously. This is not limited in this disclosure, and may be set as required.
- each aggregated obstacle may alternatively be first matched.
- obstacles that do not match any aggregated obstacle in the second point cloud may be determined as a first obstacle set from each obstacle included in the first point cloud.
- matching parameter values between the non-aggregated obstacle and each obstacle in the first obstacle set are respectively determined by using a non-group matching rule.
- a matching result of each non-aggregated obstacle is determined according to the matching parameter values between each non-aggregated obstacle and each obstacle in the first obstacle set.
- a position of the matched obstacle may be determined that the obstacle is a static obstacle, and a moving speed is 0. If the speed of the obstacle has been determined according to a laser point cloud acquired during historical driving, an acceleration of the obstacle may be determined according to a speed determined in history, positions of the obstacle in the first point cloud and the second point cloud, and a time interval of acquiring the two frames of laser point cloud.
- an obstacle that fails to match in the second point cloud that is, no matched obstacle in the first point cloud corresponding to the obstacle
- the foregoing group matching rule set for the aggregated obstacle and the non-group matching rule set for the non-aggregated obstacle are merely an exemplary implementation.
- another matching rule may alternatively be set for the aggregated obstacle and the non-aggregated obstacle.
- a specific setting manner of another matching rule is not limited in this disclosure, and may be set as required, provided that the aggregated obstacle may be tracked more accurately.
- the unmanned driving device acquires laser point clouds in a surrounding environment, and performs obstacle tracking based on the laser point clouds.
- the unmanned driving device may alternatively track each obstacle in the surrounding environment in a manner of acquiring environment images. If the unmanned driving device performs obstacle tracking in the manner of acquiring the environment images, two frames of environment images continuously acquired by the unmanned driving device may be obtained, and obstacle tracking is performed in the manner described in the foregoing step S 102 to step S 110 . Because the foregoing steps have been described in detail, details are not described again in this disclosure.
- the unmanned driving device may alternatively comprehensively perform obstacle tracking with reference to the acquired environment images and laser point clouds. This is not limited in this disclosure, and may be specifically set as required.
- the obstacle tracking method shown in this disclosure may be used in an unmanned delivery process.
- the unmanned driving device may track each surrounding obstacle by using the obstacle tracking method in this disclosure, and plan a delivery path according to obstacle information of each obstacle tracked in real time, to perform the delivery task according to the planned path.
- an embodiment in accordance with this disclosure further correspondingly provides a schematic structural diagram of an obstacle tracking apparatus shown in FIG. 3 .
- FIG. 3 is a schematic structural diagram of an obstacle tracking apparatus according to an embodiment of this disclosure.
- the apparatus includes:
- the first matching module 206 is specifically configured to perform at least one of the following: respectively determine the matching parameter value between the aggregated obstacle and each obstacle in the first point cloud according to a distance between the aggregated obstacle and each obstacle in the first point cloud; respectively determine the matching parameter value between the aggregated obstacle and each obstacle in the first point cloud according to a size of the aggregated obstacle and a size of each obstacle in the first point cloud; respectively determine the matching parameter value between the aggregated obstacle and each obstacle in the first point cloud according to a quantity of points included in the aggregated obstacle in the second point cloud and a quantity of points included in each obstacle in the first point cloud; or respectively determine the matching parameter value between the aggregated obstacle and each obstacle in the first point cloud according to a type of the aggregated obstacle and a type of each obstacle in the first point cloud.
- the second matching module 208 is specifically configured to perform at least one of the following: respectively determine the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud according to a distance between the non-aggregated obstacle and each obstacle in the first point cloud; respectively determine the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud according to a size of the non-aggregated obstacle and a size of each obstacle in the first point cloud; respectively determine the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud according to a quantity of points included in the non-aggregated obstacle in the second point cloud and a quantity of points included in each obstacle in the first point cloud; respectively determine the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud according to a type of the non-aggregated obstacle and a type of each obstacle in the first point cloud; or respectively determine the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud according to an orientation of the non-aggregated obstacle and
- the aggregated obstacle is changed to a non-aggregated obstacle in a case that the matching result of the aggregated obstacle is not determined.
- the second matching module 208 is specifically configured to: determine, from each obstacle included in the first point cloud, obstacles that do not match any aggregated obstacle in the second point cloud as a first obstacle set; and for each non-aggregated obstacle, respectively determine a matching parameter value between the non-aggregated obstacle and each obstacle in the first obstacle set by using a non-group matching rule, and determine a matching result of the non-aggregated obstacle according to the matching parameter value between the non-aggregated obstacle and each obstacle in the first obstacle set.
- the determining module 202 is further configured to determine that each obstacle included in the second point cloud is a non-aggregated obstacle, when the first point cloud does not include the obstacle aggregation region.
- An embodiment in accordance with this disclosure further provides a computer-readable storage medium, storing a computer program.
- the computer program may be used for performing the foregoing obstacle tracking method provided in FIG. 1 .
- an embodiment in accordance with this disclosure further provides a schematic structural diagram of an unmanned driving device shown in FIG. 4 .
- the unmanned driving device includes a processor, an internal bus, a network interface, an internal memory, and a non-volatile memory, and may certainly further include hardware required for other services.
- the processor reads a corresponding computer program from the non-volatile storage into the memory and then runs the computer program, to implement the foregoing obstacle tracking method shown in FIG. 1 .
- this disclosure does not exclude other implementations, for example, a logic device or a combination of software and hardware.
- an entity executing the following processing procedure is not limited to the logic units, and may also be hardware or logic devices.
- improvements of a technology can be clearly distinguished as hardware improvements (for example, improvements to a circuit structure such as a diode, a transistor, or a switch) or software improvements (improvements to a method procedure).
- improvements of many method procedures can be considered as direct improvements of hardware circuit structures.
- Designers almost all program an improved method procedure to a hardware circuit, to obtain a corresponding hardware circuit structure. Therefore, it does not mean that the improvement of a method procedure cannot be implemented by using a hardware entity module.
- a programmable logic device such as a field programmable gate array (FPGA) is a type of integrated circuit whose logic function is determined by a user by programming the device.
- HDL hardware description language
- HDLs for example, advanced Boolean expression language (ABEL), altera hardware description language (AHDL), Confluence, Georgia university programming language (CUPL), HDCal, Java hardware description language (JHDL), Lava, Lola, MyHDL, PALASM, Ruby hardware description language (RHDL), and the like.
- ABEL advanced Boolean expression language
- AHDL altera hardware description language
- CUPL Cornell university programming language
- HDCal Java hardware description language
- JHDL Java hardware description language
- Lava Lola
- MyHDL MyHDL
- PALASM Ruby hardware description language
- RHDL Ruby hardware description language
- VHDL very-high-speed integrated circuit hardware description language
- Verilog Verilog
- the controller can be implemented in any suitable manner, for example, the controller can take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (for example, software or firmware) executable by the processor, a logic gate, a switch, an application-specific integrated circuit (ASIC), a programmable logic controller and an embedded microcontroller.
- Examples of the controller include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320.
- the memory controller can also be implemented as part of the memory control logic.
- controller in addition to implementing the controller in the form of pure computer-readable program code, it is also possible to implement, by logically programming the method steps, the controller in the form of a logic gate, switch, ASIC, programmable logic controller, and embedded microcontroller and other forms to achieve the same function.
- a controller can thus be considered as a hardware component and apparatuses included therein for implementing various functions can also be considered as structures inside the hardware component.
- apparatuses configured to implement various functions can be considered as both software modules implementing the method and structures inside the hardware component.
- the system, the apparatus, the module or the unit described in the foregoing embodiments may be implemented by a computer chip or an entity, or implemented by a product having a certain function.
- a typical implementation device is a computer.
- the computer may be, for example, a personal computer, a laptop computer, a cellular phone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
- the apparatus is divided into units according to functions, which are respectively described.
- the functions of the units may be implemented in the same piece of or a plurality of pieces of software and/or hardware.
- the embodiments of the present invention may be provided as a method, a system, or a computer program product. Therefore, the present invention may be in a form of complete hardware embodiments, complete software embodiments, or combination of software and hardware. Moreover, the present invention may use a form of a computer program product that is implemented on one or more computer-usable storage media (including but not limited to a disk memory, a compact disc read-only memory (CD-ROM), an optical memory, and the like) that include computer-usable program code.
- a computer-usable storage media including but not limited to a disk memory, a compact disc read-only memory (CD-ROM), an optical memory, and the like
- These computer program instructions may be provided to a general-purpose computer, a special-purpose computer, an embedded processor, or a processor of another programmable data processing device to generate a machine, so that an apparatus configured to implement functions specified in one or more procedures in the flowcharts and/or one or more blocks in the block diagrams is generated by using instructions executed by the general-purpose computer or the processor of another programmable data processing device.
- These computer program instructions may alternatively be stored in a computer-readable memory that can instruct a computer or another programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory generate an artifact that includes an instruction apparatus.
- the instruction apparatus implements a specific function in one or more procedures in the flowcharts and/or in one or more blocks in the block diagrams.
- These computer program instructions may also be loaded onto a computer or another programmable data processing device, so that a series of operations and steps are performed on the computer or the another programmable device, thereby generating computer-implemented processing. Therefore, the instructions executed on the computer or the another programmable device provide steps for implementing a specific function in one or more processes in the flowcharts and/or in one or more blocks in the block diagrams.
- a computing device includes one or more processors (CPUs), an input/output interface, a network interface, and an internal memory.
- processors CPUs
- input/output interface input/output interface
- network interface network interface
- internal memory volatile and non-volatile memory
- the memory may include a form such as a volatile memory, a random-access memory (RAM) and/or a non-volatile memory such as a read-only memory (ROM) or a flash RAM in a computer-readable medium.
- RAM random-access memory
- ROM read-only memory
- flash RAM flash RAM
- the memory is an example of the computer-readable medium.
- the computer-readable medium includes a non-volatile medium and a volatile medium, a removable medium and a non-removable medium, which may implement storage of information by using any method or technology.
- the information may be a computer-readable instruction, a data structure, a program module, or other data.
- An example of a computer storage medium includes, but not limited to, a phase-change memory (PRAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), another type of RAM, a ROM, an electrically erasable programmable read-only memory (EEPROM), a flash memory or another memory technology, a compact disc read-only memory (CD-ROM), a digital versatile disc (DVD) or another optical memory, a cassette, a magnetic tape, a magnetic disk memory or another magnetic storage device, or any other non-transmission medium, and may be used to store information accessible by the computing device.
- the computer-readable medium does not include transitory computer readable media (transitory media), such as a modulated data signal and a carrier.
- this disclosure may be provided as a method, a system, or a computer program product. Therefore, this disclosure may use a form of hardware only embodiments, software only embodiments, or embodiments with a combination of software and hardware. Moreover, this disclosure may use a form of a computer program product that is implemented on one or more computer-usable storage media (including but not limited to a disk memory, a compact disc read-only memory (CD-ROM), an optical memory, and the like) that include computer-usable program code.
- CD-ROM compact disc read-only memory
- optical memory optical memory
- the program module includes a routine, a program, an object, a component, a data structure, and the like for executing a particular task or implementing a particular abstract data type.
- This disclosure may also be implemented in a distributed computing environment in which tasks are performed by remote processing devices connected by using a communication network.
- the program module may be located in both local and remote computer storage media including storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
-
- obtaining two frames of laser point clouds acquired by an unmanned driving device as a first point cloud and a second point cloud;
- determining an obstacle aggregation region in the first point cloud according to position information of each obstacle in the first point cloud;
- determining, according to a position of the obstacle aggregation region, at least one obstacle in the second point cloud that falls within the obstacle aggregation region as at least one aggregated obstacle, and at least one another obstacle in the second point cloud except the at least one aggregated obstacle as at least one non-aggregated obstacle;
- for each of the at least one aggregated obstacle, respectively determining matching parameter values between the aggregated obstacle and each obstacle in the first point cloud by using a group matching rule, and determining a matching result of the aggregated obstacle according to the matching parameter values between the aggregated obstacle and each obstacle in the first point cloud;
- for each non-aggregated obstacle in the at least one non-aggregated obstacle, respectively determining matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud by using a non-group matching rule, and determining a matching result of the non-aggregated obstacle according to the matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud, where a constraint condition of the group matching rule is stronger than a constraint condition of the non-group matching rule; and
- determining an obstacle tracking result according to a matching result of each of the at least one aggregated obstacle and a matching result of each non-aggregated obstacle in the at least one non-aggregated obstacle.
-
- an obtaining module, configured to obtain two frames of laser point clouds acquired by an unmanned driving device as a first point cloud and a second point cloud respectively;
- a determining module, configured to determine an obstacle aggregation region in the first point cloud according to position information of each obstacle in the first point cloud;
- a classification module, configured to determine, according to a position of the obstacle aggregation region, at least one obstacle in the second point cloud that falls within the obstacle aggregation region as at least one aggregated obstacle, and at least one another obstacle in the second point cloud except the at least one aggregated obstacle as at least one non-aggregated obstacle;
- a first matching module, configured to, for each of the at least one aggregated obstacle, respectively determine matching parameter values between the aggregated obstacle and each obstacle in the first point cloud by using a group matching rule, and determine a matching result of the aggregated obstacle according to the matching parameter values between the aggregated obstacle and each obstacle in the first point cloud;
- a second matching module, configured to, for each non-aggregated obstacle in the at least one non-aggregated obstacle, respectively determine matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud by using a non-group matching rule, and determine a matching result of the non-aggregated obstacle according to the matching parameter values between the non-aggregated obstacle and each obstacle in the first point cloud, where a constraint condition of the group matching rule is stronger than a constraint condition of the non-group matching rule; and
- a tracking module, configured to determine an obstacle tracking result according to a matching result of each of the at least one aggregated obstacle and a matching result of each non-aggregated obstacle in the at least one non-aggregated obstacle.
where Δt represents the distance between the two obstacles, t0 represents a second predetermined threshold, and may be specifically set as required, x1 represents a first weight, x2 represents a second weight, and f(t) represents the matching parameter value between the two obstacles in the distance dimension. When the distance between the two obstacles is greater than the second predetermined threshold, it indicates a lower matching degree between the two obstacles, and it may be determined that the matching parameter value between the two obstacles in the distance dimension is x2Δt. When the distance between the two obstacles is less than the second predetermined threshold, it indicates a higher matching degree between the two obstacles, and it may be determined that the matching parameter value between the two obstacles in the distance dimension is x1Δt. Moreover, because the matching parameter value is negatively correlated with the similarity between the two obstacles, the second weight x2 is often set to be greater than the first weight x1.
where O1 represents the quantity of points included in the obstacle in the first point cloud, O2 represents the quantity of points included in the aggregated obstacle in the second point cloud, x4 represents a fourth weight corresponding to the point quantity dimension, and g(O) represents the matching parameter value between the two obstacles in the point quantity dimension.
-
- an obtaining
module 200, configured to obtain two frames of laser point clouds acquired by an unmanned driving device as a first point cloud and a second point cloud respectively; - a determining
module 202, configured to determine an obstacle aggregation region in the first point cloud according to position information of each obstacle in the first point cloud; - a
classification module 204, configured to determine, according to a position of the obstacle aggregation region, at least one obstacle in the second point cloud that falls within the obstacle aggregation region as at least one aggregated obstacle, and at least one another obstacle in the second point cloud except the at least one aggregated obstacle as at least one non-aggregated obstacle; - a
first matching module 206, configured to, for each of the at least one aggregated obstacle, respectively determine a matching parameter value between the aggregated obstacle and each obstacle in the first point cloud by using a group matching rule, and determine a matching result of the aggregated obstacle according to the matching parameter value between the aggregated obstacle and each obstacle in the first point cloud; - a
second matching module 208, configured to, for each non-aggregated obstacle in the at least one non-aggregated obstacle, respectively determine a matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud by using a non-group matching rule, and determine a matching result of the non-aggregated obstacle according to the matching parameter value between the non-aggregated obstacle and each obstacle in the first point cloud, where a constraint condition of the group matching rule is stronger than a constraint condition of the non-group matching rule; and - a
tracking module 210, configured to determine an obstacle tracking result according to a matching result of each of the at least one aggregated obstacle and a matching result of each non-aggregated obstacle in the at least one non-aggregated obstacle.
- an obtaining
Claims (20)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202110364576.6A CN112734810B (en) | 2021-04-06 | 2021-04-06 | Obstacle tracking method and device |
| CN202110364576.6 | 2021-04-06 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220314980A1 US20220314980A1 (en) | 2022-10-06 |
| US12202473B2 true US12202473B2 (en) | 2025-01-21 |
Family
ID=75596440
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/669,355 Active 2043-08-03 US12202473B2 (en) | 2021-04-06 | 2022-02-11 | Obstacle tracking method, storage medium and unmanned driving device |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US12202473B2 (en) |
| CN (1) | CN112734810B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20250061718A1 (en) * | 2023-08-14 | 2025-02-20 | Suzhou TongRuiXing Technology Co., LTD | Rail area extraction method based on laser point cloud data |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113807184B (en) * | 2021-08-17 | 2024-06-21 | 北京百度网讯科技有限公司 | Obstacle detection method, device, electronic equipment and autonomous driving vehicle |
| CN117008599A (en) * | 2023-02-27 | 2023-11-07 | 北京石头创新科技有限公司 | Path planning method, related equipment and robots for path planning method |
| CN117671643B (en) * | 2023-12-19 | 2025-04-18 | 北京百度网讯科技有限公司 | Obstacle detection method and device |
| CN119538073B (en) * | 2025-01-22 | 2025-05-13 | 之江实验室 | Deep learning network modulation method for sonar point cloud semantic understanding for deep sea scenes |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104318206A (en) | 2014-09-30 | 2015-01-28 | 东软集团股份有限公司 | Barrier detection method and apparatus |
| US20190080183A1 (en) | 2017-09-14 | 2019-03-14 | Baidu Online Network Technology (Beijing) Co., Ltd . | Dynamic obstacle point cloud annotating method and apparatus, device and readable medium |
| CN109509210A (en) | 2017-09-15 | 2019-03-22 | 百度在线网络技术(北京)有限公司 | Barrier tracking and device |
| CN110458055A (en) | 2019-07-29 | 2019-11-15 | 江苏必得科技股份有限公司 | A kind of obstacle detection method and system |
| CN112347999A (en) | 2021-01-07 | 2021-02-09 | 深圳市速腾聚创科技有限公司 | Obstacle recognition model training method, obstacle recognition method, device and system |
| CN112348897A (en) | 2020-11-30 | 2021-02-09 | 上海商汤临港智能科技有限公司 | Pose determination method and device, electronic equipment and computer readable storage medium |
| CN112393735A (en) | 2019-08-15 | 2021-02-23 | 纳恩博(北京)科技有限公司 | Positioning method and device, storage medium and electronic device |
-
2021
- 2021-04-06 CN CN202110364576.6A patent/CN112734810B/en active Active
-
2022
- 2022-02-11 US US17/669,355 patent/US12202473B2/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104318206A (en) | 2014-09-30 | 2015-01-28 | 东软集团股份有限公司 | Barrier detection method and apparatus |
| US20190080183A1 (en) | 2017-09-14 | 2019-03-14 | Baidu Online Network Technology (Beijing) Co., Ltd . | Dynamic obstacle point cloud annotating method and apparatus, device and readable medium |
| CN109509210A (en) | 2017-09-15 | 2019-03-22 | 百度在线网络技术(北京)有限公司 | Barrier tracking and device |
| CN110458055A (en) | 2019-07-29 | 2019-11-15 | 江苏必得科技股份有限公司 | A kind of obstacle detection method and system |
| CN112393735A (en) | 2019-08-15 | 2021-02-23 | 纳恩博(北京)科技有限公司 | Positioning method and device, storage medium and electronic device |
| CN112348897A (en) | 2020-11-30 | 2021-02-09 | 上海商汤临港智能科技有限公司 | Pose determination method and device, electronic equipment and computer readable storage medium |
| CN112347999A (en) | 2021-01-07 | 2021-02-09 | 深圳市速腾聚创科技有限公司 | Obstacle recognition model training method, obstacle recognition method, device and system |
Non-Patent Citations (2)
| Title |
|---|
| Fel Yang et.al"Real-time dynamic obstacle detection and tracking using 3D Lidar", Sep. 30, 2012, 7 pages. |
| State Intellectual Property Office of the People's Republic of China, Office Action and Search Report Issued in Application No. 2021103645766, May 14, 2021, 10 pages. (Submitted with Machine/Partial Translation). |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20250061718A1 (en) * | 2023-08-14 | 2025-02-20 | Suzhou TongRuiXing Technology Co., LTD | Rail area extraction method based on laser point cloud data |
| US12283105B2 (en) * | 2023-08-14 | 2025-04-22 | Suzhou TongRuiXing Technology Co., LTD | Rail area extraction method based on laser point cloud data |
Also Published As
| Publication number | Publication date |
|---|---|
| US20220314980A1 (en) | 2022-10-06 |
| CN112734810A (en) | 2021-04-30 |
| CN112734810B (en) | 2021-07-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12202473B2 (en) | Obstacle tracking method, storage medium and unmanned driving device | |
| US20220324483A1 (en) | Trajectory prediction method and apparatus, storage medium, and electronic device | |
| US20220319189A1 (en) | Obstacle tracking method, storage medium, and electronic device | |
| US20210154835A1 (en) | Robot path planning method and apparatus and robot using the same | |
| CN112068553A (en) | Robot obstacle avoidance processing method and device and robot | |
| CN112987760B (en) | Trajectory planning method and device, storage medium and electronic equipment | |
| CN111882611A (en) | Map construction method and device | |
| US12019447B2 (en) | Method and apparatus for trajectory planning, storage medium, and electronic device | |
| US10647315B2 (en) | Accident probability calculator, accident probability calculation method, and non-transitory computer-readable medium storing accident probability calculation program | |
| CN114332201B (en) | Model training and target detection method and device | |
| CN115880685B (en) | A three-dimensional target detection method and system based on votenet model | |
| CN113139696B (en) | Trajectory prediction model construction method and trajectory prediction method and device | |
| CN111062372B (en) | Method and device for predicting obstacle track | |
| CN111797711A (en) | Model training method and device | |
| US12387371B2 (en) | Pose determining | |
| CN113642616B (en) | Training sample generation method and device based on environment data | |
| CN114623824A (en) | Method and device for determining barrier speed | |
| CN114545940A (en) | Unmanned equipment control method and device and electronic equipment | |
| CN118053153B (en) | Point cloud data identification method and device, storage medium and electronic equipment | |
| CN117235195B (en) | Pedestrian movement area prediction method, device and equipment | |
| CN117333509A (en) | Target tracking method, device and equipment | |
| CN117008615A (en) | A strategy switching unmanned vehicle trajectory planning method and system | |
| CN114549579A (en) | Target tracking method and device | |
| EP4571250A1 (en) | Methods and apparatus for navigation with point mass filters | |
| CN115014375B (en) | Collision detection method and device, electronic equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BEIJING SANKUAI ONLINE TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XIA, HUAXIA;CAI, SHANBO;REEL/FRAME:058985/0822 Effective date: 20211230 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: BEIJING SANKUAI ONLINE TECHNOLOGY CO., LTD., CHINA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE ADDRESS PREVIOUSLY RECORDED ON REEL 058985 FRAME 0822. ASSIGNOR(S) HEREBY CONFIRMS THE ADDRESS IS:ROOM 2106-030, NO.9 WEST NORTH 4TH RING ROAD, HAIDIAN DISTRICT;ASSIGNORS:XIA, HUAXIA;CAI, SHANBO;REEL/FRAME:059353/0501 Effective date: 20211230 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |