CN114623824A - Method and device for determining barrier speed - Google Patents

Method and device for determining barrier speed Download PDF

Info

Publication number
CN114623824A
CN114623824A CN202210115877.XA CN202210115877A CN114623824A CN 114623824 A CN114623824 A CN 114623824A CN 202210115877 A CN202210115877 A CN 202210115877A CN 114623824 A CN114623824 A CN 114623824A
Authority
CN
China
Prior art keywords
obstacle
speed
determining
target
bounding box
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210115877.XA
Other languages
Chinese (zh)
Inventor
顾晨岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202210115877.XA priority Critical patent/CN114623824A/en
Publication of CN114623824A publication Critical patent/CN114623824A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The specification discloses a method and a device for determining barrier speed. Determining target obstacles with abnormal undetermined speed, determining polygonal surrounding frames of the target obstacles corresponding to the current time and each historical time aiming at each target obstacle, determining surrounding frame displacement of the polygonal surrounding frames corresponding to a plurality of pairs of discontinuous times, determining the target time from each historical time when the target obstacles are verified to be dynamic obstacles again, and determining the final speed of the current time of the target obstacles according to the surrounding frame displacement of the polygonal surrounding frames corresponding to the target obstacles at the current time and the target time. The target obstacle of which the speed needs to be verified again can be screened out, the displacement of the bounding box of the target obstacle between two discontinuous moments can be determined more accurately based on the polygonal bounding box which depicts the shape of the obstacle, the error can be reduced, and the more accurate final speed of the dynamic target obstacle at the current moment can be determined again.

Description

Method and device for determining barrier speed
Technical Field
The present disclosure relates to the field of unmanned driving, and more particularly, to a method and apparatus for determining a speed of an obstacle.
Background
In some fields, it is necessary to identify the speed of a target object based on data collected by a sensor to perform a corresponding task. For example, in the field of unmanned driving, when an unmanned device moves in an environment, it is necessary to identify obstacles in the environment based on environmental data collected by a sensor and estimate the speed of the obstacles, to perform path planning, determine a control strategy, and the like based further on the speed of the obstacles.
At present, the current speed of an obstacle is usually calculated by performing matching tracking on the obstacle in two adjacent frames of environmental data acquired by a sensor and determining the corresponding relationship between the obstacles in the two adjacent frames of environmental data.
However, in the prior art, the low-speed obstacle and the static obstacle are difficult to distinguish, and the speed accuracy of the determined low-speed obstacle and the determined low-speed obstacle is low.
Therefore, a method for optimizing the speed of the determined low-speed obstacle is needed.
Disclosure of Invention
The present disclosure provides a method and apparatus for determining a speed of an obstacle, which partially solve the above problems of the prior art.
The technical scheme adopted by the specification is as follows:
the present specification provides a method of determining the velocity of an obstacle, comprising:
determining an obstacle with abnormal undetermined speed as a target obstacle according to the undetermined speed of each obstacle at the current moment and the historical speed of each obstacle at each historical moment;
for each target obstacle, determining a polygonal surrounding frame which is respectively corresponding to each moment including the current moment and is used for describing the shape of the target obstacle according to a preset first time period;
determining the bounding box displacement of a plurality of pairs of polygon bounding boxes corresponding to discontinuous moments according to the determined polygon bounding boxes corresponding to the moments respectively;
judging whether the target barrier is a dynamic barrier or not according to the determined displacement of each enclosure frame;
if so, determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target obstacle at the current time according to the current time of the target obstacle and the bounding box displacement of the polygonal bounding box corresponding to the target time.
Optionally, the method for determining the obstacle with abnormal undetermined speed according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time specifically includes:
determining a low-speed obstacle with an undetermined speed smaller than a preset speed threshold value from all obstacles;
determining each historical time before the current time according to a preset second time period, and determining the historical speed of each low-speed obstacle at each historical time;
for each low-speed obstacle, judging whether the undetermined speed of the low-speed obstacle and the historical speed of the low-speed obstacle are both larger than a preset lower speed limit, and judging whether the difference of the speed directions of the low-speed obstacle between two adjacent moments is within a preset angle range in the second time interval;
if the judgment results are yes, determining that the low-speed obstacle is an obstacle with normal undetermined speed, and taking the undetermined speed of the low-speed obstacle as the final speed of the current time of the low-speed obstacle;
and if any judgment result is negative, determining that the low-speed obstacle is an obstacle with abnormal undetermined speed.
Optionally, determining, according to the determined polygon enclosure frames corresponding to the respective moments, enclosure frame displacements of a plurality of pairs of polygon enclosure frames corresponding to the non-continuous moments, specifically including:
determining the current speed direction of the target obstacle;
determining a plurality of pairs of polygonal surrounding frames of the target obstacle corresponding to the discontinuous moments according to a preset second time difference; the second time difference is less than the first time difference;
and determining bounding box displacement between each pair of polygonal bounding boxes according to the distance of the pair of polygonal bounding boxes in the speed direction.
Optionally, determining the current speed direction of the target obstacle specifically includes:
projecting the polygon bounding box of the target obstacle corresponding to each historical moment onto the polygon bounding box of the target obstacle corresponding to the current moment;
determining a representative bounding box from the polygon bounding boxes at each historical moment;
determining an offset vector corresponding to the target obstacle according to the identification point of the representative enclosure frame and the identification point of the polygon enclosure frame at the current moment;
determining a rectangular surrounding frame corresponding to the target obstacle at the current moment, and determining a central line of the rectangular surrounding frame;
and determining the direction of the vector corresponding to the central line as the current speed direction of the target obstacle under the condition that the included angle between the vector corresponding to the central line and the offset vector is smaller than a preset angle threshold.
Optionally, determining a bounding box displacement between the pair of polygonal bounding boxes according to a distance of the pair of polygonal bounding boxes in the speed direction, specifically including:
determining a starting bounding box and an end bounding box in the pair of polygon bounding boxes according to the time sequence;
determining a head edge point and a tail edge point of the initial enclosing frame and determining a head edge point and a tail edge point of the terminal enclosing frame according to intersection points of the central line, the initial enclosing frame and the terminal enclosing frame;
and determining the displacement of the bounding box between the pair of polygonal bounding boxes in the speed direction according to at least one of the distance between the head edge points of the starting bounding box and the end-point bounding box and the distance between the tail edge points of the starting bounding box and the end-point bounding box.
Optionally, according to the determined displacement of each bounding box, determining whether the target obstacle is a dynamic obstacle, specifically including:
judging whether the displacement of each bounding box is larger than a preset displacement threshold value or not;
when the bounding box displacement is determined to be larger than the displacement threshold value, determining that the bounding box displacement is accurate displacement;
and when the number of accurate displacements in all the displacements of the surrounding frame is greater than the preset displacement times, determining that the target obstacle is a dynamic obstacle in the first time period.
Optionally, determining an obstacle with an abnormal undetermined speed according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time, and specifically including:
for each obstacle in the current environment, determining the currently undetermined speed and type of the obstacle and the distance between the obstacle and the unmanned equipment;
determining a low-speed obstacle which belongs to a preset type and has a distance with the unmanned equipment smaller than a preset distance threshold value according to the current undetermined speed and type of each obstacle and the distance with the unmanned equipment, and taking the low-speed obstacle as a candidate obstacle;
and determining candidate obstacles with abnormal undetermined speed as target obstacles according to the undetermined speed of each candidate obstacle at the current moment and the historical speed of each candidate obstacle at each historical moment.
The present specification provides an apparatus for determining the velocity of an obstacle, comprising:
the target determining module is used for determining the obstacle with abnormal undetermined speed as the target obstacle according to the undetermined speed of each obstacle at the current moment and the historical speed of each obstacle at each historical moment;
the bounding box determining module is used for determining a polygonal bounding box which is corresponding to each moment including the current moment and is used for depicting the shape of the target obstacle according to a preset first time interval for each target obstacle;
the displacement determining module is used for determining the displacement of the surrounding frames of the polygonal surrounding frames corresponding to a plurality of pairs of discontinuous moments according to the determined polygonal surrounding frames corresponding to the moments respectively;
the judging module is used for judging whether the target barrier is a dynamic barrier or not according to the determined displacement of each surrounding frame;
and if so, determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target barrier at the current time according to the current time of the target barrier and the bounding box displacement of the polygonal bounding box corresponding to the target time.
The present specification provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described method of determining velocity of an obstacle.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above method of determining the velocity of an obstacle when executing the program.
The technical scheme adopted by the specification can achieve the following beneficial effects:
in the method for determining the speed of the obstacle provided by the specification, a target obstacle with an abnormal undetermined speed is determined, polygon bounding boxes of the target obstacle corresponding to the current time and each historical time are determined aiming at each target obstacle, so that displacements of bounding boxes of the polygon bounding boxes corresponding to a plurality of pairs of discontinuous times are determined, when the target obstacle is re-verified to be a dynamic obstacle, the target time is determined from each historical time, and the final speed of the current time of the target obstacle is determined according to the displacements of the bounding boxes of the polygon bounding boxes corresponding to the current time and the target time of the target obstacle.
The method can screen the target obstacle of which the speed needs to be verified again, and more accurately determines the displacement of the surrounding frame of the target obstacle between two discontinuous moments based on the polygonal surrounding frame describing the shape of the obstacle. Moreover, because the calculated displacement between two discontinuous moments is little influenced by noise, and whether the obstacle is a dynamic obstacle is verified again on the basis of the displacement of the bounding box corresponding to the plurality of discontinuous moments, the error interference can be further reduced, and the final speed of the dynamic target obstacle at the current moment can be more accurately determined.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the principles of the specification and not to limit the specification in a limiting sense. In the drawings:
FIG. 1 is a schematic flow chart of a method for determining the velocity of an obstacle according to the present disclosure;
FIG. 2 is a schematic illustration of a first time difference provided herein;
FIG. 3 is a schematic diagram of an offset vector of a target obstacle provided herein;
FIG. 4 is a schematic illustration of determining a velocity direction of a target obstacle as provided herein;
FIG. 5 is a schematic illustration of determining a velocity direction of a target obstacle as provided herein;
FIG. 6 is a schematic diagram of a leading edge point and a trailing edge point provided in the present specification;
FIG. 7 is a schematic diagram of an apparatus for determining velocity of an obstacle as provided herein;
fig. 8 is a schematic structural diagram of an electronic device provided in this specification.
Detailed Description
At present, an unmanned device generally detects environment data collected by a sensor (such as a vision sensor, a radar, etc.) through a target detection algorithm, and identifies each obstacle in the environment at a time corresponding to the environment data. Generally, each obstacle identified based on target detection is identified by a Bounding Box (B-Box), that is, a rectangular Bounding Box, and the position of the obstacle is also determined based on the position of the rectangular Bounding Box of the obstacle.
And for each obstacle in the environment data acquired at the current time, matching the obstacle with other obstacles in the environment data acquired at the previous time of the current time based on a rectangular surrounding frame of the obstacle, and determining the corresponding relation between the obstacle and other obstacles, that is, determining which other obstacle in the environment data of the previous frame and the obstacle in the current frame corresponds to the same obstacle in the environment through matching of two adjacent frames of environment data.
According to the matching relation of the obstacles between the current frame of environmental data and the previous frame of environmental data, namely the adjacent two frames of environmental data, the position difference of the obstacles in the adjacent two frames of environmental data determined based on the central point position of the rectangular bounding box and the time difference for collecting the adjacent two frames of environmental data, the current speed of the obstacles can be calculated.
However, when estimating the speed of an obstacle based on the above method during the movement of the unmanned aerial vehicle, since the rectangular bounding box of the obstacle obtained by target detection is used to mark the obstacle in the environmental data and to enclose the approximate position of the obstacle, the accuracy of positioning the obstacle by the rectangular bounding box is not high. And the shaking of the rectangular bounding box is serious, that is, for the same obstacle, the size and position difference of the rectangular bounding box of the obstacle identified from different frame environment data are large, and the stability is poor. This makes the position of the obstacle determined based on the rectangular bounding box less accurate, which further makes the speed determined based on the rectangular bounding box less accurate for some low speed obstacles, such as low speed crossing roads or turning vehicles, non-vehicles, etc., which require accurate determination of their difference in position at different times to accurately determine their speed to distinguish them from static obstacles.
In addition, in the method for determining the speed of the target obstacle based on the rectangular bounding box, for one obstacle, if the obstacle is blocked by other obstacles in the movement process, the difference between the sizes of the rectangular bounding box before the obstacle is blocked and the rectangular bounding box after the obstacle is blocked is more, the difference between the positions of the central points of the two bounding boxes on the obstacle body is more, and the displacement error of the obstacle determined based on the central points is larger.
In addition, the existing method for estimating the speed of the obstacle is difficult to distinguish the low-speed obstacle from the static obstacle, and the inaccuracy of distinguishing the static obstacle from the dynamic obstacle and the inaccuracy of determining the speed of the low-speed obstacle cause adverse effects on operations such as path planning of the unmanned equipment, so that the unmanned equipment is at risk.
Therefore, a method capable of more accurately determining the velocity of a low-velocity moving obstacle is highly required.
In this specification, in order to identify the position of an obstacle in environment data more accurately, in addition to determining a rectangular bounding box of the obstacle by performing target detection on the environment data, a polygonal bounding box for describing the shape of the obstacle, which more closely follows the outline of the obstacle, may be determined. After the speed of the obstacle is obtained through preliminary calculation, the determined speed is not directly trusted, but is used as an undetermined speed, and the undetermined speed is verified through the speed of the obstacle at the current moment and the historical speed, so that the obstacle with the abnormal undetermined speed serving as the target obstacle is determined. And the target obstacle, namely whether the target obstacle is a dynamic obstacle moving at a low speed or a static obstacle recognized as the dynamic obstacle by mistake needs to be verified again, and when the target obstacle is the low-speed obstacle, the speed of the target obstacle is determined again based on the displacement of the bounding box of the polygonal bounding box for more accurately identifying the position of the target obstacle, and the target is the final speed at the current moment.
The method for calculating the position difference of the obstacle between different frames of environmental data by adopting the rectangular surrounding frame is abandoned, and in order to reduce noise interference and reduce errors, the speed of the obstacle corresponding to the current frame of environmental data is not optimized by adopting two adjacent frames of environmental data.
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without making any creative effort belong to the protection scope of the present specification.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a method for determining a speed of an obstacle in this specification, which specifically includes the following steps:
s100: and determining the obstacle with abnormal undetermined speed as the target obstacle according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time.
In this specification, the method of determining the speed of an obstacle may be performed by a driving device, which may be an unmanned device (e.g., an unmanned vehicle), or a manned device with unmanned functionality, or other devices that have the requirement of identifying the speed of an obstacle in an environment. The method performed by the drone is explained as an example later.
In one or more embodiments of the present description, the unmanned aerial vehicle collects environmental data at regular time intervals during driving in an environment, and one frame of environmental data is collected at each time. And sequentially aiming at each frame of acquired environment data, the unmanned equipment can determine the final speed of each target obstacle in the environment at the moment corresponding to the environment data by the method for determining the speed of the obstacle. That is, the unmanned aerial vehicle can determine the final speed of each target obstacle in the environment at the current time in real time through the method for determining the speed of the obstacle during the process of flowing over time.
Firstly, the unmanned equipment can identify each obstacle in the environment based on the environment data collected at the current moment so as to determine the target obstacle.
Wherein the environmental data may be a point cloud acquired by radar.
In this specification, when each obstacle in the environment is determined by target detection, an existing target detection algorithm may be used, and the specification is not limited thereto. The present specification can determine the surrounding frame of each obstacle and the type of the obstacle in the environment data by performing object detection on the environment data. The bounding Box may include a rectangular bounding Box (B-Box) and a polygonal bounding Box (Polygon).
Since various obstacles are generally present in the environment, for example, high-speed obstacles and low-speed obstacles can be classified based on the speed. Based on the type, it can be classified into motor vehicles, non-motor vehicles, pedestrians, animals, etc. The unmanned equipment is used for identifying obstacles in the environment and determining the speed of the obstacles so as to avoid collision and danger. The risk of obstacles varies depending on their speed and type. For a high-speed obstacle, due to high-speed movement of the high-speed obstacle, the position difference of the high-speed obstacle is large in two adjacent frames or two frames of environment data with an acquisition time interval, the influence of noise is small when the speed of the high-speed obstacle is calculated, the high-speed obstacle and a static obstacle are easy to distinguish, and the determined speed of the high-speed obstacle is also accurate generally. Therefore, the target obstacle in the method for determining the speed of the obstacle provided by the specification can be selected from low-speed obstacles. Of course, the final speed of the obstacle moving at a high speed at the current moment may also be determined based on the method for determining the speed of the obstacle provided in this specification, and this specification is not limited herein.
In addition, for some types of obstacles, such as animals, which generally have better active avoidance capability and do not stay on the road surface of the environment for a long time, such obstacles are not the targets for determining the speed of the obstacles by the method for determining the speed of the obstacles provided in the present specification.
In one or more embodiments of the present description, the drone may determine, after determining each obstacle in the environment, a pending speed for each obstacle at a current time. Then, the obstacle with abnormal undetermined speed can be determined as each target obstacle according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time. The undetermined velocity is the velocity of the target obstacle that is not necessarily accurate as preliminarily calculated.
In this specification, the method of determining the undetermined velocity of the obstacle is not limited. For example, the unmanned device may calculate a displacement of the obstacle between two times to obtain a pending speed of the obstacle at the current time based on a center point position of a rectangular enclosure frame of the obstacle at the current time and a center point position of the rectangular enclosure frame of the obstacle at the historical time. Alternatively, the undetermined speed of the obstacle at the current moment can be obtained based on the displacement of the center point of the polygon enclosing frame obtained by identification between two moments. Or, other methods may also be adopted, for example, the observation speed of the obstacle at the current time may be determined based on environmental data acquired by sensors such as a radar and a visual sensor, the predicted speed of the obstacle at the current time may be obtained based on the speed of the obstacle determined at a plurality of historical times, and the undetermined speed of the obstacle at the current time may be obtained through kalman filtering according to the observation speed and the predicted speed of the obstacle at the current time.
Since all the determined obstacles are not necessarily the obstacles of which the speed needs to be determined again, namely, the obstacles with accurate undetermined speed may exist in the obstacles. Therefore, after the obstacles are determined, the undetermined speed of the obstacles can be verified, and whether the undetermined speed of the obstacles is abnormal or not can be verified, so that the target obstacles can be determined from the obstacles.
A target obstacle, i.e. an obstacle that needs to re-determine its current time velocity as its final current time velocity.
S102: and determining a polygonal surrounding frame which is corresponding to each time including the current time and is used for describing the shape of the target obstacle according to a preset first time period.
As described above, the polygonal bounding box of the obstacle better fits the outline of the obstacle, so that the shape of the obstacle and the position of the obstacle can be better described, and the position of the obstacle and the displacement of the obstacle within a certain time, which are determined based on the polygonal bounding box, are more accurate.
Therefore, in one or more embodiments of the present specification, after the target obstacle is determined, the unmanned device may determine, for each target obstacle, each time including the current time according to a preset first time period, and determine a polygon enclosure corresponding to each time for depicting the shape of the target obstacle. Namely, a polygon bounding box used for drawing the shape of the target obstacle is determined, wherein the polygon bounding box corresponds to the current time and a plurality of historical times before the current time in the first time period. So that in a subsequent step, a bounding box displacement of the target obstacle between different time instants may be determined based on each polygonal bounding box of the target obstacle.
Wherein the first time period can be set as required.
In this specification, the time interval between different moments corresponds to the frame rate of the environment data collected by the unmanned device, for example, if the unmanned device collects 10 frames of environment data in 1 second, the time interval between different moments may be 1/10 seconds, that is, 1 second includes 10 moments, and one moment corresponds to one frame of environment data. Assuming that the first period corresponds to a duration of 1 second, the unmanned aerial vehicle may determine, according to the first period, 10 times including the current time and 9 historical times before the current time, and polygon bounding boxes of the target obstacle corresponding to the 10 times, respectively.
S104: and determining the enclosure frame displacement of the polygon enclosure frames corresponding to a plurality of pairs of discontinuous moments according to the determined polygon enclosure frames corresponding to the moments.
In one or more embodiments of the present disclosure, after determining, according to the first time period, each polygon bounding box of the target obstacle at each time in the first time period, the unmanned device may determine, according to the determined polygon bounding boxes corresponding to each time, bounding box displacements of a plurality of pairs of polygon bounding boxes corresponding to non-consecutive times.
Specifically, the drone may first determine a current speed direction of the target obstacle. And then, determining a plurality of pairs of polygonal surrounding frames of the target obstacle corresponding to the discontinuous moments according to a preset second time difference. And finally, for each pair of polygonal surrounding frames, determining the surrounding frame displacement between the polygonal surrounding frames according to the distance of the polygonal surrounding frames in the speed direction.
The second time difference is the time difference between the times corresponding to the two bounding boxes of the polygon bounding box, i.e. the time interval length between the two bounding boxes, and can be set as required. For example, the second time difference may correspond to the length of two time intervals between three time instants, or may correspond to another number of time intervals. In short, it is sufficient that the pair of polygon bounding boxes determined based on the second time difference are not respectively associated with two consecutive frames of environment data, that is, the two polygon bounding boxes of the pair of polygon bounding boxes are separated by at least one time.
For example, assuming that the time interval between two adjacent time instants is 1 second, the second time difference may be a minimum of 2 seconds. Assuming that a total of 6 time instants including the current time instant are determined based on the first time period, and correspond to 6 frames of environment data, each pair of determined polygon bounding boxes may be: a pair of polygon bounding boxes corresponding to the first time and the third time, a pair of polygon bounding boxes corresponding to the second time and the fourth time, a pair of polygon bounding boxes corresponding to the third time and the fifth time, and a pair of polygon bounding boxes corresponding to the fourth time and the sixth time. And four pairs of polygon bounding boxes are arranged, and correspond to four pairs of environment data respectively.
S106: and judging whether the target barrier is a dynamic barrier or not according to the determined displacement of each enclosure frame, and if so, executing step S108.
Since the determined undetermined velocity of the target obstacle may not be accurate, there is a case where the target obstacle is actually a static obstacle but is recognized as a dynamic obstacle moving at a low velocity due to noise or the like. Therefore, the target obstacle can be verified again based on the determined displacement of each surrounding frame of the target obstacle, and whether the target obstacle is a dynamic obstacle can be judged by verifying whether the target obstacle continuously moves in the first time period.
Although the determination of the position of the target obstacle based on the polygon bounding box is more accurate than the rectangular bounding box, so that the determined bounding box is displaced more accurately, the polygon bounding box may also have a degree of instability. So that the situation that an error exists in the determination of the displacement of the bounding box of the target obstacle between two moments may exist, and whether the target obstacle is a dynamic obstacle or not is determined to be one-sided based on the displacement between the two moments.
Therefore, the present specification eliminates the interference of the polygon bounding box in determining whether the target obstacle is a dynamic obstacle by determining the movement of the target obstacle between a plurality of non-adjacent time points according to the bounding box displacement of the polygon bounding box in the first time period. That is, based on the displacement of the target obstacle over a period of time, it is verified whether the target obstacle is a dynamic obstacle.
Therefore, in one or more embodiments of the present disclosure, after determining the displacement of each bounding box, the unmanned device may determine whether the target obstacle is a dynamic obstacle again according to the determined displacement of each bounding box.
If it is determined that the target obstacle is a dynamic obstacle within the first time period, step S108 is executed, and if not, the target obstacle is determined to be a static obstacle, and the final speed of the target obstacle at the current time is determined to be zero.
S108: and determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target obstacle at the current time according to the current time and the bounding box displacement of the polygonal bounding box corresponding to the target time of the target obstacle.
In one or more embodiments of the present disclosure, after re-determining that the target obstacle is a dynamic obstacle, the unmanned device may determine, from among times, that is, historical times within a first time period, a target time having a time difference greater than a preset first time difference from a current time, and determine a final speed of the target obstacle at the current time according to the current time of the target obstacle and a bounding box displacement of a polygonal bounding box corresponding to the target time. Since the time difference between the current time and the target time is known, the unmanned device may determine, by combining the determined frame displacement of the target obstacle at the current time and the frame displacement of the polygonal frame corresponding to the target time, a final velocity of the target obstacle at the current time.
The first time difference is a time difference lower limit between two time points for calculating a final speed of the target obstacle at the current time. And, the second time difference for determining each pair of polygon bounding boxes in step S104 is smaller than the first time difference. The first time difference may be reasonably set as needed, for example, assuming that the first time interval corresponds to 10 time instants including the current time instant, and the first time difference may correspond to a length of 9 time intervals, the target time instant determined based on the first time difference is a first time instant of the 10 time instants. The current time is the last time of the 10 times. Then the bounding box of the polygon bounding box corresponding to the target obstacle at the current time and the target time is displaced, that is, the bounding box of the polygon bounding box of the target obstacle is displaced between the first time and the last time in the first period.
Therefore, after the unmanned device determines the target obstacle as the dynamic obstacle again, the final speed of the target obstacle at the current moment can be determined according to the bounding box displacement of the polygonal bounding box of the target obstacle between the head moment and the tail moment in the first period.
Alternatively, the first time difference may be another time difference which is larger than the second time difference, for example, the first time difference may correspond to a length of 7 time intervals, as shown in fig. 2.
Fig. 2 is a schematic diagram of a first time difference provided in the present specification. As shown in fig. 2, the first period includes 10 times of t1, t2, t3, t4, t5, t6, t7, t8, t9, and t10, and t10 is the current time. As shown in the parenthesis notation in fig. 2, based on the first time difference, the unmanned aerial device may select one of t1 and t2 having a time difference from t10 greater than the length of 7 time intervals as the target time. The unmanned aerial vehicle may determine that the target time is the 2 nd time within the first period, i.e., t2, in addition to the first time t1 within the first period.
Based on the method for determining the speed of the obstacle shown in fig. 1, the target obstacle with abnormal undetermined speed is determined, the polygon bounding boxes of the target obstacle corresponding to the current time and each historical time are determined aiming at each target obstacle, so as to determine the bounding box displacement of the polygon bounding boxes corresponding to a plurality of pairs of discontinuous times, when the target obstacle is re-verified to be a dynamic obstacle, the target time is determined from each historical time, and the final speed of the current time of the target obstacle is determined according to the bounding box displacement of the polygon bounding box corresponding to the target time and the current time of the target obstacle.
The method can screen the target obstacle of which the speed needs to be verified again, and more accurately determines the displacement of the surrounding frame of the target obstacle between two discontinuous moments based on the polygonal surrounding frame describing the shape of the obstacle. Moreover, because the calculated displacement between two discontinuous moments is little influenced by noise, and whether the obstacle is a dynamic obstacle is verified again on the basis of the displacement of the bounding box corresponding to the plurality of discontinuous moments, the error interference can be further reduced, and the final speed of the dynamic target obstacle at the current moment can be more accurately determined.
In addition, as described above, the target obstacle in this specification is selected from low-speed obstacles, and therefore, in one or more embodiments provided in this specification, when determining an obstacle whose speed is to be determined to be abnormal in step S100, the unmanned aerial vehicle may first determine a low-speed obstacle whose speed is to be determined to be smaller than a preset speed threshold value from among the obstacles in the environment. And then determining each historical time before the current time according to a preset second time period, and determining the historical speed of each low-speed obstacle at each historical time.
Since the low-speed obstacle moves slowly, the position of the low-speed obstacle at several adjacent moments is slightly different, and the speed of the accurately identified low-speed obstacle is stable for a period of time compared with a static obstacle erroneously identified as the low-speed obstacle.
Therefore, after determining the historical speed of each historical time in the second time period, the unmanned device may determine, for each low-speed obstacle, whether the undetermined speed of the low-speed obstacle and the historical speeds are both greater than a preset lower speed limit, and determine, according to the undetermined speed of the low-speed obstacle and the current time, whether the difference between the speed directions of the low-speed obstacle between two adjacent times in the second time period is within a preset angle range. If the judgment results are yes, the low-speed obstacle can be determined to be the obstacle with normal undetermined speed, and the undetermined speed of the low-speed obstacle is used as the final speed of the current time of the low-speed obstacle. If any judgment result is negative, the low-speed obstacle can be determined to be an obstacle with abnormal undetermined speed, namely the target obstacle.
And the historical speed is the undetermined speed of the low-speed obstacle at each historical moment. The lower speed limit is the lower speed limit of the dynamic obstacle under the condition that the undetermined speed may have errors. When the pending speed of the target obstacle is below the lower speed limit, then the target obstacle may be a static obstacle that is misidentified as a dynamic obstacle.
In one or more embodiments of the present description, the second period of time may also include the current time of day.
The second time period may be the same as or different from the first time period, and may be specifically set according to needs, and this specification is not limited herein. Each of the historical times before the current time determined based on the first period may be the same as or different from each of the historical times before the current time determined based on the second period.
In another embodiment of the present description, the historical speed may also be the final speed of each target obstacle at each historical time. When the obstacle with abnormal undetermined speed is determined, the unmanned equipment can also determine each historical time before the current time according to a preset second time period after the low-speed obstacle is determined, and determine the historical speed of each low-speed obstacle at each historical time. Then, for each low-speed obstacle, the speed magnitude variation range and the speed direction variation range of the low-speed obstacle are determined according to the historical speeds of the low-speed obstacle. And judging whether the undetermined speed of the low-speed obstacle at the current moment is within the speed size change range or not, and judging whether the direction of the undetermined speed of the low-speed obstacle at the current moment is within the speed direction change range or not. If the judgment result is yes, the low-speed obstacle can be determined to be the obstacle with normal undetermined speed, and the undetermined speed of the low-speed obstacle is used as the final speed of the current time of the low-speed obstacle. If any judgment result is negative, the low-speed obstacle can be determined to be the obstacle with abnormal undetermined speed.
In one or more embodiments of the present specification, when the unmanned device determines an obstacle with an abnormal undetermined speed according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time, and when the obstacle is used as a target obstacle, the current undetermined speed, type and distance from the unmanned device of each obstacle in the current environment may also be determined. And determining low-speed obstacles which belong to a preset type and have a distance with the unmanned equipment smaller than a preset distance threshold value from the obstacles as candidate obstacles according to the current undetermined speed and type of each obstacle and the distance with the unmanned equipment. And finally, determining the candidate obstacle with abnormal undetermined speed as the target obstacle according to the undetermined speed of each candidate obstacle at the current moment and the historical speed of each candidate obstacle at each historical moment.
The preset type may be set as desired, for example, the preset type may include an automobile. Of course, other types, such as non-motor vehicles, etc., are also included, and the description is not limited herein.
The unmanned equipment can take the obstacle with the speed threshold value smaller than the undetermined speed, which belongs to the preset type and has the distance from the unmanned equipment smaller than the preset distance threshold value, as a low-speed obstacle.
In addition, in one or more embodiments of the present specification, in the step S104 of the present specification, when determining the current speed direction of the target obstacle, the current speed direction of the target obstacle may be determined based on the orientation of the rectangular enclosure frame at the current time of the target obstacle.
Specifically, the unmanned aerial vehicle may first project a polygon bounding box of the target obstacle corresponding to each historical time onto a polygon bounding box of the target obstacle corresponding to the current time. Then, the unmanned device may determine a representative bounding box from the polygon bounding boxes at each historical time, and determine an offset vector corresponding to the target obstacle according to the identification point of the representative bounding box and the identification point of the polygon bounding box at the current time. Then, a rectangular surrounding frame corresponding to the target obstacle at the current moment is determined, and the center line of the rectangular surrounding frame is determined. The unmanned device may determine a direction of the vector corresponding to the center line as a current speed direction of the target obstacle under a condition that an included angle between the vector corresponding to the center line and the offset vector is smaller than a preset angle threshold.
The identification point may be a central point, or may also be a non-edge point other than the central point inside the polygon enclosure frame. The angle threshold may be set as desired, for example, the angle threshold may be 90 °. Of course, the angle threshold may be other values, for example, an angle which is smaller than 90 ° such as 60 ° and which can reasonably determine the direction of the vector corresponding to the center line.
In the case of projection, each polygon bounding box may be projected by projection onto the same coordinate system, for example, onto a radar coordinate system corresponding to the current time, an apparatus coordinate system of the unmanned apparatus (for example, a vehicle coordinate system of an unmanned vehicle), or a world coordinate system. Correspondingly, after projection, the rectangular bounding box of the target obstacle at the current moment is also in the same coordinate system as each polygonal bounding box.
In one or more embodiments of the present disclosure, the bounding box of the object-detected polygon may be three-dimensional or two-dimensional. When the polygon bounding box is three-dimensional, the polygon bounding box of the target obstacle corresponding to each historical moment can be projected to the ground during projection, and a two-dimensional polygon bounding box of the overlooking angle is obtained.
When the polygon bounding box is two-dimensional, the two-dimensional polygon bounding box of the overlooking angle of the obstacle can be directly obtained through target detection.
Since the position change and the speed change of the low-speed obstacle are usually small in the first period, the direction of the offset vector of the target obstacle determined based on the polygon bounding box corresponding to the target obstacle at any time in each historical time and the polygon bounding box at the current time and the current speed direction of the target obstacle are not less than the angle threshold. Therefore, the representative bounding box can be randomly selected from the polygonal bounding boxes corresponding to the target obstacle at each historical time.
Or, in order to avoid the situation that the determined direction of the offset vector of the target obstacle and the current speed direction of the target obstacle are not smaller than the angle threshold value due to the fact that the target obstacle is in the process of turning in the first period, since the direction of the offset vector of the polygon enclosure frame of the target obstacle at the current moment and the moment before the current moment is closer to the current speed direction of the target obstacle, in order to more conveniently determine the speed direction of the target obstacle based on the included angle between the offset vector of the target obstacle and the center line, the polygon enclosure frame corresponding to the target obstacle at the moment before the current moment can be selected as the representative enclosure frame.
Fig. 3 is a schematic diagram of an offset vector of a target obstacle provided in the present specification. In fig. 3, a rectangle represents a rectangular bounding box of the current time of the target obstacle, and of two closed figures having similar shapes except for the rectangular bounding box, a solid line closed figure represents a polygonal bounding box of the current time of the target obstacle, and a dotted line closed figure represents a representative bounding box of the target obstacle. In the two circles in the figure, the circle pointed by the arrow indicates the identification point of the polygonal bounding box of the target obstacle at the current moment, and the other circle indicates the identification point of the target obstacle representing the bounding box. The two circles are all polygonal surrounding frame center points. The arrows in the figure are offset vectors.
Fig. 4 and 5 are schematic diagrams illustrating the determination of the speed direction of a target obstacle according to the present disclosure. The rectangles in fig. 4 and 5 again represent the rectangular bounding box of the target obstacle, the dashed arrows represent the vectors corresponding to the centerlines of the rectangular bounding box, and the solid arrows represent the extended offset vectors. Assume that the preset angle threshold is 90 °. It can be seen that the direction of the centerline-corresponding vector determined based on < a of less than 90 ° is as shown in fig. 4, rather than the direction of making the angle < b of the centerline-corresponding vector and the offset vector greater than 90 ° as shown in fig. 5. Then, the direction of the vector corresponding to the center line of the rectangular bounding box shown in fig. 4 is the current speed direction of the target obstacle.
In one or more embodiments of the present specification, in determining a bounding box displacement between each pair of polygon bounding boxes according to a distance of the pair of polygon bounding boxes in a current speed direction of the target obstacle for each pair of polygon bounding boxes, specifically, the unmanned device may determine a start bounding box and an end bounding box in the pair of polygon bounding boxes according to a time sequence. And determining a head edge point and a tail edge point of the initial enclosing frame and determining a head edge point and a tail edge point of the terminal enclosing frame according to intersection points of the central line, the initial enclosing frame and the terminal enclosing frame. And then determining the bounding box displacement between the pair of polygonal bounding boxes in the speed direction according to at least one of the distance between the head edge points of the starting bounding box and the end bounding box and the distance between the tail edge points of the starting bounding box and the end bounding box.
When determining that the bounding box between the pair of polygon bounding boxes is displaced in the speed direction according to the distance between the head edge points of the start bounding box and the end bounding box and the distance between the tail edge points of the start bounding box and the end bounding box, the unmanned aerial vehicle may take the average value of the distance between the head edge points and the distance between the tail edge points as the bounding box displacement between the pair of polygon bounding boxes in the speed direction.
Fig. 6 is a schematic diagram of a leading edge point and a trailing edge point provided in this specification. In the figure, a solid line closed figure represents a start surrounding box, a dotted line closed figure represents an end surrounding box, and a direction indicated by an arrow represents a speed direction of the target obstacle. In each circle through which the arrow passes, along the direction opposite to the speed direction, the first edge point of the initial enclosing frame, the first edge point of the terminal enclosing frame, the tail edge point of the initial enclosing frame and the tail edge point of the terminal enclosing frame are respectively arranged in sequence.
It should be noted that determining the leading edge point and the trailing edge point of the start bounding box and the leading edge point and the trailing edge point of the end bounding box based on the center line of the rectangular bounding box are only examples. But may also be determined in other ways. For example, the drone may also determine other straight lines that pass through the start and end bounding boxes before being parallel to the centerline.
Specifically, for example, the drone may determine a first intersection perpendicular to the centerline and passing through the start bounding box and a second intersection perpendicular to the centerline and passing through the end bounding box. And determining an edge point which is farthest from the first intersection line from the head to the tail of each edge point of the initial enclosing frame, and respectively using the edge points as a head pending point and a tail pending point of the initial enclosing frame. And determining an edge point which is farthest from the second intersection line from the head to the tail of each edge point of the terminal point enclosure frame, and respectively using the edge points as a head to-be-determined point and a tail to-be-determined point of the terminal point enclosure frame. And then, determining one point from the head undetermined points as a head edge point of the corresponding polygon enclosure frame, and recording the point as a representative point, or determining one point from the tail undetermined points as a tail edge point of the corresponding polygon enclosure frame. Then, the unmanned aerial vehicle can determine a straight line parallel to the center line and passing through the representative point as a representative line, and determine other edge points than the representative point based on an intersection of the representative line with the start bounding box and an intersection with the end bounding box.
If the representative point is the head edge point of the start bounding box, the tail edge point of the start bounding box and the head edge point and the tail edge point of the end bounding box can be determined based on the intersection point of the representative line and the start bounding box and the intersection point of the end bounding box.
In addition, in one or more embodiments of the present specification, in step S106, when determining whether the target obstacle is a dynamic obstacle according to the determined displacement of each bounding box, the unmanned device may determine whether the displacement of each bounding box is greater than a preset displacement threshold. When the bounding box displacement is determined to be greater than the displacement threshold, determining that the bounding box displacement is an accurate displacement. And when the number of accurate displacement in all the displacement of the surrounding frames is larger than the preset displacement times, determining that the target barrier is a dynamic barrier in the first time period.
In one or more embodiments of the present description, the unmanned aerial vehicle may further determine, for each bounding box displacement, whether the bounding box displacement is greater than a preset displacement threshold. When the bounding box displacement is determined to be greater than the displacement threshold, determining that the bounding box displacement is an accurate displacement. And when all the bounding box displacements are accurate displacements, determining that the target obstacle is a dynamic obstacle in the first time period.
In one or more embodiments of the present specification, when the bounding box displacement of each pair of polygonal bounding boxes is determined based on one of a distance between leading edge points of the start bounding box and the end bounding box and a distance between trailing edge points of the start bounding box and the end bounding box for the pair of polygonal bounding boxes, the unmanned aerial vehicle may further check the determination result of whether the target obstacle is a dynamic obstacle based on another distance other than the distance used for determining the bounding box displacement.
The following description will be given by taking, as an example, for each pair of polygon bounding boxes, bounding box displacement of the pair of polygon bounding boxes determined based on the distance between the leading edge points of the starting bounding box and the ending bounding box. The unmanned device can judge whether the displacement of the bounding box is larger than a preset displacement threshold value or not aiming at the displacement of each bounding box, and when the displacement of the bounding box is determined to be larger than the displacement threshold value, the displacement of the bounding box is determined to be accurate.
When the target obstacle is determined to move forward according to the time sequence and the distance between the tail edge points of the starting surrounding frame and the end surrounding frame corresponding to the displacements of the surrounding frames in the displacement of all the surrounding frames, the unmanned equipment can determine that the target obstacle is a dynamic obstacle in the first period, and otherwise, the target obstacle is determined to be a static obstacle.
Or, the unmanned device may further determine, for each bounding box displacement, whether the bounding box displacement is greater than a preset displacement threshold, and when it is determined that the bounding box displacement is greater than the displacement threshold, determine that the bounding box displacement is an accurate displacement. And judging whether the distance between tail edge points of the starting surrounding frame and the end surrounding frame in the pair of polygon surrounding frames is smaller than the displacement threshold value or not aiming at a pair of polygon surrounding frames corresponding to the displacement of each surrounding frame, and if so, taking the pair of polygon surrounding frames as a pair of to-be-determined surrounding frames. And when the number of accurate displacement in the displacement of all the enclosing frames is larger than the preset displacement times, and the number of undetermined enclosing frame pairs in each pair of polygonal enclosing frames corresponding to the displacement of all the enclosing frames is not larger than the preset undetermined number, determining that the target barrier is a dynamic barrier in the first period. Otherwise, it is determined to be a static obstacle.
In this specification, since it is determined that the bounding box between the pair of polygon bounding boxes is displaced in the velocity direction based on at least one of the distance between the head edge points of the start bounding box and the end bounding box and the distance between the tail edge points of the start bounding box and the end bounding box, the bounding box displacement between the pair of polygon bounding boxes is determined based on the distance between the head edge points of the start bounding box and the end bounding box and the distance between the tail edge points of the start bounding box and the end bounding box, so that the bounding box errors can be balanced by combining the distances of the two kinds of edge points. And the distance between the other edge point is verified only according to the distance between one edge point, so that the interference caused by the unstable phenomenon of the surrounding frame of the same barrier, such as large size change of the surrounding frame at different moments, caused by problems such as shielding and the like can be reduced, and the verification of whether the barrier is a dynamic barrier is more accurate.
In addition, in one or more embodiments of the present description, the undetermined speed of the obstacle may also be undetermined, and the unmanned device may determine each low-speed obstacle according to the historical speed of each obstacle. And aiming at each low-speed obstacle, determining a polygonal surrounding frame which is respectively corresponding to each moment including the current moment and is used for describing the shape of the low-speed obstacle according to a preset first time period. And then, determining the displacement of the surrounding frames of the polygonal surrounding frames corresponding to a plurality of pairs of discontinuous moments according to the determined polygonal surrounding frames corresponding to the moments respectively. And then, determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target obstacle at the current time according to the current time of the target obstacle and the bounding box displacement of the polygonal bounding box corresponding to the target time.
Further, since it is necessary to specify the historical speed of each obstacle at the current time, it is necessary to specify the correspondence between each obstacle and each obstacle specified at the historical time. Therefore, in one or more embodiments of the present specification, the correspondence relationship between each target obstacle at the current time and each obstacle at each historical time may be determined by matching, for example, matching may be performed based on one or more combinations of an IntersecTIon ratio (IoU), an area difference, a distance difference, and the like between a rectangular bounding box and/or a polygonal bounding box of each obstacle at each historical time and the rectangular bounding box and/or the polygonal bounding box of each obstacle at each historical time, for each obstacle, so as to determine at most one obstacle most similar to the obstacle from the historical time as the obstacle matching therewith for each historical time.
Based on the matching result, the tracking of each obstacle at the current moment can be realized, the data of each successfully tracked obstacle at each moment can be recorded into the corresponding tracking data, and the currently obtained data of each obstacle, such as information of undetermined speed, position, final speed, position and area of a polygon enclosing frame, can also be updated into the corresponding tracking data.
Based on the same idea, the present specification also provides a corresponding device for determining the speed of an obstacle, as shown in fig. 7.
Fig. 7 is a schematic diagram of an apparatus for determining a speed of an obstacle provided by the present specification, the apparatus including:
the target determining module 200 is configured to determine an obstacle with an abnormal undetermined speed as a target obstacle according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time;
an enclosing frame determining module 201, configured to determine, according to a preset first time period, a polygonal enclosing frame corresponding to each time including the current time and used for depicting the shape of the target obstacle, for each target obstacle;
a displacement determining module 202, configured to determine, according to the determined polygon enclosure frames corresponding to the respective moments, enclosure frame displacements of a plurality of pairs of polygon enclosure frames corresponding to the non-continuous moments;
the judging module 203 is configured to judge whether the target obstacle is a dynamic obstacle according to the determined displacement of each bounding box;
and a speed determining module 204, configured to determine, if yes, a target time at which a time difference with the current time is greater than a preset first time difference from each time, and determine a final speed of the target obstacle at the current time according to the current time of the target obstacle and a bounding box displacement of a polygonal bounding box corresponding to the target time.
Optionally, the target determining module 200 is further configured to determine a low-speed obstacle with an undetermined speed smaller than a preset speed threshold from among the obstacles, determine, according to a preset second time period, each historical time before the current time, determine a historical speed of each low-speed obstacle at each historical time, determine, for each low-speed obstacle, whether the undetermined speed of the low-speed obstacle and each historical speed are both greater than a preset lower speed limit, determine, in the second time period, whether a difference in speed direction of the low-speed obstacle between two adjacent times is within a preset angle range, if both the determination results are yes, determine that the low-speed obstacle is an obstacle with a normal undetermined speed, and use the undetermined speed of the low-speed obstacle as a final speed of the low-speed obstacle at the current time, if any determination result is no, and determining the low-speed obstacle as an obstacle with abnormal undetermined speed.
Optionally, the displacement determining module 202 is further configured to determine a current speed direction of the target obstacle, and determine, according to a preset second time difference, a plurality of pairs of polygon bounding boxes of the target obstacle corresponding to the discontinuous time; and determining the displacement of the bounding box between each pair of polygon bounding boxes according to the distance of the pair of polygon bounding boxes in the speed direction aiming at each pair of polygon bounding boxes when the second time difference is smaller than the first time difference.
Optionally, the displacement determining module 202 is further configured to project a polygon bounding box of the target obstacle corresponding to each historical time onto the polygon bounding box of the target obstacle corresponding to the current time, determine a representing bounding box from the polygon bounding boxes at each historical time, determine an offset vector corresponding to the target obstacle according to the identification point of the representing bounding box and the identification point of the polygon bounding box at the current time, determine a rectangular bounding box corresponding to the target obstacle at the current time, determine a center line of the rectangular bounding box, and determine a direction of a vector corresponding to the center line as a current speed direction of the target obstacle under the condition that an included angle between the vector corresponding to the center line and the offset vector is smaller than a preset angle threshold.
Optionally, the displacement determining module 202 is further configured to determine a starting bounding box and an ending bounding box in the pair of polygon bounding boxes according to a time sequence, determine a leading edge point and a trailing edge point of the starting bounding box according to an intersection point of the central line and the starting bounding box and the ending bounding box, and determine a leading edge point and a trailing edge point of the ending bounding box, and determine a bounding box displacement between the pair of polygon bounding boxes in the velocity direction according to at least one of a distance between the starting bounding box and the leading edge point of the ending bounding box and a distance between the starting bounding box and the trailing edge point of the ending bounding box.
Optionally, the determining module 203 is further configured to determine, for each bounding box displacement, whether the bounding box displacement is greater than a preset displacement threshold, determine that the bounding box displacement is an accurate displacement when it is determined that the bounding box displacement is greater than the displacement threshold, and determine that the target obstacle is a dynamic obstacle in the first time period when the number of accurate displacements in all the bounding box displacements is greater than a preset displacement number.
Optionally, the target determining module 200 is further configured to determine, for each obstacle in the current environment, a current undetermined speed, a current type, and a distance between the obstacle and the unmanned device, determine, according to the current undetermined speed, the current type, and the distance between the obstacle and the unmanned device, a low-speed obstacle that belongs to a preset type and has a distance between the obstacle and the unmanned device that is less than a preset distance threshold as a candidate obstacle, and determine, according to the undetermined speed of each candidate obstacle at the current time and a historical speed of each candidate obstacle at each historical time, a candidate obstacle with an abnormal undetermined speed as a target obstacle.
The present specification also provides a computer readable storage medium having stored thereon a computer program operable to perform the method of determining velocity of an obstacle as provided above with respect to fig. 1.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 8. As shown in fig. 8, at the hardware level, the electronic device includes a processor, an internal bus, a memory, and a non-volatile memory, but may include hardware required for other services. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs it to implement the method of determining the speed of an obstacle as provided in fig. 1 above.
Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain a corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually manufacturing an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to the software compiler used in program development, but the original code before compiling is also written in a specific Programming Language, which is called Hardware Description Language (HDL), and the HDL is not only one kind but many kinds, such as abel (advanced boot Expression Language), ahdl (alternate Language Description Language), communication, CUPL (computer universal Programming Language), HDCal (Java Hardware Description Language), langa, Lola, mylar, HDL, PALASM, rhydl (runtime Description Language), vhjhdul (Hardware Description Language), and vhygl-Language, which are currently used commonly. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in purely computer readable program code means, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, apparatuses, modules or units described in the above embodiments may be specifically implemented by a computer chip or an entity, or implemented by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present disclosure, and is not intended to limit the present disclosure. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (10)

1. A method of determining a velocity of an obstacle, comprising:
determining an obstacle with abnormal undetermined speed as a target obstacle according to the undetermined speed of each obstacle at the current moment and the historical speed of each obstacle at each historical moment;
for each target obstacle, determining a polygonal surrounding frame which is respectively corresponding to each moment including the current moment and is used for describing the shape of the target obstacle according to a preset first time period;
determining the bounding box displacement of a plurality of pairs of polygon bounding boxes corresponding to discontinuous moments according to the determined polygon bounding boxes corresponding to the moments respectively;
judging whether the target barrier is a dynamic barrier or not according to the determined displacement of each enclosure frame;
if so, determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target obstacle at the current time according to the current time of the target obstacle and the bounding box displacement of the polygonal bounding box corresponding to the target time.
2. The method of claim 1, wherein the obstacle with abnormal undetermined speed is determined according to the undetermined speed of each obstacle at the current moment and the historical speed of each obstacle at each historical moment, and the method specifically comprises the following steps:
determining a low-speed obstacle with an undetermined speed smaller than a preset speed threshold value from all obstacles;
determining each historical time before the current time according to a preset second time period, and determining the historical speed of each low-speed obstacle at each historical time;
for each low-speed obstacle, judging whether the undetermined speed of the low-speed obstacle and the historical speed of the low-speed obstacle are both larger than a preset lower speed limit, and judging whether the difference of the speed directions of the low-speed obstacle between two adjacent moments is within a preset angle range in the second time interval;
if the judgment results are yes, determining that the low-speed obstacle is an obstacle with normal undetermined speed, and taking the undetermined speed of the low-speed obstacle as the final speed of the current time of the low-speed obstacle;
and if any judgment result is negative, determining that the low-speed obstacle is an obstacle with abnormal undetermined speed.
3. The method according to claim 1, wherein determining bounding box displacements of a plurality of pairs of polygon bounding boxes corresponding to non-consecutive moments according to the determined polygon bounding boxes corresponding to respective moments comprises:
determining the current speed direction of the target obstacle;
determining a plurality of pairs of polygonal surrounding frames of the target obstacle corresponding to the discontinuous moments according to a preset second time difference; the second time difference is less than the first time difference;
and determining bounding box displacement between each pair of polygonal bounding boxes according to the distance of the pair of polygonal bounding boxes in the speed direction.
4. The method of claim 3, wherein determining the current velocity direction of the target obstacle comprises:
projecting the polygon bounding box of the target obstacle corresponding to each historical moment onto the polygon bounding box of the target obstacle corresponding to the current moment;
determining a representative bounding box from the polygon bounding boxes at each historical moment;
determining an offset vector corresponding to the target barrier according to the identification point of the representative enclosure frame and the identification point of the polygon enclosure frame at the current moment;
determining a rectangular surrounding frame corresponding to the target obstacle at the current moment, and determining a central line of the rectangular surrounding frame;
and determining the direction of the vector corresponding to the central line as the current speed direction of the target obstacle under the condition that the included angle between the vector corresponding to the central line and the offset vector is smaller than a preset angle threshold.
5. The method of claim 4, wherein determining bounding box displacement between the pair of polygon bounding boxes based on the distance of the pair of polygon bounding boxes in the velocity direction comprises:
determining a starting bounding box and an end bounding box in the pair of polygon bounding boxes according to the time sequence;
determining a head edge point and a tail edge point of the initial enclosing frame and determining a head edge point and a tail edge point of the terminal enclosing frame according to intersection points of the central line, the initial enclosing frame and the terminal enclosing frame;
and determining the displacement of the bounding box between the pair of polygonal bounding boxes in the speed direction according to at least one of the distance between the head edge points of the starting bounding box and the end-point bounding box and the distance between the tail edge points of the starting bounding box and the end-point bounding box.
6. The method according to claim 1, wherein determining whether the target obstacle is a dynamic obstacle according to the determined displacement of each bounding box specifically comprises:
judging whether the displacement of each bounding box is larger than a preset displacement threshold value or not;
when the bounding box displacement is determined to be larger than the displacement threshold value, determining that the bounding box displacement is accurate displacement;
and when the number of accurate displacements in all the displacements of the surrounding frame is greater than the preset displacement times, determining that the target obstacle is a dynamic obstacle in the first time period.
7. The method according to claim 1, wherein the step of determining the obstacle with abnormal undetermined speed according to the undetermined speed of each obstacle at the current time and the historical speed of each obstacle at each historical time as the target obstacle specifically comprises:
aiming at each obstacle in the current environment, determining the current undetermined speed and type of the obstacle and the distance between the obstacle and the unmanned equipment;
determining a low-speed obstacle which belongs to a preset type and has a distance with the unmanned equipment smaller than a preset distance threshold value according to the current undetermined speed and type of each obstacle and the distance with the unmanned equipment, and taking the low-speed obstacle as a candidate obstacle;
and determining candidate obstacles with abnormal undetermined speed as target obstacles according to the undetermined speed of each candidate obstacle at the current moment and the historical speed of each candidate obstacle at each historical moment.
8. An apparatus for determining a velocity of an obstacle, comprising:
the target determining module is used for determining the obstacle with abnormal undetermined speed as the target obstacle according to the undetermined speed of each obstacle at the current moment and the historical speed of each obstacle at each historical moment;
the bounding box determining module is used for determining a polygonal bounding box which is corresponding to each moment including the current moment and is used for depicting the shape of the target obstacle according to a preset first time interval for each target obstacle;
the displacement determining module is used for determining the displacement of the surrounding frames of the polygonal surrounding frames corresponding to a plurality of pairs of discontinuous moments according to the determined polygonal surrounding frames corresponding to the moments respectively;
the judging module is used for judging whether the target barrier is a dynamic barrier or not according to the determined displacement of each surrounding frame;
and if so, determining a target time with a time difference larger than a preset first time difference from each time, and determining the final speed of the target barrier at the current time according to the current time of the target barrier and the bounding box displacement of the polygonal bounding box corresponding to the target time.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1 to 7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the program.
CN202210115877.XA 2022-02-07 2022-02-07 Method and device for determining barrier speed Withdrawn CN114623824A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210115877.XA CN114623824A (en) 2022-02-07 2022-02-07 Method and device for determining barrier speed

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210115877.XA CN114623824A (en) 2022-02-07 2022-02-07 Method and device for determining barrier speed

Publications (1)

Publication Number Publication Date
CN114623824A true CN114623824A (en) 2022-06-14

Family

ID=81898246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210115877.XA Withdrawn CN114623824A (en) 2022-02-07 2022-02-07 Method and device for determining barrier speed

Country Status (1)

Country Link
CN (1) CN114623824A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116540252A (en) * 2023-07-06 2023-08-04 上海云骥跃动智能科技发展有限公司 Laser radar-based speed determination method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116540252A (en) * 2023-07-06 2023-08-04 上海云骥跃动智能科技发展有限公司 Laser radar-based speed determination method, device, equipment and storage medium
CN116540252B (en) * 2023-07-06 2023-09-26 上海云骥跃动智能科技发展有限公司 Laser radar-based speed determination method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112001456B (en) Vehicle positioning method and device, storage medium and electronic equipment
CN112799411B (en) Control method and device of unmanned equipment
CN112987760B (en) Trajectory planning method and device, storage medium and electronic equipment
US11789141B2 (en) Omnidirectional sensor fusion system and method and vehicle including the same
CN111508258A (en) Positioning method and device
CN111062372B (en) Method and device for predicting obstacle track
CN111126362A (en) Method and device for predicting obstacle track
CN112327864A (en) Control method and control device of unmanned equipment
CN110660103A (en) Unmanned vehicle positioning method and device
CN113074748B (en) Path planning method and device for unmanned equipment
CN113968243B (en) Obstacle track prediction method, device, equipment and storage medium
CN111127551A (en) Target detection method and device
CN114623824A (en) Method and device for determining barrier speed
CN112818968A (en) Target object classification method and device
US20220319189A1 (en) Obstacle tracking method, storage medium, and electronic device
US20220314980A1 (en) Obstacle tracking method, storage medium and unmanned driving device
CN113340311B (en) Path planning method and device for unmanned equipment
CN112987762B (en) Trajectory planning method and device, storage medium and electronic equipment
CN114549579A (en) Target tracking method and device
CN114332201A (en) Model training and target detection method and device
CN112712561A (en) Picture construction method and device, storage medium and electronic equipment
CN111798489A (en) Feature point tracking method, device, medium and unmanned device
CN111104908A (en) Road edge determination method and device
CN112631312B (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN114460940A (en) Unmanned equipment control method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220614

WW01 Invention patent application withdrawn after publication