CN114590249A - Unmanned equipment control method, device, equipment and storage medium - Google Patents

Unmanned equipment control method, device, equipment and storage medium Download PDF

Info

Publication number
CN114590249A
CN114590249A CN202210210420.7A CN202210210420A CN114590249A CN 114590249 A CN114590249 A CN 114590249A CN 202210210420 A CN202210210420 A CN 202210210420A CN 114590249 A CN114590249 A CN 114590249A
Authority
CN
China
Prior art keywords
obstacle
unmanned equipment
unmanned
speed
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210210420.7A
Other languages
Chinese (zh)
Inventor
周小红
姜訢
周末
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202210210420.7A priority Critical patent/CN114590249A/en
Publication of CN114590249A publication Critical patent/CN114590249A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • B60W2420/408

Abstract

The present specification discloses a method, an apparatus, a device, and a storage medium for controlling an unmanned aerial vehicle, wherein a current speed of the unmanned aerial vehicle is detected, when the current speed of the unmanned aerial vehicle is less than a predetermined speed threshold, an ultrasonic radar is enabled to obtain state information of an obstacle, and then a collision risk of collision between the unmanned aerial vehicle and the obstacle is determined according to the state information of the unmanned aerial vehicle and the state information of the obstacle, so that the unmanned aerial vehicle is controlled according to the determined collision risk. Therefore, when the current speed of the unmanned equipment is smaller than the predetermined speed threshold value, the ultrasonic radar is started, the precision of detecting the obstacle when the unmanned equipment runs at the low speed is improved, the condition that the unmanned equipment collides with the obstacle due to the fact that the unmanned equipment cannot accurately detect the obstacle is avoided, and the safety of the unmanned equipment is improved. Moreover, the AEB system can be used in the low-speed state of the unmanned equipment, and the application scene of the AEB system is expanded.

Description

Unmanned equipment control method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of unmanned driving technologies, and in particular, to an unmanned device control method, apparatus, device, and storage medium.
Background
At present, unmanned equipment is widely applied to multiple fields such as national defense and national economy, and the unmanned equipment is further developed along with the continuous improvement of technological level, so that more convenience is brought to the life of people. The drone may perceive the road environment and automatically plan a route and reach a predetermined target according to the road environment. In order to ensure the driving safety of the unmanned equipment, an Automatic Emergency Braking (AEB) system is configured on the unmanned equipment.
In the prior art, an AEB system judges the collision risk between unmanned equipment and an obstacle by utilizing obstacle information acquired by a millimeter wave radar and a vision sensor, and controls the unmanned equipment to take braking measures such as braking and the like to avoid collision once the collision between the unmanned equipment and the obstacle is detected.
However, the millimeter wave radar and the vision sensor adopted by the AEB system have a high accuracy in detecting the obstacle when the unmanned aerial vehicle is running at a high speed, but cannot accurately detect the obstacle around the unmanned aerial vehicle when the unmanned aerial vehicle is running at a low speed, so that the unmanned aerial vehicle cannot brake in time, and the safety of the unmanned aerial vehicle is reduced.
Disclosure of Invention
The present specification provides an unmanned device control method, apparatus, device and storage medium, to partially solve the above problems in the prior art.
The technical scheme adopted by the specification is as follows:
the present specification provides an unmanned device control method including:
monitoring a current speed of the unmanned device;
when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, acquiring state information of an obstacle through an ultrasonic radar configured on the unmanned equipment;
judging whether collision risk exists between the unmanned equipment and the obstacle or not according to the current state information of the unmanned equipment and the state information of the obstacle;
and controlling the unmanned equipment according to the judgment result.
Optionally, the method further comprises:
and when the current speed of the unmanned equipment is not less than the speed threshold value, acquiring the state information of the obstacle through a millimeter wave radar and/or a vision sensor configured on the unmanned equipment.
Optionally, the determining of the speed threshold specifically includes:
presetting a plurality of speed test values;
for each speed test value, controlling the unmanned equipment to run at the speed test value, testing the ultrasonic radar configured on the unmanned equipment, and obtaining the accuracy of the ultrasonic radar detection obstacle state information corresponding to the speed test value;
and determining a speed threshold according to the accuracy corresponding to each speed test value.
Optionally, the speed threshold is a preset value.
Optionally, determining whether there is a collision risk between the unmanned aerial vehicle and the obstacle according to the current state information of the unmanned aerial vehicle and the state information of the obstacle, specifically including:
determining a motion track of the unmanned equipment according to the state information of the unmanned equipment;
determining the movement track of the obstacle according to the state information of the obstacle;
determining a track intersection point according to the motion track of the unmanned equipment and the motion track of the obstacle;
and judging whether collision risk exists between the unmanned equipment and the obstacle or not according to the track intersection point.
Optionally, judging whether a collision risk exists between the unmanned aerial vehicle and the obstacle according to the trajectory intersection point specifically includes:
determining a first time for the drone to reach the trajectory intersection based on the current speed of the drone;
determining a second time for the obstacle to reach the track intersection point according to the speed of the obstacle;
taking a difference value between the first time and the second time as a time difference between the unmanned device and the obstacle reaching the track intersection point respectively;
and judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the time difference.
Optionally, the method further comprises:
when no track intersection point exists between the motion track of the unmanned equipment and the motion track of the obstacle, judging whether the transverse distance between the unmanned equipment and the obstacle is smaller than a preset distance threshold value;
if yes, judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the current speed of the unmanned equipment, the speed of the obstacle and the longitudinal distance between the unmanned equipment and the obstacle.
Optionally, judging whether a collision risk exists between the unmanned aerial vehicle and the obstacle according to the time difference specifically includes:
if the time difference is not smaller than a first threshold value, judging that the collision risk existing between the unmanned equipment and the obstacle is a first risk;
if the time difference is smaller than the first threshold value and not smaller than a second threshold value, judging that the collision risk existing between the unmanned equipment and the obstacle is a second risk; wherein the second threshold is less than the first threshold;
if the time difference is smaller than the second threshold, judging that the collision risk existing between the unmanned equipment and the obstacle is a third risk;
controlling the unmanned equipment according to the judgment result, which specifically comprises:
when the judgment result shows that the collision risk existing between the unmanned equipment and the obstacle is a first risk, controlling the unmanned equipment to keep a current driving state;
when the judgment result shows that the collision risk existing between the unmanned equipment and the barrier is a second risk, controlling the unmanned equipment to avoid the track point;
and when the judgment result shows that the collision risk existing between the unmanned equipment and the obstacle is a third risk, controlling the unmanned equipment to brake.
This specification provides an unmanned equipment control device, including:
the monitoring module is used for monitoring the current speed of the unmanned equipment;
the acquiring module is used for acquiring the state information of the obstacle through an ultrasonic radar configured on the unmanned equipment when the current speed of the unmanned equipment is smaller than a predetermined speed threshold;
the judging module is used for judging whether collision risks exist between the unmanned equipment and the obstacles or not according to the current state information of the unmanned equipment and the state information of the obstacles;
and the control module is used for controlling the unmanned equipment according to the judgment result.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described unmanned aerial device control method.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-mentioned unmanned device control method when executing the program.
The technical scheme adopted by the specification can achieve the following beneficial effects:
according to the method, the current speed of the unmanned equipment is detected, when the current speed of the unmanned equipment is smaller than a predetermined speed threshold, the ultrasonic radar is started to obtain the state information of the obstacle, and then the collision risk of collision between the unmanned equipment and the obstacle is judged according to the state information of the unmanned equipment and the state information of the obstacle, so that the unmanned equipment is controlled according to the determined collision risk. Therefore, when the current speed of the unmanned equipment is smaller than the predetermined speed threshold value, the ultrasonic radar is started, the precision of detecting the obstacle when the unmanned equipment runs at the low speed is improved, the condition that the unmanned equipment collides with the obstacle due to the fact that the unmanned equipment cannot accurately detect the obstacle is avoided, and the safety of the unmanned equipment is improved. Moreover, when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, the ultrasonic radar is started to sense the obstacle, so that the AEB system is available in the low-speed state of the unmanned equipment, and the application scene of the AEB system is expanded.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
fig. 1 is a schematic flow chart of an unmanned aerial vehicle control method in the present specification;
fig. 2 is a schematic flow chart of an unmanned aerial vehicle control method in the present specification;
FIG. 3A is a schematic diagram of an unmanned aerial vehicle colliding with an obstacle according to the present disclosure;
FIG. 3B is a schematic diagram of an unmanned aerial vehicle colliding with an obstacle according to the present disclosure;
FIG. 3C is a schematic view of an unmanned aerial vehicle of the present disclosure colliding with an obstacle;
FIG. 4 is a schematic diagram of an unmanned aerial vehicle control apparatus provided herein;
fig. 5 is a schematic diagram of an electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
In addition, it should be noted that all the actions of acquiring signals, information or data in the present invention are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Aiming at the problem of the driving safety of the unmanned equipment, the mode of configuring an automatic emergency braking AEB system on the unmanned equipment is adopted, so that the driving safety of the unmanned equipment is enhanced. The AEB system has an active safety assistance function. The conventional AEB system measures the relative distance between the vehicle and a front vehicle or an obstacle by adopting a millimeter wave radar, a vision sensor and the like, compares the measured relative distance with the braking distance through data analysis, and automatically brakes when the measured relative distance is smaller than the safety distance, so that the driving safety of unmanned equipment is ensured. The integration of the millimeter wave radar and the vision sensor in the AEB system can obtain more accurate environmental data, improve redundancy and ensure the running stability and safety of the unmanned equipment to the maximum extent.
However, in the case of low-speed driving of the unmanned equipment, the accuracy of detecting the obstacle by the millimeter wave radar and the vision sensor cannot be expected, so that the AEB system cannot take braking measures in time to avoid collision between the unmanned equipment and the obstacle in the case of low-speed driving. In order to solve the problem, the method introduces the ultrasonic radar mode, under the condition of low-speed running, the ultrasonic radar detects the state information of the obstacle, and then judges the collision risk of collision between the unmanned equipment and the obstacle, so that the running safety of the unmanned equipment in the states of high-speed running, low-speed running and the like is guaranteed.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
In the embodiment of the present specification, the method for controlling an unmanned aerial vehicle as shown in fig. 1 may be applied to an unmanned aerial vehicle, which may be an unmanned aerial vehicle or an unmanned aerial vehicle. The unmanned equipment control method may be executed by a system or a module having an emergency braking function in an unmanned system provided in the unmanned equipment, or may be executed by an AEB system provided in the unmanned equipment independently of the unmanned system, which is not limited in the present specification.
Fig. 1 is a schematic flow chart of an unmanned aerial vehicle control method in this specification, which specifically includes the following steps:
s100: the current speed of the drone is monitored.
Since the introduced ultrasonic radar is mainly applied to the situation that the unmanned aerial vehicle runs at a low speed in the embodiment of the specification, the current speed of the unmanned aerial vehicle needs to be monitored so as to enable the ultrasonic radar to acquire the state information of the obstacle when the unmanned aerial vehicle runs at a low speed.
In this step, the speed of the drone may include longitudinal speed, lateral speed, angular speed. The mode for monitoring the current speed of the unmanned equipment can be to acquire the current speed of the unmanned equipment in real time or periodically acquire the speed of the unmanned equipment. The type and monitoring mode of the unmanned equipment speed are not limited in the specification.
S102: and when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, acquiring the state information of the obstacle through an ultrasonic radar configured on the unmanned equipment.
In practical applications, the sensor for sensing the state of the obstacle around the unmanned aerial vehicle may further include a millimeter wave radar, a vision sensor, and the like. The method comprises the steps that a millimeter wave radar and a visual sensor are used for sensing the state of the obstacle, the accuracy of sensing the state of the obstacle is higher in the state that the unmanned equipment runs at a high speed, therefore, a speed threshold value can be preset, under the condition that the current speed of the unmanned equipment is not smaller than the preset speed threshold value, the state information of the obstacle is obtained through the millimeter wave radar and/or the visual sensor arranged on the unmanned equipment, and when the current speed of the unmanned equipment is smaller than the preset speed threshold value, the state information of the obstacle is obtained through an ultrasonic radar arranged on the unmanned equipment. In general, the speed threshold is determined in advance based on the accuracy with which a sensor for sensing obstacle state information, such as a millimeter wave radar, an ultrasonic radar, or a visual sensor, disposed in the drone senses obstacle information.
In this step, although the ultrasonic radar is enabled to sense the environment around the drone in the case where the current speed of the drone is less than the predetermined speed threshold, it is not illustrated that in this case, the millimeter wave radar and/or the visual sensor are not continuously enabled. That is, for the case that the current speed of the unmanned aerial vehicle is less than the predetermined speed threshold, on the basis of sensing the obstacle through the millimeter wave radar and/or the visual sensor, the ultrasonic radar is used for collecting the state information of the obstacle, so as to make up the situation that the sensing accuracy of the millimeter wave radar and/or the visual sensor cannot reach the expectation in the low-speed driving state of the unmanned aerial vehicle.
In addition, in general, the millimeter wave radar has high detection accuracy for a metal obstacle such as a vehicle, but has poor resolution for an obstacle such as a pedestrian, and cannot perform image color recognition because of inability to form an image. Although the vision sensor can image and process the image, due to environmental factors or the state of the obstacle itself, the obstacle may be fused with the surrounding environment (the edge of the obstacle is not clear), and the detection accuracy of the vision sensor for such an obstacle is low. And the ultrasonic radar, the millimeter wave radar and/or the visual sensor are started to detect the obstacles together in the AEB system, so that more accurate environmental data can be obtained for various types of obstacles, the redundancy is improved, and the running stability and safety of the unmanned equipment can be ensured to the maximum extent.
It should be noted that, in the embodiments of the present specification, only the speed is used as a criterion for whether the ultrasonic radar is enabled to sense the obstacle, but it is not described that the ultrasonic radar in the embodiments of the present specification is only applicable to a low-speed state of the unmanned device, and is also applicable to a case where the obstacle is close to the unmanned device.
The obstacles may include dynamic obstacles perceivable in a range near the unmanned device, which may include other unmanned devices, automobiles, non-automobiles, pedestrians, etc., as well as static obstacles, which may include road edges, traffic facilities, etc. The acquired state information of the obstacle may include information of the speed, the traveling direction, the position relative to the unmanned device itself, and the like of the obstacle.
S104: and determining the collision risk of collision between the unmanned equipment and the obstacle according to the current state information of the unmanned equipment and the state information of the obstacle.
In practical application, when determining the collision risk of collision between the unmanned equipment and the obstacle, the perceived obstacles may be risk-classified according to the principle of closest vehicles on the same lane, and the obstacle with a higher risk level is selected as the target obstacle, so as to preferentially determine the collision risk of collision between the unmanned equipment and the target obstacle. According to the principle of the nearest vehicle in the same lane, the lane where the unmanned equipment is located is taken as a main lane, and the risk level of the obstacle driving in the main lane is higher than that of the obstacle driving in the adjacent lane of the main lane. For each obstacle in the same main lane, the longitudinal distance between the obstacle and the unmanned equipment is taken as a classification standard, and the shorter the longitudinal distance between the obstacle and the unmanned equipment is, the higher the risk level is. The number of the selected target obstacles may be one or more, and the specification does not limit this.
In order to avoid a collision between the unmanned aerial vehicle and the obstacle, the unmanned aerial vehicle needs to be controlled to change the driving strategy to avoid the obstacle, and emergency braking is performed when the unmanned aerial vehicle is judged to be unable to avoid the obstacle. Therefore, the determined collision risk can be classified, so that the control strategy of the unmanned device can be determined according to the category of the collision risk. For example, a first threshold and a second threshold are set in advance for determining the category of the collision risk, where the second threshold is smaller than the first threshold. And presetting risk categories as a first risk, a second risk and a third risk. The collision risk category is a first risk and corresponds to no collision risk between the unmanned device and the obstacle, the collision risk category is a second risk and corresponds to no collision risk between the unmanned device and the obstacle, and the collision risk category is a third risk and corresponds to no collision risk between the unmanned device and the obstacle. I.e. the risk level of the first risk is lower than the risk level of the second risk, which is lower than the risk level of the third risk.
S106: controlling the unmanned aerial vehicle according to the collision risk.
Optionally, a correspondence between the collision risk category and the unmanned aerial vehicle control policy is preset. Then, the control strategy of the unmanned aerial vehicle is determined according to the collision risk category judged in the step S104 and the preset correspondence between the collision risk category and the unmanned aerial vehicle control strategy. If the collision risk type is a first risk, it is indicated that no collision risk exists, the unmanned equipment is controlled to continue to keep the current driving state, if the collision risk type is a second risk, it is indicated that the collision risk exists but the unmanned equipment is not urgent, the unmanned equipment is controlled to avoid the obstacle, and if the collision risk type is a third risk, it is indicated that the collision risk exists and the unmanned equipment is urgent, and the unmanned equipment is controlled to be braked emergently.
According to the method, the current speed of the unmanned equipment is detected, when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, the ultrasonic radar is started to obtain the state information of the obstacle, and then the collision risk of collision between the unmanned equipment and the obstacle is judged according to the state information of the unmanned equipment and the state information of the obstacle, so that the unmanned equipment is controlled according to the determined collision risk. Therefore, when the current speed of the unmanned equipment is smaller than the predetermined speed threshold value, the ultrasonic radar is started, the precision of detecting the obstacle when the unmanned equipment runs at the low speed is improved, the condition that the unmanned equipment collides with the obstacle due to the fact that the unmanned equipment cannot accurately detect the obstacle is avoided, and the safety of the unmanned equipment is improved. Moreover, when the current speed of the unmanned equipment is smaller than the predetermined speed threshold, the ultrasonic radar is started to sense the obstacle, so that the AEB system is available in the low-speed state of the unmanned equipment, and the application scene of the AEB system is expanded.
In the embodiment of the present specification, it is determined whether to enable the ultrasonic radar to acquire the obstacle state information by using the predetermined speed threshold as shown in step S102 of fig. 1. Wherein the speed threshold may be determined by:
first, a plurality of speed test values are set in advance. The speed test value is used for testing the accuracy of the ultrasonic radar, the millimeter wave radar and/or the vision sensor which are configured on the unmanned equipment for detecting the obstacle state information when the unmanned equipment runs under a plurality of different speed test values.
And then controlling the unmanned equipment to run at each speed test value, testing the ultrasonic radar configured on the unmanned equipment, and obtaining the accuracy of the ultrasonic radar detection obstacle state information corresponding to each speed test value.
And finally, determining the speed threshold according to the accuracy corresponding to each speed test value. Selecting the speed test value corresponding to the target accuracy as the highest accuracy from the accuracies of the ultrasonic radars corresponding to the speed test values, and using the speed test value corresponding to the target accuracy as the determined speed threshold value, so that when the current speed of the unmanned equipment is less than the speed threshold value during actual running of the unmanned equipment, the state information of the obstacle is detected by the ultrasonic radar arranged on the unmanned equipment,
Therefore, by judging the relation between the current speed of the unmanned equipment and the predetermined speed threshold value, the ultrasonic radar is started when the ultrasonic radar detects the state information of the obstacle with higher accuracy, the characteristic that the ultrasonic radar has higher detection accuracy in the low-speed state of the unmanned equipment can be more effectively utilized, the condition that the unmanned equipment collides with the obstacle due to the fact that the unmanned equipment cannot accurately detect the obstacle is further avoided, and the safety of the unmanned equipment is improved.
In addition, the speed threshold for determining whether to activate the ultrasonic radar may be preset, and the setting mode is not limited in this specification.
In this embodiment of the present specification, as shown in step S104 in fig. 1, whether there is a Collision risk between the unmanned device and the obstacle is determined according To the current state information of the unmanned device and the state information of the obstacle, the Collision risk of the Collision between the unmanned device and the obstacle may be represented in the form of Time To Collision (TTC), where TTC represents the Time required for the Collision when the unmanned device and the obstacle both travel in the current state (speed and trajectory), and it is obvious that if the Collision risk of the Collision between the unmanned device and the obstacle is represented by TTC, the problem of overlapping of the motion trajectories of the unmanned device and the obstacle needs To be considered when calculating TTC, that is, it needs To determine whether there is a trajectory intersection point between the motion trajectory of the unmanned device and the motion trajectory of the obstacle.
It should be noted that the representation form of the collision risk of the collision between the unmanned aerial vehicle and the obstacle is not limited to the TTC, and in the embodiment of the present specification, for convenience of understanding, only the TTC is taken as an example, and a specific technical solution is described. As shown in fig. 2, the method specifically comprises the following steps:
s200: and determining the motion trail of the unmanned equipment according to the state information of the unmanned equipment, and determining the motion trail of the obstacle according to the state information of the obstacle.
In practical application, the unmanned aerial vehicle comprises a prediction module with a function of predicting the movement track of the obstacle, the prediction module is configured to predict the movement track of the obstacle according to state information (speed and position relative to the unmanned aerial vehicle) of the obstacle acquired by sensors such as an ultrasonic radar, a millimeter wave radar and a vision sensor, and the planning module is configured to plan the movement track according to the state information of the unmanned aerial vehicle.
S202: and judging whether a track intersection point exists between the motion track of the unmanned equipment and the motion track of the obstacle. If so, go to step S204, otherwise, go to step S206.
Since the collision risk of the collision between the unmanned aerial vehicle and the obstacle is determined by calculating the TTC between the unmanned aerial vehicle and the obstacle, it is necessary to determine whether there is a trajectory intersection between the trajectory of the unmanned aerial vehicle and the trajectory of the obstacle.
If a track intersection point exists between the motion track of the unmanned equipment and the motion track of the obstacle, it is indicated that the obstacle and the unmanned equipment may collide at the determined track intersection point. Optionally, since the trajectory intersection point between the motion trajectory of the unmanned device and the motion trajectory of the obstacle is subsequently calculated, and when the time difference between the unmanned device and the obstacle reaching the trajectory intersection point is determined, the calculation model is related to the motion direction of the unmanned device and the motion direction of the obstacle, and the relative motion of the unmanned device and the obstacle can be classified into four types: the unmanned device moves straight and the obstacle moves diagonally with respect to the unmanned device, the unmanned device moves straight and the obstacle moves transversely with respect to the unmanned device, the unmanned device turns and the obstacle moves diagonally with respect to the unmanned device, the unmanned device turns and the obstacle moves transversely with respect to the unmanned device.
If no track intersection point exists between the motion track of the unmanned device and the motion track of the obstacle, the motion track of the unmanned device is parallel to or coincident with the motion track of the obstacle, the driving direction of the unmanned device is the same direction or opposite to the driving direction of the obstacle, and at the moment, the TTC can be calculated according to the situations that the unmanned device meets the obstacle, pursues the obstacle and the like.
S204: determining a first time for the drone to reach the trajectory intersection based on the current velocity of the drone. Determining a second time for the obstacle to reach the trajectory intersection based on the speed of the obstacle. And taking the difference value of the first time and the second time as the time difference of the unmanned device and the obstacle reaching the track intersection respectively. Step S210 is performed.
For example, as shown in FIG. 3A, the motion trajectory L of the unmanned device AA(shown by solid line) and the movement locus L of the obstacle BB(shown by dotted line) with a trajectory crossing C, the drone A follows the current motion trajectory LAAt the current speed vAThe time when the vehicle arrives at the point C and the obstacle B follow the current movement track LBAt the current speed vBThe difference between the times of travel to the point C is the difference between the first time and the second time, and the time difference between the unmanned aerial vehicle a and the obstacle B reaching the track intersection C.
S206: and judging whether the transverse distance between the unmanned equipment and the obstacle is smaller than a preset distance threshold value. If yes, go to step S208, otherwise go to step S214.
The reason why the lateral distance between the unmanned aerial vehicle and the obstacle is used as one of the bases for judging the risk of collision between the unmanned aerial vehicle and the obstacle is that whether the movement locus of the unmanned aerial vehicle coincides with or is parallel to the movement locus of the obstacle can be distinguished according to the lateral distance between the unmanned aerial vehicle and the obstacle. In general, the distance of an obstacle sensed by an ultrasonic radar, a millimeter wave radar, and a vision sensor provided in the drone with respect to the drone is a distance from an edge of the recognized obstacle to the center of the drone, and a distance threshold may be determined in advance according to the width of the drone in order to distinguish whether the movement locus of the drone and the movement locus of the obstacle coincide with each other or are parallel to each other.
If the motion trail of the unmanned equipment is parallel to that of the obstacle, collision does not occur even if the unmanned equipment and the obstacle meet or trace in the same direction when moving in opposite directions, and the collision risk existing between the unmanned equipment and the obstacle is directly judged to be the first risk at the moment, namely the type of the collision risk is the collision-free risk. If the motion trail of the unmanned device is overlapped with the motion trail of the obstacle, the collision risk needs to be judged according to the conditions that the unmanned device meets the obstacle when moving in the opposite direction or catches up when moving in the same direction.
S208: and determining a collision position where the unmanned equipment and the obstacle are likely to collide according to the current speed of the unmanned equipment, the speed of the obstacle and the longitudinal distance between the unmanned equipment and the obstacle. And determining the time difference of the unmanned equipment and the barrier respectively reaching the collision position.
The collision position where the unmanned aerial vehicle and the obstacle are likely to collide is determined according to the two situations of meeting when the unmanned aerial vehicle and the obstacle move in opposite directions and pursuing when the unmanned aerial vehicle and the obstacle move in the same direction.
For example, as shown in fig. 3B, the unmanned aerial vehicle a travels opposite to the obstacle B, wherein the trajectory of the unmanned aerial vehicle a is LA(shown by the solid line) and a velocity vAThe movement track of the obstacle B is LB(shown by the dotted line) at a velocity vB. Due to the movement track and the obstacle of the unmanned equipment AAnd track intersection points do not exist between the motion tracks of the obstacles B, the transverse distance between the unmanned device A and the obstacles B is less than half of the width of the unmanned device, once the unmanned device A meets the obstacles B, the unmanned device A collides with the obstacles B, and the collision position C of the unmanned device A colliding with the obstacles B can be determined according to the relative distance and the relative speed between the unmanned device A and the obstacles B. As shown in fig. 3C, when the unmanned aerial vehicle a travels in the same direction as the obstacle B, since there is no track intersection between the motion track of the unmanned aerial vehicle a and the motion track of the obstacle B, and the lateral distance between the unmanned aerial vehicle a and the obstacle B is less than half of the width of the unmanned aerial vehicle, once the obstacle B catches up with the unmanned aerial vehicle a, the unmanned aerial vehicle a collides with the obstacle B, and the collision position C at which the unmanned aerial vehicle a collides with the obstacle B can be determined according to the relative distance and the relative speed between the unmanned aerial vehicle a and the obstacle B.
S210: and judging whether the time difference is smaller than a first threshold value. If so, go to step S212, otherwise go to step S214.
S212: and judging whether the time difference is smaller than a second threshold value. If so, go to step S218, otherwise go to step S216.
The first threshold value and the second threshold value are used for judging the category of the collision risk, wherein the second threshold value is smaller than the first threshold value.
S214: and judging that the collision risk existing between the unmanned equipment and the obstacle is a first risk.
S216: and judging the collision risk existing between the unmanned equipment and the obstacle as a second risk.
S218: and judging that the collision risk existing between the unmanned equipment and the obstacle is a third risk.
Specifically, if the time difference is not smaller than a first threshold, it is determined that a collision risk existing between the unmanned aerial vehicle and the obstacle is a first risk. And the control strategy corresponding to the first risk is to control the unmanned equipment to keep the current driving state.
And if the time difference is smaller than a first threshold value and the time difference is not smaller than a second threshold value, judging that the collision risk existing between the unmanned equipment and the obstacle is a second risk. And the control strategy corresponding to the second risk is to control the unmanned equipment to avoid a track intersection or a collision position. Here, it should be noted that the collision risk existing between the unmanned aerial vehicle and the obstacle is a second risk, which means that the unmanned aerial vehicle has enough time to re-plan the trajectory according to the state information of the obstacle, and the obstacle can be avoided according to the re-planned movement trajectory. If the unmanned equipment cannot plan the track again due to the fact that the unmanned system configured on the unmanned equipment fails and the like within a certain time, the AEB system can judge the collision risk between the unmanned equipment and the obstacle again, and if the collision risk is judged to change (judged to be the first risk or the third risk), the unmanned equipment is controlled according to a control strategy corresponding to the judged collision risk again.
And if the time difference is smaller than a second threshold value, judging that the collision risk existing between the unmanned equipment and the obstacle is a third risk. The control strategy corresponding to the third risk is to control the braking of the unmanned equipment, such as deceleration, stop running and the like.
Optionally, in another embodiment of the present specification, when determining whether there is a collision risk between the unmanned aerial vehicle and the obstacle according to the current state information of the unmanned aerial vehicle and the state information of the obstacle, the number of the obstacles may be multiple.
First, for each obstacle, the time difference between the arrival of the unmanned aerial vehicle and the arrival of the obstacle at the trajectory intersection is determined based on the state information of the obstacle and the state information of the current state of the unmanned aerial vehicle. Then, the smallest time difference is selected from the time differences corresponding to the respective obstacles as the designated time difference. And judging whether collision risks exist between the unmanned equipment and the obstacles corresponding to the specified time difference or not according to the specified time difference.
Here, the purpose of selecting the smallest time difference as the designated time difference from among the time differences corresponding to the respective obstacles is to: when determining whether collision risk exists between the unmanned equipment and the obstacle, preferentially determining the collision risk for the obstacle with the highest collision risk of collision between the unmanned equipment and the obstacle. Among the obstacles, the obstacle corresponding to the specified time difference has the greatest risk of collision with the unmanned device. If no collision risk exists between the obstacle corresponding to the specified time difference and the unmanned device, no collision risk exists between the unmanned device and other obstacles except the obstacle corresponding to the specified time difference.
Based on the same idea, the present specification further provides a corresponding unmanned aerial vehicle control apparatus, as shown in fig. 4.
Fig. 4 is a schematic diagram of an unmanned equipment control device provided in this specification, which specifically includes:
a monitoring module 300 for monitoring a current speed of the drone;
an obtaining module 302, configured to obtain state information of an obstacle through an ultrasonic radar configured on the unmanned aerial vehicle when a current speed of the unmanned aerial vehicle is smaller than a predetermined speed threshold;
a judging module 304, configured to judge whether there is a collision risk between the unmanned aerial vehicle and the obstacle according to the current state information of the unmanned aerial vehicle and the state information of the obstacle;
and the control module 306 is configured to control the unmanned device according to the determination result.
Optionally, the obtaining module 302 is further configured to obtain the state information of the obstacle through a millimeter wave radar and/or a visual sensor configured on the unmanned device when the current speed of the unmanned device is not less than a predetermined speed threshold.
Optionally, the apparatus further comprises:
a speed threshold determination module 308 for presetting a plurality of speed test values; controlling the unmanned equipment to run at each speed test value, testing the ultrasonic radar configured on the unmanned equipment, and obtaining the accuracy of the ultrasonic radar for detecting the obstacle state information corresponding to each speed test value; and determining the speed threshold according to the accuracy corresponding to each speed test value.
Optionally, the speed threshold is a preset value.
Optionally, the determining module 304 is specifically configured to determine a motion trajectory of the unmanned device according to the state information of the unmanned device; determining the movement track of the obstacle according to the state information of the obstacle; determining a track intersection point according to the motion track of the unmanned equipment and the motion track of the obstacle; and judging whether collision risk exists between the unmanned equipment and the obstacle or not according to the track intersection point.
Optionally, the determining module 304 is specifically configured to determine, according to the current speed of the unmanned aerial vehicle, a first time when the unmanned aerial vehicle reaches the track intersection; determining a second time for the obstacle to reach the trajectory intersection point according to the speed of the obstacle; taking a difference value between the first time and the second time as a time difference between the unmanned device and the obstacle reaching the track intersection point respectively; and judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the time difference.
Optionally, the determining module 304 is further configured to determine whether a lateral distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold when there is no track intersection point between the motion track of the unmanned aerial vehicle and the motion track of the obstacle; if yes, judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the current speed of the unmanned equipment, the speed of the obstacle and the longitudinal distance between the unmanned equipment and the obstacle.
Optionally, the determining module 304 is specifically configured to determine, if the time difference is not smaller than a first threshold, that a collision risk existing between the unmanned aerial vehicle and the obstacle is a first risk; if the time difference is smaller than the first threshold and the time difference is not smaller than a second threshold, judging that the collision risk existing between the unmanned equipment and the obstacle is a second risk; wherein the second threshold is less than the first threshold; if the time difference is smaller than the second threshold, judging that the collision risk existing between the unmanned equipment and the obstacle is a third risk;
optionally, the control module 306 is specifically configured to, when the determination result indicates that the risk of collision between the unmanned aerial vehicle and the obstacle is a first risk, control the unmanned aerial vehicle to maintain a current driving state; when the judgment result shows that the collision risk existing between the unmanned equipment and the barrier is a second risk, controlling the unmanned equipment to avoid the track point; and when the judgment result shows that the collision risk existing between the unmanned equipment and the obstacle is a third risk, controlling the unmanned equipment to brake.
According to the method, the current speed of the unmanned equipment is detected, when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, the ultrasonic radar is started to obtain the state information of the obstacle, and then the collision risk of collision between the unmanned equipment and the obstacle is judged according to the state information of the unmanned equipment and the state information of the obstacle, so that the unmanned equipment is controlled according to the determined collision risk. Therefore, when the current speed of the unmanned equipment is smaller than the predetermined speed threshold value, the ultrasonic radar is started, the precision of detecting the obstacle when the unmanned equipment runs at the low speed is improved, the condition that the unmanned equipment collides with the obstacle due to the fact that the unmanned equipment cannot accurately detect the obstacle is avoided, and the safety of the unmanned equipment is improved. Moreover, when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, the ultrasonic radar is started to sense the obstacle, so that the AEB system is available in the low-speed state of the unmanned equipment, and the application scene of the AEB system is expanded.
The present specification also provides a computer-readable storage medium storing a computer program operable to execute the above-described unmanned aerial device control method provided in fig. 1.
This specification also provides a schematic block diagram of the electronic device shown in fig. 5. As shown in fig. 5, at the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, but may also include hardware required for other services. The processor reads a corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to implement the above-described unmanned device control method of fig. 1. Of course, besides the software implementation, this specification does not exclude other implementations, such as logic devices or combination of software and hardware, and so on, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually manufacturing an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to the software compiler used in program development, but the original code before compiling is also written in a specific Programming Language, which is called Hardware Description Language (HDL), and the HDL is not only one kind but many kinds, such as abel (advanced boot Expression Language), ahdl (alternate Language Description Language), communication, CUPL (computer universal Programming Language), HDCal (Java Hardware Description Language), langa, Lola, mylar, HDL, PALASM, rhydl (runtime Description Language), vhjhdul (Hardware Description Language), and vhygl-Language, which are currently used commonly. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (11)

1. An unmanned equipment control method, comprising:
monitoring a current speed of the unmanned device;
when the current speed of the unmanned equipment is smaller than a predetermined speed threshold value, acquiring state information of an obstacle through an ultrasonic radar configured on the unmanned equipment;
judging whether collision risk exists between the unmanned equipment and the obstacle or not according to the current state information of the unmanned equipment and the state information of the obstacle;
and controlling the unmanned equipment according to the judgment result.
2. The method of claim 1, wherein the method further comprises:
and when the current speed of the unmanned equipment is not less than the speed threshold value, acquiring the state information of the obstacle through a millimeter wave radar and/or a vision sensor configured on the unmanned equipment.
3. The method of claim 1 or 2, wherein the determining of the speed threshold comprises:
presetting a plurality of speed test values;
controlling the unmanned equipment to run at each speed test value, testing the ultrasonic radar configured on the unmanned equipment, and obtaining the accuracy of the ultrasonic radar detection obstacle state information corresponding to each speed test value;
and determining the speed threshold according to the accuracy corresponding to each speed test value.
4. A method according to claim 1 or 2, wherein the speed threshold is a preset value.
5. The method according to claim 1, wherein determining whether there is a risk of collision between the unmanned aerial vehicle and the obstacle according to the current state information of the unmanned aerial vehicle and the state information of the obstacle specifically includes:
determining the motion trail of the unmanned equipment according to the state information of the unmanned equipment;
determining the movement track of the obstacle according to the state information of the obstacle;
determining a track intersection point according to the motion track of the unmanned equipment and the motion track of the obstacle;
and judging whether collision risk exists between the unmanned equipment and the obstacle or not according to the track intersection point.
6. The method of claim 5, wherein determining whether a collision risk exists between the drone and the obstacle based on the trajectory intersection point comprises:
determining a first time for the drone to reach the trajectory intersection based on the current speed of the drone;
determining a second time for the obstacle to reach the trajectory intersection point according to the speed of the obstacle;
taking a difference value between the first time and the second time as a time difference between the unmanned device and the obstacle reaching the track intersection point respectively;
and judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the time difference.
7. The method of claim 5, wherein the method further comprises:
when no track intersection point exists between the motion track of the unmanned equipment and the motion track of the obstacle, judging whether the transverse distance between the unmanned equipment and the obstacle is smaller than a preset distance threshold value;
if yes, judging whether collision risks exist between the unmanned equipment and the obstacle or not according to the current speed of the unmanned equipment, the speed of the obstacle and the longitudinal distance between the unmanned equipment and the obstacle.
8. The method according to claim 6, wherein determining whether there is a risk of collision between the drone and the obstacle based on the time difference comprises:
if the time difference is not smaller than a first threshold value, judging that the collision risk existing between the unmanned equipment and the obstacle is a first risk;
if the time difference is smaller than the first threshold and the time difference is not smaller than a second threshold, judging that the collision risk existing between the unmanned equipment and the obstacle is a second risk; wherein the second threshold is less than the first threshold;
if the time difference is smaller than the second threshold, judging that the collision risk existing between the unmanned equipment and the obstacle is a third risk;
the controlling the unmanned equipment according to the judgment result specifically includes:
when the judgment result shows that the collision risk existing between the unmanned equipment and the obstacle is a first risk, controlling the unmanned equipment to keep a current driving state;
when the judgment result is that the collision risk existing between the unmanned equipment and the obstacle is a second risk, controlling the unmanned equipment to avoid the track intersection point;
and when the judgment result shows that the collision risk existing between the unmanned equipment and the obstacle is a third risk, controlling the unmanned equipment to brake.
9. An unmanned aerial device control apparatus, comprising:
the monitoring module is used for monitoring the current speed of the unmanned equipment;
the acquiring module is used for acquiring the state information of the obstacle through an ultrasonic radar configured on the unmanned equipment when the current speed of the unmanned equipment is smaller than a predetermined speed threshold;
the judging module is used for judging whether collision risks exist between the unmanned equipment and the obstacles or not according to the current state information of the unmanned equipment and the state information of the obstacles;
and the control module is used for controlling the unmanned equipment according to the judgment result.
10. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1 to 8.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 8 when executing the program.
CN202210210420.7A 2022-03-04 2022-03-04 Unmanned equipment control method, device, equipment and storage medium Pending CN114590249A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210210420.7A CN114590249A (en) 2022-03-04 2022-03-04 Unmanned equipment control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210210420.7A CN114590249A (en) 2022-03-04 2022-03-04 Unmanned equipment control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114590249A true CN114590249A (en) 2022-06-07

Family

ID=81807794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210210420.7A Pending CN114590249A (en) 2022-03-04 2022-03-04 Unmanned equipment control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114590249A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117382593A (en) * 2023-12-08 2024-01-12 之江实验室 Vehicle emergency braking method and system based on laser point cloud filtering

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117382593A (en) * 2023-12-08 2024-01-12 之江实验室 Vehicle emergency braking method and system based on laser point cloud filtering
CN117382593B (en) * 2023-12-08 2024-04-05 之江实验室 Vehicle emergency braking method and system based on laser point cloud filtering

Similar Documents

Publication Publication Date Title
CN110709911B (en) Travel assist method for travel assist device and travel assist device
CN111547053B (en) Automatic driving control method and system based on vehicle-road cooperation
CN108688660B (en) Operating range determining device
CN109416883B (en) Vehicle control method and vehicle control device
US20170320521A1 (en) Drive Assist Device
CN111033510A (en) Method and device for operating a driver assistance system, driver assistance system and motor vehicle
US20220227392A1 (en) Vehicle control device, vehicle control method, and automatic driving method
JP2015061776A (en) Consistent behavior generation of predictive advanced drive support system
US11472439B2 (en) Vehicle control system and vehicle control method
US20200174113A1 (en) Omnidirectional sensor fusion system and method and vehicle including the same
US20220153266A1 (en) Vehicle adaptive cruise control system, method and computer readable medium for implementing the method
JP7037956B2 (en) Vehicle course prediction method, vehicle travel support method, and vehicle course prediction device
CN113895456A (en) Intersection driving method and device for automatic driving vehicle, vehicle and medium
US20220274594A1 (en) Control system and control method
CN113968243B (en) Obstacle track prediction method, device, equipment and storage medium
CN114590249A (en) Unmanned equipment control method, device, equipment and storage medium
JPWO2020058740A1 (en) Vehicle behavior prediction method and vehicle behavior prediction device
CN113074748A (en) Path planning method and device for unmanned equipment
JP2020019301A (en) Behavior decision device
JP7223588B2 (en) Driving characteristic estimation method and driving characteristic estimation device
JP7143893B2 (en) Vehicle behavior prediction method and vehicle behavior prediction device
CN112365730A (en) Automatic driving method, device, equipment, storage medium and vehicle
CN117382593B (en) Vehicle emergency braking method and system based on laser point cloud filtering
US20230311879A1 (en) Autonomous driving control apparatus and method thereof
CN113715846B (en) Lane borrowing control method and device, storage medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination