CN113966496A - Control method, control device, movable platform and computer readable storage medium - Google Patents

Control method, control device, movable platform and computer readable storage medium Download PDF

Info

Publication number
CN113966496A
CN113966496A CN202080039444.XA CN202080039444A CN113966496A CN 113966496 A CN113966496 A CN 113966496A CN 202080039444 A CN202080039444 A CN 202080039444A CN 113966496 A CN113966496 A CN 113966496A
Authority
CN
China
Prior art keywords
parameter condition
parameter
position information
echo energy
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080039444.XA
Other languages
Chinese (zh)
Inventor
王俊喜
王石荣
祝煌剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN113966496A publication Critical patent/CN113966496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C1/00Fuselages; Constructional features common to fuselages, wings, stabilising surfaces or the like
    • B64C1/36Fuselages; Constructional features common to fuselages, wings, stabilising surfaces or the like adapted to receive antennas or radomes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

Disclosed is a control method including: acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning by a radar (630) of a movable platform; acquiring a set parameter condition set; wherein the set of parameter conditions includes echo energy parameter conditions; determining a target scanning point in the plurality of scanning points according to the parameter condition set; the echo energy of the target scanning point accords with the echo energy parameter condition; and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point. The method solves the technical problem that the unmanned aerial vehicle carries out obstacle avoidance in the face of weak and small obstacles, so that the operation efficiency is reduced.

Description

Control method, control device, movable platform and computer readable storage medium Technical Field
The present application relates to the field of motion control technologies, and in particular, to a control method, a control device, a movable platform, and a computer-readable storage medium.
Background
The obstacle avoidance function is an important function in the unmanned aerial vehicle. For an agricultural unmanned aerial vehicle, the obstacle avoidance function is usually realized based on a radar obstacle avoidance technology.
For guaranteeing agricultural unmanned aerial vehicle's flight safety, the sensitivity of radar is fixed at higher level usually, and under this sensitivity, the point cloud that radar scanning environment obtained is comparatively comprehensive, including the scanning point that various objects correspond wherein, through clustering the point cloud, can cluster out the point cloud cluster that different objects correspond, according to the positional information of each point cloud cluster, unmanned aerial vehicle can discern barrier wherein to can carry out obstacle avoidance movement to this barrier.
Disclosure of Invention
In order to solve the technical problem that the unmanned aerial vehicle performs obstacle avoidance in the face of weak and small obstacles, so that the operation efficiency is reduced, the embodiment of the application provides a control method, a control device, a movable platform and a computer readable storage medium.
A first aspect of an embodiment of the present application provides a control method, including:
acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning of a radar of a movable platform;
acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
A second aspect of the embodiments of the present application provides a control apparatus, including: a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning of a radar of a movable platform;
acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
A third aspect of embodiments of the present application provides a movable platform, including: the radar detection device comprises a machine body, a power device connected with the machine body, a radar arranged on the machine body, a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by the environment scanning of the radar;
acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements any one of the control methods provided in the first aspect.
According to the control method provided by the embodiment of the application, scanning points with echo energy which is not in accordance with the echo energy parameter condition are filtered from a plurality of scanning points obtained by radar scanning, the part of the filtered scanning points correspond to negligible scanning points of weak obstacles, that is, scanning points corresponding to obstacles in reserved target scanning points are scanning points of non-negligible obstacles, so that the weak obstacles cannot be considered in the obstacle avoidance function realized based on the target scanning points, and the weak obstacles can be ignored in the obstacle avoidance function.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic view of an operation scene of an unmanned aerial vehicle provided in an embodiment of the present application.
Fig. 2 is a flowchart of a control method according to an embodiment of the present application.
Fig. 3 is a schematic view of another operation scenario of an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 4 is a user interaction interface provided in an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a control device according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a movable platform according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Obstacle avoidance is an important function of unmanned aerial vehicles. When keeping away barrier function and opening, unmanned aerial vehicle can detect the barrier on the flight path automatically and dodge the barrier. There are various methods for avoiding obstacles, such as visual obstacle avoidance, radar obstacle avoidance, etc. For the agricultural unmanned aerial vehicle, it relies on more when realizing keeping away the barrier function that the radar keeps away the barrier. Because many operation tasks of the agricultural unmanned aerial vehicle are to spray agricultural materials such as pesticides and fertilizers, water mist formed in the spraying process can greatly interfere vision, and the effect of visual obstacle avoidance is greatly reduced. On the other hand, agricultural unmanned aerial vehicle's activity duration is not necessarily daytime, and it probably just begins the operation before the sky is bright, perhaps continues the operation after dark, and at this moment, the not enough of light also can influence the effect that the vision kept away the barrier to a great extent.
When keeping away the barrier function based on radar realization, in order to guarantee that unmanned aerial vehicle can not collide with the barrier, the sensitivity of radar is usually fixed at higher level. In some scenarios, however, applicants have found that users do not want such high sensitivity of radar. Taking an agricultural unmanned aerial vehicle as an example, when the agricultural unmanned aerial vehicle works, various obstacles exist in a working environment, but not all the obstacles need to be avoided in the view of a user.
Referring to fig. 1, fig. 1 is a schematic view of an operation scene of an unmanned aerial vehicle provided in an embodiment of the present application. As shown in fig. 1, a reed is raised in a paddy field and falls into the flight path of the drone, so that the existing high-sensitivity radar can accurately detect the reed and perform obstacle avoidance on the reed as an obstacle. However, for some users, although the reed is located on the flight path of the unmanned aerial vehicle, the users think that the unmanned aerial vehicle can completely ignore the reed to directly fly past, the weak reed cannot influence the normal flight of the unmanned aerial vehicle, the risk of the explosion is extremely low, and the operation efficiency is reduced on the contrary by executing obstacle avoidance. Therefore, these users are likely to turn off the obstacle avoidance function of the drone during operation, and when a non-negligible obstacle (such as a telegraph pole in the figure) is encountered, a bomb will occur.
In view of the above problems, the applicant has found through research that in point cloud data obtained by radar scanning, negligible obstacles and non-negligible obstacles have a large difference in echo energy corresponding to a scanning point, and specifically, the echo energy corresponding to the non-negligible obstacles is always higher than that corresponding to the negligible obstacles. For example, when a reed and a telegraph pole are both obstacles, the echo energy of the scanning point corresponding to the telegraph pole is obviously higher than that of the scanning point corresponding to the reed. Based on this, the applicant proposes a solution idea, which can filter a plurality of scanning points obtained by radar scanning by setting a condition about echo energy, and filter out scanning points corresponding to weak obstacles, so that the weak obstacles can be ignored when avoiding obstacles.
Referring to fig. 2, fig. 2 is a flowchart of a control method provided in an embodiment of the present application. The method can be applied to the obstacle avoidance function of the movable platform. The movable platform can be an object with movable capability, and can be a small-sized electronic device, such as an unmanned aerial vehicle, a small-sized robot, a remote control car, and the like, or a large-sized machine, such as an automobile, an airplane, a ship, and the like. The movable platform may be equipped with a radar, which may be various radars with environment scanning capability, such as a laser radar, an electromagnetic wave (millimeter wave) radar, and the like, and may be fixedly connected to the movable platform or detachably connected to the movable platform.
The control method provided by the embodiment of the application can comprise the following steps:
step S201, obtaining echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by scanning an environment with a radar of a movable platform.
And step S202, acquiring a set parameter condition set.
Wherein the set of parameter conditions includes echo energy parameter conditions.
And S203, determining a target scanning point with echo energy meeting the echo energy parameter condition from the plurality of scanning points according to the parameter condition set.
And S204, controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
The environment is scanned in a rotary mode through the radar carried on the movable platform, and a plurality of scanning points corresponding to the environment can be obtained. The set of scanning points acquired by these radars may also be referred to as a point cloud, and each scanning point in the point cloud may correspond to a plurality of data, where the data includes echo energy and position information, and of course, may also include other information or data.
The set of parametric conditions may be a set of parametric conditions including an echo energy parametric condition. The echo energy parameter condition can be used for determining a target scanning point, and particularly can be used for determining a target scanning point with echo energy meeting the echo energy parameter condition from a plurality of scanning points acquired by a radar. It is readily apparent that this process actually filters or discards a portion of the scan points collected by the radar. When the echo energy parameter condition is set properly, the part of the filtered scanning points can correspond to the scanning points of the negligible weak obstacle.
The echo energy parameter conditions also have various manifestations. In one embodiment, the echo energy parameter condition may include that the echo energy corresponding to the scanning point is greater than the echo energy threshold, for example, E may be used to represent the echo energy corresponding to the scanning point, and then the echo energy parameter condition may be E>E setIn which EsetRepresenting the echo energy threshold. In another embodiment, the echo energy parameter condition may also include that the echo energy corresponding to the scanning point falls within the echo energy range, for example, the echo energy parameter condition may be Emin<E<E maxIn which EminRepresenting the minimum of the echo energy, EmaxRepresenting the echo energy maximum.
After the target scanning point is determined according to the echo energy parameter condition, the obstacle can be determined according to the position information of the target scanning point, and the movable platform is controlled to perform obstacle avoidance movement on the obstacle.
According to the control method provided by the embodiment of the application, scanning points with echo energy which is not in accordance with the echo energy parameter condition are filtered from a plurality of scanning points obtained by radar scanning, the part of the filtered scanning points correspond to negligible scanning points of weak obstacles, that is, scanning points corresponding to obstacles in reserved target scanning points are scanning points of non-negligible obstacles, so that the weak obstacles cannot be considered in the obstacle avoidance function realized based on the target scanning points, and the weak obstacles can be ignored in the obstacle avoidance function.
Since the echo energy parameter condition is only to filter out the scanning point corresponding to the negligible obstacle from the echo energy, but the determination of the obstacle also needs to pay attention to the position where the scanning point is located, for example, in the scanning point obtained by scanning, the scanning point corresponding to the stone pillar meets the echo energy parameter condition on the echo energy, but the height of the stone pillar is very low, if the movable platform is an unmanned aerial vehicle, and the movable platform can fly at a level higher than the stone pillar during operation, the stone pillar should not be considered as an obstacle, or belong to an obstacle but not an obstacle that the unmanned aerial vehicle needs to pay attention to.
Based on the above problem, the obstacle should have sufficient strength (which can be distinguished by the echo energy parameter conditions), and also need to fall within a safe distance of the movable platform in distance, on the path of movement of the movable platform or on the plane of operation. Thus, in one embodiment, the set of parameter conditions may also include location information parameter conditions. When determining a target scanning point according to the parameter condition set, on one hand, whether the echo energy of the scanning point meets the echo energy parameter condition or not needs to be determined, and on the other hand, whether the position information of the scanning point meets the position information parameter condition or not can be determined, and only the scanning point of which the echo energy meets the echo energy parameter condition and the position information also meets the position information parameter condition can be determined as the target scanning point.
The position information parameter condition corresponds in expression to the position information corresponding to the scanning point. In the point cloud collected by the radar, the position information corresponding to one scanning point can include one or more of the following: the distance, the azimuth angle, the pitch angle, the coordinates under a machine body coordinate system, the coordinates under a coordinate system corresponding to a horizontal plane where the machine body is located, the height relative to the ground plane and the height relative to the plant. With reference to fig. 3, regarding a coordinate system of the body coordinate system corresponding to a horizontal plane where the body is located, fig. 3 is another schematic view of an operation scene of the unmanned aerial vehicle provided in the embodiment of the present application. As shown in fig. 3, the body coordinate system is a coordinate system (Y0 axis is not shown) composed of three coordinate axes of X0, Y0, and Z0, and the body coordinate system is established based on the body of the drone, and is fixed to the body of the drone and changes with the change of the attitude of the drone. The coordinate system corresponding to the horizontal plane where the airframe is located may also be referred to as a horizontal plane coordinate system, as shown in fig. 3, the coordinate system includes three coordinate axes (Y1 axis is not shown) of X1, Y1, and Z1, where the platform where the X1 axis and the Y1 axis are located is the horizontal plane where the unmanned aerial vehicle is located. The height of the scanning point relative to the ground level and the height relative to the plant can be calculated by a terrain detection algorithm, which is the prior art and will not be explained herein.
The location information parameter condition may also include a condition for different data in the location information, corresponding to the location information may include a plurality of different data. For example, the coordinate point of the scanning point in the horizontal plane coordinate system may be represented as (x1, y1, z1), and in one example, the position information parameter condition may include z1> -0.5, which indicates that the target scanning point should fall within a space within 0.5m below the horizontal plane of the body. For another example, the height of the scanning point relative to the ground plane can be represented by h0, the height of the scanning point relative to the plant can be represented by h1, and the position information parameter conditions can further include conditions of h0>1, h1>0.5, and the like.
It is to be noted that, in the above-described embodiment, the position information parameter condition is an element in the parameter condition set, but in actual implementation, the position information parameter condition does not necessarily need to be ascribed to the parameter condition set, and it may be acquired in a separate step, independently of the parameter condition set.
When the movable platform is controlled to execute obstacle avoidance movement according to the position information of the target scanning points, specifically, the target scanning points can be clustered firstly, and the scanning points corresponding to different obstacles are clustered into corresponding point cloud clusters; further, the position information of the point cloud cluster can be determined according to the position information of the scanning points in the point cloud cluster, and the position information of the point cloud cluster can comprise the coordinates of the center point of the point cloud cluster and the three-dimensional size of the point cloud cluster; and finally, controlling the movable platform to dodge the point cloud cluster according to the position information of the point cloud cluster, thereby realizing obstacle avoidance.
A plurality of scanning points acquired by the radar are screened through a parameter condition set (comprising an echo energy parameter condition and a position information parameter condition), and scanning points corresponding to non-negligible obstacles, namely target scanning points, can be determined. Considering that under different environments, a negligible weak obstacle is different from a non-negligible strong obstacle, therefore, when determining a specific condition in the parameter condition set, in one embodiment, a different parameter condition set may be set for different environments, and specifically, the parameter condition set may be determined according to a scene type of an environment where the movable platform is located.
The parameter condition sets corresponding to different scene categories may be predetermined in the early work. For example, the obstacle avoidance function of the movable platform can be debugged in advance in the environments corresponding to different scene categories, so that the parameter condition set most suitable for the scene category can be debugged. Most suitably, when the parameter condition set is applied, the movable platform may automatically ignore small and weak obstacles in the environment corresponding to the scene type, and meanwhile, may also maintain the obstacle avoidance capability for strong obstacles (for convenience of description, this obstacle avoidance function may be referred to as a more intelligent obstacle avoidance function). For example, in the application scene of agricultural unmanned aerial vehicle, the scene category can be corn field, paddy field, sorghum field, etc., and the movable platform can be unmanned aerial vehicle, then can carry out unmanned aerial vehicle test operation respectively in corn field, paddy field, sorghum field, combine personnel's such as flyer and expert's experience to carry out the analysis to the environment, determine negligible barrier and non-negligible barrier under this environment to adjust the parameter condition set with more intelligent obstacle avoidance function as the purpose, thereby determine the most suitable parameter condition set of current scene category. After the most suitable parameter condition set corresponding to each scene category is determined, the corresponding relationship between the scene category and the most suitable parameter condition set can be established, so that the corresponding parameter condition set can be directly matched according to the scene category during application.
The debugging process described above can be taken as a specific example to facilitate understanding. For example, for a scene category of the corn field, when a parameter condition set corresponding to the scene category is determined in the early stage, the corn field can be scanned by a radar carried on the unmanned aerial vehicle, and a point cloud (a plurality of scanning points) corresponding to the corn field is obtained. The point cloud can be clustered, and after the point cloud clusters corresponding to different objects are clustered, the point cloud clusters corresponding to obstacles in the point cloud clusters can be analyzed. Obstacles such as the poles, reeds, and corn leaves included in the corn field, wherein reeds and high corn leaves can be considered negligible obstacles and poles can be considered non-negligible obstacles. For the determination of the echo energy parameter conditions, the echo energy of scanning points in the point cloud clusters corresponding to telegraph poles, reeds and corn leaves can be analyzed. For example, the analysis result may be that the echo energy of the scanning points corresponding to the telegraph pole is greater than 10000, the echo energy of the scanning points corresponding to the reed is in the range of 5000-. In order to realize a more intelligent obstacle avoidance function, the echo energy parameter condition in the parameter condition set can be adjusted to be E >10000(E represents the echo energy of a scanning point), when the parameter condition set is applied, only a telegraph pole is regarded as an obstacle to be avoided, and reed and corn leaves can be ignored.
For the position information parameter conditions, the position information parameter conditions can be flexibly determined according to actual needs, flight experience of a flyer and the like. For example, if it is considered that the drone needs to fly more safely, taking the condition of z1, which is the height of the scanning point relative to the horizontal plane of the drone, z1> -1 may be set, that is, a space 1m below the horizontal plane of the drone is included in the observation range, so that the scanning point falling in the observation range may be identified as the target scanning point. If the unmanned aerial vehicle does not need to pay attention to such a large range, for example, in a scene, if the unmanned aerial vehicle performs height fixing operation relative to the height of the plant by 0.5m (the height relative to the plant by 0.5m is better than the height relative to the plant by 1m in spraying effect), then z1> -1 is obviously too large in observation range, z1> -0.5 can be set, and a space 0.5m below the horizontal plane where the unmanned aerial vehicle is located is taken into the observation range.
When a parameter condition set corresponding to a certain scene category is determined in an early stage, in an embodiment, the parameter condition set can be further implemented through a neural network model. Specifically, a training sample set may be established by using the determined environment data corresponding to the plurality of sets of scene categories as input and the parameter condition set corresponding to the scene category as output training samples, and the neural network model may be trained by using the training sample set. When a parameter condition set corresponding to a new scene type is determined, environment data corresponding to the scene type can be collected, and the collected environment data is input into a trained neural network model, so that the parameter condition set corresponding to the new scene type output by the neural network model can be obtained.
Considering that some users may want the obstacle avoidance function of the drone (taking the drone as an example, but not limited to a movable platform such as a drone) to ignore weak obstacles, but some users prefer the flight safety of the drone to expect that the drone does not collide with any obstacle, at least two different parameter condition sets can be determined for each scene category during debugging. Continuing with the above example of the corn field, two sets of parameter conditions may be set for the scene category of the corn field, the first set of parameter conditions may include the echo energy parameter condition E >5000, the location information parameter condition z1> -1, h0>1, h1>0.5, and the second set of parameter conditions may include the echo energy parameter condition E >10000, the location information parameter condition z1> -0.5, h0>1, h1> 0.5.
It can be understood that the first parameter condition set corresponds to a high-sensitivity obstacle avoidance function, when the parameter condition set is applied, the unmanned aerial vehicle can perform obstacle avoidance on obstacles with low echo energy, such as reeds and corn leaves, and scanning points in a space within 1m below the unmanned aerial vehicle can be considered as scanning points corresponding to the obstacles. The second parameter condition set corresponds to a low-sensitivity obstacle avoidance function, when the parameter condition set is applied, an unmanned aerial vehicle can ignore obstacles with low echo energy, such as reeds and corn leaves, and only scanning points in a space within 0.5m below the unmanned aerial vehicle can be considered as scanning points corresponding to the obstacles.
At least two different parameter condition sets are determined for each scene category, so that in application, after the scene category of the current environment is determined, at least two parameter condition sets corresponding to the scene category are provided for a user to select, and one of the parameter condition sets can be determined according to a selection instruction of the user to serve as a parameter condition set used by the unmanned aerial vehicle in the operation process. Referring to fig. 4, fig. 4 shows a possible user interaction interface in which a user can switch the sensitivity level of the obstacle avoidance function to any one of high, medium, and low by touching the sensitivity control.
When determining a target scanning point, although the target scanning point simultaneously satisfies the echo energy parameter condition and the position information parameter condition, the satisfaction of these two parameter conditions does not necessarily mean that all of the two parameter conditions are satisfied. For example, in an example, the parameter condition set may include an echo energy parameter condition E >10000, a location information parameter condition z1> -0.5, h0>1, and h1>0.5, and then the scanning point satisfying the two conditions of E >10000 and z1> -0.5 may be set as the target scanning point, and the scanning point satisfying E >10000, h0>1, and h1>0.5 may also be the target scanning point, that is, which of the echo energy parameter condition and the location information parameter condition is specifically met by the target scanning point may be flexibly set as required.
There may be various embodiments in determining the scene category corresponding to the environment. In one embodiment, the scene type corresponding to the current working environment can be determined according to an instruction input by a user through interaction with the user. In another embodiment, the environment may be automatically identified to determine the corresponding scene type. Specifically, in identifying the environment, in one embodiment, the cloud data obtained by scanning the environment with a radar may be used for identification. For example, a scene recognition model with point cloud data as input and a scene type as output may be trained in advance, and the scene type corresponding to the environment may be determined by the scene recognition model.
In another embodiment, the scene category may also be identified by image recognition techniques. Specifically, the movable platform may be equipped with a camera, and a current work environment is photographed by the camera to obtain a scene image corresponding to the environment, and the scene type corresponding to the environment may be determined by recognizing the scene image. When the scene image is identified, the scene image may be input into a pre-trained scene identification model, and the scene identification model may output a scene type corresponding to the scene image through calculation, so as to determine a scene type corresponding to the environment. The scene recognition model may be a convolutional neural network model, and of course, there are other optional models, which are not described herein again.
Although the parameter condition set applied by the obstacle avoidance function may be determined according to the scene type corresponding to the environment, in some scenes, the determined parameter condition set may still not meet the requirements of the user. Due to the complexity and variability of the working environment, the parameter condition set corresponding to the scene type determined in the previous work is often only the most common environment corresponding to the scene type, and the actual working environment is likely to be different from the common environment. It may be a case that there may be some special obstacles in the actual environment, which are not usually found in the corresponding scene category, for example, the obstacle in the corn field is usually some high-stalk corn leaves, or some telegraph poles, but the corn field cultivated by a certain user is more specific, and the special obstacle of bamboo pole is also present in the cultivated corn field. If the bamboo pole of the user is a weak and small obstacle which does not need to be avoided, but the special condition that the bamboo pole exists in the corn field is not considered in the parameter condition set corresponding to the corn field determined in the earlier work, so that the obstacle avoidance of the bamboo pole is probably executed when the unmanned aerial vehicle adopts the parameter condition set to operate, the unmanned aerial vehicle is not intelligent enough, and the requirement of the user cannot be met.
In view of the foregoing situation, embodiments of the present application provide an implementation manner, and in particular, may determine an obstacle that a user wants to ignore through a manner of interacting with the user, and modify a current parameter condition set according to the negligible obstacle selected by the user. It is understood that the current parameter condition set may also be considered as an initial parameter condition set, which may be a default parameter condition set provided by the system when the drone is started, or may be the parameter condition set determined according to the scene category corresponding to the environment mentioned in the foregoing.
Specifically, in the modification, the object type may be determined for the user-selected negligible obstacle, and the object type may be, in one example, bamboo pole, reed, or the like. After the object class of the negligible obstacle is determined, the characteristic parameters corresponding to the object class can be further determined, and the initial parameter condition set is corrected according to the characteristic parameters. In determining the feature parameters corresponding to the object categories, in an embodiment, the matching may be performed based on a pre-configured correspondence, specifically, the pre-configured correspondence may include feature parameters corresponding to various object categories, for example, feature parameters corresponding to bamboo poles, feature parameters corresponding to reeds, and the like. The corresponding relationship may be determined in advance in the previous work, for example, corresponding point cloud data may be acquired for each object type, and the point cloud data corresponding to the object type may be obtained by scanning an obstacle corresponding to the object type by a radar. After the point cloud data corresponding to the object category is obtained, feature extraction can be performed on the point cloud data to obtain corresponding feature parameters.
In another embodiment, the characteristic parameters corresponding to the object categories may also be extracted in real time. For example, after determining a negligible obstacle selected by a user, analyzing the point cloud data corresponding to the currently acquired negligible obstacle, extracting a characteristic parameter corresponding to the negligible obstacle in real time, and correcting the current parameter condition set according to the characteristic parameter.
For convenience of understanding, the following examples may be referred to when the parameter condition set is modified according to the characteristic parameters. For example, in a parameter condition set corresponding to a corn region determined in the early work, the echo energy parameter condition includes E >6000, and a user-selected negligible obstacle, namely a bamboo pole, the characteristic parameter of which may indicate the echo energy corresponding to the bamboo pole, is 6500, when the parameter condition set is corrected according to the characteristic parameter, the echo energy parameter condition may be increased to E >6600, so that the obstacle avoidance function may ignore the bamboo pole and not avoid the bamboo pole.
There are several alternative embodiments for determining the object class corresponding to the user-selected negligible obstacle. In one embodiment, the object categories may be determined by interaction with a user, for example, the user may be provided with a selection page including a plurality of object categories for selection by the user, so that the object categories corresponding to the negligible obstacles may be determined according to the selection of the user. In another embodiment, the object class corresponding to the negligible obstacle may also be determined by image recognition techniques. For example, a working scene can be shot by a camera mounted on the movable platform, at this time, a user can select an obstacle which the user wants to ignore from a frame in the shot picture, and an image corresponding to the negligible obstacle can be intercepted according to a frame selection instruction input by the user, so that the type of the object corresponding to the negligible obstacle can be determined by identifying the image corresponding to the negligible obstacle.
When the image corresponding to the negligible obstacle is recognized, similar to the recognition of the scene image in the foregoing, the corresponding obstacle recognition model may also be trained in advance, so that when the method is applied, only the image corresponding to the negligible obstacle selected by the frame is input into the obstacle recognition model, and the object type of the negligible obstacle output by the obstacle recognition model may be obtained. Similarly, the obstacle identification model may be a convolutional neural network model.
The above is a detailed description of the control method provided in the embodiments of the present application. According to the method provided by the embodiment of the application, scanning points with echo energy which is not in accordance with the echo energy parameter condition are filtered from a plurality of scanning points obtained by radar scanning, the part of the filtered scanning points correspond to negligible scanning points of weak obstacles, that is, scanning points corresponding to obstacles in reserved target scanning points are scanning points of non-negligible obstacles, so that the weak obstacles cannot be considered in the obstacle avoidance function realized based on the target scanning points, and the weak obstacles can be ignored in the obstacle avoidance function.
In addition, the method provided by the embodiment of the application can provide the most suitable parameter condition set for the operation environment for the user according to the specific operation environment, the number of the provided parameter condition sets can be multiple, and each parameter condition set can correspond to different sensitivity levels, so that the requirements of various users on the obstacle avoidance function are met. In addition, the parameter condition set can be adjusted according to the obstacles which the user wants to ignore in a personalized manner, so that the obstacle avoidance function can meet the requirements of the user to the maximum extent, the obstacles selected by the user can be ignored, and the intelligence expected by the user can be achieved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a control device according to an embodiment of the present disclosure. The control device provided by the embodiment of the application can be implemented in various ways on specific product forms. For example, the control device may itself be a stand-alone product, or may be part of other electronic equipment such as radar, drone controller, etc. The control device includes: a processor 510 and a memory 520 storing computer programs;
the processor 510, when executing the computer program, realizes the following steps:
acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning of a radar of a movable platform;
acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
Optionally, the parameter condition set further includes a location information parameter condition; and the determined position information of the target scanning point conforms to the position information parameter condition.
Optionally, the processor is specifically configured to cluster the target scanning points to obtain a point cloud cluster when executing the step of controlling the movable platform to execute obstacle avoidance motion according to the position information of the target scanning points; determining the position information of the point cloud cluster according to the position information of the target scanning point in the point cloud cluster; and controlling the movable platform to execute obstacle avoidance movement according to the position information of the point cloud cluster.
Optionally, the position information of the point cloud cluster includes a center point coordinate of the point cloud cluster and a three-dimensional size of the point cloud cluster.
Optionally, the echo energy parameter condition includes that the echo energy corresponding to the scanning point is greater than an echo energy threshold.
Optionally, the parameter condition set is determined according to a scene category corresponding to the environment.
Optionally, when the step of determining the parameter condition set according to the scene category is executed, the processor is specifically configured to determine the parameter condition set corresponding to the scene category according to a preset correspondence between the scene category and the parameter condition set.
Optionally, in the correspondence, one scene category corresponds to at least two parameter condition sets.
Optionally, the parameter condition sets are determined from the at least two parameter condition sets according to a selection instruction of a user.
Optionally, a camera is mounted on the movable platform, and the scene type is determined by identifying a scene image captured by the camera.
Optionally, when the step of recognizing the scene image is executed, the processor is specifically configured to input the scene image into a pre-trained scene recognition model, so as to obtain a scene category corresponding to the scene image output by the scene recognition model; wherein the scene recognition model is a convolutional neural network model.
Optionally, the parameter condition set is obtained by modifying the initial parameter condition set according to the negligible obstacle selected by the user.
Optionally, when the step of correcting the initial parameter condition set according to the negligible obstacle is executed, the processor is specifically configured to determine an object class corresponding to the negligible obstacle, determine a feature parameter corresponding to the object class, and correct the initial parameter condition set according to the feature parameter; the characteristic parameters are obtained by performing characteristic extraction on radar scanning data corresponding to the object type in advance.
Optionally, the object class corresponding to the negligible obstacle is determined by identifying an image corresponding to the negligible obstacle.
Optionally, a camera is mounted on the movable platform, and the image corresponding to the negligible obstacle is obtained by capturing the image captured by the camera according to a frame selection instruction input by a user.
Optionally, the position information corresponding to the scanning point includes one or more of the following: the distance, the azimuth angle, the pitch angle, the coordinates under a machine body coordinate system, the coordinates under a coordinate system corresponding to a horizontal plane where the machine body is located, the height relative to the ground plane and the height relative to the plant.
Optionally, the radar is a millimeter wave radar.
For the specific implementation of the control device in the above various embodiments, reference may be made to the corresponding description of the control method in the foregoing, and details are not described here again.
The control device provided by the embodiment of the application can filter out scanning points with echo energy which is not in accordance with the echo energy parameter condition from a plurality of scanning points obtained by radar scanning according to the echo energy parameter condition, the part of the filtered scanning points correspond to negligible scanning points of weak and small obstacles, that is, in the reserved target scanning points, the scanning points corresponding to the obstacles are scanning points of non-negligible obstacles, so that the weak and small obstacles cannot be considered in the obstacle avoidance function realized based on the target scanning points, and the weak and small obstacles can be ignored in the obstacle avoidance function.
In addition, the control device can also provide a parameter condition set which is most suitable for the operation environment for the user according to the specific operation environment, the provided parameter condition sets can be multiple, and each parameter condition set can correspond to different sensitivity levels, so that the requirements of various users on the obstacle avoidance function are met. In addition, the parameter condition set can be adjusted according to the obstacles which the user wants to ignore in a personalized manner, so that the obstacle avoidance function can meet the requirements of the user to the maximum extent, the obstacles selected by the user can be ignored, and the intelligence expected by the user can be achieved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a movable platform according to an embodiment of the present disclosure. The movable platform may include: a main body 610, a power unit 620 connected to the main body, a radar 630 mounted on the main body 610, a processor 611, and a memory 612 in which a computer program is stored;
the processor 611, when executing the computer program, performs the steps of:
acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by the environment scanning of the radar 630;
acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
Optionally, the parameter condition set further includes a location information parameter condition; and the determined position information of the target scanning point conforms to the position information parameter condition.
Optionally, the processor is specifically configured to cluster the target scanning points to obtain a point cloud cluster when executing the step of controlling the movable platform to execute obstacle avoidance motion according to the position information of the target scanning points; determining the position information of the point cloud cluster according to the position information of the target scanning point in the point cloud cluster; and controlling the movable platform to execute obstacle avoidance movement according to the position information of the point cloud cluster.
Optionally, the position information of the point cloud cluster includes a center point coordinate of the point cloud cluster and a three-dimensional size of the point cloud cluster.
Optionally, the echo energy parameter condition includes an echo energy threshold, and the echo energy of the target scanning point is higher than the echo energy threshold.
Optionally, the parameter condition set is determined according to a scene category corresponding to the environment.
Optionally, when the step of determining the parameter condition set according to the scene category is executed, the processor is specifically configured to determine the parameter condition set corresponding to the scene category according to a preset correspondence between the scene category and the parameter condition set.
Optionally, in the correspondence, one scene category corresponds to at least two parameter condition sets.
Optionally, the parameter condition sets are determined from the at least two parameter condition sets according to a selection instruction of a user.
Optionally, a camera is mounted on the movable platform, and the scene type is determined by identifying a scene image captured by the camera.
Optionally, when the step of recognizing the scene image is executed, the processor is specifically configured to input the scene image into a pre-trained scene recognition model, so as to obtain a scene category corresponding to the scene image output by the scene recognition model; wherein the scene recognition model is a convolutional neural network model.
Optionally, the parameter condition set is obtained by modifying the initial parameter condition set according to the negligible obstacle selected by the user.
Optionally, when the step of correcting the initial parameter condition set according to the negligible obstacle is executed, the processor is specifically configured to determine an object class corresponding to the negligible obstacle, determine a feature parameter corresponding to the object class, and correct the initial parameter condition set according to the feature parameter; the characteristic parameters are obtained by performing characteristic extraction on radar scanning data corresponding to the object type in advance.
Optionally, the object class corresponding to the negligible obstacle is determined by identifying an image corresponding to the negligible obstacle.
Optionally, a camera is mounted on the movable platform, and the image corresponding to the negligible obstacle is obtained by capturing the image captured by the camera according to a frame selection instruction input by a user.
Optionally, the position information corresponding to the scanning point includes one or more of the following: the distance, the azimuth angle, the pitch angle, the coordinates under a machine body coordinate system, the coordinates under a coordinate system corresponding to a horizontal plane where the machine body is located, the height relative to the ground plane and the height relative to the plant.
Optionally, the radar is a millimeter wave radar.
Optionally, the movable platform comprises a drone.
For the specific implementation of the movable platform in the above various embodiments, reference may be made to the corresponding description of the control method in the foregoing, and details are not described here again.
The movable platform provided by the embodiment of the application can filter scanning points with echo energy which is not in accordance with the echo energy parameter condition from a plurality of scanning points obtained by radar scanning according to the echo energy parameter condition, the part of the filtered scanning points correspond to negligible scanning points of weak and small obstacles, namely, the scanning points corresponding to the obstacles in the reserved target scanning points are scanning points of non-negligible obstacles, so that the weak and small obstacles cannot be considered in the obstacle avoidance function realized based on the target scanning points, and the weak and small obstacles can be ignored in the obstacle avoidance function.
In addition, the movable platform can provide a parameter condition set which is most suitable for the operation environment for the user according to the specific operation environment, the number of the provided parameter condition sets can be multiple, and each parameter condition set can correspond to different sensitivity levels, so that the requirements of various users on the obstacle avoidance function are met. In addition, the parameter condition set can be adjusted according to the obstacles which the user wants to ignore in a personalized manner, so that the obstacle avoidance function can meet the requirements of the user to the maximum extent, the obstacles selected by the user can be ignored, and the intelligence expected by the user can be achieved.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements any one of the control methods provided in the embodiment of the present application.
As long as there is no conflict or contradiction between the technical features provided in the above embodiments, a person skilled in the art may combine the technical features according to actual situations to form various embodiments. While the present document is intended to be limited to the details and not by way of limitation, it is understood that various embodiments are also within the scope of the disclosure of the embodiments of the present application.
Embodiments of the present application may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, in which program code is embodied. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method provided by the embodiment of the present application is described in detail above, and the principle and the implementation of the present application are explained in the present application by applying specific examples, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (36)

  1. A control method, comprising:
    acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning of a radar of a movable platform;
    acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
    determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
    and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
  2. The control method of claim 1, wherein the set of parameter conditions further comprises a location information parameter condition; and the determined position information of the target scanning point conforms to the position information parameter condition.
  3. The control method according to claim 1, wherein the controlling the movable platform to perform obstacle avoidance motion according to the position information of the target scanning point comprises:
    clustering the target scanning points to obtain a point cloud cluster;
    determining the position information of the point cloud cluster according to the position information of the target scanning point in the point cloud cluster;
    and controlling the movable platform to execute obstacle avoidance movement according to the position information of the point cloud cluster.
  4. The control method according to claim 3, wherein the position information of the point cloud cluster includes coordinates of a center point of the point cloud cluster and a three-dimensional size of the point cloud cluster.
  5. The control method of claim 1, wherein the echo energy parameter condition comprises an echo energy corresponding to a scan point being greater than an echo energy threshold.
  6. The control method according to claim 1, wherein the parameter condition set is determined according to a scene category corresponding to the environment.
  7. The control method of claim 6, wherein determining the set of parameter conditions based on the scene category comprises:
    and determining the parameter condition set corresponding to the scene category according to the corresponding relation between the pre-configured scene category and the parameter condition set.
  8. The control method according to claim 7, wherein in the correspondence, one scene category corresponds to at least two parameter condition sets.
  9. The control method according to claim 8, wherein the parameter condition set is determined from the at least two parameter condition sets according to a selection instruction of a user.
  10. The control method according to claim 6, wherein a camera is mounted on the movable platform, and the scene type is determined by recognizing a scene image captured by the camera.
  11. The control method of claim 10, wherein identifying the scene image comprises:
    inputting the scene image into a pre-trained scene recognition model to obtain a scene category corresponding to the scene image output by the scene recognition model; wherein the scene recognition model is a convolutional neural network model.
  12. The control method of claim 1, wherein the set of parameter conditions is modified from an initial set of parameter conditions based on a user-selected negligible obstacle.
  13. The control method of claim 12, wherein modifying the initial set of parameter conditions based on the negligible obstruction comprises:
    determining an object type corresponding to the negligible obstacle, determining a characteristic parameter corresponding to the object type, and correcting the initial parameter condition set according to the characteristic parameter; the characteristic parameters are obtained by performing characteristic extraction on radar scanning data corresponding to the object type in advance.
  14. The control method according to claim 13, wherein the object class corresponding to the negligible obstacle is determined by recognizing an image corresponding to the negligible obstacle.
  15. The control method according to claim 14, wherein a camera is mounted on the movable platform, and the image corresponding to the negligible obstacle is obtained by extracting from the image captured by the camera according to a frame selection command input by a user.
  16. The control method according to claim 1, wherein the position information corresponding to a scanning point comprises one or more of: the distance, the azimuth angle, the pitch angle, the coordinates under a machine body coordinate system, the coordinates under a coordinate system corresponding to a horizontal plane where the machine body is located, the height relative to the ground plane and the height relative to the plant.
  17. The control method according to claim 1, characterized in that the radar is a millimeter wave radar.
  18. A control device, comprising: a processor and a memory storing a computer program;
    the processor, when executing the computer program, implements the steps of:
    acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by environment scanning of a radar of a movable platform;
    acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
    determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
    and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
  19. The control apparatus of claim 18, wherein the set of parameter conditions further comprises a location information parameter condition; and the determined position information of the target scanning point conforms to the position information parameter condition.
  20. The control device according to claim 18, wherein the processor is configured to cluster the target scanning points to obtain point cloud clusters when performing the step of controlling the movable platform to perform obstacle avoidance motion according to the position information of the target scanning points; determining the position information of the point cloud cluster according to the position information of the target scanning point in the point cloud cluster; and controlling the movable platform to execute obstacle avoidance movement according to the position information of the point cloud cluster.
  21. The control apparatus of claim 20, wherein the position information of the point cloud cluster comprises coordinates of a center point of the point cloud cluster and a three-dimensional size of the point cloud cluster.
  22. The control device of claim 18, wherein the echo energy parameter condition comprises an echo energy corresponding to a scan point being greater than an echo energy threshold.
  23. The control apparatus of claim 18, wherein the set of parameter conditions is determined according to a scene category corresponding to the environment.
  24. The control device according to claim 23, wherein the processor, when executing the step of determining the parameter condition set according to the scene category, is specifically configured to determine the parameter condition set corresponding to the scene category according to a pre-configured correspondence relationship between the scene category and the parameter condition set.
  25. The control device according to claim 24, wherein in the correspondence, one scene category corresponds to at least two parameter condition sets.
  26. The control device of claim 25, wherein the parameter condition set is determined from the at least two parameter condition sets according to a selection instruction of a user.
  27. The control device according to claim 23, wherein a camera is mounted on the movable platform, and the scene type is identified by recognizing a scene image captured by the camera.
  28. The control device according to claim 27, wherein the processor is configured to, when executing the step of recognizing the scene image, input the scene image into a pre-trained scene recognition model to obtain a scene type corresponding to the scene image output by the scene recognition model; wherein the scene recognition model is a convolutional neural network model.
  29. The control apparatus of claim 18, wherein the set of parameter conditions is modified from an initial set of parameter conditions based on a user-selected negligible obstruction.
  30. The control device according to claim 29, wherein the processor, when executing the step of modifying the initial set of parameter conditions according to the negligible obstacle, is specifically configured to determine an object class corresponding to the negligible obstacle, determine a characteristic parameter corresponding to the object class, and modify the initial set of parameter conditions according to the characteristic parameter; the characteristic parameters are obtained by performing characteristic extraction on radar scanning data corresponding to the object type in advance.
  31. The control device of claim 30, wherein the object class to which the negligible obstacle corresponds is determined by identifying an image to which the negligible obstacle corresponds.
  32. The control device according to claim 31, wherein a camera is mounted on the movable platform, and the image corresponding to the negligible obstacle is obtained by extracting from the image captured by the camera in accordance with a frame selection command input by a user.
  33. The control device of claim 18, wherein the position information corresponding to a scanning point comprises one or more of: the distance, the azimuth angle, the pitch angle, the coordinates under a machine body coordinate system, the coordinates under a coordinate system corresponding to a horizontal plane where the machine body is located, the height relative to the ground plane and the height relative to the plant.
  34. The control device of claim 18, wherein the radar is a millimeter wave radar.
  35. A movable platform, comprising: the radar detection device comprises a machine body, a power device connected with the machine body, a radar arranged on the machine body, a processor and a memory storing a computer program;
    the processor, when executing the computer program, implements the steps of:
    acquiring echo energy and position information corresponding to each scanning point in a plurality of scanning points obtained by the environment scanning of the radar;
    acquiring a set parameter condition set; wherein the set of parameter conditions comprises echo energy parameter conditions;
    determining a target scanning point in the plurality of scanning points according to the parameter condition set; wherein the echo energy of the target scanning point conforms to the echo energy parameter condition;
    and controlling the movable platform to execute obstacle avoidance movement according to the position information of the target scanning point.
  36. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the control method according to any one of claims 1 to 17.
CN202080039444.XA 2020-05-21 2020-05-21 Control method, control device, movable platform and computer readable storage medium Pending CN113966496A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/091587 WO2021232359A1 (en) 2020-05-21 2020-05-21 Control method, control device, movable platform, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN113966496A true CN113966496A (en) 2022-01-21

Family

ID=78708968

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080039444.XA Pending CN113966496A (en) 2020-05-21 2020-05-21 Control method, control device, movable platform and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN113966496A (en)
WO (1) WO2021232359A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114693187B (en) * 2022-05-31 2022-10-21 杭州未名信科科技有限公司 Operation analysis method and device of tower crane cluster, storage medium and terminal
CN117472069B (en) * 2023-12-28 2024-03-26 烟台宇控软件有限公司 Robot control method and system for power transmission line detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
CN109344715A (en) * 2018-08-31 2019-02-15 北京达佳互联信息技术有限公司 Intelligent composition control method, device, electronic equipment and storage medium
CN110390814A (en) * 2019-06-04 2019-10-29 深圳市速腾聚创科技有限公司 Monitoring system and method
CN110865365A (en) * 2019-11-27 2020-03-06 江苏集萃智能传感技术研究所有限公司 Parking lot noise elimination method based on millimeter wave radar
CN110892285A (en) * 2018-11-26 2020-03-17 深圳市大疆创新科技有限公司 Microwave radar and unmanned vehicles

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113238581A (en) * 2016-02-29 2021-08-10 星克跃尔株式会社 Method and system for flight control of unmanned aerial vehicle
CN105892489B (en) * 2016-05-24 2019-09-10 国网山东省电力公司电力科学研究院 A kind of automatic obstacle avoiding UAV system and control method based on Multi-sensor Fusion
CN107121677B (en) * 2017-06-02 2019-10-11 太原理工大学 Avoidance radar method and device based on ultra wide band cognition CPPM signal
JP7077013B2 (en) * 2017-12-27 2022-05-30 株式会社トプコン 3D information processing unit, device equipped with 3D information processing unit, unmanned aerial vehicle, notification device, moving object control method using 3D information processing unit, and program for moving object control processing
IL257010B (en) * 2018-01-18 2021-10-31 Israel Aerospace Ind Ltd Automatic camera driven aircraft control for rader activation
CN212135234U (en) * 2020-01-19 2020-12-11 国网江苏省电力有限公司 A flight auxiliary device for transmission line patrols and examines unmanned aerial vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
CN109344715A (en) * 2018-08-31 2019-02-15 北京达佳互联信息技术有限公司 Intelligent composition control method, device, electronic equipment and storage medium
CN110892285A (en) * 2018-11-26 2020-03-17 深圳市大疆创新科技有限公司 Microwave radar and unmanned vehicles
CN110390814A (en) * 2019-06-04 2019-10-29 深圳市速腾聚创科技有限公司 Monitoring system and method
CN110865365A (en) * 2019-11-27 2020-03-06 江苏集萃智能传感技术研究所有限公司 Parking lot noise elimination method based on millimeter wave radar

Also Published As

Publication number Publication date
WO2021232359A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
CN111932588B (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN109144097B (en) Obstacle or ground recognition and flight control method, device, equipment and medium
CN108122553B (en) Unmanned aerial vehicle control method and device, remote control equipment and unmanned aerial vehicle system
US20050041102A1 (en) Automatic target detection and motion analysis from image data
CN111679695B (en) Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology
CN107272734A (en) Unmanned plane during flying task executing method, unmanned plane and computer-readable recording medium
CN113966496A (en) Control method, control device, movable platform and computer readable storage medium
CN111831010A (en) Unmanned aerial vehicle obstacle avoidance flight method based on digital space slice
CN116907282B (en) Unmanned target aircraft ultra-low altitude flight control method based on artificial intelligence algorithm
CN112378397A (en) Unmanned aerial vehicle target tracking method and device and unmanned aerial vehicle
CN115649501A (en) Night driving illumination system and method for unmanned aerial vehicle
CN114325642A (en) Laser radar scanning method, scanning apparatus, and computer-readable storage medium
WO2022061632A1 (en) Obstacle detection method and apparatus, and unmanned aerial vehicle and storage medium
CN113568428A (en) Campus security method and system based on multi-unmanned aerial vehicle cooperation
CN110501680B (en) Target monitoring system and target monitoring method based on radar system
CN112380933A (en) Method and device for identifying target by unmanned aerial vehicle and unmanned aerial vehicle
CN109977884B (en) Target following method and device
Geyer et al. Prototype sense-and-avoid system for UAVs
CN113574487A (en) Unmanned aerial vehicle control method and device and unmanned aerial vehicle
US20230367319A1 (en) Intelligent obstacle avoidance method and apparatus based on binocular vision, and non-transitory computer-readable storage medium
Tang et al. sUAS and Machine Learning Integration in Waterfowl Population Surveys
CN113433965B (en) Unmanned aerial vehicle obstacle avoidance method and device, storage medium and electronic equipment
CN114355960B (en) Unmanned aerial vehicle defense intelligent decision-making method and system, server and medium
CN114638975A (en) Bird and non-bird repelling method and system for airport
WO2022004333A1 (en) Information processing device, information processing system, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination