WO2021168708A1 - Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage - Google Patents

Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage Download PDF

Info

Publication number
WO2021168708A1
WO2021168708A1 PCT/CN2020/076845 CN2020076845W WO2021168708A1 WO 2021168708 A1 WO2021168708 A1 WO 2021168708A1 CN 2020076845 W CN2020076845 W CN 2020076845W WO 2021168708 A1 WO2021168708 A1 WO 2021168708A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
cloud data
radar
distance
target
Prior art date
Application number
PCT/CN2020/076845
Other languages
English (en)
Chinese (zh)
Inventor
王石荣
陈文平
王俊喜
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/076845 priority Critical patent/WO2021168708A1/fr
Priority to CN202080004213.5A priority patent/CN112585553A/zh
Publication of WO2021168708A1 publication Critical patent/WO2021168708A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0206Control of position or course in two dimensions specially adapted to water vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the present invention relates to the field of device control, in particular to a control method for a movable platform, a movable platform, a device and a storage medium.
  • a commonly used obstacle avoidance method is: the mobile platform can first use its own radar to observe the surrounding environment, and then recognize the obstacles contained in the surrounding environment based on the point cloud data obtained from the observation, so as to realize the obstacle avoidance.
  • the invention provides a control method, a movable platform, equipment and a storage medium for a movable platform, which are used to accurately identify obstacles and ensure the normal movement of the movable platform.
  • the first aspect of the present invention is to provide a control method for a movable platform, the method including:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the second aspect of the present invention is to provide a movable platform, which at least includes: a body, a radar, a power system, and a control device;
  • the radar is arranged on the body for detecting point cloud data
  • the power system is arranged on the body and used to provide power for the movable platform
  • the control device includes a memory and a processor
  • the memory is used to store a computer program
  • the processor is configured to run a computer program stored in the memory to realize:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the third aspect of the present invention is to provide a control device for a movable platform, the device including:
  • Memory used to store computer programs
  • the processor is configured to run a computer program stored in the memory to realize:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the fourth aspect of the present invention is to provide a computer-readable storage medium, the storage medium is a computer-readable storage medium, the computer-readable storage medium stores program instructions, and the program instructions are used in the first aspect.
  • FIG. 1 is a schematic flowchart of a control method for a movable platform according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a method for filtering noise point cloud data according to an embodiment of the present invention
  • FIG. 3 is a schematic flowchart of another method for filtering noise point cloud data according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of an obstacle recognition method provided by an embodiment of the present invention.
  • FIG. 5 is a curve form of the first reference distance provided by an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of another obstacle recognition method provided by an embodiment of the present invention.
  • FIG. 7 is a curve form of a second reference distance provided by an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a control device for a movable platform provided by an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a control device for a movable platform provided by an embodiment of the present invention.
  • the control method, the movable platform, the equipment and the storage medium for the movable platform provided by the present invention first obtain the first point cloud data detected by the radar configured on the movable platform, and then according to the first point cloud data including At least one feature information of the first point cloud data is filtered to filter out noisy point cloud data corresponding to non-obstacles. Then, the obstacle is identified according to the filtered second point cloud data, and the movement state of the movable platform is controlled according to the position of the obstacle.
  • the second point cloud data corresponding to the obstacle can be accurately obtained, according to the second point cloud
  • the data can accurately identify obstacles and further ensure the normal movement of the movable platform.
  • an embodiment of the present invention provides a control method for a movable platform, and the method includes:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the embodiment of the present invention also provides a movable platform, which at least includes: a body, a power system, and a control device;
  • the power system is arranged on the body and used to provide power for the movable platform
  • the control device includes a memory and a processor
  • the memory is used to store a computer program
  • the processor is configured to run a computer program stored in the memory to realize:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the embodiment of the present invention also provides a control device for a movable platform, the device including:
  • Memory used to store computer programs
  • the processor is configured to run a computer program stored in the memory to realize:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the embodiment of the present invention also provides a computer-readable storage medium, the storage medium is a computer-readable storage medium, the computer-readable storage medium stores program instructions, and the program instructions are used for the above-mentioned mobile platform ⁇ Control methods.
  • FIG. 1 is a schematic flowchart of a control method for a movable platform provided by an embodiment of the present invention.
  • the execution subject of the control method for the movable platform is the control device. It can be understood that the control device can be implemented as software or a combination of software and hardware.
  • the detection device executes the control method for the movable platform to realize the control of the movement state of the movable platform.
  • the control equipment in this embodiment and the following embodiments may be any movable platform such as drones, automobiles, ships, etc.
  • the automobile may be an unmanned vehicle, an ordinary automobile, etc., and the ship may They are unmanned ships, ordinary ships, etc.
  • the movable platform is a drone as an example, the following embodiments will be described.
  • the method may include:
  • S101 Acquire first point cloud data detected by radar.
  • the point cloud data detected at this time can be referred to as the first point cloud data.
  • the first point of cloud data can reflect the distribution of obstacles in the flying environment of the drone.
  • an alternative way is to set up multiple radars around the fuselage of the drone so that it can detect the omnidirectional point cloud data of the drone.
  • the radar configured on the UAV can be a rotating radar, which can detect a full range of point cloud data through rotation.
  • S102 Filter out noise point cloud data corresponding to non-obstacles in the first point cloud data according to at least one feature information of the first point cloud data to obtain second point cloud data.
  • Each point cloud data detected by the radar may contain at least one type of characteristic information.
  • the feature information of different dimensions can be used alone or in combination to filter the first point cloud data. The following describes at least one feature information:
  • the at least one type of characteristic information may include: azimuth information relative to the radar, movement speed relative to the UAV, received echo signal energy, and so on.
  • the position information may specifically include: the distance relative to the radar, the angle relative to the radar, the position relative to the radar, and so on. Since the electromagnetic wave signal emitted by the radar illuminates an object and is reflected, a point cloud data is formed. Therefore, the distance in the position information is actually the distance between the object and the radar, and the angle in the position information is also That is, the angle between the object and the radar.
  • the position in the azimuth information can be expressed as a three-dimensional coordinate, that is, the position of the object relative to the radar is indicated in the form of three-dimensional coordinates.
  • the first point cloud data can be filtered out according to the distance from the radar in the azimuth information.
  • the point cloud data A in the first point cloud data first determine whether the distance to the radar in the point cloud data A is less than a preset distance. If the distance is less than the preset distance, it indicates that the object corresponding to this point cloud data A is closer to the radar, and this object is usually the drone body. The body is obviously not an obstacle that affects the flight of the UAV. In other words, the point cloud data A corresponding to the body is also noisy point cloud data, which should be filtered out.
  • the point cloud data A may be any one of the first point cloud data. After filtering out the first point cloud data in the above manner, the second point cloud data can be obtained.
  • the UAV and the radar are usually rigidly connected, the UAV will block the radar, and the radar will detect the noise point cloud data A corresponding to the airframe.
  • the above-mentioned preset distance can be understood as a lower limit of distance, and point cloud data less than this lower limit of distance are all noisy point cloud data.
  • the size of the drone can be referred to. For example, the larger the drone, the larger the preset distance setting.
  • the above method of filtering point cloud data according to the preset distance can actually be understood as: planning a sphere area with the radar as the center and the preset distance as the radius.
  • This sphere area can roughly represent the spatial range of the drone body. That is to say, it can be considered that the point cloud data falling in this area corresponds to the body of the drone, and the point cloud data falling in it needs to be filtered out.
  • the above-mentioned sphere area can be replaced with a cube area, a cone area, a triangle area, etc., and different preset distances can be set according to the shape of the area.
  • the area planned in the above method is relatively rough, and there is still a certain gap between this area and the real space range of the UAV body. In this case, it is easy to filter out the non-noise point cloud data falling within this range, resulting in The accuracy of obstacle recognition is reduced.
  • the position information may include a position relative to the radar, and this position is specifically expressed as a three-dimensional coordinate.
  • the first point cloud data can be filtered according to the relative radar position.
  • the UAV can be scanned in advance to obtain actual measurement data, which can be called reference point cloud data, and these reference point cloud data can form a preset spatial range. Then, following the example of the point cloud data A mentioned above, it is determined whether the position of the relative radar in the point cloud data A is within a preset spatial range. If it is within this preset space range, indicating that the object corresponding to point cloud data A is the body of a drone, point cloud data A will be filtered out.
  • filtering in the above manner can avoid the situation of filtering out the point cloud data corresponding to the non-body in the first point cloud data. .
  • the above filtering method actually filters out the point cloud data used to describe the drone body in the first point cloud data. And the above only lists two ways to filter out noise point cloud data based on feature information. Of course, the filtering of noise point cloud data can also include other methods. For the specific process, please refer to the embodiments shown in Figures 2 to 3 below. Detailed description in.
  • S103 Perform obstacle recognition according to the second point cloud data.
  • Each cluster can be considered as an obstacle, that is, the distribution of obstacles in the current flight environment of the UAV is identified.
  • the flight path of the drone can be planned according to the location of the obstacle, and the flight of the drone can be controlled.
  • the control method for a movable platform first acquires the first point cloud data detected by a radar configured on the movable platform, and then, based on at least one feature information included in the first point cloud data, The first point cloud data is filtered to filter out noisy point cloud data corresponding to non-obstacles. Then, the obstacle is identified according to the filtered second point cloud data, and the movement state of the movable platform is controlled according to the position of the obstacle. It can be seen that in the control method provided by the present invention, by using multi-dimensional feature information in the point cloud data to filter out the noisy point cloud data, the second point cloud data corresponding to the obstacle can be accurately obtained. Cloud data can accurately identify obstacles and further ensure the normal movement of the movable platform.
  • the first point cloud data collected by the radar usually includes noise point cloud data corresponding to water clutter and ground clutter.
  • the point cloud data corresponds to clutter
  • the position information may specifically include the distance and angle relative to the radar.
  • the position information may specifically include the distance and angle relative to the radar.
  • the preset distance, preset angle range, and preset energy value can all be set based on historical experience.
  • the noise point cloud data in the point cloud data collected by the radar is separated, and the separated noise point cloud data can be considered to be generated by clutter. While obtaining the echo signal energy of the noise point cloud data, the noise point cloud data can also be counted to obtain the probability of the noise point cloud data appearing at different distances and angles relative to the radar.
  • the distance and angle range where the noise point cloud data has a higher probability of occurrence is determined as the preset distance and the preset angle range. In practical applications, the preset distance can be 5 meters, the preset angle range can be 10° ⁇ 15°, and the preset energy value can be 5000W/m 2 .
  • the azimuth information may also include the distance relative to the radar.
  • the filtering of the first point cloud data can be: when the mission execution signal of the drone is turned on, if the distance of the point cloud data A radar is less than the preset distance, and the echo signal energy of the target point cloud data is less than The preset energy value will filter out the target point cloud data.
  • the preset distance can be 5 meters, and the preset energy value can be 20000W/m 2 .
  • the task execution signal corresponding to the drone is the spraying signal or the sowing signal.
  • the noise point cloud data can also be filtered according to the relative radar movement speed, as shown in Figure 2, which is also a possible way of step S102.
  • the location selection can be achieved as follows:
  • An inertial measurement unit (IMU for short) configured on the drone can measure the movement speed V a of the drone. Then, it is judged whether the relative radar moving speed V rm in the first point cloud data matches the moving speed V a of the drone.
  • the point cloud data corresponds to an object in the flying environment of the UAV
  • the relative velocity V rm of the radar in the point cloud data is actually the velocity of the object relative to the UAV, that is, a relative velocity.
  • the relative speed V rm should be equal to the moving speed V a of the drone, but in the opposite direction.
  • point cloud data A there is an alternative way to determine whether the speeds match. If the vector sum of the movement speed V rmx in the point cloud data A and the movement speed V ax of the drone is not 0 , It indicates that the two speeds do not match, and this point cloud data A can be filtered out.
  • the velocity V rm and the velocity V a is collected by the UAV to different devices, thus, corresponds to the moving speed V rm radar coordinate system, corresponding to the velocity V a body UAV Coordinate System.
  • the velocity V a body coordinate system moving at velocity V r radar coordinates can also convert the velocity V a body coordinate system moving at velocity V r radar coordinates using a conversion matrix R.
  • the conditions for judging whether the movement speed matches or not can be appropriately relaxed.
  • the relationship between the velocity vector sum and the preset velocity error ⁇ v is determined. If the vector sum is greater than the preset velocity error ⁇ v, this point cloud data A is filtered out.
  • the preset speed error ⁇ v can be set according to historical experience, and the preset speed error can be a speed range.
  • the above-mentioned screening method actually determines the point cloud data with unqualified movement speed as noise point cloud data and filters it out. Determining whether the speed of movement can be understood as a qualified process: setting a criterion according to the movement speed of the unmanned aerial vehicle and the relative velocity V a V rm. If the vector sum of the two speeds meets this criterion, it is considered qualified, otherwise it is considered unqualified. And according to the above two situations, it can be known that according to actual needs, the set judgment standard can be a speed value or a speed range.
  • the noise point cloud data can be filtered according to the echo signal energy and the position relative to the radar in the azimuth information, as shown in Figure 3, that is, Another optional implementation manner of step S102 may be:
  • S301 Fit a target plane corresponding to the ground according to the relative radar position in the first point cloud data.
  • the relative radar position in the first point cloud data can be expressed as a three-dimensional coordinate, and then the target plane can be fitted according to the three-dimensional coordinate.
  • This target plane is used to indicate the ground in the flying environment of the drone.
  • the three-dimensional coordinates can be expressed as [x, y, z]
  • the above three-dimensional coordinates [x, y, z] are also in the radar coordinate system.
  • the three-dimensional coordinates in the radar coordinate system can be converted to horizontal coordinates. System to get the new three-dimensional coordinates [x',y',z']. Then fit the target plane according to the three-dimensional coordinates in the horizontal coordinate system to ensure that the target plane is a horizontal plane and is closer to the real ground.
  • the ground is located below the drone in the flight environment, which makes the angle of the point cloud data used to describe the ground have certain characteristics, such as within a certain angle range. . Therefore, for those point cloud data whose angles obviously do not meet the requirements, if they are applied to the target plane fitting process, it will affect the accuracy of the plane fitting.
  • the first point cloud data can be filtered according to the angle relative to the radar in the first point cloud data.
  • the third point cloud data whose angle relative to the radar meets the preset angle range is filtered from the first point cloud data, and then the target plane is fitted according to the position of the relative radar in the third point cloud data.
  • S302 Calculate the distance of the first point cloud data relative to the target plane according to the relative position of the radar in the first point cloud data.
  • the distance between the three-dimensional coordinates in the first point cloud data and the target plane is calculated, and at the same time, the echo signal energy in the first point cloud data is obtained.
  • the three-dimensional coordinates of point cloud data A are [x 1 ,y 1 ,z 1 ]
  • the distance can be expressed as This distance can be understood as the distance between the point cloud data and the ground.
  • the distance h is within the preset distance range, and the echo signal energy of the point cloud data A also meets the preset energy range, it is determined that the point cloud data A is valid, and it is retained. If the point cloud data A does not meet the aforementioned conditions, it will be filtered out.
  • the preset distance range and the preset energy range can be set according to historical experience.
  • This historical experience can be that after many actual flights, the three-dimensional coordinates of the point cloud data collected by the radar can be counted to obtain the probability of the point cloud data appearing at different distances from the radar, so as to obtain the The probability of point cloud data appearing under wave signal energy.
  • the embodiment shown in FIG. 3 actually calculates the height between the point cloud data and the ground based on the three-dimensional coordinates of the point cloud data, and then determines the effective point cloud data based on the height and the echo signal energy included in the point cloud data.
  • Figs. 1 to 3 provide a variety of ways to filter out noise point cloud data, and one or more of them can be selected and implemented according to actual needs.
  • the more types of execution the better the filtering effect of noisy point cloud data.
  • the remaining point cloud data is the second point cloud data.
  • the second point cloud data can be clustered to obtain at least one cluster of point cloud data.
  • an optional way can be:
  • S401 For the target cluster point cloud data in at least one cluster of point cloud data, obtain a center of the target cluster point cloud data, and the azimuth information of the center includes a position relative to the radar.
  • S402 Calculate the first distance between the center and the ground.
  • the center of each cluster of point cloud data can be obtained.
  • the center and the center of the target cluster point cloud data K can be further calculated. The distance between the target planes.
  • the target cluster point cloud data K can be any one of at least one cluster of point cloud data.
  • the first distance between the center and the target plane is the center and the ground
  • the distance between can be expressed as:
  • the point cloud data in the radar coordinate system can also be converted to the horizontal coordinate system.
  • the first reference distance corresponding to the first distance h can be calculated according to the following method
  • z max , z min , h max , and h min are all preset values.
  • the above-mentioned first reference distance can be regarded as a lower limit value of the distance. Then if the coordinate value z 2 of the center of the target cluster point cloud data K and the first reference distance If it matches, it is determined that the target cluster point cloud data K corresponds to an obstacle. Specifically, this matching relationship may be that the coordinate value z 2 is greater than or equal to the first reference distance
  • the first reference distance can be expressed as shown in FIG. 5
  • z 2 1.0m, calculated by the above formula That is, z 2 at the center of the target cluster point cloud data K is located above the curve, indicating that the target cluster point cloud data K corresponds to an obstacle.
  • the method shown in Figure 4 is to first determine this cluster point cloud data according to the size relationship between the distance h between the center of the target cluster point cloud data and the ground and the z 2 value of the center of the target cluster point cloud data. Whether it corresponds to an obstacle.
  • S501 For the target cluster point cloud data in at least one cluster of point cloud data, obtain a center of the target cluster point cloud data, and the position information of the center includes a position relative to the radar.
  • step S401 is similar to the corresponding steps of the foregoing embodiment, and reference may be made to the related description in the embodiment shown in FIG. 4, which is not repeated here.
  • S502 Calculate a second distance between the center and the center of the movable platform.
  • S503 Calculate a second reference distance according to the second distance.
  • the target cluster point cloud data K may be any one of at least one cluster of point cloud data. Assuming that the three-dimensional coordinates of the center of the target cluster point cloud data K are [x 2 , y 2 , z 2 ], the horizontal distance between the center and the movable platform, that is, the UAV radar center, that is, the second distance is expressed as:
  • the second reference distance corresponding to the second distance d can be calculated according to the following method
  • z max , z min , d max , and d min are all preset values.
  • the above-mentioned second reference distance can be regarded as a lower limit of distance. Then if the coordinate value z 2 of the center of the target cluster point cloud data K is equal to the second reference distance If it matches, it is determined that the target cluster point cloud data K corresponds to an obstacle. Specifically, this matching relationship may be that the coordinate value z 2 is greater than or equal to the second reference distance
  • the second reference distance can be represented as shown in Fig. 7 The curve shown.
  • z 2 1.0m, it is calculated by the above formula That is, the z 2 value of the center of the target cluster point cloud data K is located above the curve, indicating that the target cluster point cloud data K corresponds to an obstacle.
  • the method shown in FIG. 6 is to determine whether the cluster point cloud data corresponds to an obstacle according to the magnitude relationship between the horizontal distance d between the center of the target cluster point cloud data and the radar, and the distance between the center and the ground z 2.
  • noisy point cloud data corresponding to non-obstacles in at least one cluster of point cloud data can be filtered out.
  • the following filtering can also be performed:
  • this cluster of point cloud data P in the remaining multiple clusters of point cloud data count the number of point cloud data included in this cluster of point cloud data. If the number is greater than the preset value, it is determined that this cluster of point cloud data P corresponds to an obstacle.
  • this cluster of point cloud data P may be any cluster of the remaining multiple clusters of point cloud data.
  • the above process of filtering based on the number of point cloud data in a cluster of point cloud data can be performed after the embodiment shown in Figure 4 or Figure 6 is executed, or clustering can be performed on the second point cloud data.
  • the processing is performed after obtaining at least one cluster of point cloud data, of course, it can also be performed after the above two cases, so as to more accurately filter out the noisy point cloud data in the point cloud data, and ensure the accuracy of obstacle recognition. Ensure the normal flight of the drone.
  • the obstacles in the flying environment of the drone can be accurately identified, and then the drone can be accurately controlled. For example, the drone can achieve target tracking and so on.
  • Fig. 8 is a schematic structural diagram of a control device for a movable platform provided by an embodiment of the present invention. As shown in FIG. 8, this embodiment provides a control device for a movable platform, which can execute the above-mentioned control method for a movable platform; specifically, the control device includes:
  • the acquisition module 11 is used to acquire the first point cloud data detected by the radar, and the radar is set on the movable platform.
  • the filtering module 12 is configured to filter out noisy point cloud data corresponding to non-obstacles in the first point cloud data according to at least one feature information of the first point cloud data to obtain second point cloud data .
  • the recognition module 13 is configured to recognize obstacles according to the second point cloud data.
  • the control module 14 is used to control the movement state of the movable platform according to the obstacle recognition result.
  • the device shown in FIG. 8 can also execute the methods of the embodiments shown in FIG. 1 to FIG. 7.
  • FIG. 9 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
  • an embodiment of the present invention provides a movable platform, and the movable platform is at least one of the following: Aircraft, automobiles, and ships.
  • the automobile can be an unmanned vehicle, an ordinary automobile, etc.
  • the ship can be an unmanned ship, an ordinary ship, and the like.
  • the movable platform includes: a body 21, a radar 22, a power system 23, and a control device 24.
  • the radar 22 is arranged on the body 21 for detecting point cloud data.
  • the power system 23 is arranged on the body of the machine 21 and used to provide power for the movable platform.
  • the control device 24 includes a memory 241 and a processor 242.
  • the memory is used to store a computer program
  • the processor is configured to run a computer program stored in the memory to realize:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the at least one type of characteristic information includes: position information relative to the radar, echo signal energy, and movement speed relative to the movable platform; the position information includes a distance relative to the radar;
  • the processor 242 is further configured to filter out point cloud data whose distance to the radar is less than a preset distance in the first point cloud data, and the preset distance corresponds to the size information of the movable platform.
  • the position information includes a position relative to the radar
  • the processor 242 is further configured to filter out point cloud data located within a preset spatial range relative to the position of the radar, and the preset spatial range includes the movable platform.
  • the processor 242 is further configured to filter out the point cloud data in the first point cloud data that do not meet the respective preset thresholds in the azimuth information relative to the radar and the energy of the echo signal.
  • the position information includes the distance and angle relative to the radar
  • the processor 242 is further configured to: for the target point cloud data in the first point cloud data, if the distance of the target point cloud data to the radar is less than a preset distance, and the target point cloud data is relatively to the The angle of the radar is within a preset angle range, and the echo signal energy of the target point cloud data is less than the preset energy value, then the target point cloud data is filtered out.
  • the position information includes a distance relative to the radar
  • the processor 242 is further configured to: for the target point cloud data in the first point cloud data, if the distance between the target point cloud data and the radar is less than a preset distance, and the echo of the target point cloud data If the signal energy is less than the preset energy value, the target point cloud data is filtered out, wherein the task execution signal of the movable platform is turned on.
  • processor 242 is further configured to: obtain the movement speed of the movable platform;
  • the processor 242 is further configured to: for the target point cloud data in the first point cloud data, if the target point cloud data is relative to the moving speed of the radar and the moving speed vector of the movable platform If the sum is greater than the set speed error, the target point cloud data is filtered out.
  • the position information includes a position relative to the radar
  • the processor 242 is further configured to fit a target plane corresponding to the ground according to the position relative to the radar in the first point cloud data;
  • the position information further includes an angle relative to the radar
  • the processor 242 is further configured to: filter out the third point cloud data whose angle relative to the radar satisfies a preset angle range from the first point cloud data, and the preset angle range corresponds to the angle of the radar relative to the ground;
  • processor 242 is further configured to perform clustering processing on the second point cloud data to obtain at least one cluster of point cloud data
  • processor 242 is further configured to: for the target cluster point cloud data in the at least one cluster of point cloud data, obtain the center of the target cluster point cloud data, and the position information of the center includes the relative position information of the radar. Location;
  • processor 242 is further configured to: for the target cluster point cloud data in the at least one cluster of point cloud data, obtain the center of the target cluster point cloud data, and the position information of the center includes the relative position information of the radar. Location;
  • processor 242 is further configured to: count the number of point cloud data included in the target cluster point cloud data;
  • the target cluster point cloud data corresponds to an obstacle.
  • the movable platform shown in FIG. 9 can execute the methods of the embodiments shown in FIG. 1 to FIG. 7.
  • parts that are not described in detail in this embodiment please refer to the related description of the embodiments shown in FIG. 1 to FIG. 7.
  • the implementation process and technical effects of this technical solution please refer to the description in the embodiment shown in FIG. 1 to FIG. 7, which will not be repeated here.
  • the structure of the control device for the movable platform shown in FIG. 10 can be realized as an electronic device.
  • the electronic device can be a drone, a car, a ship, etc., where the car can be an unmanned vehicle. Driving cars, ordinary cars, etc., ships can be unmanned ships, ordinary ships, and so on.
  • the electronic device may include: one or more processors 31 and one or more memories 32.
  • the memory 32 is used to store a program that supports the electronic device to execute the control method for the movable platform provided in the embodiments shown in FIGS. 1 to 7 above.
  • the processor 31 is configured to execute a program stored in the memory 32.
  • the program includes one or more computer instructions, and the following steps can be implemented when one or more computer instructions are executed by the processor 31:
  • the movement state of the movable platform is controlled according to the obstacle recognition result.
  • the structure of the pan/tilt control device may further include a communication interface 33 for the electronic device to communicate with other devices or a communication network.
  • the at least one type of characteristic information includes: position information relative to the radar, echo signal energy, and movement speed relative to the movable platform; the position information includes a distance relative to the radar;
  • the processor 31 is further configured to filter out point cloud data whose distance to the radar is less than a preset distance in the first point cloud data, and the preset distance corresponds to the size information of the movable platform.
  • the position information includes a position relative to the radar
  • the processor 31 is further configured to filter out point cloud data located within a preset spatial range relative to the position of the radar, and the preset spatial range includes the movable platform.
  • the processor 31 is further configured to: filter out the point cloud data in the first point cloud data that do not meet the respective preset thresholds in the azimuth information relative to the radar and the energy of the echo signal.
  • the position information includes the distance and angle relative to the radar
  • the processor 31 is further configured to: for the target point cloud data in the first point cloud data, if the distance between the target point cloud data and the radar is less than a preset distance, and the target point cloud data is relative to the The angle of the radar is within a preset angle range, and the echo signal energy of the target point cloud data is less than the preset energy value, then the target point cloud data is filtered out.
  • the position information includes a distance relative to the radar
  • the processor 31 is further configured to: for the target point cloud data in the first point cloud data, if the distance between the target point cloud data and the radar is less than a preset distance, and the echo of the target point cloud data If the signal energy is less than the preset energy value, the target point cloud data is filtered out, wherein the task execution signal of the movable platform is turned on.
  • the processor 31 is further configured to: obtain the movement speed of the movable platform;
  • the processor 31 is further configured to: for the target point cloud data in the first point cloud data, if the target point cloud data is relative to the moving speed of the radar and the moving speed vector of the movable platform If the sum is greater than the set speed error, the target point cloud data is filtered out.
  • the position information includes a position relative to the radar
  • the processor 31 is further configured to: fit a target plane corresponding to the ground according to the position relative to the radar in the first point cloud data;
  • the position information further includes an angle relative to the radar
  • the processor 31 is further configured to: filter out the third point cloud data whose angle relative to the radar satisfies a preset angle range from the first point cloud data, and the preset angle range corresponds to the angle of the radar relative to the ground;
  • processor 31 is further configured to perform clustering processing on the second point cloud data to obtain at least one cluster of point cloud data
  • the processor 31 is further configured to: for the target cluster point cloud data in the at least one cluster of point cloud data, obtain the center of the target cluster point cloud data, and the position information of the center includes the relative position information of the radar. Location;
  • the processor 31 is further configured to: for the target cluster point cloud data in the at least one cluster of point cloud data, obtain the center of the target cluster point cloud data, and the position information of the center includes the relative position information of the radar. Location;
  • processor 31 is further configured to: count the number of point cloud data included in the target cluster point cloud data;
  • the target cluster point cloud data corresponds to an obstacle.
  • the device shown in FIG. 10 can execute the methods of the embodiments shown in FIGS. 1 to 7.
  • parts that are not described in detail in this embodiment please refer to the related descriptions of the embodiments shown in FIGS. 1 to 7.
  • an embodiment of the present invention provides a computer-readable storage medium.
  • the storage medium is a computer-readable storage medium.
  • the computer-readable storage medium stores program instructions. Control method for movable platform.
  • the related detection device for example: IMU
  • the embodiments of the remote control device described above are merely illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or components. It can be combined or integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, remote control devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functional units in the various embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • the aforementioned storage media include: U disk, mobile hard disk, Read-Only Memory (ROM), Random Access Memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Astronomy & Astrophysics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un procédé de commande d'une plateforme mobile, une plateforme mobile, un dispositif, et un support de stockage. Un radar configuré sur une plateforme mobile détecte des premières données de nuage de points, et examine les premières données de nuage de points en fonction d'informations de caractéristiques de différentes dimensions dans les premières données de nuage de points, de façon à obtenir des secondes données de nuage de points correspondant à un obstacle. Ensuite, l'obstacle est reconnu en fonction des secondes données de nuage de points, de façon à mieux commander l'état de mouvement de la plateforme mobile en fonction de l'obstacle reconnu. En utilisant des informations de caractéristiques de multiples dimensions dans des données de nuage de points, des données de nuage de points correspondant à un obstacle peuvent être filtrées avec précision, et l'obstacle est reconnu avec davantage de précision, ce qui permet d'assurer le mouvement normal d'une plateforme mobile.
PCT/CN2020/076845 2020-02-26 2020-02-26 Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage WO2021168708A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/076845 WO2021168708A1 (fr) 2020-02-26 2020-02-26 Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage
CN202080004213.5A CN112585553A (zh) 2020-02-26 2020-02-26 用于可移动平台的控制方法、可移动平台、设备和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/076845 WO2021168708A1 (fr) 2020-02-26 2020-02-26 Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage

Publications (1)

Publication Number Publication Date
WO2021168708A1 true WO2021168708A1 (fr) 2021-09-02

Family

ID=75145414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/076845 WO2021168708A1 (fr) 2020-02-26 2020-02-26 Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage

Country Status (2)

Country Link
CN (1) CN112585553A (fr)
WO (1) WO2021168708A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791426A (zh) * 2021-09-10 2021-12-14 深圳市唯特视科技有限公司 雷达p显界面生成方法、装置、计算机设备及存储介质
CN114723830A (zh) * 2022-03-21 2022-07-08 深圳市正浩创新科技股份有限公司 障碍物的识别方法、设备及存储介质
CN114973006A (zh) * 2022-08-02 2022-08-30 四川省机械研究设计院(集团)有限公司 花椒采摘方法、装置、系统及存储介质
CN116465302A (zh) * 2023-03-31 2023-07-21 中国地震局地质研究所 一种断层运动的监测方法、装置、设备及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888621B (zh) * 2021-09-29 2022-08-26 中科海微(北京)科技有限公司 装载率确定方法、装置、边缘计算服务器及存储介质
CN114706070A (zh) * 2022-02-22 2022-07-05 惠州市德赛西威智能交通技术研究院有限公司 一种基于4d毫米波雷达的自动泊车车位搜索方法和系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156926A (zh) * 2014-08-19 2014-11-19 武汉海达数云技术有限公司 多场景下车载激光点云噪声点去除方法
CN107064955A (zh) * 2017-04-19 2017-08-18 北京汽车集团有限公司 障碍物聚类方法及装置
CN108647646A (zh) * 2018-05-11 2018-10-12 北京理工大学 基于低线束雷达的低矮障碍物的优化检测方法及装置
CN108897311A (zh) * 2018-06-14 2018-11-27 天津大学 一种碾压机集群无人驾驶筑坝系统
US20190129039A1 (en) * 2017-10-31 2019-05-02 Analytical Mechanics Associates, Inc. Polyhedral geofences
CN109709986A (zh) * 2019-03-06 2019-05-03 华北电力大学(保定) 一种无人机控制系统及方法
CN109739256A (zh) * 2018-12-20 2019-05-10 深圳市道通智能航空技术有限公司 一种无人机降落避障方法、装置及无人机
CN110458055A (zh) * 2019-07-29 2019-11-15 江苏必得科技股份有限公司 一种障碍物检测方法及系统
WO2020023745A1 (fr) * 2018-07-26 2020-01-30 Bear Flag Robotics, Inc. Dispositifs de commande de véhicule destinés à des applications agricoles et industrielles

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156926A (zh) * 2014-08-19 2014-11-19 武汉海达数云技术有限公司 多场景下车载激光点云噪声点去除方法
CN107064955A (zh) * 2017-04-19 2017-08-18 北京汽车集团有限公司 障碍物聚类方法及装置
US20190129039A1 (en) * 2017-10-31 2019-05-02 Analytical Mechanics Associates, Inc. Polyhedral geofences
CN108647646A (zh) * 2018-05-11 2018-10-12 北京理工大学 基于低线束雷达的低矮障碍物的优化检测方法及装置
CN108897311A (zh) * 2018-06-14 2018-11-27 天津大学 一种碾压机集群无人驾驶筑坝系统
WO2020023745A1 (fr) * 2018-07-26 2020-01-30 Bear Flag Robotics, Inc. Dispositifs de commande de véhicule destinés à des applications agricoles et industrielles
CN109739256A (zh) * 2018-12-20 2019-05-10 深圳市道通智能航空技术有限公司 一种无人机降落避障方法、装置及无人机
CN109709986A (zh) * 2019-03-06 2019-05-03 华北电力大学(保定) 一种无人机控制系统及方法
CN110458055A (zh) * 2019-07-29 2019-11-15 江苏必得科技股份有限公司 一种障碍物检测方法及系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791426A (zh) * 2021-09-10 2021-12-14 深圳市唯特视科技有限公司 雷达p显界面生成方法、装置、计算机设备及存储介质
CN114723830A (zh) * 2022-03-21 2022-07-08 深圳市正浩创新科技股份有限公司 障碍物的识别方法、设备及存储介质
CN114973006A (zh) * 2022-08-02 2022-08-30 四川省机械研究设计院(集团)有限公司 花椒采摘方法、装置、系统及存储介质
CN114973006B (zh) * 2022-08-02 2022-10-18 四川省机械研究设计院(集团)有限公司 花椒采摘方法、装置、系统及存储介质
CN116465302A (zh) * 2023-03-31 2023-07-21 中国地震局地质研究所 一种断层运动的监测方法、装置、设备及存储介质
CN116465302B (zh) * 2023-03-31 2023-11-10 中国地震局地质研究所 一种断层运动的监测方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN112585553A (zh) 2021-03-30

Similar Documents

Publication Publication Date Title
WO2021168708A1 (fr) Procédé de commande de plateforme mobile, plateforme mobile, dispositif, et support de stockage
US20190196474A1 (en) Control method, control apparatus, control device, and movable platform
EP3640921B1 (fr) Système adaptatif de détection et d'évitement
CN108226951B (zh) 一种基于激光传感器的快速运动障碍物实时跟踪方法
JP6783950B2 (ja) 無人航空機の障害物回避制御方法および無人航空機
CN108124472B (zh) 一种闪避障碍物的方法、装置及飞行器
WO2017040254A1 (fr) Atténuation des menaces sur les petits aéronefs sans équipage
WO2017034689A1 (fr) Système et procédé pour l'échantillonnage d'une carte de profondeur par laser
CN112051575B (zh) 一种毫米波雷达与激光雷达的调整方法及相关装置
US11587445B2 (en) System and method for fusing asynchronous sensor tracks in a track fusion application
Huh et al. Vision-based sense-and-avoid framework for unmanned aerial vehicles
Cho et al. Vision-based detection and tracking of airborne obstacles in a cluttered environment
EP4085444A1 (fr) Détection et évitement à base acoustique pour aéronef
JP7406656B2 (ja) 航空機の相関動作及び検知
WO2021087751A1 (fr) Procédé de mesure de distance, dispositif de mesure de distance, plateforme mobile autonome et support de stockage
CN115033026B (zh) 搭载云台的斜侧装无人机毫米波雷达避障和定高方法
Geyer et al. Prototype sense-and-avoid system for UAVs
WO2021087737A1 (fr) Procédé et dispositif de détection d'état de montage de radar, plate-forme mobile, et support de stockage
CN108133076B (zh) 基于四维坐标的无人机碰撞模型的建模方法
US10330769B1 (en) Method and apparatus for geolocating emitters in a multi-emitter environment
CN112334880B (zh) 可移动平台对障碍物的处理方法、装置及计算机存储介质
WO2021087643A1 (fr) Procédé d'estimation de terrain et de suppression d'échos de sol, véhicule aérien sans pilote, radar rotatif et support de stockage
WO2020041959A1 (fr) Procédé de prédiction de terrain au moyen d'un radar à onde continue, dispositif, système et véhicule aérien sans pilote
Bauer et al. Real Flight Application of a Monocular Image‐Based Aircraft Collision Decision Method
Zsedrovits et al. Distant aircraft detection in sense-and-avoid on kilo-processor architectures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20921919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20921919

Country of ref document: EP

Kind code of ref document: A1