CN116069051A - Unmanned aerial vehicle control method, device, equipment and readable storage medium - Google Patents

Unmanned aerial vehicle control method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN116069051A
CN116069051A CN202111271016.2A CN202111271016A CN116069051A CN 116069051 A CN116069051 A CN 116069051A CN 202111271016 A CN202111271016 A CN 202111271016A CN 116069051 A CN116069051 A CN 116069051A
Authority
CN
China
Prior art keywords
data
point cloud
cloud data
radar
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111271016.2A
Other languages
Chinese (zh)
Other versions
CN116069051B (en
Inventor
刘长江
庞勃
景华
郭彦杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202111271016.2A priority Critical patent/CN116069051B/en
Publication of CN116069051A publication Critical patent/CN116069051A/en
Application granted granted Critical
Publication of CN116069051B publication Critical patent/CN116069051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Abstract

The application discloses a control method, a control device, control equipment and a readable storage medium of an unmanned aerial vehicle, and belongs to the technical field of data processing. The method comprises the following steps: acquiring a plurality of first point cloud data observed by a first radar and a plurality of second point cloud data observed by a second radar; correlating the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair, wherein the point cloud data pair comprises at least one first point cloud data and at least one second point cloud data; and performing flight control on the unmanned aerial vehicle in the radar observation area based on the point cloud data pair. Because the point cloud data is acquired by means of the radar, the influence of weather factors such as rain, snow and illumination is avoided, and the data accuracy is high, when the unmanned aerial vehicle is controlled to fly based on the point cloud data, the control precision of the unmanned aerial vehicle is improved, and therefore the flight safety is improved.

Description

Unmanned aerial vehicle control method, device, equipment and readable storage medium
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to a control method, a device and equipment of an unmanned aerial vehicle and a readable storage medium.
Background
The unmanned plane is a unmanned plane which is operated by using a radio remote control device and a self-contained program control device. With the gradual maturity of unmanned aerial vehicle technology, unmanned aerial vehicles are increasingly applied to distribution scenes, high-altitude photography scenes and the like.
In the related art, when controlling the flight of an unmanned aerial vehicle, it is necessary to acquire position data of the unmanned aerial vehicle based on a satellite navigation chip built in the unmanned aerial vehicle, such as a global navigation satellite system (Global Navigation Satellite System, GNSS), and the like, and to acquire visual positioning information based on a camera or the like. And then controlling the unmanned aerial vehicle to fly based on the position data and the visual positioning information of the unmanned aerial vehicle, wherein the unmanned aerial vehicle can fly by taking off, landing, flying according to a flying route and the like.
However, when visual positioning information is acquired based on a camera or the like, the visual positioning information is easily affected by weather factors such as rain, snow, illumination and the like, so that the unmanned aerial vehicle is low in control precision and poor in flight safety.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, control equipment and a readable storage medium of an unmanned aerial vehicle, which can be used for solving the problems in the related art.
In one aspect, an embodiment of the present application provides a control method of an unmanned aerial vehicle, where the method includes:
acquiring a plurality of first point cloud data observed by a first radar and a plurality of second point cloud data observed by a second radar;
correlating the plurality of first point cloud data with the plurality of second point cloud data to obtain point cloud data pairs, wherein the point cloud data pairs comprise at least one first point cloud data and at least one second point cloud data;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
In a possible implementation manner, the associating the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair includes:
clustering the plurality of first point cloud data to obtain at least one first data set;
clustering the plurality of second point cloud data to obtain at least one second data set;
and correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair.
In a possible implementation manner, the clustering processing is performed on the plurality of first point cloud data to obtain at least one first data set, including:
Screening a plurality of target point cloud data from the plurality of first point cloud data;
and clustering the plurality of target point cloud data to obtain the at least one first data set.
In a possible implementation manner, the screening the plurality of target point cloud data from the plurality of first point cloud data includes at least one of the following:
screening target point cloud data with speed data smaller than or equal to a target speed from the plurality of first point cloud data based on the speed data contained in the plurality of first point cloud data;
screening target point cloud data with echo intensity data greater than or equal to target echo intensity from the plurality of first point cloud data based on echo intensity data contained in the plurality of first point cloud data;
screening target point cloud data with the slant range data larger than or equal to a target slant range from the plurality of first point cloud data based on slant range data contained in the plurality of first point cloud data;
and screening target point cloud data with angle data smaller than or equal to a target angle from the plurality of first point cloud data based on the angle data contained in the plurality of first point cloud data.
In a possible implementation manner, the clustering the plurality of target point cloud data to obtain the at least one first data set includes:
calculating the slant distance difference between every two target point cloud data based on slant distance data contained in each of the plurality of target point cloud data;
and clustering the plurality of target point cloud data based on the slant distance difference between every two target point cloud data to obtain at least one first data set, wherein the slant distance difference between any two target point cloud data in the first data set is smaller than or equal to a first threshold value.
In a possible implementation manner, the associating the at least one first data set with the at least one second data set to obtain the point cloud data pair includes:
determining the central slant distance of each first data set and the central slant distance of each second data set;
calculating a corresponding skew difference between the first data set and the second data set based on the center skew of the respective first data set and the center skew of the respective second data set;
and correlating the at least one first data set with the at least one second data set based on a corresponding skew difference between the first data set and the second data set to obtain the point cloud data pair, wherein the skew difference between the first data set and the second data set in the point cloud data pair is smaller than or equal to a second threshold.
In one possible implementation, the determining the center skew of each first data set includes:
and calculating the average value of the slant ranges of the first point cloud data in each first data set based on the slant range data contained in each first point cloud data in each first data set, so as to obtain the center slant range of each first data set.
In one possible implementation manner, the number of the point cloud data pairs is a plurality of, and any one point cloud data pair is a point cloud data pair corresponding to the current moment;
the performing flight control on the unmanned aerial vehicle in the radar observation area based on the position data and the point cloud data pair comprises the following steps:
determining motion state data at the previous moment based on a point cloud data pair corresponding to the previous moment, wherein the motion state data is data corresponding to the point cloud data pair in a rectangular coordinate system;
determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened.
In a possible implementation manner, the determining at least one data to be screened from the plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment includes:
Predicting and obtaining first motion state data corresponding to the current moment based on the motion state data of the previous moment;
determining each second motion state data corresponding to the current moment based on each point cloud data pair corresponding to the current moment;
and determining at least one datum to be screened from the second motion state data based on the first motion state data.
In a possible implementation manner, the performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened includes:
acquiring position data of an unmanned aerial vehicle;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one datum to be screened and the position data.
In a possible implementation manner, the performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened and the position data includes:
determining distance data corresponding to each piece of data to be screened based on the at least one piece of data to be screened and the position data, wherein the distance data corresponding to the piece of data to be screened is used for representing the distance between an object corresponding to the piece of data to be screened and the unmanned aerial vehicle;
Screening target data with the distance data smaller than or equal to the target distance from the at least one piece of data to be screened based on the distance data corresponding to the data to be screened;
and carrying out flight control on the unmanned aerial vehicle in the radar observation area based on the target data.
In one possible implementation, the radar beam planes of the first radar and the second radar are perpendicular, the first radar includes at least two patch antennas, and the second radar includes at least two patch antennas.
On the other hand, the embodiment of the application provides a control device of an unmanned aerial vehicle, which comprises:
the acquisition module is used for acquiring a plurality of first point cloud data observed by the first radar and a plurality of second point cloud data observed by the second radar;
the association module is used for associating the plurality of first point cloud data with the plurality of second point cloud data to obtain point cloud data pairs, wherein the point cloud data pairs comprise at least one first point cloud data and at least one second point cloud data;
and the control module is used for controlling the flight of the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
In a possible implementation manner, the association module is configured to perform clustering processing on the plurality of first point cloud data to obtain at least one first data set; clustering the plurality of second point cloud data to obtain at least one second data set; and correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair.
In a possible implementation manner, the association module is configured to screen a plurality of target point cloud data from the plurality of first point cloud data; and clustering the plurality of target point cloud data to obtain the at least one first data set.
In one possible implementation manner, the association module is configured to at least one of the following:
screening target point cloud data with speed data smaller than or equal to a target speed from the plurality of first point cloud data based on the speed data contained in the plurality of first point cloud data;
screening target point cloud data with echo intensity data greater than or equal to target echo intensity from the plurality of first point cloud data based on echo intensity data contained in the plurality of first point cloud data;
screening target point cloud data with the slant range data larger than or equal to a target slant range from the plurality of first point cloud data based on slant range data contained in the plurality of first point cloud data;
and screening target point cloud data with angle data smaller than or equal to a target angle from the plurality of first point cloud data based on the angle data contained in the plurality of first point cloud data.
In a possible implementation manner, the association module is configured to calculate a skew difference between every two target point cloud data based on skew data included in each of the plurality of target point cloud data; and clustering the plurality of target point cloud data based on the slant distance difference between every two target point cloud data to obtain at least one first data set, wherein the slant distance difference between any two target point cloud data in the first data set is smaller than or equal to a first threshold value.
In a possible implementation manner, the association module is configured to determine a central skew of each first data set, and determine a central skew of each second data set; calculating a corresponding skew difference between the first data set and the second data set based on the center skew of the respective first data set and the center skew of the respective second data set; and correlating the at least one first data set with the at least one second data set based on a corresponding skew difference between the first data set and the second data set to obtain the point cloud data pair, wherein the skew difference between the first data set and the second data set in the point cloud data pair is smaller than or equal to a second threshold.
In a possible implementation manner, the association module is configured to calculate a pitch average value of each first point cloud data in each first data set based on the pitch data included in each first point cloud data in each first data set, so as to obtain a central pitch of each first data set.
In one possible implementation manner, the number of the point cloud data pairs is a plurality of, and any one point cloud data pair is a point cloud data pair corresponding to the current moment;
the control module is used for determining motion state data at the previous moment based on the point cloud data pair corresponding to the previous moment, wherein the motion state data are data corresponding to the point cloud data pair in a rectangular coordinate system; determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment; and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened.
In one possible implementation manner, the control module is configured to predict and obtain first motion state data corresponding to the current moment based on the motion state data of the previous moment; determining each second motion state data corresponding to the current moment based on each point cloud data pair corresponding to the current moment; and determining at least one datum to be screened from the second motion state data based on the first motion state data.
In one possible implementation manner, the control module is configured to obtain position data of the unmanned aerial vehicle; and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one datum to be screened and the position data.
In a possible implementation manner, the control module is configured to determine distance data corresponding to each piece of data to be screened based on the at least one piece of data to be screened and the location data, where the distance data corresponding to the piece of data to be screened is used to characterize a distance between an object corresponding to the piece of data to be screened and the unmanned aerial vehicle; screening target data with the distance data smaller than or equal to the target distance from the at least one piece of data to be screened based on the distance data corresponding to the data to be screened; and carrying out flight control on the unmanned aerial vehicle in the radar observation area based on the target data.
In one possible implementation, the radar beam planes of the first radar and the second radar are perpendicular, the first radar includes at least two patch antennas, and the second radar includes at least two patch antennas.
On the other hand, the embodiment of the application provides an electronic device, which comprises a processor and a memory, wherein at least one program code is stored in the memory, and the at least one program code is loaded and executed by the processor, so that the electronic device realizes the control method of the unmanned aerial vehicle.
In another aspect, there is also provided a computer readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor, to cause a computer to implement the control method of the unmanned aerial vehicle described in any of the above.
In another aspect, a computer program or a computer program product is provided, where at least one computer instruction is stored, where the at least one computer instruction is loaded and executed by a processor, so that the computer implements a control method of any one of the unmanned aerial vehicle described above.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
according to the technical scheme, flight control is performed on the unmanned aerial vehicle in the radar observation area based on point cloud data pairs formed by associating the plurality of first point cloud data with the plurality of second point cloud data. Because the point cloud data is acquired by means of the radar, the influence of weather factors such as rain, snow and illumination is avoided, and the data accuracy is high, when the unmanned aerial vehicle is controlled to fly based on the point cloud data, the control precision of the unmanned aerial vehicle is improved, and therefore the flight safety is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an implementation environment schematic diagram of a control method of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a flowchart of a control method of the unmanned aerial vehicle provided in the embodiment of the present application;
FIG. 3 is a schematic view of a radar assembly according to an embodiment of the present application;
fig. 4 is a schematic diagram of point cloud data according to an embodiment of the present application;
fig. 5 is a schematic diagram of interaction between a drone and a radar system according to an embodiment of the present application;
fig. 6 is a schematic diagram of another unmanned aerial vehicle interacting with a radar system according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a data process provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a track provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of a control device of an unmanned aerial vehicle according to an embodiment of the present application;
Fig. 10 is a schematic structural diagram of a terminal device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment of a control method of an unmanned aerial vehicle according to an embodiment of the present application, where the implementation environment includes an electronic device 11 as shown in fig. 1, and the control method of the unmanned aerial vehicle according to the embodiment of the present application may be executed by the electronic device 11. The electronic device 11 may comprise at least one of a terminal device or a server, for example.
The terminal device may be at least one of a smart phone, a game console, a desktop computer, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) player, and a laptop portable computer.
The server may be one server, or a server cluster formed by a plurality of servers, or any one of a cloud computing platform and a virtualization center, which is not limited in the embodiments of the present application. The server may be communicatively connected to the terminal device via a wired network or a wireless network. The server may have functions of data processing, data storage, data transceiving, and the like, and is not limited in the embodiments of the present application.
Based on the above implementation environment, the embodiment of the present application provides a control method of an unmanned aerial vehicle, taking a flowchart of the control method of an unmanned aerial vehicle provided in the embodiment of the present application as shown in fig. 2 as an example, the method may be executed by the electronic device 11 in fig. 1. As shown in fig. 2, the method comprises steps 21-23.
Step 21, acquiring a plurality of first point cloud data observed by the first radar and a plurality of second point cloud data observed by the second radar.
The point cloud data is data of a point cloud, a plurality of first point clouds (i.e., first point cloud data) are observed by using a first radar, and a plurality of second point clouds (i.e., second point cloud data) are observed by using a second radar. Any one of the first point cloud data or any one of the second point cloud data includes, but is not limited to, speed data, angle data, echo intensity data, skew data, and the like.
Wherein the radar beam planes of the first radar and the second radar are perpendicular, the first radar comprises at least two patch antennas, and the second radar comprises at least two patch antennas.
In this embodiment of the present application, the first radar and the second radar are both configured with patch antennas, and because the cost of the patch antennas is lower, the cost of the first radar and the second radar in this embodiment of the present application is lower. The first radar comprises at least two patch antennas and the second radar also comprises at least two patch antennas, by means of which the observability and the localization ability of the first radar and the second radar are improved.
The radar is based on the observation of the object by the patch antenna, the patch antenna transmits radar wave signals, when the transmitted radar wave signals meet any object, echo signals are returned, and the arrangement direction of the patch antenna is perpendicular to the radar beam plane of the radar wave signals. In the embodiment of the application, the radar beam planes of the first radar and the second radar are vertical to ensure that the beam polarization directions of the two radars are orthogonal, so that mutual interference of echoes of the first radar and the second radar is prevented, the accuracy of echo signals is improved, and the accuracy of first point cloud data and second point cloud data is improved.
As shown in fig. 3, fig. 3 is a schematic structural diagram of a radar assembly according to an embodiment of the present application. The radar assembly comprises a first radar 31 and a second radar 32, the first radar 31 and the second radar 32 being on the same horizontal plane and the radar beam planes being vertical. Wherein the first radar 31 comprises at least two patch antennas 311 (4 patch antennas 311 are shown in fig. 3) and a processing chip 312, the at least two patch antennas 311 being on the same horizontal plane. The second radar 32 comprises at least two patch antennas 321 (4 patch antennas 321 are shown in fig. 3) and a processing chip 322, the at least two patch antennas 321 being on the same horizontal plane.
Since the first radar includes at least two patch antennas and the second radar includes at least two patch antennas, for any one object, the first radar may observe at least two first point cloud data corresponding to the object, and the second radar may observe at least two second point cloud data corresponding to the object. Also, since the object is at least one, the first radar can observe a plurality of first point cloud data, and the second radar can observe a plurality of second point cloud data.
Step 22, associating the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair, wherein the point cloud data pair comprises at least one first point cloud data and at least one second point cloud data.
In this embodiment of the present invention, since the first point cloud data observed by the first radar and the second point cloud data observed by the second radar may be point cloud data corresponding to the same object, the plurality of first point cloud data and the plurality of second point cloud data are correlated to obtain a point cloud data pair, so that the first point cloud data and the second point cloud data corresponding to the same object are correlated in the same point cloud data pair. Wherein the point cloud data pair is at least one.
In one possible implementation manner, associating the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair includes: clustering the plurality of first point cloud data to obtain at least one first data set; clustering the plurality of second point cloud data to obtain at least one second data set; and correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair.
In this embodiment of the present application, since the first radar may observe at least two first point cloud data corresponding to any one object, clustering processing is performed on the plurality of first point cloud data to obtain at least one first data set, so as to cluster first point cloud data corresponding to the same object in the same first data set. Similarly, since the second radar can observe at least two second point cloud data corresponding to any one object, clustering is performed on the plurality of second point cloud data to obtain at least one second data set, so that the second point cloud data corresponding to the same object are clustered in the same second data set.
And then, correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair. Because the clustering processing is performed on the plurality of first point cloud data and the plurality of second point cloud data, the association speed is increased, and the data processing efficiency is improved.
In this embodiment of the present application, clustering is performed on a plurality of first point cloud data to obtain at least one first data set, including: screening a plurality of target point cloud data from the plurality of first point cloud data; and clustering the cloud data of the plurality of target points to obtain at least one first data set.
Since the first radar observes a non-target object (in this embodiment, the target object includes an unmanned aerial vehicle) when observing the object, a plurality of first point cloud data obtained by the first radar observation needs to be filtered to filter some or all of the point cloud data corresponding to the non-target object, so as to obtain point cloud data corresponding to the target object, and thus a plurality of target point cloud data (which may be referred to as first target point cloud data) are obtained. And then, clustering the cloud data of the multiple target points to obtain at least one first data set.
Based on the same principle, clustering is carried out on the plurality of second point cloud data to obtain at least one second data set, wherein the clustering comprises the following steps: screening a plurality of target point cloud data (which can be recorded as second target point cloud data) from the plurality of second point cloud data; and clustering the cloud data of the plurality of target points to obtain at least one second data set. The description of the clustering process on the first point cloud data may be found in the foregoing, and will not be described herein.
The method comprises the steps of screening a plurality of target point cloud data from a plurality of first point cloud data, wherein the plurality of target point cloud data comprises at least one of an implementation mode A1-an implementation mode A4 shown below.
In the implementation mode A1, based on the speed data contained in each of the plurality of first point cloud data, target point cloud data with the speed data smaller than or equal to the target speed are screened out from the plurality of first point cloud data.
When the radar observes an object, the obtained point cloud data corresponding to the object includes velocity data (such as doppler velocity data) of the object, and therefore any one of the first point cloud data includes velocity data. And screening first point cloud data with the speed data smaller than or equal to the target speed from the plurality of first point cloud data based on the speed data contained in each first point cloud data, and taking the first point cloud data as target point cloud data (which can be recorded as first target point cloud data). Because the speed that unmanned aerial vehicle's rotor corresponds is great, consequently, through screening out the first point cloud data that speed data is less than or equal to target speed, filter the first point cloud data that unmanned aerial vehicle's rotor corresponds, reduce unmanned aerial vehicle rotor to flight control's influence, improve the flight security.
Based on the same principle, when a plurality of target point cloud data (which can be recorded as second target point cloud data) are screened out from a plurality of second point cloud data, based on the speed data contained in each of the plurality of second point cloud data, target point cloud data with the speed data smaller than or equal to the target speed are screened out from the plurality of second point cloud data. The relevant descriptions are described above and are not repeated here.
It should be noted that the magnitude of the value of the target speed is not limited, and is determined according to an application scenario or manual experience.
In implementation A2, target point cloud data with echo intensity data greater than or equal to the target echo intensity is screened from the plurality of first point cloud data based on echo intensity data contained in each of the plurality of first point cloud data.
When the radar observes an object, the obtained point cloud data corresponding to the object includes echo intensity data, and therefore, any one of the first point cloud data includes echo intensity data. And screening first point cloud data with echo intensity data greater than or equal to the target echo intensity from the plurality of first point cloud data based on echo intensity data contained in each first point cloud data as target point cloud data (which can be recorded as first target point cloud data).
Since the first point cloud data contains noise data, interference data, and the like, in order to extract target data from the first point cloud data, it is necessary to determine whether or not target data is present in the first point cloud data based on whether or not echo intensity data contained in the first point cloud data exceeds a target echo intensity.
When the echo intensity data is smaller than the target echo intensity, the first point cloud data is proved to have no target data, and when the echo intensity data is larger than or equal to the target echo intensity, the first point cloud data is proved to have target data. Based on the principle, first point cloud data with echo intensity data larger than or equal to target echo intensity are screened out from a plurality of first point cloud data, so that influences of noise, interference and the like on flight control are reduced, false alarm probability is reduced, and flight safety is improved.
Based on the same principle, when a plurality of target point cloud data (which can be recorded as second target point cloud data) are screened out from a plurality of second point cloud data, the target point cloud data with the echo intensity data greater than or equal to the target echo intensity are screened out from the plurality of second point cloud data based on the echo intensity data contained in the plurality of second point cloud data. The relevant descriptions are described above and are not repeated here.
It should be noted that the magnitude of the value of the target echo intensity is not limited, and is determined according to an application scenario or manual experience.
In implementation mode A3, based on the slant range data included in each of the plurality of first point cloud data, target point cloud data with the slant range data being greater than or equal to the target slant range are screened out of the plurality of first point cloud data.
When the radar observes an object, the obtained point cloud data corresponding to the object includes the slant range data, and therefore, any one of the first point cloud data includes the slant range data. And screening first point cloud data with the skew data larger than or equal to the target skew from the plurality of first point cloud data based on the skew data contained in each first point cloud data, wherein the first point cloud data is used as target point cloud data (can be recorded as first target point cloud data).
Because unmanned aerial vehicle's flight height is higher, therefore, unmanned aerial vehicle corresponding slant range data is great, and through the first point cloud data that the slant range data is greater than or equal to the target slant range of screening out from a plurality of first point cloud data, the first point cloud data that ground object (ground object is nearer to the radar, and its slant range data is less) corresponding filters out, reduces the influence of ground clutter to flight control, improves flight security.
Based on the same principle, when a plurality of target point cloud data (which can be recorded as second target point cloud data) are screened out from a plurality of second point cloud data, target point cloud data with the skew data larger than or equal to the target skew are screened out from the plurality of second point cloud data based on the skew data contained in the plurality of second point cloud data. The relevant descriptions are described above and are not repeated here.
It should be noted that the numerical value of the target pitch is not limited, and is determined according to an application scenario or manual experience.
In implementation A4, based on angle data included in each of the plurality of first point cloud data, target point cloud data whose angle data is less than or equal to a target angle is selected from the plurality of first point cloud data.
When the radar observes an object, the obtained point cloud data corresponding to the object includes angle data, and therefore, any one of the first point cloud data includes angle data. And screening first point cloud data with angle data smaller than or equal to a target angle from the plurality of first point cloud data based on the angle data contained in each first point cloud data, wherein the first point cloud data is used as target point cloud data (can be recorded as first target point cloud data).
The unmanned aerial vehicle control method is achieved based on the first radar and the second radar, and the first radar and the second radar cooperatively control the unmanned aerial vehicle to fly. Therefore, the first point cloud data with the angle data smaller than or equal to the target angle are screened out from the plurality of first point cloud data, so that the object corresponding to the screened first point cloud data is in the detection range of the first radar and the second radar, namely, the object corresponding to the screened first point cloud data is in the cross beam range of the two radars, the first point cloud data in the non-cross beam range is removed, and the accuracy of flight control is improved.
Based on the same principle, when a plurality of target point cloud data (which can be recorded as second target point cloud data) are screened out from a plurality of second point cloud data, based on angle data contained in each of the plurality of second point cloud data, target point cloud data with the angle data smaller than or equal to a target angle are screened out from the plurality of second point cloud data. The relevant descriptions are described above and are not repeated here.
It should be noted that the numerical value of the target pitch is not limited, and is determined according to an application scenario or manual experience.
In this embodiment of the present application, clustering is performed on a plurality of target point cloud data to obtain at least one first data set, including: calculating the slant distance difference between every two target point cloud data based on slant distance data contained in each of the plurality of target point cloud data; and clustering the plurality of target point cloud data based on the skew difference between every two target point cloud data to obtain at least one first data set, wherein the skew difference between any two target point cloud data in the first data set is smaller than or equal to a first threshold value.
According to at least one of the above implementation manners A1-A4, after a plurality of target point cloud data (which may be denoted as first target point cloud data) are screened out from a plurality of first point cloud data, for each two target point cloud data, a skew difference between the two target point cloud data is calculated based on the skew data contained in each of the two target point cloud data.
And when the skew difference between the two target point cloud data is larger than the first threshold value, the two target point cloud data are gathered in different first data sets. In this way, clustering the plurality of target point cloud data into at least one first data set is achieved.
Since the distribution of aerial objects is very sparse, at least two aerial objects will not be present in a small area, e.g. two unmanned aerial vehicles will not be present in a small area. Therefore, by gathering the two target point cloud data with the skew difference smaller than or equal to the first threshold value in the same first data set, the purpose that a plurality of target point cloud data corresponding to the same object are gathered in the same first data set is achieved, namely, the plurality of target point cloud data in the first data set correspond to the same object, and therefore timeliness of operation is improved.
Based on the same principle, after a plurality of target point cloud data (which can be recorded as second target point cloud data) are screened out from a plurality of second point cloud data, clustering is performed on the plurality of target point cloud data to obtain at least one second data set, including: calculating the slant distance difference between every two target point cloud data based on slant distance data contained in each of the plurality of target point cloud data; and clustering the plurality of target point cloud data based on the skew difference between every two target point cloud data to obtain at least one second data set, wherein the skew difference between any two target point cloud data in the second data set is smaller than or equal to a first threshold value. The relevant descriptions are set forth above, and are not limited in the embodiments of the present application.
The value of the first threshold is not limited, and is determined according to an application scene or manual experience, and the first threshold is 1 meter in an exemplary manner.
In one possible implementation, associating at least one first data set with at least one second data set, resulting in a point cloud data pair, includes: determining the central slant distance of each first data set and the central slant distance of each second data set; calculating a corresponding skew difference between the first data set and the second data set based on the center skew of each first data set and the center skew of each second data set; and correlating at least one first data set with at least one second data set based on the corresponding skew difference between the first data set and the second data set to obtain a point cloud data pair, wherein the skew difference between the first data set and the second data set in the point cloud data pair is smaller than or equal to a second threshold value.
In this embodiment, for any one of the first data sets, the first data set includes at least one target point cloud data (may be denoted as first target point cloud data). A center-to-center skew of the first dataset is determined based on the skew data contained by each of the target point cloud data in the first dataset.
In one possible implementation, determining the center skew of each first data set includes: and calculating the average value of the slant ranges of the first point cloud data in each first data set based on the slant range data contained in each first point cloud data in each first data set, so as to obtain the center slant range of each first data set.
And for any one of the first data sets, calculating the sum of the slant range data contained in each target point cloud data in the first data set, dividing the sum by the number of the target point cloud data in the first data set to obtain a slant range average value, and taking the slant range average value as the central slant range of the first data set.
It should be noted that, in the foregoing, a plurality of target point cloud data may be first screened from a plurality of first point cloud data, and then clustering is performed on the plurality of target point cloud data to obtain at least one first data set. Therefore, the average value of the skew of each target point cloud data in each first data set can be calculated based on the skew data contained in each target point cloud data in each first data set, so as to obtain the center skew of each first data set.
Based on the same principle, determining the center skew of each second dataset comprises: and calculating the average value of the slant ranges of the second point cloud data in each second data set based on the slant range data contained in each second point cloud data in each second data set, so as to obtain the center slant range of each second data set. The relevant descriptions are described above and are not repeated here.
In this embodiment, any one of the at least one first data set and any one of the at least one second data set are optionally combined to obtain at least one combination result, where any one combination result includes a first data set and a second data set.
For each combined result, a corresponding skew difference between the first data set and the second data set in the combined result is calculated. And when the skew difference is smaller than or equal to a second threshold value, correlating the first data set and the second data set in the combined result to obtain a point cloud data pair, and when the skew difference is larger than the second threshold value, determining that the first data set and the second data set in the combined result are not correlated.
The value of the second threshold is not limited, and is determined according to an application scene or manual experience, and the second threshold is 1 meter in an exemplary manner.
And step 23, performing flight control on the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
In this embodiment of the present application, the first radar and the second radar correspond to one radar observation area, and the first radar and the second radar are used for observing objects in the radar observation area. When the unmanned aerial vehicle in the radar observation area is subjected to flight control based on the point cloud data pair, the unmanned aerial vehicle is controlled to perform flight including ascending, landing and the like.
In one possible implementation manner, the number of the point cloud data pairs is a plurality of, and any one point cloud data pair is a point cloud data pair corresponding to the current moment; based on the point cloud data pair, performing flight control on the unmanned aerial vehicle in the radar observation area, including: determining motion state data at the previous moment based on the point cloud data pair corresponding to the previous moment, wherein the motion state data is data corresponding to the point cloud data pair in a rectangular coordinate system; determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment; and performing flight control on the unmanned aerial vehicle in the radar observation area based on at least one datum to be screened.
In the embodiment of the present application, the motion state data at any time includes position data at the time and velocity component data at the time. The point cloud data pair includes first point cloud data and second point cloud data. The first point cloud data comprises slant distance data, angle data, speed data and echo intensity data, and the second point cloud data also comprises slant distance data, angle data, speed data and echo intensity data. In the embodiment of the present application, the motion state data at the previous moment may be determined based on the point cloud data pair corresponding to the previous moment.
Fig. 4 is a schematic diagram of point cloud data according to an embodiment of the present application, as shown in fig. 4. A three-dimensional coordinate system is established, noted as the x-axis, the y-axis and the z-axis, respectively. In one possible implementation, the three-dimensional coordinate system is referenced to the eastern, northern, and zenith directions, the northern direction being the y-axis, the eastern direction being the x-axis, and the zenith direction being the z-axis.
In the embodiment of the application, the detection distance of the radar is far, and the radar size is small, so that any point in the radar plane can be set as the origin o of the coordinate system. As shown in fig. 4 (a), the detection direction of the first radar is located between the y-axis and the z-axis, i.e., the beam of the first radar is mainly covered in the yoz plane, and the detection direction of the second radar is located between the x-axis and the z-axis, i.e., the beam of the first radar is mainly covered in the xoz plane.
Based on the detection direction of the first radar and the detection direction of the second radar shown in (a) in fig. 4, three-dimensional coordinate systems shown in (b) in fig. 4 are suggested, which are respectively denoted as x-axis, y-axis and z-axis, and the origin of the coordinate systems is o. For any one object, the position data of the object in the three-dimensional coordinate system is (x ', y ', z '), the first point cloud data corresponding to the object obtained by the first radar detection comprises slant distance data R, angle data beta of a line formed by the position of the object and the origin o of the coordinate system and a yoz plane, speed data of the object relative to the origin o of the coordinate system and echo intensity data, and the second point cloud data corresponding to the object obtained by the second radar detection comprises slant distance data R, angle data alpha of a line formed by the position of the object and the origin o of the coordinate system and a xoz plane, speed data of the object relative to the origin o of the coordinate system and echo intensity data.
In the ideal case, the pitch data R included in the first point cloud data is equal to the pitch data R included in the second point cloud data, and the speed data included in the first point cloud data is equal to the speed data included in the second point cloud data. The echo intensity data included in the first point cloud data may be equal to or different from the echo intensity data included in the second point cloud data, and the angle data β included in the first point cloud data may be equal to or different from the angle data α included in the second point cloud data.
In this embodiment, for any one object, based on the pitch data R, the speed data vd, the angle data β included in the first point cloud data, and the angle data α included in the second point cloud data corresponding to the object, the position data (x ', y ', z ') of the object in the three-dimensional coordinate system is calculated according to the following formula (1).
x'=R×sin(α)
y' =r×sin (β) formula (1)
z'=sqrt(R 2 -x' 2 -y' 2 )
vd=(vx×x'+vy×y'+vz×z')/R
The velocity data vd is a projection component of a velocity vector of an object on a line between the object and a coordinate system origin o (a point in a radar plane), vx is a velocity component of the velocity vector on an x-axis, vy is a velocity component of the velocity vector on a y-axis, vz is a velocity component of the velocity vector on a z-axis, and (vx, vy, vz) is velocity component data.
The conversion of the formula (1) can give the following formula (2).
R=sqrt(x' 2 +y' 2 +z' 2 )
sinα=x'/sqrt(x' 2 +y' 2 +z' 2 ) Formula (2)
sinβ=y'/sqrt(x' 2 +y' 2 +z' 2 )
vd=(vx×x'+vy×y'+vz×z')/R
Where sqrt is a square root symbol.
In this embodiment of the present application, for a point cloud data pair corresponding to a previous time, the point cloud data pair corresponds to one object. According to formula (1) or formula (2), predicting the motion state data of the object corresponding to the point cloud data pair in the three-dimensional coordinate system by utilizing the first point cloud data and the second point cloud data contained in the point cloud data pair, namely obtaining the motion state data of the last moment, wherein the motion state data of the last moment comprises position data and speed component data.
It should be noted that, each point cloud data pair includes at least one first point cloud data and at least one second point cloud data, and when calculating the motion state data corresponding to the point cloud data pair, the average value of the slant distance data and the average value of the angle data are calculated by using all the first point cloud data and all the second point cloud data in each point cloud data pair. And then calculating position data corresponding to the point cloud data pair by using the average value of the oblique distance data and the average value of the angle data according to a formula (1), and predicting speed component data corresponding to the point cloud data pair based on the position data corresponding to the point cloud data pair and the speed data in the point cloud data pair.
And then, determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment. Namely, based on the motion state data of the previous moment, determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment, wherein the data to be screened comprises: predicting and obtaining first motion state data corresponding to the current moment based on the motion state data of the previous moment; determining each second motion state data corresponding to the current moment based on each point cloud data pair corresponding to the current moment; and determining at least one datum to be screened from each second motion state data based on the first motion state data.
And determining each second motion state data corresponding to the current moment according to the formula (1) or the formula (2). The detailed descriptions of the formula (1) and the formula (2) are omitted here. The first motion state data corresponding to the current time includes position data and speed component data of the current time, and the second motion state data corresponding to the current time also includes position data and speed component data of the current time.
In the embodiment of the application, based on the motion state data of the previous moment, the first motion state data corresponding to the current moment is predicted, and the first motion state data is the data corresponding to the motion state of the unmanned aerial vehicle. And respectively converting each point cloud data corresponding to the current moment into each second motion state data corresponding to the current moment, wherein the second motion state data is the data corresponding to the motion state of the actual unmanned aerial vehicle.
And then, determining at least one data to be screened from the second motion state data. Assuming that the current time is t time, based on the first motion state data and the second motion state data corresponding to the t time, distance data corresponding to each second motion state data corresponding to the t time is determined according to the following formula (3).
d=sqrt[(x2-x1) 2 +(y2-y1) 2 +(z2-z1) 2 Formula (3)
Wherein d is distance data corresponding to the second motion state data corresponding to the time t, (x 2, y2, z 2) is second motion state data corresponding to the time t, and (x 1, y1, z 1) is first motion state data corresponding to the time t.
When the distance data corresponding to the second motion state data corresponding to the moment t is smaller than or equal to the first distance, determining that the association result of the second motion state data corresponding to the moment t and the first motion state data corresponding to the moment t is association, and at the moment, the second motion state data corresponding to the moment t is data to be screened.
When the distance data corresponding to the second motion state data corresponding to the moment t is larger than the first distance, determining that the association result of the second motion state data corresponding to the moment t and the first motion state data corresponding to the moment t is not associated, and filtering the second motion state data corresponding to the moment t.
The value of the first distance is not limited, and is determined according to an application scene or manual experience. The first distance is, for example, 1 meter.
In this embodiment of the present application, after determining the first motion state data corresponding to the current time, the motion state data at the current time may also be determined based on each point cloud data pair corresponding to the current time, so as to determine, from a plurality of point cloud data corresponding to the next time, at least one data to be screened (corresponding to the next time) based on the motion state data at the current time, and perform flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened (corresponding to the next time) and the position data (corresponding to the next time).
When the motion state data at the current moment is determined based on each point cloud data pair corresponding to the current moment after the data to be screened are determined, the first motion state data corresponding to the current moment is converted into a prediction data pair according to a formula (1) or a formula (2), wherein the prediction data pair comprises oblique distance data, speed data and angle data. Then, for each point cloud data pair corresponding to the current moment, calculating an error of the point cloud data pair based on the pitch data, the speed data and the angle data in the prediction data pair and the pitch data, the speed data and the angle data in the point cloud data pair, wherein the error is used for representing the error between the prediction data pair and the point cloud data pair. And then, screening out point cloud data corresponding to the minimum error from each point cloud data pair corresponding to the current moment to serve as a first representative data pair. Then, based on the first representative data pair, the motion state data at the current moment is determined according to the formula (1) or the formula (2).
Illustratively, a mean value of the stand-off data, a mean value of the angle data, is calculated based on all the first point cloud data and all the second point cloud data in the first representative data pair. And then calculating position data corresponding to the first representative data pair by using the average value of the slant distance data and the average value of the angle data according to a formula (1) or a formula (2), and predicting speed component data corresponding to the first representative data pair based on the position data corresponding to the first representative data pair and the speed data in the first representative data pair, so as to obtain the motion state data at the current moment.
When determining that the data to be screened does not exist, determining motion state data at the current moment based on each point cloud data pair corresponding to the current moment, and screening first point cloud data with maximum echo intensity data or with echo intensity data larger than the first echo intensity data from at least one first point cloud data for each point cloud data pair corresponding to the current moment, and similarly screening second point cloud data with maximum echo intensity data or with echo intensity data larger than the second echo intensity data from at least one second point cloud data. And then, the screened first point cloud data and the screened second point cloud data form a second representative data pair. Based on the second representative data pair, the motion state data at the current moment is determined according to the formula (1) or the formula (2). The values of the first echo intensity data and the second echo intensity data are not limited, and are determined according to application scenes and manual experience.
Illustratively, a mean value of the skew data, a mean value of the angle data, is calculated based on all the first point cloud data and all the second point cloud data in the second representative data pair. And then calculating position data corresponding to the second representative data pair by using the average value of the slant distance data and the average value of the angle data according to the formula (1) or the formula (2), and setting the speed component data corresponding to the second representative data pair to be (0, 0), thereby obtaining the motion state data at the current moment.
The principle of determining the motion state data at the previous moment based on the point cloud data pair corresponding to the previous moment is the same as that of determining the motion state data at the current moment based on each point cloud data pair corresponding to the current moment, and is not repeated here.
And then, carrying out flight control on the unmanned aerial vehicle in the radar observation area based on at least one datum to be screened. In this embodiment of the present application, based on at least one data to be screened, flight control is performed on an unmanned aerial vehicle in a radar observation area, including: acquiring position data of an unmanned aerial vehicle; and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened and the position data.
The unmanned aerial vehicle is provided with a sensor, acquires the position data of the unmanned aerial vehicle based on the sensor, and performs flight control on the unmanned aerial vehicle in the radar observation area based on at least one datum to be screened and the position data. The type of sensor is not limited in the embodiments of the present application, and the sensor is exemplified as a laser altimeter or a global navigation satellite system (Global Navigation Satellite System, GNSS) positioning module.
Optionally, based on at least one data to be screened and the position data, performing flight control on the unmanned aerial vehicle in the radar observation area, including: determining distance data corresponding to each piece of data to be screened based on at least one piece of data to be screened and the position data, wherein the distance data corresponding to the data to be screened is used for representing the distance between an object corresponding to the data to be screened and the unmanned aerial vehicle; screening target data with the distance data smaller than or equal to the target distance from at least one piece of data to be screened based on the distance data corresponding to each piece of data to be screened; based on the target data, the unmanned aerial vehicle in the radar observation area is subjected to flight control.
For any one data to be screened, determining the distance between the object corresponding to the data to be screened and the unmanned aerial vehicle based on the data to be screened and the position data of the unmanned aerial vehicle, namely the distance data corresponding to the data to be screened.
And when the distance data corresponding to the data to be screened is smaller than or equal to the target distance, screening the data to be screened to obtain the target data. And when the distance data corresponding to the data to be screened is larger than the target distance, filtering the data to be screened.
The value of the target distance is not limited, and is determined according to an application scene or manual experience.
It will be appreciated that the correlation calculation of step 23 can be performed in a module, which is illustratively a tracking filter module. In one possible implementation manner, for the current time t, the first motion state data corresponding to the time t and each point cloud data pair corresponding to the time t are input to the tracking filter module, and the tracking filter module outputs each second motion state data corresponding to the time t and the first motion state data corresponding to the time t+1. The data processing method of the tracking filter module includes, but is not limited to, extended kalman filtering, particle filtering, etc., and the detailed description of step 23 is omitted here.
The method is used for controlling the flight of the unmanned aerial vehicle in the radar observation area based on point cloud data pairs formed by associating the plurality of first point cloud data with the plurality of second point cloud data. Because the point cloud data is acquired by means of the radar, the influence of weather factors such as rain, snow and illumination is avoided, and the data accuracy is high, when the unmanned aerial vehicle is controlled to fly based on the point cloud data, the control precision of the unmanned aerial vehicle is improved, and therefore the flight safety is improved.
The control method of the unmanned aerial vehicle is described from the aspect of method steps, and is described in detail from the aspect of scenes. The scene in the embodiment of the application is an interaction scene of the unmanned aerial vehicle and the radar system. Fig. 5 is a schematic diagram of interaction between a unmanned aerial vehicle and a radar system according to an embodiment of the present application, as shown in fig. 5. The radar system includes a radar observation section, a data processing section, and a communication section.
The radar observation portion corresponds to an observation range and is used for observing objects, such as unmanned aerial vehicles, located in the observation range. The radar observation section includes a first radar and a second radar as shown in fig. 3, and the related description is visible in the foregoing, and will not be repeated here.
The data processing part is used for acquiring a plurality of first point cloud data observed by the first radar and a plurality of second point cloud data observed by the second radar in the radar observation part, acquiring the position data of the unmanned aerial vehicle sent by the communication part and carrying out data processing based on the plurality of first point cloud data, the plurality of second point cloud data and the position data of the unmanned aerial vehicle.
Fig. 6 is a schematic diagram of interaction between another unmanned aerial vehicle and a radar system according to an embodiment of the present application, as shown in fig. 6. The unmanned plane acquires own position data, sends the position data to the communication part, and forwards the position data to the data processing part by the communication part. The radar observation part observes the first point cloud data and the second point cloud data, sends the first point cloud data and the second point cloud data to the data processing part, performs data processing on the first point cloud data and the second point cloud data by the data processing part, and sends a data processing result to the unmanned aerial vehicle.
As shown in fig. 7, fig. 7 is a schematic diagram of a data processing according to an embodiment of the present application. The data processing part screens the first point cloud data and clusters the first point cloud data, and similarly, the data processing part screens the second point cloud data and clusters the second point cloud data. And then correlating the clustered first point cloud data with the clustered second point cloud data to obtain a point cloud data pair. And then, controlling the unmanned aerial vehicle to fly based on the point cloud data pair and the position data. The related description can be found in the related description above, and will not be repeated here.
Fig. 8 is a schematic diagram of a track provided in an embodiment of the present application. (1) In order to obtain a track graph after data processing is performed on first point cloud data of a first radar, the abscissa of the track graph is the x axis, the ordinate of the track graph is the z axis, the track graph comprises a plurality of point clouds (namely first point cloud data), and the track a and the track b in the step (1) are obtained after data processing is performed according to the plurality of first point cloud data. (2) And (3) obtaining a track graph obtained after data processing is performed on the second point cloud data of the second radar, wherein the abscissa of the track graph is the y axis, the ordinate of the track graph is the z axis, the track graph comprises a plurality of point clouds (namely second point cloud data), and the track a and the track b in the step (2) are obtained after data processing is performed according to the plurality of second point cloud data. As is obvious from (1) and (2), the track a and the track b are more visual and have higher precision than the point cloud.
The communication part is used for communicating with the unmanned aerial vehicle, and the communication part can acquire the position data sent by the unmanned aerial vehicle, and can also send the processing result of the data processing part to the unmanned aerial vehicle so as to realize controlling the flight of the unmanned aerial vehicle.
Fig. 9 is a schematic structural diagram of a control device of an unmanned aerial vehicle according to an embodiment of the present application, where, as shown in fig. 9, the device includes:
an acquisition module 901, configured to acquire a plurality of first point cloud data observed by a first radar and a plurality of second point cloud data observed by a second radar;
the associating module 902 is configured to associate the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair, where the point cloud data pair includes at least one first point cloud data and at least one second point cloud data;
the control module 903 is configured to perform flight control on the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
In a possible implementation manner, the association module 902 is configured to perform clustering processing on the plurality of first point cloud data to obtain at least one first data set; clustering the plurality of second point cloud data to obtain at least one second data set; and correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair.
In one possible implementation, the associating module 902 is configured to screen out a plurality of target point cloud data from a plurality of first point cloud data; and clustering the cloud data of the plurality of target points to obtain at least one first data set.
In one possible implementation, the association module 902 is configured to at least one of:
screening target point cloud data with the speed data smaller than or equal to the target speed from the plurality of first point cloud data based on the speed data contained in the plurality of first point cloud data;
screening target point cloud data with echo intensity data greater than or equal to target echo intensity from the plurality of first point cloud data based on echo intensity data contained in the plurality of first point cloud data;
screening target point cloud data with the slant range data larger than or equal to the target slant range from the plurality of first point cloud data based on the slant range data contained in the plurality of first point cloud data;
and screening target point cloud data with the angle data smaller than or equal to the target angle from the plurality of first point cloud data based on the angle data contained in the plurality of first point cloud data.
In one possible implementation, the associating module 902 is configured to calculate a skew difference between every two target point cloud data based on skew data included in each of the plurality of target point cloud data; and clustering the plurality of target point cloud data based on the skew difference between every two target point cloud data to obtain at least one first data set, wherein the skew difference between any two target point cloud data in the first data set is smaller than or equal to a first threshold value.
In one possible implementation, the associating module 902 is configured to determine a central skew of each first data set, and determine a central skew of each second data set; calculating a corresponding skew difference between the first data set and the second data set based on the center skew of each first data set and the center skew of each second data set; and correlating at least one first data set with at least one second data set based on the corresponding skew difference between the first data set and the second data set to obtain a point cloud data pair, wherein the skew difference between the first data set and the second data set in the point cloud data pair is smaller than or equal to a second threshold value.
In a possible implementation manner, the association module 902 is configured to calculate a pitch average value of each first point cloud data in each first data set based on the pitch data included in each first point cloud data in each first data set, so as to obtain a center pitch of each first data set.
In one possible implementation manner, the number of the point cloud data pairs is a plurality of, and any one point cloud data pair is a point cloud data pair corresponding to the current moment;
the control module 903 is configured to determine motion state data at a previous time based on a point cloud data pair corresponding to the previous time, where the motion state data is data corresponding to the point cloud data pair in a rectangular coordinate system; determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment; and performing flight control on the unmanned aerial vehicle in the radar observation area based on at least one datum to be screened.
In a possible implementation manner, the control module 903 is configured to predict and obtain first motion state data corresponding to the current time based on motion state data of a previous time; determining each second motion state data corresponding to the current moment based on each point cloud data pair corresponding to the current moment; and determining at least one datum to be screened from each second motion state data based on the first motion state data.
In one possible implementation, the control module 903 is configured to obtain position data of the unmanned aerial vehicle; and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened and the position data.
In a possible implementation manner, the control module 903 is configured to determine distance data corresponding to each data to be screened based on at least one data to be screened and the position data, where the distance data corresponding to the data to be screened is used to characterize a distance between an object corresponding to the data to be screened and the unmanned aerial vehicle; screening target data with the distance data smaller than or equal to the target distance from at least one piece of data to be screened based on the distance data corresponding to each piece of data to be screened; based on the target data, the unmanned aerial vehicle in the radar observation area is subjected to flight control.
In one possible implementation, the radar beam planes of the first radar and the second radar are perpendicular, the first radar including at least two patch antennas, the second radar including at least two patch antennas.
The device controls the flight of the unmanned aerial vehicle in the radar observation area based on the point cloud data pairs formed by associating the plurality of first point cloud data and the plurality of second point cloud data. Because the point cloud data is acquired by means of the radar, the influence of weather factors such as rain, snow and illumination is avoided, and the data accuracy is high, when the unmanned aerial vehicle is controlled to fly based on the point cloud data, the control precision of the unmanned aerial vehicle is improved, and therefore the flight safety is improved.
It should be understood that, in implementing the functions of the apparatus provided in fig. 9, only the division of the functional modules is illustrated, and in practical application, the functional modules may be allocated to different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Fig. 10 shows a block diagram of a terminal device 1000 according to an exemplary embodiment of the present application. The terminal device 1000 may be a portable mobile terminal such as: a smart phone, a tablet, an MP3 (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook or a desktop. Terminal device 1000 can also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
In general, terminal device 1000 includes: a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1001 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1001 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and drawing of content that the display screen needs to display. In some embodiments, the processor 1001 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one instruction for execution by processor 1001 to implement the control method of the drone provided by the method embodiments in the present application.
In some embodiments, terminal device 1000 can optionally further include: a peripheral interface 1003, and at least one peripheral. The processor 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. The various peripheral devices may be connected to the peripheral device interface 1003 via a bus, signal wire, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, a display 1005, a camera assembly 1006, audio circuitry 1007, a positioning assembly 1008, and a power supply 1009.
Peripheral interface 1003 may be used to connect I/O (Input/Output) related at least one peripheral to processor 1001 and memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1001, memory 1002, and peripheral interface 1003 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
Radio Frequency circuit 1004 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. Radio frequency circuitry 1004 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. Radio frequency circuitry 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 1004 may also include NFC (Near Field Communication ) related circuitry, which is not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1005 is a touch screen, the display 1005 also has the ability to capture touch signals at or above the surface of the display 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this time, the display 1005 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1005 may be one, disposed on the front panel of the terminal device 1000; in other embodiments, at least two display screens 1005 may be respectively disposed on different surfaces of terminal device 1000 or in a folded design; in other embodiments, display 1005 may be a flexible display disposed on a curved surface or a folded surface of terminal device 1000. Even more, the display 1005 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1005 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1006 is used to capture images or video. Optionally, camera assembly 1006 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing, or inputting the electric signals to the radio frequency circuit 1004 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be provided at different portions of terminal device 1000, respectively. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 1007 may also include a headphone jack.
The location component 1008 is used to locate the current geographic location of the terminal device 1000 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 1008 may be a positioning component based on the united states GPS (Global Positioning System ), the beidou system of china, or the galileo system of russia.
Power supply 1009 is used to power the various components in terminal device 1000. The power source 1009 may be alternating current, direct current, disposable battery or rechargeable battery. When the power source 1009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal device 1000 can further include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyroscope sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal apparatus 1000. For example, the acceleration sensor 1011 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1001 may control the display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal device 1000, and the gyro sensor 1012 may collect a 3D motion of the user to the terminal device 1000 in cooperation with the acceleration sensor 1011. The processor 1001 may implement the following functions according to the data collected by the gyro sensor 1012: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1013 may be disposed at a side frame of terminal device 1000 and/or at a lower layer of display 1005. When the pressure sensor 1013 is provided at a side frame of the terminal apparatus 1000, a grip signal of the terminal apparatus 1000 by a user can be detected, and the processor 1001 performs right-left hand recognition or quick operation based on the grip signal collected by the pressure sensor 1013. When the pressure sensor 1013 is provided at the lower layer of the display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1005. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 1014 may be disposed on the front, back, or side of terminal device 1000. When a physical key or vendor Logo is provided on terminal device 1000, fingerprint sensor 1014 may be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the display screen 1005 based on the ambient light intensity collected by the optical sensor 1015. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 1005 is turned up; when the ambient light intensity is low, the display brightness of the display screen 1005 is turned down. In another embodiment, the processor 1001 may dynamically adjust the shooting parameters of the camera module 1006 according to the ambient light intensity collected by the optical sensor 1015.
Proximity sensor 1016, also referred to as a distance sensor, is typically located on the front panel of terminal device 1000. Proximity sensor 1016 is used to capture the distance between the user and the front face of terminal device 1000. In one embodiment, when proximity sensor 1016 detects a gradual decrease in the distance between the user and the front face of terminal device 1000, processor 1001 controls display 1005 to switch from the bright screen state to the off screen state; when the proximity sensor 1016 detects that the distance between the user and the front surface of the terminal device 1000 gradually increases, the processor 1001 controls the display screen 1005 to switch from the off-screen state to the on-screen state.
It will be appreciated by those skilled in the art that the structure shown in fig. 10 is not limiting and that terminal device 1000 may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
Fig. 11 is a schematic structural diagram of a server provided in the embodiment of the present application, where the server 1100 may have a relatively large difference due to different configurations or performances, and may include one or more processors 1101 and one or more memories 1102, where at least one program code is stored in the one or more memories 1102, and the at least one program code is loaded and executed by the one or more processors 1101 to implement the control method of the unmanned aerial vehicle provided in each of the above method embodiments. Of course, the server 1100 may also have a wired or wireless network interface, a keyboard, an input/output interface, etc. for performing input/output, and the server 1100 may also include other components for implementing device functions, which are not described herein.
In an exemplary embodiment, there is also provided a computer readable storage medium having stored therein at least one program code loaded and executed by a processor to cause an electronic device to implement a method of controlling any one of the above unmanned aerial vehicles.
Alternatively, the above-mentioned computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Read-Only optical disk (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program or a computer program product is also provided, in which at least one computer instruction is stored, which is loaded and executed by a processor, to cause the computer to implement a control method of any one of the above-mentioned unmanned aerial vehicles.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and scope of the invention.

Claims (15)

1. A method of controlling a drone, the method comprising:
acquiring a plurality of first point cloud data observed by a first radar and a plurality of second point cloud data observed by a second radar;
Correlating the plurality of first point cloud data with the plurality of second point cloud data to obtain point cloud data pairs, wherein the point cloud data pairs comprise at least one first point cloud data and at least one second point cloud data;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
2. The method of claim 1, wherein the associating the plurality of first point cloud data with the plurality of second point cloud data to obtain a point cloud data pair comprises:
clustering the plurality of first point cloud data to obtain at least one first data set;
clustering the plurality of second point cloud data to obtain at least one second data set;
and correlating the at least one first data set with the at least one second data set to obtain the point cloud data pair.
3. The method of claim 2, wherein clustering the plurality of first point cloud data to obtain at least one first data set comprises:
screening a plurality of target point cloud data from the plurality of first point cloud data;
and clustering the plurality of target point cloud data to obtain the at least one first data set.
4. The method of claim 3, wherein the screening the plurality of target point cloud data from the plurality of first point cloud data comprises at least one of:
screening target point cloud data with speed data smaller than or equal to a target speed from the plurality of first point cloud data based on the speed data contained in the plurality of first point cloud data;
screening target point cloud data with echo intensity data greater than or equal to target echo intensity from the plurality of first point cloud data based on echo intensity data contained in the plurality of first point cloud data;
screening target point cloud data with the slant range data larger than or equal to a target slant range from the plurality of first point cloud data based on slant range data contained in the plurality of first point cloud data;
and screening target point cloud data with angle data smaller than or equal to a target angle from the plurality of first point cloud data based on the angle data contained in the plurality of first point cloud data.
5. A method according to claim 3, wherein clustering the plurality of target point cloud data to obtain the at least one first data set comprises:
Calculating the slant distance difference between every two target point cloud data based on slant distance data contained in each of the plurality of target point cloud data;
and clustering the plurality of target point cloud data based on the slant distance difference between every two target point cloud data to obtain at least one first data set, wherein the slant distance difference between any two target point cloud data in the first data set is smaller than or equal to a first threshold value.
6. The method of claim 2, wherein the associating the at least one first data set with the at least one second data set to obtain the point cloud data pair comprises:
determining the central slant distance of each first data set and the central slant distance of each second data set;
calculating a corresponding skew difference between the first data set and the second data set based on the center skew of the respective first data set and the center skew of the respective second data set;
and correlating the at least one first data set with the at least one second data set based on a corresponding skew difference between the first data set and the second data set to obtain the point cloud data pair, wherein the skew difference between the first data set and the second data set in the point cloud data pair is smaller than or equal to a second threshold.
7. The method of claim 6, wherein said determining the center skew of each first data set comprises:
and calculating the average value of the slant ranges of the first point cloud data in each first data set based on the slant range data contained in each first point cloud data in each first data set, so as to obtain the center slant range of each first data set.
8. The method according to any one of claims 1 to 7, wherein the number of the point cloud data pairs is plural, and any one point cloud data pair is a point cloud data pair corresponding to a current time;
based on the point cloud data pair, performing flight control on the unmanned aerial vehicle in the radar observation area, including:
determining motion state data at the previous moment based on a point cloud data pair corresponding to the previous moment, wherein the motion state data is data corresponding to the point cloud data pair in a rectangular coordinate system;
determining at least one data to be screened from a plurality of point cloud data pairs corresponding to the current moment based on the motion state data of the previous moment;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one data to be screened.
9. The method according to claim 8, wherein determining at least one data to be screened from a plurality of point cloud data pairs corresponding to a current moment based on the motion state data of the previous moment includes:
predicting and obtaining first motion state data corresponding to the current moment based on the motion state data of the previous moment;
determining each second motion state data corresponding to the current moment based on each point cloud data pair corresponding to the current moment;
and determining at least one datum to be screened from the second motion state data based on the first motion state data.
10. The method of claim 8, wherein the performing flight control of the drone within the radar observation area based on the at least one data to be screened comprises:
acquiring position data of an unmanned aerial vehicle;
and performing flight control on the unmanned aerial vehicle in the radar observation area based on the at least one datum to be screened and the position data.
11. The method of claim 10, wherein the performing flight control of the drone within the radar observation area based on the at least one data to be screened and the location data comprises:
Determining distance data corresponding to each piece of data to be screened based on the at least one piece of data to be screened and the position data, wherein the distance data corresponding to the piece of data to be screened is used for representing the distance between an object corresponding to the piece of data to be screened and the unmanned aerial vehicle;
screening target data with the distance data smaller than or equal to the target distance from the at least one piece of data to be screened based on the distance data corresponding to the data to be screened;
and carrying out flight control on the unmanned aerial vehicle in the radar observation area based on the target data.
12. The method of any of claims 1-7, wherein the radar beam planes of the first radar and the second radar are perpendicular, the first radar comprises at least two patch antennas, and the second radar comprises at least two patch antennas.
13. A control device for an unmanned aerial vehicle, the device comprising:
the acquisition module is used for acquiring a plurality of first point cloud data observed by the first radar and a plurality of second point cloud data observed by the second radar;
the association module is used for associating the plurality of first point cloud data with the plurality of second point cloud data to obtain point cloud data pairs, wherein the point cloud data pairs comprise at least one first point cloud data and at least one second point cloud data;
And the control module is used for controlling the flight of the unmanned aerial vehicle in the radar observation area based on the point cloud data pair.
14. An electronic device, characterized in that it comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor, in order to carry out the method of controlling a drone according to any one of claims 1 to 12.
15. A computer readable storage medium, characterized in that at least one program code is stored in the computer readable storage medium, which is loaded and executed by a processor, to cause a computer to implement the control method of the drone of any one of claims 1 to 12.
CN202111271016.2A 2021-10-29 2021-10-29 Unmanned aerial vehicle control method, device, equipment and readable storage medium Active CN116069051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111271016.2A CN116069051B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle control method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111271016.2A CN116069051B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle control method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN116069051A true CN116069051A (en) 2023-05-05
CN116069051B CN116069051B (en) 2024-03-19

Family

ID=86182361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111271016.2A Active CN116069051B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle control method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116069051B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105356994A (en) * 2015-12-08 2016-02-24 深圳大学 MIMO radar system and phase synchronization method of the same at dynamic target end
CN106199546A (en) * 2016-06-30 2016-12-07 西安电子科技大学 Direct-path signal method of purification based on external illuminators-based radar
CN108228798A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 The method and apparatus for determining the matching relationship between point cloud data
CN110441751A (en) * 2019-07-26 2019-11-12 大亚湾核电运营管理有限责任公司 Dual radars detection method, device, readable storage medium storing program for executing and terminal device
US20200166611A1 (en) * 2018-11-22 2020-05-28 Jomoo Kitchen & Bath Co., Ltd Detection method, detection device, terminal and detection system
CN111337941A (en) * 2020-03-18 2020-06-26 中国科学技术大学 Dynamic obstacle tracking method based on sparse laser radar data
CN112334788A (en) * 2019-11-11 2021-02-05 深圳市大疆创新科技有限公司 Radar component, unmanned aerial vehicle, obstacle detection method, equipment and storage medium
CN112415494A (en) * 2020-12-11 2021-02-26 福勤智能科技(昆山)有限公司 AGV double-laser-radar position calibration method, device, equipment and storage medium
CN112986973A (en) * 2019-12-18 2021-06-18 华为技术有限公司 Distance measuring method and distance measuring device
CN113030946A (en) * 2021-02-05 2021-06-25 北京航空航天大学 Secondary radar detection method, apparatus, device, system, medium, and program product
CN113220018A (en) * 2021-04-23 2021-08-06 上海发电设备成套设计研究院有限责任公司 Unmanned aerial vehicle path planning method and device, storage medium and electronic equipment
CN113359148A (en) * 2020-02-20 2021-09-07 百度在线网络技术(北京)有限公司 Laser radar point cloud data processing method, device, equipment and storage medium
CN113359136A (en) * 2020-03-06 2021-09-07 华为技术有限公司 Target detection method and device and distributed radar system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105356994A (en) * 2015-12-08 2016-02-24 深圳大学 MIMO radar system and phase synchronization method of the same at dynamic target end
CN106199546A (en) * 2016-06-30 2016-12-07 西安电子科技大学 Direct-path signal method of purification based on external illuminators-based radar
CN108228798A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 The method and apparatus for determining the matching relationship between point cloud data
US20200166611A1 (en) * 2018-11-22 2020-05-28 Jomoo Kitchen & Bath Co., Ltd Detection method, detection device, terminal and detection system
CN110441751A (en) * 2019-07-26 2019-11-12 大亚湾核电运营管理有限责任公司 Dual radars detection method, device, readable storage medium storing program for executing and terminal device
CN112334788A (en) * 2019-11-11 2021-02-05 深圳市大疆创新科技有限公司 Radar component, unmanned aerial vehicle, obstacle detection method, equipment and storage medium
CN112986973A (en) * 2019-12-18 2021-06-18 华为技术有限公司 Distance measuring method and distance measuring device
CN113359148A (en) * 2020-02-20 2021-09-07 百度在线网络技术(北京)有限公司 Laser radar point cloud data processing method, device, equipment and storage medium
CN113359136A (en) * 2020-03-06 2021-09-07 华为技术有限公司 Target detection method and device and distributed radar system
CN111337941A (en) * 2020-03-18 2020-06-26 中国科学技术大学 Dynamic obstacle tracking method based on sparse laser radar data
CN112415494A (en) * 2020-12-11 2021-02-26 福勤智能科技(昆山)有限公司 AGV double-laser-radar position calibration method, device, equipment and storage medium
CN113030946A (en) * 2021-02-05 2021-06-25 北京航空航天大学 Secondary radar detection method, apparatus, device, system, medium, and program product
CN113220018A (en) * 2021-04-23 2021-08-06 上海发电设备成套设计研究院有限责任公司 Unmanned aerial vehicle path planning method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN116069051B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
CN110967011B (en) Positioning method, device, equipment and storage medium
CN111126182B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN111768454B (en) Pose determination method, pose determination device, pose determination equipment and storage medium
CN110986930B (en) Equipment positioning method and device, electronic equipment and storage medium
CN110095128B (en) Method, device, equipment and storage medium for acquiring missing road information
CN112084811B (en) Identity information determining method, device and storage medium
CN111754564B (en) Video display method, device, equipment and storage medium
CN112991439B (en) Method, device, electronic equipment and medium for positioning target object
CN110597389B (en) Virtual object control method in virtual scene, computer device and storage medium
CN112241987B (en) System, method, device and storage medium for determining defense area
CN112734346B (en) Method, device and equipment for determining lane coverage and readable storage medium
CN112817337B (en) Method, device and equipment for acquiring path and readable storage medium
CN110095792B (en) Method and device for positioning terminal
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
CN116069051B (en) Unmanned aerial vehicle control method, device, equipment and readable storage medium
CN112365088B (en) Method, device and equipment for determining travel key points and readable storage medium
CN111369684B (en) Target tracking method, device, equipment and storage medium
CN113935678A (en) Method, device, equipment and storage medium for determining multiple distribution terminals held by distributor
CN111984755A (en) Method and device for determining target parking point, electronic equipment and storage medium
CN113689484B (en) Method and device for determining depth information, terminal and storage medium
CN112731972B (en) Track management method, device, equipment and computer readable storage medium
CN113359851B (en) Method, device, equipment and storage medium for controlling navigation of aircraft
CN112214645B (en) Method and device for storing track data
CN113409235B (en) Vanishing point estimation method and apparatus
CN112241662B (en) Method and device for detecting drivable area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant