CN112540622A - Radar data processing method and device and operation equipment - Google Patents

Radar data processing method and device and operation equipment Download PDF

Info

Publication number
CN112540622A
CN112540622A CN202010281612.8A CN202010281612A CN112540622A CN 112540622 A CN112540622 A CN 112540622A CN 202010281612 A CN202010281612 A CN 202010281612A CN 112540622 A CN112540622 A CN 112540622A
Authority
CN
China
Prior art keywords
target object
radar
coordinate system
invalid
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010281612.8A
Other languages
Chinese (zh)
Other versions
CN112540622B (en
Inventor
郑立强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202010281612.8A priority Critical patent/CN112540622B/en
Publication of CN112540622A publication Critical patent/CN112540622A/en
Application granted granted Critical
Publication of CN112540622B publication Critical patent/CN112540622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The application discloses a radar data processing method and device and operation equipment. Wherein, the method comprises the following steps: acquiring at least one target object detected by a radar in the operation equipment; acquiring pose information of the operation equipment and a detection pitch angle of a radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object. This application has been solved because the existence of radar detection pitch angle, also when having certain flight angle when unmanned aerial vehicle flies, the target that the radar detected can have a lot of invalid targets that do not have an influence to unmanned aerial vehicle's flight, leads to the data that the radar detected to have a lot of technical problem of redundant data.

Description

Radar data processing method and device and operation equipment
Technical Field
The application relates to the field of radar data processing, in particular to a radar data processing method and device and operation equipment.
Background
During the operation of plant protection unmanned aerial vehicle, always meet various barriers, if do not dodge the barrier, plant protection unmanned aerial vehicle can hit on the barrier, causes the incident. In order to solve the problem of flight safety, radars are generally arranged on the plant protection unmanned aerial vehicle at the present stage to carry out obstacle avoidance operation. The pitch angle of the radar may be a range as shown in figure 1.
Also when there is certain angle when plant protection unmanned aerial vehicle flies, because the existence of radar detection pitch angle, the target that the radar detected is not necessarily the barrier, as shown in fig. 2, what the dotted line showed is the effective barrier scope that can influence the flight of plant protection unmanned aerial vehicle, owing to there is certain flight angle, can see that the target (ground) that the radar detected can not influence the flight operation.
Because the existence of radar detection pitch angle also when certain flight angle when unmanned aerial vehicle flies, the target that the radar detected may exert an influence to unmanned aerial vehicle flight, also may not exert an influence to unmanned aerial vehicle's flight. In this way, the target detected by the radar may have many invalid targets that have no effect on the flight of the drone, i.e., there may be many redundant data in the data detected by the radar.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the application provides a processing method and device of radar data, and operation equipment to at least, solve because the existence of radar detection pitch angle, also when there is certain flight angle when unmanned aerial vehicle flies, the target that the radar detected can have a lot of invalid targets that do not have an influence to unmanned aerial vehicle's flight, leads to the data that the radar detected to have a lot of redundant data's technical problem.
According to an aspect of an embodiment of the present application, there is provided a method for processing radar data, including: acquiring at least one target object detected by a radar in the operation equipment; acquiring pose information of the operation equipment and a detection pitch angle of a radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object.
Optionally, acquiring at least one target object detected by a radar in the working equipment includes: acquiring position information of at least one target object detected by a radar in a first preset coordinate system, wherein the position information comprises: the distance between at least one target object and the radar and the azimuth angle of at least one target object, wherein the first preset coordinate system is a polar coordinate system determined by taking the center of an antenna surface of the radar as an origin and taking the normal of the antenna surface as a 0-degree line.
Optionally, before the operation device is an unmanned aerial vehicle and an invalid target object in the at least one target object is filtered according to the pose information and the detection pitch angle, the method further includes: determining a plane rectangular coordinate system by taking a horizontal line as a Y axis, taking an intersection point of a straight line where the head direction of the unmanned aerial vehicle is located and the horizontal line as an original point and taking a straight line which passes through the original point and is perpendicular to the Y axis as an X axis, wherein the plane rectangular coordinate system accords with the rule of a right-handed system and is taken as a second preset coordinate system; determining an included angle between an extension line of the head direction of the unmanned aerial vehicle and the positive direction of the Y axis of the second preset coordinate system; the included angle is used as the flight angle of the unmanned aerial vehicle.
Optionally, filtering an invalid target object in the at least one target object according to the pose information and the detected pitch angle, including: converting the position information of the target object in the first preset coordinate system into position information of the target object in a third preset coordinate system, wherein the third preset coordinate system is a plane rectangular coordinate system which takes the radar as an origin, takes a horizontal line where the origin is located as a Y axis, and takes a straight line which passes through the origin and is perpendicular to the Y axis as an X axis; determining the position information of the target object in the world coordinate system according to the position information, the detection pitch angle and the pose information of the target object in the third preset coordinate system; judging whether the target object is an invalid target object according to the position information of the target object in the world coordinate system; and if the target object is judged to be an invalid target object, deleting the data corresponding to the invalid target object from the data corresponding to at least one target object.
Optionally, determining the position information of the target object in the world coordinate system according to the position information, the detected pitch angle, and the pose information of the target object in the third preset coordinate system, includes: if the flight angle of the unmanned aerial vehicle is larger than-90 degrees and smaller than 0 degree, determining the position information of the target object in the world coordinate system according to a first formula, wherein the first formula is as follows:
Figure BDA0002446782180000021
wherein x iswr,ywr,zwrFor the position information of the target object in the world coordinate system,
Figure BDA0002446782180000022
a homogeneous transformation matrix for transforming the target object from a third preset coordinate system to a world coordinate system, beta is a detection pitch angle of the radar,
Figure BDA0002446782180000023
is a function representing counterclockwise rotation about the Y axis
Figure BDA0002446782180000024
Then, the obtained rotation matrix, xr,yr,zrPosition information of the target object in a third preset coordinate system is obtained; if the flight angle of the unmanned aerial vehicle is larger than 0 degree and smaller than 90 degrees, determining the position information of the target object in the world coordinate system according to a second formula, wherein the second formula is as follows:
Figure BDA0002446782180000031
optionally, the determining whether the target object is an invalid target object according to the position information of the target object in the world coordinate system includes: if the flight angle of the unmanned aerial vehicle is greater than-90 degrees and smaller than 0 degree, and the coordinate value of the target object in the Z direction under the world coordinate system is smaller than the lower limit value of the target height range, determining that the target object is an invalid target object, wherein the target height range is the height range of an airspace which influences the flight of the unmanned aerial vehicle; and if the flight angle of the unmanned aerial vehicle is greater than 0 degree and smaller than 90 degrees and the coordinate value of the target object in the Z direction under the world coordinate system is greater than the upper limit value of the target height range, determining that the target object is an invalid target object.
Optionally, controlling the running state of the working device according to the remaining target object of the at least one target object comprises: constructing an obstacle distribution map according to the data corresponding to the residual target objects; and determining the driving route of the operation equipment according to the obstacle distribution map.
Optionally, the pose information is a pose angle and a position coordinate of the operation equipment in a world coordinate system in the running process; the detection pitch angle is an included angle formed by straight lines where the upper boundary and the lower boundary of a region which can be detected by a radar are located.
According to another aspect of the embodiments of the present application, there is also provided a radar data processing apparatus, including: the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring at least one target object detected by a radar in the operation equipment; the second acquisition module is used for acquiring pose information of the operation equipment and a detection pitch angle of the radar; the processing module is used for filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and the control module is used for controlling the running state of the working equipment according to the rest target objects in the at least one target object.
According to another aspect of the embodiments of the present application, there is also provided a work apparatus including: a radar for detecting at least one target object; the processor is in communication connection with the radar and is used for acquiring the pose information of at least one target object and the operation equipment and the detection pitch angle of the radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object.
According to still another aspect of the embodiments of the present application, there is provided a storage medium including a stored program, where the apparatus where the storage medium is located is controlled to execute the above radar data processing method when the program runs.
According to still another aspect of the embodiments of the present application, there is also provided a processor for executing a program stored in a memory, wherein the program executes the above radar data processing method.
In the embodiment of the application, at least one target object detected by a radar in the working equipment is acquired; acquiring pose information of the operation equipment and a detection pitch angle of a radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; according to the mode of the running state of the operation equipment of remaining target object control in at least one target object, the invalid target that the radar detected is filtered through the pitch angle that the radar detected and the flight attitude of operation equipment, thereby realized the redundant data that the filtering radar detected, the technical effect of the subsequent in-process calculated amount that utilizes the data that the radar detected to construct the barrier map has been reduced, and then solved because the existence of radar detection pitch angle, also when there is certain flight angle when unmanned aerial vehicle flies, the target that the radar detected can have a lot of invalid targets that do not have an influence to unmanned aerial vehicle's flight, lead to the data that the radar detected to have a lot of redundant data technical problem.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 shows a schematic view of a detected pitch angle of a radar;
FIG. 2 shows a schematic diagram of radar detection when a flight angle of an unmanned aerial vehicle exists;
FIG. 3 is a flow chart of a method of processing radar data according to an embodiment of the present application;
fig. 4a and 4b are schematic views of determining a flight angle of a drone according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a rectangular radar coordinate system according to an embodiment of the present application;
FIG. 6a is a schematic diagram of filtering out invalid targets according to an embodiment of the present application;
FIG. 6b is a schematic diagram of another filtering of invalid objects according to an embodiment of the present application;
fig. 7 is a schematic diagram of a body coordinate system of a drone according to an embodiment of the present application;
fig. 8 is a block diagram of a radar data processing apparatus according to an embodiment of the present application;
fig. 9 is a block diagram of a work apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present application, there is provided an embodiment of a method for radar data processing, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 3 is a flowchart of a radar data processing method according to an embodiment of the present application, and as shown in fig. 3, the method includes the following steps:
in step S302, at least one target object detected by a radar in the work apparatus is acquired.
According to an optional embodiment of the present application, the operation device in step S302 may be a plant protection unmanned aerial vehicle, or may be a plant protection aircraft driven by ordinary manpower, and the radar includes, but is not limited to, a millimeter wave radar and a laser radar.
And step S304, acquiring the pose information of the operation equipment and the detection pitch angle of the radar.
According to an optional embodiment of the present application, the pose information is a pose angle and a position coordinate in a world coordinate system during a traveling process of the working device; the detection pitch angle is an included angle formed by straight lines where the upper boundary and the lower boundary of a region which can be detected by a radar are located.
The working equipment is provided with sensors such as an Inertial Measurement Unit (IMU) and a GPS, and self pose information can be easily acquired. The acquired pose information includes a posture angle (roll, pitch, yaw) and a position coordinate (x, y, z).
The detection pitch angle of the radar is an included angle formed by straight lines where an upper boundary and a lower boundary of a region which can be detected by the radar are located, and is shown in fig. 1.
And S306, filtering an invalid target object in the at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which does not influence the running of the operation equipment.
In step S308, the running state of the work equipment is controlled according to the remaining target object of the at least one target object.
Through the steps, the invalid target detected by the radar is filtered through the pitch angle detected by the radar and the flight attitude of the operation equipment, so that redundant data detected by the radar are filtered, and the technical effect of subsequently utilizing the data detected by the radar to construct the calculated amount in the process of constructing the obstacle map is reduced.
According to an alternative embodiment of the present application, step S302 may be implemented by: acquiring position information of at least one target object detected by a radar in a first preset coordinate system, wherein the position information comprises: the distance between at least one target object and the radar and the azimuth angle of at least one target object, wherein the first preset coordinate system is a polar coordinate system determined by taking the center of an antenna surface of the radar as an origin and taking the normal of the antenna surface as a 0-degree line.
The data structure of the radar is as follows:
data structure of a single target object:
Figure BDA0002446782180000061
a radar data frame is composed of a plurality of target objects, and the data structure of the radar data frame is as follows:
Figure BDA0002446782180000062
the position information reference coordinate of the target object detected by the radar is a polar coordinate system which takes the center of the radar antenna surface as an original point and takes the normal of the antenna surface as a 0-degree line. The target distance refers to a distance from a target object detected by the radar to the radar, and the target azimuth refers to an azimuth of the target object in the polar coordinate system.
According to an optional embodiment of the present application, the operating device is an unmanned aerial vehicle, and fig. 4a and 4b are schematic diagrams for determining a flight angle of the unmanned aerial vehicle according to the embodiment of the present application, where a horizontal line is taken as a Y axis, an intersection point of a straight line where a head direction of the unmanned aerial vehicle is located and the horizontal line is taken as an origin, and a straight line passing through the origin and perpendicular to the Y axis is taken as an X axis, a planar rectangular coordinate system is determined, which conforms to a right-handed system rule, and the planar rectangular coordinate system is taken as a second preset coordinate system; determining an included angle between an extension line of the head direction of the unmanned aerial vehicle and the positive direction of the Y axis of the second preset coordinate system; the included angle is used as the flight angle of the unmanned aerial vehicle. The use of this angle of flight is described in detail below.
As shown in fig. 4a, the nose direction of the unmanned aerial vehicle is horizontal downward, the included angle θ between the nose direction of the unmanned aerial vehicle and the positive direction of the Y axis is an angle larger than-90 degrees and smaller than 0 degree, and the angle is used as the flight angle when the nose direction of the unmanned aerial vehicle flies horizontally downward.
As shown in fig. 4b, the nose direction of the unmanned aerial vehicle is horizontally upward, the included angle θ between the nose direction of the unmanned aerial vehicle and the positive direction of the Y axis is an angle greater than 0 degree and smaller than 90 degrees, and the included angle is taken as the flight angle when the nose direction of the unmanned aerial vehicle flies horizontally upward.
In an optional embodiment of the present application, the step S306 of filtering out invalid target objects in the at least one target object according to the pose information and the detected pitch angle includes the following steps:
step S3062, converting the position information of the target object in the first preset coordinate system into position information of the target object in a third preset coordinate system, where the third preset coordinate system is a rectangular plane coordinate system determined by using the radar as an origin, using a horizontal line where the origin is located as a Y-axis, and using a straight line passing through the origin and perpendicular to the Y-axis as an X-axis.
Step S3064, determining the position information of the target object in the world coordinate system according to the position information of the target object in the third preset coordinate system, the detection pitch angle of the radar, and the pose information of the unmanned aerial vehicle.
Step S3066, determine whether the target object is an invalid target object according to the position information of the target object in the world coordinate system.
In step S3068, if the target object is determined to be an invalid target object, data corresponding to the invalid target object is deleted from the data corresponding to the at least one target object.
According to an alternative embodiment of the present application, step S3064 is implemented by: if the flight angle of the unmanned aerial vehicle is larger than 90 degrees and smaller than 180 degrees, determining the position information of the target object in the world coordinate system according to a first formula, wherein the first formula is as follows:
Figure BDA0002446782180000071
wherein x iswr,ywr,zwrFor the position information of the target object in the world coordinate system,
Figure BDA0002446782180000072
a homogeneous transformation matrix for transforming the target object from a third preset coordinate system to a world coordinate system, beta is a detection pitch angle of the radar,
Figure BDA0002446782180000073
is a function representing counterclockwise rotation about the Y axis
Figure BDA0002446782180000074
Then, the obtained rotation matrix, xr,yr,zrPosition information of the target object in a third preset coordinate system is obtained;
if the flight angle of the unmanned aerial vehicle is larger than 0 degree and smaller than 90 degrees, determining the position information of the target object in the world coordinate system according to a second formula, wherein the second formula is as follows:
Figure BDA0002446782180000081
according to an alternative embodiment of the present application, step S3066 includes the steps of: if the flight angle of the unmanned aerial vehicle is greater than-90 degrees and smaller than 0 degree, and the coordinate value of the target object in the Z direction under the world coordinate system is smaller than the lower limit value of the target height range, determining that the target object is an invalid target object, wherein the target height range is the height range of an airspace which influences the flight of the unmanned aerial vehicle; and if the flight angle of the unmanned aerial vehicle is greater than 0 degree and smaller than 90 degrees and the coordinate value of the target object in the Z direction under the world coordinate system is greater than the upper limit value of the target height range, determining that the target object is an invalid target object.
Steps S3062 to S3066 provide a method for filtering out invalid objects, which is described below with a specific embodiment:
fig. 5 is a schematic diagram of a rectangular coordinate system of a radar according to an embodiment of the present application, where step S3062 is performed to convert the position information of the target detected by the radar in the polar coordinate system (the above first preset coordinate system) into the position information in the rectangular coordinate system of the radar (i.e. the above third preset coordinate system), and the formula is as follows:
xr=d×cos(α)
yr=d×sin(-α)
zr=0
where α is the azimuth of the target and d is the distance of the target from the radar.
It is mentioned in the background section that due to the existence of the radar detection pitch angle, when the unmanned aerial vehicle also has a certain flight angle when flying, the target detected by the radar can have a plurality of invalid targets which have no influence on the flight of the unmanned aerial vehicle. Therefore, invalid targets detected by the radar need to be filtered according to the pitch angle detected by the radar and the flight attitude of the unmanned aerial vehicle.
In specific implementation, the following two methods for filtering out invalid targets can be used according to the flight angles of the unmanned aerial vehicle shown in fig. 4a and 4 b:
fig. 6a is a schematic diagram of filtering an invalid target according to an embodiment of the present application, and as shown in fig. 6a, the flight angle of the drone is greater than-90 degrees and less than 0 degree (i.e., the nose direction of the drone is horizontal downward), as long as it is determined that the coordinate value in the z direction under the world coordinate system of the target is lower than the airspace of interest of the plant protection drone (the area between the two dotted lines shown in fig. 6a, and the target located between the areas all affects the flight of the drone), the target can be filtered.
Setting the lower limit Z value of airspace as ZlAnd the radar detection pitch angle is beta. Assuming that the target is located at the upper edge of the radar detection pitch angle, the world coordinate system coordinates of the target are as follows:
Figure BDA0002446782180000091
therefore, as long as the target satisfies zwr<ZlThen the target may be filtered out.
Fig. 6b is a schematic diagram of another filtering of invalid targets according to the embodiment of the present application, and as shown in fig. 6b, the flight angle of the drone is greater than 0 degree and less than 90 degrees (i.e., the head direction of the drone is horizontally upward), as long as it is determined that the coordinate value in the z direction under the world coordinate system of the target is higher than the airspace of interest of the plant protection drone, the target can be filtered.
Assuming that the target is located at the lower edge of the radar detection pitch angle, the world coordinate system coordinates of the target are as follows:
Figure BDA0002446782180000092
setting the upper limit value of airspace as ZuAs long as the target satisfies zwr>ZuThen the target may be filtered out.
For the remaining target points, a homogeneous transformation matrix T is usedi wAnd the system can be transformed from the radar rectangular coordinate system to the world coordinate system.
Following the homogeneous transformation matrix Ti wThe calculation method of (2) is introduced:
first, the knowledge about coordinate transformation is introduced, and a homogeneous transformation matrix rotating a certain angle around the x-axis is:
Figure BDA0002446782180000093
the homogeneous transformation matrix rotating a certain angle around the y-axis is:
Figure BDA0002446782180000094
the homogeneous transformation matrix rotating a certain angle around the z-axis is:
Figure BDA0002446782180000101
the homogeneous transformation matrix translated a certain distance is:
Figure BDA0002446782180000102
the position coordinate and the attitude angle of the plant protection unmanned aerial vehicle under the world coordinate system are (x)w,yw,zwww,
Figure BDA0002446782180000103
) Therefore, the homogeneous transformation matrix for converting the target detected by the radar from the radar rectangular coordinate system (the third predetermined coordinate system) to the world coordinate system is:
Figure BDA0002446782180000104
Figure BDA0002446782180000105
the target detected by the radar is converted into a homogeneous transformation matrix of a body coordinate system of the unmanned aerial vehicle from a radar rectangular coordinate system (the third preset coordinate system). Fig. 7 is a schematic diagram of a body coordinate system of an unmanned aerial vehicle according to an embodiment of the present application, and as shown in fig. 7, the body coordinate system is determined by taking a geometric center of the plant protection unmanned aerial vehicle as a center of a circle, taking an x axis along a head direction, taking a direction perpendicular to the x axis, taking a direction to the right of the body as a y axis, and taking a direction to the ground of a perpendicular xoy plane as a z axis.
The position coordinate and the installation angle of the radar in the body coordinate system are assumed to be (x)b,yb,zbbb,
Figure BDA0002446782180000106
) Then the target detected by the radar is converted from the rectangular coordinate system of the radar (the third preset coordinate system) to the homogeneous transformation matrix of the coordinate system of the body
Figure BDA0002446782180000107
Comprises the following steps:
Figure BDA0002446782180000108
in some optional embodiments of the present application, step S308 may be implemented by: constructing an obstacle distribution map according to the data corresponding to the residual target objects; and determining the driving route of the operation equipment according to the obstacle distribution map.
And constructing an obstacle distribution map by using data corresponding to the targets remaining after the invalid targets are filtered out, and then determining the driving route of the operating equipment by using the constructed obstacle distribution map. In specific implementation, the range search may be directly used, or the kd-tree may be used to store the data corresponding to the obstacle, and the kd-tree may perform nearest neighbor search to obtain the environment information of the environment where the operating device is located. A kd-tree (short for k-dimensional tree) is a data structure that partitions a k-dimensional data space. The method is mainly applied to searching of multidimensional space key data (such as range searching and nearest neighbor searching).
Fig. 8 is a block diagram of a radar data processing apparatus according to an embodiment of the present application, and as shown in fig. 8, the apparatus includes:
the first acquiring module 80 is configured to acquire at least one target object detected by a radar in the work equipment.
And a second obtaining module 82, configured to obtain pose information of the working device and a detected pitch angle of the radar.
According to an optional embodiment of the present application, the pose information is a pose angle and a position coordinate in a world coordinate system during a traveling process of the working device; the detection pitch angle is an included angle formed by straight lines where the upper boundary and the lower boundary of a region which can be detected by a radar are located.
The working equipment is provided with sensors such as an Inertial Measurement Unit (IMU) and a GPS, and self pose information can be easily acquired. The acquired pose information includes a posture angle (roll, pitch, yaw) and a position coordinate (x, y, z).
The detection pitch angle of the radar is an included angle formed by straight lines where an upper boundary and a lower boundary of a region which can be detected by the radar are located, and is shown in fig. 1.
And the processing module 84 is configured to filter an invalid target object from the at least one target object according to the pose information and the detection pitch angle, where the invalid target object is a target object that does not affect the driving of the working equipment.
And the control module 86 is used for controlling the running state of the working equipment according to the rest target objects in the at least one target object.
It should be noted that, reference may be made to the description related to the embodiment shown in fig. 1 for a preferred implementation of the embodiment shown in fig. 8, and details are not repeated here.
Fig. 9 is a structural diagram of a work apparatus according to an embodiment of the present application, and as shown in fig. 9, the work apparatus includes:
a radar 90 for detecting at least one target object;
the processor 92 is in communication connection with the radar and is used for acquiring the pose information of at least one target object and the operation equipment and the detection pitch angle of the radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object.
It should be noted that, reference may be made to the description related to the embodiment shown in fig. 1 for a preferred implementation of the embodiment shown in fig. 9, and details are not repeated here.
The embodiment of the application also provides a storage medium, wherein the storage medium comprises a stored program, and when the program runs, the device where the storage medium is located is controlled to execute the radar data processing method.
The storage medium stores a program for executing the following functions: acquiring at least one target object detected by a radar in the operation equipment; acquiring pose information of the operation equipment and a detection pitch angle of a radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object.
The embodiment of the application also provides a processor, wherein the processor is used for running the program stored in the memory, and the above radar data processing method is executed when the program runs.
The processor is used for running a program for executing the following functions: acquiring at least one target object detected by a radar in the operation equipment; acquiring pose information of the operation equipment and a detection pitch angle of a radar; filtering an invalid target object in at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which cannot influence the running of the operation equipment; and controlling the running state of the working equipment according to the rest target objects in the at least one target object.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a variety of media that can store program codes, such as a usb disk, a read-Only Memory (ROM), a random access Memory (RGZJFM), a mobile hard disk, a magnetic disk, or an optical disk.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (12)

1. A method for processing radar data, comprising:
acquiring at least one target object detected by a radar in the operation equipment;
acquiring pose information of the operation equipment and a detection pitch angle of the radar;
filtering an invalid target object in the at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which does not influence the running of the operation equipment;
and controlling the running state of the operating equipment according to the rest target objects in the at least one target object.
2. The method of claim 1, wherein obtaining at least one target object detected by a radar in a work machine comprises:
acquiring position information of at least one target object detected by the radar in a first preset coordinate system, wherein the position information comprises: the distance between the at least one target object and the radar and the azimuth angle of the at least one target object are determined, wherein the first preset coordinate system is a polar coordinate system which is determined by taking the center of an antenna surface of the radar as an origin and the normal of the antenna surface as a 0-degree line.
3. The method of claim 2, wherein the working device is a drone, and wherein before filtering out invalid ones of the at least one target object based on the pose information and the pitch angle, the method further comprises:
determining a plane rectangular coordinate system by taking a horizontal line as a Y axis, taking an intersection point of a straight line where the head direction of the unmanned aerial vehicle is located and the horizontal line as an original point and taking a straight line which passes through the original point and is perpendicular to the Y axis as an X axis, wherein the plane rectangular coordinate system conforms to the rule of a right-hand system and is taken as a second preset coordinate system;
determining an included angle between an extension line of the head direction of the unmanned aerial vehicle and the positive direction of the Y axis of the second preset coordinate system;
and taking the included angle as the flight angle of the unmanned aerial vehicle.
4. The method of claim 3, wherein filtering out invalid target objects of the at least one target object as a function of the pose information and the pitch angle of investigation comprises:
converting the position information of the target object in the first preset coordinate system into position information of the target object in a third preset coordinate system, wherein the third preset coordinate system is a planar rectangular coordinate system which is determined by taking the radar as an origin, taking a horizontal line where the origin is located as a Y axis, and taking a straight line which passes through the origin and is perpendicular to the Y axis as an X axis;
determining the position information of the target object in a world coordinate system according to the position information of the target object in the third preset coordinate system, the detection pitch angle and the pose information;
judging whether the target object is the invalid target object or not according to the position information of the target object in the world coordinate system;
and if the target object is judged to be the invalid target object, deleting the data corresponding to the invalid target object from the data corresponding to the at least one target object.
5. The method according to claim 4, wherein determining the position information of the target object in the world coordinate system according to the position information of the target object in the third preset coordinate system, the detected pitch angle and the pose information comprises:
if the flight angle of the unmanned aerial vehicle is larger than-90 degrees and smaller than 0 degree, determining the position information of the target object in a world coordinate system according to a first formula, wherein the first formula is as follows:
Figure FDA0002446782170000021
wherein x iswr,ywr,zwrFor the position information of the target object in the world coordinate system,
Figure FDA0002446782170000022
a homogeneous transformation matrix for transforming the target object from the third preset coordinate system to a world coordinate system, beta is a detected pitch angle of the radar,
Figure FDA0002446782170000023
is a function representing counterclockwise rotation about the Y axis
Figure FDA0002446782170000024
Then, the obtained rotation matrix, xr,yr,zrPosition information of the target object in the third preset coordinate system is obtained;
if the flight angle of the unmanned aerial vehicle is larger than 0 degree and smaller than 90 degrees, determining the position information of the target object in the world coordinate system according to a second formula, wherein the second formula is as follows:
Figure FDA0002446782170000025
6. the method of claim 5, wherein determining whether the target object is the invalid target object according to the position information of the target object in the world coordinate system comprises:
if the flight angle of the unmanned aerial vehicle is greater than-90 degrees and smaller than 0 degree, and the coordinate value of the target object in the Z direction under the world coordinate system is smaller than the lower limit value of a target height range, determining that the target object is the invalid target object, wherein the target height range is the height range of an airspace which influences the flight of the unmanned aerial vehicle;
and if the flight angle of the unmanned aerial vehicle is greater than 0 degree and smaller than 90 degrees, and the coordinate value of the target object in the Z direction under the world coordinate system is greater than the upper limit value of the target height range, determining that the target object is the invalid target object.
7. The method according to any one of claims 1 to 6, wherein controlling the travel state of the work equipment in accordance with the remaining target objects of the at least one target object comprises:
constructing an obstacle distribution map according to the data corresponding to the residual target objects;
and determining a driving route of the operation equipment according to the obstacle distribution map.
8. The method of claim 1,
the pose information is a pose angle and a position coordinate of the operation equipment in a world coordinate system in the running process;
the detection pitch angle is an included angle formed by straight lines where an upper boundary and a lower boundary of a region which can be detected by the radar are located.
9. An apparatus for processing radar data, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring at least one target object detected by a radar in the operation equipment;
the second acquisition module is used for acquiring the pose information of the operation equipment and the detection pitch angle of the radar;
the processing module is used for filtering an invalid target object in the at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which does not influence the running of the operation equipment;
and the control module is used for controlling the running state of the operating equipment according to the rest target objects in the at least one target object.
10. A work apparatus, comprising:
a radar for detecting at least one target object;
the processor is in communication connection with the radar and is used for acquiring the at least one target object, the pose information of the operation equipment and the detection pitch angle of the radar; filtering an invalid target object in the at least one target object according to the pose information and the detection pitch angle, wherein the invalid target object is a target object which does not influence the running of the operation equipment; and controlling the running state of the operating equipment according to the rest target objects in the at least one target object.
11. A storage medium, characterized in that the storage medium includes a stored program, wherein when the program runs, a device in which the storage medium is located is controlled to execute the radar data processing method according to any one of claims 1 to 8.
12. A processor configured to execute a program stored in a memory, wherein the program is configured to execute the method for processing radar data according to any one of claims 1 to 8 when executed.
CN202010281612.8A 2020-04-10 2020-04-10 Radar data processing method and device and operation equipment Active CN112540622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010281612.8A CN112540622B (en) 2020-04-10 2020-04-10 Radar data processing method and device and operation equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010281612.8A CN112540622B (en) 2020-04-10 2020-04-10 Radar data processing method and device and operation equipment

Publications (2)

Publication Number Publication Date
CN112540622A true CN112540622A (en) 2021-03-23
CN112540622B CN112540622B (en) 2021-12-28

Family

ID=75013417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010281612.8A Active CN112540622B (en) 2020-04-10 2020-04-10 Radar data processing method and device and operation equipment

Country Status (1)

Country Link
CN (1) CN112540622B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6208284B1 (en) * 1998-06-16 2001-03-27 Rockwell Science Center, Inc. Radar augmented TCAS
US20100271258A1 (en) * 2009-04-22 2010-10-28 Mitsubishi Electric Corporation Radar device
CN101893710A (en) * 2009-05-20 2010-11-24 中国科学院电子学研究所 Non-uniform distributed multi-baseline synthetic aperture radar three-dimensional imaging method
CN102508232A (en) * 2011-10-20 2012-06-20 黄安祥 Multi-source detection-based method for detecting stealth target in the sky
CN103901411A (en) * 2014-03-28 2014-07-02 长城汽车股份有限公司 Radar test device and automotive radar pitch angle test method
CN106291539A (en) * 2016-07-29 2017-01-04 山东康威通信技术股份有限公司 A kind of phased-array radar target identification system based on location filtering algorithm and method
CN106873630A (en) * 2017-04-20 2017-06-20 广州极飞科技有限公司 A kind of flight control method and device, perform equipment
CN107807659A (en) * 2017-10-24 2018-03-16 北京臻迪科技股份有限公司 A kind of UAV Flight Control method and device
CN107945580A (en) * 2017-11-17 2018-04-20 武汉理工大学 Marine traction system AIS virtually guards against mark designation system and method
CN108614262A (en) * 2018-06-22 2018-10-02 安徽江淮汽车集团股份有限公司 A kind of vehicle forward target detection method and system
CN108921003A (en) * 2018-04-26 2018-11-30 东华大学 Unmanned plane obstacle detection method based on convolutional neural networks and morphological image
CN108944929A (en) * 2018-05-31 2018-12-07 合肥中科自动控制系统有限公司 A kind of target extraction method for Vehicle Adaptive Cruising Control Systems
CN109709554A (en) * 2018-12-13 2019-05-03 广州极飞科技有限公司 Operating equipment and its control method and device
CN110930692A (en) * 2019-10-24 2020-03-27 河北德冠隆电子科技有限公司 Active vehicle continuous tracking device, system and method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6208284B1 (en) * 1998-06-16 2001-03-27 Rockwell Science Center, Inc. Radar augmented TCAS
US20100271258A1 (en) * 2009-04-22 2010-10-28 Mitsubishi Electric Corporation Radar device
CN101893710A (en) * 2009-05-20 2010-11-24 中国科学院电子学研究所 Non-uniform distributed multi-baseline synthetic aperture radar three-dimensional imaging method
CN102508232A (en) * 2011-10-20 2012-06-20 黄安祥 Multi-source detection-based method for detecting stealth target in the sky
CN103901411A (en) * 2014-03-28 2014-07-02 长城汽车股份有限公司 Radar test device and automotive radar pitch angle test method
CN106291539A (en) * 2016-07-29 2017-01-04 山东康威通信技术股份有限公司 A kind of phased-array radar target identification system based on location filtering algorithm and method
CN106873630A (en) * 2017-04-20 2017-06-20 广州极飞科技有限公司 A kind of flight control method and device, perform equipment
CN107807659A (en) * 2017-10-24 2018-03-16 北京臻迪科技股份有限公司 A kind of UAV Flight Control method and device
CN107945580A (en) * 2017-11-17 2018-04-20 武汉理工大学 Marine traction system AIS virtually guards against mark designation system and method
CN108921003A (en) * 2018-04-26 2018-11-30 东华大学 Unmanned plane obstacle detection method based on convolutional neural networks and morphological image
CN108944929A (en) * 2018-05-31 2018-12-07 合肥中科自动控制系统有限公司 A kind of target extraction method for Vehicle Adaptive Cruising Control Systems
CN108614262A (en) * 2018-06-22 2018-10-02 安徽江淮汽车集团股份有限公司 A kind of vehicle forward target detection method and system
CN109709554A (en) * 2018-12-13 2019-05-03 广州极飞科技有限公司 Operating equipment and its control method and device
CN110930692A (en) * 2019-10-24 2020-03-27 河北德冠隆电子科技有限公司 Active vehicle continuous tracking device, system and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
熊有伦,等: "《机器人学 建模、控制与视觉》", 31 March 2018 *
蔡明辉,刘燕: "基于基变换的地球站天线视角计算及方位角、俯仰角之间的映射关系", 《电视技术》 *
马国成,等: "车载雷达多车道目标识别及补偿方法", 《北京理工大学学报》 *

Also Published As

Publication number Publication date
CN112540622B (en) 2021-12-28

Similar Documents

Publication Publication Date Title
US11320833B2 (en) Data processing method, apparatus and terminal
CN111932588B (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN108508916A (en) A kind of control method, device, equipment and storage medium that unmanned plane is formed into columns
EP2490178B1 (en) Methods and systems for identifying hazardous flight zone areas on a display
US20200206927A1 (en) Relocalization method and robot using the same
CN112305559A (en) Power transmission line distance measuring method, device and system based on ground fixed-point laser radar scanning and electronic equipment
CN112666963A (en) Road pavement crack detection system based on four-axis unmanned aerial vehicle and detection method thereof
CN112036274A (en) Driving region detection method and device, electronic equipment and storage medium
CN113566825A (en) Unmanned aerial vehicle navigation method and system based on vision and storage medium
CN116997771A (en) Vehicle, positioning method, device, equipment and computer readable storage medium thereof
CN112527009A (en) Radar data processing method and device and operation equipment
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN115793690A (en) Indoor inspection method, system and equipment for unmanned aerial vehicle
Hyyppä et al. Efficient coarse registration method using translation-and rotation-invariant local descriptors towards fully automated forest inventory
CN112540622B (en) Radar data processing method and device and operation equipment
CN112597946A (en) Obstacle representation method and device, electronic equipment and readable storage medium
CN112154355B (en) High-precision map positioning method, system, platform and computer readable storage medium
CN116047499B (en) High-precision real-time protection system and method for power transmission line of target construction vehicle
Duan et al. Image digital zoom based single target apriltag recognition algorithm in large scale changes on the distance
CN115686073A (en) Unmanned aerial vehicle-based power transmission line inspection control method and system
CN113156450B (en) Active rotation laser radar system on unmanned aerial vehicle and control method thereof
US10330769B1 (en) Method and apparatus for geolocating emitters in a multi-emitter environment
CN116508071A (en) System and method for annotating automotive radar data
CN112154351A (en) Terrain detection method, movable platform, control device, system and storage medium
CN112346481A (en) Method and system for unmanned aerial vehicle power inspection operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: Guangzhou Jifei Technology Co.,Ltd.

Address before: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant before: Guangzhou Xaircraft Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant