CN115079142A - Method for simulating real-time image simulation of airborne laser radar - Google Patents

Method for simulating real-time image simulation of airborne laser radar Download PDF

Info

Publication number
CN115079142A
CN115079142A CN202210654115.7A CN202210654115A CN115079142A CN 115079142 A CN115079142 A CN 115079142A CN 202210654115 A CN202210654115 A CN 202210654115A CN 115079142 A CN115079142 A CN 115079142A
Authority
CN
China
Prior art keywords
axis
point
point cloud
laser radar
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210654115.7A
Other languages
Chinese (zh)
Inventor
孙传伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NANJING LEFEI AVIATION TECHNOLOGY CO LTD
Original Assignee
NANJING LEFEI AVIATION TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NANJING LEFEI AVIATION TECHNOLOGY CO LTD filed Critical NANJING LEFEI AVIATION TECHNOLOGY CO LTD
Priority to CN202210654115.7A priority Critical patent/CN115079142A/en
Publication of CN115079142A publication Critical patent/CN115079142A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a method for simulating real-time image simulation of an airborne laser radar, which comprises the following steps: the method comprises the following steps: acquiring the number of dot matrixes required for drawing the laser radar point cloud image, and solving an intersection point coordinate set Pclouds between laser radar rays and obstacles to obtain mass point cloud data; step two: drawing a point cloud image: dividing the point cloud data obtained in the step one into different colors according to the linear mapping relation of HSV-height, and drawing a point cloud image in real time by using the osgShader; and step one, the point cloud data Pcloud obtained by the step one is converted from a UTM coordinate system to a body coordinate system, and obstacle distance, azimuth angle and pitch angle data in limited time are output. The invention can realize that the precision of the point in the detection distance of 1200m is less than 5m, and the point calculation speed is kept about 300K/s; the obstacle detection is accurate, and the point cloud data and the obstacle target data can be sent in a subpackaging way, so that the point cloud data and the obstacle target data can interact with the outside.

Description

Method for simulating real-time image simulation of airborne laser radar
Technical Field
The invention belongs to the technical field of sensor detection and radar image simulation, and particularly relates to a method for simulating real-time image simulation of an airborne laser radar.
Background
The laser radar technology is rapidly developed in various fields, and compared with other remote sensing technologies, the laser radar technology has been widely recognized as a tool for accurately and rapidly acquiring ground three-dimensional data. Lidar sensors are becoming the primary sensors of driving assistance systems by virtue of three-dimensional environmental modeling capabilities. With the maturity of vehicle-mounted laser radar technology, the detection technology of an airborne laser radar sensor is a necessary trend.
In the prior art, in order to improve the calculation efficiency, the airborne laser radar simulation is adopted, and a barrier model is simply replaced by a capsule body, a sphere and the like during calculation, so that the detection effect is not accurate enough; the efficiency of point cloud data generation is also not high.
Disclosure of Invention
The invention mainly provides a method for simulating real-time image simulation of an airborne laser radar, which is used for simulating detection of a sensor of the airborne laser radar on terrain and ground obstacles, drawing a point cloud image and outputting three-dimensional point cloud data and obstacle information.
The technical solution for realizing the purpose of the invention is as follows:
a method for simulating real-time image simulation of an airborne laser radar comprises the following steps:
the method comprises the following steps: acquiring the number of dot matrixes required for drawing the laser radar point cloud image, and solving an intersection point coordinate set Pclouds between laser radar rays and obstacles to obtain mass point cloud data;
step two: drawing a point cloud image: dividing the point cloud data obtained in the step one into different colors according to the linear mapping relation of HSV-height, and drawing a point cloud image in real time by using the osgShader; and step one, converting the point cloud data Pcloud obtained in the step one from a UTM coordinate system to a machine body coordinate system, and outputting obstacle distance, azimuth angle and pitch angle data in a limited time.
Further, the first step specifically comprises: defining the maximum detection distance of an airborne laser radar as R, the point cloud image point precision as d, the scanning vertical field angle of the laser radar as va, and solving the point cloud line number as vcount; scanning a horizontal field angle ha by using a laser radar, and solving the number of point cloud columns as hcount;
simulating a laser radar to send out a vcount multiplied by hcount rays, and obtaining an intersection point coordinate set Pcloud between the laser radar rays and an obstacle by using an osg type, namely a linear aggregation Intersector, namely mass point cloud data and distance between an airplane and the obstacle point.
Further, in the first step, the number of point cloud lines vcount and the number of point cloud columns hcount are calculated, and the calculation method is as follows:
vcount=R×tan(va/2)×2/d;
hcount=R×tan(ha/2)×2/d。
further, in the step one, an intersection coordinate set pcloud between the laser radar ray and the obstacle is obtained, and the specific steps are as follows:
in a world coordinate system, a matrix quaternion QUAT of the real-time flight attitude of the airplane is obtained, a radar scanning view cone is simulated, a radar mounting point p is taken as a view cone eye point, and a plane with a distance point p taken as R in the radar detection direction is taken as a far plane xoz distance Setting the starting point of the laser radar ray as p and passing through the far plane xOz distance The arbitrary point e is obtained by using osg, a linear interpolation inter class, the coordinates world _ object of the obstacle data points detected by the laser radar are obtained by taking the point p and the point e as end points, the coordinates of all the obstacle data points form a set Pcoordinates,
setting world coordinate systems as [ X _ AXIS, Y _ AXIS and Z _ AXIS ]; QUAT is quaternion transformation from an initial attitude to a flight attitude at a certain moment of an airplane;
QUAT=osg∷quat(α x ,X_AXIS,α y ,Y_AXIS,α z ,Z_AXIS);
wherein alpha is x 、α y 、α z The angles of the airplane rotating around the three axial directions of X _ AXIS, Y _ AXIS and Z _ AXIS are respectively divided into alpha under the initial attitude of the airplane x 、α y 、α z Are all 0;
e=QUAT×(-tx/2.0+dx×i,-ty/2.0+dy×j,R)+p;
where tx ═ R × tan (ha/2) denotes the far plane xoz distance The horizontal direction maximum distance of (a);
ty-R × tan (va/2) represents the far plane xoz distance The vertical direction maximum distance of;
dx represents the horizontal point precision of the point cloud image;
dy represents the vertical and horizontal point precision of the point cloud image;
i is a point cloud serial number, and the range is [0, hcount-1 ];
j is the point cloud line number, the range is [0, vcount-1 ].
Further, the specific method for drawing the point cloud image in the second step comprises the following steps: obtaining a color value (r, g, b) corresponding to the world _ object by utilizing the mapping relation between the height world _ object.z of the coordinates world _ object of the obstacle data points and HSV to draw a point cloud image:
Figure BDA0003688587630000021
wherein the altitude rangelow is the lowest altitude of the terrain within the radar detection range;
altitude range is the altitude difference of the terrain within the radar detection range;
360 is the maximum range value of HSV;
HSV and RGB conversion formula:
Figure BDA0003688587630000031
Figure BDA0003688587630000032
p=v×(1-s)
q=v×(1-f×s)
t=v×(1-(1-f)×s)
Figure BDA0003688587630000033
wherein, h, s and v are respectively hue, saturation and brightness; s is 1, v is 1, and h is sufficient only to distinguish different altitudes within the detection range of the radar.
Further, the conversion processing of the point cloud data from the UTM coordinate system to the body coordinate system in the second step is specifically as follows: let the coordinate system of the machine body [ x _ axis, y _ axis, z _ axis ], transfer the world coordinate system to the initial machine body coordinate system, so that the positive direction of the y axis of the world coordinate system:
let the quaternion transformation from the world coordinate system to the original body coordinate system be QUAT 2:
QUAT2=osg∷quat(α jx ,X_AXIS,α jy ,Y_AXIS,α jz ,Z_AXIS);
wherein alpha is jx 、α jy 、α jz The components of the initial handpiece orientation and the positive direction of the Y AXIS in the world coordinate system in the three-AXIS directions of X _ AXIS, Y _ AXIS and Z _ AXIS are respectively distinguished;
x_axis=QUAT*QUAT2*X_AXIS;
y_axis=QUAT*QUAT2*Y_AXIS;
z_axis=QUAT*QUAT2*Z_AXIS;
obtaining the Azimuth angle Azimuth, the pitch angle Elevation and the distance SlntRng of the laser radar detection point at the moment according to the coordinate air _ obj (x, y, z) of the laser radar detection point in the body coordinate system;
coordinates of air _ obj point:
x=(dis*x_axis)/x_axis.length();
y=(dis*y_axis)/y_axis.length();
z=(dis*z_axis)/z_axis.length();
dis is a vector from a laser radar detection obstacle data point to a mass center of the fuselage; x _ axis. length (), y _ axis. length (), z _ axis. length () are the modulo of y _ axis, z _ axis, respectively,
azimuth, Elevation and SlntRng are:
Elevation=asin(z/air_objpoint.length());
Azimuth=atan(x/y);
SlntRng=air_objpoint.length();
length () is a modulo of air _ obj point.
Compared with the prior art, the invention has the remarkable advantages that:
(1) the laser radar point cloud computing method can realize that the point precision in the detection distance of 1200m is less than 5m, and the point computing speed is kept about 300K/s; (2) the obstacle detection is accurate, and the point cloud data and the obstacle target data can be sent in a subpackaging way, so that the point cloud data and the obstacle target data can interact with the outside.
Drawings
FIG. 1 is a flow diagram of the present invention.
Fig. 2 is a computational schematic of the present invention.
Fig. 3 is a terrain elevation display effect diagram of the present invention.
Fig. 4 is a diagram showing the effect of the obstacle according to the present invention.
Detailed Description
The invention provides a method for simulating real-time image simulation of an airborne laser radar, which comprises the following steps in combination with a figure 1:
the method comprises the following steps: the method comprises the following steps of obtaining the number of dot matrixes needed for drawing the laser radar point cloud image, solving an intersection point coordinate set Pcloud between a laser radar ray and an obstacle, and obtaining mass point cloud data, wherein the mass point cloud data comprises the following steps:
defining a radar scanning view cone as shown in fig. 2, wherein a radar mounting point is a view cone eye point O, a plane ABCD is a maximum detection distance OL of a far-plane airborne laser radar of the radar scanning view cone, the precision of a point cloud image point is d (the grid spacing in the plane ABCD can be compared with d in unit of meter), and the vertical field angle FOH is scanned by the laser radar; the laser radar scans a horizontal field angle EOG, the number of point clouds to be drawn can be obtained through AD and AB edges, namely precision d, a point I on a passing surface ABCD is set, and an intersection point J of an OI ray and an obstacle target is obtained by using osg which is a linear interval type.
Defining the maximum detection distance of an airborne laser radar as R, the point cloud image point precision as d, the scanning vertical field angle of the laser radar as va, and solving the point cloud line number as vcount; scanning a horizontal field angle ha by using a laser radar, and solving the number of point cloud columns as hcount;
simulating a laser radar to send out a vcount multiplied by hcount ray, and obtaining an intersection point coordinate set Pcounts (mass point cloud data) between the laser radar ray and an obstacle and a distance between an airplane and the obstacle point by using an osg (Linear aggregation Intersector) type.
Step two: drawing a point cloud image, comprising the following steps:
and (4) dividing the point cloud data Pcloud obtained in the step one into different colors according to the linear mapping relation of HSV-height, and drawing a point cloud image in real time by using the osgShader.
And step one, converting the point cloud data Pcloud obtained in the step one from a UTM coordinate system to a machine body coordinate system, and outputting barrier distance, azimuth angle and pitch angle data (used for a receiver to draw scene barriers in real time) in a limited time.
Further, in the first step, the number of point cloud rows vcount and the number of point cloud columns hcount are obtained, and the calculation method is as follows:
vcount=R×tan(va/2)×2/d;
hcount=R×tan(ha/2)×2/d;
further, the first step of using osg type linear increment intersector to obtain the intersection point coordinate between the laser radar ray and the obstacle comprises the following concrete steps:
the first step is as follows: in a world coordinate system, a matrix quaternion QUAT of the real-time flight attitude of the airplane is obtained, a radar scanning view cone is simulated, a radar mounting point p is taken as a view cone eye point, and a plane with a distance point p taken as R in the radar detection direction is taken as a far plane xoz distance . Setting the starting point of the laser radar ray as p, and passing through the far plane xOz distance And (4) obtaining the coordinates world _ object of the obstacle data points detected by the laser radar by using the osg type, the line aggregate Intersector, and the point p and the point e as end points, wherein the coordinates of all the obstacle data points form a set Pthresholds.
Setting world coordinate systems as [ X _ AXIS, Y _ AXIS and Z _ AXIS ]; QUAT is quaternion transformation from an initial attitude to a flight attitude at a certain moment of an airplane;
QUAT=osg∷quat(α x ,X_AXIS,α y ,Y_AXIS,α z ,Z_AXIS);
wherein alpha is x 、α y 、α z The angles of the airplane rotating around the three-AXIS directions of X _ AXIS, Y _ AXIS and Z _ AXIS are respectively distinguished (alpha is under the initial attitude of the airplane) x 、α y 、α z Are all 0);
e=QUAT×(-tx/2.0+dx×i,-ty/2.0+dy×j,R)+p;
where tx ═ R × tan (ha/2) denotes the far plane xoz distance The horizontal direction maximum distance of;
ty-R × tan (va/2) represents the far plane xoz distance The vertical direction maximum distance of;
dx represents the horizontal point precision of the point cloud image;
dy represents the vertical and horizontal point precision of the point cloud image;
i is a point cloud serial number, and the range is [0, hcount-1 ];
j is the point cloud line number, the range is [0, vcount-1 ].
The specific method for drawing the point cloud image in the second step comprises the following steps: by utilizing the mapping relation between the height world _ object.z of the coordinates world _ object of the obstacle data point and HSV, obtaining the color value (r, g, b) corresponding to world _ object to draw a point cloud image, and as shown in the figure 3, the terrain elevation display effect is shown, and as shown in the figure 4, the obstacle display effect is shown:
Figure BDA0003688587630000061
wherein the altitude rangelow is the lowest altitude of the terrain within the radar detection range;
altitude range is the altitude difference of the terrain within the radar detection range;
360 is the maximum range value of HSV.
HSV and RGB conversion formula:
Figure BDA0003688587630000062
Figure BDA0003688587630000063
p=v×(1-s)
q=v×(1-f×s)
t=v×(1-(1-f)×s)
Figure BDA0003688587630000064
wherein, h, s and v are hue, saturation and brightness respectively; the range of s and v is 0-1, where s is 1 and v is 1 by default, and only h is sufficient to distinguish between different altitudes within the detection range of the radar.
The point cloud data of the second step is specifically converted from a UTM coordinate system to a machine body coordinate system as follows:
let the machine coordinate system [ x _ axis, y _ axis, z _ axis ], turn the world coordinate system to the initial machine coordinate system, so that the machine head is in the positive direction of the y-axis of the world coordinate system:
let the quaternion transformation from the world coordinate system to the original body coordinate system be QUAT 2:
QUAT2=osg∷quat(α jx ,X_AXIS,α jy ,Y_AXIS,α jz ,Z_AXIS);
wherein alpha is jx 、α jy 、α jz The components of the initial handpiece orientation and the positive direction of the Y AXIS in the world coordinate system in the three-AXIS directions of X _ AXIS, Y _ AXIS and Z _ AXIS are respectively distinguished;
x_axis=QUAT*QUAT2*X_AXIS;
y_axis=QUAT*QUAT2*Y_AXIS;
z_axis=QUAT*QUAT2*Z_AXIS;
obtaining the Azimuth angle Azimuth, the pitch angle Elevation and the distance SlntRng of the laser radar detection point at the moment according to the coordinate air _ obj (x, y, z) of the laser radar detection point in the body coordinate system;
coordinates of air _ obj point:
x=(dis*x_axis)/x_axis.length();
y=(dis*y_axis)/y_axis.length();
z=(dis*z_axis)/z_axis.length();
dis is a vector from a laser radar detection obstacle data point to a mass center of the fuselage; length (), y _ axis.length (), z _ axis.length (), respectively, are the modulo of y _ axis, z _ axis.
The Azimuth angle Azimuth, the pitch angle Elelevation and the distance SlntRng can be obtained as follows:
Elevation=asin(z/air_objpoint.length());
Azimuth=atan(x/y);
SlntRng=air_objpoint.length();
length () is a modulo of air _ obj point.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (6)

1. A method for simulating real-time image simulation of an airborne laser radar is characterized by comprising the following steps:
the method comprises the following steps: acquiring the number of dot matrixes required for drawing the laser radar point cloud image, and solving an intersection point coordinate set Pclouds between laser radar rays and obstacles to obtain mass point cloud data;
step two: drawing a point cloud image: dividing the point cloud data obtained in the step one into different colors according to the linear mapping relation of HSV-height, and drawing a point cloud image in real time by using the osgShader; and step one, converting the point cloud data Pcloud obtained in the step one from a UTM coordinate system to a machine body coordinate system, and outputting obstacle distance, azimuth angle and pitch angle data in a limited time.
2. The method for simulating airborne lidar real-time image simulation of claim 1,
the first step is specifically as follows: defining the maximum detection distance of an airborne laser radar as R, the point cloud image point precision as d, the scanning vertical field angle of the laser radar as va, and solving the point cloud line number as vcount; scanning a horizontal field angle ha by using a laser radar, and solving the number of point cloud columns as hcount;
simulating a laser radar to send out a vcount multiplied by hcount rays, and obtaining an intersection point coordinate set Pcloud between the laser radar rays and an obstacle by using an osg type, namely a linear aggregation Intersector, namely mass point cloud data and distance between an airplane and the obstacle point.
3. The method for simulating airborne lidar real-time image simulation of claim 2,
in the first step, the number of point cloud lines vcount and the number of point cloud columns hcount are obtained, and the calculation method is as follows:
vcount=R×tan(va/2)×2/d;
hcount=R×tan(ha/2)×2/d。
4. the method for simulating airborne lidar real-time image simulation of claim 3,
in the first step, an intersection point coordinate set Pcoordinates between the laser radar ray and the obstacle is obtained, and the method specifically comprises the following steps:
in a world coordinate system, a matrix quaternion QUAT of the real-time flight attitude of the airplane is obtained, a radar scanning view cone is simulated, a radar mounting point p is taken as a view cone eye point, and a plane with a distance point p taken as R in the radar detection direction is taken as a far plane xoz distance Setting the starting point of the laser radar ray as p and passing through the far plane xOz distance The arbitrary point e is obtained by using osg, a linear interpolation inter class, the coordinates world _ object of the obstacle data points detected by the laser radar are obtained by taking the point p and the point e as end points, the coordinates of all the obstacle data points form a set Pcoordinates,
setting world coordinate systems as [ X _ AXIS, Y _ AXIS and Z _ AXIS ]; QUAT is quaternion transformation from an initial attitude to a flight attitude at a certain moment of an airplane;
QUAT=osg∷quat(α x ,X_AXIS,α y ,Y_AXIS,α z ,Z_AXIS);
wherein alpha is x 、α y 、α z Respectively the angles of the aircraft rotating around the three axial directions of X _ AXIS, Y _ AXIS and Z _ AXIS and the initial attitude of the aircraft are alpha x 、α y 、α z Are all 0;
e=QUAT×(-tx/2.0+dx×i,-ty/2.0+dy×j,R)+p;
where tx ═ R × tan (ha/2) denotes the far plane xoz distance The horizontal direction maximum distance of;
ty-R × tan (va/2) denotes a far planexoz distance The vertical direction maximum distance of;
dx represents the horizontal point precision of the point cloud image;
dy represents the vertical and horizontal point precision of the point cloud image;
i is a point cloud serial number, and the range is [0, hcount-1 ];
j is the point cloud line number, the range is [0, vcount-1 ].
5. The method for simulating airborne lidar real-time image simulation of claim 4,
the specific method for drawing the point cloud image in the second step comprises the following steps: obtaining a color value (r, g, b) corresponding to world _ object by utilizing the mapping relation between the height world _ object.z of the coordinates world _ object of the obstacle data point and HSV to draw a point cloud image:
Figure FDA0003688587620000021
wherein the altitude rangelow is the lowest altitude of the terrain within the radar detection range;
altitude range is the altitude difference of the terrain within the radar detection range;
360 is the maximum range value of HSV;
HSV and RGB conversion formula:
Figure FDA0003688587620000022
Figure FDA0003688587620000023
p=v×(1-s)
q=v×(1-f×s)
t=v×(1-(1-f)×s)
Figure FDA0003688587620000031
wherein, h, s and v are hue, saturation and brightness respectively; s is 1, v is 1, and h is sufficient only to distinguish different altitudes within the detection range of the radar.
6. The method for simulating airborne lidar real-time image simulation of claim 5,
the point cloud data in the second step is specifically converted from a UTM coordinate system to a machine body coordinate system as follows: let the coordinate system of the machine body [ x _ axis, y _ axis, z _ axis ], transfer the world coordinate system to the initial machine body coordinate system, so that the positive direction of the y axis of the world coordinate system:
let the quaternion transformation from the world coordinate system to the original body coordinate system be QUAT 2:
QUAT2=osg∷quat(α jx ,X_AXIS,α jy ,Y_AXIS,α jz ,Z_AXIS);
wherein alpha is jx 、α jy 、α jz The components of the initial handpiece orientation and the positive direction of the Y AXIS in the world coordinate system in the three-AXIS directions of X _ AXIS, Y _ AXIS and Z _ AXIS are respectively distinguished;
x_axis=QUAT*QUAT2*X_AXIS;
y_axis=QUAT*QUAT2*Y_AXIS;
z_axis=QUAT*QUAT2*Z_AXIS;
obtaining the Azimuth angle Azimuth, the pitch angle Elevation and the distance SlntRng of the laser radar detection point at the moment according to the coordinate air _ obj (x, y, z) of the laser radar detection point in the body coordinate system;
coordinates of air _ obj point:
x=(dis*x_axis)/x_axis.length();
y=(dis*y_axis)/y_axis.length();
z=(dis*z_axis)/z_axis.length();
dis is a vector from a laser radar detection obstacle data point to a mass center of the fuselage; x _ axis. length (), y _ axis. length (), z _ axis. length () are the modulo of y _ axis, z _ axis, respectively,
azimuth, pitch angle Elevation and distance SlntRng are:
Elevation=asin(z/air_objpoint.length());
Azimuth=atan(x/y);
SlntRng=air_objpoint.length();
length () is a modulo of air _ obj point.
CN202210654115.7A 2022-06-10 2022-06-10 Method for simulating real-time image simulation of airborne laser radar Pending CN115079142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210654115.7A CN115079142A (en) 2022-06-10 2022-06-10 Method for simulating real-time image simulation of airborne laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210654115.7A CN115079142A (en) 2022-06-10 2022-06-10 Method for simulating real-time image simulation of airborne laser radar

Publications (1)

Publication Number Publication Date
CN115079142A true CN115079142A (en) 2022-09-20

Family

ID=83252032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210654115.7A Pending CN115079142A (en) 2022-06-10 2022-06-10 Method for simulating real-time image simulation of airborne laser radar

Country Status (1)

Country Link
CN (1) CN115079142A (en)

Similar Documents

Publication Publication Date Title
CN106780618B (en) Three-dimensional information acquisition method and device based on heterogeneous depth camera
CN109405835B (en) Relative pose measurement method based on non-cooperative target straight line and circular monocular image
CN110618691B (en) Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle
CN103489214A (en) Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system
CN111459046B (en) Real-time dynamic generation system and method for target and scene for image seeker
CN114743021A (en) Fusion method and system of power transmission line image and point cloud data
CN107422743A (en) The unmanned plane alignment system of view-based access control model
CN110517284B (en) Target tracking method based on laser radar and PTZ camera
CN112115607A (en) Mobile intelligent digital twin system based on multidimensional Sayboat space
CN113050074A (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN114002701A (en) Method, device, electronic equipment and system for rendering point cloud in real time
KR100213607B1 (en) Display device for flight control
Cao et al. Research on space fusion method of millimeter wave radar and vision sensor
CN100443859C (en) Target simulation method for photoelectric theodolite
CN115079142A (en) Method for simulating real-time image simulation of airborne laser radar
CN111521157B (en) Method and system for calculating coordinates of observation area of aircraft
CN117058051A (en) Method and device based on fusion of laser point cloud and low-light-level image
CN116564159A (en) Photoelectric measurement and control equipment tracking operation simulation training system and method
CN111125832A (en) Method for acquiring position and area of pressure center of windward side of spacecraft
CN114926552B (en) Method and system for calculating Gaussian coordinates of pixel points based on unmanned aerial vehicle image
WO2021147311A1 (en) Speed factor-considering fast numerical simulation method for lidar
CN113124821A (en) Structure measurement method based on curved mirror and plane mirror
CN113959412A (en) Unmanned aerial vehicle docking process active visual navigation method based on infrared detection
CN117274472B (en) Aviation true projection image generation method and system based on implicit three-dimensional expression
CN115964446B (en) Radar data interaction processing method based on mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination