CN113297668B - Three-light pod simulation method based on UE4 game engine - Google Patents

Three-light pod simulation method based on UE4 game engine Download PDF

Info

Publication number
CN113297668B
CN113297668B CN202110523247.1A CN202110523247A CN113297668B CN 113297668 B CN113297668 B CN 113297668B CN 202110523247 A CN202110523247 A CN 202110523247A CN 113297668 B CN113297668 B CN 113297668B
Authority
CN
China
Prior art keywords
pod
simulation
angle
camera
steps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110523247.1A
Other languages
Chinese (zh)
Other versions
CN113297668A (en
Inventor
徐新海
周东傲
刘兆鹏
姚剑
杨伟龙
叶帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of War of PLA Academy of Military Science
Original Assignee
Research Institute of War of PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of War of PLA Academy of Military Science filed Critical Research Institute of War of PLA Academy of Military Science
Priority to CN202110523247.1A priority Critical patent/CN113297668B/en
Publication of CN113297668A publication Critical patent/CN113297668A/en
Application granted granted Critical
Publication of CN113297668B publication Critical patent/CN113297668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/23Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]

Abstract

The invention provides a three-light pod simulation method based on a UE4 game engine, which comprises pod control simulation, optical camera simulation, laser ranging simulation and infrared camera simulation, wherein three-degree-of-freedom rotation is realized by controlling a pod to approach to a real situation, and the experiment cost can be reduced. The method can obtain the visible light picture, the infrared picture and the target distance information, and can be used for training the information perception capability of the unmanned aerial vehicle system through multi-information fusion.

Description

Three-light pod simulation method based on UE4 game engine
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle system simulation, and particularly relates to a three-light pod simulation method based on a UE4 game engine.
Background
The three-light pod simulation refers to a technology for simulating the three-light pod through a computer, and the three lights refer to an optical camera, a laser range finder and an infrared camera, are coaxial and are controlled by a three-degree-of-freedom rotating platform. The three-light pod is an important device for unmanned system detection and detection, and is widely used in sea, land and air detection, and carriers can be unmanned vehicles, unmanned boats and unmanned planes. Through simulation calculation of the three-light gondola, a large amount of training data can be provided for the unmanned system.
A three-light nacelle simulation model belongs to a sensor model in unmanned aerial vehicle system simulation, and is used for carrying out simulation calculation on an optical scene, a target distance and an infrared scene in front of an unmanned aerial vehicle. Modern game engines UE4 and Unity have advantages of excellent rendering effect, efficient ray tracing algorithm, etc., so they are widely used for scene simulation of unmanned aerial vehicle systems.
The simulation of the three-light pod can provide approximate real sensor data, and the data can be directly used for training an unmanned aerial vehicle system, so that the research cost is greatly reduced and the research and development period is shortened while the reliability of the data is maintained. At present, semi-physical and physical simulation is mostly adopted in the unmanned aerial vehicle system to acquire sensor data, and three-optical pod of the common unmanned aerial vehicle adopts a three-optical coaxial structure, so that the rotation in three directions of yawing, pitching and rolling can be realized, the attitude information of the pod on a motion carrier can be measured, the attitude information is fed back to a control system, and the capture and tracking of a target are realized by combining the sensor data on the unmanned aerial vehicle. The method for acquiring data is easily limited by experimental conditions, and the cost of manpower, time and expenses is high. By utilizing the technologies of light rendering, collision detection and the like of a modern game engine, the detection and perception functions of the three-light pod can be simulated really, and the three-light pod simulation system has the advantages of randomly constructing scenes, being agile to develop and the like.
Disclosure of Invention
Technical problem to be solved
The invention provides a game engine-based three-light pod simulation method, which aims to solve the technical problem of how to realize pod control, optical camera photographing and zooming, laser ranging and infrared camera photographing and zooming functions by utilizing a UE4 game engine.
(II) technical scheme
In order to solve the technical problem, the invention provides a simulation method of a three-light pod based on a UE4 game engine, which comprises the following steps:
(1) pod control simulation
The pod adopts a three-light coaxial structure, namely the pod simultaneously controls the postures of the optical camera, the laser range finder and the infrared camera; firstly, establishing a relative coordinate system O-FRD (O-frame fast recovery) with the center of the unmanned aerial vehicle as an origin by adopting a left-hand coordinate system in a game engine; wherein, the direction of the nose is F, the direction of the right wing is R, and the downward direction of the vertical tail wing is D; in an initial state, the pod and the unmanned aerial vehicle keep the same attitude, the coordinate system O-xyz of the pod is aligned with the O-FRD, the pod clockwise around a z axis, namely clockwise when viewed along the z axis direction, the rotation angle is a yaw angle alpha, the rotation angle of the pod around a y axis is a pitch angle beta, and the clockwise rotation angle of the pod around an x axis is a roll angle gamma;
at the initial state, the pod coordinate is P0Yaw angle of alpha0Pitch angle of beta0With a roll angle of gamma0(ii) a The simulated bird rotates at an angular velocity w/s to an angle (alpha)111) And satisfies the condition-180 DEG<α1<180°、-90°<β1<90°、-180°<γ1<180°;
The nacelle control simulation steps are as follows:
and S1, converting the angular velocity in the real world into the angular velocity in the simulated world, wherein if the wall clock time 1S is 60 frames of the simulation time, the simulated world pod angular velocity omega is as follows:
Figure GDA0003574729070000021
s2, rotating the attitude of the nacelle in the sequence of yaw angle, pitch angle and roll angle, namely firstly rotating alpha around the x axis10Rotated by beta about the y-axis10Finally rotate gamma around z-axis10
S3, calculating the rotation alpha around the x axis according to the following formula10Required number of frames N:
Figure GDA0003574729070000022
setting the current frame as N from the 1 st frame to the N-1 st frame, wherein N is more than or equal to 1 and less than or equal to N-1, and determining the target angle alpha of the yaw angle of the current frame NnComprises the following steps:
αn=α0+nω
setting nacelle yaw angle to alphanThe angular rotation is achieved by the game engine; when N is equal to N, setting the yaw angle of the nacelle to be alpha;
s4, calculating the rotation beta around the y axis according to the following formula10Required frame number M:
Figure GDA0003574729070000031
from N +1 to N + M-1,target pitch angle betanComprises the following steps:
βn=β0+(n-N)ω
setting the pitch angle of the nacelle to be betan(ii) a When N is N + M, setting the pitch angle of the nacelle as beta;
s5, calculating the rotation gamma around the z axis according to the following formula10Required number of frames L:
Figure GDA0003574729070000032
from N-N + M +1 to N-N + M + L-1, the target roll angle γnComprises the following steps:
γn=γ0+(n-N-M)ω
setting the roll angle of the nacelle to gamman(ii) a When N is N + M + L, setting the roll angle of the nacelle to be gamma;
(2) optical camera emulation
Carrying out optical camera shooting simulation by utilizing a graphic rendering component of a game engine;
(3) laser ranging simulation
Performing laser ranging simulation by using a light projection and collision query component of a game engine;
(4) infrared camera simulation
A scene capture component of the game engine is used to enter into an infrared camera photographing simulation.
Further, the specific steps of the photographing simulation of the optical camera are as follows:
s1, scene construction: constructing a scene required for optical imaging in the UE4, including terrain, lighting, environment, and objects;
s2, setting parameters of the optical camera: comprises parameters of light sensing elements, a one-time focal length, image resolution and a Gamma value; wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal edges is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure GDA0003574729070000041
setting the resolution of an output image to be 960 multiplied by 640 and the Gamma value to be 3 through a Camera type interface provided by the UE 4; converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure GDA0003574729070000042
vertical field angle FOVyComprises the following steps:
Figure GDA0003574729070000043
s3, rendering and imaging: rendering and imaging the rays within the camera field of view together with the scene capture and canvas rendering components of the UE4 according to the optical camera parameters provided in step S2; firstly, determining a scene capturing range according to a camera field angle, then creating a canvas of 960 multiplied by 640, updating canvas resources, and finally generating a two-dimensional BMP format picture through ray rendering;
s4, format conversion and compression: and converting the two-dimensional BMP format picture into a picture in a different format by using an ImageWrapper library of the UE4, and setting a compression rate.
Further, a laser ranging simulation is performed by using a ray casting and collision query component of the UE4, and the specific steps are as follows:
s1, obtaining self coordinates p of the laser range finder0Obtaining a direction vector l of laser emission according to the pod attitude, and obtaining an end point p of a line segment by knowing that the maximum distance of laser ranging is d1(ii) a Solving line segments by the following parametric equations
Figure GDA0003574729070000044
Any point p (t) above:
p(t)=p0+tdl,t∈[0,1]
s2, obtaining the earliest collision point, i.e., with p, using the collision query component of UE40The closest collision point in the direction l, or the minimum value t of tminCorresponding point, thereby calculating the distance p (t) between the laser range finder and the targetmin)。
Further, the infrared camera photographing simulation is performed by using the scene capturing component of the UE4, and the specific steps are as follows:
s1, pretreatment: acquiring actors of various classes from the game world by using a GetAllActorOfClass () function of a scene capture component, namely acquiring attributes of various targets; then, acquiring the grids of each Actor by using GetComponents () function of the scene capture component, namely dividing the target into a plurality of parts and endowing the grids of each target with proper temperature and radiance epsilon;
s2, infrared camera parameter setting: the method comprises the following steps of (1) including thermal sensing element parameters, a one-time focal length and image resolution; wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal sides is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure GDA0003574729070000051
the output image resolution is set to 960 × 640 through a Camera-like interface provided by the UE 4; converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure GDA0003574729070000052
vertical field of view FOVyComprises the following steps:
Figure GDA0003574729070000053
s3, post-processing to generate an infrared image: and according to the infrared radiation energy of each pixel point calculated in the step S1 and the optical camera parameters provided in the step S2, rendering and imaging the infrared radiation energy in the camera field of view by utilizing a scene capturing and canvas rendering component of the UE 4.
Further, in the step S3, in the step of generating the infrared image by the post-processing, the specific steps of rendering and imaging the infrared radiation energy in the camera field of view are as follows:
s3.1, calculating the radiation energy in unit wavelength according to the following formula:
Figure GDA0003574729070000054
wherein epsilon is the radiance of the object; c1=3.74×10-12W·cm2;C21.44 cm. K; e is a natural logarithm base number; lambda is the wavelength and is between 0.1 and 14 mu m; t is the object temperature;
s3.2, calculating the total radiant energy according to the following formula:
Figure GDA0003574729070000055
wherein λ is1=0.1μm、λ2=14μm;
S3.3 passing white Gaussian noisewN (0,1) models thermal and Johnson noise by Gaussian noise εfN (0,1/f) to simulate dither noise;
s3.4, calculating the total energy received according to the following formula:
Ereceive=EIRwf
and S3.5, normalizing the total energy obtained by calculating in the S3.4 to obtain the infrared image.
(III) advantageous effects
The invention provides a three-light pod simulation method based on a game engine, which comprises pod control simulation, optical camera simulation, laser ranging simulation and infrared camera simulation. The method can obtain the visible light picture, the infrared picture and the target distance information, and can be used for training the information perception capability of the unmanned aerial vehicle system through multi-information fusion.
Drawings
FIG. 1 is a schematic view of a coordinate system of a drone and a coordinate system of a pod in an embodiment of the present invention;
fig. 2 is a schematic diagram of a simulation principle of laser ranging in the embodiment of the present invention.
Detailed Description
In order to make the objects, contents and advantages of the present invention clearer, the following detailed description of the embodiments of the present invention will be made in conjunction with the accompanying drawings and examples.
The embodiment provides a three-light pod simulation method based on a game engine, which comprises the following contents:
(1) pod control simulation
The pod adopts a three-light coaxial structure, namely the pod simultaneously controls the postures of the optical camera, the laser range finder and the infrared camera. As shown in fig. 1, a relative coordinate system O-FRD with the center of the drone as the origin is first established in the game engine using the left-hand coordinate system. Wherein, the direction of the nose is F, the direction of the right wing is R, and the downward direction of the vertical tail wing is D. In an initial state, the pod and the unmanned aerial vehicle keep the same attitude, a coordinate system O-xyz of the pod is aligned with an O-FRD, the rotation angle of the pod clockwise around a z axis (namely clockwise when viewed along the z axis direction) is a yaw angle alpha, the rotation angle of the pod clockwise around a y axis is a pitch angle beta, and the rotation angle of the pod clockwise around an x axis is a roll angle gamma.
At the initial state, the pod coordinate is P0Yaw angle of alpha0Pitch angle of beta0With a roll angle of gamma0. The simulated nacelle is rotated at an angular velocity w/s to an angle (alpha)111) And satisfies the condition-180 DEG<α1<180°、-90°<β1<90°、-180°<γ1<180°。
The nacelle control simulation steps are as follows:
and S1, converting the angular velocity in the real world into the angular velocity in the simulation world, wherein if the wall clock time 1S is 60 frames of the simulation time, the simulated world pod angular velocity omega is as follows:
Figure GDA0003574729070000071
s2, rotating the attitude of the nacelle in the sequence of yaw angle, pitch angle and roll angle, namely firstly rotating alpha around the x axis10Rotated by beta about the y-axis10Finally rotate gamma around z-axis10
S3, calculating the rotation alpha around the x axis according to the following formula10Required number of frames N:
Figure GDA0003574729070000072
setting the current frame as N from the 1 st frame to the N-1 st frame, wherein N is more than or equal to 1 and less than or equal to N-1, and determining the target angle alpha of the yaw angle of the current frame NnComprises the following steps:
αn=α0+nω
setting nacelle yaw angle to alphanThe angular rotation is achieved by means provided in the game engine (e.g., setratation () function in UE4, Rotate () method in Unity). When N is N, the yaw angle of the nacelle is set to α.
S4, calculating the rotation beta around the y axis according to the following formula10Required frame number M:
Figure GDA0003574729070000073
from N + N to N + M-1, target pitch angle βnComprises the following steps:
βn=β0+(n-N)ω
setting the pitch angle of the nacelle to be betan(ii) a When N is N + M, the pitch angle of the nacelle is set to β.
S5, calculating the rotation gamma around the z axis according to the following formula10Required frame number L:
Figure GDA0003574729070000074
from N-N + M +1 to N-N + M + L-1, the target roll angle γnComprises the following steps:
γn=γ0+(n-N-M)ω
setting the roll angle of the nacelle to gamman(ii) a When N is N + M + L, the roll angle of the nacelle is set to γ.
(2) Optical camera emulation
The photo shooting simulation of the optical camera is carried out by utilizing the graphic rendering technology of the modern game engine.
Taking UE4 as an example, the specific steps of the simulation of the photo taking by the optical camera are as follows:
s1, scene construction: the scene required for optical imaging is constructed in the UE4, including terrain (e.g., mountains, rivers, trees), lighting (e.g., sunlight, light), environment (e.g., clouds, fog, rain), and objects (e.g., vehicles, pedestrians).
S2, setting parameters of the optical camera: including the parameters of the light sensing element, the one-time focal length, the image resolution and the Gamma value. Wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal sides is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure GDA0003574729070000081
the output image resolution was set to 960 × 640 and the Gamma value to 3 through the Camera class interface provided by the UE 4. Converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure GDA0003574729070000082
vertical field of view FOVyComprises the following steps:
Figure GDA0003574729070000083
s3, rendering and imaging: rays within the camera field of view are collectively rendered imaged with the scene capture and canvas rendering components of the UE4 in accordance with the optical camera parameters provided at step S2. Firstly, determining a scene capturing range according to a camera field angle, then creating 960 multiplied by 640 canvas, updating canvas resources, and finally generating a two-dimensional BMP format picture through ray rendering.
S4, format conversion and compression: the image wrapper library of the UE4 is used to convert the two-dimensional BMP format picture into a picture in PNG, JPEG, BMP, ICO, EXR, or the like format, and set the compression rate.
(3) Laser ranging simulation
The laser ranging simulation is carried out by using a ray projection and collision query component of the UE4, and the specific steps are as follows:
s1, as shown in FIG. 2, obtaining the self-coordinate p of the laser range finder0Obtaining a direction vector l of laser emission according to the pod attitude, and obtaining an end point p of a line segment by knowing that the maximum distance of laser ranging is d1. Solving line segments by the following parametric equations
Figure GDA0003574729070000091
Any point p (t) above:
p(t)=p0+tdl,t∈[0,1]
s2, obtaining the earliest collision point, i.e., with p, using the collision query component of UE40The closest collision point in the direction l, or the minimum value t of tminCorresponding point, thereby calculating the distance p (t) between the laser range finder and the targetmin)。
(4) Infrared camera simulation
The scene capturing component of the UE4 is used for carrying out photographing simulation of the infrared camera, and the method comprises the following specific steps:
s1, preprocessing: acquiring actors of various classes from the game world by using a GetAllActorOfClass () function of a scene capture component, namely acquiring the attributes of targets such as vehicles, trees, pedestrians and the like; GetCom then using the scene Capture componentThe nodes () function takes the mesh of each Actor, i.e., the target is divided into multiple parts, e.g., a person can be divided into limbs, body, head. By looking up the table 1, the grid of each target is given an appropriate temperature T (in K) and emissivity ε (in W/m)2/sr)。
TABLE 1 temperature and emissivity of common objects
Object Winter temperature (K) Summer temperature (K) Emissivity of radiation
Soil for planting 278 288 0.914
Grass (Haw) 273 293 0.958
Bush 273 293 0.986
Tree (R) 273 293 0.952
Human being 292 298 0.985
Vehicle with a steering wheel 273 293 0.80
Water (I) 273 293 0.96
S2, infrared camera parameter setting: including thermal sensor parameters, one-time focal length, image resolution. Wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal sides is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure GDA0003574729070000101
the output image resolution is set to 960 × 640 through the Camera class interface provided by the UE 4. Converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure GDA0003574729070000102
vertical field of view FOVyComprises the following steps:
Figure GDA0003574729070000103
s3, infrared image generation by post-processing: and according to the infrared radiation energy of each pixel point calculated in the step S1 and the optical camera parameters provided in the step S2, rendering and imaging the infrared radiation energy in the camera field of view by utilizing a scene capturing and canvas rendering component of the UE 4. The method comprises the following specific steps:
s3.1, calculating the radiation energy in the unit wavelength according to the following formula:
Figure GDA0003574729070000104
wherein epsilon is the radiance of the object; c1=3.74×10-12W·cm2;C21.44cm · K; e is a natural logarithm base number; lambda is the wavelength and is between 0.1 and 14 mu m; t is the object temperature.
S3.2, calculating the total radiant energy according to the following formula:
Figure GDA0003574729070000105
wherein λ is1=0.1μm、λ2=14μm。
S3.3, when heat is transferred to the infrared thermal sensing device, the heat is influenced by Johnson noise, jitter noise and thermal noise, and the heat is influenced by Gaussian white noise epsilonwN (0,1) models thermal and Johnson noise by Gaussian noise εfN (0,1/f) to simulate jitter noise.
S3.4, calculating the total received energy according to the following formula:
Ereceive=EIRwf
and S3.5, normalizing the total energy obtained by calculating in the S3.4 to obtain the infrared image.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.

Claims (5)

1. A three-light-pod simulation method based on a UE4 game engine is characterized by comprising the following steps:
(1) pod control simulation
The pod adopts a three-light coaxial structure, namely the pod simultaneously controls the postures of the optical camera, the laser range finder and the infrared camera; firstly, establishing a relative coordinate system O-FRD (O-frame fast recovery) with the center of the unmanned aerial vehicle as an origin by adopting a left-hand coordinate system in a game engine; wherein, the direction of the nose is F, the direction of the right wing is R, and the downward direction of the vertical tail wing is D; in an initial state, the pod and the unmanned aerial vehicle keep the same attitude, the coordinate system O-xyz of the pod is aligned with the O-FRD, the pod clockwise around a z axis, namely clockwise when viewed along the z axis direction, the rotation angle is a yaw angle alpha, the rotation angle of the pod around a y axis is a pitch angle beta, and the clockwise rotation angle of the pod around an x axis is a roll angle gamma;
at the initial state, the pod coordinate is P0Yaw angle of alpha0Pitch angle of beta0With a roll angle of gamma0(ii) a The simulated bird rotates at an angular velocity w/s to an angle (alpha)111) And satisfies the condition-180 °<α1<180°、-90°<β1<90°、-180°<γ1<180°;
The nacelle control simulation steps are as follows:
and S1, converting the angular velocity in the real world into the angular velocity in the simulation world, wherein if the wall clock time 1S is 60 frames of the simulation time, the simulated world pod angular velocity omega is as follows:
Figure FDA0003574729060000011
s2, rotating the attitude of the nacelle in the sequence of yaw angle, pitch angle and roll angle, namely firstly rotating alpha around the x axis10And then rotated by beta about the y-axis10Finally rotate gamma around z-axis10
S3, calculating the rotation alpha around the x axis according to the following formula10Required number of frames N:
Figure FDA0003574729060000012
setting the current frame as N from the 1 st frame to the N-1 st frame, wherein N is more than or equal to 1 and less than or equal to N-1, and determining the target angle alpha of the yaw angle of the current frame NnComprises the following steps:
αn=α0+nω
setting nacelle yaw angle to alphanThe angular rotation is achieved by the game engine; when N is N, setting the yaw angle of the nacelle to be alpha;
s4, calculating the rotation beta around the y axis according to the following formula10Required frame number M:
Figure FDA0003574729060000021
from N + N to N + M-1, target pitch angle βnComprises the following steps:
βn=β0+(n-N)ω
setting the pitch angle of the nacelle to be betan(ii) a When N is N + M, setting the pitch angle of the nacelle as beta;
s5, calculating the rotation gamma around the z axis according to the following formula10Required number of frames L:
Figure FDA0003574729060000022
from N + M +1 to N + M + L-1, the target roll angle γnComprises the following steps:
γn=γ0+(n-N-M)ω
setting the roll angle of the nacelle to gamman(ii) a When N is N + M + L, setting the roll angle of the nacelle to be gamma;
(2) optical camera emulation
Carrying out optical camera shooting simulation by utilizing a graphic rendering component of a game engine;
(3) laser ranging simulation
Performing laser ranging simulation by using a light projection and collision query component of a game engine;
(4) infrared camera simulation
And using a scene capturing component of the game engine to simulate the shooting of the infrared camera.
2. The UE4 game engine-based three-light pod simulation method of claim 1, wherein the simulation of photographing with an optical camera comprises the following steps:
s1, scene construction: constructing a scene required for optical imaging in the UE4, including terrain, lighting, environment, and objects;
s2, setting parameters of the optical camera: including light sensing element parameters, one-time focal length, image resolution and Gamma value; wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal sides is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure FDA0003574729060000023
setting the resolution of an output image to be 960 multiplied by 640 and the Gamma value to be 3 through a Camera type interface provided by the UE; converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure FDA0003574729060000031
vertical field angle FOVyComprises the following steps:
Figure FDA0003574729060000032
s3, rendering and imaging: rendering and imaging the rays within the camera field of view together with the scene capture and canvas rendering components of the UE4 according to the optical camera parameters provided in step S2; firstly, determining a scene capturing range according to a camera field angle, then creating a canvas of 960 multiplied by 640, updating canvas resources, and finally generating a two-dimensional BMP format picture through ray rendering;
s4, format conversion and compression: and converting the two-dimensional BMP format picture into a picture in a different format by using an ImageWrapper library of the UE4, and setting a compression rate.
3. The UE4 game engine-based three-light pod simulation method of claim 1, wherein the laser ranging simulation is performed using a light projection and collision query module of the UE, comprising the steps of:
s1, obtaining self coordinates p of the laser range finder0Obtaining a direction vector l of laser emission according to the attitude of the nacelle, and obtaining an end point p of the line segment by knowing that the maximum distance of laser ranging is d1(ii) a Solving line segments by the following parametric equations
Figure FDA0003574729060000033
Any point p (t) above:
p(t)=p0+tdl,t∈[0,1]
s2, obtaining the earliest collision point, i.e., with p, using the collision query component of UE40The closest collision point in the direction l, or the minimum value t of tminCorresponding point, thereby calculating the distance p (t) between the laser range finder and the targetmin)。
4. The UE4 game engine-based three-light pod simulation method of claim 1, wherein the scene capture component of the UE4 is used for infrared camera photographing simulation, comprising the following steps:
s1, preprocessing: acquiring actors of various classes from a game world by using a GetAllActorOfClass () function of a scene capture component, namely acquiring attributes of various targets; then, acquiring the grids of each Actor by using GetComponents () function of the scene capture component, namely dividing the target into a plurality of parts and endowing the grids of each target with proper temperature and radiance epsilon;
s2, infrared camera parameter setting: the method comprises the following steps of (1) including thermal sensing element parameters, a one-time focal length and image resolution; wherein, the focal length is set to be f1The distance from the center of the light-sensitive element to the vertical and horizontal edges is dxAnd dyAnd calculating an internal reference matrix E of the camera according to the following formula:
Figure FDA0003574729060000041
setting the resolution of an output image to be 960 multiplied by 640 through a Camera type interface provided by the UE; converting the focal length multiple n into a field angle according to the following formula:
horizontal field angle FOVxComprises the following steps:
Figure FDA0003574729060000042
vertical field angle FOVyComprises the following steps:
Figure FDA0003574729060000043
s3, infrared image generation by post-processing: and according to the step S1, calculating the infrared radiation energy of each pixel point and the optical camera parameters provided in the step S2, and rendering and imaging the infrared radiation energy in the camera field of view by using the scene capturing and canvas rendering component of the UE 4.
5. The method for simulating the three-light pod based on the UE4 game engine of claim 4, wherein in the step S3 of generating the infrared image by post-processing, the specific steps of rendering and imaging the infrared radiation energy in the camera field of view are as follows:
s3.1, calculating the radiation energy in unit wavelength according to the following formula:
Figure FDA0003574729060000044
wherein epsilon is the radiance of the object; c1=3.74×10-12W·cm2;C21.44cm · K; e is a natural logarithm base number; lambda is the wavelength and is between 0.1 and 14 mu m; t is the object temperature;
s3.2, calculating the total radiant energy according to the following formula:
Figure FDA0003574729060000045
wherein λ is1=0.1μm、λ2=14μm;
S3.3 passing white Gaussian noisewN (0,1) models thermal and Johnson noise by Gaussian noise εfN (0,1/f) to simulate jitter noise;
s3.4, calculating the total energy received according to the following formula:
Ereceive=EIRwf
and S3.5, normalizing the total energy obtained by calculating in the S3.4 to obtain the infrared image.
CN202110523247.1A 2021-05-13 2021-05-13 Three-light pod simulation method based on UE4 game engine Active CN113297668B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110523247.1A CN113297668B (en) 2021-05-13 2021-05-13 Three-light pod simulation method based on UE4 game engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110523247.1A CN113297668B (en) 2021-05-13 2021-05-13 Three-light pod simulation method based on UE4 game engine

Publications (2)

Publication Number Publication Date
CN113297668A CN113297668A (en) 2021-08-24
CN113297668B true CN113297668B (en) 2022-07-19

Family

ID=77321844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110523247.1A Active CN113297668B (en) 2021-05-13 2021-05-13 Three-light pod simulation method based on UE4 game engine

Country Status (1)

Country Link
CN (1) CN113297668B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436307B (en) * 2021-08-27 2021-11-16 速度时空信息科技股份有限公司 Mapping algorithm based on osgEarth image data to UE4 scene
CN114372348B (en) * 2021-12-13 2022-11-15 北京理工大学 Rapid simulation method for missile-borne linear array laser imaging fuse

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204189339U (en) * 2014-07-30 2015-03-04 中国人民解放军海军航空工程学院青岛校区 A kind of airborne electronic equipment Jamming pod safeguards training system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2293678A1 (en) * 1997-06-13 1998-12-17 Yiftach Tzori Concurrent hardware-software co-simulation
US7768527B2 (en) * 2006-05-31 2010-08-03 Beihang University Hardware-in-the-loop simulation system and method for computer vision
US10105603B2 (en) * 2015-11-13 2018-10-23 Zynga Inc. Automated tuning of computer-implemented games
CN105550458B (en) * 2015-12-25 2019-04-16 天津航天中为数据系统科技有限公司 Unmanned helicopter vibrates the modeling method and device to aerial survey gondola Imaging
US10703508B1 (en) * 2016-08-31 2020-07-07 Amazon Technologies, Inc. Stereoscopic flight simulator with data acquisition
CN110920845B (en) * 2019-11-14 2020-11-10 浙江大学 Full-guide-pipe type two-stage pod propeller with C-shaped guide vanes
CN212332992U (en) * 2019-12-30 2021-01-12 普宙飞行器科技(深圳)有限公司 Multifunctional three-light nacelle and system based on unmanned aerial vehicle carrying and unmanned aerial vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204189339U (en) * 2014-07-30 2015-03-04 中国人民解放军海军航空工程学院青岛校区 A kind of airborne electronic equipment Jamming pod safeguards training system

Also Published As

Publication number Publication date
CN113297668A (en) 2021-08-24

Similar Documents

Publication Publication Date Title
Xiang et al. Mini-unmanned aerial vehicle-based remote sensing: Techniques, applications, and prospects
CN113297668B (en) Three-light pod simulation method based on UE4 game engine
CN108702444B (en) Image processing method, unmanned aerial vehicle and system
CN104504748B (en) A kind of infrared 3-D imaging system of unmanned plane oblique photograph and modeling method
CN104168455B (en) A kind of space base large scene camera system and method
US8687062B1 (en) Step-stare oblique aerial camera system
US11756158B2 (en) Information processing device, information processing method, information processing program, image processing device, and image processing system for associating position information with captured images
CN107504957A (en) The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
KR101550780B1 (en) System and method for collecting image datas using unmanned air vehicle
CN113159466B (en) Short-time photovoltaic power generation prediction system and method
Ferworn et al. Initial experiments on 3D modeling of complex disaster environments using unmanned aerial vehicles
CN106043723A (en) Swinging oblique photography system and method of fixed-wing unmanned aerial vehicle
CN106683039A (en) System for generating fire situation map
CN111899345B (en) Three-dimensional reconstruction method based on 2D visual image
CN108830811A (en) A kind of aviation image real-time correction method that flight parameter is combined with camera internal reference
Zhang et al. Aerial orthoimage generation for UAV remote sensing
CN104732557A (en) Color point cloud generating method of ground laser scanner
CN107301633B (en) Simulation method for remote sensing imaging under cloud and fog interference
CN114545963A (en) Method and system for optimizing multi-unmanned aerial vehicle panoramic monitoring video and electronic equipment
CN112785678B (en) Sunlight analysis method and system based on three-dimensional simulation
CN116597155A (en) Forest fire spreading prediction method and system based on multi-platform collaborative computing mode
CN110209199A (en) A kind of farmland fire source monitoring UAV system design
Lavigne et al. Step-stare technique for airborne high-resolution infrared imaging
Krause et al. UAV workflow optimization for the acquisition of high-quality photogrammetric point clouds in forestry
CN205952332U (en) Fixed wing uavs sways formula oblique photography system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant