CN117368869B - Visualization method, device, equipment and medium for radar three-dimensional power range - Google Patents

Visualization method, device, equipment and medium for radar three-dimensional power range Download PDF

Info

Publication number
CN117368869B
CN117368869B CN202311657990.1A CN202311657990A CN117368869B CN 117368869 B CN117368869 B CN 117368869B CN 202311657990 A CN202311657990 A CN 202311657990A CN 117368869 B CN117368869 B CN 117368869B
Authority
CN
China
Prior art keywords
radar
vector
model
power range
dimensional power
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311657990.1A
Other languages
Chinese (zh)
Other versions
CN117368869A (en
Inventor
王帅
王宇翔
马海波
王亚娜
李晓明
张颖超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Hongtu Information Technology Co Ltd
Original Assignee
Aerospace Hongtu Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Hongtu Information Technology Co Ltd filed Critical Aerospace Hongtu Information Technology Co Ltd
Priority to CN202311657990.1A priority Critical patent/CN117368869B/en
Publication of CN117368869A publication Critical patent/CN117368869A/en
Application granted granted Critical
Publication of CN117368869B publication Critical patent/CN117368869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a method, a device, equipment and a medium for visualizing a radar three-dimensional power range, which comprise the following steps: acquiring radar position and range attribute sent by a server side; generating a radar model based on the radar position, and generating a radar three-dimensional power range corresponding to the radar model based on the range attribute; and displaying the radar three-dimensional power range in the mixed reality scene so as to realize the visualization of the radar three-dimensional power range. The method can display the three-dimensional power range of the radar in the mixed reality scene, thereby obviously improving the visual effect of the three-dimensional power range of the radar.

Description

Visualization method, device, equipment and medium for radar three-dimensional power range
Technical Field
The invention relates to the technical field of visualization, in particular to a method, a device, equipment and a medium for visualizing a radar three-dimensional power range.
Background
A three-dimensional radar power map is a graph showing the range of radar detection power. It may display the detection range of the radar in three dimensions to help the user to better understand the detection capabilities of the radar. Three-dimensional radar power maps are commonly used in the military, aviation, weather, etc. fields to help users better understand the detection capabilities and limitations of the radar. While three-dimensional radar power is a visualization tool, the visualization effect of which still has room for improvement.
Disclosure of Invention
In view of the above, the present invention aims to provide a method, a device, and a medium for visualizing a three-dimensional power range of a radar, which can display the three-dimensional power range of the radar in a mixed reality scene, thereby significantly improving the visualization effect of the three-dimensional power range of the radar.
In a first aspect, an embodiment of the present invention provides a method for visualizing a radar three-dimensional power range, where the method is applied to a mixed reality platform, and the mixed reality platform is communicatively connected to a server, and includes:
acquiring radar position and range attributes sent by the server side;
generating a radar model based on the radar position, and generating a radar three-dimensional power range corresponding to the radar model based on the range attribute;
and displaying the radar three-dimensional power range in a mixed reality scene to realize the visualization of the radar three-dimensional power range.
In one embodiment, the range attribute includes a shape type of radar three-dimensional power range, a target angle, a detection distance, the shape type including a cone type, a rectangular field of view type, or a phased array type, the target angle including a horizontal angle and a pitch angle; the step of generating the radar three-dimensional power range corresponding to the radar model based on the range attribute comprises the following steps:
If the shape type is the cone type, determining a cone included angle based on the radar position and the detection distance, generating a regular cone model according to the cone included angle, and expressing a radar three-dimensional power range corresponding to the radar model by using the regular cone model;
if the shape type is the rectangular view field type, constructing a first vector corresponding to the forward direction of the radar model, rotating the first vector anticlockwise by the horizontal angle to obtain a second vector and rotating the first vector clockwise by the horizontal angle to obtain a third vector by taking a vector, perpendicular to the ground, of the longitude and latitude of the radar model as an axis; rotating the pitching angle anticlockwise for the first vector to obtain a fourth vector and rotating the pitching angle clockwise for the first vector to obtain a fifth vector by taking a vector which is perpendicular to the first vector and parallel to the tangent line of the current point as an axis; generating a quadrangular pyramid model based on the second vector, the third vector, the fourth vector and the fifth vector, and expressing a radar three-dimensional power range corresponding to the radar model by using the quadrangular pyramid model;
If the shape type is the phased array type, constructing a sixth vector, wherein the starting point of the sixth vector is the radar position, and the direction is that the sphere center of the earth points to the north pole of the earth; rotating the sixth vector clockwise by the horizontal angle by taking a vector of the vertical and the current longitude and latitude coordinates of the earth as an axis to obtain a seventh vector, and taking the planes of the sixth vector and the seventh vector as bottom surfaces; taking the central line of the sixth vector and the seventh vector as an eighth vector, and rotating the eighth vector by the pitching angle by taking the tangent line of the current longitude and latitude vertical to the eighth vector as an axis to obtain a ninth vector; taking the sixth vector, the seventh vector and the ninth vector as edges; and generating a phased array model based on the bottom surface and the edge, and expressing a radar three-dimensional power range corresponding to the radar model by using the phased array model.
In one embodiment, the method further comprises:
for any two or more radar models, determining an intersection overlapping area between the radar three-dimensional power ranges corresponding to the radar models;
and displaying the intersection overlapping area in the mixed reality scene according to a preset first target special effect.
In one embodiment, the method further comprises:
receiving celestial body operation data sent by the server side to control celestial body target movement according to the celestial body operation data;
judging whether the celestial object enters one or more radar three-dimensional power ranges corresponding to the radar model in the celestial object moving process;
if so, taking a radar model corresponding to a radar three-dimensional power range to which the current position of the celestial object belongs as a target radar model, and determining that a communication link exists between the celestial object and the target radar model;
and displaying the communication link in the mixed reality scene according to a preset second target special effect.
In one embodiment, the method further comprises:
receiving celestial body communication data sent by the server side, and controlling communication between the celestial body target and the target radar model according to the celestial body communication data;
when a communication event occurs between the celestial object and the target radar model, displaying the communication effect corresponding to the communication event in the mixed reality scene according to a preset third target special effect.
In one embodiment, the method further comprises:
capturing gesture information of a user;
and based on the gesture information, performing interactive operation on the radar three-dimensional power range displayed in the mixed reality scene.
In one embodiment, the method further comprises:
determining a relative positional relationship between one or more of the radar models and a pre-constructed earth surface model to display the radar model on the earth surface model in accordance with the relative positional relationship;
and receiving a radar observation angle sent by the server side, and rotating the radar model displayed on the earth surface model according to the radar observation angle.
In a second aspect, an embodiment of the present invention further provides a device for visualizing a three-dimensional power range of a radar, where the device is applied to a mixed reality platform, and the mixed reality platform is communicatively connected to a server, and includes:
the data acquisition module is used for acquiring radar position and range attributes sent by the server side;
the range generation module is used for generating a radar model based on the radar position and generating a radar three-dimensional power range corresponding to the radar model based on the range attribute;
And the range visualization module is used for displaying the radar three-dimensional power range in a mixed reality scene so as to realize the visualization of the radar three-dimensional power range.
In a third aspect, an embodiment of the present invention further provides an electronic device comprising a processor and a memory storing computer-executable instructions executable by the processor to implement the method of any one of the first aspects.
In a fourth aspect, embodiments of the present invention also provide a computer-readable storage medium storing computer-executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of any one of the first aspects.
The method, the device, the equipment and the medium for visualizing the radar three-dimensional power range are applied to a mixed reality platform, the mixed reality platform is in communication connection with a server, and the radar position and the range attribute sent by the server are firstly obtained; then, a radar model is generated based on the radar position, and a radar three-dimensional power range corresponding to the radar model is generated based on the range attribute; and finally, displaying the three-dimensional power range of the radar in a mixed reality scene so as to realize the visualization of the three-dimensional power range of the radar. According to the method, the radar model and the corresponding radar three-dimensional power range are generated based on the radar position and the range attribute through the mixed reality platform and displayed in the mixed reality scene, so that the visual effect of the radar three-dimensional power range is remarkably improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for visualizing a three-dimensional power range of a seed radar according to an embodiment of the present invention;
fig. 2 is a diagram showing an effect of data transceiving between a target radar model and a celestial object according to an embodiment of the present invention;
Fig. 3 is a schematic structural diagram of a radar three-dimensional power range visualization device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described in conjunction with the embodiments, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Currently, three-dimensional radar power is used as a visualization tool, and the visualization effect of the three-dimensional radar power still has a lifting space. Based on the above, the embodiment of the invention provides a method, a device, equipment and a medium for visualizing the three-dimensional power range of the radar, which can display the three-dimensional power range of the radar in a mixed reality scene, thereby obviously improving the visualization effect of the three-dimensional power range of the radar.
The embodiment of the invention firstly introduces related concepts:
(1) Unlike the two-dimensional drawing method of the radar range, the three-dimensional radar can measure three coordinates of a target, namely, a horizontal direction, a vertical direction and a distance. Three-dimensional radars are typically scanned by a pencil beam that is steered vertically by an electron beam and mechanically moved horizontally.
Mixed Reality (MR) technology is an emerging direction of development in visualization technology. MR technology is actually an upgrade of AR (AugmentedReality, AR, meaning augmented reality) technology, and can superimpose computer-generated virtual information (objects, pictures, videos, sounds, system prompt information and the like) into a real scene and interact with people, and the virtual world and the real world are combined into a seamless virtual-real fusion world, wherein physical entities and digital objects meet the real three-dimensional projection relationship, so that people can move freely in a three-dimensional space and observe from various angles, and can interact with 'entities' in the three-dimensional virtual space in real time, thereby realizing real three-dimensional display and 'real-time interweaving'.
MR technology utilizes space scanning, three-dimensional construction and MR space anchoring technology to assist perception, breaks through the limitation brought by vision, constructs four-dimensional space in brain, mixes real space to display three-dimensional virtual scene, presents real three-dimensional interactive scene through three-dimensional simulation display of various space entity information, drives virtual entity in real world with data, constructs virtual-real combined immersive three-dimensional simulation system, and provides a realistic battlefield space situation analysis training environment for command personnel at all levels.
Specifically, the MR system generally adopts an optical perspective technology, and a virtual image is superimposed on the eyeball of a person; the video perspective technology is adopted, the 'real' world seen by the eyes is acquired in real time and digitized through the binocular camera, then the picture is rendered in real time through the computer algorithm, partial virtual images can be overlapped, the virtual images can be completely overlapped, in addition, the constraint of the real picture can be eliminated, the image is deleted and changed, and the new 'real picture' is rendered through the computer. Three lasers are emitted by a three-color laser through a depth camera to scan the real space, the three lasers are converged by a relay lens and then emitted to two micro mirrors, one fast mirror controls transverse scanning, and the other slow mirror controls longitudinal movement, so that space depth data are obtained, space redrawing is carried out, and an application scene in the system is placed at a corresponding real space model position.
MR is the mixing of the real world and virtual world together to create a new visual environment. The key word of the method is redrawing, namely, virtual information is overlapped again by redrawing the real world, and the virtual information can interact with the redrawn real world, so that partial reservation and free switching between the virtual world and the real world are realized, and the environment is richer. VR, AR, MR differences as shown in table 1:
TABLE 1
Unlike the computer end which uses a mouse and a keyboard to interact, the MR mixed reality technology mainly achieves the interaction purpose through capturing and identifying the motion of the human hand. The holonens 2 and MagicLeapOne, nreal, actionOne are probably available in the market, wherein the holonens 2 belongs to the mountain-opening ancestor of the mixed reality equipment, the whole industry is initiated, the realization of optical waveguide technology imaging and structured light SLAM becomes a wind vane of the whole industry, the invention of a three-party view angle and some experimental work of a holographic UI bring precious property to the whole mixed reality field. HoloLens2 is unique in appearance design, four external depth cameras are used for recognizing the hand motions by depth, capturing the motions, presetting different gestures through the hand motions, and starting different functions by using different gestures. And the characteristics lead the playing method and the interaction mode to be richer and more various, and are widely appreciated by users.
(2) UnityShader is a technique for rendering graphics that can customize graphics cards to render pictures to achieve the desired effect of the user. In Unity, the shapers are divided into three types: surface shapers, vertex/Fragment shapers and Fixed Function Shaders. Wherein Fixed Function Shaders has been eliminated and is no longer used. The Surface loader is a further layer of Unity-to-Vertex/Fragment loader package, so that the manufacturing mode of the loader is more in line with the human thinking mode, and meanwhile, the task to be considered under different illumination models and different platforms can be completed with few codes. Shaderlab is a Unity specific language for writing shaders. The loader Graph is a tool that creates shaders without writing code. In writing a loader, a Shader programming language is required. Currently, there are three main languages: openGL Shading Language based on OpenGL (GLSL), high Level Shading Language based on DirectX (HLSL), and CforGraphic (Cg language) of NVIDIA corporation. In Unity, authorities recommend that the loader be written using Cg/HLSL.
Specifically, a loader is a technique for graphics rendering. A Shader (Shader) is a program running on a GPU for processing lighting, materials, dynamic weather and sky effects, water surface reflection and moire effects, etc. in graphics rendering. The shader can finish accurate real-time illumination calculation effect and process a large amount of three-dimensional data. Shaders are typically programs that run on a GPU, while GPU hardware is better suited to handle large amounts of data than traditional CPU hardware. There are many types of shader languages, and software or hardware developers may define the shader language themselves. Different graphics program interfaces may be supported on different platforms or operating systems, such as the microsoft corporation DirectX graphics program interface and HLSL shader language, which are used predominantly on Windows platforms. The ShaderLab of the Unity engine can be seen as an extension of these graphics shaders, which in addition to supporting the native graphics API language, adds a number of custom shader languages.
For the sake of understanding the present embodiment, first, a method for visualizing a three-dimensional power range of a radar disclosed in the present embodiment of the present invention is described in detail, where the method is applied to a Mixed Reality (MR) platform, and the mixed reality platform is communicatively connected to a server, and see a schematic flow chart of a method for visualizing a three-dimensional power range of a radar shown in fig. 1, and the method mainly includes steps S102 to S106:
Step S102, radar position and range attribute sent by a server side are obtained.
The radar position can be longitude and latitude data of the radar, the range attribute comprises a shape type, a target angle and a detection distance of a radar three-dimensional power range, the shape type comprises a cone type, a rectangular view field type or a phased array type, and the target angle comprises a horizontal angle and a pitching angle.
In one embodiment, the server side sends radar position and range attributes carrying radar identifications to the MR platform, and the MR platform generates a corresponding radar model and a corresponding radar three-dimensional power range according to the radar identifications based on the radar position and range attributes.
Step S104, a radar model is generated based on the radar position, and a radar three-dimensional power range corresponding to the radar model is generated based on the range attribute.
In one embodiment, the radar location may embody a relative positional relationship between the radar model and the earth model, at which a corresponding radar model is generated by a loader editor; and generating a radar three-dimensional power range corresponding to the radar model based on the target angle or the detection distance according to an algorithm corresponding to the shape type to which the radar three-dimensional power range belongs.
And S106, displaying the radar three-dimensional power range in the mixed reality scene so as to realize the visualization of the radar three-dimensional power range.
According to the method for visualizing the radar three-dimensional power range, provided by the embodiment of the invention, the radar model and the radar three-dimensional power range corresponding to the radar model are generated by the mixed reality platform based on the radar position and the range attribute and are displayed in the mixed reality scene, so that the visualization effect of the radar three-dimensional power range is remarkably improved.
The visualization method of the radar three-dimensional power range, provided by the embodiment of the invention, is applied to the field of MR mixed reality, and can definitely ensure that the capability of analyzing data and understanding data is more prominently displayed. Summary four advantages are roughly divided:
and (one) environment perception: by using a three-dimensional radar wellmap, the surrounding environment, including obstacles, terrain, and other objects, may be better perceived. This is very important for the development of MR applications, as it can help developers to better understand the environment in which the user is located.
(II) space positioning: three-dimensional radar power can help MR applications locate the user's position in space more accurately. This is very important for developing applications that require accurate positioning, such as indoor navigation or AR gaming.
(III) object identification: by using a three-dimensional radar wellmap, objects can be better identified and tracked. This is very important for developing applications that require identification and tracking of objects, such as AR games or industrial automation.
And (IV) safety monitoring: three-dimensional radar power can help monitor and detect potential safety issues such as defects in building structures or malfunctions of machine equipment. This is important for the industrial automation and construction industry.
Therefore, in the mixed reality field, it is imperative to develop a set of flexible, attractive and expandable visualization tools that can clearly and accurately display the radar power range and various parameters, and that are easy to use.
Based on the above, the embodiment of the invention provides a specific implementation mode of a visualization method of a radar three-dimensional power range:
further, parameters transmitted from the server side to the MR platform are generally divided into: (1) an initial angle comprising: azimuth angle: the radar rotates in the horizontal direction by an angle of 0 degrees with a parallel vector of a vector from the center of the earth to the north pole of the earth under the condition of being perpendicular to the earth surface; pitch angle: the radar rotates at an angle of 0 degrees in the vertical direction with the parallel vector of the vector from the center of the earth to the north pole of the earth in the case of being perpendicular to the earth's surface. (2) initial position: longitude and latitude of the earth. (3) horizontal angle: the angle of the three-dimensional power range cone grid model in the horizontal direction. (4) pitch angle: angle of the three-dimensional power range cone grid model in the vertical direction. (5) detection distance: radius of three-dimensional power range cone grid model.
Further, the embodiment of the invention provides a specific implementation manner of generating a radar three-dimensional power range corresponding to a radar model based on range attributes, wherein an MR radar three-dimensional power range drawing function dynamically adjusts and controls the shape, color, angle and the like of the radar power range according to parameters transmitted by a server, and then a customized loader is used for drawing to generate a corresponding grid model, and finally the grid model is displayed in an MR mixed reality scene. Specifically, for different shape types, a corresponding radar three-dimensional power range can be generated in the following way one to way three:
in the first mode, if the shape type is a cone type, determining a cone included angle based on the radar position and the detection distance, generating a regular cone model according to the cone included angle, and expressing a radar three-dimensional power range corresponding to the radar model by using the regular cone model.
Specifically, for a simple cone: the three-dimensional power range resembles a standard right circular cone. The detailed process is to take the radar as a cone vertex, and the specific shape of the three-dimensional power range is controlled mainly by an included angle formed between a connecting line (AB) of a detection distance and the radar position (conical vertex) to the edge of the bottom surface and a point (AC) where the maximum detection distance is located from the point to the radar facing upwards.
If the shape type is a rectangular view field type, constructing a first vector OX corresponding to the forward direction of the radar model, rotating the first vector OX anticlockwise by a horizontal angle to obtain a second vector OA, and rotating the first vector OX clockwise by the horizontal angle to obtain a third vector OB by taking a vector vertical to the ground at the longitude and latitude of the radar model as an axis; taking a vector perpendicular to the first vector OX and parallel to the tangent line of the current point as an axis, rotating the pitching angle anticlockwise for the first vector OX to obtain a fourth vector OC, and rotating the pitching angle clockwise for the first vector OX to obtain a fifth vector OD; generating a quadrangular pyramid model based on the second vector, the third vector, the fourth vector and the fifth vector, and expressing a radar three-dimensional power range corresponding to the radar model by using the quadrangular pyramid model.
Specifically, for a rectangular field of view: the three-dimensional power range is formed by a quadrangular pyramid and a spherical surface. The detailed process is that the positive direction of the radar is vector OX, the vector OA is obtained by anticlockwise rotating a specified angle (horizontal angle in parameters) by taking the vector with the longitude and latitude of the radar vertical to the ground as an axis, and the vector OB is obtained by clockwise rotating the specified angle (horizontal angle in parameters); the vector OC is obtained by rotating counterclockwise (upward) by a specified angle (pitch angle in the parameters) with a vector perpendicular to OX and parallel to the tangent line of the current point as an axis, and the vector OD is obtained by rotating clockwise (downward) by a specified angle (pitch angle in the parameters). Then, respectively obtaining OA, OB, OC, OD points A, B, C, D on a sphere with a radar as a cone vertex O and a detection distance as a radius, finally obtaining a quadrangular pyramid with OA, OB, OC, OD as an edge, and drawing a grid model (namely a model of a radar three-dimensional power range) by taking ABCD on the sphere as the ground.
In a third mode, if the shape type is a phased array type, constructing a sixth vector OA ', wherein the starting point of the sixth vector OA' is a radar position, and the direction is that the spherical center of the earth points to the north pole of the earth; a seventh vector OB 'is obtained by rotating the sixth vector OA' clockwise by a horizontal angle by taking a vector vertical to the current longitude and latitude coordinates of the earth as an axis, and the planes of the sixth vector OA 'and the seventh vector OB' are taken as bottom surfaces; taking the central line of the sixth vector OA 'and the seventh vector OB' as an eighth vector OC ', and rotating the eighth vector OC' by a pitching angle by taking the tangent line of the current longitude and latitude vertical to the eighth vector OC 'as an axis to obtain a ninth vector OD'; taking the sixth vector OA ', the seventh vector OB ' and the ninth vector OD ' as edges; and generating a phased array model based on the bottom surface and the edge, and expressing a radar three-dimensional power range corresponding to the radar model by using the phased array model.
Specifically, for phased arrays: the three-dimensional power range is formed by a triangular pyramid and a spherical triangle. The detailed process is to take the radar position as the cone vertex O and the vector OA' as the starting point: radar position, direction: and then rotating the designated angle (horizontal angle in parameters) of the service end of the vector OA 'clockwise by taking the vector of the current longitude and latitude coordinates of the earth as an axis to obtain the vectors OB', OA ', OB' on the plane which is the bottom surface of the three-dimensional power range. The central line of the included angle A 'B' is a vector OC ', and the current tangential line of the longitude and latitude perpendicular to the OC' is taken as an axis, and the vector OD ', OA', OB 'and OD' are obtained by rotating a designated angle (the pitching angle in the parameters), namely three edges of the triangular pyramid. The spherical triangle takes the O point as the sphere center, OA 'as the sphere radius, and the area contained in A' B 'D' on the sphere is the spherical triangle.
Furthermore, the embodiment of the invention can also realize the highlighting of overlapping areas of a plurality of radar three-dimensional power ranges. In one embodiment, for any two or more radar models, determining an intersection overlapping area between radar three-dimensional power ranges corresponding to the radar models, and then displaying the intersection overlapping area in a mixed reality scene according to a preset first target special effect.
In a specific implementation, when multiple radar models exist in a mixed reality scene, the requirements of representing the total range of the radar, representing the intersection overlapping area of two or more radars, displaying the overlapping area singly, highlighting the overlapping intersection area and the like naturally occur, so that the calculation and the calculation of multiple radar envelopes are involved, and the first target special effect manufactured by using a loader is displayed in a program.
Furthermore, the embodiment of the invention can also realize the detection of the celestial object entering the radar three-dimensional power range. In one embodiment, see steps 1 to 4 below:
and step 1, receiving celestial body operation data sent by a server side to control celestial body target movement according to celestial body operation data. The celestial body operation data are used for reflecting operation tracks and the like of celestial body targets.
And 2, judging whether the celestial object enters a radar three-dimensional power range corresponding to one or more radar models in the celestial object moving process. In one example, knowing the radar three-dimensional power range corresponding to the radar model and the real-time longitude and latitude coordinates in the moving process of the celestial object, whether the celestial object enters the radar three-dimensional power range corresponding to the radar model can be judged by comparing whether the real-time longitude and latitude coordinates are located in the radar three-dimensional power range.
And 3, if so, taking a radar model corresponding to the radar three-dimensional power range of the current position of the celestial object as a target radar model, and determining that a communication link exists between the celestial object and the target radar model.
And 4, displaying the communication link in the mixed reality scene according to the preset second target special effect.
For example, referring to a display effect diagram of data transceiving between a target radar model and a celestial object shown in fig. 2, when the celestial object enters a radar identifiable region (i.e., a radar three-dimensional power range), the embodiment of the present invention performs signal transceiving according to data, and a second target special effect manufactured by a loader can be used to simulate the process.
Furthermore, the embodiment of the invention can also enable the radar three-dimensional power range to react to the entering celestial object. In one embodiment, celestial communication data sent by the server may be received, so as to control communication between the celestial object and the target radar model according to the celestial communication data, and when a communication event occurs between the celestial object and the target radar model, display a communication effect corresponding to the communication event in a mixed reality scene according to a preset third target special effect.
In practical application, when a celestial object runs to a certain stage on a track and enters a three-dimensional power range of a radar, real-time data communication is carried out with the radar, and a third target special effect manufactured by a reader is selected to be represented. When multiple radars exist, the three-dimensional power ranges can generate power ranges which are overlapped and combined, and the same celestial body can simultaneously communicate with the multiple radars.
The running of celestial bodies on the track is controlled by program time, and when communication is started, the communication object is designated by server data. When the celestial body starts to run on the track, timing is started, and the time runs to a time range where communication is needed in the data, a third target special effect manufactured by the loader is controlled to start to display the communication effect.
Furthermore, the embodiment of the invention can also realize three-dimensional interaction of the radar three-dimensional power range model in the holographic space. In one embodiment, gesture information of a user may be captured, and based on the gesture information, an interactive operation may be performed on a radar three-dimensional power range displayed within the mixed reality scene.
In specific implementation, the interaction and control module captures and analyzes various gesture operations and effectively controls the interaction of the related grid model. The main interaction types are: (1) UI interaction function: creating a holographic 2DUI or 3DUI, operating the UI by using gestures, and finally achieving the purpose of operating the grid model. (2) object interaction functionality: creating a holographic object, and performing operation interaction on the grid model by using gestures. (3) gesture interaction: the physical volume of the hand is simulated to realize steel body collision, and the joint mapping enables the hand to interact with the holographic object.
Furthermore, the embodiment of the invention can also realize the display of the radar model on the spherical surface. In one embodiment, a relative positional relationship between one or more radar models and a pre-constructed earth surface model may be determined to display the radar model on the earth surface model according to the relative positional relationship, and then a radar observation angle sent by a server side is received to rotate the display radar model on the earth surface model according to the radar observation angle.
In the concrete implementation, in the scene generation process, the generation of radar stations belongs to a more important link. When radar is generated on the earth's surface, it is necessary to first assign it an initial angle to facilitate data retention migration with the server. In the procedure, the embodiment of the invention takes the north pole direction of the earth as the initial 0 degree of radar orientation, and then rotates the radar station in the scene according to the radar observation angle of the specified radar (according to the number) received from the service.
In summary, the following objects can be achieved by the embodiments of the present invention:
holographic representation of three-dimensional radar power in a confined space:
three-dimensional radar power has found wide application in the space field. The method can be used for detecting, tracking and identifying the space target and monitoring and analyzing the space garbage. In addition, the three-dimensional radar power map can be used in the fields of satellite communication, navigation, remote sensing and the like. In satellite communications, three-dimensional radar power can help satellite positioning and tracking ground stations, thereby improving communication quality. In navigation, three-dimensional radar power maps may be used for target identification and tracking in satellite navigation systems. In remote sensing, three-dimensional radar power maps can be used for three-dimensional imaging and topography measurements of the earth's surface.
The three-dimensional simulation technology can truly reproduce the environment and the scene, bring the feeling of being personally on the scene to the people, provide visual animation, graphics and other display effects for space simulation display and command decision, and play a vital role in simulating and deducting the real environment due to accurate and clear radar power range display. The method breaks through the prior means of displaying three-dimensional visualization technology on two-dimensional display equipment such as televisions, projectors and the like, adopts MR technology, uses HoloLens 2 equipment to construct a holographic space form, truly realizes three-dimensional display, can walk in space environment, is free to operate, and observes and analyzes the complicated conditions in space from different directions.
(II) three-dimensional radar Weizhu in holographic space three-dimensional interactions:
interaction with various celestial bodies in space in holographic space is realized, two-dimensional and three-dimensional interaction is performed, such as various scene switching, virtual model zooming-in, zooming-out, dragging movement, zooming-in and zooming-out, rotation and other operations are performed, and interaction is performed by using gestures.
On the basis of the foregoing embodiment, the embodiment of the present invention further provides a device for visualizing a radar three-dimensional power range, where the device is applied to a mixed reality platform, and the mixed reality platform is connected with a server in a communication manner, and referring to a schematic structural diagram of the device for visualizing a radar three-dimensional power range shown in fig. 3, the device mainly includes the following parts:
The data acquisition module 302 is configured to acquire a radar position and a range attribute sent by the server side;
the range generation module 304 is configured to generate a radar model based on the radar position, and generate a radar three-dimensional power range corresponding to the radar model based on the range attribute;
the range visualization module 306 is configured to display the radar three-dimensional power range in a mixed reality scene, so as to realize the visualization of the radar three-dimensional power range.
According to the method for visualizing the radar three-dimensional power range, provided by the embodiment of the invention, the radar model and the radar three-dimensional power range corresponding to the radar model are generated by the mixed reality platform based on the radar position and the range attribute and are displayed in the mixed reality scene, so that the visualization effect of the radar three-dimensional power range is remarkably improved.
In one embodiment, the range attribute includes a shape type of the radar three-dimensional power range, a target angle, a detection distance, the shape type includes a cone type, a rectangular field of view type, or a phased array type, and the target angle includes a horizontal angle and a pitch angle; the range generation module 304 is further configured to:
if the shape type is a cone type, determining a cone included angle based on the radar position and the detection distance, generating a regular cone model according to the cone included angle, and expressing a radar three-dimensional power range corresponding to the radar model by using the regular cone model;
If the shape type is a rectangular view field type, constructing a first vector corresponding to the forward direction of the radar model, taking a vector with the longitude and latitude of the radar model vertical to the ground as an axis, rotating the first vector anticlockwise by a horizontal angle to obtain a second vector, and rotating the first vector clockwise by the horizontal angle to obtain a third vector; taking a vector perpendicular to the first vector and parallel to the tangent line of the current point as an axis, rotating the pitching angle anticlockwise for the first vector to obtain a fourth vector, and rotating the pitching angle clockwise for the first vector to obtain a fifth vector; generating a quadrangular pyramid model based on the second vector, the third vector, the fourth vector and the fifth vector, and expressing a radar three-dimensional power range corresponding to the radar model by using the quadrangular pyramid model;
if the shape type is a phased array type, constructing a sixth vector, wherein the starting point of the sixth vector is a radar position, and the direction is that the spherical center of the earth points to the north pole of the earth; a seventh vector is obtained by rotating the sixth vector clockwise by a horizontal angle by taking a vector vertical to the current longitude and latitude coordinates of the earth as an axis, and a plane where the sixth vector and the seventh vector are positioned is taken as a bottom surface; taking the central line of the sixth vector and the seventh vector as an eighth vector, and rotating the eighth vector by a pitching angle by taking the tangent line of the current longitude and latitude vertical to the eighth vector as an axis to obtain a ninth vector; taking the sixth vector, the seventh vector and the ninth vector as edges; and generating a phased array model based on the bottom surface and the edge, and expressing a radar three-dimensional power range corresponding to the radar model by using the phased array model.
In one embodiment, the method further comprises an overlapping region display module for:
for any two or more radar models, determining an intersection overlapping area between radar three-dimensional power ranges corresponding to the radar models;
and displaying the intersection overlapping area in the mixed reality scene according to the preset first target special effect.
In one embodiment, the method further comprises a communication link presentation module for:
receiving celestial body operation data sent by a server side to control celestial body target movement according to celestial body operation data;
judging whether a celestial object enters a radar three-dimensional power range corresponding to one or more radar models in the celestial object moving process;
if so, taking a radar model corresponding to the radar three-dimensional power range to which the current position of the celestial object belongs as a target radar model, and determining that a communication link exists between the celestial object and the target radar model;
and displaying the communication link in the mixed reality scene according to the preset second target special effect.
In one embodiment, the system further comprises a communication effect display module, configured to:
receiving celestial body communication data sent by a server side, and controlling communication between a celestial body target and a target radar model according to the celestial body communication data;
When a communication event occurs between the target radar model and the target object, displaying the communication effect corresponding to the communication event in the mixed reality scene according to a preset third target special effect.
In one embodiment, the method further comprises an interaction module for:
capturing gesture information of a user;
and based on gesture information, performing interactive operation on the radar three-dimensional power range displayed in the mixed reality scene.
In one embodiment, the system further comprises a radar station finding display module for:
determining a relative positional relationship between one or more radar models and a pre-constructed earth surface model to display the radar model on the earth surface model in accordance with the relative positional relationship;
and receiving a radar observation angle sent by the server side so as to rotate the display radar model on the earth surface model according to the radar observation angle.
The device provided by the embodiment of the present invention has the same implementation principle and technical effects as those of the foregoing method embodiment, and for the sake of brevity, reference may be made to the corresponding content in the foregoing method embodiment where the device embodiment is not mentioned.
The embodiment of the invention provides electronic equipment, which comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the embodiments described above.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device 100 includes: a processor 40, a memory 41, a bus 42 and a communication interface 43, the processor 40, the communication interface 43 and the memory 41 being connected by the bus 42; the processor 40 is arranged to execute executable modules, such as computer programs, stored in the memory 41.
The memory 41 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and the at least one other network element is achieved via at least one communication interface 43 (which may be wired or wireless), which may use the internet, a wide area network, a local network, a metropolitan area network, etc.
Bus 42 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 4, but not only one bus or type of bus.
The memory 41 is configured to store a program, and the processor 40 executes the program after receiving an execution instruction, and the method executed by the apparatus for flow defining disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 40 or implemented by the processor 40.
The processor 40 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in processor 40. The processor 40 may be a general-purpose processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a digital signal processor (Digital Signal Processing, DSP for short), application specific integrated circuit (Application Specific Integrated Circuit, ASIC for short), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA for short), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 41 and the processor 40 reads the information in the memory 41 and in combination with its hardware performs the steps of the method described above.
The computer program product of the readable storage medium provided by the embodiment of the present invention includes a computer readable storage medium storing a program code, where the program code includes instructions for executing the method described in the foregoing method embodiment, and the specific implementation may refer to the foregoing method embodiment and will not be described herein.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, but it should be understood by those skilled in the art that the present invention is not limited thereto, and that the present invention is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. The method is applied to a mixed reality platform, and the mixed reality platform is in communication connection with a server side, and comprises the following steps:
acquiring radar position and range attributes sent by the server side;
Generating a radar model based on the radar position, and generating a radar three-dimensional power range corresponding to the radar model based on the range attribute; the MR radar three-dimensional power range drawing function dynamically adjusts and controls the shape, color and angle of the radar power range according to parameters transmitted by a server end, and then a corresponding grid model is generated by using a customized loader drawing;
displaying the radar three-dimensional power range in a mixed reality scene to realize the visualization of the radar three-dimensional power range;
the range attribute comprises a shape type, a target angle and a detection distance of the radar three-dimensional power range, wherein the shape type comprises a cone type, a rectangular view field type or a phased array type, and the target angle comprises a horizontal angle and a pitching angle; the step of generating the radar three-dimensional power range corresponding to the radar model based on the range attribute comprises the following steps:
if the shape type is the cone type, determining a cone included angle based on the radar position and the detection distance, generating a regular cone model according to the cone included angle, and expressing a radar three-dimensional power range corresponding to the radar model by using the regular cone model;
If the shape type is the rectangular view field type, constructing a first vector corresponding to the forward direction of the radar model, rotating the first vector anticlockwise by the horizontal angle to obtain a second vector and rotating the first vector clockwise by the horizontal angle to obtain a third vector by taking a vector, perpendicular to the ground, of the longitude and latitude of the radar model as an axis; rotating the pitching angle anticlockwise for the first vector to obtain a fourth vector and rotating the pitching angle clockwise for the first vector to obtain a fifth vector by taking a vector which is perpendicular to the first vector and parallel to the tangent line of the current point as an axis; generating a quadrangular pyramid model based on the second vector, the third vector, the fourth vector and the fifth vector, and expressing a radar three-dimensional power range corresponding to the radar model by using the quadrangular pyramid model;
if the shape type is the phased array type, constructing a sixth vector, wherein the starting point of the sixth vector is the radar position, and the direction is that the sphere center of the earth points to the north pole of the earth; rotating the sixth vector clockwise by the horizontal angle by taking a vector of the vertical and the current longitude and latitude coordinates of the earth as an axis to obtain a seventh vector, and taking the planes of the sixth vector and the seventh vector as bottom surfaces; taking the central line of the sixth vector and the seventh vector as an eighth vector, and rotating the eighth vector by the pitching angle by taking the tangent line of the current longitude and latitude vertical to the eighth vector as an axis to obtain a ninth vector; taking the sixth vector, the seventh vector and the ninth vector as edges; and generating a phased array model based on the bottom surface and the edge, and expressing a radar three-dimensional power range corresponding to the radar model by using the phased array model.
2. The method of visualizing a radar three-dimensional power range as in claim 1, further comprising:
for any two or more radar models, determining an intersection overlapping area between the radar three-dimensional power ranges corresponding to the radar models;
and displaying the intersection overlapping area in the mixed reality scene according to a preset first target special effect.
3. The method of visualizing a radar three-dimensional power range as in claim 1, further comprising:
receiving celestial body operation data sent by the server side to control celestial body target movement according to the celestial body operation data;
judging whether the celestial object enters one or more radar three-dimensional power ranges corresponding to the radar model in the celestial object moving process;
if so, taking a radar model corresponding to a radar three-dimensional power range to which the current position of the celestial object belongs as a target radar model, and determining that a communication link exists between the celestial object and the target radar model;
and displaying the communication link in the mixed reality scene according to a preset second target special effect.
4. A method of visualizing a radar three-dimensional power range as in claim 3, further comprising:
receiving celestial body communication data sent by the server side, and controlling communication between the celestial body target and the target radar model according to the celestial body communication data;
when a communication event occurs between the celestial object and the target radar model, displaying the communication effect corresponding to the communication event in the mixed reality scene according to a preset third target special effect.
5. The method of visualizing a radar three-dimensional power range as in claim 1, further comprising:
capturing gesture information of a user;
and based on the gesture information, performing interactive operation on the radar three-dimensional power range displayed in the mixed reality scene.
6. The method of visualizing a radar three-dimensional power range as in claim 1, further comprising:
determining a relative positional relationship between one or more of the radar models and a pre-constructed earth surface model to display the radar model on the earth surface model in accordance with the relative positional relationship;
And receiving a radar observation angle sent by the server side, and rotating the radar model displayed on the earth surface model according to the radar observation angle.
7. The utility model provides a three-dimensional power scope's of radar visualization device, its characterized in that, the device is applied to mixed reality platform, mixed reality platform and server end communication connection include:
the data acquisition module is used for acquiring radar position and range attributes sent by the server side;
the range generation module is used for generating a radar model based on the radar position and generating a radar three-dimensional power range corresponding to the radar model based on the range attribute; the MR radar three-dimensional power range drawing function dynamically adjusts and controls the shape, color and angle of the radar power range according to parameters transmitted by a server end, and then a corresponding grid model is generated by using a customized loader drawing;
the range visualization module is used for displaying the radar three-dimensional power range in a mixed reality scene so as to realize the visualization of the radar three-dimensional power range;
the range attribute comprises a shape type, a target angle and a detection distance of the radar three-dimensional power range, wherein the shape type comprises a cone type, a rectangular view field type or a phased array type, and the target angle comprises a horizontal angle and a pitching angle; the range generation module is further configured to:
If the shape type is the cone type, determining a cone included angle based on the radar position and the detection distance, generating a regular cone model according to the cone included angle, and expressing a radar three-dimensional power range corresponding to the radar model by using the regular cone model;
if the shape type is the rectangular view field type, constructing a first vector corresponding to the forward direction of the radar model, rotating the first vector anticlockwise by the horizontal angle to obtain a second vector and rotating the first vector clockwise by the horizontal angle to obtain a third vector by taking a vector, perpendicular to the ground, of the longitude and latitude of the radar model as an axis; rotating the pitching angle anticlockwise for the first vector to obtain a fourth vector and rotating the pitching angle clockwise for the first vector to obtain a fifth vector by taking a vector which is perpendicular to the first vector and parallel to the tangent line of the current point as an axis; generating a quadrangular pyramid model based on the second vector, the third vector, the fourth vector and the fifth vector, and expressing a radar three-dimensional power range corresponding to the radar model by using the quadrangular pyramid model;
If the shape type is the phased array type, constructing a sixth vector, wherein the starting point of the sixth vector is the radar position, and the direction is that the sphere center of the earth points to the north pole of the earth; rotating the sixth vector clockwise by the horizontal angle by taking a vector of the vertical and the current longitude and latitude coordinates of the earth as an axis to obtain a seventh vector, and taking the planes of the sixth vector and the seventh vector as bottom surfaces; taking the central line of the sixth vector and the seventh vector as an eighth vector, and rotating the eighth vector by the pitching angle by taking the tangent line of the current longitude and latitude vertical to the eighth vector as an axis to obtain a ninth vector; taking the sixth vector, the seventh vector and the ninth vector as edges; and generating a phased array model based on the bottom surface and the edge, and expressing a radar three-dimensional power range corresponding to the radar model by using the phased array model.
8. An electronic device comprising a processor and a memory, the memory storing computer-executable instructions executable by the processor, the processor executing the computer-executable instructions to implement the method of any one of claims 1 to 6.
9. A computer readable storage medium storing computer executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of any one of claims 1 to 6.
CN202311657990.1A 2023-12-06 2023-12-06 Visualization method, device, equipment and medium for radar three-dimensional power range Active CN117368869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311657990.1A CN117368869B (en) 2023-12-06 2023-12-06 Visualization method, device, equipment and medium for radar three-dimensional power range

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311657990.1A CN117368869B (en) 2023-12-06 2023-12-06 Visualization method, device, equipment and medium for radar three-dimensional power range

Publications (2)

Publication Number Publication Date
CN117368869A CN117368869A (en) 2024-01-09
CN117368869B true CN117368869B (en) 2024-03-19

Family

ID=89402591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311657990.1A Active CN117368869B (en) 2023-12-06 2023-12-06 Visualization method, device, equipment and medium for radar three-dimensional power range

Country Status (1)

Country Link
CN (1) CN117368869B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5757310A (en) * 1995-05-03 1998-05-26 Matra Bae Dynamics (Uk) Ltd. Tactical ballistic missle early warning radar and defence system
JPH10154243A (en) * 1996-09-30 1998-06-09 Sony Corp Information processor, information processing method and information supply medium in three-dimensional virtual reality space sharing system
CN106932772A (en) * 2017-03-15 2017-07-07 华北计算技术研究所(中国电子科技集团公司第十五研究所) A kind of radar coverage display methods by the influence of topography towards digital earth
CN107067454A (en) * 2017-04-01 2017-08-18 北京无线电测量研究所 A kind of radar coverage-diagram 3 D displaying method based on SuperMap development platform
CN110322553A (en) * 2019-07-10 2019-10-11 广州建通测绘地理信息技术股份有限公司 The method and system of laser radar point cloud mixed reality scene implementation setting-out
CN113933844A (en) * 2021-10-13 2022-01-14 黄兵 Phased array multiband integrated transmitting and receiving radar and radar detection method
CN217467690U (en) * 2022-01-07 2022-09-20 中国人民解放军火箭军工程大学 Distributed maintenance support site selection system for military Internet of things
WO2023059087A1 (en) * 2021-10-08 2023-04-13 Samsung Electronics Co., Ltd. Augmented reality interaction method and apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200035028A1 (en) * 2018-07-30 2020-01-30 Raytheon Company Augmented reality (ar) doppler weather radar (dwr) visualization application
US11899124B2 (en) * 2020-04-17 2024-02-13 Raytheon Company Interface for realtime, 3D radar activity visualization

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5757310A (en) * 1995-05-03 1998-05-26 Matra Bae Dynamics (Uk) Ltd. Tactical ballistic missle early warning radar and defence system
JPH10154243A (en) * 1996-09-30 1998-06-09 Sony Corp Information processor, information processing method and information supply medium in three-dimensional virtual reality space sharing system
CN106932772A (en) * 2017-03-15 2017-07-07 华北计算技术研究所(中国电子科技集团公司第十五研究所) A kind of radar coverage display methods by the influence of topography towards digital earth
CN107067454A (en) * 2017-04-01 2017-08-18 北京无线电测量研究所 A kind of radar coverage-diagram 3 D displaying method based on SuperMap development platform
CN110322553A (en) * 2019-07-10 2019-10-11 广州建通测绘地理信息技术股份有限公司 The method and system of laser radar point cloud mixed reality scene implementation setting-out
WO2023059087A1 (en) * 2021-10-08 2023-04-13 Samsung Electronics Co., Ltd. Augmented reality interaction method and apparatus
CN113933844A (en) * 2021-10-13 2022-01-14 黄兵 Phased array multiband integrated transmitting and receiving radar and radar detection method
CN217467690U (en) * 2022-01-07 2022-09-20 中国人民解放军火箭军工程大学 Distributed maintenance support site selection system for military Internet of things

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"干扰下雷达三维威力范围可视化研究";张阳等;《电子信息对抗技术》;20111130(第6期);73-77 *

Also Published As

Publication number Publication date
CN117368869A (en) 2024-01-09

Similar Documents

Publication Publication Date Title
Haala et al. 3D urban GIS from laser altimeter and 2D map data
US20130300740A1 (en) System and Method for Displaying Data Having Spatial Coordinates
US20040181382A1 (en) Visualizing the surface of a liquid
Portalés et al. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
US20090237396A1 (en) System and method for correlating and synchronizing a three-dimensional site model and two-dimensional imagery
US7098915B2 (en) System and method for determining line-of-sight volume for a specified point
CN109448137B (en) Interaction method, interaction device, electronic equipment and storage medium
CN115690336B (en) Satellite beam coverage area visualization method, server and storage medium
CN111047506B (en) Environmental map generation and hole filling
CN112712582B (en) Dynamic global illumination method, electronic device and computer readable storage medium
Gomez-Jauregui et al. Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM
CN113593027B (en) Three-dimensional avionics display control interface device
CN109741431B (en) Two-dimensional and three-dimensional integrated electronic map frame
US9401044B1 (en) Method for conformal visualization
Liarokapis et al. Mobile augmented reality techniques for geovisualisation
RU2295772C1 (en) Method for generation of texture in real time scale and device for its realization
US7116341B2 (en) Information presentation apparatus and method in three-dimensional virtual space and computer program therefor
CN117368869B (en) Visualization method, device, equipment and medium for radar three-dimensional power range
JPH113432A (en) Image processor, game machine, its method and recording medium
Fernández-Palacios et al. Augmented reality for archaeological finds
Tao A VR/AR-based display system for arts and crafts museum
Conde et al. LiDAR Data Processing for Digitization of the Castro of Santa Trega and Integration in Unreal Engine 5
CN111325783A (en) WebGIS-based visual domain analysis method and device
Hairuddin et al. Development of a 3d cadastre augmented reality and visualization in malaysia
CN113658318A (en) Data processing method and system, training data generation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant