CN112802159A - Rendering method and device of air route and storage medium - Google Patents

Rendering method and device of air route and storage medium Download PDF

Info

Publication number
CN112802159A
CN112802159A CN202110088010.5A CN202110088010A CN112802159A CN 112802159 A CN112802159 A CN 112802159A CN 202110088010 A CN202110088010 A CN 202110088010A CN 112802159 A CN112802159 A CN 112802159A
Authority
CN
China
Prior art keywords
data
rendering
track
route
rendered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110088010.5A
Other languages
Chinese (zh)
Inventor
周明瑞
吴浩原
麻广伟
赵龙
李洪亮
马三立
石清华
温宇浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Cennavi Technologies Co Ltd
Original Assignee
Beijing Cennavi Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Cennavi Technologies Co Ltd filed Critical Beijing Cennavi Technologies Co Ltd
Priority to CN202110088010.5A priority Critical patent/CN112802159A/en
Publication of CN112802159A publication Critical patent/CN112802159A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the application provides a rendering method and device of a flight path and a storage medium, relates to the technical field of computers, and can solve the problems that the rendering method of the flight path in the related technology is complex, the jamming is easy to occur when data is processed, and the rendering effect is poor, and the method comprises the following steps: firstly, a rendering device of a route acquires data of a target route; wherein the data of the target route comprises a starting position, an ending position and an intermediate parameter, wherein the intermediate parameter comprises at least one of a maximum altitude, a minimum altitude and a unit travel distance; and finally, rendering the driving track according to the position in the spherical coordinate system to obtain the rendered track.

Description

Rendering method and device of air route and storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a rendering method and device of a flight path and a storage medium.
Background
At present, the rendering method of the airline in the related art mainly adopts a canvas data loading manner and a webgl (web graphics library) three-dimensional basic engine loading manner. The data loading mode of canvas is that front-end visual libraries such as echarts and mapv adopt an mvc (model-view-controller) mode to process and render data; however, this method may cause a stuck condition when the amount of data is large. The three-dimensional basic engine loading mode of the WebGL is mainly used for performing route rendering by setting exposure parameters of three-dimensional visualization libraries such as threjs and procession and processing input data; however, since threjs does not belong to a three-dimensional visualization library of a map class, the user is required to calculate data and parameters to perform route rendering, which is complicated; in addition, the rendering effect of the ceium on the route is poor, and the route is deformed.
Based on this, the rendering method of the flight path provided in the related art may cause problems of jamming, complicated rendering method, and poor rendering effect when processing data.
Disclosure of Invention
The application aims to provide a rendering method, a rendering device and a storage medium of a flight path, which can solve the problems that the rendering method of the flight path is complex, the jamming is easy to occur during data processing and the rendering effect is poor in the related technology.
In a first aspect, an embodiment of the present application provides a rendering method for a flight path, where the rendering method includes: firstly, a rendering device of a route acquires data of a target route; wherein the data of the target route comprises a starting position, an ending position and an intermediate parameter, wherein the intermediate parameter comprises at least one of a maximum altitude, a minimum altitude and a unit travel distance; and finally, rendering the driving track according to the position in the spherical coordinate system to obtain the rendered track.
Based on the first aspect, the flight path rendering device determines the driving track of the target flight path according to the acquired data of the target flight path, then converts the coordinates of each position in the data into spherical coordinates, and renders the driving track by adopting a WebGL technology, namely, the coordinates in a spherical coordinate system are drawn, so that the rendered track can be obtained, and therefore the problems that the flight path rendering method in the related technology is complex, the blockage is easy to occur when the data is processed, and the rendering effect is poor can be solved.
In one possible design, before determining the driving track of the target route according to the data, the route rendering method further comprises the following steps: the rendering device of the air route divides the data into a plurality of data groups, determines the sub-track of the target air route according to the data groups, and obtains the driving track of the target air route according to the sub-track.
Based on the possible design, the rendering device of the air route divides the data into a plurality of data groups, and determines the sub-track of the target air route according to the starting position and the ending position in each data group.
In one possible design, the rendering device of the route constructs a functional relation corresponding to the target route according to the data, and determines the driving track of the target route according to the functional relation.
Based on the possible design, the rendering device of the route constructs a functional relation corresponding to the target route through data, and then determines the running track of the target route according to the functional relation, so that the determined running track of the target route is more accurate, and the running track of the target route is smoother.
In one possible design, the en route rendering device draws the position in the spherical coordinate system according to the shader to obtain the rendered trajectory.
In one possible design, the rendering device of the flight path modifies the attribute information of the rendered trajectory according to a preset period, so that the rendered trajectory has different display modes at different moments.
Based on the possible design, the rendering device of the flight path modifies the attribute information of the rendered track, namely modifies the color, the line width and the transparency of the rendered track, so that the rendered track presents a dynamic effect, and further the rendering effect is enhanced.
In one possible design, the rendering device of the flight path calls an animation model, assigns a position in a spherical coordinate system to the animation model, and adjusts the attitude angle of the animation model according to data to obtain a rendered dynamic track.
Based on the possible design, the rendering device of the flight path adds the animation model in the rendered track and assigns the position in the spherical coordinate system to the animation model so as to ensure that the animation model is consistent with the rendered track; on the basis, the attitude angle of the animation model is adjusted, so that the animation model can rotate the attitude angle of the animation model according to the rendered track, the rendered track also comprises the animation model, and the rendering effect is further enhanced.
In a second aspect, an embodiment of the present application provides a en-route rendering apparatus, which may implement the functions performed by the en-route rendering apparatus in the above first aspect or the possible design of the first aspect, where the functions may be implemented by hardware executing corresponding software. The hardware or software comprises one or more modules corresponding to the functions. Such as an acquisition module, a processing module, a coordinate transformation module, and a rendering module. The acquisition module is used for acquiring data of a target air route; the data includes a start position, an end position, and intermediate parameters including at least one of a maximum altitude, a minimum altitude, and a unit distance traveled; the processing module is used for determining the driving track of the target route according to the data; the coordinate conversion module is used for converting a coordinate system of each position in the data to obtain the position of each position in a spherical coordinate system; and the rendering module is used for rendering the driving track according to the position in the spherical coordinate system to obtain the rendered track.
In one possible design, the processing module is further configured to divide the data into a plurality of data groups; wherein each of the plurality of data sets includes a start position, an end position, and an intermediate parameter, the intermediate parameter including at least one of a maximum height, a minimum height, and a unit travel distance; the time corresponding to different data sets is different; and determining a sub-track of the target route according to the data group, and obtaining the driving track of the target route according to the sub-track.
In one possible design, the processing module is specifically configured to construct a functional relationship corresponding to the target route according to the data; and determining the driving track of the target route according to the functional relation.
In one possible design, the en route rendering device further includes a shader; the rendering module is specifically configured to draw a position in the spherical coordinate system according to the shader, so as to obtain a rendered trajectory.
In one possible design, the rendering module is further configured to modify attribute information of the rendered trajectory according to a preset period, so that the rendered trajectory has different display modes at different times; wherein the attribute information includes at least one of color, line width, and transparency.
In one possible design, the rendering module is further configured to invoke an animation model and assign a position in the spherical coordinate system to the animation model; and adjusting the attitude angle of the animation model according to the data to obtain a rendered dynamic track.
In a third aspect, an embodiment of the present application provides an electronic device, where the lane rendering apparatus may be the electronic device or a chip or a system on a chip in the electronic device. The electronic device may implement the functions performed by the en-route rendering apparatus in the possible designs of the above aspects, which functions may be implemented by hardware and software.
In one possible design, the electronic device may include: a processor and a memory; the processor is coupled with the memory. The memory is for storing computer program code comprising computer instructions. When the computer instructions are executed by the processor, the electronic device performs a method of rendering the lanes as described in the first aspect and any of its possible designs.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer instruction or program which, when run on a computer, causes the computer to perform a method of rendering a course as set forth in the first aspect or any possible design of the first aspect.
In a fifth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform a method of rendering a flight path as set forth in the first aspect or any one of the possible designs of the first aspect.
Drawings
FIG. 1 is a schematic flowchart of a rendering method for a flight path according to an embodiment of the present disclosure;
FIG. 2 is a spherical coordinate system provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating another en route rendering method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a driving trajectory of a target route according to an embodiment of the present application;
FIG. 5 is a schematic flowchart of a rendering method for another route according to an embodiment of the present disclosure;
FIG. 6 is a schematic flowchart of a rendering method for another route according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a rendered track according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another rendered trajectory provided by an embodiment of the present application;
FIG. 9 is a schematic structural diagram of a en route rendering apparatus according to an embodiment of the present application;
fig. 10 is a schematic composition diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second" and the like in the description and drawings of the present application are used for distinguishing different objects or for distinguishing different processes for the same object, and are not used for describing a specific order of the objects.
Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the present application, the meaning of "a plurality" means two or more unless otherwise specified.
In order to solve the problems in the background art, the embodiment of the application provides a rendering method of a flight path, wherein a rendering device of the flight path determines a driving track of a target flight path according to acquired data of the target flight path, each position in the data is converted into a position in a spherical coordinate system, then the rendering device of the flight path renders the driving track according to the position in the spherical coordinate system to obtain a rendered track, and therefore the rendering method of the flight path is simple, data processing is smooth, and the effect of the rendered track is improved.
The following detailed description of embodiments of the present application refers to the accompanying drawings.
Fig. 1 is a schematic flowchart of a rendering method of a flight path provided in an embodiment of the present application, and as shown in fig. 1, the rendering method includes:
s101, the rendering device of the route acquires data of a target route.
Wherein the data includes a start position, an end position, and intermediate parameters including at least one of a maximum altitude, a minimum altitude, and a unit distance traveled.
It should be noted that the intermediate parameter includes at least one of a maximum height, a minimum height, and a unit travel distance, and may be that the intermediate parameter includes only the maximum height, only the minimum height, or only the unit travel distance; the intermediate parameters may include any two or three of the maximum height, the minimum height and the unit travel distance, and may be specifically set according to the needs, and are not limited herein.
It should be noted that the target route includes an air flight route and a water flight route. Such as the flight path of an aircraft in the air, the travel path of a submarine in the sea.
Illustratively, the target flight path is a flight path of the aircraft in the air, for example, the target flight path is a flight path of the aircraft in a time period of 10:00-12:00(24 hours), in the flight path, the data of the target flight path includes a starting position of the position at 10:00, an ending position of the position at 12:00, a maximum altitude of the maximum altitude of all altitudes of the aircraft from the ground during the flight of the aircraft in the time period of 10:00-12:00, a minimum altitude of the minimum altitude of all altitudes of the aircraft from the ground during the flight of the aircraft in the time period of 10:00-12:00, and the unit driving distance is the mileage of the flight path of the aircraft in the time period of 10:00-12: 00.
And S102, determining the driving track of the target route by the rendering device of the route according to the data.
For example, the target route may be a curve or a straight line. The curve may be a parabola, for example. Meanwhile, the determined driving track can be a curve or a straight line, namely, the target route corresponds to the driving track.
For example, taking the target route as the flight route of the aircraft in the air as an example, the route rendering device may calculate, according to the data of the target route and according to a preset period, information (for example, coordinates of each position and the height of the aircraft from the ground at the current time) corresponding to each position of the aircraft on the flight route, so as to obtain the driving track of the target route.
For example, the preset period may be 1 minute (min), or 5 minutes (min), and may also be 10 seconds(s), and the embodiment of the present application is not limited. It should be noted that the shorter the preset period is, the more accurate the obtained driving track of the target route is.
And S103, the rendering device of the flight path converts the coordinate system of each position in the data to obtain the position of each position in the spherical coordinate system.
Specifically, in the data of the target route, the starting position and the ending position are positions (i.e., longitude and latitude coordinates) in a geographic coordinate system, and all the longitude and latitude coordinates in the data are converted into positions in a spherical coordinate system, i.e., spherical coordinates.
Illustratively, as shown in fig. 2, a spherical coordinate system with a radius R is established, and it is assumed that the horizontal rotation angle of the target route is θ (0, 2 pi), and the up-down rotation angle is
Figure BDA0002911574120000041
In the spherical coordinates shown in figure 2,
Figure BDA0002911574120000042
Figure BDA0002911574120000043
as a point of action, i.e. angle of rotation theta and of M
Figure BDA0002911574120000044
The dynamic point M represents an airplane in the embodiment of the present application. Based on this, in the data of the target route, the position of each position in the spherical coordinate system is represented as (x, y, z), wherein x, y, z satisfy the following formula:
Figure BDA0002911574120000051
in the formula (1), x represents an abscissa, y represents an ordinate, z represents an ordinate, and R is a spherical radius.
It should be noted that the value range of the spherical coordinate is (-1, 1), that is, the minimum value of the spherical coordinate (x, y, z) is (-1, -1, -1), and the maximum value is (1, 1, 1). In addition, in order to avoid that the data processing is affected by the excessive data amount of the spherical coordinates after each position in the data is converted into a position in the spherical coordinate system, each spherical coordinate can be stored in the rendering device of the navigation path in a matrix multiplication mode so as to be used for subsequent rendering.
And S104, rendering the driving track by the rendering device of the route according to the position in the spherical coordinate system to obtain the rendered track.
Specifically, the airline rendering device may render the driving trajectory by using a webgl (web graphics library) technology. When the driving trajectory is rendered by the WebGL technique, the spherical coordinate system may be referred to as a WebGL coordinate system.
For example, the rendering device of the route may draw coordinates of each position in the spherical coordinate system, add preset route parameters to the drawn coordinates, and render the driving trajectory to obtain a rendered trajectory. The preset route parameters may include, for example, attributes such as color, transparency, and line width.
It can be understood that when the rendering device of the navigation line obtains the rendered track, the rendered track can be displayed, so that the user can see the rendered track, and the visual effect of the user is improved.
In the embodiment of the application, the flight path rendering device determines the driving track of the target flight path according to the acquired data of the target flight path, then converts the coordinates of each position in the data into spherical coordinates, and renders the driving track by adopting a WebGL technology, namely, the coordinates in a spherical coordinate system are drawn, so that the rendered track can be obtained, and therefore the problems that the flight path rendering method in the related technology is complex, the blockage is easy to occur when the data is processed, and the rendering effect is poor can be solved.
Optionally, as shown in fig. 3, before determining the driving track of the target route according to the data, the method for rendering the route further includes:
s101a, the rendering device of the flight path divides the data into a plurality of data groups.
Wherein each data group in the plurality of data groups comprises a starting position, an ending position and an intermediate parameter; the intermediate parameters include at least one of a maximum height, a minimum height, and a unit travel distance; the time corresponding to different data sets is different.
Illustratively, the plurality of data sets includes a first data set, a second data set, and a third data set; the unit durations of the first data group, the second data group and the third data group can be the same or different.
In one example, the unit time lengths of the first data group, the second data group, and the third data group are the same; for example, the time corresponding to the first data group is 10:00-10:05, the time corresponding to the second data group is 10:05-10:10, and the time corresponding to the third data group is 10:10-10: 15. In another example, the unit time lengths of the first data group, the second data group and the third data group are different, for example, the time corresponding to the first data group is 10:00-10:05, the time corresponding to the second data group is 10:05-10:15, and the time corresponding to the third data group is 10:15-10: 30.
It is understood that, in the plurality of data sets, the ending position of the previous data set is the starting position of the next data set. Taking the plurality of data sets including the first data set, the second data set and the third data set as an example, for example, the ending position of the first data set is the starting position of the second data set, and the ending position of the second data set is the starting position of the third data set.
In addition, after dividing the data into a plurality of data groups, for example, the plurality of data groups may satisfy the following format:
[[(x1,y1),(x2,y2),parmas1]、[(x2,y2),(x3,y3),parmas2]、…、[(xn-1,yn-1),(xn,yn),parmasn]]
wherein (x)n-1,yn-1) Is an initial position, (x)n,yn) To the end position, parmasnRepresenting the intermediate parameter.
It should be noted that the more data groups into which the data are divided, the smoother the travel locus of the determined target route. Based on this, in one possible design, the rendering device of the flight path may also traverse each data set, subdividing each data set into a plurality of sub-data sets, which may, for example, satisfy the following format:
{type:"Feature",geometry:{type:"LineString",coordinates:[(xo,yo),(xe,ye)]},properties:{}}
wherein (x)o,yo) Is the starting position of the sub data group, (x)e,ye) The characteristics represent intermediate parameters, and the { } can be at least one of maximum height, minimum height and unit travel distance; in addition, the above format may be referred to as GeoJson data of a standard linesting type.
It should be noted that, for an example of the intermediate parameter included in each data group, reference may be made to the example of the intermediate parameter included in the data in the foregoing embodiment, and details are not described here again.
S101b, the rendering device of the air route determines the sub-track of the target air route according to the data group, and obtains the driving track of the target air route according to the sub-track.
Specifically, each data group comprises a starting position and an ending position, so that the sub-track of the target air route can be determined according to the starting position and the ending position in one data group, and the starting position and the ending position of each data group are determined to obtain the driving track of the target air route.
Exemplarily, taking a target route as a parabola, fig. 4 is a schematic diagram of a driving track of a target route provided by an embodiment of the present application, as shown in fig. 4, (x1, y1) is a start position of a first data set, and (x2, y2) is an end position of the first data set; (x2, y2) is the start position of the second data set, and (x3, y3) is the end position of the second data set; by analogy, the start and end positions of each data set can be marked. In the travel locus shown in fig. 4, the distance between (x1, y1) and (x2, y2) represents sub-loci determined from one data group, and since each sub-locus is continuous, a plurality of sub-loci are linked together to be the travel locus of the target course.
In the embodiment of the application, the rendering device of the air route divides data into a plurality of data groups, and determines the sub-track of the target air route according to the starting position and the ending position in each data group.
Alternatively, as shown in fig. 5, S102 may be replaced by S1021 and S1022. Specifically, S102 includes:
and S1021, the rendering device of the route constructs a functional relation corresponding to the target route according to the data.
For example, taking the target course as a parabola, assuming that the data includes an initial position, an end position and a maximum height, and the coordinates of the initial position are (x1, y1), the coordinates of the end position are (x2, y2) and the coordinates of the maximum height are (x3, y3), the functional relationship satisfies the following equation:
Figure BDA0002911574120000061
in the formula (2), a, b and c are coefficients of parabolas. It will be appreciated that where the target course is parabolic, the functional relationship is a parabolic equation.
Since (x1, y1), (x2, y2), and (x3, y3) are known, the equation in equation (2) is converted so that the coefficients a, b, c satisfy the following equations:
Figure BDA0002911574120000071
coefficients a, b and c of the parabola can be calculated according to the formula (3), and then a parabola equation of the target route is constructed according to the coefficients a, b and c.
And S1022, determining the driving track of the target route by the rendering device of the route according to the functional relation.
For example, let the function relationship be y ═ ax2+ bx + c; and calculating a vertex coordinate and a directrix equation according to the functional relation, and determining the driving track of the target air route according to the vertex coordinate and the directrix equation. It is understood that the more the coordinates are calculated, the smoother the parabola, according to a functional relationship, based on which it is also possible to calculate the intersection point of the parabola with the y-axis, with x being equal to 0, and the intersection point of the parabola with the x-axis, with y being equal to 0.
In the embodiment of the application, the rendering device of the route constructs the functional relation corresponding to the target route through the data, and then determines the running track of the target route according to the functional relation, so that the determined running track of the target route is more accurate, and the running track of the target route is smoother.
Optionally, the rendering device of the lane comprises a shader; as shown in fig. 6, S104 may be replaced by S1041-S1044, and specifically, S104 includes:
and S1041, drawing the position in the spherical coordinate system by the rendering device of the flight path according to the shader to obtain a rendered track.
In particular, the shaders include a vertex shader and a fragment shader. The vertex shader is mainly used for processing vertex coordinates, namely processing coordinates of positions in the spherical coordinate system, the vertex shader sends the processed coordinates in the spherical coordinate system to the fragment shader, then the fragment shader paints the drawn coordinates according to preset route parameters to obtain rendered tracks, and finally the rendered tracks are output.
For example, fig. 7 is a schematic diagram of a rendered track provided in an embodiment of the present application, and as shown in fig. 7, the rendered track clearly illustrates a track from an origin to a destination, that is, a route where an aircraft flies from the origin to the destination. For example, rendered traces include: Beijing-Jinan; Taiyuan-Zheng state, etc., are not listed here.
S1042, the rendering device of the flight path modifies the attribute information of the rendered track according to a preset period so that the rendered track has different display modes at different moments.
Wherein the attribute information includes at least one of color, line width, and transparency.
For example, in order to improve the rendering effect, the preset period may be set to 1s, that is, the rendered track is modified every 1 s. It should be noted that only the color, only the line width, or only the transparency may be modified; two or more of the color, the line width and the transparency can be modified, and the modification can be specifically set according to actual needs, and the embodiment of the application is not limited.
In the case where the rendering device of the route modifies the color according to a preset period, for example, the color of the modified trajectory gradually becomes lighter as time increases until the color disappears. In the case where the rendering device of the route modifies the line width according to the preset period, for example, the line width of the modified trajectory gradually narrows with time until it becomes a line, even the line width is zero. In the case where the rendering device of the route modifies the transparency according to a preset period, for example, the transparency of the modified trajectory becomes gradually shallower as time increases until the transparency disappears.
In addition, the preset period may be modified, that is, the speed, length, and the like of the modified track, which change with the time increment, may be modified.
In the embodiment of the application, the rendering device of the route modifies the attribute information of the rendered track, namely modifies the color, the line width and the transparency of the rendered track, so that the rendered track presents a dynamic effect, and the rendering effect is further enhanced.
And S1043, calling the animation model by the rendering device of the flight path, and assigning the position in the spherical coordinate system to the animation model.
Note that the animation model (gltf model) is stored in advance in the rendering device of the flight path. The animated model may be, for example, an airplane model.
Specifically, the rendering device of the airline calls the animation model and uploads the animation model to a graphics processing unit (graphics processor), and after the GPU loads the animation model, the rendering device of the airline assigns a position in the spherical coordinate system to the animation model, thereby ensuring that the animation model is consistent with a rendered track.
And S1044, the rendering device of the flight path adjusts the attitude angle of the animation model according to the data to obtain a rendered dynamic track.
Specifically, the attitude angle includes a heading angle, a pitch angle, and a roll angle. On this basis, at least one of a heading angle, a pitch angle, and a roll angle of the animated model may be adjusted.
Taking the course angle of the animation model as an example, for example, the rendering device of the course calculates the course angle of the animation model at the current moment according to the data of the target course, determines the difference between the course angle of the animation model at the current moment and the target course angle, and then rotates the course angle of the animation model around the z-axis in the spherical coordinate system until the rotated course angle is the same as the target course angle, thereby completing the adjustment of the course angle. It can be understood that the angle of rotation of the heading angle around the z-axis in the spherical coordinate system is the difference between the heading angle of the animated model at the current time and the target heading angle.
It should be noted that the target course angle is the course angle of the animation model at the next moment of the current moment.
In addition, if the calculated heading angle (for example, phi) of the animation model at the current moment is greater than 180 degrees, the compensation angle for the heading angle of the animation model at the current moment needs to be 2 pi-phi. Illustratively, the course angle of the animation model at the current moment is 190 °, and then the compensation angle of the course angle of the animation model at the current moment is 2 pi-190 ° -170 °, that is, during specific implementation, the course angle of the animation model at the current moment is 170 °, and then the difference between the course angle of the animation model at the current moment and the target course angle is determined according to 170 °, and finally the course angle of the animation model is rotated around the z-axis in the spherical coordinate system according to the difference until the rotated course angle is the same as the target course angle, so that the adjustment of the course angle is completed.
Fig. 8 is a schematic diagram of a rendered track provided in an embodiment of the present application, as shown in fig. 8, an animated model (e.g., an airplane model) is included in the rendered track, and it can be seen from fig. 8 that a heading angle of the airplane model coincides with a traveling track of a target route, and transparency of the rendered track decreases with time, that is, transparency of a track close to the airplane model is stronger, and transparency of a track far from the airplane model is weaker.
In the embodiment of the application, the rendering device of the flight path adds the animation model in the rendered track and assigns the position in the spherical coordinate system to the animation model so as to ensure that the animation model is consistent with the rendered track; on the basis, the attitude angle of the animation model is adjusted, so that the animation model can rotate the attitude angle of the animation model according to the rendered track, the rendered track also comprises the animation model, and the rendering effect is further enhanced.
The scheme provided by the embodiment of the application is introduced mainly from the point of interaction between devices. It will be appreciated that each device, in order to carry out the above-described functions, comprises corresponding hardware structures and/or software modules for performing each function. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, functional modules may be divided according to the above method examples, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing each function module with corresponding functions, fig. 9 shows a en-route rendering apparatus 200, and the en-route rendering apparatus 200 may include an acquisition module 201, a processing module 202, a coordinate conversion module 203, and a rendering module 204. For example, the rendering apparatus 200 of the airline may be a server, or may be a chip applied in the server or other combined device, component, etc. having the above-mentioned server function.
Specifically, the obtaining module 201 is configured to obtain data of a target route; the data includes a start location, an end location, and intermediate parameters including at least one of a maximum altitude, a minimum altitude, and a unit distance traveled. For example, as shown in connection with fig. 1, the obtaining module 201 may be configured to execute S101.
And the processing module 202 is used for determining the driving track of the target air route according to the data. For example, as shown in connection with fig. 1, the processing module 202 may be configured to execute S102.
And the coordinate conversion module 203 is configured to perform coordinate system conversion on each position in the data to obtain a position of each position in the spherical coordinate system. For example, as shown in connection with fig. 1, the coordinate conversion module 203 may be configured to perform S103.
And the rendering module 204 is configured to render the driving track according to the position in the spherical coordinate system, so as to obtain a rendered track. For example, as shown in connection with fig. 1, rendering module 204 may be configured to perform S104.
Optionally, the processing module 202 is further configured to divide the data into a plurality of data groups; wherein each of the plurality of data sets includes a start position, an end position, and an intermediate parameter, the intermediate parameter including at least one of a maximum height, a minimum height, and a unit travel distance; the time corresponding to different data sets is different; and determining a sub-track of the target route according to the data group, and obtaining the driving track of the target route according to the sub-track. For example, as shown in connection with FIG. 3, the processing module 202 may be configured to perform S101a-S101 b.
Optionally, the processing module 202 is specifically configured to construct a functional relationship corresponding to the target route according to the data; and determining the driving track of the target route according to the functional relation. For example, as shown in connection with FIG. 5, the processing module 202 may be configured to perform S1021-S1022.
Optionally, the rendering device of the route further comprises a shader; the rendering module 204 is specifically configured to draw a position in the spherical coordinate system according to the shader, so as to obtain a rendered track. For example, as shown in connection with fig. 6, rendering module 204 may be configured to perform S1041.
Optionally, the rendering module 204 is further configured to modify attribute information of the rendered track according to a preset period, so that the rendered track has different display modes at different times; wherein the attribute information includes at least one of color, line width, and transparency. For example, as shown in connection with fig. 6, rendering module 204 may be configured to perform S1042.
Optionally, the rendering module 204 is further configured to invoke an animation model, and assign a position in the spherical coordinate system to the animation model; and adjusting the attitude angle of the animation model according to the data to obtain a rendered dynamic track. For example, as shown in connection with FIG. 6, rendering module 204 may be configured to perform S1043-S1044.
It should be noted that, for the illustration of the functions implemented by each module of the rendering apparatus of the lane shown in fig. 9 and the resulting beneficial effects, reference may be made to the illustration and the beneficial effects of the rendering method of the lane in the foregoing embodiment, and no further description is given here.
The embodiment of the present application further provides an electronic device, as shown in fig. 10, the electronic device 300 includes a processor 301, a transceiver 302, a communication line 303, and a memory 304.
The processor 301, the memory 304 and the transceiver 302 may be connected via a communication line 303.
The processor 301 is a Central Processing Unit (CPU), a general purpose processor Network (NP), a Digital Signal Processor (DSP), a microprocessor, a microcontroller, a Programmable Logic Device (PLD), or any combination thereof. The processor 201 may also be other devices with processing functions, such as, without limitation, a circuit, a device, or a software module.
A transceiver 302 for communicating with other devices or other communication networks. The other communication network may be an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), or the like. The transceiver 302 may be a module, a circuit, a transceiver, or any device capable of enabling communication.
A communication line 303 for transmitting information between the respective components included in the electronic apparatus 300.
A memory 304 for storing instructions. Wherein the instructions may be a computer program.
The memory 104 may be a read-only memory (ROM) or other types of static storage devices that can store static information and/or instructions, a Random Access Memory (RAM) or other types of dynamic storage devices that can store information and/or instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disc storage medium or other magnetic storage devices, and the like, without limitation.
It is noted that the memory 304 may exist separately from the processor 301 or may be integrated with the processor 301. The memory 304 may be used for storing instructions or program code or some data or the like. The memory 304 may be located inside the electronic device 300 or outside the electronic device 300, and the embodiment of the present application is not limited thereto. The processor 301 is configured to execute the instructions stored in the memory 304 to implement the rendering method of the route provided in the embodiment of the present application.
In one example, the processor 301 may include one or more CPUs, such as CPU0 and CPU1 in fig. 10.
As an alternative implementation, the traffic congestion index determining apparatus 300 includes a plurality of processors, for example, a processor 305 may be included in addition to the processor 301 in fig. 10.
It should be noted that the electronic device may be a desktop computer, a portable computer, a network server, a mobile phone, a tablet computer, a wireless terminal, an embedded device, a chip system, or a device with a similar structure as in fig. 10. Further, the constituent structure shown in fig. 10 does not constitute a limitation of the electronic apparatus, and the electronic apparatus may include more or less components than those shown in fig. 10, or combine some components, or a different arrangement of components, in addition to the components shown in fig. 10.
In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
In addition, acts, terms, and the like referred to between the embodiments of the present application may be mutually referenced and are not limited. In the embodiment of the present application, the name of the message exchanged between the devices or the name of the parameter in the message, etc. are only an example, and other names may also be used in the specific implementation, which is not limited.
In actual implementation, the obtaining module 201, the processing module 202, the coordinate conversion module 203, and the rendering module 204 may be implemented by the processor 301 shown in fig. 10 calling the program code in the memory 304. The specific implementation process may refer to the descriptions of the rendering method parts of the lanes described in fig. 1, fig. 3, fig. 5, and fig. 6, and will not be described herein again.
The embodiment of the application also provides a computer readable storage medium. All or part of the processes in the above method embodiments may be performed by relevant hardware instructed by a computer program, which may be stored in the above computer-readable storage medium, and when executed, may include the processes in the above method embodiments. The computer readable storage medium may be an internal storage unit of the en-route rendering device (including the data sending end and/or the data receiving end) of any of the foregoing embodiments, such as a hard disk or a memory of the en-route rendering device. The computer readable storage medium may also be an external storage device of the rendering apparatus of the airline, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash memory card (flash card), and the like, provided on the rendering apparatus of the airline. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the en-route rendering apparatus. The computer-readable storage medium stores the computer program and other programs and data required by the terminal. The above-described computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. The rendering method of the route is characterized by being applied to a rendering device of the route; the method comprises the following steps:
acquiring data of a target air route; the data includes a start location, an end location, and intermediate parameters including at least one of a maximum altitude, a minimum altitude, and a unit distance traveled;
determining a driving track of the target route according to the data;
converting a coordinate system of each position in the data to obtain the position of each position in a spherical coordinate system;
rendering the running track according to the position in the spherical coordinate system to obtain a rendered track.
2. The rendering method of claim 1, wherein prior to the determining the travel trajectory of the target route from the data, the rendering method further comprises:
dividing the data into a plurality of data groups; each data set of the plurality of data sets comprises a start position, an end position and an intermediate parameter, wherein the intermediate parameter comprises at least one of a maximum height, a minimum height and a unit travel distance; the time corresponding to different data sets is different;
and determining the sub-track of the target route according to the data group, and obtaining the driving track of the target route according to the sub-track.
3. The rendering method of claim 1, wherein determining the travel trajectory of the target route from the data comprises:
according to the data, constructing a functional relation corresponding to the target route;
and determining the driving track of the target route according to the functional relation.
4. The rendering method of claim 1, wherein the en route rendering device comprises a shader; rendering the driving track according to the position in the spherical coordinate system to obtain a rendered track, comprising:
and drawing the position in the spherical coordinate system according to the shader to obtain a rendered track.
5. The rendering method according to claim 4, wherein the rendering method further comprises:
modifying the attribute information of the rendered track according to a preset period so as to enable the rendered track to have different display modes at different moments; the attribute information includes at least one of color, line width, and transparency.
6. The rendering method according to claim 5, wherein the rendering method further comprises:
calling an animation model, and assigning a position in the spherical coordinate system to the animation model;
and adjusting the attitude angle of the animation model according to the data to obtain a rendered dynamic track.
7. An en-route rendering apparatus, comprising:
the acquisition module is used for acquiring data of a target route; the data includes a start location, an end location, and intermediate parameters including at least one of a maximum altitude, a minimum altitude, and a unit distance traveled;
the processing module is used for determining the driving track of the target air route according to the data;
the coordinate conversion module is used for carrying out coordinate system conversion on each position in the data to obtain the position of each position in a spherical coordinate system;
and the rendering module is used for rendering the driving track according to the position in the spherical coordinate system to obtain the rendered track.
8. The rendering apparatus of claim 7, wherein the processing module is further configured to,
dividing the data into a plurality of data groups; each data set of the plurality of data sets comprises a start position, an end position and an intermediate parameter, wherein the intermediate parameter comprises at least one of a maximum height, a minimum height and a unit travel distance; the time periods corresponding to different data sets are different;
and determining the sub-track of the target route according to the data group, and obtaining the driving track of the target route according to the sub-track.
9. The rendering apparatus according to claim 7, characterized in that the processing module is specifically configured to,
according to the data, constructing a functional relation corresponding to the target route;
and determining the driving track of the target route according to the functional relation.
10. The rendering apparatus of claim 7, wherein the en route rendering apparatus further comprises a shader; the rendering module is specifically configured to draw a position in the spherical coordinate system according to the shader, so as to obtain a rendered trajectory.
11. The rendering apparatus of claim 10, wherein the rendering module is further configured to,
modifying the attribute information of the rendered track according to a preset period so as to enable the rendered track to have different display modes at different moments; the attribute information includes at least one of color, line width, and transparency.
12. The rendering apparatus of claim 11, wherein the rendering module is further configured to,
calling an animation model, and assigning a position in the spherical coordinate system to the animation model;
and adjusting the attitude angle of the animation model according to the data to obtain a rendered dynamic track.
13. An electronic device, comprising: one or more processors, and memory; the processor and the memory are coupled; the memory for storing computer program code, the computer program code comprising computer executable instructions;
the computer executable instructions, when executed by the processor, cause the electronic device to perform a method of en route rendering as claimed in any one of claims 1 to 6.
14. A computer-readable storage medium, characterized in that it stores computer instructions or a program which, when run on a computer, causes the computer to perform the method of en-route rendering of any one of claims 1-6.
CN202110088010.5A 2021-01-22 2021-01-22 Rendering method and device of air route and storage medium Pending CN112802159A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110088010.5A CN112802159A (en) 2021-01-22 2021-01-22 Rendering method and device of air route and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110088010.5A CN112802159A (en) 2021-01-22 2021-01-22 Rendering method and device of air route and storage medium

Publications (1)

Publication Number Publication Date
CN112802159A true CN112802159A (en) 2021-05-14

Family

ID=75811219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110088010.5A Pending CN112802159A (en) 2021-01-22 2021-01-22 Rendering method and device of air route and storage medium

Country Status (1)

Country Link
CN (1) CN112802159A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113268688A (en) * 2021-05-26 2021-08-17 海南太美航空股份有限公司 Rendering method and system for cross route
CN113267192A (en) * 2021-05-26 2021-08-17 海南太美航空股份有限公司 Method and system for improving cross route rendering efficiency
CN113505164A (en) * 2021-09-13 2021-10-15 中航信移动科技有限公司 Travel track drawing method and device, computer equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113268688A (en) * 2021-05-26 2021-08-17 海南太美航空股份有限公司 Rendering method and system for cross route
CN113267192A (en) * 2021-05-26 2021-08-17 海南太美航空股份有限公司 Method and system for improving cross route rendering efficiency
CN113268688B (en) * 2021-05-26 2023-09-05 海南太美航空股份有限公司 Rendering method and system for cross airlines
CN113505164A (en) * 2021-09-13 2021-10-15 中航信移动科技有限公司 Travel track drawing method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112802159A (en) Rendering method and device of air route and storage medium
US11920934B2 (en) Path planning using sparse volumetric data
US7352292B2 (en) Real-time, three-dimensional synthetic vision display of sensor-validated terrain data
CN105333883B (en) A kind of guidance path track display method and device for head up display
JP7201651B2 (en) Navigation application programming interface
CN110869981A (en) Vector data encoding of high definition map data for autonomous vehicles
JP2004213663A (en) Navigation system
EP3754469A1 (en) 3d structure engine-based computation platform
JP2021508856A (en) Systems and methods for identifying grids of geographical areas on maps
CN113535169A (en) Scene rendering method, device and equipment and readable storage medium
CN113014824A (en) Video picture processing method and device and electronic equipment
KR20140049100A (en) Reality display system of air inteligence and method thereof
WO2023138199A1 (en) Route drawing method and apparatus, computer device, storage medium and program product
CN112102497A (en) System and method for attaching applications and interactions to static objects
CN115511701A (en) Method and device for converting geographic information
Stødle et al. High-performance visualisation of UAV sensor and image data with raster maps and topography in 3D
CN114463506A (en) Map element display method based on three-dimensional drawing protocol and map engine
KR101967587B1 (en) Method and apparadus for generating surveying data using task screen providing overlapped layers of construction map data, map data from external map service, and public map data
CN113888709A (en) Electronic sand table generation method and device and non-transient storage medium
KR102012362B1 (en) Method and apparatus for generating digital moving map for safe navigation of unmanned aerial vehicle
KR102547711B1 (en) Method and system for generating realtime chain corresponding to object including arc or circle included in cad data
CN114116930A (en) WebGIS-based motion model implementation method and device, intelligent terminal and storage medium
KR101955377B1 (en) system for generating 3 dimensional electronic nautical chart capable of expressing triangulated area object
CN118097079A (en) Camera roaming path processing method, device, equipment and medium
Andersson et al. Interactive Visualization of Air Traffic in OpenSpace

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination