CN111569414A - Flight display method and device of virtual aircraft, electronic equipment and storage medium - Google Patents

Flight display method and device of virtual aircraft, electronic equipment and storage medium Download PDF

Info

Publication number
CN111569414A
CN111569414A CN202010514623.6A CN202010514623A CN111569414A CN 111569414 A CN111569414 A CN 111569414A CN 202010514623 A CN202010514623 A CN 202010514623A CN 111569414 A CN111569414 A CN 111569414A
Authority
CN
China
Prior art keywords
virtual aircraft
pose data
current pose
landing
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010514623.6A
Other languages
Chinese (zh)
Other versions
CN111569414B (en
Inventor
揭志伟
李炳泽
武明飞
符修源
陈凯彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010514623.6A priority Critical patent/CN111569414B/en
Publication of CN111569414A publication Critical patent/CN111569414A/en
Application granted granted Critical
Publication of CN111569414B publication Critical patent/CN111569414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a flight display method, device, electronic device and storage medium for a virtual aircraft, wherein the flight display method includes: acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time; determining current pose data of the AR equipment in a target real scene based on a real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; responding to a landing request for the virtual aircraft, and determining a landing position in a target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image; and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.

Description

Flight display method and device of virtual aircraft, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of augmented reality technologies, and in particular, to a flight display method and apparatus for a virtual aircraft, an electronic device, and a storage medium.
Background
Augmented Reality (AR) technology is a technology that skillfully integrates virtual information with the real world, widely applies various technical means such as multimedia, three-dimensional modeling, real-time tracking and the like, and is currently applied to various scenes.
Taking a game scene as an example, for example, a flying game, when a user uses an AR technology to control a virtual aircraft, how to provide more realistic display contents through AR display is a problem to be solved.
Disclosure of Invention
The embodiment of the disclosure at least provides a flight display scheme of a virtual aircraft.
In a first aspect, an embodiment of the present disclosure provides a flight display method for a virtual aircraft, including:
acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time;
determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
in response to a landing request for the virtual aircraft, determining a landing position in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
In the embodiment of the disclosure, when a landing request for a controlled virtual aircraft sent by an AR device is received, a corresponding landing position can be determined for the virtual aircraft based on the current pose data of the virtual aircraft, and further, the AR device can be controlled to display an AR special effect of the virtual aircraft in a landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft and the corresponding landing position.
In one possible embodiment, the determining, based on the current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
and determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In one possible embodiment, the determining, based on the current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR device and the control data which is sent by the AR device and aims at the virtual aircraft.
In one possible embodiment, the controlling the AR device to show the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position includes:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process based on the pose data of the AR equipment and the landing navigation route data.
In one possible embodiment, the controlling the AR device to show the AR special effect of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data includes:
displaying, by an AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
in response to a landing control operation for the virtual aircraft that includes a target landing navigation route, controlling the AR apparatus to show an AR special effect of the virtual aircraft landing process.
In one possible embodiment, the flight display method further includes:
and after the virtual aircraft reaches the landing position, responding to a takeoff request aiming at the virtual aircraft, and controlling the AR equipment to display the AR special effect of the virtual aircraft in the takeoff process according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In one possible embodiment, the determining, based on the real-time acquired image of the real scene, current pose data of the AR device in the target real scene includes:
and determining the current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model for representing the target real scene.
In a second aspect, an embodiment of the present disclosure provides a flight display device for a virtual aircraft, including:
the image acquisition module is used for acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time;
the first determining module is used for determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
a second determination module, configured to determine, in response to a landing request for the virtual aircraft, a landing position in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and the special effect display module is used for controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
In one possible embodiment, the second determining module, when configured to determine the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, includes:
and determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In one possible embodiment, the second determining module, when configured to determine the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, includes:
determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR device and the control data which is sent by the AR device and aims at the virtual aircraft.
In one possible embodiment, the special effect presentation data, when used for controlling the AR device to present the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position, includes:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process based on the pose data of the AR equipment and the landing navigation route data.
In one possible embodiment, the special effect presentation data, when used for controlling the AR device to present an AR special effect of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data, includes:
displaying, by an AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
in response to a landing control operation for the virtual aircraft that includes a target landing navigation route, controlling the AR apparatus to show an AR special effect of the virtual aircraft landing process.
In one possible implementation, the special effects presentation module is further configured to:
and after the virtual aircraft reaches the landing position, responding to a takeoff request aiming at the virtual aircraft, and controlling the AR equipment to display the AR special effect of the virtual aircraft in the takeoff process according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In one possible implementation, the first determining module, when configured to determine the current pose data of the AR device in the target real scene based on the real scene image acquired in real time, comprises:
and determining the current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model for representing the target real scene.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the flight demonstration method according to the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, performs the steps of the flight demonstration method according to the first aspect.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of a flight demonstration method of a virtual aircraft according to an embodiment of the present disclosure;
fig. 2 illustrates a flowchart of a method for controlling an AR device to exhibit a landing process of a virtual aircraft according to an embodiment of the present disclosure;
fig. 3 illustrates a flowchart of a method for controlling an AR device to exhibit a specific landing procedure of a virtual aircraft according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a flight demonstration apparatus for a virtual aircraft provided by an embodiment of the present disclosure;
fig. 5 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
At present, the application scenes of the AR technology in interactive games are few, for example, in airship battle games, a user can operate and control a matched virtual airship through a carried AR device, and in similar application scenes, how to enable the display effect to be more vivid through the AR technology, for example, aiming at the landing situation of the virtual airship, is a technical problem to be discussed in the disclosure.
Based on the research, the present disclosure provides a flight display method for a virtual aircraft, which may determine a corresponding landing position for the virtual aircraft based on current pose data of the virtual aircraft when receiving a landing request for a controlled virtual aircraft sent by an AR device, and further may control the AR device to display an AR special effect of the virtual aircraft in a landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the corresponding landing position.
To facilitate understanding of the present embodiment, first, a detailed description is given to a flight display method of a virtual aircraft disclosed in the embodiments of the present disclosure, where an execution subject of the flight display method of a virtual aircraft provided in the embodiments of the present disclosure is generally a computer device with certain computing capability, and the computer device includes, for example: a server or other processing device. In some possible implementations, the flight demonstration method of the virtual aircraft may be implemented by a processor calling computer readable instructions stored in a memory.
The embodiment of the present disclosure is described by taking an execution subject as a server as an example, and referring to fig. 1, it is a flowchart of a flight display method of a virtual aircraft provided in the embodiment of the present disclosure, where the flight display method includes steps S101 to S104, where:
and S101, acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time.
Illustratively, the AR device may include AR glasses, a smartphone, a tablet, a wearable device, and the like, without limitation.
S102, determining current pose data of the AR equipment in a target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment.
For example, the pose data of the AR device in the target reality scene may include a position and/or a display angle of a display component for displaying the virtual aircraft when the AR device is held or worn by the user.
The display component of the AR device specifically refers to a component used for displaying the virtual aircraft in the AR device, for example, when the AR device is a mobile phone or a tablet, the corresponding display component may be a display screen, and when the AR device is AR glasses, the corresponding display component may be a lens used for displaying the virtual aircraft.
The real scene image is an image obtained when the AR device shoots a target real scene, so that the pose data of the AR device in the target real scene can be determined through the real scene image, and how to determine the pose data will be described in detail later.
For example, after the pose data of the AR device is determined, the server may control the AR device to display an AR scene image matching the pose data according to the pose data of the AR device, for example, a virtual aircraft superimposed in a real scene may be displayed in the AR scene image.
S103, responding to the landing request aiming at the virtual aircraft, and determining the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image.
For example, when the virtual aircraft controlled by the AR device is in a flight state, the virtual aircraft may be controlled to land through the AR device, and specifically, the user may send a landing request for the virtual aircraft matched with the AR device to the server through the AR device carried by the user, so as to achieve an effect of controlling the virtual aircraft to land.
For example, after detecting a landing request of the virtual aircraft, the server may determine, based on current pose data of the virtual aircraft in the AR scene image, a landing position of the virtual aircraft in the target reality scene, where the landing position may be a place planned in advance and where the virtual aircraft may be parked, or a place in the target reality scene where the virtual aircraft may be carried, such as a building roof, an open ground, and the like.
For example, the current pose data of the virtual aircraft in the AR scene image may be determined according to control data sent by an AR device controlling the virtual aircraft, and the current pose data of the virtual aircraft in the AR scene image may be determined according to control data sent by an AR device controlling the virtual aircraft.
And S104, controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
For example, in an application scene, the AR devices may select the same AR experience package to enter the multi-person interactive AR scene, for example, for an AR scene of a game class, each AR device may enter the same AR room experience AR scene by logging in an AR game account, and the AR devices entering the same AR room may serve as the associated AR device group.
Therefore, when the corresponding current pose data of different AR devices are different, the displayed viewing angles are also different, and for the same AR device, when the pose data change, the displayed viewing angles are also different, so that when the AR special effect of the virtual aircraft in the landing process is generated for the AR device, the AR special effect needs to be generated together with the current pose data of the AR device.
In the embodiment of the disclosure, when a landing request for a controlled virtual aircraft sent by an AR device is received, a corresponding landing position can be determined for the virtual aircraft based on the current pose data of the virtual aircraft, and further, the AR device can be controlled to display an AR special effect of the virtual aircraft in a landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft and the corresponding landing position.
The above-mentioned S101 to S104 will be explained with reference to specific embodiments.
For the above S102, when determining the current pose data of the AR device in the target real scene based on the real scene image acquired in real time, the method may include:
and determining the current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model for representing the target real scene.
For example, the three-dimensional scene model may be constructed based on a plurality of sample images of the target real scene captured in advance, specifically, during construction, the three-dimensional scene model may be constructed by extracting feature points in each sample image, after the three-dimensional scene model is generated, the sample image corresponding to each feature point in the three-dimensional scene model may be saved, and the capturing pose of the sample image in the three-dimensional scene model may be stored, so that after the real scene image of the target real scene captured by the AR device is obtained, the feature points may be extracted from the real scene image, the sample image matched with the real scene image is determined based on the extracted feature points, and finally, the pose data of the AR device in the three-dimensional scene model is obtained.
Because the three-dimensional scene model is a model representing the target real scene, the pose data of the AR device in the three-dimensional scene model can be used as the pose data of the AR device in the target real scene.
The three-dimensional scene model can be constructed based on a plurality of real scene images shot for the target real scene in advance, and after the construction is completed, the constructed three-dimensional scene model can be corrected through a real two-dimensional map corresponding to the target real scene, so that the three-dimensional scene model representing the target real scene with high accuracy is obtained.
In one possible implementation, for S103, when determining a landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the method may include:
and determining the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
For example, by using the current pose data of the AR device, an AR scene image currently displayed by the AR device may be determined, then candidate landing positions that may be used for the virtual aircraft to land may be determined in the AR scene image, such as a building roof, an open ground, and the like, and then a landing position corresponding to the virtual aircraft may be determined based on the current pose data of the virtual aircraft in the AR scene image, for example, the closest one may be selected as the landing position in the candidate landing positions.
In another possible implementation manner, with respect to S103 above, when determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the method may include:
and determining the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR equipment and the control data aiming at the virtual aircraft, which are sent by the AR equipment.
For example, the control data for the virtual aircraft sent by the AR device may include control data such as a flight speed and a flight direction for the virtual aircraft, and by using these control data, the landing position corresponding to the virtual aircraft may be determined jointly in combination with the current pose data of the AR device and the current pose data of the virtual aircraft.
Similarly, an AR scene image currently displayed by the AR device may be determined according to the current pose data of the AR device, then candidate landing positions for the virtual aircraft to land may be determined in the AR scene image, further based on the current pose data of the virtual aircraft in the AR scene image and control data for the virtual aircraft sent by the AR device, an optimal landing position may be selected from the candidate landing positions, for example, when the flying speed is fast, a candidate landing position farther from the current position of the virtual aircraft may be selected to land in consideration of the time required for landing safety and deceleration of the virtual aircraft, and a candidate landing position consistent with the flying direction may be selected to land in combination with the flying direction, for example, the flying direction of the virtual aircraft is flying to the east of the target real scene, then in the AR scene content presented in the AR device, and selecting a candidate landing position located in the east of the target real scene as the landing position of the virtual aircraft.
For the above S104, when the AR device is controlled to show the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position, as shown in fig. 2, the following S1041 to S1042 may be included:
s1041, generating landing navigation route data for the virtual aircraft based on the current pose data and the landing position of the virtual aircraft;
and S1042, controlling the AR equipment to show the AR special effect of the virtual aircraft in the landing process based on the pose data and the landing navigation route data of the AR equipment.
Illustratively, at least one piece of landing navigation route data may be generated in conjunction with obstacles in AR scene content presented in the AR device, such as high-rise buildings, and the current pose data and landing position of the virtual aircraft.
Specifically, for S1042, when the AR device is controlled to show the AR special effect of the virtual aircraft in the landing process based on the pose data and the landing navigation route data of the AR device, as shown in fig. 3, the following S10421 to S10422 may be included:
s10421, displaying at least one landing navigation route for the virtual aircraft through the AR device based on the current pose data and the landing navigation route data of the AR device;
and S10422, in response to the landing control operation including the target landing navigation route for the virtual aircraft, controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process.
For example, the current pose data of the AR device is also used for determining the visual angle displayed by the AR device, and the same effects as above are not described herein again.
Further, the generated at least one piece of landing navigation route data may be sent to an AR device for display, so that a user may select a target landing navigation route through the AR device, and then send a landing control operation for the virtual aircraft to a server through the AR device, where the landing control operation may include the selected target landing navigation route, and after detecting the landing control operation for the virtual aircraft, the server may control the AR device to display an AR special effect of the virtual aircraft in a landing process according to the target landing navigation route.
In one implementation, the flight display method provided in the embodiments of the present disclosure further includes:
and after the virtual aircraft reaches the landing position, responding to a takeoff request aiming at the virtual aircraft, and controlling the AR equipment to display the AR special effect of the virtual aircraft in the takeoff process according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
After the virtual aircraft arrives at the landing position, the AR device can still control the virtual aircraft to take off from the landing position, a user can send a take-off request for the virtual aircraft to the server through the AR device, the server can determine the take-off direction in a target real scene according to the current pose data of the virtual aircraft after receiving the take-off request, and then the AR device is controlled to display the AR special effect of the virtual aircraft in the take-off process by combining the current pose data of the AR device.
The current pose data of the AR device is also used to determine the visual angle displayed by the AR device, and the same effects as above are not described herein again.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same technical concept, the embodiment of the present disclosure further provides a flight display apparatus of a virtual aircraft corresponding to the flight display method of the virtual aircraft, and as the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the flight display method of the virtual aircraft in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 4, there is a schematic view of a flight demonstration apparatus 400 of a virtual aircraft according to an embodiment of the present disclosure, the flight demonstration apparatus includes:
an image acquisition module 401, configured to acquire, in real time, a real scene image shot by an AR device for a target real scene;
a first determining module 402, configured to determine current pose data of the AR device in a target real scene based on a real scene image acquired in real time, and control the AR device to display an AR scene image matched with the current pose data based on the current pose data, where the AR scene image includes a virtual aircraft controlled by the AR device;
a second determining module 403, configured to determine, in response to a landing request for the virtual aircraft, a landing position in a target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and the special effect display module 404 is configured to control the AR device to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position.
In one possible implementation, the second determining module 403, when configured to determine the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, includes:
and determining the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In one possible implementation, the second determining module 403, when configured to determine the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, includes:
and determining the landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR equipment and the control data aiming at the virtual aircraft, which are sent by the AR equipment.
In one possible implementation, the special effect showing data 404, when used for controlling the AR device to show the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position, includes:
generating landing navigation route data for the virtual aircraft based on the current pose data and the landing position of the virtual aircraft;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process based on the pose data and the landing navigation route data of the AR equipment.
In one possible implementation, the special effect presentation data 404, when used to control the AR device to present the AR special effect of the virtual aircraft landing process based on the pose data and the landing navigation route data of the AR device, includes:
displaying, by the AR device, at least one landing navigation route for the virtual aircraft based on the current pose data and the landing navigation route data of the AR device;
and in response to a landing control operation for the virtual aircraft, including the target landing navigation route, controlling the AR equipment to show the AR special effect of the virtual aircraft landing process.
In one possible implementation, the special effects presentation module 404 is further configured to:
and after the virtual aircraft reaches the landing position, responding to a takeoff request aiming at the virtual aircraft, and controlling the AR equipment to display the AR special effect of the virtual aircraft in the takeoff process according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In one possible implementation, the first determining module 402, when configured to determine the current pose data of the AR device in the target real scene based on the real scene image acquired in real time, includes:
and determining the current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model for representing the target real scene.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Corresponding to the flight display method of the virtual aircraft in fig. 1, an embodiment of the present disclosure further provides an electronic device 500, and as shown in fig. 5, a schematic structural diagram of the electronic device 500 provided in the embodiment of the present disclosure includes:
a processor 51, a memory 52, and a bus 53; the storage 52 is used for storing execution instructions and comprises a memory 521 and an external storage 522; the memory 521 is also referred to as an internal memory, and is configured to temporarily store operation data in the processor 51 and data exchanged with an external memory 522 such as a hard disk, the processor 51 exchanges data with the external memory 522 through the memory 521, and when the electronic device 500 operates, the processor 51 communicates with the memory 52 through the bus 53, so that the processor 51 executes the following instructions: acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time; determining current pose data of the AR equipment in a target real scene based on a real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; responding to a landing request for the virtual aircraft, and determining a landing position in a target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image; and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
The disclosed embodiment also provides a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of the flight display method of the virtual aircraft described in the above method embodiment are executed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the flight display method for a virtual aircraft provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the flight display method for a virtual aircraft described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A flight display method of a virtual aircraft is characterized by comprising the following steps:
acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time;
determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
in response to a landing request for the virtual aircraft, determining a landing position in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
2. The flight presentation method of claim 1, wherein the determining a landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image comprises:
and determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
3. The flight presentation method of claim 1, wherein the determining a landing position in the target reality scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image comprises:
determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR device and the control data which is sent by the AR device and aims at the virtual aircraft.
4. The flight display method according to any one of claims 1 to 3, wherein the controlling the AR device to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position comprises:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
and controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process based on the pose data of the AR equipment and the landing navigation route data.
5. The flight presentation method of claim 4, wherein the controlling the AR device to present the AR special effect of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data comprises:
displaying, by an AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
in response to a landing control operation for the virtual aircraft that includes a target landing navigation route, controlling the AR apparatus to show an AR special effect of the virtual aircraft landing process.
6. The flight demonstration method of any one of claims 1 to 5 further comprising:
and after the virtual aircraft reaches the landing position, responding to a takeoff request aiming at the virtual aircraft, and controlling the AR equipment to display the AR special effect of the virtual aircraft in the takeoff process according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
7. The flight presentation method of any one of claims 1 to 6, wherein the determining current pose data of the AR device in the target real scene based on the real-time acquired real-scene images comprises:
and determining the current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model for representing the target real scene.
8. A flight display device for a virtual aircraft, comprising:
the image acquisition module is used for acquiring a real scene image shot by the AR equipment aiming at a target real scene in real time;
the first determining module is used for determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
a second determination module, configured to determine, in response to a landing request for the virtual aircraft, a landing position in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and the special effect display module is used for controlling the AR equipment to display the AR special effect of the virtual aircraft in the landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of the flight demonstration method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the flight demonstration method according to one of the claims 1 to 7.
CN202010514623.6A 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium Active CN111569414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010514623.6A CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514623.6A CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111569414A true CN111569414A (en) 2020-08-25
CN111569414B CN111569414B (en) 2024-03-29

Family

ID=72110628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514623.6A Active CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111569414B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN112927293A (en) * 2021-03-26 2021-06-08 深圳市慧鲤科技有限公司 AR scene display method and device, electronic equipment and storage medium
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120089274A1 (en) * 2010-10-06 2012-04-12 Hon Hai Precision Industry Co., Ltd. Electronic device and method for controlling unmanned aerial vehicle
RU130963U1 (en) * 2012-11-20 2013-08-10 Общество с ограниченной ответственностью "АВИАРЕАЛ" AERIAL REALITY SYSTEM FOR DISPLAYING INFORMATION FOR AIRCRAFT CONTROL
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
CN106408515A (en) * 2016-08-31 2017-02-15 郑州捷安高科股份有限公司 Augmented reality-based vision synthesis system
CN106856566A (en) * 2016-12-16 2017-06-16 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of information synchronization method and system based on AR equipment
WO2019090488A1 (en) * 2017-11-07 2019-05-16 深圳市大疆创新科技有限公司 Method for generating simulated route, method and device for flight simulation, and storage medium
KR20190110499A (en) * 2019-09-03 2019-09-30 엘지전자 주식회사 Method and apparatus for landing of unmanned aerial vehicle
CN110378990A (en) * 2019-07-03 2019-10-25 北京悉见科技有限公司 Augmented reality scene shows method, apparatus and storage medium
CN110507994A (en) * 2019-09-05 2019-11-29 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and the storage medium of virtual aircraft flight
CN110559653A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 control method, device, terminal and storage medium of virtual aircraft
CN110585708A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, device and readable storage medium for landing from aircraft in virtual environment
CN110585711A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual aircraft
CN110716646A (en) * 2019-10-15 2020-01-21 北京市商汤科技开发有限公司 Augmented reality data presentation method, device, equipment and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
US20120089274A1 (en) * 2010-10-06 2012-04-12 Hon Hai Precision Industry Co., Ltd. Electronic device and method for controlling unmanned aerial vehicle
RU130963U1 (en) * 2012-11-20 2013-08-10 Общество с ограниченной ответственностью "АВИАРЕАЛ" AERIAL REALITY SYSTEM FOR DISPLAYING INFORMATION FOR AIRCRAFT CONTROL
CN106408515A (en) * 2016-08-31 2017-02-15 郑州捷安高科股份有限公司 Augmented reality-based vision synthesis system
CN106856566A (en) * 2016-12-16 2017-06-16 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of information synchronization method and system based on AR equipment
WO2019090488A1 (en) * 2017-11-07 2019-05-16 深圳市大疆创新科技有限公司 Method for generating simulated route, method and device for flight simulation, and storage medium
CN110378990A (en) * 2019-07-03 2019-10-25 北京悉见科技有限公司 Augmented reality scene shows method, apparatus and storage medium
KR20190110499A (en) * 2019-09-03 2019-09-30 엘지전자 주식회사 Method and apparatus for landing of unmanned aerial vehicle
CN110507994A (en) * 2019-09-05 2019-11-29 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and the storage medium of virtual aircraft flight
CN110559653A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 control method, device, terminal and storage medium of virtual aircraft
CN110585708A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, device and readable storage medium for landing from aircraft in virtual environment
CN110585711A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual aircraft
CN110716646A (en) * 2019-10-15 2020-01-21 北京市商汤科技开发有限公司 Augmented reality data presentation method, device, equipment and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨建业;戚国庆;盛安冬;: "基于视觉伺服的无人机自主着陆仿真系统设计", vol. 27, no. 11, pages 9 - 14 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium
CN112927293A (en) * 2021-03-26 2021-06-08 深圳市慧鲤科技有限公司 AR scene display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111569414B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
KR102417645B1 (en) AR scene image processing method, device, electronic device and storage medium
CN107820593B (en) Virtual reality interaction method, device and system
CN111880657B (en) Control method and device of virtual object, electronic equipment and storage medium
EP3978089A1 (en) Interactive method and system based on augmented reality device, electronic device, and computer readable medium
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN107016704A (en) A kind of virtual reality implementation method based on augmented reality
US20150317832A1 (en) World-locked display quality feedback
US20130141434A1 (en) Virtual light in augmented reality
CN111610998A (en) AR scene content generation method, display method, device and storage medium
CN112348968B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
US10974132B2 (en) Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies
CN103760972B (en) Cross-platform augmented reality experience
CN111617471A (en) Virtual shooting display method and device, electronic equipment and storage medium
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN111638797A (en) Display control method and device
CN112882576B (en) AR interaction method and device, electronic equipment and storage medium
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112967404A (en) Method and device for controlling movement of virtual object, electronic equipment and storage medium
CN111632377A (en) Shooting track display method and device, electronic equipment and storage medium
US10391408B2 (en) Systems and methods to facilitate user interactions with virtual objects depicted as being present in a real-world space
US10296080B2 (en) Systems and methods to simulate user presence in a real-world three-dimensional space
CN112154440A (en) Simulation method and device for unmanned aerial vehicle and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant