CN111569414B - Flight display method and device of virtual aircraft, electronic equipment and storage medium - Google Patents

Flight display method and device of virtual aircraft, electronic equipment and storage medium Download PDF

Info

Publication number
CN111569414B
CN111569414B CN202010514623.6A CN202010514623A CN111569414B CN 111569414 B CN111569414 B CN 111569414B CN 202010514623 A CN202010514623 A CN 202010514623A CN 111569414 B CN111569414 B CN 111569414B
Authority
CN
China
Prior art keywords
virtual aircraft
pose data
current pose
equipment
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010514623.6A
Other languages
Chinese (zh)
Other versions
CN111569414A (en
Inventor
揭志伟
李炳泽
武明飞
符修源
陈凯彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010514623.6A priority Critical patent/CN111569414B/en
Publication of CN111569414A publication Critical patent/CN111569414A/en
Application granted granted Critical
Publication of CN111569414B publication Critical patent/CN111569414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a flight display method, a device, electronic equipment and a storage medium of a virtual aircraft, wherein the flight display method comprises the following steps: real scene images shot by AR equipment aiming at a target real scene are acquired in real time; based on a real scene image acquired in real time, determining current pose data of the AR equipment in a target real scene, and based on the current pose data, controlling the AR equipment to display an AR scene image matched with the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; responding to a landing request for the virtual aircraft, and determining a landing position of the virtual aircraft in a target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in an AR scene image; and controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.

Description

Flight display method and device of virtual aircraft, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of augmented reality, in particular to a flight display method, a device, electronic equipment and a storage medium of a virtual aircraft.
Background
Augmented reality (Augmented Reality, AR) technology is a technology that integrates virtual information with the real world ingeniously, and is widely applied to various technical means such as multimedia, three-dimensional modeling, real-time tracking, and the like, and has been applied to various scenes at present.
Taking a game scene as an example, such as a flying game, how to provide more realistic display content through AR display when a user uses AR technology to control a virtual aircraft is a problem to be solved.
Disclosure of Invention
The embodiment of the disclosure at least provides a flight display scheme of a virtual aircraft.
In a first aspect, an embodiment of the present disclosure provides a flight display method of a virtual aircraft, including:
real scene images shot by AR equipment aiming at a target real scene are acquired in real time;
determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
determining a landing position in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image in response to a landing request for the virtual aircraft;
And controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
In the embodiment of the disclosure, when a landing request for a controlled virtual aircraft sent by an AR device is received, a corresponding landing position for the virtual aircraft may be determined based on current pose data of the virtual aircraft, and further, according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the corresponding landing position, the AR device may be controlled to display an AR special effect of a landing process of the virtual aircraft.
In a possible implementation manner, the determining, based on current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In a possible implementation manner, the determining, based on current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
determining a landing position of the virtual aircraft in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, current pose data of the AR equipment and control data sent by the AR equipment and aiming at the virtual aircraft.
In a possible implementation manner, the controlling the AR device to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft and the landing position includes:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
and controlling the AR equipment to display the AR special effect of the virtual aircraft landing process based on the pose data of the AR equipment and the landing navigation route data.
In a possible implementation manner, the controlling the AR device to display the AR special effect of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data includes:
Displaying, by the AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
the AR apparatus is controlled to exhibit AR special effects of the virtual aircraft landing procedure in response to a landing control operation for the virtual aircraft that includes a target landing navigation route.
In one possible embodiment, the flight display method further includes:
after the virtual aircraft reaches the landing position, responding to a take-off request aiming at the virtual aircraft, and controlling the AR equipment to display an AR special effect in the take-off process of the virtual aircraft according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In a possible implementation manner, the determining current pose data of the AR device in the target real scene based on the real scene image acquired in real time includes:
and determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model used for representing the target real scene.
In a second aspect, embodiments of the present disclosure provide a flight display device for a virtual aircraft, including:
the image acquisition module is used for acquiring real scene images shot by the AR equipment aiming at the target real scene in real time;
the first determining module is used for determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment;
a second determining module, configured to determine, in response to a landing request for the virtual aircraft, a landing position in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
and the special effect display module is used for controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
In one possible implementation manner, the second determining module when configured to determine, based on current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
Determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In one possible implementation manner, the second determining module when configured to determine, based on current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
determining a landing position of the virtual aircraft in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, current pose data of the AR equipment and control data sent by the AR equipment and aiming at the virtual aircraft.
In one possible implementation manner, the special effect display data, when used for controlling the AR device to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft and the landing position, comprises:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
And controlling the AR equipment to display the AR special effect of the virtual aircraft landing process based on the pose data of the AR equipment and the landing navigation route data.
In one possible implementation, the special effects display data, when used for controlling the AR device to display the AR special effects of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data, includes:
displaying, by the AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
the AR apparatus is controlled to exhibit AR special effects of the virtual aircraft landing procedure in response to a landing control operation for the virtual aircraft that includes a target landing navigation route.
In a possible implementation manner, the special effect display module is further used for:
after the virtual aircraft reaches the landing position, responding to a take-off request aiming at the virtual aircraft, and controlling the AR equipment to display an AR special effect in the take-off process of the virtual aircraft according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In one possible implementation manner, the first determining module, when configured to determine current pose data of the AR device in the target real scene based on the real scene image acquired in real time, includes:
and determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model used for representing the target real scene.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the flight presentation method as described in the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the flight presentation method according to the first aspect.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method of flight demonstration of a virtual aircraft provided by an embodiment of the present disclosure;
FIG. 2 illustrates a method flowchart for controlling an AR device to demonstrate a virtual aircraft landing process provided by embodiments of the present disclosure;
FIG. 3 illustrates a method flow diagram for controlling an AR device to demonstrate a specific landing procedure for a virtual aircraft provided by embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of a flight display of a virtual aircraft provided in an embodiment of the present disclosure;
Fig. 5 shows a schematic diagram of an electronic device provided by an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
At present, the AR technology has fewer application scenes in interactive games, such as in large-warship games, users can control matched virtual ships through carried AR equipment, and in similar application scenes, how to enable the display effect to be more vivid through the AR technology, such as aiming at the landing condition of the virtual ships, is a technical problem to be discussed in the disclosure.
Based on the above study, the disclosure provides a flight display method of a virtual aircraft, when receiving a landing request for a controlled virtual aircraft sent by an AR device, a corresponding landing position can be determined for the virtual aircraft based on current pose data of the virtual aircraft, and further according to the current pose data of the AR device, the current pose data of the virtual aircraft and the corresponding landing position, the AR device is controlled to display an AR special effect of a landing process of the virtual aircraft, because the process combines with the current pose data of the AR device, the landing process of the virtual aircraft can be displayed for the view angle of the AR device, so that the flight display content of the virtual aircraft is more lifelike.
For the convenience of understanding the present embodiment, first, a detailed description will be given of a flight display method of a virtual aircraft disclosed in the embodiments of the present disclosure, where an execution subject of the flight display method of a virtual aircraft provided in the embodiments of the present disclosure is generally a computer device with a certain computing capability, where the computer device includes, for example: a server or other processing device. In some possible implementations, the flight presentation method of the virtual aircraft may be implemented by way of a processor invoking computer readable instructions stored in a memory.
The embodiment of the present disclosure will be described by taking an execution body as a server as an example, and referring to fig. 1, a flowchart of a flight display method of a virtual aircraft provided in the embodiment of the present disclosure is shown, where the flight display method includes steps S101 to S104, where:
s101, real scene images shot by AR equipment aiming at a target real scene are acquired in real time.
For example, the AR device may include AR glasses, smartphones, tablets, wearable devices, etc., without limitation.
S102, determining current pose data of the AR equipment in a target real scene based on real scene images acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment.
For example, the pose data of the AR device in the target real scene may include a position and/or a display angle at which a display component of the virtual aircraft is displayed when the AR device is held or worn by a user.
The display component of the AR device specifically refers to a component for displaying the virtual aircraft in the AR device, and when the AR device is a mobile phone or a tablet, the corresponding display component may be a display screen, and when the AR device is AR glasses, the corresponding display component may be a lens for displaying the virtual aircraft.
The real scene image is an image obtained when the AR device shoots a target real scene, so that pose data of the AR device in the target real scene can be determined through the real scene image, and how to determine the pose data will be described in detail later.
For example, after determining the pose data of the AR device, the server may control the AR device to display an AR scene image matching the pose data according to the pose data of the AR device, for example, may display a virtual aircraft superimposed in a real scene in the AR scene image.
S103, responding to a landing request for the virtual aircraft, and determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image.
In an exemplary embodiment, when the virtual aircraft controlled by the AR device is in a flight state, the virtual aircraft may be controlled by the AR device to land, and in particular, the user may send a landing request for the virtual aircraft matched with the AR device to the server through the carried AR device, so as to achieve the effect of controlling the virtual aircraft to land.
For example, after detecting a landing request of a virtual aircraft, the server may determine, based on current pose data of the virtual aircraft in the AR scene image, a landing position of the virtual aircraft in the target real scene, where the landing position may be a pre-planned place where the virtual aircraft may be parked, or may be a place where the virtual aircraft may be carried in the target real scene, such as a building roof, an open ground, or the like.
For example, the current pose data of the virtual aircraft in the AR scene image may be determined according to control data sent by an AR device controlling the virtual aircraft, and the current pose data of the virtual aircraft in the AR scene image may be determined according to control data controlling the virtual aircraft to transmit.
S104, controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
For example, in the embodiment of the present disclosure, a plurality of AR devices may be included, where the plurality of AR devices form an associated AR device group, each AR device in the same associated AR device group may enter the same AR scene content, and in an application scene, the AR devices may select the same AR experience packet to enter a multi-person interactive AR scene, for example, for an AR scene of a game, each AR device may enter the same AR room experience AR scene by logging in an AR game account, and an AR device entering the same AR room may serve as the associated AR device group.
Therefore, when the corresponding current pose data of different AR devices are different, the displayed visual angles are different, and when the pose data of the same AR device are changed, the displayed visual angles are also different, so that when the AR special effect of the virtual aircraft landing process is generated for the AR device, the current pose data of the AR device are required to be generated together.
In the embodiment of the disclosure, when a landing request for a controlled virtual aircraft sent by an AR device is received, a corresponding landing position for the virtual aircraft may be determined based on current pose data of the virtual aircraft, and further, according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the corresponding landing position, the AR device may be controlled to display an AR special effect of a landing process of the virtual aircraft.
The above-described S101 to S104 will be explained below in connection with specific embodiments.
For S102 described above, when determining current pose data of the AR device in the target real scene based on the real scene image acquired in real time, it may include:
based on a real scene image acquired in real time and a pre-established three-dimensional scene model for representing a target real scene, determining current pose data of the AR equipment in the target real scene.
The three-dimensional scene model may be constructed based on a plurality of sample images of the target real scene captured in advance, specifically, during construction, the construction may be performed by extracting feature points in each sample image, after the three-dimensional scene model is generated, a sample image corresponding to each feature point in the three-dimensional scene model and a capturing pose corresponding to the sample image in the three-dimensional scene model may be saved, so that after a real scene image of the target real scene captured by the AR device is obtained, feature point extraction may be performed on the real scene image, a sample image matched with the real scene image may be determined based on the extracted feature points, and finally pose data of the AR device in the three-dimensional scene model may be obtained.
Because the three-dimensional scene model is a model characterizing the target real scene, pose data of the AR device in the three-dimensional scene model can be used as pose data of the AR device in the target real scene.
The three-dimensional scene model can be constructed based on a plurality of real scene images shot on the target real scene in advance, and after the construction is completed, the constructed three-dimensional scene model can be corrected through a real two-dimensional map corresponding to the target real scene, so that the three-dimensional scene model with higher accuracy for representing the target real scene is obtained.
In one possible implementation manner, for S103, when determining a landing position in a target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, the method may include:
and determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
For example, through current pose data of the AR device, an AR scene image currently displayed through the AR device may be determined, then candidate landing positions that may be used for the virtual aircraft to land, such as a building roof, an open ground, etc., are determined in the AR scene image, and then a landing position corresponding to the virtual aircraft is determined based on the current pose data of the virtual aircraft in the AR scene image, such as the landing position closest to the candidate landing position may be selected as the landing position.
In another possible embodiment, for S103, when determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the method may include:
and determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR equipment and the control data for the virtual aircraft sent by the AR equipment.
For example, the control data sent by the AR device for the virtual aircraft may include control data for a flight speed, a flight direction, and the like of the virtual aircraft, and by using these control data, the landing position corresponding to the virtual aircraft may be determined together in combination with the current pose data of the AR device and the current pose data of the virtual aircraft.
Similarly, the current pose data of the AR device may be determined by first determining an AR scene image currently displayed by the AR device, then determining a candidate landing position for the virtual aircraft to land in the AR scene image, further selecting an optimal landing position among the candidate landing positions based on the current pose data of the virtual aircraft in the AR scene image and the control data for the virtual aircraft sent by the AR device, for example, when the flight speed is fast, considering the landing safety of the virtual aircraft and the time required for deceleration, a candidate landing position far from the current position of the virtual aircraft may be selected for landing, and in combination with the flight direction, a candidate landing position consistent with the flight direction may be selected for landing, for example, the virtual aircraft may fly in the direction of the virtual aircraft toward the eastern plane of the target reality scene, and then the candidate landing position located on the eastern plane of the target reality scene may be selected as the virtual aircraft landing position in the AR scene content presented in the AR device.
For S104, when the AR device is controlled to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft and the landing position, as shown in fig. 2, the following S1041 to S1042 may be included:
s1041, generating landing navigation route data for the virtual aircraft based on the current pose data and the landing position of the virtual aircraft;
s1042, based on the pose data of the AR device and the landing navigation route data, controlling the AR device to display the AR special effect of the virtual aircraft landing process.
For example, at least one piece of landing navigation route data may be generated in connection with an obstacle, such as a high-rise building, in the AR scene content presented in the AR device, and the current pose data and landing position of the virtual aircraft.
Specifically, for S1042, when controlling the AR device to display the AR special effect of the virtual aircraft landing process based on the pose data of the AR device and the landing navigation route data, as shown in fig. 3, the following S10421 to S10422 may be included:
s10421, displaying at least one landing navigation route for the virtual aircraft through the AR equipment based on the current pose data of the AR equipment and the landing navigation route data;
S10422, controlling the AR device to exhibit an AR special effect of the virtual aircraft landing procedure in response to a landing control operation for the virtual aircraft including the target landing navigation route.
The current pose data of the AR device is also used to determine the visual angle exhibited by the AR device, and the same effects as above are not described herein.
Further, the generated at least one landing navigation route data may be sent to an AR device for display, so that a user may select a target landing navigation route through the AR device, and then send a landing control operation for the virtual aircraft to a server through the AR device, where the landing control operation may include the selected target landing navigation route, and after detecting the landing control operation for the virtual aircraft, the server may control the AR device to display an AR special effect of the virtual aircraft landing process according to the target landing navigation route.
In one implementation manner, the flight display method provided by the embodiment of the disclosure further includes:
after the virtual aircraft arrives at the landing position, responding to a take-off request for the virtual aircraft, and controlling the AR equipment to display the AR special effect of the take-off process of the virtual aircraft according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
After the virtual aircraft arrives at the landing position, the AR equipment can still control the virtual aircraft to take off from the landing position, a user can send a take-off request aiming at the virtual aircraft to the server through the AR equipment, after receiving the take-off request, the server can determine the take-off direction in the target real scene according to the current pose data of the virtual aircraft, and then the AR equipment is controlled to display the AR special effect of the virtual aircraft in the take-off process by combining the current pose data of the AR equipment.
The current pose data of the AR device is also used to determine the visual angle displayed by the AR device, and the same effects as the above are not described herein.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same technical concept, the embodiment of the disclosure further provides a flight display device of the virtual aircraft, which corresponds to the flight display method of the virtual aircraft, and since the principle of solving the problem of the device in the embodiment of the disclosure is similar to that of the flight display method of the virtual aircraft in the embodiment of the disclosure, the implementation of the device can be referred to the implementation of the method, and the repetition is omitted.
Referring to fig. 4, a schematic diagram of a flight display device 400 of a virtual aircraft according to an embodiment of the disclosure is provided, where the flight display device includes:
an image acquisition module 401, configured to acquire, in real time, a real scene image taken by the AR device for a target real scene;
a first determining module 402, configured to determine current pose data of the AR device in the target real scene based on the real scene image acquired in real time, and control the AR device to display an AR scene image matched with the current pose data based on the current pose data, where the AR scene image includes a virtual aircraft controlled by the AR device;
a second determining module 403, configured to determine, in response to a landing request for the virtual aircraft, a landing position in a target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image;
the special effect display module 404 is configured to control the AR device to display an AR special effect of the virtual aircraft during the landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position.
In one possible implementation, the second determining module 403, when configured to determine a landing position in a target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, includes:
And determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
In one possible implementation, the second determining module 403, when configured to determine a landing position in a target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, includes:
and determining the landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image, the current pose data of the AR equipment and the control data for the virtual aircraft sent by the AR equipment.
In one possible implementation, the special effects display data 404, when used to control the AR device to display the AR special effects of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position, includes:
generating landing navigation route data for the virtual aircraft based on the current pose data and the landing position of the virtual aircraft;
based on the pose data of the AR equipment and the landing navigation route data, the AR equipment is controlled to display the AR special effect of the virtual aircraft landing process.
In one possible implementation, special effects display data 404, when used to control an AR device to display AR special effects for a virtual aircraft landing procedure based on pose data of the AR device and landing navigation route data, comprises:
displaying at least one landing navigation route for the virtual aircraft through the AR device based on the current pose data of the AR device and the landing navigation route data;
in response to a landing control operation for the virtual aircraft that includes the target landing navigation route, the AR device is controlled to exhibit an AR special effect of the virtual aircraft landing procedure.
In one possible implementation, the special effects display module 404 is further configured to:
after the virtual aircraft arrives at the landing position, responding to a take-off request for the virtual aircraft, and controlling the AR equipment to display the AR special effect of the take-off process of the virtual aircraft according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
In one possible implementation, the first determining module 402, when configured to determine current pose data of the AR device in the target real scene based on the real scene image acquired in real time, includes:
based on a real scene image acquired in real time and a pre-established three-dimensional scene model for representing a target real scene, determining current pose data of the AR equipment in the target real scene.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
Corresponding to the flight display method of the virtual aircraft in fig. 1, the embodiment of the disclosure further provides an electronic device 500, as shown in fig. 5, which is a schematic structural diagram of the electronic device 500 provided in the embodiment of the disclosure, including:
a processor 51, a memory 52, and a bus 53; memory 52 is used to store execution instructions, including memory 521 and external storage 522; the memory 521 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 51 and data exchanged with the external memory 522 such as a hard disk, and the processor 51 exchanges data with the external memory 522 through the memory 521, and when the electronic device 500 is operated, the processor 51 and the memory 52 communicate with each other through the bus 53, so that the processor 51 executes the following instructions: real scene images shot by AR equipment aiming at a target real scene are acquired in real time; based on a real scene image acquired in real time, determining current pose data of the AR equipment in a target real scene, and based on the current pose data, controlling the AR equipment to display an AR scene image matched with the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; responding to a landing request for the virtual aircraft, and determining a landing position of the virtual aircraft in a target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in an AR scene image; and controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the flight display method of a virtual aircraft described in the method embodiments above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
The computer program product of the flight display method of the virtual aircraft provided in the embodiments of the present disclosure includes a computer readable storage medium storing program code, where the program code includes instructions for executing the steps of the flight display method of the virtual aircraft described in the above method embodiments, and the detailed description thereof will be omitted herein.
The disclosed embodiments also provide a computer program which, when executed by a processor, implements any of the methods of the previous embodiments. The computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (9)

1. A method of displaying a flight of a virtual aircraft, comprising:
real scene images shot by AR equipment aiming at a target real scene are acquired in real time;
determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, and controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; the AR device is used for displaying the position and/or the display angle of a display part of the virtual aircraft when the current pose data of the AR device in the target reality scene comprise the situation that a user holds or wears the AR device;
determining a landing position in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image in response to a landing request for the virtual aircraft; the landing position is a pre-planned place capable of parking the virtual aircraft and a place capable of bearing the virtual aircraft in a target reality scene;
according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position, controlling the AR equipment to display an AR special effect of the virtual aircraft in the landing process;
Wherein the determining, based on current pose data of the virtual aircraft in the AR scene image, a landing position in the target real scene corresponding to the virtual aircraft includes:
determining a landing position of the virtual aircraft in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, current pose data of the AR equipment and control data for the virtual aircraft sent by the AR equipment; the control data sent by the AR device for the virtual aircraft includes a flight speed and a flight direction for the virtual aircraft.
2. The flight presentation method of claim 1, wherein the determining a landing position in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image comprises:
determining a landing position in the target real scene corresponding to the virtual aircraft based on the current pose data of the virtual aircraft in the AR scene image and the current pose data of the AR equipment.
3. The flight display method according to claim 1, wherein the controlling the AR device to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR device, the current pose data of the virtual aircraft, and the landing position comprises:
generating landing navigation route data for the virtual aircraft based on the current pose data of the virtual aircraft and the landing position;
and controlling the AR equipment to display the AR special effect of the virtual aircraft landing process based on the pose data of the AR equipment and the landing navigation route data.
4. The flight presentation method of claim 3, wherein the controlling the AR device to present the AR special effects of the virtual aircraft landing procedure based on the pose data of the AR device and the landing navigation route data comprises:
displaying, by the AR device, at least one landing navigation route for the virtual aircraft based on current pose data of the AR device and the landing navigation route data;
the AR apparatus is controlled to exhibit AR special effects of the virtual aircraft landing procedure in response to a landing control operation for the virtual aircraft that includes a target landing navigation route.
5. The flight display method of any one of claims 1 to 4, further comprising:
after the virtual aircraft reaches the landing position, responding to a take-off request aiming at the virtual aircraft, and controlling the AR equipment to display an AR special effect in the take-off process of the virtual aircraft according to the current pose data of the AR equipment and the current pose data of the virtual aircraft.
6. The flight presentation method of claim 1, wherein the determining current pose data of the AR device in the target real scene based on the real scene image acquired in real time comprises:
and determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time and a pre-established three-dimensional scene model used for representing the target real scene.
7. A flight display device for a virtual aircraft, comprising:
the image acquisition module is used for acquiring real scene images shot by the AR equipment aiming at the target real scene in real time;
the first determining module is used for determining current pose data of the AR equipment in the target real scene based on the real scene image acquired in real time, controlling the AR equipment to display an AR scene image matched with the current pose data based on the current pose data, wherein the AR scene image comprises a virtual aircraft controlled by the AR equipment; the AR device is used for displaying the position and/or the display angle of a display part of the virtual aircraft when the current pose data of the AR device in the target reality scene comprise the situation that a user holds or wears the AR device;
A second determining module, configured to determine, in response to a landing request for the virtual aircraft, a landing position in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image; the landing position is a pre-planned place capable of parking the virtual aircraft and a place capable of bearing the virtual aircraft in a target reality scene;
the special effect display module is used for controlling the AR equipment to display the AR special effect of the virtual aircraft landing process according to the current pose data of the AR equipment, the current pose data of the virtual aircraft and the landing position;
the second determining module, when configured to determine a landing position of the virtual aircraft in the target real scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, includes:
determining a landing position of the virtual aircraft in the target reality scene corresponding to the virtual aircraft based on current pose data of the virtual aircraft in the AR scene image, current pose data of the AR equipment and control data for the virtual aircraft sent by the AR equipment; the control data sent by the AR device for the virtual aircraft includes a flight speed and a flight direction for the virtual aircraft.
8. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating over the bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the flight presentation method according to any one of claims 1 to 6.
9. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the flight presentation method according to any one of claims 1 to 6.
CN202010514623.6A 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium Active CN111569414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010514623.6A CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514623.6A CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111569414A CN111569414A (en) 2020-08-25
CN111569414B true CN111569414B (en) 2024-03-29

Family

ID=72110628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514623.6A Active CN111569414B (en) 2020-06-08 2020-06-08 Flight display method and device of virtual aircraft, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111569414B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU130963U1 (en) * 2012-11-20 2013-08-10 Общество с ограниченной ответственностью "АВИАРЕАЛ" AERIAL REALITY SYSTEM FOR DISPLAYING INFORMATION FOR AIRCRAFT CONTROL
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
CN106408515A (en) * 2016-08-31 2017-02-15 郑州捷安高科股份有限公司 Augmented reality-based vision synthesis system
CN106856566A (en) * 2016-12-16 2017-06-16 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of information synchronization method and system based on AR equipment
WO2019090488A1 (en) * 2017-11-07 2019-05-16 深圳市大疆创新科技有限公司 Method for generating simulated route, method and device for flight simulation, and storage medium
KR20190110499A (en) * 2019-09-03 2019-09-30 엘지전자 주식회사 Method and apparatus for landing of unmanned aerial vehicle
CN110378990A (en) * 2019-07-03 2019-10-25 北京悉见科技有限公司 Augmented reality scene shows method, apparatus and storage medium
CN110507994A (en) * 2019-09-05 2019-11-29 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and the storage medium of virtual aircraft flight
CN110559653A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 control method, device, terminal and storage medium of virtual aircraft
CN110585708A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, device and readable storage medium for landing from aircraft in virtual environment
CN110585711A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual aircraft
CN110716646A (en) * 2019-10-15 2020-01-21 北京市商汤科技开发有限公司 Augmented reality data presentation method, device, equipment and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201215442A (en) * 2010-10-06 2012-04-16 Hon Hai Prec Ind Co Ltd Unmanned Aerial Vehicle control system and method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
RU130963U1 (en) * 2012-11-20 2013-08-10 Общество с ограниченной ответственностью "АВИАРЕАЛ" AERIAL REALITY SYSTEM FOR DISPLAYING INFORMATION FOR AIRCRAFT CONTROL
CN106408515A (en) * 2016-08-31 2017-02-15 郑州捷安高科股份有限公司 Augmented reality-based vision synthesis system
CN106856566A (en) * 2016-12-16 2017-06-16 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of information synchronization method and system based on AR equipment
WO2019090488A1 (en) * 2017-11-07 2019-05-16 深圳市大疆创新科技有限公司 Method for generating simulated route, method and device for flight simulation, and storage medium
CN110378990A (en) * 2019-07-03 2019-10-25 北京悉见科技有限公司 Augmented reality scene shows method, apparatus and storage medium
KR20190110499A (en) * 2019-09-03 2019-09-30 엘지전자 주식회사 Method and apparatus for landing of unmanned aerial vehicle
CN110507994A (en) * 2019-09-05 2019-11-29 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and the storage medium of virtual aircraft flight
CN110559653A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 control method, device, terminal and storage medium of virtual aircraft
CN110585708A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, device and readable storage medium for landing from aircraft in virtual environment
CN110585711A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Control method, device, terminal and storage medium of virtual aircraft
CN110716646A (en) * 2019-10-15 2020-01-21 北京市商汤科技开发有限公司 Augmented reality data presentation method, device, equipment and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨建业 ; 戚国庆 ; 盛安冬 ; .基于视觉伺服的无人机自主着陆仿真系统设计.电子设计工程.2019,27(11),9-14. *

Also Published As

Publication number Publication date
CN111569414A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
CN105279795B (en) Augmented reality system based on 3D marker
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
US10977869B2 (en) Interactive method and augmented reality system
KR101736477B1 (en) Local sensor augmentation of stored content and ar communication
US20220122331A1 (en) Interactive method and system based on augmented reality device, electronic device, and computer readable medium
CN107016704A (en) A kind of virtual reality implementation method based on augmented reality
US20220382051A1 (en) Virtual reality interaction method, device and system
EP3137976A1 (en) World-locked display quality feedback
US10974132B2 (en) Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies
WO2014074465A1 (en) Cross-platform augmented reality experience
CN111610998A (en) AR scene content generation method, display method, device and storage medium
CN103760972B (en) Cross-platform augmented reality experience
CN111744202A (en) Method and device for loading virtual game, storage medium and electronic device
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
Oskiper et al. Augmented reality binoculars
CN112348968B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN109640070A (en) A kind of stereo display method, device, equipment and storage medium
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN111617471A (en) Virtual shooting display method and device, electronic equipment and storage medium
CN112882576B (en) AR interaction method and device, electronic equipment and storage medium
CN111833458A (en) Image display method and device, equipment and computer readable storage medium
CN111744180A (en) Method and device for loading virtual game, storage medium and electronic device
CN112148125A (en) AR interaction state control method, device, equipment and storage medium
CN114332429A (en) Display method and device for augmented reality AR scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant