WO2022056933A1 - Procédé de simulation de vol et terminal pour drone de course - Google Patents

Procédé de simulation de vol et terminal pour drone de course Download PDF

Info

Publication number
WO2022056933A1
WO2022056933A1 PCT/CN2020/116621 CN2020116621W WO2022056933A1 WO 2022056933 A1 WO2022056933 A1 WO 2022056933A1 CN 2020116621 W CN2020116621 W CN 2020116621W WO 2022056933 A1 WO2022056933 A1 WO 2022056933A1
Authority
WO
WIPO (PCT)
Prior art keywords
real
target
scene
dimensional scene
dimensional
Prior art date
Application number
PCT/CN2020/116621
Other languages
English (en)
Chinese (zh)
Inventor
孙晓帆
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/116621 priority Critical patent/WO2022056933A1/fr
Priority to CN202080035107.3A priority patent/CN113826149A/zh
Publication of WO2022056933A1 publication Critical patent/WO2022056933A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Definitions

  • the present application relates to the technical field of flight simulation of a time-travel aircraft, and in particular, to a flight simulation method, a simulation terminal and a computer-readable storage medium for a time-travel aircraft.
  • the present application provides a flight simulation method for a crossing aircraft to solve the technical problem of limited repetition of flight scenarios provided by existing flight simulation software.
  • a first aspect of the present application provides a flight simulation method for a crossing aircraft, including:
  • the earth model includes a plurality of location markers, and the location markers are used to identify real three-dimensional scenes corresponding to different locations;
  • the target location identifier is determined, and the target real three-dimensional scene corresponding to the target location identifier is loaded;
  • the target real three-dimensional scene is accelerated, the viewing angle is adjusted or the scene is updated, so as to display the same relationship with the virtual joystick or the physical joystick of the remote control. Operate the corresponding first-person view screen.
  • a second aspect of the present application provides an analog terminal, including:
  • the display is used to display the first-person perspective screen of the traversing aircraft
  • the processor implements the following steps when executing the computer program:
  • the earth model is displayed on the display interface through the display, wherein the earth model includes a plurality of location markers, and the location markers are used to identify real three-dimensional scenes corresponding to different locations;
  • the target location identifier is determined, and the target real three-dimensional scene corresponding to the target location identifier is loaded;
  • the target real three-dimensional scene is accelerated, the viewing angle is adjusted or the scene is updated, so that the display is displayed and the pair of virtual joysticks or the remote control is displayed.
  • the first-person perspective screen corresponding to the operation of the physical joystick of the controller.
  • a third aspect of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, implements any one of the flight simulation methods provided in the first aspect above .
  • the loaded scene model is a real 3D scene corresponding to the real scene, it can provide more diverse scenarios than the virtual 3D scene provided in the existing flight simulation software.
  • the flight environment makes flight training more effective.
  • FIG. 1 is a schematic diagram of a flight simulation scenario provided by an embodiment of the present application.
  • FIG. 2 is a flowchart of a flight simulation method provided by an embodiment of the present application.
  • FIG. 3 is a display effect diagram of the earth model provided by the embodiment of the present application.
  • FIG. 4 is a structural diagram of an analog terminal provided by an embodiment of the present application.
  • a drone is a remotely controlled unmanned aircraft, which can establish a wireless connection with a remote control device, so that the user can implement remote control of the drone through the remote control device.
  • the remote control device may be a remote control, a combination of a remote control and a mobile terminal, or a combination of a remote control and video glasses.
  • the application does not limit the specific implementation form of the remote control device.
  • Flight training can be carried out through practical operation, that is, users can directly control the drone for training in reality. However, in the actual operation, once a collision occurs due to improper control, the drone may be damaged or exploded. Therefore, in another way, UAV flight training can also be performed through flight simulation software to reduce training costs.
  • the flight simulation software can be installed on terminal devices such as computers, mobile phones, and smart tablets, and the terminal device with the flight simulation software installed can be called an analog terminal.
  • the user can control the drone in the flight simulation software to achieve the purpose of training or entertainment.
  • FIG. 1 is a schematic diagram of a flight simulation scenario provided by an embodiment of the present application, wherein the simulated terminal 110 can be connected to the remote control device 120 of the drone, and the user can use the remote control device to control the display of the simulated terminal.
  • the drone is controlled to achieve the effect of flight training.
  • the 3D scenes provided in the existing flight simulation software are all artificially constructed virtual 3D scenes, the content of the virtual 3D scenes is limited in repetition, and there is a certain gap with the real scenes, and the training effect is not satisfactory.
  • the existing flight simulation software is all aimed at aerial photography drones, and there is still no matching flight simulation content for the time-travelling aircraft, which cannot meet the needs of the flying-travelling aircraft pilots.
  • the aerial photography drone is aimed at ordinary consumers, its flight control is relatively stable, the flight speed is slow, and it is relatively easy to control, while the flying drone is purely manual control, the flight speed is unlimited or limited, and it is difficult to get started. Easy to blow up. Therefore, compared with aerial photography UAVs, the flight simulation requirements of flying drones are greater, and novices rely more on flight simulation to get started.
  • FIG. 2 is a flowchart of the flight simulation method provided by the embodiment of the present application. The method includes:
  • the simulated terminal may be a terminal device installed with flight simulation software.
  • the simulation terminal may include a display, a processor and a memory, wherein a computer program corresponding to the flight simulation software may be stored in the memory, the processor may execute the computer program to run the flight simulation software, and during the software running process, the processing
  • the controller can also display the corresponding picture of the flight simulation software through the display.
  • the screen corresponding to the earth model can be displayed by loading the earth model.
  • the earth model can be displayed in various ways.
  • the earth model can be displayed from the perspective of observing the earth from outer space, as shown in FIG. 3 , which is a display effect diagram of the earth model provided by the embodiment of the present application.
  • the earth model may include multiple location markers, and the location markers may be used to identify real three-dimensional scenes corresponding to different locations.
  • the location identifiers may correspond to real locations in the real world, such as countries such as China, Russia, India, etc., of course, may also be smaller-level locations, such as Guangdong, Hunan, Hubei and other provinces, Or Guangzhou, Shenzhen, Beijing and other cities.
  • the display manner of the earth model can be adjusted according to the user's operation on the earth model.
  • the user can rotate, zoom, and move the earth model through keyboard, mouse, remote control, touch gestures, etc.
  • the level of the displayed location identification can be adjusted according to the degree of enlargement.
  • the location identifiers in the earth model can correspond to the country level
  • the location identifiers in the earth model can correspond to the levels of provinces, cities, or urban areas.
  • Different locations may correspond to real three-dimensional scenes of the locations.
  • the real 3D scene is the scene model corresponding to the real scene of the location. For example, if the location corresponds to a shopping mall in reality, then in the real 3D scene corresponding to the location, the corresponding location will include the shopping mall. the corresponding 3D model.
  • the user can input the target location through the search bar provided on the display interface, or can select the target location on the screen to determine the target location. According to the user's input or selection, the target location identifier can be determined.
  • the real three-dimensional scene of the target corresponding to the target location identifier can be loaded, so that the flight simulation of the crossing aircraft can be performed based on the real three-dimensional scene of the target.
  • the loaded scene model is a real 3D scene corresponding to the real scene, compared with the virtual 3D scene provided in the existing flight simulation software, it can provide more This kind of flight environment makes the flight training effect better.
  • a model of the real 3D scene can be constructed through a 3D reconstruction technology.
  • the real scene corresponding to the place can be photographed from multiple angles to obtain a multi-angle image corresponding to the real scene.
  • the multi-angle can include front view, side view, top view, upward view, etc.
  • the multi-angle image can include images captured by different devices such as satellite images and drone aerial images.
  • the multi-angle images can be fused through a three-dimensional reconstruction algorithm, so as to construct the real scene corresponding to the real three-dimensional scene.
  • the third-party map software since the third-party map software has built real three-dimensional scenes of various locations on the earth through the three-dimensional reconstruction technology, the third-party map software can be called by calling the application programming interface (Application Programming Interface, API), The flight simulation software of the embodiment of the present application is docked with the third-party map software, and the earth model of the third-party map software is imported during the operation of the flight simulation software, and the earth model is displayed on the display interface; After the location identification, the target location identification can be sent to the third-party map software, so that the target real three-dimensional scene corresponding to the target location identification returned by the third-party map software can be received, and the target real three-dimensional scene can be loaded.
  • API Application Programming Interface
  • the third-party map software can be any software that performs three-dimensional reconstruction of the real scene, such as Google Earth, Baidu Map, AutoNavi Map, and so on.
  • the flight simulation of the traversing aircraft can be performed based on the real 3D scene of the target.
  • a time-travelling aircraft model can be generated in the loaded target real three-dimensional scene, and the time-travelling aircraft model can be associated with the flight control logic corresponding to the time-travelling aircraft, so that in a first-person perspective (First Person View, When FPV) displays the flight screen of the traversing aircraft, according to the user's operation of the virtual joystick or the physical joystick of the remote control, the real 3D scene of the target can be accelerated, the viewing angle adjusted or the scene processing is updated to achieve the effect of flight simulation.
  • first-person perspective First Person View, When FPV) displays the flight screen of the traversing aircraft, according to the user's operation of the virtual joystick or the physical joystick of the remote control, the real 3D scene of the target can be accelerated, the viewing angle adjusted or the scene processing is updated to achieve the effect of flight simulation.
  • the analog terminal may include a connector, which may be a wireless or wired connector, for connecting with the physical remote control, so that when connecting the physical remote control with the analog terminal, the user can operate
  • the physical remote controller controls the generated aircraft model to obtain a more realistic flight simulation experience.
  • the remote controller can also be a virtual remote controller, for example, the virtual remote controller can be a remote controller that uses a keyboard, a mouse, a touch screen, etc. to input control instructions.
  • the real 3D scene can be built by 3D reconstruction technology, and the model built by 3D reconstruction technology is only the result of image fusion. Therefore, obstacles in the model, such as buildings and plants, do not have collision volumes, that is, when When flying in the real 3D scene of the target, the crossing aircraft can pass through these obstacle models without hindrance, which is inconsistent with the real flying experience.
  • the obstacles in the target real three-dimensional scene can be identified through image recognition technology, and collision volumes can be assigned or generated for these obstacles, so that when the passing machine collides with the obstacles, it can trigger the The occurrence of any of the following events: the plane explodes, or the damage of the plane increases (the plane can be equipped with a health bar or damage bar), or, the plane returns to the position before the collision.
  • the category of each obstacle in the real 3D scene of the target can be determined, such as buildings, plants, objects, etc.
  • the physical properties corresponding to the categories of the obstacles so that when a collision event occurs between the traversing machine and the obstacle, the hit obstacle can be triggered to move and/or deform corresponding to its assigned physical properties.
  • the crossing machine collides with a building the building can be partially damaged, but will not move.
  • the branches and leaves of the plant can be deformed, and can also respond to the collision of the crossing machine. Wind pressure produces movements such as swaying.
  • the target obstacles in the obstacles can be compared with other three-dimensional scenes (the three-dimensional scene can be Real 3D scene or virtual 3D scene), so that when the traversing machine collides with the designated position of the target obstacle, the effect of traversing machine transmission can be triggered, that is, other 3D scenes associated with the target obstacle can be loaded.
  • the flight simulation of the crossing aircraft is continued.
  • the shopping mall can be associated with the three-dimensional scene inside the shopping mall, so that when the crossing machine collides with the entrance of the shopping mall or the user performs a preset interaction logic with the shopping mall , the loading of the 3D scene inside the mall can be triggered, so that the user can fly through the aircraft in the 3D scene inside the mall.
  • the other three-dimensional scenes may be real three-dimensional scenes obtained by three-dimensional reconstruction based on real scenes, or may be artificially constructed virtual three-dimensional scenes.
  • a user-defined virtual 3D scene can be created, a self-created virtual 3D scene imported by the user can be received, and the target obstacle and the imported virtual 3D scene can be established according to the target obstacle specified by the user. so as to realize the switching from the target real 3D scene to the virtual 3D scene through the target obstacle, which greatly improves the playability of the flight simulation.
  • image enhancement processing can be performed on the image in the target real 3D scene , and display the real 3D scene of the target after image enhancement processing on the display interface, giving users better visual effects.
  • image enhancement processing such as super-resolution, color enhancement, denoising, and sharpening can be performed on the texture of the model in the target real 3D scene.
  • a fine model of common obstacles can also be established in advance.
  • a fine model corresponding to the tree can be established in advance for a tree, and a fine model corresponding to a common building can also be established for some ordinary buildings without landmarks. Therefore, after acquiring the real 3D scene of the target, the obstacles in the real 3D scene of the target can be identified, and after identifying the specific obstacle, the model of the specific obstacle can be replaced with the pre-established model of the specific obstacle.
  • Refinement model for example, all obstacles identified as trees can be replaced by any kind of pre-built fine models of trees, and some common buildings can also be replaced by any kind of pre-built fine models of common buildings, so that It can improve the fineness of the overall scene without affecting the consistency with the real scene, and provide users with a more refined flight environment.
  • a designated area in the real 3D scene of the target can also be set as a restricted flight area.
  • Countermeasures to the crossing aircraft can also trigger a pursuit event, that is, at least one police drone can be generated in the real 3D scene of the target. Hunting can improve the fun of flight simulation, and can also remind users to pay more attention to the restricted flight area in the real field.
  • the control parameters of the riding machine can be adjusted according to the performance parameters of the riding machine selected by the user, such as the acceleration when accelerating the target real 3D scene, the rotation speed of the riding machine when the viewing angle is adjusted, etc. .
  • the currently generated riding machine can be more in line with the real hand feel in operation, and bring better training effects to the user.
  • an assembly interface of the crossing machine before generating the crossing machine in the target real three-dimensional scene, an assembly interface of the crossing machine may be provided, and the assembly interface may include various components of the crossing machine for the user to select. , such as image transmission system (camera, digital image transmission, etc.), power system (motor, ESC, propeller, etc.), rack, battery system, etc. Therefore, according to the components selected by the user, the corresponding performance parameters.
  • the model of the flying machine can also be modified accordingly according to the components selected by the user and the personalized map selected by the user, so that when switching to the third-person perspective in the flight simulation, the user can Observe the personalized traversing machine of its design.
  • the real time of the real location corresponding to the target location identifier can also be acquired, and according to the real time, it is determined that the lighting system in the flight simulation environment is a day mode or a night mode. Since the location identifier selected by the user can be anywhere on the earth, and there is a time difference between locations that are far apart on the earth, in this embodiment, the mode of the lighting system can be changed to day or day according to the real time. In the dark night, users can have a more realistic flight simulation experience.
  • the music corresponding to the target location identifier may also be acquired and played according to the target location identifier selected by the user.
  • the location type of the selected location can be determined, and the characteristic music corresponding to the location type can be obtained and played.
  • the location type of the selected place is a city, and more fashionable music that matches the city can be played; if the user chooses places such as Inner Mongolia During flight simulation, it can be determined that the location type of the selected location is grassland, and music with a sense of vastness that matches the grassland can be played.
  • the user can listen to the style music of different places when performing flight simulation in different places, which increases the fun of flight simulation.
  • a follow-up target can also be generated in the target real 3D scene, and a motion route corresponding to the target real three-dimensional scene is obtained, and the follow-up target is controlled to move according to the motion route, wherein the The tracking target is used by the user to control the crossing machine to follow the tracking target.
  • the generated follow-up target can be any model, such as a character model, an animal model, a vehicle model, etc., which can move quickly according to a specific route as the follow-up target of the crossing machine controlled by the user, It is used to assist users to practice flying skills.
  • motion routes can be generated for the real 3D scene corresponding to each different place in advance.
  • the tracking target is generated in the corresponding real 3D scene, and the tracking target is controlled to move along the motion route corresponding to the real 3D scene in Shenzhen, and at the same time, the user can be instructed to follow the tracking target.
  • a recording mode can be provided so that one or others can analyze the flight control records.
  • the recording mode can be entered according to the recording instruction of the first user.
  • the displayed FPV picture can be recorded, and the starting point, initial attitude, initial speed of the flying machine, and the first user's flight status can be recorded.
  • all the control instructions for the traversing machine, and the recorded information can generate a record file.
  • the record file can be uploaded to the server, and when the server receives a request from the second user to obtain the record file, the server can send the record file to the client of the second user.
  • the client of the second user can reproduce the FPV picture of the first user on the display interface corresponding to the second user according to the record file, and synchronously display the control instructions of the first user on the crossing machine at each moment, Therefore, the second user can study and learn the flight manipulation of the first user, so as to improve his flying skills.
  • the client of the second user can generate a traversing machine controlled by the first user at the starting point recorded in the recording file according to the recording file, and assign the FPV to the first user.
  • the generated crossing machine is controlled according to the control instructions of the first user recorded in the recording file, so as to realize the operation of the customer of the second user.
  • the terminal reproduces the FPV picture of the first user.
  • throttle up controls, throttle down controls, pan left controls, pan right controls, pitch forward controls, pitch back controls, roll left tilt controls, and roll controls may be displayed on the display interface
  • the right tilting control while displaying the FPV screen, displays the rod amount and control time required for the operation represented by each control through the color band, and the rod amount and control time are in a synchronized state with the FPV screen; the display color of the color band Or the length is linked to the user's control of the physical joystick or virtual joystick. For example, the closer the user's control of the physical joystick or virtual joystick is to the amount required for the operation represented by the corresponding control, the shorter the color band.
  • the loaded scene model is a real 3D scene corresponding to the real scene, it can provide more diverse scenarios than the virtual 3D scene provided in the existing flight simulation software.
  • the flight environment makes flight training more effective.
  • the loaded scene model is obtained by calling the API of the third-party map software, a lot of modeling work can be saved, and the three-dimensional scene provided by the third-party map software covers multiple The location can provide users with a broad flight environment.
  • FIG. 4 is a structural diagram of an analog terminal provided by an embodiment of the present application.
  • the simulated terminal can include:
  • the display 410 is used to display the first-person perspective picture of the flying machine
  • the processor implements the following steps when executing the computer program:
  • the earth model is displayed on the display interface through the display, wherein the earth model includes a plurality of location markers, and the location markers are used to identify real three-dimensional scenes corresponding to different locations;
  • the target location identifier is determined, and the target real three-dimensional scene corresponding to the target location identifier is loaded;
  • the target real three-dimensional scene is accelerated, the viewing angle is adjusted or the scene is updated, so that the display is displayed and the pair of virtual joysticks or the remote control is displayed.
  • the first-person perspective screen corresponding to the operation of the physical joystick of the controller.
  • the computer program stored in the memory may be a program corresponding to flight simulation software, and when the computer program is executed by the processor, the flight simulation software may be run, and the above-mentioned method steps may be implemented.
  • the processor is used to import the earth model of the third-party map software by calling the API of the third-party map software when displaying the earth model on the display interface through the display; place the earth model in the third-party map software. displayed on the display interface of the simulated terminal.
  • the processor is configured to, when loading the target real three-dimensional scene corresponding to the target location identifier, send the determined target location identifier to the third-party map software; and receive the information returned by the third-party map software.
  • the target location identifies the corresponding target real three-dimensional scene, and loads the target real three-dimensional scene.
  • the target real three-dimensional scene is established in the following manner:
  • Multi-angle photography is performed on the real scene corresponding to the target location identification to obtain a multi-angle image corresponding to the real scene;
  • the target real three-dimensional scene corresponding to the real scene is obtained.
  • the processor is further configured to identify obstacles in the real three-dimensional scene of the target; assign a collision volume to the identified obstacles; When an obstacle collides, any one of the following events is triggered: the crossing machine explodes, or the damage degree of the crossing machine increases, or the crossing machine returns to the position before the collision.
  • the processor is further configured to determine the category of each obstacle in the real three-dimensional scene of the target; assign physical attributes corresponding to the category of the obstacle to the obstacle through a physics engine; When the crossing machine collides with the obstacle, the obstacle is triggered to move and/or deform corresponding to its assigned physical property.
  • the processor is further configured to identify obstacles in the target real three-dimensional scene; associate the target obstacle with other real or virtual three-dimensional scenes; When the specified position collides, the loading of the other three-dimensional scene is triggered, and the flight simulation is continued in the other three-dimensional scene.
  • the processor is further configured to perform image enhancement processing on the image in the target real three-dimensional scene.
  • the image enhancement processing includes one or more of the following: super-resolution, color enhancement, denoising, and sharpening.
  • the processor is further configured to set a designated area in the real 3D scene of the target as a flight-restricted area; when the crossing aircraft enters the flight-restricted area, trigger a reaction to the crossing aircraft. control event.
  • the processor is configured to accelerate or adjust the viewing angle of the target real three-dimensional scene according to the performance parameters of the flying machine selected by the user when accelerating or adjusting the viewing angle of the target real three-dimensional scene.
  • the performance parameters of the traversing machine are determined in the following manner:
  • the corresponding performance parameters of the flying machine are calculated according to the components of the flying machine selected by the user.
  • the loaded scene model is a real 3D scene corresponding to the real scene, it can provide more diverse flight scenarios than the virtual 3D scene provided in the existing flight simulation software. environment to make flight training more effective.
  • the loaded scene model is obtained by calling the API of the third-party map software, a lot of modeling work can be saved, and the three-dimensional scene provided by the third-party map software covers multiple The location can provide users with a broad flight environment.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the flight simulation provided by the embodiments of the present application for a flying machine is implemented method.
  • Embodiments of the present application may take the form of a computer program product implemented on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having program code embodied therein.
  • Computer-usable storage media includes permanent and non-permanent, removable and non-removable media, and storage of information can be accomplished by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • PRAM phase-change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • Flash Memory or other memory technology
  • CD-ROM Compact Disc Read Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • DVD Digital Versatile Disc
  • Magnetic tape cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non-

Abstract

La présente invention concerne un procédé de simulation de vol pour un drone de course, le procédé comprenant les étapes consistant à : afficher un modèle terrestre sur une interface d'affichage d'un terminal de simulation (S210), le modèle terrestre comprenant une pluralité d'identifiants d'emplacements et les identifiants d'emplacements étant utilisés pour identifier des scènes tridimensionnelles réelles correspondant à différents emplacements ; déterminer un identifiant d'emplacement cible en fonction d'une opération d'un utilisateur sur les identifiants d'emplacements du modèle terrestre (S220) ; charger une scène tridimensionnelle réelle cible correspondant à l'identifiant d'emplacement cible (S230) ; et effectuer une accélération, un ajustement de perspective ou un traitement de mise à jour de scène sur la scène tridimensionnelle réelle cible en fonction d'une opération de l'utilisateur sur une manette virtuelle ou une manette physique d'un dispositif de commande à distance (S240), de sorte à afficher une image en perspective subjective correspondant à l'opération sur la manette virtuelle ou la manette physique du dispositif de commande à distance. Par comparaison avec une scène tridimensionnelle virtuelle fournie par un logiciel de simulation de vol existant, le procédé peut fournir des environnements de vol plus diversifiés, de sorte à améliorer l'effet d'entraînement au vol.
PCT/CN2020/116621 2020-09-21 2020-09-21 Procédé de simulation de vol et terminal pour drone de course WO2022056933A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/116621 WO2022056933A1 (fr) 2020-09-21 2020-09-21 Procédé de simulation de vol et terminal pour drone de course
CN202080035107.3A CN113826149A (zh) 2020-09-21 2020-09-21 用于穿越机的飞行模拟方法及模拟终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/116621 WO2022056933A1 (fr) 2020-09-21 2020-09-21 Procédé de simulation de vol et terminal pour drone de course

Publications (1)

Publication Number Publication Date
WO2022056933A1 true WO2022056933A1 (fr) 2022-03-24

Family

ID=78924224

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116621 WO2022056933A1 (fr) 2020-09-21 2020-09-21 Procédé de simulation de vol et terminal pour drone de course

Country Status (2)

Country Link
CN (1) CN113826149A (fr)
WO (1) WO2022056933A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117271045A (zh) * 2023-11-22 2023-12-22 北自所(北京)科技发展股份有限公司 基于数字孪生的设备信息展示方法、装置及电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530894A (zh) * 2017-01-10 2017-03-22 北京捷安申谋军工科技有限公司 一种采用增强现实技术的飞行训练器虚拟平显方法及系统
CN106530896A (zh) * 2016-11-30 2017-03-22 中国直升机设计研究所 一种用于无人机飞行演示的虚拟系统
CN106796761A (zh) * 2014-09-30 2017-05-31 深圳市大疆创新科技有限公司 用于支持模拟移动的系统和方法
CN107885096A (zh) * 2017-10-16 2018-04-06 中国电力科学研究院 一种无人机巡检航迹三维仿真监控系统
CN108496121A (zh) * 2017-08-25 2018-09-04 深圳市大疆创新科技有限公司 无人机仿真飞行系统、方法、设备及机器可读存储介质
CN109448108A (zh) * 2018-10-17 2019-03-08 中国人民解放军陆军航空兵学院陆军航空兵研究所 直升机飞行模拟器的成像系统及成像方法
US20200066179A1 (en) * 2013-08-30 2020-02-27 Insitu, Inc. Unmanned vehicle simulation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200066179A1 (en) * 2013-08-30 2020-02-27 Insitu, Inc. Unmanned vehicle simulation
CN106796761A (zh) * 2014-09-30 2017-05-31 深圳市大疆创新科技有限公司 用于支持模拟移动的系统和方法
CN106530896A (zh) * 2016-11-30 2017-03-22 中国直升机设计研究所 一种用于无人机飞行演示的虚拟系统
CN106530894A (zh) * 2017-01-10 2017-03-22 北京捷安申谋军工科技有限公司 一种采用增强现实技术的飞行训练器虚拟平显方法及系统
CN108496121A (zh) * 2017-08-25 2018-09-04 深圳市大疆创新科技有限公司 无人机仿真飞行系统、方法、设备及机器可读存储介质
CN107885096A (zh) * 2017-10-16 2018-04-06 中国电力科学研究院 一种无人机巡检航迹三维仿真监控系统
CN109448108A (zh) * 2018-10-17 2019-03-08 中国人民解放军陆军航空兵学院陆军航空兵研究所 直升机飞行模拟器的成像系统及成像方法

Also Published As

Publication number Publication date
CN113826149A (zh) 2021-12-21

Similar Documents

Publication Publication Date Title
EP3882870B1 (fr) Procédé et dispositif d'affichage d'image, support de stockage et dispositif électronique
US9704295B2 (en) Construction of synthetic augmented reality environment
CN106683195B (zh) 一种基于室内定位的ar场景渲染方法
CN112076473B (zh) 虚拟道具的控制方法、装置、电子设备及存储介质
WO2021227864A1 (fr) Procédé et appareil d'affichage de scène virtuelle, support de stockage et dispositif électronique
CN106951561A (zh) 基于虚拟现实技术与gis数据的电子地图系统
US20240095998A1 (en) Virtual scene playback method and apparatus, electronic device, and storage medium
US20230073750A1 (en) Augmented reality (ar) imprinting methods and systems
CN111142967B (zh) 一种增强现实显示的方法、装置、电子设备和存储介质
WO2022056933A1 (fr) Procédé de simulation de vol et terminal pour drone de course
CN108399653A (zh) 增强现实方法、终端设备及计算机可读存储介质
CN113230652B (zh) 虚拟场景变换方法、装置、计算机设备及存储介质
US11710286B2 (en) Virtual object kit
KR20170013539A (ko) 증강현실 기반의 게임 시스템 및 방법
WO2014132988A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
CN111569414A (zh) 虚拟飞行器的飞行展示方法、装置、电子设备及存储介质
CN113426110B (zh) 虚拟角色交互方法、装置、计算机设备和存储介质
CN113313796B (zh) 场景生成方法、装置、计算机设备和存储介质
JP7463509B2 (ja) 検索結果における目標ゲームの目立つ表示
CN109254660B (zh) 内容显示方法、装置及设备
Xiao et al. Research and design of digital library based on virtual reality
CN115631320B (zh) 预计算单元格显示方法、预计算单元格生成方法及装置
WO2024051487A1 (fr) Procédé et appareil de traitement de paramètres pour caméra virtuelle, dispositif, support de stockage et produit programme
JP7464175B1 (ja) 仮想空間制御装置、仮想空間制御方法、およびプログラム
CN117389338B (zh) 无人机的多视角交互方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20953791

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20953791

Country of ref document: EP

Kind code of ref document: A1