CN115930991A - Navigation method and system applied to aircraft and aircraft - Google Patents

Navigation method and system applied to aircraft and aircraft Download PDF

Info

Publication number
CN115930991A
CN115930991A CN202211645514.3A CN202211645514A CN115930991A CN 115930991 A CN115930991 A CN 115930991A CN 202211645514 A CN202211645514 A CN 202211645514A CN 115930991 A CN115930991 A CN 115930991A
Authority
CN
China
Prior art keywords
image
aircraft
navigation
information
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211645514.3A
Other languages
Chinese (zh)
Inventor
刘恒兴
周达威
苏晓
王冬妮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Huitian Aerospace Technology Co Ltd
Original Assignee
Guangdong Huitian Aerospace Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Huitian Aerospace Technology Co Ltd filed Critical Guangdong Huitian Aerospace Technology Co Ltd
Priority to CN202211645514.3A priority Critical patent/CN115930991A/en
Publication of CN115930991A publication Critical patent/CN115930991A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses a navigation method and system applied to an aircraft and the aircraft. The method comprises the following steps: acquiring an environment image of an aircraft; acquiring pose information of the aircraft and spatial position information of a navigation element, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, wherein the augmented reality image comprises an image identifier of the navigation element, and the navigation element is an element influencing a flight path of the aircraft in the air; synthesizing the environment image and the augmented reality image to obtain a navigation image; displaying a navigation image of the aircraft. The technical scheme that this application embodiment provided can make the driver directly perceived know the environment of traveling of aircraft, provides more audio-visual driving guide for the driver, and the driver of being convenient for controls the navigation of aircraft, is favorable to improving the security of traveling.

Description

Navigation method and system applied to aircraft and aircraft
Technical Field
The present application relates to the field of aircraft technologies, and in particular, to a navigation method and system applied to an aircraft, and a storage medium.
Background
With the rapid development of vehicles, navigation technology is also widely used. Currently, providing navigation to a vehicle traveling on the ground is one of the most important applications of navigation technology.
Providing navigation to a vehicle traveling on the ground is as follows: the navigation application acquires the driving environment (including road condition information and obstacle information) of the vehicle, acquires the driving parameters (such as position, speed, acceleration, steering angle and the like) of the vehicle, and gives navigation guidance based on the two information.
The above-described solutions for providing navigation to a vehicle travelling on the ground are not suitable for use in an aircraft.
Disclosure of Invention
The application provides a navigation method and system applied to an aircraft and the aircraft.
In a first aspect, an embodiment of the present application provides a navigation method applied to an aircraft, where the method includes: acquiring an environment image of an aircraft; acquiring pose information of the aircraft and spatial position information of a navigation element, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, wherein the augmented reality image comprises an image identifier of the navigation element, and the navigation element is an element influencing a flight path of the aircraft in the air; synthesizing the environment image and the augmented reality image to obtain a navigation image; displaying a navigation image of the aircraft.
In a second aspect, a navigation system applied to an aircraft in an embodiment of the present application includes an image acquisition subsystem, a three-dimensional space positioning subsystem, an image synthesis subsystem, and an image display subsystem; the image acquisition subsystem is used for acquiring an environment image corresponding to the aircraft; the three-dimensional space positioning subsystem is used for acquiring pose information of the aircraft and spatial position information of a navigation element, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, wherein the augmented reality image comprises an image identifier of the navigation element, and the navigation element is an element influencing a flight path of the aircraft in the air; the image synthesis subsystem is used for synthesizing the environment image and the augmented reality image to obtain a navigation image of the aircraft; and the image display subsystem is used for displaying the navigation image of the aircraft.
In a third aspect, an embodiment of the present application provides an aircraft, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the navigation method as applied to the aircraft as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein computer program instructions that are invokable by a processor to perform a navigation method as applied to an aircraft according to the first aspect.
In a fifth aspect, the present application provides a computer program product for implementing the navigation method applied to an aircraft according to the first aspect when the computer program product is executed.
Compared with the prior art, the navigation method applied to the aircraft provided by the embodiment of the application generates the augmented reality image by acquiring the environment image of the aircraft and based on the self pose information of the aircraft and the spatial position information of the navigation element, then synthesizes the environment image and the augmented reality image to obtain the navigation image of the virtual navigation element image identifier superposed on the real navigation environment of the aircraft, and displays the navigation image, so that the driver can visually know the running environment of the aircraft, more visual driving guidance is provided for the driver, the driver can conveniently control the navigation of the aircraft, and the driving safety is favorably improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application.
FIG. 2 is a system block diagram of an aircraft provided by an embodiment of the present application.
Fig. 3 is a flowchart of a navigation method applied to an aircraft according to an embodiment of the present application.
FIG. 4 is a schematic diagram of a composite navigation image provided by one embodiment of the present application.
Fig. 5 is a flowchart of a navigation method applied to an aircraft according to another embodiment of the present application.
Fig. 6 is a block diagram illustrating a navigation system applied to an aircraft according to an embodiment of the present application.
FIG. 7 is a block diagram of an aircraft according to an embodiment of the present application.
FIG. 8 is a block diagram of a computer storage medium provided in one embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In order to make the technical solutions of the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. The term aircraft as used herein includes, but is not limited to, aircraft and flying vehicles in the conventional sense.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the application is shown. The environment of implementation includes an aircraft 100. Aircraft 100 refers to a vehicle powered or towed by a power plant for use by personnel or for transporting items.
In some embodiments, referring to fig. 2 in combination, an image capture device 20 is disposed on the aircraft 100, and the image capture device 20 may be a monocular camera, a binocular camera, a panoramic camera, or the like. The embodiment of the present application does not limit the position where the image capturing device 20 is disposed on the aircraft 100. In the embodiment of the present application, the controller of the aircraft 100 further includes a camera system 30 matched with the image capturing device 20, and the camera system 30 is configured to process data captured by the image capturing module to output an environment image (i.e., a camera picture).
In some embodiments, the aircraft 100 includes a radar 40 by which the distance and direction of markers, ground, obstacles (trees, mountains, signal lights, buildings, rivers, etc.), etc. relative to the aircraft 100 may be detected, the radar 40 including, but not limited to: laser radar, millimeter wave radar, ultrasonic radar, and the like.
In some embodiments, aircraft 100 includes inertial navigation element 50, where inertial navigation element 50 includes an accelerometer, an angular velocity meter, and a barometer. Wherein the accelerometer may measure the acceleration of the aircraft 100. The angular velocity meter can calculate the pitch angle and roll angle of the aircraft 100, the pitch angle refers to the included angle between the X axis of the body coordinate system of the aircraft 100 and the horizontal plane, the roll angle refers to the lateral tilt angle used for identifying the target in the navigation system, and the value of the roll angle is equal to the included angle between the line perpendicular to the fore-aft line on the plane where the target object is located and the projection of the line on the horizontal plane. The barometer may detect the altitude of the aircraft 100. In some embodiments, the aircraft 100 includes a compass that can detect the yaw angle of the flight 100, which is the angle between the longitudinal axis of the aircraft 100 and the north pole of the earth.
In some embodiments, the controller of the aircraft 100 further includes an autopilot system 60, and the autopilot system 60 processes data respectively acquired by the image acquisition device, the radar, and the inertial navigation element to obtain pose information of the navigation element. Navigation elements include, and are not limited to: a navigation path of the aircraft 100, a track point in the navigation path, an obstacle, an electronic fence. Pose information for navigation elements includes, and is not limited to: absolute coordinates of the navigation element (including latitude and longitude and altitude), distance, direction, etc. of the navigation element relative to the aircraft 100.
In some embodiments, the controller of the aircraft 100 further includes a flight control system 70, and the flight control system 70 processes data respectively collected by the radar and the inertial navigation element to obtain the pose information of the aircraft 100. The pose information of the aircraft 100 includes absolute coordinates, a pitch angle, a yaw angle, and a roll angle of the aircraft 100.
In some embodiments, a three-dimensional spatial positioning system 80 is also included in the controller of aircraft 100, and three-dimensional spatial positioning system 80 is configured to generate an augmented reality image based on the output data of autopilot system 60 and the output data of flight control system 70, the augmented reality image including image identifications of navigation elements. In other possible implementations, the implementation environment further includes a head-mounted augmented reality device, which is established with the aircraft 100, and can receive the pose information of the navigation element sent by the aircraft 100, the pose information of the aircraft 100, and the image capture device 20 in the aircraft 100, and then generate an augmented reality image based on the received data. The head-mounted augmented reality device may be augmented reality glasses, augmented reality eyewear, augmented reality helmets, and the like.
In some embodiments, the aircraft 100 further includes a picture composition system 90, the picture composition system 90 for composing the navigation image based on the environmental image and the augmented reality image.
In some embodiments, the aircraft 100 also includes a Positioning System, which may be an Assisted Global Positioning System (AGPS) System utilizing a cell phone base station, a Real-time kinematic (RTK) System based on carrier-phase observations, or the like.
In some embodiments, the aircraft 100 also includes a display module, which may be a central control screen of the aircraft 100. In other embodiments, the aircraft 100 also includes a voice module, which may be a speaker, for playing voice navigation information. In other embodiments, the aircraft 100 further includes a voice capture module, which may be a microphone, for receiving voice instructions.
In the embodiment of the present application, the aircraft 100 acquires an environment image, and acquires pose information (including spatial position information and attitude information) of the aircraft 100 itself and spatial position information of a navigation element to generate an augmented reality image, the augmented reality image includes an image identifier of the navigation element, and then the environment image and the augmented reality image are synthesized to obtain a navigation image in which a virtual image identifier of the navigation element is superimposed on the basis of a real environment of the aircraft 100, and a driver can more intuitively know the environment information and a driving condition of the aircraft 100 through the navigation image, so that a more intuitive driving guide is given to the driver, and the improvement of driving safety is facilitated.
Referring to fig. 3, a flowchart of a navigation method applied to an aircraft according to an embodiment of the present application is shown. The method comprises the following steps S301-S304.
Step S301, an environment image of the aircraft is acquired.
The environment image of the aircraft is used to describe the flight environment of the aircraft. The controller acquires an environmental image acquired by an image acquisition device in the aircraft. Specifically, the image acquisition device acquires an environment image and stores the environment image to a designated storage path, and the controller reads the environment image from the designated storage path.
In one possible implementation, the image acquisition device acquires the environmental image every first predetermined period after the aircraft starts to sail.
In another possible implementation manner, after the aircraft starts to sail, if it is determined that the augmented reality navigation function is in an on state, the image acquisition device acquires the environment image every first predetermined period. Optionally, the aircraft stores a flag bit of the augmented reality navigation function, and if the value of the flag bit is a preset value, it indicates that the augmented reality navigation function is in an on state. The preset value may be 1. The user can set the augmented reality navigation function on the operation panel of the aircraft to be in an open state, and then the aircraft sets the zone bit of the augmented reality navigation function to be a preset value based on the setting so as to facilitate subsequent query.
In another possible implementation manner, the image acquisition device acquires the environment image every first predetermined period if receiving the augmented reality navigation instruction after the aircraft starts to sail. The augmented reality navigation instruction may be a first trigger signal for a specified virtual control displayed on a center control screen of the aircraft, may also be a second trigger signal for a specified entity control included in an operation console of the aircraft, and may also be a voice signal containing a specified keyword. The specified keywords can be set by the aircraft by default or can be set by the driver by self. Optionally, the controller of the aircraft is installed with an augmented reality navigation application, a user interface of the augmented reality navigation application includes an endpoint input box and a navigation control, and the controller receives the augmented reality navigation instruction after the user inputs an endpoint in the endpoint input box and triggers the navigation control.
The first predetermined period is actually determined according to the refresh frequency of the navigation image. For example, if the refresh frequency of the navigation image is 0.5 seconds, the first predetermined period is 0.5 seconds, that is, the image capturing device also needs to capture the environment image every 0.5 seconds. The refreshing frequency of the navigation image can be set by the aircraft by default or can be set by the driver by self-definition.
Step S302, acquiring pose information of the aircraft and spatial position information of the navigation elements, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation elements.
The pose information of the aircraft comprises absolute coordinates, a pitch angle, a yaw angle and a roll angle of the aircraft. The process of acquiring pose information of an aircraft will be explained in the following embodiments.
A navigational element refers to an element that affects the flight path of an aircraft in the air. The navigation elements include at least one of: a navigation path of the aircraft, waypoints in the navigation path, obstacles, and an electronic fence. The navigation path of the aircraft may be a path planned in real time by the path planning module. The path points in the navigation path may be path points obtained by sampling the navigation path according to a predetermined distance. The obstacles may be signal lights, buildings, mountains, trees, flying birds, etc. The electronic fence refers to a no-fly zone identification regulated by a government agency or a commercial institution. The number of navigation elements may be one or more. The navigation elements may be all navigation elements detected by the aircraft, all navigation elements within the acquisition range of the image acquisition device, or all navigation elements detected by the aircraft and within the acquisition range of the image acquisition device.
The spatial position information of the navigation element includes absolute coordinates of the navigation element, a distance between the navigation element and the aircraft, and an angle of the navigation element with respect to the aircraft. The process of acquiring the spatial position information of the navigation element will be explained in the following embodiments.
The augmented reality image includes image identifications of navigation elements. The image identification of the navigation element is used to uniquely identify the navigation element. The generation process of the augmented reality image will be explained in the following embodiments.
And the controller acquires the augmented reality image corresponding to the aircraft every second preset period, and the second preset period is also actually determined according to the refreshing frequency of the navigation image. For example, if the refresh frequency of the navigation image is 0.5 seconds, the second predetermined period is 0.5 seconds, that is, the controller acquires the augmented reality image every 0.5 seconds. The first predetermined period and the second predetermined period may be the same or different, and this is not limited in this embodiment of the application.
The execution sequence of step S301 and step S302 is not limited in the embodiment of the present application. The aircraft may perform step S301 first and then step S302; step S302 may be executed first, and then step S301 may be executed; step S301 and step S302 may also be performed simultaneously.
And step S303, synthesizing the environment image and the augmented reality image to obtain a navigation image of the aircraft.
In the embodiment of the application, the aircraft synthesizes the environment image and the augmented reality image to obtain the navigation image of the image identifier of the virtual navigation element superposed on the real environment of the aircraft, and a driver can more intuitively know the environment information and the running condition of the aircraft through the navigation image, thereby being beneficial to improving the running safety.
In some embodiments, the size of the augmented reality image is the same as the size of the environment image, and the transparency of the background region of the augmented reality image is a preset value. The augmented reality image has the same size as the environment image, which means that the augmented reality image and the environment image have the same number of pixels in length and width. The preset value can be set experimentally or empirically, for example, the preset value can be 100%, that is, the background region of the augmented reality image is completely transparent.
In this embodiment, step S303 is specifically implemented as: and displaying the environment image and the augmented reality image in an overlapping manner to obtain a navigation image. Since the background region of the augmented reality image is completely transparent, the driver can view the ambient image through the background region of the augmented reality image.
With combined reference to fig. 4, a schematic diagram of image synthesis provided by an embodiment of the present application is shown. The image acquisition device of the aircraft acquires an environment image 41 in real time, generates an augmented reality image 42 based on pose information of the aircraft and spatial position information of the navigation elements, and synthesizes the two to obtain a navigation image 43 in which image identifiers of virtual navigation elements are superimposed on the basis of the real environment of the aircraft.
And step S304, displaying a navigation image of the aircraft.
In some embodiments, the aircraft includes a central control screen through which navigation images of the aircraft are displayed. In other embodiments, the aircraft may also project the navigation image at a transparent glass (such as a front windshield) included in the aircraft.
In some embodiments, the aircraft may further generate navigation prompt information based on the pose information of the aircraft and the spatial position information of the navigation element; overlaying real navigation prompt information on the navigation image; or/and voice playing navigation prompt information. The prompt content of the navigation prompt information includes, but is not limited to: obstacle warning, turn alert, distance alert, etc. Taking the obstacle early warning as an example, the navigation prompt information may be "there is an obstacle in the front 200 meters, please pay attention to avoiding". Optionally, the aircraft may obtain pixel coordinates of the image identifier of each navigation element in the navigation image, determine a pop-up window area based on the pixel coordinates of the image identifier of each navigation element, where the pop-up window area is not overlapped with the image identifier of each navigation element, and then display the navigation prompt information in the pop-up window area, so as to avoid the navigation prompt information from blocking the image identifier of the navigation element.
To sum up, the technical scheme provided by the embodiment of the application generates the augmented reality image by acquiring the environment image of the aircraft and based on the self position and pose information of the aircraft and the spatial position information of the navigation element, then synthesizes the two images to obtain the navigation image of the image identifier of the virtual navigation element superposed on the real navigation environment of the aircraft, and displays the navigation image, so that the driver can visually know the driving environment of the aircraft, more visual driving guidance is provided for the driver, the driver can conveniently control the navigation of the aircraft, and the driving safety is favorably improved.
Fig. 5 is a navigation method applied to an aircraft according to another embodiment of the present application, and the method includes the following steps S501 to S507.
Step S501, an environment image of the aircraft is acquired.
And S502, acquiring pose information of the aircraft.
The pose information of the aircraft includes absolute coordinates (longitude and latitude and altitude), a pitch angle, a roll angle, a yaw angle, and the like of the aircraft.
The absolute coordinates of the aircraft can be obtained in several ways: in a first possible implementation, the aircraft acquires its absolute coordinates by means of a positioning module. In a second possible implementation, in the case where the aircraft detects a landmark object (such as a building, a beacon, etc.) by radar, the absolute coordinates of the aircraft are calculated based on the point cloud data of the landmark object (used to characterize the distance and direction of the marker with respect to the aircraft) and the absolute coordinates of the landmark object. In a third possible implementation manner, in a case where the landmark object is captured by the image capturing device, the distance and the direction of the landmark object relative to the aircraft may be determined according to the field angle parameter of the image capturing device and the pixel coordinates of the landmark object image, and then the absolute coordinates of the aircraft may be determined based on the absolute coordinates of the landmark object and the distance and the direction of the landmark object relative to the aircraft.
It should be noted that the aircraft may select the manner of determining its absolute coordinates according to the influence elements of the positioning accuracy. For example, when the aircraft determines that the signal strength of the positioning module is smaller than the preset strength, the aircraft selects the second mode or the third mode to determine the absolute coordinates of the aircraft. For another example, when the current weather condition of the aircraft is determined to be rainy or snowy weather or foggy weather, the aircraft selects the first mode or the third mode to determine the absolute coordinates of the aircraft.
The pitch angle and the roll angle of the aircraft can be determined through data collected by an angular velocity meter. The yaw angle of the aircraft may be determined from data collected by the compass.
In step S503, spatial position information of the navigation element is acquired.
The spatial position information of the navigation element includes absolute coordinates of the navigation element, a distance between the navigation element and the aircraft, and an angle of the navigation element with respect to the aircraft.
In case the navigation element is an electronic fence, the absolute coordinates of the navigation element can be read directly from the pre-stored data. Optionally, the server includes a corresponding relationship between different electronic fences and different absolute coordinates, the aircraft may send a coordinate acquisition request to the server when detecting the electronic fence, the coordinate acquisition request carries a unique identifier of the electronic fence, the server feeds back the absolute coordinates of the electronic fence to the aircraft according to the coordinate acquisition request, the aircraft may further send the coordinate acquisition request to the server after determining the navigation path, the coordinate acquisition request includes the absolute coordinates of each path point in the navigation path, the server may return the absolute coordinates of the electronic fence within a specified range including the navigation path to the aircraft based on the coordinate acquisition request, and the absolute coordinates of the electronic fence may be queried based on information issued by the server when a subsequent aircraft detects the electronic fence. In the latter method, the absolute coordinates of the electronic fence are acquired more efficiently than in the former method. The distance between the navigation element and the aircraft and the angle of the navigation element relative to the aircraft can be calculated according to the absolute coordinates of the navigation element and the absolute coordinates of the aircraft.
Under the condition that the navigation element is the obstacle, the distance between the navigation element and the aircraft and the angle of the navigation element relative to the aircraft can be obtained from point cloud data of the obstacle detected by a radar in the aircraft. The absolute coordinates of the navigation elements can be calculated according to the distance between the navigation elements and the aircraft, the angle of the navigation elements relative to the aircraft, and the absolute coordinates of the aircraft.
And under the condition that the navigation element is a navigation path of the aircraft and a path point in the navigation path, the absolute coordinate of the navigation element is planned in advance by a path planning module and is stored in the aircraft, and the distance between the navigation element and the aircraft and the angle of the navigation element relative to the aircraft can be obtained by calculation according to the absolute coordinate of the navigation element and the absolute coordinate of the aircraft.
And step S504, based on the pose information of the aircraft and the spatial position information of the navigation element, determining the rendering information of the image identifier of the navigation element.
The server may determine the image identification of the navigation element according to the type of the navigation element. The rendering information of the image identification of the navigation element includes: pixel coordinates of an image identification of the navigation element, rendering color of an image identification of the navigation element, and so forth.
In some embodiments, the aircraft may determine the rendering information for the navigation element by: acquiring a first shooting parameter of an image acquisition device; determining a second shooting parameter of the virtual image acquisition device based on the first shooting parameter; determining pose information of the virtual image acquisition device based on the pose information of the aircraft; and processing the spatial position information of the navigation element based on the pose information of the virtual image acquisition device and the second shooting parameter to obtain rendering information of the image identifier of the navigation element.
The first shooting parameter is a parameter acquired by the image acquisition device when the image acquisition device shoots the environment image. The first shooting parameter of the image acquisition device comprises: focal length of the image capture device, field of View (FOV), size of the photosensitive element, and the like. The first shooting parameters and the second shooting parameters are the same, so that the generated augmented reality image can restore the visual angle of the image acquisition device. The position and pose information of the virtual image acquisition device is the same as that of the image acquisition device, and can be determined according to the position and pose information of the aircraft and the installation position of the image acquisition device on the aircraft, so that the generated augmented reality image can restore the visual angle of the image acquisition device. The aircraft may then output rendering information for each navigation element based on the three-dimensional perspective algorithm.
It should be noted that the above embodiments may be used to determine the pixel coordinates of the image identifier of the navigation element, and the rendering color of the image identifier of the navigation element may be determined as follows: and acquiring the color of the environment image in the area with the same pixel coordinates as the image identifier of the navigation element, and determining the rendering color of the image identifier of the navigation element based on the color. Specifically, the aircraft includes a specific mapping table, the specific mapping table includes a mapping relationship between different colors in an area having the same pixel coordinates as the image identifier of the navigation element and different rendering colors of the image identifier of the navigation element, and the rendering colors of the image identifier of the navigation element can be determined by looking up the mapping table.
It should be noted that the color of the region having the same pixel coordinates as the image identifier of the navigation element is different from the rendering color of the image identifier of the navigation element, or even has a great difference. For example, if the color of the environment image in the same area as the pixel coordinate of the image identifier of the navigation element is blue and white, the rendering color of the image identifier of the navigation element may be dark (such as black, brown, etc.).
The aircraft may perform the above steps S503-504 for each navigation element to determine the rendering information for the image identification of each navigation element. In some embodiments, before determining the rendering information of the image identifier of each navigation element, the aircraft may further filter the detected navigation elements, and if the detected navigation elements are within the acquisition range of the image acquisition device, determine the navigation elements as target navigation elements, where step S504 is specifically implemented as: and determining rendering information of the image identification of the target navigation element based on the pose information of the aircraft and the spatial position information of the target navigation element. Therefore, the calculation process of the aircraft can be reduced, and the energy consumption is reduced.
Step S505 is to generate an augmented reality image based on rendering information of the image identifier of the navigation element.
And the aircraft renders the image identifier of each navigation element through a preset rendering algorithm and rendering information of the image identifier of each navigation element to obtain an augmented reality image. In some embodiments, in the case where the aircraft determines only the rendering information of the image identification of the target navigation element, step S505 is implemented as: and generating an augmented reality image based on the rendering information of the image identification of the target navigation element.
And step S506, synthesizing the environment image and the augmented reality image to obtain a navigation image.
And step S507, displaying a navigation image of the aircraft.
To sum up, the technical scheme provided by the embodiment of the application generates the augmented reality image by acquiring the environment image of the aircraft and based on the self position and pose information of the aircraft and the spatial position information of the navigation element, then synthesizes the two images to obtain the navigation image of the image identifier of the virtual navigation element superposed on the real navigation environment of the aircraft, and displays the navigation image, so that the driver can visually know the driving environment of the aircraft, more visual driving guidance is provided for the driver, the driver can conveniently control the navigation of the aircraft, and the driving safety is favorably improved.
Referring to fig. 6, a block diagram of a navigation system applied to an aircraft according to an embodiment of the present application is shown, where the navigation system includes an image acquisition subsystem 610, a three-dimensional spatial positioning subsystem 620, an image synthesis subsystem 630, and an image display subsystem 640.
Image acquisition subsystem 610 is used to acquire the corresponding environmental image of the aircraft. The three-dimensional space positioning subsystem 620 is configured to acquire pose information of the aircraft and spatial position information of the navigation element, and generate an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, where the augmented reality image includes an image identifier of the navigation element, and the navigation element is an element that affects a flight path of the aircraft in the air. The image synthesis subsystem 630 is configured to perform synthesis processing on the environment image and the augmented reality image to obtain a navigation image of the aircraft. And the image display subsystem 640 is used for displaying the navigation image of the aircraft.
In some embodiments, the three-dimensional spatial positioning subsystem 620 is specifically configured to: determining rendering information of the image identifier of the navigation element based on the pose information of the aircraft and the spatial position information of the navigation element; an augmented reality image is generated based on rendering information of the image identification of the navigation element.
In some embodiments, the aircraft is provided with an image acquisition device, a three-dimensional spatial positioning subsystem 620, specifically for: acquiring a first shooting parameter of an image acquisition device; determining a second shooting parameter of the virtual image acquisition device based on the first shooting parameter; determining pose information of the virtual image acquisition device based on the pose information of the aircraft: and processing the spatial position information of the navigation element based on the pose information and the second shooting parameter of the virtual image acquisition device to obtain rendering information of the image identifier of the navigation element.
In some embodiments, the transparency of the background of the augmented reality image is a preset value, and the size of the augmented reality image is the same as the size of the environment image; the image synthesis subsystem 630 is configured to display the augmented reality image and the environment image in an overlapping manner, so as to obtain a navigation image of the aircraft.
In some embodiments, the navigation system further comprises a navigation prompt subsystem (not shown in the figures). The navigation prompt subsystem is used for generating navigation prompt information based on the pose information of the aircraft and the spatial position information of the navigation elements; overlaying real navigation prompt information on the navigation image; or/and voice playing navigation prompt information.
To sum up, the technical scheme provided by the embodiment of the application generates the augmented reality image by acquiring the environment image of the aircraft and generating the augmented reality image based on the self pose information of the aircraft and the spatial position information of the navigation element, and then synthesizes the two images to obtain the navigation image of the image identifier of the virtual navigation element superposed on the real navigation environment of the aircraft, and displays the navigation image, so that the driver can visually know the running environment of the aircraft, more visual driving guidance is provided for the driver, the driver can conveniently control the navigation of the aircraft, and the running safety is favorably improved.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Referring to fig. 7, it is shown that the embodiment of the present application further provides an aircraft 700, where the aircraft 700 includes: one or more processors 710, memory 720, and one or more applications. Wherein one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more application programs configured to perform the methods described in the above embodiments.
Processor 710 may include one or more processing cores. The processor 710 interfaces with various interfaces and circuitry throughout the battery management system to perform various functions of the battery management system and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 720 and invoking data stored in the memory 720. Alternatively, the processor 710 may be implemented in hardware using at least one of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 710 may integrate one or more of a Central Processing Unit (CPU) 710, a Graphics Processing Unit (GPU) 710, a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 710, but may be implemented by a communication chip.
The Memory 720 may include a Random Access Memory (RAM) 720 and a Read-Only Memory (ROM) 720. The memory 720 may be used to store instructions, programs, code sets, or instruction sets. The memory 720 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (e.g., a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like. The storage data area may also store data created by the electronic device map in use (e.g., phone book, audio-video data, chat log data), and the like.
Referring to fig. 8, a computer-readable storage medium 800 is provided according to an embodiment of the present application, in which a computer program instruction 810 is stored in the computer-readable storage medium 800, and the computer program instruction 810 can be called by a processor to execute the method described in the above embodiment.
The computer-readable storage medium 800 may be, for example, a flash Memory, an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Electrically Programmable Read-Only Memory (EPROM), a hard disk, or a Read-Only Memory (ROM). Optionally, the Computer-readable Storage Medium includes a Non-volatile Computer-readable Storage Medium (Non-transitory Computer-readable Storage Medium). The computer readable storage medium 800 has storage space for computer program instructions 810 to perform any of the method steps of the method described above. The computer program instructions 810 may be read from or written to one or more computer program products.
Although the present application has been described with reference to the preferred embodiments, it is to be understood that the present application is not limited to the disclosed embodiments, but rather, the present application is intended to cover various modifications, equivalents and alternatives falling within the spirit and scope of the present application.

Claims (10)

1. A navigation method applied to an aircraft, characterized in that the method comprises:
acquiring an environment image of an aircraft;
acquiring pose information of the aircraft and spatial position information of a navigation element, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, wherein the augmented reality image comprises an image identifier of the navigation element, and the navigation element is an element influencing a flight path of the aircraft in the air; synthesizing the environment image and the augmented reality image to obtain a navigation image;
displaying a navigation image of the aircraft.
2. The method of claim 1, wherein generating an augmented reality image based on pose information of the aircraft and spatial location information of navigation elements comprises:
determining rendering information for the image identification of the navigation element based on the pose information of the aircraft and the spatial position information of the navigation element:
generating the augmented reality image based on rendering information of the image identification of the navigation element.
3. The method of claim 2, wherein the aircraft is provided with an image acquisition device, and wherein determining rendering information for the image identification of the navigation element based on pose information of the aircraft and spatial position information of the navigation element comprises:
acquiring a first field angle parameter of the image acquisition device;
determining a second field angle parameter of the virtual image acquisition device based on the first field angle parameter;
determining pose information of the virtual image acquisition device based on the pose information of the aircraft;
and processing the spatial position information of the navigation element based on the pose information of the virtual image acquisition device and the second field angle parameter to obtain rendering information of the image identifier of the navigation element.
4. The method according to any one of claims 1 to 3, wherein the transparency of the background of the augmented reality image is a preset value, and the size of the augmented reality image is the same as the size of the environment image; the synthesizing the environment image and the augmented reality image to obtain a navigation image includes:
and overlapping and displaying the augmented reality image and the environment image to obtain the navigation image.
5. The method according to any one of claims 1 to 3, wherein after the synthesizing the environment image and the augmented reality image to obtain a navigation image, the method further comprises:
generating navigation prompt information based on the pose information of the aircraft and the spatial position information of the navigation elements;
superimposing the navigation prompt information on the navigation image; or/and playing the navigation prompt information by voice.
6. The navigation system applied to the aircraft is characterized by comprising an image acquisition subsystem, a three-dimensional space positioning subsystem, an image synthesis subsystem and an image display subsystem;
the image acquisition subsystem is used for acquiring an environment image corresponding to the aircraft;
the three-dimensional space positioning subsystem is used for acquiring pose information of the aircraft and spatial position information of a navigation element, and generating an augmented reality image based on the pose information of the aircraft and the spatial position information of the navigation element, wherein the augmented reality image comprises an image identifier of the navigation element, and the navigation element is an element influencing a flight path of the aircraft in the air;
the image synthesis subsystem is used for carrying out synthesis processing on the environment image and the augmented reality image to obtain a navigation image of the aircraft;
and the image display subsystem is used for displaying the navigation image of the aircraft.
7. The system of claim 6, wherein the three-dimensional spatial localization subsystem is specifically configured to:
determining rendering information of the image identifier of the navigation element based on pose information of the aircraft and spatial position information of the navigation element;
generating the augmented reality image based on rendering information of the image identification of the navigation element.
8. The system according to claim 7, characterized in that the aircraft is provided with an image acquisition device, the three-dimensional spatial localization subsystem being in particular configured to:
acquiring a first shooting parameter of the image acquisition device;
determining second shooting parameters of the virtual image acquisition device based on the first shooting parameters;
determining pose information of the virtual image acquisition device based on the pose information of the aircraft;
and processing the spatial position information of the navigation element based on the pose information of the virtual image acquisition device and the second shooting parameter to obtain rendering information of the image identifier of the navigation element.
9. An aircraft, characterized in that it comprises:
one or more processors;
a memory;
one or more application programs, wherein one or more of the application programs are stored in the memory and configured to be executed by one or more of the processors, the one or more application programs being configured to perform the navigation method applied to an aircraft according to any one of claims 1-5.
10. A computer-readable storage medium, in which computer program instructions are stored, which computer program instructions can be called by a processor to execute a navigation method applied to an aircraft according to any one of claims 5 to 8.
CN202211645514.3A 2022-12-19 2022-12-19 Navigation method and system applied to aircraft and aircraft Pending CN115930991A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211645514.3A CN115930991A (en) 2022-12-19 2022-12-19 Navigation method and system applied to aircraft and aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211645514.3A CN115930991A (en) 2022-12-19 2022-12-19 Navigation method and system applied to aircraft and aircraft

Publications (1)

Publication Number Publication Date
CN115930991A true CN115930991A (en) 2023-04-07

Family

ID=86552010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211645514.3A Pending CN115930991A (en) 2022-12-19 2022-12-19 Navigation method and system applied to aircraft and aircraft

Country Status (1)

Country Link
CN (1) CN115930991A (en)

Similar Documents

Publication Publication Date Title
WO2021197189A1 (en) Augmented reality-based information display method, system and apparatus, and projection device
CN108801276B (en) High-precision map generation method and device
CN107554425B (en) A kind of vehicle-mounted head-up display AR-HUD of augmented reality
EP3008708B1 (en) Vision augmented navigation
US11127373B2 (en) Augmented reality wearable system for vehicle occupants
US8195386B2 (en) Movable-body navigation information display method and movable-body navigation information display unit
US8503762B2 (en) Projecting location based elements over a heads up display
CN109949439B (en) Driving live-action information labeling method and device, electronic equipment and medium
CN111460865B (en) Driving support method, driving support system, computing device, and storage medium
CN210139859U (en) Automobile collision early warning system, navigation and automobile
US11525694B2 (en) Superimposed-image display device and computer program
WO2021197190A1 (en) Information display method, system and apparatus based on augmented reality, and projection device
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
US9791287B2 (en) Drive assist system, method, and program
US8988425B2 (en) Image display control system, image display control method, and image display control program
JP6345381B2 (en) Augmented reality system
JP2022176234A (en) Information display control device, information display control method, and information display control program
US11409280B1 (en) Apparatus, method and software for assisting human operator in flying drone using remote controller
CN115930991A (en) Navigation method and system applied to aircraft and aircraft
US11200749B2 (en) Systems and methods of augmented reality visualization based on sensor data
US20240042857A1 (en) Vehicle display system, vehicle display method, and computer-readable non-transitory storage medium storing vehicle display program
KR20040025150A (en) Route guide method in car navigation system
KR102482829B1 (en) Vehicle AR display device and AR service platform
EP3767436A1 (en) Systems and methods of augmented reality visualization based on sensor data
CN117308979A (en) Path planning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination