CN110299030B - Handheld terminal, aircraft, airspace measurement method and control method of aircraft - Google Patents

Handheld terminal, aircraft, airspace measurement method and control method of aircraft Download PDF

Info

Publication number
CN110299030B
CN110299030B CN201910579055.5A CN201910579055A CN110299030B CN 110299030 B CN110299030 B CN 110299030B CN 201910579055 A CN201910579055 A CN 201910579055A CN 110299030 B CN110299030 B CN 110299030B
Authority
CN
China
Prior art keywords
flight
airspace
bionic
modeling
ornithopter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910579055.5A
Other languages
Chinese (zh)
Other versions
CN110299030A (en
Inventor
刘迎健
张立清
敬鹏生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanwang Technology Co Ltd
Original Assignee
Hanwang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hanwang Technology Co Ltd filed Critical Hanwang Technology Co Ltd
Priority to CN201910579055.5A priority Critical patent/CN110299030B/en
Publication of CN110299030A publication Critical patent/CN110299030A/en
Application granted granted Critical
Publication of CN110299030B publication Critical patent/CN110299030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones

Abstract

The disclosure provides a handheld terminal, an aircraft, an airspace measuring method and a control method of the aircraft, a flight system and a flight method of the flight system. This handheld terminal includes: the measuring unit is used for measuring the position information of at least one target object in the peripheral space where the aircraft is located; the modeling unit is used for modeling the peripheral space where the aircraft is located according to the position information of the at least one target object so as to generate modeling information; a control unit that generates control information based on at least the modeling information; and a communication unit that transmits the control information based on at least the modeling information to the aircraft.

Description

Handheld terminal, aircraft, airspace measurement method and control method of aircraft
Technical Field
The present disclosure relates to the field of aircraft, and in particular, to a handheld terminal, an aircraft, an airspace measurement method and a control method thereof, and a flight system and a flight method thereof.
Background
Bionic ornithopters have been developed for over 30 years. Generally, the weight of the aircraft body of the bionic flapping wing aircraft is required to be light in order to realize the flight mode of the bionic flapping wing. For example, the toy products of the bionic ornithopter on the market have the weight of about 10 grams and basically do not exceed 15 grams.
Therefore, the technical difficulties of low load capacity, few functions of an airborne control module, poor autonomous flight capability and the like of the bionic flapping-wing aircraft can be brought. On the other hand, the remote controller suitable for the bionic flapping wing aircraft has too simple function, generally has 4 flight control modes of up, down, left and right, and cannot give a slightly complex control command, so that complex flight modes such as autonomous flight, obstacle avoidance flight, following flight, nest return and the like cannot be realized. These factors have hindered the commercial use of bionic ornithopters.
Therefore, there is a need for an aircraft that implements complex autonomous flight control functions with a light fuselage weight and low load capacity.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided a handheld terminal including: the measuring unit is used for measuring the position information of at least one target object in the peripheral space where the aircraft is located; the modeling unit is used for modeling the peripheral space where the aircraft is located according to the position information of the at least one target object so as to generate modeling information; a control unit that generates control information based on at least the modeling information; and a communication unit that transmits the control information based on at least the modeling information to the aircraft.
According to another aspect of the present disclosure, there is provided an aircraft comprising: a communication unit that receives control information based on at least the modeling information from the handheld terminal; and the control unit is used for realizing the flight of the aircraft at least based on the control information, wherein the modeling information is obtained by measuring the position information of at least one target object in the peripheral space where the aircraft is located by the handheld terminal and modeling the peripheral space where the aircraft is located according to the position information of the at least one target object.
According to another aspect of the present disclosure, there is provided an airspace measurement method for an aircraft, comprising: measuring position information of at least one target object in a peripheral space where the aircraft is located; modeling a peripheral space where the aircraft is located according to the position information of the at least one target object to generate modeling information; generating control information based at least on the modeling information; and transmitting the control information based at least on the modeling information to the aircraft.
According to another aspect of the present disclosure, there is provided a control method for an aircraft, comprising: receiving control information based on at least the modeling information from the handheld terminal; and controlling the flight of the aircraft at least based on the control information, wherein the modeling information is obtained by measuring the position information of at least one target object in the peripheral space where the aircraft is located by the handheld terminal and modeling the peripheral space where the aircraft is located according to the position information of the at least one target object.
According to another aspect of the present disclosure, there is provided a flight system comprising a hand-held terminal as in the first aspect and an aircraft as in the second aspect.
According to another aspect of the present disclosure, there is provided a flight method for a flight system, the flight system comprising an aircraft and a handheld terminal, the method comprising: measuring, by the handheld terminal, position information of at least one target object in a peripheral space where the aircraft is located; modeling the peripheral space where the aircraft is located by the handheld terminal according to the position information of the at least one target object to generate modeling information; generating, by the handheld terminal, control information based at least on the modeling information; transmitting, by a handheld terminal, the control information based at least on the modeling information to the aircraft; receiving, by the aerial vehicle, the control information from the handheld terminal; and enabling, by an aircraft, a flight of the aircraft based at least on the control information.
According to another aspect of the present disclosure, there is provided a handheld terminal including: the measuring unit is used for measuring the position information of at least one target object in the peripheral space where the aircraft is located; a control unit that generates control information based on at least position information of the at least one target object; and a communication unit that transmits the control information based on at least the position information of the at least one target object to the aircraft.
According to another aspect of the present disclosure, there is provided an aircraft comprising: a communication unit receiving control information based on at least position information of at least one target object from a handheld terminal; a modeling unit that models a peripheral space in which the aircraft is located, based on the control information, to generate modeling data; and the control unit is used for realizing the flight of the aircraft at least based on the control information and the modeling data, wherein the position information of the at least one target object is obtained by measuring the position of the at least one target object in the peripheral space where the aircraft is located by the handheld terminal.
According to another aspect of the present disclosure, there is provided an airspace measurement method for an aircraft, comprising: measuring position information of at least one target object in a peripheral space where the aircraft is located; generating control information based at least on the position information of the at least one target object; and transmitting the control information based on at least the position information of the at least one target object to the aircraft.
According to another aspect of the present disclosure, there is provided a control method for an aircraft, comprising: receiving control information based on at least position information of at least one target object from a handheld terminal; modeling a peripheral space in which the aircraft is located based on the control information to generate modeling data; and controlling the flight of the aircraft at least based on the control information and the modeling data, wherein the position information of the at least one target object is obtained by measuring the position of the at least one target object in the peripheral space where the aircraft is located by the handheld terminal.
According to another aspect of the present disclosure, there is provided a flight system comprising a hand-held terminal according to the seventh aspect and an aircraft according to the eighth aspect.
According to another aspect of the present disclosure, there is provided a flight method for a flight system, the flight system comprising an aircraft and a handheld terminal, the method comprising: measuring, by the handheld terminal, position information of at least one target object in a peripheral space where the aircraft is located; generating, by the handheld terminal, control information based at least on the location information of the at least one target object; transmitting, by a handheld terminal, the control information based at least on the location information of the at least one target object to the aerial vehicle; receiving, by the aerial vehicle, the control information from the handheld terminal; modeling, by an aircraft, a peripheral space in which the aircraft is located based on the control information to generate modeling data; and enabling, by an aircraft, flight of the aircraft based at least on the control information and modeling data.
By utilizing the handheld terminal, the aircraft, the airspace measuring method and the airspace measuring method thereof, the control method, the flight system and the flight method thereof, under the condition that the weight of the aircraft body is limited to be light and the load capacity is low, the functions of more stable autonomous flight, more complex control mode and more flexible man-machine interaction can be realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the drawings.
Fig. 1 is a block diagram schematically illustrating a main configuration of a handheld terminal according to an embodiment of the present disclosure;
fig. 2 is a block diagram schematically illustrating a main configuration of a handheld terminal according to another embodiment of the present disclosure;
FIG. 3 is a block diagram schematically illustrating a primary configuration of an aircraft according to an embodiment of the present disclosure;
FIG. 4 is a block diagram schematically illustrating a primary configuration of an aircraft according to another embodiment of the present disclosure;
FIG. 5 is a block diagram schematically illustrating a primary configuration of a flight system according to an embodiment of the present disclosure;
FIG. 6 is a block diagram schematically illustrating a primary configuration of a flight system according to another embodiment of the present disclosure;
FIG. 7 is a flow chart schematically illustrating the main steps of a method of airspace measurement of an aircraft according to an embodiment of the present disclosure;
FIG. 8 is a flow chart schematically illustrating the main steps of a method of airspace measurement of an aircraft according to another embodiment of the present disclosure;
FIG. 9 is a flow chart schematically illustrating the main steps of a control method of an aircraft according to an embodiment of the present disclosure;
FIG. 10 is a flow chart schematically illustrating the main steps of a method of controlling an aircraft according to another embodiment of the present disclosure;
FIG. 11 is a flow chart schematically illustrating the main steps of a flight method of a flight system according to an embodiment of the present disclosure; and
fig. 12 is a flow chart schematically illustrating the main steps of a flight method of a flight system according to another embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described in the present disclosure without inventive step, shall fall within the scope of protection of the invention.
First, a block diagram of a main configuration of a handheld terminal according to an embodiment of the present disclosure is described with reference to fig. 1.
As shown in fig. 1, the handheld terminal 100 of the embodiment of the present disclosure mainly includes a measurement unit 110, a modeling unit 120, a control unit 130, and a communication unit 140. These components are interconnected by a bus and/or other form of connection mechanism (not shown).
The measurement unit 110 is configured to measure position information of at least one target object within a peripheral space in which an aircraft corresponding to the handheld terminal 100 is located. The at least one object here refers, for example, to any boundary and/or obstacle in the three-dimensional space in which the aircraft is about to fly.
Specifically, the measurement unit 110 may include at least one of a range finder and an image collector. At least one of the range finder and the image collector may use the position of the handheld terminal 100 as a base point, and for each of at least one target object, obtain a distance of each of at least one feature point of the target object relative to the base point.
Illustratively, rangefinders include, but are not limited to, laser rangefinders, lidar, ultrasonic radar, infrared lasers, and the like. Image collectors include, but are not limited to, monocular cameras, binocular cameras, infrared cameras, visible light based color or grayscale cameras, and the like. The handheld terminal can acquire three-dimensional information, such as distance and azimuth angle, of each feature point relative to the base point through the distance measuring instrument. In addition, the handheld terminal can acquire the image information of each feature point relative to the base point through the image collector.
In one example, the measurement unit 110 includes only a range finder, e.g., a laser range finder. More specifically, the laser distance measuring instrument includes a plurality of laser sensors, at least one feature point of each target object is first irradiated with laser light from a base point, so that a distance from the base point to each feature point is measured, and then an azimuth angle of the feature point is measured according to a gyroscope inside the laser sensor. And finally, calculating to obtain the three-dimensional information of each characteristic point by combining the distance and the azimuth angle measured by the distance meter.
In another example, the measurement unit 110 includes only an image collector, such as a depth camera based on binocular stereo vision. The binocular depth camera shoots a plurality of images of the surrounding environment at a base point by utilizing the rotation of the camera, and then the plurality of images form a relatively specific space through splicing. Further, the measurement unit 110 identifies at least one object from the plurality of images and determines at least one feature point of each object, thereby extracting three-dimensional image depth information of each feature point. Specifically, for example, first, a binocular depth camera needs to be calibrated to obtain internal and external parameters and a homography matrix of the two cameras; correcting two original images acquired by the binocular depth camera at the same time according to a calibration result, wherein the two corrected images are positioned on the same plane and are parallel to each other; detecting the characteristic points of the two corrected images; matching the characteristic points on the two images; and calculating the depth distance of the matched feature points according to the matching result. On the other hand, by recording or measuring the rotation angle of the camera at each shooting, the azimuth angle corresponding to the feature point can be obtained.
In yet another example, the measurement unit 110 may include both a range finder and an image collector. In the previous example, when the binocular depth camera is used for acquiring the depth information of the image, the software calculation amount of the handheld terminal is large, so that in the example, the distance meter and the camera can be combined, the software calculation amount is reduced, and the calculation efficiency is improved. In this example, the camera is not limited to a binocular camera, but may be a monocular camera. Firstly, the camera is also used for rotatably shooting images of the surrounding environment, and a plurality of images are spliced to form a relatively specific space. Then, the distance of each characteristic point in each image is measured by the distance measuring instrument. Thus, the distance and the image information are combined to obtain the position information of at least one feature point of each target object.
It should be noted that, although various examples of the range finder and the image collector are enumerated herein, the present disclosure is not limited thereto, but may include any measurement unit capable of measuring three-dimensional position information of a target object. Further, the above-described measurement method is merely an example, and a person skilled in the art may measure the position information of the target object using any appropriate measurement method known in the art or developed in the future.
After obtaining the position information of the at least one target object, the modeling unit 120 models the peripheral space in which the aircraft is located according to the position information of the at least one target object to generate modeling information.
Specifically, the modeling unit 120 includes a flight airspace modeling subunit. The flight space modeling subunit models the flight space of the aircraft according to the position information of at least one first target object in the at least one target object to obtain a flight space model.
Here, the flight airspace refers to a three-dimensional region of space in which the aircraft is about to fly, which may or may not include obstacles. The at least one first object may for example be an object such as a ground, wall or other obstacle or the like for limiting the flight boundary of the aircraft. In other words, the flight airspace may be determined by the at least one first target.
In one example, the flight airspace may be a limited spatial region within a room. In this case, the plurality of boundaries of the flight space may be determined by a plurality of first targets.
In another example, the flight airspace may be an open area outdoors. In this case, the maximum value that can be measured by the measurement unit 110 of the handheld terminal 100 may be adopted as the boundary value of the flight space. For example, at least one surface of the outdoor open area is provided with a ground surface, a wall surface or an obstacle, and a boundary surface can be generated by the plane where the ground surface, the wall surface or the obstacle is located; and other surfaces without ground, wall or obstacles, a corresponding boundary surface may be generated using the maximum value that can be measured by the measurement unit 110 of the handheld terminal 100, and then a flight space may be formed by a plurality of boundary surfaces.
After determining the boundary surfaces of the flight airspace in the above manner, the flight airspace modeling subunit may model the flight airspace as a closed, regular, or irregular geometric volume. Specifically, a three-dimensional coordinate system is established with an initial measurement position where the handheld terminal 100 is located as an origin. For example, the initial measurement location where the handheld terminal 100 is located is a fixed point in the current flight space. Then, the position information of the at least one feature point of each first target object measured by the measurement unit 110 corresponds to the three-dimensional coordinates of the plurality of coordinate points in the three-dimensional coordinate system. The three-dimensional coordinates are used for establishing a corresponding flight airspace model, and the established flight airspace model can be various regular or irregular geometric bodies including but not limited to cuboids, cubes, cones, spheres, ellipsoids, cylinders and the like. For example, if a regular rectangular flight airspace model is built, it is necessary to select the closest point to each surface on 6 surfaces around the handheld terminal 100, and then measure the three-dimensional coordinates of the selected 6 points, and build a closed rectangular flight airspace model with the 6 points as reference points.
Further, the modeling unit 120 further includes a headroom spatial modeling subunit. First, a headroom modeling unit determines whether the flight airspace is headroom. Herein, the headroom is a closed, regular or irregular geometric body having an inner space without obstacles.
Specifically, the determining whether the flight airspace is the headroom airspace by the headroom airspace modeling unit includes: it is determined whether there is at least one second object that is different from the at least one first object. In other words, the second target for determining headroom is an obstacle different from the first target above for determining the flight boundaries of the flight airspace. If at least one second target is present, the flight airspace is not headroom; conversely, flight airspace is headroom.
In one example, the at least one first object and the at least one second object are both selected autonomously by the user. For example, before the flight of the aircraft, the user firstly selects at least one first target object for measurement through the handheld terminal to determine the flight airspace, and then selects at least one second target object for measurement through the handheld terminal to determine the clearance airspace. After the headroom is determined, the aircraft takes off. Optionally, the user may also add a new second target object again via the handheld terminal during the flight of the aircraft. Alternatively, the user may select only at least one first object and not any second object in the event that it is determined that the aircraft is not in the flight airspace in which it is about to fly.
In another example, a user autonomously selects at least one first target to measure for determining flight airspace prior to flight of the aircraft. After the flight airspace is determined, the aircraft takes off. Then, in the flying process of the aircraft, the obstacle avoidance module on board the aircraft detects obstacles around the aircraft as at least one second target object, and the position information of the at least one second target object is sent to the handheld terminal to carry out modeling of a clearance airspace.
And if the headroom space modeling subunit determines that the obtained flight space is headroom space, namely the flight space does not contain any second target, determining the obtained flight space model as the headroom space model.
If the headroom modeling subunit determines that the obtained flight airspace is not headroom, i.e., the flight airspace contains at least one second target, it needs to further remove each second target from the flight airspace model to obtain the headroom airspace model.
Specifically, the at least one target measured by the measurement unit 110 may include at least one second target in addition to the at least one first target. The measurement unit 110 may measure position information, for example, three-dimensional coordinates, of at least one feature point of each of the second objects. Based on the position information of at least one feature point of each second target object, removing the geometric model associated with the second target object from the corresponding spatial position in the obtained flight airspace model to obtain a headroom airspace model.
For example, the geometric model of the second object may be set by default, selected by a user from a plurality of predetermined geometric models according to geometric features of the second object, or determined by a spatial modeling unit from the plurality of predetermined geometric models according to a geometric distribution of at least one feature point of the second object. In particular, the predetermined geometric model may include, but is not limited to, a cuboid, a cube, a pyramid, a sphere, an ellipsoid, a cylinder, and the like. In this way, the geometric model of the obstacle in the flight space in the three-dimensional coordinate system can be simplified, and thus the headroom space model is easily obtained.
Optionally, the handheld terminal 110 may further include a user instruction receiving unit. For example, the user instruction receiving unit may be a touch display screen, or a separate display screen in combination with a plurality of hard keys.
The user instruction receiving unit may be configured to receive a user selection of at least one first object and a user selection of at least one second object. Alternatively, the user instruction receiving unit receives a user selection of at least one first object without a selection of a second object.
To more accurately determine flight airspace, the user instruction receiving unit may receive user input regarding attributes of the flight airspace. For example, the display screen may display the measurement result of the first target object for determining the flight space measured by the measurement unit 110, and then the user confirms or modifies the measurement result by inputting. Illustratively, attributes of flight space include, but are not limited to, height, width, length, etc. of the flight space.
To more accurately determine the headroom area, the user instruction receiving unit may receive a user selection of a geometry of each of the at least one second target. Illustratively, the geometric shapes are selected from a predetermined geometric model. As mentioned above, the predetermined geometric model may include, but is not limited to, a cuboid, a cube, a pyramid, a sphere, an ellipsoid, a cylinder, and the like. Thus, the user can select a geometric shape suitable for the second object from the plurality of predetermined geometric models according to the geometric feature of the second object.
It should be noted that, in the above-described embodiments, although the initial measurement position where the handheld terminal 100 is located is described as the origin of the three-dimensional coordinate system as an example, it will be understood by those skilled in the art that the present invention is not limited thereto, and any point in the surrounding area where the aircraft is located may be used as the origin of the coordinates. For example, but not limited to, the origin of coordinates may be the initial location where the aircraft is located; or may have a fixed orientation of the vertex in the surrounding area as the origin of coordinates.
It should be noted that the above-described flight airspace modeling method and headroom airspace modeling method are merely examples. Any suitable modeling method known in the art or developed in the future may be employed by those skilled in the art to determine flight airspace and/or headroom airspace.
After the modeling is completed, the modeling unit 120 of the handheld terminal 100 generates modeling information. Next, the control unit 130 generates control information based on at least the modeling information. The communication unit 140 then transmits the control information based on at least the modeling information to the aircraft.
In one embodiment, the control information generated by the control unit 130 based on at least the modeling information includes headroom model data associated with the headroom model. Thus, the communication unit 140 transmits the headroom model data associated with the headroom model to the aircraft.
Illustratively, after obtaining the data of the clearance airspace model, the aircraft can realize random flight in the clearance airspace under the support of the sensors such as the obstacle avoidance module and the barometer high-altitude sensor, the aircraft does not need to plan a flight path in advance, and the aircraft autonomously flies out of a route.
Alternatively, after obtaining the headroom model data, the aircraft performs flight path planning based at least on the headroom model data. In other words, after the hand-held terminal completes modeling, flight path planning is performed by the aircraft itself.
In this embodiment, when the aircraft performs flight path planning, in addition to the headroom model data, the specific position of the current position of the aircraft in the three-dimensional coordinate system needs to be known. The specific position of the aircraft may be obtained according to the following two examples.
In one example, the measurement unit 110 of the handheld terminal 100 also measures aircraft relative position information of the aircraft. For example, the measurement unit 110 of the handheld terminal 100 measures aircraft relative position information of the aircraft with respect to the current position of the handheld terminal 100 and transmits the same to the aircraft through the communication unit 140. If the current position of the handheld terminal 100 is different from the initial measurement position of the handheld terminal 100 (i.e., the origin of coordinates of the three-dimensional coordinate system when modeled), the current position of the handheld terminal 100 may be associated with the initial measurement position. Illustratively, a position sensor is added to the handheld terminal 100, by which three-dimensional information of the current position with respect to the previously recorded initial measurement position is acquired. Alternatively, a 9-axis sensor is added to the handheld terminal 100, the change of the angle, the acceleration and the direction of the handheld terminal during the movement process is measured in real time through the 9-axis sensor, and then the change of the current position of the handheld terminal relative to the initial measurement position is obtained through calculation. Thereby, it is possible to obtain aircraft relative position information of the initial measured position of the aircraft (i.e., the origin of coordinates of the three-dimensional coordinate system at the time of modeling) with respect to the hand-held terminal 100 and transmit the aircraft relative position information to the aircraft through the communication unit 140.
In another example, aircraft position information is measured by an aircraft via position sensors on board the aircraft. Alternatively, the initial takeoff position of the aircraft is recorded, and the change of relative angle, acceleration and direction is measured by an onboard gyroscope, an acceleration sensor, a 9-axis sensor or the like on the aircraft, so that the current aircraft position information is acquired.
After obtaining the relative position information of the aircraft or the position information of the aircraft, the aircraft can plan a path where the aircraft can autonomously fly in a clearance airspace according to the obtained clearance airspace model data and the relative position information of the aircraft or the position information of the aircraft.
Specifically, the aircraft may generate at least one of path coordinate data and flight dynamics data associated with the flight path plan. For example, the flight path may be characterized by coordinate data in a three-dimensional coordinate system. In this case, the flight path planning is a global planning, and the aircraft stores information of a plurality of track points on the path in the aircraft and then flies according to the designated track points. Alternatively, the flight path may also be characterized by flight dynamics data of the aircraft. In this case, the flight path planning is a dynamic planning, and specifically includes planning and storing the flight dynamics and the flight time of each flight dynamics. The flight dynamics include, but are not limited to, forward, backward, upward, downward, leftward, rightward, acceleration, deceleration, etc. information.
Further, optionally, the communication unit 140 may further send a flight control command indicating at least one flight mode of a plurality of preset flight modes to the aircraft, so that the aircraft performs flight path planning based on at least three of: at least one flight mode; a headroom space domain model; and aircraft relative position information or aircraft position information. For example, the at least one flight mode may be selected by the user from a plurality of preset flight modes through the user instruction receiving unit. For example, by touch screen or hard key selection.
In another embodiment, unlike the above embodiment in which the flight path planning is performed by the aircraft itself, the flight path planning is performed by the hand-held terminal 100. In other words, since the flight path planning by the aircraft is not required, the communication unit 140 of the handheld terminal 100 does not transmit the headroom model data associated with the headroom model to the aircraft.
In this embodiment, the control unit 130 of the hand-held terminal 100 comprises a planning subunit for performing a flight path planning with respect to the aircraft.
Specifically, in one example, the communication unit 140 of the handheld terminal 100 receives aircraft location information from an aircraft. Specifically, the aircraft position information is acquired in the same manner as in the previous embodiment. For example, aircraft position information is measured by an aircraft via position sensors on board the aircraft. Alternatively, the initial takeoff position of the aircraft is recorded, and the change of relative angle, acceleration and direction is measured by an onboard gyroscope, an acceleration sensor, a 9-axis sensor or the like on the aircraft, so that the current aircraft position information is acquired.
In this example, the planning subunit of the control unit 130 performs a flight path planning with respect to the aircraft based on at least the modeling information and the aircraft position information.
In another example, the measurement unit 110 of the handheld terminal 100 also measures aircraft relative position information. For example, the measurement unit 110 of the handheld terminal 100 measures aircraft relative position information with respect to the current position of the handheld terminal 100. If the current position of the handheld terminal 100 is different from the initial measurement position of the handheld terminal 100 (i.e., the origin of coordinates of the three-dimensional coordinate system when modeled), the current position of the handheld terminal 100 may be associated with the initial measurement position. Illustratively, a position sensor is added to the handheld terminal 100, by which three-dimensional information of the current position with respect to the previously recorded initial measurement position is acquired. Alternatively, a 9-axis sensor is added to the handheld terminal 100, the change of the angle, the acceleration and the direction of the handheld terminal during the movement process is measured in real time through the 9-axis sensor, and then the change of the current position of the handheld terminal relative to the initial measurement position is obtained through calculation. Thereby, the aircraft relative position information of the initial measured position of the aircraft (i.e., the origin of coordinates of the three-dimensional coordinate system at the time of modeling) with respect to the handheld terminal 100 can be obtained.
In this example, the planning subunit of the control unit 130 performs a flight path planning with respect to the aircraft based on at least the modeling information and the aircraft relative position information.
In particular, the planning subunit generates at least one of path coordinate data and flight dynamics data associated with the flight path plan. Further, at least one of the path coordinate data and the flight dynamics data is transmitted to the aircraft through the communication unit 140. For example, the flight path may be characterized by coordinate data in a three-dimensional coordinate system. In this case, the flight path planning is a global planning, the communication unit 140 sends information of a plurality of track points on the path to the aircraft, and the aircraft flies according to the designated track points. Alternatively, the flight path may also be characterized by flight dynamics data of the aircraft. In this case, the flight path planning is a dynamic planning, and specifically includes planning and transmitting the flight dynamics and the flight time of each flight dynamics. The flight dynamics include information of forward, backward, upward, downward, leftward, rightward, acceleration, deceleration, and the like.
Optionally, the planning subunit further determines at least one flight mode from a plurality of preset flight modes, and performs flight path planning based on: at least one flight mode; a headroom space domain model; and aircraft relative position information or aircraft position information. For example, the at least one flight mode may be selected by the user from a plurality of preset flight modes through the user instruction receiving unit. For example, by touch screen or hard key selection.
For the above two embodiments, that is, whether the aircraft performs flight path planning by itself or the handheld terminal performs flight path planning, the preset flight modes may include, but are not limited to: fixed-height flight, straight flight, wave-type forward flight, circular hovering flight, elliptical hovering flight, 8-letter hovering flight, climbing flight, descending flight, left-turn flight, right-turn flight, hovering flight, following flight, nest returning, landing and the like. The aircraft may be flown in any combination of single, dual or multiple of the various flight modes described above.
It should be noted that the measurements of the position of the aircraft, whether by the aircraft itself or by the handheld terminal, may be periodic, real-time, intermittent, or event-based. Further, while various examples of sensors for measuring the current position of the hand-held terminal and/or the current position of the aircraft are enumerated herein, the present disclosure is not limited thereto, but may include any other sensor capable of measuring the current position of the hand-held terminal and/or the current position of the aircraft.
It is further noted that the flight path planning method described above is merely an example. The autonomous flight of the aircraft may be planned by one skilled in the art using any suitable flight path planning method known in the art or developed in the future.
Further, in the above-described embodiments, although the description has been given taking the case where the flight path planning is performed entirely on the hand-held terminal side and the flight path planning is performed entirely on the aircraft side as an example, it will be understood by those skilled in the art that the present invention is not limited thereto, but any manner capable of planning a flight path may be employed. Illustratively, the flight path planning is jointly completed in a manner that the handheld terminal and the aircraft are cooperated, namely, a part of the flight path planning function is completed by the handheld terminal, and the rest is completed by the aircraft. For another example, switching between the flight path planning completed on the hand-held terminal side and the flight path planning completed on the aircraft side can be performed under different scenes or applications. For example, before the aircraft takes off, the flight path of the aircraft is planned for the first time by the handheld terminal, and then the flight path can be adjusted or planned by the aircraft itself in the flight process of the aircraft.
The communication unit 140 of the hand-held terminal 100 may be implemented as a communication unit that communicates with the aircraft in various wireless communication protocols. Illustratively, the communication unit 140 may include a bluetooth communicator, a Bluetooth Low Energy (BLE) communicator, a near field communicator, a Wireless Local Area Network (WLAN) or Wi-Fi communicator, a Zigbee communicator, an infrared data association (IrDA) communicator, a Wi-Fi direct (WFD) communicator, an Ultra Wideband (UWB) communicator, and an Ant + communicator, but is not limited thereto.
In one embodiment, the communication unit 140 of the handheld terminal 100 also receives flight status information of the aircraft. Furthermore, the control unit 130 monitors the flight status of the aircraft based on the flight status information. The flight conditions include, but are not limited to: the flying height, speed, angle, direction, steering engine rotating speed, vibration frequency, obstacle information, radar perception information, airborne camera information and the like.
In one embodiment, the aircraft returns to the current location of the handheld terminal 100 under at least one of the following conditions: the strength of the signal received by the aircraft from the communication unit 140 of the handheld terminal 100 is below a first preset threshold; the communication unit 140 of the hand-held terminal 100 issues a return command to the aircraft; and the battery level of the aircraft is below a second preset threshold.
It should be noted that the components and structure of the handheld terminal 100 shown in fig. 1 are exemplary only, and not limiting, and the handheld terminal 100 may have other components and structures as desired. For example, the handheld terminal 100 may also include input and output devices not shown. The input device may be a device used by a user to input instructions and may include one or more of a keyboard, a microphone, a touch screen, and the like. The output device may output various information (e.g., images or sounds) to an outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
In one example, the handheld terminal 100 further includes a user instruction receiving unit for receiving a manual control command. Illustratively, during the autonomous flight of the aircraft, the user may issue some manual control commands through the user command receiving unit, including but not limited to various control commands for adjusting the flight altitude, speed, angle, direction, preset flight mode, homing function, etc., for example, commands for altitude climb 1m, altitude descent 1m, left turn 90 degrees, right turn 90 degrees, straight flight, hover flight, 8-word flight, etc. After receiving the instructions, the flight path is re-planned by at least one of the handheld terminal and the aircraft to complete the actions of the instructions. After completing the instructions, the aircraft continues autonomous flight in the re-planned flight paths until new instructions are received.
In another example, the handheld terminal 100 further includes a voice collecting unit for receiving a voice of the user. In this case, the control unit 130 of the hand-held terminal 100 includes a voice recognition subunit for recognizing the voice of the user to generate a voice control command for the aircraft. The voice control commands include, but are not limited to, various control commands for adjusting altitude, speed, angle, direction, preset flight mode, homing function, etc. The speech recognition process is known to those skilled in the art and will not be described in detail herein.
In yet another example, the handheld terminal 100 further includes a display unit for displaying data acquired by an onboard camera of the aircraft. Illustratively, the display unit may be a display screen integrated on the handheld terminal 100. Alternatively, the display unit may also be a display device physically separate from the handheld terminal but communicating via wireless communication technology, such as, but not limited to, a VR display device, a head mounted display device, etc. Further, the data acquired by the onboard cameras of the aircraft may include, but is not limited to, still images and/or motion video.
The handheld terminal 100 of the embodiment of the present disclosure is described in detail above with reference to fig. 1. In the handheld terminal of the embodiment of the disclosure, the measurement unit is configured on the handheld terminal to measure the obstacle, the modeling unit is configured to perform three-dimensional modeling, and the control unit is configured to perform complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced, and under the condition that the weight of the aircraft body is limited to be light and the load capacity is low, the functions of more stable autonomous flight and more complex control mode can be realized, thereby greatly improving the autonomous flight capacity of the light-weight aircraft. In addition, the handheld terminal can determine the obstacle more accurately through the autonomous selection of the user and can be matched with the user requirements more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
It is noted that the disclosed embodiments are particularly applicable to bionic ornithopters with a light weight fuselage. However, the principles of the disclosed embodiments may be extended to any other type of unmanned aerial vehicle, including fixed wing aircraft, multi-rotor aircraft, and the like, particularly where these types of unmanned aerial vehicles are limited to airframes of light weight and low load capacity that require autonomous flight.
Next, a block diagram of a main configuration of a handheld terminal according to an embodiment of the present disclosure is described with reference to fig. 2.
As shown in fig. 2, the handheld terminal 200 of the embodiment of the present disclosure mainly includes a measurement unit 210, a control unit 230, and a communication unit 240. These components are interconnected by a bus and/or other form of connection mechanism (not shown).
The measurement unit 210 measures position information of at least one target object within the peripheral space in which the aircraft is located. The control unit 230 generates control information based on at least the position information of the at least one target object. The communication unit 240 transmits the control information based on at least the position information of the at least one target object to the aircraft.
It should be noted that the configuration and function of the measurement unit 210 of the handheld terminal 200 and the specific measurement method thereof are the same as the measurement unit 110 of the handheld terminal 100 described with reference to fig. 1, and are not described again here.
One difference from the hand-held terminal 100 shown in fig. 1 is that the hand-held terminal 200 shown in fig. 2 does not include a modeling unit, i.e., in this embodiment, three-dimensional modeling is not performed on the hand-held terminal side, but three-dimensional modeling is performed on the aircraft side. Therefore, the handheld terminal 200 does not generate modeling information. In other words, the control unit 230 does not generate, nor does the communication unit 240 transmit control information based on at least the modeling information. Alternatively, the control unit 230 generates control information based on at least the position information of the at least one target object and transmits it to the aircraft through the communication unit 240. The position information of the at least one target object is used by the aircraft for modeling the peripheral space where the aircraft is located so as to generate modeling data, and the flight path planning is carried out at least based on the modeling data.
Specifically, the aircraft-side modeling process includes a flight airspace modeling subprocess: modeling a flight airspace of the aircraft according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model.
Still further, the aircraft-side modeling process further includes a headroom modeling process: determining whether the flight airspace is headroom, and upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or when the flight airspace is determined to be the clearance airspace, determining the flight airspace model as the clearance airspace model.
It should be noted that the above modeling process on the aircraft side is similar to the modeling process of the modeling unit 120 of the handheld terminal 100 described with reference to fig. 1, and specifically, the flight space modeling process of the aircraft is similar to the processing performed by the flight space modeling unit of the modeling unit 120 of the handheld terminal 100 described with reference to fig. 1; and the headroom modeling process of the aircraft is similar to the process performed by the headroom modeling unit of the modeling unit 120 of the handheld terminal 100 described with reference to fig. 1. Therefore, for the sake of brevity, the detailed modeling process is not repeated here.
Another difference from the hand-held terminal 100 described with reference to fig. 1 is that the control unit 230 of the hand-held terminal 200 shown in fig. 2 does not perform flight path planning with respect to the aircraft. In other words, in the handy terminal 100 described with reference to fig. 1, the flight path planning with respect to the aircraft may be performed on at least one of the handy terminal side and the aircraft side, whereas in the handy terminal 200 shown in fig. 2, the flight path planning is performed only on the aircraft side.
Specifically, after the aircraft generates the headroom model by itself, the aircraft performs flight path planning based on at least the headroom model.
For example, in addition to the headroom model, the aircraft may obtain a specific position of the current position of the aircraft in the modeled three-dimensional coordinate system for flight path planning. For example, a specific position of an aircraft may be obtained according to the following two examples.
In one example, measurement unit 210 also measures aircraft relative position information, and communication unit 240 transmits the aircraft relative position information to the aircraft.
In another example, aircraft position information is measured by the aircraft via position sensors.
After obtaining the relative position information of the aircraft or the position information of the aircraft, the aircraft carries out flight path planning based on at least the relative position information of the aircraft or the position information of the aircraft.
It should be noted that the measurement unit 210 of the handheld terminal 200 shown in fig. 2 measures the relative position information of the aircraft in the same manner as the measurement unit 110 of the handheld terminal 100 shown in fig. 1 measures the relative position information of the aircraft, and therefore, the description thereof is omitted. The manner in which the aircraft itself in the embodiment of fig. 2 obtains the aircraft position information is the same as the manner in which the aircraft itself in the embodiment of fig. 1 obtains the aircraft position information, and therefore, the description is omitted. In addition, the way of planning the flight path of the aircraft in the embodiment of fig. 2 is the same as the way of planning the flight path of the aircraft in the partial embodiment of fig. 1, and therefore, the description is omitted.
It should also be noted that the user instruction receiving function, the voice recognition function, the flight status monitoring function, the display function, and the like of the hand-held terminal 100 described with reference to fig. 1 are also applicable to the hand-held terminal 200 shown in fig. 2. For the sake of brevity, the embodiments related to these functions will not be described in detail herein.
It should be noted that the components and structure of the handheld terminal 200 shown in fig. 2 are exemplary only, and not limiting. Similar to the handheld terminal 100 shown in fig. 1, the handheld terminal 200 shown in fig. 2 may have other components and structures as desired. For example, the handheld terminal 200 may also include input and output devices not shown. The input device may be a device used by a user to input instructions and may include one or more of a keyboard, a microphone, a touch screen, and the like. The output device may output various information (e.g., images or sounds) to an outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
The handheld terminal 200 of the embodiment of the present disclosure is described in detail above with reference to fig. 2. In the handheld terminal of the embodiment of the disclosure, the measurement unit is configured on the handheld terminal to measure the obstacle, and the control unit is configured to perform the complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced to a certain extent, and the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the weight of the aircraft body is limited to light and the load capacity is low, thereby greatly improving the autonomous flight capacity of the light-weight aircraft. In addition, the handheld terminal can determine the obstacle more accurately through the autonomous selection of the user and can be matched with the user requirements more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
Next, a block diagram of a main configuration of an aircraft according to an embodiment of the present disclosure is described with reference to fig. 3.
As shown in fig. 3, the aircraft 300 of the disclosed embodiment generally includes a communication unit 310 and a control unit 320. These components are interconnected by a bus and/or other form of connection mechanism (not shown). The aerial vehicle 300 shown in fig. 3 may be used with the hand-held terminal 100 shown in fig. 1.
The communication unit 310 receives control information based on at least the modeling information from the handheld terminal. The control unit 320 controls the flight of the aircraft 300 at least on the basis of the control information. The modeling information is obtained by the handheld terminal by measuring the position information of at least one object in the peripheral space in which the aircraft 300 is located and modeling the peripheral space in which the aircraft 300 is located according to the position information of the at least one object.
According to an embodiment of the present disclosure, modeling the peripheral space in which the aircraft 300 is located according to the position information of the at least one target object includes: the flight airspace of the aircraft 300 is modeled by the handheld terminal according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model.
According to an embodiment of the present disclosure, modeling the perimeter space in which the aircraft 300 is located according to the position information of the at least one target object further includes: determining whether the flight airspace is headroom, and when it is determined that the flight airspace is not headroom, for at least one second target among the at least one target, determining a geometric model of each of the at least one second target, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second target is different from the at least one first target; or when the flight airspace is determined to be the clearance airspace, determining the flight airspace model as the clearance airspace model.
According to the embodiment of the present disclosure, the communication unit 310 receives headroom model data associated with a headroom model from the handheld terminal. Furthermore, the control unit 320 comprises a planning subunit which performs a flight path planning at least based on the headroom spatial model data.
According to an embodiment of the present disclosure, the aircraft 300 further includes an onboard position sensor that measures aircraft position information. Further, the communication unit 310 transmits the aircraft position information to the handheld terminal. In this embodiment, the control information received by the communication unit 310 of the aircraft 300 from the hand-held terminal includes flight path planning information for the aircraft 300, which is obtained by the hand-held terminal based on at least the modeling information and the aircraft location information.
According to an embodiment of the present disclosure, the control information received by the communication unit 310 of the aircraft 300 from the handheld terminal includes flight path planning information for the aircraft 300, which is obtained by the handheld terminal based on at least the modeling information and the relative position information of the aircraft measured by the handheld terminal.
According to the embodiment of the disclosure, the flight path planning information includes: at least one of path coordinate data and flight power data.
According to an embodiment of the present disclosure, the communication unit 310 receives aircraft relative position information measured by the handheld terminal from the handheld terminal. Alternatively, the aircraft 300 also includes onboard position sensors that measure aircraft position information. In both cases, the planning subunit of the control unit 320 of the aircraft 300 performs a flight path planning at least on the basis of the aircraft relative position information or the aircraft position information, respectively.
According to an embodiment of the present disclosure, the communication unit 310 receives flight control commands from the handheld terminal indicating at least one of a plurality of preset flight modes. In addition, the planning subunit of the control unit 320 performs flight path planning based on three of: at least one flight mode; a headroom space domain model; and aircraft relative position information or aircraft position information.
According to an embodiment of the present disclosure, the planning subunit of the control unit 320 generates at least one of the following data associated with the flight path planning: path coordinate data and flight power data.
According to the embodiment of the present disclosure, the measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space where the aircraft 300 is located further includes: receiving a user selection of at least one first target object; and receiving a user selection of at least one second object.
According to an embodiment of the present disclosure, determining the geometric model of each of the at least one second object comprises: a user input is received by the handheld terminal to determine a user selection of a geometry for each of the at least one second object.
According to an embodiment of the present disclosure, modeling the flight airspace of the aircraft 300 includes: user input of attributes of the flight airspace is received by the handheld terminal.
According to the embodiment of the present disclosure, the communication unit 310 also receives a voice control command from the handheld terminal, and the voice control command is generated by voice recognition of voice input by the user by the handheld terminal.
According to the embodiment of the present disclosure, the measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space where the aircraft 300 is located further includes: and the handheld terminal passes through at least one of the range finder and the image collector, the position of the handheld terminal is taken as a base point, and for each of at least one target object, the distance of each of at least one characteristic point of the target object relative to the base point is obtained.
According to the embodiment of the present disclosure, the communication unit 310 also transmits the flight status information of the aircraft 300 to the handheld terminal, so that the handheld terminal monitors the flight status of the aircraft 300.
According to an embodiment of the present disclosure, the aircraft 300 further includes an onboard camera, and data acquired by the onboard camera is transmitted to the handheld terminal by the communication unit 310. And the data is displayed by a display unit of the handheld terminal.
According to an embodiment of the present disclosure, the aircraft 300 returns to the current location of the handheld terminal under at least one of the following conditions: the strength of the signal received by the communication unit 310 from the handheld terminal is lower than a first preset threshold; the communication unit 310 receives a return command from the handheld terminal; and the battery level of the aircraft 300 is below a second preset threshold.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 1 are also applicable to the embodiments of fig. 3. For the sake of brevity, no further description is provided herein.
It should be noted that the components and configuration of the aircraft 300 shown in FIG. 3 are exemplary only, and not limiting, and that the aircraft 300 shown in FIG. 3 may have other components and configurations as desired. For example, the aircraft 300 may also include various on-board sensors not shown, such as one or more of gyroscopes, angular rate sensors, acceleration sensors, obstacle avoidance sensors, position sensors, barometers, on-board cameras, and so forth.
The aircraft 300 of the disclosed embodiment is described in detail above with reference to fig. 3. In the embodiment of the disclosure, the handheld terminal is configured on the aircraft instead of the aircraft to measure the obstacle, the modeling unit is configured to perform three-dimensional modeling, and the control unit is configured to perform complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced, and the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the fuselage of the aircraft is light in weight and the load capacity is low, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
It is noted that the disclosed embodiments are particularly applicable to bionic ornithopters with a light weight fuselage. However, the principles of the disclosed embodiments may be extended to any other type of unmanned aerial vehicle, including fixed wing aircraft, multi-rotor aircraft, and the like, particularly where these types of unmanned aerial vehicles are limited to airframes of light weight and low load capacity that require autonomous flight.
Next, a block diagram of a main configuration of an aircraft according to another embodiment of the present disclosure is described with reference to fig. 4.
As shown in fig. 4, the aircraft 400 of the embodiment of the present disclosure mainly includes a communication unit 410, a control unit 420, and a modeling unit 430. These components are interconnected by a bus and/or other form of connection mechanism (not shown). The aerial vehicle 400 shown in fig. 4 may be used with the handheld terminal 200 shown in fig. 2.
According to an embodiment of the present disclosure, the communication unit 410 receives control information based on at least position information of at least one target object from a handheld terminal. The modeling unit 430 models the peripheral space in which the aircraft 400 is located, based on the control information, to generate modeling data. Control unit 420 controls the flight of aircraft 400 based on at least the control information and the modeling data. In this embodiment, the position information of the at least one object is obtained by the handheld terminal by measuring the position of the at least one object within the peripheral space in which the aircraft 400 is located.
According to an embodiment of the present disclosure, the modeling unit 430 comprises a flight airspace modeling subunit that models the flight airspace of the aircraft 400 based on the position information of at least one first object from among the at least one object to obtain a flight airspace model.
According to an embodiment of the present disclosure, the modeling unit 430 further comprises a headroom airspace modeling subunit that determines whether the flight airspace is headroom, and when it is determined that the flight airspace is not headroom, determines a geometric model of each of the at least one second object for at least one second object among the at least one object, wherein the at least one second object is different from the at least one first object, and removes the geometric model from the flight airspace model to obtain a headroom airspace model; or when the flight airspace is determined to be the clearance airspace, determining the flight airspace model as the clearance airspace model.
According to an embodiment of the present disclosure, the control unit 420 comprises a planning subunit that performs flight path planning based at least on the modeling data.
According to an embodiment of the present disclosure, the communication unit 410 receives aircraft relative position information measured by the handheld terminal from the handheld terminal. Alternatively, the aircraft 400 also includes onboard position sensors that measure aircraft position information. In both cases, the planning subunit of the control unit 420 performs the flight path planning based on at least the aircraft relative position information or the aircraft position information, respectively.
According to an embodiment of the present disclosure, the communication unit 410 also receives flight control commands from the handheld terminal indicating at least one of a plurality of preset flight modes. Furthermore, the planning subunit of the control unit 420 performs flight path planning based on three: at least one flight mode; a headroom space domain model; and aircraft relative position information or aircraft position information.
According to an embodiment of the present disclosure, the planning subunit of the control unit 420 generates at least one of the following data associated with the flight path planning: path coordinate data and flight power data.
According to the embodiment of the present disclosure, measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space where the aircraft 400 is located further includes: receiving a user selection of at least one first target object; and receiving a user selection of at least one second object.
According to an embodiment of the present disclosure, determining the geometric model of each of the at least one second object comprises: a user input is received by the handheld terminal to determine a user selection of a geometry for each of the at least one second object.
According to an embodiment of the present disclosure, modeling the flight airspace of the aircraft 400 includes: user input of attributes of the flight airspace is received by the handheld terminal.
According to the embodiment of the present disclosure, the communication unit 410 also receives a voice control command from the handheld terminal, and the voice control command is generated by voice recognition of voice input by the user by the handheld terminal.
According to the embodiment of the present disclosure, measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space where the aircraft 400 is located further includes: and the handheld terminal passes through at least one of the range finder and the image collector, the position of the handheld terminal is taken as a base point, and for each of at least one target object, the distance of each of at least one characteristic point of the target object relative to the base point is obtained.
According to the embodiment of the present disclosure, the communication unit 410 also transmits the flight status information of the aircraft 400 to the handheld terminal so that the handheld terminal monitors the flight status of the aircraft 400.
According to an embodiment of the present disclosure, the aircraft 400 further includes an onboard camera, and data acquired by the onboard camera is transmitted to the handheld terminal by the communication unit 410, and the data is displayed by the display unit of the handheld terminal.
According to an embodiment of the present disclosure, the aircraft 400 returns to the current location of the handheld terminal under at least one of the following conditions: the strength of the signal received by the communication unit 410 from the handheld terminal is lower than a first preset threshold; the communication unit 410 receives a return command from the handheld terminal; and the battery level of the aircraft 400 is below a second preset threshold.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 2 are also applicable to the embodiments of fig. 4. For the sake of brevity, no further description is provided herein.
It should be noted that the components and configuration of the aircraft 400 shown in FIG. 4 are exemplary only, and not limiting, and that other components and configurations of the aircraft 400 shown in FIG. 4 may be used as desired. For example, the aircraft 400 may also include various on-board sensors not shown, such as one or more of gyroscopes, angular rate sensors, acceleration sensors, obstacle avoidance sensors, position sensors, barometers, on-board cameras, and so forth.
The aircraft 400 of the disclosed embodiment is described in detail above with reference to fig. 4. In the embodiment of the disclosure, the handheld terminal is configured with the measurement unit to measure the obstacle instead of the aircraft, and the control unit is configured to perform the complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced to a certain extent, and the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the weight of the aircraft body is light and the load capacity is low, thereby greatly improving the autonomous flight capacity of the light-weight aircraft. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
It is noted that the disclosed embodiments are particularly applicable to bionic ornithopters with a light weight fuselage. However, the principles of the disclosed embodiments may be extended to any other type of unmanned aerial vehicle, including fixed wing aircraft, multi-rotor aircraft, and the like, particularly where these types of unmanned aerial vehicles are limited to airframes of light weight and low load capacity that require autonomous flight.
Next, a block diagram of a main configuration of a flight system according to an embodiment of the present disclosure is described with reference to fig. 5.
As shown in fig. 5, a flight system 500 of an embodiment of the present disclosure includes the handheld terminal 100 and the aircraft 300. The handheld terminal 100 mainly includes a measurement unit 110, a modeling unit 120, a control unit 130, and a communication unit 140. The aircraft 300 mainly comprises a communication unit 310 and a control unit 320.
The communication unit 140 of the hand-held terminal 100 and the communication unit 310 of the aircraft 300 may communicate through various wireless communication protocols. Illustratively, the wireless communication may include, but is not limited to, bluetooth communication, Bluetooth Low Energy (BLE) communication, near field communication, Wireless Local Area Network (WLAN) or Wi-Fi communication, Zigbee communication, infrared data association (IrDA) communication, Wi-Fi direct (WFD) communication, Ultra Wideband (UWB) communication, and Ant + communication.
Since each functional module of the handheld terminal 100 shown in fig. 5 is the same as the corresponding functional module of the handheld terminal 100 shown in fig. 1, it is not described herein again. Since the individual functional modules of the aircraft 300 shown in fig. 5 are identical to the corresponding functional modules of the aircraft 300 shown in fig. 3, no further description is provided here.
The flight system 500 of the disclosed embodiment is described in detail above with reference to FIG. 5. In the embodiment of the disclosure, the handheld terminal is configured on the aircraft instead of the aircraft to measure the obstacle, the modeling unit is configured to perform three-dimensional modeling, and the control unit is configured to perform complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced, and the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the fuselage of the aircraft is light in weight and the load capacity is low, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
Next, a block diagram of a main configuration of a flight system according to another embodiment of the present disclosure is described with reference to fig. 6.
As shown in fig. 6, a flight system 600 of an embodiment of the present disclosure includes a handheld terminal 200 and an aerial vehicle 400. The handheld terminal 200 mainly includes a measurement unit 210, a control unit 230, and a communication unit 240. The aircraft 400 mainly comprises a communication unit 410, a control unit 420 and a modeling unit 430.
The communication unit 240 of the handheld terminal 200 and the communication unit 410 of the aircraft 400 may communicate via various wireless communication protocols. Illustratively, the wireless communication may include, but is not limited to, bluetooth communication, Bluetooth Low Energy (BLE) communication, near field communication, Wireless Local Area Network (WLAN) or Wi-Fi communication, Zigbee communication, infrared data association (IrDA) communication, Wi-Fi direct (WFD) communication, Ultra Wideband (UWB) communication, and Ant + communication.
Since each functional module of the handheld terminal 200 shown in fig. 6 is the same as the corresponding functional module of the handheld terminal 200 shown in fig. 2, it is not described herein again. Since the individual functional modules of the aircraft 400 shown in fig. 6 are identical to the corresponding functional modules of the aircraft 400 shown in fig. 4, no further description is provided here.
The aircraft 400 of the disclosed embodiment is described in detail above with reference to fig. 6. In the embodiment of the disclosure, the handheld terminal is configured with the measurement unit to measure the obstacle instead of the aircraft, and the control unit is configured to perform the complex autonomous flight control, so that the load burden and the calculation burden of the aircraft are reduced to a certain extent, and the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the weight of the aircraft body is light and the load capacity is low, thereby greatly improving the autonomous flight capacity of the light-weight aircraft. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
In the following, a flow chart of the main steps of a method of airspace measurement of an aircraft according to an embodiment of the present disclosure is described with reference to fig. 7. The airspace measurement method 700 may be performed, for example, by the handheld terminal 100 shown in fig. 1.
In step S710, position information of at least one target object within a peripheral space in which the aircraft is located is measured.
In step S720, a surrounding space in which the aircraft is located is modeled according to the position information of the at least one target object to generate modeling information.
In step S730, control information based on at least the modeling information is generated.
In step S740, control information based on at least the modeling information is transmitted to the aircraft.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 1 are also applicable to the embodiment of fig. 7. For the sake of brevity, no further description is provided herein.
The airspace measurement method 700 of the aircraft of the disclosed embodiment is described in detail above with reference to FIG. 7. In the airspace measurement method of the embodiment of the disclosure, by performing obstacle measurement, three-dimensional modeling, and complex autonomous flight control on the handheld terminal side, the load burden and the computational burden of the aircraft are reduced, and under the condition that the fuselage of the aircraft is light in weight and the load capacity is low, the functions of more stable autonomous flight and more complex control mode are realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
Next, a flowchart of the main steps of an airspace measurement method of an aircraft according to another embodiment of the present disclosure is described with reference to fig. 8. The airspace measurement method 800 may be performed, for example, by the handheld terminal 200 shown in fig. 2.
In step S810, position information of at least one target object within a peripheral space in which the aircraft is located is measured.
In step S820, control information based on at least the position information of at least one target object is generated.
In step S830, control information based on at least the position information of the at least one target object is transmitted to the aircraft.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 2 are also applicable to the embodiment of fig. 8. For the sake of brevity, no further description is provided herein.
The spatial domain measurement method 800 of the embodiment of the present disclosure is described in detail above with reference to fig. 8. In the embodiment of the disclosure, by performing obstacle measurement and complex autonomous flight control on the handheld terminal side, the load burden and the calculation burden of the aircraft are reduced to a certain extent, and under the condition that the weight of the aircraft body is limited to light and the load capacity of the aircraft is low, the functions of more stable autonomous flight and more complex control mode are realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
Next, a flowchart of the main steps of a control method of an aircraft according to an embodiment of the present disclosure is described with reference to fig. 9. The control method 900 may be performed, for example, by the aircraft 300 shown in FIG. 3.
In step S310, control information based on at least the modeling information is received from the handheld terminal.
In step S320, based at least on the control information, a flight of the aircraft is effected.
According to the embodiment of the disclosure, the modeling information is obtained by measuring the position information of at least one target object in the peripheral space where the aircraft is located by the handheld terminal and modeling the peripheral space where the aircraft is located according to the position information of the at least one target object.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 1 are also applicable to the embodiment of fig. 9. For the sake of brevity, no further description is provided herein.
The aircraft control method 900 of the disclosed embodiment is described in detail above with reference to fig. 9. According to the control method of the embodiment of the disclosure, by performing obstacle measurement, three-dimensional modeling and complex autonomous flight control on the handheld terminal side instead of the aircraft side, the load burden and the calculation burden of the aircraft are reduced, and under the condition that the weight of the aircraft body is light and the load capacity is low, the functions of more stable autonomous flight and more complex control mode can be realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
Next, a flowchart of main steps of a control method of an aircraft according to another embodiment of the present disclosure is described with reference to fig. 10. The control method 1000 may be performed, for example, by the aircraft 400 shown in fig. 4.
Receiving control information based on at least position information of at least one target object from the handheld terminal at step S1010;
in step S1020, modeling a peripheral space where the aircraft is located based on the control information to generate modeling data; and
in step S1030, flight of the aircraft is effected based on at least the control information and the modeling data.
According to the embodiment of the disclosure, the position information of the at least one target object is obtained by measuring the position of the at least one target object in the peripheral space where the aircraft is located through the handheld terminal.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 2 are also applicable to the embodiment of fig. 10. For the sake of brevity, no further description is provided herein.
The control method 1000 of the embodiment of the present disclosure is described in detail above with reference to fig. 10. In the embodiment of the disclosure, by performing obstacle measurement and complex autonomous flight control on the handheld terminal side instead of the aircraft side, the load burden and the calculation burden of the aircraft are reduced to a certain extent, and under the condition that the weight of the aircraft body is limited to be light and the load capacity is low, the functions of more stable autonomous flight and more complex control mode are realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
In the following, a flow chart of the main steps of a flight method of a flight system according to an embodiment of the disclosure is described with reference to fig. 11. The flying method 1100 may be performed, for example, by the flying system 500 shown in FIG. 5.
In step S1110, the handheld terminal measures position information of at least one target object in the peripheral space where the aircraft is located.
In step S1120, the handheld terminal models the surrounding space where the aircraft is located according to the position information of the at least one target object to generate modeling information.
In step S1130, control information based on at least the modeling information is generated by the handheld terminal.
In step S1140, control information based at least on the modeling information is transmitted to the aircraft by the handheld terminal.
In step S1150, control information is received by the aerial vehicle from the handheld terminal.
In step S1160, a flight of the aircraft is effectuated by the aircraft based at least on the control information.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 1 are also applicable to the embodiment of fig. 11. For the sake of brevity, no further description is provided herein.
The method of flight 1100 of the flight system of the embodiments of the present disclosure is described in detail above with reference to FIG. 11. According to the flight method, the obstacle measurement, the three-dimensional modeling and the complex autonomous flight control are carried out on the hand-held terminal side, so that the load burden and the calculation burden of the aircraft are reduced, the functions of more stable autonomous flight and more complex control mode can be realized under the condition that the weight of the aircraft body is limited to light and the load capacity is low, and the autonomous flight capacity of the light and heavy aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
In the following, a flow chart of the main steps of a method of flight of a flight system according to another embodiment of the disclosure is described with reference to fig. 12. The flying method 1200 may be performed, for example, by the flying system 600 shown in FIG. 6.
In step S1210, position information of at least one target object in a surrounding space where the aircraft is located is measured by the handheld terminal.
In step S1220, control information based on at least the position information of the at least one target object is generated by the handheld terminal.
In step S1230, control information based on at least the position information of the at least one target object is transmitted to the aircraft by the handheld terminal.
In step S1240, control information is received by the aircraft from the handheld terminal.
In step S1250, the surrounding space where the aircraft is located is modeled by the aircraft based on the control information to generate modeling data.
In step S1260, a flight of the aircraft is effectuated by the aircraft based at least on the control information and the modeling data.
It should be noted that the specific measurement method, modeling method, flight path planning method, and the like in the embodiments described with reference to fig. 2 are also applicable to the embodiment of fig. 12. For the sake of brevity, no further description is provided herein.
The method 1200 of flying a flying system of an embodiment of the present disclosure is described in detail above with reference to fig. 12. In the embodiment of the disclosure, by performing obstacle measurement and complex autonomous flight control on the handheld terminal side, the load burden and the calculation burden of the aircraft are reduced to a certain extent, and under the condition that the weight of the aircraft body is limited to light and the load capacity of the aircraft is low, the functions of more stable autonomous flight and more complex control mode are realized, so that the autonomous flight capacity of the light-weight aircraft is greatly improved. In addition, the obstacle can be determined more accurately through the autonomous selection of the user, and the user requirements can be matched more flexibly, so that various application scenes of the light and heavy aircraft can be better adapted.
According to another embodiment of the present disclosure, there is provided a handheld terminal including: a measurer; a wireless transceiver; a processor; a memory; and computer program instructions stored in the memory that, when executed by the processor, perform the steps of: controlling the measurer to measure the position information of at least one target object in the peripheral space where the aircraft is located; modeling a peripheral space where the aircraft is located according to the position information of the at least one target object to generate modeling information; generating control information based at least on the modeling information; and controlling the wireless transceiver to transmit the control information based at least on the modeling information to the aircraft.
According to another embodiment of the present disclosure, there is provided a handheld terminal including: a measurer; a wireless transceiver; a processor; a memory; and computer program instructions stored in the memory that, when executed by the processor, perform the steps of: controlling a measurer to measure the position information of at least one target object in the peripheral space where the aircraft is located; generating control information based at least on the position information of the at least one target object; and controlling the wireless transceiver to transmit the control information based on at least the position information of the at least one target object to the aircraft.
According to another embodiment of the present disclosure, there is provided an aircraft including: a wireless transceiver; a processor; a memory; and computer program instructions stored in the memory that, when executed by the processor, perform the steps of: controlling the wireless transceiver to receive control information based at least on the modeling information from the handheld terminal; and enabling flight of the aircraft based at least on the control information. The modeling information is obtained by measuring the position information of at least one target object in the peripheral space where the aircraft is located by the handheld terminal and modeling the peripheral space where the aircraft is located according to the position information of the at least one target object.
According to another embodiment of the present disclosure, there is provided an aircraft including: a wireless transceiver; a processor; a memory; and computer program instructions stored in the memory that, when executed by the processor, perform the steps of: controlling the wireless transceiver to receive control information based on at least position information of at least one target object from the handheld terminal; modeling a peripheral space in which the aircraft is located based on the control information to generate modeling data; and controlling the flight of the aircraft based at least on the control information and the modeling data. The position information of the at least one target object is obtained by the handheld terminal through measuring the position of the at least one target object in the peripheral space where the aircraft is located.
According to another embodiment of the present disclosure, a flight system is provided that includes a handheld terminal and an aerial vehicle. The hand-held terminal includes: a measurer; a first wireless transceiver; a first processor; a first memory; and first computer program instructions stored in the first memory which, when executed by the first processor, perform the steps of: controlling the measurer to measure the position information of at least one target object in the peripheral space where the aircraft is located; modeling a peripheral space where the aircraft is located according to the position information of the at least one target object to generate modeling information; generating control information based at least on the modeling information; and control the first wireless transceiver to transmit the control information based at least on the modeling information to the aircraft. The aircraft comprises: a second wireless transceiver; a second processor; a second memory; and second computer program instructions stored in the second memory that, when executed by the second processor, perform the steps of: controlling the second wireless transceiver to receive control information based on at least the modeling information from the handheld terminal; and controlling the flight of the aircraft based at least on the control information.
According to another embodiment of the present disclosure, a flight system is provided that includes a handheld terminal and an aerial vehicle. The hand-held terminal includes: a measurer; a first wireless transceiver; a first processor; a first memory; and first computer program instructions stored in the first memory which, when executed by the first processor, perform the steps of: controlling a measurer to measure the position information of at least one target object in the peripheral space where the aircraft is located; generating control information based at least on the position information of the at least one target object; and controlling the first wireless transceiver to transmit the control information based on at least the position information of the at least one target object to the aircraft. The aircraft comprises: a second wireless transceiver; a second processor; a second memory; and second computer program instructions stored in the second memory that, when executed by the second processor, perform the steps of: controlling the second wireless transceiver to receive control information based on at least position information of at least one target object from the handheld terminal; modeling a peripheral space in which the aircraft is located based on the control information to generate modeling data; and controlling the flight of the aircraft based at least on the control information and the modeling data.
In the above embodiments, the measurer may include at least one of a range finder and an image collector. Illustratively, rangefinders include, but are not limited to, laser rangefinders, lidar, ultrasonic radar, infrared lasers, and the like. Image collectors include, but are not limited to, monocular cameras, binocular cameras, infrared cameras, visible light based color or grayscale cameras, and the like.
In the above embodiments, the wireless transceivers (including the first wireless transceiver and the second wireless transceiver) may be implemented as wireless transceivers that communicate with the aircraft in various wireless communication protocols. Illustratively, the wireless transceiver may include a bluetooth communicator, a Bluetooth Low Energy (BLE) communicator, a near field communicator, a Wireless Local Area Network (WLAN) or Wi-Fi communicator, a Zigbee communicator, an infrared data association (IrDA) communicator, a Wi-Fi direct (WFD) communicator, an ultra-wideband (UWB) communicator, and an Ant + communicator, but is not limited thereto.
In the above embodiments, the processor (including the first processor and the second processor) may be a Central Processing Unit (CPU) or other form of processing unit having data processing capability and/or instruction execution capability, and may cooperate with other components to perform desired functions.
In the above embodiments, the memories (including the first memory and the second memory) may comprise one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by a processor to implement the respective functions of the apparatuses of the embodiments of the present disclosure described above and/or other desired functions.
It should be noted that, in the present specification, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In the disclosed embodiments, the units/modules may be implemented in software for execution by various types of processors. An identified module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be constructed as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may comprise disparate instructions stored in different bits which, when joined logically together, comprise the unit/module and achieve the stated purpose for the unit/module.
When a unit/module can be implemented by software, considering the level of existing hardware technology, the unit/module can be implemented by software, and those skilled in the art can build corresponding hardware circuits to implement corresponding functions, without considering the cost, the hardware circuits include conventional Very Large Scale Integration (VLSI) circuits or gate arrays and existing semiconductors such as logic chips, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.
The exemplary embodiments of the present disclosure described in detail above are merely illustrative, and not restrictive. Those skilled in the art will appreciate that various modifications, combinations, or sub-combinations of the embodiments may be made without departing from the spirit and principle of the disclosure, and that such modifications are intended to be within the scope of the disclosure.

Claims (45)

1. A hand-held terminal comprising:
the measuring unit is used for measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located;
the modeling unit is used for modeling the peripheral space where the bionic flapping wing aircraft is located according to the position information of the at least one target object so as to generate modeling information;
a control unit that generates control information based on at least the modeling information; and
a communication unit that transmits the control information based on at least the modeling information to the bionic ornithopter,
wherein the modeling unit includes:
the flight airspace modeling subunit is used for modeling the flight airspace of the bionic flapping wing aircraft according to the position information of at least one first target object in the at least one target object so as to obtain a flight airspace model; and
a headroom modeling subunit that determines whether the flight airspace is headroom, and
upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
2. The handheld terminal of claim 1, wherein the communication unit transmits headroom model data associated with the headroom model to the bionic ornithopter such that the bionic ornithopter performs flight path planning based at least on the headroom model data.
3. The handheld terminal of claim 1, wherein the communication unit receives bionic ornithopter position information from the bionic ornithopter,
and wherein the control unit comprises a planning subunit that performs a flight path planning for the bionic ornithopter based on at least the modeling information and the bionic ornithopter position information.
4. The hand-held terminal of claim 1, wherein the measurement unit further measures bionic ornithopter relative position information,
and wherein the control unit comprises a planning subunit that performs a flight path planning for the bionic ornithopter based on at least the modeling information and the bionic ornithopter relative position information.
5. The handheld terminal of claim 3 or 4, wherein the planning subunit further determines at least one flight mode from a plurality of preset flight modes, and performs flight path planning based on: the at least one flight mode; the headroom spatial domain model; and the relative position information of the bionic flapping wing air vehicle or the position information of the bionic flapping wing air vehicle.
6. The handheld terminal of claim 3 or 4, wherein the planning subunit further generates at least one of path coordinate data and flight dynamics data associated with the flight path plan,
and wherein the communication unit transmits at least one of the path coordinate data and flight power data to the bionic ornithopter.
7. The handheld terminal of claim 3, wherein the measurement unit further measures bionic ornithopter relative position information, and the communication unit further transmits the bionic ornithopter relative position information to the bionic ornithopter, or
Measuring the position information of the bionic flapping wing aircraft by the bionic flapping wing aircraft through a position sensor;
and wherein the flight path planning is based at least on the bionic ornithopter relative position information or the bionic ornithopter position information.
8. The handheld terminal of claim 7, wherein the communication unit sends a flight control command to the bionic ornithopter vehicle indicating at least one flight mode of a plurality of preset flight modes, such that the bionic ornithopter vehicle performs the flight path planning based on at least three of: the at least one flight mode; the headroom spatial domain model; and the relative position information of the bionic flapping wing air vehicle or the position information of the bionic flapping wing air vehicle.
9. A hand-held terminal comprising:
the measuring unit is used for measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located;
a control unit that generates control information based on at least position information of the at least one target object; and
a communication unit that transmits the control information based on at least the position information of the at least one target object to the bionic ornithopter,
wherein the position information of the at least one target object is used by the bionic ornithopter to model the peripheral space where the bionic ornithopter is located so as to generate modeling data, and the flight path planning is carried out at least based on the modeling data,
wherein, the modeling of the peripheral space where the peripheral space is located comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model;
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
10. The handheld terminal of claim 9, wherein the measurement unit further measures bionic ornithopter relative position information, and the communication unit further transmits the bionic ornithopter relative position information to the bionic ornithopter, or
Measuring the position information of the bionic flapping wing aircraft by the bionic flapping wing aircraft through a position sensor;
and wherein the flight path planning is based at least on the bionic ornithopter relative position information or the bionic ornithopter position information.
11. The handheld terminal of claim 10, wherein the communication unit sends a flight control command to the bionic ornithopter vehicle indicating at least one flight mode of a plurality of preset flight modes, such that the bionic ornithopter vehicle performs the flight path planning based on at least three of: the at least one flight mode; a headroom space domain model; and the relative position information of the bionic flapping wing air vehicle or the position information of the bionic flapping wing air vehicle.
12. The handheld terminal of claim 2 or 9, further comprising:
and the user instruction receiving unit is used for receiving the selection of the at least one first target object and the selection of the at least one second target object by the user.
13. The handheld terminal of claim 12, wherein the user instruction receiving unit further receives a user selection of a geometry of each of the at least one second object.
14. The hand-held terminal according to claim 12, wherein the user instruction receiving unit further receives user input on a property of the flight space.
15. The hand-held terminal according to claim 1 or 9, wherein the control unit comprises:
and the voice recognition subunit recognizes the voice of the user to generate a voice control command for the bionic ornithopter.
16. The handheld terminal according to claim 1 or 9, wherein the measuring unit includes at least one of a distance meter and an image collector, and with a position of the handheld terminal as a base point, for each of the at least one target object, a distance of each of at least one feature point of the target object with respect to the base point is obtained.
17. The hand-held terminal according to claim 1 or 9, wherein the communication unit further receives flight status information of the bionic ornithopter,
and wherein the control unit further monitors the flight state of the bionic ornithopter based on the flight state information.
18. The handheld terminal of claim 1 or 9, further comprising:
and the display unit is used for displaying data acquired by an airborne camera of the bionic flapping wing aircraft.
19. The hand-held terminal of claim 1 or 9, wherein the biomimetic ornithopter returns to the current position of the hand-held terminal under at least one of:
the signal intensity received by the bionic flapping wing aircraft from the communication unit is lower than a first preset threshold value;
the communication unit sends a return command to the bionic flapping wing aircraft; and
the battery electric quantity of the bionic flapping wing air vehicle is lower than a second preset threshold value.
20. A bionic ornithopter comprising:
a communication unit that receives control information based on at least the modeling information from the handheld terminal; and
a control unit for controlling the flight of the bionic flapping-wing aircraft at least based on the control information,
wherein the modeling information is obtained by the handheld terminal through measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located and modeling the peripheral space where the bionic flapping wing aircraft is located according to the position information of the at least one target object,
wherein, the modeling of the peripheral space where the bionic ornithopter is located according to the position information of the at least one target object comprises the following steps:
modeling the flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object by the handheld terminal to obtain a flight airspace model,
wherein, modeling the peripheral space where the bionic ornithopter is located according to the position information of the at least one target object further comprises:
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
21. The bionic ornithopter of claim 20, wherein the communication unit receives from the handheld terminal headroom model data associated with the headroom model,
and wherein the control unit comprises a planning subunit performing a flight path planning at least based on the headroom spatial model data.
22. The bionic ornithopter of claim 20, further comprising a position sensor that measures bionic ornithopter position information, and the communication unit further transmits the bionic ornithopter position information to the hand-held terminal,
and wherein the control information comprises flight path planning information for the bionic ornithopter, the flight path planning information being obtained by the handheld terminal based at least on the modeling information and the bionic ornithopter position information.
23. The bionic ornithopter of claim 20, wherein the control information comprises flight path planning information for the bionic ornithopter, the flight path planning information being obtained by the handheld terminal based on at least the modeling information and bionic ornithopter relative position information measured by the handheld terminal.
24. The bionic ornithopter of claim 22 or 23, wherein the flight path planning information comprises: at least one of path coordinate data and flight power data.
25. A bionic ornithopter comprising:
a communication unit receiving control information based on at least position information of at least one target object from a handheld terminal;
a modeling unit that models a peripheral space in which the bionic flapping wing aircraft is located, based on the control information, to generate modeling data; and
a control unit for controlling the flight of the bionic ornithopter based on at least the control information and the modeling data,
wherein the position information of the at least one target object is obtained by the handheld terminal through measuring the position of the at least one target object in the peripheral space where the bionic ornithopter is located,
wherein the modeling unit includes:
the flight airspace modeling subunit is used for modeling the flight airspace of the bionic flapping wing aircraft according to the position information of at least one first target object in the at least one target object so as to obtain a flight airspace model; and
a headroom modeling subunit that determines whether the flight airspace is headroom, and
upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
26. The biomimetic ornithopter of claim 25, wherein the control unit includes a planning subunit that performs flight path planning based at least on the modeled data.
27. The bionic ornithopter of claim 21 or 26, wherein the communication unit receives the bionic ornithopter relative position information measured by the hand-held terminal from the hand-held terminal, or
The bionic flapping wing air vehicle also comprises a position sensor which measures the position information of the bionic flapping wing air vehicle;
and the planning subunit performs flight path planning at least based on the relative position information of the bionic ornithopter or the position information of the bionic ornithopter.
28. The biomimetic ornithopter of claim 27, wherein the communication unit further receives flight control commands from the hand-held terminal indicative of at least one flight mode of a plurality of preset flight modes,
and wherein the planning subunit performs flight path planning based on: the at least one flight mode; the headroom spatial domain model; and the relative position information of the bionic flapping wing air vehicle or the position information of the bionic flapping wing air vehicle.
29. The bionic ornithopter of claim 21 or 26, wherein the planning subunit generates at least one of the following data associated with the flight path plan: path coordinate data and flight power data.
30. The bionic ornithopter of claim 20 or 25, wherein the measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space in which the bionic ornithopter is located further comprises:
receiving a user selection of the at least one first target; and
receiving a user selection of the at least one second target.
31. The biomimetic ornithopter of claim 30, wherein the determining a geometric model for each of the at least one second target includes:
receiving, by the handheld terminal, a user input to determine a user selection of a geometry for each of the at least one second object.
32. The biomimetic ornithopter of claim 30, wherein said modeling a flight airspace of the biomimetic ornithopter comprises:
receiving, by the handheld terminal, user input of attributes of the flight airspace.
33. The biomimetic ornithopter of claim 20 or 25, wherein the communication unit further receives voice control commands from the hand-held terminal, the voice control commands being generated by the hand-held terminal by voice recognition of a voice input by a user.
34. The bionic ornithopter of claim 20 or 25, wherein the measuring, by the handheld terminal, the position information of the at least one target object in the peripheral space in which the bionic ornithopter is located further comprises:
and the handheld terminal acquires the distance of each of at least one characteristic point of the target object relative to a base point by taking the position of the handheld terminal as the base point through at least one of a distance meter and an image collector.
35. The bionic ornithopter of claim 20 or 25, wherein the communication unit further transmits flight state information of the bionic ornithopter to the hand-held terminal so that the hand-held terminal monitors the flight state of the bionic ornithopter.
36. The bionic ornithopter of claim 20 or 25, further comprising an onboard camera, and the data acquired by the onboard camera is transmitted to the handheld terminal by the communication unit,
and wherein the data is displayed by a display unit of the handheld terminal.
37. The bionic ornithopter of claim 20 or 25, wherein the bionic ornithopter returns to the current position of the hand-held terminal under at least one of the following conditions:
the signal intensity received by the communication unit from the handheld terminal is lower than a first preset threshold value;
the communication unit receives a return command from the handheld terminal; and
the battery electric quantity of the bionic flapping wing air vehicle is lower than a second preset threshold value.
38. An airspace measurement method for a bionic ornithopter comprises the following steps:
measuring the position information of at least one target object in the peripheral space where the bionic ornithopter is located;
modeling a peripheral space where the bionic flapping wing aircraft is located according to the position information of the at least one target object to generate modeling information;
generating control information based at least on the modeling information; and
transmitting the control information based on at least the modeling information to the bionic ornithopter,
wherein, the modeling the peripheral space where the bionic ornithopter is located according to the position information of the at least one target object to generate modeling information comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model; and
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
39. An airspace measurement method for a bionic ornithopter comprises the following steps:
measuring the position information of at least one target object in the peripheral space where the bionic ornithopter is located;
generating control information based at least on the position information of the at least one target object; and
transmitting the control information based on at least the position information of the at least one target object to the bionic ornithopter,
wherein the position information of the at least one target object is used by the bionic ornithopter to model the peripheral space where the bionic ornithopter is located so as to generate modeling data, and the flight path planning is carried out at least based on the modeling data,
wherein, the modeling of the peripheral space where the peripheral space is located comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model;
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
40. A control method for a bionic ornithopter, comprising:
receiving control information based on at least the modeling information from the handheld terminal; and
controlling the flight of the bionic ornithopter based on at least the control information,
wherein the modeling information is obtained by the handheld terminal through measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located and modeling the peripheral space where the bionic flapping wing aircraft is located according to the position information of the at least one target object,
wherein, the modeling the peripheral space where the bionic ornithopter is located according to the position information of the at least one target object to generate modeling information comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model; and
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
41. A control method for a bionic ornithopter, comprising:
receiving control information based on at least position information of at least one target object from a handheld terminal;
modeling the peripheral space where the bionic ornithopter is located based on the control information to generate modeling data; and
controlling the flight of the bionic ornithopter based on at least the control information and the modeling data,
wherein the position information of the at least one target object is obtained by the handheld terminal through measuring the position of the at least one target object in the peripheral space where the bionic ornithopter is located,
wherein, based on the control information, modeling the peripheral space where the bionic ornithopter is located to generate modeling data further comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model; and
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
42. A flight system comprising a hand-held terminal as claimed in claim 1 and a bionic ornithopter as claimed in claim 20.
43. A flying system comprising the hand-held terminal of claim 9 and the biomimetic ornithopter of claim 25.
44. A method of flying for a flying system comprising a biomimetic ornithopter and a handheld terminal, the method comprising:
measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located by the handheld terminal;
modeling the peripheral space where the bionic flapping wing aircraft is located by the handheld terminal according to the position information of the at least one target object so as to generate modeling information;
generating, by the handheld terminal, control information based at least on the modeling information;
sending the control information at least based on the modeling information to the bionic ornithopter by a handheld terminal;
receiving the control information from the handheld terminal by the bionic ornithopter; and
the bionic flapping wing air vehicle realizes the flight of the bionic flapping wing air vehicle at least based on the control information,
wherein, the modeling of the peripheral space where the bionic ornithopter is located by the handheld terminal according to the position information of the at least one target object to generate modeling information comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model; and
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
45. A method of flying for a flying system comprising a biomimetic ornithopter and a handheld terminal, the method comprising:
measuring the position information of at least one target object in the peripheral space where the bionic flapping wing aircraft is located by the handheld terminal;
generating, by the handheld terminal, control information based at least on the location information of the at least one target object;
transmitting, by a handheld terminal, the control information based on at least the position information of the at least one target object to the bionic ornithopter;
receiving the control information from the handheld terminal by the bionic ornithopter;
modeling, by a bionic ornithopter, a peripheral space where the bionic ornithopter is located based on the control information to generate modeling data; and
the bionic flapping wing air vehicle realizes the flight of the bionic flapping wing air vehicle at least based on the control information and the modeling data,
wherein the modeling, by the bionic ornithopter, the peripheral space in which the bionic ornithopter is located based on the control information to generate modeling data comprises:
modeling a flight airspace of the bionic ornithopter according to the position information of at least one first target object in the at least one target object to obtain a flight airspace model;
determining whether the flight airspace is headroom, an
Upon determining that the flight airspace is not headroom, for at least one second object among the at least one object, determining a geometric model of each of the at least one second object, and removing the geometric model from the flight airspace model to obtain a headroom airspace model, wherein the at least one second object is different from the at least one first object; or
And when the flying airspace is determined to be the clearance airspace, determining the flying airspace model as the clearance airspace model.
CN201910579055.5A 2019-06-28 2019-06-28 Handheld terminal, aircraft, airspace measurement method and control method of aircraft Active CN110299030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910579055.5A CN110299030B (en) 2019-06-28 2019-06-28 Handheld terminal, aircraft, airspace measurement method and control method of aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910579055.5A CN110299030B (en) 2019-06-28 2019-06-28 Handheld terminal, aircraft, airspace measurement method and control method of aircraft

Publications (2)

Publication Number Publication Date
CN110299030A CN110299030A (en) 2019-10-01
CN110299030B true CN110299030B (en) 2021-11-19

Family

ID=68029522

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910579055.5A Active CN110299030B (en) 2019-06-28 2019-06-28 Handheld terminal, aircraft, airspace measurement method and control method of aircraft

Country Status (1)

Country Link
CN (1) CN110299030B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112740222A (en) * 2019-11-29 2021-04-30 深圳市大疆创新科技有限公司 Altitude determination method, aircraft and computer-readable storage medium

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714719B (en) * 2014-01-16 2016-02-10 天津天航创力科技有限公司 Based on the Flight navigational system of Beidou satellite navigation
CN103941748B (en) * 2014-04-29 2016-05-25 百度在线网络技术(北京)有限公司 Autonomous navigation method and system and Map building method and system
CN105492985B (en) * 2014-09-05 2019-06-04 深圳市大疆创新科技有限公司 A kind of system and method for the control loose impediment in environment
CN108132678B (en) * 2014-09-15 2021-06-04 深圳市大疆创新科技有限公司 Flight control method of aircraft and related device
CN104457704B (en) * 2014-12-05 2016-05-25 北京大学 Based on the unmanned aerial vehicle object locating system and the method that strengthen geography information
US9671791B1 (en) * 2015-06-10 2017-06-06 Amazon Technologies, Inc. Managing unmanned vehicles
CN205179207U (en) * 2015-12-07 2016-04-20 深圳市帝翼飞科技有限公司 Unmanned aerial vehicle's imaging system
CN108780325B (en) * 2016-02-26 2022-03-18 深圳市大疆创新科技有限公司 System and method for adjusting unmanned aerial vehicle trajectory
CN105955288B (en) * 2016-07-15 2021-04-09 北京远度互联科技有限公司 Aircraft positioning and control method and system
CN106681353B (en) * 2016-11-29 2019-10-25 南京航空航天大学 The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream
CN108496134A (en) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage paths planning method and device
CN206833252U (en) * 2017-06-16 2018-01-02 炬大科技有限公司 A kind of mobile electronic device
CN108701164A (en) * 2017-08-25 2018-10-23 深圳市大疆创新科技有限公司 Obtain method, apparatus, storage medium and the equipment of flight simulation data
CN107784841A (en) * 2017-11-22 2018-03-09 成都大学 Traffic monitoring system and monitoring method based on aircraft
CN108805906A (en) * 2018-05-25 2018-11-13 哈尔滨工业大学 A kind of moving obstacle detection and localization method based on depth map
CN109634309B (en) * 2019-02-21 2024-03-26 南京晓庄学院 Autonomous obstacle avoidance system and method for aircraft and aircraft
CN109886192A (en) * 2019-02-21 2019-06-14 彭劲松 A kind of ecological environment intelligent monitor system

Also Published As

Publication number Publication date
CN110299030A (en) 2019-10-01

Similar Documents

Publication Publication Date Title
US20210072745A1 (en) Systems and methods for uav flight control
US11797009B2 (en) Unmanned aerial image capture platform
US10447912B2 (en) Systems, methods, and devices for setting camera parameters
JP6816156B2 (en) Systems and methods for adjusting UAV orbits
US11771076B2 (en) Flight control method, information processing device, program and recording medium
US11822353B2 (en) Simple multi-sensor calibration
US10375289B2 (en) System and method for providing autonomous photography and videography
US11531336B2 (en) Systems and methods for automatically customizing operation of a robotic vehicle
JP2020098567A (en) Adaptive detection/avoidance system
US10852723B2 (en) Unmanned aerial vehicle swarm photography
WO2018045538A1 (en) Unmanned aerial vehicle, obstacle avoidance method for same, and obstacle avoidance system thereof
WO2018094626A1 (en) Unmanned aerial vehicle obstacle-avoidance control method and unmanned aerial vehicle
WO2018146803A1 (en) Position processing device, flight vehicle, position processing system, flight system, position processing method, flight control method, program, and recording medium
WO2020087297A1 (en) Unmanned aerial vehicle testing method and apparatus, and storage medium
WO2021199449A1 (en) Position calculation method and information processing system
WO2020107454A1 (en) Method and apparatus for accurately locating obstacle, and computer readable storage medium
CN110299030B (en) Handheld terminal, aircraft, airspace measurement method and control method of aircraft
KR20210075647A (en) Method and apparatus for learning of controlling aviation of unmanned aviation vehicle by using depth camera
WO2022047709A1 (en) Method and apparatus for updating restricted area data, movable platform and computer storage medium
Ghosh et al. Arduino quadcopter
Ngo et al. UAV Platforms for Autonomous Navigation in GPS-Denied Environments for Search and Rescue Missions
WO2022126397A1 (en) Data fusion method and device for sensor, and storage medium
CN116804883B (en) Unmanned aerial vehicle obstacle avoidance method and device
US20230296793A1 (en) Motion-Based Calibration Of An Aerial Device
Manjanoor Towards Autonomous Drone Racing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant