CN205353775U - Unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle Download PDF

Info

Publication number
CN205353775U
CN205353775U CN201620081348.2U CN201620081348U CN205353775U CN 205353775 U CN205353775 U CN 205353775U CN 201620081348 U CN201620081348 U CN 201620081348U CN 205353775 U CN205353775 U CN 205353775U
Authority
CN
China
Prior art keywords
landing
unmanned vehicle
place
coordinate
jump area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201620081348.2U
Other languages
Chinese (zh)
Inventor
谭圆圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
High Domain (beijing) Intelligent Technology Research Institute Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=56184361&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CN205353775(U) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Individual filed Critical Individual
Priority to CN201620081348.2U priority Critical patent/CN205353775U/en
Application granted granted Critical
Publication of CN205353775U publication Critical patent/CN205353775U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Toys (AREA)

Abstract

The utility model provides an unmanned aerial vehicle, it includes that aircraft body, first communication module, machine carry camera, comparison chip and flight control module. First communication module is used for receiving the descending coordinate in descending place and the ground reference diagram that descends to the coordinate that will descend sends to flight control module, and the ground reference diagram that will descend sends to comparing the chip, the machine carries the surrounding environment photo that the camera is used for acquireing descending coordinate department to send the surrounding environment photo to comparing the chip, relatively the chip is used for receiving the surrounding environment photo and descends the ground reference diagram to send the landing place to flight control module, flight control module is used for controlling aircraft body flight to the coordinate that descends to carry out the automatic landing operation according to the landing place. The utility model discloses an unmanned aerial vehicle has realized unmanned aerial vehicle's high accuracy's automatic landing, and has realized that the cost is lower through the combination of preliminary location with the meticulous location of ground reference diagram of descending of descending coordinate.

Description

Unmanned vehicle
Technical field
This utility model relates to unmanned aerial vehicle (UAV) control field, particularly relates to a kind of unmanned vehicle.
Background technology
Current little miniature unmanned vehicle, especially Multi-axis aircraft is developed rapidly.Owing to little miniature unmanned vehicle landing is flexible, flight is rapidly, it is possible to be applied in measure monitor, security protection, the various fields such as take photo by plane.
The control of existing little miniature unmanned vehicle is broken generally into full-automatic and full manual control two class, no matter which kind of mode, and the automatic landing of unmanned vehicle is all an important technical problem.The takeoff setting of unmanned vehicle is usually artificial setting, only need to be performed the action of liftoff flight so taking off.Taking off compared to unmanned vehicle, the landing of unmanned vehicle at least needs to meet two requirements: landing safety and landing place are accurate.
Under the mode of full manual control, landing accuracy depends primarily on the control ability flying hands and landing environmental effect, has bigger randomness;And under the mode of Automatic Control, there is presently no the mode of a kind of well accurate landing automatically, adopt the automatic landing modes of satellite navigation location, often because of the deviation of navigation coordinate, and make landing inaccurate.
Application number is a kind of SUAV landing method of 201510599276.0 and the utility model patent of system discloses, by landing stage mark, unmanned vehicle carried out Primary Location, realizes the automatic landing of unmanned vehicle.But unmanned vehicle intelligence is landed towards landing stage, and landing stage needs to arrange in advance, and landing selected element is very limited.
Therefore, it is necessary to provide a kind of unmanned vehicle, to solve the problem existing for prior art.
Utility model content
This utility model embodiment provides a kind of and realizes the unmanned vehicle that cost is low and landing accuracy is higher;Landing automatically to solve existing unmanned vehicle realizes relatively costly or that landing accuracy is poor technical problem.
This utility model embodiment provides a kind of unmanned vehicle, comprising:
First communication module, is arranged on Flight main body, is used for receiving the landing coordinate in landing place and jump area with reference to figure, and sends described landing coordinate to flight control modules, is sent to comparable chip with reference to figure by described jump area;
Airborne camera, is arranged on described Flight main body, for obtaining the surrounding photo at landing coordinate place, and sends described surrounding photo to described comparable chip;
Described comparable chip, is arranged on described Flight main body, receives described surrounding photo and described jump area with reference to figure, and sends landing place to described flight control modules;And
Described flight control modules, is arranged on described Flight main body, is used for controlling the flight of described Flight main body to described landing coordinate, and carries out automatic landing operation according to described landing place.
In unmanned vehicle described in the utility model, described unmanned vehicle also includes:
Landing place storage data base, is arranged on described Flight main body, is used for storing the landing coordinate in described landing place and jump area with reference to figure.
In unmanned vehicle described in the utility model, described unmanned vehicle also includes:
Global positioning system, is arranged on described Flight main body, for sending the current position coordinates of described unmanned vehicle to described flight control modules.
In unmanned vehicle described in the utility model, described unmanned vehicle also includes flight parameter sensor, for detecting the flight parameter of described Flight main body.
In unmanned vehicle described in the utility model, described flight parameter sensor includes attitude sensors and airspeed transducer.
In unmanned vehicle described in the utility model, described airspeed transducer is three axial velocity sensors.
In unmanned vehicle described in the utility model, described attitude sensors is pressure-altitude sensor.
In unmanned vehicle described in the utility model, described unmanned vehicle also includes:
Control terminal, for receiving the input instruction of user, and send the landing coordinate in described landing place and jump area with reference to figure to described unmanned vehicle.
In unmanned vehicle described in the utility model, described control terminal includes:
Touching display screen, for receiving the input instruction of user;And
Photo module, is used for obtaining described jump area with reference to figure.
In unmanned vehicle described in the utility model, described control terminal also includes:
Second communication module, for by the landing coordinate in described landing place and jump area with reference to figure, transmission is described first communication module extremely.
Compared to prior art, unmanned vehicle of the present utility model is by the combination with reference to the fine location of figure of the Primary Location of the coordinate that lands and jump area, it is achieved that the automatic landing of the high accuracy of unmanned vehicle, and automatically land realize less costly;The landing automatically solving existing unmanned vehicle realizes relatively costly or that landing accuracy is poor technical problem.
Accompanying drawing explanation
Fig. 1 is the flow chart of the first preferred embodiment of the automatic landing method of unmanned vehicle of the present utility model;
Fig. 2 is the flow chart of the second preferred embodiment of the automatic landing method of unmanned vehicle of the present utility model;
Fig. 3 is the flow chart of the third preferred embodiment of the automatic landing method of unmanned vehicle of the present utility model;
Fig. 4 is the structural representation of the first preferred embodiment of the automatic landing-gear of unmanned vehicle of the present utility model;
Fig. 5 is the structural representation of the second preferred embodiment of the automatic landing-gear of unmanned vehicle of the present utility model;
Fig. 6 is the structural representation that the landing place of the second preferred embodiment of the automatic landing-gear of unmanned vehicle of the present utility model determines module;
Fig. 7 is the structural representation of the third preferred embodiment of the automatic landing-gear of unmanned vehicle of the present utility model;
Fig. 8 is the structural representation that the landing place of the third preferred embodiment of the automatic landing-gear of unmanned vehicle of the present utility model determines module;
Fig. 9 is the structural representation of the first preferred embodiment of unmanned vehicle of the present utility model.
Detailed description of the invention
Below in conjunction with the accompanying drawing in this utility model embodiment, the technical scheme in this utility model embodiment is clearly and completely described, it is clear that described embodiment is only a part of embodiment of this utility model, rather than whole embodiments.Based on the embodiment in this utility model, the every other embodiment that those skilled in the art obtain under not making creative work premise, broadly fall into the scope of this utility model protection.
The automatic landing method of unmanned vehicle of the present utility model can use various UAV system electronic equipment to be carried out, this electronic equipment includes but not limited to microcomputer or various intelligent mobile terminals etc., this electronic equipment is preferably unmanned flight control, this electronic equipment can pass through to land the landing coordinate in place and jump area with reference to figure, it is achieved the automatic landing operation of the high accuracy of unmanned vehicle.
Refer to the flow chart of the first preferred embodiment of the automatic landing method that Fig. 1, Fig. 1 are unmanned vehicle of the present utility model.The flight control method of this preferred embodiment can use above-mentioned electronic equipment to be carried out, and the automatic landing method of the unmanned vehicle of this preferred embodiment includes:
Step S101, the landing coordinate in acquisition landing place and jump area are with reference to figure;
Step S102, controls unmanned vehicle and flies to the coordinate that lands;
Step S103, obtains the surrounding photo at landing coordinate place, according to surrounding photo and jump area with reference to figure, it is determined that the landing place of unmanned vehicle;
Step S104, carries out the automatic landing operation of unmanned vehicle at landing place;
The idiographic flow of each step of automatic landing method the following detailed description of the unmanned vehicle of this preferred embodiment.
In step S101, when flying hands requirement unmanned vehicle and automatically landing, automatic landing-gear can obtain the landing landing coordinate in place and jump area with reference to figure.Here landing place can be preset the place (as being stored in advance in unmanned vehicle) for the unmanned vehicle that lands obtained or the place for the unmanned vehicle that lands (as sent place to unmanned vehicle by user mobile phone) obtained in real time, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.Then pass to step S102.
In step s 102, after automatic landing-gear obtains the landing coordinate in landing place, control unmanned vehicle and fly to this landing coordinate, in order to carry out automatic landing operation.Then pass to step S103.
In step s 103, automatic landing-gear obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Automatically this surrounding photo is contrasted with reference to figure by landing-gear with the step S101 jump area obtained subsequently, finds jump area with reference to the position in figure around environment photo, so that it is determined that the precision approach position of unmanned vehicle.Then pass to step S104.
In step S104, automatic landing-gear carries out the automatic landing operation of unmanned vehicle according to the precision approach position of the step S103 unmanned vehicle obtained.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place in step s 103, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
So namely the automatic descent of the unmanned vehicle of the automatic landing method of the unmanned vehicle of this preferred embodiment is completed.
The automatic landing method of the unmanned vehicle of this preferred embodiment is by the Primary Location of coordinate and the jump area combination with reference to the fine location of figure of landing, it is achieved that automatically landing of the high accuracy of unmanned vehicle;And jump area has only to by acquisition of taking pictures with reference to figure, landing coordinate has only to shooting places be positioned and obtains, therefore this automatic landing method realize less costly.
Refer to the flow chart of the second preferred embodiment of the automatic landing method that Fig. 2, Fig. 2 are unmanned vehicle of the present utility model.The flight control method of this preferred embodiment can use above-mentioned electronic equipment to be carried out, and the automatic landing method of the unmanned vehicle of this preferred embodiment includes:
Step S201, pre-sets at least one landing landing coordinate in place and jump area with reference to figure;
Step S202, when receiving automatically landing instruction, obtains the landing coordinate at least one landing place and jump area with reference to figure;
Step S203, controls the landing coordinate in the unmanned vehicle flight landing place nearest to the current location of distance unmanned vehicle;
Step S204, according to surrounding photo and jump area with reference to figure, it is determined that jump area is with reference to the relative position in figure around environment photo;
Step S205, according to jump area with reference to the relative position in figure around environment photo, it is determined that the landing place of unmanned vehicle;
Step S206, carries out the automatic landing operation of unmanned vehicle at landing place.
The idiographic flow of each step of automatic landing method the following detailed description of the unmanned vehicle of this preferred embodiment.
In step s 201, automatic landing-gear can pre-set the landing coordinate in multiple landing place and jump area with reference to figure;Here landing place can be preset the place (as being stored in advance in unmanned vehicle) for the unmanned vehicle that lands obtained, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.
As tested through a large number of users, all think certain outdoor sport square A, B, C tri-place be suitable to unmanned vehicle and carry out automatic landing operation, therefore can by this outdoor sport square A, B, C tri-place be set to landing place, preserve respectively outdoor sport square A, B, C tri-place landing coordinate and corresponding jump area with reference to figure.Then pass to step S202.
In step S202, when receiving the automatically landing instruction flying hands, automatic landing-gear can obtain the landing coordinate in all landing places and jump area with reference to figure.
If certainly flying finger determine the landing place of unmanned vehicle, then directly obtain the landing coordinate in landing place accordingly and jump area with reference to figure, then pass to step S203.
In step S203, in the landing place that automatic landing-gear obtains from step S202, obtain the landing place that the current location of distance unmanned vehicle is nearest, and control the landing coordinate in the unmanned vehicle flight landing place nearest to the current location of distance unmanned vehicle, in order to carry out automatic landing operation.
If flying finger to determine the landing place of unmanned vehicle, then landing-gear controls the unmanned vehicle flight landing coordinate to this landing place automatically, in order to carry out automatic landing operation.Then pass to step S204.
In step S204, automatic landing-gear obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Automatically this surrounding photo is contrasted with reference to figure by landing-gear with the step S202 jump area obtained subsequently, it is determined that jump area is with reference to the relative position in figure around environment photo.Then pass to step S205.
In step S205, automatic landing-gear according in step S204 obtain jump area with reference to the relative position in figure around environment photo, it is determined that the precision approach position of unmanned vehicle.Then pass to step S206.
In step S206, automatic landing-gear carries out the automatic landing operation of unmanned vehicle according to the precision approach position of the step S205 unmanned vehicle obtained.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place in step S205, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
So namely the automatic descent of the unmanned vehicle of the automatic landing method of the unmanned vehicle of this preferred embodiment is completed.
On the basis of the first preferred embodiment, the automatic landing method of the unmanned vehicle of this preferred embodiment passes through to arrange the landing coordinate in landing place and jump area with reference to figure, in order to new hand better completes the automatic landing operation of unmanned vehicle.And when being provided with multiple landing place, unmanned vehicle can intelligent selection closest to jump area point carry out landing operation, reduce further the executory cost of automatic landing operation.
Refer to the flow chart of the third preferred embodiment of the automatic landing method that Fig. 3, Fig. 3 are unmanned vehicle of the present utility model.The flight control method of this preferred embodiment can use above-mentioned electronic equipment to be carried out, and the automatic landing method of the unmanned vehicle of this preferred embodiment includes:
Step S301, lands the landing coordinate in place and jump area with reference to figure from default terminal real-time reception;
Step S302, when receiving landing instruction automatically, the landing coordinate in acquisition landing place and jump area are with reference to figure;
Step S303, controls the unmanned vehicle flight landing coordinate to the landing place being most recently received;
Step S304, according to surrounding photo and jump area with reference to figure, it is determined that jump area is with reference to the relative position in figure around environment photo;
Step S305, according to jump area with reference to the relative position in figure around environment photo, it is determined that the landing place of unmanned vehicle;
Step S306, carries out the automatic landing operation of unmanned vehicle at landing place.
The idiographic flow of each step of automatic landing method the following detailed description of the unmanned vehicle of this preferred embodiment.
In step S301, automatic landing-gear is implemented to receive the landing coordinate in multiple landing places and jump area with reference to figure from default terminal;Here landing place can be the place for the unmanned vehicle that lands (as sent place to unmanned vehicle by user mobile phone) obtained in real time, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.
Here preset at terminal by jump area point is taken pictures, jump area to be generated with reference to figure, and the coordinate of photographing operation can be set to the landing coordinate in landing place.Selected jump area point is taken pictures generation jump area with reference to figure by the hand-held default terminal of hands like flying;Use simultaneously this default terminal obtain this action of taking pictures spot GPS coordinates as landing place landing coordinate.Then pass to step S302.
In step s 302, when receiving the automatically landing instruction flying hands, automatic landing-gear can obtain the landing coordinate in the landing place being most recently received and corresponding jump area with reference to figure.Then pass to step S303.
In step S303, automatic landing-gear controls the unmanned vehicle flight landing coordinate to the step S302 landing place obtained, in order to carry out automatic landing operation.Then pass to step S304.
In step s 304, automatic landing-gear obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Automatically this surrounding photo is contrasted with reference to figure by landing-gear with the step S302 jump area obtained subsequently, it is determined that jump area is with reference to the relative position in figure around environment photo.Then pass to step S305.
In step S305, automatic landing-gear according in step S304 obtain jump area with reference to the relative position in figure around environment photo, it is determined that the precision approach position of unmanned vehicle.Then pass to step S306.
In step S306, automatic landing-gear carries out the automatic landing operation of unmanned vehicle according to the precision approach position of the step S305 unmanned vehicle obtained.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place in step S305, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
So namely the automatic descent of the unmanned vehicle of the automatic landing method of the unmanned vehicle of this preferred embodiment is completed.
On the basis of the first preferred embodiment, the automatic landing method of the unmanned vehicle of this preferred embodiment flies the landing coordinate in the landing place that hands sends and jump area with reference to figure by real-time reception, further increases the execution convenience of this automatic landing method.And when receiving multiple real-time landing place, unmanned vehicle can carry out landing operation by the jump area point that is most recently received of intelligent selection, reduce further the executory cost of automatic landing operation.
This utility model also provides for the automatic landing-gear of a kind of unmanned vehicle, refer to the structural representation of the first preferred embodiment of the automatic landing-gear that Fig. 4, Fig. 4 are unmanned vehicle of the present utility model.The automatic landing-gear of the unmanned vehicle of this preferred embodiment can use the first preferred embodiment of above-mentioned automatic landing method to be carried out.This automatic landing-gear 40 includes jump area dot information acquisition module 41, flight control modules 42, landing place determine module 43 and landing module 44 automatically.
Jump area dot information acquisition module 41 is used for obtaining the landing coordinate in landing place and jump area with reference to figure;Flight control modules 42 is used for controlling unmanned vehicle and flies to the coordinate that lands;Landing place determines that module 43 is for obtaining the surrounding photo at landing coordinate place, according to surrounding photo and jump area with reference to figure, it is determined that the landing place of unmanned vehicle;Automatically landing module 44 for carrying out the automatic landing operation of unmanned vehicle at landing place.
When the automatic landing-gear 40 of the unmanned vehicle of this preferred embodiment uses, first when flying hands requirement unmanned vehicle and automatically landing, jump area dot information acquisition module 41 can obtain the landing landing coordinate in place and jump area with reference to figure.Here landing place can be preset the place (as being stored in advance in unmanned vehicle) for the unmanned vehicle that lands obtained or the place for the unmanned vehicle that lands (as sent place to unmanned vehicle by user mobile phone) obtained in real time, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.
After flight control modules 42 obtains the landing coordinate in landing place subsequently, control unmanned vehicle and fly to this landing coordinate, in order to carry out automatic landing operation.
Then landing place determines that module 43 obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Landing place determines that the jump area that this surrounding photo and jump area dot information acquisition module obtain is contrasted by module 43 with reference to figure subsequently, finds jump area with reference to the position in figure around environment photo, so that it is determined that the precision approach position of unmanned vehicle.
According to landing place, the last module 44 of landing automatically determines that the precision approach position of unmanned vehicle that module 43 obtains carries out the automatic landing operation of unmanned vehicle.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place before, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
So namely the automatic descent of the unmanned vehicle of the automatic landing-gear 40 of the unmanned vehicle of this preferred embodiment is completed.
The automatic landing-gear of the unmanned vehicle of this preferred embodiment is by the Primary Location of coordinate and the jump area combination with reference to the fine location of figure of landing, it is achieved that automatically landing of the high accuracy of unmanned vehicle;And jump area has only to by acquisition of taking pictures with reference to figure, landing coordinate has only to shooting places be positioned and obtains, therefore this automatic landing method realize less costly.
Refer to the structural representation of the second preferred embodiment of the automatic landing-gear that Fig. 5, Fig. 5 are unmanned vehicle of the present utility model.The automatic landing-gear of the unmanned vehicle of this preferred embodiment can use the second preferred embodiment of above-mentioned automatic landing method to be carried out.This automatic landing-gear 50 includes jump area dot information presetting module 51, jump area dot information acquisition module 52, flight control modules 53, landing place determine module 54 and landing module 55 automatically.
Jump area dot information presetting module 51 is used for pre-setting the landing coordinate in landing place and jump area with reference to figure;When jump area dot information acquisition module 52 is used for receiving landing instruction automatically, the landing coordinate in acquisition landing place and jump area are with reference to figure;Flight control modules 53 is for controlling the landing coordinate in the unmanned vehicle flight landing place nearest to the current location of distance unmanned vehicle;Landing place determines that module 54 is for obtaining the surrounding photo at landing coordinate place, according to surrounding photo and jump area with reference to figure, it is determined that the landing place of unmanned vehicle;Automatically landing module 55 for carrying out the automatic landing operation of unmanned vehicle at landing place.
Refer to the landing place of the second preferred embodiment of the automatic landing-gear that Fig. 6, Fig. 6 are unmanned vehicle of the present utility model and determine the structural representation of module.This landing place determines that module 54 includes relative position and determines that unit 61 and landing place determine unit 62.
Relative position determine unit 61 for according to surrounding photo and jump area with reference to figure, it is determined that jump area is with reference to the relative position in figure around environment photo.Landing place determine unit 62 for according to jump area with reference to the relative position in figure around environment photo, it is determined that the landing place of unmanned vehicle.
When the automatic landing-gear 50 of the unmanned vehicle of this preferred embodiment uses, jump area dot information presetting module 51 can pre-set the landing coordinate of a lot of jump area point and jump area with reference to figure;Here landing place can be preset the place (as being stored in advance in unmanned vehicle) for the unmanned vehicle that lands obtained, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.
As tested through a large number of users, all think certain outdoor sport square A, B, C tri-place be suitable to unmanned vehicle and carry out automatic landing operation, therefore can by this outdoor sport square A, B, C tri-place be set to landing place, preserve respectively outdoor sport square A, B, C tri-place landing coordinate and corresponding jump area with reference to figure.
Subsequently when receiving the automatically landing instruction flying hands, jump area dot information acquisition module 52 can obtain the landing coordinate in all landing places and jump area with reference to figure.If certainly flying finger determine the landing place of unmanned vehicle, then directly obtain the landing coordinate in landing place accordingly and jump area with reference to figure.
Then flight control modules 53 is from the landing place that jump area dot information acquisition module 52 obtains, obtain the landing place that the current location of distance unmanned vehicle is nearest, and control the landing coordinate in the unmanned vehicle flight landing place nearest to the current location of distance unmanned vehicle, in order to carry out automatic landing operation.
If flying finger to determine the landing place of unmanned vehicle, then flight control modules 53 controls the unmanned vehicle flight landing coordinate to this landing place, in order to carry out automatic landing operation.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place before, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
Landing place determines that the relative position of module 54 determines that unit 61 obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle subsequently, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Relative position determines that the jump area that this surrounding photo and jump area dot information acquisition module obtain is contrasted by unit 61 with reference to figure subsequently, it is determined that jump area is with reference to the relative position in figure around environment photo.
Landing place determines that the landing place of module 54 determines according to relative position, unit 62 determines that jump area that unit 61 obtains is with reference to the relative position in figure around environment photo, it is determined that the precision approach position of unmanned vehicle.
According to landing place, the last module 55 of landing automatically determines that the precision approach position of unmanned vehicle that module 54 obtains carries out the automatic landing operation of unmanned vehicle.
So namely the automatic descent of the unmanned vehicle of the automatic landing-gear 40 of the unmanned vehicle of this preferred embodiment is completed.
On the basis of the first preferred embodiment, the automatic landing-gear of the unmanned vehicle of this preferred embodiment passes through to arrange the landing coordinate in landing place and jump area with reference to figure, in order to new hand better completes the automatic landing operation of unmanned vehicle.And when being provided with multiple landing place, unmanned vehicle can intelligent selection closest to jump area point carry out landing operation, reduce further the executory cost of automatic landing operation.
Refer to the structural representation of the third preferred embodiment of the automatic landing-gear that Fig. 7, Fig. 7 are unmanned vehicle of the present utility model.The automatic landing-gear of the unmanned vehicle of this preferred embodiment can use the third preferred embodiment of above-mentioned automatic landing method to be carried out.This automatic landing-gear 70 includes jump area dot information receiver module 71, jump area dot information acquisition module 72, flight control modules 73, landing place determine module 74 and landing module 75 automatically.
Jump area dot information receiver module 71 is used for the landing coordinate from default terminal real-time reception landing place and jump area with reference to figure;When jump area dot information acquisition module 72 is used for receiving landing instruction automatically, the landing coordinate in acquisition landing place and jump area are with reference to figure;Flight control modules 73 is for controlling the unmanned vehicle flight landing coordinate to the landing place being most recently received;Landing place determines that module 74 is for obtaining the surrounding photo at landing coordinate place, according to surrounding photo and jump area with reference to figure, it is determined that the landing place of unmanned vehicle;Automatically landing module 75 for carrying out the automatic landing operation of unmanned vehicle at landing place.
Refer to the landing place of the third preferred embodiment of the automatic landing-gear that Fig. 8, Fig. 8 are unmanned vehicle of the present utility model and determine the structural representation of module.This landing place determines that module 74 includes relative position and determines that unit 81 and landing place determine unit 82.
Relative position determine unit 81 for according to surrounding photo and jump area with reference to figure, it is determined that jump area is with reference to the relative position in figure around environment photo.Landing place determine unit 82 for according to jump area with reference to the relative position in figure around environment photo, it is determined that the landing place of unmanned vehicle.
When the automatic landing-gear 70 of the unmanned vehicle of this preferred embodiment uses, first jump area dot information receiver module 71 is implemented to receive the landing coordinate in multiple landing places and jump area with reference to figure from default terminal;Here landing place can be the place for the unmanned vehicle that lands (as sent place to unmanned vehicle by user mobile phone) obtained in real time, top as open in certain building or certain open square etc..
The landing coordinate in landing place is landing place longitude and latitude in global positioning system;The jump area in landing place is with reference to the photo that figure is landing place.Even if due to the landing coordinate determining landing place, it is possible to still there will be the deviation (such as problems such as deviations) in landing place, therefore there is also the need to obtain the photo in landing place, in order to jump area point is accurately positioned.For the ease of contrast location, jump area here should have certain identification characteristics with reference to figure, as having bigger colour-difference XOR vision difference etc. with surrounding enviroment;And this jump area is with reference to the top view photograph from top to bottom that figure should be landing place as far as possible.
Here preset at terminal by jump area point is taken pictures, jump area to be generated with reference to figure, and the coordinate of photographing operation can be set to the landing coordinate in landing place.Selected jump area point is taken pictures generation jump area with reference to figure by the hand-held default terminal of hands like flying;Use simultaneously this default terminal obtain this action of taking pictures spot GPS coordinates as landing place landing coordinate.
Subsequently when receiving the automatically landing instruction flying hands, jump area dot information acquisition module 72 can obtain the landing coordinate in the landing place being most recently received and corresponding jump area with reference to figure.
Then flight control modules 73 controls the landing coordinate in the landing place that unmanned vehicle flight to jump area dot information acquisition module 72 obtains, in order to carry out automatic landing operation.
Landing place determines that the relative position of module 74 determines that unit 81 obtains the surrounding photo at landing coordinate place by the airborne photographic head of unmanned vehicle subsequently, the landing place photo that surrounding photo here can be formed for unmanned vehicle centered by the coordinate that lands, the scope of circumference 20-50 rice is taken pictures.
Relative position determines that the jump area that this surrounding photo and jump area dot information acquisition module obtain is contrasted by unit 81 with reference to figure, it is determined that jump area is with reference to the relative position in figure around environment photo.
Then landing place determines that the landing place of module 74 determines according to relative position, unit 82 determines that jump area that unit 81 obtains is with reference to the relative position in figure around environment photo, it is determined that the precision approach position of unmanned vehicle.
According to landing place, the last module 75 of landing automatically determines that the precision approach position of unmanned vehicle that module 74 obtains carries out the automatic landing operation of unmanned vehicle.Concrete, the automatic landing operation of unmanned vehicle is generally vertical decline, owing to having got landing place before, therefore unmanned vehicle can carry out horizontal-shift adjustment for landing place in decline process here, to ensure that unmanned vehicle is in the surface of landing place.As often declined 5 meters, carry out a horizontal-shift adjustment for landing place;Or often decline 10 seconds, carry out a horizontal-shift adjustment for landing place, to ensure the accuracy of landing automatically.
So namely the automatic descent of the unmanned vehicle of the automatic landing-gear 70 of the unmanned vehicle of this preferred embodiment is completed.
On the basis of the first preferred embodiment, the automatic landing-gear of the unmanned vehicle of this preferred embodiment flies the landing coordinate in the landing place that hands sends and jump area with reference to figure by real-time reception, further increases the execution convenience of this automatic landing method.And when receiving multiple real-time landing place, unmanned vehicle can carry out landing operation by the jump area point that is most recently received of intelligent selection, reduce further the executory cost of automatic landing operation.
Refer to the structural representation of the first preferred embodiment that Fig. 9, Fig. 9 are unmanned vehicle of the present utility model.This unmanned vehicle 90 includes Flight main body 91 and controls terminal 92.Flight main body 91 includes first communication module 911, Airborne camera 912, comparable chip 913, flight control modules 914, landing place storage data base 915, global positioning system 916 and flight parameter sensor 917.
First communication module 911 is used for receiving the landing coordinate in landing place and jump area with reference to figure, and sends landing coordinate to flight control modules 914, by jump area with reference to figure transmission to comparable chip;Airborne camera 912 is for obtaining the surrounding photo at landing coordinate place, and sends surrounding photo to comparable chip 913;Comparable chip 913 receives surrounding photo and jump area with reference to figure, and is sent by landing place and control module 914 to flight control modules 914 row and be used for controlling Flight main body 91 and fly to the coordinate that lands, and carries out automatic landing operation according to landing place;Landing place storage data base 915 is used for storing the landing coordinate in landing place and jump area with reference to figure;Global positioning system 916 is for sending the current position coordinates of unmanned vehicle 90 to flight control modules 914;Flight parameter sensor 917 is for detecting the flight parameter of Flight main body 91.
Wherein flight parameter sensor 917 includes attitude sensors and airspeed transducer.Airspeed transducer is preferably three axial velocity sensors, and attitude sensors is preferably pressure-altitude sensor.
Controlling terminal 92 and include touching display screen 921, photo module 922 and second communication module 923, touching display screen 921 is for receiving the input instruction of user;Photo module 922 is used for obtaining jump area with reference to figure;Second communication module 923 for by the landing coordinate in landing place and jump area with reference to figure, transmission is to first communication module 911.
When the unmanned vehicle 90 of this preferred embodiment uses, first user inputs instruction by controlling the touching display screen 921 of terminal 92, and the jump area point that user is specified by photo module 922 according to this input instruction subsequently is taken pictures, and obtains jump area with reference to figure;Then second communication module 923 is by the landing landing coordinate in place and jump area with reference to figure, and transmission is to the first communication module 911 of Flight main body.
The first communication module 911 of Flight main body 91 receives the landing landing coordinate in place and jump area with reference to, after figure, sending landing coordinate to flight control modules 914, by jump area with reference to figure transmission to comparable chip 913.
The current position coordinates that flight control modules 914 obtains according to global positioning system 916 subsequently, control Flight main body 91 flight to landing coordinate place, Airborne camera 912 obtains the surrounding photo at landing coordinate place, and sends surrounding photo to comparable chip 913.
Then surrounding photo and level point are contrasted by comparable chip 913 with reference to figure, obtain the landing place of unmanned vehicle 90, and send this landing place to flight control modules 914.
Last flight control modules 914 carries out automatic landing operation according to this landing place.In the flight course of Flight main body 91, flight parameter sensor 917 detects the flight parameter of Flight main body always in real time, and the normal table to ensure Flight main body 91 flies.
So namely the automatic descent of the unmanned vehicle 90 of this preferred embodiment is completed.
The specific works principle of the unmanned vehicle of this preferred embodiment is same or similar with the description in the preferred embodiment of the automatic landing method of above-mentioned unmanned vehicle and automatic landing-gear, specifically refers to the associated description in the automatic landing method of above-mentioned unmanned vehicle and the preferred embodiment of automatic landing-gear.
The automatic landing method of unmanned vehicle of the present utility model, automatic landing-gear and unmanned vehicle are by the Primary Location of coordinate and the jump area combination finely positioned with reference to figure of landing, achieve the automatic landing of the high accuracy of unmanned vehicle, and realize less costly;The landing automatically solving existing unmanned vehicle realizes relatively costly or that landing accuracy is poor technical problem.
In sum; although this utility model is disclosed above with preferred embodiment; but above preferred embodiment is also not used to restriction this utility model; those of ordinary skill in the art; without departing from spirit and scope of the present utility model; all can doing various change and retouching, the scope that therefore protection domain of the present utility model defines with claim is as the criterion.

Claims (10)

1. a unmanned vehicle, it is characterised in that including:
First communication module, is arranged on Flight main body, is used for receiving the landing coordinate in landing place and jump area with reference to figure, and sends described landing coordinate to flight control modules, is sent to comparable chip with reference to figure by described jump area;
Airborne camera, is arranged on described Flight main body, for obtaining the surrounding photo at landing coordinate place, and sends described surrounding photo to described comparable chip;
Described comparable chip, is arranged on described Flight main body, is used for receiving described surrounding photo and described jump area with reference to figure, and sends landing place to described flight control modules;And
Described flight control modules, is arranged on described Flight main body, is used for controlling the flight of described Flight main body to described landing coordinate, and carries out automatic landing operation according to described landing place.
2. unmanned vehicle according to claim 1, it is characterised in that described unmanned vehicle also includes:
Landing place storage data base, is arranged on described Flight main body, is used for storing the landing coordinate in described landing place and jump area with reference to figure.
3. unmanned vehicle according to claim 1, it is characterised in that described unmanned vehicle also includes:
Global positioning system, is arranged on described Flight main body, for sending the current position coordinates of described unmanned vehicle to described flight control modules.
4. unmanned vehicle according to claim 1, it is characterised in that described unmanned vehicle also includes flight parameter sensor, for detecting the flight parameter of described Flight main body.
5. unmanned vehicle according to claim 4, it is characterised in that described flight parameter sensor includes attitude sensors and airspeed transducer.
6. unmanned vehicle according to claim 5, it is characterised in that described airspeed transducer is three axial velocity sensors.
7. unmanned vehicle according to claim 5, it is characterised in that described attitude sensors is pressure-altitude sensor.
8. unmanned vehicle according to claim 1, it is characterised in that described unmanned vehicle also includes:
Control terminal, for receiving the input instruction of user, and send the landing coordinate in described landing place and jump area with reference to figure to described unmanned vehicle.
9. unmanned vehicle according to claim 8, it is characterised in that described control terminal includes:
Touching display screen, for receiving the input instruction of user;And
Photo module, is used for obtaining described jump area with reference to figure.
10. unmanned vehicle according to claim 8, it is characterised in that described control terminal also includes:
Second communication module, for by the landing coordinate in described landing place and jump area with reference to figure, transmission is described first communication module extremely.
CN201620081348.2U 2016-01-27 2016-01-27 Unmanned aerial vehicle Active CN205353775U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201620081348.2U CN205353775U (en) 2016-01-27 2016-01-27 Unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201620081348.2U CN205353775U (en) 2016-01-27 2016-01-27 Unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN205353775U true CN205353775U (en) 2016-06-29

Family

ID=56184361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201620081348.2U Active CN205353775U (en) 2016-01-27 2016-01-27 Unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN205353775U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106708065A (en) * 2017-01-17 2017-05-24 览意科技(上海)有限公司 Unmanned aerial vehicle landing control method, device and unmanned aerial vehicle
US10795354B2 (en) 2014-12-25 2020-10-06 SZ DJI Technology Co., Ltd. Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10795354B2 (en) 2014-12-25 2020-10-06 SZ DJI Technology Co., Ltd. Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal
US11474516B2 (en) 2014-12-25 2022-10-18 SZ DJI Technology Co., Ltd. Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal
CN106708065A (en) * 2017-01-17 2017-05-24 览意科技(上海)有限公司 Unmanned aerial vehicle landing control method, device and unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
EP3547066B1 (en) Method and apparatus for controlling flight of unmanned aerial vehicle
CN109596118B (en) Method and equipment for acquiring spatial position information of target object
CN105388905B (en) UAV Flight Control method and device
CN102156481B (en) Intelligent tracking control method and system for unmanned aircraft
CN106444843B (en) Unmanned plane relative bearing control method and device
WO2016192249A1 (en) Method and apparatus for manipulating aerial vehicle
WO2016138690A1 (en) Motion sensing flight control system based on smart terminal and terminal equipment
CN104007766A (en) Flight control method and device for unmanned aerial vehicle
CN103809598A (en) Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground
CN111123964B (en) Unmanned aerial vehicle landing method and device and computer readable medium
CN112326686A (en) Unmanned aerial vehicle intelligent cruise pavement disease detection method, unmanned aerial vehicle and detection system
CN106927059A (en) A kind of unmanned plane landing method and device based on monocular vision
CN108965124A (en) A kind of unmanned aerial vehicle control system
CN104931051A (en) Indoor electronic map drawing and navigating method and system based on big data
CN106919186A (en) Unmanned vehicle flight control operation method and device
WO2019080113A1 (en) Patrol planning method for unmanned aerial vehicle, control terminal, unmanned aerial vehicle, and unmanned aerial vehicle system
CN107479082A (en) A kind of unmanned plane makes a return voyage method without GPS
CN107466384A (en) A kind of method and device for following the trail of target
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
JP2020170213A (en) Drone-work support system and drone-work support method
CN105956081A (en) Ground station map updating method and apparatus
CN205353775U (en) Unmanned aerial vehicle
CN105867411A (en) Landing method and landing system of unmanned aerial vehicle
CN115981355A (en) Unmanned aerial vehicle automatic cruise method and system capable of landing quickly and accurately
CN110196601A (en) Unmanned aerial vehicle (UAV) control method, apparatus, system and computer readable storage medium

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20170220

Address after: 100020 Beijing City West Road No. 12 Chaoyang District Building No. 2 (national advertising Industrial Park incubator 25978)

Patentee after: High domain (Beijing) Intelligent Technology Research Institute Co., Ltd.

Address before: 100052 Beijing city Shunyi District Sha Yu Xiang day north lemon Lake Villa 2123

Patentee before: Tan Yuanyuan