CN106647814A - System and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification - Google Patents

System and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification Download PDF

Info

Publication number
CN106647814A
CN106647814A CN201611092540.2A CN201611092540A CN106647814A CN 106647814 A CN106647814 A CN 106647814A CN 201611092540 A CN201611092540 A CN 201611092540A CN 106647814 A CN106647814 A CN 106647814A
Authority
CN
China
Prior art keywords
information
target
module
unmanned plane
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611092540.2A
Other languages
Chinese (zh)
Other versions
CN106647814B (en
Inventor
刘磊
谯睿智
王永骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201611092540.2A priority Critical patent/CN106647814B/en
Publication of CN106647814A publication Critical patent/CN106647814A/en
Application granted granted Critical
Publication of CN106647814B publication Critical patent/CN106647814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The present invention discloses a system and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification. The system comprises an unmanned aerial vehicle body, a sensor module, a tracking locus generation module, a visual processing module, a sensor update module, a flight control module, a visual assist control switching module, an instruction output module and a camera. The two-dimensional marker arranged on the route specific position to perform visual extraction, an inertial navigation system is fused to perform calculation of accurate position and attitude information so as to assist and improve the precision of the GPS combination system, and diversification information guidance is provided for the unmanned aerial vehicle through two-dimensional encoding information to develop the diversity of a flight task. Besides, the present invention provides a cascade flight control system of adaptive compensation control based on deviation. The smooth transition of the marker identification state and the unidentification state is realized, the stability of the flight control is improved, and the accuracy of the identification is improved.

Description

A kind of unmanned plane vision auxiliary positioning recognized based on Quick Response Code terrestrial reference and flight control system And method
Technical field
The invention belongs to unmanned vehicle technical field, more particularly, to a kind of nothing recognized based on Quick Response Code terrestrial reference Man-machine vision auxiliary positioning and flight control system and method.
Background technology
Recently as intelligence science and control the reach of science, unmanned plane becomes a currently relatively more popular research Topic.At present unmanned plane be widely used in taking photo by plane, the earth mapping, geology rescue, fire rescue, the field such as traffic monitoring.Nobody Machine not only has actual social application value, also has important Research Significance, such as agricultural plant protection, electric power in engineering and science Patrol and examine, forest fire protection, inspection calamity etc. field, with vast potential for future development.
In unmanned plane automatically in-flight, traditional integrated navigation technology is limited to GPS accuracy problem, in positional precision About ± 2m or so, in the occasion higher to airline operation, hovering required precision, such as Express Logistics are delivered, and the disaster relief is supported, The applications such as upper warship operation, auto-returned charging, generally require to adopt other equipment to aid in arrive at during flight impact point improving Precision, with certain limitation.
The content of the invention
For the disadvantages described above or Improvement requirement of prior art, the invention provides one kind is precisely known based on Quick Response Code terrestrial reference Other unmanned plane vision auxiliary positioning and flight control system, arrange several with the mark of Quick Response Code form on the ad-hoc location in course line Will thing as key point, by carrying out vision extraction to Quick Response Code mark, fusion inertial navigation system carry out exact position and The calculating of attitude information, and then the precision of tradition GPS integrated navigation systems is aided in and improves, while being believed by the coding of Quick Response Code Cease and provide the guide of diversification information for unmanned plane, expand the diversity of aerial mission.In addition, proposing a kind of based on the adaptive of deviation The cascade flight control system of control should be compensated, seamlessly transitting for Marker Identity state and unidentified state is realized, is improved and is flown The stability of row control, and then the precision and rapidity of identification are improved, thus solve conventional combination navigation skill in prior art Art is limited to GPS accuracy problem, and positional precision is relatively low, needs to adopt other equipment to aid in arrive at during flight impact point improving Precision technical problem.
For achieving the above object, according to one aspect of the present invention, there is provided a kind of nothing recognized based on Quick Response Code terrestrial reference Man-machine vision auxiliary positioning and flight control system, it is characterised in that include:Unmanned plane body, sensor assembly, pursuit path are generated Module, vision processing module, sensor update module, flight control modules, command output module, vision auxiliary control switching mould Block and camera:
The sensor assembly is used to obtain the of the positional information of the unmanned plane body and the unmanned plane body One movement velocity vector;
The pursuit path generation module is used to generate course line pursuit path according to default task way point information, and to institute State course line pursuit path and carry out discrete processes and obtain N number of expectation destination, N is positive integer;
The vision processing module is used for the image of the Quick Response Code mark acquired according to the camera and obtains institute Positional information, attitude information and the coding information of Quick Response Code mark are stated, by the positional information, attitude information and coding Information obtains deviation distance vector and the camera of the camera relative to the Quick Response Code mark relative to institute State the second movement velocity vector of Quick Response Code mark;
The sensor update module is used for the positional information using the unmanned plane body, first movement velocity arrow Amount, the deviation distance vector and the second movement velocity vector carry out multi-sensor information by Kalman filtering algorithm Fusion, obtains target position information, the movement velocity of target first arrow of the filtered unmanned plane body of Jing Kalman filtering algorithms Amount, target deviation distance vector and target the second movement velocity vector;
The flight control modules are used to expect that the desired locations of destination, target expect the desired speed of destination using target Vector, the target position information, the target the first movement velocity vector, the target deviation distance vector and the mesh Mark the second movement velocity vector to guidance command by deviation adaptive equalization generation, described guidanceing command is sent to into the instruction Output module, wherein, the target expects the destination that destination is currently being directed to for unmanned plane, described to guidance command including roll Angle and the angle of pitch;
The vision auxiliary control handover module, for when the Quick Response Code mark is in identification state, controlling institute State flight control modules to be guidanceed command according to the information calculating that the sensor assembly and the vision processing module are obtained, in institute State Quick Response Code mark in unidentified state when, control what the flight control modules were obtained according only to the sensor assembly Information is calculated and guidanceed command;
The command output module is used to export described guidanceing command.
Preferably, the camera is located at the bottom of the unmanned plane body, and the visual field direction of the camera is hung down Directly down.
Preferably, the vision processing module includes that image gray processing module, image binaryzation module, binary map process mould Block, 2 D code information extraction module and position and attitude acquisition module,
Described image gray processing module is used to for the image of the Quick Response Code mark to be converted into single channel gray-scale map;
Described image binarization block is used to set a fixed threshold values according to single channel gray-scale map, and gray-scale map is converted into Binary map;
The binary map processing module is used to carry out the binary map contour detecting, travels through all sides in the binary map Number is 4 polygon, and rejects polygon of the area less than predetermined threshold value, then by polygon that remaining side number is 4 Rectangular projection is carried out, the square-shaped image of standard is obtained;
The 2 D code information extraction module is used for according in square-shaped image described in default coding information Rule Extraction Binary-coded information and angle point information;
The position and attitude acquisition module is used to be taken the photograph according to the binary-coded information and angle point information extracted are obtained As head relative to the Quick Response Code mark deviation distance vector and the camera relative to the Quick Response Code mark The second movement velocity vector.
It is another aspect of this invention to provide that there is provided a kind of unmanned plane vision auxiliary positioning recognized based on Quick Response Code terrestrial reference With winged prosecutor method, it is characterised in that include:
S1:Obtain the positional information of unmanned plane and the first movement velocity vector of unmanned plane;
S2:According to default task way point information generate course line pursuit path, and the course line pursuit path is carried out from Scattered process obtains N number of expectation destination, and N is positive integer;
S3:The image of the Quick Response Code mark acquired according to camera obtains the position letter of the Quick Response Code mark Breath, attitude information and coding information, obtain the camera relative by the positional information, attitude information and coding information Transport relative to the second of the Quick Response Code mark in the deviation distance vector and the camera of the Quick Response Code mark Dynamic velocity;
S4:Using the positional information of the unmanned plane, the first movement velocity vector, the deviation distance vector and The second movement velocity vector carries out multi-sensor information fusion by Kalman filtering algorithm, obtains the calculation of Jing Kalman filterings The target position information of the filtered unmanned plane of method, target the first movement velocity vector, target deviation distance vector and target Second movement velocity vector;
S5:Expect that the desired locations of destination, target expect desired speed vector, the target location of destination using target Information, the target the first movement velocity vector, the target deviation distance vector and the movement velocity of the target second arrow Amount is generated by deviation adaptive equalization and guidanceed command, wherein, the target expects what destination was currently being directed to for unmanned plane Destination, it is described to guidance command including roll angle and the angle of pitch;
S6:Guidance command described in output.
Preferably, the camera is located at the bottom of the unmanned plane, and the vertical court in visual field direction of the camera Under.
Preferably, step S3 specifically includes following sub-step:
S301:The image of the Quick Response Code mark is converted into into single channel gray-scale map;
S302:One fixed threshold values is set according to single channel gray-scale map, gray-scale map is converted into into binary map;
S303:Contour detecting is carried out to the binary map, the polygon that all side numbers in the binary map are 4 is traveled through, And polygon of the area less than predetermined threshold value is rejected, and then the polygon that remaining side number is 4 is carried out into rectangular projection, obtain The square-shaped image of standard;
S304:According to binary-coded information and angle point in square-shaped image described in default coding information Rule Extraction Information;
S305:Binary-coded information and angle point information according to extracting obtains the camera relative to the Quick Response Code Second movement velocity vector of the deviation distance vector and the camera of mark relative to the Quick Response Code mark.
In general, there is following skill compared with prior art, mainly by the contemplated above technical scheme of the present invention Art advantage:
(1) it is identified by the Quick Response Code mark to arranging on the ad-hoc location of ground, using the coding skill of Quick Response Code Art obtains landmark information, and merges multiple sensors information improving the positioning precision of unmanned plane, so as to aid in and improve tradition The precision of GPS integrated navigation systems, simultaneously because two-dimensional encoded can provide abundant landmark information and with cryptographic capabilities, energy Enough provide diversification information for unmanned plane to guide, and then the diversity of extended flight task;
(2) flight control system of same cascade is adopted in Marker Identity state and unidentified state, and is proposed A kind of adaptive equalization prosecutor method based on deviation, the additional positional information to getting under mark target identification state enters Row compensation, can realize seamlessly transitting for Marker Identity state and unidentified state, improve the stability of flight control, it is ensured that Rotor wing unmanned aerial vehicle can realize that fast accurate is recognized under various interference environments.
Description of the drawings
Fig. 1 is a kind of hardware structure diagram of unmanned plane high-precision independent flight disclosed in the embodiment of the present invention;
Fig. 2 be the embodiment of the present invention disclosed in it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning with fly The structural representation of control system;
Fig. 3 be the embodiment of the present invention disclosed in it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning with fly The information exchange figure of each module of control system;
Fig. 4 be the embodiment of the present invention disclosed in it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning with fly The schematic flow sheet of prosecutor method;
Fig. 5 is a kind of schematic flow sheet of unmanned plane high-precision independent flight disclosed in the embodiment of the present invention.
Specific embodiment
In order that the objects, technical solutions and advantages of the present invention become more apparent, it is right below in conjunction with drawings and Examples The present invention is further elaborated.It should be appreciated that specific embodiment described herein is only to explain the present invention, and It is not used in the restriction present invention.As long as additionally, technical characteristic involved in invention described below each embodiment Not constituting conflict each other just can be mutually combined.
Fig. 1 show a kind of hardware structure diagram of unmanned plane high-precision independent flight disclosed in the embodiment of the present invention, in Fig. 1 In shown hardware structure diagram, Fig. 1 upper lefts are navigation sensor set, can include accelerometer, gyroscope, ultrasonic wave Sensor, barometer, magnetometer and GPS module etc., wherein each sensor can be by IIC and SPI interface and Fig. 1 lower left quarters The flight control mainboard communication for dividing, Fig. 1 lower right-most portions are the camera on unmanned plane, can be right with Fig. 1 by USB2.0 interfaces The visual processes mainboard communication of upper part, visual processes mainboard can be communicated by TTL serial ports with flight control mainboard.
Wherein, flight control mainboard can adopt STM32F407 flush bonding processors, and operation dominant frequency is 168Mhz.Navigation Sensing implement body can include:MPU6050 gyroscopes and accelerometer, MS5611 high accuracy barometers, M8NGPS receivers, US100 ultrasonic range finders.Visual processes mainboard can adopt S5P4418 high-performance processors, and operation dominant frequency is 1.4Ghz, is had There are 1GB DDR3 running memories.Camera can be KS2A17, can be USB2.0 with visual processes mainboard communication mode, Under 640 × 480 resolution ratio, maximum frame per second is 120fps.Visual processes mainboard can be connected with flight control mainboard with TTL serial ports Mode enter row data communication.
Fig. 2 be the embodiment of the present invention disclosed in it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning with fly The structural representation of control system, Fig. 3 is a kind of unmanned plane vision recognized based on Quick Response Code terrestrial reference disclosed in the embodiment of the present invention The information exchange figure of auxiliary positioning and each module of flight control system.As shown in Figures 2 and 3, system of the present invention includes unmanned plane Body, sensor assembly, pursuit path generation module, vision processing module, sensor update module, flight control modules, refer to Make output module, vision auxiliary control handover module and camera.
Wherein, the sensor module is used to obtain the first fortune of the positional information of unmanned plane body and unmanned plane body Dynamic velocity;
Above-mentioned pursuit path generation module is used to generate course line pursuit path according to default task way point information, and to upper State course line pursuit path and carry out discrete processes and obtain N number of expectation destination, N is positive integer;
Above-mentioned vision processing module is used for the image of the Quick Response Code mark acquired according to camera and obtains the two dimension The positional information of code mark thing, attitude information and coding information, are obtained by above-mentioned positional information, attitude information and coding information Transport relative to the second of Quick Response Code mark to deviation distance vector and camera of the camera relative to Quick Response Code mark Dynamic velocity;
Wherein, camera is located at the bottom of unmanned plane body, and the visual field direction of camera is vertically downward.
Wherein, vision processing module include image gray processing module, image binaryzation module, binary map processing module, two Dimension code information extraction modules and position and attitude acquisition module,
Above-mentioned image gray processing module is used to for the image of Quick Response Code mark to be converted into single channel gray-scale map;
Above-mentioned image binaryzation module is used to set a fixed threshold values according to single channel gray-scale map, and gray-scale map is converted into Binary map;
Above-mentioned binary map processing module is used to carry out binary map contour detecting, and all side numbers are 4 in traversal binary map Polygon, and reject area less than predetermined threshold value polygon, then the polygon that remaining side number is 4 is carried out orthogonal Projection, obtains the square-shaped image of standard;
Above-mentioned 2 D code information extraction module is used for according to two in default coding information Rule Extraction square-shaped image Scale coding information and angle point information;
Above-mentioned position and attitude acquisition module is used to obtain camera according to the binary-coded information and angle point information extracted Swear relative to the second movement velocity of Quick Response Code mark relative to the deviation distance vector and camera of Quick Response Code mark Amount.
Wherein, the size of Quick Response Code mark be m cm x m centimetre, the angle point information table of the Quick Response Code mark for obtaining Show the positional information under the image coordinate of camera, mainly destination flight error is compensated due to follow-up, therefore unite One regulation Quick Response Code mark four angle point real-world coordinates be respectively (m, m, 0), (m, 0,0), (0, m, 0), (0,0, 0), video camera imaging principle:Sm'=A [R | T] M, wherein A is camera internal reference matrix, can be obtained by experimental calibration Arrive, m ' are position of the camera under camera coordinate system, M is position of the camera under real-world coordinates system, and [R | T] is Rotation translation matrix, that is, the position put relative to some under real-world coordinates system of camera and attitude, you can ask The deviation distance vector and camera for going out camera relative to Quick Response Code mark is transported relative to the second of Quick Response Code mark Dynamic velocity.
The sensor update module is used for the positional information using unmanned plane body, the first movement velocity vector, deviates Distance vector and the second movement velocity vector carry out multi-sensor information fusion by Kalman filtering algorithm, obtain Jing karrs Target position information, target the first movement velocity vector, the target deviation distance of the filtered unmanned plane body of graceful filtering algorithm Vector and target the second movement velocity vector.
Wherein it is possible to pass through design Kalman filter merge to improve certainty of measurement to multi-sensor information.Card The state of Thalmann filter more new formula is:
Wherein, θ, γ are the angle of pitch and roll angle in spin matrix R, and V is the unmanned motor speed under real-world coordinates Vector, a is the unmanned plane acceleration under real-world coordinates, abIt is the acceleration under unmanned plane coordinate, Ke Yiyou Accelerometer measures in unmanned plane are obtained, wbIt is the angular velocity vector under unmanned plane coordinate, can be by the gyro in unmanned plane Instrument measurement is obtained, and Δ t is filter update interval time.
Above-mentioned flight control modules are used to expect that the desired locations of destination, target expect the desired speed of destination using target Vector, target position information, target the first movement velocity vector, target deviation distance vector and the movement velocity of target second arrow Amount is generated by deviation adaptive equalization and guidanceed command, and this is guidanceed command and is sent to command output module, wherein, target is expected The destination that destination is currently being directed to for unmanned plane, guidances command including roll angle and the angle of pitch;
Wherein, the control targe of high accuracy flight is just so that the position of unmanned plane converges to target and expects destination set In sufficiently small neighborhood.In the high accuracy mission phase for having vision to aid in, because the precision of visual apparatus measurement is better than tradition Navigator, now needs to compensate the input quantity in controller:
Verr(t)=[Vd(t)-w3·V(t)]-w4·Vvision(t)
Wherein, Pd(t), VdT () is respectively the desired locations of unmanned plane and desired speed input vector, Perr(t), Verr(t) The error input vector of ring controller respectively in position outer ring controller and speed, P (t), V (t) is respectively tradition GPS and combines The position vector of the calculated unmanned plane of navigation system and movement velocity vector, T (t), VvisionT () is respectively visual processes The deviation distance vector sum movement velocity vector of the calculated unmanned plane relative target Quick Response Code mark of module, w1, w2, w3, w4For backoff weight coefficient, w is typically can use1=w2, w3=w4, backoff weight coefficient can take fixed value, it would however also be possible to employ adaptive The mode answered determines:
w2=1-w1
During unmanned plane during flying, limited by camera field range and practical flight environmental disturbances are affected, can be right The mark of unmanned plane extracts accurate information degree and brings impact, and traditional unmanned aerial vehicle control system is in the unidentified process of surface mark thing Different control strategies are respectively adopted with the control after identification success, therefore, flight control modules can be known in surface mark thing Not with frequent switching under unidentified two states, cause control unstable.The present invention recognizes state and does not know in surface mark thing Other state that is, using a flight control modules, and is proposed a kind of based on inclined using the flight control modules of same cascade Poor Adaptive Compensation Control Method, to the additional positions that get and motion velocity information under surface mark thing identification state Compensate.Can realize that surface mark thing identification state and unidentified state are seamlessly transitted, improve the stability of flight control, Ensure that unmanned plane can realize that under circumstances high accurancy and precision flies.
Above-mentioned command output module is used to export above-mentioned guidanceing command.
Fig. 4 be the embodiment of the present invention disclosed in it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning with fly The schematic flow sheet of prosecutor method, wherein, the method shown in Fig. 4 is comprised the following steps:
S1:Obtain the positional information of unmanned plane and the first movement velocity vector of unmanned plane;
S2:According to default task way point information generate course line pursuit path, and the course line pursuit path is carried out from Scattered process obtains N number of expectation destination, and N is positive integer;
S3:The image of the Quick Response Code mark of the advance arrangement got according to camera is precisely recognized;
Wherein, the implementation of step S3 is:The figure of the Quick Response Code mark of the advance arrangement got according to camera As obtaining the positional information of the Quick Response Code mark, attitude information and coding information, by above-mentioned positional information, attitude information with And coding information obtains deviation distance vector and camera of the camera relative to Quick Response Code mark relative to Quick Response Code mark Second movement velocity vector of will thing;
S4:Using the information of Quick Response Code mark of identification, the positional information of unmanned plane and the first movement velocity vector Multi-sensor information fusion is carried out by Kalman filtering algorithm;
Wherein, the specific implementation of step S4 is:Using the positional information of unmanned plane, the first movement velocity vector, partially Multi-sensor information fusion is carried out by Kalman filtering algorithm from distance vector and the second movement velocity vector, Jing cards are obtained The target position information of the filtered unmanned plane of Kalman Filtering algorithm, target the first movement velocity vector, target deviation distance arrow Amount and target the second movement velocity vector.
S5:Generated using the information that obtains after Kalman filtering and guidanceed command, wherein guidance command include roll angle with The angle of pitch;
Wherein, the specific implementation of step S5 is:Expect that the desired locations of destination, target expect destination using target Desired speed vector, target position information, target the first movement velocity vector, target deviation distance vector and target second are transported Dynamic velocity is generated by deviation adaptive equalization and guidanceed command, wherein, target expect destination be unmanned plane it is current before Past destination, guidances command including roll angle and the angle of pitch.
S6:Output is above-mentioned to guidance command.
Fig. 5 is a kind of schematic flow sheet of unmanned plane high-precision independent flight disclosed in the embodiment of the present invention.Have in Figure 5 Three task destinations, wherein task destination (n) and (n+1) are crucial destination, and have mark in surface deployment:The He of Quick Response Code 1 Quick Response Code 2.Unmanned plane when destination (n-1) is flown through, only with traditional GPS integrated navigation systems.Flying through task destination (n) (n+1) when, the position of ground mark, attitude and coding information can be extracted, to aid in traditional GPS integrated navigations System.
As it will be easily appreciated by one skilled in the art that the foregoing is only presently preferred embodiments of the present invention, not to The present invention, all any modification, equivalent and improvement made within the spirit and principles in the present invention etc. are limited, all should be included Within protection scope of the present invention.

Claims (6)

1. it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning and flight control system, it is characterised in that include:Nothing Man-machine body, sensor assembly, pursuit path generation module, vision processing module, sensor update module, flight control mould Block, command output module, vision auxiliary control handover module and camera:
The sensor assembly is used to obtain the first fortune of the positional information of the unmanned plane body and the unmanned plane body Dynamic velocity;
The pursuit path generation module is used to generate course line pursuit path according to default task way point information, and to the boat Line pursuit path carries out discrete processes and obtains N number of expectation destination, and N is positive integer;
The vision processing module is used for the image of the Quick Response Code mark acquired according to the camera and obtains described two Positional information, attitude information and the coding information of code mark thing are tieed up, by the positional information, attitude information and coding information The deviation distance vector and the camera that the camera is obtained relative to the Quick Response Code mark is relative to described two Second movement velocity vector of dimension code mark thing;
The sensor update module be used for using the positional information of the unmanned plane body, the first movement velocity vector, The deviation distance vector and the second movement velocity vector carry out multi-sensor information and melt by Kalman filtering algorithm Close, obtain the target position information of the filtered unmanned plane body of Jing Kalman filtering algorithms, target the first movement velocity vector, Target deviation distance vector and target the second movement velocity vector;
The flight control modules are used to expect that the desired locations of destination, target expect the desired speed arrow of destination using target Amount, the target position information, the target the first movement velocity vector, the target deviation distance vector and the target Second movement velocity vector by deviation adaptive equalization generate guidance command, by it is described guidance command be sent to it is described instruction it is defeated Go out module, wherein, the target expects the destination that destination is currently being directed to for unmanned plane, described to guidance command including roll angle And the angle of pitch;
The vision auxiliary control handover module, for when the Quick Response Code mark is in identification state, controlling described flying Row control module is calculated according to the information that the sensor assembly and the vision processing module are obtained and guidanceed command, described two When dimension code mark thing is in unidentified state, the information that the flight control modules are obtained according only to the sensor assembly is controlled Calculating is guidanceed command;
The command output module is used to export described guidanceing command.
2. system according to claim 1, it is characterised in that the camera is located at the bottom of the unmanned plane body, And the visual field direction of the camera is vertically downward.
3. system according to claim 1, it is characterised in that the vision processing module include image gray processing module, Image binaryzation module, binary map processing module, 2 D code information extraction module and position and attitude acquisition module,
Described image gray processing module is used to for the image of the Quick Response Code mark to be converted into single channel gray-scale map;
Described image binarization block is used to set a fixed threshold values according to single channel gray-scale map, and gray-scale map is converted into into two-value Figure;
The binary map processing module is used to carry out the binary map contour detecting, travels through all side numbers in the binary map For 4 polygon, and polygon of the area less than predetermined threshold value is rejected, then carry out on the polygon that remaining side number is 4 Rectangular projection, obtains the square-shaped image of standard;
The 2 D code information extraction module is used for according to two in square-shaped image described in default coding information Rule Extraction Scale coding information and angle point information;
The position and attitude acquisition module is used to obtain the camera according to the binary-coded information and angle point information extracted Relative to the Quick Response Code mark deviation distance vector and the camera relative to the of the Quick Response Code mark Two movement velocity vectors.
4. it is a kind of based on Quick Response Code terrestrial reference recognize unmanned plane vision auxiliary positioning and winged prosecutor method, it is characterised in that include:
S1:Obtain the positional information of unmanned plane and the first movement velocity vector of unmanned plane;
S2:Course line pursuit path is generated according to default task way point information, and discrete place is carried out to the course line pursuit path Reason obtains N number of expectation destination, and N is positive integer;
S3:The positional information of the image acquisition Quick Response Code mark of the Quick Response Code mark acquired according to camera, Attitude information and coding information, by the positional information, attitude information and coding information obtain the camera relative to The deviation distance vector and the camera of the Quick Response Code mark is moved relative to the second of the Quick Response Code mark Velocity;
S4:Using the positional information of the unmanned plane, the first movement velocity vector, the deviation distance vector and described Second movement velocity vector carries out multi-sensor information fusion by Kalman filtering algorithm, obtains the filter of Jing Kalman filtering algorithms The target position information of the unmanned plane after ripple, target the first movement velocity vector, target deviation distance vector and target second Movement velocity vector;
S5:Expect that the desired locations of destination, target expect the desired speed vector of destination, target location letter using target Breath, the target the first movement velocity vector, the target deviation distance vector and the target the second movement velocity vector Generated by deviation adaptive equalization and guidanceed command, wherein, the target expects the boat that destination is currently being directed to for unmanned plane Point, it is described to guidance command including roll angle and the angle of pitch;
S6:Guidance command described in output.
5. method according to claim 4, it is characterised in that the camera is located at the bottom of the unmanned plane, and The visual field direction of the camera is vertically downward.
6. method according to claim 4, it is characterised in that step S3 specifically includes following sub-step:
S301:The image of the Quick Response Code mark is converted into into single channel gray-scale map;
S302:One fixed threshold values is set according to single channel gray-scale map, gray-scale map is converted into into binary map;
S303:Contour detecting is carried out to the binary map, it is 4 polygon to travel through all side numbers in the binary map, and is picked Except area is less than the polygon of predetermined threshold value, then the polygon that remaining side number is 4 is carried out into rectangular projection, obtain standard Square-shaped image;
S304:Believe according to the binary-coded information in square-shaped image described in default coding information Rule Extraction and angle point Breath;
S305:Binary-coded information and angle point information according to extracting obtains the camera relative to the two-dimentional code mark Second movement velocity vector of the deviation distance vector and the camera of thing relative to the Quick Response Code mark.
CN201611092540.2A 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference Active CN106647814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611092540.2A CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611092540.2A CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Publications (2)

Publication Number Publication Date
CN106647814A true CN106647814A (en) 2017-05-10
CN106647814B CN106647814B (en) 2019-08-13

Family

ID=58814148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611092540.2A Active CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Country Status (1)

Country Link
CN (1) CN106647814B (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194399A (en) * 2017-07-14 2017-09-22 广东工业大学 A kind of vision determines calibration method, system and unmanned plane
CN107703973A (en) * 2017-09-11 2018-02-16 广州视源电子科技股份有限公司 Trajectory tracking method and device
CN108305291A (en) * 2018-01-08 2018-07-20 武汉大学 Utilize the monocular vision positioning and orientation method of the wall advertisement comprising positioning Quick Response Code
CN108803668A (en) * 2018-06-22 2018-11-13 航天图景(北京)科技有限公司 A kind of intelligent patrol detection unmanned plane Towed bird system of static object monitoring
WO2019006767A1 (en) * 2017-07-06 2019-01-10 杨顺伟 Scenic spot navigation method and device for unmanned aerial vehicle
CN109521781A (en) * 2018-10-30 2019-03-26 普宙飞行器科技(深圳)有限公司 Unmanned plane positioning system, unmanned plane and unmanned plane localization method
WO2019056982A1 (en) * 2017-09-21 2019-03-28 索尼公司 Apparatus and method in wireless communication system and computer readable storage medium
CN110325940A (en) * 2018-06-29 2019-10-11 深圳市大疆创新科技有限公司 A kind of flight control method, equipment, system and storage medium
CN110375747A (en) * 2019-08-26 2019-10-25 华东师范大学 A kind of inertial navigation system of interior unmanned plane
CN110446159A (en) * 2019-08-12 2019-11-12 上海工程技术大学 A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN110543989A (en) * 2019-08-29 2019-12-06 中国南方电网有限责任公司 Power transmission line machine patrol operation safety early warning method and device and computer equipment
CN110673619A (en) * 2019-10-21 2020-01-10 深圳市道通智能航空技术有限公司 Flight attitude control method and device, unmanned aerial vehicle and storage medium
CN111121744A (en) * 2018-10-30 2020-05-08 千寻位置网络有限公司 Positioning method and device based on sensing unit, positioning system and mobile terminal
CN111323789A (en) * 2020-03-19 2020-06-23 苏州思维慧信息科技有限公司 Ground topography scanning device and method based on unmanned aerial vehicle and solid-state radar
CN111580551A (en) * 2020-05-06 2020-08-25 杭州电子科技大学 Navigation system and method based on visual positioning
CN111930133A (en) * 2020-07-20 2020-11-13 贵州电网有限责任公司 Transformer substation secondary screen cabinet inspection method based on rotor unmanned aerial vehicle
CN112040175A (en) * 2020-07-31 2020-12-04 深圳供电局有限公司 Unmanned aerial vehicle inspection method and device, computer equipment and readable storage medium
CN112147995A (en) * 2019-06-28 2020-12-29 深圳市创客工场科技有限公司 Robot motion control method and device, robot and storage medium
CN112381464A (en) * 2020-12-07 2021-02-19 北京小米松果电子有限公司 Shared vehicle scheduling method and device and storage medium
CN112859923A (en) * 2021-01-25 2021-05-28 西北工业大学 Unmanned aerial vehicle vision formation flight control system
CN113238580A (en) * 2021-06-03 2021-08-10 一飞智控(天津)科技有限公司 Method and system for switching static placement deviation and dynamic flight deviation of unmanned aerial vehicle
CN113657256A (en) * 2021-08-16 2021-11-16 大连海事大学 Unmanned ship-borne unmanned aerial vehicle sea-air cooperative visual tracking and autonomous recovery method
TWI746973B (en) * 2018-05-09 2021-11-21 大陸商北京外號信息技術有限公司 Method for guiding a machine capable of autonomous movement through optical communication device
CN113776523A (en) * 2021-08-24 2021-12-10 武汉第二船舶设计研究所 Low-cost navigation positioning method and system for robot and application
CN114104310A (en) * 2021-12-31 2022-03-01 重庆高新区飞马创新研究院 Device and method for assisting unmanned aerial vehicle in landing based on GPS and AprilTag
CN114237262A (en) * 2021-12-24 2022-03-25 陕西欧卡电子智能科技有限公司 Automatic mooring method and system for unmanned ship on water
CN114326766A (en) * 2021-12-03 2022-04-12 深圳先进技术研究院 Vehicle-mounted machine cooperative autonomous tracking and landing method
CN114489102A (en) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 Self-inspection method and device for electric power tower, unmanned aerial vehicle and storage medium
CN115586798A (en) * 2022-12-12 2023-01-10 广东电网有限责任公司湛江供电局 Unmanned aerial vehicle anti-crash method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184776A (en) * 2015-08-17 2015-12-23 中国测绘科学研究院 Target tracking method
CN105388905A (en) * 2015-10-30 2016-03-09 深圳一电航空技术有限公司 Unmanned aerial vehicle flight control method and device
US20160209850A1 (en) * 2014-12-09 2016-07-21 Embry-Riddle Aeronautical University, Inc. System and method for robust nonlinear regulation control of unmanned aerial vehicles syntetic jet actuators

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160209850A1 (en) * 2014-12-09 2016-07-21 Embry-Riddle Aeronautical University, Inc. System and method for robust nonlinear regulation control of unmanned aerial vehicles syntetic jet actuators
CN105184776A (en) * 2015-08-17 2015-12-23 中国测绘科学研究院 Target tracking method
CN105388905A (en) * 2015-10-30 2016-03-09 深圳一电航空技术有限公司 Unmanned aerial vehicle flight control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姚瑶,刘磊,王永骥: "基于耦合补偿和自抗扰的飞行器姿态控制", 《第34届中国控制会议》 *

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019006767A1 (en) * 2017-07-06 2019-01-10 杨顺伟 Scenic spot navigation method and device for unmanned aerial vehicle
CN107194399A (en) * 2017-07-14 2017-09-22 广东工业大学 A kind of vision determines calibration method, system and unmanned plane
CN107194399B (en) * 2017-07-14 2023-05-09 广东工业大学 Visual calibration method, system and unmanned aerial vehicle
CN107703973A (en) * 2017-09-11 2018-02-16 广州视源电子科技股份有限公司 Trajectory tracking method and device
US11076328B2 (en) 2017-09-21 2021-07-27 Sony Corporation Apparatus and method in wireless communication system and computer readable storage medium
WO2019056982A1 (en) * 2017-09-21 2019-03-28 索尼公司 Apparatus and method in wireless communication system and computer readable storage medium
US11653276B2 (en) 2017-09-21 2023-05-16 Sony Group Corporation Apparatus and method in wireless communication system and computer readable storage medium
CN108305291A (en) * 2018-01-08 2018-07-20 武汉大学 Utilize the monocular vision positioning and orientation method of the wall advertisement comprising positioning Quick Response Code
CN108305291B (en) * 2018-01-08 2022-02-01 武汉大学 Monocular vision positioning and attitude determination method utilizing wall advertisement containing positioning two-dimensional code
US11338920B2 (en) 2018-05-09 2022-05-24 Beijing Whyhow Information Technology Co., Ltd. Method for guiding autonomously movable machine by means of optical communication device
TWI746973B (en) * 2018-05-09 2021-11-21 大陸商北京外號信息技術有限公司 Method for guiding a machine capable of autonomous movement through optical communication device
CN108803668A (en) * 2018-06-22 2018-11-13 航天图景(北京)科技有限公司 A kind of intelligent patrol detection unmanned plane Towed bird system of static object monitoring
CN108803668B (en) * 2018-06-22 2021-08-24 中国南方电网有限责任公司超高压输电公司广州局 Intelligent inspection unmanned aerial vehicle nacelle system for static target monitoring
CN110325940A (en) * 2018-06-29 2019-10-11 深圳市大疆创新科技有限公司 A kind of flight control method, equipment, system and storage medium
CN111121744A (en) * 2018-10-30 2020-05-08 千寻位置网络有限公司 Positioning method and device based on sensing unit, positioning system and mobile terminal
CN109521781A (en) * 2018-10-30 2019-03-26 普宙飞行器科技(深圳)有限公司 Unmanned plane positioning system, unmanned plane and unmanned plane localization method
CN112147995B (en) * 2019-06-28 2024-02-27 深圳市创客工场科技有限公司 Robot motion control method and device, robot and storage medium
CN112147995A (en) * 2019-06-28 2020-12-29 深圳市创客工场科技有限公司 Robot motion control method and device, robot and storage medium
CN110446159A (en) * 2019-08-12 2019-11-12 上海工程技术大学 A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN110375747A (en) * 2019-08-26 2019-10-25 华东师范大学 A kind of inertial navigation system of interior unmanned plane
CN110543989A (en) * 2019-08-29 2019-12-06 中国南方电网有限责任公司 Power transmission line machine patrol operation safety early warning method and device and computer equipment
CN110673619B (en) * 2019-10-21 2022-06-17 深圳市道通智能航空技术股份有限公司 Flight attitude control method and device, unmanned aerial vehicle and storage medium
CN110673619A (en) * 2019-10-21 2020-01-10 深圳市道通智能航空技术有限公司 Flight attitude control method and device, unmanned aerial vehicle and storage medium
CN111323789B (en) * 2020-03-19 2023-11-03 陕西思地三维科技有限公司 Ground morphology scanning device and method based on unmanned aerial vehicle and solid-state radar
CN111323789A (en) * 2020-03-19 2020-06-23 苏州思维慧信息科技有限公司 Ground topography scanning device and method based on unmanned aerial vehicle and solid-state radar
CN111580551A (en) * 2020-05-06 2020-08-25 杭州电子科技大学 Navigation system and method based on visual positioning
CN111930133A (en) * 2020-07-20 2020-11-13 贵州电网有限责任公司 Transformer substation secondary screen cabinet inspection method based on rotor unmanned aerial vehicle
CN112040175A (en) * 2020-07-31 2020-12-04 深圳供电局有限公司 Unmanned aerial vehicle inspection method and device, computer equipment and readable storage medium
CN112381464A (en) * 2020-12-07 2021-02-19 北京小米松果电子有限公司 Shared vehicle scheduling method and device and storage medium
CN112859923A (en) * 2021-01-25 2021-05-28 西北工业大学 Unmanned aerial vehicle vision formation flight control system
CN113238580A (en) * 2021-06-03 2021-08-10 一飞智控(天津)科技有限公司 Method and system for switching static placement deviation and dynamic flight deviation of unmanned aerial vehicle
CN113657256B (en) * 2021-08-16 2023-09-26 大连海事大学 Unmanned aerial vehicle sea-air cooperative vision tracking and autonomous recovery method
CN113657256A (en) * 2021-08-16 2021-11-16 大连海事大学 Unmanned ship-borne unmanned aerial vehicle sea-air cooperative visual tracking and autonomous recovery method
CN113776523B (en) * 2021-08-24 2024-03-19 武汉第二船舶设计研究所 Robot low-cost navigation positioning method, system and application
CN113776523A (en) * 2021-08-24 2021-12-10 武汉第二船舶设计研究所 Low-cost navigation positioning method and system for robot and application
CN114326766A (en) * 2021-12-03 2022-04-12 深圳先进技术研究院 Vehicle-mounted machine cooperative autonomous tracking and landing method
CN114237262B (en) * 2021-12-24 2024-01-19 陕西欧卡电子智能科技有限公司 Automatic berthing method and system for unmanned ship on water surface
CN114237262A (en) * 2021-12-24 2022-03-25 陕西欧卡电子智能科技有限公司 Automatic mooring method and system for unmanned ship on water
CN114104310A (en) * 2021-12-31 2022-03-01 重庆高新区飞马创新研究院 Device and method for assisting unmanned aerial vehicle in landing based on GPS and AprilTag
CN114489102A (en) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 Self-inspection method and device for electric power tower, unmanned aerial vehicle and storage medium
CN115586798B (en) * 2022-12-12 2023-03-24 广东电网有限责任公司湛江供电局 Unmanned aerial vehicle anti-crash method and system
CN115586798A (en) * 2022-12-12 2023-01-10 广东电网有限责任公司湛江供电局 Unmanned aerial vehicle anti-crash method and system

Also Published As

Publication number Publication date
CN106647814B (en) 2019-08-13

Similar Documents

Publication Publication Date Title
CN106647814B (en) A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference
US12007792B2 (en) Backup navigation system for unmanned aerial vehicles
US11693428B2 (en) Methods and system for autonomous landing
CN109270953B (en) Multi-rotor unmanned aerial vehicle autonomous landing method based on concentric circle visual identification
CN109945858B (en) Multi-sensing fusion positioning method for low-speed parking driving scene
CN105352495B (en) Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method
De Marina et al. Guidance algorithm for smooth trajectory tracking of a fixed wing UAV flying in wind flows
CN110058602A (en) Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN105759829A (en) Laser radar-based mini-sized unmanned plane control method and system
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN107389968B (en) Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor
CN102190081B (en) Vision-based fixed point robust control method for airship
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
Zhao et al. Homography-based vision-aided inertial navigation of UAVs in unknown environments
Bi et al. A lightweight autonomous MAV for indoor search and rescue
Schofield et al. Autonomous power line detection and tracking system using UAVs
Deng et al. Visual–inertial estimation of velocity for multicopters based on vision motion constraint
Wang et al. Monocular vision and IMU based navigation for a small unmanned helicopter
Subramanian et al. Integrating computer vision and photogrammetry for autonomous aerial vehicle landing in static environment
Cui et al. Landmark extraction and state estimation for UAV operation in forest
Agarwal et al. Monocular vision based navigation and localisation in indoor environments
CN207379510U (en) Unmanned plane indoor locating system based on cooperative target and monocular vision
Yigit et al. Visual attitude stabilization of a unmanned helicopter in unknown environments with an embedded single-board computer
Lei et al. A high performance altitude navigation system for small rotorcraft unmanned aircraft
Xiaodong et al. Obstacle avoidance for outdoor flight of a quadrotor based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant