CN108007474A - A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking - Google Patents

A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking Download PDF

Info

Publication number
CN108007474A
CN108007474A CN201710766832.8A CN201710766832A CN108007474A CN 108007474 A CN108007474 A CN 108007474A CN 201710766832 A CN201710766832 A CN 201710766832A CN 108007474 A CN108007474 A CN 108007474A
Authority
CN
China
Prior art keywords
coordinate system
unmanned vehicle
axis
information
light stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710766832.8A
Other languages
Chinese (zh)
Inventor
张雨
遆晓光
范晋祥
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201710766832.8A priority Critical patent/CN108007474A/en
Publication of CN108007474A publication Critical patent/CN108007474A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Abstract

A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking, are related to and automatically control and image processing field.It is characterized in that in the case of no gps signal, inertance element, magnetometer and light stream module information on aircraft are merged, land marking is identified using camera, completes the position to aircraft and attitude updating.Inertance element and light stream module are used for the pitching rolling information and positional information for obtaining aircraft, and magnetometer is used for the yaw information for obtaining aircraft;Camera is used to obtain land marking image, according to position of the land marking in image and initial geographic coordinate system is obtained and posture, resolves position and posture of the aircraft relative to initial geographic coordinate system;Using posture information acquired in camera to being corrected using pose obtained by inertance element, magnetometer and light stream module information.The present invention can realize the Global localization to aircraft and attitude updating using the fixed intervals grid or floor ceramic tile gap of white features band composition as land marking.

Description

A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking
Technical field
The present invention relates to automatically controlling and image processing field, suitable for carried out in unmanned vehicle room precision navigation or High-precision navigation is carried out in the case of no gps signal.
Background technology
In recent years, the development of unmanned vehicle all becomes the topic increasingly paid close attention in civilian and military field, such as nobody Helicopter, unmanned Fixed Wing AirVehicle, unmanned rotary wing aircraft etc., are now widely used for industry-by-industry, such as investigation and prison Depending on, search and rescue aid, take photo by plane with mapping etc..Especially quadrotor unmanned vehicle, it is simple in structure by its, have vertical Landing and the characteristics of freely hover, can be outstanding in complex environment completion assigned tasks.And held in unmanned vehicle During row task, one of the problem of navigator fix is wherein the most key.
For outdoor environment, Global Navigation Satellite System (Global Navigation Satellite System, GNSS) global positioning system (Global Positioning System, GPS) in such as U.S., the big-dipper satellite in China are led Boat system (BeiDou Navigation Satellite System, BDS) can provide the positioning clothes of degree of precision to the user Business, has substantially met user in outdoor scene to the demand based on location-based service.However, personal user, indoor performance, formation Inspection etc. has substantial amounts of location requirement that scene indoors occurs in indoor flight, factory.And indoor scene is hidden be subject to building Keep off, GNSS signal rapid decay, can not meet the needs of navigator fix in indoor scene, and existing unmanned vehicle positions Seldom be related to unmanned vehicle in method determines appearance problem, waits indoors in environment.It is difficult that this is solved in not new technology Before topic, the operation skill of user can only be relied on, and security incident happens occasionally.Therefore, unmanned vehicle indoor positioning technologies As industrial quarters and the hot spot of academia's research.
It is current had some carried out on unmanned plane be indoor positioning patent and technology, as light stream positions, inertia member Part positions, wireless location, dynamic capture system positioning etc..Although these methods can be applied to indoor positioning, still deposit Various the problem of, as light stream positioning and inertance element position, there are cumulative errors, it is impossible to the fortune of long-time stable OK;Wireless location is relatively low there are positioning accuracy, the problem of easily receiving the interference of surrounding environment, and wireless location method can not be to nothing The posture of people's aircraft is corrected;And for dynamic capture system, although it has high positioning accuracy, can reach centimetre The even precision of millimeter, but its fancy price and less sphere of action limit the large-scale application of this method.
In conclusion existing is that unmanned vehicle pose determines to be primarily present problems with:First, indoors etc. without navigation In the case of satellite-signal, good positioning and orientation can not be realized, limit the application range of unmanned plane;2nd, in existing room In interior localization method, for unmanned vehicle carry not by the setting element of outer signals, such as light stream locating module, inertial navigation Element etc., it is poor there are positioning accuracy, can not long-time steady operation the problem of, can not be corrected for position and posture cumulative errors The problem of;And caught for such as dynamic, the system such as wireless location, which is respectively present price height and positioning accuracy, can not meet control system The problem of system requires.
The content of the invention
The present invention is ineffective and existing in order to solve indoor positioning existing for existing unmanned vehicle localization method Indoor orientation method is existing to have cumulative errors, can not correct unmanned vehicle posture, and price is high and positioning accuracy is poor The problems such as.
A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking, comprise the following steps:
Step 1: establish initial geographic coordinate system, body coordinate system, body to geography involves coordinate system, camera coordinates system, Image coordinate system, light stream module coordinate system, installs camera and light stream locating module on unmanned vehicle;
Step 2: position and posture of the calibration for cameras coordinate system relative to body coordinate system, and measure light stream locating module Relative position relation between camera, calibration body coordinate system to the transition matrix between initial geographic coordinate system;
Step 3: light stream locating module velocity information is obtained, and the acceleration information of inertance element and unmanned vehicle are certainly The attitude information of body, integrates the acceleration information of inertance element, the light stream locating module that light stream locating module is returned Speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa, initial geographic coordinate system is transformed into, is obtained Light stream locating module speed V under to initial geographic coordinate systemf' and inertance element speed Va'.It is right under initial geographic coordinate system Vf' and Va' merged, speed after being merged, integrates speed after fusion, obtains unmanned plane in initial geographical coordinate Rough position P in system0, pass through inertance element and the rough posture R of magnetometer acquisition unmanned plane0
Step 4: obtain the land marking image I of camera collection, land marking information in extraction image I, to from image I In acquired land marking information carry out preliminary screening, information after screening is changed to initial geographic coordinate system;
Step 5: ground identification information is further screened in initial geographic coordinate system, after further screening Information unmanned vehicle position and posture are corrected, obtain unmanned vehicle accurate location P1With exact posture R1
Step 6: return to step three.
Preferably, a kind of the unmanned vehicle independent positioning and pose alignment technique based on land marking, its ground Face is identified as square net
Preferably, the detailed process of step 1 comprises the following steps:
Step 1 one, the initial geographic coordinate system o0x0y0z0:With east-north-day coordinate of unmanned vehicle position System is used as initial geographic coordinate system;
Body coordinate system obxbybzb:Body coordinate system origin o is used as using unmanned plane center of gravityb, with the symmetrical of unmanned vehicle It is x that plane in parallel is directed toward heading in axisbAxis, to be vertically directed toward right side with unmanned vehicle symmetrical plane as ybAxis, hangs down Directly and xbAxis and ybIt is z that axis, which is directed toward below unmanned vehicle,bAxis;
Body coordinate system obxbybzb:Body coordinate system origin o is used as using unmanned plane center of gravityb, with the longitudinal direction of unmanned vehicle It is x to be directed toward heading parallel to axis in symmetrical planebAxis, to be directed toward right side perpendicular to the longitudinally asymmetric plane of unmanned vehicle For ybAxis, perpendicular to xbAxis and ybIt is z that axis, which is directed toward below unmanned vehicle,bAxis;
Body involves coordinate system o to geography1x1y1z1:Body coordinate system origin o is used as using unmanned plane center of gravity1, x1、y1、z1Axis Parallel respectively and initial geographic coordinate system x0、y0、z0Axis;
Image coordinate system oixiyi:Using camera upper left side as image coordinate system origin oi, with parallel to image the top pixel Line, it is x to be directed toward right sideiAxis, with parallel to image leftmost side pixel, downwardly directed is yiAxis;
Camera coordinates system ocxcyczc:Using camera photocentre as camera coordinates system origin oc, it is outwardly directed for phase with camera optical axis Machine coordinate system zcAxis, with parallel and image coordinate system xi、yiAxis is respectively xc、ycAxis;
Light stream module coordinate system ofxfyfzf:Using light stream locating module center as light stream locating module coordinate origin, xf、 yf、zfAxis distinguishes parallel and body coordinate system xb、yb、zbAxis.
Light stream locating module, is installed on unmanned vehicle lower section by step 1 two, is connected with unmanned vehicle body, camera Coordinate system is installed on below unmanned vehicle so that camera coordinates system zcAxis, xcAxis, ycAxis distinguishes parallel and body coordinate system zb Axis, y0Axis ,-x0Axis;
Preferably, the detailed process of step 2 comprises the following steps:
Step 2 one, by unmanned vehicle and scaling board be positioned over same physical plane, and scaling board is being moved to camera just Lower section, keeps scaling board side parallel with body coordinate system x-axis direction, camera is demarcated.
Step 2 two, utilize the horizontal and vertical distance between graduated scale measurement light stream locating module and camera.
Step 2 three, using installed on unmanned vehicle magnetometer and inertance element calibration body coordinate system to initially Transformational relation between geographic coordinate system.
Preferably, the detailed process of step 3 comprises the following steps:
Step 3 one, obtain light stream locating module velocity information, and the acceleration information and unmanned vehicle of inertance element The attitude information of itself, integrates the acceleration information of inertance element, and light stream locating module is returned light stream locating module Speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa.By VfAnd VaSuccessively change to body Under coordinate system, body to geography involve coordinate system, be finally transformed under initial geographic coordinate system, finally obtain initial geographical coordinate Light stream locating module speed V under systemf' and inertance element speed Va'。
Step 3 two, read unmanned vehicle height H, and fusion method is selected to V according to elevation informationf' and Va' melted Close, the unit of height H is rice.
When unmanned vehicle height H is more than or equal to threshold level H1When, using Kalman filter to Vf' and Va' melted Close, obtain speed V under the initial geographic coordinate system of unmanned vehicleo
When unmanned vehicle height H is less than H1When, by Vf' and Va' be directly added according to certain weights, obtain nothing Speed V under the initial geographic coordinate system of people's aircrafto, formula is as follows:
Vo=α Vf'+βVa'
Wherein α and β is weighting parameters, is changed with the change of height H.
Step 3 three, to speed V after fusionoIntegrated to obtain rough position of the unmanned vehicle under initial coordinate system P0, pass through inertance element and the coarse filtration posture R of magnetometer acquisition unmanned plane0
Preferably, the detailed process of step 4 comprises the following steps:
Step 4 one, by camera gather land marking image I change to hsv color space, utilize land marking with ground Difference of the face in color and luminance channel obtains bianry image I to image I is carried out binary conversion treatmentb.In bianry image IbIn, land marking part is represented with white, above ground portion is represented with black.
Step 4 two, by morphology operations, to bianry image IbHandled.
Step 4 three, to bianry image IbProjective transformation is done, obtain land marking faces bianry image Ir;To facing two It is worth image IrHough straight-line detection is carried out, is tentatively sieved using the yaw angle information Hough straight-line detection result of unmanned vehicle Choosing, deletes in image and detects that straight line angle differs larger straight line with true ground tag line.
Step 4 four, transform to the straight line after preliminary screening under initial geographic coordinate system.
Preferably, the detailed process of step 5 is as follows:
Step 5 one:Under initial geographic coordinate system, using straight line after unmanned vehicle position and preliminary screening initial Correlation between straight line, further screens straight line, is deleted behind position and angle and preliminary screening under geographic coordinate system Except wherein position, straight line that angle has big difference with actual value deletes non-parallel and non-perpendicular straight line between other straight lines.
Step 5 two, handle straight line information after further screening, will be straight after screening under initial geographic coordinate system Straight-line detection position after the actual position of line and screening makes the difference, as a result as unmanned vehicle position correction amount Δ P.
When straight line only exists straight line information after further screening, correction unmanned vehicle is perpendicular to the rectilinear direction On positional information, when there are corrected value is averaged during plurality of parallel lines.
There are can correct unmanned vehicle when two straight lines and orthogonal two straight lines for straight line after further screening Horizontal position information.
P1=P0+ΔP
Wherein, P1For unmanned vehicle accurate location;P0For unmanned vehicle coarse position information;Δ P is unmanned vehicle Position correction amount.
Straight-line intersection after further screening is calculated, retains the intersection point for being less than distance L apart from unmanned vehicle horizontal level, when There are such four points in the intersection point of reservation, wherein when any three points are not conllinear, it can be utilized and thrown by the intersection point of reservation Shadow conversion directly calculates unmanned vehicle accurate location P1With exact posture R1
Preferably, the detailed process of step 6 is as follows:
Return to step three.
Preferably, the value of the weighting parameters α of step 3 is α=1- β, the value β of weighting parameters β=| H-H1|, threshold value Height H1Value is 0.4 meter.
Preferably, the distance L of step 5 is 3 meters.
Proposed in the present invention by being laid with white features band group grid on ground or utilizing the nets such as flooring ceramic tile Method of the lattice as land marking, merges inertance element, magnetometer and light stream locating module information on unmanned vehicle, utilizes phase Machine identifies land marking, realizes the position to unmanned vehicle and attitude updating, and obtains higher positioning and determine appearance essence Degree.
Brief description of the drawings
Fig. 1 is position and the attitude updating flow chart of the present invention;
Fig. 2 is land marking schematic diagram of the present invention, wherein, black portions are ground, and white portion is land marking;
Fig. 3 is unmanned plane and land marking figure used in test in the present invention.
Embodiment
Embodiment one:Illustrate present embodiment with reference to Fig. 1,
A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking, comprise the following steps:
Step 1: establish initial geographic coordinate system, body coordinate system, body to geography involves coordinate system, camera coordinates system, Image coordinate system, light stream module coordinate system, installs camera and light stream locating module on unmanned vehicle;
Step 2: position and posture of the calibration for cameras coordinate system relative to body coordinate system, and measure light stream locating module Relative position relation between camera, calibration body coordinate system to the transition matrix between initial geographic coordinate system;
Step 3: light stream locating module velocity information is obtained, and the acceleration information of inertance element and unmanned vehicle are certainly The attitude information of body, integrates the acceleration information of inertance element, the light stream locating module that light stream locating module is returned Speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa, initial geographic coordinate system is transformed into, is obtained Light stream locating module speed V under to initial geographic coordinate systemf' and inertance element speed Va'.It is right under initial geographic coordinate system Vf' and Va' merged, speed after being merged, integrates speed after fusion, obtains unmanned plane in initial geographical coordinate Rough position P in system0, pass through inertance element and the rough posture R of magnetometer acquisition unmanned plane0
Step 4: obtain the land marking image I of camera collection, land marking information in extraction image I, to from image I In acquired land marking information carry out preliminary screening, information after screening is changed to initial geographic coordinate system;
Step 5: ground identification information is further screened in initial geographic coordinate system, after further screening Information unmanned vehicle position and posture are corrected, obtain unmanned vehicle accurate location P1With exact posture R1
Step 6: return to step three.
Proposed in the present invention by being laid with white features band group grid on ground or utilizing the nets such as flooring ceramic tile Method of the lattice as land marking, merges inertance element, magnetometer and light stream locating module information on unmanned vehicle, utilizes phase Machine identifies land marking, realizes the position to unmanned vehicle and attitude updating, and obtains higher positioning and determine appearance essence Degree.
Preferably, a kind of the unmanned vehicle independent positioning and pose alignment technique based on land marking, its ground Face is identified as square net.
Embodiment two:
The detailed process of present embodiment step 1 comprises the following steps:
Step 1 one, the initial geographic coordinate system o0x0y0z0:With east-north-day coordinate of unmanned vehicle position System is used as initial geographic coordinate system;
Body coordinate system obxbybzb:Body coordinate system origin o is used as using unmanned plane center of gravityb, with the longitudinal direction of unmanned vehicle It is x to be directed toward heading parallel to axis in symmetrical planebAxis, to be directed toward right side perpendicular to the longitudinally asymmetric plane of unmanned vehicle For ybAxis, perpendicular to xbAxis and ybIt is z that axis, which is directed toward below unmanned vehicle,bAxis;
Body involves coordinate system o to geography1x1y1z1:Body coordinate system origin o is used as using unmanned plane center of gravity1, x1、y1、z1Axis Parallel respectively and initial geographic coordinate system x0、y0、z0Axis;
Image coordinate system oixiyi:Using camera upper left side as image coordinate system origin oi, with parallel to image the top pixel Line, it is x to be directed toward right sideiAxis, with parallel to image leftmost side pixel, downwardly directed is yiAxis;
Camera coordinates system ocxcyczc:Using camera photocentre as camera coordinates system origin oc, it is outwardly directed for phase with camera optical axis Machine coordinate system zcAxis, with parallel and image coordinate system xi、yiAxis is respectively xc、ycAxis;
Light stream module coordinate system ofxfyfzf:Using light stream locating module center as light stream locating module coordinate origin, xf、 yf、zfAxis distinguishes parallel and body coordinate system xb、yb、zbAxis.
Light stream locating module, is installed on unmanned vehicle lower section by step 1 two, is connected with unmanned vehicle body, camera Coordinate system is installed on below unmanned vehicle so that camera coordinates system zcAxis, xcAxis, ycAxis distinguishes parallel and body coordinate system zb Axis, y0Axis ,-x0Axis;
The three of embodiment:
The detailed process of present embodiment step 2 comprises the following steps:
Step 2 one, by unmanned vehicle and scaling board be positioned over same physical plane, and scaling board is being moved to camera just Lower section, keeps scaling board side parallel with body coordinate system x-axis direction, camera is demarcated.
Step 2 two, utilize the horizontal and vertical distance between graduated scale measurement light stream locating module and camera.
Step 2 three, using installed on unmanned vehicle magnetometer and inertance element calibration body coordinate system to initially Transformational relation between geographic coordinate system.
The four of specific real mode
The detailed process of present embodiment step 3 comprises the following steps:
Step 3 one, obtain light stream locating module velocity information, and the acceleration information and unmanned vehicle of inertance element The attitude information of itself, integrates the acceleration information of inertance element, and light stream locating module is returned light stream locating module Speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa.By VfAnd VaSuccessively change to body Under coordinate system, body to geography involve coordinate system, be finally transformed under initial geographic coordinate system, finally obtain initial geographical coordinate Light stream locating module speed V under systemf' and inertance element speed Va'。
Step 3 two, read unmanned vehicle height H, and fusion method is selected to V according to elevation informationf' and Va' melted Close, the unit of height H is rice.
When unmanned vehicle height H is more than or equal to threshold level H1When, using Kalman filter to Vf' and Va' melted Close, obtain speed V under the initial geographic coordinate system of unmanned vehicleo
When unmanned vehicle height H is less than H1When, by Vf' and Va' be directly added according to certain weights, obtain nothing Speed V under the initial geographic coordinate system of people's aircrafto, formula is as follows:
Vo=α Vf'+βVa'
Wherein α and β is weighting parameters, is changed with the change of height H.
Step 3 three, to speed V after fusionoIntegrated to obtain rough position of the unmanned vehicle under initial coordinate system P0, pass through inertance element and the coarse filtration posture R of magnetometer acquisition unmanned plane0
The five of embodiment
The detailed process of present embodiment step 5 comprises the following steps:
Step 4 one, by camera gather land marking image I change to hsv color space, utilize land marking with ground Difference of the face in color and luminance channel obtains bianry image I to image I is carried out binary conversion treatmentb.In bianry image IbIn, land marking part is represented with white, above ground portion is represented with black.
Step 4 two, by morphology operations, to bianry image IbHandled.
Step 4 three, to bianry image IbProjective transformation is done, obtain land marking faces bianry image Ir;To facing two It is worth image IrHough straight-line detection is carried out, is tentatively sieved using the yaw angle information Hough straight-line detection result of unmanned vehicle Choosing, deletes in image and detects that straight line angle differs larger straight line with true ground tag line.
Step 4 four, transform to the straight line after preliminary screening under initial geographic coordinate system.
Embodiment six:
The detailed process of present embodiment step 5 is as follows:
Step 5 one:Under initial geographic coordinate system, using straight line after unmanned vehicle position and preliminary screening initial Correlation between straight line, further screens straight line, is deleted behind position and angle and preliminary screening under geographic coordinate system Except wherein position, straight line that angle has big difference with actual value deletes non-parallel and non-perpendicular straight line between other straight lines.
Step 5 two, handle straight line information after further screening, will be straight after screening under initial geographic coordinate system Straight-line detection position after the actual position of line and screening makes the difference, as a result as unmanned vehicle position correction amount Δ P.
When straight line only exists straight line information after further screening, correction unmanned vehicle is perpendicular to the rectilinear direction On positional information, when there are corrected value is averaged during plurality of parallel lines.
There are can correct unmanned vehicle when two straight lines and orthogonal two straight lines for straight line after further screening Horizontal position information.
P1=P0+ΔP
Wherein, P1For unmanned vehicle accurate location;P0For unmanned vehicle coarse position information;Δ P is unmanned vehicle Position correction amount.
Straight-line intersection after further screening is calculated, retains the intersection point for being less than distance L apart from unmanned vehicle horizontal level, when There are such four points in the intersection point of reservation, wherein when any three points are not conllinear, it can be utilized and thrown by the intersection point of reservation Shadow conversion directly calculates unmanned vehicle accurate location P1With exact posture R1
Other steps and parameter are identical with one of embodiment one to six.
Unmanned vehicle position and attitude error after 10 meters of the indoor flight of no gps signal is as shown in table 1.
Position and attitude error after 10 meters of 1 unmanned vehicle of table flight
Method Position error Yaw angle error Whether drift about
Optical flow method >1 meter Without legal appearance It is
Context of methods <5 centimetres <4° It is no
By table one it can be seen that context of methods is compared can significantly improve positioning accuracy with simple optical flow method, and it is fixed Position and accuracy of attitude determination do not increase and increase with move distance.
Position and attitude error is as shown in table 2 after ten minutes in the indoor flight of no gps signal for unmanned vehicle.
2 unmanned vehicle of table flight position and attitude error after ten minutes
Method Position error Yaw angle error Whether drift about
Inertance element >5 meters >10° It is
Context of methods 5 centimetres <4° It is no
Note:Unmanned vehicle is used in test as great Jiang companies M100 quadrotor unmanned vehicles;Camera uses normal light Learn camera, camera resolution 640*480;Light stream locating module is great Jiang companies guidance locating modules;Inertance element and Magnetometer etc. carries module using M100 quadrotor unmanned vehicles.

Claims (8)

1. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking, it is characterised in that including:
Step 1: establishing initial geographic coordinate system, body coordinate system, body to geography involves coordinate system, camera coordinates system, image Coordinate system, light stream module coordinate system, installs camera and light stream locating module on unmanned vehicle;
Step 2: position and posture of the calibration for cameras coordinate system relative to body coordinate system, and measure light stream locating module and phase Relative position relation between machine, calibration body coordinate system to the transition matrix between initial geographic coordinate system;
Step 3: light stream locating module velocity information is obtained, and the acceleration information of inertance element and unmanned vehicle itself Attitude information.The acceleration information of inertance element is integrated, obtains inertance element speed Va.Light stream locating module is returned Light stream locating module speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa, it is transformed into just Beginning geographic coordinate system, obtains the light stream locating module speed V under initial geographic coordinate systemf' and inertance element speed Va’.Initial To V under geographic coordinate systemf' and Va' merged, speed after being merged, integrates speed after fusion, obtains unmanned plane Rough position P in initial geographic coordinate system0, pass through inertance element and the rough posture R of magnetometer acquisition unmanned plane0
Step 4: the land marking image I of camera collection is obtained, land marking information in extraction image I, to the institute from image I The land marking information of acquisition carries out preliminary screening, and information after screening is changed to initial geographic coordinate system;
Step 5: ground identification information is further screened in initial geographic coordinate system, according to the letter after further screening Breath is corrected unmanned vehicle position and posture, obtains unmanned vehicle accurate location P1With exact posture R1
Step 6: return to step three.
2. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 1, It is characterized in that, the land marking is square net.
3. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 1, It is characterized in that, the detailed process of step 2 comprises the following steps:
Step 1 one, the initial geographic coordinate system o0x0y0z0:With east-north-day coordinate of the initial position of unmanned vehicle System is used as initial geographic coordinate system;
Body coordinate system obxbybzb:Body coordinate system origin o is used as using unmanned plane center of gravityb, with the longitudinally asymmetric of unmanned vehicle It is x that plane in parallel is directed toward heading in axisbAxis, to be directed toward right side perpendicular to the longitudinally asymmetric plane of unmanned vehicle as yb Axis, perpendicular to xbAxis and ybIt is z that axis, which is directed toward below unmanned vehicle,bAxis;
Body involves coordinate system o to geography1x1y1z1:Body coordinate system origin o is used as using unmanned plane center of gravity1, x1、y1、z1Axis is distinguished Parallel and initial geographic coordinate system x0、y0、z0Axis;
Image coordinate system oixiyi:Using camera upper left side as image coordinate system origin oi, with parallel to image the top pixel line, It is x to be directed toward right sideiAxis, with parallel to image leftmost side pixel, downwardly directed is yiAxis;
Camera coordinates system ocxcyczc:Using camera photocentre as camera coordinates system origin oc, sat so that camera optical axis is outwardly directed for camera Mark system zcAxis, with parallel and image coordinate system xi、yiAxis is respectively xc、ycAxis;
Light stream module coordinate system ofxfyfzf:Using light stream locating module center as light stream locating module coordinate origin, xf、yf、zfAxis Parallel respectively and body coordinate system xb、yb、zbAxis.
Light stream locating module, is installed on unmanned vehicle lower section by step 1 two, is connected with unmanned vehicle body;Camera is installed Below unmanned vehicle so that camera coordinates system zcAxis, xcAxis, ycAxis distinguishes parallel and body coordinate system zbAxis, y0Axis ,-x0 Axis.
4. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 2, It is characterized in that, the detailed process of step 2 comprises the following steps:
Step 2 one, by unmanned vehicle and scaling board be positioned over same physical plane, and scaling board is moved to immediately below camera, Keep scaling board side parallel with body coordinate system x-axis direction, camera is demarcated.
Step 2 two, utilize the horizontal and vertical distance between graduated scale measurement light stream locating module and camera.
Step 2 three, utilize magnetometer and inertance element the calibration body coordinate system installed on unmanned vehicle to initial geography Transformational relation between coordinate system.
5. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 3, It is characterized in that, the detailed process of step 3 comprises the following steps:
Step 3 one, obtain light stream locating module velocity information, and the acceleration information of inertance element and unmanned vehicle itself Attitude information, the acceleration information of inertance element is integrated, by light stream locating module return light stream locating module speed VoThe inertance element speed V obtained after the integrated acceleration provided with inertance elementa.By VfAnd VaSuccessively change to body coordinate Under system, body to geography involve coordinate system, be finally transformed under initial geographic coordinate system, finally obtain under initial geographic coordinate system Light stream locating module speed Vf' and inertance element speed Va’。
Step 3 two, read unmanned vehicle height H, and fusion method is selected to V according to elevation informationf' and Va' merged, it is high The unit for spending H is rice.
When unmanned vehicle height H is more than or equal to threshold level H1When, using Kalman filter to Vf' and Va' merged, Obtain speed V under the initial geographic coordinate system of unmanned vehicleo
When unmanned vehicle height H is less than H1When, by Vf' and Va' be directly added according to certain weights, obtain nobody and fly Speed V under the initial geographic coordinate system of row deviceo, formula is as follows:
Vo=α Vf’+βV,
Wherein α and β is weighting parameters, is changed with the change of height H.
Step 3 three, to speed V after fusionoIntegrated to obtain rough position P of the unmanned vehicle under initial coordinate system0, lead to Cross inertance element and magnetometer obtains the coarse filtration posture R of unmanned plane0
6. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 4, It is characterized in that, the detailed process of step 4 comprises the following steps:
Step 4 one, change the land marking image I that camera gathers to hsv color space, is existed using land marking and ground Difference in color and luminance channel obtains bianry image I to image I is carried out binary conversion treatmentb.The I in bianry imagebIn, Land marking part is represented with white, above ground portion is represented with black.
Step 4 two, by morphology operations, to bianry image IbHandled.
Step 4 three, to bianry image IbProjective transformation is done, obtain land marking faces bianry image Ir;To facing binary map As IrHough straight-line detection is carried out, preliminary screening is carried out using the yaw angle information Hough straight-line detection result of unmanned vehicle, Delete in image and detect that straight line angle differs larger straight line with true ground tag line.
Step 4 four, transform to the straight line after preliminary screening under initial geographic coordinate system.
7. a kind of unmanned vehicle independent positioning and pose alignment technique based on land marking according to claim 5, It is characterized in that, the detailed process of step 5 is as follows:
Step 5 one:Under initial geographic coordinate system, using straight line after unmanned vehicle position and preliminary screening initial geographical Correlation between straight line, further screens straight line, deletes it behind position and angle and preliminary screening under coordinate system Middle position, the straight line that angle has big difference with actual value, deletes non-parallel and non-perpendicular straight line between other straight lines.
Step 5 two, handle straight line information after further screening, under initial geographic coordinate system, by straight line after screening Actual position makes the difference with the straight-line detection position after screening, as a result as unmanned vehicle position correction amount Δ P.
When straight line only exists straight line information after further screening, correction unmanned vehicle is in the rectilinear direction Positional information, when there are be averaged during plurality of parallel lines to corrected value.
There are the water that unmanned vehicle can be corrected when two straight lines and orthogonal two straight lines for straight line after further screening Flat positional information.
P1=P0+ΔP
Wherein, P1For unmanned vehicle accurate location;P0For unmanned vehicle coarse position information;Δ P is unmanned vehicle position Correcting value.
Straight-line intersection after further screening is calculated, retains the intersection point for being less than distance L apart from unmanned vehicle horizontal level, works as reservation Intersection point in there are such four points, wherein when any three points are not conllinear, projection can be utilized to become by the intersection point of reservation Change and directly calculate unmanned vehicle accurate location P1With exact posture R1
8. a kind of unmanned vehicle independent positioning and pose school based on land marking according to one of claim 2 to 7 Positive technology, it is characterised in that the value of the weighting parameters α of step 3 is α=1- β, the value β of weighting parameters β=| H-H1|, threshold It is worth height H1Value is 0.4 meter.The value of distance L is 3 meters in step 5.
CN201710766832.8A 2017-08-31 2017-08-31 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking Pending CN108007474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710766832.8A CN108007474A (en) 2017-08-31 2017-08-31 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710766832.8A CN108007474A (en) 2017-08-31 2017-08-31 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking

Publications (1)

Publication Number Publication Date
CN108007474A true CN108007474A (en) 2018-05-08

Family

ID=62050931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710766832.8A Pending CN108007474A (en) 2017-08-31 2017-08-31 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking

Country Status (1)

Country Link
CN (1) CN108007474A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109405821A (en) * 2018-09-21 2019-03-01 北京三快在线科技有限公司 Method, apparatus used for positioning and target device
CN109533380A (en) * 2018-12-19 2019-03-29 中山大学 Lifting airscrew based on Kalman filtering blocks gap duration prediction method
CN109903336A (en) * 2019-01-18 2019-06-18 浙江工业大学 Across the visual field estimation method of attitude of flight vehicle and device based on local feature
CN110730301A (en) * 2019-10-14 2020-01-24 深圳市施罗德工业集团有限公司 Head adjusting method and device, electronic equipment and storage medium
CN111580142A (en) * 2019-01-30 2020-08-25 北京优位智停科技有限公司 Method for entering train yard for piloting by using air flight device
CN112149659A (en) * 2019-06-27 2020-12-29 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
CN112965507A (en) * 2021-02-03 2021-06-15 南京航空航天大学 Cluster unmanned aerial vehicle cooperative work system and method based on intelligent optimization
CN113237478A (en) * 2021-05-27 2021-08-10 哈尔滨工业大学 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle
CN113358135A (en) * 2021-08-09 2021-09-07 青州耐威航电科技有限公司 Method for correcting aircraft position by photoelectric measurement data
CN113551692A (en) * 2021-07-19 2021-10-26 杭州迅蚁网络科技有限公司 Unmanned aerial vehicle magnetometer and camera installation angle calibration method and device
CN113712469A (en) * 2021-08-11 2021-11-30 朱明� Unmanned mopping cleaning vehicle based on visual navigation, control method and base station
CN114088114A (en) * 2021-11-19 2022-02-25 智道网联科技(北京)有限公司 Vehicle pose calibration method and device and electronic equipment
WO2022198590A1 (en) * 2021-03-25 2022-09-29 华为技术有限公司 Calibration method and apparatus, intelligent driving system, and vehicle
CN115790574A (en) * 2023-02-14 2023-03-14 飞联智航(北京)科技有限公司 Unmanned aerial vehicle optical flow positioning method and device and unmanned aerial vehicle

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103267523A (en) * 2013-04-19 2013-08-28 天津工业大学 Offline processing method for visual information of visual navigation system of quadcopter
CN103365297A (en) * 2013-06-29 2013-10-23 天津大学 Optical flow-based four-rotor unmanned aerial vehicle flight control method
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
CN104729506A (en) * 2015-03-27 2015-06-24 北京航空航天大学 Unmanned aerial vehicle autonomous navigation positioning method with assistance of visual information
CN105352495A (en) * 2015-11-17 2016-02-24 天津大学 Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
CN106017463A (en) * 2016-05-26 2016-10-12 浙江大学 Aircraft positioning method based on positioning and sensing device
CN106094847A (en) * 2016-06-07 2016-11-09 廖兴池 A kind of unmanned plane automatic obstacle-avoiding controls technology and device thereof
US20170006148A1 (en) * 2015-06-30 2017-01-05 ZEROTECH (Shenzhen) Intelligence Robot Co., Ltd. Unmanned aerial vehicle and control device thereof
CN106647784A (en) * 2016-11-15 2017-05-10 天津大学 Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system
CN106681353A (en) * 2016-11-29 2017-05-17 南京航空航天大学 Unmanned aerial vehicle (UAV) obstacle avoidance method and system based on binocular vision and optical flow fusion
CN106813662A (en) * 2016-06-08 2017-06-09 极翼机器人(上海)有限公司 A kind of air navigation aid based on light stream
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103267523A (en) * 2013-04-19 2013-08-28 天津工业大学 Offline processing method for visual information of visual navigation system of quadcopter
CN103365297A (en) * 2013-06-29 2013-10-23 天津大学 Optical flow-based four-rotor unmanned aerial vehicle flight control method
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
CN104729506A (en) * 2015-03-27 2015-06-24 北京航空航天大学 Unmanned aerial vehicle autonomous navigation positioning method with assistance of visual information
US20170006148A1 (en) * 2015-06-30 2017-01-05 ZEROTECH (Shenzhen) Intelligence Robot Co., Ltd. Unmanned aerial vehicle and control device thereof
CN105352495A (en) * 2015-11-17 2016-02-24 天津大学 Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
CN106017463A (en) * 2016-05-26 2016-10-12 浙江大学 Aircraft positioning method based on positioning and sensing device
CN106094847A (en) * 2016-06-07 2016-11-09 廖兴池 A kind of unmanned plane automatic obstacle-avoiding controls technology and device thereof
CN106813662A (en) * 2016-06-08 2017-06-09 极翼机器人(上海)有限公司 A kind of air navigation aid based on light stream
CN106647784A (en) * 2016-11-15 2017-05-10 天津大学 Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system
CN106681353A (en) * 2016-11-29 2017-05-17 南京航空航天大学 Unmanned aerial vehicle (UAV) obstacle avoidance method and system based on binocular vision and optical flow fusion
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
宋宇等: "基于光流和惯性导航的小型无人机定位方法", 《传感器与微系统》 *
杨天雨等: "惯性/光流/磁组合导航技术在四旋翼飞行器中的应用", 《传感器与微系统》 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109405821A (en) * 2018-09-21 2019-03-01 北京三快在线科技有限公司 Method, apparatus used for positioning and target device
CN109533380A (en) * 2018-12-19 2019-03-29 中山大学 Lifting airscrew based on Kalman filtering blocks gap duration prediction method
CN109533380B (en) * 2018-12-19 2022-03-15 中山大学 Kalman filtering-based helicopter rotor wing shielding gap duration prediction method
CN109903336A (en) * 2019-01-18 2019-06-18 浙江工业大学 Across the visual field estimation method of attitude of flight vehicle and device based on local feature
CN111580142A (en) * 2019-01-30 2020-08-25 北京优位智停科技有限公司 Method for entering train yard for piloting by using air flight device
CN112149659A (en) * 2019-06-27 2020-12-29 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
WO2020258935A1 (en) * 2019-06-27 2020-12-30 浙江商汤科技开发有限公司 Positioning method and device, electronic device and storage medium
TWI749532B (en) * 2019-06-27 2021-12-11 大陸商浙江商湯科技開發有限公司 Positioning method and positioning device, electronic equipment and computer readable storage medium
CN110730301A (en) * 2019-10-14 2020-01-24 深圳市施罗德工业集团有限公司 Head adjusting method and device, electronic equipment and storage medium
CN112965507A (en) * 2021-02-03 2021-06-15 南京航空航天大学 Cluster unmanned aerial vehicle cooperative work system and method based on intelligent optimization
WO2022198590A1 (en) * 2021-03-25 2022-09-29 华为技术有限公司 Calibration method and apparatus, intelligent driving system, and vehicle
CN113237478A (en) * 2021-05-27 2021-08-10 哈尔滨工业大学 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle
CN113237478B (en) * 2021-05-27 2022-10-14 哈尔滨工业大学 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle
CN113551692A (en) * 2021-07-19 2021-10-26 杭州迅蚁网络科技有限公司 Unmanned aerial vehicle magnetometer and camera installation angle calibration method and device
CN113551692B (en) * 2021-07-19 2024-04-02 杭州迅蚁网络科技有限公司 Calibration method and device for installation angle of magnetometer and camera of unmanned aerial vehicle
CN113358135A (en) * 2021-08-09 2021-09-07 青州耐威航电科技有限公司 Method for correcting aircraft position by photoelectric measurement data
CN113358135B (en) * 2021-08-09 2021-11-26 青州耐威航电科技有限公司 Method for correcting aircraft position by photoelectric measurement data
CN113712469A (en) * 2021-08-11 2021-11-30 朱明� Unmanned mopping cleaning vehicle based on visual navigation, control method and base station
CN114088114A (en) * 2021-11-19 2022-02-25 智道网联科技(北京)有限公司 Vehicle pose calibration method and device and electronic equipment
CN114088114B (en) * 2021-11-19 2024-02-13 智道网联科技(北京)有限公司 Vehicle pose calibration method and device and electronic equipment
CN115790574A (en) * 2023-02-14 2023-03-14 飞联智航(北京)科技有限公司 Unmanned aerial vehicle optical flow positioning method and device and unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
CN108007474A (en) A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking
US11218689B2 (en) Methods and systems for selective sensor fusion
CN105928498B (en) Method, the geodetic mapping and survey system, storage medium of information about object are provided
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN106651990A (en) Indoor map construction method and indoor map-based indoor locating method
CN103822635A (en) Visual information based real-time calculation method of spatial position of flying unmanned aircraft
CN106155081B (en) A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method
CN107907900A (en) A kind of multi-sensor combined navigation system and method for GNSS double antennas auxiliary
CN104076817A (en) High-definition video aerial photography multimode sensor self-outer-sensing intelligent navigation system and method
CN106226780A (en) Many rotor-wing indoors alignment system based on scanning laser radar and implementation method
CN103175524A (en) Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN109739257A (en) Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
CN106705962B (en) A kind of method and system obtaining navigation data
CN108955685A (en) A kind of tanker aircraft tapered sleeve pose measuring method based on stereoscopic vision
CN108548520A (en) A kind of antenna attitude remote data acquisition system based on NB-IOT
CN108426576A (en) Aircraft paths planning method and system based on identification point vision guided navigation and SINS
CN106370160A (en) Robot indoor positioning system and method
CN109974713A (en) A kind of navigation methods and systems based on topographical features group
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN110220533A (en) A kind of onboard electro-optical pod misalignment scaling method based on Transfer Alignment
CN113819904B (en) polarization/VIO three-dimensional attitude determination method based on zenith vector
CN108801225A (en) A kind of unmanned plane tilts image positioning method, system, medium and equipment
CN105424060B (en) A kind of measurement method of aircraft star sensor and strapdown inertial measurement unit installation error
CN103808309A (en) Three-dimensional aerial photograph forest measurement method for unmanned aerial vehicle
CN205176663U (en) System of falling is being fixed a position to unmanned aerial vehicle power line based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180508

WD01 Invention patent application deemed withdrawn after publication