CN107389968A - A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer - Google Patents
A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer Download PDFInfo
- Publication number
- CN107389968A CN107389968A CN201710539335.4A CN201710539335A CN107389968A CN 107389968 A CN107389968 A CN 107389968A CN 201710539335 A CN201710539335 A CN 201710539335A CN 107389968 A CN107389968 A CN 107389968A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- matrix
- light stream
- current time
- stream sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000001133 acceleration Effects 0.000 title claims abstract description 65
- 238000000034 method Methods 0.000 title claims abstract description 33
- 239000011159 matrix material Substances 0.000 claims description 85
- 238000005259 measurement Methods 0.000 claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 230000010354 integration Effects 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims 1
- 238000012545 processing Methods 0.000 abstract description 5
- 238000001914 filtration Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000686 essence Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/36—Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Abstract
The present invention relates to unmanned air vehicle technique field, there is provided a kind of unmanned plane spot hover implementation method and device based on light stream sensor and acceleration transducer.Method includes:The horizontal movement velocity obtained according to light stream sensor, the acceleration that acceleration transducer obtains, and acceleration bias generation state vector I;According to the assessed value at current time, unmanned plane is calculated in current time optimum state vector value in horizontal velocity that current time light stream sensor detects;Extract in the optimum state vector and be sent respectively to the positioner and speed control of unmanned plane apart from the factor and velocity factor, generation control instruction.The present invention obtains the horizontal velocity of unmanned plane using the light stream sensor based on image processing techniques, and merged using Kalman filtering light stream sensor and acceleration transducer data to obtain accurate unmanned plane position and velocity estimation information, and for realizing the spot hover control of unmanned plane.
Description
【Technical field】
The present invention relates to unmanned air vehicle technique field, more particularly to a kind of based on light stream sensor and acceleration transducer
Unmanned plane fixed-point implementation method and apparatus.
【Background technology】
Unmanned plane scientific investigation, take precautions against natural calamities recover, there is wide application in the field such as security, in UAV Intelligent and
In practical process, the autonomous positioning and hovering function of unmanned plane seem more and more indispensable.
The autonomous positioning technology of unmanned plane is primarily referred to as determining unmanned plane in flight environment of vehicle using the data of various sensors
In to the positional information of inertial coordinate system.The degree of accuracy of unmanned plane location estimation is that unmanned plane realizes hovering, trajectory planning
And basis and the premise of the complicated aerial mission such as target following.
The hovering mode of usual unmanned plane is realized based on IMU (Inertial measurement unit), passes through judgement
The posture of unmanned plane ensures that the roll angle of unmanned plane and the angle of pitch are zero and then ensure that unmanned plane does not produce the displacement of horizontal direction,
So as to realize the steadily hovering of unmanned plane, the open loop control mode of error and hovering yet with sensor, unmanned plane meeting
Drift is produced, can not independently realize positioning and hovering.
The method that now widely used UAV Navigation System is mainly based upon GPS location, but its positioning precision compared with
It is low, and almost there is no signal indoors, so GPS sensor can not be utilized by realizing the positioning flight of unmanned plane indoors.
The unmanned plane indoor locating system being currently, there are mainly has following several:
1st, the positioning based on bluetooth, it is based on triangulation location, accurate positioning.
2nd, Indoor Positioning System Using Ultra Wideband Radio.It uses ultra-wideband impulse signal, by default sensor to signal label
Analysis is tracked, multi-path resolved ability is strong, and precision is high.
3rd, it is in unmanned plane underhung Special Graphs target characteristic body and camera, such as number of patent application
The station keeping method of unmanned plane disclosed in 201410526631.7 patent is:It is special to be placed in default hovering place
Icon, by unmanned plane hanging cam, special icon is identified, and ensures that Special Graphs are marked in captured image and occupy admittedly
Positioning is put, so as to ensure that the relative horizontal position of camera and icon keeps constant.It is special in order to ensure to occur without spin shift
Icon needs to indicate with certain direction.
But scheme one needs bluetooth module and deployment Bluetooth base. station in the prior art, causes indoor positioning cost higher.
Scheme two needs to arrange the sensor for positioning indoors, and cost is high, and orientation range is limited, portable poor.Scheme three should
Method have the drawback that realize it is complex, not only need hover place be pre-placed specific pictures, cause unmanned plane can not
At will hovering, and if Special Graphs indicate breakage, cause unmanned plane not hover accurately.
【The content of the invention】
Technical problems to be solved of the embodiment of the present invention are to improve at least following present in each scheme in the prior art ask
Needed in one or more in topic, including scheme one bluetooth module and deployment Bluetooth base. station, therefore indoor positioning cost compared with
It is high.Need to arrange the sensor for positioning indoors in scheme two, cost is high, and orientation range is limited, portable poor.Scheme
It is complex in the presence of realizing in three, not only need hovering place to be pre-placed specific pictures, cause unmanned plane at will to hover,
If Special Graphs indicate breakage, unmanned plane is caused not hover accurately.
The embodiment of the present invention adopts the following technical scheme that:
In a first aspect, the invention provides a kind of unmanned plane fixed-point implementation based on light stream sensor and acceleration transducer
Method, including:
The unmanned plane obtained according to light stream sensor is relative to the horizontal movement velocity Vx and Vy on ground, acceleration transducer
Acceleration A x, Ay, the Az of the unmanned plane of acquisition under body axis system on three directions of x-axis, y-axis and z-axis, are calculated
Current unmanned plane x directions position Lx, y direction position Ly and z direction position Lz, and the speed Vz in z directions;According to the x side
To position, y directions position and z directions position, x directions speed Vx, y direction speed Vy and z directions speed Vz, and in x-axis, y
Axle and the acceleration bias in z-axis direction generation state vector I;
The acceleration A x that is gathered according to the optimal estimation value I of the state vector of previous moment (k-1 | k-1) and current time,
Ay and Az, current time assessed value I (k | k-1) of the state vector under the influence of previous moment is calculated, wherein, when k is represented
Current time in sequence, k-1 represent the previous moment in sequential;
According to the assessed value I at the current time (k | k-1), the horizontal velocity that current time light stream sensor detects,
Unmanned plane is calculated at current time optimum state vector value I (k | k);
Extract in the optimum state vector I (k | k) apart from the factor and/or velocity factor, generation control instruction difference
The positioner and/or speed control of unmanned plane are sent to, so that the positioner and/or speed control are made
Corresponding adjustment.
Preferably, the state vector I is 9 × 1 matrixes, is specially:
(Lx, Ly, Lz, Vx, Vy, Vz, Bx, By, Bz), wherein, Lx, Ly, Lz are unmanned plane under inertial coordinate system
Positioned at the position of x-axis, y-axis and z-axis, Vx, Vy and Vz are unmanned plane relative to the translational speed of inertial coordinate system, Bx, By
It is acceleration bias with Bz;Wherein, Vz is obtained by doing integration to acceleration transducer collection value Az, initial Lx, Ly and Lz
Parameter value be 0.
Preferably, the assessed value I (k | k-1) at current time acquisition modes, are specifically included:
The assessed value I (k | k-1) at current time is calculated according to the first formula, first formula is:
[I (k | k-1)-I (k-1 | k-1)]/dt=AI (k-1 | k-1)+I (k-1 | k-1)+BU (k-1);
Wherein, A is the sytem matrix of unmanned plane, and the sytem matrix is by permanent matrix and the direction cosine matrix of unmanned plane
Form;B is the control machine matrix of unmanned plane;U is that acceleration gathered data of the acceleration in x-axis, y-axis and z-axis direction is formed
Vector;Dt is the state vector update cycle;The direction cosine matrix is obtained by the IMU of unmanned plane.
Preferably, the sytem matrix A is 9*9 matrix, and 9 factors that its row and column has state vector are formed, institute
Stating 9 factors includes:Lx, Ly, Lz, Vx, Vy, Vz, Bx, By and Bz;
Wherein, (Isosorbide-5-Nitrae), (2,5) and (3,6) is permanent number 1 in sytem matrix A;In sytem matrix A (4,7) to (6,9) it
Between matrix area place the unmanned plane direction cosine matrix homography parameter value;Remaining ranks in sytem matrix A
Corresponding value is 0.
Preferably, it is described to control the matrix that matrix B is 9*3, wherein row is made up of 9 factors of state vector, arrange by adding
Acceleration gathered data U_ax, U_ay and the U_az of speed in x-axis, y-axis and z-axis direction are formed;Wherein, control matrix B (4,
1), (5,2) and (6,3) are permanent number 1, and described to control in matrix B value corresponding to remaining ranks be 0.
Preferably, the assessed value I (k | k-1) according to the current time, current time light stream sensor detects
Horizontal velocity, unmanned plane is calculated at current time optimum state vector value I (k | k), specifically includes:
Unmanned plane is calculated in current time optimum state vector value I (k | k), second formula according to the second formula
For:
I (k | k)=I (k | k-1)+Kg (k) (Z (k)-HI (k | k-1)
Wherein, Z (K) is actual measurement matrix, is made up of the horizontal velocity collected of current time light stream sensor;H is
The measurement matrix of system, the measurement matrix H are a permanent matrix;Kg (k) is the kalman gain factor;
Current time optimal state vector estimation can be obtained by obtaining suitable Kg (k) by adjustment.
Preferably, the measurement matrix H is 2*9 matrixes, wherein, measurement matrix H row is made up of Vx and Vy, measurement matrix
H row are made up of 9 factors of corresponding states vector, wherein, measurement matrix H (Isosorbide-5-Nitrae) and (2,5) is 1, its in the matrix of both sides
Value corresponding to remaining ranks is 0.
Preferably, the kalman gain factor K g (k) is a 9*2 matrixes.
Second aspect, pinpointed in fact based on the unmanned plane of light stream sensor and acceleration transducer present invention also offers a kind of
Existing device, realization device include positioner, speed control, thrust vectoring conversion targeted attitude angle module, Kalman
Wave filter, light stream sensor and acceleration transducer, specifically:
The light stream sensor and acceleration transducer are used to the sensing data each collected being sent to the karr
Graceful wave filter;
The Kalman filter is used for according to the assessed value I at current time of storage (k | k-1), and current time light stream passes
The horizontal velocity that sensor detects, unmanned plane is calculated at current time optimum state vector value I (k | k);
The positioner, speed control and thrust vectoring conversion targeted attitude angle module are sequentially connected, also,
The adjustment control port of the positioner and speed control exports with the first control of the Kalman filter respectively
Port is connected with the second control output end mouth;
The positioner, for receiving optimal location parameter and the target location control that the first control output end mouth is sent
System instruction, and it is translated into speed adjust instruction;
The speed control, for receiving optimal velocity parameter and the positioner that the second control output end mouth is sent
Speed adjust instruction, and be translated into thrust vectoring conversion targeted attitude angle module control instruction, so as to realize mesh
Mark the adjustment of posture.
The present invention obtains the horizontal velocity of unmanned plane using the light stream sensor based on image processing techniques, and light stream is passed
Sensor is merged to obtain accurate unmanned plane position and velocity estimation with acceleration transducer data using Kalman filtering
Information, and for realizing the spot hover control of unmanned plane, cost is low, portability is good, and the requirement to environment is relatively low, and realizes
Work well.
【Brief description of the drawings】
In order to illustrate the technical solution of the embodiments of the present invention more clearly, it will make below to required in the embodiment of the present invention
Accompanying drawing is briefly described.It should be evident that drawings described below is only some embodiments of the present invention, for
For those of ordinary skill in the art, on the premise of not paying creative work, other can also be obtained according to these accompanying drawings
Accompanying drawing.
Fig. 1 is provided in an embodiment of the present invention a kind of real based on the unmanned plane of light stream sensor and acceleration transducer fixed point
Existing method flow schematic diagram;
Fig. 2 is a kind of state vector schematic diagram provided in an embodiment of the present invention;
Fig. 3 is a kind of control matrix schematic diagram provided in an embodiment of the present invention;
Fig. 4 is a kind of input matrix schematic diagram provided in an embodiment of the present invention;
Fig. 5 is a kind of sytem matrix schematic diagram provided in an embodiment of the present invention;
Fig. 6 is a kind of method flow diagram for obtaining current time optimum state vector and estimating provided in an embodiment of the present invention;
Fig. 7 is a kind of actual measurement matrix schematic diagram provided in an embodiment of the present invention;
Fig. 8 is a kind of Kalman filter structured flowchart provided in an embodiment of the present invention;
Fig. 9 is a kind of measurement matrix schematic diagram provided in an embodiment of the present invention;
Figure 10 is a kind of unmanned plane spot hover controller architecture block diagram provided in an embodiment of the present invention.
【Embodiment】
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
In the description of the invention, term " interior ", " outer ", " longitudinal direction ", " transverse direction ", " on ", " under ", " top ", " bottom " etc. refer to
The orientation or position relationship shown be based on orientation shown in the drawings or position relationship, be for only for ease of the description present invention rather than
It is required that the present invention must be with specific azimuth configuration and operation, therefore it is not construed as limitation of the present invention.
In addition, as long as technical characteristic involved in each embodiment of invention described below is each other not
Conflict can is formed to be mutually combined.
Embodiment 1:
The embodiment of the present invention 1 provides a kind of unmanned plane fixed-point implementation side based on light stream sensor and acceleration transducer
Method, as shown in figure 1, including:
In step 201, the horizontal movement velocity Vx and Vy according to the unmanned plane that light stream sensor obtains relative to ground,
Acceleration A x of the unmanned plane that acceleration transducer obtains under body axis system on three directions of x-axis, y-axis and z-axis,
Ay, Az, position x directions position, y directions position and the z directions position of current unmanned plane, and the speed in z directions is calculated
Vz。
Specifically, being continuously shot using the CMOS camera of light stream sensor to uav bottom ground, pass through figure
As the image processing techniques such as matching algorithm and optical flow algorithm are handled ground picture, and combine the high number of degrees of ultrasonic wave
According to unmanned plane horizontal movement velocity Vx, Vy relative to ground can be obtained.
Speed, which is integrated, can obtain unmanned plane in the horizontal direction with respect to the position on ground.Utilize adding in IMU
Velocity sensor can obtain acceleration A x, Ay, the Az of unmanned plane under body axis system on three directions of x-axis, y-axis and z-axis.
The speed of unmanned plane in three directions is can obtain to integrated acceleration, it is relative further can to obtain unmanned plane to rate integrating
The position on ground.But due to noise be present in light stream sensor and acceleration transducer measurement data, obtained after integration nobody
It is poor that data precision is put in seat in the plane, typically may not apply to Hovering control and other practical applications.In embodiments of the present invention, accordingly
In x directions position Lx, y direction position Ly and z direction position Lz that step 201 is calculated for the state vector I, x directions
Speed Vx, y direction speed Vy and z directions speed Vz, only subsequent step optimum state vector is calculated and is used
Initial state vector I.
In step 202, according to x directions position Lx, y direction position Ly and z direction position Lz, x directions speed Vx,
Y directions speed Vy and z directions speed Vz, and the acceleration bias generation state vector I in x-axis, y-axis and z-axis direction.
In step 203, adopted according to the optimal estimation value I of the state vector of previous moment (k-1 | k-1) and current time
Acceleration A x, Ay and Az of collection, current time assessed value I (k | k-1) of the state vector under the influence of previous moment is calculated,
Wherein, k represents the current time in sequential, and k-1 represents the previous moment in sequential.
In step 204, according to the assessed value I at the current time (k | k-1), current time light stream sensor detects
Horizontal velocity, unmanned plane is calculated at current time optimum state vector value I (k | k).
In step 205, extract in the optimum state vector I (k | k) apart from the factor and/or velocity factor, generation
Control instruction is sent respectively to the positioner and/or speed control of unmanned plane, so as to the positioner and/or speed
Degree controller makes corresponding adjustment.
The embodiment of the present invention obtains the horizontal velocity of unmanned plane using the light stream sensor based on image processing techniques, and will
Light stream sensor is merged to obtain accurate unmanned plane position and speed with acceleration transducer data using Kalman filtering
Estimated information is spent, and for realizing the spot hover control of unmanned plane, cost is low, portability is good, and the requirement to environment is relatively low,
And realize and work well.
In the embodiment of the present invention, the state vector I is 9 × 1 matrixes, as described in Figure 2, is specially:(Lx,Ly,Lz,Vx,
Vy, Vz, Bx, By, Bz), wherein, Lx, Ly, Lz are the position that unmanned plane is located at x-axis, y-axis and z-axis under inertial coordinate system
Put, Vx, Vy and Vz be unmanned plane relative to the translational speed of inertial coordinate system, Bx, By and Bz are acceleration bias;Its
In, Vz is obtained by doing integration to acceleration transducer collection value Az, and initial Lx, Ly and Lz parameter value are 0.
In embodiments of the present invention, the assessed value I (k | k-1) at current time acquisition modes exist a kind of preferable
Implementation, it is specific as follows:
The assessed value I (k | k-1) at current time is calculated according to the first formula, first formula (1) is:
[I (k | k-1)-I (k-1 | k-1)]/dt=AI (k-1 | k-1)+I (k-1 | k-1)+BU (k-1) (1)
Wherein, A is the sytem matrix of unmanned plane, and the sytem matrix is by permanent matrix and the direction cosine matrix of unmanned plane
Form;B is the control machine matrix of unmanned plane, as shown in figure 3, being a typical control machine matrix schematic diagram;U is acceleration in x
The vector (alternatively referred to as 1 dimension matrix, and participate in matrix computations) that the acceleration gathered data of axle, y-axis and z-axis direction is formed, such as
Fig. 4 show a typical U matrixes schematic diagram;Dt is the state vector update cycle;The direction cosine matrix passes through unmanned plane
IMU obtain.Under normal circumstances, an IMU contains the accelerometer of three single shafts and the gyro of three single shafts, accelerometer
Detection object founds the acceleration signal of three axles in carrier coordinate system unification and independence, and gyro detection carrier is relative to the angle of navigational coordinate system
Rate signal, the angular speed and acceleration of object in three dimensions are measured, and the posture of object is calculated with this.
In the above-mentioned preferred scheme of the embodiment of the present invention, as shown in figure 5, the sytem matrix A is 9*9 matrix, its
9 factors that row and column has state vector are formed, and 9 factors include:Lx, Ly, Lz, Vx, Vy, Vz, Bx, By and Bz;
Wherein, (Isosorbide-5-Nitrae), (2,5) and (3,6) is permanent number 1 in sytem matrix A;In sytem matrix A (4,7) to (6,9) it
Between matrix area place the unmanned plane direction cosine matrix homography parameter value;Remaining ranks in sytem matrix A
Corresponding value is 0.
In the above-mentioned preferred scheme of the embodiment of the present invention, as shown in figure 3, described control the matrix that matrix B is 9*3, its
Middle row is made up of 9 factors of state vector, arrange by acceleration x-axis, y-axis and z-axis direction acceleration gathered data U_ax,
U_ay and U_az is formed;Wherein, (4,1), (5,2) and (6,3) for controlling matrix B are permanent number 1, its in the control matrix B
Value corresponding to remaining ranks is 0.
In embodiments of the present invention, for the content in step 203:According to the assessed value I at the current time (k | k-
1) horizontal velocity that, current time light stream sensor detects, unmanned plane is calculated in current time optimum state vector value I
(k|k).In the presence of a kind of this preferable implementation, as shown in fig. 6, specifically including:
In step 2031, according to the second formula be calculated unmanned plane current time optimum state vector value I (k |
K), second formula (2) is:
I (k | k)=I (k | k-1)+Kg (k) (Z (k)-HI (k | k-1) (2)
Wherein, Z (K) is actual measurement matrix, as shown in fig. 7, the horizontal velocity collected by current time light stream sensor
Form;H is the measurement matrix of system, and the measurement matrix H is a permanent matrix;Kg (k) is the kalman gain factor;
In step 2032, obtain suitable Kg (k) by adjustment and can obtain current time optimal state vector estimating
Meter.Wherein, Kalman filter correspond to formula (2) structure it is as shown in Figure 8.
In above-mentioned preferable implementation, the measurement matrix H is 2*9 matrixes, as shown in figure 9, wherein, measurement matrix
H row is made up of Vx and Vy, and measurement matrix H row are made up of 9 factors of corresponding states vector, wherein, measurement matrix H's
(Isosorbide-5-Nitrae) and (2,5) is 1, and value corresponding to remaining ranks is 0 in the matrix of both sides.In above-mentioned preferable implementation, the karr
Graceful gain factor Kg (k) is a 9*2 matrixes.
Embodiment 2:
The embodiment of the present invention is based on light stream sensor and acceleration transducer except providing one kind as described in Example 1
Unmanned plane fixed-point implementation method outside, additionally provide it is a kind of pinpointed based on the unmanned plane of light stream sensor and acceleration transducer it is real
Existing device, can be used for running method as described in Example 1.As shown in Figure 10, realization device includes positioner 1, speed
Spend controller 2, thrust vectoring conversion targeted attitude angle module 3, Kalman filter 4, light stream sensor 5 and acceleration sensing
Device 6, specifically:
The light stream sensor and acceleration transducer are used to the sensing data each collected being sent to the karr
Graceful wave filter;
The Kalman filter is used for according to the assessed value I at current time of storage (k | k-1), and current time light stream passes
The horizontal velocity that sensor detects, unmanned plane is calculated at current time optimum state vector value I (k | k);
The positioner, speed control and thrust vectoring conversion targeted attitude angle module are sequentially connected, also,
The adjustment control port of the positioner and speed control exports with the first control of the Kalman filter respectively
Port is connected with the second control output end mouth;
The positioner, for receiving optimal location parameter and the target location control that the first control output end mouth is sent
System instruction, and it is translated into speed adjust instruction;
The speed control, for receiving optimal velocity parameter and the positioner that the second control output end mouth is sent
Speed adjust instruction, and be translated into thrust vectoring conversion targeted attitude angle module control instruction, so as to realize mesh
Mark the adjustment of posture.
The embodiment of the present invention obtains the horizontal velocity of unmanned plane using the light stream sensor based on image processing techniques, and will
Light stream sensor is merged to obtain accurate unmanned plane position and speed with acceleration transducer data using Kalman filtering
Estimated information is spent, and for realizing the spot hover control of unmanned plane, cost is low, portability is good, and the requirement to environment is relatively low,
And realize and work well.
What deserves to be explained is the content such as information exchange, implementation procedure between the module of said apparatus, unit, due to
The processing method embodiment of the present invention is based on same design, and particular content can be found in the narration in the inventive method embodiment, this
Place repeats no more.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of embodiment is to lead to
Program is crossed to instruct the hardware of correlation to complete, the program can be stored in a computer-readable recording medium, storage medium
It can include:Read-only storage (ROM, Read Only Memory), random access memory (RAM, Random Access
Memory), disk or CD etc..
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention
All any modification, equivalent and improvement made within refreshing and principle etc., should be included in the scope of the protection.
Claims (9)
- A kind of 1. unmanned plane fixed-point implementation method based on light stream sensor and acceleration transducer, it is characterised in that including:The unmanned plane obtained according to light stream sensor is relative to the horizontal movement velocity Vx and Vy of inertial coordinate system, acceleration Acceleration A x, Ay, the Az of the unmanned plane that sensor obtains under body axis system on three directions of x-axis, y-axis and z-axis, meter Calculation obtains current unmanned plane x directions position Lx, y direction position Ly and z direction position Lz, and the speed Vz in z directions;According to x directions position, y directions position and z directions position, x directions speed Vx, y direction speed Vy and z directions speed Vz, and the acceleration bias generation state vector I in x-axis, y-axis and z-axis direction;According to the optimal estimation value I of the state vector of previous moment (k-1 | k-1) and current time acceleration A x, Ay gathered and Az, current time assessed value I (k | k-1) of the state vector under the influence of previous moment is calculated, wherein, k is represented in sequential Current time, k-1 represent sequential in previous moment;According to the assessed value I at the current time (k | k-1), the horizontal velocity that current time light stream sensor detects, calculate Unmanned plane is obtained at current time optimum state vector value I (k | k);Extract being sent respectively apart from the factor and/or velocity factor, generation control instruction in the optimum state vector I (k | k) To the positioner and/or speed control of unmanned plane, so that the positioner and/or speed control are made accordingly Adjustment.
- 2. selected realize is pinpointed in the unmanned plane room according to claim 1 based on light stream sensor and acceleration transducer Method, it is characterised in that the state vector I is 9 × 1 matrixes, is specially:(Lx, Ly, Lz, Vx, Vy, Vz, Bx, By, Bz), wherein, Lx, Ly, Lz are that unmanned plane is located at x in inertial coordinate system The position of axle, y-axis and z-axis, Vx, Vy and Vz be unmanned plane relative to the translational speed of inertial coordinate system, Bx, By and Bz are Acceleration bias;Wherein, Vz is obtained by doing integration to acceleration transducer collection value Az, initial Lx, Ly and Lz parameter It is worth for 0.
- 3. pinpointed in the unmanned plane room according to claim 1 or 2 based on light stream sensor and acceleration transducer selected Implementation method, it is characterised in that the assessed value I (k | k-1) at current time acquisition modes, specifically include:The assessed value I (k | k-1) at current time is calculated according to the first formula, first formula is:[I (k | k-1)-I (k-1 | k-1)]/dt=AI (k-1 | k-1)+I (k-1 | k-1)+BU (k-1);Wherein, A is the sytem matrix of unmanned plane, and the sytem matrix is by permanent matrix and the direction cosine matrix structure of unmanned plane Into;B is the control machine matrix of unmanned plane;U be acceleration x-axis, y-axis and z-axis direction acceleration gathered data form to Amount;Dt is the state vector update cycle;The direction cosine matrix is obtained by the IMU of unmanned plane.
- 4. selected realize is pinpointed in the unmanned plane room according to claim 3 based on light stream sensor and acceleration transducer Method, it is characterised in that the sytem matrix A is 9*9 matrix, and 9 factors that its row and column has state vector are formed, institute Stating 9 factors includes:Lx, Ly, Lz, Vx, Vy, Vz, Bx, By and Bz;Wherein, (Isosorbide-5-Nitrae), (2,5) and (3,6) is permanent number 1 in sytem matrix A;(4,7) are between (6,9) in sytem matrix A Matrix area places the homography parameter value of the direction cosine matrix of the unmanned plane;Remaining ranks is corresponding in sytem matrix A Value be 0.
- 5. selected realize is pinpointed in the unmanned plane room according to claim 3 based on light stream sensor and acceleration transducer Method, it is characterised in that it is described to control the matrix that matrix B is 9*3, wherein row is made up of 9 factors of state vector, arrange by adding Acceleration gathered data U_ax, U_ay and the U_az of speed in x-axis, y-axis and z-axis direction are formed;Wherein, control matrix B (4, 1), (5,2) and (6,3) are permanent number 1, and described to control in matrix B value corresponding to remaining ranks be 0.
- 6. selected realize is pinpointed in the unmanned plane room according to claim 3 based on light stream sensor and acceleration transducer Method, it is characterised in that according to the assessed value I at the current time (k | k-1), the water that current time light stream sensor detects Flat speed, unmanned plane is calculated at current time optimum state vector value I (k | k), specifically includes:Unmanned plane is calculated at current time optimum state vector value I (k | k) according to the second formula, second formula is:I (k | k)=I (k | k-1)+Kg (k) (Z (k)-HI (k | k-1)Wherein, Z (K) is actual measurement matrix, is made up of the horizontal velocity collected of current time light stream sensor;H is system Measurement matrix, the measurement matrix H is a permanent matrix;Kg (k) is the kalman gain factor;Current time optimal state vector estimation can be obtained by obtaining suitable Kg (k) by adjustment.
- 7. selected realize is pinpointed in the unmanned plane room according to claim 6 based on light stream sensor and acceleration transducer Method, it is characterised in that the measurement matrix H is 2*9 matrixes, wherein, measurement matrix H row is made up of Vx and Vy, measures square Battle array H row are made up of 9 factors of corresponding states vector, wherein, measurement matrix H (Isosorbide-5-Nitrae) and (2,5) is 1, in the matrix of both sides Value corresponding to remaining ranks is 0.
- 8. selected realize is pinpointed in the unmanned plane room according to claim 6 based on light stream sensor and acceleration transducer Method, it is characterised in that the kalman gain factor K g (k) is a 9*2 matrixes.
- 9. a kind of unmanned plane fixed-point implementation device based on light stream sensor and acceleration transducer, it is characterised in that realize dress Put and passed including positioner, speed control, thrust vectoring conversion targeted attitude angle module, Kalman filter, light stream Sensor and acceleration transducer, specifically:The light stream sensor and acceleration transducer are used to the sensing data each collected being sent to Kalman's filter Ripple device;The Kalman filter is used for according to the assessed value I at current time of storage (k | k-1), current time light stream sensor The horizontal velocity detected, unmanned plane is calculated at current time optimum state vector value I (k | k);The positioner, speed control and thrust vectoring conversion targeted attitude angle module are sequentially connected, also, described The adjustment control port of positioner and speed control the first control output end mouth with the Kalman filter respectively It is connected with the second control output end mouth;The positioner, optimal location parameter and target location control for receiving the first control output end mouth transmission refer to Order, and it is translated into speed adjust instruction;The speed control, for receiving the optimal velocity parameter of the second control output end mouth transmission and the speed of positioner Adjust instruction is spent, and is translated into the control instruction of thrust vectoring conversion targeted attitude angle module, so as to realize target appearance The adjustment of state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710539335.4A CN107389968B (en) | 2017-07-04 | 2017-07-04 | Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710539335.4A CN107389968B (en) | 2017-07-04 | 2017-07-04 | Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107389968A true CN107389968A (en) | 2017-11-24 |
CN107389968B CN107389968B (en) | 2020-01-24 |
Family
ID=60335108
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710539335.4A Active CN107389968B (en) | 2017-07-04 | 2017-07-04 | Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107389968B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108196582A (en) * | 2018-02-12 | 2018-06-22 | 深圳技术大学(筹) | A kind of indoor Visual Navigation unmanned plane cluster flight control system and method |
CN109634297A (en) * | 2018-12-18 | 2019-04-16 | 辽宁壮龙无人机科技有限公司 | A kind of multi-rotor unmanned aerial vehicle and control method based on light stream sensor location navigation |
CN110442142A (en) * | 2018-05-02 | 2019-11-12 | 北京京东尚科信息技术有限公司 | Speed data processing method, device, electronic equipment and computer-readable medium |
CN110737212A (en) * | 2018-07-18 | 2020-01-31 | 华为技术有限公司 | Unmanned aerial vehicle control system and method |
CN113109830A (en) * | 2021-03-29 | 2021-07-13 | 桂林电子科技大学 | Three-dimensional motion measurement method adopting optical flow and distance measurement sensor |
CN115826602A (en) * | 2022-11-17 | 2023-03-21 | 众芯汉创(北京)科技有限公司 | System and method for managing dynamic and accurate positioning of flight based on unmanned aerial vehicle |
CN115963851A (en) * | 2021-10-13 | 2023-04-14 | 北京三快在线科技有限公司 | Unmanned aerial vehicle positioning method and device |
CN117739972A (en) * | 2024-02-18 | 2024-03-22 | 中国民用航空飞行学院 | Unmanned aerial vehicle approach stage positioning method without global satellite positioning system |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040246463A1 (en) * | 2003-01-29 | 2004-12-09 | Milinusic Tomislav F. | Method and apparatus for optical inertial measurement |
CN102928846A (en) * | 2012-10-24 | 2013-02-13 | 华南理工大学 | Extreme-low-altitude laser radar digital terrain mapping system and extreme-low-altitude laser radar digital terrain mapping method of small-sized unmanned helicopter |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN103365295A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | DSP (Digital Signal Processor)-based quad-rotor unmanned aerial vehicle autonomous hover control system and method |
CN103853156A (en) * | 2014-02-07 | 2014-06-11 | 中山大学 | Small four-rotor aircraft control system and method based on airborne sensor |
CN103868521A (en) * | 2014-02-20 | 2014-06-18 | 天津大学 | Autonomous quadrotor unmanned aerial vehicle positioning and controlling method based on laser radar |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
CN104501814A (en) * | 2014-12-12 | 2015-04-08 | 浙江大学 | Attitude and position estimation method based on vision and inertia information |
CN104567799A (en) * | 2014-11-28 | 2015-04-29 | 天津大学 | Multi-sensor information fusion-based method for measuring height of small unmanned gyroplane |
CN104808231A (en) * | 2015-03-10 | 2015-07-29 | 天津大学 | Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion |
CN105094138A (en) * | 2015-07-15 | 2015-11-25 | 东北农业大学 | Low-altitude autonomous navigation system for rotary-wing unmanned plane |
CN105352495A (en) * | 2015-11-17 | 2016-02-24 | 天津大学 | Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor |
CN105737833A (en) * | 2016-05-13 | 2016-07-06 | 上海会志信息科技有限公司 | Indoor navigation method and indoor navigation device |
CN106647784A (en) * | 2016-11-15 | 2017-05-10 | 天津大学 | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system |
CN106708066A (en) * | 2015-12-20 | 2017-05-24 | 中国电子科技集团公司第二十研究所 | Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation |
-
2017
- 2017-07-04 CN CN201710539335.4A patent/CN107389968B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040246463A1 (en) * | 2003-01-29 | 2004-12-09 | Milinusic Tomislav F. | Method and apparatus for optical inertial measurement |
CN102928846A (en) * | 2012-10-24 | 2013-02-13 | 华南理工大学 | Extreme-low-altitude laser radar digital terrain mapping system and extreme-low-altitude laser radar digital terrain mapping method of small-sized unmanned helicopter |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN103365295A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | DSP (Digital Signal Processor)-based quad-rotor unmanned aerial vehicle autonomous hover control system and method |
CN103853156A (en) * | 2014-02-07 | 2014-06-11 | 中山大学 | Small four-rotor aircraft control system and method based on airborne sensor |
CN103868521A (en) * | 2014-02-20 | 2014-06-18 | 天津大学 | Autonomous quadrotor unmanned aerial vehicle positioning and controlling method based on laser radar |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
CN104567799A (en) * | 2014-11-28 | 2015-04-29 | 天津大学 | Multi-sensor information fusion-based method for measuring height of small unmanned gyroplane |
CN104501814A (en) * | 2014-12-12 | 2015-04-08 | 浙江大学 | Attitude and position estimation method based on vision and inertia information |
CN104808231A (en) * | 2015-03-10 | 2015-07-29 | 天津大学 | Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion |
CN105094138A (en) * | 2015-07-15 | 2015-11-25 | 东北农业大学 | Low-altitude autonomous navigation system for rotary-wing unmanned plane |
CN105352495A (en) * | 2015-11-17 | 2016-02-24 | 天津大学 | Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor |
CN106708066A (en) * | 2015-12-20 | 2017-05-24 | 中国电子科技集团公司第二十研究所 | Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation |
CN105737833A (en) * | 2016-05-13 | 2016-07-06 | 上海会志信息科技有限公司 | Indoor navigation method and indoor navigation device |
CN106647784A (en) * | 2016-11-15 | 2017-05-10 | 天津大学 | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system |
Non-Patent Citations (5)
Title |
---|
MANKIBI M E 等: ""Hybrid ventilation control design and management"", 《ASHRAE TRANSACTIONS》 * |
姜成平: ""一种四旋翼无人机控制系统的设计与实现研究"", 《中国优秀硕士学位论文》 * |
曹美会 等: ""基于视觉的四旋翼无人机自主定位与控制系统"", 《信息与控制》 * |
郑伟: ""基于视觉的微小型四旋翼飞行机器人位姿估计与导航研究"", 《中国优秀硕士学位论文》 * |
鲜斌 等: ""基于视觉的小型四旋翼无人机自主飞行控制"", 《机械工程学报》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108196582A (en) * | 2018-02-12 | 2018-06-22 | 深圳技术大学(筹) | A kind of indoor Visual Navigation unmanned plane cluster flight control system and method |
CN110442142A (en) * | 2018-05-02 | 2019-11-12 | 北京京东尚科信息技术有限公司 | Speed data processing method, device, electronic equipment and computer-readable medium |
CN110737212A (en) * | 2018-07-18 | 2020-01-31 | 华为技术有限公司 | Unmanned aerial vehicle control system and method |
CN110737212B (en) * | 2018-07-18 | 2021-01-01 | 华为技术有限公司 | Unmanned aerial vehicle control system and method |
CN109634297A (en) * | 2018-12-18 | 2019-04-16 | 辽宁壮龙无人机科技有限公司 | A kind of multi-rotor unmanned aerial vehicle and control method based on light stream sensor location navigation |
CN113109830A (en) * | 2021-03-29 | 2021-07-13 | 桂林电子科技大学 | Three-dimensional motion measurement method adopting optical flow and distance measurement sensor |
CN115963851A (en) * | 2021-10-13 | 2023-04-14 | 北京三快在线科技有限公司 | Unmanned aerial vehicle positioning method and device |
CN115826602A (en) * | 2022-11-17 | 2023-03-21 | 众芯汉创(北京)科技有限公司 | System and method for managing dynamic and accurate positioning of flight based on unmanned aerial vehicle |
CN115826602B (en) * | 2022-11-17 | 2023-11-17 | 众芯汉创(北京)科技有限公司 | Unmanned aerial vehicle-based flight dynamic and accurate positioning management system and method |
CN117739972A (en) * | 2024-02-18 | 2024-03-22 | 中国民用航空飞行学院 | Unmanned aerial vehicle approach stage positioning method without global satellite positioning system |
Also Published As
Publication number | Publication date |
---|---|
CN107389968B (en) | 2020-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107389968A (en) | A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer | |
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
US7395181B2 (en) | Motion tracking system | |
García Carrillo et al. | Combining stereo vision and inertial navigation system for a quad-rotor UAV | |
Wang et al. | A simple and parallel algorithm for real-time robot localization by fusing monocular vision and odometry/AHRS sensors | |
Fan et al. | Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB | |
Shen et al. | Optical flow sensor/INS/magnetometer integrated navigation system for MAV in GPS-denied environment | |
EP2029970B1 (en) | Beacon-augmented pose estimation | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN110058602A (en) | Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision | |
WO2019071916A1 (en) | Antenna beam attitude control method and system | |
CN109141433A (en) | A kind of robot indoor locating system and localization method | |
CN107314718A (en) | High speed rotating missile Attitude estimation method based on magnetic survey rolling angular rate information | |
CN108253963A (en) | A kind of robot active disturbance rejection localization method and alignment system based on Multi-sensor Fusion | |
CN111238469A (en) | Unmanned aerial vehicle formation relative navigation method based on inertia/data chain | |
Amidi et al. | Research on an autonomous vision-guided helicopter | |
CN116952229A (en) | Unmanned aerial vehicle positioning method, device, system and storage medium | |
CN110285811A (en) | The fusion and positioning method and device of satellite positioning and inertial navigation | |
CN113237485A (en) | SLAM method and system based on multi-sensor fusion | |
Mung et al. | Target State Estimation for UAV's Target Tracking and Precision Landing Control: Algorithm and Verification System | |
He et al. | Rotational coordinate transformation for visual-inertial sensor fusion | |
Taylor | Fusion of inertial, vision, and air pressure sensors for MAV navigation | |
Troiani et al. | A 3 points vision based approach for MAV localization in GPS denied environments | |
Zhang et al. | Sensor-fusion-based Trajectory Reconstruction for Mobile Devices. | |
Qin | Exploring Indoor Navigation Based on 9-dof Kalman Filter and SLAM Algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |