CN107748569A - Motion control method, device and UAS for unmanned plane - Google Patents
Motion control method, device and UAS for unmanned plane Download PDFInfo
- Publication number
- CN107748569A CN107748569A CN201710785660.9A CN201710785660A CN107748569A CN 107748569 A CN107748569 A CN 107748569A CN 201710785660 A CN201710785660 A CN 201710785660A CN 107748569 A CN107748569 A CN 107748569A
- Authority
- CN
- China
- Prior art keywords
- key frame
- unmanned plane
- image
- attitude information
- current frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000006073 displacement reaction Methods 0.000 claims abstract description 28
- 230000008859 change Effects 0.000 claims abstract description 27
- 238000005259 measurement Methods 0.000 claims abstract description 11
- 230000001360 synchronised effect Effects 0.000 claims abstract description 5
- 238000005457 optimization Methods 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 16
- 238000003860 storage Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011217 control strategy Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of motion control method, device and UAS for unmanned plane, this method includes:Obtain the image of the monocular-camera collection of the unmanned plane;Key frame is chosen from described image, wherein, it is key frame to choose the first two field picture, compare the current frame image after the first two field picture and previous key frame, if the change of the pixel coordinate of at least one pair of homologous picture point between the current frame image and previous key frame images exceeds given threshold, it is key frame then to choose the current frame image, is otherwise non-key frame;Obtain the inertial data of the Inertial Measurement Unit collection of the unmanned plane;Key frame according to selecting calculates the first displacement and the attitude information of the unmanned plane;The second attitude information of the unmanned plane is calculated according to the inertial data;First displacement and attitude information are blended with second attitude information, positioning and map structuring are synchronized to the unmanned plane.
Description
Technical field
The present invention relates to motion tracking technology field, more particularly it relates to a kind of motion control for unmanned plane
Method, a kind of motion control device for unmanned plane and a kind of UAS processed.
Background technology
Because the flight environment of vehicle of unmanned plane is more complicated, unmanned plane can be successfully tackled currently without any sensor
All interference for being run into flight course and provide accurate estimation.For example, real-time positioning based on monocular vision and building figure
((simultaneous localization and mapping, SLAM) can not just obtain absolute dimensional information and can not feel
Know the direction of gravity, and image information is vulnerable to illumination, the interference blocked, quickly moved, so, it is currently suggested and is regarded in monocular
In the SLAM of feel introduce Inertial Measurement Unit (Inertial measurement unit, IMU) collection inertial data and then
That realizes vision and inertia merges estimation, while dimensional information is recovered, to improve the robustness of system.This kind of vision is with being used to
Although the fusion estimation of property with higher precision, there is also amount of calculation it is larger the problem of, can have certain shadow to real-time
Ring, therefore, amount of calculation can be reduced by being highly desirable offer one kind, and then is improved and carried out by vision with the estimation of merging of inertia
The real-time of motion control.
The content of the invention
One purpose of the embodiment of the present invention is to provide a kind of new technical scheme for carrying out vision and being merged with inertia, to carry
The real-time of high motion control.
According to the first aspect of the invention, there is provided a kind of motion control method for unmanned plane, it includes:
Obtain the image of the monocular-camera collection of the unmanned plane;
Key frame is chosen from described image, wherein, the first two field picture of selection is key frame, after comparing the first two field picture
Current frame image and previous key frame, if at least one pair of between the current frame image and previous key frame images is homologous
The change of the pixel coordinate of picture point exceeds given threshold, then it is key frame to choose the current frame image, otherwise determines described work as
Prior image frame is non-key frame;
Obtain the inertial data of the Inertial Measurement Unit collection of the unmanned plane;
Key frame according to selecting calculates the first displacement and the attitude information of the unmanned plane;
The second attitude information of the unmanned plane is calculated according to the inertial data;
First displacement and attitude information are blended with second attitude information, the unmanned plane is synchronized
Positioning and map structuring.
Alternatively, the current frame image after the two field picture of comparison first includes with previous key frame:
Extract the characteristic point of the current frame image;
Obtain the characteristic point of the previous key frame;
The characteristic point of the current frame image and the characteristic point of the previous key frame are matched, is obtained each to homologous picture point;
If the change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold, the current frame image is pass
Key frame, it is otherwise non-key frame.
Alternatively, it is described to work as if the change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold
Prior image frame is key frame, is otherwise included for non-key frame:
If the change of at least one pixel coordinate of at least one pair of homologous picture point exceeds given threshold, the present frame
Image is key frame, is otherwise non-key frame.
Alternatively, it is described to blend first displacement and attitude information with second attitude information, to the nothing
It is man-machine synchronize positioning and map structuring include:
First displacement and attitude information are merged with second attitude information by average weighted method,
Optimize pose estimation;
Based on the optimization pose estimation, pose figure optimization is carried out;
Optimized based on the pose figure, splice point cloud builds figure.
According to the second aspect of the invention, a kind of motion control device for unmanned plane is additionally provided, it includes:
Image collection module, the image that the monocular-camera for obtaining the unmanned plane gathers;
Key frame extraction module, for choosing key frame from described image, wherein, the first two field picture is chosen as key
Frame, compare the current frame image after the first two field picture and previous key frame, if the current frame image and previous key frame
The change of the pixel coordinate of at least one pair of homologous picture point between image exceeds given threshold, then chooses the current frame image and be
Key frame, it is non-key frame otherwise to determine the current frame image;
Inertial data acquisition module, the inertial data that the Inertial Measurement Unit for obtaining the unmanned plane gathers;
First resolves module, for calculating the first displacement and the posture letter of the unmanned plane according to the key frame selected
Breath;
Second resolves module, for calculating the second attitude information of the unmanned plane according to the inertial data;And
Motion-control module, it is right for first displacement and attitude information to be blended with second attitude information
The unmanned plane synchronizes positioning and map structuring.
Alternatively, the key frame extraction module includes:
Extraction unit, for extracting the characteristic point of the current frame image;
Acquiring unit, for obtaining the characteristic point of the previous key frame;
Matching unit, for matching the characteristic point of the current frame image and the characteristic point of the previous key frame, obtain
It is each to homologous picture point;And
Unit is chosen, in the case that the change for the pixel coordinate at least one pair of homologous picture point exceeds given threshold,
It is key frame to choose the current frame image, and it is non-key frame otherwise to determine the present frame.
Alternatively, the selection unit is used for:
In the case where the change of at least one pixel coordinate of at least one pair of homologous picture point exceeds given threshold, institute is chosen
It is key frame to state current frame image, and it is non-key frame otherwise to determine the current frame image.
Alternatively, the motion-control module includes:
Pose estimation unit, for by average weighted method to first displacement and attitude information and described second
Attitude information is merged, optimization pose estimation;
Rear end optimizes unit, for based on the optimization pose estimation, carrying out pose figure optimization;
Figure unit is built, for optimizing based on the pose figure, splice point cloud builds figure.
According to the third aspect of the invention we, a kind of UAS is additionally provided, it is included according to a second aspect of the present invention
Described motion control device.
According to the fourth aspect of the invention, a kind of UAS is additionally provided, it includes processor and memory, described
Memory is used for store instruction, described to instruct for controlling the processor to be operated to perform according to a first aspect of the present invention
Described method.
The beneficial effect of the present invention is that the inventive method is selected by the change of the pixel coordinate of more homologous picture point
Take key frame, and the merging to realize positioning and map structuring in real time of vision and inertia is carried out based on key frame, this can be effective
Amount of calculation is reduced, is advantageous to improve the real-time of motion control.
Detailed description according to accompanying drawing to the exemplary embodiment of the present invention, further feature of the invention and its advantage will
It is made apparent from.
Brief description of the drawings
It is combined in the description and the accompanying drawing of a part for constitution instruction shows embodiments of the invention, and even
It is used for the principle for explaining the present invention together with its explanation.
Fig. 1 is the composition structure according to the UAS of the embodiment of the present invention;
Fig. 2 is the hardware architecture diagram according to the unmanned plane of the embodiment of the present invention;
Fig. 3 is the hardware architecture diagram according to the earth station of the embodiment of the present invention;
Fig. 4 is the schematic flow sheet according to the motion control method of the embodiment of the present invention;
Fig. 5 is the theory diagram according to the motion control device of the embodiment of the present invention;
Fig. 6 is the theory diagram according to the UAS of the embodiment of the present invention.
Embodiment
The various exemplary embodiments of the present invention are described in detail now with reference to accompanying drawing.It should be noted that:Unless have in addition
Body illustrates that the unlimited system of part and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally
The scope of invention.
The description only actually at least one exemplary embodiment is illustrative to be never used as to the present invention below
And its application or any restrictions that use.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable
In the case of, the technology, method and apparatus should be considered as part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without
It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi
It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
<UAS framework>
Fig. 1 is the structure composition of UAS according to embodiments of the present invention.
According to Fig. 1, the UAS of the embodiment of the present invention includes unmanned plane 1000 and earth station 2000.
Fig. 2 is the hardware configuration of unmanned plane 1000 according to embodiments of the present invention.
According to Fig. 2, unmanned plane 1000 includes at least one processor 1100 and at least one memory 1200.Storage
Device 1200 is used for store instruction, and the instruction is used for control processor 1100 and operated to perform the motion control according to the present invention
Method.Technical staff can instruct according to presently disclosed conceptual design.How control processor 1100 is operated for instruction,
This is it is known in the art that therefore being not described in detail herein.
Processor 1100 is such as can be central processor CPU, Micro-processor MCV.
Memory 1200 for example including ROM (read-only storage), RAM (random access memory), such as hard disk it is non-easily
The property lost memory etc..
Unmanned plane according to embodiments of the present invention can also include Inertial Measurement Unit (IMU) 1300, communicator 1400,
Camera device 1500 and positioner 1600.
Inertial Measurement Unit 1300 includes the accelerometer (or the three axis accelerometer integrated) of three single shafts
With the gyroscope (or the three-axis gyroscope integrated) of three single shafts, accelerometer is used to gather unmanned plane in three axles
Acceleration signal, and gyroscope be used for gather angular velocity signal of the unmanned plane on three axles, with by measuring unmanned plane three
Angular speed and acceleration in dimension space, calculate the posture of unmanned plane.
In this embodiment of the invention, camera device 1500 includes monocular-camera, and the monocular-camera, which is used to gather, schemes
Picture.
Communicator 1400 can include radio communication device, to be communicatively coupled with earth station 2000.
Positioner 1600 is used to position unmanned plane, to provide the position coordinates of unmanned plane.The positioner 160
E.g. GPS positioning device.
Fig. 3 is the hardware configuration of earth station 2000 according to embodiments of the present invention.
According to Fig. 3, earth station 2000 includes at least one processor 2100 and at least one memory 2200.Storage
Device 2200 is used for store instruction, and the instruction is used for control processor 2100 and operated to perform the motion control according to the present invention
Method.Technical staff can instruct according to presently disclosed conceptual design.How control processor 2100 is operated for instruction,
This is it is known in the art that therefore being not described in detail herein.
Earth station 2000 according to embodiments of the present invention can also include communicator 2400, with cause unmanned plane 1000 with
Earth station 2000 is communicatively coupled by respective communicator 1400,2400, to carry out the transmission of data, instruction etc..
Communicator 1400,2400 is, for example, radio frequency communication devices.
In addition, the earth station 2000 of the embodiment of the present invention can also include input unit, display device (is not shown in figure
Go out) etc..The input unit is such as including press key input device, touch display screen.Display device is for example including display screen, touch
Display screen etc..
The unmanned plane 1000 of the embodiment of the present invention can independently implement the motion control method according to the present invention.
The unmanned plane 1000 of the embodiment of the present invention can also implement the motion control according to the present invention together with earth station 2000
Method processed.Such as the image collected and inertial data are sent to earth station 2000 and carry out SLAM resolvings by unmanned plane 1000,
And then Motion Control Strategies are obtained, then corresponding motion control signal is sent to unmanned plane 1000 and performed.
<Method>
Fig. 4 is the schematic flow sheet of motion control method according to embodiments of the present invention.
According to Fig. 4, motion control method of the present invention may include steps of:
Step S4010, obtain the image of the monocular-camera collection of unmanned plane.
Monocular-camera it is per second collection image frame number depend on monocular-camera data frequency, using data frequency as
Exemplified by 20Hz, then monocular-camera is per second gathers 20 two field pictures.
Step S4020, key frame is chosen from the image got.
In step S4020, choosing the method for key frame includes:It is key frame to choose the first two field picture, afterwards, is often gathered
To a two field picture will the image as current frame image compared with the previous key frame selected, if current frame image with
The change of the pixel coordinate of at least one pair of homologous picture point between previous key frame images exceeds given threshold, then chooses present frame
Image is key frame, otherwise determines that current frame image is non-key frame, can abandon, to save internal memory.
According to the choosing method, after the first two field picture is got, that is, it is key frame to choose the first two field picture;When getting
During the second two field picture, the second two field picture and the first two field picture are compared, if between the second two field picture and the first two field picture
The change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold, then it is also key frame to choose the second two field picture, otherwise
It is non-key frame to determine the second two field picture, here, for example the second two field picture is non-key frame;When getting three two field pictures,
By the 3rd two field picture and previous key frame, i.e. the first two field picture, compare, if between the 3rd two field picture and the first two field picture
The change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold, then it is also key frame to choose the 3rd two field picture, otherwise
It is non-key frame to determine the 3rd two field picture, here, for example the 3rd two field picture is key frame;, will when getting four two field pictures
4th two field picture and previous key frame, i.e. the 3rd two field picture, are compared, if between the 4th two field picture and the 3rd two field picture extremely
The change of the pixel coordinate of few pair of homologous picture point exceeds given threshold, then it is also key frame to choose the 4th two field picture, otherwise really
Fixed 4th two field picture is non-key frame, here, for example the 4th two field picture is non-key frame;The like.
Above-mentioned homologous picture point is that a pair of pixels of same feature in kind are represented in two field pictures.
The homologous picture point can carry out matching determination by extracting the characteristic point in two field pictures.
This feature point is, for example, edge feature, corner characteristics etc..
Therefore, the current frame image after the above-mentioned two field picture of comparison first can include with previous key frame:
Step S4021, extract the characteristic point of current frame image.
Step S4022, obtain the characteristic point of previous key frame.
It is determined that during the previous key frame, it is also desirable to extract the characteristic point of the previous key frame, therefore, can extract
It is stored directly in data structure after the characteristic point of previous key frame, and in step S4022, is directly obtained from data structure
Get the characteristic point of the previous key frame.
Step S4023, the characteristic point of current frame image and the characteristic point of previous key frame are matched, obtained each to same source image
Point.
Step S4024, if the change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold, choose current
Two field picture is key frame, is otherwise non-key frame for determination current frame image.
The given threshold can be set according to control accuracy, and the smaller precision of given threshold is higher, the bigger precision of given threshold
It is lower.
Because the pixel coordinate of pixel in image can be two-dimensional coordinate or three-dimensional coordinate (including depth coordinate), because
This, step S4024 can be further:Set if the change of at least one pixel coordinate of at least one pair of homologous picture point exceeds
Determine threshold value, then it is key frame to choose current frame image, otherwise determines that current frame image is non-key frame.
According to step S4020, because distance is close between two field pictures, if each two field picture is all used to build ground
Figure, then can cause map frequent updating, consumption calculations time and storage area.Therefore, can quickly be rejected by the step superfluous
Remaining frame, to improve resolving efficiency, and then improve the real-time of control.
Step S4030, obtain the inertial data of the Inertial Measurement Unit collection of unmanned plane.
Step S4040, the first displacement and the attitude information of unmanned plane are calculated according to the key frame selected.
Step S4040 includes solving the relative motion (conversion) of interframe for two neighboring crucial interframe, so as to
Observed result under diverse location is transformed into a global coordinate system, to realize the road in unmanned plane itself pose and environment
Mark the expression of (or being characterized a little) under global coordinate system.
Step S4040 can be decomposed into three feature extraction, characteristic matching and estimation processes.
Because the embodiment of the present invention is resolved by the key frame selected, this is equal relative to one two field picture of every acquisition
The undoubtedly amount of calculation of reduction for being resolved.
Step S4050, the second attitude information of unmanned plane is calculated according to inertial data.
Step S4050 can also further calculate the second displacement information of unmanned plane according to inertial data, form the
Two displacements and attitude information.
Step S4060, the first displacement and attitude information are blended with second attitude information, unmanned plane carried out same
Step positioning and map structuring.
, can also be in the embodiment of second displacement and attitude information be obtained, by the first displacement and appearance in step S4060
State information blends with the second displacement and attitude information, positioning and map structuring is synchronized to unmanned plane, and then realize
The motion control of unmanned plane.
, can be by the second attitude information to the attitude information in the first displacement and attitude information in step S4060
It is corrected, and then improves the precision of positioning and map structuring.
Therebetween blending algorithm is such as can use weighted mean method, Kalman's method.
Step S4060 may further include:
Step S4061, the first displacement and attitude information are melted with the second attitude information by average weighted method
Close, optimization pose estimation.
Step S4062, based on optimization pose estimation, pose figure optimization is carried out, to reduce cumulative errors.
Step S4063, is optimized based on pose figure, and splice point cloud builds figure.
As can be seen here, method according to embodiments of the present invention is to choose key frame, this method by the change of pixel coordinate
Other relatively existing choosing methods, such as norm of the transformation matrix by calculating consecutive frame etc., have real-time high and have
The advantage of effect, resolving amount of calculation can be greatly simplified, improve the real-time of control.
<Device>
Fig. 5 is the theory diagram of motion control device according to embodiments of the present invention.
According to Fig. 5, the motion control device of this embodiment of the invention includes image collection module 5010, key frame choosing
Modulus block 5020, inertial data acquisition module 5030, first resolve module 5040, second and resolve module 5050 and motion control mould
Block 5060.
Above-mentioned image collection module 5010 is used for the image for the monocular-camera collection for obtaining unmanned plane.
Above-mentioned key frame extraction module 5020 is used to choose key frame from image, wherein, the first two field picture is chosen to close
Key frame, compare the current frame image after the first two field picture and previous key frame, if the current frame image and previous key
The change of the pixel coordinate of at least one pair of homologous picture point between two field picture exceeds given threshold, then chooses the current frame image
It is otherwise non-key frame for key frame.
Above-mentioned inertial data acquisition module 5030 is used for the inertial data for the Inertial Measurement Unit collection for obtaining unmanned plane.
Above-mentioned first resolving module 5040 is used for the first displacement and the appearance that unmanned plane is calculated according to the key frame selected
State information.
Above-mentioned second resolving module 5050 is used for the second attitude information that unmanned plane is calculated according to inertial data.
Above-mentioned motion-control module 5060 is used to blend the first displacement and attitude information with the second attitude information, to nothing
It is man-machine to synchronize positioning and map structuring (SLAM).
Further, above-mentioned key frame extraction module 5020 can include extraction unit, acquiring unit, matching unit and choosing
Take unit (not shown).Extraction unit is used for the characteristic point for extracting the current frame image.Acquiring unit is used to obtain institute
State the characteristic point of previous key frame.Matching unit is used for the characteristic point for matching the current frame image and the previous key frame
Characteristic point, obtain each to homologous picture point.Selection unit, which is used to exceed in the change of the pixel coordinate of at least one pair of homologous picture point, to be set
In the case of determining threshold value, it is key frame to choose the current frame image, and it is non-key frame otherwise to determine the present frame.
Further, above-mentioned selection unit can be used for:In at least one pixel coordinate of at least one pair of homologous picture point
Change exceed given threshold in the case of, it is key frame to choose the current frame image, otherwise determines the current frame image
For non-key frame.
Further, above-mentioned motion-control module 5060 can include pose estimation unit, rear end optimization unit and build figure
Unit.Pose estimation unit is used for by average weighted method to first displacement and attitude information and second posture
Information is merged, optimization pose estimation;Rear end optimizes unit, for based on the optimization pose estimation, it is excellent to carry out pose figure
Change;Figure unit is built, for optimizing based on the pose figure, splice point cloud builds figure.
<UAS>
Fig. 6 is the theory diagram of UAS according to embodiments of the present invention.
According to Fig. 6, the UAS of the embodiment of the present invention includes the motion control according to any embodiment of the present invention
Device 5000 processed.
The motion control device 5000 can be arranged on unmanned plane 1000, a part of module can also be arranged on into nobody
It is arranged on machine 1000 and by another part module in earth station 2000.
Each embodiment in this specification is described by the way of progressive, identical similar portion between each embodiment
Point cross-reference, what each embodiment stressed is the difference with other embodiment, but people in the art
Member is it should be understood that the various embodiments described above can be used alone or be combined with each other as needed.In addition, for device
For embodiment, because it is corresponding with embodiment of the method, so describing fairly simple, related part is implemented referring to method
The explanation of the corresponding part of example.Device embodiment described above is only schematical, wherein as separating component
The module of explanation can be or may not be physically separate.
The present invention can be device, method and/or computer program product.Computer program product can include computer
Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer-readable recording medium can keep and store to perform the tangible of the instruction that uses of equipment by instruction
Equipment.Computer-readable recording medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage
Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer-readable recording medium
More specifically example (non exhaustive list) includes:Portable computer diskette, hard disk, random access memory (RAM), read-only deposit
It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable
Compact disk read-only storage (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon
It is stored with punch card or groove internal projection structure and the above-mentioned any appropriate combination of instruction.Calculating used herein above
Machine readable storage medium storing program for executing is not construed as instantaneous signal in itself, the electromagnetic wave of such as radio wave or other Free propagations, leads to
Cross the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or transmitted by electric wire
Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer-readable recording medium it is each calculate/
Processing equipment, or outer computer or outer is downloaded to by network, such as internet, LAN, wide area network and/or wireless network
Portion's storage device.Network can include copper transmission cable, optical fiber is transmitted, is wirelessly transferred, router, fire wall, interchanger, gateway
Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment receive from network to be counted
Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment
In calculation machine readable storage medium storing program for executing.
For perform the computer program instructions that operate of the present invention can be assembly instruction, instruction set architecture (ISA) instruction,
Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages
The source code or object code that any combination is write, programming language of the programming language including object-oriented-such as
Smalltalk, C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer
Readable program instructions fully can on the user computer perform, partly perform on the user computer, be only as one
Vertical software kit performs, part performs or completely in remote computer on the remote computer on the user computer for part
Or performed on server.In the situation of remote computer is related to, remote computer can pass through network-bag of any kind
LAN (LAN) or wide area network (WAN)-be connected to subscriber computer are included, or, it may be connected to outer computer (such as profit
Pass through Internet connection with ISP).In certain embodiments, by using computer-readable program instructions
Status information carry out personalized customization electronic circuit, such as PLD, field programmable gate array (FPGA) or can
Programmed logic array (PLA) (PLA), the electronic circuit can perform computer-readable program instructions, so as to realize each side of the present invention
Face.
Referring herein to the flow chart and/or block diagram of method, apparatus and computer program product according to embodiments of the present invention
Describe various aspects of the invention.It should be appreciated that each square frame and flow chart and/or block diagram of flow chart and/or block diagram
In each square frame combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special-purpose computer or other programmable datas
The processor of processing unit, so as to produce a kind of machine so that these instructions are passing through computer or other programmable datas
During the computing device of processing unit, work(specified in one or more of implementation process figure and/or block diagram square frame is generated
The device of energy/action.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to
Order causes computer, programmable data processing unit and/or other equipment to work in a specific way, so as to be stored with instruction
Computer-readable medium then includes a manufacture, and it is included in one or more of implementation process figure and/or block diagram square frame
The instruction of the various aspects of defined function/action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other
In equipment so that series of operation steps is performed on computer, other programmable data processing units or miscellaneous equipment, with production
Raw computer implemented process, so that performed on computer, other programmable data processing units or miscellaneous equipment
Instruct function/action specified in one or more of implementation process figure and/or block diagram square frame.
Flow chart and block diagram in accompanying drawing show device, method and the computer journey of multiple embodiments according to the present invention
Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation
One module of table, program segment or a part for instruction, the module, program segment or a part for instruction include one or more use
In the executable instruction of logic function as defined in realization.At some as the function of in the realization replaced, being marked in square frame
Can be with different from the order marked in accompanying drawing generation.For example, two continuous square frames can essentially be held substantially in parallel
OK, they can also be performed in the opposite order sometimes, and this is depending on involved function.It is also noted that block diagram and/or
The combination of each square frame and block diagram in flow chart and/or the square frame in flow chart, function or dynamic as defined in performing can be used
The special hardware based system made is realized, or can be realized with the combination of specialized hardware and computer instruction.It is right
It is well known that, realized for those skilled in the art by hardware mode, realized by software mode and by software and
It is all of equal value that the mode of combination of hardware, which is realized,.
It is described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and
It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill
Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport
The principle of each embodiment, practical application or technological improvement to the technology in market are best being explained, or is leading this technology
Other those of ordinary skill in domain are understood that each embodiment disclosed herein.The scope of the present invention is limited by appended claims
It is fixed.
Claims (10)
- A kind of 1. motion control method for unmanned plane, it is characterised in that including:Obtain the image of the monocular-camera collection of the unmanned plane;Key frame is chosen from described image, wherein, the first two field picture of selection is key frame, compares working as after the first two field picture Prior image frame and previous key frame, if at least one pair of homologous picture point between the current frame image and previous key frame images The change of pixel coordinate exceed given threshold, then it is key frame to choose the current frame image, otherwise determines the present frame Image is non-key frame;Obtain the inertial data of the Inertial Measurement Unit collection of the unmanned plane;Key frame according to selecting calculates the first displacement and the attitude information of the unmanned plane;The second attitude information of the unmanned plane is calculated according to the inertial data;First displacement and attitude information are blended with second attitude information, positioning is synchronized to the unmanned plane And map structuring.
- 2. motion control method according to claim 1, it is characterised in that current after the two field picture of comparison first Two field picture includes with previous key frame:Extract the characteristic point of the current frame image;Obtain the characteristic point of the previous key frame;The characteristic point of the current frame image and the characteristic point of the previous key frame are matched, is obtained each to homologous picture point;If the change of the pixel coordinate of at least one pair of homologous picture point exceeds given threshold, the current frame image is key Frame, it is otherwise non-key frame.
- 3. the motion control method described in requiring 2 according to right, it is characterised in that if at least one pair of homologous picture point The change of pixel coordinate exceeds given threshold, then the current frame image is key frame, is otherwise included for non-key frame:If the change of at least one pixel coordinate of at least one pair of homologous picture point exceeds given threshold, the current frame image It is otherwise non-key frame for key frame.
- 4. motion control method according to claim 1, it is characterised in that described by first displacement and attitude information Blended with second attitude information, positioning and map structuring are synchronized to the unmanned plane to be included:First displacement and attitude information are merged with second attitude information by average weighted method, optimized Pose is estimated;Based on the optimization pose estimation, pose figure optimization is carried out;Optimized based on the pose figure, splice point cloud builds figure.
- A kind of 5. motion control device for unmanned plane, it is characterised in that including:Image collection module, the image that the monocular-camera for obtaining the unmanned plane gathers;Key frame extraction module, for choosing key frame from described image, wherein, the first two field picture of selection is key frame, than Compared with the current frame image after the first two field picture and previous key frame, if the current frame image and previous key frame images it Between the change of pixel coordinate of at least one pair of homologous picture point exceed given threshold, then choose the current frame image as key Frame, it is non-key frame otherwise to determine the current frame image;Inertial data acquisition module, the inertial data that the Inertial Measurement Unit for obtaining the unmanned plane gathers;First resolves module, for calculating the first displacement and the attitude information of the unmanned plane according to the key frame selected;Second resolves module, for calculating the second attitude information of the unmanned plane according to the inertial data;AndMotion-control module, for first displacement and attitude information to be blended with second attitude information, to described Unmanned plane synchronizes positioning and map structuring.
- 6. motion control device according to claim 5, it is characterised in that the key frame extraction module includes:Extraction unit, for extracting the characteristic point of the current frame image;Acquiring unit, for obtaining the characteristic point of the previous key frame;Matching unit, for matching the characteristic point of the current frame image and the characteristic point of the previous key frame, it is each right to obtain Homologous picture point;AndUnit is chosen, in the case that the change for the pixel coordinate at least one pair of homologous picture point exceeds given threshold, is chosen The current frame image is key frame, and it is non-key frame otherwise to determine the present frame.
- 7. the motion control device described in requiring 6 according to right, it is characterised in that the selection unit is used for:In the case where the change of at least one pixel coordinate of at least one pair of homologous picture point exceeds given threshold, described work as is chosen Prior image frame is key frame, and it is non-key frame otherwise to determine the current frame image.
- 8. motion control device according to claim 5, it is characterised in that the motion-control module includes:Pose estimation unit, for by average weighted method to first displacement and attitude information and second posture Information is merged, optimization pose estimation;Rear end optimizes unit, for based on the optimization pose estimation, carrying out pose figure optimization;Figure unit is built, for optimizing based on the pose figure, splice point cloud builds figure.
- 9. a kind of UAS, it is characterised in that including the motion control device any one of claim 5 to 8.
- 10. a kind of UAS, it is characterised in that including processor and memory, the memory is used for store instruction, institute Instruction is stated to be used to control the processor to be operated to perform method according to any one of claim 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710785660.9A CN107748569B (en) | 2017-09-04 | 2017-09-04 | Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710785660.9A CN107748569B (en) | 2017-09-04 | 2017-09-04 | Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107748569A true CN107748569A (en) | 2018-03-02 |
CN107748569B CN107748569B (en) | 2021-02-19 |
Family
ID=61255639
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710785660.9A Active CN107748569B (en) | 2017-09-04 | 2017-09-04 | Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107748569B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596976A (en) * | 2018-04-27 | 2018-09-28 | 腾讯科技(深圳)有限公司 | Method for relocating, device, equipment and the storage medium of camera posture tracing process |
CN109326006A (en) * | 2018-09-30 | 2019-02-12 | 百度在线网络技术(北京)有限公司 | Map amalgamation method and device |
CN109579847A (en) * | 2018-12-13 | 2019-04-05 | 歌尔股份有限公司 | Extraction method of key frame, device and smart machine in synchronous superposition |
CN110132280A (en) * | 2019-05-20 | 2019-08-16 | 广州小鹏汽车科技有限公司 | Vehicle positioning method, vehicle locating device and vehicle under indoor scene |
CN110246147A (en) * | 2019-05-14 | 2019-09-17 | 中国科学院深圳先进技术研究院 | Vision inertia odometer method, vision inertia mileage counter device and mobile device |
WO2020078250A1 (en) * | 2018-10-15 | 2020-04-23 | 华为技术有限公司 | Data processing method and device for virtual scene |
CN111060948A (en) * | 2019-12-14 | 2020-04-24 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, helmet and computer readable storage medium |
CN111145248A (en) * | 2018-11-06 | 2020-05-12 | 北京地平线机器人技术研发有限公司 | Pose information determination method and device and electronic equipment |
CN111368015A (en) * | 2020-02-28 | 2020-07-03 | 北京百度网讯科技有限公司 | Method and device for compressing map |
CN111553915A (en) * | 2020-05-08 | 2020-08-18 | 深圳前海微众银行股份有限公司 | Article identification detection method, device, equipment and readable storage medium |
CN111951198A (en) * | 2019-05-16 | 2020-11-17 | 杭州海康机器人技术有限公司 | Unmanned aerial vehicle aerial image splicing optimization method and device and storage medium |
CN113465602A (en) * | 2021-05-26 | 2021-10-01 | 北京三快在线科技有限公司 | Navigation method, navigation device, electronic equipment and readable storage medium |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582063A (en) * | 2008-05-13 | 2009-11-18 | 华为技术有限公司 | Video service system, video service device and extraction method for key frame thereof |
CN102201115A (en) * | 2011-04-07 | 2011-09-28 | 湖南天幕智能科技有限公司 | Real-time panoramic image stitching method of aerial videos shot by unmanned plane |
CN103106688A (en) * | 2013-02-20 | 2013-05-15 | 北京工业大学 | Indoor three-dimensional scene rebuilding method based on double-layer rectification method |
CN103686345A (en) * | 2013-12-18 | 2014-03-26 | 北京航天测控技术有限公司 | Video content comparing method based on digital signal processor |
CN103761738A (en) * | 2014-01-22 | 2014-04-30 | 杭州匡伦科技有限公司 | Method for extracting video sequence key frame in three-dimensional reconstruction |
CN103871076A (en) * | 2014-02-27 | 2014-06-18 | 西安电子科技大学 | Moving object extraction method based on optical flow method and superpixel division |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
US20150148988A1 (en) * | 2013-11-10 | 2015-05-28 | Google Inc. | Methods and Systems for Alerting and Aiding an Emergency Situation |
US20160068114A1 (en) * | 2014-09-03 | 2016-03-10 | Sharp Laboratories Of America, Inc. | Methods and Systems for Mobile-Agent Navigation |
CN105783913A (en) * | 2016-03-08 | 2016-07-20 | 中山大学 | SLAM device integrating multiple vehicle-mounted sensors and control method of device |
CN205426175U (en) * | 2016-03-08 | 2016-08-03 | 中山大学 | Fuse on -vehicle multisensor's SLAM device |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
CN106384383A (en) * | 2016-09-08 | 2017-02-08 | 哈尔滨工程大学 | RGB-D and SLAM scene reconfiguration method based on FAST and FREAK feature matching algorithm |
-
2017
- 2017-09-04 CN CN201710785660.9A patent/CN107748569B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582063A (en) * | 2008-05-13 | 2009-11-18 | 华为技术有限公司 | Video service system, video service device and extraction method for key frame thereof |
CN102201115A (en) * | 2011-04-07 | 2011-09-28 | 湖南天幕智能科技有限公司 | Real-time panoramic image stitching method of aerial videos shot by unmanned plane |
CN103106688A (en) * | 2013-02-20 | 2013-05-15 | 北京工业大学 | Indoor three-dimensional scene rebuilding method based on double-layer rectification method |
US20150148988A1 (en) * | 2013-11-10 | 2015-05-28 | Google Inc. | Methods and Systems for Alerting and Aiding an Emergency Situation |
CN103686345A (en) * | 2013-12-18 | 2014-03-26 | 北京航天测控技术有限公司 | Video content comparing method based on digital signal processor |
CN103761738A (en) * | 2014-01-22 | 2014-04-30 | 杭州匡伦科技有限公司 | Method for extracting video sequence key frame in three-dimensional reconstruction |
CN103871076A (en) * | 2014-02-27 | 2014-06-18 | 西安电子科技大学 | Moving object extraction method based on optical flow method and superpixel division |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
US20160068114A1 (en) * | 2014-09-03 | 2016-03-10 | Sharp Laboratories Of America, Inc. | Methods and Systems for Mobile-Agent Navigation |
CN105783913A (en) * | 2016-03-08 | 2016-07-20 | 中山大学 | SLAM device integrating multiple vehicle-mounted sensors and control method of device |
CN205426175U (en) * | 2016-03-08 | 2016-08-03 | 中山大学 | Fuse on -vehicle multisensor's SLAM device |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
CN106384383A (en) * | 2016-09-08 | 2017-02-08 | 哈尔滨工程大学 | RGB-D and SLAM scene reconfiguration method based on FAST and FREAK feature matching algorithm |
Non-Patent Citations (3)
Title |
---|
SHAOWU YANG 等: "Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles", 《ROBOTICS AND AUTONOMOUS SYSTEMS》 * |
张臻炜: "一种基于计算机视觉的无人机实时三维重建方法", 《机械与电子》 * |
李顺意 等: "基于帧间距的运动关键帧提取", 《计算机工程》 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596976A (en) * | 2018-04-27 | 2018-09-28 | 腾讯科技(深圳)有限公司 | Method for relocating, device, equipment and the storage medium of camera posture tracing process |
CN108596976B (en) * | 2018-04-27 | 2022-02-22 | 腾讯科技(深圳)有限公司 | Method, device and equipment for relocating camera attitude tracking process and storage medium |
CN109326006A (en) * | 2018-09-30 | 2019-02-12 | 百度在线网络技术(北京)有限公司 | Map amalgamation method and device |
CN109326006B (en) * | 2018-09-30 | 2023-03-28 | 阿波罗智联(北京)科技有限公司 | Map fusion method and device |
WO2020078250A1 (en) * | 2018-10-15 | 2020-04-23 | 华为技术有限公司 | Data processing method and device for virtual scene |
CN111145248A (en) * | 2018-11-06 | 2020-05-12 | 北京地平线机器人技术研发有限公司 | Pose information determination method and device and electronic equipment |
CN111145248B (en) * | 2018-11-06 | 2023-06-27 | 北京地平线机器人技术研发有限公司 | Pose information determining method and device and electronic equipment |
US11466988B2 (en) | 2018-12-13 | 2022-10-11 | Goertek Inc. | Method and device for extracting key frames in simultaneous localization and mapping and smart device |
CN109579847A (en) * | 2018-12-13 | 2019-04-05 | 歌尔股份有限公司 | Extraction method of key frame, device and smart machine in synchronous superposition |
CN109579847B (en) * | 2018-12-13 | 2022-08-16 | 歌尔股份有限公司 | Method and device for extracting key frame in synchronous positioning and map construction and intelligent equipment |
CN110246147B (en) * | 2019-05-14 | 2023-04-07 | 中国科学院深圳先进技术研究院 | Visual inertial odometer method, visual inertial odometer device and mobile equipment |
CN110246147A (en) * | 2019-05-14 | 2019-09-17 | 中国科学院深圳先进技术研究院 | Vision inertia odometer method, vision inertia mileage counter device and mobile device |
CN111951198B (en) * | 2019-05-16 | 2024-02-02 | 杭州海康威视数字技术股份有限公司 | Unmanned aerial vehicle aerial image stitching optimization method, device and storage medium |
CN111951198A (en) * | 2019-05-16 | 2020-11-17 | 杭州海康机器人技术有限公司 | Unmanned aerial vehicle aerial image splicing optimization method and device and storage medium |
CN110132280A (en) * | 2019-05-20 | 2019-08-16 | 广州小鹏汽车科技有限公司 | Vehicle positioning method, vehicle locating device and vehicle under indoor scene |
CN111060948B (en) * | 2019-12-14 | 2021-10-29 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, helmet and computer readable storage medium |
CN111060948A (en) * | 2019-12-14 | 2020-04-24 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, helmet and computer readable storage medium |
CN111368015B (en) * | 2020-02-28 | 2023-04-07 | 北京百度网讯科技有限公司 | Method and device for compressing map |
CN111368015A (en) * | 2020-02-28 | 2020-07-03 | 北京百度网讯科技有限公司 | Method and device for compressing map |
CN111553915A (en) * | 2020-05-08 | 2020-08-18 | 深圳前海微众银行股份有限公司 | Article identification detection method, device, equipment and readable storage medium |
CN113465602A (en) * | 2021-05-26 | 2021-10-01 | 北京三快在线科技有限公司 | Navigation method, navigation device, electronic equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN107748569B (en) | 2021-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107748569A (en) | Motion control method, device and UAS for unmanned plane | |
CN109307508B (en) | Panoramic inertial navigation SLAM method based on multiple key frames | |
CN106679648B (en) | Visual inertia combination SLAM method based on genetic algorithm | |
CN104236548B (en) | Autonomous navigation method in a kind of MAV room | |
CN110125928A (en) | A kind of binocular inertial navigation SLAM system carrying out characteristic matching based on before and after frames | |
CN109084746A (en) | Monocular mode for the autonomous platform guidance system with aiding sensors | |
CN112652016B (en) | Point cloud prediction model generation method, pose estimation method and pose estimation device | |
CN111595334B (en) | Indoor autonomous positioning method based on tight coupling of visual point-line characteristics and IMU (inertial measurement Unit) | |
CN107688391A (en) | A kind of gesture identification method and device based on monocular vision | |
CN107357427A (en) | A kind of gesture identification control method for virtual reality device | |
WO2017020766A1 (en) | Scenario extraction method, object locating method and system therefor | |
CN101794349A (en) | Experimental system and method for augmented reality of teleoperation of robot | |
CN109211251A (en) | A kind of instant positioning and map constructing method based on laser and two dimensional code fusion | |
CN106780631A (en) | A kind of robot closed loop detection method based on deep learning | |
CN104240297A (en) | Rescue robot three-dimensional environment map real-time construction method | |
CN108829116B (en) | Barrier-avoiding method and equipment based on monocular cam | |
JP2015532077A (en) | Method for determining the position and orientation of an apparatus associated with an imaging apparatus that captures at least one image | |
CN109298629A (en) | For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace | |
CN111161337B (en) | Accompanying robot synchronous positioning and composition method in dynamic environment | |
CN102622762A (en) | Real-time camera tracking using depth maps | |
CN107941217A (en) | A kind of robot localization method, electronic equipment, storage medium, device | |
CN110603122B (en) | Automated personalized feedback for interactive learning applications | |
Krainin et al. | Manipulator and object tracking for in hand model acquisition | |
Ye et al. | 6-DOF pose estimation of a robotic navigation aid by tracking visual and geometric features | |
EP2851868A1 (en) | 3D Reconstruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |