CN110111388A - Three-dimension object pose parameter estimation method and visual apparatus - Google Patents
Three-dimension object pose parameter estimation method and visual apparatus Download PDFInfo
- Publication number
- CN110111388A CN110111388A CN201910389377.3A CN201910389377A CN110111388A CN 110111388 A CN110111388 A CN 110111388A CN 201910389377 A CN201910389377 A CN 201910389377A CN 110111388 A CN110111388 A CN 110111388A
- Authority
- CN
- China
- Prior art keywords
- pose parameter
- dimension object
- mentioned
- line section
- straightway
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Abstract
This application involves a kind of three-dimension object pose parameter estimation method and visual apparatus, wherein this method can be performed in visual apparatus, this method comprises: obtaining the pose parameter that previous frame image based on three-dimensional objects determines, the initial pose parameter of pose parameter is corresponded to as determining current frame image;The three-dimensional space straightway of three-dimension object is projected to two dimensional image plane based on initial pose parameter, obtains the projection straight line section in two dimensional image plane of three-dimensional space straightway;It determines projection straight line section immediate graph line section in two dimensional image plane, and determines the distance between projection straight line section and immediate graph line section error;Judge whether range error meets preset condition;If conditions are not met, determining new pose parameter based on range error, and using new pose parameter as initial pose parameter;If it is satisfied, using initial pose parameter as the pose parameter of current frame image.The application realizes the pose parameter estimation of the object less to texture.
Description
Technical field
This application involves visual apparatus fields more particularly to a kind of three-dimension object pose parameter estimation method and vision to set
It is standby.
Background technique
Pose estimation for the three-dimension object of texture rareness, point feature rareness influence pose accuracy of estimation, relatively surely
Fixed linear feature is conducive to pose estimation.It is two dimension and 3 d-line using the key problem that linear feature carries out pose estimation
Characteristic matching, since the available description information of 3 d-line is very little, it is difficult to be done directly two dimension and 3 d-line characteristic matching.One
As be to be reduced to two dimension using known two dimension and the reference frame of 3 d-line corresponding relationship and matched with two dimensional image linear feature.
It is matched currently, linear feature matching is primarily upon two dimension with two dimensional image linear feature, is applied to three-dimensional reconstruction, fortune
In terms of dynamic recovery structure, immediately positioning and map structuring, and obtain certain effect.Linear feature matching can be divided into feature description side
Method, dotted line invariance method and line knot close method.Character description method is constructed special using the grayscale information of linear feature neighborhood
Vector is levied to characterize the straight line, by comparing feature vector similitude to determine whether being matching line;Dotted line invariance method
Match point is relied on, using the invariance constraint of coplanar dotted line to determine whether being matching line;Line knot closes method using difference
The constraint relationship between straight line is to determine whether be matching line.
It is matched using image with the two dimension of reference frame with two-dimentional linear feature, realizes two dimension and 3 d-line feature indirectly
Match.First is that line subscript is needed to be poured in the two-dimentional reference frame with three-dimensional corresponding relationship, quantity is more and cumbersome;Second is that needing accurate extract
Graph line feature, but the factors such as noise, fuzzy will affect the integrality of linear feature.
As seen from the above analysis, indirect matching method realizes the Attitude estimation of texture rareness object, relies on history figure
As mark reference frame, and heavy workload is marked under line.
Summary of the invention
In order to solve the above-mentioned technical problem or it at least is partially solved above-mentioned technical problem, this application provides one kind three
Tie up object pose method for parameter estimation and visual apparatus.
In a first aspect, this application provides a kind of three-dimension object pose parameter estimation methods, comprising: obtain and be based on three-dimensional article
The pose parameter for the above-mentioned three-dimension object that the previous frame image of body determines, the current frame image pair as the above-mentioned three-dimension object of determination
Answer the initial pose parameter of pose parameter;The three-dimensional space straightway of above-mentioned three-dimension object is thrown based on above-mentioned initial pose parameter
On shadow to the two dimensional image plane of above-mentioned current frame image, the flat in above-mentioned two dimensional image of above-mentioned three-dimensional space straightway is obtained
Projection straight line section on face;Determine above-mentioned projection straight line section immediate graph line section in above-mentioned two dimensional image plane, and
Determine the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error;Whether judge above-mentioned range error
Meet preset condition;If being unsatisfactory for above-mentioned preset condition, new pose parameter is determined based on above-mentioned range error, and will be above-mentioned
New pose parameter is as above-mentioned initial pose parameter;If meeting above-mentioned preset condition, using above-mentioned initial pose parameter as
The pose parameter of the above-mentioned current frame image of above-mentioned three-dimension object.
In some embodiments, the three-dimensional space straightway of above-mentioned three-dimension object is projected based on above-mentioned initial pose parameter
To above-mentioned current frame image two dimensional image plane on, obtain above-mentioned three-dimensional space straightway in above-mentioned two dimensional image plane
On projection straight line section before, further includes: determine above-mentioned current frame image sight and above-mentioned three-dimensional space straightway where it is flat
The corner dimension in face judges the projection visibility of above-mentioned three-dimensional space straightway and removes to be blocked based on above-mentioned corner dimension
Three-dimensional space straightway.
In some embodiments, above-mentioned projection straight line section immediate graph line in above-mentioned two dimensional image plane is determined
Section, and determine the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error, comprising: in above-mentioned projection
Straightway up-samples to obtain multiple control points;Determine the vertical direction of above-mentioned projection straight line section;It is every in above-mentioned multiple control points
Preset search range at a control point is along above-mentioned vertical direction bidirectional research, using the pixel of maximum likelihood ratio as candidate
Corresponding points obtain multiple candidate corresponding points;Determine the range error of above-mentioned multiple candidate corresponding points and projection straight line section.
In some embodiments, new pose parameter is determined based on above-mentioned range error, comprising: utilize Lie algebra spatial table
Levy above-mentioned initial pose parameter;Pose increment is determined according to above-mentioned range error;New pose is determined based on above-mentioned pose increment
Parameter.
In some embodiments, pose increment is determined according to above-mentioned range error, comprising: according to above-mentioned range error, adopt
Pose increment is determined with Robust Estimation.New pose parameter is determined based on above-mentioned pose increment, comprising: is determined by index mapping
In the increment of rotation and translation increment of theorem in Euclid space, new spin matrix is determined based on above-mentioned increment of rotation and above-mentioned translation increment
With new translation vector.
Second aspect, this application provides a kind of three-dimension object pose parameter estimation methods, comprising: obtains above-mentioned three-dimensional article
The initial frame image of body;Extract the image geometry feature of the three-dimension object in above-mentioned initial frame image;By the image geometry of extraction
The fixed reference feature storehouse matching of feature and above-mentioned three-dimension object, the corresponding three-dimensional space of image geometry feature for obtaining said extracted are sat
Mark, wherein above-mentioned fixed reference feature library includes the image geometry spy extracted from the image under multiple visual angles of above-mentioned three-dimension object
Corresponding relationship between sign and three dimensional space coordinate;The image of threedimensional model and said extracted based on above-mentioned three-dimension object is several
What the corresponding two dimensional image coordinate of feature and three dimensional space coordinate, determines the initial of above-mentioned three-dimension object using N point perspective method
Pose parameter;The three-dimensional space straightway of above-mentioned three-dimension object is projected to above-mentioned initial frame figure based on above-mentioned initial pose parameter
On the two dimensional image plane of picture, the projection straight line in above-mentioned two dimensional image plane of above-mentioned three-dimensional space straightway is obtained
Section;It determines above-mentioned projection straight line section immediate graph line section in above-mentioned two dimensional image plane, and determines that above-mentioned projection is straight
The distance between line segment and above-mentioned immediate graph line section error;Judge whether above-mentioned range error meets preset condition;
If being unsatisfactory for above-mentioned preset condition, new pose parameter is determined based on above-mentioned range error, and by above-mentioned new pose parameter
As above-mentioned initial pose parameter;If meeting above-mentioned preset condition, using above-mentioned initial pose parameter as above-mentioned three-dimension object
Pose parameter.
In some embodiments, above-mentioned image geometry feature is the crucial corner feature and/or straight line of above-mentioned three-dimension object
Feature.
In some embodiments, the image geometry feature of three-dimension object in above-mentioned initial frame image is extracted, comprising: above-mentioned
The crucial angle point and/or straightway of above-mentioned three-dimension object are extracted on initial frame image;Crucial angle point and/or straight line based on extraction
The local gray level of section determines the crucial angle point of said extracted and/or the feature vector of straightway.
In some embodiments, the three-dimensional space straightway of above-mentioned three-dimension object is projected based on above-mentioned initial pose parameter
To above-mentioned initial frame image two dimensional image plane on, obtain above-mentioned three-dimensional space straightway in above-mentioned two dimensional image plane
On projection straight line section before, further includes: determine above-mentioned initial frame image sight and above-mentioned three-dimensional space straightway where it is flat
The corner dimension in face judges the projection visibility of above-mentioned three-dimensional space straightway and removes to be blocked based on above-mentioned corner dimension
Three-dimensional space straightway.
In some embodiments, above-mentioned projection straight line section immediate graph line in above-mentioned two dimensional image plane is determined
Section, and determine the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error, comprising: in above-mentioned projection
Straightway up-samples to obtain multiple control points;Determine the vertical direction of above-mentioned projection straight line section;It is every in above-mentioned multiple control points
Preset search range at a control point is along above-mentioned vertical direction bidirectional research, using the pixel of maximum likelihood ratio as candidate
Corresponding points obtain multiple candidate corresponding points;Determine the range error of above-mentioned multiple candidate corresponding points and projection straight line section.
In some embodiments, new pose parameter is determined based on above-mentioned range error, comprising: utilize Lie algebra spatial table
Levy above-mentioned initial pose parameter;Pose increment is determined according to above-mentioned range error;New pose is determined based on above-mentioned pose increment
Parameter.
In some embodiments, pose increment is determined according to above-mentioned range error, comprising: according to above-mentioned range error, adopt
Pose increment is determined with Robust Estimation.New pose parameter is determined based on above-mentioned pose increment, comprising: is determined by index mapping
In the increment of rotation and translation increment of theorem in Euclid space, new spin matrix is determined based on above-mentioned increment of rotation and above-mentioned translation increment
With new translation vector.
The third aspect, this application provides a kind of visual apparatus, which includes: memory, processor and storage
On above-mentioned memory and the computer program that can be run on above-mentioned processor;Above-mentioned computer program is held by above-mentioned processor
The step of the application arbitrary three-dimension object pose parameter estimation method is realized when row.
Fourth aspect, this application provides a kind of computer readable storage medium, on above-mentioned computer readable storage medium
It is stored with three-dimension object pose parameter estimation program, realization when above-mentioned three-dimension object pose parameter estimation program is executed by processor
The step of the application arbitrary three-dimension object pose parameter estimation method.
Above-mentioned technical proposal provided by the embodiments of the present application has the advantages that compared with prior art
This method provided by the embodiments of the present application determines the pose parameter of three-dimension object by iterative projection, avoid according to
Rely a large amount of history image to mark, and efficiently and accurately realizes the pose parameter estimation of the three-dimension object of texture rareness.
Characteristic matching and pose parameter estimation alternating iteration carry out, and can remove exterior point influence, while obtaining matching result and joining with pose
Number estimated result.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention
Example, and be used to explain the principle of the present invention together with specification.
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, for those of ordinary skill in the art
Speech, without any creative labor, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of hardware structural diagram of embodiment of visual apparatus provided by the embodiments of the present application;
Fig. 2 is a kind of flow chart of embodiment of three-dimension object pose parameter estimation method provided by the embodiments of the present application;
Fig. 3 is a kind of flow chart of embodiment of range error determination process provided by the embodiments of the present application;
Fig. 4 determines a kind of process of embodiment of new pose parameter based on range error to be provided by the embodiments of the present application
Figure;
Fig. 5 is the process of three-dimension object pose parameter estimation method another embodiment provided by the embodiments of the present application
Figure;
Fig. 6 is a kind of flow chart of embodiment of fixed reference feature library provided by the embodiments of the present application method for building up;
Fig. 7 is the schematic diagram of the projection visibility judge of three-dimensional space straightway provided by the embodiments of the present application;
Fig. 8 is the schematic diagram of projection straight line section local search provided by the embodiments of the present application;
Fig. 9 is the schematic diagram of distance between candidate corresponding points and projection straight line section provided by the embodiments of the present application;
Figure 10 is reference frame image of the cube provided by the embodiments of the present application for initialization;
Figure 11 is the matched schematic diagram of initialization provided by the embodiments of the present application;
Figure 12 is non-initial frame images match provided by the embodiments of the present application and pose estimated result;And
Figure 13 is pose parameter estimated value provided by the embodiments of the present application and the correlation curve for marking reference value.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element
Be conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix
Ground uses.
The visual apparatus provided in the embodiment of the present invention includes but is not limited to industrial automation equipment and intelligent robot
Etc. equipment or user terminal, can identify and capture target object, provide target object real-time image information and in real time
Spatial pose information.The visual apparatus provided in the embodiment of the present invention may include: RF (Radio Frequency, radio frequency) single
Member, WiFi module, audio output unit, A/V (audio/video) input unit, sensor, interface unit, memory, processing
The components such as device and power supply.
It will be illustrated by taking visual apparatus as an example in subsequent descriptions, referring to Fig. 1, its each embodiment to realize the present invention
A kind of visual apparatus hardware structural diagram, which may include: one or more optical image sensors
101, the components such as memory 102, processor 103 and power supply 104.It will be understood by those skilled in the art that shown in Fig. 1
Visual apparatus structure does not constitute the restriction to visual apparatus, and visual apparatus may include components more more or fewer than diagram,
Perhaps certain components or different component layouts are combined.
In one embodiment, the optical image sensor 101 of visual apparatus 100 is one or more cameras, is led to
Unlatching camera is crossed, can be realized the capture to image, realizes the functions such as take pictures, record a video, the position of camera can be according to need
It is configured.
Visual apparatus 100 may also include at least one sensor, such as optical sensor, motion sensor and other sensings
Device.As a kind of motion sensor, accelerometer sensor can detect the size of (generally three axis) acceleration in all directions,
It can detect that size and the direction of gravity when static.
Memory 102 can be used for storing software program and various data.Memory 102 can mainly include storing program area
The storage data area and, wherein storing program area can program needed for storage program area, at least one function etc..In addition, storage
Device 102 may include high-speed random access memory, can also include nonvolatile memory, for example, at least a disk storage
Device, flush memory device or other volatile solid-state parts.
Processor 103 is the control centre of visual apparatus, utilizes each of various interfaces and the entire visual apparatus of connection
A part by running or execute the software program and/or module that are stored in memory 102, and calls and is stored in storage
Data in device 102 execute the various functions and processing data of visual apparatus, to carry out integral monitoring to visual apparatus.Place
Managing device 103 may include one or more processing units.
A kind of three-dimension object pose parameter estimation method is present embodiments provided, with reference to Fig. 2, three-dimension object pose parameter is estimated
Meter method includes step S201 to step S206.
Step S201 obtains the pose parameter for the above-mentioned three-dimension object that previous frame image based on three-dimensional objects determines, makees
The initial pose parameter of pose parameter is corresponded to for the current frame image of the above-mentioned three-dimension object of determination.
The three-dimensional space straightway of above-mentioned three-dimension object is projected to above-mentioned based on the initial pose parameter and is worked as by step S202
On the two dimensional image plane of prior image frame, the projection in above-mentioned two dimensional image plane of above-mentioned three-dimensional space straightway is obtained
Straightway.
In the present embodiment, projection straight line section is characterized by the two dimensional image coordinate of image.
Step S203 determines above-mentioned projection straight line section immediate graph line section in above-mentioned two dimensional image plane, and
Calculate the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error.
In the present embodiment, immediate graph line section may be considered that practical three-dimensional space on the two dimensional image of image
Between straightway projection straight line section.The range error can characterize determining based on initial pose parameter
Step S204, judges whether above-mentioned range error meets preset condition;If being unsatisfactory for above-mentioned preset condition, enter
Step S205;If meeting above-mentioned preset condition, S206 is entered step.
In some embodiments, in step S204, preset condition can be less than pre-determined distance error for range error, work as distance
When error is greater than the pre-determined distance error, determination is unsatisfactory for preset condition, when range error is missed less than or equal to the pre-determined distance
When poor, determination meets the preset condition.But the present embodiment is without being limited thereto.
Step S205 determines new pose parameter based on above-mentioned range error, and using above-mentioned new pose parameter as upper
State the initial pose parameter of step S202.
Step S206 joins above-mentioned initial pose parameter as the pose of the above-mentioned current frame image of above-mentioned three-dimension object
Number.
By the three-dimension object pose parameter estimation method, the pose parameter of three-dimension object is determined by iterative projection, is kept away
Exempt from a large amount of history image mark of dependence, and efficiently and accurately realizes the pose parameter of the three-dimension object of texture rareness
Estimation.Characteristic matching and pose parameter estimation alternating iteration carry out, and can remove exterior point influence, while obtaining matching result and position
Appearance parameter estimation result.
In some embodiments, step S205 also accumulating step S205 determines the cumulative number of new pose parameter, per really
Fixed primary new pose parameter, cumulative number increase by 1.In step S204, if being unsatisfactory for above-mentioned preset condition, judgement is accumulative
Whether number is greater than preset value, if it is less than preset value, enters step S205.It optionally, will be above-mentioned if it is greater than the preset value
Initial pose parameter of the pose parameter as the above-mentioned current frame image of above-mentioned three-dimension object.
In some embodiments, cause due to itself blocking etc. some three-dimensional space straightways be it is sightless, can
Invisible straightway is removed according to visibility test before projection, only retains visible line section.For this purpose, in some embodiments,
Before above-mentioned steps S202, further includes: determine the sight and above-mentioned three-dimensional space straightway place plane of above-mentioned current frame image
Corner dimension, based on above-mentioned corner dimension judge above-mentioned three-dimensional space straightway projection visibility and remove be blocked three
Dimension space straightway.
In some embodiments, as shown in fig. 7, the three-dimensional space straightway being blocked in the following manner:
Calculate sight line vectorWith plane normal vector where three-dimensional straight line segmentScalar product, judge three according to the scalar product
The projection visibility of dimension space straightway:
Wherein, V (Li) it is 1 expression three-dimensional space straightway as it can be seen that being that the 0 expression three-dimensional space straightway is invisible.
In some embodiments, if the three-dimensional space straightway as it can be seen that if to its projection straight line section carry out it is uniform at equal intervals
Sampling, sampled point are known as control point.
In some embodiments, above-mentioned steps S203 determines above-mentioned projection straight line section in above-mentioned two dimensional image plane most
Close graph line section, and the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error are calculated,
With reference to Fig. 3, including step S301 to step S304.
Step S301, in the projection straight line section up-sampling in above-mentioned two dimensional image plane of above-mentioned three-dimensional space straightway
Obtain multiple control points.
Step S302 determines the vertical direction of above-mentioned projection straight line section.
Step S303, the preset search range in above-mentioned multiple control points at each control point are double along above-mentioned vertical direction
Multiple candidate corresponding points are obtained to search using the pixel of maximum likelihood ratio as candidate corresponding points.
In the present embodiment, multiple candidate corresponding points can be equivalent to straightway.
Step S304 determines the range error of above-mentioned multiple candidate corresponding points and projection straight line section.
The distance between projection straight line section and above-mentioned immediate graph line section error in above-mentioned steps S203, can be by this
Multiple candidate's the distance between corresponding points and above-mentioned projection straight line section error characterizations.
In this way, optimal corresponding points can be searched in preset search range under the constraint of initial pose parameter,
Characteristics of image is searched for without full figure, computational efficiency is improved, reduces computation complexity.
It in some embodiments, can be accurate to improve search with the multiple control points of uniform sampling in above-mentioned steps S301
Degree.
In some embodiments, as shown in figure 8, to each control point, one is carried out along the direction perpendicular to projection straight line section
Dimension search, search range are { Qi, j ∈ [- R, R] }.The likelihood ratio for calculating each pixel in search range, takes likelihood ratio
Maximum pixel (being known as maximum likelihood ratio point in the present embodiment) is as candidate corresponding pointsMeet:
ζjFor likelihood ratio, indicate respectively in ptAnd QjThe absolute value of the sum of the convolution value at place.MδIt is the side δ predetermined
To gradient mask.The neighborhood pixels of v () expression position.
In some embodiments, as shown in figure 9, candidate corresponding points x 'iWith corresponding projection straight line section ls(r) distance are as follows:
Wherein,WithFor X 'iImage coordinate.
In some embodiments, above-mentioned steps S205 determines new pose parameter based on above-mentioned range error, with reference to Fig. 4,
Including step S401 step S403.
Step S401 utilizes the above-mentioned initial pose parameter of Lie algebra spatial characterization.
Step S402 determines pose increment according to above-mentioned range error.
Step S403 determines new pose parameter based on above-mentioned pose increment.
In some embodiments, above-mentioned steps S402 determines pose increment according to above-mentioned range error, comprising: according to above-mentioned
Range error determines pose increment using Robust Estimation.Robust Estimation refers to and uses Tukey influence function adaptively for distance
Error distributes weight, weakens the influence of false candidate corresponding points.
In some embodiments, above-mentioned steps S403 determines new pose parameter based on above-mentioned pose increment, comprising: passes through
Index mapping determines the increment of rotation and translation increment in theorem in Euclid space, is determined based on above-mentioned increment of rotation and above-mentioned translation increment
New spin matrix and new translation vector.
In some embodiments, in Lie algebra se (3) spatial characterization pose parameter, v is translation vector, ω be rotation to
Amount.Pose incremental computations formula is as follows:
Wherein,ForMoore-Penrose pseudoinverse.D=diag (ω1..., ωk) be made of weighted value
Diagonal matrix, the weighted value by Tukey influence function calculate gained.Candidate corresponding points with corresponding projection straight line section away from
From error vector ei(r) Jacobian matrix:
In formula (6)
Utilization index maps the spin matrix R and translation vector T that can determine corresponding Euclidean space, indicates are as follows:
Pose parameter after pose incremental update indicates are as follows:
The present embodiment additionally provides a kind of three-dimension object pose parameter estimation method, to initialize the pose of three-dimension object
Parameter, with reference to Fig. 5, three-dimension object pose parameter estimation method includes step S501 to step S509.
Step S501 obtains the initial frame image of three-dimension object.
Step S502 extracts the image geometry feature of the three-dimension object in above-mentioned initial frame image.
In some embodiments, feature detects and describes algorithm acquisition by feature, and specifying information includes it in image coordinate
Coordinate under system, and the feature vector from neighbouring gray count.
Step S503 obtains the fixed reference feature storehouse matching of the image geometry feature of extraction and above-mentioned three-dimension object above-mentioned
The corresponding three dimensional space coordinate of image geometry feature of extraction.
Wherein, above-mentioned fixed reference feature library is several comprising the image extracted from the image under multiple visual angles of above-mentioned three-dimension object
What corresponding relationship between feature and three dimensional space coordinate.
Step S504, the image geometry feature corresponding two of threedimensional model and said extracted based on above-mentioned three-dimension object
Image coordinate and three dimensional space coordinate are tieed up, the initial pose parameter of above-mentioned three-dimension object is determined using N point perspective method.
In the present embodiment, N point perspective method is in picture point coordinate and corresponding three-dimensional spatial point coordinate it is known that and view
Under the camera internal reference known conditions for feeling equipment, method of the Optimization Solution three-dimension object relative to the pose parameter of camera.
Step S505 is projected the three-dimensional space straightway of above-mentioned three-dimension object to above-mentioned based on above-mentioned initial pose parameter
On the two dimensional image plane of initial frame image, the throwing in above-mentioned two dimensional image plane of above-mentioned three-dimensional space straightway is obtained
Shadow straightway.
Step S506 determines above-mentioned projection straight line section immediate graph line section in above-mentioned two dimensional image plane, and
Calculate the distance between above-mentioned projection straight line section and above-mentioned immediate graph line section error.
Step S507, judges whether above-mentioned range error meets preset condition, if being unsatisfactory for above-mentioned preset condition, enters
Step S508;If meeting above-mentioned preset condition, S509 is entered step.
Step S508 determines new pose parameter based on above-mentioned range error, and using above-mentioned new pose parameter as upper
The initial pose parameter for stating step S505, enters step S505.
Step S509, using above-mentioned initial pose parameter as the pose parameter of above-mentioned three-dimension object.
In some embodiments, above-mentioned image geometry feature is the crucial corner feature and/or straight line of above-mentioned three-dimension object
Feature.
In some embodiments, above-mentioned steps S502 extracts the image geometry feature of above-mentioned initial frame image, comprising: upper
State the crucial angle point and/or straightway that the three-dimension object is extracted on initial frame image;Crucial angle point and/or straight line based on extraction
The local gray level of section determines the crucial angle point of said extracted and/or the feature vector of straightway.
In some embodiments, above-mentioned steps S504, based on above-mentioned initial pose parameter by the three-dimensional of above-mentioned three-dimension object
Space line section is projected to the two dimensional image plane of above-mentioned initial frame image, obtain above-mentioned three-dimensional space straightway upper
Before stating the projection straight line section in two dimensional image plane, further includes: determine above-mentioned initial frame image sight and above-mentioned three-dimensional space
Between plane where straightway corner dimension, the projection visibility of above-mentioned three-dimensional space straightway is judged based on above-mentioned corner dimension
And remove the three-dimensional space straightway being blocked.
In some embodiments, above-mentioned steps S506, with reference to Fig. 3, it may include: it up-samples to obtain in above-mentioned projection straight line section
Multiple control points;Determine the vertical direction of above-mentioned projection straight line section;It is default at each control point in above-mentioned multiple control points
Search range obtains multiple along above-mentioned vertical direction bidirectional research using the pixel of maximum likelihood ratio as candidate corresponding points
Candidate corresponding points;Determine the range error of above-mentioned multiple candidate corresponding points and projection straight line section.
In some embodiments, the step of new pose parameter being determined based on above-mentioned range error, with reference to Fig. 4, it may include:
Utilize the above-mentioned initial pose parameter of Lie algebra spatial characterization;Pose increment is determined according to above-mentioned range error;Based on above-mentioned pose
Increment determines new pose parameter.
In some embodiments, the step of pose increment being determined according to above-mentioned range error, it may include: according to above-mentioned distance
Error determines pose increment using Robust Estimation.New pose parameter is determined based on above-mentioned pose increment, comprising: passes through index
The increment of rotation and translation increment determined in theorem in Euclid space is mapped, is determined newly based on above-mentioned increment of rotation and above-mentioned translation increment
Spin matrix and new translation vector.
Fixed reference feature library provided in this embodiment method for building up, for producing the fixed reference feature library of the present embodiment, the reference
Feature database can be applied to the three-dimension object pose parameter estimation method of the present embodiment, in especially step S503.It, should with reference to Fig. 6
Fixed reference feature library method for building up includes step S601 to step S604.
Step S601 obtains the image under the multiple visual angles of three-dimension object, obtains the multiple image of the three-dimension object.
In the present embodiment, the multiple image may include the three-dimension object at least partly or the image under more multi-angle of view,
Each visual angle may include a frame or multiple image.In some embodiments, it may include the figure under 3 to 5 visual angles of three-dimension object
Picture.
Step S602 extracts the image geometry feature of the three-dimension object in above-mentioned multiple image.
In some embodiments, feature detects and describes algorithm acquisition by feature, and specifying information includes it in image coordinate
Coordinate under system, and the feature vector from neighbouring gray count.
In some embodiments, above-mentioned image geometry feature is the crucial corner feature and/or straight line of above-mentioned three-dimension object
Feature.
In some embodiments, the image geometry feature of the three-dimension object in above-mentioned multiple image is extracted, it may include: upper
State the crucial angle point and/or straightway that the three-dimension object is extracted on multiple image;Crucial angle point and/or straightway based on extraction
Local gray level determine the crucial angle point of said extracted and/or the feature vector of straightway.
Step S603 determines image geometry feature corresponding three using Inverse Projection according to the threedimensional model of three-dimension object
Dimension space coordinate.
In the present embodiment, one or more geometrical characteristics of every frame image are whole corresponding with three dimensional space coordinate.
Step S604 forms the corresponding relationship between image geometry feature and three dimensional space coordinate comprising three-dimension object
Fixed reference feature library.
In some embodiments, as exemplary illustration, a kind of three-dimension object pose parameter estimation method includes: 1) line
Next stage: fixed reference feature library is constructed according to method as shown in FIG. 6;2) it initial phase: is determined according to method as described in Figure 5
The corresponding pose parameter of initial frame image;3) the pose parameter more new stage: initial frame image is determined according to method as shown in Figure 2
The corresponding pose parameter of picture frame later.
In some embodiments, can determine whether image is initial frame image, when image is initial frame image, into first
Stage beginning determines the corresponding pose parameter of initial frame image according to method as described in Figure 5;When image is not initial frame image
When, into the pose parameter more new stage, the corresponding pose parameter of picture frame is determined according to method as shown in Figure 2.
A kind of visual apparatus provided by the present application, the visual apparatus include: memory, processor and are stored in above-mentioned storage
On device and the computer program that can be run on above-mentioned processor;This is realized when above-mentioned computer program is executed by above-mentioned processor
The step of three-dimension object pose parameter estimation method of above-mentioned any embodiment or embodiment.
A kind of computer readable storage medium provided by the present application is stored with three-dimensional on above-mentioned computer readable storage medium
Object pose Parameter estimation routines, above-mentioned three-dimension object pose parameter estimation program realize that the application is any when being executed by processor
Three-dimension object pose parameter estimation method the step of.
Experiment and experimental result explanation
In the present embodiment, continuously to move cube as test object, image resolution ratio is 640*480 pixel.Initially
Frame image is initialized to obtain initial pose parameter, backward initial pose of the frame using the pose parameter of previous frame as present frame
Parameter carries out two dimension and estimates with 3 d-line characteristic matching and pose parameter.All experiments are in a central processing unit (CPU)
ForIt is carried out on the portable computer that i5-5200Hq (2.2GHz), RAM are 8GB.
Cube is taken, as reference picture, to construct fixed reference feature library in the image of three different perspectivess.Multiframe ginseng
It is as shown in Figure 10 to examine image.Initial frame image and the angled key points correspondence result in fixed reference feature library are as shown in figure 11, bottom right
Angle image is initial frame image, and corresponding point feature is connected with straight line, and cubical three-dimensional model is with the initial pose of initial frame image
Parameter is projected to initial frame two dimensional image plane.
The linear feature matching of frame and pose parameter estimated result are as shown in figure 12 backward.Cubical three-dimensional space line section
By on estimation pose parameter re-projection to the plane of delineation, projection straight line is overlapped with graph line substantially, and rectangular coordinate system is towards table
Show object pose.Continuous pose parameter estimated result and the correlation curve of mark reference value are as shown in figure 13, and estimation curve is basic
It is consistent with reference curve.The root-mean-square error of pose estimated value and reference value is calculated, as shown in table 1.In addition, single frames average treatment
Time 43ms, meets requirement of real-time.
1 cube pose parameter error of table
From the above result shows that, inventive algorithm can accurately complete two dimension and 3 d-line characteristic matching, while can
The pose parameter of accurate estimation three-dimension object, and computation complexity is low.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service
Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form, all of these belong to the protection of the present invention.
Claims (10)
1. a kind of three-dimension object pose parameter estimation method characterized by comprising
The pose parameter for obtaining the three-dimension object that previous frame image based on three-dimensional objects determines, as the determination three-dimensional
The current frame image of object corresponds to the initial pose parameter of pose parameter;
The three-dimensional space straightway of the three-dimension object is projected to the current frame image based on the initial pose parameter
On two dimensional image plane, the projection straight line section in the two dimensional image plane of the three-dimensional space straightway is obtained;
It determines the projection straight line section immediate graph line section in the two dimensional image plane, and determines that the projection is straight
The distance between line segment and the immediate graph line section error;
Judge whether the range error meets preset condition;
If being unsatisfactory for the preset condition, new pose parameter is determined based on the range error, and by the new pose
Parameter is as the initial pose parameter;
If meeting the preset condition, using the initial pose parameter as the current frame image of the three-dimension object
Pose parameter.
2. three-dimension object pose parameter estimation method according to claim 1, which is characterized in that be based on the initial pose
Parameter projects the three-dimensional space straightway of the three-dimension object to the two dimensional image plane of the current frame image, obtains
The three-dimensional space straightway before the projection straight line section in the two dimensional image plane, further includes:
It determines the sight of the current frame image and the corner dimension of three-dimensional space straightway place plane, is based on the folder
Angle size judges the projection visibility of the three-dimensional space straightway and removes the three-dimensional space straightway being blocked.
3. three-dimension object pose parameter estimation method according to claim 1, which is characterized in that determine the projection straight line
Section immediate graph line section in the two dimensional image plane, and determine the projection straight line section and the immediate figure
As the distance between straightway error, comprising:
It up-samples to obtain multiple control points in the projection straight line section;
Determine the vertical direction of the projection straight line section;
Preset search range in the multiple control point at each control point, will be maximum along the vertical direction bidirectional research
The pixel of likelihood ratio obtains multiple candidate corresponding points as candidate corresponding points;
Determine the range error of the multiple candidate corresponding points and projection straight line section.
4. three-dimension object pose parameter estimation method according to claim 1, which is characterized in that be based on the range error
Determine new pose parameter, comprising:
Utilize initial pose parameter described in Lie algebra spatial characterization;
Pose increment is determined according to the range error;
New pose parameter is determined based on the pose increment.
5. three-dimension object pose parameter estimation method according to claim 4, which is characterized in that
Pose increment is determined according to the range error, comprising: according to the range error, determines that pose increases using Robust Estimation
Amount;
New pose parameter is determined based on the pose increment, comprising: is determined by index mapping and is increased in the rotation of theorem in Euclid space
Amount and translation increment, determine new spin matrix and new translation vector based on the increment of rotation and the translation increment.
6. a kind of three-dimension object pose parameter estimation method characterized by comprising
Obtain the initial frame image of the three-dimension object;
Extract the image geometry feature of three-dimension object described in the initial frame image;
By the fixed reference feature storehouse matching of the image geometry feature of extraction and the three-dimension object, the image geometry of the extraction is obtained
The corresponding three dimensional space coordinate of feature, wherein the fixed reference feature library includes from the figure under multiple visual angles of the three-dimension object
Corresponding relationship as between the image geometry feature extracted and three dimensional space coordinate;
The corresponding two dimensional image coordinate of image geometry feature of threedimensional model and the extraction based on the three-dimension object and
Three dimensional space coordinate determines the initial pose parameter of the three-dimension object using N point perspective method;
The three-dimensional space straightway of the three-dimension object is projected to the initial frame image based on the initial pose parameter
On two dimensional image plane, the projection straight line section in the two dimensional image plane of the three-dimensional space straightway is obtained;
It determines the projection straight line section immediate graph line section in the two dimensional image plane, and determines that the projection is straight
The distance between line segment and the immediate graph line section error;
Judge whether the range error meets preset condition;
If being unsatisfactory for the preset condition, new pose parameter is determined based on the range error, and by the new pose
Parameter is as the initial pose parameter;
If meeting the preset condition, using the initial pose parameter as the pose parameter of the three-dimension object.
7. three-dimension object pose parameter estimation method according to claim 6, which is characterized in that described image geometrical characteristic
For the crucial corner feature and/or linear feature of the three-dimension object.
8. three-dimension object pose parameter estimation method according to claim 6, which is characterized in that extract the initial frame figure
The image geometry feature of the three-dimension object as described in, comprising:
The crucial angle point and/or straightway of the three-dimension object are extracted on the initial frame image;
The local gray level of crucial angle point and/or straightway based on extraction determines the crucial angle point and/or straightway of the extraction
Feature vector.
9. a kind of visual apparatus, which is characterized in that the visual apparatus includes:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step such as method described in any item of the claim 1 to 8 is realized when the computer program is executed by the processor
Suddenly.
10. a kind of computer readable storage medium, which is characterized in that be stored with three-dimensional article on the computer readable storage medium
Posture Parameter estimation routines, the three-dimension object pose parameter estimation program realize such as claim 1 when being executed by processor
To described in any one of 8 the step of three-dimension object pose parameter estimation method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910389377.3A CN110111388B (en) | 2019-05-10 | 2019-05-10 | Three-dimensional object pose parameter estimation method and visual equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910389377.3A CN110111388B (en) | 2019-05-10 | 2019-05-10 | Three-dimensional object pose parameter estimation method and visual equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110111388A true CN110111388A (en) | 2019-08-09 |
CN110111388B CN110111388B (en) | 2021-03-23 |
Family
ID=67489342
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910389377.3A Active CN110111388B (en) | 2019-05-10 | 2019-05-10 | Three-dimensional object pose parameter estimation method and visual equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110111388B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110542422A (en) * | 2019-10-10 | 2019-12-06 | 上海钛米机器人科技有限公司 | Robot positioning method, device, robot and storage medium |
CN111127547A (en) * | 2019-12-17 | 2020-05-08 | 北京迈格威科技有限公司 | Positioning method, positioning device, robot and storage medium |
CN111325796A (en) * | 2020-02-28 | 2020-06-23 | 北京百度网讯科技有限公司 | Method and apparatus for determining pose of vision device |
CN111369571A (en) * | 2020-02-27 | 2020-07-03 | 北京百度网讯科技有限公司 | Three-dimensional object pose accuracy judgment method and device and electronic equipment |
CN112037282A (en) * | 2020-09-04 | 2020-12-04 | 北京航空航天大学 | Aircraft attitude estimation method and system based on key points and skeleton |
CN113256722A (en) * | 2021-06-21 | 2021-08-13 | 浙江华睿科技有限公司 | Pose determination method, pose determination device and storage medium |
CN113551661A (en) * | 2020-04-23 | 2021-10-26 | 曰轮法寺 | Pose identification and track planning method, device and system, storage medium and equipment |
CN113643356A (en) * | 2020-04-27 | 2021-11-12 | 北京达佳互联信息技术有限公司 | Camera pose determination method, camera pose determination device, virtual object display method, virtual object display device and electronic equipment |
CN113793297A (en) * | 2021-08-13 | 2021-12-14 | 北京迈格威科技有限公司 | Pose determination method and device, electronic equipment and readable storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100197391A1 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Visual target tracking |
EP2722136A1 (en) * | 2012-10-19 | 2014-04-23 | inos Automationssoftware GmbH | Method for in-line calibration of an industrial robot, calibration system for performing such a method and industrial robot comprising such a calibration system |
CN105096386A (en) * | 2015-07-21 | 2015-11-25 | 中国民航大学 | Method for automatically generating geographic maps for large-range complex urban environment |
CN107564062A (en) * | 2017-08-16 | 2018-01-09 | 清华大学 | Pose method for detecting abnormality and device |
CN107833253A (en) * | 2017-09-22 | 2018-03-23 | 北京航空航天大学青岛研究院 | A kind of camera pose refinement method towards the generation of RGBD three-dimensional reconstructions texture |
CN108022264A (en) * | 2016-11-01 | 2018-05-11 | 狒特科技(北京)有限公司 | Camera pose determines method and apparatus |
CN108122256A (en) * | 2017-12-25 | 2018-06-05 | 北京航空航天大学 | It is a kind of to approach under state the method for rotating object pose measurement |
CN108986161A (en) * | 2018-06-19 | 2018-12-11 | 亮风台(上海)信息科技有限公司 | A kind of three dimensional space coordinate estimation method, device, terminal and storage medium |
CN109470149A (en) * | 2018-12-12 | 2019-03-15 | 北京理工大学 | A kind of measurement method and device of pipeline pose |
CN109493384A (en) * | 2018-09-20 | 2019-03-19 | 顺丰科技有限公司 | Camera position and orientation estimation method, system, equipment and storage medium |
CN109725645A (en) * | 2019-03-29 | 2019-05-07 | 中国人民解放军国防科技大学 | Nested unmanned aerial vehicle landing cooperation sign design and relative pose acquisition method |
US20200043189A1 (en) * | 2017-01-13 | 2020-02-06 | Zhejiang University | Simultaneous positioning and dense three-dimensional reconstruction method |
-
2019
- 2019-05-10 CN CN201910389377.3A patent/CN110111388B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100197391A1 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Visual target tracking |
EP2722136A1 (en) * | 2012-10-19 | 2014-04-23 | inos Automationssoftware GmbH | Method for in-line calibration of an industrial robot, calibration system for performing such a method and industrial robot comprising such a calibration system |
CN105096386A (en) * | 2015-07-21 | 2015-11-25 | 中国民航大学 | Method for automatically generating geographic maps for large-range complex urban environment |
CN108022264A (en) * | 2016-11-01 | 2018-05-11 | 狒特科技(北京)有限公司 | Camera pose determines method and apparatus |
US20200043189A1 (en) * | 2017-01-13 | 2020-02-06 | Zhejiang University | Simultaneous positioning and dense three-dimensional reconstruction method |
CN107564062A (en) * | 2017-08-16 | 2018-01-09 | 清华大学 | Pose method for detecting abnormality and device |
CN107833253A (en) * | 2017-09-22 | 2018-03-23 | 北京航空航天大学青岛研究院 | A kind of camera pose refinement method towards the generation of RGBD three-dimensional reconstructions texture |
CN108122256A (en) * | 2017-12-25 | 2018-06-05 | 北京航空航天大学 | It is a kind of to approach under state the method for rotating object pose measurement |
CN108986161A (en) * | 2018-06-19 | 2018-12-11 | 亮风台(上海)信息科技有限公司 | A kind of three dimensional space coordinate estimation method, device, terminal and storage medium |
CN109493384A (en) * | 2018-09-20 | 2019-03-19 | 顺丰科技有限公司 | Camera position and orientation estimation method, system, equipment and storage medium |
CN109470149A (en) * | 2018-12-12 | 2019-03-15 | 北京理工大学 | A kind of measurement method and device of pipeline pose |
CN109725645A (en) * | 2019-03-29 | 2019-05-07 | 中国人民解放军国防科技大学 | Nested unmanned aerial vehicle landing cooperation sign design and relative pose acquisition method |
Non-Patent Citations (3)
Title |
---|
KARL PAUWELS ET AL: ""Real-time Model-based Rigid Object Pose Estimation and Tracking Combining Dense and Sparse Visual Cues"", 《2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 * |
ZHOUDI HUANG ET AL: ""BALG: An alternative for fast and robust feature matching"", 《J. VIS. COMMUN. IMAGE R.》 * |
冯春等: ""基于多传感器融合的航天器间位姿参数估计"", 《红外与激光工程》 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110542422A (en) * | 2019-10-10 | 2019-12-06 | 上海钛米机器人科技有限公司 | Robot positioning method, device, robot and storage medium |
CN111127547A (en) * | 2019-12-17 | 2020-05-08 | 北京迈格威科技有限公司 | Positioning method, positioning device, robot and storage medium |
CN111369571A (en) * | 2020-02-27 | 2020-07-03 | 北京百度网讯科技有限公司 | Three-dimensional object pose accuracy judgment method and device and electronic equipment |
CN111325796A (en) * | 2020-02-28 | 2020-06-23 | 北京百度网讯科技有限公司 | Method and apparatus for determining pose of vision device |
CN111325796B (en) * | 2020-02-28 | 2023-08-18 | 北京百度网讯科技有限公司 | Method and apparatus for determining pose of vision equipment |
CN113551661A (en) * | 2020-04-23 | 2021-10-26 | 曰轮法寺 | Pose identification and track planning method, device and system, storage medium and equipment |
CN113643356A (en) * | 2020-04-27 | 2021-11-12 | 北京达佳互联信息技术有限公司 | Camera pose determination method, camera pose determination device, virtual object display method, virtual object display device and electronic equipment |
CN112037282A (en) * | 2020-09-04 | 2020-12-04 | 北京航空航天大学 | Aircraft attitude estimation method and system based on key points and skeleton |
CN112037282B (en) * | 2020-09-04 | 2021-06-15 | 北京航空航天大学 | Aircraft attitude estimation method and system based on key points and skeleton |
CN113256722A (en) * | 2021-06-21 | 2021-08-13 | 浙江华睿科技有限公司 | Pose determination method, pose determination device and storage medium |
CN113256722B (en) * | 2021-06-21 | 2021-10-15 | 浙江华睿科技股份有限公司 | Pose determination method, pose determination device and storage medium |
CN113793297A (en) * | 2021-08-13 | 2021-12-14 | 北京迈格威科技有限公司 | Pose determination method and device, electronic equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110111388B (en) | 2021-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110111388A (en) | Three-dimension object pose parameter estimation method and visual apparatus | |
CN105469405B (en) | Positioning and map constructing method while view-based access control model ranging | |
CN106446815B (en) | A kind of simultaneous localization and mapping method | |
WO2020259481A1 (en) | Positioning method and apparatus, electronic device, and readable storage medium | |
Liang et al. | Image based localization in indoor environments | |
Sankar et al. | Capturing indoor scenes with smartphones | |
Chen et al. | Rise of the indoor crowd: Reconstruction of building interior view via mobile crowdsourcing | |
CN103875024B (en) | Systems and methods for navigating camera | |
CN109307508A (en) | A kind of panorama inertial navigation SLAM method based on more key frames | |
CN110246147A (en) | Vision inertia odometer method, vision inertia mileage counter device and mobile device | |
CN109242913A (en) | Scaling method, device, equipment and the medium of collector relative parameter | |
CN109154973A (en) | Execute the method and system of convolved image transformation estimation | |
US20150286893A1 (en) | System And Method For Extracting Dominant Orientations From A Scene | |
JP6609640B2 (en) | Managing feature data for environment mapping on electronic devices | |
CN110533694A (en) | Image processing method, device, terminal and storage medium | |
US11373329B2 (en) | Method of generating 3-dimensional model data | |
CN104848861A (en) | Image vanishing point recognition technology based mobile equipment attitude measurement method | |
CN112733641A (en) | Object size measuring method, device, equipment and storage medium | |
CN109902675A (en) | The method and apparatus of the pose acquisition methods of object, scene reconstruction | |
Shalaby et al. | Algorithms and applications of structure from motion (SFM): A survey | |
Koch et al. | Wide-area egomotion estimation from known 3d structure | |
US10970869B2 (en) | Method for generating roof outlines from lateral images | |
Huttunen et al. | A monocular camera gyroscope | |
CN111091117B (en) | Target detection method, device, equipment and medium for two-dimensional panoramic image | |
CN111161138B (en) | Target detection method, device, equipment and medium for two-dimensional panoramic image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |