CN108109175A - The tracking and device of a kind of image characteristic point - Google Patents
The tracking and device of a kind of image characteristic point Download PDFInfo
- Publication number
- CN108109175A CN108109175A CN201711381388.4A CN201711381388A CN108109175A CN 108109175 A CN108109175 A CN 108109175A CN 201711381388 A CN201711381388 A CN 201711381388A CN 108109175 A CN108109175 A CN 108109175A
- Authority
- CN
- China
- Prior art keywords
- feature point
- fisrt feature
- frame image
- current frame
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
This application discloses the tracking and device of a kind of image characteristic point, the described method includes:Detect characteristic point to be matched in current frame image;Detect fisrt feature point of at least two frames in the target two field picture before the current frame image;Obtain predicted position of the fisrt feature point in the current frame image;From the characteristic point to be matched, the second feature point to match with the fisrt feature point is determined;Wherein, predicted position of the matched fisrt feature point in current location of the second feature point in the current frame image in the current frame image meets default position relationship.Global traversal need not be carried out in the application to current frame image, and track and localization is simply carried out according to prediction, so as to the calculation amount being substantially reduced during tracking, so as to accelerate following rate, realizes the Feature Points Matching tracking of fast and stable.
Description
Technical field
This application involves technical field of image processing, the tracking and device of more particularly to a kind of image characteristic point.
Background technology
The reliable and stable of positioning feature point technology is to realize good use in AR augmented realities (Augmented Reality)
The important guarantee of family experience.In order to which the physical location of characteristic point in three dimensions in camera image is accurately positioned, need
Characteristic point in continuous multiple image is matched and tracked, such as binocular visual positioning, monocular vision positioning and laser
The location technologies such as positioning.In these location technologies be typically by identification, matching and tracking successive image frame in characteristic strip you
Realize the accurate positionin of characteristic point.
And currently used feature point tracking method has:Optical flow method or the global characteristic point matching calculation based on descriptor
Method, but the former requirement to illumination is more sensitive, and the latter needs with regard to those overall situation traversals so that calculation amount is larger, makes
Into larger delay, the real-time of effect characteristics point tracking.
The content of the invention
The purpose of the application is to provide the tracking and device of a kind of image characteristic point, to solve prior art characteristic
The technical issues of point tracking calculation amount is larger, the real-time of effect characteristics point tracking.
This application provides a kind of tracking of image characteristic point, including:
Detect characteristic point to be matched in current frame image;
Detect fisrt feature point of at least two frames in the target two field picture before the current frame image;
Obtain predicted position of the fisrt feature point in the current frame image;
From the characteristic point to be matched, the second feature point to match with the fisrt feature point is determined;
Wherein, the matched fisrt feature point in current location of the second feature point in the current frame image exists
Predicted position in the current frame image meets default position relationship.
The above method, it is preferred that the position relationship includes:The current location and the predicted position are nearest.
The above method, it is preferred that the position relationship includes:The distance between the current location and the predicted position
Value is less than default threshold value.
The above method, it is preferred that the predicted position for obtaining the fisrt feature point in the current frame image, bag
It includes:
Obtain target location of the fisrt feature point in the target two field picture;
Based on the target location, the tracking and matching information between the fisrt feature point is determined;
Using default prediction algorithm, based on fisrt feature point described in the tracking and matching information acquisition in the present frame
Predicted position in image.
The above method, it is preferred that the prediction algorithm includes:Linear prediction algorithm or Kalman Prediction algorithm.
Present invention also provides a kind of tracks of device of image characteristic point, including:
Current signature point detection unit, for monitoring characteristic point to be matched in current frame image;
Fisrt feature point detection unit, for monitoring at least two frames in the target two field picture before the current frame image
Fisrt feature point;
Position prediction unit, for obtaining predicted position of the fisrt feature point in the current frame image;
Feature Points Matching unit, for from the characteristic point to be matched, determining to match with the fisrt feature point
Second feature point;
Wherein, the matched fisrt feature point in current location of the second feature point in the current frame image exists
Predicted position in the current frame image meets default position relationship.
Above device, it is preferred that the position relationship includes:The current location and the predicted position are nearest.
Above device, it is preferred that the position relationship includes:The distance between the current location and the predicted position
Value is less than default threshold value.
Above device, it is preferred that the position prediction unit includes:
Target location obtains subelement, for obtaining target position of the fisrt feature point in the target two field picture
It puts;
Feature point tracking subelement for being based on the target location, determines the tracking between the fisrt feature point
With information;
Feature point prediction subelement, for utilizing default prediction algorithm, based on described in the tracking and matching information acquisition
Predicted position of the fisrt feature point in the current frame image.
Above device, it is preferred that the prediction algorithm includes:Linear prediction algorithm or Kalman Prediction algorithm.
From said program, the tracking and device of a kind of image characteristic point that the application provides, by feature
Position of the point in current frame image is predicted, thus according to prediction result come the position to characteristic point in current frame image
Into line trace, the accurate positionin of characteristic point is realized.It need not carry out global traversal in the application to current frame image, and simply root
It is predicted that carrying out track and localization, so as to the calculation amount being substantially reduced during tracking, so as to accelerate following rate, realize quick
Stable Feature Points Matching tracking.
Description of the drawings
In order to illustrate more clearly of the technical solution in the embodiment of the present application, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, the accompanying drawings in the following description is only some embodiments of the present application, for
For those of ordinary skill in the art, without having to pay creative labor, it can also be obtained according to these attached drawings
His attached drawing.
Fig. 1 is a kind of flow chart of the tracking for image characteristic point that the embodiment of the present application one provides;
Fig. 2, Fig. 3, Fig. 4 and Fig. 5 are respectively the application exemplary plot of the embodiment of the present application;
Fig. 6 is a kind of structure diagram of the tracks of device for image characteristic point that the embodiment of the present application two provides;
Fig. 7 is a kind of part-structure schematic diagram of the tracks of device for image characteristic point that the embodiment of the present application two provides;
Fig. 8 is the another application instance graph of the embodiment of the present application.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, the technical solution in the embodiment of the present application is carried out clear, complete
Site preparation describes, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on
Embodiment in the application, those of ordinary skill in the art are obtained every other without making creative work
Embodiment shall fall in the protection scope of this application.
With reference to figure 1, a kind of realization flow chart of the tracking of the image characteristic point provided for the embodiment of the present application one is fitted
For being carried out to the characteristic point in sequential frame image in the application of track and localization, for example, in the game such as AR augmented realities, to figure
Character features as in click through line trace positioning, provide better usage experience to the user.
In the present embodiment, this method may comprise steps of:
Step 101:It determines to need the current frame image for carrying out feature point tracking.
Wherein, in the present embodiment carry out feature point tracking refer to, found in current frame image with before current frame image
Image in the current position of characteristic point or feature, as shown in Figure 2, the characteristic point in image 2 is found in the image 1
It is corresponding, and image 2 is the picture frame before image 1.
Step 102:Characteristic point to be matched is detected from current frame image.
Wherein, default Feature point recognition algorithm may be employed in the present embodiment and stablize to detect to have in current frame image
Some pixels of feature or region, such as Harris Corner Detection Algorithms, DOG (Difference Of Gaussian) algorithm
Deng.
Step 103:Obtain target two field picture of at least two frames before current frame image.
As shown in Figure 3, the image before image a, image b and image c are image d, image d are current frame image,
In the present embodiment, image a~c is obtained, and is determined as target two field picture.
Step 104:Fisrt feature point is detected from each target two field picture.
Wherein, default Feature point recognition algorithm such as Harris Corner Detection Algorithms, DOG may be employed in the present embodiment
(Difference Of Gaussian) algorithm etc. detects the fisrt feature point in each target two field picture, as shown in Figure 4,
The fisrt feature point c1 in fisrt feature the point b1, image c in fisrt feature point a1, image b in image a, fisrt feature point
A1, second feature point b1 and fisrt feature point c1 are the characteristic point of the same position of same object, such as the feature of face left eye
Point.
It should be noted that after detecting characteristic point in the present embodiment, the picture of this feature point in the picture also will recognise that
Plain position (x, y), x are the abscissa in coordinate system in image, and y is the ordinate in coordinate system in image.
Step 105:Obtain target location of the fisrt feature point in target two field picture.
For example, as in three target two field pictures in Fig. 4, fisrt feature point a1, second feature point b1 and the first spy are obtained
Levy target locations of the point c1 in respective image, the position B1 of position A1, second feature point b1 such as fisrt feature point a1 and
The position C1 of fisrt feature point c1.
Step 106:Based on target location, the tracking and matching information between fisrt feature point is determined.
For example, based on the position A1 of respective fisrt feature point a1, second feature point b1 in three target two field pictures in Fig. 4
Position B1 and fisrt feature point c1 position C1, the tracking and matching information between these characteristic points is determined, such as object such as people
Face left eye transforms to the position B1 of image b from the position A1 of image a, then transforms to the position C1 of image c.
Step 107:Using default prediction algorithm, based on tracking and matching information acquisition fisrt feature point in current frame image
In predicted position.
Wherein, in the present embodiment linear prediction algorithm or Kalman Prediction algorithm can be used to work as fisrt feature point
It is predicted position in prior image frame.
In one implementation, if target two field picture only has two, then linear prediction algorithm prediction may be employed
Fisrt feature point in target two field picture is likely to appear in the predicted position in current frame image.Wherein, linear prediction algorithm
Similar to the coefficient a and b for solving ax+b=y, then y values are predicted after giving x.For example, two frame target frames are utilized in the present embodiment
The position of fisrt feature point solves coefficient in image, possibly afterwards is present at present frame figure using equations fisrt feature point
Abscissa and ordinate as in.
In another realization method, if target two field picture has 3 or more than 3, then can be in the present embodiment
Predict that the fisrt feature point in target two field picture is likely to appear in the prediction in current frame image using Kalman Prediction algorithm
Position.
Step 108:From characteristic point to be matched, the second feature point to match with fisrt feature point is determined.
And be to need to follow certain rule when matching second feature point, for example, the second feature point matched is being worked as
Predicted position of the matched fisrt feature point in current location in prior image frame in current frame image be meet it is default
Position relationship.
Here position relationship can be:In all characteristic points to be matched, the present bit of the second feature point matched
It is nearest to put the predicted position of matched fisrt feature point, for example, as shown in Figure 5, it is respective in three target two field pictures
Fisrt feature point a1, second feature point b1 and fisrt feature point c1, predicted position of these characteristic points in current frame image are
X, and the current location of multiple characteristic points to be matched in current frame image is respectively Y1~Y4, and the characteristic point on the Y2 of position
Distance X is nearest, then it is second feature point to determine the characteristic point on Y2.
Alternatively, the present embodiment, when matching fisrt feature point, the predicted position that can calculate fisrt feature point is adjacent thereto
The distance between characteristic point to be matched value, then determine that distance value is less than for example pre-set pixel coordinate distance of predetermined threshold value
The characteristic point of value is as second feature point, so as to fulfill feature point tracking.
From said program, the tracking for a kind of image characteristic point that the embodiment of the present application one provides, by spy
Position of the sign point in current frame image is predicted, thus according to prediction result come the position to characteristic point in current frame image
It puts into line trace, realizes the accurate positionin of characteristic point.Global traversal need not be carried out in uh embodiment to current frame image, and only
It is that track and localization is carried out according to prediction, so as to the calculation amount being substantially reduced during tracking, so as to accelerate following rate, realizes
The Feature Points Matching tracking of fast and stable.
With reference to figure 6, a kind of structure diagram of the tracks of device of the image characteristic point provided for the embodiment of the present application two should
Device is in the application that track and localization is carried out to the characteristic point sequential frame image, for example, in game such as AR augmented realities
In, the character features in image are clicked through with line trace positioning, provides better usage experience to the user.
In the present embodiment, which can include with lower structure:
Current signature point detection unit 601, for monitoring characteristic point to be matched in current frame image.
Wherein, current signature point detection unit 601 first has to determine to need the current frame image for carrying out feature point tracking, then
Characteristic point to be matched is detected from current frame image.And feature point tracking is carried out in the present embodiment and is referred to, in current frame image
In find the position current with the characteristic point in the image before current frame image or feature, as shown in Figure 2, image 1
In characteristic point find corresponding point in image 2, and image 2 is the picture frame before image 1.
It should be noted that default Feature point recognition algorithm may be employed in the present embodiment to detect in current frame image
Some pixels or region with invariant feature, such as Harris Corner Detection Algorithms, DOG algorithms.
Fisrt feature point detection unit 602, for monitoring target frame figure of at least two frames before the current frame image
Fisrt feature point as in.
Wherein, fisrt feature point detection unit 602 can obtain target of at least two frames before current frame image first
Two field picture, then fisrt feature point is detected from each target two field picture.As shown in Figure 3, image a, image b and image c are
Image before image d, image d are current frame image, in the present embodiment, image a~c are obtained, and is determined as mesh
Mark two field picture.
Wherein, default Feature point recognition algorithm such as Harris Corner Detection Algorithms, DOG may be employed in the present embodiment
(Difference Of Gaussian) algorithm etc. detects the fisrt feature point in each target two field picture, as shown in Figure 4,
The fisrt feature point c1 in fisrt feature the point b1, image c in fisrt feature point a1, image b in image a, fisrt feature point
A1, second feature point b1 and fisrt feature point c1 are the characteristic point of the same position of same object, such as the feature of face left eye
Point.
It should be noted that after detecting characteristic point in the present embodiment, the picture of this feature point in the picture also will recognise that
Plain position (x, y), x are the abscissa in coordinate system in image, and y is the ordinate in coordinate system in image.
Position prediction unit 603, for obtaining predicted position of the fisrt feature point in the current frame image.
In the concrete realization, position prediction unit 603 can be by being realized, as shown in Figure 7 with lower structure:
Target location obtains subelement 701, for obtaining target of the fisrt feature point in the target two field picture
Position.
For example, as in three target two field pictures in Fig. 4, fisrt feature point a1, second feature point b1 and the first spy are obtained
Levy target locations of the point c1 in respective image, the position B1 of position A1, second feature point b1 such as fisrt feature point a1 and
The position C1 of fisrt feature point c1.
Feature point tracking subelement 702 for being based on the target location, determines the tracking between the fisrt feature point
Match information.
For example, based on the position A1 of respective fisrt feature point a1, second feature point b1 in three target two field pictures in Fig. 4
Position B1 and fisrt feature point c1 position C1, the tracking and matching information between these characteristic points is determined, such as object such as people
Face left eye transforms to the position B1 of image b from the position A1 of image a, then transforms to the position C1 of image c.
Feature point prediction subelement 703, for utilizing default prediction algorithm, based on the tracking and matching information acquisition institute
State predicted position of the fisrt feature point in the current frame image.
Wherein, in the present embodiment linear prediction algorithm or Kalman Prediction algorithm can be used to work as fisrt feature point
It is predicted position in prior image frame.
In one implementation, if target two field picture only has two, then linear prediction algorithm prediction may be employed
Fisrt feature point in target two field picture is likely to appear in the predicted position in current frame image.Wherein, linear prediction algorithm
Similar to the coefficient a and b for solving ax+b=y, then y values are predicted after giving x.For example, two frame target frames are utilized in the present embodiment
The position of fisrt feature point solves coefficient in image, possibly afterwards is present at present frame figure using equations fisrt feature point
Abscissa and ordinate as in.
In another realization method, if target two field picture has 3 or more than 3, then can be in the present embodiment
Predict that the fisrt feature point in target two field picture is likely to appear in the prediction in current frame image using Kalman Prediction algorithm
Position.
Feature Points Matching unit 604, for from the characteristic point to be matched, determining and the fisrt feature point phase
The second feature point matched somebody with somebody.
Wherein, the matched fisrt feature point in current location of the second feature point in the current frame image exists
Predicted position in the current frame image meets default position relationship.
And be to need to follow certain rule when matching second feature point, for example, the second feature point matched is being worked as
Predicted position of the matched fisrt feature point in current location in prior image frame in current frame image be meet it is default
Position relationship.
Here position relationship can be:In all characteristic points to be matched, the present bit of the second feature point matched
It is nearest to put the predicted position of matched fisrt feature point, for example, as shown in Figure 5, it is respective in three target two field pictures
Fisrt feature point a1, second feature point b1 and fisrt feature point c1, predicted position of these characteristic points in current frame image are
X, and the current location of multiple characteristic points to be matched in current frame image is respectively Y1~Y4, and the characteristic point on the Y2 of position
Distance X is nearest, then it is second feature point to determine the characteristic point on Y2.
Alternatively, Feature Points Matching unit 604 can calculate fisrt feature point when matching fisrt feature point in the present embodiment
The distance between predicted position characteristic point to be matched adjacent thereto value, then to determine that distance value is less than predetermined threshold value for example advance
The characteristic point of the pixel coordinate distance value of setting is as second feature point, so as to fulfill feature point tracking.
From said program, the tracks of device for a kind of image characteristic point that the embodiment of the present application two provides, by spy
Position of the sign point in current frame image is predicted, thus according to prediction result come the position to characteristic point in current frame image
It puts into line trace, realizes the accurate positionin of characteristic point.Global traversal need not be carried out in uh embodiment to current frame image, and only
It is that track and localization is carried out according to prediction, so as to the calculation amount being substantially reduced during tracking, so as to accelerate following rate, realizes
The Feature Points Matching tracking of fast and stable.
Below in conjunction with the flow chart in Fig. 8, explanation of summarizing to the implementation in the present embodiment:
First, characteristic point is detected from current frame image as characteristic point to be matched;
Secondly, n before acquisition (>3) characteristic point position of two field picture and according to matched jamming information combination linear prediction or
The prediction algorithms such as Kalman Prediction obtain predicted position of these characteristic points in current frame image;
Again, the characteristic point conduct nearest with prediction coordinate points position is found from the characteristic point that present frame detects to match
Characteristic point pair;
Finally, all characteristic points and upper one in present frame can be obtained by repeating above step to all mark points, that is, characteristic point
The matched jamming relation of frame characteristic point, and in this, as the predicting tracing basis of next frame characteristic point, so as to fulfill characteristic point
Tracking and matching.
It should be understood by those skilled in the art that, embodiments herein can be provided as method, system or computer program
Product.Therefore, the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware can be used in the application
Apply the form of example.Moreover, the computer for wherein including computer usable program code in one or more can be used in the application
The computer program production that usable storage medium is implemented on (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
The form of product.
The application is with reference to the flow according to the method for the embodiment of the present application, equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that it can be realized by computer program instructions every first-class in flowchart and/or the block diagram
The combination of flow and/or box in journey and/or box and flowchart and/or the block diagram.These computer programs can be provided
The processor of all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that the instruction performed by computer or the processor of other programmable data processing devices is generated for real
The device for the function of being specified in present one flow of flow chart or one box of multiple flows and/or block diagram or multiple boxes.
These computer program instructions, which may also be stored in, can guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works so that the instruction generation being stored in the computer-readable memory includes referring to
Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one box of block diagram or
The function of being specified in multiple boxes.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted
Series of operation steps is performed on calculation machine or other programmable devices to generate computer implemented processing, so as in computer or
The instruction offer performed on other programmable devices is used to implement in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in a box or multiple boxes.
In a typical configuration, computing device includes one or more processors (CPU), input/output interface, net
Network interface and memory.
Memory may include computer-readable medium in volatile memory, random access memory (RAM) and/
Or the forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable Jie
The example of matter.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology come realize information store.Information can be computer-readable instruction, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only memory (CD-ROM),
Digital versatile disc (DVD) or other optical storages, magnetic tape cassette, the storage of tape magnetic rigid disk or other magnetic storage apparatus
Or any other non-transmission medium, the information that can be accessed by a computing device available for storage.It defines, calculates according to herein
Machine readable medium does not include temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
It should also be noted that, term " comprising ", "comprising" or its any other variant are intended to nonexcludability
Comprising so that process, method, commodity or equipment including a series of elements are not only including those elements, but also wrap
Include other elements that are not explicitly listed or further include for this process, method, commodity or equipment it is intrinsic will
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including element
Also there are other identical elements in process, method, commodity or equipment.
It will be understood by those skilled in the art that embodiments herein can be provided as method, system or computer program product.
Therefore, complete hardware embodiment, complete software embodiment or the embodiment in terms of combining software and hardware can be used in the application
Form.It is deposited moreover, the application can be used to can use in one or more computers for wherein including computer usable program code
The shape for the computer program product that storage media is implemented on (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
Formula.
It these are only embodiments herein, be not limited to the application.To those skilled in the art,
The application can have various modifications and variations.All any modifications made within spirit herein and principle, equivalent substitution,
Improve etc., it should be included within the scope of claims hereof.
Claims (10)
1. a kind of tracking of image characteristic point, which is characterized in that including:
Detect characteristic point to be matched in current frame image;
Detect fisrt feature point of at least two frames in the target two field picture before the current frame image;
Obtain predicted position of the fisrt feature point in the current frame image;
From the characteristic point to be matched, the second feature point to match with the fisrt feature point is determined;
Wherein, the matched fisrt feature point in current location of the second feature point in the current frame image is described
Predicted position in current frame image meets default position relationship.
2. according to the method described in claim 1, it is characterized in that, the position relationship includes:The current location with it is described
Predicted position is nearest.
3. method according to claim 1 or 2, which is characterized in that the position relationship includes:The current location and institute
The distance between predicted position value is stated less than default threshold value.
4. according to the method described in claim 1, it is characterized in that, described obtain the fisrt feature point in the present frame figure
Predicted position as in, including:
Obtain target location of the fisrt feature point in the target two field picture;
Based on the target location, the tracking and matching information between the fisrt feature point is determined;
Using default prediction algorithm, based on fisrt feature point described in the tracking and matching information acquisition in the current frame image
In predicted position.
5. according to the method described in claim 4, it is characterized in that, the prediction algorithm includes:Linear prediction algorithm or karr
Graceful prediction algorithm.
6. a kind of tracks of device of image characteristic point, which is characterized in that including:
Current signature point detection unit, for monitoring characteristic point to be matched in current frame image;
Fisrt feature point detection unit, for monitoring at least two frames the in the target two field picture before the current frame image
One characteristic point;
Position prediction unit, for obtaining predicted position of the fisrt feature point in the current frame image;
Feature Points Matching unit for from the characteristic point to be matched, determines match with the fisrt feature point the
Two characteristic points;
Wherein, the matched fisrt feature point in current location of the second feature point in the current frame image is described
Predicted position in current frame image meets default position relationship.
7. device according to claim 6, which is characterized in that the position relationship includes:The current location with it is described
Predicted position is nearest.
8. the device according to claim 6 or 7, which is characterized in that the position relationship includes:The current location and institute
The distance between predicted position value is stated less than default threshold value.
9. device according to claim 6, which is characterized in that the position prediction unit includes:
Target location obtains subelement, for obtaining target location of the fisrt feature point in the target two field picture;
Feature point tracking subelement for being based on the target location, determines the tracking and matching letter between the fisrt feature point
Breath;
Feature point prediction subelement, for utilizing default prediction algorithm, based on first described in the tracking and matching information acquisition
Predicted position of the characteristic point in the current frame image.
10. device according to claim 9, which is characterized in that the prediction algorithm includes:Linear prediction algorithm or karr
Graceful prediction algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711381388.4A CN108109175A (en) | 2017-12-20 | 2017-12-20 | The tracking and device of a kind of image characteristic point |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711381388.4A CN108109175A (en) | 2017-12-20 | 2017-12-20 | The tracking and device of a kind of image characteristic point |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108109175A true CN108109175A (en) | 2018-06-01 |
Family
ID=62211306
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711381388.4A Pending CN108109175A (en) | 2017-12-20 | 2017-12-20 | The tracking and device of a kind of image characteristic point |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108109175A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919190A (en) * | 2019-01-29 | 2019-06-21 | 广州视源电子科技股份有限公司 | Algorism of Matching Line Segments method, apparatus, storage medium and terminal |
CN110310333A (en) * | 2019-06-27 | 2019-10-08 | Oppo广东移动通信有限公司 | Localization method and electronic equipment, readable storage medium storing program for executing |
CN110401796A (en) * | 2019-07-05 | 2019-11-01 | 浙江大华技术股份有限公司 | A kind of jitter compensation method and device of image collecting device |
CN112819889A (en) * | 2020-12-30 | 2021-05-18 | 浙江大华技术股份有限公司 | Method and device for determining position information, storage medium and electronic device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103985138A (en) * | 2014-05-14 | 2014-08-13 | 苏州盛景空间信息技术有限公司 | Long-sequence image SIFT feature point tracking algorithm based on Kalman filter |
CN107330943A (en) * | 2017-06-26 | 2017-11-07 | 乐视致新电子科技(天津)有限公司 | One kind positioning mark matching process, device and electronic equipment |
-
2017
- 2017-12-20 CN CN201711381388.4A patent/CN108109175A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103985138A (en) * | 2014-05-14 | 2014-08-13 | 苏州盛景空间信息技术有限公司 | Long-sequence image SIFT feature point tracking algorithm based on Kalman filter |
CN107330943A (en) * | 2017-06-26 | 2017-11-07 | 乐视致新电子科技(天津)有限公司 | One kind positioning mark matching process, device and electronic equipment |
Non-Patent Citations (1)
Title |
---|
高宏伟 等著: "电子封装工艺与装备技术基础教程", 《电子封装工艺与装备技术基础教程》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919190A (en) * | 2019-01-29 | 2019-06-21 | 广州视源电子科技股份有限公司 | Algorism of Matching Line Segments method, apparatus, storage medium and terminal |
CN110310333A (en) * | 2019-06-27 | 2019-10-08 | Oppo广东移动通信有限公司 | Localization method and electronic equipment, readable storage medium storing program for executing |
CN110401796A (en) * | 2019-07-05 | 2019-11-01 | 浙江大华技术股份有限公司 | A kind of jitter compensation method and device of image collecting device |
CN110401796B (en) * | 2019-07-05 | 2020-09-29 | 浙江大华技术股份有限公司 | Jitter compensation method and device of image acquisition device |
CN112819889A (en) * | 2020-12-30 | 2021-05-18 | 浙江大华技术股份有限公司 | Method and device for determining position information, storage medium and electronic device |
CN112819889B (en) * | 2020-12-30 | 2024-05-10 | 浙江大华技术股份有限公司 | Method and device for determining position information, storage medium and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Strasdat et al. | Double window optimisation for constant time visual SLAM | |
CN107292949B (en) | Three-dimensional reconstruction method and device of scene and terminal equipment | |
Bae et al. | High-precision vision-based mobile augmented reality system for context-aware architectural, engineering, construction and facility management (AEC/FM) applications | |
CN107633526B (en) | Image tracking point acquisition method and device and storage medium | |
CN108109175A (en) | The tracking and device of a kind of image characteristic point | |
CN111428607B (en) | Tracking method and device and computer equipment | |
CN108229456A (en) | Method for tracking target and device, electronic equipment, computer storage media | |
EP2813082A1 (en) | Head pose tracking using a depth camera | |
CN106385640B (en) | Video annotation method and device | |
US10600190B2 (en) | Object detection and tracking method and system for a video | |
CN108280843A (en) | A kind of video object detecting and tracking method and apparatus | |
US20150104067A1 (en) | Method and apparatus for tracking object, and method for selecting tracking feature | |
CN106295598A (en) | A kind of across photographic head method for tracking target and device | |
CN109598744A (en) | A kind of method, apparatus of video tracking, equipment and storage medium | |
CN108122280A (en) | The method for reconstructing and device of a kind of three-dimensional point cloud | |
US20140147000A1 (en) | Image tracking device and image tracking method thereof | |
CN113392794B (en) | Vehicle line crossing identification method and device, electronic equipment and storage medium | |
JP2014235743A (en) | Method and equipment for determining position of hand on the basis of depth image | |
WO2023016182A1 (en) | Pose determination method and apparatus, electronic device, and readable storage medium | |
Pei et al. | Improved Camshift object tracking algorithm in occluded scenes based on AKAZE and Kalman | |
CN110956131B (en) | Single-target tracking method, device and system | |
JP6154759B2 (en) | Camera parameter estimation apparatus, camera parameter estimation method, and camera parameter estimation program | |
EP3676801B1 (en) | Electronic devices, methods, and computer program products for controlling 3d modeling operations based on pose metrics | |
CN110223320A (en) | Object detection tracking and detecting and tracking device | |
US11373382B2 (en) | Augmented reality implementation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180601 |