CN114049376A - Pose real-time tracking method, apparatus, computer readable medium and program product - Google Patents
Pose real-time tracking method, apparatus, computer readable medium and program product Download PDFInfo
- Publication number
- CN114049376A CN114049376A CN202111090846.5A CN202111090846A CN114049376A CN 114049376 A CN114049376 A CN 114049376A CN 202111090846 A CN202111090846 A CN 202111090846A CN 114049376 A CN114049376 A CN 114049376A
- Authority
- CN
- China
- Prior art keywords
- line segment
- straight
- pose
- projection
- current frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000004422 calculation algorithm Methods 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 9
- 238000000605 extraction Methods 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The application provides a pose real-time tracking method, a pose real-time tracking device, a pose real-time tracking computer readable medium and a pose real-time tracking program product. The method comprises the following steps: collecting a current frame of an object to be tracked, and extracting a characteristic straight-line segment in the current frame; calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and the object pose obtained by resolving the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame; searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, wherein the matched characteristic straight-line segments represent the projection positions of the control straight-line segments in the current frame; and resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked. According to the method, the accuracy of the corresponding relation between the two-dimensional image and the three-dimensional scene is improved, and therefore the accuracy of real-time tracking of the pose of the target object is improved.
Description
Technical Field
The embodiment of the application relates to the technical field of target tracking, in particular to a pose real-time tracking method, pose real-time tracking equipment, a computer readable medium and a program product.
Background
The method for tracking the object in real time by recovering the three-dimensional information (6-Degree of Freedom, 6-DOF pose for short) of the target object from the two-dimensional image is always a research hotspot in the field of computer vision, and the current commonly used tracking algorithm estimates the relative pose of a current frame relative to a previous frame by analyzing the edge displacement between continuous frames, obtains the pose of each frame of the object to be tracked by an incremental updating mode, and realizes real-time tracking.
In a weak texture scene, for a target object with a relatively obvious edge feature of a straight line, in the prior art, a RAPID-time attribute and Position Determination (RAPID-time attribute and Position Determination) method is mostly adopted, and strong gradient points are searched in a neighborhood.
However, in the prior art, detected noise points or interference points are easily and wrongly matched into corresponding points during searching, so that the corresponding relationship between the two-dimensional image and the three-dimensional scene is inaccurate, and the pose of the target object cannot be accurately tracked in real time.
Disclosure of Invention
The application provides a pose real-time tracking method, a pose real-time tracking device, a pose real-time tracking computer readable medium and a pose real-time tracking program product. The method is used for solving the problem that in the prior art, detected noise points or interference points are easily and wrongly matched into corresponding points during searching, so that the corresponding relation between a two-dimensional image and a three-dimensional scene is inaccurate, and accurate pose real-time tracking of a target object cannot be carried out.
In a first aspect, the present application provides a pose real-time tracking method, including:
collecting a current frame of an object to be tracked, and extracting a characteristic straight-line segment in the current frame; calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and the object pose obtained by resolving the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame; searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, wherein the matched characteristic straight-line segments represent the projection positions of the control straight-line segments in the current frame; and resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked.
In a second aspect, an embodiment of the present application provides a pose real-time tracking apparatus, including:
the acquisition module is used for acquiring a current frame of an object to be tracked and extracting a characteristic straight-line segment in the current frame;
the first calculation module is used for calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and an object pose obtained by calculation of the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame;
the searching module is used for searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, and the matched characteristic straight-line segments represent the projection positions of the control straight-line segments in the current frame;
and the second calculation module is used for resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so that the real-time pose tracking of the object to be tracked is realized.
In a third aspect, the embodiment of the application provides a pose real-time tracking device,
the method comprises the following steps: a memory, a processor;
a memory; a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions stored in the memory to perform any of the pose real-time tracking methods of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when the computer-executable instructions are executed by a processor, the method is used to implement any one of the pose real-time tracking methods in the first aspect.
In a fifth aspect, the present application provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements any one of the pose real-time tracking methods of the first aspect.
Provided are a pose real-time tracking method, apparatus, computer readable medium and program product. Extracting a characteristic straight-line segment in a current frame by collecting the current frame of an object to be tracked; calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and the object pose obtained by resolving the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame; searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, wherein the matched characteristic straight-line segments represent the projection positions of the control straight-line segments in the current frame; and resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked. According to the method, the accuracy and the correctness of the corresponding relation between the two-dimensional image and the three-dimensional scene are improved, and therefore the accuracy of real-time tracking of the pose of the target object is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the following briefly introduces the drawings needed to be used in the description of the embodiments or the prior art, and obviously, the drawings in the following description are some embodiments of the present invention, and those skilled in the art can obtain other drawings according to the drawings without inventive labor.
Fig. 1 is a flowchart of a pose real-time tracking method provided in an embodiment of the present application;
fig. 2 is a flowchart of another pose real-time tracking method provided in the embodiment of the present application;
FIG. 3 is a diagram illustrating searching for a corresponding point in a neighborhood according to an embodiment of the present disclosure;
fig. 4 is a flowchart of another pose real-time tracking method provided in the embodiment of the present application;
FIG. 5 is a diagram illustrating searching a neighborhood for corresponding straight line segments according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a pose real-time tracking apparatus provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms referred to in this application are explained first:
and (3) RAPiD: real-time Attitude and Position Determination is a model-based Real-time tracking algorithm that uses observed image features to determine the pose of a known three-dimensional object image to a camera, thereby achieving Real-time tracking.
Image edge: it refers to the discontinuity of local characteristics of the image, such as abrupt change of gray level, abrupt change of color, abrupt change of texture structure, etc., which is an important feature on which the image segmentation depends.
Noise points: refers to the random error and variance of the measured variable, and is referred to herein specifically as image noise.
Graying treatment: is the process of converting a color image to a grayscale image.
Euclidean distance: is referred to as a euclidean metric and is a commonly used definition of distance, the euclidean distance in two and three dimensions being the actual distance between two points.
Satisfying least squares: the least square method is a mathematical optimization technology, is used for curve fitting in the application, can simply obtain a result, and the result is more accurate.
Hough transformation: a basic method of recognizing geometry from an image in image processing. The basic principle of Hough transformation is to transform a straight line in an image space into a parameter space, and determine the description parameters of the straight line by detecting extreme points in the parameter space, so as to extract the straight line in the image.
LSD: a Line Segment Detector is a straight Line Segment detection algorithm. The algorithm can obtain a high-precision straight-line segment detection result in a short time. The LSD straight line detection algorithm firstly calculates the gradient size and direction of all points in an image, then takes the points with small gradient direction change and adjacent points as a connected domain, then judges whether the points need to be disconnected according to rules according to the rectangularity of each domain to form a plurality of domains with larger rectangularity, finally improves and screens all the generated domains, and reserves the domain meeting the conditions, namely the final straight line detection result. The algorithm has the advantages of high detection speed, no need of parameter adjustment and utilization of an error control method to improve the accuracy of linear detection.
The application scene of the embodiment of the application is suitable for any scene needing to track the target object, and the target object generally has the edge characteristics of multiple straight lines. For example: the method can be applied to the field of intelligent video monitoring, and can realize real-time pose tracking by utilizing the characteristics of the object straight line through motion recognition and peripheral object detection of the vehicle. Another example is: the method can also be applied to the field of robot vision, and the movement track of the shot object is calculated by carrying out position recognition and state analysis on the barrier and the road sign, so that the vision navigation of the mobile robot is realized. It will be appreciated that this is merely illustrated as two specific use scenarios and is not intended as a limitation of the present application.
And (3) tracking the pose of the target object in real time, estimating the relative pose of the current frame relative to the previous frame, namely the incremental pose, by analyzing the edge displacement between continuous frames, and acquiring the pose of each frame of the object to be tracked in an incremental updating mode. Namely, assuming that the pose of the previous frame of the target object is already obtained, if the pose change of the current frame relative to the previous frame, namely the incremental pose, is calculated, the pose of the current frame can be obtained through incremental updating. Furthermore, the pose of the current frame is used as the pose of the previous frame, the pose of the next frame is obtained by calculation in the same way, and the process is repeated in the same way, so that the real-time tracking of the target object is finally realized.
As described above, to achieve the real-time pose tracking of the target object, the incremental pose needs to be obtained first.
In the present application, it is assumed that the pose of the first frame is already obtained, and as for the specific solving process of the pose of the first frame, the detailed explanation is not provided herein since it does not belong to the research focus of the present application. Meanwhile, as the sampling is carried out on the three-dimensional model of the object to be tracked in advance, the control straight-line segment set of the three-dimensional model is obtained by collecting the head and tail points of the straight-line segment on the three-dimensional model, namely the control straight-line segment of the three-dimensional model is obtained. This process can be understood as a preparatory phase prior to calculating the incremental poses.
Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a pose real-time tracking method provided by an embodiment of the present invention, as shown in fig. 1, including:
s101, collecting a current frame of an object to be tracked, and extracting a characteristic straight-line segment in the current frame.
The method comprises the steps of collecting a current frame of an object to be tracked through an image collecting device such as a camera, enabling the object to be tracked to appear in a picture, wherein the current frame is a two-dimensional image, and extracting all characteristic straight line segments in the current frame through any one of existing algorithms capable of extracting straight lines.
S102, calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and the object pose obtained by calculation of the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame.
In the preparation stage, the three-dimensional straight line edge is sampled on the three-dimensional model of the object to be tracked in advance, a control straight line segment set can be obtained by collecting the head and tail points of a straight line segment on the three-dimensional model, and meanwhile, the pose of the previous frame is assumed to be obtained. And calculating a projection straight line segment of the object to be tracked according to the acquired control straight line segment set and the pose of the object to be tracked calculated in the previous frame through a certain algorithm, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame.
And S103, searching matched characteristic straight line segments in the current frame based on the projection straight line segments, wherein the matched characteristic straight line segments represent the projection positions of the control straight line segments in the current frame.
And performing neighborhood search on the projection straight line segment according to the projection straight line segment obtained by calculation, and searching matched characteristic straight line segments in the current frame. The matched feature straight-line segment is referred to as a corresponding straight-line segment in the present embodiment, and the corresponding straight-line segment represents a projection position of the control straight-line segment in the current frame.
And S104, resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked. And after the control straight-line segments which meet a certain number of thresholds, namely the corresponding straight-line segments are obtained, the incremental pose is solved based on the difference of the projection positions in the current frame and the previous frame, and the pose parameters of the object to be tracked are solved and updated according to the incremental pose, so that the real-time pose tracking of the object to be tracked is realized.
According to the pose real-time tracking method provided by the embodiment of the application, the current frame of an object to be tracked is collected, and the characteristic straight line segment in the current frame is extracted; calculating a projection straight line segment of the control straight line segment in the previous frame through the control straight line segment of the three-dimensional model and the object pose obtained by resolving the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame; searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, wherein the matched characteristic straight-line segments represent the projection positions of the control straight-line segments in the current frame; and resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked. According to the method, the accuracy of the corresponding relation between the two-dimensional image and the three-dimensional scene is improved, and therefore the accuracy of real-time tracking of the pose of the target object is improved.
Fig. 2 is a flowchart of another pose real-time tracking method provided in an embodiment of the present invention, and as shown in fig. 2, on the basis of the embodiment shown in fig. 1, for a current frame of an object to be tracked, which is acquired in step S101, and a feature straight-line segment in the current frame is extracted, the method may specifically be implemented by the following steps, including:
s1011, collecting a current frame of the object to be tracked, and carrying out gray preprocessing on the current frame.
The current frame is acquired by an image acquisition device, for example, a camera, and the acquired current frame is a color image in practice. The color image is an image formed by R, G, B components of each pixel, and because the color image contains more pixel points and occupies more channels, the subsequent image processing has large calculation amount and low processing speed. Therefore, the acquired color image needs to be preprocessed first. Optionally, the preprocessing mode may be a graying processing.
The graying pretreatment is carried out on the color image, namely the RGB values of each pixel point are unified into the same value, the grayed image is changed from three channels into a single channel, and the data processing of the single channel is simpler. The image is grayed out in preparation for other operations to follow the image.
And S1012, extracting the characteristic straight line segment in the grayed current frame based on a straight line extraction algorithm.
All the straight lines in the current frame are extracted, and the extraction mode can use any existing algorithm for extracting straight lines, for example, Hough transformation or LSD algorithm. All straight line segments in the gray level image are obtained through Hough transformation or LSD algorithm and are marked as { limageAnd obtaining the coordinates of the end points of the straight line segments.
And calculating a projection straight line segment of the control straight line segment in the previous frame according to the object pose calculated by the control straight line segment of the three-dimensional model and the previous frame in the S102. The calculating of the projection straight-line segment of the control straight-line segment in the previous frame, which is obtained by marking the obtained three-dimensional model control straight-line segment as { L }, may specifically include:
and S1021, determining projection coordinates of two end points of each control straight line segment through the control straight line segment of the three-dimensional model and the object pose obtained by resolving in the previous frame.
In this embodiment, the three-dimensional model is a three-dimensional CAD model, and it is understood that the three-dimensional CAD model in this application may also be other types of three-dimensional models. According to the control straight line segments of the obtained three-dimensional CAD model, each control straight line L is respectively calculated by using a formula IiThe two end points of (2) project coordinates.
The formula I is as follows:
wherein (X)w,Yw,Zw) To control the three-dimensional coordinates of point E, [ R | t]For pose parameters including rotation and displacement, (u)0,v0) Is the coordinate of the center point of the image plane under the pixel coordinate system, alphax=f/dx、αy=f/dyRatio of camera focal length to physical size of pixel in x, y directions, ZcIs ZwCoordinate values in the camera coordinate system.
And S1022, determining a projection straight-line segment of each control straight-line segment in the previous frame according to the projection coordinates of two end points of each control straight-line segment, wherein the projection straight-line segment represents the projection position of the control straight-line segment in the previous frame.
Obtaining the projection coordinates of two end points of each control straight-line segment according to a formula I, connecting the two end points of the projection to determine a projection straight-line segment, and recording the projection straight-line segment as li. It can be understood that the corresponding projection straight-line segment can be obtained by determining two end points of any one control straight-line segment.
Further, in S103, based on the projected straight-line segment, a matching feature straight-line segment is searched in the current frame. The matching characteristic straight line segment in this embodiment is the corresponding straight line segment. It should be noted that the corresponding straight line segment represents the projection position of the control straight line segment in the current frame.
By projecting straight line segments l for each lineiI.e. controlling the straight line segment LiProjection in the image, all straight line segments { l ] in the current frameimageNeighborhood search is performed to find the corresponding straight line segment matching the projection straight line segment.
Corresponding straight line segment l 'of neighborhood search'iThe following conditions may be satisfied:
specifically, the method comprises the following steps:
condition 1: liAnd l'iIs within a preset threshold value (20 degrees);
condition 2: liAnd l'iDistance dist (l)iFrom mid point to l'iDistance) is within a preset threshold (15% of the image width);
condition 3: l's'iIs projected to the first end point along the vertical directioniOr l 'when it is on the extension line'iAnd liThere is an overlap;
condition 4: liAnd l'iIs at a preset threshold (0.67)<li:l′i<1.5) in the range;
condition 5: if { limageAnd if a plurality of straight line segments meet the conditions, selecting the straight line segment with the closest distance in the condition 2.
Wherein condition 5 may be a further supplement to condition 2, i.e., if there are a plurality of straight line segments l'iIf all the parameters meet the condition 2, the closest one can be selected from the parameters. In addition, the parenthesis in the above conditions is only for illustrating the possible angles, distances or lengths, and different settings can be made under different application conditions, and therefore the application is not limited.
In the prior art, when searching for a strong gradient point along a gradient direction, due to the introduction of a noise point or an interference point, a corresponding point searched by a control point is not an accurate strong gradient point, so that a matching result is wrong. As shown in FIG. 3, wherein eijIs a projected point, e'ij1,e′ij2And e'ij3Is the corresponding point of the search. In the embodiment, the corresponding straight line segment is searched in the projection neighborhood of the control straight line segment, and generally, noise mostly appears in the form of points, and the probability of appearing in the form of the line segment is low, so that noise interference and mismatching rate are further reduced in the embodiment.
In addition, in the prior art, the target object with any edge shape is identified by searching for the corresponding point along the gradient direction, and the any edge shape does not have the characteristic of having an explicit analytic expression specific to the straight line segment, so that the searching for the corresponding point along the gradient direction is an approximate means, and actually, the projection point of the control point in the current frame does not necessarily fall on the gradient direction, so that the distance between the calculated projection point and the corresponding point is inaccurate.
Further, fig. 4 is a flowchart of another pose real-time tracking method provided by an embodiment of the present invention, and to facilitate understanding of the present embodiment, first, an alphabet expression involved in the present embodiment is explained, where [ R | t ] represents a pose of a previous frame of the target object, which has been obtained by assumption in this application, [ Δ R · R | t + Δ t ] represents a pose of a current frame, and [ Δ R | Δ t ] represents a pose change, i.e., an incremental pose, of the current frame with respect to the previous frame.
And as shown in FIG. 4, performing pose real-time tracking on the object to be tracked based on the matched characteristic straight line segment according to the step S104.
Specifically, the pose real-time tracking can be realized by the following steps:
s1041, if the number of the characteristic straight line segments matched with the projection straight line segments meets the set number, obtaining the incremental pose by minimizing the Euclidean distance between the projection straight line segments and the matched characteristic straight line segments.
If the number of the projection straight-line segments-corresponding straight-line segments is larger than a certain value, for example, the number is larger than or equal to 6 pairs, the distance between the new projection position and the corresponding straight-line segment can meet the least square through solving by a formula II, so that the distance between the projection straight-line segment and the corresponding straight-line segment is minimized, and the incremental pose is solved. And if the projection straight line segments-corresponding straight line segments do not meet the set number, ending.
The formula II is as follows:
wherein dist is Proj (L)i,[ΔR·R|t+Δt]) From mid point to l'iAs shown in fig. 5. The formula can be solved by a nonlinear optimization method, and can also be solved by a linear equation system after the linear approximation of the delta R.
And S1042, resolving and updating the pose parameters of the object to be tracked, and realizing the real-time pose tracking of the object to be tracked.
After the incremental pose [ delta R | delta t ] is obtained, the incremental pose [ delta R · R | t + delta t ] is substituted, and the current frame pose of the object to be tracked can be obtained in an incremental updating mode.
After the pose of the current frame is obtained, the pose of the current frame is transmitted to the next frame, and the pose of the current frame is used as the pose of the previous frame in the application and is used for resolving a new pose of the current frame (namely the pose of the next frame). And circulating in such a way, thereby realizing the real-time tracking of the pose of the target object.
Fig. 6 is a schematic diagram of an embodiment of a pose real-time tracking apparatus provided in the present application, where the apparatus includes:
the acquisition module 61 is used for acquiring a current frame of the object to be tracked and extracting a characteristic straight-line segment in the current frame;
the first calculation module 62 is configured to calculate a projection straight-line segment of the control straight-line segment in the previous frame through the control straight-line segment of the three-dimensional model and the object pose calculated in the previous frame, where the projection straight-line segment represents a projection position of the control straight-line segment in the previous frame;
a searching module 63, configured to search, based on the projection straight-line segment, for a matched feature straight-line segment in the current frame, where the matched feature straight-line segment represents a projection position of the control straight-line segment in the current frame;
and the second calculation module 64 is used for calculating and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked.
Fig. 7 is a schematic structural diagram of a pose real-time tracking device provided by the present application. As shown in fig. 7, the electronic device may include: at least one processor 71 and a memory 72. Fig. 7 shows an electronic device as an example of a processor.
And a memory 72 for storing programs. In particular, the program may include program code including computer operating instructions.
The memory 72 may comprise high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 71 is used for executing computer execution instructions stored in the memory 72 to realize a pose real-time tracking method;
the processor 71 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement the embodiments of the present Application, and the processor 71 executes instructions stored in the memory 72 to implement the pose real-time tracking.
Alternatively, in a specific implementation, if the communication interface, the memory 72 and the processor 71 are implemented independently, the communication interface, the memory 72 and the processor 71 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. Buses may be classified as address buses, data buses, control buses, etc., but do not represent only one bus or type of bus.
Alternatively, in a specific implementation, if the communication interface, the memory 72 and the processor 71 are integrated into a chip, the communication interface, the memory 72 and the processor 71 may complete communication through an internal interface.
The present application also provides a computer-readable storage medium, which may include: the computer readable storage medium may store program information, and the program information is used for a pose real-time tracking method.
Embodiments of the present application also provide a program, which when executed by a processor, is configured to perform the pose real-time tracking method provided by the above method embodiments.
Embodiments of the present application further provide a program product, such as a computer-readable storage medium, having instructions stored therein, which when executed on a computer, cause the computer to perform the pose real-time tracking method provided by the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the invention are brought about in whole or in part when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. A pose real-time tracking method is characterized by comprising the following steps:
collecting a current frame of an object to be tracked, and extracting a characteristic straight-line segment in the current frame;
calculating a projection straight line segment of the control straight line segment in the previous frame through a control straight line segment of the three-dimensional model and an object pose obtained by calculation of the previous frame, wherein the projection straight line segment represents a projection position of the control straight line segment in the previous frame;
searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, wherein the matched characteristic straight-line segments represent projection positions of control straight-line segments in the current frame;
and resolving and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame, so as to realize the real-time pose tracking of the object to be tracked.
2. The method as claimed in claim 1, wherein said acquiring a current frame of an object to be tracked, extracting a feature straight-line segment in the current frame comprises:
collecting a current frame of an object to be tracked, and carrying out gray level preprocessing on the current frame;
and extracting the characteristic straight line segment in the grayed current frame based on a straight line extraction algorithm.
3. The method according to claim 1, wherein the calculating of the projection straight-line segment of the control straight-line segment in the previous frame through the control straight-line segment of the three-dimensional model and the object pose calculated in the previous frame comprises:
determining projection coordinates of two end points of each control straight line segment through the control straight line segment of the three-dimensional model and an object pose obtained by resolving in the previous frame;
and determining a projection straight-line segment of each control straight-line segment in the previous frame according to the projection coordinates of two end points of the control straight-line segment, wherein the projection straight-line segment represents the projection position of the control straight-line segment in the previous frame.
4. The method of claim 3, wherein said searching for a matching feature straight-line segment in said current frame based on said projected straight-line segment comprises:
and on the basis of the projection straight line segments, performing neighborhood search on each projection straight line segment in the feature straight line segment in the current frame, and determining the matched feature straight line segment, wherein the matched feature straight line segment represents the projection position of the control straight line segment in the current frame.
5. The method of claim 4, wherein the neighborhood search is performed based on the projected straight-line segment, and wherein the projected straight-line segment and the matched feature straight-line segment satisfy the following condition:
the parallelism of the projection straight-line segment and the characteristic straight-line segment is within a preset parallelism threshold range;
the distance between the projection straight line segment and the characteristic straight line segment is within a threshold range of the preset image width;
when two end points of the characteristic straight-line segment are projected to the matched projection straight-line segment or the extension line of the projection straight-line segment along the vertical direction, the characteristic straight-line segment is overlapped with the projection straight-line segment;
and the length ratio of the projection straight-line segment to the characteristic straight-line segment is within a preset threshold range.
6. The method according to any one of claims 1 to 5, wherein the calculating and updating of the pose parameters of the object to be tracked based on the difference between the projection positions in the current frame and the previous frame to realize the real-time pose tracking of the object to be tracked comprises:
if the number of the characteristic straight line segments matched with the projection straight line segments meets the set number, obtaining an increment pose by minimizing the Euclidean distance between the projection straight line segments and the matched characteristic straight line segments;
and resolving and updating the pose parameters of the object to be tracked according to the incremental pose, so as to realize the real-time pose tracking of the object to be tracked.
7. A pose real-time tracking apparatus, comprising:
the system comprises an acquisition module, a tracking module and a tracking module, wherein the acquisition module is used for acquiring a current frame of an object to be tracked and extracting a characteristic straight-line segment in the current frame;
the first calculation module is used for calculating a projection straight line segment of the control straight line segment in the previous frame through a control straight line segment of the three-dimensional model and an object pose obtained by calculation of the previous frame, wherein the projection straight line segment represents the projection position of the control straight line segment in the previous frame;
the searching module is used for searching matched characteristic straight-line segments in the current frame based on the projection straight-line segments, and the matched characteristic straight-line segments represent projection positions of the control straight-line segments in the current frame;
and the second calculation module is used for calculating and updating the pose parameters of the object to be tracked based on the difference of the projection positions in the current frame and the previous frame so as to realize the real-time pose tracking of the object to be tracked.
8. A pose real-time tracking device, comprising: a memory, a processor;
the memory is to store executable instructions;
the processor is configured to: executing the instructions stored in the memory to perform the pose real-time tracking method of claims 1-6.
9. A computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions, which when executed by a processor, are used for implementing the pose real-time tracking method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program that, when executed by a processor, implements the pose real-time tracking method of any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111090846.5A CN114049376A (en) | 2021-09-17 | 2021-09-17 | Pose real-time tracking method, apparatus, computer readable medium and program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111090846.5A CN114049376A (en) | 2021-09-17 | 2021-09-17 | Pose real-time tracking method, apparatus, computer readable medium and program product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114049376A true CN114049376A (en) | 2022-02-15 |
Family
ID=80204413
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111090846.5A Pending CN114049376A (en) | 2021-09-17 | 2021-09-17 | Pose real-time tracking method, apparatus, computer readable medium and program product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114049376A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114882065A (en) * | 2022-07-12 | 2022-08-09 | 深圳市瑞图生物技术有限公司 | Method and device for judging fluidity of detection object, analyzer and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100328682A1 (en) * | 2009-06-24 | 2010-12-30 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, measurement method therefor, and computer-readable storage medium |
CN109544629A (en) * | 2018-11-29 | 2019-03-29 | 南京人工智能高等研究院有限公司 | Camera pose determines method and apparatus and electronic equipment |
CN110111388A (en) * | 2019-05-10 | 2019-08-09 | 北京航空航天大学 | Three-dimension object pose parameter estimation method and visual apparatus |
CN110631554A (en) * | 2018-06-22 | 2019-12-31 | 北京京东尚科信息技术有限公司 | Robot posture determining method and device, robot and readable storage medium |
-
2021
- 2021-09-17 CN CN202111090846.5A patent/CN114049376A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100328682A1 (en) * | 2009-06-24 | 2010-12-30 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, measurement method therefor, and computer-readable storage medium |
CN110631554A (en) * | 2018-06-22 | 2019-12-31 | 北京京东尚科信息技术有限公司 | Robot posture determining method and device, robot and readable storage medium |
CN109544629A (en) * | 2018-11-29 | 2019-03-29 | 南京人工智能高等研究院有限公司 | Camera pose determines method and apparatus and electronic equipment |
CN110111388A (en) * | 2019-05-10 | 2019-08-09 | 北京航空航天大学 | Three-dimension object pose parameter estimation method and visual apparatus |
Non-Patent Citations (1)
Title |
---|
YIJUN ZHOU等: "SVO-PL: Stereo Visual Odometry with Fusion of Points and Line Segments", 《PROCEEDINGS OF 2018 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION》, 8 August 2018 (2018-08-08), pages 900 - 905, XP033415885, DOI: 10.1109/ICMA.2018.8484479 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114882065A (en) * | 2022-07-12 | 2022-08-09 | 深圳市瑞图生物技术有限公司 | Method and device for judging fluidity of detection object, analyzer and storage medium |
CN114882065B (en) * | 2022-07-12 | 2023-03-14 | 深圳市瑞图生物技术有限公司 | Method and device for judging fluidity of detection object, analyzer and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110322500B (en) | Optimization method and device for instant positioning and map construction, medium and electronic equipment | |
CN108256394B (en) | Target tracking method based on contour gradient | |
CN107481292B (en) | Attitude error estimation method and device for vehicle-mounted camera | |
JP6095018B2 (en) | Detection and tracking of moving objects | |
US9177404B2 (en) | Systems and methods of merging multiple maps for computer vision based tracking | |
CN109242884B (en) | Remote sensing video target tracking method based on JCFNet network | |
CN107424171B (en) | Block-based anti-occlusion target tracking method | |
WO2016034059A1 (en) | Target object tracking method based on color-structure features | |
KR20180056685A (en) | System and method for non-obstacle area detection | |
US11281897B2 (en) | Gesture shaking recognition method and apparatus, and gesture recognition method | |
CN108229475B (en) | Vehicle tracking method, system, computer device and readable storage medium | |
CN109658454B (en) | Pose information determination method, related device and storage medium | |
JP2017526082A (en) | Non-transitory computer-readable medium encoded with computer program code for causing a motion estimation method, a moving body, and a processor to execute the motion estimation method | |
CN107895375B (en) | Complex road route extraction method based on visual multi-features | |
CN110930411B (en) | Human body segmentation method and system based on depth camera | |
CN110084830B (en) | Video moving object detection and tracking method | |
CN112927303B (en) | Lane line-based automatic driving vehicle-mounted camera pose estimation method and system | |
CN111444778A (en) | Lane line detection method | |
CN110598771A (en) | Visual target identification method and device based on deep semantic segmentation network | |
CN109255801B (en) | Method, device and equipment for tracking edges of three-dimensional object in video and storage medium | |
CN111507340B (en) | Target point cloud data extraction method based on three-dimensional point cloud data | |
CN116740126A (en) | Target tracking method, high-speed camera, and storage medium | |
EP2990995A2 (en) | Line parametric object estimation | |
CN114049376A (en) | Pose real-time tracking method, apparatus, computer readable medium and program product | |
CN111145634A (en) | Method and device for correcting map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |