CN107730534A - The tracking and device of destination object - Google Patents

The tracking and device of destination object Download PDF

Info

Publication number
CN107730534A
CN107730534A CN201610653855.3A CN201610653855A CN107730534A CN 107730534 A CN107730534 A CN 107730534A CN 201610653855 A CN201610653855 A CN 201610653855A CN 107730534 A CN107730534 A CN 107730534A
Authority
CN
China
Prior art keywords
contour edge
point
destination object
pixel
area image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610653855.3A
Other languages
Chinese (zh)
Other versions
CN107730534B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GUIZHOU UINSHINE INFORMATION TECHNOLOGY Co.,Ltd.
Original Assignee
Shenzhen Guangqi Hezhong Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangqi Hezhong Technology Co Ltd filed Critical Shenzhen Guangqi Hezhong Technology Co Ltd
Priority to CN201610653855.3A priority Critical patent/CN107730534B/en
Priority to PCT/CN2017/092030 priority patent/WO2018028363A1/en
Publication of CN107730534A publication Critical patent/CN107730534A/en
Application granted granted Critical
Publication of CN107730534B publication Critical patent/CN107730534B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of tracking of destination object and device.Wherein, this method includes:Gather the area image of destination object region;The contour edge line of destination object in area image is determined, wherein, record has the shape facility of destination object in contour edge line;Based on contour edge line, destination object is tracked.The present invention solves technical problem low to the tracking precision of object on the move in the prior art.

Description

The tracking and device of destination object
Technical field
The present invention relates to target tracking domain, in particular to the tracking and device of a kind of destination object.
Background technology
At present, when obtaining video information using camera, the video got includes object on the move (such as movement In hand), and the tracking of the gesture for hand on the move, the scheme that prior art uses includes following content:First, gather The area image (image inputted, as shown in Figure 1) of hand region;Secondly, weighted value method of estimation estimation region is passed through The moving range of gesture in image, and the hand similar color occurred in moving range is chosen, obtain figure as shown in Figure 2 Picture;Finally, the absolute value of the gradient of the similar color chosen, and the absolute value of the gradient based on similar color are calculated, is obtained To image as shown in Figure 3, namely the contour images of the gesture identified, and the profile based on gesture, realize to gesture with Track.But such scheme only identifies the profile of gesture, the profile of the gesture to identifying does not do any processing, so exists When profile based on gesture is tracked to gesture, if small movement (or change), the profile based on gesture occur for gesture The minute movement that gesture occurs can not be judged, so as to which the accurate tracking to gesture can not be realized.
For it is low to the tracking precision of object on the move in the prior art the problem of, not yet propose effective solution at present Certainly scheme.
The content of the invention
It is right in the prior art at least to solve the embodiments of the invention provide a kind of tracking of destination object and device The low technical problem of the tracking precision of object on the move.
One side according to embodiments of the present invention, there is provided a kind of tracking of destination object, the tracking bag Include:Gather the area image of destination object region;The contour edge line of destination object in area image is determined, wherein, wheel Record has the shape facility of destination object in wide edge line;Based on contour edge line, destination object is tracked.
Another aspect according to embodiments of the present invention, additionally provide a kind of tracks of device of destination object, the tracks of device Including:Collecting unit, for gathering the area image of destination object region;Determining unit, for determining in area image The contour edge line of destination object, wherein, record has the shape facility of destination object in contour edge line;Tracking cell, it is used for Based on contour edge line, destination object is tracked.
In embodiments of the present invention, according to the contour edge line of the destination object of determination, destination object is tracked.By Record has the feature (magnitude range of the size of such as destination object) of destination object in contour edge line, and when destination object is sent out Raw any movement (or change), can be by the change of contour edge line come table even very small movement (or change) Show to come, you can to represent the change in location of mobile destination object by the change of contour edge line, so as to reality Now by the change of contour edge line, target is accurately tracked, solved in the prior art to object on the move The problem of tracking precision is low.
Brief description of the drawings
Accompanying drawing described herein is used for providing a further understanding of the present invention, forms the part of the application, this hair Bright schematic description and description is used to explain the present invention, does not form inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 is the schematic diagram according to a kind of optional tracking destination object of prior art;
Fig. 2 is the schematic diagram according to a kind of optional tracking destination object of prior art;
Fig. 3 is the schematic diagram according to a kind of optional tracking destination object of prior art;
Fig. 4 is a kind of flow chart of the tracking of destination object according to embodiments of the present invention;
Fig. 5 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Fig. 6 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Fig. 7 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Fig. 8 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Fig. 9 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Figure 10 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Figure 11 is a kind of schematic diagram of optional tracking destination object according to embodiments of the present invention;
Figure 12 is a kind of schematic diagram of the tracks of device of destination object according to embodiments of the present invention;
Figure 13 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention;
Figure 14 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention;
Figure 15 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention;
Figure 16 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention;
Figure 17 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention;
Figure 18 is a kind of schematic diagram of the tracks of device of optional destination object according to embodiments of the present invention.
Embodiment
In order that those skilled in the art more fully understand the present invention program, below in conjunction with the embodiment of the present invention Accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, rather than whole embodiments.Based on the embodiment in the present invention, ordinary skill people The every other embodiment that member is obtained under the premise of creative work is not made, it should all belong to the model that the present invention protects Enclose.
It should be noted that term " first " in description and claims of this specification and above-mentioned accompanying drawing, " Two " etc. be for distinguishing similar object, without for describing specific order or precedence.It should be appreciated that so use Data can exchange in the appropriate case, so as to embodiments of the invention described herein can with except illustrating herein or Order beyond those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover Cover it is non-exclusive include, be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment Those steps or unit clearly listed, but may include not list clearly or for these processes, method, product Or the intrinsic other steps of equipment or unit.
Embodiment 1
According to embodiments of the present invention, there is provided a kind of embodiment of the tracking of destination object, it is necessary to explanation, The step of flow of accompanying drawing illustrates can perform in the computer system of such as one group computer executable instructions, also, , in some cases, can be with different from shown in order execution herein although showing logical order in flow charts The step of going out or describing.
Fig. 4 is a kind of flow chart of the tracking of destination object according to embodiments of the present invention, as shown in figure 4, should be with Track method comprises the following steps:
Step S402, the area image of collection destination object region.
Step S404, the contour edge line of destination object in area image is determined, wherein, record has mesh in contour edge line Mark the shape facility of object.
Step S406, based on contour edge line, track destination object.
Using embodiments of the invention, according to the contour edge line of the destination object of determination, destination object is tracked. Because record has the feature (magnitude range of the size of such as destination object) of destination object in contour edge line, and work as destination object Any movement of generation (or change), even very small movement (or change), can by the change of contour edge line come Show, you can to represent the change in location of mobile destination object by the change of contour edge line, so as to The change by contour edge line is realized, target is accurately tracked, solved in the prior art to object on the move Tracking precision it is low the problem of.
Specifically, the video information of destination object is included by camera collection, multiple administrative division maps are included in video information Picture;Every area image of the shift position (or change in shape) comprising destination object is obtained, and determines every area image In destination object contour edge line;Because contour edge line is changing, contour edge line can represent target pair As the change in location in area image, therefore, the contour edge line of the destination object in every area image is determined it Afterwards, the mobile change of destination object can be obtained based on the change of contour edge line, so as to realize to destination object it is accurate with Track.
So that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Pass through shooting first Head gathers video information, and hand on the move is contained in video information;Secondly, hand place on the move is obtained from video information The area image in region, as shown in figure 1, wherein, Fig. 1 exemplarily only gives multiple area images included in video information In an area image.After area image as shown in Figure 1 is obtained, the pixel outside hand will be removed in area image Filtration treatment is carried out, obtains the image of the only profile comprising gesture, and determines the contour edge line of gesture in the images.Most Afterwards, after the contour edge line of the gesture in determining every area image, can according to the change of these contour edge lines, The motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
In the above embodiment of the present invention, determining the contour edge line of destination object in area image includes:From region The object images of extracting target from images object, wherein, in object images comprising formed destination object shape pixel;Base The contour edge point of destination object is determined in pixel;Connect profile marginal point generation contour edge line.
Specifically, every area image of the shift position (or change in shape) comprising destination object is obtained, is determined every The contour edge line of the destination object in area image is opened, wherein, the wheel of the destination object in every area image is determined During wide edge line, it is necessary first to the click-through outside pixel to removing the shape for forming destination object in area image Row filtration treatment, obtain the image (object of i.e. above-mentioned destination object of the pixel only comprising the shape for forming destination object Image);Then the contour edge point of destination object is determined from the pixel of shape for forming destination object, finally, will be true The contour edge point connection made, the as obtained connecting line for being surrounded the profile of destination object, contour edge line.By It is moved (or change) in destination object, corresponding obtained contour edge line is bound to change, the change of contour edge line Change can represent change in location of the destination object in area image, therefore, the target in every area image is determined After the contour edge line of object, the mobile change of destination object can be obtained, so as to realize based on the change of contour edge line Accurate tracking to destination object.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.First by taking the photograph As head collection video information, hand on the move is contained in video information;Secondly, hand institute on the move is obtained from video information Area image in region, as shown in figure 1, wherein, Fig. 1 exemplarily only gives multiple administrative division maps included in video information An area image as in.After area image as shown in Figure 1 is obtained, gesture shape is formed by being removed in area image Pixel outside point carry out filtration treatment, obtain only comprising gesture profile image, and from formed gesture shape picture The contour edge point of gesture is determined in vegetarian refreshments;Finally, the contour edge determined point is connected, obtained the profile bag of gesture The connecting line to fence up, as contour edge line., can after the contour edge line of the gesture in determining every area image With the change according to these contour edge lines, the motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
By above-described embodiment, destination object can be determined according to the change for the contour edge line for obtaining destination object Change in moving process, because contour edge line can represent small change, therefore, immediate targets object is in movement During small change occurs, can also be come out by the reacting condition of contour edge line, so as to improve to target pair The tracking accuracy of elephant.
In the above embodiment of the present invention, the contour edge point for determining destination object based on pixel includes:It is determined that picture Central point in vegetarian refreshments, wherein, central point is located at the center of destination object in object images;Based between pixel and central point Distance determine candidate marginal;Contour edge point is chosen from candidate marginal.
Specifically, after the area image of destination object region is got, first, to removing shape in area image Point outside into the pixel of the shape of destination object carries out filtration treatment, obtains only including the picture for the shape for forming destination object The image (object images of i.e. above-mentioned destination object) of vegetarian refreshments;Secondly, after the object images of destination object are obtained, from shape Determined into the pixel of the shape of destination object positioned at destination object center central point, and calculate each pixel with The distance between central point, candidate marginal is determined according to the distance being calculated, and target is chosen from candidate marginal The contour edge point of object;Finally, the contour edge determined point is connected, what is obtained is surrounded the profile of destination object Connecting line, as contour edge line.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Obtaining such as Fig. 1 After shown area image, the point removed in area image outside the pixel for forming gesture shape is subjected to filtration treatment, Obtain the image of the only profile comprising gesture;After the only image of the profile comprising gesture is obtained, from formation gesture shape The central point positioned at the center of gesture is determined in pixel, and calculates the distance between each pixel and central point, according to The distance being calculated determines candidate marginal, and the contour edge point of gesture is chosen from candidate marginal;Finally, will be true The contour edge point connection made, obtains the connecting line for being surrounded the profile of gesture, as contour edge line.Determining After the contour edge line of gesture in every area image, gesture can be obtained according to the change of these contour edge lines Motion track, so as to realize the accurate tracking to gesture.
By above-described embodiment, can be determined based on the distance between pixel and central point for forming destination object Candidate marginal, so that it is determined that going out contour edge point.In the program, according only to distance parameter, it may be determined that go out contour edge point, Therefore, embodiment is simple, and processing speed is fast.
In the above embodiment of the present invention, candidate marginal bag is determined based on the distance between pixel and central point Include:Centered on central point, object images are divided into N number of region, based on each pixel and central point in regional Between the first distance, by first distance in pixel corresponding to ultimate range be defined as candidate marginal.
Specifically, centered on central point, object images decile is divided into N number of angular regions according to predetermined angle, its In, N is the integer more than 1;Calculate the first distance between each pixel and central point in all angles region;Will be each Pixel corresponding to ultimate range is defined as candidate marginal in the first distance in angular regions;Preferably, herein default Angle value is 2 degree.
Further, also included based on pixel and the distance between central point determination candidate marginal:Retain each angle Spend pixel corresponding to ultimate range in region.
Wherein it is possible to by pixel is defined as candidate marginal corresponding to ultimate range while, retain all angles Pixel corresponding to ultimate range in region.
Specifically, after the object images of destination object are obtained, from the pixel of shape for forming destination object really The central point positioned at the center of destination object is made, after central point is determined, centered on central point, by destination object Object images are divided using θ degree as predetermined angle, are divided into N number of angular regions.In each angular regions, shape may be included Into multiple pixels in the pixel of destination object, it is also possible in the absence of pixel, include to form mesh in angular regions In the case of marking multiple pixels in the pixel of object, calculate each pixel and the distance between central point is (i.e. above-mentioned The first distance), and pixel corresponding to the ultimate range in the distance being calculated is defined as candidate marginal, in angle In do not include the pixel formed in the pixel of destination object in the case of, handled without any calculating;Finally, from candidate The contour edge point of destination object is chosen in marginal point, and the contour edge determined point is connected, obtain by destination object The connecting line that is surrounded of profile, as contour edge line.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Only wrapped After the image of profile containing gesture, the central point positioned at the center of gesture is determined from the pixel for forming gesture shape, After central point is determined, centered on central point, the image of the profile comprising gesture is drawn using θ degree as predetermined angle Point, it is divided into N number of angular regions.In each angular regions, multiple pictures in the pixel for forming gesture shape may be included Vegetarian refreshments, it is also possible in the absence of pixel, multiple pixels in the pixel to form gesture shape are included in angular regions In the case of, calculate the distance between each pixel and central point (the first i.e. above-mentioned distance), and will be calculated away from Pixel corresponding to ultimate range from is defined as candidate marginal, and the pixel for forming destination object is not included in angle In pixel in the case of, without any calculating handle;Finally, the profile side of destination object is chosen from candidate marginal Edge point, and the contour edge determined point is connected, obtain the connecting line for being surrounded the profile of gesture, as contour edge Line.After the contour edge line of the gesture in determining every area image, can according to the change of these contour edge lines, The motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
Further, while pixel corresponding to the ultimate range being calculated is defined as into candidate marginal, protect Stay pixel corresponding to ultimate range, and pixel point deletion outside ultimate range will be removed.
Wherein, above-mentioned N=360 °/predetermined angle.
By above-described embodiment, with center line dot center, the object images of destination object are divided into N number of region, and can be with The distance between pixel and central point based on the formation destination object included in each region, determine candidate marginal, So that it is determined that go out contour edge point.In the program, according only to distance parameter, it may be determined that go out contour edge point, therefore, embodiment party Formula is simple, and processing speed is fast.
In the above embodiment of the present invention, selection contour edge point includes from candidate marginal:From candidate marginal First contour edge point of middle acquisition;Using first contour edge point as current point, operations described below is performed to current point, successively Next contour edge point is determined, until the next contour edge point determined is first contour edge point, operation includes:Obtain Central point is taken to point to the primary vector of current point;Obtain current point points to each candidate marginal in predetermined angular range the Two vectors, wherein, predetermined angular range is by the way that using primary vector as rotary shaft, the predetermined angle that turns clockwise obtains;Calculate the Angle between one vector and each secondary vector, and candidate marginal corresponding to the angle of minimum is defined as next profile Marginal point;Using next contour edge point of determination as the current point for performing operation next time.
Alternatively, the above embodiment of the present invention still is described in detail so that destination object is the gesture of hand on the move as an example. After to the image of profile for only including gesture, determined from the pixel for forming gesture shape in the center of gesture Heart point, and the distance between each pixel and central point are calculated, candidate marginal is determined according to the distance being calculated. After determining candidate marginal, first from candidate marginal, the farthest point conduct of the distance between centerline is selected The starting point (i.e. above-mentioned the first profile marginal point) of contour edge, and obtain first that the first profile marginal point is pointed to central point Vector is rotary shaft, the candidate edge in the 180 degree scope rotated clockwise (i.e. above-mentioned predetermined angular range) Point, obtain centerline and point to the primary vector of the starting point, and obtain the more of the candidate marginal in the range of starting point sensing 180 degree Individual secondary vector, the angle between primary vector and each secondary vector is calculated, and be candidate side corresponding to minimum value by angle Edge point is defined as contour edge point, and the point using the contour edge point of determination as next contour edge, to next profile The point execution at edge operates as above, until it is determined that contour edge point be contour edge starting point, so that it is determined that going out multiple profiles Marginal point;Finally, the contour edge determined point is connected, obtains the connecting line for being surrounded the profile of gesture, as take turns Wide edge line.
In the above-described embodiments, it is candidate side corresponding to minimum value by angle by calculating the angle between two vectors Edge point is defined as contour edge point, in the program, with angle parameter, you can the parameter of contour edge point is determined, due to being to belong to Simple data calculate, and therefore, embodiment is simple, and processing speed is fast.
In the above embodiment of the present invention, first contour edge point of acquisition includes from candidate marginal:Calculate each Second distance between individual candidate marginal and central point;Candidate marginal corresponding to ultimate range in second distance is defined as First contour edge point.
Specifically, after multiple candidate marginals are got, according to the coordinate information of candidate marginal and central point Coordinate information, the distance between each candidate marginal and central point (i.e. above-mentioned second distance) is calculated, and obtains To distance in ultimate range, candidate marginal corresponding to ultimate range is defined as to the starting point of contour edge.
In the above-described embodiments, rising using the candidate marginal between central point with maximum distance as contour edge Point, choose a starting point, you can to determine contour edge point faster.
In the above embodiment of the present invention, determine that the central point in pixel includes:Calculate the coordinate information of pixel Average value, using pixel corresponding to average value as central point.
Further, it is determined that the central point in pixel includes:Obtain the abscissa and ordinate of each pixel;By institute There is abscissa of the average value as central point of the abscissa of pixel, using the average value of the ordinate of all pixels point as in The ordinate of heart point
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Only wrapped After the image of profile containing gesture, the abscissa and ordinate of each pixel to form gesture shape can be obtained, is passed through The method averaged, the average value of abscissa and the average value of ordinate of all pixels point are obtained, by being averaged for abscissa Abscissa of the value as central point, the ordinate using the average value of ordinate as central point, and it is true by the coordinate of central point Make the central point in pixel.
By above-described embodiment, the central point in the object images of destination object can be obtained, is obtaining the base of central point On plinth, the contour edge point of destination object can be obtained faster, so as to determine contour edge line, realization pair faster The quick tracking of destination object.
In the above embodiment of the present invention, based on contour edge line, tracking destination object includes:Contour edge line is existed First position in area image, as the second place of the destination object in area image.
Specifically, the area image of destination object region is obtained, destination object is formed by being removed in area image Point outside the pixel of shape carries out filtration treatment, obtains the image of the pixel only comprising the shape for forming destination object (object images of i.e. above-mentioned destination object);The wheel of destination object is determined from the pixel of shape for forming destination object Wide marginal point, the contour edge point determined is connected, obtains the contour edge line of destination object.Obtaining the profile of destination object After edge line, the position (i.e. above-mentioned first position) of contour edge line, the contour edge got are obtained from area image The position of line can represent position (the i.e. above-mentioned second place) of the destination object in area image.
In the above-described embodiments, the position by obtained contour edge line in area image, can be more clear Position of the destination object in area image is represented, so as to preferably track destination object, is improved to destination object Tracking accuracy.
In the above embodiment of the present invention, in the case where area image is multiframe area image, based on contour edge Line, tracking destination object include:Obtain the first position of the contour edge line in regional image;By the of contour edge line One position, as the corresponding second place in area image of destination object;Generate the location track of multiple second places.
Specifically, every area image of the shift position (or change in shape) comprising destination object is obtained, is determined every The contour edge line of destination object in area image, and the position of contour edge line corresponding to obtaining from regional image (i.e. above-mentioned first position) is put, the position of the contour edge line got can represent destination object in area image Position (the i.e. above-mentioned second place), multiple area images are subjected to image accumulated process, you can by destination object corresponding Area image in position, obtain the location track of destination object.Because the destination object in every area image is all present Corresponding mobile (or change), corresponding contour edge line is changing, and the change of contour edge line can represent target Change in location of the object in area image, therefore, the contour edge line of the destination object in every area image is determined Afterwards, the mobile change of destination object can be obtained, so as to realize to the accurate of destination object based on the change of contour edge line Tracking.
Below still so that destination object is the gesture of hand on the move as an example, the above-mentioned of the present invention is described in detail with reference to Fig. 5 to Figure 11 Embodiment.
Alternatively, video information is gathered by camera, hand on the move is contained in video information, from video information Obtain the area image of hand region on the move, and pixel outside the shape for forming gesture will be removed in area image Filtration treatment is carried out, the image (as shown in Figure 5) of the only profile comprising gesture is obtained, is included in the image and form gesture shape Multiple pixels, the method by averaging, the average value of the coordinate information of multiple pixels is calculated, by corresponding to average value Central point (as shown in Figure 6) of the pixel as the center of the destination object in image, after central point is obtained, in Centered on heart point, the image of the profile comprising gesture is divided using θ degree as predetermined angle, is divided into N number of angular regions (such as Shown in Fig. 7), and based on the distance between pixel and central point included in each angular regions, determine candidate marginal (as shown in Figure 8), it is specifically consistent in the embodiment and above-described embodiment of determination candidate marginal, it will not be repeated here; After candidate marginal is determined, the distance between candidate marginal and central point are calculated, and will have between central point Starting point (i.e. above-mentioned the first profile marginal point, as shown in Figure 9) of the candidate marginal of maximum distance as contour edge;So Afterwards, the primary vector that central point points to the first profile marginal point is obtained, and is obtained using primary vector as rotary shaft, along clockwise Candidate marginal in the range of 180 ° of rotation, to obtain multiple secondary vectors that the first profile marginal point points to candidate marginal (what as shown in Figure 10, Figure 10 was exemplary gives three secondary vectors), the angle of primary vector and multiple secondary vectors is calculated, And by angle be that candidate marginal corresponding to minimum value is defined as contour edge point, then using the contour edge point of determination under The point of one contour edge, repeats aforesaid operations, and multiple contour edge points are obtained from candidate marginal, connect what is obtained Multiple contour edge points, obtain contour edge line (as shown in figure 11).
By above-described embodiment, object images (such as profile diagram of destination object (gesture of hand such as on the move) is being obtained Picture) after, the contour edge line that the destination object in object images can be wrapped is determined, because destination object moves Dynamic (or change), corresponding multiple contour edge lines are bound to change, moreover, even if the movement (or change) of very little, It can be showed by the change of contour edge line, therefore, track the change of profile edge line, can more accurately track target The movement (or change) of object, it is low to the tracking precision of object on the move in the prior art so as to solve the problems, such as.
Embodiment 2
Figure 12 is a kind of schematic diagram of the tracks of device of destination object according to embodiments of the present invention, as shown in figure 12, should Tracks of device includes:Collecting unit 10, determining unit 30 and tracking cell 50.
Wherein, collecting unit 10, for gathering the area image of destination object region.
Determining unit 30, for determining the contour edge line of destination object in area image, wherein, remember in contour edge line Record has the shape facility of destination object.
Tracking cell 50, for based on contour edge line, tracking destination object.
Using embodiments of the invention, according to the contour edge line of the destination object of determination, destination object is tracked. Because record has the feature (magnitude range of the size of such as destination object) of destination object in contour edge line, and work as destination object Any movement of generation (or change), even very small movement (or change), can by the change of contour edge line come Show, you can to represent the change in location of mobile destination object by the change of contour edge line, so as to The change by contour edge line is realized, target is accurately tracked, solved in the prior art to object on the move Tracking precision it is low the problem of.
Specifically, the video information of destination object is included by camera collection, multiple administrative division maps are included in video information Picture;Every area image of the shift position (or change in shape) comprising destination object is obtained, and determines every area image In destination object contour edge line;Because contour edge line is changing, contour edge line can represent target pair As the change in location in area image, therefore, the contour edge line of the destination object in every area image is determined it Afterwards, the mobile change of destination object can be obtained based on the change of contour edge line, so as to realize to destination object it is accurate with Track.
So that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Pass through shooting first Head gathers video information, and hand on the move is contained in video information;Secondly, hand place on the move is obtained from video information The area image in region, as shown in figure 1, wherein, Fig. 1 exemplarily only gives multiple area images included in video information In an area image.After area image as shown in Figure 1 is obtained, the pixel outside hand will be removed in area image Filtration treatment is carried out, obtains the image of the only profile comprising gesture, and determines the contour edge line of gesture in the images.Most Afterwards, after the contour edge line of the gesture in determining every area image, can according to the change of these contour edge lines, The motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
As shown in figure 13, above-mentioned determining unit 30 includes:Extraction module 301, for extracting target from area image The object images of object, wherein, in object images comprising formed destination object shape pixel;First determining module 303, For determining the contour edge point of destination object based on pixel;Link block 305, for connecting profile marginal point generation profile Edge line.
Specifically, every area image of the shift position (or change in shape) comprising destination object is obtained, is determined every The contour edge line of the destination object in area image is opened, wherein, the wheel of the destination object in every area image is determined During wide edge line, it is necessary first to the click-through outside pixel to removing the shape for forming destination object in area image Row filtration treatment, obtain the image (object of i.e. above-mentioned destination object of the pixel only comprising the shape for forming destination object Image);Then the contour edge point of destination object is determined from the pixel of shape for forming destination object, finally, will be true The contour edge point connection made, the as obtained connecting line for being surrounded the profile of destination object, contour edge line.By It is moved (or change) in destination object, corresponding obtained contour edge line is bound to change, the change of contour edge line Change can represent change in location of the destination object in area image, therefore, the target in every area image is determined After the contour edge line of object, the mobile change of destination object can be obtained, so as to realize based on the change of contour edge line Accurate tracking to destination object.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.First by taking the photograph As head collection video information, hand on the move is contained in video information;Secondly, hand institute on the move is obtained from video information Area image in region, as shown in figure 1, wherein, Fig. 1 exemplarily only gives multiple administrative division maps included in video information An area image as in.After area image as shown in Figure 1 is obtained, gesture shape is formed by being removed in area image Pixel outside point carry out filtration treatment, obtain only comprising gesture profile image, and from formed gesture shape picture The contour edge point of gesture is determined in vegetarian refreshments;Finally, the contour edge determined point is connected, obtained the profile bag of gesture The connecting line to fence up, as contour edge line., can after the contour edge line of the gesture in determining every area image With the change according to these contour edge lines, the motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
By above-described embodiment, destination object can be determined according to the change for the contour edge line for obtaining destination object Change in moving process, because contour edge line can represent small change, therefore, immediate targets object is in movement During small change occurs, can also be come out by the reacting condition of contour edge line, so as to improve to target pair The tracking accuracy of elephant.
As shown in figure 14, the first above-mentioned determining module 303 includes:First determination sub-module 3031, for determining pixel Central point in point, wherein, central point is located at the center of destination object in object images;Second determination sub-module 3033, is used for Candidate marginal is determined based on the distance between pixel and central point;Submodule 3035 is chosen, for from candidate marginal Choose contour edge point.
Specifically, after the area image of destination object region is got, first, to removing shape in area image Point outside into the pixel of the shape of destination object carries out filtration treatment, obtains only including the picture for the shape for forming destination object The image (object images of i.e. above-mentioned destination object) of vegetarian refreshments;Secondly, after the object images of destination object are obtained, from shape Determined into the pixel of the shape of destination object positioned at destination object center central point, and calculate each pixel with The distance between central point, candidate marginal is determined according to the distance being calculated, and target is chosen from candidate marginal The contour edge point of object;Finally, the contour edge determined point is connected, what is obtained is surrounded the profile of destination object Connecting line, as contour edge line.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Obtaining such as Fig. 1 After shown area image, the point removed in area image outside the pixel for forming gesture shape is subjected to filtration treatment, Obtain the image of the only profile comprising gesture;After the only image of the profile comprising gesture is obtained, from formation gesture shape The central point positioned at the center of gesture is determined in pixel, and calculates the distance between each pixel and central point, according to The distance being calculated determines candidate marginal, and the contour edge point of gesture is chosen from candidate marginal;Finally, will be true The contour edge point connection made, obtains the connecting line for being surrounded the profile of gesture, as contour edge line.Determining After the contour edge line of gesture in every area image, gesture can be obtained according to the change of these contour edge lines Motion track, so as to realize the accurate tracking to gesture.
By above-described embodiment, can be determined based on the distance between pixel and central point for forming destination object Candidate marginal, so that it is determined that going out contour edge point.In the program, according only to distance parameter, it may be determined that go out contour edge point, Therefore, embodiment is simple, and processing speed is fast.
As shown in figure 15, the second above-mentioned determination sub-module 3033 includes:Baryon module 30331 is divided, for center Centered on point, object images decile is divided into N number of angular regions according to predetermined angle, wherein, N is the integer more than 1;First Baryon module 30333 is handled, for calculating the first distance between each pixel and central point in all angles region;The One determines baryon module 30335, for pixel corresponding to ultimate range in the first distance in all angles region to be defined as Candidate marginal.
As shown in figure 16, the second above-mentioned determination sub-module 3033 also includes:Retain baryon module 30337, for retaining Pixel corresponding to ultimate range in all angles region.
Alternatively, by pixel is defined as candidate marginal corresponding to maximum range value while, retain all angles Pixel corresponding to ultimate range in region.
Specifically, after the object images of destination object are obtained, from the pixel of shape for forming destination object really The central point positioned at the center of destination object is made, after central point is determined, centered on central point, by destination object Object images are divided using θ degree as predetermined angle, are divided into N number of angular regions.In each angular regions, shape may be included Into multiple pixels in the pixel of destination object, it is also possible in the absence of pixel, include to form mesh in angular regions In the case of marking multiple pixels in the pixel of object, calculate each pixel and the distance between central point is (i.e. above-mentioned The first distance), and pixel corresponding to the ultimate range in the distance being calculated is defined as candidate marginal, in angle In do not include the pixel formed in the pixel of destination object in the case of, handled without any calculating;Finally, from candidate The contour edge point of destination object is chosen in marginal point, and the contour edge determined point is connected, obtain by destination object The connecting line that is surrounded of profile, as contour edge line.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Only wrapped After the image of profile containing gesture, the central point positioned at the center of gesture is determined from the pixel for forming gesture shape, After central point is determined, centered on central point, the image of the profile comprising gesture is drawn using θ degree as predetermined angle Point, it is divided into N number of angular regions.In each angular regions, multiple pictures in the pixel for forming gesture shape may be included Vegetarian refreshments, it is also possible in the absence of pixel, multiple pixels in the pixel to form gesture shape are included in angular regions In the case of, calculate the distance between each pixel and central point (the first i.e. above-mentioned distance), and will be calculated away from Pixel corresponding to ultimate range from is defined as candidate marginal, and the pixel for forming destination object is not included in angle In pixel in the case of, without any calculating handle;Finally, the profile side of destination object is chosen from candidate marginal Edge point, and the contour edge determined point is connected, obtain the connecting line for being surrounded the profile of gesture, as contour edge Line.After the contour edge line of the gesture in determining every area image, can according to the change of these contour edge lines, The motion track of gesture is obtained, so as to realize the accurate tracking to gesture.
Further, while pixel corresponding to the ultimate range being calculated is defined as into candidate marginal, protect Stay pixel corresponding to ultimate range, and pixel point deletion outside ultimate range will be removed.
Wherein, above-mentioned N=360 °/predetermined angle.
By above-described embodiment, with center line dot center, the object images of destination object are divided into N number of region, and can be with The distance between pixel and central point based on the formation destination object included in each region, determine candidate marginal, So that it is determined that go out contour edge point.In the program, according only to distance parameter, it may be determined that go out contour edge point, therefore, embodiment party Formula is simple, and processing speed is fast.
As shown in figure 17, above-mentioned selection submodule 3035 includes:First obtains baryon module 30351, for from candidate First contour edge point is obtained in marginal point;Second processing baryon module 30353, for using first contour edge point as Current point, operations described below is performed to current point, determines next contour edge point successively, until the next contour edge determined Point is first contour edge point, and operation includes:Son two obtains baryon module 30355, is pointed to currently for obtaining central point The primary vector of point;3rd obtains baryon module 30357, for obtaining each candidate in current point sensing predetermined angular range The secondary vector of marginal point, wherein, predetermined angular range is by the way that using primary vector as rotary shaft, the predetermined angle that turns clockwise obtains Arrive;3rd processing baryon module 30359, for calculating the angle between primary vector and each secondary vector, and by minimum Candidate marginal corresponding to angle is defined as next contour edge point;Second determines baryon module 30352, for by determination Next contour edge point is as the current point for performing operation next time.
Alternatively, the above embodiment of the present invention still is described in detail so that destination object is the gesture of hand on the move as an example. After to the image of profile for only including gesture, determined from the pixel for forming gesture shape in the center of gesture Heart point, and the distance between each pixel and central point are calculated, candidate marginal is determined according to the distance being calculated. After determining candidate marginal, first from candidate marginal, the farthest point conduct of the distance between centerline is selected The starting point (i.e. above-mentioned the first profile marginal point) of contour edge, and obtain first that the first profile marginal point is pointed to central point Vector is rotary shaft, the candidate edge in the 180 degree scope rotated clockwise (i.e. above-mentioned predetermined angular range) Point, obtain centerline and point to the primary vector of the starting point, and obtain the more of the candidate marginal in the range of starting point sensing 180 degree Individual secondary vector, the angle between primary vector and each secondary vector is calculated, and be candidate side corresponding to minimum value by angle Edge point is defined as contour edge point, and the point using the contour edge point of determination as next contour edge, to next profile The point execution at edge operates as above, until it is determined that contour edge point be contour edge starting point, so that it is determined that going out multiple profiles Marginal point;Finally, the contour edge determined point is connected, obtains the connecting line for being surrounded the profile of gesture, as take turns Wide edge line.
In the above-described embodiments, it is candidate side corresponding to minimum value by angle by calculating the angle between two vectors Edge point is defined as contour edge point, in the program, with angle parameter, you can the parameter of contour edge point is determined, due to being to belong to Simple data calculate, and therefore, embodiment is simple, and processing speed is fast.
As shown in figure 18, the first above-mentioned acquisition baryon module 30351 includes:Fourth process baryon module 303511, use Second distance between each candidate marginal of calculating and central point;3rd determines baryon module 303513, for by second Candidate marginal corresponding to ultimate range is defined as first contour edge point in distance.
Specifically, after multiple candidate marginals are got, according to the coordinate information of candidate marginal and central point Coordinate information, the distance between each candidate marginal and central point (i.e. above-mentioned second distance) is calculated, and obtains To distance in ultimate range, candidate marginal corresponding to ultimate range is defined as to the starting point of contour edge.
In the above-described embodiments, rising using the candidate marginal between central point with maximum distance as contour edge Point, choose a starting point, you can to determine contour edge point faster.
In the above embodiment of the present invention, the first above-mentioned determination sub-module includes:4th obtains baryon module, is used for Obtain the abscissa and ordinate of each pixel;5th processing baryon module, for putting down the abscissa of all pixels point Abscissa of the average as central point, the ordinate using the average value of the ordinate of all pixels point as central point.
Still so that destination object is the gesture of hand on the move as an example, the above embodiment of the present invention is described in detail.Only wrapped After the image of profile containing gesture, the abscissa and ordinate of each pixel to form gesture shape can be obtained, is passed through The method averaged, the average value of abscissa and the average value of ordinate of all pixels point are obtained, by being averaged for abscissa Abscissa of the value as central point, the ordinate using the average value of ordinate as central point, and it is true by the coordinate of central point Make the central point in pixel.
By above-described embodiment, the central point in the object images of destination object can be obtained, is obtaining the base of central point On plinth, the contour edge point of destination object can be obtained faster, so as to determine contour edge line, realization pair faster The quick tracking of destination object.
In the above embodiment of the present invention, above-mentioned tracking cell includes:Second determining module, for by contour edge First position of the line in area image, as the second place of the destination object in area image.
Specifically, the area image of destination object region is obtained, destination object is formed by being removed in area image Point outside the pixel of shape carries out filtration treatment, obtains the image of the pixel only comprising the shape for forming destination object (object images of i.e. above-mentioned destination object);The wheel of destination object is determined from the pixel of shape for forming destination object Wide marginal point, the contour edge point determined is connected, obtains the contour edge line of destination object.Obtaining the profile of destination object After edge line, the position (i.e. above-mentioned first position) of contour edge line, the contour edge got are obtained from area image The position of line can represent position (the i.e. above-mentioned second place) of the destination object in area image.
In the above-described embodiments, the position by obtained contour edge line in area image, can be more clear Position of the destination object in area image is represented, so as to preferably track destination object, is improved to destination object Tracking accuracy.
In the above embodiment of the present invention, in the case where area image is multiframe area image, above-mentioned tracking list Member includes:Acquisition module, for obtaining the first position of the contour edge line in regional image;3rd determining module, use In by the first position of contour edge line, as the second place of the destination object in the image of corresponding region;Generation module, it is used for Generate the location track of multiple second places.
Specifically, every area image of the shift position (or change in shape) comprising destination object is obtained, is determined every The contour edge line of destination object in area image, and the position of contour edge line corresponding to obtaining from regional image (i.e. above-mentioned first position) is put, the position of the contour edge line got can represent destination object in area image Position (the i.e. above-mentioned second place), multiple area images are subjected to image accumulated process, you can by destination object corresponding Area image in position, obtain the location track of destination object.Because the destination object in every area image is all present Corresponding mobile (or change), corresponding contour edge line is changing, and the change of contour edge line can represent target Change in location of the object in area image, therefore, the contour edge line of the destination object in every area image is determined Afterwards, the mobile change of destination object can be obtained, so as to realize to the accurate of destination object based on the change of contour edge line Tracking.
By above-described embodiment, object images (such as profile diagram of destination object (gesture of hand such as on the move) is being obtained Picture) after, the contour edge line that the destination object in object images can be wrapped is determined, because destination object moves Dynamic (or change), corresponding multiple contour edge lines are bound to change, moreover, even if the movement (or change) of very little, It can be showed by the change of contour edge line, therefore, track the change of profile edge line, can more accurately track target The movement (or change) of object, it is low to the tracking precision of object on the move in the prior art so as to solve the problems, such as.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
In the above embodiment of the present invention, the description to each embodiment all emphasizes particularly on different fields, and does not have in some embodiment The part of detailed description, it may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed technology contents, others can be passed through Mode is realized.Wherein, device embodiment described above is only schematical, such as the division of the unit, Ke Yiwei A kind of division of logic function, can there is an other dividing mode when actually realizing, for example, multiple units or component can combine or Person is desirably integrated into another system, or some features can be ignored, or does not perform.Another, shown or discussed is mutual Between coupling or direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module Connect, can be electrical or other forms.
The unit illustrated as separating component can be or may not be physically separate, show as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On unit.Some or all of unit therein can be selected to realize the purpose of this embodiment scheme according to the actual needs.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and is used as independent production marketing or use When, it can be stored in a computer read/write memory medium.Based on such understanding, technical scheme is substantially The part to be contributed in other words to prior art or all or part of the technical scheme can be in the form of software products Embody, the computer software product is stored in a storage medium, including some instructions are causing a computer Equipment (can be personal computer, server or network equipment etc.) perform each embodiment methods described of the present invention whole or Part steps.And foregoing storage medium includes:USB flash disk, read-only storage (ROM, Read-Only Memory), arbitrary access are deposited Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic disc or CD etc. are various can be with store program codes Medium.
Described above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, under the premise without departing from the principles of the invention, some improvements and modifications can also be made, these improvements and modifications also should It is considered as protection scope of the present invention.

Claims (18)

  1. A kind of 1. tracking of destination object, it is characterised in that including:
    Gather the area image of destination object region;
    The contour edge line of destination object described in the area image is determined, wherein, recorded in the contour edge line State the shape facility of destination object;
    Based on the contour edge line, the destination object is tracked.
  2. 2. according to the method for claim 1, it is characterised in that determine the profile of destination object described in the area image Edge line includes:
    The object images of the destination object are extracted from the area image, wherein, included in the object images and form institute State the pixel of the shape of destination object;
    The contour edge point of the destination object is determined based on the pixel;
    Connect the contour edge point and generate the contour edge line.
  3. 3. according to the method for claim 2, it is characterised in that the profile of the destination object is determined based on the pixel Marginal point includes:
    The central point in the pixel is determined, wherein, central point is located at the center of destination object in the object images;
    Candidate marginal is determined based on the distance between the pixel and the central point;
    The contour edge point is chosen from the candidate marginal.
  4. 4. according to the method for claim 3, it is characterised in that based on the distance between the pixel and the central point Determine that the candidate marginal includes:
    Centered on the central point, the object images decile is divided into N number of angular regions according to predetermined angle, wherein, N For the integer more than 1;
    Calculate the first distance between each pixel and the central point in each angular regions;
    Pixel corresponding to ultimate range in first distance described in each angular regions is defined as the candidate edge Point.
  5. 5. according to the method for claim 4, it is characterised in that based on the distance between the pixel and the central point Determine that the candidate marginal also includes:
    Retain pixel corresponding to ultimate range described in each angular regions.
  6. 6. according to the method for claim 3, it is characterised in that the contour edge point is chosen from the candidate marginal Including:
    First contour edge point is obtained from the candidate marginal;
    Using first contour edge point as current point, operations described below is performed to the current point, determined successively next The contour edge point, until the next contour edge point determined is first contour edge point, the operation Including:
    Obtain the primary vector that the central point points to the current point;
    The secondary vector that the current point points to each candidate marginal in predetermined angular range is obtained, wherein, it is described Predetermined angular range is by the way that using the primary vector as rotary shaft, the predetermined angle that turns clockwise obtains;
    Calculate the angle between the primary vector and each secondary vector, and by candidate edge corresponding to the angle of minimum Point is defined as next contour edge point;
    Using next contour edge point of determination as the current point for performing the operation next time.
  7. 7. according to the method for claim 6, it is characterised in that first contour edge is obtained from the candidate marginal Point includes:
    Calculate the second distance between each candidate marginal and the central point;
    Candidate marginal corresponding to ultimate range in the second distance is defined as first contour edge point.
  8. 8. according to the method for claim 6, it is characterised in that the predetermined angular range is the angular range of 180 degree.
  9. 9. according to the method for claim 3, it is characterised in that determine that the central point in the pixel includes:
    Obtain the abscissa and ordinate of each pixel;
    Abscissa using the average value of the abscissa of all pixels as the central point, by all pixels Ordinate of the average value of ordinate as the central point.
  10. 10. according to the method for claim 1, it is characterised in that based on the contour edge line, track the destination object Including:
    By first position of the contour edge line in the area image, as the destination object in the area image In the second place.
  11. 11. according to the method for claim 1, it is characterised in that in the situation that the area image is multiframe area image Under, based on the contour edge line, tracking the destination object includes:
    Obtain the first position of the contour edge line in each area image;
    By the first position of the contour edge line, the second in the area image is corresponded to as the destination object Put;
    Generate the location track of multiple second places.
  12. A kind of 12. tracks of device of destination object, it is characterised in that including:
    Collecting unit, for gathering the area image of destination object region;
    Determining unit, for determining the contour edge line of destination object described in the area image, wherein, the contour edge Record has the shape facility of the destination object in line;
    Tracking cell, for based on the contour edge line, tracking the destination object.
  13. 13. device according to claim 12, it is characterised in that the determining unit includes:
    Extraction module, for extracting the object images of the destination object from the area image, wherein, the object images In comprising formed the destination object shape pixel;
    First determining module, for determining the contour edge point of the destination object based on the pixel;
    Link block, the contour edge line is generated for connecting the contour edge point.
  14. 14. device according to claim 13, it is characterised in that first determining module includes:
    First determination sub-module, for determining the central point in the pixel, wherein, central point is located in the object images The center of destination object;
    Second determination sub-module, for determining candidate marginal based on the distance between the pixel and the central point;
    Submodule is chosen, for choosing the contour edge point from the candidate marginal.
  15. 15. device according to claim 14, it is characterised in that second determination sub-module includes:
    Baryon module is divided, for centered on the central point, the object images decile to be divided into N according to predetermined angle Individual angular regions, wherein, N is the integer more than 1;
    First processing baryon module, for calculating between each pixel and the central point in each angular regions First distance;
    First determine baryon module, for will described in each angular regions first apart from pixel corresponding to ultimate range Point is defined as the candidate marginal.
  16. 16. device according to claim 15, it is characterised in that second determination sub-module also includes:
    Retain baryon module, for retaining pixel corresponding to ultimate range described in each angular regions.
  17. 17. device according to claim 14, it is characterised in that the selection submodule includes:
    First obtains baryon module, for obtaining first contour edge point from the candidate marginal;
    Second processing baryon module, for using first contour edge point as current point, under being performed to the current point Operation is stated, determines next contour edge point successively, until the next contour edge point determined is described first Individual contour edge point, the operation include:
    Second obtains baryon module, and the primary vector of the current point is pointed to for obtaining the central point;
    3rd obtains baryon module, for obtaining each candidate marginal in the current point sensing predetermined angular range Secondary vector, wherein, the predetermined angular range is by the way that using the primary vector as rotary shaft, turn clockwise predetermined angle Obtain;
    3rd processing baryon module, for calculating the angle between the primary vector and each secondary vector, and will most Candidate marginal corresponding to small angle is defined as next contour edge point;
    Second determination sub-module, for using next contour edge point of determination as perform next time the operation work as Preceding point.
  18. 18. device according to claim 17, it is characterised in that the first acquisition baryon module includes:
    Fourth process baryon module, for calculating the second distance between each candidate marginal and the central point;
    3rd determines baryon module, for candidate marginal corresponding to ultimate range in the second distance to be defined as into described the One contour edge point.
CN201610653855.3A 2016-08-09 2016-08-09 Target object tracking method and device Active CN107730534B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610653855.3A CN107730534B (en) 2016-08-09 2016-08-09 Target object tracking method and device
PCT/CN2017/092030 WO2018028363A1 (en) 2016-08-09 2017-07-06 Target object tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610653855.3A CN107730534B (en) 2016-08-09 2016-08-09 Target object tracking method and device

Publications (2)

Publication Number Publication Date
CN107730534A true CN107730534A (en) 2018-02-23
CN107730534B CN107730534B (en) 2020-10-23

Family

ID=61162626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610653855.3A Active CN107730534B (en) 2016-08-09 2016-08-09 Target object tracking method and device

Country Status (2)

Country Link
CN (1) CN107730534B (en)
WO (1) WO2018028363A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108614263A (en) * 2018-04-20 2018-10-02 Oppo广东移动通信有限公司 Mobile terminal, method for detecting position and Related product
CN112188105A (en) * 2020-09-30 2021-01-05 苏州臻迪智能科技有限公司 Tracking shooting method and device, intelligent device and computer readable storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1766928A (en) * 2004-10-29 2006-05-03 中国科学院计算技术研究所 A kind of motion object center of gravity track extraction method based on the dynamic background sport video
CN1770204A (en) * 2004-10-29 2006-05-10 中国科学院计算技术研究所 Method for extracting barycenter trajectory of motive object from motive video with static background
CN101236657A (en) * 2008-03-03 2008-08-06 吉林大学 Single movement target track tracking and recording method
CN101419712A (en) * 2008-12-02 2009-04-29 深圳市蓝韵实业有限公司 Method for determining external periphery outline of mammary gland
CN102074018A (en) * 2010-12-22 2011-05-25 Tcl集团股份有限公司 Depth information-based contour tracing method
CN102324032A (en) * 2011-09-08 2012-01-18 北京林业大学 Texture feature extraction method for gray level co-occurrence matrix in polar coordinate system
CN102930268A (en) * 2012-08-31 2013-02-13 西北工业大学 Accurate positioning method for data matrix code under pollution and multi-view situation
CN103577800A (en) * 2012-07-23 2014-02-12 中国航天员科研训练中心 Method for measuring human hand morphological parameters based on color images
CN103996212A (en) * 2013-02-18 2014-08-20 威达电股份有限公司 Method for automatically describing edge orientation of object
CN104217192A (en) * 2013-06-03 2014-12-17 株式会社理光 Hand positioning method and equipment based on range image
JP2015070359A (en) * 2013-09-27 2015-04-13 株式会社京三製作所 Person counting device
CN105320917A (en) * 2014-06-27 2016-02-10 南京理工大学 Pedestrian detection and tracking method based on head-shoulder contour and BP neural network

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4784709B1 (en) * 2011-03-10 2011-10-05 オムロン株式会社 Object tracking device, object tracking method, and control program
RU2014113049A (en) * 2014-04-03 2015-10-10 ЭлЭсАй Корпорейшн IMAGE PROCESSOR CONTAINING A GESTURE RECOGNITION SYSTEM WITH OBJECT TRACKING ON THE BASIS OF COMPUTING SIGNS OF CIRCUITS FOR TWO OR MORE OBJECTS

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1766928A (en) * 2004-10-29 2006-05-03 中国科学院计算技术研究所 A kind of motion object center of gravity track extraction method based on the dynamic background sport video
CN1770204A (en) * 2004-10-29 2006-05-10 中国科学院计算技术研究所 Method for extracting barycenter trajectory of motive object from motive video with static background
CN101236657A (en) * 2008-03-03 2008-08-06 吉林大学 Single movement target track tracking and recording method
CN101419712A (en) * 2008-12-02 2009-04-29 深圳市蓝韵实业有限公司 Method for determining external periphery outline of mammary gland
CN102074018A (en) * 2010-12-22 2011-05-25 Tcl集团股份有限公司 Depth information-based contour tracing method
CN102324032A (en) * 2011-09-08 2012-01-18 北京林业大学 Texture feature extraction method for gray level co-occurrence matrix in polar coordinate system
CN103577800A (en) * 2012-07-23 2014-02-12 中国航天员科研训练中心 Method for measuring human hand morphological parameters based on color images
CN102930268A (en) * 2012-08-31 2013-02-13 西北工业大学 Accurate positioning method for data matrix code under pollution and multi-view situation
CN103996212A (en) * 2013-02-18 2014-08-20 威达电股份有限公司 Method for automatically describing edge orientation of object
CN104217192A (en) * 2013-06-03 2014-12-17 株式会社理光 Hand positioning method and equipment based on range image
JP2015070359A (en) * 2013-09-27 2015-04-13 株式会社京三製作所 Person counting device
CN105320917A (en) * 2014-06-27 2016-02-10 南京理工大学 Pedestrian detection and tracking method based on head-shoulder contour and BP neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108614263A (en) * 2018-04-20 2018-10-02 Oppo广东移动通信有限公司 Mobile terminal, method for detecting position and Related product
CN112188105A (en) * 2020-09-30 2021-01-05 苏州臻迪智能科技有限公司 Tracking shooting method and device, intelligent device and computer readable storage medium

Also Published As

Publication number Publication date
CN107730534B (en) 2020-10-23
WO2018028363A1 (en) 2018-02-15

Similar Documents

Publication Publication Date Title
CN112797897B (en) Method and device for measuring geometric parameters of object and terminal
CN103426186B (en) A kind of SURF fast matching method of improvement
CN109191491A (en) The method for tracking target and system of the twin network of full convolution based on multilayer feature fusion
CN110473254A (en) A kind of position and orientation estimation method and device based on deep neural network
CN107369183A (en) Towards the MAR Tracing Registration method and system based on figure optimization SLAM
CN108629843A (en) A kind of method and apparatus for realizing augmented reality
CN111489394B (en) Object posture estimation model training method, system, device and medium
CN109035334A (en) Determination method and apparatus, storage medium and the electronic device of pose
CN105427333B (en) Real-time Registration, system and the camera terminal of video sequence image
CN107424142A (en) A kind of weld joint recognition method based on saliency detection
CN107689075B (en) Generation method, device and the robot of navigation map
CN106056089A (en) Three-dimensional posture recognition method and system
CN107689035A (en) A kind of homography matrix based on convolutional neural networks determines method and device
CN110111389B (en) Mobile augmented reality tracking registration method and system based on SLAM
CN108229430A (en) It is a kind of to merge the commodity classification method for paying attention to trying hard to
CN109922258A (en) Electronic image stabilization method, device and the readable storage medium storing program for executing of in-vehicle camera
CN107564063A (en) A kind of virtual object display methods and device based on convolutional neural networks
CN112215925A (en) Self-adaptive follow-up tracking multi-camera video splicing method for coal mining machine
CN108829116A (en) Barrier-avoiding method and equipment based on monocular cam
CN108090924A (en) Image processing method and device, robot
CN107784315A (en) The recognition methods of destination object and device, and robot
CN108257112A (en) The method and apparatus for filtering hot spot
CN107784631A (en) Image deblurring method and device
CN114742888A (en) 6D attitude estimation method based on deep learning
CN107730534A (en) The tracking and device of destination object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210721

Address after: 550000 room 212, building B, Shuobo entrepreneurship Park, Jinyang science and Technology Industrial Park, Guiyang National High tech Industrial Development Zone, Guiyang City, Guizhou Province

Patentee after: GUIZHOU UINSHINE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 518000 Guangdong, Shenzhen, Nanshan District, Nanhai Road, West Guangxi Temple Road North Sunshine Huayi Building 1 15D-02F

Patentee before: SHEN ZHEN KUANG-CHI HEZHONG TECHNOLOGY Ltd.

TR01 Transfer of patent right