CN106502380A - A kind of method and apparatus for judging target following effect - Google Patents
A kind of method and apparatus for judging target following effect Download PDFInfo
- Publication number
- CN106502380A CN106502380A CN201610830288.4A CN201610830288A CN106502380A CN 106502380 A CN106502380 A CN 106502380A CN 201610830288 A CN201610830288 A CN 201610830288A CN 106502380 A CN106502380 A CN 106502380A
- Authority
- CN
- China
- Prior art keywords
- tracking target
- effect
- angle point
- point sequence
- threshold
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the present invention provides a kind of method and apparatus for judging target following effect, and method includes:Sample object image is obtained, and sample angle point sequence is obtained using Corner Detection to the sample object image;Tracking target image is obtained, the angle point sequence that tracking target image is obtained to tracking target image using Corner Detection;According to the effect that the angle point sequence of the tracking target image and the angle point sequence of the sample object image judge tracking target.Corner Detection is based in the embodiment of the present invention, and angle point is serialized, the angle point sequence of sample object image and tracking target image is mated, obtain measuring similarity, can be combined with the rectangular histogram of color space, the similarity measurement of deformation target can be realized, the accuracy rate for judging target following effect is improved.
Description
Technical field
The present embodiments relate to human-computer interaction technique field, more particularly to a kind of method for judging target following effect and
Device.
Background technology
Gesture Recognition is the important technology of man-machine interaction, and for example, vision robot is gathered using hand-type technology of identification
The action of gesture and processed accordingly.
Gesture Recognition, by simply rough fine to complexity, can substantially be divided into Three Estate:Two-dimentional hand-type is known
Not, two-dimentional gesture identification, three-dimension gesture identification.
Two dimension is a plane space, can use the coordinate information that (X-coordinate, Y-coordinate) be constituted to represent an object
Coordinate position in two-dimensional space, is like that a width draws the position on a present face wall.Three-dimensional then increase on this basis
The information of " depth " (Z coordinate), this are that two dimension is not included.
Two-dimentional hand-type identification, alternatively referred to as static two dimensional gesture identification, identification is a simplest class in gesture.This
Technology after two-dimensional signal input is obtained can recognize the gesture of several static state, such as clench fist or the five fingers open.For example,
User can control player with several hand-types, and for example, palm is raised and is put into before photographic head by user, and video begins to broadcast
Put;Again palm is put into before photographic head, video suspends again.
Two-dimentional gesture identification, slightly difficult compared with for the identification of two-dimentional hand-type, but still depth information is substantially free of, stop
In the aspect of two dimension.This technology not only can recognize hand-type, can also recognize some simple two dimension gesture motions, such as
Wave against photographic head.Two-dimentional gesture identification has dynamic feature, can follow the trail of the motion of gesture, and then recognize handss
The compound action that gesture and hand exercise are combined together.So, the scope of gesture identification is really extended to two dimension just flat
Face, for example, not only by gesture come control computer play/suspend, can also realize forward/backward/page up/
Scroll down through the complex operations of these demand two-dimensional coordinate modification informations.
Although two-dimentional Gesture Recognition is recognized and indistinction with two-dimentional hand-type on hardware requirement, have benefited from more
Advanced computer vision algorithms make, it is possible to obtain the man-machine interaction content that more enriches.One is also improved in experience
Class, controls from pure state, becomes than more rich plane control, and for example, two-dimentional Gesture Recognition is collected
Into having arrived on TV.
Three-dimension gesture technology of identification, is that three-dimension gesture identification is most basic with two-dimentional gesture identification based on three-dimensional aspect
Difference is that the input that three-dimension gesture identification needs is the information for including depth, and this allows for three-dimension gesture and recognizes hard
All more complex than two-dimentional gesture identification in terms of part and software two.For general simple operationss, such as just hope and playing
When video suspend or continue projection, two-dimentional gesture also it is sufficient that.But for some complicated man-machine interactions, such as
Play game or apply on VR (virtual reality), three-dimension gesture is necessary.
Naturally gesture tracking is that real augmented reality/virtual reality (AR/VR) is necessary, it would be desirable to be able to accurately quick
Identification compound action, and be supplied to exploitation as SDK (SDK, Software Development Kit)
Person only needs to take little resource.
Gesture identification, briefly, this technology is form, the displacement using various kinds of sensors to hand/handheld tool
Etc. continuous collecting is carried out, complete at set intervals once to model, form the sequence frame of a model information, then by these information
Sequence is converted to corresponding instruction, for some operations of control realization.
Hand-type recognizes that general at present method is first directly to train sample image, obtains grader (as multi-class
SVM classifier, or ADABOOST graders), but, the discrimination that can cause some hand-types due to the interference of similar hand-type is not
High, False Rate is very high.
Location tracking be realize in VR (virtual reality) the important means this point of feeling of immersion and telepresenc by
Accepted in the industry, at present, in prior art, in order to realize that the technical scheme of location tracking has much, respectively had quality.
But head, arm, finger or other objects (such as weapon) is followed the trail of whether, location tracking can bring as follows
Benefit:
According to the visual angle that the action (for example, takeoff, squat down or lean forward) of user changes user, show in virtual world
The handss of user or other objects, connection reality and virtual world.For example, if position in one's hands can be detected, it is possible to realize
With handss mobile virtual object.Detection of complex action, by analyzing the position of limbs in a period of time, can detect more complicated
Action.For example, when user draws one " 8 " in the air with handss, can identify.
In gesture identification, opponent to be tracked, to last, tracking effect to be judged how, that is, so-called convergence
What property criterion is.
Same problem, during the present invention is realized, inventor has found also run into when target detection, i.e.,
Which target is only is most like handss.As the deformation of handss is very big, various postures can all have.And, the also colour of skin of handss,
In space, the object of the class colour of skin is a lot, can cause very big interference.Prior art, not only has limitation when colour of skin target is recognized
Property, and the interference for being difficult to solve noise, cause target following effect not high.
Content of the invention
The purpose of the embodiment of the present invention is to provide a kind of method and apparatus for judging target following effect, existing in order to solve
There is interference in technology cause to judge the problem that target following effect is not high.
The technical scheme that the embodiment of the present invention is adopted is as follows:
One embodiment of the invention provides a kind of method for judging target following effect, and methods described includes:
Sample object image is obtained, and sample angle point sequence is obtained using Corner Detection to the sample object image;
Tracking target image is obtained, the angle point sequence that tracking target image is obtained to tracking target image using Corner Detection
Row;
Tracking mesh is judged with the angle point sequence of the sample object image according to the angle point sequence of the tracking target image
Target effect.
Alternatively, the angle point sequence of the angle point sequence according to the tracking target image and the sample object image
Judge that the effect of tracking target includes:
Angle point sequence of the angle point sequence of the tracking target image with the sample object image is carried out similarity
With a measuring similarity value is obtained, according to the effect that the measuring similarity value judges tracking target.
Alternatively, the effect according to measuring similarity value judgement tracking target includes:
When the measuring similarity value is more than or equal to first threshold, judge that the effect of tracking target is qualified;
When the measuring similarity value is less than the first threshold, judge that the effect of tracking target is unqualified.
Alternatively, the effect according to measuring similarity value judgement tracking target includes:
When the measuring similarity value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;
When the measuring similarity value is less than the Second Threshold more than or equal to the 3rd threshold value, tracking target is judged
Effect qualified;
When the measuring similarity value is less than three threshold value, judge that the effect of tracking target is unqualified;
Wherein, the 3rd threshold value is less than Second Threshold.
Alternatively, the method for judging target following effect also includes:
Another measuring similarity value is obtained according to rectangular histogram of the tracking target image in color space, another similar according to this
Degree metric determines whether the effect for tracking target.
Alternatively, the acquisition sample object image is specifically included:The sample object image is obtained offline.
Another embodiment of the present invention provides a kind of device for judging target following effect, the judgement target following effect
Device include:
Sample acquisition unit, for obtaining sample object image, and is obtained using Corner Detection to the sample object image
Obtain sample angle point sequence;
Target Acquisition unit, for obtaining tracking target image, is tracked using Corner Detection to tracking target image
The angle point sequence of target image;
Effect judging unit, for the angle of the angle point sequence according to the tracking target image and the sample object image
Point sequence judges the effect of tracking target.
Alternatively, the effect judging unit, is additionally operable to the angle point sequence of the tracking target image and the sample
The angle point sequence of target image carries out similarity mode and obtains a measuring similarity value, is judged according to the measuring similarity value
The effect of tracking target.
Alternatively, the effect judging unit, is additionally operable to when the measuring similarity value is more than or equal to first threshold,
Judge that the effect of tracking target is qualified;When the measuring similarity value is less than the first threshold, the effect of tracking target is judged
Really unqualified.
Alternatively, the effect judging unit, is additionally operable to when the measuring similarity value is more than or equal to Second Threshold,
Judge the excellent effect of tracking target;When the measuring similarity value is less than the Second Threshold more than or equal to the 3rd threshold value
When, judge that the effect of tracking target is qualified;When the measuring similarity value is less than three threshold value, tracking target is judged
Effect is unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
Alternatively, the effect judging unit, is additionally operable to the rectangular histogram according to tracking target image in color space and obtains
Another measuring similarity value, determines whether the effect for tracking target according to another measuring similarity value.
Alternatively, the sample acquisition unit is specifically for obtaining the sample object image offline.
Another embodiment of the present invention provides a kind of virtual reality terminal, and the virtual reality terminal includes:
Hardware processor, for obtaining sample object image, and is obtained using Corner Detection to the sample object image
Sample angle point sequence;Tracking target image is obtained, the angle that tracking target image is obtained to tracking target image using Corner Detection
Point sequence;Tracking target is judged with the angle point sequence of the sample object image according to the angle point sequence of the tracking target image
Effect.
The technical scheme of the embodiment of the present invention has advantages below:
A kind of method and apparatus for judging target following effect provided in an embodiment of the present invention, based on Corner Detection, and will
Angle point is serialized, and the angle point sequence of sample object image and tracking target image is mated, is obtained measuring similarity, may be used also
With the rectangular histogram in conjunction with color space, it is possible to achieve the similarity measurement of deformation target, improve and judge target following effect
Accuracy rate.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
Accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are these
Some bright embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can be with root
Other accompanying drawings are obtained according to these accompanying drawings.
Schematic flow sheets of the Fig. 1 for a kind of method of judgement target following effect of another embodiment of the present invention;
Structural representations of the Fig. 2 for a kind of device of judgement target following effect of one embodiment of the invention;
Structural representations of the Fig. 3 for a kind of virtual reality terminal of another embodiment of the present invention;
Structural representations of the Fig. 4 for a kind of device of judgement target following effect of one embodiment of the invention;
Schematic flow sheets of the Fig. 5 for a kind of method of judgement target following effect of another embodiment of the present invention.
Specific embodiment
Purpose, technical scheme and advantage for making the embodiment of the present invention is clearer, below in conjunction with the embodiment of the present invention
In accompanying drawing, to the embodiment of the present invention in technical scheme be clearly and completely described, it is clear that described embodiment is
The a part of embodiment of the present invention, rather than whole embodiments.Embodiment in based on the present invention, those of ordinary skill in the art
The every other embodiment obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
Several technical terms are referred to herein, for example:Angle point, Corner Detection, serializing, the rectangular histogram of color space, retouch
State as follows.
Angle point:As extreme point, i.e., the particularly pertinent point of attribute in terms of certain.
Corner Detection (Corner Detection):It is for obtaining one kind side of characteristics of image in computer vision system
Method, is widely used in the fields such as motion detection, images match, video tracking, three-dimensional modeling and target recognition, is also referred to as characterized
Point detection.Angle point is generally defined as the intersection point on two sides, and tightened up says, the local neighborhood of angle point should have two differences
The border of the different directions in region.And in practical application, most of so-called angular-point detection method detections are to have specific spy
The picture point that levies, and it is not only " angle point ".These characteristic points have specific coordinate in the picture, and have some mathematics special
Levy, such as local maxima or minimal gray, some Gradient Features etc..
One critically important evaluation criterion of angular-point detection method is its inspection to same or similar feature in multiple image
Survey ability, and cope with the image changes such as illumination variation, image rotation.
Serializing:I.e. in a certain order, as counterclockwise, each point is arranged in sequence.
The rectangular histogram of color space:It is based on color space distribution of color figure.
The Histogram Matching of color space, can obtain the similarity based on color space distribution of color, and angle point is serialized,
The similarity based on shape can be obtained;Each similarity gives specific weight, and both weight sums are 1.
As shown in figure 1, the schematic flow sheet of the method for a kind of judgement target following effect of one embodiment of the invention, institute
State and judge that the method for target following effect can apply to that virtual reality terminal, robot, computer and TV etc. are various to be regarded
Feel machine, the method for the judgement target following effect is mainly as described below.
Step 11, obtains sample object image, and obtains sample angle point to the sample object image using Corner Detection
Sequence.
Step 12, obtains tracking target image, obtains tracking target image to tracking target image using Corner Detection
Angle point sequence.
Step 13, is judged with the angle point sequence of the sample object image according to the angle point sequence of the tracking target image
The effect of tracking target.
In another embodiment of the invention, the angle point sequence according to the tracking target image and the sample mesh
The angle point sequence of logo image judges that the effect of tracking target includes:
Angle point sequence of the angle point sequence of the tracking target image with the sample object image is carried out similarity
With a measuring similarity value is obtained, according to the effect that the measuring similarity value judges tracking target.
In another embodiment of the invention, the effect bag for judging tracking target according to the measuring similarity value
Include:
When the measuring similarity value is more than or equal to first threshold, judge that the effect of tracking target is qualified;
When the measuring similarity value is less than the first threshold, judge that the effect of tracking target is unqualified.
In another embodiment of the invention, the effect bag for judging tracking target according to the measuring similarity value
Include:
When the measuring similarity value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;
When the measuring similarity value is less than the Second Threshold more than or equal to the 3rd threshold value, tracking target is judged
Effect qualified;
When the measuring similarity value is less than three threshold value, judge that the effect of tracking target is unqualified;
Wherein, the 3rd threshold value is less than Second Threshold.
In another embodiment of the invention, the method for judging target following effect also includes:
Another measuring similarity value is obtained according to rectangular histogram of the tracking target image in color space, another similar according to this
Degree metric determines whether the effect for tracking target.
As shown in Fig. 2 the structural representation of the device for a kind of judgement target following effect of one embodiment of the invention, institute
State and judge that the device of target following effect can apply to that virtual reality terminal, robot, computer and TV etc. are various to be regarded
Feel that machine, the device of the judgement target following effect include:Sample acquisition unit 21, Target Acquisition unit 22 and effect judge
Unit 23.
The sample acquisition unit 21, for obtaining sample object image, and adopts angle point to the sample object image
Detection obtains sample angle point sequence.
The Target Acquisition unit 22, for obtaining tracking target image, is obtained using Corner Detection to tracking target image
Obtain the angle point sequence of tracking target image.
The effect judging unit 23, for the angle point sequence according to the tracking target image and the sample object figure
The angle point sequence of picture judges the effect of tracking target.
In another embodiment of the invention, the effect judging unit, is additionally operable to the angle of the tracking target image
Point sequence carries out similarity mode with the angle point sequence of the sample object image and obtains a measuring similarity value, according to described
Measuring similarity value judges the effect of tracking target.
In another embodiment of the invention, the effect judging unit, is additionally operable to be more than when the measuring similarity value
Or when being equal to first threshold, judge that the effect of tracking target is qualified;When the measuring similarity value is less than the first threshold,
Judge that the effect of tracking target is unqualified.
In another embodiment of the invention, the effect judging unit, is additionally operable to be more than when the measuring similarity value
Or when being equal to Second Threshold, judge the excellent effect of tracking target;When the measuring similarity value is more than or equal to Second Threshold
When, judge the excellent effect of tracking target;When the measuring similarity value is less than described second more than or equal to the 3rd threshold value
During threshold value, judge that the effect of tracking target is qualified;When the measuring similarity value is less than three threshold value, tracking mesh is judged
Target effect is unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
In another embodiment of the invention, the effect judging unit, is additionally operable to according to tracking target image in color
The rectangular histogram in space obtains another measuring similarity value, determines whether the effect for tracking target according to another measuring similarity value
Really.
As shown in figure 3, the structural representation of a kind of electronic equipment for one embodiment of the invention, the electronic equipment can be with
For the various visual machines of robot, computer and TV etc., the electronic equipment includes:Processor (processor) 311,
Transmitter (transmitter) 312, communication interface (Communications Interface) 313, memorizer (memory)
314 and communication bus 315;Wherein, the processor 311, the transmitter 312, the communication interface 313 and the memorizer
314 complete mutual communication by the communication bus 315.
The possibly central processor CPU of the processor 311, or specific integrated circuit ASIC
(Application Specific Integrated Circuit), or be arranged to implement the one of the embodiment of the present invention
Individual or multiple integrated circuits.
The memorizer 314 is used for depositing program code, and described program code includes computer-managed instruction.The storage
Device 314 may include high-speed RAM memorizer, it is also possible to also include nonvolatile memory (non-volatile memory), example
Such as at least one disk memory.
311 configuration processor code of the processor, for obtaining sample object image, and adopts to the sample object image
Sample angle point sequence is obtained with Corner Detection;For obtaining tracking target image, tracking target image is obtained using Corner Detection
Obtain the angle point sequence of tracking target image;For the angle point sequence according to the tracking target image and the sample object image
Angle point sequence judge tracking target effect.
In another embodiment of the invention, the processor 311 also further executes another program code, for inciting somebody to action
The angle point sequence of the tracking target image carries out similarity mode with the angle point sequence of the sample object image and obtains one
Measuring similarity value, according to the effect that the measuring similarity value judges tracking target.
In another embodiment of the invention, the processor 311 also further executes another program code, for working as
When the measuring similarity value is more than or equal to first threshold, judge that the effect of tracking target is qualified;When the measuring similarity
When value is less than the first threshold, judge that the effect of tracking target is unqualified.
In another embodiment of the invention, the processor 311 also further executes another program code, for working as
When the measuring similarity value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;When the measuring similarity
When value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;When the measuring similarity value is more than or equal to the
Three threshold values and when being less than the Second Threshold, judge that the effect of tracking target is qualified;When the measuring similarity value is less than described
During three threshold values, judge that the effect of tracking target is unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
In another embodiment of the invention, the processor 311 also further executes another program code, for root
Another measuring similarity value is obtained according to rectangular histogram of the tracking target image in color space, is entered according to another measuring similarity value
One step judges the effect of tracking target.
As shown in figure 4, the structural representation of the device for a kind of judgement target following effect of another embodiment of the present invention,
The device for judging target following effect can be the various visual machines of robot, computer and TV etc., the judgement
The device of target following effect includes:Sample acquisition unit 41, Target Acquisition unit 42, effect judging unit 43 and memorizer
44.
The sample acquisition unit 41, for obtaining sample object image, and adopts angle point to the sample object image
Detection obtains sample angle point sequence.
For example, the sample acquisition unit 41, for obtaining sample object image offline, and to the sample object figure
As adopting Corner Detection, angle point is serialized the angle point sequence for obtaining sample object image.
Described refer to that the sample acquisition unit 41 obtains sample object image in non real-time offline, for example, in photographic head
When not obtaining target image, sample object image can be pre-set.Or, then for example, target image is absorbed in photographic head
Afterwards target image is stored, then in certain time point, the sample acquisition unit 41 obtains these target images as sample
Target image.
The photographic head, can be vision sensor, wired photographic head and wireless camera, for example, USB camera,
Wifi photographic head, ARM connect photographic head and cmos photographic head.
In another embodiment of the invention, the sample acquisition unit 41 can both obtain static sample object image,
Dynamic sample target image can also be obtained, for example, the sample acquisition unit 41 is opened for obtaining the V words of handss, the five fingers, stone
Head, shears and cloth etc. several static sample object images.
Again for example, the sample acquisition unit 41 is additionally operable to obtain head, arm, finger or other objects (such as weapon)
Static sample object image.
Again for example, the sample acquisition unit 41 can also obtain the action (for example, takeoff, squat down or lean forward) of user
Dynamic sample target image.
Wherein, the sample acquisition unit 41 is additionally operable to the sample object image and the angle of the sample object image
Point sequence is stored in the memorizer 44.
The Target Acquisition unit 42, for obtaining tracking target image, is obtained using Corner Detection to tracking target image
Obtain the angle point sequence of tracking target image.
For example, when absorbing a target image by photographic head, the Target Acquisition unit 42 is used for adopting track algorithm
The target image is set to tracking target image.
The effect judging unit 43, for the angle point sequence according to the tracking target image and the sample object figure
The angle point sequence of picture judges the effect of tracking target.
In another embodiment of the invention, the effect judging unit 43 is used for the angle point of the tracking target image
Sequence carries out similarity mode with the angle point sequence of the sample object image and obtains a measuring similarity value;According to the phase
Like the effect that degree metric judges tracking target.
For example, the effect judging unit 43 is used for judging that the effect of tracking target is concrete according to the measuring similarity value
For:Judge whether the measuring similarity value meets threshold value;Or, in another embodiment of the invention, judge described similar
Degree metric is located at certain threshold levels interval corresponding with gradation of effects, described in detail below.
For example, the effect judging unit 43 is used for, when the measuring similarity value is more than or equal to first threshold, sentencing
Surely the effect of tracking target is qualified;When the measuring similarity value is less than the first threshold, the effect of tracking target is judged
Unqualified.
In another embodiment of the invention, the effect judging unit 43 be used for when the measuring similarity value more than or
When being equal to Second Threshold, the excellent effect of tracking target is judged;When the measuring similarity value is more than or equal to the 3rd threshold value
During less than the Second Threshold, judge that the effect of tracking target is qualified;When the measuring similarity value is less than the 3rd threshold value
When, judge that the effect of tracking target is unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
In another embodiment of the invention, the effect judging unit 43 is additionally operable to according to tracking target image in color
The rectangular histogram in space obtains another more accurate measuring similarity value, according to another measuring similarity value determine whether with
The effect of track target.For example, the effect judging unit 43 judges rectangular histogram and sample of the tracking target image in color space
Histogrammic similarity of the target image in color space, if similarity determines the effect of tracking target more than or equal to threshold value
Really qualified, if similarity is less than threshold value, determine that the effect of tracking target is unqualified.
Target is tracked in particular in deformation, by the rectangular histogram in color combining space, it is possible to achieve deformation target
Similarity measurement.
As shown in figure 5, the schematic flow sheet of the method for a kind of judgement target following effect of another embodiment of the present invention,
It is various that the method for judging target following effect can apply to virtual reality terminal, robot, computer and TV etc.
Visual machine, the method for the judgement target following effect are mainly as described below.
Step 51, obtains sample object image, and obtains sample angle point to the sample object image using Corner Detection
Sequence.
For example, it is possible to obtain sample object image offline, and Corner Detection is adopted to the sample object image, by angle
Point sequence obtains the angle point sequence of sample object image.
Described is to obtain sample object image in non real-time offline, for example, when photographic head does not obtain target image, can
To pre-set sample object image.Or, then for example, target image is stored after photographic head intake target image, then
In certain time point, these target images are obtained as sample object image.
The photographic head, can be vision sensor, wired photographic head and wireless camera, for example, USB camera,
Wifi photographic head, ARM connect photographic head and cmos photographic head.
In another embodiment of the invention, static sample object image can both have been obtained, it is also possible to obtain dynamic sample
Target image, for example, obtain the V words of handss, the five fingers open, the several static sample object images of stone, shears and cloth etc..
Again for example, head, arm, finger or the static sample object image of other objects (such as weapon) are obtained.
Again for example, it is also possible to obtain the dynamic sample target image of the action (for example, takeoff, squat down or lean forward) of user.
Wherein, the method for judging target following effect also includes:Store the sample object image and the sample
The angle point sequence of target image.
Step 52, obtains tracking target image, obtains tracking target image to tracking target image using Corner Detection
Angle point sequence.
For example, when absorbing a target image by photographic head, the target image is set to track using track algorithm
Target image.
Step 53, is judged with the angle point sequence of the sample object image according to the angle point sequence of the tracking target image
The effect of tracking target.
In another embodiment of the invention, the angle point sequence according to the tracking target image and the sample mesh
The angle point sequence of logo image judges that the effect of tracking target is described in detail below.
Step 531, angle point sequence of the angle point sequence of the tracking target image with the sample object image is carried out
Similarity mode obtains a measuring similarity value.
Step 532, according to the effect that the measuring similarity value judges tracking target.
For example, judge whether the measuring similarity value meets threshold value;Or, in another embodiment of the invention, sentence
The measuring similarity value of breaking is located at certain threshold levels interval corresponding with gradation of effects, described in detail below.
For example, when the measuring similarity value is more than or equal to first threshold, judge that the effect of tracking target is qualified;When
When the measuring similarity value is less than the first threshold, judge that the effect of tracking target is unqualified.
In another embodiment of the invention, when the measuring similarity value be more than or equal to Second Threshold when, judge with
The excellent effect of track target;When the measuring similarity value is less than the Second Threshold more than or equal to the 3rd threshold value, sentence
Surely the effect of tracking target is qualified;When the measuring similarity value is less than three threshold value, the effect of tracking target is judged
Unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
Step 54, obtains another more accurate measuring similarity according to rectangular histogram of the tracking target image in color space
Value.
Target is tracked in particular in deformation, by the rectangular histogram in color combining space, it is possible to achieve deformation target
Similarity measurement.
In sum, the method for judging target following effect of the present embodiment, based on Corner Detection, and by angle point sequence
Change, the angle point sequence of sample object image and tracking target image is mated, obtains measuring similarity, empty in conjunction with color
Between rectangular histogram, it is possible to achieve the similarity measurement of deformation target.
Device embodiment described above is only that schematically the wherein described unit illustrated as separating component can
To be or may not be physically separate, as the part that unit shows can be or may not be physics list
Unit, you can be located at a place, or can also be distributed on multiple NEs.Which is selected according to the actual needs can
In some or all of module realizing the purpose of this embodiment scheme.Those of ordinary skill in the art are not paying creativeness
Work in the case of, you can to understand and implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can
Mode by software plus required general hardware platform is realizing, naturally it is also possible to by hardware.Such understanding is based on, on
State the part that technical scheme substantially contributes prior art in other words to embody in the form of software product, should
Computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disc, CD etc., including some fingers
Order is used so that a computer equipment (can be personal computer, server, or network equipment etc.) executes each enforcement
Method described in some parts of example or embodiment.
Finally it should be noted that:Above example only in order to technical scheme to be described, rather than a limitation;Although
With reference to the foregoing embodiments the present invention has been described in detail, it will be understood by those within the art that:Which still may be used
To modify to the technical scheme described in foregoing embodiments, or equivalent is carried out to which part technical characteristic;
And these modification or replace, do not make appropriate technical solution essence depart from various embodiments of the present invention technical scheme spirit and
Scope.
Claims (10)
1. a kind of judge target following effect method, it is characterised in that methods described includes:
Sample object image is obtained, and sample angle point sequence is obtained using Corner Detection to the sample object image;
Tracking target image is obtained, the angle point sequence that tracking target image is obtained to tracking target image using Corner Detection;
Tracking target is judged with the angle point sequence of the sample object image according to the angle point sequence of the tracking target image
Effect.
2. method according to claim 1, it is characterised in that the angle point sequence according to the tracking target image with
The angle point sequence of the sample object image judges that the effect of tracking target includes:
The angle point sequence of the tracking target image is carried out similarity mode with the angle point sequence of the sample object image to obtain
To a measuring similarity value, according to the effect that the measuring similarity value judges tracking target.
3. method as claimed in claim 2, it is characterised in that described tracking target is judged according to the measuring similarity value
Effect includes:
When the measuring similarity value is more than or equal to first threshold, judge that the effect of tracking target is qualified;
When the measuring similarity value is less than the first threshold, judge that the effect of tracking target is unqualified.
4. method as claimed in claim 2, it is characterised in that described tracking target is judged according to the measuring similarity value
Effect includes:
When the measuring similarity value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;
When the measuring similarity value is less than the Second Threshold more than or equal to the 3rd threshold value, the effect of tracking target is judged
Really qualified;
When the measuring similarity value is less than three threshold value, judge that the effect of tracking target is unqualified;
Wherein, the 3rd threshold value is less than Second Threshold.
5. method as claimed in claim 2, it is characterised in that also include:
Another measuring similarity value is obtained according to rectangular histogram of the tracking target image in color space, according to another similarity degree
Value determines whether the effect for tracking target.
6. a kind of judge target following effect device, it is characterised in that described device includes:
Sample acquisition unit, for obtaining sample object image, and obtains sample to the sample object image using Corner Detection
This angle point sequence;
Target Acquisition unit, for obtaining tracking target image, obtains tracking target to tracking target image using Corner Detection
The angle point sequence of image;
Effect judging unit, for the angle point sequence of the angle point sequence according to the tracking target image and the sample object image
Row judge the effect of tracking target.
7. device according to claim 6, it is characterised in that the effect judging unit, is additionally operable to the tracking mesh
The angle point sequence of logo image carries out similarity mode with the angle point sequence of the sample object image and obtains a measuring similarity
Value, according to the effect that the measuring similarity value judges tracking target.
8. device as claimed in claim 7, it is characterised in that the effect judging unit, is additionally operable to when the similarity degree
When value is more than or equal to first threshold, judge that the effect of tracking target is qualified;When the measuring similarity value is less than described the
During one threshold value, judge that the effect of tracking target is unqualified.
9. device as claimed in claim 7, it is characterised in that the effect judging unit, is additionally operable to when the similarity degree
When value is more than or equal to Second Threshold, the excellent effect of tracking target is judged;When the measuring similarity value is more than or equal to
During Second Threshold, the excellent effect of tracking target is judged;When the measuring similarity value is less than more than or equal to the 3rd threshold value
During the Second Threshold, judge that the effect of tracking target is qualified;When the measuring similarity value is less than three threshold value, sentence
Surely the effect of tracking target is unqualified;Wherein, the 3rd threshold value is less than Second Threshold.
10. device as claimed in claim 7, it is characterised in that the effect judging unit, is additionally operable to according to tracking target figure
As the rectangular histogram in color space obtains another measuring similarity value, determine whether to track according to another measuring similarity value
The effect of target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610830288.4A CN106502380A (en) | 2016-09-18 | 2016-09-18 | A kind of method and apparatus for judging target following effect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610830288.4A CN106502380A (en) | 2016-09-18 | 2016-09-18 | A kind of method and apparatus for judging target following effect |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106502380A true CN106502380A (en) | 2017-03-15 |
Family
ID=58290120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610830288.4A Pending CN106502380A (en) | 2016-09-18 | 2016-09-18 | A kind of method and apparatus for judging target following effect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106502380A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111885301A (en) * | 2020-06-29 | 2020-11-03 | 浙江大华技术股份有限公司 | Gun and ball linkage tracking method and device, computer equipment and storage medium |
CN113888583A (en) * | 2021-09-28 | 2022-01-04 | 河北汉光重工有限责任公司 | Real-time judgment method and device for visual tracking accuracy |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
US20120062736A1 (en) * | 2010-09-13 | 2012-03-15 | Xiong Huaixin | Hand and indicating-point positioning method and hand gesture determining method used in human-computer interaction system |
CN104732187A (en) * | 2013-12-18 | 2015-06-24 | 杭州华为企业通信技术有限公司 | Method and equipment for image tracking processing |
-
2016
- 2016-09-18 CN CN201610830288.4A patent/CN106502380A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
US20120062736A1 (en) * | 2010-09-13 | 2012-03-15 | Xiong Huaixin | Hand and indicating-point positioning method and hand gesture determining method used in human-computer interaction system |
CN104732187A (en) * | 2013-12-18 | 2015-06-24 | 杭州华为企业通信技术有限公司 | Method and equipment for image tracking processing |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111885301A (en) * | 2020-06-29 | 2020-11-03 | 浙江大华技术股份有限公司 | Gun and ball linkage tracking method and device, computer equipment and storage medium |
CN113888583A (en) * | 2021-09-28 | 2022-01-04 | 河北汉光重工有限责任公司 | Real-time judgment method and device for visual tracking accuracy |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6079832B2 (en) | Human computer interaction system, hand-to-hand pointing point positioning method, and finger gesture determination method | |
Memo et al. | Head-mounted gesture controlled interface for human-computer interaction | |
Plouffe et al. | Static and dynamic hand gesture recognition in depth data using dynamic time warping | |
US8166421B2 (en) | Three-dimensional user interface | |
US8488888B2 (en) | Classification of posture states | |
KR101581954B1 (en) | Apparatus and method for a real-time extraction of target's multiple hands information | |
Prisacariu et al. | 3D hand tracking for human computer interaction | |
CN109597485B (en) | Gesture interaction system based on double-fingered-area features and working method thereof | |
US20130077820A1 (en) | Machine learning gesture detection | |
CN107688391A (en) | A kind of gesture identification method and device based on monocular vision | |
US20030156756A1 (en) | Gesture recognition system using depth perceptive sensors | |
CN107357428A (en) | Man-machine interaction method and device based on gesture identification, system | |
CN105107200B (en) | Face Changing system and method based on real-time deep body feeling interaction and augmented reality | |
JP6487642B2 (en) | A method of detecting a finger shape, a program thereof, a storage medium of the program, and a system for detecting a shape of a finger. | |
CN110503686A (en) | Object pose estimation method and electronic equipment based on deep learning | |
CN106200971A (en) | Man-machine interactive system device based on gesture identification and operational approach | |
Plouffe et al. | Natural human-computer interaction using static and dynamic hand gestures | |
CN103577792A (en) | Device and method for estimating body posture | |
Kerdvibulvech | Hand tracking by extending distance transform and hand model in real-time | |
CN106502380A (en) | A kind of method and apparatus for judging target following effect | |
Xu et al. | A novel method for hand posture recognition based on depth information descriptor | |
Półrola et al. | Real-time hand pose estimation using classifiers | |
CN109254663B (en) | Using method of auxiliary reading robot for books of children | |
Simion et al. | Finger detection based on hand contour and colour information | |
Schlattmann et al. | Markerless 4 gestures 6 DOF real‐time visual tracking of the human hand with automatic initialization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170315 |