CN104637069A - Electronic device and method of tracking objects in video - Google Patents

Electronic device and method of tracking objects in video Download PDF

Info

Publication number
CN104637069A
CN104637069A CN201310573541.9A CN201310573541A CN104637069A CN 104637069 A CN104637069 A CN 104637069A CN 201310573541 A CN201310573541 A CN 201310573541A CN 104637069 A CN104637069 A CN 104637069A
Authority
CN
China
Prior art keywords
picture
film
module
defines
define
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310573541.9A
Other languages
Chinese (zh)
Inventor
廖嘉维
詹凯轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute for Information Industry
Original Assignee
Institute for Information Industry
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute for Information Industry filed Critical Institute for Information Industry
Priority to CN201310573541.9A priority Critical patent/CN104637069A/en
Publication of CN104637069A publication Critical patent/CN104637069A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Studio Devices (AREA)

Abstract

The invention provides an electronic device and a method of tracking objects in a video. The electronic device comprises a video supply unit and a processing unit. The video supply unit is used for supplying the video. The processing unit is used for capturing a video clip of the video, which includes a plurality of continuous frames; the position of at least one first object in the first one of the continuous frames is defined; the position of at least one first object in the second one of the continuous frames is defined; according to the positions of the first objects in the first frame and the second frame, positions of the first objects in the frames between the first frame and the second frame are judged.

Description

Electronic installation and film object tracking method thereof
Technical field
The present invention is about a kind of electronic installation and film processing method thereof.More specifically, the present invention is about a kind of electronic installation and film object tracking method thereof.
Background technology
In recent years, electronic installation miscellaneous, such as TV, computer, mobile phone, camera, video camera etc., generally all have and carry out for film the function that processes.For example, above-mentioned electronic installation generally can be used to shooting film, plays film, edit a film etc.
Under some application situation, film object tracking function can make electronic installation have higher surcharge.For example, under the situation of interactive film, if electronic installation has film object tracking function, then can estimate for the object position that each picture occurs in film, and then customizing messages or link are embedded to this object occurred in each this picture.So, when user views and admires this film, by by directly clicking the object that comes across in this film and obtaining the customizing messages or link that correspond to this object.Under this application situation, there is the electronic installation of film object tracking function, user not only can be allowed to save the time using world-wide web to search in addition, be also beneficial to manufacturer's product promotion and movable marketing.
One large key of film object tracking technology is to estimate the motion track of object in film how rapidly, namely how to estimate rapidly and the change of this object in film (comprising the change of body form and object space).In response to progressing greatly of various estimation algorithm, the time of traditional film object tracking technology spent by the motion track of estimation single object phases down.But when there being the motion track of several object to need by estimation in film, traditional film object tracking technology must be estimated for the motion track of different objects in this film in order, thus needed for time of expending often present linear growth.In other words, when there being the motion track of several object to need by estimation in film, the efficiency of traditional film object tracking technology is also remarkable not.
In view of this, the time of how following the trail of spent by several object for traditional film object tracking technology improves, really for the technical field of the invention needs the problem of solution badly.
Summary of the invention
Fundamental purpose of the present invention is that the time of following the trail of spent by several object for traditional film object tracking technology improves.
For reaching above-mentioned purpose, the invention provides a kind of electronic installation.This electronic installation comprises the processing unit that a film supplier unit and is electrically connected to this film supplier unit.This film supplier unit is in order to supply a film.This processing unit is in order to the filmstrip that captures this film, and this filmstrip comprises several continuous pictures; The position of at least one first object is defined in one first picture of these continuous pictures; The position of this at least one first object is defined in one second picture of these continuous pictures; And according to this at least one first object these positions in this first picture and this second picture, judge the position in each picture of this at least one first object between this first picture and this second picture.
For reaching above-mentioned purpose, the present invention separately provides a kind of film object tracking method for an electronic installation.This electronic installation comprises the processing unit that a film supplier unit and is electrically connected to this film supplier unit.This film object tracking method comprises the following step:
A () supplies a film by this film supplier unit;
B () captures a filmstrip of this film by this processing unit, this filmstrip comprises several continuous pictures;
C (), in one first picture of these continuous pictures, is defined the position of at least one first object by this processing unit;
D (), in one second picture of these continuous pictures, is defined the position of this at least one first object by this processing unit; And
E (), according to this at least one first object these positions in this first picture and this second picture, judges the position in each picture of this at least one first object between this first picture and this second picture by this processing unit.
Specifically, the invention provides a kind of electronic installation and film object tracking method thereof.By the running of aforesaid film feeding unit and processing unit, this electronic installation and film object tracking method thereof pick out two pictures by the several continuous pictures comprised at a filmstrip, and in these two pictures, define the position of one or more same object respectively.Then, the position of these two pictures is arranged in by this one or more same object, judge the position in each picture of this one or more same object between these two pictures, use and estimate the motion track of this one or more same object between these two pictures.Be different from traditional film object tracking method must estimate the motion track of different objects in order, the present invention can once estimate the motion track of multiple object, therefore the time that can effectively follow the trail of spent by several object for traditional film object tracking technology improves.
After accompanying drawings and the embodiment that describes subsequently, those skilled in the art just can understand other objects of the present invention, and technological means of the present invention and part implement aspect.
Accompanying drawing explanation
Fig. 1 is a structural representation of the electronic installation 1 described in first embodiment of the invention;
Fig. 2 is a schematic diagram of the filmstrip 22 described in first embodiment of the invention;
The running schematic diagram that Fig. 3 is the object tracking module 135 described in first embodiment of the invention; And
Fig. 4 is a schematic diagram of the film object tracking method described in second embodiment of the invention.
Symbol description:
1: electronic installation
11: film supplier unit
13: processing unit
131: film cutting module
133: object defines module
135: object tracking module
15: user's interface unit
20: film
22: filmstrip
40: at least one first object is in the position of the first picture
42: at least one first object is in the position of the second picture
43: at least one second object is in the position of the second picture
44: at least one second object is in the position of the 3rd picture
45: at least one third body is in the position of the first picture
46: at least one third body is in the position of the 3rd picture
60: user inputs
F1: the first picture
F2: the second picture
F3: the three picture
B1: box frame
B2: box frame
S21, S23, S25, S27, S29: step
X1: the first object
X2: the second object
X3: third body
Embodiment
Below will explain content of the present invention by embodiment, embodiments of the invention and be not used to restriction the present invention can must implement in any specific environment as described embodiments, application or particular form.Therefore, the explanation about embodiment is only explaination object of the present invention, and is not used to limit the present invention.In following examples and accompanying drawing, the element relevant to non-immediate of the present invention all omits and does not illustrate, and in accompanying drawing, each interelement size relationship is only and asks easy understanding, is not used to the ratio limiting actual enforcement.
The first embodiment of the present invention is in order to set forth a kind of electronic installation, and Fig. 1 is the structural representation of this electronic installation.As shown in Figure 1, electronic installation 1 comprises film supplier unit 11 and a processing unit 13, and selectively comprises user's interface unit 15.Electronic installation 1 can be but be not limited to: TV, computer, mobile phone, camera, video camera etc.Film supplier unit 11 and user's interface unit 15 are all electrically connected with processing unit 13, and each unit can intercom and pass-along message each other mutually.
Film supplier unit 11 can comprise film acquisition device, such as a video camera, in order to capture a film 20 and to supply film 20 to processing unit 13.Film supplier unit 11 also can comprise a reservoir, in order to store film 20 and to supply film 20 to processing unit 13.In other embodiments, film supplier unit 11 can also implement aspect by other provides film 20 to processing unit 13.
Processing unit 13 is in order to capture a filmstrip 22 of film 20, and wherein filmstrip 22 can comprise several continuous pictures.Processing unit 13 is more used to, in one first picture of these continuous pictures, define the position 40 of at least one first object, and in one second picture of these continuous pictures, defines the position 42 of this at least one first object.At least one object described in the present embodiment can be considered one or more object.Processing unit 13 is also in order to be positioned at the position 40 of this first picture according to this at least one first object and to be positioned at the position 42 of this second picture, judge the position in each picture of this at least one first object between this first picture and this second picture, use this at least one first object of estimation from this first picture to the motion track of this second picture.
For ease of illustrating, the processing unit 13 of the present embodiment can comprise film cutting module 131, object and define module 133 and an object tracking module 135.In other embodiments, processing unit 13 also can be only single processor, and performs the corresponding operation of above-mentioned modules.
As shown in Figure 1, after film supplier unit 11 supplies film 20 to processing unit 13, first processing unit 13 captures the filmstrip 22 of film 20 by film cutting module 131.The object of film cutting module 131 is to have at least one common objects in several continuous pictures that the filmstrip 22 guaranteeing to capture comprises.Therefore, filmstrip 22 can be whole section of film 20, also can be arbitrary section of fragment in film 20.
For example, whether film cutting module 131 can switch according to the picture camera lens of shooting film 20, judges that whether picture is continuous.If picture is discontinuous, then film cutting module 131 can cut film 20, and captures the filmstrip 22 comprising several continuous pictures.In other embodiments, processing unit 13 also can not have film cutting module 131, and direct film 20 to the object that transmits of film supplier unit 11 defines module 133.
When electronic installation 1 does not comprise user's interface unit 15, after the film cutting module 131 of processing unit 13 captures the filmstrip 22 of film 20, the object of processing unit 13 defines module 133 and in order to define the position 40 of at least one first object at this first picture of these continuous pictures, and can define the position 42 of this at least one first object at this second picture of these continuous pictures.
When electronic installation 1 comprises user's interface unit 15, the object of processing unit 13 defines module 133 and can input 60 according to the user from user's interface unit 15, defines the position 40 of this at least one first object and in this second picture, define the position 42 of this at least one first object in this first picture.For example, user is by a touch screen, a slide-mouse, a keyboard or various input media (not illustrating), and in this first picture and this second picture, this at least one object selected by manual frame.Then, user's interface unit 15 inputs 60 by producing user according to user's scope that institute's frame selects in this first picture and this second picture.Then, the object of processing unit 13 defines module 133 and will input 60 according to user, defines the position 40 of this at least one first object and in this second picture, define the position 42 of this at least one first object in this first picture.
After the object of processing unit 13 defines the position 40 that module 133 defines this at least one first object in this first picture and the position 42 defining this at least one first object in this second picture, the object tracking module 135 of processing unit 13 can in order to be arranged in the position 40 of this first picture according to this at least one first object and to be positioned at the position 42 of this second picture, judge the position in each picture of this at least one first object between this first picture and this second picture, use this at least one first object of estimation from this first picture to the motion track of this second picture.
Below using Fig. 2 and Fig. 3 as a demonstration example, the above-mentioned running that object defines module 133 and object tracking module 135 will be further illustrated.Fig. 2 is a schematic diagram of filmstrip 22, and Fig. 3 is a running schematic diagram of object tracking module 135.
As shown in Figure 2, filmstrip 22 comprises several continuous pictures.For convenience of explanation, suppose that the first picture F1 is the start picture in these continuous pictures of comprising of filmstrip 22, and the end picture in these continuous pictures that the second picture F2 is filmstrip 22 to be comprised.In other embodiments, the first picture F1 and the second picture F2 can also be wantonly two the non-adjacent pictures in these continuous pictures of comprising of filmstrip 22.
The object of processing unit 13 defines module 133 and first judges whether the first picture F1 and the second picture F2 occurs identical object.Because the first object X1(circular object all appears in the first picture F1 and the second picture F2) time, therefore the object of processing unit 13 defines module 133 is positioned at the position 40 of the first picture F1 by defining the first object X1 and is positioned at the position 42 of the second picture F2.Afterwards, the object tracking module 135 of processing unit 13, by being positioned at the position 40 of the first picture F1 according to the first object X1 and being positioned at the position 42 of the second picture F2, judges the position in each picture of the first object X1 between the first picture F1 and the second picture F2.
If multiple first object X1 appears at the first picture F1 and the second picture F2 simultaneously, the object of processing unit 13 defines module 133 and synchronously can define several first object X1 and be positioned at several positions 40 of the first picture F1 and be positioned at several positions 42 of the second picture F2, and the object tracking module 135 of processing unit 13 will be positioned at these positions 40 of the first picture F1 and be positioned at these positions 42 of the second picture F2 according to several first object X1, judge the position respectively in each picture of this first object X1 between the first picture F1 and the second picture F2.
In other embodiments, even continuous pictures, in filmstrip 22, some object may not appear at the first picture F1, but appears at the second picture F2; Or some object may appear at the first picture F1, but do not appear at the second picture F2.The former is the second object X2(triangle object as shown in Figure 2 still), and the latter is just as the square object of the third body X3(shown in Fig. 2).The quantity of the second object X2 and third body X3 can be one or several.
Only appear at the second picture F2 due to the second object X2 and do not appear at the first picture F1, therefore the second object X2 may start to occur by a certain picture between the first picture F1 to the second picture F2, and continue to occur until the second picture F2.Therefore, the object of processing unit 13 defines module 133 more in order to define the position 43 of the second object X2 in the second picture F2; And a selected 3rd picture F3 in picture between the first picture F1 and the second picture F2, in the 3rd picture F3, then define the position 44 of the second object X2.Then, the object tracking module 135 of processing unit 13 is more in order to be arranged in the position 43 of the second picture F2 and to be positioned at the position 44 of the 3rd picture F3 according to the second object X2, judge the position in each picture of the second object X2 between the second picture F2 and the 3rd picture F3, use estimation second object X2 from the 3rd picture F3 to the motion track of the second picture F2.
Similarly, only appear at the first picture F1 due to third body X3 and do not appear at the second picture F2, therefore third body X3 may disappear by a certain picture between the first picture F1 to the second picture F2, and continued depletion is until the second picture F2.Therefore, the object of processing unit 13 defines module 133 more in order to define the position 45 of third body X3 in the first picture F1; And a selected 3rd picture F3 in picture between the first picture F1 and the second picture F2, in the 3rd picture F3, then define the position 46 of third body X3.Then, the object tracking module 135 of processing unit 13 is more in order to be arranged in the position 45 of the first picture F1 and to be positioned at the position 46 of the 3rd picture F3 according to third body X3, judge the position in each picture of third body X3 between the first picture F1 and the 3rd picture F3, use estimation third body X3 from the first picture F1 to the motion track of the 3rd picture F3.
When to there is the second object X2 and third body X3 simultaneously, the object of processing unit 13 defines module 133 can select two different the 3rd picture F3 respectively to define the position 44,46 of the second object X2 and third body X3 respectively from the picture between the first picture F1 from the second picture F2, only can also select identical a 3rd picture F3 to define the position 44,46 of the second object X2 and third body X3.In addition, from the picture between the first picture F1 and the second picture F2, select the rule of the 3rd picture F3, thumb rule, dichotomy can be adopted then or other known methods.For example, the object of processing unit 13 defines module 133 and can adopt dichotomy and select that picture of centre in the picture between the first picture F1 and the second picture F2 as the 3rd picture F3.In other embodiments, also the 3rd picture F3 can be selected by user's interface unit 15 by user from the picture between the first picture F1 and the second picture F2.
Except the situation of above-mentioned second object X2 and third body X3, some object may appear in the picture between the first picture F1 and the second picture F2, and disappears in the picture between the first picture F1 and the second picture F2.In other words, these objects do not come across in the first picture F1 and do not come across in the second picture F2 yet.But known based on above-mentioned disclosure, the object of processing unit 13 defines module 133 and still can effectively estimate the motion track these objects with object tracking module 135.
For example, suppose that the object of processing unit 13 defines module 133 in the 3rd picture F3, defines one the 4th object (not illustrating), and the 4th object does not come across in the first picture F1 and does not come across in the second picture F2 yet.Now, the object of processing unit 13 defines module 133 and can select one the 4th picture (not illustrating) in the picture between the first picture F1 and the 3rd picture F3, or selected one the 5th picture (not illustrating) in picture between the 3rd picture F3 and the second picture F2, in the 4th picture and/or the 5th picture, then define the position of the 4th object.Similarly, the object tracking module 135 of processing unit 13 can be positioned at the position of the 4th picture according to the 4th object and be positioned at the position of the 3rd picture F3, judge the position in each picture of the 4th object between the 4th picture and the 3rd picture F3, use estimation the 4th object from the 4th picture to the motion track of the 3rd picture F3.Or, the object tracking module 135 of processing unit 13 can be positioned at the position of the 3rd picture F3 according to the 4th object and be positioned at the position of the 5th picture, judge the position in each picture of the 4th object between the 3rd picture F3 and the 5th picture, use estimation the 4th object from the 3rd picture F3 to the motion track of the 5th picture.
In other embodiments, the object of processing unit 13 defines module 133 and also can determine several specific picture (not illustrating) between the first picture F1 and the second picture F2.For example, object defines module 133 can from the first picture F1, equally spaced at interval of N number of picture (N is positive integer), just determines a specific picture until the second picture F2.Or object defines module 133 also can determine these specific pictures at random between the first picture F1 and the second picture F2.When after these specific pictures of decision, object defines module 133 from two adjacent pictures the first picture F1, the second picture F2 and these specific pictures, can define the position of a same object respectively.Then, the object tracking module 135 of processing unit 13 according to this same object these positions in this two adjacent pictures, can judge the position in each picture of this same object between this two adjacent pictures.
After the object of processing unit 13 defines and defines one or more object in wantonly two non-adjacent pictures of module 133 at filmstrip 22, the object tracking module 135 of processing unit 13 can according to various known film object tracking method, estimates and the motion track of this one or more object in these two non-adjacent pictures.As illustrated in the 3rd figure, the object tracking module 135 of processing unit 13, by direction tracing technology (comprising forward to follow the trail of and backward tracing), estimates the first object X1 from the first picture F1 to the motion track of the second picture F2.
Suppose to also have three pictures between the first picture F1 and the second picture F2, object tracking module 135 can utilize box frame B1 to define the position 40 of the first object X1 in the first picture F1, then various known film object tracking technology is adopted, carry out forward tracking to the position of the first object X1 in all the other four pictures, wherein the estimation position of the first object X1 in all the other four pictures represents with box frame B2 respectively.Known film object tracking technology can be but be not limited to: average displacement algorithm (Mean Shift Algorithm), continuously adaptive average displacement algorithm (Continuously Adaptive Mean Shift Algorithm), overall tracking algorithm (Ensemble Tracking Algorithm) etc.
Object tracking module 135 also can for estimation position calculation one weighted value of the first object X1 in each picture.For example, the weighted value of the first position of object X1 in each picture can be sequentially 1,0.75,0.5,0.25,0.It is more complete that weighted value higher expression box frame B1, B2 contain the first object X1, and the estimation position of the first object X1 in each picture is more accurate.
Similarly, object tracking module 135 can utilize box frame B1 to define the position 42 of the first object X1 in the second picture F2, then various known film object tracking technology is adopted, carry out backward tracing to the position of the first object X1 in all the other four pictures, wherein the estimation position of the first object X1 in all the other four pictures represents with box frame B2 equally respectively.Object tracking module 135 equally can for estimation position calculation one weighted value of the first object X1 in each picture.For example, the weighted value of the first position of object X1 in each picture can be sequentially 0,0.25,0.5,0.75,1.
Finally, to follow the trail of according to forward and backward tracing calculates these weighted values for the first position of object X1 in each picture and to be averaged weighted calculation, to judge that the first object X1 is arranged in the position of each picture, and then estimation the first object X1 is from the first picture F1 to the motion track of the second picture F2.Based on identical determination methods, the motion track of the second object X2 and the second object X3 can be estimated.
In other embodiments, object tracking module 135 also can be positioned at the position 40 of the first picture F1 according to the first object X1 and be positioned at the position 42 of the second picture F2, adopt various known interpolation algorithm, judge the position in each picture of the first object X1 between the first picture F1 and the second picture F2, use estimation first object X1 from the first picture F1 to the motion track of the second picture F2.
The second embodiment of the present invention is in order to set forth a kind of film object tracking method for an electronic installation.Electronic installation described in the present embodiment can comprise the processing unit that a film supplier unit and is electrically connected to this film supplier unit.This processing unit can comprise a film cutting module, an object defines module and an object tracking module.Film object tracking method described in the present embodiment is applicable to the electronic installation 1 described in the first embodiment.
Film object tracking method described by the present embodiment also can be performed by a computer program product.When an electronic installation is loaded into this computer program product, and after performing several instructions that this computer program product comprises, the film object tracking method that the present embodiment is stated can be completed.Aforementioned computer program product can be stored in computer-readable recording medium, such as ROM (read-only memory) (read only memory; ROM), flash memory, floppy disk, hard disk, CD, portable disk, tape and by the database of network access or and other Storage Media of identical function can be had.
Fig. 4 is a schematic diagram of the film object tracking method described in the present embodiment.Film object tracking method described in the present embodiment comprises step S21, step S23, step S25, step S27 and step S29, and each step put in order and be not used to limit the present invention.
As shown in Figure 4, in the step s 21, a film is supplied by this film supplier unit.In step S23, captured a filmstrip of this film by this film cutting module, this filmstrip comprises several continuous pictures.In step s 25, in one first picture of these continuous pictures, define by this object the position that module defines at least one first object.In step s 27, in one second picture of these continuous pictures, define by this object the position that module defines this at least one first object.Selectively, this first picture is the start picture of these continuous pictures, and this second picture is the end picture of these continuous pictures.In step S29, according to this at least one first object these positions in this first picture and this second picture, judge the position in the picture of this at least one first object between this first picture and this second picture by this object tracking module.
In other embodiments, this electronic installation more can comprise user's interface unit that is electrically connected to this processing unit, and this object defines module according to the user's input from this user's interface unit, defines these positions of this at least one first object in this first picture and this second picture.
In other embodiments, the film object tracking method described in the present embodiment more can comprise following steps: in this second picture, defines by this object the position that module defines at least one second object; In one the 3rd picture of these continuous pictures, define by this object the position that module defines this at least one second object, the 3rd picture is between this first picture and this second picture; And according to this at least one second object these positions in this second picture and the 3rd picture, judge the position in the picture of this at least one second object between this second picture and the 3rd picture by this object tracking module.
In other embodiments, the film object tracking method described in the present embodiment more can comprise following steps: in this first picture, defines by this object the position that module defines at least one third body; In one the 3rd picture of these continuous pictures, define by this object the position that module defines this at least one third body, the 3rd picture is between this first picture and this second picture; And according to this at least one third body these positions in this first picture and the 3rd picture, judge the position in the picture of this at least one third body between this first picture and the 3rd picture by this object tracking module.
In other embodiments, the film object tracking method described in the present embodiment more can comprise following steps: between this first picture and this second picture, defines module determine several specific picture by this object; In two adjacent pictures in this first picture, this second picture and these specific pictures, define by this object the position that module defines a same object respectively; And according to this same object these positions in this two adjacent pictures, judge the position in each picture of this same object between this two adjacent pictures by this object tracking module.
Except above-mentioned steps, the film object tracking method of the present embodiment more can comprise the step of all operations described by electronic installation 1 corresponded in the first embodiment and realize corresponding function.Because person of ordinary skill in the field can be directly acquainted with the step that the present embodiment do not disclose based on the disclosure of above-mentioned first embodiment, repeat no more in this.
In sum, the invention provides a kind of electronic installation and film object tracking method thereof.By the running of aforesaid film feeding unit and processing unit, this electronic installation and film object tracking method thereof pick out two pictures by the several continuous pictures comprised at a filmstrip, and in these two pictures, define the position of one or more same object respectively.Then, the position of these two pictures is arranged in by this one or more same object, judge the position in each picture of this one or more same object between these two pictures, use and estimate the motion track of this one or more same object between these two pictures.Be different from traditional film object tracking method must estimate the motion track of different objects in order, the present invention can once estimate the motion track of multiple object, therefore the time that can effectively follow the trail of spent by several object for traditional film object tracking technology improves.
Above-described embodiment is only used for exemplifying part of the present invention and implements aspect, and explains technical characteristic of the present invention, but is not used for limiting protection category of the present invention.Anyly be familiar with this operator and the arrangement of unlabored change or isotropism can all belong to the scope that the present invention advocates, and the scope of the present invention is as the criterion with claims.

Claims (12)

1. an electronic installation, is characterized in that, comprises:
One film supplier unit, in order to supply a film; And
One processing unit, is electrically connected to this film supplier unit, and comprises:
One film cutting module, in order to capture a filmstrip of this film, this filmstrip comprises several continuous pictures;
One object defines module, is used to, in one first picture of described continuous pictures, define the position of at least one first object, and in one second picture of described continuous pictures, defines the position of this at least one first object; And
One object tracking module, in order to according to the described position of this at least one first object in this first picture and this second picture, judges the position in each picture of this at least one first object between this first picture and this second picture.
2. electronic installation as claimed in claim 1, is characterized in that:
This object define module more in order to:
In this second picture, define the position of at least one second object; And
In one the 3rd picture of described continuous pictures, define the position of this at least one second object, the 3rd picture is between this first picture and this second picture; And
This object tracking module more in order to:
According to the described position of this at least one second object in this second picture and the 3rd picture, judge the position in each picture of this at least one second object between this second picture and the 3rd picture.
3. electronic installation as claimed in claim 1, is characterized in that:
This object define module more in order to:
In this first picture, define the position of at least one third body; And
In one the 3rd picture of described continuous pictures, define the position of this at least one third body, the 3rd picture is between this first picture and this second picture; And
This object tracking module more in order to:
According to the described position of this at least one third body in this first picture and the 3rd picture, judge the position in each picture of this at least one third body between this first picture and the 3rd picture.
4. electronic installation as claimed in claim 1, it is characterized in that, this first picture is the start picture of described continuous pictures, and this second picture is the end picture of described continuous pictures.
5. electronic installation as claimed in claim 1, it is characterized in that, more comprise user's interface unit that is electrically connected to this processing unit, wherein this object defines module according to the user's input from this user's interface unit, defines the described position of this at least one first object in this first picture and this second picture.
6. electronic installation as claimed in claim 1, is characterized in that:
This object define module more in order to:
Several specific picture is determined between this first picture and this second picture; And
In two adjacent pictures in this first picture, this second picture and described specific picture, define the position of a same object respectively; And
This object tracking module more in order to:
According to the described position of this same object in this two adjacent pictures, judge the position in each picture of this same object between this two adjacent pictures.
7. the film object tracking method for an electronic installation, this electronic installation comprises the processing unit that a film supplier unit and is electrically connected to this film supplier unit, this processing unit comprises a film cutting module, an object defines module and an object tracking module, it is characterized in that, this film object tracking method comprises the following step:
A () supplies a film by this film supplier unit;
B () captures a filmstrip of this film by this film cutting module, this filmstrip comprises several continuous pictures;
C (), in one first picture of described continuous pictures, defines by this object the position that module defines at least one first object;
D (), in one second picture of described continuous pictures, defines by this object the position that module defines this at least one first object; And
E (), according to the described position of this at least one first object in this first picture and this second picture, judges the position in each picture of this at least one first object between this first picture and this second picture by this object tracking module.
8. film object tracking method as claimed in claim 7, is characterized in that, more comprise the following step:
(f1) in this second picture, define by this object the position that module defines at least one second object;
(f2) in one the 3rd picture of described continuous pictures, define by this object the position that module defines this at least one second object, the 3rd picture is between this first picture and this second picture; And
(f3) according to the described position of this at least one second object in this second picture and the 3rd picture, the position in each picture of this at least one second object between this second picture and the 3rd picture is judged by this object tracking module.
9. film object tracking method as claimed in claim 7, is characterized in that, more comprise the following step:
(g1) in this first picture, define by this object the position that module defines at least one third body;
(g2) in one the 3rd picture of described continuous pictures, define by this object the position that module defines this at least one third body, the 3rd picture is between this first picture and this second picture; And
(g3) according to the described position of this at least one third body in this first picture and the 3rd picture, the position in each picture of this at least one third body between this first picture and the 3rd picture is judged by this object tracking module.
10. film object tracking method as claimed in claim 7, it is characterized in that, this first picture is the start picture of described continuous pictures, and this second picture is the end picture of described continuous pictures.
11. film object tracking methods as claimed in claim 7, it is characterized in that, this electronic installation more comprises user's interface unit that is electrically connected to this processing unit, and this object defines module according to the user's input from this user's interface unit, defines the described position of this at least one first object in this first picture and this second picture.
12. film object tracking methods as claimed in claim 7, is characterized in that, more comprise the following step:
(h1) between this first picture and this second picture, define module by this object and determine several specific picture;
(h2), in two adjacent pictures in this first picture, this second picture and described specific picture, define by this object the position that module defines a same object respectively; And
(h3) according to the described position of this same object in this two adjacent pictures, the position in each picture of this same object between this two adjacent pictures is judged by this object tracking module.
CN201310573541.9A 2013-11-15 2013-11-15 Electronic device and method of tracking objects in video Pending CN104637069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310573541.9A CN104637069A (en) 2013-11-15 2013-11-15 Electronic device and method of tracking objects in video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310573541.9A CN104637069A (en) 2013-11-15 2013-11-15 Electronic device and method of tracking objects in video

Publications (1)

Publication Number Publication Date
CN104637069A true CN104637069A (en) 2015-05-20

Family

ID=53215775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310573541.9A Pending CN104637069A (en) 2013-11-15 2013-11-15 Electronic device and method of tracking objects in video

Country Status (1)

Country Link
CN (1) CN104637069A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0942395A2 (en) * 1998-03-13 1999-09-15 Siemens Corporate Research, Inc. Method for digital video processing
CN1471004A (en) * 2002-06-24 2004-01-28 株式会社东芝 Cartoon image replay device, progress data and cartoon image replay method
CN101283376A (en) * 2005-10-14 2008-10-08 微软公司 Bi-directional tracking using trajectory segment analysis
CN101783020A (en) * 2010-03-04 2010-07-21 湖南大学 Video multi-target fast tracking method based on joint probability data association

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0942395A2 (en) * 1998-03-13 1999-09-15 Siemens Corporate Research, Inc. Method for digital video processing
CN1471004A (en) * 2002-06-24 2004-01-28 株式会社东芝 Cartoon image replay device, progress data and cartoon image replay method
CN101283376A (en) * 2005-10-14 2008-10-08 微软公司 Bi-directional tracking using trajectory segment analysis
CN101783020A (en) * 2010-03-04 2010-07-21 湖南大学 Video multi-target fast tracking method based on joint probability data association

Similar Documents

Publication Publication Date Title
CN104967803B (en) A kind of video recording method and device
CN102707830B (en) Display control apparatus, display control method and program
CN102799358B (en) The determination method and device of display position of cursor
EP2172837A2 (en) Touch input device and method for portable device
US20140096002A1 (en) Video clip editing system
CN104380234B (en) Icon moving method, device and electronic equipment
CN103544245B (en) Ranking list generating method and server
CN105843480A (en) Desktop icon adjustment method and apparatus
CN104394422A (en) Video segmentation point acquisition method and device
CN104423841A (en) Method and device for dragging icon
CN105989180A (en) Method and device for operating picture
CN105898523A (en) Video playing method and video playing device
CN106879263A (en) A kind of image pickup method and mobile device
CN103902615A (en) Photo processing method, photo processing device and mobile terminal
CN105549814A (en) Mobile terminal based photographing method and mobile terminal
CN105843477A (en) Event display method and device
TWI570666B (en) Electronic device and video object tracking method thereof
CN109409321A (en) A kind of determination method and device of camera motion mode
CN103809841B (en) Data processing method and electronic equipment
CN100593948C (en) Method and device for jointing video
CN103685420B (en) A kind of method of media file duplication removal, server and system
CN104869317B (en) Smart machine image pickup method and device
CN103207725A (en) Icon conversion method and device
CN103019546B (en) A kind of slideshow method, system and apparatus for demonstrating
CN106815283A (en) Data processing method, device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150520

WD01 Invention patent application deemed withdrawn after publication