US8831347B2 - Data segmenting apparatus and method - Google Patents
Data segmenting apparatus and method Download PDFInfo
- Publication number
- US8831347B2 US8831347B2 US13/342,464 US201213342464A US8831347B2 US 8831347 B2 US8831347 B2 US 8831347B2 US 201213342464 A US201213342464 A US 201213342464A US 8831347 B2 US8831347 B2 US 8831347B2
- Authority
- US
- United States
- Prior art keywords
- candidate
- boundary
- data
- data segments
- segmenting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/49—Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/19—Recognition using electronic means
- G06V30/196—Recognition using electronic means using sequential comparisons of the image signals with a plurality of references
- G06V30/1983—Syntactic or structural pattern recognition, e.g. symbolic string recognition
Definitions
- the disclosure relates to the processing and analysis of electronic device data, and particularly, to an apparatus and method of segmenting an input data sequence automatically.
- an apparatus for segmenting an input data sequence may include: a boundary searching device configured to search for one or more candidate boundaries in the input data sequence, in order to obtain one or more candidate segmenting schemes formed by different combinations of these candidate boundaries; and an evaluating device configured to, with respect to each candidate segmenting scheme, generate an evaluation value of each candidate boundary in the candidate segmenting scheme, by evaluating a segmenting loss caused by using the candidate boundary in the candidate segmenting scheme to segment the input data sequence, and determine whether the candidate boundary is valid in the candidate segmenting scheme according to the evaluation value, the evaluation value reflecting association relationship between a pair of adjacent data segments adjoining to the candidate boundary and association relationship between each of one or more pairs of non-adjacent data segments, each pair of non-adjacent data segments comprising two non-adjacent data segments respectively located at two sides of the candidate boundary.
- a method for segmenting an input data sequence may include: searching for one or more candidate boundaries in the input data sequence, in order to obtain one or more candidate segmenting schemes formed by different combinations of these candidate boundaries, generating, with respect to each candidate segmenting scheme, an evaluation value of each candidate boundary in the candidate segmenting scheme by evaluating a segmenting loss caused by using the candidate boundary in the candidate segmenting scheme to segment the input data sequence, and determining whether the candidate boundary is valid in the candidate segmenting scheme according to the evaluation value the evaluation value reflecting association relationship between a pair of data adjacent segments adjoining to the candidate boundary and association relationship between each of one or more pairs of non-adjacent data segments, each pair of non-adjacent data segments comprising two non-adjacent data segments respectively located at two sides of the candidate boundary.
- a computer implemented data segmenting apparatus may include: an input device configured to receive an input data sequence; and a processing device coupled to the input device.
- the processing device may include: a boundary searching device configured to search for one or more candidate boundaries in the input data sequence, in order to obtain one or more candidate segmenting schemes formed by different combinations of these candidate boundaries; and an evaluating device configured to, with respect to each candidate segmenting schemes, generate an evaluation value of each candidate boundary in the candidate segmenting scheme by evaluating a segmenting loss caused by using the candidate boundary in the candidate segmenting scheme to segment the input data sequence, and determine whether the candidate boundary is valid in the candidate segmenting scheme according to the evaluation value, the evaluation value reflecting association relationship between a pair of adjacent data segments adjoining to the candidate boundary and association relationship between each of one or more pairs of non-adjacent data segments, each pair of non-adjacent data segments comprising two non-adjacent data segments respectively located at two sides of the candidate boundary.
- some embodiments of the disclosure further provide computer program for realizing the above method.
- some embodiments of the disclosure further provide computer program products in at least the form of computer-readable medium, upon which computer program codes for realizing the above method are recorded.
- FIG. 1 is a schematic block diagram illustrating the structure of a data segmenting apparatus according to an embodiment
- FIG. 2 is a schematic flow chart illustrating a data segmenting method according to the embodiment
- FIG. 3 is a schematic diagram illustrating an example a data sequence
- FIG. 4 is a schematic block diagram illustrating the structure of an evaluating device for evaluating the validity of a candidate boundary in a candidate segmenting scheme according to a particular embodiment
- FIG. 5(A) is a schematic flow chart illustrating an example of a method of evaluating the validity of a candidate boundary in a candidate segmenting scheme
- FIG. 5(B) is a schematic flow chart illustrating another example of a method of evaluating the validity of a candidate boundary in a candidate segmenting scheme
- FIG. 6 is a schematic block diagram illustrating the structure of a data segmenting apparatus according to an embodiment
- FIG. 7(A) is a schematic flow chart illustrating a method of searching for an optimum data segmenting scheme according to a particular embodiment
- FIG. 7(B) is a schematic flow chart illustrating a method of searching for an optimum data segmenting scheme according to another particular embodiment
- FIG. 8 is a schematic flow chart illustrating a method of searching for an optimum data segmenting scheme according to another particular embodiment
- FIG. 9 is a schematic diagram illustrating an example of dynamic programming (DP) search tree
- FIG. 10 shows a particular example of searching for an optimum preceding boundary sequence by using dynamic programming method
- FIG. 11 is a schematic diagram illustrating the determination of a random threshold to be used in boundary evaluation by using a statistic result of training samples
- FIG. 12 is a schematic diagram illustrating a process of voting the candidate boundaries by using multiple DP steps.
- FIG. 13 is a schematic block diagram illustrating the structure of computer for realizing the disclosure.
- Some embodiments of the disclosure provide apparatuses and methods for segmenting an input data sequence.
- the so called data sequence refers to a sequence including one or more data segments arranged in a temporal and/or spatial order.
- the data sequence may be a scalar quantity sequence, a vector quantity sequence, text, audio, images or video (motion pictures), or any combination thereof.
- video data is taken as an example of the data sequence. It should be noted the examples are merely illustrative and the data sequence of the disclosure should not be regarded as being limited to these.
- FIG. 1 is a schematic block diagram illustrating the structure of a data segmenting apparatus according to an embodiment
- FIG. 2 is a schematic flow chart illustrating a data segmenting method according to the embodiment.
- the candidate boundaries of the data sequence are searched first. Then the candidate boundaries are evaluated. In the evaluation, not only the association relationship between a pair of adjacent data segments adjoining to the candidate boundary, but also the association relationships between one or more pairs of non-adjacent data segments are considered.
- data segmenting apparatus 100 may include a boundary searching device 101 and an evaluating device 103 .
- the data segmenting apparatus 100 may utilize the method as shown in FIG. 2 .
- the functions of the devices in the data segmenting apparatus 100 are described below with reference to the method flow shown in FIG. 2 .
- the boundary searching device 101 is configured to search for one or more candidate boundaries in the input data sequence (step 202 ).
- the different combination of the candidate boundaries obtained by the boundary searching device 101 may form various different candidate segmenting schemes.
- the boundary searching device 101 may search the candidate boundaries by utilizing any appropriate method, for example, the boundary searching device 101 may utilize a traversal method, an equal-interval sampling method or a pre-segmenting method using sliding window (only the positions in the former half and latter half that have apparent differences therebetween are selected) or other methods to search the candidate boundaries, which are not enumerated and detailed herein.
- the boundary searching device 101 may utilize a traversal method, an equal-interval sampling method or a pre-segmenting method using sliding window (only the positions in the former half and latter half that have apparent differences therebetween are selected) or other methods to search the candidate boundaries, which are not enumerated and detailed herein.
- the boundary searching device 101 may utilize a traversal method, an equal-interval sampling method or a pre-segmenting method using sliding window (only the positions in the former half and latter half that have apparent differences therebetween are selected) or other methods to search the candidate boundaries, which are not enumerated and detailed herein.
- FIG. 3 shows an example of data sequence.
- b 0 , b 1 , b 2 , . . . , b n represent the boundaries between the data units. It is assumed that after the searching in step 202 , the boundary searching device 101 utilizes b 4 , b 6 , b 9 as the candidate boundaries.
- b 0 and b 10 are the start and end points of the data sequence, respectively, and should be included in each candidate segmenting scheme.
- the different combinations of these candidate boundaries and the start and end points (b 0 , b 4 , b 6 , b 9 , b 10 ) may form a plurality of candidate segmenting schemes different from each other.
- FIG. 3 shows only a candidate segmenting scheme including all the candidate boundaries (b 0 , b 4 , b 6 , b 9 , b 10 ).
- the candidate segmenting scheme divides the data sequence into four data segments a, b, c, d. Each of the data segment may include at least one data unit.
- a data unit s i (1 ⁇ i ⁇ n) may include one shot (each shot may include one or more image frames); and a data segment may be a scene including a plurality of contiguous shots.
- a data segmenting scheme including all the candidate boundaries (b 0 , b 4 , b 6 , b 9 , b 10 ) shown in FIG. 3 as an example, data segment a includes data units s 1 , s 2 , s 3 , s 4 ; data segment b includes data units s 5 , s 6 ; data segment c includes data units s 7 , s 8 , s 9 ; and data segment d includes a data unit s 10 .
- the boundary searching device 101 sends the search result to the evaluating device 103 .
- the evaluating device 103 is configured to calculate the evaluation value of a candidate boundary in a candidate segmenting scheme (step 204 ). Particularly, for a candidate boundary in a candidate segmenting scheme, the evaluating device 103 evaluates the segment loss resulted from segmenting the data sequence by using the candidate boundary according to the association relationship between a pair of adjacent data segments adjoining to the candidate boundary and the association relationship between each pair of one or more pairs of non-adjacent data segments, as the evaluation value of the candidate boundary.
- the evaluation value of a candidate boundary reflects the association relationship between a pair of adjacent data segments adjoining to the candidate boundary and the association relationship between each pair of one or more pairs of non-adjacent data segments.
- the so called a pair of non-adjacent data segments refers to two data segments that are located at the two sides of the candidate boundary, respectively, and are not adjacent to each other.
- the association relationship between two data segments may be a feature value representing the similarity of the two data segments in one or more features.
- the feature may be any feature that characterizes one or more characteristics of the data sequence, and is not limited to any example herein. In an example in which the input data sequence is video, the feature may be color, texture and/or content, etc.
- the feature may be keyword and/or content, etc.
- the feature may be spectrum and/or content, etc. Any appropriate feature may be selected according to the actual application scenario, and the disclosure should not be limited to the examples.
- the evaluation value i.e. the segmenting loss resulted from data segmentation by use of this boundary
- the evaluating device 103 may be represented by the following formula:
- E ⁇ ( ⁇ ) ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ F ⁇ [ A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ) ] ( 1 )
- evaluation value in the above formula may be defined as follows:
- E ⁇ ( ⁇ ) ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ F ⁇ [ A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ⁇ ⁇ ) , A ⁇ ( ⁇ , ⁇ ⁇ ⁇ ) ] ( 1 ⁇ ⁇ a )
- E( ⁇ ) represents the loss function (i.e. the evaluation of the boundary) of segmenting the data sequence at the candidate boundary ⁇ ; ⁇ represents two data segments preceding and following the boundary ⁇ in an arrangement order of data in the data sequence, wherein ⁇ and ⁇ represent the pair of the data segments located at the two sides of the boundary ⁇ in the data sequence (a represents a data segment preceding the boundary ⁇ , and ⁇ represents a data segment following the boundary ⁇ ).
- the pair of adjacent data segments adjoining to the boundary includes data segments b and c
- the pairs of non-adjacent data segments located at the two sides of the boundary include the pair of data segments a and c, the pair of data segments b and d, and the pair of data segments a and d. Therefore, the evaluating device 103 may calculate the evaluation value of the boundary b 6 based on the association relationship between each pair of the above 4 pairs of data segments, instead of based on only the association relationship between the pair of adjacent data segments b and c adjoining to the boundary.
- the evaluating device 103 determines whether the candidate boundary is valid in the candidate segmenting scheme based on the evaluation value (step 206 ).
- the evaluating device 103 may determine whether the evaluation value of the candidate boundary meets a predetermined relationship with a threshold (e.g. whether the evaluation value excels the threshold), and if yes, determines that the candidate boundary is valid, otherwise, determines that the candidate boundary is invalid.
- a threshold e.g. whether the evaluation value excels the threshold
- the threshold herein may be determined according to the actual application scenario, for example, it may be a predetermined theoretical value or a predetermined empirical value, or may be a value obtained by training data samples, thus it should not be limited to any particular value.
- the input data sequence may be divided into one or more data segments such that the data within the same data segment are similar to each other in some features while the data of a data segment has distinct differences in the corresponding features from the data in the data segments that are adjacent to the data segment.
- the apparatus and method shown in FIG. 1 and FIG. 2 take into consideration not only the association relationship between the pair of adjacent data segments adjoining to the candidate boundary, but also the association relationship of one or more pairs of non-adjacent data segments. Therefore, compared with the method of evaluating a boundary based on only the association relationship between the pair of adjacent data segments adjoining to the boundary, the apparatus and method shown in FIG. 1 and FIG. 2 can evaluate the validity of the boundary with an improved accuracy and the data segmenting scheme thus obtained is more reasonable.
- FIG. 4 is a schematic block diagram showing the structure of the evaluating device 103 according to a particular embodiment
- FIGS. 5(A) and 5(B) are schematic flow charts respectively showing two particular examples of step 204 in FIG. 2 according to the particular embodiment.
- the evaluating device 103 may include an association estimating device 103 - 1 , a loss calculating device 103 - 2 and a validity judging device 103 - 3 .
- the evaluating device 103 may utilize the method shown in FIG. 5(A) or 5 (B) to evaluate a candidate boundary.
- the association estimating device 103 - 1 calculates the association relationship between the pair of adjacent data segments adjoining to the candidate boundary, and calculates the association relationship between each pair of the one or more pairs of non-adjacent data segments (step 204 - 1 A), and outputs the calculated association relationships to the loss calculating device 103 - 2 .
- the loss calculating device 103 - 2 estimates the segmenting loss of the candidate boundary according to the association relationships, as the evaluation value of the candidate boundary (step 204 - 3 A).
- the validity judging device 103 - 3 determines whether the candidate boundary is valid in the candidate segmenting scheme according to the evaluation value obtained by the loss calculating device 103 - 2 (step 206 ).
- the validity judging device 103 - 3 may determine whether the evaluation value of the candidate boundary meets a predetermined relationship with a certain threshold (e.g. whether the evaluation value excels the threshold), and if yes, determines that the candidate boundary is valid, otherwise, determines that the candidate boundary is invalid.
- a certain threshold e.g. whether the evaluation value excels the threshold
- ⁇ right arrow over (E) ⁇ (b,c) represents the evaluation value of the candidate boundary b 6 between the data segments b and c;
- C(b,c) represents the association relationship between the pair of adjacent data segments b and c adjoining to the candidate boundary b 6 ;
- C(a,c), C(b,d), C(a,d) represent the association relationship between the pair of non-adjacent data segments a and c, the association relationship between the pair of non-adjacent data segments b and d, and the association relationship between the pair of non-adjacent data segments a and d, respectively.
- the association estimating device 103 - 1 of the evaluating device 103 may utilize any appropriate method to estimate the association relationship between two data segments.
- a graph cut function C( ) is used to calculate the association relationship. That is, the evaluation value of the candidate boundary b 6 may be based on a plurality of graph cut function values between a plurality of pairs of data segments at the two sides of the boundary.
- ⁇ right arrow over (E) ⁇ (b,c) is a feature vector formed by the plurality of cascaded graph cut function values to represent the evaluation value of the candidate boundary b 6 between the data segments b and c.
- the association estimating device 103 - 1 of the evaluating device 103 may calculate the association relationship between two data segments based on the similarity between the two data segments and the similarity between the data units within each data segment of the two data segments. Taking the data segments a and d shown in FIG. 3 as an example, the association relationship between the two data segments may be calculated by the following formula:
- A(a,d) represents the similarity between the data segments a and d in a certain feature or certain features (the description of the features is described above and is not repeated herein), A(a,a) represents the similarity of the data inside the data segment a (the similarity of between the data units inside the data segment a), and A(d,d) represents the similarity of the data inside the data segment d (the similarity of between the data units inside the data segment d).
- the association estimating device 103 - 1 may obtain the similarity between two data segments by calculating the similarities between data units included in different ones of the two data segments.
- the association estimating device 103 - 1 may obtain the similarity within a single data segment by calculating the similarities between the data units within the single data segment. For example, the following formula (4) gives a particular example of calculating the similarity between the two data segments a and d:
- a ⁇ ( a , d ) ⁇ s i ⁇ a ⁇ ⁇ ⁇ s j ⁇ d ⁇ ⁇ S ⁇ ( s i , s j ) ⁇ ( 4 )
- the formula (4) may also be used to calculate the similarity of the data inside a single data segment, such as A(a,a) and A(d,d).
- the association estimating device 103 - 1 may utilize the lengths of the data units within the two data segments to weight the similarity between two data units from different ones of the two data segments, respectively.
- the lengths of the data units in the two data segments may be used as the weights to calculate a weighted average similarity between the two data segments (or an average similarity normalized by using the lengths), as the similarity between the two data segments.
- the association estimating device 103 - 1 may weight the similarity between the data units within a single data segment by using the lengths of the data units within the data segment.
- the lengths of the data units within the data segment may be used as the weights to calculate the weighted average similarity of the data inside the data segment (or the average similarity normalized by using the lengths), as the similarity of the data within the data segment.
- the following formula may be used to calculate the similarity between the two data segments a and d:
- a ⁇ ( a , d ) ⁇ s i ⁇ a ⁇ ⁇ ⁇ s j ⁇ d ⁇ ⁇ S ⁇ ( s i , s j ) ⁇ ⁇ L ⁇ ( s i ) ⁇ L ⁇ ( s j ) ⁇ s i ⁇ a ⁇ ⁇ ⁇ s j ⁇ d ⁇ ⁇ L ⁇ ( s i ) ⁇ L ⁇ ( s j ) ( 4 ⁇ ⁇ a )
- the formula (4a) may also be used to calculate the similarity of the data inside a single data segment, such as A(a,a) and A(d,d).
- s i represents a data unit in the data segment a
- s j represents a data unit in the data segment d
- S(s i ,s j ) represents the similarity between s i and s j in a certain feature or some certain features (in the case of video or image sequence, for example, the feature may be color, texture and/or contents, etc.).
- L(s i ) represents the length of the data unit s i
- L(s j ) represents the length of the data unit s j .
- the association estimating device 103 - 1 may calculate the similarity S(•) between two data units by using any appropriate feature(s) and an appropriate method according to the actual application scenario.
- the input data sequence is a video sequence
- S C ⁇ (s i ,s j ) represents the similarity between the data units in color
- f m represents a frame in s i
- f n represents a frame in s i
- H C ⁇ (•) represents HSV histogram (e.g. HSV of 64 bins, wherein each channel corresponds to two bits, and thus 6 bits in total) of the frame
- mean(•) represents a mean value function
- I(•) represents the intersection of the histograms (i.e. the overlapping ratio of the two histograms of the two frames).
- the method of FIG. 5(B) is similar to that in FIG. 5(A) , the difference lies in that, in FIG. 5(B) , after the association estimating device 103 - 1 calculates the association relationships between the pairs of data segments as shown in step 204 - 1 B (the step 204 - 1 B is similar to step 204 - 1 A), the loss calculating device 103 - 2 weights the association relationships by using the distance between the data segments in each pair of (step 204 - 2 B), and estimates the segmenting loss of the candidate boundary by using the weighted association relationships, as the evaluation value of the boundary (step 204 - 3 B).
- the validity judging device 103 - 3 determines whether the candidate boundary is valid in the corresponding candidate segmenting scheme according to the evaluation value obtained by the loss calculating device 103 - 2 (step 206 ). As a particular example, the validity judging device 103 - 3 may determine whether the evaluation value of the candidate boundary meets a predetermined relationship with a certain threshold (whether the evaluation value excels the threshold), and if yes, determine that the candidate boundary is valid, otherwise, determine that the candidate boundary is invalid.
- the evaluation value of the candidate boundary ⁇ may be calculated by using the following weighted function:
- E ′ ⁇ ( ⁇ ) ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ D d ⁇ ( ⁇ , ⁇ ) ⁇ F ⁇ [ A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ) ] ( 6 )
- the evaluation value of the candidate boundary ⁇ may be calculated by using the following weighted function:
- E ′ ⁇ ( ⁇ ) ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ D d ⁇ ( ⁇ , ⁇ ) ⁇ F ⁇ [ A ⁇ ( ⁇ , ⁇ ) , A ⁇ ( ⁇ , ⁇ ⁇ ⁇ ) , A ⁇ ( ⁇ , ⁇ ⁇ ⁇ ) ] ( 6 ⁇ ⁇ a )
- E′( ⁇ ) represents the distance-weighted evaluation value of the candidate boundary ⁇ ;
- D d ( ⁇ , ⁇ ) represents an attenuation function which attenuates monotonically with the distance between the data segments ⁇ and ⁇ .
- E′( ⁇ ) represents the distance-weighted evaluation value of the candidate boundary ⁇ ;
- D d ( ⁇ , ⁇ ) represents an attenuation function which attenuates monotonically with the distance between the data segments ⁇ and ⁇ .
- association relationship calculated using formula (3) may be weighted by using the formula (7):
- C′(a,d) represents the weighted association relationship between the two data segments a and d
- D(a,d) represents the distance between the two data segments a and d
- v represents a predetermined constant.
- exp[ ⁇ v ⁇ D(a,d)] in formula (7) is a particular example of the attenuation function which attenuates monotonically with the distance between the two data segments a and d. It should be appreciated that any other appropriate attenuation function may be used, which is not detailed here.
- the distance between two data segments may be calculated by using any appropriate method.
- the distance between two data segments may be the number of data segments between the two data segments.
- the distance between two data segments may be the number of data units between the two data segments.
- the distance between two data segments may be the total length of data units between the two data segments (for example the length here may be represented by using the number of frames or time).
- the disclosure is not limited to these.
- the constant v may be an empirical value or experimental value predetermined according to the actual application scenarios.
- the value of v may be between 0.02 and 0.1, and preferably, the value of v may be 0.05 (1/second), and so on.
- the above values are merely examples, it should be appreciated that v may have other appropriate values according to the different application scenarios, which is not detailed herein.
- the distance between a pair of data segments is used to weight the association relationship between the data segments of the pair. Therefore, the pair of data segments having a larger distance therebetween corresponds to a smaller weight and thus contributes less to the resultant evaluation value.
- the two data segments will be divided apart even if the association relationship between the two data segments is relatively large (i.e. the similarity is relatively high). With the method, the evaluation of the candidate boundary is more reasonable, and thus the resultant data segmenting scheme is better.
- p represents the projection that can distinguish the correct boundaries from the erroneous ones
- E (b,c) represents the converted (projected) evaluation value of the boundary b 6 .
- the evaluating device 103 may determine whether the evaluation value (e.g. E (b,c) obtained by using formula (8)) of the candidate boundary meets a predetermined relationship with a certain threshold (e.g. whether the evaluation value excels the threshold), and if yes, determine that the candidate boundary is valid, otherwise, determine that the candidate boundary is invalid.
- the threshold may be a theoretical value or an empirical value predetermined based on the actual application scenarios, which is not detailed herein.
- the projection p in formula (8) may be obtained by training the training samples by use of the Linear Discriminate Analysis (LDA) method.
- LDA Linear Discriminate Analysis
- ⁇ + and ⁇ + represent the mean value and covariance of feature vectors of the positive training samples, respectively
- ⁇ ⁇ and ⁇ ⁇ represent the mean value and covariance of feature vectors of the negative training samples, respectively
- the positive training samples may include the samples generated by using the correct boundaries
- the negative training samples may include the samples generated by using the erroneous boundaries (e.g. erroneous boundaries that are randomly selected).
- the above method of generating the projection p is merely illustrative. In other examples other appropriate methods (e.g. support vector machine) may be used to obtain the projection p, which is not detailed herein.
- FIG. 6 is a schematic block diagram illustrating the structure of a data segmenting apparatus according to another embodiment. Similar to the data segmenting apparatus 100 in FIG. 1 , the data segmenting apparatus 600 may include a boundary searching device 601 c and an evaluating device 603 configured to evaluate the candidate boundaries. The difference lies in that, the data segmenting apparatus 600 further includes a scheme selecting device 607 .
- the boundary searching device 601 and the evaluating device 603 in the data segmenting apparatus 600 may utilize the method in the embodiments/examples described above with reference to FIG. 2 and FIG. 4 to evaluate the candidate boundaries, that is, the boundary searching device 601 and the evaluating device 603 may have functions and structures similar to those of the boundary searching device 101 and the evaluating device 103 in the embodiments or examples described above with reference to FIG. 1 and FIG. 3 , which is not repeated herein.
- the scheme selecting device 607 is configured to select, based on the evaluation (the validity and the evaluation value of each candidate boundary) made by the evaluating device 603 to the candidate boundaries in each candidate segmenting scheme, an optimum segmenting scheme from one or more candidate segmenting schemes formed by one or more candidate boundaries searched by the boundary searching device 601 .
- the scheme selecting device 607 may utilize the method as shown in FIG. 7(A) or 7 (B) to search for the optimum segmenting scheme.
- the scheme selecting device 607 according to the embodiment is described below with reference to FIG. 7(A) .
- the scheme selecting device 607 may firstly remove, based on the validity of the candidate boundaries in each candidate segmenting scheme, one or more segmenting schemes containing an invalid boundary from one or more candidate segmenting schemes (step 708 A); then, calculate the sum or the average value or the weighted average value of the evaluation values of all the candidate boundaries in each remaining candidate segmenting scheme, as the evaluation value of each remaining candidate segmenting scheme (step 710 A), and select an optimum one from the remaining candidate segmenting schemes based on the evaluation value of each remaining candidate segmenting scheme, as the optimum candidate segmenting scheme (step 712 A).
- the scheme selecting device 607 may firstly remove, based on the validity of the candidate boundaries in each candidate segmenting scheme, one or more segmenting schemes containing an invalid boundary from one or more candidate segmenting schemes (step 708 B); calculate the number of boundaries included in each remaining candidate segmenting scheme (step 714 B); and then determine whether there are more than one candidate segmenting scheme having the same maximum number of boundaries (step 716 B); if no, the candidate segmenting scheme having the maximum number of boundaries is determined as the optimum segmenting scheme (step 718 B).
- the scheme selecting device 607 calculates the sum or the average value or the weighted average value of the evaluation values of all the candidate boundaries in each of the more than one candidate segmenting scheme having the same maximum number of boundaries, as the evaluation value of each of the more than one candidate segmenting scheme (step 710 B), and selects an optimum one from the more than one candidate segmenting scheme having the same maximum number of boundaries, based on the evaluation value of each of the more than one candidate segmenting scheme, as the optimum candidate segmenting scheme (step 712 B).
- the scheme selecting device 607 may select the optimum segmenting scheme based on Dynamic Programming (DP) method.
- DP Dynamic Programming
- the DP method decomposes the problem of searching for the optimum segmenting boundary sequence into recursive searching steps. In each DP step, candidate preceding boundary sequences, instead of single neighboring boundary, are searched.
- the scheme selecting device 607 may search for the optimum segmenting scheme by iteratively performing the DP method steps as shown in FIG. 8 .
- FIG. 8 shows the DP method steps of searching for the optimum segmenting scheme in the candidate boundaries according to an embodiment.
- a predetermined criterion is used to select one with the best performance among the candidate segmenting schemes. Particularly, for a candidate boundary or a candidate boundary sequence segment containing at least one candidate boundary, an optimum one is selected from candidate preceding boundary sequences. For different candidate boundaries or candidate boundary sequence segments, the above steps are iteratively performed, until a complete data segmenting scheme is formed.
- its candidate preceding boundary sequence is a boundary sequence which contains at least one candidate boundary preceding the candidate boundary sequence segment according to an order of processing the input data sequence and does not contain any candidate boundary following the candidate boundary sequence segment according to the order of processing the input data sequence.
- each candidate boundary in the candidate preceding boundary sequence should be determined (for example by evaluating device 103 / 603 ) as valid.
- the so-called order of processing the input data sequence may be the order of sequentially processing the input data sequence from the beginning to the end of the data sequence (Taking the data sequence shown in FIG.
- each candidate preceding boundary sequence is a boundary sequence that contains at least one candidate boundary preceding the candidate boundary sequence segment according to the data arrangement order of the input data sequence and does not contain any candidate boundary following the candidate boundary sequence segment according to the data arrangement order of the input data sequence.
- the beginning boundary of each candidate preceding boundary sequence is the beginning of the input data sequence.
- each candidate preceding boundary sequence is a boundary sequence that contains at least one candidate boundary following the candidate boundary sequence segment according to the data arrangement order of the input data sequence and does not contain any candidate boundary preceding the candidate boundary sequence segment according to the data arrangement order of the input data sequence.
- the beginning boundary of each candidate preceding boundary sequence is the end of the input data sequence.
- the scheme selecting device 607 is configured to find one or more candidate preceding boundary sequences corresponding to the candidate boundary sequence segment. Then in step 822 the scheme selecting device 607 selects one from the candidate preceding boundary sequences based on a predetermined selection criterion, as the preceding boundary sequence of the candidate boundary sequence segment. This preceding boundary sequence together with the candidate boundary sequence segment constitutes the candidate preceding boundary sequence of a candidate boundary sequence segment to be processed in the following steps.
- the scheme selecting device 607 may save the preceding boundary sequence in a storage device (not shown in Figure, the storage device may be a memory configured inside the data segmenting apparatus, or may be a memory that is provided outside the data segmenting apparatus and may be accessed by the components of the data segmenting apparatus) for the following DP steps.
- a storage device may be a memory configured inside the data segmenting apparatus, or may be a memory that is provided outside the data segmenting apparatus and may be accessed by the components of the data segmenting apparatus) for the following DP steps.
- the selection criterion may be as the follows:
- the scheme selecting device 607 may repeat steps 820 and 822 , until a complete data segmenting scheme is formed.
- the method shown in FIG. 8 may accelerate the searching and thus it may be more applicable to bulk data searching.
- FIG. 9 shows an example of a DP search tree.
- 6 candidate boundaries represented by the digits 0,1,2,3,4,5, respectively in FIG. 9
- FIG. 9 illustrates the different candidate segmenting schemes formed by the different combination of the 6 candidate boundaries.
- the candidate boundary sequence (2,3,5) is processed.
- a possible preceding boundary sequence of the candidate boundary sequence is (0,2,3).
- Each DP step is to select, based on the above selection criterion, an optimum preceding boundary sequence which has the maximum number of boundaries or is optimum synthetically in the evaluation values of its boundaries.
- FIG. 10 shows a particular example of a searching step in the DP.
- the currently processed boundary sequence is (b p ,b c ,b n ), wherein b p ,b c ,b n represents the boundaries in the sequence.
- step 1020 a candidate preceding boundary sequence of (b p ,b c ,b n ) is searched, this candidate preceding boundary sequence is represented by (b f ,b p ,b c ).
- b f represents a candidate boundary preceding the sequence (b p ,b c ,b n ) according to the data arrangement order of the data sequence.
- step 1022 the evaluation value of the candidate boundary b p in the sequence ( . . . ,b f ,b c ,b n ) is calculated, and in step 1024 , it is determined whether the evaluation value meets a predetermined relationship with a threshold (e.g. less than the threshold), if yes, it is determined that the candidate boundary b p is valid in the sequence (b f ,b p ,b c ,b n ) and the processing goes to next step 1026 ; otherwise, it is determined that the candidate boundary b p is invalid in the sequence ( . . . , b f ,b p ,b c ,b n ) and the processing returns to step 1020 to search for another candidate preceding boundary sequence.
- a threshold e.g. less than the threshold
- step 1026 it is determined whether the current preceding boundary sequence of (b p ,b c ,b n ) has been found in the previous DP step, if yes, the processing goes to step 1030 ; otherwise, the processing goes to step 1028 in which (b f ,b p ,b c ) is set as the current preceding boundary sequence of (b p ,b c ,b n ), and then the processing returns to step 1020 to process another candidate preceding boundary sequence.
- step 1030 based on, the synthesized evaluation values of (b f ,b p ,b c ) and the current preceding boundary sequence are compared based on the evaluation value of each candidate boundary. If the synthesized evaluation value of (b f ,b p ,b c ) excels that of the current preceding boundary sequence, (b f ,b p ,b n ) is set as the new current preceding boundary sequence (i.e. the current preceding boundary sequence is replaced to be (b f ,b p ,b c )); otherwise, the processing returns to step 1020 to process another candidate preceding boundary sequence.
- the synthesized evaluation value of a boundary sequence may be the average value or weighted average value of the evaluation values of the boundaries contained in the boundary sequence.
- the above DP method may be repeated many times (the DP step is performed iteratively, until a complete data segmenting scheme is formed) by using Monte Carlo method.
- the boundaries in an effective boundary sequence may be voted.
- a data segmenting scheme may be selected based on the number of votes of each candidate boundary. For example, a candidate boundary with the number of votes exceeding a threshold may be determined as a correct boundary (it is appreciated that the threshold may be determined as required by the actual application, and should not be limited to any particular value), as a boundary to be included in the resultant data segmenting scheme.
- FIG. 12 shows an example in which the boundaries are voted after multiple rounds of DP method. In FIG.
- the horizon axis represents the data sequence, where 0, 1, . . . , 15 represents the numbers of the candidate boundaries in the sequence, and the dark-colored block above each candidate boundary represents the number of votes for the boundary. The taller the hock is, the larger the number of the votes is.
- One or more candidate boundaries corresponding to the maximum number of votes may be selected as the correct boundaries. For example, the candidate boundaries 6 and 10 gain the most votes, and thus may be selected as correct boundaries. In this way the resultant data segmenting scheme may be (0, 6, 10, 15).
- the threshold adopted in a DP step may be a random valued obtained by performing statistic to a plurality of training samples.
- statistics may be performed to the training samples to obtain a normal distribution model N( ⁇ , ⁇ ), and a random threshold complied with the model may be generated.
- N( ⁇ , ⁇ ) a normal distribution model
- a random threshold complied with the model may be generated.
- ⁇ represents the expectation of the random threshold
- ⁇ + represents the mean value of the feature vectors of positive training samples
- ⁇ ⁇ represents the mean value of the feature vectors of negative training samples
- p represents the projection that can distinguish the correct boundaries from the erroneous boundaries.
- a positive training sample may be a sample generated by using correct boundaries, and a negative training sample may be generated by using erroneous boundaries (e.g. erroneous boundaries that are selected randomly).
- erroneous boundaries e.g. erroneous boundaries that are selected randomly.
- the other symbols in formula (10) are the same as those in formula (9), the description of which is not repeated here.
- FIG. 11 is a schematic diagram showing the variance of the distribution of the randomly determined threshold.
- the curve C 1 represents the distribution of the positive training samples after projection
- the curve C 2 represents the distribution of the negative training samples after projection.
- the selection of variance is to ensure that most of the positive training samples excel the threshold. Using such random threshold distribution, the correct boundaries and the erroneous boundaries may be distinguished effectively.
- each DP step adopts a threshold different from those in other DP steps, the DP searching may be more stable.
- heuristic search may be used to accelerate the searching. For example, for each candidate boundary, a temporary optimum segmenting scheme may be found by comparing all the valid candidate boundary sequences that contain the candidate boundary and utilize this candidate boundary as an extreme point. With the progression of the DP searching, multiple temporary optimum segmenting schemes may be found. For each candidate boundary, the number of times that this candidate boundary appears in these temporary optimum segmenting schemes may be counted, and if the number of times exceed a threshold (the threshold may be a valued predetermined according to actual application scenario and should not be limited to any particular value), the candidate boundary may be heuristically determined as a correct boundary.
- the threshold may be a valued predetermined according to actual application scenario and should not be limited to any particular value
- the candidate boundary sequence excluding this candidate boundary and the sequence in which this candidate boundary is evaluated as invalid may be skipped.
- the candidate boundary 1 is heuristically determined as a correct boundary
- these candidate schemes shown in the left half (the shadowed part) may be skipped in the following DP steps.
- using the heuristic search may accelerate the searching significantly.
- the data segmenting apparatus or method in the embodiments or examples mentioned above may include an input device or an input step for receiving the input data sequence, the description of which is not detailed here.
- the data segmenting apparatus or method according to the embodiments or examples of the disclosure may be applicable to any application scenarios of processing various electronic data, such as video, audio, text, still image, or any combination thereof.
- the data segmenting apparatus or method according to the embodiments or examples of the disclosure may be applicable to various video processing apparatus or system, such as digital camera, digital video camera, digital video recorder, video server, or video monitoring system, etc.; and may be applicable to other data processing apparatus or system, such as data (including video, audio and/or text, etc.) browsing, summarization, retrieval, and/or storage, etc.
- the components, units or steps in the above apparatuses and methods can be configured with software, hardware, firmware or any combination thereof.
- the above data segmenting apparatus may be incorporated into other data processing apparatus or system as a component thereof, or may be connected to other data processing apparatus or system as a separate device.
- programs constituting the software for realizing the above method or apparatus can be installed to a computer with a specialized hardware structure (e.g. the general purposed computer 1300 as shown in FIG. 13 ) from a storage medium or a network.
- the computer when installed with various programs, is capable of carrying out various functions.
- a central processing unit (CPU) 1301 executes various types of processing in accordance with programs stored in a read-only memory (ROM) 1302 , or programs loaded from a storage unit 1308 into a random access memory (RAM) 1303 .
- the RAM 1303 also stores the data required for the CPU 1301 to execute various types of processing, as required.
- the CPU 1301 , the ROM 1302 , and the RAM 1303 are connected to one another through a bus 1304 .
- the bus 1304 is also connected to an input/output interface 1305 .
- the input/output interface 1305 is connected to an input unit 1306 composed of a keyboard, a mouse, etc., an output unit 1307 composed of a cathode ray tube or a liquid crystal display, a speaker, etc., the storage unit 1308 , which includes a hard disk, and a communication unit 1309 composed of a modem, a terminal adapter, etc.
- the communication unit 1309 performs communicating processing.
- a drive 1310 is connected to the input/output interface 1305 , if needed. In the drive 1310 , for example, removable media 1311 is loaded as a recording medium containing a program of the present invention. The program is read from the removable media 1311 and is installed into the storage unit 1308 , as required.
- the programs constituting the software may be installed from a network such as Internet or a storage medium such as the removable media 1311 .
- the storage medium is not limited to the removable media 1311 , such as, a magnetic disk (including flexible disc), an optical disc (including compact-disc ROM (CD-ROM) and digital versatile disk (DVD)), an magneto-optical disc (including an MD (Mini-Disc) (registered trademark)), or a semiconductor memory, in which the program is recorded and which are distributed to deliver the program to the user aside from a main body of a device, or the ROM 1302 or the hard disc involved in the storage unit 1308 , where the program is recorded and which are previously mounted on the main body of the device and delivered to the user.
- a magnetic disk including flexible disc
- an optical disc including compact-disc ROM (CD-ROM) and digital versatile disk (DVD)
- an MD Magneto-optical disc
- MD Magneto-optical disc
- the present disclosure further provides a program product having machine-readable instruction codes which, when being executed, may carry out the methods according to the embodiments.
- the storage medium for bearing the program product having the machine-readable instruction codes is also included in the disclosure.
- the storage medium includes but not limited to a flexible disk, an optical disc, a magneto-optical disc, a storage card, or a memory stick, or the like.
- the terms “comprise,” “include,” “have” and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- the methods are not limited to a process performed in temporal sequence according to the order described therein, instead, they can be executed in other temporal sequence, or be executed in parallel or separatively. That is, the executing orders described above should not be regarded as limiting the method thereto.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
{right arrow over (E)}(b,c)=[C(b,c),C(a,c),C(b,d),C(a,d)] (2)
The formula (4) may also be used to calculate the similarity of the data inside a single data segment, such as A(a,a) and A(d,d).
The formula (4a) may also be used to calculate the similarity of the data inside a single data segment, such as A(a,a) and A(d,d).
S C·(s i ,s j)=I(meanf
E(b,c)={right arrow over (E)}(b,c)·{right arrow over (p)} (8)
{right arrow over (p)}=(Σ++Σ−)−1*({right arrow over (μ)}−−{right arrow over (μ)}+) (9)
-
- a) to select a candidate preceding boundary sequence that contains a maximum number of boundaries, or
- b) if more than one candidate preceding boundary sequence has the same maximum number of boundaries, to select a candidate preceding boundary sequence that is optimum synthetically (for example, a candidate preceding boundary sequence that has the optimum average value of the evaluation values or has the optimum weighted average value of the evaluation values).
μ=({right arrow over (μ)}−+{right arrow over (μ)}+)·{right arrow over (p)}/2 (10)
Claims (18)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011100239199A CN102591892A (en) | 2011-01-13 | 2011-01-13 | Data segmenting device and method |
CN201110023919.9 | 2011-01-13 | ||
CN201110023919 | 2011-01-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120183219A1 US20120183219A1 (en) | 2012-07-19 |
US8831347B2 true US8831347B2 (en) | 2014-09-09 |
Family
ID=46480561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/342,464 Expired - Fee Related US8831347B2 (en) | 2011-01-13 | 2012-01-03 | Data segmenting apparatus and method |
Country Status (2)
Country | Link |
---|---|
US (1) | US8831347B2 (en) |
CN (1) | CN102591892A (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103729530B (en) * | 2012-10-15 | 2017-05-24 | 富士通株式会社 | Device and method for processing sequence |
CN103139663B (en) * | 2013-01-25 | 2016-05-11 | 深圳先进技术研究院 | The automatic detachment device of video and the method automatically splitting thereof |
AU2014382891B2 (en) | 2014-02-14 | 2016-11-10 | Xfusion Digital Technologies Co., Ltd. | Method and server for searching for data stream dividing point based on server |
CN106782506A (en) * | 2016-11-23 | 2017-05-31 | 语联网(武汉)信息技术有限公司 | A kind of method that recorded audio is divided into section |
CN109784126B (en) * | 2017-11-10 | 2022-11-18 | 富士通株式会社 | Data cutting method and device and article detection method and device |
CN108710878B (en) * | 2018-04-18 | 2021-11-26 | 武汉工程大学 | Railway contact network column number plate character segmentation method and system |
CN108881950B (en) * | 2018-05-30 | 2021-05-25 | 北京奇艺世纪科技有限公司 | Video processing method and device |
CN110162552B (en) * | 2019-05-09 | 2020-05-12 | 山东科技大学 | Time series feature extraction method and system based on confidence interval |
CN113407799A (en) * | 2021-06-22 | 2021-09-17 | 深圳大学 | Performance measurement method and device for measuring space division boundary and related equipment |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010021268A1 (en) * | 2000-03-07 | 2001-09-13 | Lg Electronics Inc. | Hierarchical hybrid shot change detection method for MPEG-compressed video |
US6710822B1 (en) * | 1999-02-15 | 2004-03-23 | Sony Corporation | Signal processing method and image-voice processing apparatus for measuring similarities between signals |
US6738100B2 (en) * | 1996-06-07 | 2004-05-18 | Virage, Inc. | Method for detecting scene changes in a digital video stream |
US20070201746A1 (en) * | 2002-05-20 | 2007-08-30 | Konan Technology | Scene change detector algorithm in image sequence |
US20100259688A1 (en) * | 2007-11-14 | 2010-10-14 | Koninklijke Philips Electronics N.V. | method of determining a starting point of a semantic unit in an audiovisual signal |
US8195038B2 (en) * | 2008-10-24 | 2012-06-05 | At&T Intellectual Property I, L.P. | Brief and high-interest video summary generation |
US8254677B2 (en) * | 2006-09-27 | 2012-08-28 | Sony Corporation | Detection apparatus, detection method, and computer program |
US8363960B2 (en) * | 2007-03-22 | 2013-01-29 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and device for selection of key-frames for retrieving picture contents, and method and device for temporal segmentation of a sequence of successive video pictures or a shot |
US20130194508A1 (en) * | 2008-04-17 | 2013-08-01 | Ramesh PB | Scene Break Prediction Based On Characteristics Of Previous Scenes |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6819795B1 (en) * | 2000-07-07 | 2004-11-16 | Fuji Xerox Co., Ltd. | Genetic segmentation method for data, such as image data streams |
CN1245697C (en) * | 2003-08-04 | 2006-03-15 | 北京大学计算机科学技术研究所 | Method of proceeding video frequency searching through video frequency segment |
CN100461179C (en) * | 2006-10-11 | 2009-02-11 | 北京新岸线网络技术有限公司 | Audio analysis system based on content |
-
2011
- 2011-01-13 CN CN2011100239199A patent/CN102591892A/en active Pending
-
2012
- 2012-01-03 US US13/342,464 patent/US8831347B2/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6738100B2 (en) * | 1996-06-07 | 2004-05-18 | Virage, Inc. | Method for detecting scene changes in a digital video stream |
US6710822B1 (en) * | 1999-02-15 | 2004-03-23 | Sony Corporation | Signal processing method and image-voice processing apparatus for measuring similarities between signals |
US20010021268A1 (en) * | 2000-03-07 | 2001-09-13 | Lg Electronics Inc. | Hierarchical hybrid shot change detection method for MPEG-compressed video |
US20070201746A1 (en) * | 2002-05-20 | 2007-08-30 | Konan Technology | Scene change detector algorithm in image sequence |
US8254677B2 (en) * | 2006-09-27 | 2012-08-28 | Sony Corporation | Detection apparatus, detection method, and computer program |
US8363960B2 (en) * | 2007-03-22 | 2013-01-29 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and device for selection of key-frames for retrieving picture contents, and method and device for temporal segmentation of a sequence of successive video pictures or a shot |
US20100259688A1 (en) * | 2007-11-14 | 2010-10-14 | Koninklijke Philips Electronics N.V. | method of determining a starting point of a semantic unit in an audiovisual signal |
US20130194508A1 (en) * | 2008-04-17 | 2013-08-01 | Ramesh PB | Scene Break Prediction Based On Characteristics Of Previous Scenes |
US8195038B2 (en) * | 2008-10-24 | 2012-06-05 | At&T Intellectual Property I, L.P. | Brief and high-interest video summary generation |
Also Published As
Publication number | Publication date |
---|---|
US20120183219A1 (en) | 2012-07-19 |
CN102591892A (en) | 2012-07-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8831347B2 (en) | Data segmenting apparatus and method | |
EP2304649B1 (en) | Frame based video matching | |
JP5479340B2 (en) | Detect and classify matches between time-based media | |
US7551234B2 (en) | Method and apparatus for estimating shot boundaries in a digital video sequence | |
EP2657884B1 (en) | Identifying multimedia objects based on multimedia fingerprint | |
US9373054B2 (en) | Method for selecting frames from video sequences based on incremental improvement | |
US8467611B2 (en) | Video key-frame extraction using bi-level sparsity | |
US8947600B2 (en) | Methods, systems, and computer-readable media for detecting scene changes in a video | |
US20110069939A1 (en) | Apparatus and method for scene segmentation | |
US20120250983A1 (en) | Object detecting apparatus and method | |
JP2010537585A5 (en) | Detect and classify matches between time-based media | |
CN110781960B (en) | Training method, classification method, device and equipment of video classification model | |
CN111683274A (en) | Bullet screen advertisement display method, device and equipment and computer readable storage medium | |
US10162867B2 (en) | Low memory sampling-based estimation of distinct elements and deduplication | |
CN111949798A (en) | Map construction method and device, computer equipment and storage medium | |
CN106598997B (en) | Method and device for calculating text theme attribution degree | |
CN110716857A (en) | Test case management method and device, computer equipment and storage medium | |
US6433709B1 (en) | Decoding method and decoding apparatus for variable length code words, and computer readable recording medium for storing decoding program for variable length code words | |
US8463725B2 (en) | Method for analyzing a multimedia content, corresponding computer program product and analysis device | |
JP4447602B2 (en) | Signal detection method, signal detection system, signal detection processing program, and recording medium recording the program | |
CN108566567B (en) | Movie editing method and device | |
JPWO2007049378A1 (en) | Video identification device | |
CN114422848A (en) | Video segmentation method and device, electronic equipment and storage medium | |
CN111680175B (en) | Face database construction method, computer equipment and computer readable storage medium | |
US20130191310A1 (en) | Prediction model refinement for information retrieval system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAN, BO;REEL/FRAME:027470/0381 Effective date: 20111208 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220909 |