WO2013077546A1 - Apparatus and method for detecting a scene change in a stereoscopic video - Google Patents

Apparatus and method for detecting a scene change in a stereoscopic video Download PDF

Info

Publication number
WO2013077546A1
WO2013077546A1 PCT/KR2012/008237 KR2012008237W WO2013077546A1 WO 2013077546 A1 WO2013077546 A1 WO 2013077546A1 KR 2012008237 W KR2012008237 W KR 2012008237W WO 2013077546 A1 WO2013077546 A1 WO 2013077546A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
scene
scene change
frame
feature point
dimensional
Prior art date
Application number
PCT/KR2012/008237
Other languages
French (fr)
Korean (ko)
Inventor
우대식
박재범
전병기
김종대
정원석
Original Assignee
에스케이플래닛 주식회사
시모스 미디어텍(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00711Recognising video content, e.g. extracting audiovisual features from movies, extracting representative key-frames, discriminating news vs. sport content
    • G06K9/00765Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots and scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/142Detection of scene cut or scene change
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Abstract

The present invention relates to an apparatus and method for detecting a scene change in a stereoscopic video, wherein the apparatus includes: a module for detecting a scene change point in an input video; a module for calculating a stereoscopic scene continuity score which extracts feature points in the last frame of the detected scene change point and calculates a stereoscopic scene continuity score by using the extracted feature points tracked in the first frame of the scene change video; and a module for determining scene continuity which compares the calculated stereoscopic scene continuity score with a preset threshold, and determines whether the stereoscopic scene is continuous on the basis of the compared result. According to the present invention, whether a typical scene change point is to be divided into independent scene changes or distinguished as one continuous scene change and whether the scene change point is continuous in a stereoscopic view can be determined by using video features.

Description

A stereoscopic scene change detection device and method

The present invention relates to a scene change detection device and method of the stereoscopic image, and more particularly, the extracting feature points from the last frame of a scene change detected in the input image and the extracted feature points to the scene at the first frame of the converted image by using a characteristic point tracking calculate the stereoscopic scene continuity score after, the present invention relates to the three-dimensional scene comparing the continuity and breaking group set threshold value to the three-dimensional image to determine whether the continuity of the three-dimensional scene, the scene change detection device and method.

In general, transition (Scene Change) after Iran ended a scene from the scene of continuous video, a phenomenon that started the video again with different scenes. Transition is a normal random scene gradually gone back to another scene gradually appears fade-out (fade out) and the fade-in (fade in), and any two scenes of each other over time in different directions overlap slowly overlap ( overlap), sometimes separated by a simple transitions and more.

Detection of a conventional scene change is primarily to separate the scene change based on the value such as brightness or Histogram of the image. Many of these criteria is reasonable to also select a non-continuous part is numerically as Histogram Results of 1 to scene change, however, in the case of the three-dimensional transformation, if the continuity of the visual object that it must be separated by one continuous scene If there are many.

That is, a common scene change and is a particular object to continue moving in a series of different scenes, a viewer is less discomfort of the three-dimensional should maintain a three-dimensional characteristic of the object.

This is be distinguished from the other scene change and if the separation in different scenes (Scene) be subject to a different three-dimensional transformation method, and therefore in spite of a continuous scene, and the characteristics of the three-dimensional change may increase the discomfort of the visual three-dimensional.

Of course, if the stereoscopic conversion work by hand, but the problem by taking into account the characteristics of the stereo to a person without particular distinction of scene change visually, if you must perform an automatic three-dimensional conversion through the image processing, three-dimensional as scene change unit conversion the scene change is important because it is more efficient to do.

The present invention is to provide such, an object of the present invention is best scene change of a stereoscopic image scene which is capable of detecting the change detection apparatus and method in terms of the three-dimensional conversion made in view of solving the above-mentioned problem.

Another object of the invention is to determine not distinguish between a conventional scene change independent cutaway separated or whether a continuous single transition to this in using the features of the image, in the three-dimensional point of view to determine the continuous presence or absence of the scene change switch scene of a stereoscopic image that can be to provide a detection apparatus and method.

According to an aspect of the invention, the tracking in the first frame of the extracted feature point from the last frame of detecting a scene change in the input video scene change detection module, the detected scene change, and the extracted feature points are scene change image of the three-dimensional scene computed continuity points to calculate the stereoscopic scene continuity score using the characteristic point module, the calculated three-dimensional scene comparing continuity score group and the set threshold value, and determining continuity whether the three-dimensional scene on the basis of the comparison result to a scene continuity the scene change detection device of the three-dimensional image comprising a state determining module is provided.

The scene change detection module by using a correlation (Correlation), at least one method of sequential statistical analysis (Statistical sequential analysis), histogram (Histogram) and detects the scene change.

The three-dimensional scene continuity score calculation module, to the to the the feature points extracted from the feature point extraction section, the last frame to extract feature points from the last frame matched with the first frame, the extracted feature points of the scene change detected by the scene change detection module obtain the SAD (sum of absolute difference) in a pre-defined on the basis of the tracked feature point feature point matching unit to obtain the number of feature point tracking by the first frame, and the feature point of the last frame in the first frame block, the obtained SAD a calculation unit to include the three-dimensional three-dimensional scene scene continuity points to obtain a continuous score used.

The three-dimensional scene continuity score calculation module using the following equation obtains the three-dimensional scene continuity score (C (s)).

Formula;

C (s) = (the number of feature point tracking / total feature points) * (Σ (1 / SAD (fn, bm))

Here, the total of the characteristic point number of the feature point number extracted from the last frame of a scene change, the tracked feature point number is a scene traced in the first frame of the converted image feature point number, fn is a number of the frames, bm are m in the frame It means the first block.

The SAD (fn, bm) is determined using the following equation.

Formula;

SAD (fn, bm) = Σ abs (Frame (fn, bm) pixel (i) - Frame (fn + 1, bm) pixel (i))

Here, fn is a number of the frames, bm is an m-th block of the frame, it referred i indicates the order of each pixel of the block.

The scene continuity state determining module determines that the scene change is made when the three-dimensional scene continuity is not equal to or greater than the score determined in a continuous scene, and the threshold value not less than the threshold value.

According to another aspect of the invention, there is provided a method of the scene change detection device that detects a scene change of a stereoscopic image, (a) detecting a scene change in the input image, (b) the last frame of the detected scene change extracting feature points in and calculating a three-dimensional scene continuity score using the feature point tracking at the first frame of the the feature points are scene change extraction image, (c) compared with the predetermined threshold value on the calculated three-dimensional scene continuity score and there is provided a scene change detection in the three-dimensional image comprises the step of determining whether the continuity of the three-dimensional scene on the basis of the comparison result.

The step (b), the step of extracting feature points from the last frame in the detected scene change, feature points to the the feature points extracted from the last frame matched with the first frame, the extracted to the minutia number of tracks in the first frame, obtaining step, obtaining an SAD (sum of absolute difference) in a pre-define the tracked feature points from the first frame and the feature points of the last frame in the reference block, the three-dimensional using the calculated SAD and the tracked feature point number and a step of calculating a score scene continuity.

The three-dimensional scene continuity score (C (s)) it is determined using the following equation.

Formula;

C (s) = (the number of feature point tracking / total feature points) * (Σ (1 / SAD (fn, bm))

Here, the total of the characteristic point number of the feature point number extracted from the last frame of a scene change, the tracked feature point number is a scene traced in the first frame of the converted image feature point number, fn is a number of the frames, bm are m in the frame It means the first block.

Wherein the step (c), it is determined that the three-dimensional scene continuity score if not equal to or greater than the determination in a continuous scene, and the threshold is greater than or equal to the threshold value to be made a transition.

In accordance with another aspect of the invention, (a) of detecting a scene change in the input image, (b) extracting a feature point from the last frame in the detected scene change, and the extracted feature points are scene change image using the feature point tracking in the first frame, calculating a three-dimensional scene continuity points, (c) determining a continuance if the three-dimensional scene on the basis of the comparison result compared to the predetermined threshold, the calculated three-dimensional scene continuity score the scene change detecting method of the three-dimensional image is recorded as a program-readable recording medium in an electronic device is provided comprising a.

According to the invention, it is possible to detect the best scene change in a three-dimensional perspective transform.

Further, by determining not distinguish between a conventional scene change independent cutaway separated or whether one continuous transition into this in using the features of the image, it is possible to determine the presence or absence of a scene change in the continuity of the three-dimensional point of view.

1 is a diagram representing the levels of a conventional scene change.

2 is a block diagram schematically showing the configuration of the scene change detection device of the three-dimensional image according to the present invention.

Figure 3 is an exemplary view for explaining the method for obtaining the SAD between frames in accordance with the present invention.

Figure 4 is a flow diagram of a three-dimensional video transition the scene detection device is shown how to check whether or not the continuity of scenes in accordance with the present invention.

Figure 5 is a flow diagram that the scene change detection device according to the invention showing the method of obtaining the three-dimensional scene continuity score.

With reference to the accompanying drawings, the present will be described in more detail a preferred embodiment of the invention. Description of the same components in the following description with reference to the accompanying drawings or corresponding are assigned the same reference numerals and a duplicate thereof will be omitted.

Figure 2 is an exemplary view for explaining a method for obtaining an SAD between frames in accordance with the present invention a block diagram schematically showing the construction of a stereoscopic image of a scene change detection device, Figure 3 is in accordance with the present invention.

Referring to Figure 2, the scene change detection unit 200 of the stereoscopic image includes a scene change detection module 210, the three-dimensional scene continuity scoring module 220, a scene continuity state determining module 230.

The scene change detection module 210 detects a scene change in the input image. That is, the scene change detection module 210, the picture brightness, the correlation (Correlation), sequential statistical analysis (Statistical sequential analysis), histogram (Histogram) obtains the scene change of a video input using a method such as.

Techniques for the scene change detection module 210 detects a scene change is follows the conventional techniques, detailed description thereof will be omitted.

The three-dimensional scene continuity score computation module 220 is the extracted feature point from the last frame in the detected scene change, track in the first frame after which the extracted feature points are scene change image feature point in the scene change detection module (210) the used to calculate the stereoscopic scene continuity score. Here, the last frame of the scene change is the first frame of the means that the last frame before the scene change, wherein the scene transition image means the first frame after a scene change, and after the scene change before and is divided by scene change.

The three-dimensional scene continuity score calculation module 220 that acts as described above includes a feature point extracting unit 222, a feature point matching unit 224, a three-dimensional scene continuity score calculation unit 226. The

The feature point extraction unit 222 extracts a feature point by using a corner extraction method, SIFT (Scale Invariant Feature Transform) algorithm Harris in the last frame of a scene change detected by the scene change detection module (210). Here, the feature points may be pre-defined to refer to such an edge (edge) or a corner (corner).

The feature point extracting unit 222 is to extract the feature point from the last frame of the scene change in order to find the association between the object before and after the scene change image.

The feature point matching unit 224 calculates a first frame, and matching to the extracted feature points of the final scene of the feature points extracted from the image frame switching to the number of feature point tracking at the first frame of the scene change.

For example, the feature points extracted from the last frame of the scene change before feature point 1, the feature point 2, the feature point 3, the feature point 4, feature point 5, and the one of the extracted feature point matching the first frame after a scene change result of feature point 1, the feature point 2, when the feature point 4, the track, the feature point matching unit 224 is rescued by the feature point tracking number "3".

The three-dimensional scene continuity score calculation unit 226 calculates the SAD (sum of absolute difference) in a pre-defined on the basis of the tracked feature points from the first frame and the feature points of the last frame, block, using the obtained SAD obtain a three-dimensional scene continuity score. Here, the predefined block means a predetermined constant size of the block, including the peripheral pixels, based on the feature points. The three-dimensional scene continuity score means a score indicating whether the feature points are to some extent held in the converted scene based on the characteristic point, and then lead to the extracted image in the last frame.

Scene of the three-dimensional scene continuity score calculation unit 226, taking into account the reliability of the feature point obtained by the number and tracking of possible feature point actual track on the number of feature points to perform Scene (s) and Scene (s + 1) Continuity calculate a score for. Here, Scene (s) corresponds to a scene change scene, Scene (s + 1) refers to the scene after the scene change.

Thus, the three-dimensional scene continuity score calculation unit 226 is to seek the three-dimensional scene continuity score (C (s)) using the equation (1).

Equation 1

Figure PCTKR2012008237-appb-M000001

Here, the total number of feature points is to say the number of feature points extracted from the last frame of a scene change, the tracked feature point number refers to the number of feature point tracking at the first frame after a scene change obtained from the feature point matching unit.

The F (n) means a reliability for each feature point, n is the index (index) of the feature point tracking is possible, s refers to the index for each Scene.

Wherein each feature point reliability (F (n)) is optionally for the feature points n, the last frame of the Scene (s) according to the conventional method transitions the next Scene ease of tracking when tracking in the first frame of the (s + 1) It means. That is, the reliability of each characteristic point is to express how much present in the same form, and whether close-up in the next image to be tracked is a part of the image formed based on the feature points.

If the previous image and relations Reporting completely transition to a different place without video, it will not be able to find an area with similar characteristics which are excluded from the track because the track is essentially impossible. However, if more than a certain degree of similarity is found determine a measure of similarity it is found by using the SAD (sum of absolute difference).

Thus, the three-dimensional scene continuity score calculation unit 226 is to seek the three-dimensional scene continuity score using the SAD.

At this time, the three-dimensional scene continuity score calculation unit 226 calculates the SAD using the following expression (2).

Equation 2

Figure PCTKR2012008237-appb-M000002

Here, fn is a number of the frames, bm is an m-th block of the frame, i is the order of each pixel of the block, abs means an absolute value.

The SAD of the sum of absolute difference (sum of absolute difference) is within all the pixel block, resulting in the expression of the difference block in the same position between frames. Thus, the mean SAD value is large means that the change in the image is greater.

Thus, the three-dimensional scene continuity score calculating unit 226 is divided into blocks (block) of a predetermined number determined by the frame in advance about the feature points, respectively, it obtains the SAD between the current frame and the previous frame for each block.

The three-dimensional scene continuity score calculation unit 226, the reference to Figure 3 how to obtain the SAD, and the current frame, Frame (fn), the previous frame, Frame (fn-1), a later frame, Frame (fn + 1 ) of the case, the three-dimensional scene continuity score calculating unit 226 is SAD (fn-1, bm) between the SAD that is, the current frame block bm and a previous frame block bm between the block (bm) at the same position in each frame, current frame and later calculate the SAD (fn, bm) between a frame block bm respectively.

Thus, the Frame current frame (fn), and if after a frame is the Frame (fn + 1), the three-dimensional scene continuity score calculating unit 226 is SAD that is, the current frame between the block (bm) at the same position in each frame, block and bm is obtained since the SAD (fn, bm) between a frame block bm respectively.

The three-dimensional scene continuity score calculation unit 226 as described above obtains the SAD of each block of the size of an image area is defined in advance for the feature point selected in the scene change. If the calculated SAD values ​​over the threshold this is because there is no practical means of tracking except in the track number of the feature point, a SAD value can be tracked is not a threshold value or more. Therefore, the smaller the real SAD values, the greater affinity is increased reliability. That is, the smaller the SAD is larger reliability inverse relationship scaffold.

Thus, the three-dimensional scene continuity score calculation unit 226 may be determined by the reliability (F (n)) of each feature point using the SAD.

Therefore, the three-dimensional scene continuity score calculation unit 226 may be expressed by the equation (1) and equation (3).

Equation 3

Figure PCTKR2012008237-appb-M000003

Here, the total of the characteristic point number of the feature point number extracted from the last frame of a scene change, the tracked feature point number is a scene traced in the first frame of the converted image feature point number, fn is a number of the frames, bm are m in the frame It means the first block.

Stereoscopic image continuity scoring module 220 configured as described above is to obtain the SAD (sum of absolute difference) of the feature points and the scene predefined on the basis of the feature point tracking in a transition the first frame block of the last frame, obtained the SAD is obtained by using the three-dimensional scene continuity score.

In addition, the calculation the three-dimensional image continuity breaking module 220 may correct the left and right (i.e., the scene change before or after) the three-dimensional scene continuity score resulting from performing the feature point matching for a number of frames in at least one frame of the scene change .

The scene continuity state determining module 230 is compared with a predetermined threshold value for the three-dimensional scene continuity score obtained in the three-dimensional scene continuity scoring module 220, and determines whether or not the continuity of the three-dimensional scene on the basis of the comparison result.

That is, the scene continuity state determining module 230 determines the subsequent scene when the three-dimensional scene continuity score greater than a preset threshold value, determines that the scene change is made, if not equal to or greater than the threshold value.

In other words, if the scene continuity state determining module 230 is more than the said three-dimensional scene continuity score threshold, the scene change detection module 210 two Scene (s) separated by a conventional method in the Scene (s + 1 ) it is treated collectively as a Scene in the three-dimensional image.

The scene change detection unit 200 of the stereoscopic image is configured as described above is to be considered the detection of a scene change for the autostereoscopic converted to basically by selecting an existing scene change first, then performing the further detection as the transition the review takes place in a manner that whether or not to do.

In addition, the scene change detection unit 200 of the stereoscopic image defines the criteria for determining whether to be distinguished from a conventional scene change independent cutaway separated or whether a continuous single transition in using the features of the image.

Figure 4 is a flow chart of a stereoscopic image is the scene change detection device according to the invention illustrating a method for checking whether or not the continuity of the scene.

Referring to Figure 4, the scene change detection device of the three-dimensional image is the scene change detection for the input image (S302). That is, the scene change detection device is correlated (Correlation), and detects the sequential statistical analysis (Statistical sequential analysis), histogram (Histogram) of a video input using a method such as a scene change.

After execution of the S302, the scene change detection device is the three-dimensional scene using the tracked feature points from the first frame of the extracted feature point from the last frame in the detected scene change, and (S304), the extracted feature points are scene change image obtain a continuous score (S306). A detailed description of how the scene change detection device to obtain a three-dimensional scene continuity score will be given with reference to FIG.

After execution of the S306, the scene change detection unit judges whether the obtained three-dimensional scene, the group continuity breaking more than a set threshold value (S308).

If it is determined in the S308 is the three-dimensional scene continuity score thresholds described above, the scene change detection device judges the frame as a continuous scene (S310).

If it is determined in the S308 is the three-dimensional scene continuity score greater than or equal to the threshold, the scene change detection device determines the frame to be made of the transition (S312).

5 is a flowchart illustrating a scene change detection device according to the invention showing the method of obtaining the three-dimensional scene continuity score.

5, the scene change detecting device (S402) a feature point of the last frame, the scene change is detected to calculate the number of feature point tracking at the first frame after a scene change. That is, the scene change detection device is the extracted feature points to match with the first frame of the feature points extracted from the last frame to calculate the number of feature point tracking at the first frame.

After execution of the S402, the scene change detection device calculates the SAD (sum of absolute difference) in the pre-defined blocks by the traced feature point in the first frame and the feature points of the last frame (S404).

That is, the scene change detection device calculates the SAD using the following expression (2).

After execution of the S404, the scene change detection device calculates a three-dimensional scene continuity score using the calculated SAD and the tracked feature point number (S406).

That is, the scene change detection device is the three-dimensional scene continuity rescued score using the equation (3).

The present invention can be realized as a code which the computer can read in a computer-readable recording medium. The computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system.

Device for detecting a scene change of a stereoscopic image in accordance with the present invention may include a processor, memory, storage and input / output devices as a component, these components can for example, be interconnected using a system bus.

The processor may process instructions for execution within the unit. In one implementation, the processor may be a single thread (Single-threaded) processor, the processor in the other embodiments may be a multi-threaded (Multi0threaded) processor. The processor may process the instructions stored in memory or storage device, it is possible to.

On the other hand, the memory stores information in the unit. If the embodiment, the memory is a computer-readable media. In one implementation, the memory is the case can be a volatile memory unit, and the other embodiments, the memory may be a non-volatile memory unit. The aforementioned storage device is capable of providing mass storage for the parts of the unit. If the embodiment, the storage device is a computer-readable media. In various different implementations, the storage device may include, for example, a hard disk device, optical disk device, or some other mass storage device.

The above-mentioned input / output device provides input / output operations for the system according to the present invention. In one implementation, the input / output device may comprise, for example, a wireless interface device, such as a serial communication device and / or for example, an 802.11 card, such as one or more network interface devices, such as RS-232 ports such as an Ethernet card. In another implementation, the input / output device may include a device driver, such as a keyboard, a printer and a display device configured to transmit and receive input data to output data to other input / output devices.

Device according to the invention may be driven by a command for causing one or more processors to perform the functions and processes described above. For example, in such order, for example, it may include other instructions stored on a readable medium, or computer commands or executable code that is interpreted as a command script, such as JavaScript or ECMAScript command. Further according to the present invention may be implemented in a server farm may be implemented as a distributed over a network, such as (Sever Farm), or a single computer device.

Although in the present specification and drawings, but describes an exemplary device configuration, implementation of the functional operation and the topics discussed in this specification be implemented in digital electronic circuitry of another type, the structure and their structural equivalents disclosed herein be implemented in computer software, firmware, or hardware, including, it can be implemented in one or more of these bonds. Implementation of the subject are more than one computer program product described herein, that is to say one for execution by, or which to control the operation of the device according to the present invention relates to a computer program instructions encoded on a type of program storage medium It may be implemented as one module. Computer-readable media may be readable for influencing the spread-type signal into a machine-readable storage device, a machine readable storage substrate, a memory device, the machine material composition or one or more combinations of the foregoing.

The term "processing system", "processor" and "sub-system" is for example, encompasses all apparatus, devices, and machines for processing data, including a programmable processor, a computer, or multiple processors or computers. The processing system can include a code to be added to the hardware, forming an execution environment for example, code that makes up processor firmware, a protocol stack, a database management system, operating system, or request a computer program, such as one or more combinations of these .

A computer program (also known as programs, software, software application, script, or code) for executing the method according to the invention is mounted on a device according to the present invention is programming, including compiled or interpreted languages, a priori or procedural languages It can be written in any form of a language, and can be deployed in any form to include other unit suitable for use in stand-alone program or a module, component, subroutine, or the computer environment. A computer program does not necessarily have to correspond to the files in the file system. The program part of the inside in a single file that is provided to the requesting program, or the multiple cross-file serving (e. G., One or more modules, sub programs, or file for storing part of the code), or a file that holds other programs or data may be stored in the (e. g., one or more scripts stored in a markup language document). A computer program can be deployed so as to be located at one site or distributed across multiple sites running on a computer of a cross-multiple-access computer or one by a communication network.

Medium readable by a suitable computer, for storing computer program instructions and data, for example, EPROM, EEPROM, and flash semiconductor memory device, such as a memory device, such as magnetic disks, such as internal hard disks and external disks, magneto-optical disks and CD-ROM and it shall include the DVD-ROM disc, including all forms of non volatile memory, media and memory devices. The processor and the memory is supplemented by a logic circuit or a special-purpose, can be integrated into it.

Implementation of the subject matter described herein includes, for example, a back-end component, such as a data server, or, for example, includes a middleware component, such as an application server, or, for example, your Web browser can interact implementation and mutual subject matter described in this specification or the graphical user It may be implemented in front-end component, or such back-end, computing system comprising at least one combination of all of the middleware or a front-end component, such as a client computer having an interface. Of system components, for example it can be interconnected by any form or medium of digital data communication such as a communication network.

The specification includes a number of specific implementations of the detail, but these should not be any understood as inventions and also limit the scope of what can be charged, but rather a description of the features that can be specific to particular embodiments of the particular invention it should be understood. The specific features described herein in the context of separate embodiments may be implemented in combination in a single embodiment. On the other hand, it is possible in a variety of technical features in the context of a single embodiment also implemented separately from or in Fig plurality of embodiments in any suitable sub-combination. Further, characterized in that, but can be described as charge, such as that the operation and initially with particular combinations, one or more features from a claimed combination they can be excluded from the combination in some cases, that the claimed combination is subcombination or it may be modified by modification of the sub-combination.

Similarly, although depicted in the drawings in a particular order of operation, which handageona be carried out such operations as the specific order or sequential order shown to achieve the desired result and should not be understood to be carried out by all illustrated acts. In certain cases, it may be advantageous are multi-tasking and parallel processing. In addition, the separation of various system components in the embodiments described above are not to be understood such a separation to be required in all embodiments, the program components and systems described above are generally to be integrated or packaged into multiple software products together in a single software product it can be understood that.

It has been described particular embodiments of the subject matter described in this specification. Other embodiments are within the scope of the following claims. For example, the operations referred to in the claims are to achieve the desired result while being still carried out in a different order. As an example, a process shown in the accompanying figures do not necessarily require the particular order shown or in sequential order to obtain the desired results. In certain embodiments, it may be advantageous are multi-tasking and parallel processing.

This technical description has provided an example for allowing and presents the best mode of the present invention, in order to illustrate the present invention, and a person skilled in the art to make or use the present invention. This written specification is not intended to limit the invention to the specific terms set forth. Thus, although the detailed description of the present invention will be described with reference to the example described above, one of ordinary skill in the art may be added to the modifications, changes and variations to the examples without departing from the scope of the invention.

Thus, persons skilled in the art will appreciate that the present invention without changing the technical spirit or essential features may be embodied in other specific forms. Therefore, the embodiment described in the above examples should be understood as illustrative and not be limiting in all aspects. The scope of the invention is intended to be included within the scope of the above description becomes than indicated by the claims, which will be described later, the spirit and scope, and all such modifications as derived from the equivalent concept of the appended claims the invention do.

The present invention is three-dimensional, which can determine the continuity or absence of the determining not distinguish between a conventional scene change independent cutaway separated or whether a continuous single transition to this in using the characteristics of the video, scene change in a three-dimensional perspective, It can be applied to the scene change detection device and method of the image.

Claims (11)

  1. Detecting a scene change in the input video scene change detection module;
    Three-dimensional scene continuity scoring module for extracting a feature point from the last frame in the detected scene change, and calculating a three-dimensional scene continuity score using the feature point tracking at the first frame of the feature point to a scene change image extraction; And
    The calculated three-dimensional scene continuity breaking the group compared to the set threshold, the comparison scene to determine whether the continuity of the three-dimensional scene on the basis of the result continuity state determining module;
    Three-dimensional image of the scene change detection device comprising a.
  2. According to claim 1,
    The scene change detection module correlation (Correlation), sequential statistical analysis (Statistical sequential analysis), histogram (Histogram) of the at least one three-dimensional video scene change detecting device according to the method using, characterized in that for detecting the scene change.
  3. According to claim 1,
    The three-dimensional scene continuity score calculation module,
    A feature point extraction unit for extracting feature points from the last frame of a scene change detected by the scene change detection module;
    Feature point matching unit and the cost of the feature points extracted from the first frame and last frame matching the extracted feature points to obtain the number of feature point tracking at the first frame; And
    The end of the feature point from the frame to obtain the SAD (sum of absolute difference) in a pre-defined on the basis of the tracked feature point block in the first frame, using the calculated SAD calculated stereoscopic scene with three-dimensional scene continuity points to obtain a continuous score three-dimensional image of the scene change detection device comprising parts.
  4. 4. The method of claim 3,
    The three-dimensional scene continuity score calculation unit, using the following equation of the three-dimensional scene continuity score (C (s)) of the three-dimensional video scene change detection device, characterized in that to obtain.
    Formula;
    C (s) = (the number of feature point tracking / total feature points) * (Σ (1 / SAD (fn, bm))
    Here, the total of the characteristic point number of the feature point number extracted from the last frame of a scene change, the tracked feature point number is a scene traced in the first frame of the converted image feature point number, fn is a number of the frames, bm are m in the frame It means the first block.
  5. 5. The method of claim 4,
    The SAD (fn, bm) is a scene change of an image sensing device in the mouth, characterized in that to obtain, using the following equation.
    Formula;
    SAD (fn, bm) = Σ abs (Frame (fn, bm) pixel (i) - Frame (fn + 1, bm) pixel (i))
    Here, the number fn is, bm of the frame means in order, abs is the absolute value of each pixel of the m-th block of the frame, i is the corresponding block.
  6. According to claim 1,
    The scene continuity state determining module is the three-dimensional scene continuity breaking the three-dimensional image of the scene change detection device, it characterized in that the determination made by the scene change, if not less than the threshold value is determined as a continuous scene, is not equal to or greater than the threshold value.
  7. A method for the scene change detection device that detects a scene change of a stereoscopic image,
    (A) detecting a scene change in the input image;
    (B) extracting a feature point from the last frame in the detected scene change, and calculating a three-dimensional scene continuity score using the feature point tracking at the first frame of the feature point to a scene change image extraction; And
    (C) comparing with a predetermined threshold value on the calculated three-dimensional scene continuity points, and judges whether or not the continuity of the three-dimensional scene on the basis of the comparison result;
    Detection cutaway three-dimensional image comprising a.
  8. The method of claim 7,
    The step (b),
    Extracting feature points from the last frame in the detected scene change;
    A step to the feature points extracted from the last frame, the matching with the extracted feature point to the first frame to obtain the number of feature point tracking at the first frame;
    Calculating the SAD (sum of absolute difference) in a pre-defined on the basis of the tracked feature point block in the first frame and the feature points of the last frame; And
    The calculated SAD and the feature point by using the track number of the three-dimensional scene continuity breaking step method for detecting scene change of a stereoscopic image comprising: a for calculating a.
  9. The method of claim 8,
    The three-dimensional scene continuity score (C (s)) is a method for detecting scene change of a stereoscopic image, characterized in that obtained by using the following equation.
    Formula;
    C (s) = (the number of feature point tracking / total feature points) * (Σ (1 / SAD (fn, bm))
    Here, the total of the characteristic point number of the feature point number extracted from the last frame of a scene change, the tracked feature point number is a scene traced in the first frame of the converted image feature point number, fn is a number of the frames, bm are m in the frame It means the first block.
  10. The method of claim 7,
    Wherein the step (c),
    The three-dimensional scene continuity score is the threshold value when a continuous scene is determined by, and is not equal to or greater than the threshold, the scene change is the scene change detection method of the stereoscopic image, characterized in that determines that comprises not less than.
  11. (A) detecting a scene change in the input image;
    (B) extracting a feature point from the last frame in the detected scene change, and calculating a three-dimensional scene continuity score using the feature point tracking at the first frame of the feature point to a scene change image extraction; And
    (C) the calculated three-dimensional scene to a continuous score group compared with the predetermined threshold on the basis of the comparison result to scene change of a stereoscopic image detection method comprising the step of determining the continuity whether the three-dimensional scene is being recorded as a program in an electronic device readable recording medium.
PCT/KR2012/008237 2011-11-24 2012-10-11 Apparatus and method for detecting a scene change in a stereoscopic video WO2013077546A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR10-2011-0123373 2011-11-24
KR20110123373A KR101667011B1 (en) 2011-11-24 2011-11-24 Apparatus and Method for detecting scene change of stereo-scopic image

Publications (1)

Publication Number Publication Date
WO2013077546A1 true true WO2013077546A1 (en) 2013-05-30

Family

ID=48469955

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/008237 WO2013077546A1 (en) 2011-11-24 2012-10-11 Apparatus and method for detecting a scene change in a stereoscopic video

Country Status (2)

Country Link
KR (1) KR101667011B1 (en)
WO (1) WO2013077546A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489191B (en) * 2013-09-24 2016-04-13 中国科学院自动化研究所 Significant target remote sensing image change detection method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196662A (en) * 2001-12-27 2003-07-11 Ntt Data Corp Cut detection device and its program
JP2009540667A (en) * 2006-06-08 2009-11-19 トムソン ライセンシングThomson Licensing Method and apparatus for detecting a scene change
KR20100060330A (en) * 2008-11-27 2010-06-07 삼성전자주식회사 Apparatus and method for creating multi-view image for stereo image
KR20110050364A (en) * 2009-11-06 2011-05-13 삼성전자주식회사 Method and apparatus for parallax adjusting in stereoscopic video

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101490521B1 (en) 2007-10-10 2015-02-06 삼성전자주식회사 Method for real-time scene-change detection for rate control of video encoder, method for enhancing qulity of video telecommunication using the same, and system for the video telecommunication
KR101050255B1 (en) * 2009-08-25 2011-07-19 주식회사 노매드커넥션 Video footage split system and method
KR101167645B1 (en) * 2010-07-27 2012-07-20 (주)자람테크놀로지 Method for detecting scene change and apparatus therof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196662A (en) * 2001-12-27 2003-07-11 Ntt Data Corp Cut detection device and its program
JP2009540667A (en) * 2006-06-08 2009-11-19 トムソン ライセンシングThomson Licensing Method and apparatus for detecting a scene change
KR20100060330A (en) * 2008-11-27 2010-06-07 삼성전자주식회사 Apparatus and method for creating multi-view image for stereo image
KR20110050364A (en) * 2009-11-06 2011-05-13 삼성전자주식회사 Method and apparatus for parallax adjusting in stereoscopic video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHUNG-LIN HUANG ET AL. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY vol. 11, 31 December 2001, pages 1281 - 1288 *

Also Published As

Publication number Publication date Type
KR101667011B1 (en) 2016-10-18 grant
KR20130057585A (en) 2013-06-03 application

Similar Documents

Publication Publication Date Title
Rui et al. Constructing table-of-content for videos
Jung Efficient background subtraction and shadow removal for monochromatic video sequences
Cernekova et al. Information theory-based shot cut/fade detection and video summarization
Colombari et al. Segmentation and tracking of multiple video objects
Hampapur et al. Production model based digital video segmentation
JP2001256500A (en) System and method for judging target
Pua et al. Real time repeated video sequence identification
JP2009199322A (en) Monitoring system, and person retrieval method
Ewerth et al. Estimation of arbitrary camera motion in MPEG videos
Fernando et al. A unified approach to scene change detection in uncompressed and compressed video
Baber et al. Shot boundary detection from videos using entropy and local descriptor
CN102930553A (en) Method and device for identifying objectionable video content
CN1489112A (en) Sports image detecting method
CN102096471A (en) Human-computer interaction method based on machine vision
Wang et al. Soccer replay detection using scene transition structure analysis
CN101315631A (en) News video story unit correlation method
Choudary et al. Summarization of visual content in instructional videos
CN104036287A (en) Human movement significant trajectory-based video classification method
Kang et al. Video retrieval based on scene change detection in compressed streams
US20110135152A1 (en) Information processing apparatus, information processing method, and program
Lin et al. Detection of frame duplication forgery in videos based on spatial and temporal analysis
Liu et al. Rule-based semantic summarization of instructional videos
Fradi et al. People counting system in crowded scenes based on feature regression
CN103593464A (en) Video fingerprint detecting and video sequence matching method and system based on visual features
Hsieh et al. A kinect-based people-flow counting system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12850966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 12850966

Country of ref document: EP

Kind code of ref document: A1