WO2013077562A1 - Appareil et procédé de définition de points caractéristiques et appareil et procédé de poursuite d'objets utilisant ces derniers - Google Patents

Appareil et procédé de définition de points caractéristiques et appareil et procédé de poursuite d'objets utilisant ces derniers Download PDF

Info

Publication number
WO2013077562A1
WO2013077562A1 PCT/KR2012/008896 KR2012008896W WO2013077562A1 WO 2013077562 A1 WO2013077562 A1 WO 2013077562A1 KR 2012008896 W KR2012008896 W KR 2012008896W WO 2013077562 A1 WO2013077562 A1 WO 2013077562A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
block
feature point
sad
sum
Prior art date
Application number
PCT/KR2012/008896
Other languages
English (en)
Korean (ko)
Inventor
우대식
박재범
전병기
김종대
정원석
Original Assignee
에스케이플래닛 주식회사
시모스 미디어텍(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020110123376A external-priority patent/KR101624182B1/ko
Priority claimed from KR1020110123372A external-priority patent/KR101700276B1/ko
Application filed by 에스케이플래닛 주식회사, 시모스 미디어텍(주) filed Critical 에스케이플래닛 주식회사
Publication of WO2013077562A1 publication Critical patent/WO2013077562A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments

Definitions

  • a block having a predetermined size is formed around the specified feature point in an input image, and a block corresponding to the size of the formed block within a preset movement range is provided.
  • each block defined in the moving range is frequency-transformed to obtain the gradient representative value, respectively, and the center point of the block having the largest gradient representative value in the movement range is reassigned to the final feature point.
  • the present invention divides the frame of the input image into a predetermined number of blocks, calculates the sum of absolute difference (SAD) between the current frame and the adjacent frame for each block, and then calculates the SADs between the adjacent frames for each frame. It is related to selecting a frame having the smallest tracking error value as a key frame after calculating the total SAD by calculating the total sum, obtaining the tracking error value for each frame using the total SAD for each frame.
  • SAD sum of absolute difference
  • the object tracking technique is performed by extracting feature points and tracking these feature points in the next image. Extraction of the feature points becomes a strong edge component or corner component of the image because these positions can be maintained and scored even if there is motion. And each feature point is expressed by scoring that expresses value to feature point because of the feature of these components.
  • key frame extraction, highlighting, and indexing of video can be found in digital video search, recognition, tracking, monitoring, and blocking systems that specialize in various video camcorders, recorders, and video players that can be privately owned. To support the function, etc., scene change detection must be performed.
  • the object tracking technology attempts to keep track of the shape or direction while searching for the shape of the same object in the continuous video as much as possible.
  • an object in an image may move or move while maintaining or deforming shape.
  • the error in order to track the shape of a specific object for three-dimensional transformation, the error should be minimized for the shape of the outline representing the object.
  • the error of the shape of the outline in the tracking process for each scene becomes larger.
  • the tracking of the object with the starting point of the scene 9 can reduce the errors in the generation of the depth map required for the stereoscopic transformation, so it is most reasonable to select the frame 9 as the "key frame" corresponding to the starting point of the tracking.
  • an object of the present invention is to obtain a representative value of the change degree for each block in consideration of the peripheral pixel matrix of the feature point specified by the user when the user manually specifies the feature point,
  • a feature point setting device and method for resetting feature points specified by a user based on the representative value and increasing the accuracy of feature point tracking of an object's outline in terms of stereoscopic transformation, and an object tracking device and method using the same are provided. It is.
  • Another object of the present invention is to find a frame with the smallest sum of errors by simple calculations without performing tracking for each frame, and easily select a key frame, and to detect errors generated by the movement of an object.
  • the present invention provides an apparatus and method for setting a feature point that can select a key frame that minimizes a cumulative error using the estimation factor.
  • a feature point defining unit for receiving and storing a feature point corresponding to the initial position of the object from the user with respect to the input image;
  • a block forming unit forming a block having a predetermined size around the designated feature point, and defining blocks corresponding to the size of the formed block within a predetermined moving range;
  • a gradient representative value calculator for frequency transforming each block defined within the moving range to obtain a gradient representative value, respectively;
  • a feature point determiner for reassigning the center point of the block having the largest gradient representative value to the final feature point within the movement range.
  • the SAD calculation unit for dividing the frame of the input image into a predetermined number of blocks, and for each block to obtain a sum of absolute difference (SAD) between the current frame and the adjacent frame for each block;
  • a SAD total calculation unit that calculates a total SAD for each frame by adding up all SADs between adjacent frames;
  • a tracking error calculator which calculates a tracking error value for each frame by using the sum of SADs for each frame;
  • a key frame selecting unit configured to select a frame having the smallest tracking error value as a key frame.
  • a feature point setting unit for setting the feature point of the object by checking the validity of the feature point corresponding to the initial position of the object specified by the user with respect to the input image;
  • An image analyzer extracting motion vector information and residual signal information of each frame from the input image; And generating forward motion vector information for each unit block from the extracted motion vector information, restoring pixel information for a predetermined block from the extracted residual signal, and then positioning information of the set feature point and the forward motion vector information.
  • an object tracking unit for generating optimal position information of the object in each frame from the reconstructed pixel information.
  • the method for setting a feature point provides a recording medium recorded by the program and readable by the electronic device.
  • the key frame selection method provides a recording medium recorded by the program and readable by the electronic device.
  • An object tracking method using the feature point is provided by a program and provides a recording medium that can be read by an electronic device.
  • a variation degree representative value for each block is obtained in consideration of the surrounding pixel matrix of the feature point designated by the user, and the characteristic point designated by the user is determined based on the variation degree representative value. You can reset it.
  • a key frame can be easily selected by finding a frame having the smallest sum of errors by a simple operation without performing tracking for each frame.
  • 1 and 2 are views for explaining a cumulative tracking error relationship for a conventional stereoscopic transformation
  • FIG. 3 is a block diagram schematically showing the configuration of a feature point setting apparatus according to the present invention.
  • FIG. 4 is a view for explaining a method for forming a block according to the present invention.
  • FIG. 5 is a view showing a frequency-converted block according to the present invention.
  • FIG. 6 is an exemplary view for explaining a method for obtaining an inter-frame SAD according to the present invention.
  • FIG. 7 is a graph showing the total SAD of each frame in a scene to be tracked according to the present invention.
  • Figure 3 is a block diagram schematically showing the configuration of a feature point setting apparatus according to the present invention
  • Figure 4 is a view for explaining a method for forming a block according to the present invention
  • Figure 5 shows a frequency-converted block according to the present invention Drawing.
  • the feature point setting device 300 includes a feature point defining unit 310, a block forming unit 320, a gradient representative value calculating unit 330, a feature point determining unit 340, and a sum of absolute difference.
  • a calculator 350 a SAD total calculator 360, a tracking error calculator 370, and a key frame selector 380.
  • the feature point definer 310 receives and stores a feature point corresponding to the initial position of the object from the user with respect to the input image. That is, the user designates a feature point corresponding to the initial position of the object in the input image, and the feature point defining unit 310 stores the feature point specified by the user.
  • the feature points refer to edges or corners.
  • the block forming unit 320 forms a block having a predetermined size based on the feature point specified by the feature point defining unit 310 and defines blocks corresponding to the size of the formed block within a preset movement range.
  • the block forming unit 320 forms an arbitrary block including peripheral pixels around the designated feature point in the input image, and sets a moving range in which the formed block can move to the front, back, left, and right.
  • the moving range of the block may be a range previously defined by the user.
  • blocks that can be formed within the movement range are defined.
  • the block forming unit 420 may block an arbitrary block around the feature point.
  • Block (n) 410 is formed.
  • Block (n) 410 is an arbitrary block including peripheral pixels corresponding to the feature point n.
  • the feature point coordinates (i, j) 400 of block (n) 410 are user defined coordinates, which are before, after, and left by a predefined movable size of d (x), d (y). , You can move right. Therefore, the upper left block 420 and the lower right block 430 refer to a moving range in which the block (n) 410 can move back, forth, left and right. As a result, block (n) has a spatial coordinate that can move by (2 * dx + 2 * dy), that is, a moving range of the block, and a total of (2 * dx + 2 * dy) blocks within the moving range. (n) may be defined.
  • the gradient representative value calculator 330 obtains the gradient representative value of each block by frequency converting all the blocks defined in the moving range set by the block forming unit 320, respectively. If (2 * dx + 2 * dy) blocks are defined in the moving range, the gradient representative value calculator 330 frequency-converts each of the (2 * dx + 2 * dy) blocks and changes the gradient representative value. Obtain each.
  • the gradient representative value calculator 330 performs frequency conversion on each block, and obtains a gradient representative value of each block by summing pixel values corresponding to a high frequency region in the frequency-converted block.
  • a score that gives a value of a feature point in a space that can be moved based on a feature point of a user specified (i, j) coordinates, that is, reliability of the feature point uses an FFT or DFT transform representing a frequency characteristic of the corresponding block (n).
  • a two-dimensional fast fourier transform (FFT) or discrete fourier transform (DFT) in space corresponds to the low-frequency region in the upper left as shown in FIG. 3 in the case of a flat simple image according to the characteristics of the image pixel of the block (n).
  • FFT fast fourier transform
  • DFT discrete fourier transform
  • the gradient representative value calculator 330 calculates a gradient representative value for each block defined in a 2 * dx + 2 * dy space in which an arbitrary block (n) can move in space, that is, a moving range.
  • the feature point determiner 340 reassigns the center point of the block having the largest change representative value within the movement range as the final feature point.
  • the periphery selected as a feature point is simply a flat image or if the change is not large (that is, the change is not large), it is usually not easy to distinguish it from the following image, and if there is a characteristic that can be differentiated (that is, the change between pixels). It is easy to compare this change in the next image. In other words, the part with a large degree of change is easy to track, and tracking error becomes small. If a specific part is selected as a feature point according to the user's intention for this reason, the feature point setting device 300 may reselect the feature point as a feature point that is more easily tracked within a preset movement range.
  • the SAD calculator 350 divides the frame of the input image into a predetermined number of blocks, and then obtains SAD between the current frame and the previous frame, and the current frame and the subsequent frame, for each block.
  • the block is an N ⁇ N sized block, for example, a 16 ⁇ 16 size block, an 8 ⁇ 8 size block, or a 4 ⁇ 4 sized block.
  • Each block is called Macro Block and is divided into small blocks of appropriate size according to the size of the whole image. Generally, each block is divided into 16x16 ⁇ 32x32.
  • SAD sum of absolute difference
  • the SAD calculator 350 calculates SADs between the current frame and the adjacent frame for each block by using Equation 1, respectively.
  • the adjacent frame refers to a previous frame of the current frame and a subsequent frame of the current frame.
  • fn is the number of frames
  • bm is the m-th block of the frame
  • i is the order of the pixels of the block
  • abs is the absolute value.
  • the SAD calculator 350 obtains SAD between the current frame and the previous frame, and SAD between the current frame and the subsequent frame, for each block.
  • the SAD calculator 350 performs SAD between blocks bm at the same position in each frame, that is, SAD (fn-1, bm) between the current frame block bm and the previous frame block bm, and SAD between the current frame and the next frame block bm ( fn, bm) are obtained respectively.
  • the SAD total calculation unit 350 obtains the SAD total by adding up all the SADs of each block for each frame.
  • the SAD sum calculator 360 obtains a first SAD sum by adding the SADs for all blocks between the current frame and the previous frame, and adds the SADs for all blocks between the current frame and the next frame and adds a second SAD sum. After the calculation, the first SAD sum and the second SAD sum are summed to obtain a total SAD for each frame.
  • the SAD sum calculating unit 360 obtains a second SAD sum SAD (fn) by adding the SADs of all blocks between the current frame and subsequent frames as shown in Equation 2.
  • j is an index of the block number.
  • the SAD (fn) obtained in Equation 2 is the sum of SADs between the current frame and the subsequent frame, and the SAD total calculation unit 360 should consider the previous frame as well as the subsequent frame to obtain the SAD total in the current frame.
  • the SAD sum calculator 360 calculates the sum SAD for each frame t SAD (fn) by using Equation 3 below.
  • SAD (fn-1) is a first SAD sum obtained by adding the SADs of all blocks between the current frame and the previous frame
  • SAD (fn) is a sum obtained by adding the SADs of all blocks between the current frame and the next frame. 2 means SAD sum.
  • the SAD sum calculated by the SAD sum calculation unit 360 is a representative value representing the magnitude of the inter-frame variability, that is, the difficulty of tracking.
  • the SAD total calculation unit 360 represents the total SAD obtained for each frame in a graph, it is as shown in FIG. 7. Referring to FIG. 7, the total SAD (tSAD (fn)) of each frame in a scene to be tracked is shown.
  • the tracking error calculator 370 calculates a tracking error value for each frame by using the SAD total for each frame obtained by the SAD total calculator 360.
  • the tracking error calculator 370 calculates a tracking error value by accumulating the SAD total and its SAD total obtained up to the previous frame for each frame of the scene to be tracked.
  • a tracking error value between a current frame (fn) and a subsequent frame (frame (fn + 1)) may be obtained by using Equation 3, but in frame (fn + 1) The sum of consecutive errors between the next frame (fn + 2) must take into account the error reflected in the previous frame. Consequently, the tracking error value from frame (fn + 1) to frame (fn + 2) is compared with the previous tracking error value. Add up.
  • the tracking error value in frame fn is tSAD (fn).
  • the tracking error value for each frame obtained by the tracking error calculator 370 is represented by a graph as shown in FIG. 8.
  • the tracking error value of the frame f (n + 1) is a result of accumulating the sum of its own SADs on the tracking error value of the frame fn.
  • the tracking error value of the frame f (n + 2) is a result of accumulating the sum of its own SADs on the tracking error value of f (n + 1).
  • the key frame selecting unit 380 selects a frame having the smallest tracking error value as the key frame among the tracking error values obtained by the tracking error calculating unit 370.
  • the key frame selector 380 selects fn as a key frame of the scene.
  • the precondition that the feature point setting device 300 configured as described above selects a key frame by using the SAD is that the tracking of a specific object from one image to another image including motion is more relative to the motion of the image. This increases the tracking error of the image. Of course, a lot of movement does not increase tracking error, but on the average, a lot of movement means that there is a lot of volatility of a specific object to be tracked, which makes tracking difficult.
  • the feature point setting device 300 obtains SAD between frames and selects a key frame using the SAD.
  • FIG. 9 is a block diagram schematically illustrating a configuration of an object tracking apparatus using feature points according to an embodiment of the present invention.
  • the object tracking apparatus 900 using the feature points includes a feature point setter 910, an image analyzer 920, and an object tracker 930.
  • the feature point setting unit 910 sets the feature point of the object by checking whether the feature point corresponding to the initial position of the object designated by the user is valid.
  • the feature point setting unit 910 forms an arbitrary block around a feature point designated by a user in the input image, sets a moving range of the formed block, and then frequency-converts each block within the moving range of the block. Representative values of gradients are obtained for each block. Then, the feature point setting unit 910 sets the center point of the block having the largest gradient representative value within the movement range as the final feature point. Description of the method for setting the feature point by the feature point setting unit 910 is the same as the operation of the feature point setting apparatus in FIG. 3, and thus a detailed description thereof will be omitted.
  • the image analyzer 920 extracts motion vector information and residual signal information of each frame from the input image.
  • the object tracker 930 generates forward motion vector information for each unit block from the extracted motion vector information, restores pixel information of a predetermined block from the extracted residual signal, and then stores the feature points.
  • the optimum position information of the object in each frame is generated from the position information of the feature point, the forward motion vector information, and the reconstructed pixel information set by the setting unit 910.
  • the object tracking unit 930 determines the object feature point candidate of the current frame by using the pixel value difference between the current frame and the previous frame, and includes the feature point set by the feature point setting unit 910. A template similar to a predetermined template is searched in a predetermined area surrounding the object feature point candidate of the current frame to determine an object feature point of the current frame.
  • the object tracking unit 930 calculates an optical flow according to the object feature point set by the feature point setting unit 910 by using the pixel value difference between the previous frame and the current frame, and uses the calculated light flow.
  • An object feature point candidate is determined in the current frame.
  • the object tracker 930 uses a template matching to obtain a template similar to a template including the object feature point set by the feature point setter 910 around the determined object feature point candidate of the current frame. Search in the area.
  • the object tracking unit 930 divides the object tracking method using the feature point into two methods, but various conventional methods may be used.
  • FIG. 10 is a flowchart illustrating a method of resetting a feature point designated by a user by the apparatus for setting a feature point according to the present invention
  • FIG. 11 is an exemplary diagram for describing a case in which a feature point according to the present invention is reassigned.
  • the feature point setting apparatus 300 receives and stores a feature point corresponding to an initial position of an object from a user with respect to an input image in operation S1002.
  • the feature point setting apparatus 300 After performing S1002, the feature point setting apparatus 300 forms an arbitrary block around the designated feature point in the input image (S1004), and sets a moving range of the formed block (S1006).
  • Setting the moving range by the feature point setting apparatus 300 means that the formed block is moved within the moving range so that a plurality of blocks are defined. For example, when a moving range is defined as (2 * dx + 2 * dy), (2 * dx + 2 * dy) blocks may be defined in the moving range.
  • the feature point setting apparatus 300 After the operation of S1006, the feature point setting apparatus 300 obtains a representative value of the degree of change of each block by frequency converting each block defined in the movement range (S1008). That is, the feature point setting apparatus 300 frequency-transforms each block, and obtains a representative value of change degree for each block by adding values of a high frequency region in the frequency-converted image.
  • the feature point setting device 300 reassigns the center point of the block having the largest change representative value among the blocks defined in the movement range as the final feature point (S1010).
  • a method of resetting the feature point set by the user by the feature point setting apparatus 300 will be described with reference to FIG. 11.
  • the feature point setting device searches for the surroundings as shown in b to reassign a feature point having a better score.
  • the feature point may be set to use the original designated location.
  • the resolution is 1920x1080 or 1280x720, so it is practically feasible to reassign feature points near 3x3 and 5x5 pixels.
  • FIG. 12 is a flowchart illustrating a method for tracking an object using a feature point in the object tracking device according to the present invention.
  • the object tracking apparatus 900 sets a feature point of an object by checking whether a feature point corresponding to an initial position of an object designated by a user is valid (S1202). That is, the object tracking device 900 forms an arbitrary block around a user-specified feature point in the input image, sets a moving range of the formed block, and then frequency-converts each block defined within the moving range. Representative values of gradients are obtained for each block. Then, the object tracking device 900 sets the center point of the block having the largest change representative value within the moving range as the final feature point.
  • the object tracking device 900 After performing S1202, the object tracking device 900 extracts motion vector information and residual signal information of each frame from the input image (S1204).
  • the object tracking device 900 After the operation of S1204, the object tracking device 900 generates the optimal position information of the object in each frame by using the position information of the set feature point, the motion vector information, and the residual signal information (S1206). That is, the object tracking device 900 generates forward motion vector information for each unit block from the extracted motion vector information, restores pixel information for a predetermined block from the extracted residual signal, and then stores the set feature points.
  • the optimal position information of the object in each frame is generated from the position information, the forward motion vector information, and the reconstructed pixel information.
  • the object tracking apparatus 900 predicts the position coordinates of each feature point in the next frame from the feature point initial position information and the forward motion vector information of the previous frame.
  • the object tracking device 900 extracts at least one candidate position coordinate from the position coordinate of the predicted feature point in order to find the position coordinate of the feature point more accurately. That is, the object tracking device 900 selects and extracts candidate position coordinates within a predetermined range based on the position coordinates of the predicted feature points. Then, the object tracking device 900 measures texture similarity energy, shape similarity energy, and motion similarity energy of each feature point candidate position coordinate, respectively.
  • the present invention can be embodied as computer readable codes on a computer readable recording medium.
  • the computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored.
  • FIG. 13 is a flowchart illustrating a method for selecting a key frame by a feature point setting apparatus according to the present invention.
  • the feature point setting apparatus 300 divides the frame of the input image into a predetermined number of blocks (S1302).
  • the feature point setting apparatus 300 obtains SAD between the current frame and the adjacent frame for each block (S1304).
  • the feature point setting apparatus 300 obtains SAD between the current frame and the previous frame, and SAD between the current frame and the subsequent frame, for each block.
  • the SAD is a sum of absolute differences of all pixels in a block, and as a result, represents a difference of blocks at the same position between frames.
  • the feature point setting apparatus 300 obtains the total SAD by adding the SADs of each block for each frame (S1306).
  • the feature point setting apparatus 300 obtains a first SAD sum by adding the SADs of all blocks between the current frame and the previous frame, obtains a second SAD sum by adding the SADs of all blocks between the current frame and the subsequent frame, and then, The sum of the first SAD and the second SAD is obtained to obtain a total SAD for each frame.
  • the feature point setting apparatus 300 calculates a tracking error value for each frame using the total SAD for each frame (S1308).
  • the feature point setting apparatus 300 calculates a tracking error value by accumulating the total SAD obtained up to the previous frame and its total SAD for each frame of the scene to be tracked.
  • the feature point setting apparatus 300 calculates a tracking error value for each frame of a scene to be tracked using Equation 4 described with reference to FIG. 3.
  • the feature point setting apparatus 300 selects a frame having the smallest tracking error value as a key frame (S1310).
  • the feature point setting apparatus 300 selects a key frame through the above process, but the above process looks complicated but is a simple sum operation and traces because it corresponds to an extremely small amount of operation compared to an operation for tracking an object. You can predict the most optimal key frame before.
  • the feature point setting apparatus 300 may easily select a key frame through a simple inspection and calculation without performing tracking for each frame through the above process.
  • the key frame selection method of the feature point setting apparatus 300 described above can be written in a program, and codes and code segments constituting the program can be easily inferred by a programmer in the art.
  • the program related to the key frame selection method of the feature point setting device 300 is stored in a readable media that can be read by the electronic device, and is tracked by the key frame selection device by being read and executed by the electronic device.
  • a key frame can be selected from the frames in the scene.
  • a representative value of change degree for each block is obtained in consideration of the surrounding pixel matrix of the feature point designated by the user, and the feature point designated by the user is reset based on the change degree representative value. It can be applied to a feature point setting apparatus and method and an object tracking apparatus and method using the same.
  • the present invention can easily select a key frame by finding the frame with the smallest sum of errors by simple calculations without performing tracking for each frame, and estimates the error of the SAD generated by the movement of the object. It can be applied to a feature point setting apparatus and method that can select a key frame that minimizes the cumulative error using.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un appareil et un procédé de définition de points caractéristiques et un appareil et un procédé de poursuite d'objets utilisant ces derniers. Conformément à la présente invention, lorsqu'un utilisateur désigne des points caractéristiques, une valeur représentative de la variabilité pour chaque bloc est obtenue en tenant compte de la matrice de pixels entourant des points caractéristiques désignés par l'utilisateur, et les points caractéristiques désignés par l'utilisateur sont réinitialisés sur la base de la valeur représentative de la variabilité. Par conséquent, lorsqu'une conversion stéréoscopique est effectuée, la précision de la poursuite de points caractéristiques sur le contour extérieur de l'objet peut être améliorée. Par ailleurs, une image clé dont la somme totale des erreurs est minimale peut être facilement trouvée et sélectionnée par des calculs simples sans avoir à effectuer la poursuite de chaque image.
PCT/KR2012/008896 2011-11-24 2012-10-26 Appareil et procédé de définition de points caractéristiques et appareil et procédé de poursuite d'objets utilisant ces derniers WO2013077562A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020110123376A KR101624182B1 (ko) 2011-11-24 2011-11-24 키 프레임 선정 장치 및 방법
KR1020110123372A KR101700276B1 (ko) 2011-11-24 2011-11-24 특징점 설정 장치 및 방법과 이를 이용한 객체 추적 장치 및 방법
KR10-2011-0123372 2011-11-24
KR10-2011-0123376 2011-11-24

Publications (1)

Publication Number Publication Date
WO2013077562A1 true WO2013077562A1 (fr) 2013-05-30

Family

ID=48469967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/008896 WO2013077562A1 (fr) 2011-11-24 2012-10-26 Appareil et procédé de définition de points caractéristiques et appareil et procédé de poursuite d'objets utilisant ces derniers

Country Status (1)

Country Link
WO (1) WO2013077562A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100466587B1 (ko) * 2002-11-14 2005-01-24 한국전자통신연구원 합성영상 컨텐츠 저작도구를 위한 카메라 정보추출 방법
KR20050085842A (ko) * 2002-12-20 2005-08-29 더 파운데이션 포 더 프로모션 오브 인더스트리얼 사이언스 화상에서의 이동 물체의 추적 방법 및 장치
KR20080017521A (ko) * 2006-08-21 2008-02-27 문철홍 차 영상을 이용한 다중 물체 추적 방법
KR100958379B1 (ko) * 2008-07-09 2010-05-17 (주)지아트 복수 객체 추적 방법과 장치 및 저장매체
KR100996209B1 (ko) * 2008-12-23 2010-11-24 중앙대학교 산학협력단 변화값 템플릿을 이용한 객체 모델링 방법 및 그 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100466587B1 (ko) * 2002-11-14 2005-01-24 한국전자통신연구원 합성영상 컨텐츠 저작도구를 위한 카메라 정보추출 방법
KR20050085842A (ko) * 2002-12-20 2005-08-29 더 파운데이션 포 더 프로모션 오브 인더스트리얼 사이언스 화상에서의 이동 물체의 추적 방법 및 장치
KR20080017521A (ko) * 2006-08-21 2008-02-27 문철홍 차 영상을 이용한 다중 물체 추적 방법
KR100958379B1 (ko) * 2008-07-09 2010-05-17 (주)지아트 복수 객체 추적 방법과 장치 및 저장매체
KR100996209B1 (ko) * 2008-12-23 2010-11-24 중앙대학교 산학협력단 변화값 템플릿을 이용한 객체 모델링 방법 및 그 시스템

Similar Documents

Publication Publication Date Title
JP2953712B2 (ja) 移動物体検知装置
WO2012115332A1 (fr) Dispositif et procédé d'analyse de la corrélation entre une image et une autre image ou entre une image et une vidéo
JP5445460B2 (ja) なりすまし検知システム、なりすまし検知方法及びなりすまし検知プログラム
US8879894B2 (en) Pixel analysis and frame alignment for background frames
JP2016162232A (ja) 画像認識方法及び装置、プログラム
WO2013125768A1 (fr) Appareil et procédé pour détecter automatiquement un objet et des informations de profondeur d'image photographiée par un dispositif de capture d'image ayant une ouverture de filtre à couleurs multiples
US10721431B2 (en) Method for estimating a timestamp in a video stream and method of augmenting a video stream with information
JPH1091795A (ja) 移動物体検出装置及び移動物体検出方法
JP2000222584A (ja) 映像情報記述方法、映像検索方法及び映像検索装置
WO2016060439A1 (fr) Procédé et appareil de traitement de d'image
KR101868903B1 (ko) 손 추적 장치 및 방법
WO2020235804A1 (fr) Procédé pour générer un modèle de détermination de similarité de pose et dispositif pour générer un modèle de détermination de similarité de pose
WO2007036823A2 (fr) Methode et appareil pour determiner un type d'image photographiee
KR20170015299A (ko) 배경 추적을 통한 오브젝트 추적 및 분할을 위한 방법 및 장치
JPWO2014103673A1 (ja) 情報処理システム、情報処理方法及びプログラム
CN104504162B (zh) 一种基于机器人视觉平台的视频检索方法
WO2021162237A1 (fr) Procédé et appareil d'extraction d'un schéma de couleur à partir d'une vidéo
KR20120116699A (ko) 클러스터링 기법을 이용한 이동물체 검출장치 및 방법
WO2013077562A1 (fr) Appareil et procédé de définition de points caractéristiques et appareil et procédé de poursuite d'objets utilisant ces derniers
KR20130057584A (ko) 특징점 설정 장치 및 방법과 이를 이용한 객체 추적 장치 및 방법
WO2019194561A1 (fr) Procédé et système de reconnaissance d'emplacement pour fournir une réalité augmentée dans un terminal mobile
KR102614895B1 (ko) 동적 카메라 영상 내의 객체를 실시간 추적하는 시스템 및 방법
KR102450466B1 (ko) 영상 내의 카메라 움직임 제거 시스템 및 방법
WO2010095796A1 (fr) Procédé de détection de vidéo basée sur le contenu, au moyen de la propriété de segments
JP2000030033A (ja) 人物検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12851233

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12851233

Country of ref document: EP

Kind code of ref document: A1